Tesla Autopilot crashed to death, owner was charged with “homicide”, Tesla testified in court

Welcome to the WeChat subscription number of “Sina Technology”: techsina

Text / Jia Haonan

Source: Smart Car Reference (ID: AI4Auto)

Tesla driver charged with ‘homicide’ and is on trial in the world’s first case.

Here’s the thing: Autopilot was on in the car at the time of the serious crash that killed two people.

Tesla’s self-proclaimed full self-driving accident has not happened once or twice, but this time the judicial department believes that the fault lies with the driver and has nothing to do with Autopilot.

The unprecedented trial, coupled with Musk personally leading the establishment of Tesla’s litigation department at this time, made the controversy even more acute:

Does the system provider need to be held responsible for accidents related to autonomous driving? Is the legal balance more likely to tilt in favor of business?

The owner of the car was tried for “murder”, and Tesla testified in court

Los Angeles County Court indicted the owner of the car with one charge: vehicular manslaughter.

There is no similar crime in China, and it has only appeared in the United States in recent years.

By definition, drive-by homicide is “an offense that results in the death of a person as a result of gross negligence, drunk driving, reckless driving, or speeding or other unlawful driving of a motor vehicle.”

In theory, this is a serious criminal offense because it involves human life.

But the actual situation is that the punishment varies from light to heavy. For example, if only a slight speeding causes death, it is usually only sentenced to a maximum of one year in prison, but if the driver is in a state of drunk driving or drug driving, the sentence will be much more serious.

In fact, this charge was brought up to replace the manslaughter crime that was commonly used in traffic accident judgments in the United States. The legislature believes that the relevant penalty is too severe and unfair to the defendant in a traffic accident case.

Whether the laws of the United States are reasonable or not is not discussed. But judging from the description of this charge, there must be evidence to show that the driver was grossly negligent while driving.

The Los Angeles court said there was sufficient evidence to convict the Tesla driver of driving homicide.

The trial of this case has just begun, so the evidence has not been made public, but it can be analyzed from the accident.

On December 29, 2019, Kevin George Aziz Riad, driving a Tesla Model S, entered the city of Gardena, Los Angeles, via the highway, and was in a speeding state.

Immediately afterwards, the Tesla drove through a red light in downtown Gardena at 119 kilometers per hour, and then hit a Honda Civic at the intersection, directly killing two passengers in the Civic.

Two people inside the Tesla were also injured, but not life-threatening.

At the time of the accident, Autopilot was on.

According to the capabilities of the Autopilot version at the end of 2019, it cannot recognize traffic lights and respond autonomously.

Therefore, the reason for the judicial authorities to prosecute is that the driver did not implement effective supervision of Autopilot, which led to the accident.

Tesla also sent an engineer to testify in court, saying that the vehicle did turn on Autopilot at the time of the accident, and the driver had his hands on the steering wheel, but the vehicle did not brake or decelerate in the six minutes before the accident.

This is equivalent to confirming the accusation of “negligence” and killing people.

After all, after many accidents, Tesla has changed its tune from “full self-driving”, but emphasized that whether Autopilot or FSD requires the driver to be ready to take over.

As for why the vehicle did not slow down and ran the red light after coming off the highway, and the driver did not brake even once, it is still a mystery.

Some foreign netizens analyzed that it may be that the driver turned on NOA at the high speed stage, but after the high speed was off, the system automatically switched to Autopilot, which could not recognize the traffic lights, and the driver did not notice the change of the automatic driving state.

The problem of speed is not surprising. The phenomenon that Tesla NOA does not accelerate after going up the high speed on the ramp, and does not automatically decelerate after getting off the high speed, has long been complained by domestic and foreign car owners countless times.

There may be a reason for this accident.

However, Tesla now has good reason to pass the blame on because of its “vaccination” in advance.

The judiciary also believes that the driver’s “not taking good care of” Tesla Autopilot is a gross negligence and is the root cause of the accident. As for whether the Autopilot technology is mature or not, it is not directly related to the accident.

Once convicted, Kevin George Aziz Riad will become the first human driver in history to take full responsibility for a self-driving accident, a case that may also have a major impact on the development of autonomous driving.

What is the impact of “driver murder”?

Whether or not the system provider should take responsibility for an accident in autonomous driving is a social problem.

The core reason is that at this stage, L2+ and L3 autonomous driving cannot be 100% unmanned and reliable.

L2+ and L3 can realize automatic driving under certain circumstances, and some even allow the driver’s sight to leave the road. Only when the vehicle prompts that it needs to take over, the driver will take over.

It can be seen that L3 is in an embarrassing position, which is fundamentally different from the conventional auxiliary driving level and cannot reach the reliability of L4.

It is said that it is absolutely reliable, but it still cannot be separated from human intervention; it is said that it is not reliable, but the actual L3-related technology has been able to liberate most of the human labor.

Since 2017, no national regulatory agency has issued clear L3 approval details, also because it is difficult to quantify the timing and conditions of human takeover for autonomous driving like Tesla Autopilot and FSD.

Therefore, at this stage, all L2+ and above cars running on the road are actually in a stage where the right of way is ambiguous, and the division of rights and responsibilities between human drivers and AI drivers is unfounded.

This has also contributed to the chaotic handling of incidents over the past few years.

Europe tends to be conservative and does not allow Tesla or other self-driving companies to boast of “no one”, and the judgment also favors the safety of users’ lives and property.

North America has been more aggressive, with several serious crashes ultimately attributed to human drivers, such as the famous Uber self-driving fatality, and now this one.

There have also been many similar accidents in China, but domestic manufacturers are more low-key, and the treatment method is mainly out-of-court compensation. However, in the field of low-speed unmanned delivery vehicles, there have been cases with completely different rights and responsibilities, such as the Meituan unmanned vehicle accident we introduced earlier.

In the end it should be the responsibility of the manufacturer, or the responsibility of the user, no one can give an answer now.

Not long ago, Mercedes-Benz released a rhetoric saying that the manufacturer of the L3 accident is fully responsible, which made the industry and users shine.

However, if you look closely, Mercedes-Benz has limited strict conditions of use for the L3, such as weather, road sections, speed, etc., if one does not meet the requirements, Mercedes-Benz can not be held responsible. However, there are very few scenes that meet the conditions in daily use.

Therefore, Mercedes-Benz’s commitment to show off its meaning is far greater than reality, and it has not played any role in promoting technology and law, even if it can cause not many meaningful discussions.

But this case is different. In the United States, which follows case law, once the Tesla case is settled and the car is fully responsible for “driving homicide”, the impact on the industry will be significant.

Whether this step is forward or backward, there will be precedents to follow for similar autonomous driving accidents in the future.

As users tend to take responsibility, autonomous driving providers will undoubtedly be more open to mass-producing their own new technologies, and may also be more radical in terms of functions and safety.

Because, for a company like Tesla, the money lost in lawsuits is not worth it. The important thing is that once the law decides that it has to pay for the technical defects, it will damage the confidence of investors, the reputation of the company, and then affect the operation of the company. and technical iterations.

Judging from the tendency of the judicial decision in this case, it is undoubtedly beneficial to Tesla: the user is obliged to provide the manufacturer with free road test data, and he has to be responsible for the accident.

Some people’s image metaphor: Autopilot competition below L4 is like a track full of you chasing after athletes, but unfortunately, the rules of this competition have not been formulated at all, running fast does not mean running to win;

What’s even more regrettable is that for ordinary people who are watching the competition and buying a car, no matter who wins, you may not dare to get in the car and run together.

One more thing

While California judiciaries are indicting a Tesla driver for drive-by-drive homicide, Tesla has also made new moves here.

Tesla has established a department dedicated to responding to legal proceedings within the company, including both passive response and active prosecution.

Musk revealed that this new department is “hardcore” and reports directly to him.

Intriguing indeed.

Is it a self-protection strategy that was plagued by lawsuits before, or did you see the change in the judicial wind and think that you can get the “Gold Medal for Avoidance of Death”?

This article is reproduced from: http://finance.sina.com.cn/tech/csj/2022-05-23/doc-imizirau4326924.shtml
This site is for inclusion only, and the copyright belongs to the original author.

Leave a Comment