Silicon Valley’s completely driverless taxi crashed just after commercial use, 2 people were injured! The findings are even more striking: the situation is technically unsolvable

Welcome to the WeChat subscription number of “Sina Technology”: techsina

Text / Deng Simiao

Source: Smart Car Reference (ID: AI4Auto)

GM Cruise, the first in the United States to launch driverless taxis – the kind that has no driver and co-driver – commercialized it, it crashed.

Two people were injured in the accident, and 80 vehicles were recalled for a software upgrade.

The results of the accident investigation were also released: Robotaxi’s unprotected left turn did not work well. And the manufacturer also said that in the scenario of this accident, the automatic driving system can only sit and wait for the collision…

What happened?

GM Cruise recalled 80 of its self-driving cars after the accident injured two passengers and damaged the Robotaxi’s rear fender.

Look at the whole process of the accident in detail.

When the driverless car is making an unprotected left turn, the car in the right-turn lane is approaching at 40 mph (about 64 km/h), and the speed limit in its lane is 25 mph.

At that time, the unmanned car predicted that the opposite car was going to turn right, so it braked suddenly and stopped in the middle of the intersection.

Cruise said this was done to prevent the two cars from colliding when they merged into the same road.

Unexpectedly, after the opposite car drove out of the right turn lane, it did not turn right, but hit it directly.

So this accident, Cruise is somewhat wronged, after all, it is not his own initiative responsibility. But it can be seen that AI drivers cannot have the same emergency response and flexible processing capabilities as humans.

What is even more embarrassing is that the accident happened on June 3, when GM Cruise had just obtained a license for commercial operation of unmanned vehicles in California.

Later, Cruise’s official documents submitted to the NHTSA show that when turning left without protection, their autopilot system will weigh the importance and choose the least harmful treatment.

If vehicles on the road suddenly change their driving direction or drive at excessive speed, it will be difficult for unmanned vehicles to respond to and make judgments on road traffic conditions in a timely manner—probably it means that they can only stop and “wait for death”.

It added that of the 123,560 unprotected left turns previously made, this was the only accident.

After the accident, they temporarily canceled the Robotaxi’s unprotected left-turn function, narrowing the scope of commercial operations and avoiding city center roads.

The maximum speed limit is controlled at 30 mph (about 48 km/h), and the operating hours are adjusted between 10 pm and 6 am, excluding rain and fog.

On July 6, Cruise announced a software update for all of the recalled vehicles.

In a different way, they improved the predictive capabilities of the self-driving software system, and in order to avoid a collision like this from happening again, they would choose to drive on a different road next time.

What do netizens think about the accident?

This matter is indeed unlucky for GM Cruise. It has a bad start and a bad start as soon as it starts operations.

Netizens also exploded.

Someone defends Cruise:

It sounds like the fault is in another car, and Cruise is also trying to fix the self-driving software, so it can better predict the next time it encounters such a scenario.

Even people are prone to such accidents due to negligence.

There are also believers of Musk who have questioned lidar:

There are lidars everywhere, and there is no way to avoid such an accident.

However, @Whole Mars Catalog, a Tesla fan blogger, said that the controversy is not about which sensor solution to use:

Whether you use lidar or camera pure vision, you can “see” the car. The hard part is how to deal with it after seeing it.

Here’s a bunch of people in the thread that agree:

Whether it is lidar or ordinary radar, it is all for ranging. But those sensors couldn’t understand that scene and didn’t know how to respond.

It has been suggested that the pace of infrastructure deployment of vehicle-road coordination should be accelerated:

Driverless vehicles really need V2X for safety. Because for self-driving software, the every move of a human driver is really hard to predict.

Traffic would be smoother if that car could signal to other cars which lane to take next, and other cars could adjust quickly.

Others say that the situation in the Cruise accident is entirely the “intent” of the human driver. sex.

Sure, this advice sounds plausible, but it’s far too idealistic and not the essential way to help self-driving systems improve today.


(Disclaimer: This article only represents the author’s point of view and does not represent the position of Sina.com.)

This article is reproduced from: http://finance.sina.com.cn/tech/csj/2022-09-02/doc-imqmmtha5639470.shtml
This site is for inclusion only, and the copyright belongs to the original author.

Leave a Comment