CHAMPIONS FOR JUSTICE
AdobeStock_179498420.jpeg

Edelstein Law: The Jurist Journal

EDELSTEIN LAW, LLP: THE JURIST JOURNAL BLOG


Autonomous Vehicles (self driven cars)

Several Unsolved Problems in the Push Toward Autonomous Vehicles

Two recent crashes involving Tesla car models with Autopilot, especially a high-profile fatality in Florida involving a Tesla S model, show that the push toward self-driving cars still has many unsolved problems. The fatal crash occurred when the driver had put the car on Autopilot and the system failed to recognize the white side of a big rig that was turning in front of the vehicle. The U.S. National Highway Traffic Safety Administration (NHTSA) is conducting an investigation of not just the incidents, but the Autopilot system.

Consumer Reports meanwhile has openly criticized Tesla saying the company is misleading car owners by calling its semi-autonomous driving system “Autopilot.” Consumer Reports called on Tesla to drop the Autopilot name and disconnect the automatic steering system until it is updated to make sure the driver’s hands stay on the wheel at all times. The system currently warns the driver only after a few minutes of their hands being off the wheel.

Grappling with the Human Aspect

It seems that automakers and researchers alike are grappling with the human aspect of self-driving cars. There are a number of daunting questions. When should drivers take over? Will their reaction time be sufficient if they are not paying attention to the road? When should the car stop and the driver take over? What is the driver’s responsibility versus the vehicle’s responsibility? If a car’s computer system is unable to detect a problem ahead on the road, how could it possibly alert the driver?

What if people are daydreaming or absorbed in a book or their smartphones? These are some of the questions researchers are trying to answer. The other question, too, is whether autonomous driving and traditional manual driving could or should coexist in the same vehicle. Google has already decided that cannot be the case.

Automakers Should Take Responsibility

The fact is simply that self-driving technology is not ready for full autonomy. The sensors on the Model S that crashes were not designed to detect the kind of problem encountered in the form of the turning big-rig. A man ended up dying because his self-driving car couldn’t drive.

It is shocking that federal safety regulators did not require Tesla to test the system before putting these vehicles on the market. It could have prevented Tesla occupants from becoming human guinea pigs. Automakers like Tesla must stop blaming victims and take legal responsibility for Autopilot failures.