Tesla autopilot
Autopilot in action. Tesla

 

Tesla’s Autopilot semi-self-driving technology is under fire after a fatal crash in Florida in May has called into question whether some owners have invested too much trust in the system’s capabilities.

Two federal-government agencies are investigating the tragic incident, along with other nonfatal accidents involving Autopilot.

The crash has also set off a debate about what the future holds for driverless cars.

But much of the discussion around Tesla’s response to the incident is missing the point of Autopilot technology altogether.

Tesla was never trying to empower cars to drive themselves while passengers took naps or watched TV. The company was instead tackling a specific problem: using technology as a solution.

Safety first

That problem is safety — specifically, the extreme fallibility of human drivers. A daunting 35,000 people are killed in auto-related accidents every year in the US alone, and many more are injured. Most of the crashes are the result of driver error.

There’s no technology that can completely relieve the driver of responsibility for operating the vehicle, so since the advent of the seat belt decades ago, safety is cars has concentrated on protecting drivers and passengers from injury — and on preventing accidents in the first place.

Seat belts led to air bags and crumple zones — sections of cars and trucks that are designed to collapse on impact, dissipating the energy of a crash.

Airbag-Crash-Test
Crash testing. Bill Pugliano/Getty Images

 

Then new systems were developed: antilock brakes, traction and stability control, and, more recently, automatic emergency braking, lane-departure warning, and collision avoidance — technologies that use sensors, cameras, and radar.

Tesla Autopilot is a logical evolution of these advancements, and consistent with the company’s vision of addressing transportation challenges sooner than the traditional auto industry.

The bottom line is that Tesla is trying to design vehicles with the idea that safety is the primary consideration. And, for the most part, safety advocates agree that giving drivers more technology on this front, versus with distracting or complicated infotainment systems, is a good way to go.

A problem of perception

Unfortunately, Autopilot has been perceived as ultra-advance cruise-control/near-self-driving tech rather than intensified safety.

Even Autopilot’s most controversial feature, auto-steering, is calibrated not to permit the driver to take his hands off the wheel and let the car steer oneself, but rather to follow a curve more precisely than a human would. In my experience, the system plots a series of short straight lines through a curve, rather than steering through in a continuous motion, which reduces the chance that the car will “oversteer.”

The interior of a Tesla Model S is shown in autopilot mode in San Francisco, California, U.S., April 7, 2016. REUTERS/Alexandria Sage/File Photo
The interior of a Tesla Model S is shown in autopilot mode in San Francisco, California, U.S., April 7, 2016. REUTERS/Alexandria Sage/File Photo

 

It’s ironic that the safety of Tesla vehicles is now in doubt as a result of the company pushing the safety envelope. But as I’ve already argued, additional owner training — call it “Autopilot 101” — could correct any misperceptions that owners might have about the capabilities of their Teslas.

They’re not supposed to be driving themselves. They’re supposed to be helping you do a better and therefore safer job of driving yourself.

Perfection is a process

Familiar safety technologies haven’t always been perfect. Seat belts can injure people in a crash if they aren’t equipped with precrash tensioning systems.

Passenger air bags were resulting in death and injuries to children before manufacturers added deactivation sensors to the front seats. New air bags also have vents in the bags themselves so that they can deflate slightly, preventing injuries when drivers or passengers hit them.

The issue with the next wave of sensor/radar/camera systems is that their purpose is to avoid crashes, but that can give drivers the wrong impression — that they can tune out and let the car take control. Reasonable and experienced drivers will realize quickly that they can’t do that.

But because Tesla Autopilot does such a good job of dealing with mundane driving, even a veteran driver’s prudent distrust is switched off by letting a machine that weighs thousands of pounds and can go 100 mph do its own thing.

Avoiding that situation is simply a matter of reminding drivers that safety isn’t magic and, until we have millions of miles of autonomous-driving cars in the record books, at some very distant future point, truly safe driving begins and end with the guy behind the wheel.

As reported by Business Insider