Will First Fatality Affect the Development of Self-Driving Cars?

A May 7 crash of a Model S Tesla in Florida may have outsized implications for the future of driverless technology. The details of the single-fatality crash were made public in a June 30 blog by Tesla though they were reported immediately to NHTSA.

Read Time: 4 minutes

July 9, 2016, 5:00 AM PDT

By Irvin Dawid

Self-Driving Car

RioPatuca / Shutterstock

Self-driving or 'automated vehicle safety' technology holds enormous implications for reducing traffic crashes, and thus fatalities and injuries. An estimated 35,200 people lost their lives year on American roads last year, up 7.7 percent from 2014.

"Ninety-four percent of crashes can be tied back to a human choice or error, so we know we need to focus our efforts on improving human behavior while promoting vehicle technology that not only protects people in crashes, but helps prevent crashes in the first place," said National Highway Traffic Safety Administration Administrator Dr. Mark Rosekind in a July 1 press release. [Italics added.]

Details of the May 7 crash were made available by the National Highway Traffic Safety Administration (NHTSA).

"In a statement, [NHTSA] said preliminary reports indicated that the crash occurred when a tractor-trailer made a left turn in front of the Tesla, and the car failed to apply the brakes," reported Neal E. Boudette and Bill Vlasic for The New York Times on June 30.

It is the first known fatal accident involving a vehicle being driven by itself by means of sophisticated computer software, sensors, cameras and radar.

The Silicon Valley-based electric car company's driverless technology is known as Autopilot. "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla states in their June 30 blog. However, that would not account for the failure of the radar to spot the trailer.

A Tesla equipped with Autopilot makes the Model S a "semi-autonomous" car, according to Gear Patrol staff writer Andrew Connor, as opposed to the fully autonomous technology that Google and Uber are working towards. Specifically, Tesla Autopilot is a "Level 2" technology, adds Connor, while Volvo Intellisafe Autopilot is Level 3, and Google Cars Level 4.

"NHTSA defines vehicle automation as having five levels," according to their May 2013 press release. Their table below is similar, but not identical to the one used by Connor.

  • No-Automation (Level 0): The driver is in complete and sole control of the primary vehicle controls – brake, steering, throttle, and motive power – at all times.
  • Function-specific Automation (Level 1): Automation at this level involves one or more specific control functions. Examples include electronic stability control or pre-charged brakes, where the vehicle automatically assists with braking to enable the driver to regain control of the vehicle or stop faster than possible by acting alone.
  • Combined Function Automation (Level 2): This level involves automation of at least two primary control functions designed to work in unison to relieve the driver of control of those functions. An example of combined functions enabling a Level 2 system is adaptive cruise control in combination with lane centering.
  • Limited Self-Driving Automation (Level 3): Vehicles at this level of automation enable the driver to cede full control of all safety-critical functions under certain traffic or environmental conditions and in those conditions to rely heavily on the vehicle to monitor for changes in those conditions requiring transition back to driver control. The driver is expected to be available for occasional control, but with sufficiently comfortable transition time. The Google car is an example of limited self-driving automation.
  • Full Self-Driving Automation (Level 4): The vehicle is designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip. Such a design anticipates that the driver will provide destination or navigation input, but is not expected to be available for control at any time during the trip. This includes both occupied and unoccupied vehicles.

According to Tesla's Autopilot software description, "Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel." It adds that it warns drivers about side collisions, not in front of them, which is what occurred on May 7. However, on the company's homepage, it states that Autopilot "helps avoid collisions from the front and sides."

Tracey Lien, Bay Area-based technology reporter for the Los Angeles Times, explains the different paths taken by Google and Tesla.

The divergent approaches reflect companies with different goals and business strategies. Tesla’s rapid-fire approach is in line with its image as a small but significant auto industry disruptor, while Google — a tech company from whom no one expects auto products — has the luxury of time. 

The crash hasn't deterred Tesla or their devoted clientele using Autopilot who "view themselves as part of the test process for the autopilot feature – which the Palo Alto company describes as still being in beta mode," write Natalie Kitroeff and Samantha Masunaga for the Los Angeles Times.

A prescient June 24 Planetizen post, five days before the details of the May 7 crash were made public, lists five principles made by the National Association of City Transportation Officials (NACTO) to make autonomous vehicles urban-friendly. The first bears repeating here:

  • Plan for fully automated vehicles, not half-measuresGoing halfway with partially automated vehicles, instead of fully automated, would require drivers to take over if the vehicle encounters a dangerous situation. In practice, such vehicles have been shown to encourage unsafe driving behavior, with drivers reading more, texting more, and generally being inattentive while the vehicle is in motion.

Read NACTO’s full policy statement on automated vehicles (PDF).

Wednesday, July 6, 2016 in Planetizen

Chicago Commute

The Right to Mobility

As we consider how to decarbonize transportation, preserving mobility, especially for lower- and middle-income people, must be a priority.

January 26, 2023 - Angie Schmitt

Green bike lane with flexible delineators and textures paint in Hoboken, New Jersey

America’s Best New Bike Lanes

PeopleForBikes highlights some of the most exciting new bike infrastructure projects completed in 2022.

January 31, 2023 - PeopleforBikes

Sharrow bike markings on black asphalt two-lane road with snowy trees

Early Sharrow Booster: ‘I Was Wrong’

The lane marking was meant to raise awareness and instill shared respect among drivers and cyclists. But their inefficiency has led supporters to denounce sharrows, pushing instead for more robust bike infrastructure that truly protects riders.

January 26, 2023 - Streetsblog USA

A tent covered in blue and black tarps sits on a downtown Los Angeles sidewalk with the white ziggurat-topped L.A. City Hall looming in the background

L.A. County Towns Clash Over Homelessness Policies

Local governments often come to different conclusions about how to address homelessness within their respective borders, but varying approaches only exacerbate the problem.

February 3 - Shelterforce Magazine

Rendering of mixed-use development with parks and stormwater retention on former Houston landfill site

A Mixed-Use Vision for Houston Landfill Site

A local nonprofit is urging the city to consider adding mixed-use development to the site, which city officials plan to turn into a stormwater detention facility.

February 3 - Urban Edge

Aerial view of downtown Milwaukee, Wisconsin at sunset

Milwaukee County Makes Substantial Progress on Homelessness

In 2022, the county’s point-in-time count of unhoused people reflected just 18 individuals, the lowest in the country.

February 3 - Urban Milwaukee