Learn today, plan for tomorrow.
Sign up for news and offers from Planetizen Courses, the online learning platform for planners.
Self-driving or 'automated vehicle safety' technology holds enormous implications for reducing traffic crashes, and thus fatalities and injuries. An estimated 35,200 people lost their lives year on American roads last year, up 7.7 percent from 2014.
"Ninety-four percent of crashes can be tied back to a human choice or error, so we know we need to focus our efforts on improving human behavior while promoting vehicle technology that not only protects people in crashes, but helps prevent crashes in the first place," said National Highway Traffic Safety Administration Administrator Dr. Mark Rosekind in a July 1 press release. [Italics added.]
Details of the May 7 crash were made available by the National Highway Traffic Safety Administration (NHTSA).
"In a statement, [NHTSA] said preliminary reports indicated that the crash occurred when a tractor-trailer made a left turn in front of the Tesla, and the car failed to apply the brakes," reported Neal E. Boudette and Bill Vlasic for The New York Times on June 30.
It is the first known fatal accident involving a vehicle being driven by itself by means of sophisticated computer software, sensors, cameras and radar.
The Silicon Valley-based electric car company's driverless technology is known as Autopilot. "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla states in their June 30 blog. However, that would not account for the failure of the radar to spot the trailer.
A Tesla equipped with Autopilot makes the Model S a "semi-autonomous" car, according to Gear Patrol staff writer Andrew Connor, as opposed to the fully autonomous technology that Google and Uber are working towards. Specifically, Tesla Autopilot is a "Level 2" technology, adds Connor, while Volvo Intellisafe Autopilot is Level 3, and Google Cars Level 4.
"NHTSA defines vehicle automation as having five levels," according to their May 2013 press release. Their table below is similar, but not identical to the one used by Connor.
According to Tesla's Autopilot software description, "Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel." It adds that it warns drivers about side collisions, not in front of them, which is what occurred on May 7. However, on the company's homepage, it states that Autopilot "helps avoid collisions from the front and sides."
The divergent approaches reflect companies with different goals and business strategies. Tesla’s rapid-fire approach is in line with its image as a small but significant auto industry disruptor, while Google — a tech company from whom no one expects auto products — has the luxury of time.
The crash hasn't deterred Tesla or their devoted clientele using Autopilot who "view themselves as part of the test process for the autopilot feature – which the Palo Alto company describes as still being in beta mode," write Natalie Kitroeff and Samantha Masunaga for the Los Angeles Times.
A prescient June 24 Planetizen post, five days before the details of the May 7 crash were made public, lists five principles made by the National Association of City Transportation Officials (NACTO) to make autonomous vehicles urban-friendly. The first bears repeating here:
Read NACTO’s full policy statement on automated vehicles (PDF).