Tesla crash raises concerns about autonomous vehicle regulation


The fatal crash of a Tesla Motors Inc Model S in Autopilot mode has turned up pressure on auto industry executives and regulators to ensure that automated driving technology is deployed safely.

The first such known accident, which occurred in Florida in May, has highlighted tensions surrounding efforts to turn over responsibility for braking, steering and driving judgments to machines. It may delay the U.S. government's plan to outline guidelines for self-driving cars this month.

The cause of the Model S crash is still under investigation by federal and Florida state authorities, which are looking into whether the driver was distracted before his 2015 Model S went under a truck trailer.

Shares of Tesla and Mobileye NV, the maker of the camera vision system used in the Model S, rose on Friday as analysts said the accident was likely a short-term setback. The stocks fell in after-hours trading on Thursday after an investigation of the crash was made known.

Advocates of automating driving point to research that shows 90 per cent of accidents are caused by human mistakes. But machines can also make mistakes, or encounter situations they are not designed to handle.

On Friday, the U.S. National Highway Traffic Safety Administration (NHTSA) said U.S. traffic deaths rose by 7.7 percent to 35,200 in 2015 - the highest annual tally since 2008 and biggest single-year jump since 1966. Federal officials and industry executives say that toll could be cut by technology such as brakes that automatically engage when sensors detect an impending crash.

In March, 20 automakers agreed with regulators to make automatic emergency braking standard on nearly all U.S. vehicles by 2022, a move that could prevent thousands of rear-end crashes annually.

But automakers have issued numerous recalls for problems with such systems. Honda Motor Co recalled nearly 50,000 Acura SUVs and cars in June 2015 because the system can apply the brakes when it detects a vehicle accelerating and is driving along a metal fence or guardrail.


U.S. Transportation Secretary Anthony Foxx told Reuters earlier this year he planned to propose regulatory guidelines by mid-July to clear the way for wider deployment of automated driving systems.

“This technology is coming," Foxx said. "Ready or not, it’s coming."

On Wednesday, Foxx said the guidelines could take more time, and cautioned there are questions "that are probably unanswerable at this point."

NHTSA said in a report in March that there are relatively few hurdles to fully autonomous vehicles being used on U.S. roads, as long as vehicle design "allows a human driver to operate the vehicle with a wheel and pedals."

At a conference in Detroit last month, NHTSA chief Mark Rosekind said he would accept technology that was "two times" better than conventional vehicles at preventing collisions.


Hours before the crash became public knowledge on Thursday, U.S. National Transportation Safety Board Chairman Christopher Hart said driverless cars will not be perfect.

"There will be fatal crashes, that's for sure," Hart told the audience at the National Press Club in Washington, but added that will not derail the move toward driverless cars, even if the vehicles are not ready: "This train has left the station."

Alphabet Inc's Google unit and other companies are racing to get self-driving cars on U.S. roads. Google has logged about 1.5 million miles of test driving, but has not said when it would offer its technology for sale.

"We have a responsibility to get this out there as soon as we can and really as soon as we have data that says we're better than the current system of flawed human drivers," Google's self-driving car CEO John Krafcik said last month in Washington.


Former NHTSA chief David Strickland, who is heading a self-driving coalition including Google and Ford Motor Co, told Reuters on Friday he does not "think this crash is going to change the arc for the entire industry ... and our pathway toward full self-driving."

Automakers have wide latitude to install systems that intervene when drivers are not attentive - from lane-keeping systems to automatic emergency braking - and do not need prior approval from regulators, even if the systems are described as in "beta", or public testing mode.

Former NHTSA chief Joan Claybrook said in an interview the agency needs to set performance standards for electronic systems like Autopilot. "It's the like Wild West. The regulatory system is not being used," Claybrook said.

Tesla's Autopilot system "is explicitly denoted as a beta product," said Jason Corso, an associate professor of electrical and computer engineering at the University of Michigan. The accident is a "wake-up call that significant further study is needed to model the sensors and the underlying recognition technologies on which these systems rely," he said.

Timothy Carone, a business professor at the University of Notre Dame, said there will be more of these types of eventsas more automated cars, planes, trains and weapons are put into use.

Deaths will start to rise, but will then decline, he said, as "artificial intelligence, big data, and sensors for collecting data begin to mature and become capable of handling unusual situations that are difficult to simulate in test environments."

(Reporting by Narottam Medhora in Bengaluru, Bernie Woodall in Detroit and David Shepardson in Washington; Editing by Bill Rigby and Jeffrey Benkoe)

×
Stay Informed

When you subscribe to the blog, we will send you an e-mail when there are new updates on the site so you wouldn't miss them.

Israeli minister says Facebook a 'monster', hinder...
DVD player found in Tesla car in May crash: Florid...

Related Posts