Reuters/Beck Diefenbach/File Photo
A sleeping driver and a terrified granny are among the many people who appear in popular YouTube videos using the Tesla Autopilot technology – and not always using the system as intended.
It is an unwelcome social media image for the car company as it attends to a federal investigation into a fatal accident involving its Model S car operating in Autopilot mode in Florida in May. It was the first known fatality involving a Model S operating on Autopilot.
Tesla and other manufacturers working on systems that allow cars to pilot themselves under certain conditions are seeking to improve road safety by reducing some of the burden on drivers. Tesla issues strict safety protocols for using its Autopilot and stresses that the technology is still in development.
Not all users are heeding the warnings.
One video, filmed by someone in another car, shows the driver of a Tesla sleeping at the wheel, while his vehicle inches along in gridlock traffic. The footage was uploaded in May and has already been viewed more than 2.3 million times.(bit.ly/1NM46bi)
Another video, “Granny on Tesla’s Autopilot Mode,” shows a woman at the wheel of a Tesla shrieking in shock and alarm over the Autopilot feature, as someone in the passenger seat films her.
“Oh, there’s cars coming!” she yells. “Put me back for me controlling it! Oh, dear Jesus.”
Several YouTube users have reposted the footage across the video-sharing platform. (bit.ly/29hGBan)
In another video, posted by the account DragTimes in October, a driver who filmed himself using Tesla’s Autopilot feature reads a speeding ticket as the car continues on, explaining that he had been pulled over because the Autopiloted car was going 75 miles per hour (120 kph) in a 60 mph (95 kph) zone. (bit.ly/1Pxjfez)
A DragTimes spokesman told Reuters he believed the technology has been “steadily improving” since the video was filmed. “The recent death is certainly sad and awakening for Tesla owners to be careful with this technology and only use it as recommended,” he added.
The U.S. National Highway Traffic Safety Administration said on Thursday it is investigating 25,000 Model S sedans that are equipped with the Autopilot system, after the death of 40-year-old Joshua Brown in Williston, Florida. Its probe will add to debate within the auto industry and in legal circles over the safety of systems that take partial control of steering and braking from drivers.
In a statement on Thursday, Tesla noted its safety warnings.
“When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot ‘is an assist feature that requires you to keep your hands on the steering wheel at all times,’ and that ‘you need to maintain control and responsibility for your vehicle’ while using it,” Tesla said.
YouTube is full of videos either showing or discussing Autopilot – a search of “Tesla Autopilot” on the site yields some 27,000 results. Many are not “screaming granny” videos, but the more whacky ones often get far more clicks than the straightforward videos.
(Reporting By Amy Tennery; Editing by Frances Kerry)