Tesla In More Trouble- Did Elon Musk Lie?
Feds Seek Autopilot Data From Tesla in Crash Probe
-
By tom krisher, ap auto writer
DETROIT —
Federal safety investigators are asking electric car maker Tesla Motors for details on how its Autopilot system works and why it failed to detect a tractor trailer that crossed its path in a Florida crash.
The National Highway Traffic Safety Administration, in a letter to Tesla posted Tuesday, also requests data on all crashes that happened because its system did not work as expected.
The agency is investigating the May 7 crash in Williston, Florida, that killed 40-year-old Joshua Brown, of Canton, Ohio. Tesla says the cameras on his Model S sedan failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn’t automatically brake.
The agency gave Tesla until Aug. 26 to fully comply with its request. The company faces penalties of up to $21,000 per day, to a maximum of $105 million if it doesn’t comply.
Although the agency called the problem with Tesla’s Autopilot system an “alleged defect,” a spokesman said in a statement that it hasn’t determined if a safety defect exists. The information request is a routine step in an investigation into the crash, spokesman Bryan Thomas said.
The investigation could have broad implications for the auto industry and its steps toward self-driving cars. If the NHTSA probe finds defects with Tesla’s system, the agency could seek a recall. Other automakers have or are developing similar systems that may need to be changed as a result of the probe.
Tesla’s system uses cameras, radar and computers to detect objects and automatically brake its vehicles if they’re about to hit something. It also can steer the car to keep it centered in its lane. The company says that before Autopilot can be used, drivers have to acknowledge that the system is an “assist feature” that requires a driver to keep both hands on the wheel at all times. Drivers are told they need to “maintain control and responsibility for your vehicle” while using the system, and they must be prepared to take over at any time, Tesla has said.
In the letter, which was dated July 8, NHTSA also asked Tesla for results of its own investigation into the May 7 crash, and for all consumer complaints, field reports from dealers, reports of crashes, lawsuits and all data logs and images from problems with the Autopilot system. It also seeks details on any modification to the Autopilot system that Tesla has made.
“Describe all assessments, analyses, tests, test results, studies, surveys, simulations, reconstructions, investigations, inquiries and or evaluations that relate to or may relate to the alleged defect,” the letter says.
Investigators also want to know how the system recognizes objects and decides whether they are crossing the path of a Tesla. They also asked the company to describe how the system detects how signals from cameras or other sensors have been compromised or degraded and when that information is communicated to the driver.
Tesla Has No Plans to Disable Autopilot Feature in Its Cars
Tesla sees autopilot as stock market pump technology and plans to redouble efforts to educate customers on use so that Musk does not look like more of a loser
U.S. Deepens Investigation of Tesla in Fatal Autopilot Collision
The dashboard of a Model S equipped with Autopilot, Tesla’s autonomous driving system. Credit David Paul Morris/Bloomberg
Federal highway safety officials, deepening their investigation of Tesla Motors and the company’s Autopilot technology, have asked a set of detailed questions about the company’s crash-prevention systems and any incidents in which they failed to work properly, including the fatal May 7 accident that sparked the inquiry.
The request, which the National Highway Traffic Safety Administration announced on Tuesday, comes as the federal government is increasing its scrutiny of Tesla.
Document: Regulators Ask Tesla About Autopilot
A separate agency, the National Transportation Safety Board, which more typically investigates airline accidents, is also looking into the accident, which killed Joshua Brown, an entrepreneur from Ohio.
Mr. Brown’s 2015 Tesla Model S collided with a tractor-trailer on a Florida highway while the car’s Autopilot system was engaged.
Graphic
Inside the Self-Driving Tesla Fatal Accident
The questions raised by the National Highway Traffic Safety Administration, in a nine-page letter dated July 8, indicate the agency is investigating whether there are defects in the various crash-prevention systems related to the Autopilot system. Those include automatic emergency braking, which is supposed to stop Tesla models from running into other vehicles detected by radar and a camera, and Autosteer, which uses radar and camera input to guide the vehicles on highways or in slow-moving traffic.
The company warns drivers they must keep their hands on the steering wheel and remain alert while using Autopilot. And since the May 7 crash, the company has emphasized that Autopilot is intended to be a driver assistance feature and not a fully automated driving system.
Photo
Joshua Brown, shown with his Tesla Model S, had the Autopilot system engaged when he was killed in a crash in May. Credit Krista Kitchen/Krista Kitchen, via Associated Press
The agency’s letter asks Tesla to provide a wide range of data, including a list of all vehicles sold in the United States that are equipped with the Autopilot system; the miles driven with Autosteer activated; the number of incidents in which Tesla vehicles’ automatic emergency braking was activated; and how many times the automatic system warned drivers to put their hands back on the steering wheel.
Tesla remotely collects vast volumes of information about its cars as they are being driven.
The letter also asks Tesla to turn over any information on consumer complaints or reports of crashes or other incidents in which a vehicle’s accident-prevention systems may not have worked properly. The agency is also seeking detailed data about how Tesla’s technology detects pedestrians, bicycles and other vehicles, including vehicles moving laterally across a road.
The tractor-trailer involved in the Florida crash made a left turn across the path of Mr. Brown’s Tesla, which the accident investigation report indicated did not slow before hitting the trailer and then continuing under it at high speed before eventually running off the road and striking a utility pole that brought the car to a stop.
Tesla has said that in Mr. Brown’s accident, the Autopilot system failed to see the white truck against a bright sky and that neither the system nor Mr. Brown braked before the impact.
Related Coverage
-
Fatal Tesla Crash Draws In Transportation Safety Board
-
WHEELS
Makers of Self-Driving Cars Ask What to Do With Human Nature
-
U.S. Safety Agency Investigates Another Tesla Crash Involving Autopilot
-
Tesla and Google Take Different Roads to Self-Driving Car
-
A Fatality Forces Tesla to Confront Its Limits
Related Coverage
Makers of Self-Driving Cars Ask What to Do With Human Nature
Tesla Motors: It’s the Nondisclosure Not the Crash
By Ben Levisohn
UBS analysts Colin Langan and Eddie Hsieh asses the news that Tesla Motors (TSLA) is now being investigated by the SEC for failing to disclose the death of a driver using its Autopilot system:
- Associated Press
Yesterday, the WSJ reported that the SEC is investigating Tesla for failing to disclose to investors the fatal crash in May involving a model operated in Autopilot mode. This is the latest in a series of negative Tesla headlines including the original report of the Autopilot crash, the Q2 delivery miss, & concerns around the SolarCity (SCTY) deal. The SEC is reportedly investigating if the accident is considered a material event that a reasonable investor would have deemed important. The fatal accident occurred prior to the May capital raise. The article states that the investigation is in the very early stages, and Tesla said it has not received any communication from the SEC. Legal experts cited in the media highlight the fact that the stock rose the day after the crash was announced will make it tough to prove the crash was a material event and that it’s common that SEC investigations ultimately lead nowhere.
Autopilot issue largely focused on disclosure, not systems performance: Many autonomous driving experts believe Tesla’s Autopilot is a good ‘Level 2′ semiautonomous system. However, there is debate if the communication of the system’s limitation to owners is sufficient. For example, according to Reuters, the driver of the Autopilot fatality may have been watching a DVD, clearly placing too much trust in the system’s capabilities. Ensuring the driver is ready & able to take back control over the vehicle may ultimately be a key addition to this type of system (ex. possibly adding a camera to monitor driver). That said, we believe Tesla’s comparison of the Autopilot fatality as the first in >130 million miles of driving vs. all US vehicles (one in every 94 million miles) doesn’t take into account that Autopilot is operated in certain restricted conditions (highway) and with human oversight, while the overall US stats include all environments and a broad range of safety technology.
Shares of Tesla Motors have advanced 0.3% to $225.50 at 11:15 a.m. today, while SolarCity has risen 0.9% to $24.75.
no, it IS the crash
or more specifically, the crashes.
Several tesla cars have been wrecked now while ‘self driving’, not just the last three in the last couple weeks.
Tesla is promoting its self driving autopilot function as a primary aspect of the level of technology in the car.
Tesla has already said that in the near future the car will be able to drive itself all the way from NYC to LA with no one in the car. THAT is the level of expectation tesla has created.
the reality is, unless you are fully engaged and ready to take control of the car in an instant, you LIFE IS IN DANGER.
As with all things tesla, they cannot cash the checks that elons ego is writing.
That is why it was a SEC is involved, because tesla cars are NOT what tesla claims, and the crashes are the proof.
There are larger issues here than just the non-disclosure of the task.
TSLA had a $2B stock offering on 5/18 and the announced the $7.7B acquisition of SCTY on 6/20. This represents a 25% dilution of TSLA’s stock. Impossible that the board was not contemplating the SCTY offer on 5/18. Musk has admitted to speaking with large shareholders prior to 5/18.
“… They are always people who have fallen far, far short of his success in life….”
Elon is personally billions of dollars in debt. Last fall he took a $1.6B personal loan, using his tesla stock as collateral at $216 per share. He needed that money to bail out spaceX, which was burning cash faster than his rocket explosion in the summer of 2015, leaving spaceX sitting dead in the water for over 6 months.
tesla is billions in debt. Solar city is billions in debt. None of Elon’s companies are making a profit.
I am eternally grateful to the powers that be, that I am what you consider “far short of his success in life”.
Saying that autopilot functions will get better and save thousands of lives is pure bull. To match the intelligence of an average driver will take a room full of computers for EACH car.
The joke is on you, and you are the punch line. Before the fatal tesla auto-accident people were debating whether a self driving tesla car should run over a box of kittens, crash into pedestrians, ram a school bus full of disabled children, or sacrifice the driver of the tesla car in a no-win scenario. As if somehow THAT was the level of sophistication of self driving cars and the ethical dilemma they were struggling with.
And then reality rears its ugly head, and a tesla car broad sides an 18 wheeler at 80mph, because it thought the truck, trailer and all 9 visible 3 foot diameter moving black wheels was a sign over the roadway, and it could just drive ‘under it’.
It is now perfectly clear that when it comes to self driving cars, tesla’s got nothing going on.
“Autopilot crashes and investigations are minor bumps in the road. ”
Pretty compassionate of you to refer to the man who’s head got splattered like a pumpkin all over the highway in his self driving tesla, as a minor bump in the road.