B次元官网网址

Skip to content

U.S. probing Autopilot problems on 765,000 Tesla vehicles

Investigation covers almost everything that Tesla has sold in the U.S. since the start of the 2014
26161628_web1_copy_20210816070848-611a5141015176116f3107e3jpeg

The U.S. government has opened a formal investigation into TeslaB次元官网网址檚 Autopilot partially automated driving system after a series of collisions with parked emergency vehicles.

The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the U.S. since the start of the 2014 model year. Of the crashes identified by the National Highway Traffic Safety Administration as part of the probe, 17 people were injured and one was killed.

NHTSA says it has identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes where first responders have used flashing lights, flares, an illuminated arrow board or cones warning of hazards. The agency announced the action Monday in a posting on its website.

The probe is another sign that NHTSA under President Joe Biden is taking a tougher stance on on automated vehicle safety than under previous administrations. Previously the agency was reluctant to regulate the new technology for fear of hampering adoption of the potentially life-saving systems.

The investigation covers TeslaB次元官网网址檚 entire current model lineup, the Models Y, X, S and 3 from the 2014 through 2021 model years.

The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit AutopilotB次元官网网址檚 use to areas where it can safely operate. The NTSB also recommended that NHTSA require Tesla to have a better system to make sure drivers are paying attention. NHTSA has not taken action on any of the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.

Last year the NTSB blamed Tesla, drivers and lax regulation by NHTSA for two collisions in which Teslas crashed beneath crossing tractor-trailers. The NTSB took the unusual step of accusing NHTSA of contributing to the crash for failing to make sure automakers put safeguards in place to limit use of electronic driving systems.

The agency made the determinations after investigating a 2019 crash in Delray Beach, Florida, in which the 50-year-old driver of a Tesla Model 3 was killed. The car was driving on Autopilot when neither the driver nor the Autopilot system braked or tried to avoid a tractor-trailer crossing in its path.

B次元官网网址淲e are glad to see NHTSA finally acknowledge our long standing call to investigate Tesla for putting technology on the road that will be foreseeably misused in a way that is leading to crashes, injuries, and deaths,B次元官网网址 said Jason Levine, executive director of the nonprofit Center for Auto Safety, an advocacy group. B次元官网网址淚f anything, this probe needs to go far beyond crashes involving first responder vehicles because the danger is to all drivers, passengers, and pedestrians when Autopilot is engaged.B次元官网网址

Autopilot has frequently been misused by Tesla drivers, who have been caught driving drunk or even riding in the back seat while a car rolled down a California highway.

A message was left early Monday seeking comment from Tesla, which has disbanded its media relations office.

NHTSA has sent investigative teams to 31 crashes involving partially automated driver assist systems since June of 2016. Such systems can keep a vehicle centered in its lane and a safe distance from vehicles in front of it. Of those crashes, 25 involved Tesla Autopilot in which 10 deaths were reported, according to data released by the agency.

Tesla and other manufacturers warn that drivers using the systems must be ready to intervene at all times. In addition to crossing semis, Teslas using Autopilot have crashed into stopped emergency vehicles and a roadway barrier.

The probe by NHTSA is long overdue, said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University who studies automated vehicles.

TeslaB次元官网网址檚 failure to effectively monitor drivers to make sure theyB次元官网网址檙e paying attention should be the top priority in the probe, Rajkumar said. Teslas detect pressure on the steering wheel to make sure drivers are engaged, but drivers often fool the system.

B次元官网网址淚tB次元官网网址檚 very easy to bypass the steering pressure thing,B次元官网网址 Rajkumar said. B次元官网网址淚tB次元官网网址檚 been going on since 2014. We have been discussing this for a long time now.B次元官网网址

The crashes into emergency vehicles cited by NHTSA began on Jan. 22, 2018 in Culver City, California, near Los Angeles when a Tesla using Autopilot struck a parked firetruck that was partially in the travel lanes with its lights flashing. Crews were handling another crash at the time.

Since then, the agency said there were crashes in Laguna Beach, California; Norwalk, Connecticut; Cloverdale, Indiana; West Bridgewater, Massachusetts; Cochise County, Arizona; Charlotte, North Carolina; Montgomery County, Texas; Lansing, Michigan; and Miami, Florida.

B次元官网网址淭he investigation will assess the technologies and methods used to monitor, assist and enforce the driverB次元官网网址檚 engagement with the dynamic driving task during Autopilot operation,B次元官网网址 NHTSA said in its investigation documents.

In addition, the probe will cover object and event detection by the system, as well as where it is allowed to operate. NHTSA says it will examine B次元官网网址渃ontributing circumstancesB次元官网网址 to the crashes, as well as similar crashes.

An investigation could lead to a recall or other enforcement action by NHTSA.

B次元官网网址淣HTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,B次元官网网址 the agency said in a statement. B次元官网网址淓very available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles.B次元官网网址

The agency said it has B次元官网网址渞obust enforcement toolsB次元官网网址 to protect the public and investigate potential safety issues, and it will act when it finds evidence B次元官网网址渙f noncompliance or an unreasonable risk to safety.B次元官网网址

In June NHTSA ordered all automakers to report any crashes involving fully autonomous vehicles or partially automated driver assist systems.

Shares of Tesla Inc., based in Palo Alto, California, fell 3.5% at the opening bell Monday.

Tesla uses a camera-based system, a lot of computing power, and sometimes radar to spot obstacles, determine what they are, and then decide what the vehicles should do. But Carnegie MellonB次元官网网址檚 Rajkumar said the companyB次元官网网址檚 radar was plagued by B次元官网网址渇alse positiveB次元官网网址 signals and would stop cars after determining overpasses were obstacles.

Now Tesla has eliminated radar in favor of cameras and thousands of images that the computer neural network uses to determine if there are objects in the way. The system, he said, does a very good job on most objects that would be seen in the real world. But it has had trouble with parked emergency vehicles and perpendicular trucks in its path.

B次元官网网址淚t can only find patterns that it has been B次元官网网址榪uote unquoteB次元官网网址 trained on,B次元官网网址 Rajkumar said. B次元官网网址淐learly the inputs that the neural network was trained on just do not contain enough images. TheyB次元官网网址檙e only as good as the inputs and training. Almost by definition, the training will never be good enough.B次元官网网址

Tesla also is allowing selected owners to test what it calls a B次元官网网址渇ull self-drivingB次元官网网址 system. Rajkumar said that should be investigated as well.

B次元官网网址擳om Krisher, The Associated Press





(or

B次元官网网址

) document.head.appendChild(flippScript); window.flippxp = window.flippxp || {run: []}; window.flippxp.run.push(function() { window.flippxp.registerSlot("#flipp-ux-slot-ssdaw212", "Black Press Media Standard", 1281409, [312035]); }); }