GM’s Cruise will pay a $500,000 fine for submitting a false accident report

GM's robotaxi unit Cruise has agreed to pay a $500,000 for submitting a false accident report as part of a deferred prosecution agreement. The US Justice Department (DoJ) said that Cruise failed to disclose vital details about a serious October 2023 accident in which one of its vehicles struck a pedestrian and dragged her 20 feet after she was hit by another vehicle.

"Federal laws and regulations are in place to protect public safety on our roads. Companies with self-driving cars that seek to share our roads and crosswalks must be fully truthful in their reports to their regulators,” said Martha Boersch, Chief of the Office of the U.S. Attorney’s Criminal Division. Uber has yet to comment on the matter. 

Under the terms of the three-year settlement, Cruise must cooperate with the government, put a safety compliance program into place and provide annual reports to the US Attorney's office. The company could still be prosecuted if it fails to comply with those conditions. Cruise was previously fined $1.5 million by the National Highway Traffic Safety Administration (NHTSA) and reportedly reached a settlement with the victim worth at least $8 million.

According to the US Attorney's office, a Cruise driverless vehicle operating in San Francisco ran over a pedestrian who had been thrown into its path after being struck by a separate, human-operated vehicle. The Cruise vehicle initially stopped after running over the pedestrian, but its systems failed to detect that she was still under the vehicle. It then tried to pull over to the side, dragging the woman over 20 feet. In Cruise's report to the NHTSA, it said nothing about dragging the victim after it struck her. (Cruise also omitted this information in statements to the press at the time of the accident.)

Cruise was subsequently stripped of its license to operate self-driving vehicles in California. The company stopped all operations of both its driverless cars and its manned robotaxi service in order to engage in a comprehensive safety review. CEO Kyle Vogt resigned in November and GM announced plans to slash Cruise's funding and to restructure leadership based on external safety reviews. Nearly a quarter of the company's workforce was cut that in December.

Cruise vehicles stayed off roads for several more months but returned to Arizona in April and to Houston in June under the supervision of human drivers. In September this year, Cruise recommenced operations in California, again with human drivers at the wheel. In August, the company said its self-driving vehicles would come to Uber starting next year.

This article originally appeared on Engadget at https://www.engadget.com/transportation/gms-cruise-will-pay-a-500000-fine-for-submitting-a-false-accident-report-133041789.html?src=rss

GM’s Cruise fined $1.5 million for omitting details about its gruesome 2023 crash

On Monday, the National Highway Traffic Safety Administration (NHTSA) fined Cruise, GM’s self-driving vehicle division, $1.5 million. The penalty was imposed for omitting key details from an October 2023 accident in which one of the company’s autonomous vehicles struck and dragged a San Francisco pedestrian.

Cruise is being fined for initially submitting several incomplete reports. The NHTSA’s reports require pre-crash, crash and post-crash details, which the company gave to the agency without a critical detail: that the pedestrian was dragged by the vehicle for 20 feet at around 7 MPH, causing severe injuries. Eventually, the company released a 100-page report from a law firm detailing its failures surrounding the accident.

That report states that Cruise executives initially played a video of the accident during October 3 meetings with the San Francisco Mayor's Office, NHTSA, DMV and other officials. However, the video stream was “hampered by internet connectivity issues” that concealed the part where the vehicle dragged the victim. Executives, who the report stated knew about the dragging, also failed to verbally mention that crucial detail in the initial meetings because they wanted to let “the video speak for itself.” 

Investigators finally found out about the dragging after the NHTSA asked the company to submit the full video. The government agency says Cruise also amended four other incomplete crash reports involving its vehicles to add additional details.

The NHTSA's new requirements for Cruise include submitting a corrective action plan, along with others covering its total number of vehicles, their miles traveled and whether they operated without a driver. It also has to summarize software updates that affect operation, report citations and observed violations of traffic laws and let the agency know how it will improve safety. Finally, Cruise will have to meet with the NHTSA quarterly to discuss the state of its operations while reviewing its reports and compliance.

The order lasts at least two years, and the NHTSA can extend it to a third year. Reuters reported on Monday that, despite the fine, the NHTSA’s investigation into whether Cruise is taking proper safety precautions to protect pedestrians is still open. Cruise still faces probes by the Department of Justice and the Securities and Exchange Commission.

To say the incident sparked shakeups at Cruise would be an understatement. The company halted its self-driving operations after the accident. Then, last November, the dominoes began to fall: Its CEO resigned, and GM said it would cut its Cruise investment by “hundreds of millions of dollars” and restructure its leadership. Nine more executives were dismissed in December.

Nonetheless, Cruise is trying to rebound under its new leadership. Vehicles with drivers returned to Arizona and Houston this year, and GM said it’s pouring an additional $850 million into it. Earlier this month, it began operating in California again, also with drivers — which, it’s safe to say, is a good thing.

This article originally appeared on Engadget at https://www.engadget.com/transportation/gms-cruise-fined-15-million-for-omitting-details-about-its-gruesome-2023-crash-210559255.html?src=rss

Tesla Semi fire required 50,000 gallons of water to extinguish

California firefighters needed to spray 50,000 gallons of water to extinguish a roadside Tesla Semi fire, the US National Transportation Safety Board (NTSB) announced in a preliminary report. Crews also used an aircraft to drop fire retardent in the "immediate area as a precautionary measure," according to the agency.

The crash happened at 3:13 AM on August 19 on the I80 freeway east of Sacramento. The tractor-trailer departed the roadway while navigating a curve, struck a traffic delineator and eventually hit a tree. The driver was uninjured but taken to hospital as a precaution.

Tesla Semi fire required 50,000 gallons of water to extinguish
California Highway Patrol

The Tesla Semi's large 900kWh battery caught fire and reached a temperature of 1,000 degrees F while spewing toxic fumes. It continued to burn into the late afternoon as firefighters dowsed it with water to cool it down (Tesla sent a technical expert to assess high-voltage hazards and fire safety). It wasn't until 7:20 PM (over 16 hours after the crash) that the freeway was reopened. 

All of that caught the attention of the NTSB, which sent a team of investigators, mainly to examine the fire risks posed by large lithium-ion battery packs. The agency — which can only make safety recommendations and has no enforcement authority — said that "all aspects of the crash remain under investigation while the NTSB determines the probable cause." 

Given the long road shutdown time, dangerously hot fire and toxic fumes, the accident is likely to provoke a lot of discussion in and out of government. The NTSB concluded in 2021 that battery fires pose a risk to emergency responders and that manufacturers' guidelines around such fires were inadequate. 

This article originally appeared on Engadget at https://www.engadget.com/transportation/evs/tesla-semi-fire-required-50000-gallons-of-water-to-extinguish-120006477.html?src=rss

Cybertruck crash and fire reportedly causes first fatality

A Tesla Cybertruck veered off a Texas road and crashed into a culvert, bursting into flames and killing the driver, KHOU 11 Houston reported. It appears to be the first fatality involving Tesla's new electric pickup and has triggered a probe by the NHTSA (National Highway Transportation Safety Administration), according to Reuters. The driver has not yet been identified.

It's not clear what caused the accident, but it wasn't related to Tesla's Autopilot as the Cybertruck has yet to gain that feature. Video from the scene shows that the vehicle was nearly completely consumed by the fire, which thwarted identification of the vehicle and driver, Texas state troopers said. 

The first reported Cybertruck accident happened late last year near Palo Alto, with no injuries and little damage reported. Around 15,000 vehicles have now been sold, so the number of accident reports has ramped up of late. Tesla has issued four recalls for the vehicle, including one that could result in a stuck accelerator pedal

Since its launch in November 2023, the Cybertruck has been criticized for poor build quality, malfunctions, and weird design decisions like "guillotine" body panels deemed unsafe for children and others. It's also come under fire for getting stuck easily and not performing well off road, in one case being outclassed by an ancient French Citroën C15

This article originally appeared on Engadget at https://www.engadget.com/transportation/evs/cybertruck-crash-and-fire-reportedly-causes-first-fatality-120031177.html?src=rss

Cybertruck crash and fire reportedly causes first fatality

A Tesla Cybertruck veered off a Texas road and crashed into a culvert, bursting into flames and killing the driver, KHOU 11 Houston reported. It appears to be the first fatality involving Tesla's new electric pickup and has triggered a probe by the NHTSA (National Highway Transportation Safety Administration), according to Reuters. The driver has not yet been identified.

It's not clear what caused the accident, but it wasn't related to Tesla's Autopilot as the Cybertruck has yet to gain that feature. Video from the scene shows that the vehicle was nearly completely consumed by the fire, which thwarted identification of the vehicle and driver, Texas state troopers said. 

The first reported Cybertruck accident happened late last year near Palo Alto, with no injuries and little damage reported. Around 15,000 vehicles have now been sold, so the number of accident reports has ramped up of late. Tesla has issued four recalls for the vehicle, including one that could result in a stuck accelerator pedal

Since its launch in November 2023, the Cybertruck has been criticized for poor build quality, malfunctions, and weird design decisions like "guillotine" body panels deemed unsafe for children and others. It's also come under fire for getting stuck easily and not performing well off road, in one case being outclassed by an ancient French Citroën C15

This article originally appeared on Engadget at https://www.engadget.com/transportation/evs/cybertruck-crash-and-fire-reportedly-causes-first-fatality-120031177.html?src=rss

Tesla sued over fatal Autopilot crash

Tesla is facing yet more legal action over Autopilot after the parents of a motorcyclist who was killed in a crash involving a Model 3 sued the company. The plaintiffs, who also sued the driver of the Tesla, claimed that the car's driver assistant tech and other safety measures are “defective and inadequate.”

The plaintiffs argued in the complaint, which was obtained by Reuters, that Autopilot sensors and cameras “should have identified the hazard posed by" the motorcycle. Autopilot was engaged when the Model 3 struck the back of Landon Embry's motorcycle at 75-80 miles per hour in Utah in 2022. Embry died at the scene.

His parents also claim the Model 3 driver was tired and that "a reasonably prudent driver, or adequate auto braking system, would have, and could have slowed or stopped without colliding with the motorcycle." Tesla does not have a public relations department that can be reached for comment.

This is the latest in a long line of legal and regulatory issues that Tesla has contended with over the Autopilot and Full Self-Driving features. Just this week, Washington state investigators determined that a Tesla Model S involved in a fatal crash with a motorcycle in April had Full Self-Driving engaged at the time.

This article originally appeared on Engadget at https://www.engadget.com/tesla-sued-over-fatal-autopilot-crash-164723952.html?src=rss

Tesla involved in fatal Washington crash was using self-driving mode

A deadly crash in Washington that took the life of a motorcyclist earlier this year was caused by a Tesla vehicle while it was in “Full Self Driving” mode. The Associated Press reported that investigators from the Washington State Patrol confirmed that a 2022 Tesla Model S involved in the fatal collision in April was in self-driving mode from the car’s event-data recorder.

The crash occurred on April 19 on the eastbound side of State Route 522 approximately 15 miles northeast of Seattle. The unidentified driver told police he had his Tesla’s self-driving mode on and was looking at his phone at the time of the crash. The vehicle crashed into the back of the motorcycle pinning Jeffrey Nissen, 28, underneath the vehicle. Paramedics pronounced Nissen dead at the scene, according to Seattle-based KIRO 7 News.

Tesla chief executive officer Elon Musk has been making promises for autonomous cars for years now. Musk’s promises coupled with vehicle safety concerns prompted Sen. Ed Markey and Richard Blumenthal to issue a letter to the Federal Trade Commission (FTC) urging them to open an investigation into Tesla’s “misleading advertising and marketing” practices for its Autopilot and Full Self-Driving modes. Last year, the NHTSA recalled over 2 million Tesla vehicles due to concerns about driver inattention during Autopilot mode.

Musk also promised “one million robotaxis” in 2019 by the end of the following year. Four years later, the car company is still delaying the unveiling of its robotaxi initiative due to design changes.

The Washington incident happened just a few days before the National Highway Traffic Safety Administration (NHTSA) concluded a review that linked 14 deaths caused by 13 crashes to Tesla vehicles operating in Autopilot mode. The NHTSA’s report concluded that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities” and the Full Self-Driving mode “did not adequately ensure that drivers maintained their attention on the driving task.”

The Wall Street Journal conducted its own investigation into Tesla’s Autopilot mode using data obtained from cars involved in accidents and published its findings on Monday. The video report found that Tesla reported over 1,000 crashes to the NHTSA since 2016. The data the WSJ obtained from 222 of those crashes determined that 44 were in Autopilot mode.

This article originally appeared on Engadget at https://www.engadget.com/tesla-involved-in-fatal-washington-crash-was-using-self-driving-mode-170706606.html?src=rss

Waymo issues recall after one its self-driving taxis crashed into a pole

Waymo is voluntarily recalling its robotaxis after one of them collided with a telephone pole in an alley enroute to pick up a passenger, The Verge reported. The vehicle was unoccupied and no bystanders were injured.

At the time of the May 21st accident, the Waymo vehicle went through an alley lined with telephone poles mounted at street level rather than on a curb, with a yellow line showing where to drive. While pulling over, it struck one of the poles at 8 MPH and sustained some damage, Waymo said. 

"It never made it to pick us up," the passenger waiting for the car, Jericka Mitchell, told 12News. Mitchell reportedly heard, but didn't see the accident. 

The company filed a recall with the National Highway Traffic Safety Administration (NHTSA) after updating the software in its entire self-driving fleet of 672 vehicles. The update is designed to fix an error that assigned a low damage score to the pole and failed to account for the alleyway's hard edge. 

It's only Waymo's second recall. The first happened earlier this year when two of its autonomous vehicles crashed into the same pickup truck that was being towed. In that one, Waymo found that its software failed to predict the movements of the vehicle due to "persistent orientation mismatch" between the towed vehicle and the one towing it.

Waymo is also under investigation for more than 24 incidents including crashes and traffic violations. Rival Cruise, owned by GM, was involved in a more serious incident last year, wherein one of its robotaxis accidentally dragged someone hit by another vehicle a few dozen feet down a San Francisco street. California then suspended its license to operate in the state and Cruise eventually paused all robotaxi operations

This article originally appeared on Engadget at https://www.engadget.com/waymo-issues-recall-after-one-its-self-driving-taxis-crashed-into-a-pole-121937607.html?src=rss

Tesla settles lawsuit over fatal Model X crash that killed an Apple engineer

Back in 2019, the family of Apple engineer Wei Lun Huang (aka Walter Huang) sued Tesla a year after he was killed when his Model X crashed into a median in Mountain View while Autopilot was engaged. That case is officially closed, now that the automaker has settled the lawsuit on the very day jury selection was supposed to take place. According to CNBC and The New York Times, Tesla's lawyers asked the court to seal the settlement agreement so that the exact amount the company paid wouldn't be made public. The company didn't want "other potential claimants (or the plaintiffs' bar) [to] perceive the settlement amount as evidence of Tesla's potential liability for losses, which may have a chilling effect on settlement opportunity in subsequent cases."

Tesla confirmed shortly after the accident that Autopilot was switched on at the time of the crash, but it also insisted that Huang had time to react and had an unobstructed view of the divider. In a statement to the press, the company insisted that the driver was at fault and that the only way for the accident to have occurred was if Huang "was not paying attention to the road, despite the car providing multiple warnings to do so." In the lawsuit, Huang's lawyers pointed to Autopilot marketing materials from Tesla suggesting that its cars are safe enough to use on the road without drivers having to keep their hands on the wheel at all times. We took the image above from a video on Tesla's Autopilot page, showing a driver with their hands on their lap. 

The incident became big enough to attract the attention of the National Transportation Safety Board (NTSB), which conducted an investigation and found that Huang previously reported that the car steered away from the highway on prior trips. In fact, his family said that he used to complain about his car swerving towards the exact barrier he crashed into and had even reported it to the Tesla dealership, which couldn't replicate the issue. The agency also concluded that Tesla's collision warning system didn't alert the driver and that its emergency braking system didn't activate as it should have when the car started making its way toward the barrier. 

That said, the NTSB discovered, as well, that Huang was running a mobile game on his phone at the time of the accident. It just couldn't determine whether the phone was in his hands when the crash occurred. The Times said Tesla was preparing to show proof to the court that Huang was playing a game when he crashed, which his lawyers denied. Regardless of who's truly at fault, a trial would've called renewed attention to the safety of Tesla's driver assistance system. Settling puts an end to the case a few months before the company unveils its own robotaxi on August 8.

This article originally appeared on Engadget at https://www.engadget.com/tesla-settles-lawsuit-over-fatal-model-x-crash-that-killed-an-apple-engineer-054710845.html?src=rss

Tesla settles lawsuit over fatal Model X crash that killed an Apple engineer

Back in 2019, the family of Apple engineer Wei Lun Huang (aka Walter Huang) sued Tesla a year after he was killed when his Model X crashed into a median in Mountain View while Autopilot was engaged. That case is officially closed, now that the automaker has settled the lawsuit on the very day jury selection was supposed to take place. According to CNBC and The New York Times, Tesla's lawyers asked the court to seal the settlement agreement so that the exact amount the company paid wouldn't be made public. The company didn't want "other potential claimants (or the plaintiffs' bar) [to] perceive the settlement amount as evidence of Tesla's potential liability for losses, which may have a chilling effect on settlement opportunity in subsequent cases."

Tesla confirmed shortly after the accident that Autopilot was switched on at the time of the crash, but it also insisted that Huang had time to react and had an unobstructed view of the divider. In a statement to the press, the company insisted that the driver was at fault and that the only way for the accident to have occurred was if Huang "was not paying attention to the road, despite the car providing multiple warnings to do so." In the lawsuit, Huang's lawyers pointed to Autopilot marketing materials from Tesla suggesting that its cars are safe enough to use on the road without drivers having to keep their hands on the wheel at all times. We took the image above from a video on Tesla's Autopilot page, showing a driver with their hands on their lap. 

The incident became big enough to attract the attention of the National Transportation Safety Board (NTSB), which conducted an investigation and found that Huang previously reported that the car steered away from the highway on prior trips. In fact, his family said that he used to complain about his car swerving towards the exact barrier he crashed into and had even reported it to the Tesla dealership, which couldn't replicate the issue. The agency also concluded that Tesla's collision warning system didn't alert the driver and that its emergency braking system didn't activate as it should have when the car started making its way toward the barrier. 

That said, the NTSB discovered, as well, that Huang was running a mobile game on his phone at the time of the accident. It just couldn't determine whether the phone was in his hands when the crash occurred. The Times said Tesla was preparing to show proof to the court that Huang was playing a game when he crashed, which his lawyers denied. Regardless of who's truly at fault, a trial would've called renewed attention to the safety of Tesla's driver assistance system. Settling puts an end to the case a few months before the company unveils its own robotaxi on August 8.

This article originally appeared on Engadget at https://www.engadget.com/tesla-settles-lawsuit-over-fatal-model-x-crash-that-killed-an-apple-engineer-054710845.html?src=rss