US to Probe Tesla's 'Full Self-Driving' System after Pedestrian Killed in Low Visibility Conditions

16 June 2015, Ebringen: The logo of Tesla electric vehicle company is pictured on an S model vehicle. (dpa)
16 June 2015, Ebringen: The logo of Tesla electric vehicle company is pictured on an S model vehicle. (dpa)
TT

US to Probe Tesla's 'Full Self-Driving' System after Pedestrian Killed in Low Visibility Conditions

16 June 2015, Ebringen: The logo of Tesla electric vehicle company is pictured on an S model vehicle. (dpa)
16 June 2015, Ebringen: The logo of Tesla electric vehicle company is pictured on an S model vehicle. (dpa)

The US government's road safety agency is again investigating Tesla's “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust, The AP reported.

In addition to the pedestrian's death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

The investigation covers roughly 2.4 million Teslas from the 2016 through 2024 model years.

A message was left early Friday seeking comment from Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.

Last week Tesla held an event at a Hollywood studio to unveil a fully autonomous robotaxi without a steering wheel or pedals. CEO Elon Musk said the company plans to have fully autonomous vehicles running without human drivers next year, and robotaxis available in 2026.

The agency also said it would look into whether any other similar crashes involving “Full Self-Driving” have happened in low visibility conditions, and it will seek information from the company on whether any updates affected the system’s performance in those conditions.

“In particular, his review will assess the timing, purpose and capabilities of any such updates, as well as Telsa’s assessment of their safety impact,” the documents said.

Tesla has twice recalled “Full Self-Driving” under pressure from the agency, which in July sought information from law enforcement and the company after a Tesla using the system struck and killed a motorcyclist near Seattle.

The recalls were issued because the system was programmed to run stop signs at slow speeds and because the system disobeyed other traffic laws.

Critics have said that Tesla’s system, which uses only cameras to spot hazards, doesn’t have proper sensors to be fully self driving. Nearly all other companies working on autonomous vehicles use radar and laser sensors in addition to cameras to see better in the dark or poor visibility conditions.



Meta Must Face US State Lawsuits over Teen Social Media Addiction

A man walks past a logo of mobile application Instagram, during a conference in Mumbai, India, September 20, 2023. (Reuters)
A man walks past a logo of mobile application Instagram, during a conference in Mumbai, India, September 20, 2023. (Reuters)
TT

Meta Must Face US State Lawsuits over Teen Social Media Addiction

A man walks past a logo of mobile application Instagram, during a conference in Mumbai, India, September 20, 2023. (Reuters)
A man walks past a logo of mobile application Instagram, during a conference in Mumbai, India, September 20, 2023. (Reuters)

Facebook parent company Meta must face lawsuits by US states accusing it of fueling mental health problems among teens by making its Facebook and Instagram platforms addictive, a federal judge in California ruled on Tuesday.

Oakland-based US District Judge Yvonne Gonzalez Rogers rejected Meta's bid to toss the claims made by the states in two separate lawsuits filed last year, one involving more than 30 states including California and New York and the other brought by Florida.

Rogers put some limits on the states' claims, agreeing with Meta that a federal law known as Section 230 regulating online platforms partly shielded the company. However, she found that the states had put forward enough detail about allegedly misleading statements made by the company to go forward with most of their case.

The judge also rejected motions by Meta, ByteDance's TikTok, Google parent Alphabet's YouTube and Snap's SnapChat to dismiss related personal injury lawsuits by individual plaintiffs. The other companies are not defendants to the states' lawsuits.

The ruling clears the way for states and other plaintiffs to seek more evidence and potentially go to trial. It is not a final ruling on the merits of their cases.

"Meta needs to be held accountable for the very real harm it has inflicted on children here in California and across the country," California Attorney General Rob Bonta said in a statement.

Lawyers for the personal injury plaintiffs in a joint statement called the ruling "a significant victory for young people nationwide who have been negatively impacted by addictive and harmful social media platforms."

A Meta spokesperson says that the company disagreed with the ruling overall and that it had "developed numerous tools to support parents and teens," including new "Teen Accounts" on Instagram with added protections.

A Google spokesperson called the allegations "simply not true" and said, "providing young people with a safer, healthier experience has always been core to our work."

The other social media companies did not immediately respond to requests for comment.

The states are seeking court orders against Meta's allegedly illegal business practices and are seeking unspecified monetary damages.

Hundreds of lawsuits have been filed by various plaintiffs accusing the social media companies of designing addictive algorithms that lead to anxiety, depression and body-image issues among adolescents, and failing to warn of their risks.