this post was submitted on 14 Dec 2023
116 points (96.8% liked)

Technology

59378 readers
4249 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Tesla recalls nearly all vehicles sold in US to fix system that monitors drivers using Autopilot::Tesla is recalling nearly all vehicles sold in the U.S., more than 2 million, to update software and fix a defective system when using Autopilot.

top 24 comments
sorted by: hot top controversial new old
[–] [email protected] 38 points 11 months ago (2 children)

The attempt to address the flaws in Autopilot seemed like a case of too little, too late to Dillon Angulo, who was seriously injured in 2019 crash involving a Tesla that was using the technology along a rural stretch of Florida highway where the software isn’t supposed to be deployed.

"This technology is not safe, we have to get it off the road,” said Angulo, who is suing Tesla as he recovers from injuries that included brain trauma and broken bones. “The government has to do something about it. We can’t be experimenting like this.”

This is the important part, it's not just Tesla not giving a shit about their customers safety, it's dangerous to literally anyone in the same road as a Tesla

[–] [email protected] 9 points 11 months ago (1 children)

This case went to trial already and the jury sided with Tesla because the driver was holding his foot on the accelerator to override cruise control, ignored the vehicles warnings, drove through numerous flashing lights, crashed through a stop sign, and then hit the couple, later stating “I expect to be the driver and be responsible for this… I was highly aware that was still my responsibility to operate the vehicle safely." Any vehicle is capable of doing this with a reckless driver behind the wheel.

[–] [email protected] 2 points 11 months ago (1 children)

You may be confused, there's a lot of Tesla court cases to keep track of tho, I should have been specific.

https://arstechnica.com/tech-policy/2023/12/tesla-fights-autopilot-false-advertising-claim-with-free-speech-argument/

[–] [email protected] 3 points 11 months ago* (last edited 11 months ago)

You may be confused as I'm referring to the case from Angulo who was being quoted in the above comment.

I don't see how a dispute between the California DMV and Tesla over the terms "Autopilot" and "Full Self Driving" make Autopilot any more dangerous than calling it something else and other companies use similar names like "ProPilot" or "PilotAssist" for their LKAS systems.

[–] [email protected] 14 points 11 months ago (1 children)

I live in a major metropolitan area, drive a model 3, and almost never use autopilot.

I am lucky enough to rarely be in stop-and-go traffic, but when I am, I don’t even use cruise control, because it’s too reactive to the car in front of me and subsequently too jerky for my preference.

As for autopilot, I was on a relatively unpopulated freeway in the second lane from the right, when a small truck came around a clover leaf to merge into the right lane next to me. My car flipped out and slammed on the breaks. The truck wasn’t even coming into my lane; he was just merging. Thankfully there was a large gap behind me, and I was also paying enough attention to immediately jam on the accelerator to counteract it, but it spooked me pretty badly. And this was on a road that it’s designed for.

Autopilot (much less FSD) can’t really think like our brains can. It can only “see” so far ahead and behind. It can’t look at another driver’s behavior and make assessments that they might be distracted or or drunk. We’re not there yet. We’re FAR from there.

[–] [email protected] 6 points 11 months ago (1 children)

I don't think AI will ever be able to account for context; like 'black friday sales are on and the general trend on the road is a lot of fuckery, so I'll drive extra extra carefully today'

[–] [email protected] 8 points 11 months ago

Or things like, that idiot's mattress is going to fly off his roof any second, I'm not driving behind him.

[–] [email protected] 3 points 11 months ago

Why can't we just regulate autopilot to interstate systems for the time being? There's no reason a person needs to be attentive for miles on end with no stops. And if the tech isnt there for complicated traffic patterns in suburban and urban environments, make it inaccessible unless you are on the interstate. Seems like an easy enough fix for the time being.

[–] [email protected] 2 points 11 months ago

This is the best summary I could come up with:


DETROIT (AP) — Tesla is recalling nearly all vehicles sold in the U.S., more than 2 million, to update software and fix a defective system that’s supposed to ensure drivers are paying attention when using Autopilot.

The recall comes after a two-year investigation by the National Highway Traffic Safety Administration into a series of crashes that happened while the Autopilot partially automated driving system was in use.

But safety experts said that, while the recall is a good step, it still makes the driver responsible and doesn’t fix the underlying problem that Tesla’s automated systems have with spotting and stopping for obstacles in their path.

The attempt to address the flaws in Autopilot seemed like a case of too little, too late to Dillon Angulo, who was seriously injured in 2019 crash involving a Tesla that was using the technology along a rural stretch of Florida highway where the software isn’t supposed to be deployed.

Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies autonomous vehicle safety, called the software update a compromise that doesn’t address a lack of night vision cameras to watch drivers’ eyes, as well as Teslas failing to spot and stop for obstacles.

In its statement Wednesday, NHTSA said the investigation remains open “as we monitor the efficacy of Tesla’s remedies and continue to work with the automaker to ensure the highest level of safety.”


The original article contains 1,030 words, the summary contains 235 words. Saved 77%. I'm a bot and I'm open source!

[–] [email protected] 1 points 11 months ago (1 children)

Recall for software? What happened to over the air updates?

[–] [email protected] 5 points 11 months ago* (last edited 11 months ago)

Recalls do include OTA updates. It doesn't necessarily mean you need to bring your car to a dealership or shop.