this post was submitted on 18 Oct 2024
784 points (98.4% liked)

Technology

59608 readers
3360 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 60 points 1 month ago* (last edited 1 month ago) (16 children)

Musk has said that humans drive with only eyesight, so cars should be able to drive with just cameras.

This of course assumes 1) that cameras are just as good as eyes (they're not) and 2) that the processing of visual data that the human brain does can be replicated by a machine, which seems highly dubious given that we only partially understand how humans process visual data to make decisions.

Finally, it assumes that the current rate of human-caused crashes is acceptable. Which it isn't. We tolerate crashes because we can't improve people without unrealistic expense. In an automated system, if a bit of additional hardware can significantly reduce crashes it's irrational not to do it.

[–] [email protected] 22 points 1 month ago (4 children)

This is directly a result of Elon's edict that Tesla cars don't use lidar. If you aren't aware Elon set that as a requirement at the beginning of Tesla's self driving project because he didn't want to spend the money on lidar for all Tesla cars.

His "first principles" logic is that humans don't use lidar therefore self driving should be able to be accomplished without (expensive) enhanced vision tools. While this statement has some modicum of truth, it's obviously going to trade off safely in situations where vision is compromised. Think fog or sunlight shining in your cameras / eyes or a person running across the street at night wearing all black. There are obvious scenarios where lidar is a massive safety advantage, but Elon made a decision for $$ to not have that. This sounds like a direct and obvious outcome of that edict.

[–] [email protected] 3 points 1 month ago (1 children)

My vacuum robot uses lidar. How expensive could it be??

[–] [email protected] 2 points 1 month ago

You need slightly more advanced lidar for cars because you need to be able to see further ahead then 10 ft, and you need to be able to see in adverse weather conditions (rain, fog, snow), that I assume you don't experience indoors. That said, it really isn't as expensive as he is making it out to be.

load more comments (2 replies)
load more comments (13 replies)