this post was submitted on 17 Mar 2025
418 points (96.9% liked)
Videos
15394 readers
258 users here now
For sharing interesting videos from around the Web!
Rules
- Videos only
- Follow the global Mastodon.World rules and the Lemmy.World TOS while posting and commenting.
- Don't be a jerk
- No advertising
- No political videos, post those to [email protected] instead.
- Avoid clickbait titles. (Tip: Use dearrow)
- Link directly to the video source and not for example an embedded video in an article or tracked sharing link.
- Duplicate posts may be removed
Note: bans may apply to both [email protected] and [email protected]
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I am not a fan of Tesla/Elon but are you sure that no human driver would fall for this?
Part of the problem is the question of who is at fault if an autonomous car crashes. If a human falls for this and crashes, it's their fault. They are responsible for their damages and the damages caused by their negligence. We expect a human driver to be able to handle any road hazards. If a self driving car crashes who's fault is it? Tesla? They say their self driving is a beta test so drivers must remain attentive at all times. The human passenger? Most people would expect a self driving car would drive itself. If it crashes, I would expect the people that made the faulty software to be at fault, but they are doing everything they can to shift the blame off of themselves. If a self driving car crashes, they expect the owner to eat the cost.
As soon as we have hard data from real world use and FSD is safer than the average human, it would be unethical to not solve the regulatory and legal issues and apply it on a larger scale to save human lives.
If a human driver causes a crash, the insurance pays. Why shouldn't they if a computer caused the crash, which drives safer overall, if only by let's say 10%.
I agree that it would be unethical to ignore self driving since it has the potential to be far safer than a human driver. I just have problems with companies over promising what their software can do.
As for the insurance part, why should my insurance premium increase for a software defect? If a manufacturer defect causes me to crash my car, the manufacturer is at fault, not me. You wouldn't be liable if the brakes gave out in a new car.
Also keep in mind that the hard data from the real world means putting these vehicles on the road with other drivers. Deficiencies in the software mean potential crashes and deaths. It will be valuable data but we can't forget that there are people behind it. Self driving is going to shake things up and will probably be a net positive overall. I just think we should be mindful as we begin to embrace it.