this post was submitted on 20 Apr 2024
580 points (96.9% liked)
Technology
59608 readers
3274 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Honestly I'm sure there will be a lot of unfortunate mistakes until computers and self driving systems can be relied upon. However there needs to be an entry point for manufacturers and this is it. Technology will get better over time, it always has. Eventually self driving autos will be the norm.
That still doesn't address all the issues surrounding it. I am unsure if you are just young and not aware how these things work or terribly naive. But companies will always cut corners to keep profits. Regulation forces a certain level of quality control (ideally). Just letting them do their thing because "it'll eventually get better" is a gateway to absurd amounts of damage. Also, not all technology always gets better. Plenty just get abandoned.
But to circle back, if I get hit by a car tomorrow and all these thinga you think are unimportant are unanswered does that mean I might mot get legal justice or compensation? If there isn't clearly codified law I might not, and you might be callous enough to say you don't care about me. But what about you? What if you got hit by a unmonitored self driving car tomorrow and then told you'd have to go through a long, expensive court battle to determine fault because no one had done it it. So you're in and out of a hospital recovering and draining all of your money on bills both legal and medical to eventually hopefully get compensated for something that wasn't your fault.
That is why people here are asking these questions. Few people actually oppose progress. They just need to know that reasonable precautions are taken for predictable failures.
To be clear I never said that I didn't care about an individual's safety, you inferred that somehow from my post and quite frankly are quite disrespectful. I simply stated that autonomous vehicles are here to stay and that the technology will improve more with time.
The legal implications of self driving cars are still being determined and as this is literally one of the first approved technologies available. Tesla doesn't count as it's not a SAE level 3 autonomous driving vehicle. There are some references in the liability section of the wiki.
https://en.m.wikipedia.org/wiki/Regulation_of_self-driving_cars
But then it's good that the manufacturer states the driver isn't obliged to watch the road. Because it shifts responsibility towards the manufacturer and thus - it's a great incentive to make technology as safe as possible.
Can't the entry point just be that you have to pay attention while it's driving for you until they figure it out?
You’re deciding to prioritize economic development over human safety.