this post was submitted on 09 Jul 2024
260 points (97.4% liked)
Asklemmy
44148 readers
1539 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yeah, it turns out humans be humaning. We are not robots. You have the option to safely operate a car on your own but if you so happen to have an issue where you cannot operate one safely in the moment, the safety features help you out. You can still operate a vehicle with lane assist and not even notice that it is enabled. You also have the ability to turn it off. You can also still operate a vehicle with adaptive cruise control enabled and not even notice it if you are shaky operating the vehicle properly. These features do not prevent people from operating a vehicle safely on their own. They are there because a fuck ton of people cannot and never have been able to. The past driver mortality rate which was higher when these safety features were not an option is clear evidence of that.
Again, if you are indeed a robot and have never had an issue of going over the lines or going above the speed limit or ever checked your rear view mirror at an inopportune time when someone in front of you is slamming on their brakes, you can still operate a vehicle just the same as you would if they were not there. Hell, you can also simply disable them. But those safety features are there for the rest of us that recognize that shit happens.
Now I will certainly agree that many people should not be driving. I believe that you should have a hell of a lot more practice than six months of driver's education and passing a very simple test once to be able to drive for the rest of your life. I also recognize that driving is a requirement for many people to work. I welcome alternatives to driving but it is not a reality yet. The increase in safety features helps minimize death and injury in the current reality.
I see this as the problem. We're becoming more reliant on robots to accomplish basic tasks. If the mode of transportation is fully automated - fine. But that is not the case, yet. It's still the licensed driver's responsibility if there's a crash. You can't tell a judge your robot made a mistake.
You know how they say Gen Alpha doesn't know how to turn on a computer or use a file system? It's like that. We can't just give the robots full control of our lives. We should know the basics of operating a car, of being aware of our surroundings, of how to instinctively make a split second decision.
I'll offer a compromise. There should be two (or more) levels of operating licenses. If you want your car to do everything for you, you do not have the same permissions as someone who knows how to fully drive a car. This means you're unable to rent or borrow a car that requires your full attention. At least this creates some sort of stricter legal ramification when someone who's been dependent upon driver assist features for a decade and gets behind the wheel of a "dumb" car and kills someone because they don't know how to merge onto a highway. Frankly, we could benefit from this premise on existing drivers and vehicles today.