this post was submitted on 15 Jul 2024
577 points (98.2% liked)
Technology
58303 readers
11 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Was the driver asleep or something? The car drove quite a bit on the tracks... sure, blame Tesla all you want (and rightly so), but you can't really claim today that the car has "autopilot" unless you're hunting for a lawsuit. So what was the driver doing?
That's what I was thinking, your car starts doing something fucking stupid and you just let it?
Tesla owners be like.
Elon take the wheel
Well what's the point of self driving if you can't have a wank on the drive home?
I can tell you wank with your eyes closed :D
Delicately put. But essentially that's why self-driving cars are not really seen outside of Tesla. Unless the technology is basically perfect there's essentially no point to it.
Tesla have it because they use the public as guinea pigs.
I wouldn't mind if they all had to go to some dedicated test track to try it out and train it and outside of those environments it wouldn't turn on. If they want to risk their lives that's their prerogative, my problem is that it might drive into me one day and I don't own a Tesla so why should I take that risk?
It's rather reminiscent of the old days of GPS, when people would follow it to the letter, and drive into rivers, go the wrong way up a one-way street, etc.
There was a legal case recently where somebody drove off a bridge that wasn't there. At some point you have to take personal responsibility since the outcomes will be extremely personal.
It's hard to tell who's worse at driving. Tesla owners or their auto pilot
California, so I'm I'm guessing the driver was getting head at the time whilst drinking beer.
He was focusing on the caboose
Who needs a driver? This car has AUTOPILOT.
But seriously, Tesla "autopilot" is nothing more than a cruise control you have to keep an eye on. Which means, it's NOT "autopilot." This technology is not ready for the real world. Sooner or later, it's going to cause a major, horrible accident, involving dozens or people. Musk has enough connections to avoid any real-world consequences but maybe enough people will get over their child-like worship of billionaires and stop treating him like he's the next Bill Gates.
Somewhat ironically, autopilot for airplanes is more less attitude/speed holding for most history. More modern systems can now autoland or follow a preprogrammed route (the flight plan plugged into the FMS), but even then changes like TCAS advisories are usually left up to the pilots to handle. Autopilots are also expected to give control to the pilots in any kind of unexpected situation.
So in a way tesla's naming here isn't so off, it's just the generic understanding of the term "autopilot" that is off somewhat. That said, their system is also not doing much more than most other level 2 ADAS systems offer.
On the other hand, Elon loves going off about Full Self Driving mode a lot, and that's absolutely bullshit.
Comercial pilots also have a lot of training, huge list of regulations and procedures for every contingency, amd a copilot to double check your work.
Tesla has dumb fuck drivers that are actively trying to find ways to kill themselves. And an Orange wedged in the steering wheel is the copilot. To trick sensors.
Maybe the latter should not be trusted with the nuance that is the "autopilot" branding.
I think the counter to that is that aircraft manufacturers know that the people flying their aircraft are not idiots and actually know what the autopilot button does. Meanwhile Tesla knows that the people driving their cars are idiots and don't know what the autopilot does.
In the US they let kids drive for god's sake. Sure they've passed a test but what does that mean in the real world?
Because 2007+ have seen an influx of new computer users, mostly using mobile devices, many of them thinking that this is how computer use looks now and that this is the future.
Now the iPhone generation (including adults and seniors who haven't used anything smarter) thinks that you can replace any expert UI with an Angry Birds like arcade on a touchscreen.
If real autopilot to be trusted were possible for airplanes now, we'd see fully automated drone swarms in all warzones and likely automated jets (not having the constraint of G-forces survivable by a human, and not requiring life support systems at all), but in real life it's still human-controlled FPV drones and human-piloted jets.
Though I think drone swarms are coming. It's, of course, important to have control over where the force is applied, but a bomb that destroys a town when you need to destroy a house is often preferable to no bomb at all.
The point was that people want magic now and believe crooks who promise them magic now. Education is the way to counter this.
Even if asleep, how do you sleep through that? Would be super bumpy.