this post was submitted on 23 May 2025
493 points (98.6% liked)

Technology

70268 readers
3802 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don't see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 1 hour ago

Elon took the wheel because that person made a mean tweet about him

[–] [email protected] 5 points 10 hours ago (1 children)

Kill me” it said in a robotic voice that got slower, glitchier, and deeper as it drove off the road.

[–] [email protected] 1 points 1 hour ago

EXTERMINAAAAAATE!!!

[–] [email protected] 31 points 19 hours ago (3 children)

The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.

What I don't get is how this false advertising for years hasn't caused Tesla bankruptcy already?

[–] [email protected] 19 points 16 hours ago (1 children)

Because the US is an insane country where you can straight up just break the law and as long as you're rich enough you don't even get a slap on the wrist. If some small startup had done the same thing they'd have been shut down.

What I don't get is why teslas aren't banned all over the world for being so fundamentally unsafe.

[–] [email protected] 2 points 6 hours ago

What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.

I've argued this point the past year, there are obvious safety problems with Tesla, even without considering FSD.
Like blinker on the steering wheel, manual door handles that are hard to find in emergencies, and distractions from common operations being behind menus on the screen, instead of having directly accessible buttons. With auto pilot they also tend to break for no reason, even on autobahn with clear road ahead! Which can also create dangerous situations.

[–] [email protected] 6 points 17 hours ago (1 children)

Well, because 99% of the time, it's fairly decent. That 1%'ll getchya tho.

[–] [email protected] 4 points 16 hours ago (2 children)

That's probably not the failure rate odds but a 1% failure rate is several thousand times higher than what NASA would consider an abort risk condition.

Let's say that it's only 0.01% risk, that's still several thousand crashes per year. Even if we could guarantee that all of them would be non-fatal and would not involve any bystanders such as pedestrians the cost of replacing all of those vehicles every time they crashed plus fixing damage of things they crashed into, lamp posts, shop Windows etc would be so high as it would exceed any benefit to the technology.

It wouldn't be as bad if this was prototype technology that was constantly improving, but Tesla has made it very clear they're never going to add lidar scanners so is literally never going to get any better it's always going to be this bad.

[–] [email protected] -3 points 5 hours ago (1 children)

Saying it’s never going to get better is ridiculous and demonstrably wrong. It has improved in leaps and bounds over generations. It doesn’t need LiDAR.

The biggest thing you’re missing if that with FSD **the driver is still supposed to be paying attention at all times, ready to take over like a driving instructor does when a learner is doing something dangerous. Just because it’s in FSD Supervised mode it slant mean you should just sit back and watch it drive you off the road into a lake.

[–] [email protected] 1 points 1 hour ago

Your saying this on a video where it drove into a tree and flipped over. There isn't time for a human to react, that's like saying we don't need emergency stops on chainsaws, the operator needs to just not drop it.

[–] [email protected] 1 points 16 hours ago

...is literally never going to get any better it's always going to be this bad.

Hey now! That's unfair. It is constantly changing. Software updates introduce new reversions all the time. So it will be this bad, or significantly worse, and you won't know which until it tries to kill you in new and unexpected ways :j

[–] [email protected] 4 points 16 hours ago (2 children)

To be fair, that grey tree trunk looked a lot like a road

[–] [email protected] 6 points 15 hours ago

It's fine, nothing at all wrong with using just camera vision for autonomous driving. Nothing wrong at all. So a few cars run off roads or don't stop for pedestrians or drive off a cliff. So freaking what, that's the price for progress my friend!

I'd like to think this is unnecessary but just in case here's a /s for y'all.

[–] [email protected] 2 points 16 hours ago

GPS data predicted the road would go straight as far as the horizon. Camera said the tree or shadow was an unexpected 90 degree bend in the road. So the only rational move was to turn 90 degrees, obviously! No notes, no whammies, flawless

[–] [email protected] 7 points 19 hours ago (4 children)

It got the most recent update, and thought a tunnel was a wall.

[–] [email protected] 2 points 16 hours ago

... and a tree was a painting.

load more comments (3 replies)
[–] [email protected] 21 points 1 day ago
[–] [email protected] 14 points 1 day ago (2 children)

I have visions of Elon sitting in his lair, stroking his cat, and using his laptop to cause this crash. /s

[–] [email protected] 11 points 1 day ago

Why would you inflict that guy on a poor innocent kitty?

[–] [email protected] 3 points 19 hours ago

That tree cast shade on his brand.

It had to go.

[–] [email protected] 3 points 19 hours ago

HAL9000 had Oh Clementine!

Has Tesla been training their AI with the lumberjack song?

[–] [email protected] 89 points 1 day ago (3 children)

The worst part is that this problem has already been solved by using LIDAR. Vegas had fully self-driving cars that I saw perform flawlessly, because they were manufactured by a company that doesn’t skimp on tech and rip people off.

[–] [email protected] 1 points 5 hours ago (1 children)

Lidar doesn’t completely solve the issue lol. Lidar can’t see line markings, speed signs, pedestrian crossings, etc. Cars equipped with lidar crash into things too.

[–] [email protected] 1 points 1 hour ago

I oversold it in my original comment, but it still performs better than using regular cameras like Tesla did. It performs better in weather and other scenarios than standard cameras. Elon is dumb though and doesn’t think LiDAR is needed for self-driving.

[–] [email protected] 7 points 18 hours ago* (last edited 18 hours ago)

I wouldn't really called it a solved problem when waymo with lidar is crashing into physical objects

https://www.msn.com/en-us/autos/news/waymo-recalls-1200-robotaxis-after-cars-crash-into-chains-gates-and-utility-poles/ar-AA1EMVTF

NHTSA stated that the crashes “involved collisions with clearly visible objects that a competent driver would be expected to avoid.” The agency is continuing its investigation.

It'd probably be better to say that Lidar is the path to solving these problems, or a tool that can help solve it. But not solved.

Just because you see a car working perfectly, doesn't mean it always is working perfectly.

[–] [email protected] 13 points 1 day ago (4 children)

Are those the ones that you can completely immobilize with a traffic cone?

[–] [email protected] 2 points 6 hours ago

A human also (hopefully anyway) wouldn't drive if you put a cone over their head.

Like yeah, if you purposely block the car's vision, it should refuse to drive.

[–] [email protected] 10 points 21 hours ago

You say that like it's a bad thing lol if it kept going, that cone would fly off and hit somebody.

[–] [email protected] 25 points 1 day ago (1 children)

Probably Zoox, but conceptually similar, LiDAR backed.

You can immobilize them by setting anything large on them. Your purse, a traffic cone, a person :)

Probably makes sense to be a little cautious with the gas pedal when there is an anything on top the vehicle.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 5 points 22 hours ago

Don't drive Tesla

[–] [email protected] 11 points 1 day ago (1 children)

Why someone will be a passenger in self-driving vehicle? They know that they are a test subjects, part of a "Cartrial" (or whatever should be called)? Self-Driving is not reliable and not necessery. Too much money is invested in something that is "Low priority to have". There are prefectly fast and saf self-driving solutions like High-speed Trains.

[–] [email protected] 2 points 19 hours ago

I have no idea, I guess they have a lot more confidence in self driving (ESPECIALLY Tesla) than I do.

[–] [email protected] 27 points 1 day ago (1 children)

I mean, if Elon was my dad, I'd probably have some suicidal tendencies too.

load more comments (1 replies)
load more comments
view more: next ›