this post was submitted on 27 Apr 2024
886 points (96.0% liked)

Technology

69298 readers
4022 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 23 points 1 year ago (60 children)

Is the investigation exhaustive? If these are all the crashes they could find related to the driver assist / self driving features, then it is probably much safer than a human driver. 1000 crashes out of 5M+ Teslas sold the last 5 years is actually a very small amount

I would want an article to try and find the rate of accidents per 100,00, group it by severity, and then compare and contrast that with human caused accidents.

Because while it's clear by now Teslas aren't the perfect self driving machines we were promised, there is no doubt at all that humans are bad drivers.

We lose over 40k people a year to car accidents. And fatal car accidents are rare, so multiple that by like 100 to get the total number of car accidents.

[–] [email protected] 30 points 1 year ago (40 children)

The question isn't "are they safer than the average human driver?"

The question is "who goes to prison when that self driving car has an oopsie, veers across three lanes of traffic and wipes out a family of four?"

Because if the answer is "nobody", they shouldn't be on the road. There's zero accountability, and because it's all wibbly-wobbly AI bullshit, there's no way to prove that the issues are actually fixed.

[–] Maddier1993 3 points 1 year ago

I don't agree with your argument.

Making a human go to prison for wiping out a family of 4 isn't going to bring back the family of 4. So you're just using deterrence to hopefully make drivers more cautious.

Yet, year after year.. humans cause more deaths by negligence than tools can cause by failing.

The question is definitely "How much safer are they compared to human drivers"

It's also much easier to prove that the system has those issues fixed compared to training a human hoping that their critical faculties are intact. Rigorous Software testing and mechanical testing are within legislative reach and can be made strict requirements.

load more comments (39 replies)
load more comments (58 replies)