this post was submitted on 03 Sep 2023
210 points (100.0% liked)

Technology

37717 readers
457 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 year ago (1 children)

As a software developer, that’s not how testing works. QA is always trying to come up with weird edge cases to test, but once it’s out in the wild with thousands (or more) of real-world users, there’s always going to be something nobody ever tried to test.

For example, there was a crash where an unmarked truck with exactly the same color as the sky was 90° sideways on the highway. This is just something you wouldn’t think of in lab conditions.

[–] [email protected] 2 points 1 year ago (1 children)

there’s always going to be something nobody ever tried to test.

That's not what is happening. We don't see weird edge cases, we see self driving cars blocking emergency vehicles and driving through barriers.

For example, there was a crash where an unmarked truck with exactly the same color as the sky was 90° sideways on the highway.

The sky is blue and the truck was white. Testing the dynamic range of the camera system is absolutely something you do in in lab situation. And a thing blocking the road isn't exactly unforeseen either.

Or how about railroad crossing, Tesla can't even the difference between a truck and a train. Trucks blipping in out of existence, even changing direction, totally normal for Tesla too.

I don't expect self driving cars to be perfect and handle everything, but I expect the manufacturers to be transparent about their abilities and they aren't. Furthermore I expect the self driving system to have a way to react to unforeseen situations, crashing in fog is not acceptable when the fact that there was fog was plainly obvious.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

And a thing blocking the road isn’t exactly unforeseen either.

Tesla's system intentionally assumes "a thing blocking the road" is a sensor error.

They have said if they don't do that, about every hour or so you'd drive past a building and it would slam on the brakes and stop in the middle of the road for no reason (and then, probably, a car would crash into you from behind).

The good sensors used by companies like Waymo don't have that problem. They are very accurate.