this post was submitted on 29 Apr 2025
83 points (92.8% liked)

Programming

19927 readers
165 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] BB_C 9 points 3 days ago (2 children)

Reads okay for the most part. But I like how we see the same point about AI as a feature in some more serious real-life projects. There, we frame it as "Rust makes it harder for a 'contributor' to sneak in LLM-generated crap".

[–] firelizzard 5 points 2 days ago (1 children)

In what situation are you accepting contributions that you haven't vetted thoroughly enough to detect crap code? I've seen a lot of crap from developers that's as bad or worse than LLM generated crap so there's no way I'll ever accept contributions to an important system without thoroughly vetting them unless they're from one of a very few number of people I trust implicitly.

[–] [email protected] 5 points 2 days ago (1 children)

Well, no matter how thoroughly you vet, it's always good to have a tool to back you up.

For example, we once got a pull request, which was purely AI-generated but I couldn't tell that right away. So, I skimmed it to make sure no malicious code is part of it, then I gave it to the CI runner. And that failed pretty much immediately during a compile check, which made it obvious that the pull request author had never tried to compile it.

In that moment, I could stop wasting my time with that pull request, rather than try to debug why it's not working or having to vet it more thoroughly...

[–] firelizzard 1 points 1 day ago

I thoroughly agree, you should always have CI tools to ensure it builds, passes tests, and meets whatever formatting and/or linting standards the team sets. I was specifically responding to "Rust makes it harder for a ‘contributor’ to sneak in LLM-generated crap". If I get a contribution from an untrusted party, I will start with the assumption that it's utter garbage, buggy, broken, and malicious and review it until I'm convinced it's not. Not because I assume the dev is bad but because it's safer to assume the code is garbage. If I get a contribution from a trusted party (e.g. a member of the dev team/employee/whatever) I will review the code carefully though not with as much paranoia. I don't particularly care if my teammates are using LLMs, but if they're submitting code they don't understand that's a great way to get ejected from the "trusted contributors" group, and if they're an employee it's a good way to get fired if they keep doing it after being warned not to.

[–] vivendi -2 points 2 days ago

if nothing else it should be easier with rust because:

a) That fucking syntax is probably more legible to an AI than a human (sue me, Rust absolutists)

b) The language has more safety barriers; making use of AI safer by association