this post was submitted on 26 May 2025
6 points (57.9% liked)

Programming

20365 readers
218 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 2 years ago
MODERATORS
top 19 comments
sorted by: hot top controversial new old
[–] [email protected] 27 points 2 days ago* (last edited 2 days ago) (3 children)

I'm someone who use LLM, including Claude Code a lot.
But come on, this could be a tiny script that do the same, write it with claude code if you dont want to write it, instead of having claude code run each time, you'll be deterministic this way !

[–] [email protected] 15 points 2 days ago

Yeah this should just be a standard GitHub action. It’s a waste of energy to have an LLM do this over and over.

I see this trend happening a lot lately, where instead of getting the LLM to write code that does a thing, the LLM is repeatedly asked to do the thing, which leads to it just writing the same or very nearly the same code over and over.

[–] [email protected] 14 points 2 days ago (1 children)

Their company is an AI assistant for shopping, so trying to put AI everywhere including places it shouldn’t be is gonna happen.

I like my build scripts dependable, debuggable, and deterministic. This is wild. When the bot makes a pull request, and the user (who may be someone else at some point) doesn’t respond with exactly what the prompt wants, what happens? What happens when Claude Code updates, or has an outage? Don’t change that GitHub action at the end’s name without remembering to update the prompt as well.

[–] [email protected] 5 points 2 days ago

Or worse. A single bad actor (according to the company) poisoned grok to be white supremacist. How many unsupervised, privileged LLM commands could run in a short time if an angry employee at Anthropic poisons the LLM to cause malicious damage to servers, environments, or pipelines it has access to?

[–] [email protected] 2 points 2 days ago

They only actually need Claude to skip QA and hope that reading the code is a good enough substitute.

This really could be a script to create a PR for the merge, request a review from Claude, then automate the rest.

[–] [email protected] 11 points 2 days ago

Uhm, we have release pipelines in Azure DevOps that do all of this and much more with precision and reliability. Relying on an LLM for actual releases to production of all things seems like a pretty big and wholly unnecessary gamble.

[–] [email protected] 18 points 2 days ago (1 children)

The only part of this process I'd consider automating with a LLM is summarizing the changes, and even then I'd only be interested looking at a suggested changelog, not something fully automated.

It's amazing to me how far people will go to avoid writing a simple script. Thankfully determinism isn't a requirement for a release pipeline. You wouldn't want all of your releases to go smoothly. That would be no fun.

[–] FizzyOrange 1 points 2 days ago

I agree. The right way to integrate AI into this process is to pre-fill the "release notes" box with an AI suggestion, that you then edit.

[–] natecox 18 points 2 days ago (1 children)

So this is Jenkins except it guzzles water and ruins the lives of people near data centers?

[–] [email protected] 0 points 2 days ago (1 children)

Not every datacenter is like elon datacenters.

[–] natecox 3 points 2 days ago (1 children)

Sure, some of them are much bigger and more environmentally destructive.

[–] [email protected] -4 points 2 days ago (1 children)

Not at all. Bigger datacenter usually use lower carbon energy source, try to lower energy spent for cooling and try to recycle the heat.
New datacenter reuse more and more the heat they generate:
https://blog.equinix.com/blog/2024/06/05/what-is-data-center-heat-export-and-how-does-it-work/

Real water polluter are PFAS producer, not datacenters.

[–] natecox 3 points 2 days ago (1 children)

Go explain to these people why “bigger DCs are actually better”: https://youtu.be/DGjj7wDYaiI

Here’s the MIT Press detailing the ridiculous carbon and water impacts of data centers: https://thereader.mitpress.mit.edu/the-staggering-ecological-impacts-of-computation-and-the-cloud/

“AI” is not worth this.

[–] [email protected] -3 points 2 days ago (1 children)

AI is not the whole cloud, it's a fraction of the cloud.
The MIT Press article is from 2022, citing 2019 data. Datacenter tech and heat reuse extremely intensified the last years, so this data is clearly out of date.

Go explain to these people why “bigger DCs are actually better”:

Tell me where there is any proof this is meta fault ? Because they are near the datacenter ? Do you have any idea of the amount of water a datacenter consume ?

[–] natecox 1 points 2 days ago (1 children)

AI is not the whole cloud, it's a fraction of the cloud

And yet it uses so much more energy per capita. This is not like a secret, dude.

Do you have any idea of the amount of water a datacenter consume ?

No. Nobody has exact figures because there is a mountain of intentional obfuscation hiding it. We know that it is a tremendous amount of water because we can estimate and we can see the data of towns literally going into extreme droughts right next to data centers, but is is suspiciously difficult to get actual numbers. This should tell you a lot.

https://www.datacenterdynamics.com/en/analysis/data-center-water-usage-remains-hidden/

Tell me where there is any proof this is meta fault

Did… did you not actually watch the video? The video very clearly answers this. Like, multiple times.

[–] [email protected] 1 points 1 day ago* (last edited 1 day ago)

The video very clearly answers this. Like, multiple times.

No, they made affirmation, that's not a proof.
For the first location, they say the loss of water pression AND the sediments are due to the datacenter.
They are getting their water from a well, if a well runs out, you get more sediments.
Is this your "clear answers" ?

We know that it is a tremendous amount of water because we can estimate and we can see the data of towns literally going into extreme droughts right next to data centers.

If this come from your video again, i again doubt your statements.

Datacenters dont make water magically disapear, it have to go somewhere.
You would see a release pipe, so the water is restituted, or vapor cloud, which should be very visible.
But we dont see any vapor cloud.

[–] [email protected] 5 points 2 days ago

This doesn't seem like a good idea.

One, releasing should be easy. At my last job, you clicked "new release" or whatever on GitHub. It then listed all the commits for you. If you "need" an Ai to summarize the commits, you fucked up earlier. Write better commit messages. Review the changes. Use your brain (something the AI can't do) to make sure you actually want all of this to go out. Click the button. GitHub runs checks and you're done.

Most of the time it took a couple minutes at most to do this process.

[–] [email protected] 5 points 2 days ago

I'm a big fan of both AI and automation but this is just 😬

[–] xtools 1 points 2 days ago* (last edited 2 days ago)

this seems overengineered to me - it's easy enough to have fully automated builds, deployments and releases already. you can even have ai sketch your github actions or similar config. anything beyond that is simply a downgrade or reinventing the wheel. a git commit hook might be just enough