this post was submitted on 18 Feb 2024
30 points (76.8% liked)
Programming
17357 readers
339 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How much coding work is left to be done? Infinity. There will always be more needed. Always. And while there is a certain truth to the idea that software just needs to be good enough, it will very quickly become nearly impossible to maintain and add new features.
AI doesn't make us 30% more efficient. There are certain tasks that's it's really helpful for, but they are really limited. I can see issues with junior developers being replaced with AI when they are in the takes more work to train them then just do their job stage. Beyond that, a good developer has skills and experience that AI will never be able to replace, especially since the code has to be maintained.
Well, no. That's just plain wrong. There is only a certain amount of demand for software, like for every other product or service. That's literally economy 101.
You don't know that. Think about how much time you spend on boilerplating. Not only the "traditional" boilerplate, but maintenance, regular updates, breaking upgrades for dependencies, documentation.
Think about search. Google isn't that good at actually understanding what you want to find, an AI might find that one obscure blog post from 5 years ago. But in 10s, not 10h.
Think about all the tests, that you write, that are super obvious. Testing for http 500 handling, etc.
A technology doesn't have to replace you to make you more efficient, just taking some work off your shoulders can boost productivity.
One thing that is somewhat unique about software engineering is that a large part of it is dedicated to making itself more efficient and always has been. From programming languages, protocols, frameworks, and services, all of it has made programmers thousands of times more efficient than the guys who used to punch holes into cards to program the computer.
Nothing has infinite demand, clearly, but the question is more whether or not we're anywhere near the peak, such that more efficiency will result in an overall decrease in employment. So far, the answer has been no. The industry has only grown as it's become more efficient.
I still think the answer is no. There's far more of our lives and the way people do business that can be automated as the cost of doing so is reduced. I don't think we're close to any kind of maximum saturation of tech.
Here again, I think, is a somewhat tech-centric view on economics.
There is only a finite amount of automation demand, simply because human labor exists.
Inside of our tech bubble, automation simply means more "functionality" per person per time unit. What took 10 devs a year yesterday can be done by 5 people in 6 months today. That's all five and dandy, but at some point, software clashes with the hard reality of physics. Software doesn't produce anything, it's often just an enabler for physical production. Lube, or grease.
Now, that production obviously can be automated tremendously as well, but with diminishing returns. Each generation of automation is harder than the one before. And each generation has to compete with a guy in Vietnam/Kenia/Mexico. And each generation also has to compete with its own costs.
Why do you think, chips are so incredibly expensive lately? RND costs are going through the roof, production equipment is getting harder and harder to produce, and due to the time pressure, you have to squeeze out as much money as possible out of your equipment. So prices go up. But that can't go on forever, at Stone point the customers can't justify/afford the expense. So there's a kind of feedback loop.
Yes, what I'm saying is that lower costs for software, which AI will help with, will make software more competitive against human production labor. The standard assumption is that if software companies can reduce the cost of producing software, they'll start firing programmers but the entire history of software engineering has shown us that that's not true as long as the lower cost opens up new economic opportunities for software users, thus increasing demand.
That pattern stops only when there are no economic opportunities to be unlocked. The only way I think that happens is when automation has become so prevalent that further advancement has minimal impact. I don't think we're there yet. Labor costs are still huge and automation is still relatively primitive.
But that demand isn't going anywhere. A company with good profits is always going to be willing to re-invest a percentage of those profits in better software. A new company starting out is always going to invest whatever amount of risk they can tolerate on brand new software written from scratch.
That money will not be spent on AI, because AI is practically free. It will always be spent on humans doing work to create software. Maybe the work looks a bit different as in "computer, make that button green" vs
button.color = 'green'
however it's still work that needs to be done and honestly it's not that big of an efficiency gain. It's certainly not as big as the jump we did decades ago from hole punch programming to typing code on a keyboard. That jump did not result in lay offs, we have far more programmers now than we did then.If productivity improves, if anything that will mean more job opportunities. It lowers the barrier to entry allowing new projects that were not financially viable in the past.
Yes there will always be demand for coders, but will there be enough demand for the current (increasing) supply? Right now the global number of software developers is growing by about a million per year (total is only 28.7 million) - this means that (very roughly) to keep salaries stable we also need demand for new software to be growing by about 3.5% per year. I know that doesn't sound like a lot, but a decade from now you'll need 1.4 jobs for every job now to keep up with the supply.
In the past we had new dynamics to get end-users to spend more and more time using computers and hence software (desktop PCs, video games, internet, mobile phones, social media, etc.). At this point there's so little time left in a consumer's day that tech can grow into that I worry that any further advancements will have to cannibalize from another area; I.e. we've reached peak demand for software.
I agree with you on the testing point, but I disagree with everything else that you’ve said. I didn't say infinite demand, did I? I said there is an infinite need for coding. Just in the realm of business software, there will never come a day when there isn’t one more business requirement, and there will never come a day when there isn’t a need to translate that requirement into code. Is that literal infinity or only effectively so? I don't care. I'm not writing an academic thesis here.
Coding demand is not constrained by the amount of work that needs to be done, but by money (and to a degree organization because larger groups become less efficient) because there will always be more coding that needed to be done.