this post was submitted on 05 Feb 2024
202 points (84.1% liked)

Asklemmy

43980 readers
720 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
 

Ok let's give a little bit of context. I will turn 40 yo in a couple of months and I'm a c++ software developer for more than 18 years. I enjoy to code, I enjoy to write "good" code, readable and so.

However since a few months, I become really afraid of the future of the job I like with the progress of artificial intelligence. Very often I don't sleep at night because of this.

I fear that my job, while not completely disappearing, become a very boring job consisting in debugging code generated automatically, or that the job disappear.

For now, I'm not using AI, I have a few colleagues that do it but I do not want to because one, it remove a part of the coding I like and two I have the feeling that using it is cutting the branch I'm sit on, if you see what I mean. I fear that in a near future, ppl not using it will be fired because seen by the management as less productive...

Am I the only one feeling this way? I have the feeling all tech people are enthusiastic about AI.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 32 points 9 months ago (4 children)

AI is a really bad term for what we are all talking about. These sophisticated chatbots are just cool tools that make coding easier and faster, and for me, more enjoyable.

What the calculator is to math, LLM’s are to coding, nothing more. Actual sci-fi style AI, like self aware code, would be scary if it was ever demonstrated to even be possible, which it has not.

If you ever have a chance to use these programs to help you speed up writing code, you will see that they absolutely do not live up to the hype attributed to them. People shouting the end is nigh are seemingly exclusively people who don’t understand the technology.

[–] [email protected] 22 points 9 months ago

I've never had to double check the results of my calculator by redoing the problem manually, either.

[–] [email protected] 2 points 9 months ago (1 children)

Haven't we started using AGI, or artificial general intelligence, as the term to describe the kind of AI you are referring to? That self aware intelligent software?

Now AI just means reactive coding designed to mimic certain behaviours, or even self learning algorithms.

[–] [email protected] 1 points 9 months ago (1 children)

That’s true, and language is constantly evolving for sure. I just feel like AI is a bit misleading because it’s such a loaded term.

[–] [email protected] 1 points 9 months ago

I get what you mean, and I think a lot of laymen do have these unreasonable ideas about what LLMs are capable of, but as a counter point we have used the label AI to refer to very simple bits of code for decades eg video game characters.

[–] Lmaydev 2 points 9 months ago (1 children)

AI is the correct term. It's the name of the field of study and anything that mimics intelligence is an AI.

Neural networks are a perfect example of an AI. What you actually code is very simple. A bunch of nodes that pass numbers forward through the system applying weights to the values. Their capabilities once trained far outstretch the simple code they run and seem intelligent.

What you are referring to is general AI.

[–] [email protected] 4 points 9 months ago (2 children)

It's a misnomer, but if you want to pass off LLMs as "artificial intelligence" on technicality of definition, you'd also have to include

advanced web search engines (e.g., Google Search), recommendation systems (used by YouTube, Amazon, and Netflix)

etc.

[–] Lmaydev 2 points 9 months ago* (last edited 9 months ago)

Indeed you do.

Neural networks are some of the original AIs.

[–] [email protected] 2 points 9 months ago

Yes those are also examples of AI, see relevant Wikipedia article:

AI is whatever hasn't been done yet

We need better terms to specify exactly what we mean, e.g. a numeric scale of intelligence or maybe even something more complex like a radar chart.

[–] [email protected] 2 points 9 months ago

Yeah, this is the thing that always bothers me. Due to the very nature of them being large language models, they can generate convincing language. Also image "ai" can generate convincing images. Calling it AI is both a PR move for branding, and an attempt to conceal the fact that it's all just regurgitating bits of stolen copywritten content.

Everyone talks about AI "getting smarter", but by the very nature of how these types of algorithms work, they can't "get smarter". Yes, you can make them work better, but they will still only be either interpolating or extrapolating from the training set.