this post was submitted on 25 Jan 2024
330 points (97.1% liked)
Asklemmy
43834 readers
719 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy π
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
ChatGPT has been quoted as a cause in at least one of the layoffs. The tech industry is specially positioned to be quickly affected by AI, but AI is going to impact 80% of the jobs on the planet within the next 8 years. Our world is about to experience a massive change to the way things are run. We can try to prepare, but it's going to change in ways previously unimaginable.
The chatbot that's wrong 50% of the time? That's hard to believe.
You're doing yourself a big disservice if you limit your understanding of AI to what you read from the opinions of Lemmings. It is incredibly powerful, and every major corporation has large investments in AI integration.
I'm not doing that at all, this is my personal experience.
Then you're using it wrong.
Clearly the most reasonable response. Did you work for apple back in the day?
I wish!
I can't wait to be starving
Yeah, it's pretty scary. I don't know what the future holds, but I think a lot of jobs are going to go the way of the vacuum tube salesman.
There's no way capitalism and those that buy into it can responsibly use AI
True as that may be, it's not going to stop them.
Maybe people should learn to fight for a better tomorrow instead of just trying to get through today
I don't disagree with you. But that doesn't change the reality we need to live with today, and it's veering off topic. AI has already achieved a lot of cool stuff, like assisting in the creation of vaccines in record speed, helping us to identify diseases, finding planets in habitable zones around distant stars, and a bunch of other cool shit. Just because it can be abused, doesn't mean it can't also accomplish great things. If you're worried about corporations abusing AI, then call your senators, or launch a petition for a new bill. But just sticking your head in the sand doesn't accomplish anything, and doesn't change what is to come.
And how many major corporations that invested in NFTs are still doing so?
Look man, I don't work for OpenAI, or Microsoft, or Google, or any other company that is developing their own AI. I'm not evangelizing. If you don't want to use it, then don't. But it's going to affect everyone, regardless of what they think about it. If you want a leg up in the changing world, then start learning about it. Or don't. I'm not your dad.
It doesn't need to be right to make money, often more money than companies get by paying people to do a job properly.
... Yes, it does in the tech sector. If you're wrong it doesn't work.
I've tried the tools out. You go from writing code for an hour and debugging for half an hour to writing code for 15 minutes and debugging for three hours.
Half the time you've ripped out literally every bit of code the AI wrote by the time you're done making it work.
Get better at the prompts you use. My entire team uses it daily, and it has made us probably 600% more effective. Learning to prompt AI is a valuable skill right now.
What's your metric that you improved 600%?
Oh, I know how to prompt AI. Getting it to spit out workable code doesn't mean you don't have to review the code, or make sure it's integrated correctly.
You also have to make sure it's not generating blatantly braindead code, which makes the review and debugging cycle take longer.
I remain unconvinced that it's suitable for domains where there is a right and wrong answer, like engineering or law.
I've found more value in the systems that do a good job understanding the problem description and then returning references to documentation and prior art on techniques, as opposed to the actual code.
I don't need a virtual junior dev I need to hand hold, I actually have those and mine get better. I want a virtual "person who worked on something like this once and knows the links to the good articles".
Imagine training what's going to replace you lol
It's going to happen regardless of what you personally do. You might as well get some benefit from it while you can.
Nah, I have standards
You have standards, but the companies firing all these people don't.
I mean, I can imagine it. It's the industrial revolution all over again but Cyberpunk style.