this post was submitted on 17 Mar 2024
460 points (95.8% liked)

Technology

58303 readers
6 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 82 points 8 months ago (3 children)

I'm not a developer, but I use AI tools at work (mostly LLMs).

You need to treat AI like a junior intern.... You give it a task, but you still need to check the output and use critical thinking. You cant just take some work from an intern, blindly incorporate it into your presentation, and then blame the intern if the work is shoddy....

AI should be a time saver for certain tasks. It cannot (currently) replace a good worker.

[–] Lmaydev 34 points 8 months ago* (last edited 8 months ago) (1 children)

As a developer I use it mainly for learning.

What used to be a Google followed by skimming a few articles or docs pages is now a question.

It pulls the specific info I need, sources it and allows follow up questions.

I've noticed the new juniors can get up to speed on new tech very quickly nowadays.

As for code I don't trust it beyond snippets I can use as a base.

[–] [email protected] 0 points 8 months ago* (last edited 8 months ago) (1 children)

JFC they've certainly got the unethical shills out in full force today. Language Models do not and will never amount to proper human work. It's almost always a net negative everywhere it is used, final products considered.

[–] Lmaydev 1 points 8 months ago (1 children)
[–] [email protected] 1 points 8 months ago (1 children)

Its intended use is to replace human work in exchange for lower accuracy. There is no ethical use case scenario.

[–] Lmaydev 1 points 8 months ago (1 children)

It's intended to show case its ability to generate text. How people use it is up to them.

As I said it's great for learning as it's very accurate when summarising articles / docs. It even sources it so you can read up more if needed.

[–] [email protected] 0 points 8 months ago (1 children)

It's been known to claim commands and documentation exist when they don't. It very commonly gets simple addition wrong.

[–] Lmaydev 1 points 8 months ago (1 children)

That's because it's a language processor not a calculator. As I said you're using it wrong.

[–] [email protected] 1 points 8 months ago (1 children)

So the correct usage is to have documents incorrectly explained to you? I fail to see how that does any good.

[–] Lmaydev 1 points 8 months ago

I know you do buddy.

[–] [email protected] 15 points 8 months ago (1 children)

It's clutch for boring emails with several tedious document summaries. Sometimes I get a day's work done in 4 hours.

Automation can be great, when it comes from the bottom-up.

[–] [email protected] 2 points 8 months ago

Honestly, that's been my favorite - bringing in automation tech to help me in low-tech industries (almost all corporate-type office jobs). When I started my current role, I was working consistently 50 hours a week. I slowly automated almost all the processes and now usually work about 2-3 hours a day with the same outputs. The trick is to not increase outputs or that becomes the new baseline expectation.

[–] [email protected] 8 points 8 months ago

I am a developer and that's exactly how I see it too. I think AI will be able to write PRs for simple stories but it will need a human to review those stories to give approval or feedback for it to fix it, or manually intervene to tweak the output.