this post was submitted on 30 Jul 2024
961 points (98.1% liked)

Technology

58303 readers
6 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

If you've watched any Olympics coverage this week, you've likely been confronted with an ad for Google's Gemini AI called "Dear Sydney." In it, a proud father seeks help writing a letter on behalf of his daughter, who is an aspiring runner and superfan of world-record-holding hurdler Sydney McLaughlin-Levrone.

"I'm pretty good with words, but this has to be just right," the father intones before asking Gemini to "Help my daughter write a letter telling Sydney how inspiring she is..." Gemini dutifully responds with a draft letter in which the LLM tells the runner, on behalf of the daughter, that she wants to be "just like you."

I think the most offensive thing about the ad is what it implies about the kinds of human tasks Google sees AI replacing. Rather than using LLMs to automate tedious busywork or difficult research questions, "Dear Sydney" presents a world where Gemini can help us offload a heartwarming shared moment of connection with our children.

Inserting Gemini into a child's heartfelt request for parental help makes it seem like the parent in question is offloading their responsibilities to a computer in the coldest, most sterile way possible. More than that, it comes across as an attempt to avoid an opportunity to bond with a child over a shared interest in a creative way.

you are viewing a single comment's thread
view the rest of the comments
[–] lowleveldata 40 points 4 months ago (1 children)

this has to be just right

And then he couldn't even bother to choose the words himself

[–] [email protected] 9 points 4 months ago* (last edited 4 months ago) (2 children)

It's not implying he can't be bothered, but that the machine can do a better job.

...which may be true, depending on just how bad he is at writing. Like, I was just watching this classic the other day. If this guy writes like some of those people, the machine may infact be better.

That said, for most people it's stupid, and the tech isn't able to do a better job at expressing such things.

Yet.

[–] [email protected] 11 points 4 months ago* (last edited 4 months ago)

But doesn’t it matter that the machine isn’t expressing anything? It’s regurgitating words that are a facsimile of emotion. That matters to me. Especially in the long term. Since shorthand and texting became a thing, kids’ writing became way, way worse according to TAs and teachers I know. Which, that was a byproduct of a change in writing styles, so while kinda pathetic, it’s somewhat understandable. But this is just shoving itself between us and our own feelings. Say google gets their wish, everything we write to each other that ever matters more than a simple surface level conversation is expressed via LLMs. Where will that leave us? What does that leave us? We’re closing ourselves off from the world with technology. And we’re cheering for a new tech that will allow us to retreat even further away from human experience. That’s goddamn depressing if you ask me. And to answer my own question, it leaves us work, consumption, and fucking nothin.

This tech isn’t here to free us. From work, from tedium. It’s here to relegate us only to the tedium.

[–] [email protected] 5 points 4 months ago

LLMs can also be helpful when your actual feelings should NOT be conveyed. For example, I can have a genuine response to someone along the lines of, "You are dumb for so many reasons, here are just a few of them that show you are out of touch with what our product can do, and frankly with reality itself. {Enumerated list with copious amounts of cursing and belittling}" Ok LLM, rewrite that message using professional office language because I emotionally refuse to.