this post was submitted on 19 Oct 2023
546 points (96.6% liked)

Technology

58303 readers
13 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Black Mirror creator unafraid of AI because it’s “boring”::Charlie Brooker doesn’t think AI is taking his job any time soon because it only produces trash

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 16 points 1 year ago (2 children)

By its nature, Large Language Models won't ever be truly innovative, after all they rely on expected patterns. But a lot of the media that we consume is also made to appeal to patterns that we expect: genres, tropes, usual messages. AI could replace a lot of it and frankly, that's scary to think in a world where we need to work to earn our living.

Truly groundbreaking art may not be what people usually seek, it's often something they don't even know they want until they experience it, or they might even fail to appreciate it. But it likely won't be automated unless AI achieves full consciousness, but if it does we will have a much more complicated situation in our hands than "we can command AI to make art better than we can do ourselves".

Still, getting paranoid over the uncertain latter won't help us with the former that is just around the corner.

[–] [email protected] 7 points 1 year ago (1 children)

Good points.

One problem with replacing everything with AI that people don't think about: middle managers will start to be replaced too. There's no way to ask a LLM "why did you do that"? Fewer people will need to be managed.

[–] [email protected] 5 points 1 year ago (2 children)

It seems unwise to replace managers with LLMs because LLMs don't understand the real world implications of their responses, they don't have awareness of the real world, they simply give you often used language patterns, which can be innacurate or biased based on flawed human data. But it would be a great way for sketchy human executives to offload responsibility for unethical actions and feign objectivity or uninvolvement, so I don't doubt they will try.

Even if we imagine a perfect AI that does takes into account every objective fact and philosophical argument, that still leaves the question of how will the people who get replaced in all these intellectual, artistic and service jobs will make a living. That's not an answer that technology will give us, that will a nasty political situation.

[–] [email protected] 5 points 1 year ago (1 children)

No, you misunderstood. The managers are fired because there's fewer people to manage.

[–] [email protected] 2 points 1 year ago

That makes sense too. Overall, a lot of people's jobs are threatened, but I don't think "learn AI" is going to cut it this time. Not for all these people.

[–] [email protected] 1 points 1 year ago

LLMs don’t understand the real world implications of their responses

LLMs don't, but specialised AI trained for that specific purpose would.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago) (2 children)

Truly groundbreaking art may not be what people usually seek, it’s often something they don’t even know they want until they experience it, or they might even fail to appreciate it.

Everyone in these threads likes to talk about being impressed by these llm or not being impressed by them as being some sort of intelligence test. I think of it more as a test of a person's sense of creativity.

It spits out a lot of passable text very easily, but as you're saying here its creativity is essentially nil. Even its "hallucinations" are just versions of things it borrowed from elsewhere injected slightly to wildly out of context in order to satisfy a prompt.

I tried to play a generative AI RPG builder game online and it came up with scenarios so boring I can't imagine playing it for longer than ten minutes.

I also find the same with generated content in other video games. At its best it's passable and that's about it. No man's sky has infinite worlds full of weird ligar creatures and after you've visited a couple dozen worlds they're pretty much all the same.

[–] [email protected] 3 points 1 year ago (1 children)

And who is to say that we humans don't process creativity exactly the same way? By borrowing from things we encounter.

Even the earliest creative expats of humans was just things we saw in nature, which we drew on cave walls.

We humans just have more experience since we existed longer, so the line feels a lot more blurred.

I also encountered games made by humans that were so boring I couldn't manage more than 10 minutes.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

And who is to say that we humans don’t process creativity exactly the same way? By borrowing from things we encounter.

That's part of it, but it's definitely not all of it.

There's more creativity in the average prompt than there is in any response I've ever seen from ChatGPT.

If creativity were as simple as mashing a few things together as you're saying, ChatGPT would be there already because that's obviously what it's doing.

I also encountered games made by humans that were so boring I couldn’t manage more than 10 minutes.

Me too, but that's an indictment of a single creator or team's idea that was boring, not an indictment of a system. This thing was basically a framework with the llm being the central "creator" at the center. It would find the most boring aspects of the prompts and lean into them. This is of course a subjective assessment, but I'd argue that it's not an uninformed one.

[–] [email protected] 1 points 1 year ago (1 children)

I also find the same with generated content in other video games. At its best it's passable and that's about it.

Minecraft would like to have a word with you...

[–] [email protected] 1 points 1 year ago

Minecraft isn't generating new animals or narrative. Landscape generation is relatively straightforward from an algorithm / computation perspective. If it started generating its own models or characters or character dialogue I suspect it would very quickly fall into the territory of what I'm talking about.

There's just a feeling of emptiness to me that's pervasive in games with main parts of narrative or gameplay that are randomly generated.