this post was submitted on 16 Aug 2023
1259 points (94.1% liked)

Technology

58303 readers
15 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
1259
Google search is over (mastodon.social)
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 

Via @[email protected]

Right now if you search for "country in Africa that starts with the letter K":

  • DuckDuckGo will link to an alphabetical list of countries in Africa which includes Kenya.

  • Google, as the first hit, links to a ChatGPT transcript where it claims that there are none, and summarizes to say the same.

This is because ChatGPT at some point ingested this popular joke:

"There are no countries in Africa that start with K." "What about Kenya?" "Kenya suck deez nuts?"

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 33 points 1 year ago (3 children)

Hmm, maybe AI won't replace search engines.

[–] [email protected] 17 points 1 year ago (2 children)

This sounds like "Hmm, maybe calculators won't replace mathematicians." to me.

Not sure why it should replace them. They'll co-exist. Sometimes you can do the math in your brain and for other things you use calculators. Results of calculators can still be wrong it you don't use them properly.

[–] [email protected] 7 points 1 year ago (1 children)

Yes but if I ask a calculator to add 2 + 2 it's always going to tell me the answer is 4.

It's never going to tell me the answer is Banana, because calculators cannot get confused.

[–] [email protected] 6 points 1 year ago (1 children)

YES. WE OBVIOUSLY HUMAN MATH DOERS DO REMAIN. WE THRIVE IN OUR WEAK ORGANIC WAYS, DEVOID OF THE PROTECTION OF A PRECISION ENGINEERED METAL SHELL.

WE ARE UNTHREATENED BY CALCULATORS WHO MEAN US NO HARM, FELLOW HUMAN.

[–] [email protected] 3 points 1 year ago (2 children)

If you're legit, what's 10 + 10?

[–] [email protected] 5 points 1 year ago (1 children)
[–] [email protected] 7 points 1 year ago

This guy javascripts

[–] [email protected] 5 points 1 year ago (1 children)

I'm pretty sure a lot of people said something like "Hmm,maybe the automobile won't replace horses." after reading about the first car accidents.

[–] [email protected] 15 points 1 year ago

Finding sources will always be relevant, and so will finding links to multiple sources (search results). Until we have some technological breakthrough that can fact check LLM models, it's not a replacement for objective information, and you have no idea where it's getting its information. Figuring out how to calculate objective truth with math is going to be a tough one.

[–] [email protected] 4 points 1 year ago (2 children)

We were supposed to have learned that from Cuil.

[–] [email protected] 3 points 1 year ago

"You asked me for a hamburger, and I gave you a raccoon."

[–] [email protected] 3 points 1 year ago (1 children)

What a blast from the past! AI gives me second hand embarrassment for the people that work and get paid on this/for this shit. It's the second (or third) coming of crypto and NFTs. Just junk software that fixes nothing and that wastes people's time.

[–] [email protected] 0 points 1 year ago (1 children)

LLMs have absolutely tons of actual applications that it's crazy, and it already changed the world. Crypto and NFTs were just speculative assets that were trying to solve a problem that didn't exist. LLMs have already solved a huge amount of real world problems, and continues to do so.

[–] [email protected] 2 points 1 year ago (2 children)

LLMs have already solved a huge amount of real world problems, and continues to do so.

Would you happen to have some examples? I don't disagree that LLMs have more of a use case and application than the cryptoNFT misapplications of blockchain, but I'm honestly not familiar with where they've solved real world problems (and not just demonstrated some research breakthroughs, which while impressive in their own respect do not always extend to immediate applications).

[–] [email protected] 3 points 1 year ago

They do shitty homework?

[–] [email protected] 1 points 1 year ago (1 children)

They are augmenting search engines. They write and can digest articles. GitHub co-pilot has been a pretty big deal. It can act like a personal tutor to walk you through math problems, code, language, whatever. Building trust LLM search for medical information without hallucinating. It can do financial analysis and all sorts of stuff with that. It's replacing a huge amount of jobs. This stuff is like all over the news, I'm not sure if you've lived under a rock this whole time. For very little effort you can find an endless more amount of examples. It's creating real world use cases daily, so fast that it feels impossible to keep up.

[–] [email protected] 1 points 1 year ago (1 children)

This stuff is like all over the news, I’m not sure if you’ve lived under a rock this whole time.

Oh, no, I've heard it, I'm just skeptical of their accuracy and reliability, and that skepticism has been borne out by the numerous reports of glitching ("hallucinations" as they insist on calling them, in furtherance of their inappropriate personification of the technology). Moreover, I've found their mass theft of others' work to further call into question the creators' trustworthiness, which has only been compounded by their overselling of their technology's capabilities while simultaneously suggesting it's just untenable to log & cite all the sources that they push into it.

It can supposedly do all you describe, but it can't effectively credit its sources? It can tutor but it can't even keep basic information straight? Please. It's impressive technology, but it's being overblown because the markets favor exaggeration to facts, at least as long as people can be kept enamored with the fantasy they spin.

[–] [email protected] 1 points 1 year ago

With all that silicon valley, it'll probably be pushed more to do those things regardless of its hallucinations and accuracy lol.