this post was submitted on 26 Jul 2024
518 points (100.0% liked)

196

16195 readers
2219 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
all 28 comments
sorted by: hot top controversial new old
[–] [email protected] 149 points 1 month ago (3 children)

Good. It would be very time consuming

[–] [email protected] 40 points 1 month ago (2 children)
[–] [email protected] 27 points 1 month ago
[–] [email protected] 8 points 1 month ago

Did you just ASSUME skynet's gender

[–] [email protected] 14 points 1 month ago

It takes time to digest the joke.

[–] [email protected] 9 points 1 month ago

Ok google, order another tube of genital rash cream

[–] [email protected] 92 points 1 month ago

Same energy:

[–] [email protected] 47 points 1 month ago (1 children)

They're delicious. You'll go back for seconds

[–] [email protected] 14 points 1 month ago

You went for seconds? No wonder you look so long.

[–] [email protected] 33 points 1 month ago

I'm laughing.

[–] [email protected] 29 points 1 month ago (1 children)

Clock I did not read clock the first two times.

[–] [email protected] 16 points 1 month ago
[–] [email protected] 17 points 1 month ago (1 children)

You can eat it with your hands

[–] [email protected] 34 points 1 month ago (1 children)

Eat it with your hands? Not on my watch.

[–] [email protected] 7 points 1 month ago

Now this is good

[–] [email protected] 7 points 1 month ago

Well, it did make me laugh.

[–] [email protected] 5 points 1 month ago (3 children)

I don't get it. Too tired or too stupid.

[–] [email protected] 8 points 1 month ago (1 children)

Google assistant is bad, so are it's jokes

[–] [email protected] 5 points 1 month ago
[–] [email protected] 4 points 1 month ago

Por qué no los dos

[–] [email protected] 3 points 1 month ago (1 children)

Same - could someone please explain for the not-so-enlightened us?

[–] [email protected] 4 points 1 month ago (1 children)

AI's have certain input words that break the conversation, presumably to keep you safe/give you an off switch. Google's AI doesn't seem to be aware that a users' NO will make it stop and drop everything and tells a joke in the format of a yes/no question that expects a 'no'-response if you don't know the joke already. Following, it ends the conversation with an ok because the user said no. Luckily that breaks our expectations of what should have happened in a way that makes some of us laugh anyway.

[–] [email protected] 2 points 1 month ago
[–] [email protected] 4 points 1 month ago

I mean, it made me chuckle

[–] [email protected] 1 points 1 month ago

google is the joke, at this point.

[–] [email protected] 1 points 1 month ago

The response halted it. Saying "I haven't" would've probably let it continue.