It's annoying because either all of it should be AI or none of it.
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics. If you need to do this, try [email protected]
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected].
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
which do not think on their own,
but pass turing tests
(fool humans into thinking that they can think).
How do you know that?
wait for the next buzzword to come out, it'll pass.
used gpt3 once, but haven't had a use case for it since.
i'll use an """AI""" assistant when they are legitimately useful.
It's still good to start training one's AI prompt muscles and to learn what a LLM can and can't do.
Humans possess an esoteric ability to create new ideas out of nowhere, never before thought of. Humans are also capable of inspiration, which may appear similar to the way that AI's remix old inputs into "new" outputs, but the rules of creativity aren't bound by any set parameters the way a LLM is. I'm going to risk making a comment that ages like milk and just spitball: true artificial intelligence that matches a human is impossible.
No, it's just a buzzword, just saw a joke today that AI means "absent indian".
Not even driverless cars are actually driverless: https://www.jwz.org/blog/2024/01/driverless-cars-always-have-a-driver/
We do have A.I. The Turing test is there for a reason. We just don't have what movies told us A.I. would be like. Corporations don't need an A.I. that can think for itself to replace you. In fact, that's one of the reasons to replace you.
which do not think on their own,
Most humans don't either. But if think you are conflating two different things, intelligence (ability to reason) and consciousness (being able to do so on your own). I personally believe with both of those things, that spontaneously come to existence in our brains when they became complex enough, we are just quantitatively not very far away from creating networks complex enough ourselves. Big last breakthrough was the ability to create training data sets for AI with AI that don't make the models degenerate.
Of course we have “real” AI. We can literally be surprised while talking to these things.
People who claim it’s not general AI consistently, 100% of the time, fail to answer this question: what can a human mind do that these cannot?
In precise terms. You say “a human mind can understand” then I need a precise technical definition of “understand”. Because the people making this claim that “it’s not general AI” are always trying to wave their own flag of technical expertise. So, in technical terms, what can a general AI do, that an LLM cannot?
NFT
Title: Unpopular Opinion: The Term "AI" is Just a Marketing Buzzword!
Hey fellow Redditors, let's talk about the elephant in the room: AI. 🤖💬
I can't be the only one feeling a bit agitated by how the term "Artificial Intelligence" gets thrown around, right? Real AI seems like a distant dream, and what we have right now are these Large Language Models (LLMs). They're good at passing Turing tests, but let's be real – they're not thinking on their own.
Am I the only one who thinks "AI" is just a fancy label created by those rich, capitalistic individuals already knee-deep in LLM stocks? It feels like a slick way to boost investments and make us believe these machines are more intelligent than they really are. Thoughts? 🔍🧠💭
You are misunderstanding what AI means, probably due to its overuse in pop culture. What you are think of is a subcategory of AI. It goes: AI > Machine Learning > Artificial Life