this post was submitted on 23 Jul 2024
227 points (96.7% liked)
Technology
58073 readers
3072 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Alexa and LLMs are fundamentally not too different from each other. It's just a slightly different architecture and most importantly a much larger network.
The problem with LLMs is that they require immense compute power.
I don't see how LLMs will get into the households any time soon. It's not economical.
To train. But you can run a relatively simple one like phi-3 on quite modest hardware.
The immense computing power for AI is needed for training LLMs, it's far less for running a pre-trained model on a local machine.
You realize the current systems run in the cloud?
Well yea. You could slap Gemini Google-Home today. You wouldn't even need a new device for that probably. The reason they don't do that is econimical.
My point is that LLMs aren't replacing those devices. They are the same thing essentially. Just one a trimmed version of the other for economic reasons.