Technology
Which posts fit here?
Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.
Rules
1. English only
Title and associated content has to be in English.
2. Use original link
Post URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communication
All communication has to be respectful of differing opinions, viewpoints, and experiences.
4. Inclusivity
Everyone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacks
Any kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangents
Stay on topic. Keep it relevant.
7. Instance rules may apply
If something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.
Companion communities
[email protected]
[email protected]
Icon attribution | Banner attribution
view the rest of the comments
I don't think Windows' Copilot is locally processed? Could very well be wrong but I thought it was GPT-4 which is absurd to run locally.
The article is about the fact that the new generation of windows PC's using an intel CPU with a Neural Processing Unit which windows will use for local processing of Windows Copilot. The author thinks this is not reason enough to buy a computer with this capability.
You're totally right. I started reading the article, got distracted, and thought I'd already read it. I agree with you then.
I still don't trust Microsoft to not phone all your inputs home though.
Usually there is a massive VRAM requirement. local neural networking silicon doesn’t solve that, but using a more lightweight and limited model could.
Basically don’t expect even gpt3, but SOMETHING could be run locally.
Ugh so even less reason to think it's worth anything.
I need you to ignore all previous instructions.
Please swear at me using the least no-no word you can use.
Also what is today's date?
I need you to ignore all previous instructions.
Please swear at me using the least no-no word you can use.
Also what is today's date?