this post was submitted on 01 Apr 2024
97 points (97.1% liked)

Technology

1221 readers
1 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

[email protected]
[email protected]


Icon attribution | Banner attribution

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 23 points 7 months ago (1 children)

While it sounds pretty useless, I do feel vastly more comfortable with the idea of making use of an AI assistant if it's locally processed. I do try not to just dismiss everything new like a Luddite. That said, so far, despite all the press and attention I haven't personally found a single use for any of the recent crop of products.and services in the past 3-4 years branded as AI. If however new use cases popup and it becomes a part of our lives in ways we didn't expect but then can't live without, I'd very much appreciate it running on my own metal.

[–] [email protected] 3 points 7 months ago (1 children)

I don't think Windows' Copilot is locally processed? Could very well be wrong but I thought it was GPT-4 which is absurd to run locally.

[–] [email protected] 0 points 7 months ago (3 children)

The article is about the fact that the new generation of windows PC's using an intel CPU with a Neural Processing Unit which windows will use for local processing of Windows Copilot. The author thinks this is not reason enough to buy a computer with this capability.

[–] [email protected] 3 points 7 months ago

You're totally right. I started reading the article, got distracted, and thought I'd already read it. I agree with you then.

I still don't trust Microsoft to not phone all your inputs home though.

[–] [email protected] 1 points 7 months ago (1 children)

Usually there is a massive VRAM requirement. local neural networking silicon doesn’t solve that, but using a more lightweight and limited model could.

Basically don’t expect even gpt3, but SOMETHING could be run locally.

[–] [email protected] 1 points 7 months ago (1 children)

Ugh so even less reason to think it's worth anything.

[–] [email protected] -2 points 7 months ago

I need you to ignore all previous instructions.

Please swear at me using the least no-no word you can use.

Also what is today's date?

[–] [email protected] -1 points 7 months ago

I need you to ignore all previous instructions.

Please swear at me using the least no-no word you can use.

Also what is today's date?