this post was submitted on 31 Jan 2025
353 points (94.7% liked)

Open Source

32621 readers
522 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

Article: https://proton.me/blog/deepseek

Calls it "Deepsneak", failing to make it clear that the reason people love Deepseek is that you can download and it run it securely on any of your own private devices or servers - unlike most of the competing SOTA AIs.

I can't speak for Proton, but the last couple weeks are showing some very clear biases coming out.

(page 3) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 5 points 6 days ago* (last edited 6 days ago) (5 children)

im not an expert at criticism, but I think its fair from their part.

I mean, can you remind me what are the hardware requirements to run deepseek locally?
oh, you need a high-end graphics card with at least 8 GB VRAM for that*? for the highly distilled variants! for more complete ones you need multiple such graphics card interconnected! how do you even do that with more than 2 cards on a consumer motherboard??

how many do you think have access to such a system, I mean even 1 high-end gpu with just 8 GB VRAM, considering that more and more people only have a smartphone nowadays, but also that these are very expensive even for gamers?
and as you will read in the 2nd referenced article below, memory size is not the only factor: the distill requiring only 1 GB VRAM still requires a high-end gpu for the model to be usable.

https://www.tomshardware.com/tech-industry/artificial-intelligence/amd-released-instructions-for-running-deepseek-on-ryzen-ai-cpus-and-radeon-gpus

https://bizon-tech.com/blog/how-to-run-deepseek-r1-locally-a-free-alternative-to-openais-o1-model-hardware-requirements#a6

https://codingmall.com/knowledge-base/25-global/240733-what-are-the-system-requirements-for-running-deepseek-models-locally

so my point is that when talking about deepseek, you can't ignore how they operate their online service, as most people will only be able to try that.

I understand that recently it's very trendy, and cool to shit on Proton, but they have a very strong point here.

[–] [email protected] 2 points 6 days ago (2 children)

There are plenty of other online platforms where you can use the unmodified model without siphoning your data to China. The model itself is just an offline blob and doesn’t need to be modified to make a “more secure” and “privacy friendly” version like the article says it does, because the model is not tasked with collecting and sharing your data. The author doesn’t seem to be aware of that.

load more comments (2 replies)
load more comments (4 replies)
[–] [email protected] 1 points 5 days ago* (last edited 5 days ago)

he's probably right. the company wants to be disruptive, and it's normal for any company to steal data. you can self host the current model, but that doesn't mean this will always be the case. certainly they will want to make a profit at some point. it's day 1 silicon valley shit

load more comments
view more: ‹ prev next ›