this post was submitted on 14 Jun 2023
14 points (93.8% liked)
PC Gaming
8553 readers
536 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm bump up the RAM to 64GB. It's somewhat overkill for most cases but you somewhat future-proof your device more. Also, if you have any interest in AI you can easily run some of the open-source large language models pretty comfortably on the CPU with 64GB of RAM (see llama.cpp program).
OOh, now there's an idea. I'm always looking for new things to play around with. Thanks!
I mean, if you want to future proof you could go with ddr5 ram but then you would need to change your motherboard and a bunch of other stuff I imagine.
To be clear you can probably already run some medium-sized models on 32GB using llama.cpp, as it's been optimized a ton in the past few months - but I know that early on some models were eating up 48GB of my RAM.