this post was submitted on 13 Apr 2024
494 points (96.8% liked)
Technology
58303 readers
6 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Wow! 13GB! I did some heavy stuff on my computer with like a shit ton of Docker servers running together + deployment and I never reached 13GB!
Without disclosing private company information lol what are you doing ;)
not OP, but I have to run fronted and backend of a project in docker simultaneously (multiple postgres and redis dbs, queues, search index, etc., plus two webservers), plus a few browser tabs and two VSCode instances open, regularly pushes my machine over 15gb ram usage
pretty much like this
That is basically my use-case. You add a DB service (or two), DNS, reverse proxy, Redis, Memcached, etc... maybe some containers for additional proprietary backend services like APIs, and then the application themselves that need those things to run... it adds up FAST. The advantage is that you can have multiple projects all running simultaneously and you can add/remove/swap them pretty easily.
RAM is cheap. There is no excuse for shipping a 8GB computer... even if it's mostly going to be used for family photos and internet.
Running a suite of services in containers (DBs, DNS, reverse proxy, memcached, redis, elasticsearch, shared services, etc) plus a number of discreet applications that use all those things. My day-to-day usage hovers around 20GB with spikes to 32 (my max allocation) when I run parallelized test suites.
Dockers memory usage really adds up fast.