this post was submitted on 24 Jun 2024
743 points (95.8% liked)
Technology
58303 readers
7 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I still can't fully accept that 1GB is not normal, 2GB is not very good, and 4GB is not all you ever gonna need.
If only it got bloated for some good reasons.
I remember when I got my first computer with 1GB of RAM, where my previous computer had 64MB, later upgraded to 192MB. And there were only like 3 or 4 years in between them.
It was like: holy shit, now I can put all the things in RAM. I will never run out.
The moment you use a file that is bigger than 1GB, that computer will explode.
Some of us do more than just browse Lemmy.
Wow. Have you ever considered how people were working with files bigger than total RAM they had in the normal days of computing?
So in your opinion if you have 2GB+ of a log file, editing it you should have 2GB RAM occupied?
I just have no words, the ignorance.
High quality content is the reason. Sit in a terminal and your memory usage will be low.
So we're just going to ignore stuff like Electron, unoptimized assets, etc... Basically every other known problem... Yeah let's just ignore all that
Is Electron that bad? Really? I have Slack open right now with two servers and it takes around 350MB of RAM. Not that bad, considering that every other colleague thinks that posting dumb shit GIFs into work chats is cool. That's definitely nowhere close to Firefox, Chrome and WebStorm eating multiple gigs each.
Yes, it really is that bad. 350 MBs of RAM for something that could otherwise have taken less than 100? That isn't bad to you? And also, it's not just RAM. It's every resource, including CPU, which is especially bad with Electron.
I don't really mind Electron myself because I have enough resources. But pretending the lack of optimization isn't a real problem is just not right.
First of all, 350MB is a drop in a bucket. But what's more important is performance, because it affects things like power consumption, carbon emissions, etc. I'd rather see Slack "eating" one gig of RAM and running smoothly on a single E core below boost clocks with pretty much zero CPU use. That's the whole point of having fast memory - so you can cache and pre-render as much as possible and leave it rest statically in memory.
CPU usage is famously terrible with Electron, which i also pointed out in the comment you're replying to. But yes, having multiple chromium instances running for each "app" is terrible
No, it's not.
... Okay?
Do you really want me to go into the details of how JIT works in V8 and which Electron APIs allow the apps to idle correctly?
Yes it is.
"iT'S oNLy a FeW hUnDrED MB oF LiBRAriES and BiNAriES pEr aPp, iT'S oNLy dOuBLe oR tRiPLe tHe RAM, DiSk, anD cpU uSAgE"
Then we have the fucking shit show of 6-8GB of RAM used just by booting the fucking machine. Chromium/Webkit is practically an OS by itself for all the I/O, media handling, and built in libraries upon libraries of shit. Let's run that whole entire stack for all these electron apps, and then fragment each one independent of each other (hello Discord, who used Electron 12 for WAY too long) then say "bUt iT's pORtaBLe!".
Yes, it isn't just terrible, it's fucking obnoxiously and horrendously terrible, like we grabbed defeat from the jaws of victory terrible, and moronically insipid. Optimization in the fucking trash can and a fire hydrant in all our fucking assholes, terrible. That's HOW terrible it actually is, so you're wrong.
RAM usage doesn't matter in the slightest.
People don't run just a single app in their machines. If we triple ram usage of several apps, it results in a massive increase. That's how bloat happens, it's a cumulative increase on everything. If we analyze single cases, we could say that they're not that bad individually, but the end result is the necessity for a constant and fast increase in hardware resources.
That's not bloat, that's people running more apps than ever.
That's not true. 8 to 16GB RAM machines became common in early 2010-s and barely anyone is using 32 gigs today. Even if we look at the most recent Steam Hardware & Software Survey, we will see that even gamers are pretty much stuck with 16 gigs. 32 gigs are installed on less than 30% of machines and more than that is barely 4%. Ten years ago 8 gigs was the most common option with 12+ gigs (Steam didn't have 16gig category in 2014) being the third option. The switch to 16 gigs being number one happened in December 2019, so we're five years in with 16 gigs being the most common option and more RAM is not getting anywhere close to replacing it (47.08% for 16 gigs and 28.72% for 32 gigs as of May 2024).
Now if you look at late 90-s and 2000-s you will see that RAM was doubling pretty much every 2-3 years. We can look at Steam data once again. Back in 2008 (that's the earliest data available on archive.org) 2 gigs were the most common option. Next year 3 gigs option got very close and sat at 2nd place. In 2010 2GB, 3GB and 4GB were splitting hairs. 4GB option became the most common in 2011 with 3GB variant being very close 2nd place. 5GB option became the king in 2012. And the very next year 8 gigs became the norm.
So, 2 gigs in 2008, 4 gigs in 2011 and 8 gigs in 2013. You can check historical data yourself here https://web.archive.org/web/20130915000000*/http://store.steampowered.com/hwsurvey/
Not necessarily. People used to write text documents while looking for references on the internet, listening to music and chatting with friends at the same time in 2010, and even earlier, but the same use case (office suite+browser+music payer+chat app) takes much more resources today, with just a small increase in usability and features.
Bloat is a complicated thing to discuss, because there's no hard definition of it, and each person will think about it in a different way, so what someone can consider bloat, someone else may not, and we end up talking about different things. You're right that hardware resources have been increasing in a slower rate, and it may force some more optimizations, but a lot of software are still getting heavier, without bringing new functionalities.
The software is getting heavier because content, not code. Again, we can look at the games. Take some old games like GTA V or Skyrim, they will fly on modern high end machines! Now add mods with 8K textures, higher definition models, HDR support, etc and these old games will bend over your RTX4090.
Content is also getting heavier, but both things aren't mutually exclusive. It's more objective to compare modern software, instead of older and newer ones. Before reddit created obstacles for third-party apps, they were famous for being much lighter than the official one, while doing the same (some even had more features). Now, if we compare lemmy to reddit, it's also much lighter, while providing a very similar experience. Telegram has a desktop app that does everything the web version does, and more, while lighter on resources. Most linux distros will work fine with far less hardware resources than windows. If you install lineageos on an older phone, it will perform better than the stock rom, even while using a newer aosp version. If you play a video on youtube, and the same one on vlc, vlc will do the same with less resources. If you use most sites with and without content blockers, the second one will be lighter, while not losing anything important.
I could go on and on, but that's enough examples. There is a bloat component to software getting heavier, and not everything can be explained by heavier content and more features.
Just wanted to point out that the number 1 performance blocker in the CPU is memory. In the general case, if you're wasting memory, you're wasting CPU. These two things really cannot be talked about in isolation.
No, that's the other way round. You either have high CPU load and low memory, or low CPU load and high memory.
I'm not sure what metric you're using to determine this. The bottom line is, if you're trying to get the CPU to really fly, using memory efficiently is just as important (if not more) than the actual instructions you send to it. The reason for this is the high latency required to go out to external memory. This is performance 101.
When (according to about:unloads) my average firefox tab is 70-230MB depending on what it is and how old the tab is (youtube tabs for example bloat up the longer they are open), a chat app using over 350 is a pretty big deal
just checked, my firefox is using 4.5gb of RAM, while telegram is using 2.3, while minimized to the system tray, granted Telegram doesnt use electron, but this is a trend across lots of programs and Electron is a big enough offender I avoid apps using it. When I get off shift I can launch discord and check it too, but it is usually bad enough I close it entirely when not in use
Telegram is using only 66 megs here. Again - it's about content.
Well that other guy said it's only 66megs so, you're wrong.
/s
Again - content.
If a program has to keep in RAM all the things you are not currently, like right now, displaying or editing, then its author shouldn't be in the profession. That also applies to the portions of a huge text file or a huge image you are not touching right now.
EDIT: Thankfully people writing video players usually understand this.
Again, you're wrong.
Ahaha, ok!
It sure is. I'm running ferdium at this very moment with 3 chat apps open, and it consumes almost a gigabyte for something that could take just a few megabytes.
What's wrong with using Gifs in work chat lmao, can laugh or smile while hating your job like the rest of us.
Get a better job.
256MB or 512MB was fine for high-quality content in 2002, what was that then.
Suppose the amount of pixels and everything quadrupled - OK, then 2GB it is.
But 4GB being not enough? Do you realize what 4GB is?
They didn't just quadruple. They're orders of magnitude higher these days. So content is a real thing.
But that's not what's actually being discussed here, memory usage these days is much more of a problem caused by bad practices rather than just content.
I know. BTW, if something is done in an order of magnitude less efficient way than it could and it did, one might consider it a result of intentional policy aimed at neutering development. Just not clear whose. There are fewer corporations affecting this than big governments, and those are capable of reaching consensus from time to time. So not a conspiracy theory.
One frame for a 4K monitor takes 33MB of memory. You need three of them for triple buffering used back in 2002, so half of your 256MB went to simply displaying a bloody UI. But there's more! Today we're using viewport composition, so the more apps you run, the more memory you need just to display the UI. Now this is what OS will use to render the final result, but your app will use additional memory for high res icons, fonts, photos, videos, etc. 4GB today is nothing.
I can tell you an anecdote. My partner was making a set of photo collages, about 7 art works to be printed in large format (think 5m+ per side). So 7 photo collages with source material saved on an external drive took 500 gigs. Tell me more about 256MB, lol.
Yes, you wouldn't have 4K in 2002.
My normal usage would be kinda strained with it, but possible.
I can do a cold boot and show you empty RAM as well. So fucking what?
It's not a cold boot and it's not empty.