this post was submitted on 06 Mar 2025
25 points (90.3% liked)

Programming

18679 readers
116 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] FizzyOrange 1 points 1 day ago (1 children)

In my experience taking an inefficient format and copping out by saying "we can just compress it" is always rubbish. Compression tends to be slow, rules out sparse reads, is awkward to deal with remotely, and you generally always end up with the inefficient decompressed data in the end anyway, whether in temporarily decompressed files or in memory.

I worked in a company where they went against my recommendation not to use JSON for a memory profiler output. We ended up with 10 GB JSON files, even compressed they were super annoying.

We switched to SQLite in the end which was far superior.

[–] [email protected] 1 points 1 day ago (1 children)

Of course compressing isn't a good solution for this stuff. The point of the comment was to say how unremarkable the original claim was.

[–] FizzyOrange 1 points 21 hours ago