this post was submitted on 11 May 2025
819 points (97.5% liked)

Programmer Humor

23202 readers
787 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 26 points 13 hours ago (2 children)

Good luck with your 256 characters.

[–] [email protected] 26 points 12 hours ago (1 children)

When you run out of characters, you simply create another 0 byte file to encode the rest.

Check mate, storage manufacturers.

[–] [email protected] 12 points 12 hours ago* (last edited 11 hours ago)

File name file system! Looks like we broke the universe! Wait, why is my MFT so large?!

[–] [email protected] 12 points 12 hours ago* (last edited 12 hours ago) (1 children)

255, generally, because null termination. ZFS does 1023, the argument not being "people should have long filenames" but "unicode exists", ReiserFS 4032, Reiser4 3976. Not that anyone uses Reiser, any more. Also Linux' PATH_MAX of 4096 still applies. Though that's in the end just a POSIX define, I'm not sure whether that limit is actually enforced by open(2)... man page speaks of ENAMETOOLONG but doesn't give a maximum.

It's not like filesystems couldn't support it it's that FS people consider it pointless. ZFS does, in principle, support gigantic file metadata but using it would break use cases like having a separate vdev for your volume's metadata. What's the point of having (effectively) separate index drives when your data drives are empty.

[–] [email protected] 1 points 5 hours ago* (last edited 5 hours ago) (1 children)

...Just asking, just asking: Why is the default FILENAME_MAX on Linux/glibc 4096?

[–] [email protected] 1 points 35 minutes ago

Because PATH_MAX is? Also because it's a 4k page.

FILENAME_MAX is not safe to use for buffer allocations btw it could be INT_MAX.

[–] [email protected] 15 points 13 hours ago (1 children)

I remember the first time I ran out of inodes: it was very confusing. You just start getting ENOSPC, but du still says you have half the disk space available.

[–] [email protected] 4 points 12 hours ago

Ah memories. That was an interesting lesson.

[–] [email protected] 43 points 17 hours ago (4 children)

You want real infinite storage space? Here you go: https://github.com/philipl/pifs

[–] [email protected] 2 points 22 minutes ago

Finally someone uses the fact that compute time is so much cheaper than storage!

[–] [email protected] 7 points 14 hours ago* (last edited 14 hours ago)

that's awesome! I'm just migrating all my data to πfs. finally mathematics is put to a proper use!

[–] [email protected] 4 points 14 hours ago

Breakthrough vibes

[–] [email protected] 50 points 19 hours ago (9 children)

I had a manager once tell me during a casual conversation with complete sincerity that one day with advancements in compression algorithms we could get any file down to a single bit. I really didn't know what to say to that level of absurdity. I just nodded.

[–] [email protected] 9 points 12 hours ago* (last edited 12 hours ago)

You can give me any file, and I can create a compression algorithm that reduces it to 1 bit. (*)

spoiler(*) No guarantees about the size of the decompression algorithm or its efficacy on other files

[–] [email protected] 22 points 15 hours ago* (last edited 15 hours ago)

That's the kind of manager that also tells you that you just lack creativity and vision if you tell them that it's not possible. They also post regularly on LinkedIn

[–] [email protected] 9 points 15 hours ago

u can have everthing in a single bit, if the decompressor includes the whole universe

[–] [email protected] 8 points 15 hours ago

Send him your work: 1 (or 0 ofc)

[–] [email protected] 3 points 12 hours ago* (last edited 10 hours ago) (1 children)

It's an interesting question, though. How far CAN you compress? At some point you've extracted every information contained and increased the density to a maximum amount - but what is that density?

[–] [email protected] 3 points 11 hours ago

I think by the time we reach some future extreme of data density, it will be in a method of storage beyond our current understanding. It will be measured in coordinates or atoms or fractions of a dimension that we nullify.

[–] [email protected] 1 points 10 hours ago

How to tell someone you don't know how compression algorithms work, without telling them directly.

[–] [email protected] 5 points 16 hours ago

Just make a file system that maps each file name to 2 files. The 0 file and the 1 file.

Now with just a filename and 1 bit, you can have any file! The file is just 1 bit. It's the filesystems that needs more than that.

[–] [email protected] 7 points 19 hours ago

That’s precisely when you bet on it.

[–] [email protected] 21 points 23 hours ago (1 children)

It's like that chip tune webpage where the entire track is encoded in the url.

[–] [email protected] 11 points 21 hours ago (2 children)
[–] PoolloverNathan 10 points 17 hours ago
[–] [email protected] 9 points 19 hours ago

Are you trying to get rickrolled?

[–] [email protected] 157 points 1 day ago (5 children)

If you have a tub full of water and a take a sip, you still have a tub full of water. Therefore only drink in small sips and you will have infinite water.

Water shortage is a scam.

[–] [email protected] 3 points 5 hours ago

If you have a water bottle and only drink half of it each time, you will also have infinite 💦

load more comments (4 replies)
[–] [email protected] 79 points 1 day ago (1 children)

It's all fun and games until your computer turns into a black hole because there is too much information in too little of a volume.

[–] [email protected] 37 points 1 day ago (4 children)

Even better! According to no hiding theorem, you can't destroy information. With black holes you maybe possibly could be able to recover the data as it leaks through the Hawking radiation.
Perfect for long term storage

[–] [email protected] 28 points 1 day ago (1 children)

Can't wait to hear news about a major site leaking user passwords through hawking radiation.

load more comments (1 replies)
load more comments (3 replies)
[–] [email protected] 35 points 1 day ago* (last edited 1 day ago) (3 children)

Stupid BUT: making the font in LibreOffice bigger saves space. so having 11 is readible but by changing the font size to like 500 it can save some mb per page
I dont know how it works, i just noticed it at some point

Edit: i think it was kb, not mb

[–] [email protected] 11 points 19 hours ago

Have a macro that decreases all font size on opening and then increases all again before closing.

Follow me irl for more compression techniques.

[–] [email protected] 17 points 23 hours ago

per page

I mean, yes. obviously.

If you had 1000 bytes of text on 1 page before, you now have 1byte per page on 1000 pages afterwards

[–] [email protected] 6 points 1 day ago

You could always diff the XML before and after to see what's causing it.

[–] [email protected] 41 points 1 day ago (6 children)
[–] [email protected] 1 points 11 hours ago

Both people sound obnoxious lol

[–] [email protected] 3 points 19 hours ago (1 children)

I was sort of on Mike Goldman (the challenge giver)'s side until I saw the great point made at the end that the entire challenge was akin to a bar room bet; Goldman had always set it up as a kind of scam from the start and was clearly more than happy to take $100 from anyone who fell for it, and so should have taken responsibility when someone managed to meet the wording of his challenge.

[–] [email protected] 2 points 15 hours ago

Yeah, he was bamboozled as soon as he agreed to allow multiple separate files. The challenge was bs from the start, but he could have at least nailed it down with more explicit language and by forbidding any exceptions. I think it's kind of ironic that the instructions for a challenge related to different representations of information failed themselves to actually convey the intended information.

[–] [email protected] 1 points 15 hours ago

Nice read, thanks!

load more comments (3 replies)
[–] [email protected] 115 points 1 day ago* (last edited 1 day ago) (2 children)

Awesome idea. In base 64 to deal with all the funky characters.

It will be really nice to browse this filesystem...

[–] [email protected] 85 points 1 day ago

The design is very human

load more comments (1 replies)
[–] [email protected] 93 points 1 day ago (3 children)

Broke: file names have a max character length.

Woke: split b64-encoded data into numbered parts and add .part-1..n suffix to each file name.

load more comments (3 replies)
load more comments
view more: next ›