Good luck with your 256 characters.
Programmer Humor
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
When you run out of characters, you simply create another 0 byte file to encode the rest.
Check mate, storage manufacturers.
File name file system! Looks like we broke the universe! Wait, why is my MFT so large?!
255, generally, because null termination. ZFS does 1023, the argument not being "people should have long filenames" but "unicode exists", ReiserFS 4032, Reiser4 3976. Not that anyone uses Reiser, any more. Also Linux' PATH_MAX of 4096 still applies. Though that's in the end just a POSIX define, I'm not sure whether that limit is actually enforced by open(2)... man page speaks of ENAMETOOLONG but doesn't give a maximum.
It's not like filesystems couldn't support it it's that FS people consider it pointless. ZFS does, in principle, support gigantic file metadata but using it would break use cases like having a separate vdev for your volume's metadata. What's the point of having (effectively) separate index drives when your data drives are empty.
...Just asking, just asking: Why is the default FILENAME_MAX
on Linux/glibc 4096
?
Because PATH_MAX is? Also because it's a 4k page.
FILENAME_MAX is not safe to use for buffer allocations btw it could be INT_MAX.
I remember the first time I ran out of inodes: it was very confusing. You just start getting ENOSPC, but du still says you have half the disk space available.
Ah memories. That was an interesting lesson.
You want real infinite storage space? Here you go: https://github.com/philipl/pifs
that's awesome! I'm just migrating all my data to πfs. finally mathematics is put to a proper use!
Breakthrough vibes
I had a manager once tell me during a casual conversation with complete sincerity that one day with advancements in compression algorithms we could get any file down to a single bit. I really didn't know what to say to that level of absurdity. I just nodded.
You can give me any file, and I can create a compression algorithm that reduces it to 1 bit. (*)
spoiler
(*) No guarantees about the size of the decompression algorithm or its efficacy on other files
That's the kind of manager that also tells you that you just lack creativity and vision if you tell them that it's not possible. They also post regularly on LinkedIn
u can have everthing in a single bit, if the decompressor includes the whole universe
Send him your work: 1 (or 0 ofc)
It's an interesting question, though. How far CAN you compress? At some point you've extracted every information contained and increased the density to a maximum amount - but what is that density?
I think by the time we reach some future extreme of data density, it will be in a method of storage beyond our current understanding. It will be measured in coordinates or atoms or fractions of a dimension that we nullify.
How to tell someone you don't know how compression algorithms work, without telling them directly.
Just make a file system that maps each file name to 2 files. The 0 file and the 1 file.
Now with just a filename and 1 bit, you can have any file! The file is just 1 bit. It's the filesystems that needs more than that.
That’s precisely when you bet on it.
It's like that chip tune webpage where the entire track is encoded in the url.
If you have a tub full of water and a take a sip, you still have a tub full of water. Therefore only drink in small sips and you will have infinite water.
Water shortage is a scam.
If you have a water bottle and only drink half of it each time, you will also have infinite 💦
It's all fun and games until your computer turns into a black hole because there is too much information in too little of a volume.
Even better! According to no hiding theorem, you can't destroy information. With black holes you maybe possibly could be able to recover the data as it leaks through the Hawking radiation.
Perfect for long term storage
Can't wait to hear news about a major site leaking user passwords through hawking radiation.
Stupid BUT: making the font in LibreOffice bigger saves space. so having 11 is readible but by changing the font size to like 500 it can save some mb per page
I dont know how it works, i just noticed it at some point
Edit: i think it was kb, not mb
Have a macro that decreases all font size on opening and then increases all again before closing.
Follow me irl for more compression techniques.
per page
I mean, yes. obviously.
If you had 1000 bytes of text on 1 page before, you now have 1byte per page on 1000 pages afterwards
You could always diff the XML before and after to see what's causing it.
Reality is stranger than fiction:
Both people sound obnoxious lol
Nice read, thanks!
I was sort of on Mike Goldman (the challenge giver)'s side until I saw the great point made at the end that the entire challenge was akin to a bar room bet; Goldman had always set it up as a kind of scam from the start and was clearly more than happy to take $100 from anyone who fell for it, and so should have taken responsibility when someone managed to meet the wording of his challenge.
Yeah, he was bamboozled as soon as he agreed to allow multiple separate files. The challenge was bs from the start, but he could have at least nailed it down with more explicit language and by forbidding any exceptions. I think it's kind of ironic that the instructions for a challenge related to different representations of information failed themselves to actually convey the intended information.
Awesome idea. In base 64 to deal with all the funky characters.
It will be really nice to browse this filesystem...
The design is very human
Broke: file names have a max character length.
Woke: split b64-encoded data into numbered parts and add .part-1..n suffix to each file name.