this post was submitted on 20 Oct 2024
507 points (99.8% liked)
Technology
59557 readers
3099 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Yep, that seems like the ideal decentralized solution. If all the info can be distributed via torrent, anyone with spare disk space can help back up the data and anyone with spare bandwidth can help serve it.
Most of us can't afford the sort of disk capacity they use, but it would be really cool if there were a project to give volunteers pieces of the archive so that information was spread out. Then volunteers could specify if they want to contribute a few gigabytes to multiple terabytes of drive space towards the project and the software could send out packets any time the content changes. Hmm this description sounds familiar but I can't think of what else might be doing something similar -- anyone know of anything like that that could be applied to the archive?
Yeah, the projects I've heard about that have done something like this broke it into multiples.
For example, 1000GB could be broken into forty 25GB torrents and within that, you can tell the client to only download some of the files.
At scale, a webpage can show the seed/leach numbers and averages foe each torrent over a time period to give an idea of what is well mirrored and what people can shore up. You could also change which torrent is shown as the top download when people go to the contributor page and say they want to help host it ensuring a better distribution.
Since I'm spamming with this same idea right now - the description is similar to Freenet (the old one, the Hyphanet), but you'd need some kind of ability to choose parts of which collections of data get stored in your contributed storage, while with Freenet it's all the network (unless you form a separated F2F net, there is such an option, but no way to be sure that all peers, ahem, store only IA data and not their own porn collections, for example, taking precious storage). I've described one idea in my previous comment, but it's purely an idea, I'm nowhere close to having the knowledge to make such.
There's an issue with torrents, only the most popular ones get replicated and the process is manual\social.
Something like Freenet is needed, which automatically "spreads" data over machines contributing storage, but Freenet is an unreliable storage, basically like a cache where older and unwanted stuff gets erased.
So it should be something like Freenet, but possibly with some "clusters" or "communities" with a central (cryptography-enabled) authority of each being able to determine the state of some collection of data as a whole, and pick priorities. My layman's understanding is that this would be similar to something between Freenet and Ceph, LOL. More like a cluster filesystem spread over many nodes, not like cache.
You have more knowledge on this than I did. I enjoyed reading about Freenet and Ceph. I have dealt with cloud stuff, but not as much on a technical-underpinnings level. My first freenet impression from reading some articles gives me 90s internet vibes based on the common use cases they listed.
I remember ceph because I ended up building it from the AUR once on my weak little personal laptop because it got dropped from some repository or whatever but was still flagged to stay installed. I could have saved myself an hours long build if I had read the release notes.
That's correct, I meant the way it works.