RedditCrossPostBot

joined 1 month ago
 

Hello fellow Data Hoarders!

I've been eagerly awaiting Gitea's PR 20311 for over a year, but since it keeps getting pushed out for every release I figured I'd create something in the meantime.

This tool sets up and manages pull mirrors from GitHub repositories to Gitea repositories, including the entire codebase, issues, PRs, releases, and wikis.

It includes a nice web UI with scheduling functions, metadata mirroring, safety features to not overwrite or delete existing repos, and much more.

Take a look, and let me know what you think!

https://github.com/jonasrosland/gitmirror


Originally posted by u/jonasrosland on Reddit.com/r/datahoarder


beep boop I'm a bot to seed discussions from Reddit. Upvote or downvote posts like normal, discuss the topics here as well!

 

Looking for a new solution to backup my raw photos that are currently about 5 TB and have a few questions:

  1. Should I use 2 separate external HDDs and sync them from time to time or is 1 enclosure with 2 mirrored HDDs better? I am leaning towards 2 separate ones as it appears to be more redundant.
  2. If I get 2 separate HDDs should I buy 2 different brands or is it safe enough to buy 2 of the same model?
  3. Anyone here who could share their experience with the G-Drive Project 12 TB?
  4. Any other suggestions?

Thanks in advance.


Originally posted by u/Rick-Valassi on Reddit.com/r/datahoarder


beep boop I'm a bot to seed discussions from Reddit. Upvote or downvote posts like normal, discuss the topics here as well!

 

There was someone trying to dedupe 1 million videos which got me interested in the project again. I made a bunch of improvements to the video part as a result, though there is still a lot left to do. The video search is much faster, has a tunable speed/accuracy parameter (-i.vradix) and now also supports much longer videos which was limited to 65k frames previously.

To help index all those videos (not giving up on decoding every single frame yet ;-), hardware decoding is improved and exposes most of the capabilities in ffmpeg (nvdec,vulkan,quicksync,vaapi,d3d11va...) so it should be possible to find something that works for most gpus and not just Nvidia. I've only been able to test on nvidia and quicksync however so ymmv.

New binary release and info here

If you want the best performance I recommend using a Linux system and compiling from source. The codegen for binary release does not include AVX instructions which may be helpful.


Originally posted by u/JohnDorian111 on Reddit.com/r/datahoarder


beep boop I'm a bot to seed discussions from Reddit. Upvote or downvote posts like normal, discuss the topics here as well!

 

Forgive me for my ignorance on this, as I'm still pretty inexperienced with this, but is there a group or a project that makes data available from various sources, such as Kiwix for downloading Wikipedia? I figure the last 2 months have been a real wake up call and I have since downloaded the .wix for Wiki, but wonder if there is something similar that crawls .gov sites or .uni/.edu sites for archiving purposes and packaged for easy distribution/downloading?

Keep in mind, I have no idea how much effort goes into projects like that, and I can definitely appreciate it now that we have seen what happens when we take something for granted.

Just a thought that crossed my mind this morning and I wanted to post it before I forgot.


Originally posted by u/canigetahint on Reddit.com/r/DataHoarder


beep boop I'm a bot to seed discussions from Reddit. Upvote or downvote posts like normal, discuss the topics here as well!

 

I'm trying to pull some videos and haven't found any add-on or app that can do it from Podia.com (an online course platform).

Thanks in advance for any thoughts.


Originally posted by u/magicmikela on Reddit.com/r/DataHoarder


beep boop I'm a bot to seed discussions from Reddit. Upvote or downvote posts like normal, discuss the topics here as well!

 

I have been successfully running Proxmox and TrueNAS Core for a while now. Proxmox runs a small number of servers such as Home Assistant, Nextcloud, and Plex. TrueNAS Core provides network storage over SMB and NFS. In the interest of lower power consumption, smaller physical footprint, and better connection between compute and data, I am considering transitioning to TrueNAS Scale for both my VMs and network storage. Can anyone who has made this transition share their experience? What are gotchas I might be missing? What difficulties should I expect? Is TrueNAS Scale as good of a hypervisor as Proxmox? Any and all opinions are welcome. Thank you in advance!


Originally posted by u/american_engineer on Reddit.com/r/homelab


beep boop I'm a bot to seed discussions from Reddit. Upvote or downvote posts like normal, discuss the topics here as well!

 

Hello all!

I'm looking to build an 'all in one' kind of homelab server (running home automation, kubernetes/docker for various apps like Vaultwarden, Plex, -arrs, general /r/selfhosted stuff, as well as perhaps some local AI assistants or chats (not training) ...) as well as migrating from a Synology NAS. I want to ideally buy once cry once and only upgrade as things need over the next few years.

Here's what I have so far.

  • Fractal Design 7 XL case
  • ASRock X870E Taichi
  • AMD Ryzen 7 9800X3D
  • Noctua NH-D15S chromax.black - Ventirad
  • 2x Samsung 990 PRO 2TB - SSD M.2 2280 PCIe 4.0 NVMe
  • 4x Western Digital Red Pro 8TB 256MB
  • Seasonic Prime TX 1300W

I'm missing ECC RAM (unbuffered) and a graphics card. It's hard to tell what is meant for a gaming rig and what is best for Plex transcoding (rare but sometimes needed) and running AI workloads.

Feel free to critique any other parts of the build as well.


Originally posted by u/sur-vivant on Reddit.com/r/homelab

 

What the title suggests. I mean, I've already looked for some server simulation games but haven't found any first-person ones. Well done, something like "viscera cleanup detail"—I'm not talking about anything like Cisco or a network simulator—could be an interesting project to create a game like that.


Originally posted by u/Which-Relative-2803 on Reddit

 

I recently set up a backup LTE connection for my home network OPNSense router using a cheap Huawei USB modem. While the modem worked out-of-the-box on Linux with NetworkManager, getting it running on OPNSense (FreeBSD-based) turned into a deep dive into USB communication. Unlike on Linux, where /dev/cdc-wdmX allows to get this modem online through a single AT command with echo -e 'AT^NDISDUP=1,1\r' > /dev/cdc-wdm0, OPNSense/FreeBSD module does not create an equivalent CDC WDM device.

After some USB monitoring and protocol analysis, I found a solution that allows to send a raw USB control message and initialize the connection: a single usbconfig command was all it took to get the modem online:

usbconfig -d 8.2 -i 0 do\_request 0x21 0 0 2 16 0x41 0x54 0x5e 0x4e 0x44 0x49 0x53 0x44 0x55 0x50 0x3d 0x31 0x2c 0x31 0x0d 0x0a

Full write-up here: https://dawidwrobel.com/journal/initializing-lte-modem-using-raw-usb-communication/


Originally posted by u/wrobelda on Reddit

 

many thanks to: https://www.reddit.com/r/homelab/comments/hix44v/comment/kdhhp02/?context=3

This post assumes you already flashed the hacked firmware, this rather shows you how to use the hack for this specific server model. It also serves as a refresher if you ever forget how to apply the hack again.

  1. SSH into your iLo IP. Make sure to use your own user name and password as well as own IP. ssh -o KexAlgorithms=+diffie-hellman-group14-sha1 -o HostKeyAlgorithms=ssh-rsa user@iLOipaddress
  2. Once logged in the commands are simple. The PIDs range from 0-3 (total of 4 fans). fan p 0 min 10 fan p 1 min 10 fan p 2 min 10 fan p 3 min 10 fan p 0 max 60 fan p 1 max 60 fan p 2 max 60 fan p 3 max 60

Feel free to thinker with the max speeds. With 60 I keep my fans at 23% the most and it is not loud at all.


Originally posted by u/NefariousProxMox on Reddit

 

Originally posted by u/Agreeable_Repeat_568 on Reddit

 

Dear homelab community!

I have been running two Raspberry Pis (3 B+) for years now. One hosts Zigbee2MQTT and the other one Homebridge. I have dozens of home automation devices (lights, plugs, blinds, thermometers) in my house.

Yesterday I added another Raspberry Pi (also 3 B+) which hosts Adguard Home. I’ve bought a nice little “mini rack” that can house up to four Raspberry Pis and moved the whole thing to the room in the basement where the cable modem, router and switch are. My wife started calling that room the “server room” - That made me happier than would actually be appropriate…

Some time ago, I realized that you don't need a separate computer for every service. Nevertheless, I have ordered a fourth Raspberry Pi (4 with 8 GB RAM) for the next expansion - paperless-ngx and Wireguard (my router is an ER605). I couldn't install paperless-ngx on the first two Raspberrys because they both only have 32bit Linux. The Raspberry with Adguard has an SD card that is too small. I also wanted a little more computing power for paperless-ngx.

Now comes my question: Should I simply continue to operate four Raspberrys, or would you migrate the existing services (Zigbee2MQTT, Homebridge, Adguard) to the new Raspberry? If you were to set it up from scratch, you would probably only use one Raspberry. But I'm worried that I'll mess up my smarthome configuration and it will all be a huge effort.

Alternatively, I could just install Adguard Home on the new Raspberry 4 in addition to paperless-ngx, which would at least save me one device.

Of course, I am aware that there is no “real” need to reduce the number of Raspberries. I don't mind the little bit of electricity costs. But somehow it's also a question of honor to do the whole thing according to best practice.

What would you recommend?


Originally posted by u/Training_Anything179 on Reddit

view more: ‹ prev next ›