this post was submitted on 28 Jul 2023
289 points (92.1% liked)
Linux
48199 readers
927 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Linux is their bread and butter when it comes to servers and machine learning, but that's a specialized environment and they don't really care about general desktop use on arbitrary distros. They care about big businesses with big support contracts. Nobody's running Wayland on their supercomputer clusters.
I cannot wait until architecture-agnostic ML libraries are dominant and I can kiss CUDA goodbye for good. I swear, 90% of my tech problems over the past 5 years have boiled down to "Nvidia sucks". I've changed distros three times hoping it would make things easier, and it never really does; it just creates exciting new problems to play whack-a-mole with. I currently have Ubuntu LTS working, and I'm hoping I never need to breathe on it again.
That said, there's honestly some grass-is-greener syndrome going on here, because you know what sucks almost as much as using Nvidia on Linux? Using Nvidia on Windows.
I really hope this happens. After being on Nvidia for over a decade (960 for 5 years and similar midrange cards before that), I finally went AMD at the end of last year. Then of course AI burst onto the scene this year, and I've not yet managed to get stable diffusion running to the point it's made me wonder if I might have made a bad choice.
It's possible to run stable diffusion on amd cards, it's just a bit more tedious and a lot slower. I managed to get it working on my rx 6700 under arch linux just fine. Now that I'm on fedora, it doesn't really want to work for some reason, but I'm sure that it can be fixed as well, I just didn't spend enough time on it.