#1 is just not being the default for 99% of devices. If someone gets a new computer, why would they go through the effort of installing a new os when the one it comes with works fine? Hell, I bet at least 50% of people in the market for a pc don't even know what an OS is.
Linux
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
Agreed. Android and chrome os are used happily by 10s of millions without any idea it's a Linux distro
I bet if small, cheap netbooks came out running mint or fedora or something people wouldn't even or know or care that it was Linux.
In middle school I had a USB drive with Linux Mint installed on it which I was using on school PCs. We only used those PCs for internet browsing and office. Not a single soul noticed it wasn't Windows. Teacher only noticed 2 differences, "You have different version of Office installed here." and also gave me a note for "Changing wallpaper" which was strictly prohibited for some reason.
Which actually means Linux is being successfully adopted by the general public in a similar way to windows as a general use system that doesn't require a lot of technical knowledge.
Fully customizable distress will never be popular with the general public. They want systems that just do the general stuff and have it work automatically.
Of course they know what an OS is. There's only two of them: Apple and Microsoft.
New user: I have a problem 😊
Everyone:👍
- are you on xorg or wayland?
- pulseaudio or pipewire?
- what WM/DE are you using?
- amd or nvidia?
- what distro?
- systemd?
New user: Nevermind 😮💨
if a new user is using a distro that doesn't use systemd they fell for a meme
At this point, my biggest dream is that these 'new user' distros used only Wayland, Pipewire, Systemd and Flatpaks simply to simplify things. Hopefully we're less than 2024 away from NoVideo Wayland support.
Also as soon as XFCE releases their Wayland support, that soon it'll become the most famous DE choice of Mint.
What I am really happy is to see how well supported Pipewire already is. Pipewire has never showed any problem in the new installs for me.
- Isn't pre-installed on well known machines by well known brands.
- Popular applications (whether productivity, creativity, or games) do not work out of the box that people want. It doesn't matter that alternatives exist, or that you can use things like Wine. If it's more than just click the icon, it's too much.
- If things cannot be done purely through touch / the mouse, it is too hard for most people.
If things cannot be done purely through touch / the mouse, it is too hard for most people.
100%. Even as a power-user (understatement) who overwhelmingly prefers keyboard input to control things when I'm "gettin' stuff done", I will sometimes miss the general consideration level of Windows' input handling when it comes to mouse and especially touch. Mouse is pretty damn good these days on Linux, but touch...
Touch is abysmal. A ton of modern laptops have touchscreens, or are actually 2-in-1s that fold into tablets, etc, and the support is just barely there, if at all. I'm not talking about driver support - this is often fairly acceptable. My laptop's touch and pen interface worked right out of the box... technically. But KDE Plasma 5 with Wayland- an allegedly very modern desktop stack- is not pleasant when I fold into tablet mode.
The sole (seriously, I've looked) Wayland on-screen-keyboard, Maliit, is just terrible. No settings of any kind (there is a settings button! it is not wired to anything, it does nothing), no language options, no layout options (the default layout is abysmal and lacks any 'functional' keys like arrows, pgup/dn, home/end, delete, F keys, tab, etc), and most egregiously, it resists being manually summoned which is terrible because it does not summon itself at appropriate times. Firefox is invisible to it. KRunner is invisible to it. The application search bar is invisible to it. It will happily pop up when I tap into Konsole, but it's totally useless as it is completely devoid of vital keys. Touch on Wayland is absolutely pointless.
Of course, there is a diverse ecosystem of virtual keyboards and such on Xorg! However, Xorg performance across all applications is typically abysmal (below 1FPS) if the screen is rotated at all. This is evidently a well known issue that I doubt will ever be fixed.
In the spirit of Open Source Software, and knowing that simply complaining loudly has little benefit for anyone, I have at several times channeled my frustration towards developing a reasonable Wayland virtual keyboard, but it's a daunting project fraught with serious problems and I have little free-time, so it's barely left its infancy in my dev folder, and in the meanwhile I reluctantly just flip my keyboard back around on the couch with a sigh, briefly envious of my friend's extremely-touch-capable Windows 2-in-1.
Preinstalled.
Like, were nerds and we fuck with our computers n stuff. But most people are lucky to know what a power cord is.
Honestly if Linux with a good DE like KDE or Cinnamon was already on their PC at boot they would figure it out. Most people just use a web browser anyways.
I have put my dad on Kubuntu. Don't like anything *buntu, personally, but I have to admit it's quite stable and with sane defaults. He hasn't complained ever since and support calls dropped considerably. He spends most of the time in Firefox anyways, where I've added ublock.
The problem with Windows was, he'd occasionally browse the web with Edge by mistake (or because MS forces it down your throat), and as soon as an 80+ y.o. browses the web without ad blocking, getting a virus is just a matter of time.
All this is to say that I agree with the fact that preinstalled is key. I wish that more effort was focused on fewer distros and I feel that so much talent and energies are being lost in marginal projects.
But many people do this for passion and it's of course their choice to decide where to contribute, or whether to spin up a brand new distro entirely, can't judge them for that. I'm just observing that those energies could be better used to smoothen some rough edges on more popular distros to make them even more appealing to OEMs and convince them to ship those on their hardware.
Most people buy computers with the OS already installed and would get just as lost trying to install MacOS or Windows.
Based on my tests on my family and friends, the main problem is tech support. Most geeks seem to assume other people want the same things than themselves (privacy, freedom, etc). Well, they don't. They want a computer that just works.
Overall when using Linux, people actually don't need much tech support, but they need it. My father put it really well by saying: "the best OS is the one of your neighbor."
I apply few rules:
-
The deal with my family and friends is simple: you want tech support from me ? ok, then I'm going to pick your computer (usually old Lenovo Thinkpads bought on Ebay at ~300€) and I'm going to install Linux on it.
-
I'm not shy. I ask them if they want me to have remote access to their computer. If they accept, I install a Meshcentral agent. Thing is, on other OS, they are already spied on by Google, Microsoft, Apple, etc. And most people think "they have nothing to hide". Therefore why should they worry more about a family member or a friend than some unknown big company ? Fun fact, I've been really surprised by how easily people do accept that I keep a remote access on their computer: even people that are not family ! Pretty much everybody has gladly agreed up to now. (and God knows I've been really clear that I can access their computer whenever I want).
-
I install the system for them and I make the major updates for them. Therefore, if I have remote access to the system, I pick the distribution I'm the most at ease with (Debian). They just don't care what actually runs on their computers.
-
When they have a problem, they call me after 8pm. With remote access, most problems are solved in a matter of minutes. Usually, they call me a few times the first days, and then I never hear from them anymore until the next major update.
So far, everybody seems really happy with this deal. And for those wondering, I can see in Meshcentral they really do use those computers :-P
When i told my dad i can install Rustdesk on his computer to do remote support (moved out), he asked me "does that mean you can look at my computer whenever you want?". I'm really proud of him, he actually listened.
- Self updating without user interaction per default.
- Better support of codecs and drivers.
Linux does have better codecs and drivers than Windows for some stuff (Bluetooth for example), but it has worse codecs and drivers for some important proprietary hardware stuff (Nvidia for example)
It needs to "just work". It's not more complicated than that.
This, a lot of ppl talk about the pre installed thing but Linux has a lot of friction yet. Linux is big, it's open and made to run in almost any device with an arm or x86 processor, yet Linux is usually a pain in the ass on edge cases and we cannot ignore. Some years ago dealing with drivers on Linux was a hell, today is better but still has edge cases (this is not a Linux fault usually, vendors are shit usually but it cause friction. Audio just recently was resolved with the adoption of pipewire but pulseaudio had a lot of caveats. Now we are getting rid of X11 that is great for usual usecases but is full of workarounds if you want to to a simple thing like having two monitors with different refresh rates. There is a lot of things but linux is going forward, last year I could made my full switch since gaming on Linux became a thing but definitely was not plug and play.
- All of the basics should just work well out of the box with minimal tweaking. Yes even NVIDIA stuff.
- The software center needs a massive overhaul. It feels like an afterthought by people who would rather use a command line.
Linux really isn't ideal for anyone who isn't already a tech enthusiast on some level. I recently did a fresh install of Kubuntu and after about a week, it prompted me that there were updates, so I clicked the notification and ran the updates, after which my BIOS could no longer detect the UEFI partition. I had to use a live usb to chroot into the system and repair it, as well as update grub, in order to fix it.
It's fixable, but this is not something anyone who doesn't already know what they're doing can fix. I've had auto updates in the past put me on boot-loops thanks to nvidia drivers, etc.
This kind of thing needs to almost never happen for linux to be friendly for those who just want their computer to work without any technical understanding. This, honestly though, can't happen because of the nature of distros, you can't ever make guarantees that everything will work because every distro has slightly different packages.
Wine is getting better, but compatibility is still an issue, especially for people who rely really heavily on microsoft office or adobe products.
- Installation process of Linux is complicated to an average Joe (Bootable USB/ISO file/Boot priority/format <- what are these scary terms?)
- Lack of availability of pre-installed Linux PCs at physical shops
- Lack of availability of industry-standard software
- Confusion for an average Joe due to excess choice of distros/application packaging format. Average people don't want choices, they want to be guided.
- (Minor point) Most available guides for doing something heavily requires terminal usage which can be daunting to new users
The actual answer, there is no reason to switch. The vast majority of users do not care about Linux or why they would want to. For us there are lots of benefits and things we enjoy about getting away from Windows but for them "why?"
Speaking from experience, from a long time ago, and from the people/family I've installed it for on older machines: It's different. That's 90% of it.
The people that had little to no windows/PC experience actually took to Linux a lot easier not having to relearn/change habits from windows.
When's the last time the average user has had to install an operating system?
That's the biggest obstacle right there. I think plenty of non-techy people would use linux if it came preinstalled.
Also, if it came pre-installed, one would assume all the hardware was properly supported. A big pain point with Linux is that sometimes things just don't work right, and there's nobody to turn to for help except Google. It's been a while since I attempted to run Linux on a laptop, but when I did I struggled a lot getting good battery life, good trackpad support, and a sleep mode that worked correctly.
Reputations live on for decades after they are earned. Perhaps all of my laptop problems are ancient history, but I have no way to know without trying, and it's too much effort.
To be honest, one part is what everyone mentioned here. Not being preinstalled and all that.
The other part is that unfortunately at least according to my own expirence as a Linux noob a few years ago some Linux communities can be very toxic. If you're asking questions of how to do X and someone comes along and is all "why do you even want to do X if you could also do Y? Which is something entirely different but also does something vaguely similar"
That's one if the things.
And then other curiosities. I cannot for example for the life of me get my main monitor to work under Linux with any new Kernel version. My Laptop just refuses to output to it or the second monitor attached via Display port daisychaining. On the older version it works, on the newer it's broken. I have tried troubleshooting this problem for over half a year and it's still broken. And that's out of the Box on Ubuntu LTS...
So i don't really understand this question. There are major roadblocks. With Wayland which is default for Ubuntu now those roadblock jist became bigger. Screensharing in multiple Apps including slack is outright broken unless you use the shitty webapp. The main player Office 365 largely doesn't work at all on Linux. All these things that should work for a Desktop operating System don't work out of the Box as they should.
That's why people aren't using it and companies aren't preinstalling it.
Most folks have been sold a story that every new technology they start using is supposed to be "intuitive"; and that if it is not "intuitive" then it must be defective or willfully perverse.
For example, novice programmers often stumble when learning their second or third language, because it differs from their first. Maybe it uses indentation instead of curly braces; maybe type declarations are written in a different order; maybe it doesn't put $
on its variables; maybe capitalization of identifiers is syntactically significant.
And so they declare that Python is not "intuitive" because it doesn't look like C; or Go is not "intuitive" because it doesn't feel like PHP.
It should be obvious that this has nothing to do with intuition, and everything to do with familiarity and comfort-level.
Commercial, consumer-oriented technology has leaned heavily into the "intuitive" illusion. On an iPhone or Windows, Android or Mac, you're supposed to be able to just guess how to do things without ever having to confront unfamiliarity. You might use a search engine to find a how-to document with screenshots — but you're not supposed to have to learn new concepts or anything. That would be hard.
That's not how to learn, though. To learn, you need to get into unfamiliar things, recognize that they are unfamiliar, and then become familiar with them.
Comfort-level is also important. It sucks to be doing experimental risky things on the computer that's storing your only copy of your master's thesis research. If you want to try installing a new OS, it sure helps if you can experiment with it in a way that doesn't put any of your "real work" at risk. That can be on a spare computer, or booting from a USB drive, or just having all your "real work" backed up on Dropbox or Google Drive or somewhere that your experimentation can't possibly break it.
To me, the big problem is still updates breaking things.
Everybody needs to update their system from time to time, but if doing so leaves your system in an unusable (for the average person, not a linux terminal guru) state, users aren't going to stay.
I think immutable/atomic OSes like Silverblue, VanillaOS and SteamOS are heading in the right direction to solve this issue. Particularly if they allow users to easily rollback a bad update. Otherwise maybe there is some way to detect and warn about potential compatibility issues before people update.
It's the first step of installation, making a bootable usb/CD. Most non-technical people can't be arsed to create a bootable drive, then go into the bios boot settings to run it. I haven't used Windows in a long time so I don't know how it's installed these days, but the fact that it comes installed out-of-the-box when people buy a computer lets them skip the first and biggest step to running linux, which is getting it installed in the first place.
Distros have come a long way that a Windows user trying Linux Mint can hit the ground running. It's no longer about the learning curve for USING linux, it's INSTALLING linux that's the problem.
Exactly. I'd argue that some supposedly mainstream distros are hard to install even for the competent. Last time I checked, Debian's funnel for newbies consisted of a 90s-era website with "instructions" in the form of a rambling block of jargon-filled text with mentions of "CD-Roms" and a vague discussion of third-party apps for burning ISOs. I mean, on Linux flashing a USB stick is matter of a single dd
command with some obscure switches, but even that was nowhere to be found and I had to search forums for it. Incredible! Hard to imagine how forbidding it must all seem to the average Windows user! No Debian for them!
IIRC Ubuntu's process was much easier but still not as easy-peasy as it could have been.
The only hope for desktop Linux is a crystal-clear, bulletproof, 1-2-3-style onboarding funnel that takes the user from "this is the distro's website" to "I have a bootable USB". From that point on it's plain sailing.
A lot of people have already talked about the onboarding/installation experience, so I'll just chime in and say a lot of new users are unfamiliar with using a terminal for commands and instead favour a GUI experience solely for their tasks. Most modern and commercially appealing distros are moving in this direction (ie applications running the same terminal commands in the background with an easy to understand UI at the front) but I'd still say the community's insistence on terminal over all other forms of executing a command may be a turn off for the layman trying it for the first time after Windows and MacOS.
Almost makes me think it would be more ideal to reduce the stigma associated with executing commands in the terminal and find some way to get people more comfortable with using it, both via Linux and also CMD for Windows as well.
3rd party software/hardware. Companies don't develop for Linux. And Linux developers can't reverse engineering everything.
Make it just run and pre install it on most computers.
With "just run" I mean things like:
- Audio just working
- Bluetooth just working
- Bluetooth and audio working together ~~(I still can't get this one right, after 5 evenings of trying)~~
- WiFi supporting all the frequencies, instead of just some
- remembering monitor configurations
- Troubleshooting audio shouldn't mean that you almost completely kill your OS with that
You know, things like that that might cost you an evening or two or three to figure and make you feel like you're the rarest edge case alive. On Windows, these work just fine out of the box.
I know this ain't easy to get to, but I can't recommend people to use Linux when even a phones does perfectly fine out of the box results in at least an evening of troubleshooting.
man you must be using some fucked up distro because never had those problems in the last 4 years.
Linux needs developers developers developers developers developers developers developers. Notably gamedevs. And kde needs to be default. Osx is only popular in a couple countries.
One thing I always talk about is how DE is much more important for new user than a distro. New users will only use GUI anyway so their choice of DE has to be the most comfortable.
Took me years personally to switch to Linux, trying stuff like Ubuntu or PopOS, and I couldn't understand why it doesn't "click" for me until I understood that I simply personally dislike Gnome (being an ex Windows user). Tried a KDE distro and it clicked immediately, never looked back. Now I don't even use KDE but it helped me to get through initial frustration period.
When you have a problem the solution is fragmented between distros, configuration, opinions, and time as solutions constantly change and they all have subtly repercussions. It becomes very overwhelming to figure out a solution and pick the right one.
I recently changed and could only do it because of ChatGPT. There are a lot of things that work different in Linux, like package managers, the file system in general, the focus on terminal, stuff that works different with different distros. For almost all questions, ChatGPT helped me within seconds. This is even more true, when I kinda don't know, what my question actually is. Then it helps to give me some good buzzwords to Google for. If I would have done this with just reddit and forums and stack or something, I'd get so much non-helping, gatekeeping, belittling answers - if any.
- The misconception that you need to "know linux" to use a computer with linux.
You need to "know linux" to administer linux servers, or contribute to kernel development. My wife is a retired pharmacist, and she uses exclusively a computer with Linux since around 2008. She knows that's Linux, because I told her so. If I had told her it was a different version of Windows, she'd be using it anyway - she was using win95 at work before, so any current windows would have been a big change anyway (granted, nothing like gnome, that's why I gave her kubuntu).
This misconception is fed by "experienced" Linux users who like to be seen as "hackers" just because they "know Linux".
Nobody uses the OS. You use programs that run on the OS. My wife doesn't "use Linux". She uses Chrome, the file manager (whatever that is in the ancient LTS Kubuntu release I have there and update only when LTS is over), LibreOffice Writer and Calc, a pdf reader (not adobe's, whatever was in the distro), the HP scanner app. The closest she gets to "Linux" is occasionally accepting the popup asking for updates.
Users shouldn't need to care about which OS (or which distro, for that matters) they're running their apps on. The OS (and distro) should be as unobtrusive and transparent as possible.
- Distro hopping cult. It's ok to try a few distros when adopting Linux, or even flirt with new ones after you've already settled with one. Even keep doing it forever, on a secondary machine or live usbs, if you're curious.
Doing it forever, on a primary machine is stupid; NO FSCK DISTRO WILL BE PERFECT. Windows users whine and cry every time Microsoft shoves a new and worse Windows version up their SSDs, but they stick with Windows anyway.
Distro hoppers hop often because they give up at the first inconvenience. They never feel at home or make it their home, because they never actually use their computers for long enough with any distro. They are more focused on the OS than in using the computer. Nothing wrong with that, but they'll forever be "linux explorers", not actual "linux users".
There will always be some other that has that small thing that doesn't come default on this one. There will always be compromises. It's like marriage. Commit, negotiate, adapt. Settle down ffs.
The OS/distro shouldn't be important for the average user; the OS/distro shouldn't get in the way between the user and the apps, which is what the user uses.
Of course there are distros with specific usage in mind (pen test, gaming, video production, etc), as they conveniently have all main utilities packaged and integrated. But for real average user apps, the OS shouldn't matter to the end user, let alone look like the user should know what window manager or packaging system they're using.
Then when they are faced with dozens of "experts" discussing about which distro has the edge over the other, and the gory technical details of why, and comparing number of distros hopped, well, it sounds like Linux is a goal by itself, when all they wanted was to watch YouTube and access their messages and social media.
When my wife started using a Linux computer I didn't tell her which distro was there (she probably knows the name kubuntu because it shows during boot). I didn't give her a lecture about Gnome vs KDE, rpm vs deb, or the thousands of customizations she could have now. "You log in here, here's the app menu, here's chrome, this is the file manager, here's the printer app". Done, linux user since 2008.
Linux will never be mainstream while we make it look like "using Linux", or "this distro", matters, and that is an objective in itself. Most users don't care. They want to use their apps.
Last time I was hired as a code monkey we used Linux with a dual-monitor setup. The setting would not, under any circumstances, see one of those 1080p monitors as anything more than 480p.
I spent literally half the first day of work looking for solutions, and eventually settled on running some random command i don't understand copied from the internet running on startup.