I just have to say, after having booted into Windows, that Linux is so much nicer than Windows when it comes to doing system “updates.”
So, here I am, sitting in my chair for about 20 minutes looking at a mostly black screen and a highly dubious looking percentage number going up very slowly. It tells me that Windows is “updating” and that I should keep the computer turned on. Good thing I have the computer turned on or I wouldn’t know that I shouldn’t have it turned off, right?
Anyway, I start to think about how this experience goes in Linux. In my experience, I do “system” updates about once a month, and I can see each individual package being installed (if I glance away from my browser session, that is). In Windows, I have no choice but to sit here and wonder if the system will even work again.
Windows decides that it wants to update drivers, apparently (I honestly have no idea what it’s doing, which is part of what pisses me off), because it reboots the computer. Then it reboots again. Then, eventually, everything goes back to the familiar Windows desktop. WTF?
How anyone could prefer Windows to Linux is truly a mystery to me.
Sometimes people have their computers turned on, but then they turn them off. I know, it’s wild.
I’ve never, in multiple decades of using Windows, and thousands of updates, ever had an update installed and not had my computer work again. I suspect this is most people’s experience, or they wouldn’t use it.
Because most people are not system administrators and don’t have the time or knowledge to debug their computers every 5 minutes, or to figure out how to do what they want it to do or run the program they need to run. I’ve used both extensively and Windows is, by a landslide, the easier system to use, regardless of what the reasons are.
I’ll give you the benefit of the doubt that you aren’t trolling and instead congratulate you on being a lucky Windows user. That’s unicorn-level awesome to me. As a former tech for public universities for 14 years, I can attest to the validity of OP’s description.
Faculty and staff begged for methods to postpone updates that randomly introduced breaking changes, and its easy to recall the many times I was in a lecture hall rolling back audio drivers that broke the A/V setup after updates. Professors would be mid-lecture or mid-exam and have a video card driver update without warning and set their screen to mirror instead of extend, putting their notes or answer key up for the class to see and breaking their lesson plan. Disabled hardware would be updated and reenabled, breaking input or output devices.
I’ve certainly had updates (especially when they began including BIOS updates without asking) break system function irreversibly as well, like when whole campuses had a new TPM version (1.x > 2.x) pushed without warning, which caused them to fail to boot with the static image they were running. The state was slow to fully-implement WSUS, but got on the ball by 2018. That changed everything.
Suffice to say that while you my have gotten lucky and never experienced any downtime resulting from an unscheduled Windows update, others definitely have.
No one said others have. I want claiming it’s never happened to any of the several billion Windows users. I was claiming that it’s not common.
I mean I had arch break grub with a update, which would really suck for a computer beginner. And I had a OpenWRT router boot loop after a update. On my windows machine the only updates that led to a boot problem were Nvidia ones.
We have found the one Windows fan on Lemmy!
I still use windows almost entirely because of certain software I need for work. But if not for that I’d switch in a heartbeat, I’m not the most tech savvy person but in my experience Linux is much nicer and easier to use and if you need to debug it every 5 minutes you’re doing something very wrong. The only downside is software support which I’d argue isn’t the fault of Linux.
Yeah, definitely not. I still use Linux on like 6 computers.
But if the reason people use Windows is truly a “mystery” to you then you’re simply delusional. I am not a genius but I’m competent enough to make it functional.
It just frustrates me less than a remote server constantly fucking with my computer and actively preventing me from doing what I want.
Probably. That doesn’t make it any less frustrating.
Whose fault it is is completely irrelevant. If ya can’t do it, ya can’t do it.
I mean you can break Windows enough to have to debug it every five minutes, too, that would also be frustrating.
For the average user who isn’t tinkering with everything Linux is a pretty smooth and pleasant experience.
Also I’m not mystified by Windows’ market dominance, but we all know the reason isn’t because it simply provides a better experience. Most Windows users have no choice in the matter as it’s just the default.
Also software availability doesn’t have much to do with the OS. It’s a reason I don’t use Linux more, but it’s not something the OS does poorly. It’s something software developers do poorly.
I don’t break anything. The most recent debacle I experienced was that the maintainer somehow lost the signing keys or something and it just gave a generic error message and refused to update.
We all know that’s not correct. Why do you think it’s the default? Why do you think people pay real money to have it installed on their computers vs. the free option?
LOL wat?
It doesn’t matter whose fault it is. It doesn’t work. That’s all that matters.
Windows is fine at being an OS, most of the time it just works. I think the exact same thing is true of certain Linux distros, especially for the average user who could load it up, browse the internet and watch videos without ever breaking or having to debug anything.
If we’re purely talking about the OS. Forget software or imagine you’re exclusively using software that works fine on both. I think Linux is a much nicer experience. It has really improved over the years.
Obviously we can’t just ignore software, though, and that’s a huge part of why Windows is still so popular. But another huge part is that Microsoft pays a lot of money to make it the default OS on lots of hardware. I can’t even think of a single person I know who chose windows, it’s just what companies use and what most computers come with pre-installed. Companies like it because Microsoft provide tech support. There are many reasons why Windows is so popular that have nothing to do with the user experience.
You have that backwards.
Sorta but not really, it’s ubiquitous now so it almost has to be on new hardware, and Microsoft offers big discounts for OEM versions. They lose money to guarantee it stays the default I guess? Either way, I still don’t think there’s a lot of people actively choosing it.
Sometimes people find a thing easier to use, but then it turns out they only believe that because they have a lot more (or more recent) experience with it than the alternative.
I have used both Windows and Linux extensively. The easier system to use is always the one I’m more familiar with. (This became obvious when I tried using Windows again after being away from it for a decade or two.)
I hear this all the time and it’s just not true. I’ve been using Linux for ~3 years now and it’s still significantly more complicated. And it’s incredibly easy to see why.
Do a Google for “how to x on Linux” and tell me you’re not instructed to enter a bunch of commands you don’t understand into a terminal where it inevitably kicks back some generic non-descript error or just…does nothing at all.
And how long ago did you start using Windows?
Does it matter? How many years do you think it should take to become familiar with the basic functions of an OS?
Yes. Of course it matters. You just disputed an observation about relative amounts, with only a single amount to support your argument. With no point of comparison, your argument is meaningless.
But now I see you already provided an answer in an earlier comment: “multiple decades of using Windows”. Compared to your “~3 years” with Linux. That doesn’t refute my observation at all, now does it?
(We don’t even have to consider the likelihood that you’ve also spent more time per year on Windows than you have on Linux, since the difference in years is so significant on its own.)
If you were to complain that googling for random people’s ideas on how to solve a problem tends to yield more helpful results with the older and globally dominant desktop OS than it does with the younger one with a tiny minority desktop market share, then I might say you were right about that. But instead you wrote, “it’s just not true,” about something that you’re not in a position to know. That’s a bit of an overreach, don’t you think?
It’s fine not to like a thing. It’s fine not to understand a thing. But to go around condemning it as inferior based on your subjective and limited experience is unfair, and more than a little biased.
Hard to say, given that most of us have been using our OS of choice for long enough to no longer clearly remember how long it took us. It’s complicated by the fact that so many people learn Windows as their first OS, so their expectations and habits are built around it from a young age, and those shape their approach and assumptions when trying something different. But in my family, grandma got familiar and productive with the basic functions of Linux in roughly 2-3 months. I imagine it varies a lot from person to person.
I’ve already explained this in the comment you just replied to. I only need a single amount because it is more than sufficient.
It has nothing to do with Google, it has to do with convoluted processes to complete tasks that are very simple and intuitive on Windows because it has a GUI and you just click around the menus until you find it. Or use the search (but not on W11 obvi).
That’s not the issue. And you already know this because I’ve already said I’ve been using it for several years. This is just a vein attempt to derail the conversation.
Also not the issue. The issue is being complicated and difficult to understand.
You’re making shit up again.
I have seen storage corruption on a Windows 10 computer cause a boot loop where it constantly tries to verify the integrity of the filesystem or something like that. I forget what happened to that computer though. I think it was just replaced.
Also, niche types of computers like tiny laptops tend to get blue screens a lot more than more common computers because there’s probably some faulty driver, but you can say that’s not really gonna be Microsoft’s fault.
All computers can crash if you encounter rare or weird bugs and hardware failures. How often it happens is a statistical question that might be hard to answer when there’s so much going on and opinions about what could be happening.
I would think generally that relatively new computers running Windows, especially gaming PCs, tend to very rarely encounter significant issues (outside of actual manufacturing faults). They have well supported hardware and that hardware runs fast.
Slowdowns and weird behaviour are another category too. You could have a super fast computer that’s always crashing, or a slow old computer you’ve never ever seen crash (a lot of servers are kind of like that). I had an old computer running Windows 10 (someone gave me it when it was being got rid of for an upgrade), and it was horrible because everything in the interface would take actual seconds to load, even though this computer could run fairly recent games fine around 5 years ago. I installed Linux on that instead and everything was silky smooth, smoother than it ever felt to use that computer before.
Linux often works a lot better when it comes to old computers. The drivers can even in some cases be supported and updated long after it stopped being a new computer, by both a handful of enthusiasts and by big companies that hire people to prevent servers crashing. Thinkpads can be like this, so if you ever wanted to get a computer specifically for Linux, it would be best to get one a few years old that’s popular in tech company offices.
How well you can use an interface is a bit of a skill issue, or rather an experience issue. It’s a bit like the stereotype of a possibly senile old person who can’t use the menus on their TV because it’s too complicated. If you’re used to a certain interface, it shouldn’t take any thought at all to do the things you usually do on a computer.
Interfaces are also a matter of preference, and Linux has lots of different interfaces you can use. If you tried them all you’d probably find that one was your favourite for a few different reasons, but most reasonable people don’t have time for that so they’ll stick with whatever they can work out how to use and find alright.
It doesn’t seem like you’re gonna be using Linux much (at least not by choice) any time soon, and you might even have had some bad experiences with it before. That’s fair; I’m not trying to convince you that you need to use Linux and it sounds like Windows works very well for you.
You do seem to have the impression though that all Linux computers are constantly crashing and glitching out in weird ways, which is simply not true. Most servers around the world that build up the internet use Linux, and most of the popular ones rarely ever go down. That means these computers are constantly running, humming away with their bits in a big room somewhere, and Linux almost never has any issues that stop them, especially full system crashes. Power outages are a more common reason for servers to be down, and you don’t need to be a sysadmin constantly making intricate bugfixes to keep a Linux PC running while you play games.
TL;DR: All computers can have bugs or crash and Linux isn’t constantly crashing. That’s probably why everyone seems like they’re seething at what you said. Lol