What has allowed the Russian state to treat him with such a degree of cruelty is the war with Ukraine. News of his arrest, imprisonment, and treatment in jail were overshadowed by those of another airstrike on a civilian population. Not to mention, it sent a clear message to internal opposition that those willing to act against the state would face not just legal hurdles but also tyranny and physical violence.
There are plenty of photos of russians bombing civilians, hospitals, Mariupol drama theater had "children" written in letters so big that they can be seen from space. Still bombed. Just few weeks ago a funeral was hit with a rocket, 50+ civilians dead. Let alone the constant attacks on civilian power and port infrastructure trying to freeze and famine not only Ukraine.
This is just a completely ignorant take that the Israelis are somehow worse than russians.
I'm just saying exactly that - anybody may now take two sets of photos and compare the amount of rampage over the time period to approximate the "DPS" done to civilian infrastructure.
Of course, I cannot prevent that person from being in denial due to cognitive dissonance.
My thoughts are that the press (and moreso twitter) has a very skewed way to report stuff. It is designed for entertainment value and not for rigor, and also to push a narrative.
Having two sets if inputs regarding somewhat similar events allows you to do a reality check. For example, you can compare civilian infrastructure damage footage from Gaza and that of Ukraine and arrive to some conclusions.
Because the Gaza footage is quite a mess. You will have hard times matching it with Ukraine footage. Sometimes, whole neighbourhoods of civilian high rises were levelled.
Meanwhile, obviously the carnage of civilians during the initial Gaza raid is also something that you will have hard time matching with anything happening in Ukraine.
Perhaps you will derive something different from these pictures, I just suggest doing that instead of going along with narratives.
> Because the Gaza footage is quite a mess. You will have hard times matching it with Ukraine footage. Sometimes, whole neighbourhoods of civilian high rises were levelled.
Russia has literally deleted entire Ukrainian cities off the map, so this is a very strange paragraph to right.
The only difference is in Ukraine the civilians were told to evacuate there homes when the Russians are near by, in Gaza they are told to stay in there homes by Hamas.
It is also much harder to evacuate people in Gaza.
> That is the difference. People in Gaza also have nowhere to go if they wanted.
Exactly, the difference isn't that in the Ukraine war that the Russians are being careful or precise, it's that the Ukrainian citizens can actually evacuate from the areas they are in.
As a fact, no, I don't.
The amount of carnage seen on Palestine photos and videos is not matched.
Of you read words written by press, you may get an impression that it is comparable. But you will struggle to find evidence backing it up. Because there is not.
Also you seem to be fixed on carnage inflicted to Palestine but not a word about the videos and images of beheadings and wanton murder of Israelis by the hamas terrorists. Why is that? Did not hamas unleash this round of violence by absolutely barbaric attacks?
As it was already noticed, it was not due to unannounced airstrikes in Bakhmut but due to ground offensive. Any civilians could and should have been evacuated by then, or at least took cover.
If we will seee ground offensive in Gaza, the picture will likely be the same but on larger scale. This is on top of air strikes damage, for which the population in these blocks could not prepare.
> As it was already noticed, it was not due to unannounced airstrikes in Bakhmut but due to ground offensive. Any civilians could and should have been evacuated by then, or at least took cover.
Sure,
So like the Dnipro apartment block strike that was from a Russian Kh-22 and flattened part of the building.
Again, this is one missile (maybe it was shot down by AA rocket, or just went off course). And most of the building is still standing. The point of this launch was likely some military target.
Compare that to Palestinean videos where a dozen of blocks are levelled indiscriminately.
Also note that it's just been two weeks, as opposed to almost two years. Two years of fighting which resulted in just some cherry-picks.
> Again, this is one missile (maybe it was shot down by AA rocket, or just went off course).
The amount of precision guided missiles that we have seen hit civilian targets makes me think this is just as likely to be an intentional targeting as it is “flying off course”.
Also it would be an AA missile not rocket, rockets are unguided.
> The point of this launch was likely some military target.
You have no way of knowing this, and one could say literally the same thing about the Israeli strikes.
> As a fact, no, I don't. The amount of carnage seen on Palestine photos and videos is not matched.
You’re not looking very hard then.
The carnage is identical the Russians have literally leveled entire cities in Ukraine, I dare say the Russians have leveled an area that is overall larger then Gaza in its entirety.
This is before we get to the especially brutal parts of Russias war like when they launched an anti ship missile at a shopping centre.
> Launching "a missile" at a target pales in comparison with a row of apartment blocks levelled by air strikes.
It literally nearly levelled a shopping mall and killed 20+ civilians and injured close to 60.
So I guess the real question becomes, are civilian causalities okay or not?, if they are not okay then Russians indiscriminate targeting and killing of civilians in Ukraine is just as bad as Israels.
> And no, ground offensives are not the same. I hope we won't see those in Gaza - there is still chance.
So if Israel kills civilians in a ground offensive does that make it okay?.
Ground offensive makes it worse but it's already very bad. Worse than most of the stuff seen in the Russo-Ukrainian.
The question is actually a different one: about the collateral damage. But in the case of both Palestine and Israel, that question does not even enter the picture. Both sides show no discretion in killing one other.
You never answered the questions, are civilian causalities okay or not?.
Given how multiple cities in Ukraine look after Russias “ground offensive” I presume you’d be fine with Israel flattening Gaza in a ground offensive?.
Considering that’s exactly what Russia is doing to Ukraine.
> The question is actually a different one: about the collateral damage.
Or are you saying that if Ukraine used civilians as meat shields and forced them to stay in place that you would have a problem with what Russia is doing?.
Why do you write Israeli, when it was Gaza’s who bombed their hospital themselves. Like other autocracies tend to do to their citizens, by not valuing their lives.
Sorry, didn’t get that. Thanks for the clarification, I reread the comment you’ve replied to and it’s now clear them trying to whitewash terrorists (Russia and Hamas), while blaming the victims.
With a rocket made of water pipe and ammonia fertilizer? Obviously it would not do that kind of damage. We've seen impacts of stuff Hamas has and it can't do that.
The angle from which Israel doing this damage control reciprocity blaming is laughable.
I don't know who bombed the hospital in this case. But in the past Hamas showed very clearly they don't value the lives of their people at all and they used women and children as living shield on numerous occasions.
I had an experience using Starlink at a hotel in the Wadi Rum desert in Jordan. Funnily enough, I only became aware of this after some websites began redirecting me to their German versions - turns out our traffic was being routed through a host with an IP from one of their STRLNK-POP-FRNTDEU1 pools. The list @mkimball linked to doesn't include Jordan, likely because the service hasn't been officially launched in the country. This also means that, for now, you can enjoy a truly anonymous VPN-less browsing experience out of there :)
How would it be anonymous? Spacex ties your account to your traffic and most likely logs quite a bit. Or do you just mean geo-located to the wrong country?
"No HTML Club" stands as the only logical step forward in this evolution. Browsers are perfectly capable of rendering plaintext, what could we ever need those pesky "tags" for?
(Yes, I know technically codepage isn't ASCII. I guess you could use box drawing extension to draw foreign language character if you wanted. Or maybe just SVG of the text)
please not codepages. finally being able to write multiple languages in a single document, which is something i need to do frequently, really makes a difference. codepages were a nightmare compared to the simplicity that unicode is in this aspect.
These make me feel weird. On one hand, I love the way justified way it takes me back to old READMEs, GameFAQs, etc. but on the other all accessibility is thrown out the window.
Clicking on links is a bad idea, links are a bad idea. Besides from the horrendous potential for wasting your time you expose your reader to all kinds of dangers. Links could be changed in ways you don't control. They could point at offensive content, thinks you are not suppose to know, illegal things, phishing websites, hackers and much more you are better of not knowing about.
just the other day i logged into a MUD which presented me with an interactive world with a graphical map in color. the same MUD has a built-in http server so it can display the same information with the interactive and feature rich, yet compact telnet interface, as well as the round-trip heavy and verbose http/html interface.
History, forgotten. Chernobyl happened just 37 years ago. RBMK had a flawed design, this could've only happened in a corrupt socialist country, you might've said, but then we got Fukushima 25 years later. Now you might say that the new generation reactors will never misbehave in such a way, that we've learned from these mistakes, but accidents happen, negligence and corruption happens, wars and terrorist attacks happen, and any reactor, if mishandled, has a potential to irradiate half a planet.
I'm familiar with both sides of the debate and am not strictly against new nuclear power plants, but you comment is a microcosm of what I think is wrong with the pro-nuclear side of the debate. We do have a history of devastating accidents and close calls, so why dismiss our concerns as those of loud extremists?
Ok please state total nr of people that died bc of nuclear plants disasters(direct and radiation)
After that, pls state total people that died bc of fossils pollution+ disasters+ radiation from coal
Pls also total nr of people that died bc of hydro disasters(we can add those to solar and wind since hydro is used as backup for them by storing excess)
Not debating, just interesting how numbers differ in context of you mentioning chernobyl and fokushima, world should already have stats
I think this is a valid, serious concern, and I was thinking more of the unreasonable nuclear pollution/waste fear and the "why bother" pro fossil fuel attitude.
But when it comes down to it, I think the metdown risk is worth modest investment in new and existing reactors, in reasonable locations, using low risk designs. The scenario where the world leans too heavily into fission seems impossible.
It's just more stable, at least this has been my experience. I've tried hard to become a full-time workstation Linux user for years, daily driving Ubuntu, Mint, and Fedora for months at a time, but I always had to come back to Windows. Nvidia and Intel driver issues, package manager bugs, reduced laptop battery life, general UI clunkiness, and times when GRUB suddenly decided not to boot have taken so many hours of troubleshooting that could've been spent doing something actually productive.
Windows has many issues, but it never decided to break on me in the middle of the day. For me, an OS is not a religious affiliation but a tool, and Windows performs much better as one.
I’ve experienced the same. In fact, I recently tried migrating to Ubuntu. The user experience is a lot better than it once was, but it’s still not great. For instance, if I want to see what the temperature outside is on gnome, I need to install a weather app. There are several, and amongst them, the Ubuntu software installer says they’re not verifiable because a 3rd party developed them. Ok, fine, I just want the one most people are using, because I assume that is the one that is best maintained and has the best features. I’m not sure which one that is. Oh well, install the first one after a brief search to determine which is considered most “native” to gnome and Ubuntu. After installation, I don’t see the weather on my top bar. I open the weather app, look around the settings, but there’s no option to see the weather displayed on the bar. I give up. Later, my machine seems to be stuttering a bit (64 GB RAM, AMD 5970, RTX 3060), so I reboot and it’s back to normal. I try to play a game, and get an error stating that Vulkan isn’t installed (it is). I reboot instead of fiddling with it to find the root cause, and it’s working again.
I don’t have to do this stuff with Windows. It just works. I don’t mean to downplay the efforts Ubuntu developers have gone to in order to get it to its current usability. It’s pretty good, it just has a bit more maturing to do before I can make the permanent jump. A while back, I read that Ubuntu was hiring a product manager for the desktop, or maybe gaming? Anyway, I wish them luck, and hope they’re able to make strides on the experience.
> For instance, if I want to see what the temperature outside is on gnome,
In the amount of time you took to do that, you could have opened a browser and typed weather.com to see the weather.
I think this is the grandparent OP's point: Showing you news or showing you the weather is not the job of an operating system. The operating system is there to manage system memory, the filesystem, networking, security and permissions, drive peripherals and accessories, maybe provide a desktop environment.
That said, I would expect my operating system's vendor to also ship high quality applications that I can optionally install after I install my operating system. Ubuntu should have a weather application, or at least a strong opinion about which third party one is the best and that new users should use. So, you're not wrong. The whole "search through 40,000 half-assed weather applications and hope user reviews are accurate" situation is also bad.
I've never understood the obsession of "weather" apps with some people.
Heck, if a linux user wants to know the weather, all they have to do is lok at their windows. (might have to go up the stairs though :-)
-
I run W11 - and it SUCKS... one weird thing was I have my webcam covered in tape 100% of the time. Here was a creepy popup I got one day - it slid down from directly top-center of screen and gave me a notification asking my to uncover my webcam.
it only happened once - but WTF - and I havent seen it since, and I couldnt find anything on google about it. WTF is that?
> I run W11 - and it SUCKS... one weird thing was I have my webcam covered in tape 100% of the time. Here was a creepy popup I got one day - it slid down from directly top-center of screen and gave me a notification asking my to uncover my webcam.
When I run Windows these days, I assume every single part of it is compromised, either by scummy third party software running in the background or by Microsoft's scummy software running in the background. This includes cameras, microphones, any radio, the networking stack, any drives (local or network) the machine can so much as ping, everything. I have a special vlan prison I put my Windows machines in because I treat them like the hostile attackers they are.
Say what you will about Apple's "walled gardens" but every time a frustrated 3rd party developer complains online that they can't do X, Y, or Z on Macs because of permissions or security, I get a little more comforted that my Mac's software is not constantly attacking me.
When you say "black electrical tape" do you mean from, say, Harbor Freight, or from 3M? Since the 3M stuff blocks UV fairly well, i can't really see light getting through it, although i'd have to find some to test.
years and years ago we'd buy a roll of film, pull it out in the sun, and then get it developed but not printed. You could use that as an IR-passthrough filter on a camera lens - this is if my memory isn't faulty.
You gotta open the window to find out how hot it is. After noon, stand facing West and see how many hand-widths between the horizon and the sun… each hand-width is an hour until sunset. Source: Boy Scouts
My office does not have windows (the hole in the wall, not the OS). Also, I usually don't want to know the weather right now, but in the next couple of hours or rest of day. It's not a obsession, it's a handy piece of information which is nice when I'm able to check it at a glance and not have to open some app/website.
I had a coworker who didn't have a window, but did have a whiteboard. If there was a significant change in weather is go draw it on "window" on his whiteboard.
It shouldn't be preinstalled, but it should be easy to find professionally reviewed applications for the most common user application categories. Android's Google Play Store has "editor's choice" for example. If Ubuntu is trying to be THE desktop linux, this is something they should be doing.
Because even if Ubuntu knows that, users often want conflicting things.
User A may want a weather app preinstalled; user B may not want their computer knowing their location. User A and user B might even be the same person.
And that's assuming Ubuntu knows it, which let's be real, Ubuntu isn't great at knowing what its users want.
And all of that is assuming it's even true that most people do want a weather app.
I think KDE would serve users coming from Windows much better. You'd have much better experience out of the box. I have used Ubuntu, Fedor, and Arch Linux with gnome-shell and have rigorously kept my extensions for a few years up and working but eventually got tiered of them breaking with every gnome update and desktop crashing every few days. I switched to tiling windows managers since then such as i3/sway for work and to KDE for personal use (for example with OpenSUSE Tumbleweed).
Ubuntu is literally just a collection of thousands of individual packages. It’s not a cohesive whole developed in the same repository like Windows.
Windows is a completely unified OS where there is a huge repo that can build the whole OS from source.
That makes Ubuntu extremely customizable. You can swap out the window manager, or just remove it, use a KDE file manager instead of gnome, do whatever the hell you want. That comes at a cost.
Windows is just Windows. You can’t replace the desktop (you used to be able to swap the shell), or file manager or task manger or installer system. That makes things integrated and easy to use. That comes at a cost also.
For me, I use Ubuntu with i3 for work, Windows for gaming and personal stuff, and macOS because i have old work laptops that are MacBooks.
They all have their pros and cons. Having all three means I always have the right tool for the job available.
Just upgraded to Windows 11 (for its HDR features) and weather is now part of the "Widgets" bombarding me with ads and poor news sources. I genuinely tried to customize my "feed" but it's all junk -no reputable sources whatsoever- and I didn't find a way to remove them.
So the only sane course of action was to disable widgets altogether while I still can. And now I don't have the weather anymore.
The issue with the taskbar, is there are a couple different implemetation APIs, your shell probably only supports gnome out of the box. I don't recall the name, but there's an extension that will add support for the KDE api for taskbar extensions. I'm running Budgie, with a relatively customized setup, and that was a long while ago, so not as immediately familiar with all that I did.
Will likely switch back to PopOS when the next LTS comes out though.
It's kind of funny that this is often brought up as some achille's heel of linux but honestly my Windows PCs have always been larger headaches.
In fact unless I was new and heavily tinkering with my distro, linux has easily be the more "stable". All my problems were...definitely me problems.
At the end of the day, they're both OSes running on a jaw droppingly wide variety of hardware, but whenever I look up a problem I have on linux, I find an answer that makes sense.
Meanwhile, the brand new, mainstream hardware I bought for gaming with windows forcibly sold to me with it, spent a year not being able to play audio properly while microsoft publicly insisted it had nothing to do with them, until it was quietly fixed in a windows update, which I'm sure had nothing to do with them.
Also, waking my computer from sleep occasionally just crashes my entire system, or even booting it up will cause it to crash or bootloop a few times. It's genuinely amazing what "paid development" gets you from monopolists.
> “while microsoft publicly insisted it had nothing to do with them, until it was quietly fixed in a windows update, which I'm sure had nothing to do with them”
Microsoft are often taking the blame for, and working around, other vendor’s bugs. Just because they fixed it didn’t mean they broke it.
That’s all very well, but my end of the day take is that if you want more Windows/Mac adopters, you need zero friction. So often you get these handwavey (snobby?) attitudes of “why don’t you just insert hard to do thing for average user” and in the meantime nobody is the wiser.
Also not saying that things aren’t getting better, but it’s a snail’s pace.
Windows for all its flaws is zero friction and will win from any competition.
> Windows for all its flaws is zero friction and will win from any competition.
It wins because it's less friction, not zero friction. There's a reason, other than old applications, that there are still Windows 7 installations. Many people don't want to upgrade their Windows until they upgrade their hardware because it's a hassle getting the interface back to the way you want it.
Anecdotally I'm not a programmer and I switched to Ubuntu when I bought this laptop in 2013, with about 3 or 4 years of dual booting for software purposes before I stayed on Ubuntu. I'll switch away from Ubuntu to a more user friendly distribution with my next computer because it's pushing features I really don't like, and deleting features I really do like. My wife is also not a programmer and with the upgrade to Windows 10 we had to do a bunch of searching and tinkering to make the user interface satisfactory. She's avoiding Windows 11 for as long as possible.
I was having a conversation with my mother today about changing her bank. She ultimately decided against it because she doesn't want to change her bank account number.
People are creatures of habit. Microsoft learned this when they removed the start button in 8.
It is, but that’s a little meaningless when I haven’t been to a bank in maybe 20 years. I see stories here about banking infrastructure and suspect it might be very hard to get changes made.
The difference isn't even less friction. It's more familiar friction.
Here's the most crucial point: windows has the most thoroughly documented friction. If you ever have a problem, chances are 1,000 or more other people have had that problem, or a closely related one, and 1 or 2 of them even wrote about it somewhere. Life is way harder than it needs to be, but you are not even remotely alone.
Apple takes the opposite approach: walls instead of friction. If you can't figure it out, it's because you computer can't do it. That implies your computer shouldn't be able to do it. You would be surprised at how comfortable people are with this conclusion. It doesn't get them what they want, but it saves them time and energy by providing early and confident rejection.
Linux maximizes the ability to manage friction. There is always a way to actually resolve it with constructive effort. That's an unfamiliar strategy, and it requires some level of education that the average user refuses to accommodate, even if it will definitely save them time and effort.
> It's kind of funny that this is often brought up as some achille's heel of linux but honestly my Windows PCs have always been larger headaches.
Same. I switched to Ubuntu a decade ago when my Windows machine started displaying the blue screen. Somehow the motherboard itself became incompatible with Windows overnight even from a clean install. Instead of junking the board I put Ubuntu on it ... and it's still my daily driver a decade later. And though there have been issues I'd say less than I had with Windows overall.
Was that machine a laptop? In my experience, power-oriented laptops and Windows mix like oil and water.
I had a an ASUS RoG Zephyrus G15 for a little bit and its Nvidia GPU was weirdly fussy in that it had to be running ASUS-provided Nvidia drivers, because if it wasn’t it’d perform 20-30% worse while running just as hot as if it were at full performance. This was maddening because Windows Update would want to update the ASUS drivers because they were old, but this of course nerfed performance. I tried restricting this in the Windows policy manager thing, but unbeknownst to me the Nvidia driver is split up into several pieces which then resulted in the pieces getting mismatched which broke all sorts of things.
I ended up returning it and putting the money towards a custom built tower instead, which has had none of these issues.
This is why NixOS has been so great for me: it factors out an entire class of "me" problems.
If I decide to go down a rabbit hole that involves totally messing up my system, I can undo all of that by simply rebooting into an older generation. NixOS never diverges from "fresh install".
Now if we can just get the UX together, it will be incredible.
> Now if we can just get the UX together, it will be incredible.
The crux of the NixOS issue right here. I tried NixOS a few times, even this past weekend, and it was such a pain that I gave up each time!
I am planning to integrate Nix (the package manager) into my recent fresh OS install if I have some time this week. I want to use Nix to have, at the very least, a controllable way to install and remove toolchains of different versions in a reproducible manner; if I can swing it I am going to use it to install pretty much anything that requires any sort of configuration care (the rest I'll just use apt). I also want to integrate more tools like asdf or pyenv which help with that, but I prefer if I could do it all through one package manager like Nix. I finally separated my /home into another drive this time, so that'll be nice for future re-installs.
The fact that Nix's user experience can be so bad is the greatest evidence of its inherent usefulness. If you are able to get it working for you, it's somehow worth it.
It's funny because I've had a lot of people share this exact same experience with me, since they know I've been a Linux user since the late 90s. But this is my experience with Windows! Especially the shortened laptop battery life. Windows runs so much in the background that performance feels 10x slower on the same machine and the battery drains much faster ... in my experience anyway. But I also have issues with Windows drivers and applications crashing often. Whereas on Linux things "just work" for me the vast majority of the time.
To be fair, though, there was a short-lived period a couple of years ago where a lot of laptop trackpads wouldn't work, and I had a work-issued laptop that didn't seem to want to play nicely with an external monitor.
So I wonder if this is largely what you're used to. I run Linux on all of my devices, which are hardware I've chosen myself and had high confidence would have good Linux support. It's only been the work-issued machines that I've had issues with... so I probably just have a sense of how to get things to play nicely because it's what I've used primarily for so long.
It's the times when it breaks that make the difference for me. One Windows, most breakage is an annoyance. On Linux it can grind my day to a halt. Here's something I did on an Ubuntu install recently.
apt install python3
Oops. That's not what I wanted.
apt autoremove python3
OMG! What have I done! IIRC that stripped out so much stuff I didn't even have networking. Lol.
MacOS doesn't have the issues that you've described while giving you many of the same tools Linux has, in addition to support for most mainstream windows software.
And while macOS isn't as free as Linux, it's certainly less "spyware" than windows.
Growing up in a developing country, until recently Apple devices (laptops/desktops especially) have been a bit out of price range for me. Although I can afford one now, my current laptop is nowhere near its end of life, and something in my soviet-scarcity-mentality-influenced mind doesn't feel right about upgrading just for the sake of upgrading. That said, Apple laptops look very convincing at the moment, and when the time comes they will probably be my first choice.
One thing that more than balances it out for me - my Mac's have a lot longer of a lifespan than my Windows machines. I have a larger up front cost, but get far more out of them over time. I average at least 7 years without having to do anything with my Mac's (and used my last two for just under 10 years) - Windows has improved dramatically since the 2000's, but I seem to either need to reload Windows from scratch on the same hardware to maintain decent performance, or upgrade hardware a lot more frequently than I have with my Mac's. Also when setting up a new machine I have found Apple's migration assistant to be flawless in moving everything from my old machine to my new one. I only deviated with my current M1 MacBook Pro because of the CPU architecture change - and friends overwhelmingly positive experiences with using migration assistant to move from Intel to AS Mac's indicate I probably wasted a bunch of time needlessly in setting everything up fresh.
One thing that helped me is realising that I should be selling ild electronics, so it goes to people that need it most. Instead of having it collect dust in the drawerr untill it becomes c dead and obsolete
I've not seen this and I set up new Macs every year. Can you expand? You'd have to open the news app to see this AFAIK. Unless it was part of some iCloud on-boarding process which I haven't done in years.
I don't recall the details. Possibly may appear as an onboarding process.
I think in the same way we now see advertising on school buses and display-covered vending machines (even inside a state office), were going to end up with forms of outreach / ads in our tools unless there's more robust forms of support (could be paying, could be a more multicapital flow of support).
That only happens if the loved one opens the Apple News app, and grants permission for notifications. If they never open Apple News, or if they open it and deny permission, no notification.
My personal record is 11 months on Mint. I have also used Ubuntu, Fedora and (a long time ago) SUSE. But as you say, sooner or later something comes up that forces me to go back to Windows. Things like poor GPU performance for certain applications (like Obsidian for example) or GRUB acting up or WLAN/GPU drivers suddenly not working after a kernel upgrade and so on.
Would you mind sharing your Win10 setup? I use it too, but it's a stock version with just some basic cleanup.
For the record, I've never had a WLAN issue in 3 years of Linux (Ubuntu and then openSUSE). I can't attest to GPU as I don't have a dedicated GPU though.
GRUB has also been quite the happy camper in my experience (at least if you don't go mucking about with config files).
@LorenD: "For the record, I've never had a WLAN issue in 3 years of Linux (Ubuntu and then openSUSE). I can't attest to GPU as I don't have a dedicated GPU though.
GRUB has also been quite the happy camper in my experience (at least if you don't go mucking about with config files)."
How dare you not have a problem in a discussion on Linux. (Not just there yet :)
I did the same thing with Ubuntu. I used it roughly for a year before getting tired of having to find workarounds for things like a webcam. I moved over to a Mac mini and life has been good.
I still think the future is Linux. I see Microsoft and Apple taking their O/S in directions that are anti-consumer.
> It's just more stable, at least this has been my experience.
It was more stable, that's why I used it. Then starting with a certain Windows 10 update I had to reinstall the system multiple times because automatic updates kept breaking it overnight, it started crashing the USB driver, suddenly it kept randomly switching keyboard layouts by itself, and somewhere around the third ruined weekend due to an unbootable system I had enough. Switched to an Arch-based distro for 3 years in which I only had one update-related issue and it took me a whole 5 minutes and one reboot to fix. Now I partially use Mac OS and while I'm disappointed by some of its aspects I can at least be certain it will boot tomorrow and it won't install a system update without my confirmation.
I had similar issues... especially after Win11. Admittedly, I was running Insiders builds, because I wanted new WSLg features, etc. Then one day I was on Windows 11... okay, got the app bar pinned back on the left... a month later, oh, you don't have secure boot enabled, you'll need to reinstall... enabled secure boot, still had to reinstall... a few months later, the nvidia drivers kept borking out and blanking my screen. The Windows release kept overriding the newer NVidia drivers for w11. Figured out how to pin them... another couple months, start seeing adverts in the damned start menu search. That's it, I'm out. I reinstalled Win10 in case I needed it, disabled secure boot and tpm... and haven't booted back to my windows drive since. I've had two small issues in Ubuntu, both relatively easily fixed.
I don't think I'm going back. I use Win10 at work, and fortunately most of my actual day is in VS Code under WSL. And that's about all I can stand.
In fairness, you can't take your experience with insider builds as an indicator of the stability of the OS. Insider builds are expected to be unstable.
When I reinstalled after the insiders build first nuked itself, I switched to stable/mainline. Still had issues after that. I had run insiders on Win10 about 3 years without issue before that.
One of the reasons I went with AMD for my new GPU was Linux support. Nvidia has been abhorrent of Linux before Torvalds did the famous gesture and that was over 10 years ago! My old workstation had an Nvidia and performance has been all over the place, and that's on lucky days!
I have found that more recent cards from both vendors are a lot better on Linux than their older gear. My 10 year old GPU had huge issues on Linux (running fine on Windows), but when I got a more recent GPU for my Linux box, it ran fine.
That's one of the main issues that is solved with open drivers.
A year or two after AMD acquired ATI in 2006, I had just gotten my hands on my first ever modern graphics cards: the All-in-Wonder 2006 edition. It was basically a Radeon 9600 with a built-in capture card.
This was also around the time I was really getting into Linux. I'm pretty sure I could dig up a CD with Ubuntu 8.04 that I burned fresh in 2008.
As a poor teenager living on abandoned hardware, I watched the full life cycle of that card's Linux support. I lived it.
At first, the proprietary driver support was pretty good. I could just open Ubuntu's handy dandy "driver manager", and get a neatly wrapped .deb installed. A quick restart of Xorg, and I had full GPU support. I could turn on all the flashy compiz effects: wobbly windows and a cube of virtual desktops.
This was the most exciting era for the Linux desktop. It was easy, familiar, and powerful. All we needed was a compatible MS office alternative and a few well-ported AAA games, and we would be living the dream. The future of Linux was bright and close.
A few years passed, and proprietary Radeon drivers weren't getting packaged anymore. The free fglrx driver was stable, but didn't have DRM (direct GPU rendering). Even in windows, there wasn't great driver support for ATI cards. This was pain from every direction, and for whose benefit?
A few more years passed, and fglrx became the best driver: better than the proprietary one. By the time this happened, though, you could get a vastly more powerful card for ~$30, so the point was moot.
When AMDGPU was announced, I was ecstatic. Finally, a major hardware company found the value in making a full-featured, performant, and open driver. Never again will I need to fight the most purposeless incompatibility, the pain with no benefit, the hell that need not exist in the first place: proprietary video drivers.
Had a 5700 XT at launch... drivers were effectively broken in mainline Ubuntu for nearly 6 months. I had to use a beta Kernel, which broke other things. I sold that and managed to get a 3080 via newegg shuffle, went back to Linux after various Windows issues following.
Intel is pretty good about upstreaming drivers into the kernel. The only bugs I've ever ran into are around brand new wifi cards that haven't been mainlined yet. And even then I don't think I've seen that in about ten years. Nvidia on the other hand is a huge pain on Linux, but thats deliberately done by Nvidia.
> reduced laptop battery life.
Been using Linux as a daily driver for over 15 years and laptop life has been better than windows nearly the entire time.
To be fair, I cut my teeth automating Linux environments in physical datacenters. So I've lived in a world where power consumption mattered, know how to select hardware with good driver support, and can tune the os.
That said, you can get a brand new Lenovo idling under 5w without that knowledge and by simply installing tlp. With additional know how you can get it under 3w.
> For me, an OS is not a religious affiliation but a tool, and Windows performs much better as one.
Funny how you and I have the same value but wound up at opposite conclusions. I guess it's all about the tools and how we need/expect to use them.
I have been using Linux for work and my home server for around 20 years.For my personal computing I have been using windows and MacOS.
Every now and then I get my hopes up that the year of Linux is finally here and I install the latest.
I have a simple heuristic. If in the first day of setting up the system I am required to fire up the terminal, it means that more pain is coming in the future, so I immediately delete the Linux partition.
I am still using just windows and macos for my personal computing needs.
> I have a simple heuristic. If in the first day of setting up the system I am required to fire up the terminal, it means that more pain is coming in the future, so I immediately delete the Linux partition.
This is exactly my litmus test. The requirement to touch the CLI indicates little thought for the UX or care for users who don't want to use the CLI. Every year I boot up another flavour of Linux and every year it fails this test. Linux is built by developers, for developers. That's fine, but let's be honest about it.
last time for me was the audio was glitching which never happened on pc. i think the solution had something to do with changing the audio sampling rate fixed it.
The issue is pretty much the same as it ever was -- hardware manufacturers support Windows first. Linux is usually later if at all. Community support sometimes steps in to fill in the gap, and some manufacturers (most notably AMD) are coming around, but this is still usually the issue.
Oddly enough, this means that Linux tends to work better as the computer it's running on gets older. The reverse is true for Windows -- updates tend to make it slower and/or have more compatibility issues. A computer that worked better with Windows a few years ago will not-infrequently perform better under Ubuntu today. It's not usually suggested on new PCs unless it's spec'd specifically with Linux compatibility in mind.
I dual booted Windows on my desktop and laptop for a few years and also noticed lots of weird issues - reduced battery life on my laptop, sleep/hibernate being broken, GRUB occasionally just dying on me. I eventually got rid of Windows all together and now just run Manjaro. I was surprised that suspend issues and battery life on my laptop, for instance, completely went away.
The main thing that kept me on Windows for years was games, but once I jumped into using Proton via Steam on Linux (and now the tweaked Proton GE), I can run almost all of my game library at full speed. The few games I can't play are due to anti-cheat software like Battleye.
>For me, an OS is not a religious affiliation but a tool
I hard agree on this.
I use my W11 22H2 as a daily driver because of that: it just works (TM).
Also, I don't see any kind of news or the likes on Windows, and it feels really close to Windows 7. Here are the things I've done so far:
- Bought a Windows 10 Education license from a shady website, and it's been working fine for the past 4? years. The upgrade to Windows 11 Education went smoothly as well.
- Local account only. No Sign in to Windows crap.
- (Most important imo) Installed OOSU Shutup10++ (it's for Windows 11) and turned almost everything off there.
- Bought a Start11 license for 5 USD and installed it, and switched to the Windows 7 start menu using it.
No ads anywhere in Windows, feels like Windows 7, talks much less to whatever 3rd party people MS have contracts with to get my data.
FWIW, I did try daily-driving Linux Mint Cinnamon, which is the best Linux for me imho. However, there is a show-stopping bug as follows:
I have 3 monitors, with the main one being a 4K over DP. My monitors are set to turn off after 5 minutes, and in Windows, they turn back on just fine when I move the mouse (minor annoyance: display scaling is not reflected across all application windows till I minimize and open the window again but no big deal).
For Linux, the DP monitor won't get detected; it behaves as if my other 2 monitors are the only ones connected. I have to turn off the monitor switch and turn it on again for it to be detected. I looked around but didn't find any good way to solve it...
My experience is the opposite. I use Windows at work and Linux for everything else. Linux is much more stable and when something is wrong, much easier to fix. Windows has definitely broken for me in the middle of the day.
Pretty much the same as me. Windows hardware support is really great, Linux is always a hassle for me. I've tried and tried with Linux, but I've given up on it as my primary desktop
On various hardware, over the years, I've had both the same and the opposite experiences. For example, linux doesn't decide to reboot during the day to perform updates without your consent.
Overall, when there is not some specific hardware issue, I've found linux running much smoother and more user friendly _for me_. Gnome is a lot less cluttered, things are easier to find. It is also often much better in supporting older hardware and, ironically, older windows applications.
There is a middle ground. I use Windows/Mac for gaming, entertainment, and casual browsing, and I run Mint in a VMWare for serious stuff. The added benefit is that I can easily back up, snapshot, and transfer my work OS anywhere. And Mint/Ubuntu provide much better out-of-the-box productivity tools. I map my multiple Mint desktops to numpad keys. You can't do that with the other ones without additional software.
Interesting. My experience is that Linux (Debian, anyway -- Ubuntu has never given me anything but headaches and instability) is at least an order of magnitude more reliable than Windows. It's been over a decade since I've hit a serious or crashy bug in Linux. I hit one about every other day with Windows.
I wonder why there's a difference in our experiences here?
> It's just more stable, at least this has been my experience. I always had to come back to Windows. Nvidia and Intel driver issues, package manager bugs, reduced laptop battery life, general UI clunkiness, and times when GRUB suddenly decided not to boot
Well I've had the exact opposite experience. Windows was an endless source of bugs, crashes, and instability. Linux (Mint) is rock-solid, clean, fast, pretty, and stable. I've had more blue screens that I can count but I remember less than a handful of kernel panics over the last 10 years. No more fiddling around in settings, no more having to use external tools off some forum thread to accomplish something as simple as updating drivers.
The only issue I give you credit for is the battery life, which is indeed better on Windows by some ~20%.
All these might have been true for Windows 10, but with Windows 11 all the things you mentioned are my daily woes. W11 suddenly "upgrades" the video driver and explorer crashes; updates bios and camera borks. With Linux I have the opportunity to freeze the upgrades and go on to my work. Also due to behind the scenes shenanigans battery lives are worse all across the board - I am managing more than 100 laptops. In addition with this OS, 8gb of ram becomes a joke and most of my users are running office applications.
Linux has its own pain points, I agree, but especially after 2019 they are rare. With Pop_Os! I never experience any of the stuff I deal at work. I dare say Pop makes Linux boring - because everything works out of the box.
Every time someone mentions Linux driver problems, I see that name.
For me, the strategy that has worked for the longest time is to get boring computers. The boring Thinkpad, the boring Vostros and Latitudes, the boring ThinkStation and ThinkServer boxes. Large PC makers don't want their corporate-oriented products causing support calls, and that forces them to not be overly creative with their implementations. With that powerful incentive, the hardware is usually well supported by the two boring operating systems (for generic hardware) out there. Either that, or get a machine that's designed together with its OS (and know the odds of you installing anything other than that are slim).
*NIX is the only viable choice for mission-critical desktop stability in 2023. But that isn't to say Windows isn't feature complete. It is still king when it comes to gaming.
Desktop stability and reliability hasn't been a top-level OKR at Microsoft for many years. The company has been growing more product driven for years and falling victim to roadmaps being driven by muggles.
The only place where you can actually design a truly stable and reliable desktop is with open-source kernels and user software.
>> and times when GRUB suddenly decided not to boot
I suspect you are dual-booting, which is itself a hacky middle ground full of bugs. Linux and windows will never share a drive well.
>> reduced laptop battery life
??? Odd. I find battery life on my laptops far better on linux, generally because linux knows how to actually stop doing things when asked. Windows, no matter what you do, will randomly decide to install/download something.
>> general UI clunkiness,
For me, the fact that linux UIs don't change every few months, and when they do I can undo them, makes window the clunkier UI. It is monday morning here. I have so far had to restart Outlook twice on my work computer as new "updates" are applied. I'd take a thousand clunky-looking widow borders over MS's daily popup pollution of my screen time.
> Linux and windows will never share a drive well.
Nowadays with UEFI and GPT, sharing a drive is not a problem; they don't stomp on each other's MBR anymore and even UEFI itself comes with a boot manager.
The bigger problem is to learn about these "new" things ("new", because introduced ~15 years ago) and stop doing stupid shit that worked with legacy BIOS and is not necessary anymore. Grub breaking itself randomly is mostly self-inflicted problem.
> I suspect you are dual-booting, which is itself a hacky middle ground full of bugs. Linux and windows will never share a drive well.
The Windows installer unfortunately will happily clobber the EFI partitions on completely unrelated drives. Had it happen on a triple boot (Win/Linux/Hackintosh) setup a couple of times, with each OS getting its own drive. MS almost certainly is not testing against multi-OS setups of any kind.
When Windows breaks, it stays broken. I had a W10 install unable to update, running the update would break windows and be unable to log in at all. It required rolling back the upgrade. Best part? It would automatically try to install that update every time I forgot to click the dead mans switch. I'd regularly try to unlock to a useless machine. No online support was helpful, just had to re-install.
Linux is the opposite, nearly everything has a documented fix. It can be fixed in less time than it takes to backup, reimage, and reconfigure your machine.
Linux driver support was hell from roughly around 2010 to 2016. Both major GPU manufacturers had awful proprietary drivers (with even worse packaging), and most wifi chipsets required proprietary firmware blobs to work at all: which was very tricky to package, because of copyright bullshit.
This was also the era of major desktop environments playing fast and loose with there UX. GNOME3 was released in 2011. Ubuntu started defaulting to Unity in 2010, and started their Wayland competitor (Mir) in 2014. KDE Plasma 5 (2014) defaulted to fancy composting, and felt really bloated relative to the others. The only desktop environments that really kept true to the good old days (~2008) are XFCE4 and MATE (the GNOME2 fork). KDE5 isn't bad, either, but it's still a bit too bloated.
The other problem caused by proprietary video drivers was package versioning. It's tricky to have the right kernel version and Xorg version necessary to run a proprietary video driver blob; and keep the rest of your system up-to-date. Ubuntu found its initial success by creating a generally stable package repository roughly as up-to-date as Debian unstable. Unfortunately, Ubuntu became a bloated mess with strange things like Unity and Mir bundled in. Archlinux has been a good alternative, but it does expect a level of familiarity with shell utilities. Linux Mint (an Ubuntu or Debian fork) is still my first recommendation to casual users. One of these days, it will be NixOS, which is a giant leap in stability and package versioning.
The last change of that era that has been breaking the Linux experience is the switch from BIOS/MBR to UEFI/GPT. This shift was slow and messy, with most hardware adoption following the release of Windows 8 in 2012. GRUB used to break in one predictable way: windows overwrites the MBR, replacing GRUB with its bootloader. Now, with UEFI, boot entries are saved directly to the motherboard, and the bootloader itself lives in the ESP partition. The windows installer will put its bootloader in the first ESP it can find, and you don't get to choose which one that is. Now you have to worry about the ESP running out of space, but that's about it: everything else has been generally resolved, and the UEFI bootloader experience is very solid (apart from the windows installer caveat).
Now that AMDGPU is mature, and NVIDIA's drivers are relatively well maintained (and packaged), the Linux desktop experience is even more stable than its heyday back in 2008. If you install a distro that targets relatively recent package versions, like Archlinux, Linux Mint, or even Fedora; and you use a solid familiar desktop environment like MATE or XFCE4; you can avoid most UI/UX clunkiness and have very little need to fiddle with your package manager. Boot issues are pretty unlikely now, so long as you install in UEFI mode (not legacy BIOS emulation), and completely avoid MBR.
>I've tried hard to become a full-time workstation Linux user for years, daily driving Ubuntu, Mint, and Fedora for months at a time, but I always had to come back to Windows. Nvidia and Intel driver issues, package manager bugs, reduced laptop battery life, general UI clunkiness
Oh boy where to begin?
>Nvidia and Intel driver issues
Not a personal dig, but this is a result of having been spoiled. I still remember the small dime novel that came with my box of WinNT 4.0 workstation, all it did was list the various hardware that was on its hardware compatibility list. You wanted to buy some piece of hardware? Better do the homework because not everything was going to be compatible or even supported at the same level. Today everyone expects everything to just work out of the box when you throw an operating system at it. They've completely forgotten the need to even check for compatibility, they've outsourced that to the operating system. They expect it to 'just work' without input.
When it works well it's great! It's magical! But people forget that it's a relatively recent thing and that to get the best use of your hardware you're advised to research it before purchasing and to make sure you check compatibility with the operating system(s) you plan to use.
>package manager bugs
OK, this has hit us all eventually. Valid. But I've noticed most of the time when I've run into this it was a result of me doing things that I really shouldn't or at least which should prime me to monitor my system more carefully. Such as installing Debian packages into Ubuntu. Sure it can work, especially if you do your best to install any needed dependencies. But you'd better know what you're doing and watch for issues after doing so. I'm sure there are other ways the package manager can crap the bed. It's not all on us when it does so. But I really don't think Windows is any better in this regard. I've had stuff eat itself there too with applications and systems upgrading DLLs and leaving me up the famous creek without a paddle.
>reduced laptop battery life
Valid as well. But have you looked into tlp? Have you tried tuning it for battery life?
>general UI clunkiness
This heavily depends on your desktop of choice. As a Mate desktop user I've been fairly happy with how my UI behaves. To the point where it is actively annoying to be in another desktop now. Different strokes for different folks though. If Windows is the UI you rely on to the point you have muscle memory, I can sympathize. I'd argue there is a desktop that can match that UI for you on Linux but you'll have to customize it a bit and you'll have to test for which one is closest to what you prefer.
But if your preferred desktop UI is indeed Windows, it's not Linux's fault that it is not Windows any more than it would be OSX's fault it is not Windows. You have to adapt and accept that things work differently in a different operating system. Not wrong. Not misconfigured. Different.
The thing is: with Windows I don't have to do any of the compatibility checking, tuning for battery life, etc. You might have had to in the past, but you can't compare past Windows to Linux today.
I just want to get my work done, and be able to reliably turn my computer on and run my applications. Windows lets me do that. Linux doesn't. I haven't had a Windows update break things in years, where my last Linux experience had the Ubuntu live USB work fine and completely fail to boot to a GUI environment after the install. I don't have time in my life to troubleshoot kernel issues anymore.
Windows is the standard bearer of paid OS's - yes, true.
Ubuntu is the standard bearer of free Linux OS's - not really, and (this is important) less true over time.
What's happening is that, as Windows is improving, Linux appears to be getting worse. But that's really just an Ubuntu problem.
I don't know how Ubuntu got the crown exactly, but it seems to be performing less well over time, and is, increasingly, not the default choice. I would understand if other distros are harder to learn or simply unsupported, but that's not the case.
It feels like 90% of these issues could be resolved by saying "Start with Fedora. In 2023, that's the actual default Linux distro that fixes these problems."
People forget now but Fedora was created because Red Hat abandoned the home desktop market in 2003. Then Fedora was spun off to be a test bed for their enterprise offerings and it was no longer possible to buy a copy of workstation in stores. So when Canonical showed up in 2004 and was focused on the desktop they were able to get a lot of people to move over fairly quickly. The fact that they were using a different type of desktop interface with Gnome that had the two panels unlike Fedora which still had the single large panel like Gnome 1.x made it stand out even more. That and the way almost every other Linux desktop at the time was KDE based...
So yeah, Ubuntu took the crown because it wanted it. It maintains that crown because outside of it and its various spinoffs and flavors no one else is really seeking to be a desktop operating system. Since Canonical has made it clear that its focus is now also Enterprise at the expense of the desktop experience, I imagine it's only a matter of time before someone else steals that crown by focusing on the desktop again. We just need one of these billionaires to fund a company to make it happen...Say what you will about Shuttleworth, he did put his money where his mouth was and I for one am grateful for the many years of good use I got from Ubuntu as a result. I will be sad for the day when inevitably the pain points out weigh the benefits and I must switch away from Ubuntu-Mate to some other system.
> The thing is: with Windows I don't have to do any of the compatibility checking, tuning for battery life, etc. You might have had to in the past, but you can't compare past Windows to Linux today.
I suspect you are talking about some other Windows that the rest of us.
> I just want to get my work done, and be able to reliably turn my computer on and run my applications.
Don't we all?
> Windows lets me do that. Linux doesn't.
You, ok. Others? It's the other way around.
> I haven't had a Windows update break things in years,
Last time? Cumulative update 2022.12 for W19 22H2... that's not that long time ago.
>When it works well it's great! It's magical! But people forget that it's a relatively recent thing
It absolutely is not! I recently put together a Windows 95 VM and was blown away by how straightforward and automatic everything was. It automatically recognized most hardware I threw at it, and didn't even need manual driver installation or anything. Things just worked after a reboot.
Early versions of NT (pre 2000) were not consumer oriented and that's why they were more finicky, but by the time of XP, you could expect it to just work with mostly anything again.
> I recently put together a Windows 95 VM and was blown away by how straightforward and automatic everything was. It automatically recognized most hardware I threw at it, and didn't even need manual driver installation or anything.
A virtual machine you say? With a virtualized set of hardware prechosen for compatibility so that Windows would recognize it without issues? It just recognized this collection of virtual hardware selected for compatibility without further interactions? You don't say? ;)
As someone who began with Windows 95 OSR2 on real hardware you will forgive my amusement I hope?
No, a 486 emulator that I installed Windows on, and "installed" an ATI Rage series "card" into, and didn't even have to look around for a driver CD, because Windows just kinda found it.
And don't make assumptions about me, our 1996 toshiba came with OSR2, including beta USB drivers and more built in driver profiles for commodity hardware than the plug and play "pick device" window actually could handle (it wasn't resizeable!)
By the time of OSR2, and then 98, if your device had been reviewed in a PC magazine, you could probably just plug it in, select it from a list, and go on your way.
>No, a 486 emulator that I installed Windows on, and "installed" an ATI Rage series "card" into, and didn't even have to look around for a driver CD, because Windows just kinda found it.
Actually I have 3 children from elementary to high school age. Only my eldest has a phone. We got it for her for our convenience as she is often participating in activities with unknown end times.
Well, huge respect for raising reasonable and well behaved kids.
What makes it hard, imo, is that kids chase status and conformity (social school dynamics, probably). What could be worse than seeing everybody in class with a shiny new smartphone and not having one? Also, them being precisely engineered addiction machines doesn't really help. Which makes me curious, how did you manage to pull it off?
My daughter started asking for a phone at 9. After just saying no a few time she stopped asking. Before we got her a phone she realize why we had been saying no. Her friends that had phones younger were already addicted to them to the degree that the phone was more important than friends that were physically present. It is my job to protect my children from danger. I see device addiction to be as dangerous as drugs. Peer pressure is tough to deal with but that affects all aspect of growing up. Not just keeping up with the Jones. It takes help to navigate that feeling of missing out.
If the kid is old enough to need a phone, they're old enough to be a caddy or babysitter or something legal for kids to do and buy it themselves. You don't have to facilitate the purchase.
I think there's a strong class component to it, too.
I've got kids in public schools and one in a slightly-fancy private school.
The public school kids all get phones super-early and are prone to mocking kids for "being poor" over stupid shit like not having a phone (one of mine was so-mocked for "only having one backpack"—JFC, was Fussell ever right to shit on the class-anxious, pitiful, absurd Middle, give me "High Prole" over that crap any day).
Meanwhile, the private school kids whose parents are doctors and attorneys and VPs and related to major local politicians, get phones later and don't regard them as a status symbol. Phone ownership rate is maybe 50% by 7th grade, while I'd say it's that high (maybe higher?) by 4th or 5th in the public schools, and more like 95% by 7th. The private school also has much stricter rules about phone use during school (I gather all the area public schools have totally given up on stopping all but the most egregious use of them in class, as the parents who gave the kids the phones won't back them on enforcing anti-phone rules, and will in fact throw tantrums over any such enforcement)
According to official sources, Russian forces were dislocated to guard important objects which allowed Kazakh forces to quell the terrorists and take situation under control. So Russian forces did not engage in the actual fight, but rather backed up Kazakh forces.
What really happened is that Kazakh military did not want to obey the President orders when this chaos started. It's speculated that this was act of treason by some high officials in an attempt of coup. That's why Tokaev asked Russia to intervene. After Russia confirmed that they actually do support Tokaev with some symbolic forces sent, Kazakh military started to obey the orders and started to actually do what they must. That's my opinion.
Things aren't actually as clear-cut as you say. While protests started out peacefully things quickly escalated into violence, especially in Almaty. Most notably, armed bandits captured the city's airport, burned most of the government buildings, tried to assault multiple law enforcement agency offices, tried to go live on multiple TV channels, robbed ATMs, etc. The government has promised a large-scale investigation into the events, but is is already clear those people didn't want either freedom nor democracy and were quite probably influenced by criminal and illegal armed groups.
In a video from January 5th, people can be seen snatching up rifles offloaded by a truck. Some were likely convinced on the spot to grab one of the firearms being handed out, but who was responsible for acquiring them?
Escalation into violence does not discredit a revolution while it IS extremely regrettable and perhaps discredits the people involved. If escalation into violence was able to discredit a revolution, the regular every-day violence inflicted on the civilian population by bad cops, etc would discredit the government or at least policing.
At the point where rioters are being killed instead of merely arrested it's hard for me to look negatively at the rioters choosing to strike out with violence of their own. They aren't using the tools and backing of the state to oppress.
When there is a major protest that keeps the Police busy, criminals take the opportunity to loot stores while there is no law enforcement to stop them.
BTW I'm French and I'm used to this kind of rhetoric. Every time there's a demonstration (and we do a lot of them) the newspaper and "the other side" will use the minority of distractor to discredit the movement. It's so common I wish there was a name for this strategy. Perhaps we should call that a "misrepresenting minority fallacy".
I went to dozens of BLM protests in 2020, including the one in SoHo that got the most publicity. The number of looters from among the protestors rounds to zero.
It is sure easy to make political comments on countries that you know so little about, know no one from and have never been to. fyi, I am from Almaty and been in the city in January. Can you even imagine how it felt here, waking up from gunfire in the night, having no internet and mobile connection, no means of getting in touch with people you care about, wondering if tomorrow you will still have running water, electricity and heating? Exactly. This was no people's protest, but one of a small bunch of barbaric radicalists, a shock and a fever dream for all of the citizens of the country.
Yeah that's definitely what it was. I'm no westerner, so I know what this looks like and what local fascists interpret it as. Tell your stories to someone else.
Also, quoting your link:
"According to the Alma-Ata police, on the morning of January 6, dozens of rioters were killed during the storming of administrative buildings in the city and its environs."
The top comment implies that the protestors who demanded lower gas (propane, not gasoline) prices and later political reforms were the ones responsible for violence, or involved in it. There's plenty of evidence that this isn't the case.
People who started the violence were clearly trained for this and very organized, they were shooting police on the streets, looting firearm stores, giving away firearms to random people to encourage chaos. Also the government ordered police and SWAT teams to leave the cities for 1-2 days which hints at some officials being involved in this.
Russian army wasn't involved in fighting on the streets and certainly wasn't "killing" anyone. Russians were invited mainly for a political and morale effect.
I don't know what the top commenter was reading to get these conclusions, but they are clearly not representing any of the mainstream hypotheses explaining the events. I'm saying this as someone who has his family there when it happened and followed every single piece of news from there full time. And after everything was over I went there and talk to some people who witnessed the events.
> The top comment implies that the protestors who demanded lower gas (propane, not gasoline) prices and later political reforms were the ones responsible for violence, or involved in it.
What section are you refering to? Must have lost something
If anyone uses Firefox, it is possible to disable websites from knowing about any clipboard-related events. Go to about:config, search for dom.event.clipboardevents.enabled and set it to false. Done.
Sorry to go off topic, but is there documentation for the settings in about:config? And why don't they have a better UI?
It's pretty sad that something as security sensitive as browser settings requires trusting random strangers on the internet or digging through source code to understand what's going on.
If Mozilla made a chrome UI for those settings, they'd be accused of adding bloat to Firefox, creating complex options that confuse users, and making Firefox harder to use than Chrome-based browsers. I've noticed that open source really can't win in that department, so power user settings are put out of sight and out of mind, or are eliminated completely.
> I still miss Opera; they were the only browser developer who didn't take the "Users are stupid and easily confused" approach to browser development.
Old Opera was amazing, and incredibly fast. For a while there on pre-Android and early Android phones, Opera rendered the fastest out of all the mobile browsers.
> If Mozilla made a chrome UI for those settings, they'd be accused of adding bloat to Firefox, creating complex options that confuse users, and making Firefox harder to use than Chrome-based browsers.
Possibly, but does Mozilla actually care about accusations of bloat or poor usability?
I think that ultimately this project might do more environmental harm by consuming more electricity (therefore creating carbon emissions) to function/transfer webpages then it can offset by increasing engagement with environmental content.