I switched all the machines at https://lanparty.house over to Linux a couple months ago. So far, we've experienced noticeably fewer problems on Linux compared to Windows. Stability and performance are better. I can't think of one game we tried that didn't work. And wow is it nice not to have all the ads and crapware in our faces anymore.
(I'm aware that Battlefield series and League of Legends won't work due to draconian anti-cheat -- but nobody in my group cares to play those I guess.)
I used multi seat in Linux with SystemD, i just threw in some old grapchics cards and sound cards in my gaming PC so that the children could play on separate monitors while I worked. Multi seat is very cool. When upgrading to a new gaming PC it was much cheaper to build 4 separate machines because cpu's and motherboards with enough pcie lanes are very expensive.
GPU's still run at decent performance with half the pcie lanes available, so if you already got a gaming PC with many slots and dont need top performance it could still be worth it to get two more cheap gpus and use multi seats - for those building a mini lan gaming room at home.
One annoying thing is that linux cant run many different GPU drivers at the same time, so you have to make sure the cards work with the same driver.
Properitary 3rd party multi seat also exist for Windows, but Linux has built in support and its free.
I am super curious about your setup. I played with MS years ago, but I lost the need. It is a super cool tech that I'd love to see its efficiencies embraced in some way.
Install an old GPU,
Connected a monitor to the extra GPU,
connect mouse and keyboard,
Use the loginctl command to list available devices/usb ports and attach them to a seat.
I suggest using Arch linux although loginctl should be available in all distributions using SystemD now.
If you don't have enough USB ports you can use a USB hub, some monitors comes with USB hub. And some with built in sound, or you can use wireless headset.
My main issue was that driver support was dropped for my oldest GPU card. So one day when I upgraded the OS it just stopped working. So to be on the safe side get another GPU like the one you already have.
Sorry being IT guy I wondered about the logic. I understand the need to align. But if one fail all fail and children like customers … not the patient kind? Or you have two system within each … then across the …. Sorry cannot stop my mind spinning.
On a similar note, performance is sometimes better. As a direct comparison, the steam version of the Lenovos Legion S handheld is significantly more performant than the windows version. Like 20% better FPS and double the battery life. Literally the only difference between the two is the OS.
Though from what I've read, Microsoft could fix that relatively quickly, if they made some tweaks to Windows (and called it a special 'handheld gaming edition' or so).
For some reason, the Lenovo Legion S's Windows still comes with a lot of baggage and background services etc.
If LTT is to be believed, this is in the works
Maybe SteamOS managed to ruffle enough feathers to start moving the inertial colossus that is Microsoft, not that I have much trust on their willingness to leave a good idea remain good in the long term
It's called Xbox Full-Screen Experience and is marketed as Xbox PC.
It's available now, but nobody's been impressed yet. Gives you a gamepad-compatible launcher (although the gamepad PIN to login is buggy). Doesn't seem to actually save resources.
Microsoft is a big expensive oil tanker. They have the resources to turn the ship around, but they need to feel incentivize to do so. I love using Linux and won't go back to Windows, (it's been quite a while for me) but they could blow the performance problems out of the water if they really cared to.
It keeps surprising me how many people don't care about some of the most popular games today. I mean I don't care about Battlefield or League of Legends neither, but in earlier decades of PC gaming, almost everyone had some of the most popular games. Doom, Half-Life (1 + 2) and such.
The games market today seems more similar to music in its fragmentation.
Strange things are happening regarding genres served by AAA studios. Some of them come and go fairly quickly. There was a brief resurgence of strategy(!!) games because of XCOM.
The addressable consumer market is just a lot bigger and more diverse than it used to be. You go back to the early late 90s and its a market dominated by teenage boys. Go back and look at some of the 00s and early 10s E3 presenations from the big three and its very cringe inducing how focused they are on a teenage boy demographic and appearing edgy and how blatantly sexist they are in their language. For example, at the E3 conference where MS announced xbox live (2004?) they explictly said that girls don't play games (there were actually plenty of girls that did play games at that time), but they might want to use xbox live to design t-shirts to sell to boys on their online marketplace. This was also still the era of booth babes trying to pull in men to booths with barely dressed women. Nearly every game ad was just a wall of exposions and violence or just the latest NFL game.
Today you have fully grown adults in their 30-50s with very different tastes and you have a lot lot more women and girls playing.
On top of we have a lot of diversity in who creates games and the kinds of games they can create and still be commerically successful. Lots of interesting narratively focused games, puzzles games, platformers, and more artsy games. But if you want your multiplayer shooter battlefield and CS2 are still there for you.
This apparently is something that came about after the Atari days where games were social activities in bars and home console advertising features boys and girls. When the NES came out, it was marketed in the US as a toy so had to mold itself to the retail store layouts and pick a toy aisle and they picked boys.
In the mobile era and perhaps as early as The Sims beating MYST as the bestselling game, started to develop a more balanced marketing approach.
Let us say Star Wars. Until someone has a daughter and suddenly found out there is no nothing between them and then he should we say now we have one then very annoying girl Jedi forced onto the team.
The reason some of the most popular games are popular isn't because they are fun, it's because they've built an esports industry. Those popular games get spectators which in turn makes the games more popular.
As an aside..
I went down a mini-rabbit hole learning about the LAN Party House, read your website and about Sandstorm[0] and how that ended up with you at Cloudflare leading Workers. That’s a really cool and honestly inspirational path. Would love to learn more if you’ve written elsewhere…!
I was also impressed by his wife's Chez JJ work. I suspect that she has done much more impressive stuff, but that kind of thing is a dime a dozen, in SV. The hacker housing stuff speaks to her humanity, and I like humans.
I see not being able to install invasive kernel level anti-cheat as a positive. I uninstalled all Riot games before they rolled it out. I would’ve been pretty miffed if I had accidentally gotten their kernel modules simply because I wasn’t reading tech news before the auto update.
"And wow is it nice not to have all the ads and crapware in our faces anymore."
I don't understand this - and I'm not being a Windows defender here, I use Linux when I can (and promote its use).
But my Windows 11 installation has zero ads and zero "crapware". And it's a Dell!
Everything that I didn't want on the machine was removed when I purchased it (two years ago). I see no ads. If I did, this can be fixed easily by even non-technical users with OOShutUp10 or similar - or just edited with a registry change.
I've been using Windows since 3.1 and there were some ugly years but that is not the current state-of-the-state. I'm just calling it like I see it at this point.
The UI is full of Bing and Copilot tie-ins that I consider to be essentially ads. Recommended content in the start menu. The weather widget that shows you news headlines. The lock-screen-of-the-day with the text description that if you accidentally click on it, you open some Bing page. The Edge default home page. Everything is trying to push me towards engaging with Microsoft's online services, which I have never used and have no desire to use. These are ads.
It's probably the case that I could turn all of these off by hunting down the right config options, and if I used Windows as my primary desktop I'm sure I would. But it's just on my game machines which I don't want to spend a lot of time maintaining, and new crap keeps popping up in updates. It's exhausting.
A Debian Linux desktop, in comparison, is not trying to push you to anything. It's a breath of fresh air (not a term I use often but really fits here).
Note: I never made it to Windows 11, only Windows 10. But my understanding is that these things are getting worse, not better. And while not exactly the same thing, there has been a lot of talk lately about how the file explorer has become so bloated and slow that they have to preload it into memory at startup so that it can respond quickly when you click it... omg, I do not want that.
I'm surprised to hear that you were talking about Windows 10. Windows 11 is MUCH worse than 10 with the ads and seems to be the first one where people are complaining en masse. It also comes with a start menu that's both dumbed down and has performance issues. Yes, the start menu. It's slow.
I just know it from a new laptop where I'm keeping the preinstalled Windows for occasions that require it (very rare these days).
The real problem is with trust and encroachment. I think a lot of people that spend a fair amount of time on their computers start to feel like their OS is their home and they go on excursions through apps. Previously, ads were limited to apps you had to go to yourself. Ads showing up as wallpaper in your house would be unsettling, and it reveals that your homeownership was illusory from the start: you never really controlled anything.
Yes, you can use cleanup software to fix the symptoms, but that's not the real issue here.
Edit: further research revealed my original first point was a false assumption.
We must be using different Windows 11 then. Last I booted up Windows instead of shoving Cortana everywhere now it's shoving Copilot. The telemetry sent would make spyware jealous.
The "current" state does not matter. What matters is that MS can shittify your experience at any time. Your machine can stop working if you don't agree to MS "updates". On Linux you have the assurance that the state of your machine can be preserved and you know exactly what's being installed on it.
FTFY: Windows is spyware. The fact that you paid for spyware or it came on your computer or it has useful properties (like Bonzi Buddy) doesn't make it not spyware.
I did a clean Windows 11 install a few months ago. I expected to be bombarded with ads and all of the other things I kept reading about in comments here, but it’s been fine.
I do find it interesting that so many of the comments about how bad Windows 11 is are coming from comments that also admit they aren’t using Windows 11. Not everything in Windows 11 is my favorite design choice, but the anti Windows 11 comments have taken on a life of their own that isn’t always based in reality.
Yes, on Linux I was able to move the copy-on-write overlays to use local disks, which is one reason it performs much better (admittedly not a reason that would affect most people).
I am just using dm-snapshot for this -- block device level, no fancy filesystems.
Yep, I've been gaming exclusively on Ubuntu (mainly because I want my desktop to match my servers) for several years. If you aren't playing the latest AAA FPS, then everything pretty much works.
Up until now I didn't care how my software was installed, but snaps REALLY don't play nice, so it's time to retire them. Canonical has lost this battle, and the sooner they accept it and move on, the sooner they can recover their reputation and put this madness behind them.
Not that much. TO be honest, I have a few installed (Heroic Games Launcher for one), but the main one I wanted to avoid was Firefox - which is easily doable. It is annoying that we have yet another way of packaging apps - would have been better if they just supported Flatpack
Do you ever find it "updated" to the snap version? I have Ubuntu on my work laptop and every so often after an update Firefox will suddenly be the snap version and I'll have to reinstall it.
As someone else says, for Firefox (and Thunderbird) I just uninstalled the package manager version entirely and dropped Mozilla's regular distro-agnostic binary tarballs in my home folder. Using the built-in update systems also avoids that problem from .deb versions where updating the package could make the browser yell at you that it needs to be restarted when you try to open a tab.
I recommend downloading the executable-in-a-tarball form of Firefox and running that. I personally do that with Nightly, and I find it works quite well.
I no longer remember all the exact steps I did but I only googled them in the first place so presumably they are there to be googled still. But it's possible to fully remove snapd and all snap support and then taboo it so that it never comes back. Or at least, it's been a few years and it hasn't come back. FF has remained a real .deb from the mozillateam ppa. It was a few different steps though not just uninstalling a few packages but also editing some apt config files I think. Sorry that sounds useless but like I say I just googled it up at the time, did 15-20 minutes of reading and poking, and never had to touch it again since then. It's been several version bumps.
..edit..
I installed a dummy package that displaces the nagware about the pro version too so I never get those messages during apt update any more.
Taking a quick definitely incomplete look I see at least:
and removed ubuntu-pro-esm-apps and ubuntu-pro-esm-infra from that same dir
but also there is a mozillateam ppa in sources.list.d, and I don't see any installed package name that looks like it might be that dummy ubuntu-pro-esm thing, so maybe it got removed during a version upgrade and I never noticed because ubuntu stopped that nonsense and it isn't needed any more? Or there is some other config somewhere I'm forgetting that is keeping that hole plugged.
Anyway, it WAS a little bit of fiddling around one day, but at least it was only a one and done thing so far.
I kind of expected to be off of ubuntu by now because once someone starts doing anything like that, it doesn't matter if you can work around it, the real problem is that they want to do things like that at all in the first place. Well they still want what they want and that problem is never going away. They will just keep trying some other thing and then some other thing. So rather that fight them forever, it's better to find someone else who you don't want to fight. I mean that's why we're on Linux at all in the first place right? But so far it's been a few version bumps since then and still more or less fine.
I also game on Ubuntu and snaps have never been in my way. I actually like them and wish more non-game software was distributed this way, but Canonical has a brown thumb when it comes to growing their weird little side projects.
Ubuntu releases supported (aka "really supposed to work") versions much more frequently than Debian, or I would have switched already. As it is, I just make the appropriate changes to purge Snap and run Firefox from a Mozilla apt repo and Thunderbird from Flatpak via Flathub.
You are talking about Debian stable which is released approximately once in 2 years.
People who want to have more (most ?) recent software on Debian should go for Debian Testing. Or Debian Sid, which gets upstream updates almost instantly but requires more Linux knowledge in case something gets broken.
That's an amazing website, I just spent 30 minutes reading all of its content. Thank you for sharing ! I haven't been to a lan party in probably 20 years and made me jealous !
That is literally the coolest house humanity has built this side of the Industrial Revolution, if not ever. Congrats on envisioning and executing such an amazing project!
league of legends is basically the only thing holding me back from switching to Linux for myself :/ really want to just swap over to linux fully. love your website + house!
You might be saying it in jest, but I think there may be something to this line of thinking. Whenever I read about anything on HN and bazzite comes up, it feels like the beginning of a new holy war is approaching. I am starting to wonder if the concern from the 'older' linux crowd is that the gamers will introduce changes to linux that will corrupt it more than systemd ever could ( like kernel level drm in games ).
I think the GP means because the game is a time sink, but you raise an interesting point. I wanted to say "that'll never happen", but someone could fork Linux, add DRM, and then that becomes massively popular and we're done.
Realistically, that's only going to happen if Valve, specifically, decides to do it. There isn't really another player in the linux-gaming business with enough skin in the game to make it worthwhile.
I could imagine Epic Games trying to keep the Epic Games Store relevant by making a SteamOS competitor, with the idea of attracting developers with strong kernel-level DRM and anti-cheat
Great excuse to start learning DOTA. You won't regret it (until a few thousand hours of gameplay later you realise how much of your life you wasted on it).
I mean, of course he'll regret it, as he probably regrets learning LoL. But yes, DotA is the better game of the two, according to my objective opinion.
> (I'm aware that Battlefield series and League of Legends won't work due to draconian anti-cheat -- but nobody in my group cares to play those I guess.)
If every PC gamer would finally just hop into Linux for a year, no Windows, don't buy games that don't work. Just hard boycott, we could actually do a drastic change for PC gaming as a result. As they say "vote with your wallet" well I've been using Linux as my main OS with all my Steam games for about 3 years now? I only use Windows on work computers, but I'm increasingly only going to apply to work at places that either let me use Linux or will give me a Mac to work. I'm voting with my wallet full force. Windows has been turning into a steaming pile of trash.
I've repeatedly said, if Microsoft would release "Windows for Gamers" without all the bloated BS on it, I would consider using it.
I spent 2 grand or more on a prebuilt gaming system and it came with Windows Home which didn't let me add new users, so I was stuck with a Microsoft account user. Jokes on Microsoft, that and their AntiVirus sending files to their servers without any audit log was the last straw for me. I'm not supporting Microsoft, I say this as someone who is a "fanboy" for Microsoft (.NET is the only good thing they have going, but they can't even stick to one GUI stack they have to reinvent it 1000 more times).
The only game I tried on Steam that didn't work was Slave Zero, a game from the 90s. Unfortunately, I still have to use Windows for VR games. It is too troublesome on Linux (at least for the Meta Quest 2).
I'm sorry if you hear this a lot, but your house is so cool, and I must admit I am more than a little jealous.
I've also said it here before but I will just give up on PC gaming wholesale before I go back to Windows. It's crazy how much gaming on Linux has improved in just the past couple years.
Battlefield 4's anticheat runs fine on Linux, if you end up needing one. It definitely slakes my BF fix, in the same way Deadlock is filling the LoL-shaped hole in my contemptible subsistence.
Does this mean the GitHub repo linked with the scripts now include up to date linux versions? Last time I looked it was all windows specific, but I'd love to setup something similar with stations for (much lower power) versions.
Sorry, I haven't gotten around to updating it yet, although it basically works to follow the same instructions except replace Windows with Linux and skip all the workarounds for Windows-specific bugs.
what stability and performance? yeah, i dont see bsds but bluetooth doesnt work and it doesnt wake up after sleep 85 percent of the time. kind of crap really
Bluetooth is admittedly less snappy than on Mac or Windows, although it absolutely does work. As for wake after sleep, I've not had a single issue in five years of daily driving Linux. No idea what you're talking about.
A lot of hardware does have problems with properly resuming from sleep, but that is pretty much universal, not OS-dependent. People report just as many problems on Windows.
> (I'm aware that Battlefield series and League of Legends won't work due to draconian anti-cheat -- but nobody in my group cares to play those I guess.)
As I've said elsewhere, Battlefield 6 has got a far better user experience on Linux than Windows and I would recommend it to anyone.
I find that Fedora hits the right balance of stability while being up to date for anything desktop and specifically gaming focused, Debian has different priorities and packages can be a bit too old. And it’s less of a faff than Arch.
You are comparing Fedora with Debian stable. Everyone who wants to have Debian stability (and ecosystem) with the most new upstream software should go for Debian Testing (and don't be fooled by the name "testing" !).
Debian Stable is for servers, Debian Testing is for desktops.
Just try Debian Testing
(and I used Slack, Red Hat, Ubuntu, Debian)
This is from like 20 years ago, but I remember Debian Testing as the one where updates broke the system most frequently, or maybe the longest without fixes: Stable was stable, Sid / unstable was what most Debian developers were using... and Testing was the weird thing that was neither a release nor tested and fixed "live" by developers.
But who actually tests Testing? If it's not the Debian developers themselves, fixes could take a while. I seem to recall Testing breaking because of package version combinations that never existed, so were never tested, in Sid.
Archlinux can be a pretty good choice for gaming. Not necessarily because of anything Archlinux does: most distros can do anything, if you configure them.
No, just because the Steamdeck's distro is built on Arch, and so you can piggyback on what they are doing.
Arch is really in a sense the absence of a distro, but keeping a package manager with up to date packages. No bloat bundled, just install exactly what you want.
I don't see why 'piggyback on what [Steam deck is] doing' wouldn't work just as well on any distro, you'd just have a load of extra stuff you're not using too.
That's nothing against Arch, it's what I use, I'm just saying really the only magic is in doing less.
Eh, aside from GPU drivers -- which I download directly from nvidia anyway -- I don't feel like gaming is much affected by the distro packages being a couple years old. We pretty much just run Steam, Discord, and Chrome on these things, and those all have their own update schedule independent of the distro.
I'm hopeful that's been fixed by now, but when I switched to Linux a year ago I started with Debian, and had a lot of issues with input latency for games on Wayland. Switched to Fedora which was two KDE versions ahead and never had that issue again.
Because you used Debian stable (which is mostly for servers).
Try Debian Testing. And don't get fooled by its name "testing" - it is because Debian community reserved "stable" for Debian stable. Debian testing is also stable :-)
I download the nvidia drivers directly from nvidia. Their installer script is actually pretty decent and then I don't have to worry about whether the distro packages are up-to-date.
From what I read, SteamOS isn't really intended to run on any device other than official Steam devices. In particular I read that it doesn't support nvidia GPUs at all, since all official Steam devices are AMD.
Also, I have a pretty unusual setup: my machines netboot from an shared iSCSI volume, setting up a local copy-on-write overlay on each machine.
SteamOS is based on Arch, so I'm sure it would be possible to make it do anything Arch can do. But I don't know Arch -- I know Debian. So I was a lot more comfortable installing Debian and tweaking it the way I needed, then installing Steam on top.
Your house sounds like a great place to hold a fighting game local tournament (or something like the old Smash Summit series for Smash Bros Melee and Ultimate before Beyond The Summit shut down)
I think its interesting that mainstream PC gaming press is now talking about Linux. We have the benchmark Youtube channels doing some benchmarks of it as well and plenty of reports of "it just works", which is pretty promising at least for the games that aren't intentionally excluded by DRM. For me its still controllers and equipment incompatibility due to my VR headset and sim wheel/pedals setup, I use Linux everywhere else in my router and home servers. I just hope that Nvidia notices that there does appear to be a swing happening and improves their driver situation.
I'd say there are two remaining roadblocks. First and biggest is kernel level anti-cheat frameworks as you point out. But there's also no open source HDMI 2.1 implementation allowed by the HDMI cartel so people like me with an AMD card max out at 4K60 even for open source games like Visual Pinball (unless you count an adapter with hacked firmware between the card and the display). NVidia and Intel get away with it because they implement the functionality in their closed source blobs.
This is kind of a niche problem. It only affects people with AMD GPUs running games at over 4k60 with HDMI. Get an NVidia or stay at 60 FPS or stay at 1080p or use DisplayPort and you will be fine.
It is not really a roadblock, more like a bump, and it is not the only bump by far. Some games just don't run on Linux, or quite terribly and they don't have a big enough community for people to care. Sometimes one of your pieces of hardware, maybe an exotic controller, doesn't like linux. Sometimes it is not the fault of the game at all, but you want to do something else with that PC and it isn't supported on Linux, and you don't want to dual boot. Overall, you will have less problems with gaming on Windows, especially if you don't really enjoy a trip to stackoverflow and the command line, but except for anti-cheat maybe, there is no "big" reasons, just a lot of small ones.
This is the first I learned of this since personally I have no need of anything over 4k@60 (that already borders on absurd in my mind). I'm curious if this is something that's likely to get reverse engineered by the community at large?
Outrageous that a ubiquitous connection protocol is allowed to be encumbered in this way.
For the particular use case I mentioned in my earlier post (Visual Pinball), 4k@120 is actually a pretty big deal. We often play on screens 42" and up so the 4k detail is put to good use and makes things like the instruction cards in the corners legible. But the bigger difference is the smoothness in gameplay that 120Hz gets you. The ball travels really fast so 120 Hz helps gameplay a lot while reducing lag at the same time. And because a large chunk of the playfield is static at any one time, you don't need something like a 5090 to hit 120 Hz at that resolution like you might with a triple-A shooter.
It's a blocker if you want to use a TV, there are almost 0 TVs with DP. This HDMI licensing crap is also the reason a Steam Deck can't output HDMI > 4K@60 unless you install Windows on it.
Yup this works but there's as of yet no HBR13.5 or better input so you're not getting full hdmi 2.1 equivalent. But if you don't care about 24 bits per pixel DSC then you can have an otherwise flawless 4k120hz experience.
DP is something like a free superset of HDMI, so you can use a fully passive DP-HDMI cable. Obviously the feature set will be limited, but it will work.
DP however can't transfer audio, which doesn't matter for a desktop but matters a lot for a TV.
> DP is something like a free superset of HDMI, so you can use a fully passive DP-HDMI cable.
No, it's not, the protocol is completely different (DP is packet-based while HDMI traditionally was not, though AFAIK HDMI 2.1 copied DP's approach for its higher speed modes). When you use a passive DP-HDMI cable (which AFAIK is not fully passive, it has level shifters since the voltages are different), it works only because the graphics card detects it and switches to using the HDMI protocol on that port; if it's not a dual-mode port (aka "DP++" port) it won't work and you'll need an active DP-HDMI adapter.
> DP however can't transfer audio, which doesn't matter for a desktop but matters a lot for a TV.
On the desktop I'm using to type this message, I use the speakers built into the DP-connected monitor (a Dell E2222HS). So yes, DP can and does transfer audio just fine. If it couldn't, then active DP to HDMI adapters wouldn't be able to transfer audio too.
The only thing DP doesn't have AFAIK is ARC, which might matter for a few more exotic TV use cases, and HEC, which AFAIK nobody uses.
Up until a year or two ago, the majority of monitors (and graphic cards) used DisplayPort 1.4 and HDMI 2.1. With HDMI 2.1 (42 Gbps) having more bandwidth than the DisplayPort (26 Gbps).
This is my case with my relatively new/high-end RTX 4080 and OLED monitor. So until I upgrade both, I use HDMI to be able to drive a 1440p 240hz 10-bit HDR signal @ 30 Gbps.
I had said I wouldn’t upgrade from my RTX 3080 until I could run “true 4K”.
I finally got the 240hz 4K uncompressed but it required buying a $1300 Asus OLED monitor and the RTX 5090. It looks amazing though, even with frame gen. Monster Hunter had some particularly breathtaking HDR scenes. I think it uses DisplayPort 2.1? Even finding the cable is difficult, Microcenter didn’t have them in April and the only one that worked was the one that came with the monitor.
TVs don't support displayport, so it makes Linux PCs like the Steam Machine inferior console replacements if you want high refresh rates. A lot of TVs now support 4K/120hz with VRR, the PS5 and Xbox Series X also support those modes.
(Some games support 120, but it's also used to present a 40hz image in a 120hz container to improve input latency for games that can't hit 60 at high graphics quality.)
It took a long time to move from the old component input over to HDMI. The main thing that drove it was the SD to HD change. You needed HDMI to do 1080p (I believe, IDK that component ever supported that high of a resolution).
Moving from HDMI to display port is going to be the same issue. People already have all their favorite HDMI devices plugged in and setup for their TVs.
You need a feature that people want which HDMI isn't or can't provide in order to incentivize a switch.
For example, perhaps display port could offer something like power delivery. That could allow things like media sticks to be solely powered by the TV eliminating some cable management.
The legacy issue is even worse than that. I have a very new Onkyo RZ30 receiver and it is all HDMI with no DisplayPort to be seen. So it is the whole ecosystem including the TV that would need to switch to DP support.
> For example, perhaps display port could offer something like power delivery.
It already does. A guaranteed minimum of 1.65W at 3.3V is to be provided. Until very recently, HDMI only provided a guaranteed minimum of something like 0.25W at 5V.
It's not nothing, but it's also very little to play with.
5W is what I'd think is about the minimum for doing something useful. 25W would actually be usable by a large swath of devices. The raspberry pi 4, for example, has a 10W requirement. Amazon's fire stick has ~5W requirement.
> It's not nothing, but it's also very little to play with.
Sure. But it's ~6.6x more than what HDMI has historically guaranteed. It's pretty obvious to anyone with two neurons to spark together that the problem here isn't "amount of power you can suck out of the display port". If it were, DP would have swept away HDMI ages ago.
> It's pretty obvious to anyone with two neurons to spark together that the problem here isn't "amount of power you can suck out of the display port".
Nobody said it was.
I gave that out as and example of a feature that DP might adopt in order to sway TV manufacturers and media device manufactures to adopt it.
But not for nothing, 0.25W and 1.67W are virtually the same thing in terms of application. Just because it's "6.6x more" doesn't mean that it's usable. 0.25W is 25x more than 0.01W, that doesn't make it practically usable for anything related to media.
I think it's not really an issue for 95-99% of users who uses devices with non open source drivers so there is no incentive for manufacturers to add it?
For the same sorts of reasons that made it so for decades nearly every prebuilt PC shipped with an Intel CPU and Windows preinstalled: dirty backroom dealings. But in this case, the consortium that controls HDMI are the ones doing the dealings, rather than Intel and Microsoft.
"But Displayport doesn't implement the TV-control protocols that I use!", you say. That's totally correct, but DisplayPort has the out-of-band control channel needed to implement that stuff. If there had been any real chance of getting DisplayPort on mainstream TVs, then you'd see those protocols in the DisplayPort standard, too. As it stands now, why bother supporting something that will never, ever get used?
Also, DP -> HDMI active adapters exist. HDR is said to work all the time, and VRR often works, but it depends on the specifics of the display.
Correction, you can get 4K@120hz with HDMI 2.0, but you won't get full chroma 4:4:4, instead 4:2:0 will be forced.
In my case I have an htpc running linux and a radeon 6600 connected via hdmi to a 4k @ 120hz capable tv, and honestly, at the sitting distance/tv size and using 2x dpi scaling you just can't tell any chroma sub-sampling is happening. It is of course a ginormous problem when on a desktop setting and even worse if you try using 1x dpi scaling.
What you will lose however is the newer forms of VRR, and it may be unstable with lots of dropouts.
I'm bit puzzled, isn't VRR more for low powered hardware to consume less battery (handhelds like steam deck)? How does it fit hardware that is constantly connected to power?
Pirate everything. Stop feeding beasts and they have no power.
The idea that you need intrusive surveillance in order to make games fair is absurd. If you need fair games, you need referees and moderation, which means you need to train and pay competent people and establish open and transparent rules and tools. You can also give your refs latitude, so if someone is obviously cheating, they have the power to do something about it. You should also require and implement publicly transparent and auditable actions with recourse for players to prevent abuses of power.
That's expensive. It's much easier to create a terms of service with vague guidelines, implement a totally intrusive, absurdly invasive rootkit that does some bare minimum scanning for known cheats and patterns, which establishes an arms race and provides bad actors a nice little point of ingress when the responsible company inevitably fails to protect their users competently.
Just like media platforms, if you cannot moderate at the scale at which you're operating, then it shouldn't be legal to operate at that scale.
People should stop giving money to companies that don't deserve it. No game is worth sacrificing your integrity for. "Just trust us, we know what we're doing" is a huge red flag, and it should be criminal to do what they do.
AI refs are going to be a very real possibility in the near future that can be just as fair and competent as humans, so the "necessity" for rootkits won't be a valid argument for much longer. It'll still be expensive, but multiplayer gaming fairness shouldn't ever serve as a reason for nuking privacy.
Competent cheat makers don't have much difficulty in defeating in-kernel anticheats on Windows. With the amount of insight and control available on Linux anticheat makers stand little chance.
The best Valve could do is offer a special locked down kernel with perhaps some anticheat capabilities and lock down the hardware with attestation. If they offer the sources and do verified builds it might even be accepted by some.
Doubt it would be popular or even successful on non-Valve machines. But I'm not an online gamer and couldn't care less about anticheats.
Anticheat is one of those things where I probably sound really old, but man it’s just a game. If you hate cheating, don’t play on pub servers with randoms or find a group of people you can play with, like how real life works.
For competitive gaming, I think attested hardware & software actually is the right way to go. Don’t force kernel-level malware on everyone.
Yeah, that's hilariously impractical if you like these games.
> pub servers
Most of these popular competitive games probably don't even have community servers of any kind. Maybe some games like RTSes have custom matches, but they're not used much for the standard game mode, at least not for public lobbies.
Sorry but you're just old IMO :) PUBG or Arc Raiders have over 100 players in a game. Even Valorant or League have 10 players in a match. It's definitely not easy to find 9 friends to play the same game at the same time as you. And playing any of these games with a cheater can completely wreck the match. If the cheaters go unchecked, over time they start to dominate games where like 30% might be cheaters who can see through walls and insta headshot you and the entire multiplayer mode of the game is ruined. Even worse some cheaters are sneaky, they might have a wallhack or a map showing all players but use it cautiously and it can be quite hard to prove they're cheating but they build up a huge advantage nonetheless. Most of us are happy to have effective anti-cheat, and it's not forced upon us. I understand the tradeoff to having mostly cheater-free games is having to trust the game maker more and am fine with that. Riot for example is quite transparent about what their anti-cheat does, how it works and I don't consider it "malware" anymore than I consider a driver for my graphics card to be "malware" even if they do operate in kernel mode.
This was never an issue 20 years ago when we had 64 player servers, but the 64 player servers also generally had a few people online with referee access to kick/ban people at any given time. That seemed like it worked well to me.
Exactly 20 years ago I was both a competitive CS player and I also liked reverse engineering so I was somewhat interested in the cheating community and even programmed a custom injector and cheat for CS (it was surprisingly easy if you knew a bit about Windows APIs).
Cheats were a problem. Not even a nascent problem, but already established. Bad enough that VAC was released in 2002, Punkbuster in 2000...
In competitive gaming you cannot just find a stable friends group to play against: you need competition, and a diverse one. We somewhat palliated this by physically playing in LAN, but that still limits to a radius around you and it's cumbersome when you can just find an opponent online (we had manual matchmaking on IRC before modern matchmaking existed).
The problem is that cheating can be very subtle if done correctly. The difference between "that guy is better that me" and "that guy can see through walls" is pretty much undetectable through non-technical means if the cheater is not an idiot. This poisons the competitive scene.
Competitive gaming is huge. It was big back in the day but now it's a monster. Just check the largest categories on Twitch: LoL, TFT, WoW, CS, Valorant...
Competitive gaming cannot possibly be huge. Like literally it is impossible for 99% of gamers to be competitive in any meaningful sense (if you play a game with 1M players and are in the top 1%, congrats, there are 10,000 people who are better than you. You are still unremarkable). It never was huge; it was just a niche you were in. There's massively more people that are just playing the game too blow off steam.
People play competition sports. They except no, or minimal amounts of cheating. Your personal feelings about it don't matter. The kid that plays basketball with 12 years olds on saturday mornings has the right to not have to deal with cheaters, and it doesn't matter if he's in the top .0001% or a shitty player that cannot distinguish his hands from his ears.
Have a quick look at the ladder on Counter Strike, or Faceit, or ranked play on League of Legends/Valorant/Whatever: it's not a niche. These games requiring kernel AC no matter the type of play is another subject, but people play to compare themselves to other, massively.
It invalidates the idea that we need to take it seriously and have locked down computers with remote attestation to play games. People who take games seriously are a very small niche. You are in a bubble if you think otherwise.
This is like saying we need to institute drug testing at all parks to play football. Cheating in sports is a problem that very few players are concerned with. Caring about who wins isn't even common. Most are just kicking a ball around with their mates.
Those players can have their own solutions. They should recognize they are a tiny bubble and not insist the other 999,000 players need the same.
And they don't even need it all the time either. I did once participate in a CS:S tournament, so I guess I was "competitive", but half the time I was on gun game or ice world or surf maps. My friends and I played normal Warcraft 3 against each other, but otherwise I pretty much only played custom maps, which were apparently popular enough to spawn an entire new genre. I never ran into problems queueing for something like preschool wars or wintermaul. When we did queue for ladder sometimes it was like 10 minutes to find a match.
To your earlier point about e.g. Valorant: my mom invited me to play on weekends with her and my sister. I know my mom is 0% competitive. This was not some serious thing. I couldn't play with them because I'm not going to buy another computer just to run it. That's the absurdity here.
I have been watching this thread and you are triple downing on a point that you have no real experience with. Competitive e-sports is a real thing. There are e-sports arenas. (How are people even arguing this on HN?)
The International (a DOTA 2 competition) has like $40m in prizes. EWC in 2025 was $70m. 99.6 million people watched the League of Legends World Championship final. And we're not even talking about the millions of dollars of sponsorship involved.
That's great your mom isn't competitive in Valorant, but massively irrelevant. It's like me saying "I play flag football with friends, there is no competitive football."
Anti-cheat is important because this is how the best players are discovered, this is how they're recruited. If a game is 50%+ cheaters, the game will die... DOTA2 would cease to exist today as a big deal. Same with Valorant.
Aside from competitive gaming, GTA V online makes $1 BILLION in ARR. That would be $0 if the game was flooded with cheaters.
Now this isn't me defending kernel level anti-cheat, I think there are better ways to do it and some games do a great job here.
But man, calling GTA V online and competitive e-sports a "tiny bubble" is like calling the NFL a "tiny bubble".
That's really the paradigm shift - communities were self-organizing and self-moderating before. Now game publishers want to control all aspects of the online experience so they can sell you content and skins, so that means matchmaking and it means they have to shoulder the moderation burden.
> communities were self-organizing and self-moderating before
This led to legit players that were just good being banned by salty mods, or cheaters that were subtle enough to only gain a slight edge not being banned.
And now, you have false anticheat bans. If you get banned from a server you can just join another server. (or even start your own!) If you get falsely banned from the game by anti cheat your money was in some sense stolen.
It was still an issue enough that some developers made BattlEye for anti-cheat 20 years ago for Battlefield games. It's still one of the more popular anticheats today.
Other games did similarly. Quake 3 Arena added Punkbuster in a patch. Competitive 3rd party Starcraft 1 server ICCUP had an "anti-hack client" as a requirement.
> Most of us are happy to have effective anti-cheat
I could almost get on board with the idea of invasive kernel anti-cheat software if it actually was effective, but these games still have cheaters. So you get the worst of both worlds--you have to accept the security and portability problems as a condition for playing the game AND there are still cheaters!
It's kind of like when people say Google is getting worse and has too many spam results even while I suspect they're actually improving, but the volume and quality of spam has gone up 100x so it looks like they're doing worse. The question is what is the base rate of attempts to cheat and how many of those attempts does kernel anti-cheat prevent vs. conventional mechanisms. I don't have the answer, but my intuition is cheating is more accessible and viral in many ways now with professional level marketplaces and actors working to build and sell cheats. I also don't think the industry would dedicate so much effort into invasive anti-cheat which is difficult, risky and gets them negative PR unless they felt it truly necessary. Counter Strike a few years ago had huge, huge numbers of cheaters and the super popular games like that attract a lot of attention. But ultimately, this is a cat and mouse game like search & SEO, so you're right there are still cheaters and getting that number to 0 is probably impossible.
Valorant really is the only FPS where I was never once suspicious that someone may be hacking. I mean, I don’t play it and the anti-cheat is part of the reason, but it does absolutely work.
Worst of both worlds? In theory this is accurate, in practice, it isn’t. The crux of why people are fine with it as far as I can identify is “but these games still have cheaters” - people aren’t looking for 0 cheaters so much as < X% are cheaters, keeping the odds low than any given match they are in has a cheater.
> I don't consider it "malware" anymore than I consider a driver for my graphics card to be "malware" even if they do operate in kernel mode.
the bloggers/journalists calling it malware is doing the conversation a disservice. The problem is only really the risk of bugs or problems with kernel level anti-cheat, which _could_ be exploited in the worst case, and in the best case, cause outages.
The classic example recently is the crowdstrike triggered outtage of computers worldwide due to kernel level antivirus/malware scanning. Anti-cheat could potentially have the exact same outcome (but perhaps smaller in scale as only gamers would have it).
If windows created a better framework, it is feasible that such errors are recoverable from and fixable without outages.
I'm not giving a small time software vendor proprietary access to my machine at that level. I honestly think that anyone who accepts it must be woefully uninformed about the risks involved.
I'm already salty about the binary blobs required by various pieces of firmware.
I play a lot of dota 2 and never really notice anything that is obvious cheat wise. IMO league would probably be fine to do valve level anti cheat, it's even a less twitchy of a game than dota.
FPSs can just say 'the console is the competitive ranked' machine, add mouse + keyboard support and call it a day. But in those games cheaters can really ruin things with aimbots, so maybe it is necessary for the ecosystem, I dunno.
Nobody plays RTSs competitively anymore and low-twitch MMOs need better data hiding for what they send clients so 'cheating' is not relevant.
We are at the point where camera + modded input devices are cheap and easy enough I dunno if anti-cheat matters anymore.
You clearly don’t play competitive shooters and thus aren’t qualified to opine on the matter.
Competition vs other human beings is the entire point of that genre, and the intensity when you’re in the top .1% of the playerbase in Overwatch/Valorant/CSGO is really unmatched.
I think the problem comes when someone makes a cool, fun, silly little game that is otherwise great when played with randoms, and cheating just sorta spoils it.
Case in point from a few years back - Fall Guys. Silly fun, sloppy controls, a laugh. And then you get people literally flying around because they've installed a hack, so other players can't progress as they can't make the top X players in a round.
So to throw it back - it is just a game, it's so sad that a minority think winning is more important than just enjoying things, or think their own enjoyment is more important than everyone else's.
As an old-timer myself, we thought it was despicable when people replaced downloaded skins in QuakeWorld with all-fullbright versions in their local client, so they could get an advantage spotting other players... I suppose that does show us that multiplayer cheating is almost as old as internet gaming.
Not a gamer, but it seems like super competitive games should be played on locked down consoles not custom-built PCs where the players have full control?
Also, for more casual play, don't players have rankings so that you play with others about your level? Cheaters would alll end up just playing with other cheaters in that case, wouldn't they?
At one point I recall that Valve implemented a rating system so that cheaters who got reported would all end up playing in the same pool with each other.
Sure you can secure boot the kernel and the game binary itself but then you have all the surrounding support from the OS that also need to interop without being tamperable. Screenshots, network and input devices for example are routed through user space before reaching the game, and they can be used to make cheats. Now some of those layers are getting more isolated, for example with Wayland. Even so, that means your secure boot chain must go all the way up to include a non tampered window manager too, taking you closer and closer into reinventing a Android like console OS.
This seems both semi probably but also like maybe a bit of a critical moral hazard for Valve. Right now folks love Valve. They do good things for Linux.
Making a Valve-only Linux solution would take a lot of the joy of this moment away for many. But it would also help Valve significantly. It's very uncomfortable to consider, imo.
You don't have to play these specific games though. I mean, what's your privacy, what's not being bombarded by ads in your OS worth to you? Have you taken an honest thought about this?
If you want to play games with friends, you have to play whatever the group plays. This is especially problematic as the group tries out new games, increasing the chance you can’t join because you’re not on Windows.
Personally I'd be interested to see what would happen if Sony/MS did what they could to make keyboard/mouse experience as good as possible on their consoles (I'm writing from a position of ignorance on the state of mouse/keys with current consoles) and encouraged developers to offer a choice in inputs, so that the locked-down machines can become the place for highest confidence in no/low cheaters. If other people want to pay through the nose to go beyond what consoles offer on the detail/resolution/framerate trifecta then I'm sure they could do so, but I really don't see how you lock down an open platform. That challenge has been going for decades.
> I'm writing from a position of ignorance on the state of mouse/keys with current consoles
I'm far from an authority on this topic but from my understanding both Sony/MS have introduced mkb support, but so far it looks to be an opt-in kind of thing and it's still relatively new.
All major consoles support keyboard & mouse or similar.
The problem is more the audience. Console players generally expect to be able to just connect the console to the TV, sit on the sofa and play with the official controller. That’s all the game are required to support to be published on the platform.
Even if you were willing to play at a desk, you’d be matchmaking into a special (and small) mouse pool on the console game. Anyone willing to go through so much faff will accept the extra annoyances of a PC, even with kernel anti cheat.
This really depends on the friends you have. I've never encountered this limitation because no one in my friend group plays competitive ranked games. Basically anything with private sessions doesn't require anticheat, so Valheim, RV There Yet, Deep Rock Galactic, etc. all work fine.
Yes, but Linux really has gotten a lot better in recent years. At least whatever runs on Steam. I almost never had any problems with newer indie games.
My friends are understanding that I don't play games with rootkit anti cheat (whether on Linux or Windows). There are enough games that we can play other games together still, and when they want to play the games with such anti-cheat (e.g. Helldivers 2) they simply play without me. No big deal.
Yes, but sometimes it is nice to socialize with other people and they might play these types of games. I don’t enjoy Call of Duty, but I’ll play it from time to time so I can chat with my brother (this is the only way to get him on the phone/microphone for some reason). I value the time I am spending with him more than a bit of privacy (in that context).
I am very pro-Linux and pro-privacy, and hope that the situation improves so I don’t have to continue to compromise.
besides ads and privacy concerns it's been such a delight not having to deal with unwanted updates, hunting phantom processes that take up cpu time, or the file explorer that takes forever to show ten files in the download folder. I cannot be paid to use windows at this point.
Well yeah but then eBPF would not work and then the anti cheat could just show that it's not working and lock you out.
This isn't complicated.
Even the Crowdstrike falcon agent has switched to bpf because it lowers the risk that a kernel driver will brick downstream like what happened with windows that one time. I recently configured a corporate single sign on to simply not work if the bpf component was disabled.
Well but then attackers just compile a kernel with a rootkit that hides the hack and itself from the APIs of the BPF program, so it has to deal with that too or it's trivially bypassed.
Anticheat and antivirus are two similar but different games. It's very complicated.
The bpf api isn't the only telemetry source for an anti cheat module. There's a lot of other things you can look at. A bpf api showing blanks for known pid descendent trees would be a big red flag. You're right that it's very complicated but the toolchain is there if someone wanted to do the hard work of making an attempt. It's really telemetry forensics and what can you do if the cheat is external to the system.
I'd be less antianticheat if I could just select the handcuffs at boot time for the rare occasion where I need them.
Although even then I'd still have qualms about paying for the creation of something that might pave the path for hardware vendors to work with authoritarian governments to restrict users to approved kernel builds. The potential harms are just not in the same league as whatever problems it might solve for gamers.
Once a slave, always a slave. Running an explicitly anti-user proprietary kernel module that does god-knows-what is not something I'd ever be willing to do, games be damned. It might just inject exploits into all of your binaries and you'd be none the wiser. Since it wouldn't work on VMs you'd have to use a dedicated physical machine for it. Seems to high of a price to play just a few games.
Being able to snapshot and restore memory is a pretty common feature across all decent hypervisors. That in and of itself enables most client-side cheats. I doubt they'd bother to provide such a hypervisor for the vanishingly small intersection of people who:
- Want to play these adversarial games
- Don't care about compromising control of hypervisor
>Being able to snapshot and restore memory is a pretty common feature across all decent hypervisors
A hypervisor that protects against this already exists for Linux with Android's pKVM. Android properly enforces isolation between all guests.
Desktop Linux distros are way behind in terms of security compared to Android. If desktop Linux users ever want L1 DRM to work to get access to high resolution movies and such they are going to need such a hypervisor. This is not a niche use case.
It "protects" against this given the user already does not control the hypervisor, at which point all bets are off with regard to your rights anyway. It's actually worse than Windows in this regard.
I would never use a computer I don't have full control over as my main desktop, especially not to satisfy an external party's desire for control. It seems a lot more convenient to just use a separate machine.
Even mainstream consumers are getting tired of DRM crap ruining their games and movies. I doubt there is a significant Linux users would actually want to compromise their ownership of the computer just to watch movies or play games.
I do agree that Linux userland security is lackluster though. Flatpak seems to be a neat advancement, at least in regard to stopping things from basically uploading your filesystems. There is already a lot of kernel interfaces that can do this like user namespaces. I wish someone would come up with something like QubesOS, but making use of containers instead of VMs and Wayland proxies for better performance.
You already don't control the firmware on the CPU. Would you be okay with this if the hypervisor was moved into the firmware of the CPU and other components instead?
I honestly think you would be content as long as the computer offered the ability to host an arbitrary operating system just like has always been possible. Just because there may be an optional guest running that you can't fully control that doesn't take away from the ability to have an arbitrary guest you can fully customize.
>to satisfy an external party's desire for control.
The external party is reflecting the average consumer's demand for there not being cheaters in the game they are playing.
>It seems a lot more convenient to just use a separate machine.
It really isn't. It's much more convenient to launch a game on the computer you are already using than going to a separate one.
Ah, I see, you're talking about Intel ME/AMD PSP? That's unfortunate and I'm obviously not happy with it, but so far there seems to be no evidence of it being abused against normal users.
It's a little funny that the two interests of adtech are colliding a bit here: They want maximum control and data collection, but implementing control in a palatable way (like you describe) would limit their data collection abilities.
My answer to your question: No, I don't like it at all, even if I fully trust the hypervisor. It will reduce the barrier for implementing all kinds of anti-user technologies. If that were possible, it will quickly be required to interact with everything, and your arbitrary guest will soon be pretty useless, just like the "integrity" bullshit on Android. Yeah you can boot your rooted AOSP, but good luck interacting with banks, government services (often required by law!!), etc. That's still a net minus compared to the status quo.
In general, I dislike any methods that try to apply an arbitrary set of criteria to entitle you to a "free" service to prevent "abuse", be it captchas, play integrity, or Altman's worldcoin. That "abuse" is just rational behavior from misaligned incentives, because non-market mechanisms like this are fundamentally flawed and there is always a large incentive to exploit it. They want to have their cake and eat it too, by eating your cake. I don't want to let them have their way.
> The external party is reflecting the average consumer's demand for there not being cheaters in the game they are playing.
Pretty sure we already have enough technology to fully automate many games with robotics. If there is a will, there is a way. As with everything else on the internet, everyone you don't know will be considered untrusted by default. Not the happiest outcome, but I prefer it to losing general purpose computing.
I'm talking about the entire chip. You are unable to implement a new instruction for the CPU for example. Only Intel or AMD can do so. You already don't have full control over the CPU. You only have as much control as the documentation for the computer gives you. The idea of full control is not a real thing and it is not necessary for a computer to be useful or accomplish what you want.
>and your arbitrary guest will soon be pretty useless
If software doesn't want to support insecure guests, the option is between being unable to use it, or being able to use it in a secure guest. Your entire computer will become useless without the secure guest.
>Yeah you can boot your rooted AOSP, but good luck interacting with banks, government services (often required by law!!), etc.
This could be handled by also running another guest that was supported by those app developers that provide the required security requirements compared to your arbitrary one.
>That "abuse" is just rational behavior from misaligned incentives
Often these can't be fixed or would result in a poor user experience for everyone due to a few bad actors. If your answer is to just not build the app in the first place, that is not a satisfying answer. It's a net positive to be able to do things like watch movies for free on YouTube. It's beneficial for all parties. I don't think it is in anyone's best interest to not do such a thing because there isn't a proper market incentive in place stop people from ripping the movie.
>If there is a will, there is a way.
The goal of anticheat is to minimize customer frustration caused due to cheaters. It can still be successful even if it technically does not stop every possible cheat.
>general purpose computing
General purpose computing will always be possible. It just will no longer be the wild west anymore where there was no security and every program could mess with every other program. Within a program's own context it is able still do whatever it wants, you can implement a Turing machine (bar the infinite memory).
They certainly aren't perfect, but they don't seem to be hell-bent on spying on or shoving crap into my face every waking hour for the time being.
> insecure guests
"Insecure" for the program against the user. It's such a dystopian idea that I don't know what to respond with.
> required security requirements
I don't believe any external party has the right to require me to use my own property in a certain way. This ends freedom as we know it. The most immediate consequences is we'd be subject to more ads with no way to opt out, but that would just be the beginning.
> stop people from ripping the movie
This is physically impossible anyway. There's always the analog hole, recording screens, etc, and I'm sure AI denoising will close the gap in quality.
> it technically does not stop every possible cheat
The bar gets lower by the day with locally deployable AI. We'd lose all this freedom for nothing at the end of the day. If you don't want cheating, the game needs to be played in a supervised context, just like how students take exams or sports competitions have referees.
And these are my concerns with your ideal "hypervisor" provided by a benevolent party. In this world we live in, the hypervisor is provided by the same people who don't want you to have any control whatsoever, and would probably inject ads/backdoors/telemetry into your "free" guest anyway. After all, they've gotten away with worse.
We already tried out trusting the users and it turns out that a few bad apples can spoil the bunch.
>It's such a dystopian idea that I don't know what to respond with.
Plenty of other devices are designed so that you can only use it in safe ways the designer intends. For example a microwave won't function while the door is open. This is not dystopia despite potentially going against what the user wants to be able to do.
>I don't believe any external party has the right to require me to use my own property in a certain way.
And companies are not obligated to support running on your custom modified property.
>The bar gets lower by the day with locally deployable AI.
The bar at least can be raised from searching "free hacks" and double clicking the cheat exe.
>who don't want you to have any control whatsoever
This isn't true. These systems offer plenty of control, but they are just designed in a way that security actually exists and can't be easily bypassed.
>and would probably inject ads/backdoors/telemetry into your "free" guest anyway.
This is very unlikely. It is unsupported speculation.
> We already tried out trusting the users and it turns out that a few bad apples can spoil the bunch.
You say this as if the user is a guest on your machine and not the other way around.
It's not a symmetrical relationship. If companies don't trust me, they don't get my money. And if I don't trust them, they don't get my money.
The only direction that gets them paid is if I trust them. For that to happen they don't have to go out of their way to support my use cases, buy they can't be going out of their way to limit them either.
> designed in a way that security actually exists
When some remote party has placed countermeasures against how you want to use your computer, that's the opposite of security. That's malware.
Yep, a plenty of prior art on how to implement the necessary attestations. Valve could totally ship their boxes with support for anticheat kernel-attestation.
Is it possible to do this in a relatively hardware-agnostic, but reliable manner? Probably not.
What do you mean? Ship computer with preinstalled Linux that you can't tamper? Sounds like Android. For ordinary computers, secure boot is fully configurable, so it won't work: I can disable it, I can install my own keys, etc. Any for any userspace way to check it I'll fool you, if I own the kernel.
No, just have the anti-cheat trust kernels signed by the major Linux vendors and use secure boot with remote attestation. Remote attestation can't be fooled from kernel space, that's the entire point of the technology.
That way you could use an official kernel from Fedora, Ubuntu, Debian, Arch etc. A custom one wouldn't be supported but that's significantly better than blocking things universally.
You can't implement remote attestation without a full chain of exploits (from the perspective of the user). Remote attestation works on Android because there is dedicated hardware to directly establish communication with Google's servers that runs independent (as a backchannel). There is no such hardware in PCs. Software based attestation is easily fooled on previous Android/Linux.
The call asks the TPM to display the signed boot chain, you can't fake that because it wouldnt be cryptographically valid. The TPM is that independent hardware.
How would that be implemented? I'd be curious to know.
I'm not aware that a TPM is capable of hiding a key without the OS being able to access/unseal it at some point. It can display a signed boot chain but what would it be signed with?
If it's not signed with a key out of the reach of the system, you can always implement a fake driver pretty easily to spoof it.
Basically TPM includes key that's also signed with manufacturer key. You can't just extract it and signature ensures that this key is "trusted". When asked, TPM will return boot chain (including bootloader or UKI hash), signed by its own key which you can present to remote party. The whole protocol is more complicated and includes challenge.
Tpm isn't designed for this use case. You can use it for disk encryption or for identity attestation but step 1 for id attestation is asking the tpm to generate a key and then trusting that fingerprint from then on after doing a test sign with a binary blob. The running kernel is just a binary that can be hashed and whitelisted by a user space application. Don't need tpm for that.
Ah, got it. With enough motivation this is still pretty easily defeated though. The key is in some kind of NVRAM, which can be read with specialized equipment, and once it's out, you can use it to spoof signatures on a different machine and cheat as usual. The TPM implementations of a lot of consumer hardware is also rather questionable.
These attestation methods would probably work well enough if you pin a specific key like for a hardened anti-evil-maid setup in a colo, but I doubt it'd work if it trusts a large number of vendor keys by default.
Once it's out you could but EKs are unique and tied to hardware. Using an EK to sign a boot state on hardware that doesn't match is a flag to an anti-cheat tool, and would only ever work for one person.
It also means that if you do get banned for any reason (obvious cheating) they then ban the EK and you need to go source more hardware.
It's not perfect but it raises the bar significantly for cheaters to the point that they don't bother.
> Using an EK to sign a boot state on hardware that doesn't match is a flag to an anti-cheat tool
The idea is you implement a fake driver to sign whatever message you want and totally faking your hardware list too. As long as they are relatively similar models I doubt there's a good way to tell.
Yeah, I think there are much easier ways to cheat at this point, like robotics/special hardware, so it probably does raise the bar.
Any sane scheme would whitelist TPM implementations. Anyway fTPMs are a thing now which would ultimately tie the underlying security of the anticheat to the CPU manufacturer.
I wonder if you could use check-point and restore in userspace (https://criu.org/Main_Page) so that after the game boots and passes the checks on a valid system you can move it to an "invalid" system (where you have all the mods and all the tools to tamper with it).
I don't really care about games, but i do care about messing up people and companies that do such heinous crimes against humanity (kernel-level anti-cheat).
The war is lost. The most popular game that refuses to use kernel-level anti-cheat is Valve's Counter-Strike 2, so the community implemented it themselves (FaceIT) and requires it for the competitive scene.
Uh, you'd have to compile a Kernel that doesn't allow it while claiming it does ... And behaves as if it does - otherwise you'd just fail the check, no?
I feel like this is way overstated, it's not that easy to do, and could conceptually be done on windows too via hardware simulation/virtual machines. Both would require significant investments in development to pull of
Right, the very thing that works against AC on Linux also works for it. There are multiple layers (don't forget Wine/Proton) to inject a cheat, but those same layers could also be exploited to detect cheats (especially adding fingerprints over time and issuing massive ban-waves).
And then you have BasicallyHomeless on YouTube who is stimulating nerves and using actuators to "cheat." With the likes of the RP2040, even something like an aim-correcting mouse becomes completely cheap and trivial. There is a sweet-spot for AC and I feel like kernel-level might be a bit too far.
All it takes is going to cd usr src linux and running make menuconfig. Turning off a few build flags. Hitting save. And then running make to recompile. But that's like saying "well if I remove a fat32 support I can't use fat32". Yea it will lock you out showing you have it disabled. No big deal.
That would require that they actually make the effort to develop Linux support. The current "it just works" reality is that the games developers don't need to support running on Linux.
I always wondered. Isn't exactly what eBPF would allow you to do?
Assuming that cheats work by reading (and modifying) the memory of the game process you can you can attach a kprobe to the sys_ptrace system call. Every time any process uses it, your eBPF program triggers. You can then capture the PID and UID of the requester and compare it against a whitelist (eg only the game engine can mess with the memory of that process). If the requester is unauthorized, the eBPF program can even override the return value to deny access before the kernel finishes the request.
Of course there are other attack vectors (like spoofing PID/process name), but eBPF covers them also.
All of this to say that Linux already has sane primitives to allow that, but that, as long as devs don't prioritize Linux, we won't see this happening.
but how does the anti-cheat know that the kernel is not modified such that it disables certain eBPF programs (or misreports cheats/spoofs data etc)?
This is the problem with anti-cheat in general (and the same exists with DRM) - the machine is (supposedly) under the user's total control and therefore, unless your anti-cheat is running at the lowest level, outside of the control of the user's tampering, it is not trustworthy. This leads to TPM requirements and other anti-user measures that are dressed as pro-user in windows.
There's no such thing in linux, which makes it inoperable as one of these anti-cheat platforms imho.
Great point. As I mentioned there are other attack vectors and you can mitigate them. For mitigating what you are mentioning for instance you don't just run one eBPF program, but you run a cluster of them that watch each other:
(The following was refined by an LLM because I didn't remember the details of when I was pondering this a while back)
All your anti cheats are eBPF programs hooked to the bpf() syscall itself.
Whenever any process tries to call BPF_PROG_DETACH or BPF_LINK_DETACH, your monitors check if the target is one of the anti cheats in your cluster of anti-cheats.
If an unauthorized process (even Root) tries to detach any of your anti-cheat processes, the eBPF program uses bpf_override_return to send an EPERM (Permission Denied) error back to the cheat.
(End LLM part)
Of course, you can always circumvent this by modifying and compiling the kernel so that those syscalls when targeting a specific PID/process name/UID aren't triggered. But this raises the difficulty of cheating a lot as you can't simply download a script, but you need to install and boot a custom kernel.
So this would solve the random user cheating on an online match. Pro users that have enough motivation can and will cheat anyway, but that is true also on windows. Finally at top gaming events there is so much scrutiny as you need to play on stage on vetted PCs that this is a non-issue
It's open source. Somebody will simply publish an AUR package with a custom kernel that is one command away. You're underestimating the capability of motivated nerds to make a good UX when needed :p. This is how we ended up with SteamOS in the first place
But given Linux kernel is monolithic and you can enforce signing of kernel modules too, using TPM to make sure the Kernel isn't tampered with is honestly the way to go.
You can't, but circumventing anti cheats already happens on windows with all their fancy kernel level anti cheats.
I believe the goal is to make it so uncomfortable and painful that 99.999% of the users will say fuck it and they won't do it. In this case users need to boot a custom kernel that they download from the internet which might contain key-loggers and other nasty things. It is not just download a script and execute it.
For cheat developers, instead, this implies doing the modifications to allow those sys-calls to fly under the radar while keeping the system bootable and usable. This might not be trivial.
Isn't it a more fundamental problem? I can imagine a cheating setup where you have a separate PC with a HDMI capture stick ("analog hole") and access to the controllers.
I am wondering can game be shipped with their own "kernel" and "hypervisor", basically an entire VM. Yes performance will take a hit, but in my experience with my own VM, it's like 15-20%.
Another unresolved roadblock is Nvidia cards seriously underperforming in DX12 games under Proton compared to Windows. Implementing DX12 semantics on top of Vulkan runs into some nasty performance cliffs on their hardware, so Khronos is working on amending the Vulkan spec to smooth that over.
What percentage of games require DX12? From what I recall, a surprisingly large percentage of games support DX11, including Arc Raiders, BF6 and Helldivers 2, just to name a few popular titles.
At the same time, Vulkan support is also getting pretty widespread, I think notably idTech games prefer Vulkan as the API.
DX12 is overwhelmingly the default for AAA games at this point. The three titles you listed all officially require DX12, what DX11 support they have is vestigial, undocumented and unsupported. Many other AAAs have already stripped their legacy DX11 support out entirely.
Id Software do prefer Vulkan but they are an outlier.
DX12 is less and less the default, most gamedev that I’ve seen is surrounding Vulkan now.
DX12 worked decently better than openGL before, and all the gamedevs had windows, and it was required for xbox… but now those things are less and less true.
The playstation was always “odd-man-out” when it came to graphics processing, and we used a lot of shims, but then Stadia came along and was a proper linux, so we rewrote a huge amount of our render to be better behaved for Vulkan.
All subsequent games on that engine have thus had a vulkan friendly renderer by default, that is implemented cleaner than the DX12 one, and works natively pretty much everywhere. So its the new default.
Clearly, when there will be enough Linux gamers another solution to the kernel-level anti-cheat issue will be found. After all, the most played competitive shooter is CS and Valve has does not use kernel-level AC.
> After all, the most played competitive shooter is CS and Valve has does not use kernel-level AC.
Valve doesn't employ kernel AC but in practice others have taken that into their own hands - the prevalence of cheating on the official CS servers has driven the adoption of third-party matchmaking providers like FACEIT, which layer their own kernel AC on top of the game. The bulk of casual play happens on the former, but serious competitive play mostly happens on the latter.
The best description I've been able to give of the dichotomy of CS is this: there is no way for a person to become good enough to get their signature into the game, without using kernel-level ACs.
The competitive CS leagues do use AC though. The big issue for these games is the free-to-play model does not work without anti-cheat. Having a ~$20 fee to cheat for a while before getting banned significantly reduces the number of cheaters, and that's what CS does with their prime server model.
And for what it's worth, I'm pretty sure Valorant is the most played competitive shooter at the moment.
How does their revenue rely on it? People won't buy/recommend their games if they can't solve a fundamental problem, without full control over the machine their product is running on? Then they can change their business model and/or game mechanics. Simple as that. The only reason that blatant security violation was ever considered a viable option is because Microsoft gave them the ability to actually do it with the click of a button. Those companies can adapt, or die.
First, let's ask ourselves how many PCs have users play games with anti-cheat frameworks. I'm absolutely no expert, but if it's more than, what? 10%? let's even say 20% - I'd be surprised.
> and unfortunately a good majority of the gaming industry by revenue relies on it.
Well, it used to be the case that game makers relied on copy protection in floppy discs, and movie distributors on DVD/BluRay copy protection. Conditions changed and they adapted.
The Quest 3 works offline with ALVR streaming over a private (non-Internet connected) WiFi network. Together with my 3090 I get 8k @ 120fps with 20ms latency over a WiFi6e dongle. I had to manually install the dkms for the dongle on PopOs, but apart from that it just works. ALVR starts SteamVR and then I use Steam to start the game. Proton seems to use Vulcan for rendering.
What might help is if AMD or Nvidia take the gamble and create decent drivers and advertise Linux compatibility, driving up sales, forcing their competitor to do the same.
> I just hope that Nvidia notices that there does appear to be a swing happening and improves their driver situation.
I firmly believe that Nvidia doesn't want the general public to ever have better hardware than what is current as people could just run their own local models and take away from the ridiculous money they're making from data centers.
In step they're now renting their gaming GPUs to players with their GeForce now package.
The market share for Nvidia of gamers is a rounding error now against ai datacenter orders. I won't hold my breath about them revisiting their established drivers for Linux.
> I firmly believe that Nvidia doesn't want the general public to ever have better hardware than what is current as people could just run their own local models and take away from the ridiculous money they're making from data centers.
You're underestimating them. They don't even want rich professional users to own hardware that could compete with their datacenter cash cow.
Take RTX 6000 Pro, a $10k USD GPU. They say in their marketing materials that these have fifth-generation tensor cores. This is a lie, as you can't really use any 5th-gen specific features.
Take a look at their PTX docs[1]. The RTX 6000 Pro is sm_120 in that table, while their datacenter GPUs are sm_100/sm110. See the 'tcgen05' instructions in the table? It's called 'tcgen05' because it stands for "Tensor Core GEN 05". And they're all unsupported on sm_120.
I’ll keep repeating it: the more people vote with their wallet, the more game companies will deploy Linux - including the anticheat.
EAC has the support for Linux, you just have to enable it as a developer.
I know this, I worked on games that used this. EAC was used on Stadia (which was a debian box) for the division, because the server had to detect that EAC was actually running on the client.
I feel like I bring this up all the time here but people don’t believe me for some reason.
This does not mean it supports the full feature set as from EAC on Windows. As an analogy, it's like saying Microsoft Excel supports iPad. It's true, but without VBA support, there's not going to be many serious attempts to port more complicated spreadsheets to iPad.
Funnily enough the most annoying things on my system at the moment is RGB and keyboard/mouse customisation.
I haven’t found a tool that can access all the extra settings of my Logitech mouse, not my Logitech speakers.
OpenRGB is amazing but I’m stuck on a version that constantly crashes; this should be fixed in the recent versions but nixpkgs doesn’t seem to have it (last I checked).
On the other hand I did manage to get SteamVR somewhat working with ALVR on the Quest 3, but performance wasn’t great or consistent at all from what I remember (RTX 3070, Wayland KDE).
Have you tried running the windows RGB utility via Wine with HIDRAW enabled for the device?
Alternatively, given you’re running NixOS you can just override the `src` of the derivation with a newer version. This is part of the point of running NixOS: making small modifications to packages in the fly.
I was annoyed recently because I replaced my GPU and I had to boot into Windows for the first time in months and install drivers just to turn off the RGB on the card because OpenRGB wouldn't find it.
When that steam deck clone came out and games played better on SteamOS than on Windows on the exact same hardware, it woke a bunch of people up. Microsoft scrambled to bring the startup time and footprint down but shots had already been fired.
You don’t want a vendor you have to publically shame to get them to do the right thing. And that’s MS if any single sentence has ever described them without using curse words.
I've got the Legion Go S with Steam OS, and that shit is great. It's stable, my games run well, the OS is pretty much entirely in the background, but I can still access it fully if I need to. Love it.
I dont get the feeling they care. Microsoft is so lost under Satya at this point. Totally blinded by Azure and AI and stock price growth. At some point they're going to realize all the ground they've lost and it's going to be a real problem. They're repeating a lot of the same mistakes that cost them the browser and mobile market.
Yeah. MS must have been so hurt about losing to the iPhone, they really jumped the gun on AI as if to avoid a similar mistake. It's Satya's major play and I think they are already paying for that decision. xbox is hollowed out so that AI can be funded, while the pc/console hybrid project is doomed to fail because "windows everywhere" doesn't work if windows is crap. indeed, they might be left with just the cloud business in the end.
And the funniest thing is: not having a mobile platform anymore will be the death knell for all of their AI efforts.
I’m not really into this AI shenanigans, but it seems to me that if you want people to use /your/ bot, you gotta give it to people in the most seamless and efficient way possible, and that does not translate well to a desktop OS.
I don’t think they would have dethroned iOS or even Android had they stayed their ground, but they probably would’ve had a stronger base to build upon for their Copilot nonsense. Those that used Windows Phone used it because they loved it, Copilot could’ve garnered some good rep from those already sold on Microsoft’s platform; instead, they’re trying to shove it down people’s throats even though very few people actually use Windows because they actively like it, most use it because it’s the “default” OS and they do not (and care not to) know any better.
"Totally blinded by Azure and AI and stock price growth."
Stock price growth is their core business because that is how large firms operate.
MS used to embrace games etc because the whole point was all PCs should run Windows. Now the plan is to get you onto a subscription to their cloud. The PC bit is largely immaterial in that model. Enterprises get the rather horrible Intune bollocks to play with but the goal is to lock everyone into subs.
When the rumour was Windows 10 will be the last windows! I don't think people thought it would because of win11 would be so unbearable it would finally drive users to Linux.. but here we are. RIP.
If people were buying new PCs every year like they used to I'd be worth it. Turns out there isn't as much value having a "captive market" on a PC unless it's locked down.
The irony is that gaming on linux got better but the instigator was not the OSS community. All of it was funded by closed source software competing with other close source software. The OSS community by itself did not have the conviction to climb over this bulwark.
But when Steam started to develop Proton, WINE was 90% there! Valve only had to provide the remaining 90%.
The strength of Linux and Free software in general is not in that it's completely built by unpaid labor. It's built by a lot of paid, full-time labor. But the results are shared with everyone. The strength of Free software is that it fosters and enforces cooperation of all interested parties, and provides a guarantee that defection is an unprofitable move.
This is one of the reasons you see Linux everywhere, and *BSD, rarely.
> This is one of the reasons you see Linux everywhere, and *BSD, rarely.
I doubt it's a large reason. I'd put more weight on eg Linus being a great project lead and he happens to work on Linux. And a lot of other historical contingencies.
BSD does a few things right, hence it's used by Netflix (who share back some of their work), userland of macOS (because Apple don't like GPL, I assume), PS4 and PS5 (IDK if anything seeps back upstream from there).
It isn’t about conviction. Gaming takes tremendous resources and they were not there. But if this starts shifting the tides there is a possible future where game developers start building for Linux as a primary target and to run games on Windows or Mac you would use emulation. In fact this seems like a better overall approach given that there are no hidden APIs with Linux.
Money and resources suddenly materialized once someone realized that there was profit in it is pretty much the expected way this goes. OpenTofu happened not because of some OSS force of will but because a group of companies needed it to exist for their business.
This flow is basically the bread and butter for the OSS community and the only way high effort projects get done.
This still has a "sometimes" on it, there are more then a few games that need magic proton flags to run well, nothing you can't go look up on protondb, but lots of games you would want to play with friends might have some nasty anti-cheat on it that just won't let you play it at all.
Gaming works fine with exception of things like BF6 that require kernel level anti cheat.
The one thing I haven’t been able to get working reliably is steam remote play with the Linux machine as host. Most games work fine, others will only capture black screens.
Proton has gotten so good now that I don't even bother checking compatibility before buying games.
Granted, I don't play online games, so that might change things, but for years I used to have to make a concession that "yeah Windows is better for games...", but in the last couple years that simply has not been true. Games seem to run better on Linux than Windows, and I don't have to deal with a bunch of Microsoft advertising bullshit.
Hell, even the Microsoft Xbox One controllers work perfectly fine with xpad and the SteamOS/tenfoot interface recognizes it as an Xbox pad immediately, and this is with the official Microsoft Xbox dongle.
At this point, the only valid excuses to stay on Windows, in my opinion, are online games and Microsoft Office. I don't use Office since I've been on Unixey things so long that I've more or less just gotten used to its options, but I've been wholly unable to convince my parents to change.
I love my parents, but sometimes I want to kick their ass, because they can be a bit stuck in their ways; I am the one who is expected to fix their computer every time Windows decides to brick their computer, and they act like it's weird for me to ask them to install Linux. If I'm the one who has to perform unpaid maintenance on this I don't think it's weird for me to try and get them to use an operating system that has diagnostic tools that actually work.
As far as I can tell, the diagnostic and repair tools in Windows have never worked for any human in history, and they certainly have never worked for me. I don't see why anyone puts up with it when macOS and Linux have had tools that actually work for a very long time.
> At this point, the only valid excuses to stay on Windows, in my opinion
I didn’t see a performance increase moving to Linux for the vast majority of titles tested. Certainly not enough to outweigh the fact that I want EVERY game to work out of the box, and to never have to think if it will or won’t. And not all of my games did, and a not insignificant number needed serious tweaking to get working right.
I troubleshoot Linux issues all day long, I’ve zero interest in ever having to do it in my recreation time.
That’s a good enough reason for me to keep my windows box around.
I use Linux and OSX for everything that isn’t games, but windows functions just fine for me as a dumb console and I don’t seem to suffer any of these extreme and constant issues HN users seem to have with it from either a performance or reliability standpoint.
But they work out of the box, which is my point. You can use a device that can be inbetween which places screen into fixed space in front of you for example. While it is cool, it is kind of a hassle to have this device inbetween. I just plug them directly and they work.
Steve Burke from GamersNexus tested eight games from their benchmark suite on Linux last month. Although his conclusion was generally positive, there were problems with nearly every game:
- F1 2024 didn't load due to anti-cheat
- Dragon's Dogma 2 and Resident Evil 4 had non-functional raytracing
- Cyberpunk 2077 with raytracing on consistently crashes when reloading a save game
- Dying Light 2 occasionally freezes for a whole minute
- Starfield takes 25 minutes to compile shaders on first run, and framerates for Nvidia are halved compared to Windows
- Black Myth: Wukong judders badly on Nvidia cards
- Baldur's Gate 3 Linux build is a slideshow on Nvidia cards, and the Windows build fails for some AMD cards
If you research these games in discussion forums, you can find some configuration tweaks which might fix the issues. ProtonDB's rating is not a perfect indicator (BM:W is rated "platinum").
And while Steve says measurements from Linux and Windows are not directly comparable, I did so anyway and saw that Linux suffers a 10-30% drop in average FPS across the board when compared to Windows, depending on the game and video card.
AFAIK this comes down a lot to NVIDIA not doing enough efforts for the Linux drivers. There is a pretty well documented and understood reason for the perf hit NVIDIA GPUs get on Linux.
Honestly, considering where we came from, a 10-30% perf drop is good and is a reasonable tradeoff to consider. Especially for all the people that don't want to touch Windows 11 with a 11-foot pole (which I am), it's a more than decent path. I can reboot into my unsupported Win10 install if I really need the frames.
Really, Linux benchmarks need to be split between AMD and NVIDIA. Both are useful, as the "just buy an amd card lol" crowd is ignoring the actually large NVIDIA install base, and it's not like I'm gonna swap out my RTX 3090 to go Linux.
Thanks for the comparison! Would you have an apples to apples, or rather an NVIDIA to NVIDIA comparison instead of "across the board"? I'd suspect the numbers are worse for the pure NVIDIA comparison, for what I mentioned above.
I'm not. The situation is improving rapidly, and I'd expect the gap to close soon.
I still have the windows install. And with an RTX 3090, framerate is not that much of a consideration for most games, especially since my main monitor is "only" 1440p, albeit a 144Hz one.
Couple that with GSync, framerate fluctuations is not really noticeable. Gone are the days where dipping below 60Hz is a no-no. The most important metric is stutter and 1% lows, those will really affect the feeling of your game. My TV is 120Hz with GSync too, and couch games with a controller are much less sensitive to framerate.
Do I leave performance on the table? Surely. Do I care? In the short term, no. The last GPU intensive games I played are Hogwarts Legacy and Satisfactory, both of which can take a hit (satisfactory does not max the GPU, and Hogwarts can suffer DLSS). The next intensive game I plan on playing is GTA VI, and by this time I'd fully expect the perf gap to have closed. And the game to play fine, given how Rockstar puts care on how the performance of their games, more so with the Gabe Cube being an actual target.
In the long run, I agree this is not a "happy" compromise. I paid for that hardware dammit. But the NVIDIA situation will be solved by the time I buy a new GPU: either they completely drop out of the gaming business to focus on AI, or they fix their shit because Linux will be an actual gaming market and they can't keep giving the finger to the penguin.
> The Start menu works great with no lag, even immediately after booting.
The very fact that this has to be explicitly mentioned is laughable.
Like $100 Chinese phones can achieve the same, this is the bare basic for a modern system capable of running 240Hz monitor (I assume it can do so with most games).
Considering I found the win10 start menu too slow, the w11 one does not stand a chance. But I'm hopeful from your comment, it shows that w11 is not the complete shitshow people make it to be, though the few times I used it on relatives computers I found it not responsive enough.
I'm testing daily-drive on my main rig (high-end from a few years ago, 5900x + 3090), and honestly I'm rediscovering my computer. A combination of less fluff, less animations, better fs performance (NTFS on NVMe is suboptimal), etc. I was getting fed up by a few windows quirks: weird updates breaking stuff, weird audio issues (e.g. the audio subsystem getting ~10s latency for any interaction like playing a new media file or opening the output switcher), weird display issues (computer locking up when powering on/off my 4k tv), and whatnot. I'm still keeping the w10 install around, as having an unsupported OS is less of a problem for the occasional game, especially since I mostly play offline games.
As for the dev env, you're not limited to bazzite, I run Arch. Well, I've been running it for two weeks on the rig. But you really get the best devex with linux.
The few win11 I've touched were all on NVMe drives, but I'm pretty sure they're fast enough for a start menu. I mean, your gear should not be needed to get a responsive start menu.
I'm curious, did you clean up what's by default in the start menu? Stuff like "recommended", "candy crush", and the likes? On the win11 I tested, those parts loaded slower than the rest, I wonder if the start menu has a timeout of "load then open".
Had I switched to win11 I'd have slapped Classic Shell on it, as I did on win10. It's a reimplementation of the win7 start menu with windows-version-appropriate design, but with win7 reactivity (opens literally the next frame, in no small parts thanks to the absence of animation).
After checking the responsiveness of the start menu earlier, I uninstalled or unpinned the useless stuff in Pinned.
I don't think it made a difference, it was already lag free before.
It's annoying they put Office Copilot and Instagram there, but it uninstalled with just two clicks per item, taking a minute or so to get rid of everything.
> Baldur's Gate 3 Linux build is a slideshow on Nvidia cards
I played Baldur's Gate 3 on Linux on a GeForce GTX 1060 (which is almost 10 years old!) without a fan (I found later that it was broken) and I generally did not have issues (couple of times in the whole game slowed for couple of seconds, but nothing major).
The key word was Linux build. There's now an official Linux version so that BG3 runs better on Steam Deck. Everyone else should keep using Proton to run it like they've done this far.
Which applies to all the games, basically. I nowadays make sure to select Proton before even running the game for the first time, in case it has a Linux build -- that will invariably be the buggier experience so want to avoid it.
Thats the whole problem. No consistency. Some configurations work, others not - eventhough they should be way more capable.
That's not even limited to linux or gaming. A few weeks ago i tried to apply the latest Windows update to my 2018 lenovo thinkpad. It complained about insufficient space (had 20GB free). I then used a usb as swap (required by windows) and tried to install the update. Gave up after 1 hour without progress...
Hardware+OS really seems unfixable in some cases. I'm 100% getting a macbook next time. At least with Apple I can schedule a support appointment.
For gaming macOS does not seem a great choice. I have friends with macOS and, at least on Steam, there are very few games running on that platform.
Additionally when I was using macOS for work, I had also some unexpected things if I wanted to use anything a bit more special (think packages installed using homebrew, compiling a thing from source, etc.).
So for me the options are: either use a locked device where you can't do anything other than what the designers thought of and if you are lucky it will be good OR use something where you have complete freedom and take the responsibility to tweak when things dont'work. MacOS tries to be the first option (but in my opinion does not succeed as much as it claims to), Linux is the second option (but it is harder than it could be in many cases) and Windows tries to do both (and is worse than the two other alternatives)
> Baldur's Gate 3 Linux build is a slideshow on Nvidia cards
Not at all my experience which makes me question the rest. Also https://www.protondb.com/app/1086940 most people seem quite happy with it so it's not a "me" problem.
Finally the "10-30% drop in average FPS across the board" might be correct, then so what? I understand a LOT of gamers want to have "the best" performance for what they paid good money for but pretty much NO game becomes less fun with even a 30% FPS drop, you just adjust the settings and go play. I think a lot of gamers do get confused and consider maximizing performances itself as a game. It might be fun, and that's 100% OK, but it's also NOT what playing an actual game is about.
Those are mostly reports for the Windows build of Baldur's Gate 3, running through Proton/Wine. He's talking about the newer Linux native build of the game from 3 months ago.
> pretty much NO game becomes less fun with even a 30% FPS drop
I mostly play fighting games. A 7% drop in FPS is more than enough to break the whole game experience as combo rely on frame data. For example Street Fighter 6 is locked at 60 fps. A low punch needs 4 frames to launch and leaves a 4-frames window to land another hit.
If there was a 7% drop in FPS, you would miss your combo. Even the tiniest drop in FPS makes the game unplayable.
It's the same for almost every fighting games. I know it's a niche genre, but I'm quite sure it's the same for other genres. It's a complete dealbreaker for competitive play.
I played competitive Quake on LAN and online. If your setup, hardware/software, can't handle your configuration you either get a better one (spending money, rollback your OS, etc) or adjust it (lower your configuration, nobody plays competitive gaming for the aesthetics, Quake in such a context is damn ugly and nobody cares).
It's not about a drop in game, it's about being prepared for the game. If you get a 7% drop, or even a .1% drop (whatever is noticeable to you) then you adjust.
To be clear I'm not saying worst performance is OK, I'm saying everybody wants 500FPS for $1 hardware but nobody gets that. Consequently we get a compromise, e.g. pay $2000 for 60FPS and so be it. If you have to pay $2000 + $600 or lower graphics settings to still get 60FPS that's what you do.
PS: FWIW competitive gaming is niche in gaming. Most people might want to compete but in practice most people are not, at least not professionally. It's still an important use case but it's not the majority. Also from my own personal experience I didn't get performance drop.
> It's a complete dealbreaker for competitive play
Very true, and this is the biggest issue for me when it comes to gaming on Linux. And it's not just raw FPS count. You can usually brute force your way around that with better hardware. (I'm guessing you could probably get a locked 60 in Street Fighter 6 even with a 30% performance loss?). It's things like input lag and stutter, which in my experience is almost impossible to resolve.
If it weren't for competitive shooters, I could probably go all Linux. But for now I still need to switch over to Windows for that.
(cue arrogance)
People on HackerNews complaining about Linux Desktop is pretty disappointing. You guys are supposed to be the real enthusiasts... you can make it work.
(cue superiority complex)
I've been using Linux Desktop for over 10 years. It's great for literally everything. Gaming admittedly is like 8/10 for compatibility, but I just use a VM with PCIe passthrough to pass in a gpu and to load up a game for windows or use CAD, etc. Seriously, ez.
Never had issues with NVIDIA GFX with any of the desktop cards. Laptops... sure they glitch out.
Originally Wine, then Proton, now Bazzite make it super easy to game natively.
The only issues I ever had with games were from the Kernel level anti-cheats bundled. The anti-cheats just weren't available for Linux, so the games didn't start. Anyone familiar with those knows its not a linux thing, it's a publisher/anti-cheat mechanism thing. Just lazy devs really.
(cue opinionated anti-corporate ideology)
I like to keep microsoft chained up in a VM where it belongs so can't do it's shady crap. Also with a VM you can do shared folders and clipboard. Super handy actually.
Weirdly enough, MacOS in a VM is a huge pita, and doesn't work well.
I have been working professionally on Linux for many years. But about once a year I have to reinstall the os because it craps out for various reasons. The same story goes for most of my team, but for some reason they seem ok with this. My issue with Linux is this: I don’t feel like a consumer, but a janitor. I don’t want this. Yes you can do whatever you want, but I don’t want to do those things. I want to write code and play games, not maintain the intricacies of a running computer.
For a server there is no better choice than Linux, but for my desktop/laptop, I find other alternatives better. Perhaps I haven’t found «the right distro», if so let me know, but until Linux is as low maintenance as windows or macos, it will be for those with an interest in doing that maintenance.
I realize I have a love-hate relationship with Linux. It is perfect, but flawed.
> I don’t feel like a consumer, but a janitor. I don’t want this.
I think it was Jorge Castro, the creator of Universal Blue, who called it the sysadmin culture. Most Linux distros are made by sysadmins for sysadmins, and you're expected to change and configure your system. I was a sysadmin myself for a long time. I used Slackware; switched from the 2.4 kernel to 2.6; tweaked CFLAGS on Gentoo; replaced SysV init with systemd; used PipeWire from the earliest versions - you name it, I did it.
Nowadays I use https://aeondesktop.github.io/ - an immutable system with Btrfs snapshots. Everything is installed from Flathub. The major roadblock is that much of the Linux world expects you to modify the system one way or another, so your mileage may vary. I replaced my printer because I did not wanted to install binary blobs from HP/Samsung.
> Perhaps I haven’t found «the right distro»
I’d look at immutable or image-based offerings, which aims at low or no maintenance: Aeon Desktop, Universal Blue, Endless OS. There are reviews on sites like LWN.net
I don't know what you are doing but I have my Arch Linux running since about 2013. I needed to intervene a few times, I think 4 times in total but the base installation in from 2013, now nearly 13 years ago.
That's pretty good, I'm jealous! The last time I reinstalled my OS (Slackware) from scratch was 2009, but I run into serious problems every couple of years when upgrading it to 'Slackware64-current' pre-release, because Slackware's package manager doesn't track dependencies and you can just install stuff in the wrong order: I usually don't upgrade the whole OS at once... just have to fix any .so link errors (I've got a script to pull old libraries from btrfs snapshots). I've even ended up without a working libc more than once! When you can't run any program it sure is useful that you can upgrade everything aside from the kernel without rebooting!
I share the same sentiment. I've had the same Arch install running since ~2016 and have been using Arch since about 2013 and the number of times I've needed to chroot from a live image is under 10 and were mostly related to systemd breaking things during an update which is pretty much entirely no longer an issue these days.
Compared to Windows-land where nuking and reinstalling the entire OS is a routine maintenance task, checking arch news to see if there's any manual intervention and running `pacman -Syu` is all I really ever think about.
I think this is a very interesting observation, because my experience has been fairly opposite. Disclaimer, I've grown up with windows.
Yet I've never had to reinstall windows on any of my devices ever. I've never had things behave in unusual or unpredictable ways.
Meanwhile, a highly suggested utility (on reddit, SE/SO, and even a few distro forums) for touchpad gestures borked my gnome setup. (Uninstalling it, as you might have guessed from my story and tone, did diddly squat.)
Just today I manually flushed my dnf packages (or clear them? Not sure of the terminology.) In the past, I had to debug manually because apparently the default timeout for Fedora was causing timeout issues with a few 100ms internet latency. That was a fun rabit hole "why can't I install an app that's only available via dnf install" "Oh, because Fedora assumes you have good internet. But don't worry if you have Ubuntu, because that doesn't have these issues!".
...I've never even been made aware what download timeouts windows has. As it should be for a user.
I could go on and on. My windows partition goes nearly months without sleep, typically only rebooting if I run out of battery or want to install an update. Linux... doesn't have hibernate yet. Fortunately it doesn't matter! ...Because some odd memory leak (and gpu driver stuff perhaps?) forces me to shut down ever so often. Oh well.
I'm not trying to "challenge" your experience, it's your experience. But mine is completely different so I'll offer it for anyone who might be reading along...
I've been using Linux at work and at home every day for 15 years and I think in that whole time I've only ever had to reinstall the OS due to system issues once.
(I ran an Ubuntu system update on my laptop while on low battery, and it died. The APT database was irrevocably fucked afterwards. I'm not even sure it's fair to blame the OS for this, it was a dumb thing for me to do. I would also not be at all surprised if it's possible to fuck up a Windows installation in a similar manner).
Nowadays I run NixOS and yes that requires quite regular attention. But I've also used Ubuntu, Fedora and Debian extensively and all of them are just completely stable all the time.
(Only exception I can think of: Ubuntu used to have issues with /boot getting full which was a PITA).
You mention the "right distro" but you did not mention what have you tried or with what you had problems with.
From my experience, some examples: for gentoo you are much more than a janitor - you must be everything all the time; for redhat based - you can get a major headache with some version upgrades; for arch (currently using, same install from 7 years) - update monthly and I had very few and minor issues
Most problems in most distros are solvable with enough knowhow and enough research time. That time investment is not always worth it in a commercial context. It is a fairly known amount of time to reinstall a distro and get back up and running, and an unknown amount of time to fix an exotic Linux problem.
Many seem to be interested in knowing my distro. I’m not interested in throwing shade on a distro in particular, but it is one of the bigger and well known ones.
Which is shockingly bad in its own way. For having a tightly integrated hardware stack, Apple sure has managed to trash their desktop OS. Reminds me a lot of Windows in the xp => vista stage.
My read is they don't really give a shit about it anymore because the revenue comes from mobile/tablets. Same reason Microsoft is comfortable trashing windows... the revenue is coming from O365 & Azure now. The OS is a loss leader to sell those, and it definitely feels like it these days.
Once a company eats from the fruit of the "ads" tree... they tend to degrade into "awful" from the user side, because the user stops being the primary customer - the conflict of interest there is unavoidable.
I tried running various Linux distros on my desktop some years ago and definitely agree on the crap-out experience and having to reinstall. Eventually settled on macOS and it's been okay.
The game changer for me has been Nix. It works on macOS. I have had coworkers use it on Ubuntu. I am soon planning to switch to NixOS.
People complain about the syntax but honestly AI gets you around that. You will still do janitorial work, but you mostly only need to do it once.
I've been using Fedora and OpenSUSE Tumbleweed on my laptop and desktop respectively. Both are going around a year, and I haven't had major issues with them.
Looking at the logs I installed Fedora 35 on this laptop over 4 years ago when I got it and have upgraded through to 43 with no serious issues aside from some mDNS configuration that I had to fix.
Shameless self promotion, but I 80% vibe- coded a pip package for interfacing with LLMs right from the terminal. ‘pip install lask’. It has helped me a lot since it works from the terminal regardless of what the graphics drivers are doing.
I had my 80 year old mom on Linux Mint for 15 years, from about 62-78, and she didn't even know it. She tried to show me her Windows with exactly one problem 15 years in, and I was present with the Mint boot screen. Problem at that point was her laptop, not Mint. Grandmas tend to do very simple things, and the OS can just chug along forever without problems.
Okay, people say this. Could you please, and this is not a rhetorical device, it's a sincere question: how do you keep the browser updated without updating the operating system? Or if you are updating the os, doesn't that change the user interface? And if the user interface is changing, doesn't that confuse your grandmother? I installed Ubuntu for my mom and after four years Firefox was out of date, and the website for banking she'd use would have checks where logging in was only possible if the one if the user agent was recent enough. One can fake that, but I didn't want to. But updating Firefox meant updating Ubuntu, which means that every single icon and every single menu position changes, and I didn't want to have to teach her where everything was again. How do you avoid this?
I haven't dealt with this for her in a few years, but basically:
Pin all their apps in favorites and they will persist through updates. Updates don't overwrite desktop shortcuts either (although like other os, a couple might be added that need to be removed). This might be more difficult in gnome, I wouldn't know since I am firmly in the kde camp.
To stay as up to date as possible, use the mozilla apt repo:
Perfect analogy. I'm using Debian for a few months now on my main laptop, and everything is flawed. Seriously, everything.
- Hybrid graphics simply doesn't work. The exception is when it works. Don't even try Wayland with it.
- Graphics card handling is still full with race conditions. It's random when everything works as intended without manual intervention.
- Switching monitors is pain. Sometimes works, sometimes doesn't. Waking up my laptop with a new monitor plugged in is a gamble.
- Energy efficiency was bad with hybrid graphics, but since I had to turn it off, I don't even try to optimize it since.
- It was a pain to make my laptop speakers work. A lot of searching, and applying random fixes until one worked (in reality two fixes together).
- My main bluetooth headset has a feature to mute itself, or stop the music when it's not on my head. Guess which is the only device which I have that have a problem with this? The funny thing is, that it's a random even again. The sound comes back about 10% of the time fully. In another 10% of the time, the sound from some apps comes back, in others doesn't. In the other 80%, I had to reconnect it.
- Don't even talk about printers. It's a gamble, again. Some printers worked at some point in time, some simply don't work, and never will, because nobody cares about them anymore enough.
- Game performance is simply worse than on Windows. First of all, it wasn't trivial to force some games to use my GPU when I had hybrid graphics. The internet is full with outdated information. But even after that, my FPS is consistently worse. I heard some others who have the opposite experience. But this tells me again, that the whole thing is a gamble. Probably it's also a gamble on the game.
- When I press the power off button to put it to sleep, or initiate a normal shutdown, I need to force shutdown the whole laptop. Sometimes I get a notification that text editor is preventing shutdown, and whether I want to force quit it, but it doesn't matter which I clicked, and the "it will be force quit in 60 seconds if I don't select something" is a lie, the whole X framework is killed after a few seconds, and the laptop remains powered on, with the lie "the computer will be shutdown now" in terminal. This happens even when I don't get notification about that something would prevent power off. The shutdown initiation from the OS menu is working, and closing the lid put it to sleep.
And this is my current laptop. I simply couldn't use my previous one with Linux, because some stupid problem with the video card, which I couldn't solve in months. Even installation was a challenge.
I've used Linux in the past 25 years from time to time. It's getting better, but still a long way. You need some janitorial work also with Windows, especially nowadays, but it's still way better experience to click on "leave me alone" once a month, than this constant tinkering, and daily annoyance. I want to build things, not fix things which should just work.
Desktop/Laptop Linux is improving pretty fast, but by using an LTS distro like Debian you miss out on a lot of that.
I had to run Ubuntu 22.04 on a laptop for a while and encountered similar monitor switching and bluetooth issues. Eventually I figured out I could get the latest version of most desktop packages from the KDE Neon repos since they were also based on 22.04 at the time.
Running the latest KDE Plasma desktop with the latest mesa and pipewire made a huge difference. Monitor switching now works every time, all the bluetooth features worked, battery life improved, and Firefox stopped crashing when using webgl.
I'm not saying it'll fix all your problems, but most of these problems are being actively worked on and I think its worth trying a distro that actually keeps up with the pace of that work.
I recently installed Debian instead of Ubuntu on my laptop. Although I recognize many of your problems as "you need to know the right way to configure that or it's super annoying" - which sucks but is not impossible to overcome -, I also find that Debian is much more bug filled as a laptop OS than Ubuntu. I was actually extremely surprised by this. I didn't think Ubuntu was doing much of anything.
That said I am running Debian Trixie using wayland / kde / cups / nvidia / etc and do not have any of your problems my graphics work, my printers work, my bluetooth works, sleep works. They all required a lot more configuration than the last several versions of Ubuntu had required (which shouldn't be the case if there is better example just right next door), but none are persistent.
Same here. I recently bought one laptop that I researched to make sure it was supported in Linux, and it had a ton of issues the reviewers didn't mention. So I bought a different laptop with Linux shipped from the factory, and it's better, but still has issues.
I think Bluetooth and printers are broken on pretty much every OS (especially on old devices), I certainly didn't have a better experience on Windows, it's maybe even worse.
What? I have been using Linux daily for almost 20 years. I have typically only installed the OS fresh once each time. I've been using Fedora as my daily driver for well over a decade. I can't remember having to re-install a distro unless I was switching distros. My current system was installed in 2019, Fedora 30. Over a dozen painless upgrades, the last several of which have had Steam flatpak installed with no breakages.
Fully open source drivers using AMD video cards. It just works (minus the early x11/wayland debacle, I had to switch back to x11 for a while).
I've observed that most "enthusiasts" are really just brand ambassadors. They've been captured by some proprietary software that doesn't run on Linux, and that's the problem of Linux. The day their set of products runs perfectly on Linux is the day Linux will be ready for them.
I think that if affinity chooses to make it work well on linux that would be a game changer for a lot of people. daVinci resolve works on linux for video so having a proper photo editor/illustrator tool that is not gimp would open up the option for most people to daily drive it. that's really the missing piece.
I mean, yes. That's how people work: They don't care about the OS for itself, the OS is a means to run the software they want to run, and it'll be ready when it runs that software.
(I'm typing this on my Linux desktop right now... but also have a separate Windows PC for running the games I want to run that don't work on Linux yet. When they work, I'll be thrilled to put Linux on that machine or its successor.)
I agree, this is why I also consider Windows barely working, I had to install 7 then 8 then 10 then 11 what's next? It should just turn on and work stop changing it around and making me install different random crap to get it working.
That said, tech folk routinely underestimate how much they rely on their own technical skill. Try using Linux for a week without ever opening a terminal. Terminal is a "f this I'm going back to Windows" button for most people.
>but I just use a VM with PCIe passthrough to pass in a gpu and to load up a game for windows
Many games refuse to run in VM, even if that VM is windows one. I bet there is a trick to bypass, but then you are at risk of being banned or can't receive support when needed.
That isn't weird. It's by design. MacOS is only designed to run on Apple hardware, and a VM, even if the host is Apple hardware isn't really Apple hardware.
(cue nitpicking) Can Okular sign a damn pdf with a pfx certificate yet or do I still need a PhD to set that up? MS office has an online version now but it is arguably very ass and Libreoffice is not even worth a mention, using it feels like time travelling twenty years back.
Linux would be the desktop of choice years ago if anything from Adobe or Office actually worked on it, the two things that make the world go round. Valve has done their part to develop Proton, but there is no equal push for things people can't do work without.
I remember when I first started reading HN how disappointed I was to see so many comments shitting on Linux and/or FOSS. I was kind of shocked because this is exactly the group that should be evangelizing this stuff. At the end of the day I realized I’m willing to put up with some inconvenience in exchange for freedom, but most people just aren’t.
The amount of hate spewed at FOSS is astounding really. People are literally giving you shit for free. Chill out.
I'm tired of people saying Steam on Linux just works. It doesn't.
Tried running Worms: instant crash, no error message.
Tried running Among Us: instant crash, had to add cryptic arguments to the command line to get it to run.
Tried running Parkitect: crashes after 5 minutes.
These three games are extremely simple, graphically speaking. They don't use any complicated anti-cheat measure. This shouldn't be complicated, yet it is.
Oh and I'm using Arch (BTW), the exact distro SteamOS is based on.
And of course, as always, those for which it works will tell you you're doing-it-wrong™ .
These games are all rated gold or platinum on protondb, indicating that they work perfectly for most people.
Hard to say what might be going wrong for you without more details. I would guess there's something wrong with your video driver. Maybe you have an nvidia card and the OS has installed the nouveau drivers by default? Installing the nvidia first-party drivers (downloaded from the nvidia web site) will fix a lot of things. This is indeed a sore spot for Linux gaming, though to be fair graphics driver problems are not exactly unheard of on Windows either.
Personally I have a bunch of machines dedicated to gaming in my house (https://lanparty.house) which have proven to be much more stable running Linux than they were with Windows. I think this is because the particular NIC in these machines just has terrible Windows drivers, but decent Linux drivers (and I am netbooting, so network driver stability is pretty critical to the whole system).
AoE2:DE is rated gold even though multiplayer is broken for everyone, and it lags. By now someone has posted a very complex workaround to the MP issue, but it was gold even before that.
BeamNG (before a very recent native Linux beta) was gold despite a serious fps drop and also a memleak to crash any time there's traffic.
> Installing the nvidia first-party drivers (downloaded from the nvidia web site) will fix a lot of things
Interesting. I saw somewhere else you're using Debian. Is it as opposed from Nouveau or the proprietary drivers from the Debian repos?
I'm currently testing to daily drive my desktop with linux on an NVIDIA GPU, and the Arch wiki explicitly recommends drivers from their repos. However, arch is rolling and the repo drivers are supposedly much more up to date than Debian's ones. Though, I'll keep your comment if I run into anything.
I am not familiar with Arch, so my advice might be wrong for Arch.
But I have a lot of experience on Debian and Ubuntu trying to use the packages that handle the nvidia driver installation for you. It works OK. But one day on a lark I tried downloading the blob directly from nvidia and installing that way, and I was surprised to find it was quite smooth and thorough, so I've been doing it that way ever since.
> Installing the nvidia first-party drivers (downloaded from the nvidia web site) will fix a lot of things.
Crazy—it used to be that nvidia drivers were by far the least stable parts of an install, and nouveau was a giant leap forward. Good to know their software reputation has improved somewhat
Nouveau has never been good for gaming. Not their fault (they had to reverse engineer everything), but it was only really ever viable for mostly 2D desktops in my experience.
Sure, but nvidia has always been seen as a liability for basic operation of the computer. Their driver quality is notoriously as bad as it gets. Nouveau fixed this.
Everyone says this but it is not my experience at all. Every time I try AMD cards I run into weird problems. The Nvidia drivers are a pain to install and tend to break randomly on kernel updates, but once built properly they always just work for me...
Did you use the proprietary AMD drivers? You need to use the open source drivers. As far as I know these should be the default on all distros, so just click through the OS installer, install Steam, and start gaming. Don't touch the drivers.
In my most recent attempt to use AMD, my problems were:
1. I needed to install a bleeding-edge kernel version in order to get support for the very new AMD card I had purchased, which was a bit of a pain on Debian. (With NVidia, the latest drivers will support the latest hardware on older kernels just fine.)
2. AMD can't support HDMI 2.1 in their open source drivers. Not their fault -- it's a shitty decision by the HDMI forum to ban open source implementations. But I was trying to drive an 8k monitor and for other reasons I had to use HDMI, so this was a deal-breaker for me. (This is actually now solvable using a DP->HDMI dongle, but I didn't discover that solution at the time.)
But every time I've tried to use AMD the problems have been different. This is just the most recent example.
Obviously I'm using the open source drivers, since the entire point of everyone's argument for AMD on Linux is the open source part.
The root problem may just be that I'm deeply familiar with the nvidia linux experience after 25 years of using it whereas the AMD experience is unfamiliar whenever I try it, so I'm more likely to get stuck on basic issues.
This has been my experience too, when I upgraded my GPU, I wanted to switch to Linux full time, so I went with AMD because everywhere people kept saying NVIDIA GPUs had a lot of issues, but it turned out to be the opposite. With my old card, I just have to install the proprietary NVIDIA driver, zero issues.
I think people are still clinging onto old "wisdom" that hasn't be true for decades, like "updating breaks Arch", go figure.
I imagine the people saying “it just works” are saying it because it does, at least for them.
SteamOS is based on Arch, but customized and aimed at specific hardware configurations. It’d be interesting to know what hardware you’re using and if any of your components are not well supported.
FWIW, I’ve used Steam on Linux (mostly PopOS until this year, then Bazzite) for years and years without many problems. ISTR having to do something to make Quake III work a few years ago, but it ran fine after and I’ve recently reinstalled it and didn’t have to fuss with anything.
Granted, I don’t run a huge variety of games, but I’ve finished several or played for many hours without crashes, etc.
I use OpenSUSE Tumbleweed, and I've never had trouble running a game that's rated gold or above. I've even gotten an Easy AntiCheat game to work correctly.
I've been gaming on linux exclusively for about 8 years now and have had very few issues running windows games. Sometimes the windows version, run through proton, runs better than the native port. I don't tend to be playing AAA games right after launch day, though. So it could be taste is affecting my experience.
I just bought another second Dell workstation (admit I hated those) and can’t wait to install SteamOS when it is released to the public. I don’t care about AAA gaming but the integrated card should be able to handle most of the games from ten years ago.
> And of course, as always, those for which it works will tell you you're doing-it-wrong™ .
This sounds like you are rejecting help because you have made up your mind in frustration already.
Because you are doing it wrong. If you want an OS that just works, you should use Ubuntu or Fedora. Why is SteamOS based on Arch then? Because Valve wants to tweak things in it and tinker with it themselves to get it how they like.
You don't.
So use an OS that requires less from you and that tries to just work out of the box, not one that is notorious for being something you break and tinker with constantly (Arch).
I've been using Arch for 15 years, it's not like I'm suddenly discovering the concept of the distro.
But when something crashes with no error message whatsoever, it makes it a tiny bit harder to troubleshoot.
Especially when so many people answer, just like I had predicted, "works on my machine". Which would only be a gotcha if I had implied it worked on no machine whatsoever. Which I didn't.
I'll tinker some more and I'll be sure to post my findings if I get these games to work.
Well then look at the logs? Sure it's not as in-your-face, but steam/proton does log and I'm fairly sure that a combination of at most setting a command invocation parameter, looking at the game logs and system logs will show you the exact problem and given that these games run just fine for a lot of people, the fix is probably trivial.
I am using Arch and all the games I played on Steam (at least 20, not the ones mentioned above) worked perfectly.
One thing that I do though is get most games at least one year after release, when probably many issues are fixed. I had tons of issues many years ago, with buggy games bought immediately after release (on Windows back then), so now I changed strategy...
I don't have your other games, but I do have a few Worms games and they worked out of the box for me with GE Proton on NixOS.
I'm not saying "you're doing it wrong", because obviously if you're having trouble then that is, if nothing else, bad UX design, but I actually am kind of curious as to what you're doing different than me. I have an extremely vanilla NixOS setup that boots into GameScope + Tenfoot and I drive everything with a gamepad and it works about as easily as a console does for me.
If anything this is the challenge with PC as a platform being so varied, any random software/hardware/config variation could bring a whole load of quirks.
That probably includes anything that isn't a PC in a time-capsule from when the game originally released, so any OS/driver changes since then, and I don't think we've reached the point where we can emulate specific hardware models to plug into a VM. One of the reasons the geforce/radeon drivers (eg, the geforce "game ready" branding) are so big is that they carry a whole catalogue of quirk workarounds for when the game renderer is coded badly or to make it a better fit to hardware and lets them advertise +15% performance in a new version. Part of the work for wine/proton/dxvk is going to be replicating that instead of a blunt translation strictly to the standards.
Yeah, I think Linus himself pointed out that the desktop is the hardest platform to support because it's unbelievably diverse and varied.
With regards to Linux I generally just focus on hardware from brands that have historically had good Linux support, but that's just a rule of thumb, certainly not perfect.
As an Arch (btw) user myself, yes, you're doing something wrong.
Arch won't hold your hands to ensure everything required is installed, because many dependencies are either optional (you have to read the pacman logs) or just hidden (because it's in the game itself). Valve actually does a great job providing a "works everywhere" runtime as their games are distributed in a flatpak-like fashion, but things can seep through the cracks.
The compositor can have an effect. The desktop settings. The GPU drivers. What's installed as far as e.g. fonts go. RAM setup, with or without swap.
As for steamOS, the real difference, is that despite being Arch-based, you're not installing Arch, but steamOS. A pre-packaged pre-configured Arch linux, with a set of opinionated software and its set of pre-made config files, for a small set of (1) devices. It's not really Arch you're installing, but a full-blow distro that happens to be arch-based.
That said, I understand your frustration as I've hit this many times on a laptop with dual graphics. Getting PRIME to run with the very first drivers that supported it was fun. Oh and I'm likely to hit the same walls as you since I just switched my gaming rig to Arch. GLHF!
Arch is nice if you want to tinker. Based on your reasoning, I wouldn't recommend it.
But if you still want arch-based, I would recommend EndevourOS, and for even a simpler/better distro, Bazzite.
You are definitely doing it wrong, I rarely have issues and when I do I just switch comparability tools. I play multiple indie games, marvel rivals, I played lots of among us on my machine in 2020. Running Pop OS
Well, but many games just work. Actually, I try starting the games without any tweaks before heading over to protondb.com, and often they run just fine.
But it is also true that many games still require minor tweaks. For example, just last week, I found out that I had to enable hardware acceleration for the webview within Steam, just to be able to log in to Halo Infinite. It was just clicking a checkbox, but otherwise, the game would not have been playable.
But I am always surprised when you find out you have those kinds of issues with Windows as well.
Yeah, the same. I sometimes google "wine WoW issues" and every time there are recent threads, so I don't even try. Linux has the long way to become gamer platform.
The games don't fail to run because they are so "graphically powerful" they fail to run because you chose to set up your system without the necessary runtime.
There are people who make stripped-down versions of windows. Is it fair to say that because these releases exist that windows isn't "just works" either?
I switched my desktop from macOS (10+ years) to Ubuntu 25 last year and I'm not going back. The latest release includes a Gnome update which fixed some remaining annoyances with high res monitors.
I'd say it pretty much "just works" except less popular apps are a bit more work to install. On occasion you have to compile apps from source, but it's usually relatively straightforward and on the upside you get the latest version :)
For anyone who is a developer professionally I'd say the pros outweigh the cons at this point for your work machine.
> The latest release includes a Gnome update which fixed some remaining annoyances with high res monitors.
Interesting, I've had to switch off from Gnome after the new release changed the choices for HiDPI fractional scaling. Now, for my display, they only support "perfect vision" and "legally blind" scaling options.
By default Gnome doesn’t let you choose any fractional scaling in the UI because it has some remaining TODOs on that front. So from the UI you choose 100% or 200%. But the code is there and it works if you just open a terminal and type a command to enable this “experimental” feature.
Now whether or not this feature should have remained experimental is a different debate. I personally find that similar to the fact that Gmail has labeled itself beta for many years.
I've got the feature turned on. But Gnome 49 only supports fractional scaling ratios that divide your display into a whole, integer number of pixels. And they only calculate candidate ratios by dividing your resolution up to a denominator of 4.
So on my Framework 13, I no longer have the 150% option. I can pick 133%, double, or triple. 160% would be great, but that requires a denominator of 5, which Gnome doesn't evaluate. And you can't define your own values in monitors.xml anymore.
I switched in 1999. I've never really had any problems in all that time.
Although it was to BSDi then, and then FreeBSD and then OpenBSD for 5 years or so. I can't remember why I switched to Debian but I've been there ever since.
But what about laptops? I don’t use desktop machines anymore (last time was in 2012). Apple laptops are top notch. I use ubuntu as vm (headless) for software development tho
Best you can do is build a high end desktop at home and access it remotely with any laptop you desire. The laptop performance then becomes mostly irrelevant (even the OS is less relevant) and by using modern game streaming protocols you can actually get great image quality, low latency and 60+ fps. Though, optimizing it for low bandwidth is still a chore.
Have that desktop be reachable with SSH for all your CLI and sys admin needs, use sunshine/moonlight for the remote streaming and tailscale for securing and making sunshine globally available.
Bandwidth is not really a problem if you live in decent city. The problem is latency and data usage. 1 Hour streaming consumes GBs of data, that's a big problem if you use cellular network.
Latency is another problem, recently LTT video show that even as low as 5-10ms added latency can negatively impact your gaming performance, even if you don't notice. You begin to notice at around 20ms.
How is bandwidth not a problem if data usage on a cellular network is? You can dramatically lower your data usage by constraining bandwidth to say, ~2mbps, but doing so while keeping a decent image requires many sacrifices, like lowering resolution or using a software encoder that can squeeze out as much quality as possible out of 2mbps at a penalty for your latencies (won't matter much since you are already incurring latencies from your internet connection). You may also switch to a wi-fi hotspot once that's an option, and then even lift the bandwidth restrictions.
Regarding latency, this solution is meant as a way to use your notebook for any task, not just gaming. You can still play and enjoy most fps games with a mouse even at 20ms of extra latency, and you can tolerate much more when playing games with a gamepad. If you need to perform your best on a competitive match of cs2 you obviously should be on a wired connection, in front of a nice desktop pc (the very same you were using to stream to your notebook perhaps) and with a nice high refresh rate monitor. Notebooks are usually garbage for that anyways.
I don't have an x86 laptop at the moment so sticking with Macbook for now. My assumption is Mac laptops still are far superior given M-series chips and OS that are tuned for battery efficiency. Would love to find out this is no longer the case.
I have the HP Zbook Ultra G1a. AMD 395+, 129GB RAM, 4TB 2280 SSD. Works great with Ubuntu 24.04 and the OEM kernel. Plays Steam games, runs OpenCL AI models. Only nit is it is very picky on what USB PD chargers it will actually charge on at all. UGreen has a 140W that works.
I've had Linux running on a variety of laptops since the noughties. I've had no more issues than with Windows. ndiswrapper was a bit shit but did work back in the day.
I haven't, because I buy hardware that's designed to work with Linux. But if you buy hardware that doesn't have Linux drivers, it just won't work. That might mean Wifi not working, it might mean a fingerprint reader not working, etc.
My HP ZBooks have been a dream. My current Studio G10 with an i9-13900 and 4070M has largely Just Worked™ with recent versions of both Fedora and Ubuntu.
HP releases firmware updates on LVFS for both the ZBook and its companion Thunderbolt 4 dock(!). They also got it Ubuntu certified, like most of their business laptops.
I did some investigation into this the other day. The short answer seems to be that if you like MacBooks, you aren't willing to accept a downgrade along any axis, and you really want to use Linux, your best bet today is an M2 machine. But you'll still be sacrificing a few hours of battery life, Touch ID support (likely unfixable), and a handful of hardware support edge cases. Apple made M3s and M4s harder to support, so Linux is still playing catch-up on getting those usable.
Beyond that, Lunar Lake chips are evidently really really good. The Dell XPS line in particular shows a lot of promise for becoming a strict upgrade or sidegrade to the M2 line within a few years, assuming the haptic touchpad works as well as claimed. In the meantime, I'm sure the XPS is still great if you can live with some compromises, and it even has official Linux support.
What I mean is: on a normal laptop, when you scroll with two fingers on the scroll wheel, the distance you scroll is nearly a continuous function of how much you move your fingers; that is, if you only move your fingers a tiny bit, you will only scroll a few pixels or just one.
Most VM software (at least all of it that I've tried) doesn't properly emulate this. Instead, after you've moved your fingers some distance, it's translated to one discrete "tick" of a mouse scroll wheel, which causes the document to scroll a few lines.
The VM software I use is UTM, which is a frontend to QEMU or Apple Virtualization framework depending on which setting you pick when setting up the VM.
> Linux is still playing catch-up on getting those usable
This is an understatement. It is completely impossible to even attempt to install Linux at all on an M3 or M4, and AFAIK there have been no public reports of any progress or anyone working on it. (Maybe there are people working on it, I don’t know).
In his talk a few days ago, one of the main Asahi developers (Sven) shared that there is someone working on M3 support. There are screenshots of an M3 machine running Linux and playing DOOM at around 31:34 here: https://media.ccc.de/v/39c3-asahi-linux-porting-linux-to-app...
Sounds like the GPU architecture changed significantly with M3. With M4 and M5, the technique for efficiently reverse-engineering drivers using a hypervisor no longer works.
Not working with Linux is a function of Apple, not Linux. There is a crew who have wasted the last half decade trying to make Asahi Linux, a distro to run on ARM macbooks. The result is after all that time, getting an almost reasonably working OS on old hardware, Apple released the M4 and crippled the whole effort. There's been a lot of drama around the core team who have tried to cast blame, but it's clear they are frustrated by the fact that the OEM would rather Asahi didn't exist.
I can't personally consider a laptop which can't run linux "top notch." But I gave up on macbooks around 10 years ago. You can call me biased.
I just put Asahi on an M2 Air and it works so incredibly well that I was thinking this might finally be the year linux takes the desktop .. I wasn't aware of the drama w/Apple but I imagine M2 hardware will become valuable and sought after over M3+ just for the ability to run Asahi
Again, I've had two 4k monitors on Linux for about ten years, and it has worked well the whole time. Back then I used "gnome tweak" to increase the size of widgets etc. Nowadays its built into mate, cinnamon, etc.
Did you start using Linux on the Mac hardware or on PC hardware? I have a late era Intel Macbook and was considering switching it to Ubuntu or Debian since it is getting kinda slow.
Not the OP, but I have a 2015 Macbook Pro and a desktop PC both running Linux. I love Fedora, so that's on the desktop, but I followed online recommendations to put Mint on the Macbook and it seems to run very well. However, I did need to install mbpfan (https://github.com/linux-on-mac/mbpfan) to get more sane power options and this package (https://github.com/patjak/facetimehd) to get the camera working. It runs better than Mac OS, but you'll need to really tweak some power settings to get it to the efficiency of the older Mac versions.
I switched to a new x86 machine. Running Linux on Mac just made things unnecessarily complicated and hurt performance. Im still open to using docker on Mac to run Linux containers but once you want a GUI life was simpler when I switched off.
I've been using Linux on all PC's for a long time.
Experience is slowly getting better. There is nothing I haven't been able to get to work, but with tricks or adjustments.
I think the "best bonus" is using LLM's in deep research mode to wade through all the blog post, reddit posts etc to get something to work by discovering forementioned tricks. Before, you had to do that by yourself and it sucked. Now I get 3 good ideas from Claude in "ranking order" of how likely it is to make it work => 99% of games I get to run in 5 minutes with a shell command or two. Lutris is also pretty good.
Omarchy on my laptop has finally made computers fun for me again, it's so great and nostalgic. Happy to be back after my brief work-mandated adventure into MacOS.
"the real problem is a feeling that my computer isn't mine, that I am somehow renting this thing I put together with my own two hands from an AI corporation in Redmond."
I've installed Windows on all the PCs I've built for home and work over the last 20 years or so, until my latest in October. It was the ads in the lock screen that pushed me over the edge. Why should I pay for a license for that? Double-dipping fools. Am happily running Bazzite now.
I'm not against Wayland, but I think Wayland is currently not good for the Linux ecosystem. I've had lots of friends try Linux, and they've had issues with Discord global keyboard shortcuts not working, and window positions not restoring at application start, and lots of other small issues, which add up in the end. But once they switched to X11, they've all been very happy.
Yup. I fully understand that X11 is a shitshow under the hood, but it works and Wayland frequently does not work. Screen recording, window positions, various multi-monitor and calibration issues, ...
On my laptop I use to write blog posts, that never ever gets plugged into a second screen? Sure, Wayland's great. On a computer that I expect normal people to be able to use without dumb problems? Hell no!
Comparing X11 and Wayland isn’t even correct because for a functional desktop you need Xwayland anyway. X11 never went away, we piled more code on top and now we have an eternal transitionary period and two ways of doing things.
I think Wayland is good for more technical users. Going from i3 to sway or bspwm to river feels like essentially nothing has changed. On the other hand, Gnome X11 to Wayland might be a bigger shock.
Unfortunately, Wayland inherently can't be like Pipewire, which instantly solved basically 90% of audio issues on Linux through its compatibility with Pulseaudio, while having (in my experience) zero drawbacks. If someone could make the equivalent of Pipewire for X11, that'd be nice. Probably far-fetched though.
It can absolutely be like that. Global keyboard shortcuts not working is a deliberate design choice in Wayland (as is non-foreground apps not having access to the clipboard).
"window positions not restoring at application start"
Well you see, you are actually just silly for wanting this or asking for this, because it's actually just a security flaw...or something. I will not elaborate further.
--geometry is an exploit that will end in your financial ruin. Spend your weekend figuring out which tiling manager and dbus commands will come close to approximating a replacement before giving up and realizing you can manually move windows for the rest of your life. Two plus two is five.
hyprland is a fun spectacle, but takes insane effort to make remotely livable. Also any apparent shortcut (dotfiles) will do nasty damage to your install. Anyone hypr-curious should sandbox in an install they don't mind wiping.
I’m really curious about your experience, what distro you used hyprland on, what dotfiles did damage to your install etc.
I just installed hyprland yesterday and outside of having to switch back to i3 once to install what they had set for a terminal in their default config(kitty), I haven’t had to leave again.
Asahi and hyde. "Nasty damage" isn't irreparable, but it would be significant effort to enumerate every small touch that affects defaults from other DEs and restore them. There is no "restore all touched configs to default" afaik. Since my asahi install was a lightly used toy anyway I just reinstalled. My next attempt will be with a VM that I make image backups of.
Curious: do enterprises using Windows suffer through all the system-level ads and nagware? Or do they get a version that lets their employees actually focus on work instead of learning the many reasons they should consider switching back to Edge?
It’s all turned on by default even in Windows 11 Enterprise. You can turn everything off via AD Group Policy or your MDM but you have to go through the labyrinth of Windows policies and find them all. Thankfully you only have to do it once and then push it to all of your devices.
No nagware but, at least on the machines of my colleagues, an even worse enemy: Microsoft Defender with all the checkboxes ticked. Grinds the machine to an absolute halt for any development work - sometimes the responsible security department has mercy and gives exceptions for certain folders/processes, sometimes not.
From my tests defender has minimal impact on performance even when doing a full scan, except for making some io slower when you're e.g. unpacking new files but NTFS is plenty slow by itself there
Enterprise likes to layer multiple invasive security products though that'll do a lot worse than defender
My work machine is grossly slow due to all the various security software.
Loading Teams can take minutes. I'm often late to meetings waiting for the damn thing to load.
Feels like early 90s computing and that Moore's Law was an excuse for bad coding practices and pushing newer hardware so that "shit you don't care about but is 'part of the system'" can do more monitoring and have more control of 'your' computer.
You _can_ curate the Enterprise edition a lot more with group policy/intune and remove all that stuff but my experience has been most corporate IT departments don’t care/don’t know how to do it, and MS will just randomly enable new things without asking the same as home editions and you have to keep an eye on it and go to disable them.
HDR still doesn't really work on Linux w/ nVidia GPUs.
1. 10bpp color depth is not supported on RGB monitors, which are the majority of LCD displays on the market. Concretely, ARGB2101010 and XRGB2101010 modes are not supported by current nVidia Linux drivers - the drivers only offer ABGR2101010 and XBGR2101010 (See: https://github.com/NVIDIA/open-gpu-kernel-modules/blob/main/...).
2. Common browsers like Chrome and Firefox has no real support for HDR video playback on nVidia Linux drivers. The "HDR" option appears on YouTube, but no HDR color can be displayed with an nVidia GPU.
Also, video backgrounds in Google Meet on Chrome are broken with nVidia GPUs and Wayland. Ironically it works on Firefox. This has been broken for a few years and no fix is in sight.
The "HDR" toggle you get on Plasma or Mutter is hiding a ton of problems behind the scenes. If you only have 8bpp, even if you can find an app that somehow displays HDR colors on nVidia/Wayland - you'll see artifacts on color gradients.
I have Interstellar on 4K UltraHD Blu-ray that features HDR on the cover, Sony 4K Blu-ray player (UBP-X700) and a LG G4 OLED television. I also have an AVR (Denon AVR-S760H 7.2 Ch) connecting both the Blu-ray and a PC running Linux with a RTX 3060 12GB graphic card to the television. I've been meaning to compare HDR on Linux with the Blu-ray. I guess now better than never. I'll reply back to my post after I am done.
Try it with different monitors you have. The current nVidia Linux drivers only has BGR output for 10bpp, which works on TVs and OLEDs but not most LCDs monitors.
My monitors (InnoCN 27M2V and Cooler Master GP27U) require RGB input, which means it's limited to 8bpp even with HDR enabled on Wayland. There's another commentator below who uses a Dell monitor and manages to get BGR input working and full HDR in nVidia/Linux.
Television HDR mode is set to FILMMAKER,
OLED brightness 100%,
Energy Saving Mode is off.
Connected to AVR with HDMI cable that says 8K.
PC has Manjaro Linux with RTX 3060 12GB
Graphic card driver: Nvidia 580.119.02
KDE Plasma Version 6.5.4
KDE Frameworks Version: 6.21.0
Qt Version: 6.10.1
Kernel Version 6.12.63-1-MANJARO
Graphics Platform: Wayland
Display Configuration
High Dynamic Range: Enable HDR is checked
There is a button for brightness calibration that I used for adjustment.
Color accuracy: Prefer color accuracy
sRGB color intensity: This seems to do nothing (even after apply). I've set it to 0%.
Brightness: 100%
TV is reporting HDR signal.
AVR is reporting...
Resolution: 4KA VRR
HDR: HDR10
Color Space RGB /BT.2020
Pixel Depth: 10bits
FRL Rate 24Gbps
I compared Interstellar 19s into Youtube video in three different ways on Linux and 2:07:26 on Blu-ray.
For Firefox 146.0.1 by default there is no HDR option on Youtube. 4K video clearly doesn't have HDR. I enabled HDR in firefox by going to about:config and setting the following to true: gfx.wayland.hdr, gfx.wayland.hdr.force-enabled, gfx.webrender.compositor.force-enabled.
Color look completely washed out.
For Chromium 143.0.7499.169 HDR enabled by default. This looks like HDR.
I downloaded the HDR video from Youtube and played it using MPV v0.40.0-dirty with settings --vo=gpu-next --gpu-api=vulkan --gpu-context=waylandvk. Without these settings the video seems a little too bright like the Chromium playback. This was the best playback of the three on Linux.
On the Blu-ray the HDR is Dolby Vision according to both the TV and the AVR. The AVR is reporting...
Resolution: 4k24
HDR: Dolby Vision
Color Space: RGB
Pixel Depth 8bits
FRL Rate: no info
...I looked into this and apparently Dolby Vision uses RGB tunneling for its high-bit-depth (12-bit) YCbCr 4:2:2 data.
The Blu-ray looks like it has the same brightness range but the color of the explosion (2:07:26) seems richer compared to the best playback on Linux (19s).
I would say the colors over all look better on the Blu-ray.
I might be able to calibrate it better if the sRGB color setting worked in the display configuration. Also I think my brightness setting is too high compared to the Blu-ray. I'll play around with it more once the sRGB color setting is fixed.
*Edit: Sorry Hacker News has completely changed the format of my text.
I don't think the Interstellar Blu-ray has Dolby Vision (or Dolby Atmos), just regular HDR10. If the TV/AVR says it's Dolby Vision something in your setup might be doing some kind of upconversion.
Funny how it went from "just get an Nvidia card for Linux" and "oh my god, what did I do to deserve fglrx?" to "just get an AMD card" and "it's Nvidia, what did you expect?"
They're also selling $3000 nVidia AI workstations that exclusively uses Linux. But what if you want to watch an HDR video on it? No. What if you want to use Google Meet on Chrome/Wayland? It's broken.
I don't think this is true. I can go into my display settings in kde plasma and enable HDR and configure the brightness. I have a nvidia blackwell card.
You can enable, yes. But (assuming you're on an LCD display and not an OLED), you're likely still on XRGB8888 - i.e. 8-bit per channel. Check `drm_info`.
Do it once on "HDR" on Linux, and then on Windows. The "HDR" in nVidia/Linux is fake.
The brightness you see on Plasma or Mutter is indeed related to the HDR support in the driver. But - it's not really useful for the most common HDR tasks at the moment.
Your Display Configuration
Both monitors are outputting 10-bit color using the ABGR2101010 pixel format.
| Monitor | Connector | Format | Color Depth | HDR | Colorspace |
|------------------------|-----------|-------------|-------------|--------------|------------|
| Dell U2725QE (XXXXXXX) | HDMI-A-1 | ABGR2101010 | 10-bit | Enabled (PQ) | BT2020_RGB |
| Dell U2725QE (XXXXXXX) | HDMI-A-2 | ABGR2101010 | 10-bit | Disabled | Default |
* Changed the serial numbers to XXXXXXX
I am on Wayland and outputting via HDMI 2.1 if that helps.
EDIT: Claude explained how it determined this with drm_info, and manually verified it:
> Planes 0 and 3 are the primary planes (type=1) for CRTCs 62 and 81 respectively - these are what actually display your desktop content. The Format: field shows the pixel format of the currently attached framebuffer.
EDIT: Also note that I am slowbanned on this site, so may not be able to respond for a bit.
EDIT: You should try connecting with HDMI 2.1 (you will need a 8k HDMI cable or it will fall back to older standards instead of FRL).
EDIT: HDR on youtube appears to work for me. Youtube correctly indentifies HDR on only 1 of my monitors and I can see a big difference in the flames between them on this scene: https://www.youtube.com/watch?v=WjJWvAhNq34
I don't have a Dell U2725QE, but on InnoCN 27M2V and Cooler Master GP27U there's no ABGR2101010 support. These monitors would only work with ARGB2101010 or XRGB2101010 which nVidia drivers do not provide.
HDR playback in chrome on KDE works as expected from what I can tell. For GNOME 49.2 it does not, it doesn't get the luminance that it should at this time. 49.3 may fix this.
I don’t think your problem is RGB instead of BGR. That’s just the compositor’s work area and your monitor never sees it (it includes an alpha channel). Have you tried KDE Plasma? It sounds like KWin uses 10-bit planes by default when available. Maybe Ubuntu’s compositor (Mutter?) doesn’t support 30 bit color or must be configured? Or maybe you need the nvidia driver >= 580.94.11 for VK_EXT_hdr_metadata (https://www.phoronix.com/news/NVIDIA-580.94.11-Linux-Driver)
It's not obvious how to interpret the output. I pasted it into chatgpt and it thinks I am using "Format: ABGR2101010" for both monitors (only 1 has HDR on) so I don't trust it.
Switched in, ooh i dunno, '98 or '99. Quality is about where it was then relatively speaking. Sure things have improved, mainly just systemd, and we got ACPI and later power management stuff for laptops.
Prior to that windows was better on laptops due to having the proprietary drivers or working ACPI. But it was pretty poor quality in terms of reliability, and the main problem of the included software being incredibly bare bones, combined with the experience of finding and installing software was so awful (especially if you've not got an unlimited credit card to pay for "big professional solutions").
Every time the year of the Linux desktop arrives, I'm baffled, since not much has changed on this end.
This is a strange statement for me, because I'd say that since '99 almost everything has changed. Maybe your definition of quality is a bit different than mine.
I tried to use Linux back in high school. I had a Pentium 4 computer which was pretty fast for its time. However, I had a dialip windows soft modem. You remember the driver situation. I had to boot to Windows to check my email.
Also, I was basically a child and had no idea what I was doing (I still don't but that's besides the point). Things have definitely gotten better.
I'm sorry, but no. I ran Slackware 96, Red Hat 4.2, Mandrake 5.0, a bunch of Ubuntus from 12.04 onward, and Fedora now. It is absolutely, qualitatively different now than it was at the turn of the century.
In the Red Hat 4.2 days, it was something that I was able to use because I was a giant nerd, but I'd never ever ever have recommended it to a normal person. By Ubuntu 12.04, 15 years later, it was good enough that I'd recommend it to someone who didn't do any gaming and didn't need to use any of the desktop apps that were still then semi-common. In 2026, it's fine for just about anyone unless you are playing particular (albeit very popular) games.
> Every time the year of the Linux desktop arrives, I'm baffled, since not much has changed on this end.
It's Critic's Disease: When a band moves to a major label, they "suddenly" put out their critically acclaimed masterpiece, when before nobody would review a thing they did and mocked their fanatic fans.
"Now, they've matured."
Let them have it, though. People need to rationalize their past hostility to the right thing in some way in order to progress. If you want people to say that they were ever wrong, you'll die waiting. The situation became completely intolerable where they were insisting on staying no matter what because they weren't stuck-up nerds who care about stupid stuff that no one cares about. They were finally humiliated enough to move.
They'll end up moving to weird semi-commercial distributions that market specifically to them, too, and ridicule people who criticize those distributions for being stuck-up nerds who care about stupid stuff no one cares about. As long as it doesn't break Debian, I'm cool.
I made the move about a month ago to bazzite on my desktop with an nvidia graphics card. I still have my windows drive for when I need it but that's pretty rare. Bazzite isn't perfect but we've reached the point where the rough edges are less painful than the self sabotage microsoft has been inflicting on their users in recent versions of windows.
I tried bazzite but ended up on cachyos. The whole layered / immutable thing got a bit annoying. I'd rather just run snapshots and manage my packages more traditionally
I love the layered thing except for the rough edges. Unfortunately the rough edges for me are that Linux containerization and permissions are completely idiotic.
In Fedora Atomic it should be foolishly easy to set up a system account, with access to specific USB devices via group, and attach a volume that can easily be written to by a non-root user inside of the container.
I think we've reached a point where Windows is about as rough as Linux. But the problem is still that people are familiar with Windows and have learned how to deal with the roughness; not so on Linux. And so long as Windows owns the business and education sectors, it will always have the benefit of that familiarity.
Been working and playing on Linux for years, from 2D indie e.g. Baba is You to AAA e.g. Elden Ring, BG3, Expedition 33 to AAA VR e.g. Half-life: Alyx to VR indie e.g. Cubism and my experience has just been great.
It's MY system and I do whatever the heck I want, from play boring stuff to weird prototyping. I get no, like literally 0, anti-feature. I'm not "scared" that an update will limit my agency. I'm just zen and that is priceless.
Also, quite importantly, it works wonderfully with all my other devices and peripherals. I go from Bluetooh headsets easily, I switch monitors, video projectors, XR devices, CV camera inputs, I share files with KDE Connect, I receive SMS notification from my (deGoogled) Android phone, I reply from my desktop, I get notification when my SteamDeck is soon out of battery, etc. ALL my devices play nicely with each other.
So yes, Linux is good now. It's been for a while but it's been even better for the last few years.
The thing that gets me about Windows is: when's the last time Microsoft added a major new feature to Windows I actually gave a shit about? A feature that actually felt useful to me as a consumer?
Windows 7 was nice, but since then, I struggle to think of anything. Since 7, it feels like it's just been upsell after upsell, features that are about Microsoft selling me on something or annoying the crap out of me rather than providing actual utility.
Recently switched to Linux Mint from Windows and it has not only been good. It has been cathartic. I enjoy computers again! I am self-hosting some services, what an absolute joy.
Long time Linux on the desktop
user here. I don't feel Linux has become significantly better recently. It's more that Windows reached a new low that is just below the threshold for many. Also, Apple, what are you doing?
A large part of it is that for most people, the vast majority of their computer use is in a web browser. Even "standalone" programs are often just an Electron app so they don't even have to use their computer differently than they are used to. Yes Windows has gotten bad, and Linux no longer has some of the major issues people would frequently run into (e.g. hardware compability is largely a non-issue, audio just works, etc.), but I think it is mostly that things are just way more platform agnostic today.
I'd say it's both... In particular 6.16 seems to be a defining point in terms of stability and performance at least for me. My RX 9070XT is finally running with no issues since 6.16 that I've noticed in any of the admittedly few games I play.
Mesa, the kernel drivers and Proton have all seen a lot of growth this past year combined with a bunch of garbage decisions MS has doubled down on... not to mention, enough Linux users in tech combined with Valve/Steam's efforts have made it visible enough that even normies are considering giving Linux a try.
Linux desktop is amazing. Coming from Debian, I installed Windows and had to quickly purge it from my hardware! Super bloated, slow, constantly phoned some CC center, automatically connected to OneDrive, …
Debian is a breath of fresh air in comparison. Totally quiet and snappy.
Debian (stable) is great but I wouldn't use it for a gaming PC on modern hardware. The drivers included are just too old. Bazzite or Arch (DIY option) seem better options.
I don't game, but all my computers run Debian Stable, and my oldest child wastes considerable time gaming on Steam. I had to tweak one or two things for him early on, but it all seems to work fine.
People who don't use Debian misunderstand Stable. It's released every two years, and a subset of the software is kept up to date in Backports. For anything not included in Backports, its trivial to run Debian Testing or Unstable in a chroot on your Stable machine.
I moved to Debian Stable ~20 years ago because constant updates in other distros always screwed up CUPS printing (among other things). Curiously, I was using Ubuntu earlier this year and the same thing happened. Never going back.
Debian Stable gamer here, with modern hardware, having a great time.
> The drivers included are just too old.
This can usually be fixed by enabling Debian Backports. In some cases, it doesn't even need fixing, because userland drivers like Mesa can be included in the runtimes provided by Steam, Flatpak, etc.
Once set up, Debian is a very low-maintenance system that respects my time, and I love it for that.
It is good, and for 99+% of use cases for 90+% of users (who mostly use nothing but the browser), they will hardly even notice a difference, besides the lack of obnoxious, instrusive MS behavior.
However, despite really, really wanting to switch (and having it installed on my laptop), I keep finding things that don't quite work right that are preventing me from switching some of my machines. My living room PC, which is what my TV is connected to, the DVR software that runs my TV tuner card doesn't quite work right (despite having a native linux installer), and I couldn't get channels to come through as clearly and as easily. I spent a couple of hours of troubleshooting and gave up.
My work PC needs to have the Dropbox app (which has a linux installer), but it also needs the "online-only" functionality so that I can see and browse the entire (very large) dropbox directory without needing to have it all stored locally. This has been a feature that has been being requested on the linux version of the app for years, and dropbox appears unlikely to add it anytime soon.
Both of these are pretty niche issues that I don't expect to affect the vast majority of users (and the dropbox one in particular shouldn't be an issue at all if my org didn't insist on using dropbox in a way that it is very much not intended to be used, and for which better solutions exist, but I have given up on that fight a long time ago), and like I said, I've had linux on my laptop for a couple of years so far without any issue, and I love it.
I am curious how many "edge cases" like mine exist out there though. Maybe there exists some such edge case for a lot of people even while almost no one has the same edge case issue.
There are plenty. I run only Linux at home but CAD software for hobbies (Fusion 360), most games that want kernel level anti cheat, some embedded DRM-enabled media, all sort of just fail. Other things, like GPU tuning or messing with your displays/drivers are harder than they should be. My Bluetooth earbuds just don't work with my Linux machines.
I don’t think he did get it running. It’s one of my main blockers as well. Last time I tried I got as far as it starting up and logging in to their identity server via the browser, but the redirect back to the application didn’t work. Such a silly thing that prevents it from working. Why does a CAD program need to online auth, anyway? (I know the reason but it’s an annoying one)
FUSE will provide Dropbox in a more integrated way than Windows (eg. terminal) and a cursory Google revealed some projects for Dropbox that do the JIT download you are after - they are old, but I wager still work just fine (an inactive project can just mean that it's complete).
I just switched to Linux. It's a great gig, and I'm actively encouraging everyone I know still infected with the malware known as Windows 11 to switch.
But some of the drawbacks really aren't edge cases. Apparently there is still no way for me to have access to most creative apps (e.g. Adobe, Affinity) with GPU acceleration. It's irritating that so few Linux install processes are turnkey the way they are for Windows/Mac, with errors and caveats that cost less-than-expert users hours of experimenting and mucking with documentation.
I could go on, but it really feels like a bad time to be a casual PC user these days, because Windows is an inhospitable swamp, and Linux still has some sharp edges.
I use OneDrive and Google Drive heavily and there just are not good clients for Linux for those that I have found. Especially with the ability to not sync files but still "look" like they are there in the filesystem. That is my main stopper now.
Whilst initially reluctant to - I have made a once off payment to Insync ( https://www.insynchq.com ) many years ago for my Google Drive account - and has worked flawlessly.
I agree, I'd cut off dual booting and go full Linux when the hardware and software I use supports it. One of which being a PCIe Elgato capture card, another being an audio mixer with no driver support and the alternatives are very hacky and too complicated for me.
I permanently switched from Windows to Linux about five years ago. I had the same issue as you with Dropbox, so I switched to using the Maestral client for Dropbox instead which has support for selective sync. Works like a charm for me.
I quit gaming a year ago and no longer have a consumer OS installed on any machine. I can't imagine ever willingly going back after getting used to being able to set my machine up any way I want, and know it will work exactly as I've specified, and won't ever spy on me or monetize my data, and actually has an ecosystem for extending it in basically any way I can imagine, with no bloatware, an app ecosystem with no bundled spyware or adware, etc.
After a few months of testing the waters, I just moved my gaming PC over to full-time Linux this weekend. Proton has really been revolutionary, as I haven't yet encountered something in my Steam library that won't work.
I've been on Linux desktop for ages, but it's not quite stable enough that I can recommend it to anyone. Space Marine 2 was the first game in quite a while than didn't just work out of the box, but...
E.g three weeks ago nvidia pushed bad drivers which broke my desktop after a reboot and I had to swap display (ctrl-alt-f3 etc), I never got into gnome at all, and roll back to an earlier version. Automatic rollback of bad drivers would have saved this.
It might depend on the distro, but were you running a 10 series or earlier? They dropped Pascal and earlier CPUs with the v590 driver, I know Arch migrated what the nvidia package installed in such a way that could leave someone without an appropriate driver unless they manually moved to a different source.
Then again Arch is one of those distros that has the attitude that you need to be a little engaged/responsible for ongoing maintenance of your system, which is why I'm against blind "just use (distro)" recommendations unless it's very basic and low assumptions about the user.
I've had mixed experiences with AMD. Back in the day - a bit after Linus told Nvidia to fuck off - I tried to get my Radeon 5850HD (i think?) working on Ubuntu. It was one of those things I spent the whole weekend (OS reinstalls really add up) trying to make work, to no avail. Relative to that nonsense, the equivalent proprietary Nvidia driver just worked after being installed.
A couple of months ago I bought a second hand RX 7800 XT, and prepared myself for a painful experience, but I think it just worked. Like I got frustrated trying to find out how to download and install the driver, when I think it just came with Linux Mint already.
I've been using a full amd build with arch on it for years now. never had graphics related issues after an update. my biggest gripe is with the hdmi organization and how we can't have proper support with open source drivers.
- Firefox seems to be able to freeze both itself and, sometimes, the whole system. Usually while typing text into a large text box.
- Recently, printing didn't work for two days. Some pushed update installed a version of the CUPS daemon which reported a syntax error on the cupsd.conf file. A few days later, the problem went away, after much discussion on forums about workarounds.
- Can't use more than half of memory before the OOM killer kicks in. The default rule of the OOM killer daemon is that if a process has more than half of memory for a minute, kill it. Rust builds get killed. Firefox gets killed. This is a huge pain on the 8GB machine. Yes, I could edit some config file and stop this, but that tends to interfere with config file updates from Ubuntu and from the GUI tools.
The Gnome desktop environment usability degradation in recent releases (stuff like drag and dropping files and folders between the desktop and the file explorer not working anymore, or not being able to create new empty files with a right click by default without having to create custom templates, being unable to pin apps to the launcher without messing with files, and more) was so horrendous that it felt like actual sabotage was being committed. Who in their right mind would decide to make the UI actively worse?!
These seem annoying, but I'd argue that these problems are in some ways less significant on Linux than on Windows. If some function of Windows is broken or unsatisfactory, there is not necessarily a way to fix it.
But you can adjust your own system. It'd be unhelpful of me to suggest to an unhappy Windows user that they should switch to another operating system, as that demands a drastic change of environment. On the other hand, you're already familiar with Linux, so the switching cost to a different Linux distribution is significantly lower. Thus I can fairly say that "Ubuntu getting worse" is less of a problem than "Windows getting worse." You have many convenient options. A Windows user has fewer.
What are you talking about? Firefox hasn't been single process since more than 10 years ago. At most, it uses 7% for the main process and I have thousands of tabs open. I can't talk about the other two, but I've had processes use 60% of the system memory without problem (everything else is slow due swapping, but that's expected).
They were talking about instability. I had an old Radeon workstation card in my desktop at home for at least a decade, but with the most recent AMD drivers, Firefox (with hardware acceleration turned on) would crash Gnome and the system when watching videos on YouTube. So I wasted money on one of those new Intel graphics cards to get the stability back (in addition to the time wasted diagnosing the problem).
Just recently started using the desktop machine (under my desk, as opposed to my laptop which sits on my desktop) and put NixOS on it, and found myself pleasantly surprised. There's certainly still some parts of NixOS that require some expertise and getting your head around its package model, but overall I was surprised at how idiotproof it was to install and use. I mostly play games on it with Steam, which also Just Works.
NixOS is really a profound experience, once you embrace it. I used Arch for ~3 years and ended up reinstalling it maybe 15 times on my desktop alone. Switched to NixOS and I've used the same installation for 3 years, synced with my laptop and server, switching from x11 to Wayland to KDE to GNOME then back again with no problem.
It doesn't feel real sometimes. My dotfiles are modularized, backed up in Github and versioned with UEFI rollback when I update. I might be using this for the rest of my life, now.
I also have the same Arch install from 2014 on my main hardware. Each replacement computer is nothing more then taking the old drive out, placing it into an USB enclosure, booting a USB live, setting up the partitions on the new drive, and _rsync_ the content from the old to the new, finalizing with registering the UEFI boot loader.
One just need to make sure that you use the proper _rsync_ command options to preserve hard links or files will be duplicated.
I personally remember being inspired by Erase your Darlings and Paranoid NixOS Setup back in the day, less for the hardening measures and more because of how great the Nix syntax looked. Huge, monumental ass-pain setups could be scripted away in one or two lines like it was nothing. You could create wildly optimized configurations for your specific use-case, and then divide them into modules so they're portable.
It's not advisable to switch to one of these paranoid configurations outright, but they're a great introduction to the flexibility provided by the NixOS configuration system. I'd also recommend Xe's documentation of Nix Flakes, which can be used on any UNIX-like system including macOS: https://xeiaso.net/blog/nix-flakes-1-2022-02-21/
For what it's worth: I no longer suggest the use of NixOS for any purpose. I only have one NixOS system in my house because it's my NAS and I am a coward.
Minus work, I've exclusively been using Arch Linux, Ubuntu Server & macOS on my machines and haven't looked back for the last couple years. The biggest annoyance I have is the lack of (modern) Adobe applications on Linux. I'll be real though, I mostly just like Photoshop out of familiarity, I could probably get used to the alternatives but I'm lazy. Photoshop CS6 I hear works fine in WINE which is good enough for me since my workflow hasn't changed much since CS5.
Oh, and also anti-cheat games forcing me to use Windows. Makes me sick to my stomach booting into Windows 11 every couple of months and having to watch my PC performance tank while it's downloading updates, Windows Defender scans, etc. for 30 minutes
This may be an unpopular opinion but I feel Bazzite (and immutable distros in general) is the future for the normal users. Yes I know, they take the freedom of messing with the system core but for most people this is fine. All they need is a device that works without any problems.
First time I switched to asus kernel from the generic one was magic - I know asus-linux exists and following the instructions probably would have ended up in a working system, but with bazzite I wrote only one command and everything worked. It still feels weird not to monkey around with package installations (and this was a dangerous path, usually ended up with more work for me) but this is a tradeoff I can live with. The software I used - luckily - already moved to Flatpak so everything was a breeze. Also the fact that I can switch to a working state with one keypress is a stress reliever.
I agree. Linux is good now - for the common user. I still can't see immutable distros can be used for all scenarios but for gaming/home use, this is a methodology I can easily recommend for my friends and family who only want a computer that works without messing with console.
Been so happy with my switch to Linux about 8 months ago. The nvidia gremlins that stopped me in prior years are all smoothed out.
One big plus with Linux, it's more amenable to AI assistance - just copy & paste shell commands, rather than follow GUI step-by-steps. And Linux has been in the world long enough to be deeply in the LLM training corpuses.
I'm slowly de-Microsofting my computing. I've traded OneDrive for Syncthing. I ditched one PC for a Mac. I have the technical skills to run Linux effectively, but the biggest obstacle for my Linux adoption is distro fatigue. Run Ubuntu? Debian? Fedora? PopOS? Kubuntu? Arch? The article introduced yet another one to consider--Bazzite.
The Linux world is amazing for its experimentation and collaboration. But the fragmentation makes it hard for even technical people like me who just want to get work done to embrace it for the desktop.
Ubuntu LTS is probably the right choice. But it's just one more thing I have to go research.
Don't go for Ubuntu LTS. There are many choices that would work equally well for you. I'd go for Fedora KDE edition (but could easily be EndeavorOS or OpenSUSE Tumbleweed).
As a beginner, just pick Ubuntu and get on with your life imo. Switching distros isn't that big of a lift later on and pretty much everything you learn carries over from one to the other. It's much more worthwhile to just pick _something_ and learn some basics and become comfortable with the OS imo.
Pick a popular distro, and during installation, put your /home directory on its own partition. This way, you won't have much to reconfigure if you ever have a reason to switch distros. (You might not ever have a reason; they're all pretty capable.)
Just use Debian. If you have technical skills, run Debian Testing. If not, run Debian Stable, or someone else who repackages Debian Testing such as Mint or (as you mentioned) Ubuntu LTS.
Debian Testing will sometimes break, so technical skills are necessary if you want to always be sure you can be up and running. Otherwise, something may not work for a few days to a week, like CUPS (printing). 99.9% of the time, it won't be your networking or something super-important, but it could be. When you update, read the list of changes and absolutely make sure that you know what the things are that are being uninstalled, and whether you can do without them for a few days. Check the internet for when packages are removed from testing (and why) or will be moved into testing from unstable. Don't forget that you can use LLMs now when you have a problem.
Once you've been able to handle Debian Testing for a while, especially through a couple of breakages, you'll probably be confident enough and knowledgeable enough to know if you want to go to another distro. I personally don't need anything other than testing for my desktops, and stable for my servers.
edit: Debian Testing gets software that has worked smoothly on Debian Unstable for a two-week period. Sometimes things get missed during those two weeks, and sometimes Debian decides to reorganize packages radically in a way that takes more than one update. One thing to remember is that urgent bugfixes to Debian Stable might bypass testing altogether, and might actually arrive later to testing than everywhere else. You'll probably hear about those on the news or on HN, and you might want to manually install those fixes before they actually hit testing.
Ubuntu stopped caring about the desktop experience when the switched to Gnome. Now they have annoying SNAPs. They are a business and they are going to continue enshitifying it.
I haven't tried Bazzite because I'm not into gaming but Linux Mint is working very well for a lot of people coming from Windows. It just works and has great defaults. Windows users seem to pick it up pretty easily.
Also, Linux Mint upgrades very well. I've had a lot of success upgrading to new versions without needing to reinstall everything. Ubuntu and other distros I've tried often have failed during upgrading and I had to reinstall.
I think fragmentation is the wrong way to look at it; they're all basically compatible at the end of the day. It's more like an endless list of people who want to min-max.
Any reasonably popular distro will have enough other users that you can find resources for fixing hitches. The deciding factor that made me go with EndeavourOS was that their website had cool pictures of space on it. If you don't already care then the criteria don't need to be any deeper than that.
Once you use it enough to develop opinions, the huge list of options will thin itself out.
I want to switch to Linux for gaming. But I’m also stressed about leaving Lightroom for Darktable, which seems like it is much more complicated and with a worse UI (even if people say it’s more powerful under the hood)
I'm glad that I am not the only one saying this. I made the switch 20+ years ago for my day to day use, and I have rarely experienced any problems with it.
I’d argue the majority of casual online PC discourse is driven by gaming. By the numbers LTT is the largest PC/IT/consumer computer YouTube channel and the majority of their content is focused on gaming.
A long time ago when I was in University, I was a volunteer in the Ubuntu group. In addition to evangelizing Linux/OSS, We were trying to convince our University to switch to opensource software for at least some engineering education with only a little bit of success.
After a particularly busy OSS event a non-programmer friend of mine asked me, why is it that the Linux people seem to be so needy for everyone to make the same choices they make? trying to answer that question changed my perspective on the entire community. And here we are, after all these years the same question seems to still apply.
Why are we so needy for ALL users and use-cases to be Linux-based and Linux-centric once we make that choice ourselves? What is it about Linux? the BSD people seem to not suffer from this and I've never heard anyone advocate for migration to OSX in spite of it being superior for specific usecases (like music production).
IMO if you're a creator, operating systems are tools; use the tool that fits the task.
When you (try to) use libre software, the problems you run into tend not to be related to insufficient engineering, but more societal and economic, where they would be less likely to appear if there were more people in your cohort.
Examples:
- An important document is sent to me in a proprietary format
- A streaming service uses a DRM service owned by a tech giant that refuses to let it work with open source projects
- A video game developer thinks making games work on Linux isn't worth getting rid of rootkit anticheat
The downside is Windows users would have to live in a world without subscription-based office suites, locked down media, and letting the CCP into your ring 0.
It’s bad for society for the desktop OS market to be a proprietary monopoly. It basically allows Microsoft to extract rent from the public defender.
I do understand the evangelism being obnoxious. I don’t advocate for people to switch if they have key use cases that ONLY windows or OS X can meet. Certainly not good to be pushy. But otherwise, people are really getting a better experience by switching to Linux.
Because there are people who care about Free software from a philosophical standpoint on how societies should function and interact.
The community aspect of free software both pushes for more people to participate (and often for other groups to be excluded as "wrong" or "evil").
But that community only offers secondary benefits to those who are authors or painters or photographers rather than software developers - economic factors, risk aversion, functionality, and so on. The FLOSS communities are almost invariably driven toward hobbyists and developers rather than authors, artists, gamers, and the like - people whose interest lies outside of tinkering with and/or improving software.
The BSDs were never really a movement in that sense, and macOS is still just a product even if there are enthusiastic users of them both.
Similarly on the Linux side: Android, Steam Deck, and countless IoT devices are examples of successful products where the Linux aspect of them is not really even advertised.
> why is it that the Linux people seem to be so needy for everyone to make the same choices they make?
This is the sort of question an apolitical person would ask a liberal (I am aware liberalism had been tainted in the recent times), like why is it you people are so needy and constantly preaching about democracy?
I moved to linux this month for good once i realized I no longer needed microsft services (Excel for example "runs on Mac" but is missing important features). I chose redhat because its what I've been using for over a decade at work and feels like home. Only thing I miss is Capcut as that workflow was pretty ironed out. Getting the hang of KDENlive
I use a Linux PC every day but I wouldn't recommend it to normal people. They're not going to feel any renewed sense of ownership from it, just annoyance at having to think about technical gibberish when they just want to get on with using the computer.
Yeah, because getting ads and pushed to use more Microsoft products all day long isn't an annoyance when they just want to get on with using the computer.
I've been really enjoying my experience using CachyOS on my (formerly Windows) gaming PC. I chose to use Limine and btrfs so now if it gets borked by a bad package install/uninstall I can roll back pretty easily. My next step is to replace my Nvidia GPU with an AMD one so I can stop worrying about that aspect in the future.
Linux is even getting more accessible. I'm thinking of Elementary OS which not only posted about their accessibility work, but linked to the articles which really fired things up. I'm a Fedora guy, mainly because I want the latest Orca, AT-SPI2 and such, so I don't feel like an Ubuntu dirivitive would work as well.
So I installed Fedora on my work machine and find that I can still get all of my work done. Well except the parts that require testing accessibility on Windows screen readers or helping with Windows-related issues.
The only thing I miss now are the many addons made for NVDA, especially the ones for image descriptions. But if I can get something to work with Wayland, I could probably vibe code some of them. Thank goodness for Claude Code.
Every year at around this time there is a lot of linux related content in tech media.
It's a slow moving evergreen topic perfect for a scheduled release while the author is on holiday. This is just filler content that could have been written at any point in the last 10 years with minor changes.
I've been working on the Linux desktop for 20 years, and I've been using it on the desktop since 1999, so I lived through the infamous "Year of the Linux Desktop" era.
I've not seen anything like the current level of momentum, ever, nor this level of mainstream exposure. Gaming has changed the equation, and 2026 will be wild.
Not just gaming. This year, both Windows and Mac OS had absolutely terrible years. The Mac effed up its UI with liquid glass, to the point where Alan Dye fled to Meta. Microsoft pushed LLMs and ads into everything, screwing up what was otherwise a decent release.
On the other hand, on the Linux side, we had the release of COSMIC, which is an extremely user-friendly desktop. KDE, Gnome, and others are all at a point where they feel polished and stable.
To be honest, I always figured we'd make it in the long run. We're a thrifty bunch, we aim to set up sustainable organizations, we're more enshittification-resistant by nature. As long as we're reliable and stick around for long enough.
I don't think the prevalence of these articles this time of year is because the authors go on holiday, but instead is because the new year is the perfect time to ponder: "Will this be the year of the Linux desktop?"
I guess everyone’s in a “fuck it I’m ready to try some new stuff” mood too so this content is perfectly suited for new years. Would never have noticed this without your comment.
Except every year you didn't have people like Pewdiepie and DHH pushing Linux. As as channels like GamersNexus doing Linux benchmarks. At the same time Windows and Mac making very dumb mistakes. So this time it does feel different, even if it might not be in the end.
I'm solo developing a spaceflight simulator on Linux (using the Godot engine), exporting binaries in both Linux and Windows. It turns out that I really didn't need to bother with the Linux export anyway because Steam runs the Windows version on Linux without any problems.
The ONLY thing I'm still having trouble with under Linux is Steam VR on the HTC Vive. It works. Barely.
Yes, most likely. Steam is dominant, and it's not hard to make a Windows release that works under Proton.
Though in my case, I currently offer demo/beta releases for both Windows and Linux directly from Github. If I ultimately elect to release my game under a GPL license, then supporting both Linux and Windows directly would make sense.
This is not an argument in good faith. Nintendo is a company that primarily makes hardware to sell its own games. The Wii is not a general computing device and we never expected it to do that.
Linux claims to be a general computing operating system, but had historically not prioritized gaming (or UI…). This has changed and the article notes as much. Windows OS is used for a benchmark because it is still the gold standard, and the OS that most games are intended to be executed with.
One thing I'm curious about: how is the laptop dual-GPU situation these days?
I opted to install Linux in a VM under Hyper-V on Windows to avoid hassles with the dual GPUs in my ThinkPad P52, but this comes with several other hassles I'd like to avoid. (Like no GPU access in Linux at all...)
I’ve been around the block with Linux distributions since 2020. I personally think that Bazzite is the way to go for most people coming from Windows, or people experienced with Linux that want something as close to “set and forget” as you can.
One thing that can be annoying is how quickly things have moved in the Linux gaming space over the past 5 years. I have been a part of conversations with coworkers who talk about how Linux gaming was in 2019 or 2020. I feel like anyone familiar with Linux will know the feeling of how quickly things can improve while documentation and public information cannot keep up.
For me like for many others anti-cheat support is what is still keeping me away. The vast majority of the games I play are incompatible because of this.
I think it will probably change at some point, but until then I just can't use it.
I've been sceptical of the 'Linux desktop' for a long time, but I recently started using Bazzite on my gaming PC and I'm super impressed. In just a few years since I last daily drove a Linux distro it's come such a long way. KDE Plasma is fast and beautiful.
So far all the games I want to play run really well, with no noticable performance difference. If anything, they feel faster, but it could be placebo because the DE is more responsive.
If Microsoft could get their heads out of their rears, they could potentially get back to a better OS for gaming. The hybrid kernel Dave Cutler designed is in many ways still better than the Linux kernel. It's the userland that is the issue with Windows 11. Look just by enabling true nvme support you close the gap between Linux and Windows performance wise.
Linux is the best. Been using it since Slackware, RedHat first came out, now use Ubuntu or any distro which makes it easy to interact with its Desktop, e.g. GNOME :)
Linux desktops have felt flaky for me for a few years now. I’m trying to figure out how much of that is bad choices vs real problems.
Ubuntu’s default desktop felt unstable in a macOS VM. Dual-booting on a couple of HP laptops slowed to a crawl after installing a few desktop apps, apparently because they pulled in background services. What surprised me was how quickly the system became unpleasant to use without any obvious “you just broke X” moment.
My current guess: not Linux in general, but heavy defaults (GNOME, Snap, systemd timers), desktop apps dragging in daemons, and OEM firmware / power-management quirks that don’t play well with Linux. Server Linux holds up because everything stays explicit. Desktop distros hide complexity and don’t give much visibility when things start to rot.
Does this line up with others’ experience? If yes, what actually works long-term? Minimal bases, immutable distros, avoiding certain package systems, strict service hygiene, specific hardware?
The only real obnoxious slow-down daemons I'm familiar with are the "system indexing" things (GNOME Tracker, KDE Baloo) -- highly recommend disabling them.
I've been using Kubuntu for years with good results. I prefer KDE to Gnome, which Kubuntu takes care of, and I normally add in the flatpak repositories so I don't need snap. That has generally worked well for me in the last 5 years.
For certain timeperiods I have needed to switch to Fedora, or the Fedora KDE spin, to get access to more recent software if I'm using newer hardware. That has generally also been pretty stable but the constant stream of updates and short OS life are not really what I'm looking for in a desktop experience.
There are three issues that linux still has, which are across the board:
- Lack of commercial mechanical engineering software support (CAD & CAE software)
- Inability to reliably suspend or sleep for laptops
- Worse battery life on laptops
If you are using a desktop and don't care about CAD or CAE software I think it's probably a better experience overall than windows. Laptops are still more for advanced users imho but if you go with something that has good linux support from the factory (Dell XPS 13, Framework, etc.) it will be mostly frictionless. It just sucks on that one day where you install an update, close the laptop lid, put it in your backpack, and find it absolutely cooking and near 0% when you take it out.
I also have never found something that gave me the battery life I wanted with linux. I used two XPS 13's and they were the closest but still were only like 75% of what I would like. My current Framework 16 is like 50% of what I would like. That is with always going for a 1080p display but using a VPN which doesn't help battery life.
We live in a world with the internet and distributed version control, so essentially every piece of software in the world has a tradeoff where the people maintaining it might push an update that breaks something at any time, but also those updates often do good things too, like add functionality, make stuff more efficient, fix bugs, or probably most crucially, patch out security vulnerabilities.
My experience with FOSS has mostly been that mature projects with any reasonable-sized userbase tend to more reliably not break things in updates than is the case for proprietary software, whether it's an OS or just some SaaS product. YMMV. However, I think probably the most potent way to avoid problems like this actually ever mattering is a combination of doing my updates manually (or at least on an opt-in basis) and being willing to go back a version if something breaks. Usually this isn't necessary for more than a week or so for well-maintained software even in the worst case. I use arch with downgrade (Which lets you go back and choose an old version of any given package) and need to actually use downgrade maybe once a year on average, less in the last 5
> Does this line up with others’ experience? If yes, what actually works long-term? Minimal bases, immutable distros, avoiding certain package systems, strict service hygiene, specific hardware?
No, not really. A Linux desktop with a DE will always be slower and more brittle than an headless machine due to the sheer number of packages/components, but something like Arch + Plasma Shell (without the whole KDE ecosystem) should be very stable and snappy. The headaches caused by immutable distros and flatpaks are not worth it IMO, but YMMV.
With debian and KDE (both personal preference), but no snap or flatpak, it works wonderfully. Power/sleep-management has become better than a default windows install. All hardware, including the fingerprint sensor, just works.
I've run Void Linux + Xmonad for many years without any such issues. I also recently installed CachyOS for my kid to game on (KDE Plasma) and it works super well.
Not really, no. What did you install that slowed things down?
> If yes, what actually works long-term?
Plain ordinary Ubuntu 24.04 LTS, running on an ancient Thinkpad T430 with a whopping 8GB of RAM and an SSD (which is failing, but that's not Linux's fault, it's been on its way out for about a year and I should probably stop compiling Haiku nightlies on it).
Can you give an example of which desktop apps are "dragging in daemons"?
I am using Linux in different flavours for the past 10 years. It has become more reliable for the the time. The last 5 years had noticeable few issues across the distros.
I have always wanted to use linux as my main OS. I tried with Ubuntu twice the past and always ran into really painful hurdles or missing features. This year I tried again with Mint and it absolutely stuck the landing. I have completely switched my desktop and laptop (and plex server) to mint. I have never even booted back into windows. I have not had any big issues and have been able to make it better than my windows desktop ever was.
I switched in 2020. I run Fedora and Arch. I don’t miss MacOS at all. The last Windows I used was 8, so my opinion is out of date, but yeah… I don’t miss Windows, either.
I have been using Linux since almost 23 years now. I don't praise it as flawless in any way, but compared to Microsoft it is a much more efficient operating system. Top 500 supercomputers also running Linux kind of hint that Linux is very good.
Despite this, Linux as ecosystem has numerous problems. The "wayland is the future" annoys me a lot. The wayland protocol was released in 2008. Now it is almost 20 years. I don't feel wayland is ever going to win a "linux desktop of the year" award. Things that work on xorg-server still do not work on wayland - and probably never will. I am not saying wayland is useless, I ran it for a while on KDE (though I actually don't use KDE, I use icewm typically), but it is just annoying how important things like GUI on Linux simply suck. In many ways Linux is kind of a server computer system, not really a desktop computer system. I use it as one, but the design philosophy is much more catering to the server or compute-work station objective.
Also, GTK ... this thing keeps on getting worse and worse with every new release. I have no idea what they are doing, but I have an old GTK2-based editor and this one consistently works better than the GTK3 or GTK4 ported version. It was a huge mistake to downgrade and nerf GTK to a gnomey-toolkit only. Don't even get me started on GNOME ...
Hopefully all this news Linux is getting for games translates to pressure on makers of all the tools that don't run reliably/at all in Wine and co. to start working on ports. Or at least on making changes to make API translation work better.
I can't move until the software I've invested in moves too. There are no Linux alternatives.
VMs, second machine with firewall etc. I recommend moving everything you can to trustworthy tech. Doesn’t have to be all at once but a gradual process.
There is a strange, but pleasant feeling when you hear someone claiming “they’re early to Linux” and think it’s going to be something big. (Happened recently.)
The success measurements are quite strange. How am I supposed to think Linux is finally good when 96.8% of users do not care to adopt it. I can't think of anything else with that high of a rejection rate. The vast majority do not consider it good enough to use over Windows.
I have been switching between linux-windows for a while now, and i think 2026 is not the year of linux for now.
Linux still suffer from the same fragmentation issue: Oh you want to play game, you should use distro X, oh you want an average web-browsing, working, you should use distro Y, or for programming, use Z. Of course all of them can do what other can do, but the community decided that the way it is.
Yesterday i read a reddit thread about an user sharing his issue with pop-os, and most(if not all) comments saying he is using the wrong distro. He is using latest release (not the nightly build), which is a reasonably thing to do as new user.
Not sure if Linux Mint has changed this, but i remember having to add "non-free" repo to use official Nvidia driver. Not a big deal to people who know what they are doing, but still, that is unnecessary firction.
I just got a laptop for Christmas (first thing I've bought for myself in a good while) with 64GB of DDR5 RAM, a video card inside of it, AMD Ryzen 7 CPU, AMD Radeon 6550M. 144hz screen.
Not the best, but works for me.
I put CachyOS on it, using Steam just run the game's installer adding it as a game to your library -- you just select which proton you want (cachyos-proton) as a dropdown in the Properties in the Steam library. that's it.
it's lightweight, arch (I ditched manjaro), runs KDE and games perfectly, cursor IDE runs great, VMS run great.
first thing I did when I got it from fedex was remove Windows and put Linux on it. I thought 'maybe I'll just bite the bullet and sign up a Microsoft cloud account to be able to access ..my desktop' and 1/4 through its install I held the power button and popped a flash drive in. just say no to windows and you'll all be happy, trust me.
the only effort it required was for me to say f this on using Lutris and just use Steam as the wrapper.
2026 is definitely the year for linux. every year is. valve heavily invested in Arch, proton, and is using Linux on their devices and honestly: Windows is spyware, and after their vibe coded jank 25H2 update that broke a ton of things and Windows 10 being EOL, I hope more people get to enjoy throwing Ventoy on a USB stick with a bunch of linux isos copied over to it and boot and play with what they love.
so I disagree, 2026 is the year for Linux, and Linux is love.
Now that Microsoft are removing support for Windows 10, which is still running on many not-that-old devices that don't support Windows 11, I think the correct, mainstream advice HAS to be to install Linux on those machines. Those are still perfectly usable machines that can be used for productivity or enjoy a massive catalog of games.
We've reached a point where Microsoft greed and carelessness is degrading Windows from all angles. With the constant forced Copilot, forced sign-ups, annoying pop-ups and ads, it is figuratively unusable; in the case of machines stuck on Windows 10 it is literally unusable.
They are now banking entirely on a captive market of Enterprise customers who have invested too much to quit. The enshittification is feature complete.
An ex lease Thinkpad T Series with Intel graphics is a good choice for value and compatibility. eg a T490 or T14 era machine.
Using hardware at least 6-12 months old is a good way to get better compatibility.
Generally Linux drivers only start development after the hardware is available and in the hands of devs, while Windows drivers usually get a head start before release. Brand new hardware on a LTS (long term support) distro with an older kernel is usually the worst compatibility combo.
What is really blocking the move for me is zScaler, Zoom (they may exist on Linux, not sure about how integrated they are) but especially Outlook (the client). The OWA version is subpar and without it I cannot function in a work environment.
> without it I cannot function in a work environment.
This is more about what you choose as your operating environment, not what your work imposes as your working environment.
Most places of work, mine included, run Microsoft services that lock them into the ecosystem incredibly tightly.
As per the article title, "if you want to feel like you actually own your PC", this is about your PC, not the one provided to you by your workplace (since it's likely owned by them).
One thing I'm worried about in my work environment is Microsoft enforcing the web versions of Office and deprecating the stand alone desktop applications. The web versions are a massive step down in terms of functionality and ease of use. Your mention of OWA makes me feel as if that is what Outlook will be sacrificed for at some point in the future anyway.
I had a similar issue, but I ended up installing Debian and running Windows 10 as a virtual machine with VirtualBox. The webcam can be accessed as if were installed on the guest OS and haven't had a problem with Zoom or Teams. Just sharing in case it helps.
I considered that but is is such a waste of resources in my case: I deliberately use a lighter laptop that just covers my dev needs.
But yes, this is a possibility, or accessing the windows via rdp. The loss would be with the "always-handy" kind of setup, where Outlook is a click away and pops up its calendar reminder
Linux is my main and sole desktop since around-2006. I needed windows for TurboTax a few hours a year in the past but that's it, I did not do PC games though, just regular desktop stuff including developing code.
We've made the switch and it's been great. On top of my that my partner who is not a computer person picked up Linux Mint to the level she can use Windows in a couple weeks.
Yes. The reason the year of the Linux desktop has yet to arrive is because most people don't understand this joke. Linux is powerful because it is made for power users (although certain distros are changing this)
I love this. I spent my holidays hearing non-technical family members complain about their ever deteriorating Windows experiences, issues that make me righteously angry at Microsoft.
IMO the next important unblocker for Linux adoption is the Adobe suite. In a post-mobile world one can use a tablet or phone for almost any media consumption. But production is still in the realm of the desktop UX and photo/video/creative work is the most common form of output. An Adobe CC Linux option would enable that set of "power users". And regardless of their actual percentage of desktop users, just about ever YouTuber or streamer talking about technology is by definition a content creator so opening Linux up to them would have a big effect on adoption.
And yes I've tried most of the Linux alternatives, like GIMP, Inkscape, DaVinci, RawTherapee, etc. They're mostly /fine/ but it's one of the weaker software categories in FOSS-alternatives IMO. It also adds an unnecessary learning curve. Gamers would laugh if they were told that Linux gaming was great, they just have to learn and play an entirely different set of games.
Photoshop (for example) largely works in Wine, although it's not stable enough for production usage. The problem is the CC itself and the installer, which is unimaginably bloated and glued to the Internet Exp... I mean Edge Web View and many other Windows-only things.
Hahaha. Try sharing a couple old printers and scanners connected to a Linux box on your home network. At best, when it’s working you get lowest common denominator functionality. Want to run some vms ? Works great until you update your distro and the vm hosts kernel modules aren’t compatible anymore. Oh, want to use a later version of some package like docker? Did I use apt or snap or flatpack???
Yes, you can get this stuff working, but if you enjoy doing other things in life, have a job and don’t life alone, it is SSSOOOOO much easier to get a Mac mini. Or even windows 11 if that’s your thing.
Sounds ultra-specific to your experience. VMs, package management and networking are all things that macOS and Windows stumble with for regular usage. I've used all three OSes professionally, and Linux requires the least configuration to get work done.
cachyos is a good os that is also performant. arch though so there are quirks around the rolling update model but you always have the newestish packages and if you update regularly there seems to be less headache.
I tried a number of distros and settled on Omarchy because it has a coherent design and nice aesthetics, but it has some weird quirks about messing with my dotfiles on updates. It's so new I suspect this will be ironed out soon.
Honestly I loved it a lot more pre-2022, when Ubuntu added a super aggressive OOM killer that only operates on the level of an entire systemd run unit. Meaning that if you are running computation in, say, a shell and one for your subprocesses running computation takes too much memory, it takes out the entire shell and terminal window, leaving no trace of what happened, including all the terminal logs.
And if you are running Chrome, and something starts taking a lot of memory, say goodbye to the entire app without any niceties.
(Yes, this is a mere pet peeve but it has been causing me so much pain over the past year, and it's such an inferior way to deal with memory limits tha what came before it, I don't know why anybody would have taken OOM logic from systemd services and applied it to use launched processes.)
I have to wonder if Ubuntu's prescriptive stance on things like this is becoming increasingly outdated in an age where there's actually a decent experience out of the box for a lot more stuff on Linux. I've long since moved on from using it personally for my devices, but I'm fairly certain my tolerance for spending effort tinkering to get things working like I want is a lot higher than even most Linux users, so it's hard for me to gauge if the window have moved significantly in that regard for the average Linux user.
It's not just Ubuntu, Arch is just as bad. The primary problem is systemd, which provided an adequate OOMd for daemons, but then all the distributions seem to be using it for interactively launched processes
If anybody can help me out with a better solution with a modern distribution, that's about 75% of the reason I'm posting. But it's been a major pain and all the GitHub issues I have encountered on it show a big resistance to having better behavior like is the default for MacOS, Windows, or older Linux.
It's funny how you say the way it used to be was better when people always complained about the OOM killer waiting until the system had entirely ground to a halt before acting, to the point some preferred to run with 0 swap so the system would just immediately go down instead.
Regardless, I believe EarlyOOM is pretty configurable, if you care to check it out.
Thanks for the EarlyOOM pointer, it's one that I found (from HN) on my investigation of why an entire process group was getting killed rather than single processes.
The problem is not that OOM killing happens earlier under memory pressure, but rather the problem is what gets killed. Previously an offending process would get killed. Now it's an entire cgroup. So if you are using process isolation to run a batch of computation jobs, each of which takes different amounts of memory and it is not foreseeable which will take too much memory until runtime, the OOM killer takes out the batch manager and its shell and everything. So the process can't know ahead of time if it's taking too much memory, because allocations never fail, and the process itself shouldn't be monitoring what is going on the rest of the system to make run time decisions to quit. The entire batch of jobs is killed, rather than a single process dying (as happens for any number of errors) and continuing in with the rest of the batch of jobs. In fact, without interacting directly with systemd-run to create a new cgroup, it's impossible to monitor WTF happened to your process because of this new "nuke it from orbit" behavior.
During my searches on this another common error case is in an IDE where one process goes wild and takes too much memory, and then the whole IDE gets killed silently instead of single process killing allowing the app to save state.
This is a very fundamental change to how Linux has worked, it's a novel concept unfamiliar to long time users (who the fuck actually knows about cgroups or uses them extensively except for people heavy int containerization?), and workarounds for the behavior require introducing heavy dependency on systems in order to get basic functionality, making my code far less portable. I can understand being dependent on GNU, and some linuxisms in syscalls, but changing the basic semantics of launching new processes such that new code dependencies are needed for intricate cgroup control, well, that's a bit much for me. Leave systems-oomd to manage cgroups and containers, but having it manage desktop apps and standard Unix process launching leads to bad code.
Because the entire cgroup gets killed rather than individual processes, there's zero trace left. When k first encountered it I was running a multi day compute pipeline in tmux, and I saw my compute pane gone, and thought that I just have accidentally nuked the entire pane, killing the job. A few more attempts and I finally realized it wasn't me, and I checked journactl to find out that it was OOM killed but I couldn't for the life of me figure out why the shell got killed to, what's the point of killing a process with tiny memory? Turns out that is the desired behavior of systemd, and thus of many distributions now.
I find it interesting how many people have Ubuntu in mind when it comes to a Linux desktop when it hasn't been a great experience ever since they switched to Gnome. They don't really care about the desktop anymore. They are now a corporation that is enshitifying their product with things like SNAPs.
If you want a distro that really cares about the desktop experience today, try Linux Mint. Windows users seem to adapt to it quite quickly and easily. It's familiar and has really good defaults that match what people expect.
This is really annoying me as well. I use a program for work that can occasionally use a lot of ram, while saving or interpolating for example. On my little MacBook Air with just 8GB of ram everything works fine, it just swaps a whole lot more for a short period. On my desktop with 16GB ram and Ubuntu oom just kills it, my workaround is the swapspace package which adds swap files under high load, works so far.
It sounds like your primary issue is that you have a severe RAM deficiency for what you're trying to use your machine for. Any OOM killer, be it the kernel's per-process one or systemd-oomd's per-service one, only exists to try to recover from an out-of-memory scenario where the alternative is to kernel panic (in the case of the kernel's oom killer) or for the system to completely lock up (in the case of systemd-oomd).
My primary issue is that a system that did an OK job at dealing with low memory situations has been replaced with a completely inadequate system.
If your solution is "don't ever run out of memory" my solution is "I won't ever use your OS unless forced to."
Every other OS handles this better, and my work literally requires pushing the bounds of memory on the box, whether it's 64GB or 1TB of RAM. Killing an entire cgroup is never an acceptable solution, except for the long-running servers that systemd is meant to run.
As far as I know, Windows just grinds to a halt entirely, system processes start crashing, or you get a BSOD, and mobile OSes kill the app without any trace. I never had an OOM situation on Macs so I don't know about macOS.
Windows is unstable even if you have more than enough memory but your swap is disabled, due to how its virtual memory works. It generally behaves much worse than others under heavy load and when various system resources are nearly exhausted.
There are several advanced and very flexible OOM killers available for Linux, you can use them if it really bothers you (honestly you're the first I've seen complaining about it). Some gaming/realtime distros are using them by default.
Even NT4 handled OOM scenarios better than modern Linux. No, it didn't grind to a halt, it would grind the rust off of the spinning platters. But it would continue to run your applications until the application was finished or you intervened.
The kernel OOM killer has never done an adequate job for me. It tends to hesitate to kill anything until the system has literally been completely 100% unresponsive for over half an hour. That's completely unacceptable. Killing a cgroup before the system becomes unresponsive is a million times more desirable default behaviour for a normal desktop system (which Ubuntu Desktop is).
Of course, if it's absolutely not compatible with your work, you can just disable systemd-oomd. I'm wondering though, what sort of work are you doing where you can't tune stuff to use 95% of your 1TB of memory instead of 105% of it?
Hah, I actually got a new personal 64GB box just for some experiments I was doing where I needed a 33GB dataset in RAM at a time. Fortunately EBay makes this sort of bearable, especially using last generation's tech.
A few months ago I almost did an EBay order for 1TB of RAM for a different project, but chickened out because of how far RAM prices had risen recently, and I thought they'd probably go down. How wrong I was!
No, it really isn't. I use Ubuntu daily and not a month passes without a serious issue that no normal user would be able to solve. Last week my desktop completely froze again (as it often does) and Linux broke a mounted NTFS partition when I shut off the power.
What would that be useful for? A properly implemented software passkey like Keepassxc would be secure against anything short of a local root exploit. A TPM would not really help against that either.
Passkeys were designed to be bound to a device. Also I don't like using browser extensions as password managers or connecting them to a browser via an extension. It's been proven not to be safe many times.
The personal desktop has fallen in relevance enough for that to be possible. The goalposts moved, now linux needs to have phone, tablet, and laptop with smooth effortless integration between them all.
I recently switched to using a thumb drive to transfer files to and from my phone/tablet, I became demoralized when faced with getting it all setup.
KDE has phone and laptop integrated well enough for me. It's worth giving it a try but the more devices you want integrated the more of a risk it is in case it doesn't quite work right. But I've got enough other devices in the house which I can't put KDE on (work laptop, Windows machine I need for some specific software) that I can recommend https://github.com/9001/copyparty over thumb drives.
I actually intended to set everything up but did not have time and needed to copy some files, so dusted off a thumb drive. I am liking it quite a bit and I think I prefer it to the alternatives.
YOTLD has nothing to do with my needs and wants and I am perfectly happy with my thumb drive and the weird little ways linux imposes itself on my life.
Spoken like a true Windows UX aficionado. Who doesn't love multiple system settings apps, a mix of minimal new context menus and overcrowded legacy context menus just one more click away.
Only true if those inconsistencies actually matter to your workflow. Not going to deny that they exist, obviously, but their impact is largely overplayed (and gratuitously downplayed on Windows, in my experience).
This is just not true anymore. The only things that don't work anymore are a few AAA titles that use particular types of anti-cheat systems that rely on Windows kernel drivers (League of Legends is one that comes to mind).
If I remember correctly, after the Crowdstrike BSOD-all-windows-instances update last year Microsoft wanted to make some changes to their kernel driver program and these anti-cheat measures on Windows might need to find a new mechanism soon anyway. That's a long way of saying, it's plausible that even that last barrier might come down sooner rather than later.
I probably sound like a hipster or something, but it really feels like many people are finally catching up to the way I've been working for the past 15 years.
I fully switched to GNU/Linux back then and have never looked back. Initially I was quite evangelical but got tired of it and gave up probably around 10 years ago thinking "oh well, their loss". But slowly more and more of the world has switched over, first servers, then developer workstations and now finally just "normal" users.
Similarly, I've always been hugely invested into my tools and have a strong distaste for manual labour. I often watch how others work and can't believe how slow and inefficient it is. Typing up repetitive syntax every time, copy/pasting huge blocks of code, performing repetitive actions when booting their PC etc. I simply haven't been doing this for my whole career, I've been writing scripts, using clever editors, using programming languages to their fullest etc.
I think this is why LLMs don't seem like such the huge breakthrough to me as they do to others. I wasn't doing this stuff manually before, that's ridiculous. I don't need to generate globs of code because I already know how to get the computer to do it for me, and I know how to do that in a sustainable and maintainable way too. It's sad that LLMs are giving people their first real sense of control, when it's actually been available for a very long by now, and in a way that you can actually own it, rather than paying for a service that might be taken away at any moment.
What amazes me is that on Steam they no longer make the distinction (in the standard library view) between Windows and Linux: every game is assumed to launch in Linux, using Proton behind the scenes it needed. There's still a "Linux games" toggle but now every game appears ungrayed by default.
And it mostly works! At least for my games library. The only game I wasn't able to get to work so far is Space Marine 2, but on ProtonDB people report they got it to work.
As for the rest: I've been an exclusive Linux user on the desktop for ~20 years now, no regrets.
Linux is not suitable for the average user. I use Xubuntu on all my old computers, but I am 100% sure a normie would not tolerate the tedium of it. People want shiny icons with animations and a bunch of garbage on their computers to make them feel they are doing something. Linux is too static for that.
If I have an issue with an application or if I want an application, I must use the terminal. I can't imagine a Mac user bothering to learn it. Linux is for people who want to maximize the use of their computer without being spied on and without weird background processes. Linux won't die, but it won't catch Windows or Mac in the next 5 decades. People are too lazy for it. Forget about learning. I bet you $100, 99% of the people in the street didn't even see Linux in their lives, nor even heard of it. It is not because of marketing, it is because people who tried it returned to Windows or Mac after deciding it is too hard to learn for them to install a driver or an application.
I wouldn't recommend Xubuntu for the average user. What you feel is about Xubuntu, not Linux. Normies are doing well adapting to Linux Mint. It's easy for Windows users to get used to within a few days and it has sane defaults that match what users expect. It just works.
Using an Android tablet vs a linux pc is a completely separate thing. Not even close. I am talking about a user who knows Linux, uses it at home for personal PC purposes. It is non-existent.
They still don't know what Linux is, nor do they use it intentionally. Chromebook's OS is no different than an Android OS. You seem to want to argue by just throwing random things. Good luck. :)
Still can’t play big titles with anti cheat like call of duty. Only reason I’m stuck on windows on the gaming PC is I’m somewhat addicted to BO7 zombies.
Linux has been good for years. The only thing that's changed is that Valve put a bunch of effort into Proton so now Linux has enough game titles for that to no longer be an excuse to not switch.
I've been using Linux full-time (no other OSes at all) for nearly 20 years. Went through all my university education using only Linux. It's problem free if you use it like a grandma would (don't mess with the base system) and even if you mess with it, most things are easily reversible.
That being said, I have noticed that the newfound interest in Linux seems to be a result of big tech being SO abusive towards its customers that even normies are suddenly into computing "freedom".
Deluding ourselves here it's nice to think the day has finally arrived but it hasn't
We need to address the problems rather than pretending it's already great
Shutting down your laptop and having to wait five minutes for systemd to shutdown because of some timeout when you need to get your flight is just one of those reasons you end up going back to windows
I think you have decided to bury your head in the sand, if you think systemd is a dealbreaker for desktop use.
Truth is, Linux has been fine for many people for a while now. I've used it for six years without needing to install Windows for anything. Steam Deck owners are putting in millions of hours playing the hot new Windows releases, do you think they are complaining about systemd on Reddit?
<< Deluding ourselves here it's nice to think the day has finally arrived but it hasn't
Apart from systemd, tell me what isn't great. The whole benefit of the ecosystem is that you can pick and choose your system components and if you don't like any of then, build your own. I am not trying to be dismissive, if I use a windows these days, it is for work and some errant vm to run windows specific app ( hr block comes to mind -- I am just not willing to spend time to run it any other way ).
My company had an onsite where presenters were plugging their computers into a projector via HDMI. I watched seven individuals go up with their MacBook Pros, plug the cable in, then get confused why nothing was showing on the screen. Seven times in a row, someone had to run onto the stage and accept the permissions dialog on the Mac allowing the user to share their screen via HDMI. When the eighth presenter took the stage, he plugged the HDMI cable into his laptop running Linux, and the screen immediately output an image. Linux just works.
I switched full time to Linux in 2022 when Elden Ring launched and had better performance in its first week on Linux than on Windows. I personally switched over to KDE Plasma powered by Arch. The first thing I noticed was how perfectly it was immediately. So I'd push the title of the article on step further: Linux has been good for a while.
In the four years since switching over from a dual Windows (for gaming) and Mac (for programming, web browsing, and everything else), I have almost entirely had a better experience in every single area. I still use macOS daily for work, and it is constantly driving me mad. For every task I have thrown at it (from gaming to programming to game dev to photo editing), Linux just works.
On Mac it's more like, "there's an app for that". I have third party package managers on Mac. I use a third party app to display if my internet connection is using Ethernet. It yells at me to delete the CSV file that I created and requires an instruction manual with instructions for the Settings app that have changed three times in three years for how to open the file, add Bluetooth to the menu bar, etc. It even had a permanent red icon on the Settings about not being signed into an Apple ID. And once I signed in, the Settings app has a permanent red icon about paying for Apple Care. My parents have made comments about how they're worried as they get older that they won't be able to keep up with the constant updates and changes to macOS and iOS.
I don't have much to say about Windows besides good riddance. It was far less confusing to use than macOS but was filled with too much bloat and pop up notifications.
The final thing I'll mention is that the first time my girlfriend used my computer, she sat down, opened the browser, and completed her task. She thought that she was using Windows and was able to navigate the new interface without having to spend any time learning anything. For her regular use case of using the PC for an internet browser, Linux just worked. She even asked me afterwards to install it on her laptop to replace Windows! I can't believe we're in a world where that's asked by someone non-technical who just wants a computer to get out of their way so that they can perform their tasks.
Unless one has a rack of older GPU hardware that uses an abandoned EOL NVIDIA kernel driver difficult to install past kernel 6.12.x Then one faces the harsh reality of Windows users rightfully laughing at a perpetually Beta Linux OS, as Win11 still boots with the older drivers. while the dkms build randomly implodes at some point.
People dual boot SSD OS for very good reasons, as kernel permutation is not necessarily progress in FOSS. Linux is usable by regular users these days, but "Good" is relative to your use case. YMMV =3
Hey macos obviously still sucks ass, but it sucks ass less than all the other options. The userland situations for linux and windows are both horrifyingly, incomprehensibly bad. Just look at their reliance on a single fucking key "control" for all operations when that was always an inexplicably bad choice. You have so many fucking keys, why douboe dip when it defacto cripples both cli and gui contexts?
Nope, sorry. macOS sucks more than every other option. It's why it has a smaller PC install-base than Windows, and it's why macOS is treated like a second-class citizen for server software. Apple's approach is dying because it cannot compete on equal footing, it's critically dependent on third-party support that simply isn't materializing. In the same fashion as Minix and BSD before it, XNU is being smothered by it's sole beneficiary.
Also FYI, "userland" does not mean "desktop software" exclusively. It extends to the user's runtime, the stability and redundancy of their dependencies and the completeness of it's drivers and SDKs. macOS is not a stable runtime, static-links almost every dependency and has woefully incomplete driver support for third-party hardware. It is the worst of both worlds from Linux and Windows with a few innovative roadblocks that only Apple could universally enforce.
I would be 100% off Windows if it weren’t for Adobe Suite and Ableton Live not being ported to Linux. I’m guessing both of these companies are avoiding it not for technical reasons but because Linux is a support nightmare given all of the distros and variations of the platform.
What makes Linux a viable desktop for so many people now is the fact that they don’t need to run very much software anymore. It runs Chrome so you’re good.
Tried to switch to Linux plenty of times over the past few decades, this year it finally stuck. I can confidently say I’ll never install Windows again. Everything pretty much just works and any issues I’ve had have been quickly resolved with the help of LLM’s.
I've been giving Linux a go as a daily driver for a few months.
I tried Cinnamon and while it was pleasantly customizable, the sigle-threadedness of the UI killed it for me. It was too easy to do the wrong thing and lock the UI thread, including several desktop or tray Spices from the official repo.
I'm switching to KDE. Seems peppier.
Biggest hardware challenge I've faced is my Logitech mouse, which is a huge jump from the old days of fighting with Wi-Fi and sound support. Sound is a bit messy with giving a plethora of audio devices that would be hidden under windows (like digital and analog options for each device) and occasionally compatibility for digital vs analog will be flaky from a game or something, but I'll take it.
Biggest hassle imho is still installing non-repo software. So many packages offer a flatpak and a snap and and build-from-source instructions where you have to figure out the local package names for each dependency and they offer one .Deb for each different version of Debian and its derivatives and it's just so tedious to figure which is the right one.
Sadly the project feels semi-abandoned. No new releases so I had to build it from source. Also the PR board seems to be ignored (one or two of those are mine - I tried to fix the misleading button labels).
Can I run Solidworks on Linux yet? Excel? Labview? Vivado? Adobe products? Altium Designer? (Matlab is mostly yes) Not everybody is just writing Javascript and PHP.
Can I get a laptop to sleep after closing the lid yet?
Not that long ago the answer to these questions was mostly no (or sort of yes... but very painfully)
> Can I get a laptop to sleep after closing the lid yet?
> on windows all of this just works
Disagree on the sleep one - my work laptop doesn’t go to sleep properly. The only laptop I’ve ever used that behaves as expected with sleep is a macbook.
Not sure why you're insinuating that I dislike apple products. My personal mb air doesn't have this issue and most of my household is on apple.
I'm also seeing results for "macbook pro doesn't go to sleep when lid closed", so other people see this problem too. You can't really claim that other platforms have them beat here if there isn't data to support the claim.
> Not sure why you're insinuating that I dislike apple products.
Your comment was written in a manner that echos the same anti-Apple bias that's frequently found on HN. If that's not you, then it's just a misread on my part.
> You can't really claim that other platforms have them beat here if there isn't data to support the claim.
I can, because by and large those are still anecdotal experiences posted online. The deeper integration of OS/hardware due to Apple controlling the entire chain has made sleep mostly a non-issue; it's typically a misbehaving application that might prevent it. There are valid reasons an app might need to do that, so it's not like macOS is going to prevent it - but if sleep's not working right on macOS, it's typically a user error.
This is different from Linux (and Windows, to a lesser degree) where you have a crazy amount of moving parts with drivers/hardware/resources/etc.
Macs do sleep well, when they manage to sleep. Sometimes macOS takes issue with certain programs, the last stack I used at work had a ~50/50 chance of inhibiting sleep when it was spun up.
All in all, I've given up on sleep entirely and default to suspend/hibernate now.
A buggy program preventing sleep is a bug in that program, not a mark on the overall support and reliability of sleep functionality in macOS.
There are valid reasons why a program might need to block sleep, so it's not like macOS is going to hard-prevent it if a program does this. Most programs should not be doing that though.
Still no big CAD names that I'm aware of (annoyingly), Libre Calc works fine for me as an Excel alternative, I have used Matlab on it but not recently, not sure on the others.
Laptop sleep and suspend can still be finicky unfortunately.
I will say my experience using CAD or other CAE software on windows has gotten progressively worse over the years to the point that FEA is more stable on linux than on windows.
We do really need a Solidworks, Creo or NX on linux though. My hope has been that eventually something like Wine, Proton, or other efforts to bring windows games to linux will result in us getting the ability to run them. They are one of the last things holding me back from fully moving away from windows.
These are all pretty niche products at this point. For the true professionals that need these tools they're stuck but most people can find reasonable alternatives for their hobby or side hustle.
I hear you, and also value Excel and a few other products, but I hit my perosnal limit with Windows enshittificatoion early last year and changed my daily driver at home to Linux.
I added a couple VMs running windows, linux, and whatever else I need in proxmox w/ xrdp/rdp and remina, and it's really the best of both worlds. I travel a good deal and being able to remotely connect and pick up where I left off while also not dealing with windows nagware has been great.
I get people are tired of Year of Linux on Desktop, but I feel like last year it actually started happening for real. Mostly due to Arch which is not what I ever expected.
On one hand we have Steam that will make 1000s of games become available on easy to use platform based on Arch.
For developers, we have Omarchy, which makes experience much more streamlined and very pleasant and productive. I moved both my desktop and laptop to Omarchy and have one Mac laptop, this is really good experience, not everything is perfect, but when I switch to Mac after Omarchy, I often discover how not easy is to use Mac, how many clicks it takes to do something simple.
I think both Microsoft and Apple need some serious competition and again, came from Arch who turned out to be more stable and serious then Ubuntu.
My main joy of Linux is to have tilling manager and to have same machine on which I can both play games and work. Which since Windows I couldn't make happen.
Well, until it is a native elf/linux game, because the shabby compatibility, not reliable in time, zero official technical support (which is legally required for paid games), proton(wine), ewwwww...
The only sane ways: either a 'correct' set of native elf/linux binaries, or proton = 0 bucks (namely only free-to-play, with 0 cent in any micro-transactions).
Linux can't be a good desktop, almost by definition.
1. Look at commercial desktop OSes (Windows, MacOS). They spend hundreds of millions to develop and maintain the OS, do updates, quality assurance testing, working with hundreds of thousands of hardware vendors and enterprises, etc, just to try to not break things constantly. This is with "an ecosystem" that is one stack developed by one company. And even they can't get it right. Several Linux-Desktop companies have tried to mimic the commercial companies by not only custom-tailoring their own stack and doing QA, but sometimes even partnering on certified hardware. They're spending serious cash to try to do it right. But still there's plenty of bugs (go look at their issue trackers, community forums, package updates, etc) and no significant benefit over the competition.
2. There is no incentive for Linux to have consistency, quality, or a good UX. The incentive is to create free software that a developer wants. The entire ethos of the OSS community is, and has always been, I want the software, I make the software, you're welcome to use it too. And that's fine, for developers! But that's not how you make something regular people can use reliably and enjoyably. It's a hodge-podge of different solutions glued together. Which works up to a point, but then...
3. Eventually Linux desktop reaches a point where it doesn't work. The new mouse you bought's extra buttons don't work. Or the expensive webcam you bought can't be controlled because it requires a custom app only shipped on Windows/Mac. Or your graphics card's vendor uses proprietary firmware blobs causing bugs on only Linux for unknown reasons. Or your speakers sound like crap because they need a custom firmware blob loaded by the commercial OSes. Or your touchscreen can't be enabled/disabled because Wayland doesn't support the X extensions that used to allow that to work with xrandr. Or your need to look up obscure bootloader flags, edit the bootloader, and restart, to enable/disable some obscure fix for your hardware (lcd low power settings, acpi, disk controller, or any of a thousand other issues). Or, quite simply, the software you try to install just doesn't work; random errors are popping up, things are not working, and you don't know why. In any of these cases, your only hope is... to go on Reddit and ask for help from strangers. There's no customer support line. Your ISP ain't gonna help you. The Geek Squad just shrugs. You are on your own.
And this is the most frustrating part... the extremely nerdy core fan-group, like those on HN or Reddit, who are lucky enough not to be experiencing the problems unique to Linux, gaslight you and tell you all your problems are imagined or your fault.
> Linux can't be a good desktop, almost by definition.
By your problem statements 1, 2, and 3, there is never likely to be a great desktop OS. The best that we will ever have is a compromise (no cyber pun intended).
1) every OS is buggy, 2) every OS is a hotch-potch, and 3) users end up yelling at the clouds then forced to upgrade to the next version of frustration.
> who are lucky enough not to be experiencing the problems unique to Linux, gaslight you
It isn't just luck , people use Linux every day to do their jobs and pursue their interests. But if no GNU/Linux distro works for your uses, you have whatever commercial OS you are currently using to meet your needs.
As for actual gaslighting , yikes I hope that large groups of people are not conspiring to ruin your day. I personally react in a similar way when corporations tell me please wait, your call is important to us, our menu options have changed.
It is interesting and fascinating to see the growth of Linux.
As many have pointed out, The biggest factor is obviously the enshittification of Microsoft. Valve has crept up in gaming. And I think understated is how incredibly nice the tiling WMs are. They really do offer an experience which is impossible to replicate on Mac or Windows, both aesthetically and functionally.
Linux, I think, rewards the power user. Microsoft and Apple couldn't give a crap about their power users. Apple has seemed to devolve into "Name That Product Line" fanboy fantasy land and has lost all but the most diehard fans. Microsoft is just outright hostile.
I'm interested to see what direction app development goes in. I think TUIs will continue to rise in popularity. They are snappier and overall a much better experience. In addition, they work over SSH. There is now an entire overclass of power users who are very comfortable moving around in different servers in shell. I don't think people are going to want to settle for AI SaaS Cloudslop after they get a taste of local first, and when they realize that running a homelab is basically just Linux, I think all bets are off as far as which direction "personal computing" goes. Also standing firmly in the way of total SSH app freedom are IPhone and Android, which keep pushing that almost tangible utopia of amazing software frustratingly far out of reach.
It doesn't seem like there is a clear winner for the noob-friendly distro category. It seems like theyre all pretty good. The gaming distros seem really effective. I finally installed Omarchy, having thought "I didn't need it, I can rice my own arch", etc, and I must say the experience has been wonderful.
I'm pretty comfortable at the "cutting edge" (read, with all my stuff being broken), so my own tastes in OS have moved from Arch to the systemd free Artix or OpenBSD. I don't really see the more traditional "advanced" Linuxes like Slackware or Gentoo pulling much weight. I've heard interesting things about users building declarative Nix environments and I think that's an interesting path. Personally, I hope we see some new, non-Unix operating systems that are more data and database oriented than file oriented. For now, OpenBSD feels very comfortable, it feels like I have a prayer of understanding what's on my system and that I learn things by using it, the latter of which is a feature of Arch. The emphasis on clean and concise code is really quite good, and serves as a good reminder that for all the "memory safe" features of these new languages, it's tough to beat truly great C developers for code quality. If you're going to stick with Unix, you might as well go for the best.
More and more I find myself wanting to integrate "personal computing" into my workflow, whether that's apps made for me and me alone, Emacs lisp, custom vim plugins, or homelab stuff. I look with envy at the smalltalks of the world, like Marvelous Toolkit, the Forths, or the Clojure based Easel. I really crave fluency - the ability for code to just pour out - none of the hesitation or system knowledge gaps which come from Stack Overflow or LLM use. I want mastery. I've also become much more tactical on which code I want to maintain. I really have tried to curb "not invented here" syndrome because eventually you realize you aren't going to be able to maintain it all. Really I just want a fun programming environment where I can read Don Knuth and make wireframe graphics demos!
I have a Windows 11 PC strictly for gaming. Nearly every-time I interact with Windows it infuriates me with garbage code, Microsoft business BS and anti-privacy. I’d love to switch but has Linux gaming solved the anti-cheat requirement issue? Do Epic and EA games work on Linux?
I also play a decent amount of Flight Simulator 2024 and losing that is almost a non-starter for switching.
anticheat is not a linux issue, its a developers issue.
it seems facially easy to solve. pair players with the type of game they want.
turn on anticheat if you want to join no cheat sessions.
if you want a cheat game turn off anticheat and you join sessions with other cheat players.
the whole dilemma comes out of malignant users that enjoy destruction of other users ability to enjoy the game.
go nuclear on clients that manage to join anticheat sessions with cheats turned on.
The article's title - and the original title of the submission - was specific, bold, and contained a call to action. The new title is bland and unspecific (Linux has been "good" for servers for decades now).
Please revert this submission to use the correct title.
It's good until you boot your system and end up with an unrecoverable black screen that meeses your day of work for no good reason. Linux is free if you don't value your time.
You can't really make blanket statements like this about "Linux" in general because it depends on what distro you use. For example, in NixOS to fix this type of problem all you have to do is rollback to a previous configuration that is known to work. I've not used it, but I believe Arch has something similar.
Even with imperatively configured distros like Ubuntu, it's generally much easier to recover from a "screen of death" than in Windows because the former is less of a black box than the latter. This means its easier to work out what the problem is and find a fix for it. With LLMs that's now easier than ever.
And, in the worst case that you have to resort to reinstalling your system, it's far less unpleasant to do that in a Linux distro than in Windows. The modern Windows installer is painful to get through, and then you face hours or days of manually reinstalling and reconfiguring software which you can do with a small handful of commands in Linux to get back to a state that is reasonably similar to what you had before.
I spent years (maybe a decade) without seeing them in the Windows 7 and early 10 era, but in the last few years I have them sometimes. Many seem Nvidia-related, but I also remember some due to a bad update that broke things in some laptops.
I dunno, I spend less time fighting with any of my several linux systems than the macbook I'm required to use for work, even without trying to do anything new with it. I choose to view this charitably and assume most of the time investment people perceive when switching operating systems is familiarity penalties, essentially a switching cost. The longer this remains the case, the less charitably I'm willing to view this.
You can also mitigate a lot of the "familiarity penalties" by planning ahead. For example, by the time I made the decision to switch from Windows around 15 years ago, I'd already been preferring multi-platform FOSS software for many years because I had in mind that I might switch one day. This meant that when it came time to switch, I was able to go through the list of all the software I was using and find that almost all of it was already available in Linux, leaving just a small handful of cases that I was able to easily find replacements for.
The result was that from day 1 of using Linux I never looked back.
Of course, MS seems to enjoy inflicting familiarity penalties on its established user base every couple of years anyway. After having your skills negated in this way enough times, the jump to Linux might not look so bad.
Not in my experience. I've run both Windows and Linux for the last decade and Windows is the only OS that I ever have problems with updates wasting my time and breaking things. I've been running image-based Linux for the last two years and the worst case is rebooting to rollback to the last deployment. Before that it was booting a different btrfs snapshot.
Fun aside: I had a hardware failure a few years ago on my old workstation where the first few sectors of every disk got erased. I had Linux up and running in 10 minutes. I just had to recreate the efi partition and regenerate a UKI after mounting my OS from a live USB. Didn't even miss a meeting I had 15 minutes later. I spent hours trying to recover my Windows install. I'm rather familiar with the (largely undocumented) Windows boot process but I just couldn't get it to boot after hours of work. I just gave up and reinstalled windows from scratch and recovered from a restic backup.
Windows has recently been a complete shitshow - so even if Linux hasn't gotten any better (it has) it is now likely better than fiddling around with unfucking Windows, and Windows doing things like deleting all your files.
You can put some work into windows to slim it down some, a unattended generator to turn most of the crap off on install, then Shutup OO goes a long way
There's an ever growing list of things to do in order to fix Windows, and that list is likely longer than Linux. This whole "your time is free" argument hinges on Windows not having exactly the same issue, or worse.
Couple of nit picks with all these articles about trying Linux. They should explain words like "distro" before throwing them around. This one also uses "Debian" without explanation. If the audience really is Windows users these are going to be rather cryptic.
Linux is not good. Some hardware support is still reverse-engineering-based, or based on a few individuals best effort activity. Linux needs manufacturers' first hand commitment to quality opensource to be truly good.
Linux is not good. Some software is not on feature-parity among operating systems. With Linux being the software kingdom poor Cinderella. Linux needs software feature parity to be truly good.
Linux is not good. Because too much mainstream new PCs comes with some other operating system pre-installed (and paid for) even if you won't need it. Linux needs freedom of choice sing first PC power on, like a stub to download whatever OS you want (to pay for) or to boot from removable media for Linux to become truly good.
Linux is not good. Because there is still "stuff" that require some specific non-linux software running under some specific non-linux operating system to be made useful things. We need manufacturers to ditch this for linux to become truly good.
I am a happy user of Linux on my primary PC since 20+ years now. But I still have to fight for my freedom every now and again because of one or more of the above points.
Now let's jump back to gaming.
Linux is not good because game industry thinks proprietary platforms and operating systems are better for their business. There is only 1 platform fully supporting Linux and too few titles. Gaming Linux hardly hits 5% of the market share, basically the same as Desktop Linux. While Server Linux is beyond 75%.
I think reasons could be two-fold.
On one hand, Linux is not perceived by industry as attractive as other proprietary platforms. Maybe industry can squeeze much more money from the latter.
On the other hand, it could be that most of the development resources are NOT ORIENTED towards gaming and desktop, so these markets simply lag behind.
Of course, I could be totally wrong: these are my humble opinions with some facts backing them.
(I'm aware that Battlefield series and League of Legends won't work due to draconian anti-cheat -- but nobody in my group cares to play those I guess.)
reply