Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Every time I praise WSL on hn I pay the karma tax but I will die on this hill. WSL is more powerful than Linux because of how easy it is to run multiple OS on the same computer simultaneously. It's as powerful as Linux with some janky custom local docker wrappers for device support, local storage mapping, and network mapping. Except it's not janky at all. It's an absolute delight to use, out of the box, on a desktop or laptop, with no configuration required.

Edit: for clarity, by "multiple OS" I mean multiple Linux versions. Like if one project has a dependency on Ubuntu22 and another is easier with Ubuntu24. You don't have to stress "do I update my OS?"



You can accomplish the same with Distrobox on Linux, but there's definitely something to be said about having the best of both worlds by running Windows + WSL.

I honestly think Microsoft could win back some mind share from Apple if they:

* Put out a version of windows without all the crap. Call it Dev edition or something and turn off or down the telemetry, preinstalled stuff, ads, and Copilot. * Put some effort into silicon to get us hardware with no compromises like the Macbooks

I'm on Mac now, and I jump back and forth between Mac laptop and a Linux desktop. I actually prefer Windows + WSL, but ideologically I can't use it. It has potential - PowerToys is fantastic, WSL is great, I actually like PowerShell as a scripting language and the entire new PC set up can now be done with PowerShell + Winget DSC. But, I just can't tolerate the user hostile behavior from Microsoft, nor the stop the world updates that take entirely too long. They should probably do what macOS and Silverblue, etc. do and move to an immutable/read-only base and deploy image based updates instead of whatever janky patching they do now.

Plus, I can't get a laptop that's on par with my M4 Pro. The Surface Laptop 7 (the arm one) comes close, but still not good enough.


I'm not saying it's a perfect solution, but with Windows 11 Pro and group policy I was able to disable all of the annoying stuff, and because it is group policy it has persisted through several years of updates. It is annoying you have to do this, and it does take some time to get set up right. But it's a solution.

That said I'd pay for a dev edition as you described it, that would be fantastic.


You can make your own clean version, legally, with this file. https://schneegans.de/windows/unattend-generator.

I get customers and most people don't know about it but it's kind of ridiculous that techy people in a tech forum don't know how to do it.


it's kind of ridiculous that techy people in a tech forum don't know how to do it.

Why? HN has traditionally always largely been a macOS and Linux crowd. Why do we have to care about fixing an OS that is broken out of the box (that most of us don't use anyway)?


Because someone cannot make informed comments about the "other" party unless they have a reasonably deep knowledge of it, too.

Far too many Linux users, especially, make fun of Windows and if you dig a bit you see that most of their complaints are things that are solved with 5 minutes of googling. Some complaints are philosophical, and those I agree with, but even in that case, I'd be curious how consistent they are with their philosophy when for example Linux desktop environments due weird things.

Summarizing a bit: Linux users with years or decades of experience of tinkering as sysadmins with Linux frequently make junior-level user complaints about Windows usage, frequently based on outdated information about it.

I say this who has been using both Linux and Windows for a few decades now and has a fairly decent level of sysadmin skills on both.


I didn't know about this. My knowledge of Windows is very limited. I use it every day for work, but it's managed by our IT and Security departments. It's locked down. You cannot use external drives. You can't install applications yourself and you can't run un-approved applications. So, I learned over the years to never touch anything that already hasn't been approved, even settings. If you want to apply for something to be approved, you can submit a written justification co-signed by your manager. My manager has never rejected anything I requested, but it's a huge hassle. Most of us just don't bother, even developers.


This seems pretty useful, thanks! I had certainly never heard of it.


Thanks for this! I didn’t know this tool existed


There is no flavor of Windows 11 that is acceptable. Even the UI itself is a disaster. A cornucopia of libraries and paradigms from React Native to legacy APIs as if an interdimensional wave function of bad ideas had collapsed into an OS, but with ads.


[deleted]


There are literal ads for apps in the Windows 11 start menu. That's unacceptable. Even if one day they roll it back, it will still have been unacceptable. The fact that anyone ever green-lit that decision is unforgivable for a corporation.

macOS isn't perfect but the issue you point out is more of a docker problem, not an OS one. I'm told https://orbstack.dev/ is the solution.


I hear this a lot, and I do seem to remember back when I first got Windows 11 I might have seen something stupid like Candy Crush, but I'll be honest, I literally never see ads anywhere in the OS. Truth be told I hardly ever use the start menu since they ruined it, but this complaint about ads everywhere make it sound like a typical webpage. I just don't see it. Maybe because I'm on Win11 Pro?


Windows LTSC already exists, but Microsoft, in all their wisdom, restricts it to enterprise licensees only, and seems to actively discourage using it as a desktop OS. The first problem is of course fixable with some KMS server shenanigans, but the second can be kinda painful when it comes to keeping drivers up-to-date, installing apps that rely on features LTSC excludes (and doesn't provide an easy way to install manually), etc.

I've often said that if Microsoft had just iterated on Windows 2000 forever I'd probably still be a full-time Windows user. If Microsoft had maintained an LTSC-like Windows variant that was installable from the normal retail installation media and with a normal retail product key (at the very least Pro, but ideally Home), that also likely would have kept me on Windows full-time instead of switching to Linux as my daily driver.


I use Windows 11 IoT Enterprise LTSC, which as far as I'm aware has all the features that Pro has (plus the IoT Enterprise stuff) and zero bloat. I switched to it from my already de-bloated 11 Pro installation (because it removes some telemetry you're normally unable to disable) and have had 0 issues with it. I can't say I activated it using a normal retail product key, however, there are easy solutions to that.


Ya I totally get that. The way I view it is that windows is a glorified driver support layer and any actual work i do is almost exclusively in the Linux container.

When I used to have free time it was great for games too


IMO Linux is better even as that glorified driver support layer these days, at least on x86 hardware (I can't attest to anything ARM-based besides various SBCs). I have to fiddle with drivers much more often on Windows (usually because of Windows Update shenanigans, like "lmao let's silently downgrade the manually-installed AMD GPU driver for no fucking reason at all").


> I can't get a laptop that's on par with my M4 Pro.

This is the only reason I have not requested a windows laptop from my company. WSL is better for docker development in basically every way than a mac can be (disclaimer: haven't tried orbstack yet, heard good things, but my base assumption is it can't be better than WSL2) except it is literally impossible to get hardware as good as the M3 or M4 for any other OS than macOS.


I replaced my m1 with a snapdragon laptop running Win11 and upgraded that to pro. For what I do with it, it runs great with very long battery times, for less than Apple quoted to repair the m1. I don't use the copilot features and haven't seen any ads so far, except maybe for office during setup.


(Used 15ys OSX, now Win11)

The biggest difference between OSX and Windows is, Apple adds (some say steal) functionality from competition, and open source. They make it neat. On windows to have something working, you need a WezTerm, Everything for search, Windhawk for a vertical taskbar on the right, Powertoys for an app starter, Folder Size for disc management etc. If you spend a lot of time, Win11 can be ok to work with.

If Powerpoint and Affinity would work on Linux, I'd use Linux though.


Maybe just for your specific preferences. Terminal is plenty fine. Vertical taskbar on the right is straight up user preference. PowerToys for an app starter? Like Alfred? The start search does a decent enough job of that. Folder Size is nice, but enumerating all files is very taxing.


Oh running Ice to wrangle the menu bar app icons or Rectangle to properly manage windows ('cause Apple screwed that one up) must be unnecessary.

Each OS is going to have extension applications to improve on the OOTB experience. This is an invalid argument to choosing one over the other.


>Windhawk for a vertical taskbar on the right

Huh? Windows supports vertical taskbar.


It was removed in Win11, when they rewrote the taskbar to pretend that it's macOS dock (icons centered by default). Today your only options are horizontal taskbar along the top or the bottom edge, and icons aligned left or center.


Last time I checked, Windows 11 lost this capability and 3p solutions like Windhawk are needed. I'd be very happy if they brought this back though, feel free to share a link to some info about how to do it natively.



"natively" is the key word here, this looks like a 3p hack.


That was my impression too.


Outside US and countries of similar income level, Windows is doing quite alright in mindshare, and will keep doing that unless Apples stops pretending being the computer version of audiophile.

I on the other hand cannot get an affordable Mac that has the same GPU, disk space and memory size as my workstation class laptop.


To the tech savvy, there is essentially only one advantage to running Windows, and that is the ability to run Windows-only software. In all technical respects - control, performance, flexibility - it is inferior to the alternatives. Don't confuse vendor lockin with technology.

I find it dismaying that people on Hacker News willingly submit to incredibly user-hostile behavior from Microsoft and call it "the best of both worlds". Presumably a nontrivial proportion here are building the next generation of software products - and if we don't even respect ourselves, how likely is it that we will respect our users?


"I find it dismaying that people on Hacker News willingly submit to incredibly user-hostile behavior from Microsoft"

And I find it funny that the crowd that spends whole days implementing user-hostile features in yet another SaaS crapware has so much to say about Microsoft's bad behavior.


There is an additional reason: Some (many?) people simply prefer the Windows UI conventions (once you remove all the enshittifications post Windows 7).


I'm not aware of any particular UI convention that's in Windows that isn't available in, say Plasma. Day to day usage is extremely similar, and where they diverge it's usually because 1) Plasma has a feature that Windows doesn't, or 2) someone at Microsoft opted for senseless change for change's sake - a toy interface is layered over a functional one, often (but not always) grudgingly allowing access to the old behavior with extra steps, in a tacit admission of no-confidence. This behavior is pervasive - the "new control panel", the new context menu ("show more options" to get to the original, an extra click that yields a menu with many of the same options but in a different order with different icons), and best of all moving the "Start button" to the center - a change which more than any other exemplifies the silliness, because it 1) at best achieves nothing, and 2) flies in face of the original UI research based on Fitt's Law that informed 30 years of Windows UI tradition.

I honestly can't imagine anyone preferring all that. </rant>


I don't think Microsoft losing the mind share has anything to do with software. Macbooks are winning the laptop war because of superior hardware.


Only on countries where people earn salaries big enough to pay for the Apple hardware tax.


What Apple hardware tax? The macbook air is the best value laptop there is. If the latest version is out of the budget, you can buy older generations used. Even m1 air would be better than any windows laptop at a comparable price point.


Yeah, because only being able to afford used stuff is such a great place to be.


Better than buying a new but crap product for sure?


70% of the world doesn't think it is crap.

https://gs.statcounter.com/os-market-share/desktop/worldwide


Superior hardware with terrible software. Also they straight up artificially limit their hardware so they don't cannibalize their sales, which is slightly understandable, but they do it in the dumbest ways. My SOs MacBook Air can only do one external monitor, even though it has the same specs as her work Pro. Oh and good luck actually getting that external display to work, I swear only like 50% of USB-C docks work on the platform.


> Superior hardware with terrible software.

Funny how that was the other way around just a few years ago. Macs had inferior hardware, but they were supposed to have better software. At least that's what the Mac users claimed.


I fell for that, years ago. No the software wasn't superior either. I remember having to manually install codecs, which on linux had been a problem many many years before but had been solved already.


My SOs MacBook Air can only do one external monitor,

The MacBook Air M4 supports two external displays now (with the lid open):

https://support.apple.com/guide/macbook-air/use-an-external-...

My SOs MacBook Air can only do one external monitor, even though it has the same specs as her work Pro.

The MacBook Pro with the non-Pro/Max chip (i.e. MacBook Pro M3) has the same limitations as the corresponding MacBook Air (i.e. MacBook Air M3).


>Macbooks are winning the laptop war because of superior hardware.

No. This is just you repeating marketing.

No Nvidia chip = B tier at best.

I have a $700 Asus with a 3060 that is better. Go ahead and scale up to a $2000 computer with an Nvidia chip and its so obviously better, there is nothing to debate.

No one cares about performance per watt, its like someone ran a 5k race, came in 3rd and said "Well at least I burned fewer calories than the winner!"


> No Nvidia chip = B tier at best.

Nvidia chip = 45 minutes of battery life


Not the one you talk to, but I'm a dev that does not need extensive battery life. All my dev computers are desktop.


You know they can be turned on or off depending on need right?


Yes, but a few problems:

1. Turning them on/off ala bumblebee isn't a solved problem. It's buggy, especially on not-windows. Even on windows, it's going to be buggy especially in regards to sleep.

2. You obviously lose the advantage of a nvidia GPU that way. If you have to always have it off to get decent battery life, which you do, then it's kind of moot. If you turn it on for your 30 minute workload then there goes 70% of your battery.


And you can never ever plug it to the power grid because?


You can, I just think it's inconvenient so I favor laptops with better battery. Besides, I almost never find myself being on the go and needing a dedicated GPU.


If you're never on the go you don't even need a laptop to be fair…


Well, I'll have to hardly disagree. You want a laptop that its battery life is not 1 hour at best. That wasn't a thing in Windows/Linux laptops until M1 started using arm64. 6 Hours of intense work? Good luck with that.

Not only that, but being able to run very intensive work (Pro Audio, Development...) seamlessly is an absolute pleasure.

Its screen is one of the best screens out there.

The trackpad (and some keyboards) are an absolute pleasure.

The robustness of the laptop is amazing.

I don't care about the marketing of Apple, I don't buy anything new they launch, and I condemn all of their obscure pricing techniques for the tech they sell. But my M1 is rocking like the first day, after four years of daily use. That's something my Windows laptops have never delivered to me.

Apple has done a lot of things wrong, and I will not buy another Apple laptop in the future, but I don't want Nvidia on a Laptop, I want it to be portable, powerful and durable.

That is changing now, and it's amazing. I want my laptop to be mine, and to be able to install any OS I like. New laptops with arm64 and Intel Lake cpus are promissing, but we're not there yet, at least not that I have experienced.

Each to their own for sure, and for you, the nvidia requisite is important. For me it's not about brands, but usability for my work and hobbies.


The 6 hours of real work battery that Apple manages with ARM is genuinely impressive, and finally I think shifted the landscape to take ARM seriously as a CPU for consumers.

But it's just not that big a deal. Sure, I COULD spend a day working without power, but it's 2025 and USB-C power delivery is a mature spec. My desk has power. My work desk has power. My living room has power. My bedroom has power. The coffee shop has power. Airplanes have power. My fucking CAR has power.

Where are you working that you need a full 6 hours of hard working power without occasional access to a power outlet and a battery bank won't meet your needs?

I would be satisfied with 2 hours of hard working battery, which is what Ryzen powered Windows laptops deliver. My girlfriend uses her $800 mid range Ryzen laptop to play games and other power hungry things off charger every single day. It's also what work laptops other than Macs have always provided. Sure, my Thinkpad from 2012 needed a giant tumor of a battery to provide that, but it was always an available option, and you could swap it out for a tiny battery if you really wanted to slim it down.

Never an option in apple land. Battery not good enough? Fuck you, too bad.


I can do 6 hours of work on my 10 years old thinkpad… It's nothing special really.


Please tell me which laptop.

Also, is it powerful enough to have it run a development environnent (docker compose/k3s with db & cachd, intellij/vscode, etc) without having issues?

Genuine questions, I am no fanboy of anything


I have a Thinkpad T560 with only 8GB. I develop using docker and I use kate with python3-pylsp for completion. And of course the occasional zoom/teams.

Instead of slack I normally use localslackirc, so that alone probably saves a ton of battery rather than using the electron one.

When I compile a lot I still manage to get half a day on battery. If I want to save power I just ssh to a server and do everything there :)

edit: that model has also hotswap battery so if you really really need more battery life you can buy a spare.


> *You* want a laptop that its battery life is not 1 hour at best.

But why?

I mean I can see why some want that. But why would I or most or devs in general want that? I very rarely code on laptop, and almost never when not at a desk.


Why would I need an Nvidia chip in my laptop?


For some groundbreaking Artificial Intelligence work, obviously.

In reality, he probably just want to play CS2 :D


This would be fantastic. But Microsoft doesn't have to do this. Their users are captives.


Some of them are.

But the increasing market share of Macs and even Linux these days plus the ever increasing of OSS initiatives from Microsoft points out that Microsoft knows a lot fewer of their users are as captive as they were in the 90's, for example.


More specifically: a lot fewer developers are as captive as they were in the 90's. And while normal users vastly outnumber developers, Microsoft has figured out that those normal users ain't inclined to stick around if those developers jump ship and stop developing for Windows.

In other words, specifically those of a former Microsoft CEO (who understood the problem but not the solution):

DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS DEVELOPERS ... YES


Even for regular users, a big chunk of regular users are looking at other platforms:

- "creatives" have always been a core Apple market and they've grown, so that market has grown; plus, since Windows is globally less dominant, a lot of "Photoshop/video editing software/3D modeling + Windows" folks are now on Macs

- gamers now have Proton + Steam on Linux + SteamOS so quite a few more of them are on Linux now, especially since Valve is pushing in this direction to keep Microsoft honest

- large number of regular office workers have iPhones, especially as you move towards the top of the hierarchy, and are far more tempted than they would have been in the past to try or use a Mac

- in many schools there are now Chromebooks instead of Windows laptops; this is primarily a US thing, but it does pop up in some other places, too

Windows is sort of stable but probably still bleeding users slowly.


There's a dedicated settings page for quickly setting popular dev settings such as showing extensions and full paths. Getting rid of the rest just involves tweaking a few other settings like don't show tips or welcome screen. I also hide the weather and news widget because it's tabloid rubbish but many people seem to love it.


> nor the stop the world updates that take entirely too long

Interesting enough, that beyond release upgrades, happening may be once a year, all or may be 99% of updates took ~5 minutes of interruption of me, including needed reboot. I really wonder how others manage to have "entirely too long" updates.


5 minutes is too long. My Debian systems never demand that I update them. When I update them, it never even takes two minutes.


That can't be helped. I go for a smoke and when come back system is already upgraded.

I've not being using Debian setups lately, but on Ubuntu, alert on need-to-reboot packages after daily unattended upgrades run is happening almost every month. I'm kinda sure that Debian is on similar schedule here.


> a version of windows without all the crap

LTSC is a version like that


> "Microsoft doesn't make any release from the Long-Term Servicing Channel available for regular consumers. The company only makes it available to volume licensing customers, typically large organizations and enterprises. This means that individual users cannot purchase or download Windows 11 LTSC from Microsoft's website."

https://www.windowscentral.com/software-apps/windows-11/what...


Just use mas


> without all the crap

as far as MS are concerned, that crap is their business.

Or, possibly, that crap is the multitude of little software empires build by the management layer now in control..


"More powerful than Linux" is silly. It's a VM. The most useful thing is that it does a bunch of convenience features for you. I am not suggesting that it is not extremely convenient, but it's not somehow more powerful than just using Linux.

You know what's even more convenient than a VM? Not needing a VM and still having the exact same functionality. And you don't need a bunch of janky wrapper scripts, there's more than one tool that gives you essentially the same thing; I have used both Distrobox and toolbx to quickly drop into a Ubuntu or Fedora shell. It's pretty handy on NixOS if I want to test building some software in a more typical Linux environment. As a bonus, you get working hardware acceleration, graphical applications work out of the box, there is no I/O tax for going over a 9p bridge because there is no 9p bridge, and there is no weird memory balloon issues to deal with because there is no VM and there is no guest kernel.

I get that WSL is revolutionary for Windows users, but I'm sorry, the reason why there's no WSL is because on Linux we don't need to use VMs to use Linux. It's that simple...


Yeah if you are working with Linux only, its better to go full linux.

WSL2 is really handy when you want to run other software though. For example, I use Solidworks, so I need to run windows. Forscan for Ford vehicles also has to run under Windows. Having WSL2 means that I can just have one laptop and run any software that I want.


My development is mainly Windows and I prefer Linux host with Windows VM guests. The experience is more stable and I can revert to a snapshot when Windows or Microsoft product update brakes something or new test configuration does. It also allows to backup and retain multiple QA environments that are rarely used, like a client's Oracle DB. It is nice being able to save the VM state at the end of the week and shut it all down so you can start the next right where you left off. Cannot do that when your development environment is the bare metal OS. Windows has known issues of waking a sleeping laptop.


I too think it would be definitely more stable Linux Host with Win VM guests, but I can see the other way around being more convenient to get support for commercially. Though with the VMWare licensing changes, I think what is by default easier for commercial support options may be changing too.


> Windows has known issues of waking a sleeping laptop.

Doesn't Linux as well?


I'm on Lenovo Yoga 6, Gentoo, 6.12 kernel, 4.20 Xfce. Sleeps works perfect. Same on my Asus+AMD desktop. I've not had sleep related issues for years. And last time I did, it was an out-of-tree Wifi driver causing the whole mess.


I'm on Ubuntu 25.04, 128GB RAM, pcie 5 SSD, NVIDIA 5080, 9950X3D.

I discovered over the weekend that only 1 monitor works over HDMI, DisplayPort not working, tried different drivers. Suspend takes a good 5 minutes, and on resume, the UI is either turn or things barely display.

I might buy a Windows license, especially if I can't get multi-screen to work.


Be pragmatic, use the binaries provided by nvidia and not the ones provided by Ubuntu.

Or use Suse, only distro that manages that well. Forget PopOS. Really, either binaries or Suse.

If someone else here is entrenched on Arch, do this: https://github.com/Frogging-Family/nvidia-all

If on Fedora, just use the binaries... trust me.

Hope this helps someone.


Try a lower version of the Nvidia driver. The newer version was causing me and folk I work with a lot of problems.


This has been a pain point for us and our development process… not all versions of Nvidia drivers are the same… even released ones. You have to find a “good” version and keep to it, and then selectively upgrade… at least this has been the case the last 5 years, folks shout out if they have had different experiences.

Side note: our main use case is using cuda for image processing.


In my experience Ubuntu has the worst issues with displays of any distro.

To be fair I stay away from NVIDIA to, I would probably run a separate headless box for those GPU workloads if I needed to


Yeah, Ubuntu used to be the distro that "just worked" while nowadays that crown has passed to Fedora.


> In my experience Ubuntu has the worst issues with displays of any distro.

In my experience, it has zero issues. I use nvidia binary build. I have since 2006 through various nvidia GPU's.


Install Pop_OS! for better OOTB NVIDIA support.


Make sure your device is compatible with WSL this way, its very fragile and prone to breaking


Ahhh, the famous "Works on my machine!" stamp of truth.


"Works on my machine!" is stupid when it comes to software running under an OS, because a userland program that is correct shouldn't work any differently from box to box. (Exceptions you already know notwithstanding.) It is very different when it comes to an operating system.

I know people here hate this, but if you want a good Linux experience, you need to start by picking the right hardware. Hardware support is far and away the number one issue with having a good Linux experience anymore. It's, unfortunately, very possible to even set out to pick good hardware and get burnt for various reasons, like people misrepresenting how well a given device works, or perhaps just simply very similar SKUs having vastly different hardware/support. Still, i'm not saying you have to buy something from a vendor like System76 that specifically caters to Linux. You could also choose a machine that just happens to have good Linux support by happenstance, or a vendor that explicitly supports Linux as an option. I'm running a Framework Laptop 16 and it works just fine, no sleep issues. As far as I know, the sole errata that exists for this laptop is... Panel Self Refresh is broken in the AMDGPU driver. It sorta works, but it's a bit buggy, causing occasional screen artifacts. NixOS with nixos-hardware disables it for me using the kernel cmdline argument amdgpu.dcdebugmask=0x10. That's about it. The fingerprint reader is a little fidgety, and Linux could do a better job at laptop audio out of the box, but generally speaking the hardware works day in and day out. It's not held together with ducktape.

I don't usually bother checking to see if a given motherboard will work under Linux before buying it, since desktop motherboards tend to be much better about actually running Linux well. For laptops, Arch wiki often has useful information for a given laptop. For example, here's the Arch wiki regarding the Framework 16:

https://wiki.archlinux.org/title/Framework_Laptop_16

It's fair to blame Linux for the faults it actually has, which are definitely numerous. But let's be fair here, if you just pick a given random device, there is a good chance it will have some issues.


I recall having a sleep issue with linux 15 years ago, I think its been fixed long ago, except maybe on some very new hardware or if you install the wrong linux on an M series Mac you could have issues with sleep.


I had these issues with Windows, but with Linux Mint it works perfectly.


Not of you don't buy Windows hardware and slap Linux on it.

Unfortunately, most (almost all) hardware is Windows hardware. So far, System76 is the only one that I've had actually work.


The less coupled software is to hardware, the less likely it is tested in that hardware and the higher likelihood of bugs. Linux can run fine but arbitrary Linux distros may not. This is not the fault of hardware makers.


> The less coupled software is to hardware, the less likely it is tested in that hardware and the higher likelihood of bugs.

Yes, exactly! There are whole teams inside Dell etc. dealing with this. The term is "system integration." If you're doing this on your own, without support or chip unfo, you are going to (potentially) have a very, very bad time.

> This is not the fault of hardware makers.

It is if they ship Linux on their hardware.

This is why you have to buy a computer that was built for Linux, that ships with Linux, and with support that you can call.


Tell me how its not their fault ?


Hardware support is more than just kernel support. Additionally, not every kernel release works well for every piece of hardware. Each distro is unique and ensuring the correct software is used together to support the hardware can be difficult when you are not involved in the distro. This is why vertical integration between the distro and hardware leads to higher quality.


Firmware also plays a huge role these days (fan curves, ACPI, power management, etc.)

But saying it can vary largely by distro is overstating it by a lot. Mostly, distro issues are going to be based on how old their kernels are.

But definitely, modern hardware is much too complex to just slap Linux on Windows (and vice versa).


I have Linux on MacBooks from 6 different years. They all work flawlessly. I also have a Lenovo that works well.

Sorry you have had such bad luck.


I am running ChromeOS with Debian 'slapped on it' and that also experience sleep related issues.

Big fan of Linux, but saying that Linux works on system76 while they have a tiny sliver of the Linux market share seems like a nonstarter.


ChromeOS, where sleep presumably worked, is also Linux. You just exchanged a working Linux for a distro with more bugs. The fact that you're able to do that is pretty cool.

That's not to detract from the larger point here though. It's pretty funny that all of the replies in this thread identify different causes and suggest different fixes for the same symptom. Matches my experience learning Linux very well.


Turns out you get what you pay for.

You can either get hardware that works or you can deal with breakage.


System76 seems janky though if you use anything but PopOS


I run Gentoo on all but one of my system76 boxen, and have not seen any jank


You can force it to behave on Linux ;)


Can you share more details of how you make that work well? What hypervisor, what backup/replication, for instance? I can only imagine that being a world of irritation.


It's been a few years since I used it, but Virtualbox (free) had perfectly good suspend/restore functionality, and the suspended VM state was just a file.


I use virt-manager and suspend/restore for the same feature, doesn't using an oracle product (with all the side-effects that brings).


I use libvirt/kvm/qemu. It works fine to do all the things mentioned like snapshots.


> My development is mainly Windows and I prefer Linux host with Windows VM guests

I've tried this in the past but I was unable to get the debugger to work from within a VM.

Has this improved, or is there a trick, or are you just going without a debugger?


In the same spirit if "it depends", there are other options that may work for people with different Linux/Windows balance points:

* Wine is surprisingly good these days for a lot of software. If you only have an app or two that need Windows it is probably worth trying Wine to see if it meets your needs.

* Similarly, if gaming is your thing Valve has made enormous strides in getting the majority of games to work flawlessly on Linux.

* If neither of the above are good enough, dual booting is nearly painless these days, with easy setup and fast boot times across both OSes. I have grub set to boot Linux by default but give me a few seconds to pick Windows instead if I need to do one of the few things that I actually use Windows for.

Which you go for really depends on your ratio of Linux to Windows usage and whether you regularly need to mix the two.


And you also can just run a windows VM when needed for a few apps if that works for your use case.


I'm struggling to find an option for running x86 Windows software on MacOS/Apple Silicon performantly. (LiDAR point cloud processing.)

The possibilities seem endless and kinda confusing with Windows on ARM vs Rosetta and Wine, think there's some other options which use MacOS's included virtualization frameworks.


Have you tried CloudCompare? Native Mac ARM support.

https://www.cloudcompare.org/

(Edit: just so you know, the UI is a bit weird, there is a bit of a learning curve. But the app behaves in a very sane manner, with every step the previous state is maintained and a new node is created. It takes time to get used to it, but you'll learn to appreciate it.

May your cloud have oriented normals, and your samples be uniformely distributed. Godspeed!)


I like cloudcompare, but it's not really in the same space. I'm trying to achieve bulk tile processing of large datasets using LASTools


That’s interesting; I’d expect something techie like that to have good Linux programs.


Have you tried to install Windows 11 ARM under UTM on Mac? UTM is a kind of open source Parallels. Then you'll run x86 software using Windows' variant of Rosetta. Probably slower than Rosetta but perhaps good enough.


In case others were similarly confused, I thought that UTM was commercial but it is Apache 2 https://github.com/utmapp/UTM/blob/v4.6.5/LICENSE


I wanted to play around with Windows 11 for a while now. It boots in UTM just to the degree that I can confirm my suspicions that Windows 11 sucks compared to Windows 10, but is not otherwise usable. (MacBook Air M3, slightly outdated macOS)



> Forscan for Ford vehicles also has to run under Windows.

I've successfully run it with WINE. Thought, my Forscan executable was 3 years old or so and that may have changed, but I doubt it.


The thing about WINE is that it's not necessarily solid enough to rely on at work. You never know when the next software upgrade will break something that used to work.

That's always true, of course. But, compared to other options, relying on WINE increases the chances of it happening by an amount that someone could be forgiven for thinking isn't acceptable.


In my mind, I almost feel like the opposite is true. Wine is getting better and better, especially with the amount of resources that Valve is putting into it.

If you want a stable, repeatable way to wrangle a Windows tool: Wine is it. It's easy to deploy and repeat, requires no licenses, and has consistent behavior every time (unless you upgrade your Wine version or something). Great integration with Linux. No Windows Updates are going to come in and wreck your systems. No licensing, no IT issues, no active directory requirements, no forced reboots.


You can fix this issue by using a wine "bottle manager" like... Bottles. This allows you to easily manage multiple instances of wine installations (like having multiple windows installations) with better and easy to use tooling around it. More importantly, it also allows you to select across many system agnostic versions of wine that won't be upgraded automatically thus reducing the possibility of something that you rely breaking on you.


Or pony up for CodeWeavers. Their code goes into WINE, and they are (the?) major WINE devs. They've had bottles for years, if not decades now.


I used to a long time ago but even back then I was getting more value out of q4wine (a defunct project now) than from CodeWeavers stuff. Granted, I was perhaps too "enthusiast" using git versions of wine with staging patches and my own patches rolled into it, so q4wine (and I guess now Bottles) more DIY approach won me over.

That all said, I haven't tried CodeWeavers in almost 10 years so it might have improved a lot.


No, if wine itself breaks a bottle won't save you.


Same about windows upgrades nowadays really, there's a ton of software which just stopped working.


When I hear cases of using Wine etc as a substitute, I can't help but think of the "We have McDonald's at home" meme!


Wine is fantastic, but it is fantastic in the sense of being an amazing piece of technology. It's really lacking bits that would make it a great product.

It's possible to see what Wine as a great product would look like. No offense to crossover because they do good work, but Valve's Steam Play shows what you can really do with Wine if you focus on delivering a product using Wine.

Steam offers two main things:

- It pins the version of Wine, providing a unified stable runtime. Apps don't just break with Wine updates, they're tested with specific Proton versions. You can manually override this and 9 times out of 10 it's totally fine. Often times it's better. But, if you want it to work 10 out of 10 times, you have to do what Valve does here.

- It manages the wineserver (the lifecycle of the running Wine instance) and wine prefix for you.

The latter is an interesting bit to me. I think desktop environments should in fact integrate with Wine. I think they should show a tray icon or something when a Wineserver is running and offer options like killing the wineserver or spawning task manager. (I actually experimented with a standalone program to do this.[1]) Wine processes should show up nested under a wineserver in system process views, with an option to go to the wineprefix, and there should be graphical tools to manage wine prefixes.

To be fair, some of that has existed forever in some forms, but it never really felt that great. I think to feel good, it needs to feel like it's all a part of the desktop system, like Wine can really integrate into GNOME and KDE as a first-class thing. Really it'd be nice if Wine could optionally expose a D-Bus interface to make it so that desktop environments could nicely integrate with it without needing to do very nasty things, but Wine really likes to just be as C/POSIX/XDG as possible so I have no idea if something like that would have a snowball's chance in hell of working either on the Wine or desktop environment side.

Still, it bums me out a bit.

One pet peeve of mine regarding using Wine on Linux is that EXE icons didn't work out of the box on Dolphin in NixOS; I found that the old EXE thumb creator in kio-extras was a bit gnarly and involved shelling out to an old weird C program that wasn't all that fast and parsing the command line output. NixOS was missing the runtime dependency, but I decided it'd be better to just write a new EXE parser to extract the icon, and thankfully KDE accepted this approach, so now KDE has its own PE/NE parser. Thumb creators are not sandboxed on KDE yet, so enable it at your own risk; it should be disabled by default but available if you have kio-extras installed. (Sidenote: I don't know anything about icons in OS/2 LX executables, but I think it'd be cool to make those work, too.) The next pet peeve I had is that over network shares, most EXE files I had wouldn't get icons... It's because of the file size limit for remote thumbnails. If you bump the limit up really high, you'll get EXE thumbnails, but at the cost of downloading every single EXE, every single time you browse a remote folder. Yes, no caching, due to another bug. The next KDE frameworks version fixes most of this: other people sorted out multiple PreviewJob issues with caching on remote files, and I finally merged an MR that makes KIO use kio-fuse when available to spawn thumb creators instead of always copying to a temporary file. With these improvements combined, not just EXE thumbnails, but also video thumbnails work great on remote shares provided you have kio-fuse running. There's still no mechanism to bypass the file size limit even if both the thumbcreator and kio-fuse remote can handle reading only a small portion of the file, but maybe some day. (This would require more work. Some kio slaves, like for example the mpt one, could support partially reading files but don't because it's complicated. Others can't but there's no way for a kio-fuse client to know that. Meanwhile thumb creators may sometimes be able to produce a thumbnail without reading most of the file and sometimes not, so it feels like you would need a way to bail out if it turns out you need to read a lot of data. Complicated...)

I could've left most of that detail out, but I want to keep the giant textwall. To me this little bit of polish actually matters. If you browse an SMB share on Linux you should see icons for the EXE files just like on Windows, without any need to configure anything. If you don't have that, then right from the very first double-click the first experience is a bad one. That sucks.

Linux has thousands of these papercuts everywhere and easily hundreds for Wine alone. They seem small, but when you try to fix them it's not actually that easy; you can make a quick hack, but what if we want to do things right, and make a robust integration? Not as easy. But if you don't do that work, you get where we're at today, where users just expect and somewhat tolerate mediocre user experience. I think we can do better, but it takes a lot more people doing some ultimately very boring groundwork. And the payoff is not something that feels amazing, it's the opposite: it's something boring, where the user never really has any hesitation because they already know it will work and never even think about the idea that it might not. Once you can get users into that mode you know you've done something right.

Thanks for coming to my TED talk. Next time you have a minor pet peeve on Linux, please try to file a bug. The maintainers may not care, and maybe there won't be anyone to work on it, and maybe it would be hard to coordinate a fix across multiple projects. But honestly, I think a huge component of the problem is literally complacency. Most of us Linux users have dealt with desktop Linux forever and don't even register the workarounds we do (anymore than Windows or Mac users, albeit they probably have a lot less of them.) To get to a better state, we've gotta confront those workarounds and attack them at the source.

[1]: https://github.com/jchv/winemon just an experiment though.


If you (or whoever is reading this) want(s) a more refined Wine, I highly recommend CodeWeavers. Their work gets folded back into open source WINE, no less.

> To get to a better state, we've gotta confront those workarounds and attack them at the source.

To my eye, the biggest problem with Linux is that so few are willing to pony up for its support. From hardware to software.

Buy Linux computers and donate to the projects you use!


That's true, but even when money is donated, it needs to be directed somewhere. And one big problem, IMO, is that polish and UX issues are not usually the highest priority to sort out; many would rather focus on higher impact. That's all well and good and there's plenty of high impact work that needs to be done (we need more funding on accessibility, for example.) But if there's always bigger fires to put out, it's going to be rather hard to ever find time to do anything about the random smaller issues. I think the best thing anyone can do about the smaller issues is having more individual people reporting and working on them.


If your at work, it's probably a Windows shop. Use windows. At home you can chance a bad update, and probably also have access to windows. Can always use a VM, wine is great in some cases, like WSL. Both don't meet every use case.


They named it “Forscan?” They really named it that, not thinking it could sound close to something else entirely unrelated?


Surely you don't think the executives at Ford expect us to Power Stroke without FORScan?


Ford’s own software is called FDRS.

Forscan was developed independently by some Russian gentlemen, probably with plenty of reference to FDRS/IDS internals.


Volkswagen's equivalent is VAG-COM


why bring wine into a vm discussion? just run windows in a vm too. problem solved without entering the whining about wine not being better than windows itself


I work in embedded systems. In that space, it's pretty common to need some vendor-provided tool that's Windows-only. I often need to automate that tool, maybe as part of a CI/CD pipeline or something.

If I were to do it with a Windows VM, I'd need to:

  1. Create the VM image and figure out how to build/deploy it.
  2. Sort out the Windows licensing concerns.
  3. Figure out how to launch my tool (maybe put an SSH server into the VM).
  4. Figure out how to share the filesystem (maybe rsync-on-SSH? Or an SMB fileshare?).
If I do it with Wine instead, all I need to do is:

  1. Install some pinned version of Wine.
  2. Install my tool into Wine.
  3. Run it directly.


> For example, I use Solidworks, so I need to run windows.

Right. One of the things a lot of people don't get is the extent to which multidisciplinary workflows require Windows. This is particularly true of web-centric software engineers who simply do not have any exposure to the rest of the engineering universe.

Years ago this was the reason we had to drop using Raspberry Pi's little embedded microcontroller. The company is Linux-centric to such an extent that they simply could not comprehend how telling someone "Just switch to Linux" is in a range between impossible and nonsensical. They were, effectively, asking people to upend their PLM process just for the sake of using a little $0.50 part. You would have to do things like store entire OS images and configurations just to be able to reconstruct and maintain a design iteration from a few years ago.

WSL2 is pretty good. We still haven't fully integrated this into PLM workflows though. That said, what we've done on our machines was to install a separate SSD for WSL2. With that in place, backing-up and maintaining Linux distributions or distributions created in support of a project is much, much easier. This, effectively, in some ways, isolates WSL2 distributions from Windows. I can clone that drive and move it from a Windows 10 machine to a Windows 11 machine and life is good.

For AI workflows with NVIDIA GPU's WSL2 is less than ideal. I don't know if things have changed in this domain since I last looked. Our conclusion from a while back was that, if you have to do AI with the usual toolchains, you need to be on a machine running Linux natively rather than a VM running under Windows. It would be fantastic if this changed and one could run AI workflows on WSL2 without CUDA and other issues. Like I said, I have not checked in probably a year, maybe things are better now?

EDIT: The other reality is that one can have a nice powerful Linux machine next to the Windows box and simply SSH into it to work. Most good IDE's these days support remote development as well. If you are doing something serious, this is probably the best setup. This is what we do.


Did you know that Forscan works flawlessly under Wine if you're not using Bluetooth?


Im sure with enough tinkering I could get Solidworks to run. The thing is I don't want to spend time tinkering, I want to spend time doing. WSL2 gives me the optimal solution for all of that + dev.


I really want to like Windows 11, and I enjoy using WSL, but Microsoft treats me too much like an adversary for me to tolerate it as a daily driver. Only a complete scumbag of a product manager would think pushing Candy Crush ads is a good idea.

I’ve got an airgapped Toughbook that I use for the few Windows apps I really need to talk to strange hardware.


I suggest looking into Windows LTSC. It has solved most of the annoyances for me.


You don't need LTSC, you just need Windows Pro versions.

Lots of people bitch and moan about Windows problems that only exist because they buy the cheaper "Home" or whatever license and complain that Microsoft made different product decisions for average users than for people who have bought the explicitly labeled "power user" version.

Remember, the average computer user IS a hostile entity to Microsoft. They will delete System32 and then cry that Windows is so bad! They will turn off all antivirus software and bitch about Windows being insecure. They refuse to update and then get pwned and complain. They blame Microsoft for all the BSODs that were caused by Nvidia's drivers during the Vista era. They will follow a step by step procedure in some random forum from ten years ago that tells them to turn off their entire swap file despite running with lots of RAM and spinning rust and then bitch that Windows is slow.

Don't expect Microsoft to not deal with morons using their software. Buy the Pro versions if you don't want the version meant for morons.


I’m on Enterprise.

I shouldn’t need to spend this much time and energy turning off AI rubbish, bypassing cloud features, or knobbling telemetry and ads because some shitbag at Microsoft decided this was a good way of getting a promotion.

My computer is supposed to work for me, not the other way around.


Windows is only free if you don't value your time, it seems :-)


You do need to get Win 11 pro to be able to disable all of those features.


I run Windows in a VM where I need windows. It’s so much easier to fix a broken Linux installation than a broken Windows installation.


My coworkers stubbornly try to use WSL instead of Linux directly. They constantly run into corner cases and waste time working around them compared to just using Linux. Some tooling detects that it is running on Windows, and some detects that it is running on Linux. In practice, it's the worst of both worlds.


Saying running full Linux avoids wasting time on fiddly workarounds kinda blows my mind.

Full hardware support is still not a given, and Windows emulation is still need for so many cases (e.g. games, specialized software etc).

Until I can choose any machine based on form factor and specs alone and just run Linux on it, WSL will the best version of Linux it can run.


> Full hardware support is still not a given

What may hap be your workload? The only thing that aren't working on Linux day 1 are GPU's, and it's mostly because kernel/distro timings (we haven't had a GPU release without support for mainline kernel in years).


I am into small and portable, decently powerful, high DPI laptops (battery be damned), ideally with touch support. And this category just gets no love in the linux world.

I was holding hopes for the Framework 12" but they cheaped on the screen to target the student market, with no upgrade option at this point.


Or the wireless chipset that your corporate laptop happens to have. Or Bluetooh. Or it won't suspend properly.


Or a way worse touchpad experience. No swiping geastures. No smooth scrolling. FN-buttons not working. Or any other million issues. I have never been able to install Linux on a laptop and getting things to work within a weekend. And then reverting becuase I need my computer.


Run wayland instead of xorg… Also get better laptops.


> better laptops

The absolute best built laptops on the market right now don't come with Linux support...


If you're thinking of apple… as a former apple owner and current thinkpad owner… the built quality of apple is severely overrated. Please come back with comments that are not just shilling.


Buy a System76


That was kind of my point: we're still at a stage where checking a list of supported laptops and vendors is pretty much mandatory.

This is totally laptop vendors' fault, but that doesn't change the fact of the matter.

PS: it would be fine if there was a few good options in all categories. Right now I see nothing comparable to an Asus Z13 but with first class Linux support for instance.


What modern hardware isn't supported by Linux? I haven't had driver problems in probably over a decade. I don't even target Linux for my builds, it just works. Same with the pile of random laptops I've installed it on. Wifi out of the box etc.


> What modern hardware isn't supported by Linux?

Fingerprint sensors and IR login cameras that are pre-installed on many laptops, and have Windows-only drivers.

As an end-user (yes, I'm an engineer too, but from the perspective of the OS and driver developers I am an end-user) I don't care who is in charge of getting the device to work on an OS—I only care that it works or not. And these devices don't, on Linux. So, they are broken.


My fingerprint scanner works, but I don't use it because typing my password is faster.


yeah those are weird since huge chunk of the drivers are userland.


> Full hardware support is still not a given,

If you're not buying your hardware from a vendor you can call and get support with Linux from, you're going to have a hard time.


In the case of some "compatibility" subsystem, it's absolutely true. It's complexity that requires fiddly workarounds.

Just use Linux.


> Full hardware support is still not a given

I bought an iphone and then got angry it didn't run android


Why would your primary work device be running an OS not supported by the device vendor? That's just bizarre.

I use Linux as my primary OS, and while Proton/Steam are pretty good now I'm still rebooting into (unactivated) Windows for some games. It's fine. It's also the only thing I use Windows for.

On an unrelated note, I'm frankly confused about who wants Apple's janky OS, because I've been forced to use it for work and it is very annoying.


What detects it is running on windows out of interest?

I use WSL extensively, with lots of languages, and I’ve never had anything do that.

It’s running in a VM, so that would be some kind of weird VM escape?


It's easy, it's right there in uname -r.


I would love to hear about these edge cases and which tooling fails to detect they they were launched form linux.

Sounds a lot like a picnic problem but you didn’t give nearly enough details.


Yesterday, they tried to get a Python library that built a native library using Meson to work. They were working under WSL, but somehow, Meson was attempting to use the MSVC toolchain and failing.


And they were using pip/uv whatever from linux, the linux version.

One of the most common issues is calling a windows executable from within wsl… it’s a “convenience” feature that takes about 2 seconds to disable in the wsl config but causes these kinds of weird bugs


If on WSL2, they need

[interop]

appendWindowsPath=false

section in /etc/wsl.conf.

Then everything will go flawlessly.


> "More powerful than Linux" is silly. It's a VM.

I don't think it's silly. Sure, it's a VM, but it's so nice that I barely reboot into Linux. You get the best of both worlds with WSL.


For me, the best part of running Linux as the base OS is not having to deal with Windows.

No ridiculous start menu spam; a sane, non-bloated operating system (imagine being able to update user space libraries without a reboot, due to being able to delete files that other processes still have opened!); being able to back up my data at the file level without relying on weird block-level imaging shenanigans and so much more.

How is inverting the host/guest relationship an improvement on that?


> For me, the best part of running Linux as the base OS is not having to deal with Windows.

This is correct, but let's not pretend that linux is perfect. 99% of linux _for me_ is my terminal environment. WSL delivers on that _for me_.

I don't see any start menu spam because I rarely use it, when I do I type what I'm looking for before my eyes even move to look at that start menu.

oh, I can play destiny 2 and other games without shenanigans. Also don't need to figure out why Slack wants to open links in chromium, but discord in firefox (I have to deal with edge asking to be a default browser, but IMO it's less annoying).

Oh and multi-monitor with multiple DPI values works out of the box without looking up how to handle it in one of the frameworks this app uses.


> when I do I type what I'm looking for before my eyes even move to look at that start menu.

That's a /s, right? When I start typing immediately after the windows button, the initial letters are lost, the results are bad either way, and most turn into just web suggestions rather than things named exactly like the input.


> That's a /s, right? When I start typing immediately after the windows button, the initial letters are lost, the results are bad either way, and most turn into just web suggestions rather than things named exactly like the input.

No, I rarely have issues with search in start menu.


Turn off web suggestions then?


I did. They come back after some of the updates.


> imagine being able to update user space libraries without a reboot

That's... a very weird criticism to level at Windows, considering that the advice I've seen for Linux is to reboot if you update glibc (which is very much a user space library).


Why? It directly results in almost every Windows update requiring a reboot to apply, compared to usually only an application restart or at most desktop logout/login on Linux.

Having to constantly reboot my computer, or risk missing important security patches, was very annoying to me on Windows.

I've never had to reboot after updating glibc in years of using Linux, as far as I can remember.


You got some moderately bad advice.

Running programs will continue to use the libc version that was on disk when they started. They won't even know glibc was upgraded. If something is broken before rebooting, it'll stay broken after.


This is not true. Different programs on the same system that interoperate and use different versions of the same shared library can absolutely cause issues.

For a trivial change to glibc, it won't cause issues. But there's a lot of shared libraries and lots of different kinds of changes in different kinds of libraries that can happen.

I still haven't nailed if it was due to a shared library update, but just the other day, after running upgrades I was unable to su or sudo / authenticate as a user until after rebooting.


It does happen, but it's pretty rare compared to Windows in my experience, where inconvenience is essentially guaranteed.

Firefox on Linux did not really enjoy being updated while running, as far as I remember; Chrome was fine with it, but only since it does some extra work to bypass the problem via its "zygote process": https://chromium.googlesource.com/chromium/src/+/main/docs/l...


Works fine with Nix and Guix since it doesn't replace JS or other shared config files in-place to perform updates


The only time I need to reboot my Linux Mint is when the Linux kernel is updated. I understand why.


I responded "This is not true" to a sibling comment about this same topic, but about "shared libraries", which is the opposite problem (multiple programs could load the same shared library and try to interact).

This is absolutely not true for Linux kernel updating. While you won't be using the new kernel before rebooting, there's 0 risk in not rebooting, because there's exactly 1 version of the kernel running on the machine -- it's loaded into memory when your computer starts.

There's of course rare exceptions, like when a dynamically linked library you just installed depends on a minimum specific version of the Linux kernel you also just installed, but this is extremely rare in Linux land, as backwards compatibility of programs with older kernels is generally a given. "We do not break userspace"


One problem not rebooting with the kernel is drivers. They aren’t all built in.

Most distros leave the current running kernel and boot into the new one next time.

Some, like Arch, overwrite the kernel on an update, so modules can’t be loaded. It is a shock the first time you plug in a USB drive and nothing happens.


Good point, thanks for the insight!


I have a theory that 99.9% of preferring Windows or Linux comes down to "do ads in the start menu trigger my OCD".


It runs much deeper than that for me.

Windows at its core just does not seem like a serious operating system to me. Whenever there are two ways to do something, its developers seem to have picked the non-reasonable one compared to Unix – and doing that for decades adds up.

But yes, first impressions undoubtedly matter too.


I have no idea what Windows does with the various network services but my Pi-Hole gets rate-limited* when it connects to the network--there's just constant DNS lookups to countless MS domains, far beyond what could reasonably be expected for a barebones install.

This isn't even a corpo-sloptop with Qualys and Zscaler and crap running, Just a basic WIndows box I rarely boot. It's deeply offensive to me.


When you compare thing on API level, NT is generally superior to POSIX - just look at what a mess fork() is for one example, or fd reuse, or async I/O.


Want to talk about how each process has to implement their own custom escaping and splitting of the command line string?

That's much more complicated and error prone than fork.


It is not the standard in Windows land to run processes by handing them fifty commandline arguments. Simple as that. Win32 apps have strong support for selecting multiple files to pass to the app from within the file select dialog, as long as you follow the documentation.

It's like complaining that Unix is hard to use because I can't just drop a dll into a folder to hook functionality like I can on Windows. It's a radically different design following different ideologies and you can't magically expect everything to transfer over perfectly. If you want to do that on Linux land, you learn about LD_PRELOAD or hook system calls.

If you want to build powerful, interoperable modules that can pipe into each other and compose on the commandline, Powershell has existed since 2006. IMO, passing well formed objects from module to module is RADICALLY better than passing around text strings that you have to parse or mangle or fuck with if you want actual composibility. Powershell's equivalent of ls doesn't have to go looking at whether it is being called by an actual terminal or by an app Pipe for example in order to support weird quirks. Powershell support for Windows internals and functionality is also just radically better than mucking around in "everything is a file" pseudo folders that are a hacky way to represent important parts of the operating system, or calling IOCntrls.

I also think the way Windows OS handles scheduled tasks and operations is better than cron.

I also think Windows Event logging is better than something like dmesg, but that's preference.

Also EVERYTHING in Windows land is designed around remote administration. Both the scheduled tasks and Event Logging systems are transparently and magically functional from other machines if you have you AD setup right. Is there anything in Linux land like AD?


> Win32 apps have strong support for selecting multiple files to pass to the app from within the file select dialog

The problem is when you want to click a file on your file manager and you want it to open in the associated application. Because the file manager can only hope the associated application parses the escapes the same way it generates them. Otherwise it's file not found :)

I'm not going to bother to reply point by point since you completely missed the point in the first few words.


The C runtime will do that for you, and it has been a standard OS component since Win10.

But also, no, it's not worse than fork. Fork literally breaks every threaded app.


> standard OS component since Win10.

So, basically yesterday, and not default like how it is with execve, and you can never know if the command you're trying to call implements it the same way or does a different escaping.

Care to explain how fork "breaks" threaded apps? You can't mix them for doing multiprocessing, but it's fine if you use one model or the other.


Win10 has been around for literally a decade now. So much so that it's going out of support.

fork() breaks threaded apps by forking the state of all threads, including any locks (such as e.g. the global heap lock!) that any given thread might hold at that moment. In practice this means that you have to choose either fork or threads for your process. And this extends to libraries - if the library that you need happens to spawn a background thread for any reason, no more fork for you. On macOS this means that many system APIs are unusable. Nor is any of this hypothetical - it's a footgun that people run into regularly (just google for "fork deadlock") even in higher level languages such as Python.


How long has fork() existed? Is it less than 10 year? Is it much much more?

> just google for "fork deadlock"

I did, results were completely unrelated to what you're talking about.

Anyway libraries spawning hidden threads… I bet they don't even bother to use reentrant functions? I mean… ok they are written by clueless developers. There's lots and lots of them, they exist on windows too. What's your point?

That's


I have used Windows for years, and I loved it. I never understood why Linux and Mac users kept bashing on it. I just didn't know any better.

These days I'm avoiding booting into Windows unless I really have no choice. The ridiculousness of it is simply limitless. I would open a folder with a bunch of files in it and the Explorer shows me a progress bar for nearly a minute. Why? What the heck is it doing? I just want to see the list of files, I'm not even doing anything crazy. Why the heck not a single other file navigator does that — not in Linux, not on Mac, darn — even the specialized apps built for Windows work fine, but the built-in thing just doesn't. What gives? I would close the window and re-open the exact same folder, not even three minutes later and it shows the progress bar again. "WTF? Can't you fucker just cache it? Da fuk you doing?"

Or I would install an app. And seconds after installing it I would try to search for it in the Start menu, and guess what? Windows instead opens Edge and searches the web for it. wat? Why the heck I can't remove that Edge BS once and for all? Nope, not really possible. wat?

Or like why can't I ever rebind Cmd+L? I can disable it but can't rebind it, there's just no way. Is it trying to operate my computer, or 'S' in 'OS' stands for "soul"?

Or for whatever reason it can't even get the time right. Every single time I boot into it, my clock time is wrong. I have to manually re-sync it. It just doesn't do it, even with the location enabled. Stupid ass bitch.

And don't even let me rant about those pesky updates.

I dunno, I just cannot not hate Windows anymore. Even when I need to boot in it "for just a few minutes", it always ends up taking more time for some absolute fiddlesticks made of bullcrap. Screw Windows! Especially the 11 one.


> Or for whatever reason it can't even get the time right. Every single time I boot into it, my clock time is wrong.

Dual booting will do that because linux & windows treat the system clock differently. From what I recall one of them will set it directly to the local time and the other always sets it to UTC and then applies the offset.


The most reliable fix is to get Windows to use UTC for the hardware clock, which is usually the default on Linux. (It's more reliable because it means the hardware clock doesn't need to be adjusted when DST begins or ends, so there's no need for the OSs to cooperate on that.)

https://wiki.archlinux.org/title/System_time#UTC_in_Microsof...


That flag has been broken for at least several Windows versions, unfortunately. A shame, given that that's the only sane way of using the RTC in the presence of DST or time zone shifts...

That's exactly the type of Windows-ism I'm talking about. Two options (use UTC or the local time), and Windows chose to pick the nonsensical one.


Yeah, well, I use ntfs in Linux. It somehow knows how to treat the partitions. Even though it can't fix the issues when they arise (which almost never happens) — there's no chkdsk for Linux. So, I just don't understand why Windows can't automatically sync the clock (as it explicitly set to do it) when it boots? Why does one have to get creative to fix the darn clock? If I can't even trust the OS to manage the time correctly, what can I trust it with, if anything at all?


Windows syncs the clock to time.windows.com OOTB. This can be changed to any time provider.

https://learn.microsoft.com/en-us/windows-server/networking/...


I have the same issue and don’t dual boot.


I loved windows XP and Windows 7. They were a bit brittle regarding malware, but I was using a lot of pirated software at the times, so that may have been me. Win 8 was bad UX wise, but 8.1 resolved a lot of the issues. But since then, I barely touched windows.

I want a OS, not an entertainment center, meaning I want to launch a program, organize my files, and connect to other computers. Anything that hinders those is bad. I moved from macOS for the same reason, as they are trying to make those difficult too.


> I want a OS, not an entertainment center

Exactomundo! I'm a software developer, not a florist. I don't care about all those animations, transitions, dancing emojis, styled sliding notifications, windings and dingleberries. If I want to rebind a fucking key I should be able to. If I want to replace the entire desktop with a tiling manager of my choosing — that should be possible. And definitely, absolutely, in no way, should just about any kind of app, especially a web-browser, be shoved in my face. "Edge is not that bad", they would say. And would be completely missing the whole point.


Are you one of those guys that fiddles with registry settings and decrapifiers? To me, it sounds like you turned off file indexing. I turn it off when doing audio recording and yeah, that slows down file browsing.


> fiddles with registry settings

nope, that's with a pristine, freshly installed Windows Pro instance.


The reason varies by the decade. Microsoft has a tendency to fix one thing, then break another.

That said, a distaste for advertising goes beyond OCD. Advertisers frequently have questionable ethics, ranging from intruding upon people's privacy (in the many senses of the word) to manipulating people. It is simply something that many of us would rather do without.


I would say in my case it’s less about OCD and more about, inexplicably, dignity.


Advertising triggers a lot more than OCD in me outside of my start menu. On my machine, where I spend most of my waking hours, it was certainly the last straw for me.

But there's also the thing where Microsoft stops supporting older machines, creating a massive pile of insecure boxes and normie-generated e-waste; and the thing where it dials home constantly; and the thing where they try and force their browser on you, and the expensive and predatory software ecosystem, and the insane bloat, and the requiring a Microsoft account just to use my own computer. Oh yeah, and I gotta pay for this crap?!

I went full Linux back when Windows 11 came out and will only use it if a job requires. Utterly disgusting software.


Seems sorta not cool toward people with OCD to use their condition for rhetorical effect.


Take a chill pill.


What makes you think I’m not chill already? You engaged in a slightly rude trope, and I provided a very mild push back, at least from my point of view the stakes are all correctly low.


But you still get the worst of the Windows world, which is more than many are willing to deal with. I was using windows for years as my main gaming OS, but after they announced W11 being the only way forward. Switching to Linux on the desktop was like a breath of fresh air. I'll leave it at that.

If I were to run an OS on a VM it's gonna be windows, not Linux


> You get the best of both worlds with WSL.

You obviously don't. Maybe WSL is the best compromise for people who need both Windows and Linux.

But it's ridiculous to think that WSL is better than just Linux for people who don't need Windows at all. And that's kind of what the author of this thread seems to imply.


I think that case could be made. For example for people who have a laptop that is not well supported by linux. With WSL they get linux and can use all of their hardware.


If it’s impossible to massage Linux into working well with your laptop – sure. But you’re missing out so much, like, well, not having to deal with Windows.


Similarly powerful would be totally fine. More powerful really is silly. Personally I couldn't make a lot of my workflows work very well with WSL2. Some of the stuff I run is very memory intensive and the behavior is pretty bad for this in WSL2. Their Wayland compositor is also pretty buggy and unpolished last I used it, and I was never able to get hardware acceleration working right even with the special drivers installed, but hopefully they've made some progress on that front.

Having Windows and Linux in the same desktop the way that WSL2 does obviously means that it does add a lot of value, but what you get in the box isn't exactly the same as the thing running natively. Rather than a strict superset or strict subset, it's a bit more like a Venn diagram of strengths.


By default wsl2 grabs half of the memory, but that's adjustable. The biggest pain point I have is to run servers inside wsl that serve to non-localhost (localhost works auto-magically).

I am surprised you had such problems with wsl2 graphics acceleration. That just worked for me, including CUDA accelerated workloads on the linux side.


Technically it's not a VM, it's a subsystem, the same way Win32, Win64, Posix, OS/2, etc. are.

It's a feature of the NT-family of kernels where you can create many environments sharing the same underlying executive and HAL.

It's a quite interesting way to build an OS: https://en.wikipedia.org/wiki/Architecture_of_Windows_NT


As everyone said, WSL2 is actually virtual machines and it is what most people are actually using now. That said, I feel the need to chime in and say I actually love WSL1 and I love Windows NT the kernel. It bums me out all the time that we probably won't get major portions of the NT kernel, even an out-of-date version, in some open source form.

I like Linux, and I use Linux as my daily desktop, but it's not because I think Linux or even UNIX is really that elegant. If I had to pick a favorite design it would be Windows NT for sure, even with all its warts. That said, the company behind Windows NT really likes to pile a lot of shit I hate on top of that pretty neat OS design, and now it's full of dubious practices. Automatic "malware submission" on by default, sending apps you download and compile yourself to Microsoft and even executing them in a VM. Forced updates with versions that expire. Unbelievable volumes of network traffic, exfiltrating untold amounts of data from your local machine to Microsoft. Ads and unwanted news all over the UI. Increasing insistence in using a Microsoft account. I could go on and on.

From a technical standpoint I do not think the Linux OS design is superior. I think Linux has some amazing tools and APIs. dmabufs are sweet. Namespaces and cgroups are cool. BPF and it's various integrations are borderline insane. But at its core, ... It's kinda ugly. These things don't all compose nicely and the kernel is an enormous hard-to-tame beast. Windows NT has its design warts too, all over, like the amount of involvement the kernel has in the GUI for historical reasons, and the enormous syscall surface area, and untold amounts of legacy cruft. But all in all, I think the core of what they made is really cool, the subsystems concept is super cool, and it is an OS design that has stood up well to time. I also think the PE format is better than ELF and that it is literally better for the capabilities it doesn't have w.r.t. symbols. Sure it's ugly, in part due to the COFF lineage, but it's functionally very well done IMO.

I feel the need to say this because I think I probably came off as a hater, and tbh I'm not even a hater of WSL2. It's not as cool as WSL1 and subsystems and pico processes, but it's very practical and the 9p bridge works way better than it has any right to.

Thanks for pointing this out.


Put another way: Worse is Better


It used to be. They moved to a VM.

Turns out that it's easier to emulate a CPU than syscalls. The CPU churns a lot less, too, which means that once things start working things tend to keep working.


> Turns out that it's easier to emulate a CPU than syscalls

I don't think WSL2 supports CPU emulation. It might not even support (or at least rely on) driver emulation, though Hyper-V itself does.


You're thinking of the POSIX personality of Windows NT of old. This was based on Interix and has been deprecated about two decades ago and is now buried so deep that it couldn't be revived.

The new WSL1 uses kernel call translation, like Wine in reverse and WSL2 runs a full blown Linux kernel in a Hyper-V VM. To my knowledge neither of these share anything with the aforementioned POSIX subsystem.


WSL 2 is actually virtualized despite the name


WSL1 was a subsystem. WSL2 is mostly a VM.


They had to give that up because it was too slow, I think for IO. Unfortunate.


It's complicated. WSL1 is much faster at accessing the drives mounted in Windows, but much slower at accessing its own emulated drive.

If you have control over where you put your git repo, WSL2 will hit max speed. If you want it shared between OSes, WSL2 will be slower.


It also didn't have working fsync, and corrupted SQLite databases. I think that's more important.


I mean... WINE does the same on windows, but microsoft refuses to release their API docs for all internal APIs. They release WSL by relying on Linux's open-ness, while refusing the same for themselves.


Then they discontinue WSL1 and just do a VM instead because... reasons. I really don’t understand how MSFT works on the inside.


A big one of those reasons was Docker. Docker was still fairly niche when WSL was released in 2016, but demand for it grew rapidly, and I don't think there was any realistic way they could have made it work on the NT kernel.


The integration between Windows and the WSL VM is far deeper than a typical VM hypervisor.

You cannot claim with a straight face that Virtualbox is easier to use.


It's deeper but let's not overblow it.

I think the two fairly deep integrations are window's ability to navigate WSL's filesystem and wslg's fairly good ability to serve up guis.

The filesystem navigation is something that AFAIK can't easily be replicated. wslg, however, is something that other VMs have and can do. It's a bit of a pain, but doable.

What makes WSL nice is the fact that it feels pretty close to being a native terminal that can launch native application.

I do wish that WSL1 was taken further. My biggest grip with WSL is the fact that it is a VM and thus takes a large memory footprint. It'd be nice if the WSL1 approach panned out and we instead had a nice clean compatibility wrapper over winapi for linux applications.


> The filesystem navigation is something that AFAIK can't easily be replicated.

The filesystem navigation getting partially open sourced is one of the more interesting parts being open sourced per this announcement. The Plan9 file server that serves files from Windows into Linux is included in the new open source dump. (The Windows filesystem driver that runs a Plan9 client on the Windows side to get files from Linux is not in the open source expansion.)

It's still fascinating that the whole thing is Plan9-based, given the OS never really succeeded, but apparently its network file system is a really good inter-compatibility file communication layer between Linux and Windows.

> I do wish that WSL1 was taken further.

WSL1 survives and there's still a chance it will see more work eventually, as the tides shift. I think the biggest thing that blocked WSL1 from more success was lack of partners and user interest in Windows Subsystem for Android apps. That still remains a potentially good idea for Windows if it had been allowed "real" access to Google Play Services and App Store, rather than second rate copy of Amazon's copy of Google Play Services and Fire App Store. An actual Google partnership seems doomed given one of the reasons to get Windows Subsystem for Android competitive was fear of ChromeOS, but Google still loves to talk about how "Open" Android is despite the Google Play Services moat and that still sounds like something that a court with enough fortitude could challenge (even if it is probably unlikely to happen).


> The integration between Windows and the WSL VM is far deeper than a typical VM hypervisor.

Sure, but I never claimed otherwise.

> You cannot claim with a straight face that Virtualbox is easier to use.

I also didn't claim that. I wasn't comparing WSL to other virtualization solutions.

WSL2 is cool. Linux doesn't have a tool like WSL2 that manages Linux virtual machines.

The catch 22 is that it doesn't need one. If you want to drop a shell in a virtual environment Linux can do that six ways through Sunday with no hardware VM in sight using the myriad of namespacing technologies available.

So while you don't have WSL2 on Linux, you don't need it. If you just want a ubuntu2204 shell or something, and you want it to magically work, you don't need a huge thing with tons of integration like WSL2. A standalone program can provide all of the functionality.

I have a feeling people might actually be legitimately skeptical. Let me prove this out. I am on NixOS, on a machine that does not have distrobox. It's not even installed, and I don't really have to install it since it's just a simple standalone program. I will do:

    $ nix run nixpkgs#distrobox enter

Here's what happened:

    $ nix run nixpkgs#distrobox enter
    Error: no such container my-distrobox
    Create it now, out of image registry.fedoraproject.org/fedora-toolbox:latest? [Y/n]: Y
    Creating the container my-distrobox
    Trying to pull registry.fedoraproject.org/fedora-toolbox:latest...
    ...
    0f3de909e96d48bd294d138b1a525a6a22621f38cb775a991974313eda1a4119
    Creating 'my-distrobox' using image registry.fedoraproject.org/fedora-toolbox:latest [ OK ]
    Distrobox 'my-distrobox' successfully created.
    To enter, run:

    distrobox enter my-distrobox

    Starting container...                    [ OK ]
    Installing basic packages...             [ OK ]
    Setting up devpts mounts...              [ OK ]
    Setting up read-only mounts...           [ OK ]
    Setting up read-write mounts...          [ OK ]
    Setting up host's sockets integration... [ OK ]
    Integrating host's themes, icons, fonts... [ OK ]
    Setting up distrobox profile...          [ OK ]
    Setting up sudo...                       [ OK ]
    Setting up user groups...                [ OK ]
    Setting up user's group list...          [ OK ]
    Setting up existing user...              [ OK ]
    Ensuring user's access...                [ OK ]

    Container Setup Complete!
    [john@my-distrobox]~% sudo yum install glxgears
    ...
    Complete!
    [john@my-distrobox]~% glxgears
    Running synchronized to the vertical refresh.  The framerate should be
    approximately the same as the monitor refresh rate.
    302 frames in 5.0 seconds = 60.261 FPS
    ^C
No steps omitted. I can install software, including desktop software, including things that need hardware acceleration (yep, even on NixOS where everything is weird) and just run them. There's nothing to configure at all.

That's just Fedora. WSL can run a lot of distros, including Ubuntu. Of course, you can do the same thing with Distrobox. Is it hard? Let's find out by using Ubuntu 22.04 instead, with console output omitted:

   $ distrobox create --image ubuntu:22.04
   ...
   $ distrobox enter ubuntu-22-04
   ...
   $ sudo apt install openarena
   ...
   $ /usr/games/openarena
To be completely, 100% fair: running an old version of Ubuntu like this does actually have one downside: it triggers OpenGL software rendering for me, because the OpenGL drivers in Ubuntu 22.04 are too old to support my relatively new RX 9070 XT. You'd need to install or copy in newer drivers to make it work. There are in fact ways to do that (Ubuntu has no shortage of repos just for getting more up-to-date drivers and they work inside Distrobox pretty much the same way they work in real hardware.) Amusingly, this problem doesn't impact NVIDIA since you can just tell distrobox to copy in the NVIDIA driver verbatim with the --nvidia flag. (One of the few major points in favor of proprietary drivers, I suppose.)

On the other hand, even trying pretty hard (and using special drivers) I could never get hardware acceleration for OpenGL working inside of WSL2, so it could be worse.

That aside, everything works. More complex applications (e.g. file browsers, Krita, Blender) work just fine and you get your normal home folder mapped in just like you'd expect.


Distrobox seems a lot like WSL to me. You can run many different Linux distros, each well integrated into the host system.

Except that Distrobox does not require a VM of course as the host kernel is Linux.


Yes, yes I can. Also does most of everything. WSL has severe issues with hardware translation.


> I get that WSL is revolutionary for Windows users

It is... I'm working these days on bringing a legacy windows only application to the 21st century.

We are throwing a WSL container behind it and relying on the huge ecosystem of server software available for Linux to add functionality.

Yes that stuff could run directly on windows, but you'd be a lot more limited in what's supported. Even for some restricted values of supported. And you'd have to reinvent the wheel for a few parts.


And if they think that this version of Linux "isn't janky" but regular Linux is, than idk what to say.


With WSL you can use “Linux the good parts” (command line tools, efficient-enough paradigms for fork() servers) and completely avoid X Windows, the Wayland death spiral, 100 revisions of Gnome and KDE that not so much reinvent the wheel but instead show us why the wheel is not square or triangular…


It's all opinion of course, but IMO Windows is the most clumsy and unintuitive desktop experience out there. We're all just used to the jank upon jank that we think it's intuitive.

KDE is much more cohesive, stable, and has significantly more features.


>the Wayland death spiral

That sounds like Wayland getting worse, but it's actually been slowly improving and it's pretty good now. Only took a decade+ to get there.


Mir was good from year one.


Judging from what happened to X11, that means wayland will be deprecated very soon. /s


/s indeed because there are actually no plans at all to replace Wayland!

I think the infamous cascade of attention-deficit teenagers (CADT) has slowed down quite a bit in the desktop space because... well, most developers there are over 30 now.


Not unlike Win10 vs 11.


It blows my mind that people can complain about the direction KDE is going when trying to paint a picture about how it's so much nicer to use Windows. I know the boiling frog experiment is fake, but just checking: are you sure the water isn't getting a little uncomfortably warm in the Windows pool right now?


I know you're saying you don't have to use it, but for any that didn't know, WSL2 does ship with it's own Wayland. And it does have some weird bugs.


After having used i3 and Sway, Windows is surprisingly bad at handling windows for an OS called Windows.

It requires a bit of work to setup to your liking of course, but hey, at least you have an option to set it up to your liking


Agreed. I used tiling WMs for a long while (ion3, XMonad) and it was such a productivity boost.

Then I was forced to use a Mac for work, so I was using a floating WM again. On my personal machine, ion3 went away and I never fully got around to migrate to i3.

By the time I got enough free time to really work on my personal setup, it had accumulated two huge monitors and was a different machine. I found I was pretty happy just scattering windows around everywhere. Especially with a trackball's cursor throw. This was pretty surprising to me at first.

Anyway this is just my little personal anecdote. If I go back to a Linux install I'll definitely have to check out i3 again. Thanks for reminding me :)


Compiling and testing cross-platform software for Linux lately (Ubuntu and similar)... You can't even launch an application or script without CLI. Bad UX, IMO. For these decisions, There are always reasons, a justification, something about security. I don't buy it.


> You can't even launch an application or script without CLI.

Care to elaborate? I'm not sure I understand what you're saying here.


I compile my program using WSL, or Linux native. It won't launch; not an executable. So, into the CLI: chmod +x. Ok. It's a compiled binary program, so semantically I don't see the purpose of this. Probably another use case bleeding into this. (I think there's a GUI way too). Still can't double click it. Nothing to launch from the right-click menu. After doing some research, it appears you used to be able to do it (Ubuntu/Gnome[?]), but it was removed at some point. Can launch from CLI.

I make a .desktop file and shell script to move it to the right place. Double click the shell file. It opens a text editor. Search the right click menu; still no way. To the CLI we go; chmod +x, and launch if from the CLI. Then after adding the Desktop icon, I can launch it.

On windows, you just double click the identified-through-file-extension executable file. This, like most things in Linux, implies the UX is designed for workflows I don't use as a PC user. Likely servers?


This sounds very weird to me. Any sane build toolchain should produce a runnable executable that already has +x. What did you use to compile it?

Removing double-click to run an executable binary certainly sounds like something either Gnome or Ubuntu would do, but thankfully that's not the only option in town. In KDE I believe the same exact Windows workflow would just work.


>Any sane build toolchain should produce a runnable executable that already has +x. What did you use to compile it?`

`cargo build --release`

Good to know KDE doesn't do that!


Even stranger then. Just to make sure I'm not missing something, I just tried this on my Mac:

  $ cargo --version
  cargo 1.86.0

  $ cargo new hello-rs
     Creating binary (application) `hello-rs` package

  $ cd hello-rs && cargo build --release
     Compiling hello-rs v0.1.0 (/Users/int19h/src/hello-rs)
      Finished `release` profile [optimized] target(s) in 0.73s

  $ ls -la target/release/hello-rs
  -rwxr-xr-x@ 1 int19h  staff  468608 May 20 20:16 target/release/hello-rs*

  $ ./target/release/hello-rs
  Hello, world!
Are you sure it's not because the package in question does some kind of weird custom build steps?


Might have got lost in translation when I moved it from WSL to a windows-made zip file. I think that workflow nukes permissions.


Yeah the typical way programs are run is by using a .desktop file that's installed. The reason nobody cares is because running random executable that have a GUI is a pretty rare use case for Linux desktops. We don't have wizards or .msi installers, we just install using the package manager. And then it shows up where it needs to.

If you're on KDE, you can right-click the start menu and add the application. Also, right-click menu should give you a run option.


This is very much YMMV thing. There is no objectively best platform. There are different users and requirements.

I’ve been a software developer for 20 years and in _my_ opinion Windows is the best platform for professional software development. I only drop of to linux when need some of the excellent posix tools but my whole work ergonomy is based on Windows shortcuts and Visual Studio.

I’ve been forced to use Mac for the past 1.5y but would prefer not to.

Why would Windows be superior for me? Because that’s where the users are (for the work stuff I did before this latest gig). I started in real time graphics and then spent over a decade in CAD for AEC (developing components for various offerings including SketchUp). The most critical thing for the stuff I did was the need to develop on the same platform as users run the software - C++ is only theoretically platform independent.

Windows API:s are shit for sure for the most part.

But still, from this pov, WSL was and will be the best Linux for me as well.

YMMV.


I fully agree with you - "YMMV" is the one true take. Visual Studio has never been particularly attractive to me, my whole workflow is filled with POSIX tools, and my code mostly runs on Docker and Linux servers. Windows is just another thing to worry about for me, be it having to deal with the subtle quirks of WSL not running on raw metal or having to deal with running UNIX-first tooling (or finding alternatives) on Windows. If it wasn't for our work provided machines being Windows by default, and at home, being into VR gaming and audio production (mostly commercial plugins), I'd completely ditch Windows in a heartbeat.


It's a VM plus some usability automation. Can't ignore the usability benefits.


Just FYI, you may also enjoy systemd-machine. It's essentially the same thing as toolbx but it handles the system bus much more sanely, and you can see everything running inside the guest from the host's systemctl.


If Windows provided easier access to hardware, especially USB, from WSL it would be nice. In fact, if WSL enumerated devices and dealt with them as native Linux does, even better.


Windows has many useful software that is not available on Linux.

So, for me Windows + WSL is more productive than just using Linux. The UI is still better on Windows(basic utilities like File Explorer and Config Management is better on Windows). No Remoting Software beats RDP. When I remote to a Windows workstation through RDP, I can't tell the difference. VNC is always janky. Of course there is Word/Excel/Illustrator which is simply not available on Linux


File Explorer is better on Windows? How? I tried Windows 11 for the first time a month ago and it takes several seconds for file explorer to open, it's asynchronously loading like 3 different UI frameworks as random elements pop in with no consistency, there's two different rightclick menus because they couldn't figure out how to make the new one have all the functionality of the old one so they decided to just keep the old one behind "Show More Options", and it's constantly pushing OneDrive in your face. I'm offended that this is what they thought is good enough to ship to a billion users.


The File Explorer on Windows 11 is the worst experience ever. Windows 7 was snappy as hell, but I don't know what they did to damage it that badly. I use XYplorer, which is written in Visual Basic (so a 32 bit application), but is so much faster the native explorer (and is full with features).


> No Remoting Software beats RDP. When I remote to a Windows workstation through RDP, I can't tell the difference. VNC is always janky

Any recent distro running Gnome or KDE has built-in support for connecting and hosting an RDP session. This used to be a pain point, you don't need to use VNC anymore.

It's actually worse on windows since you need to pony up for a pro license to get RDP hosting support...


> The UI is still better on Windows(basic utilities like File Explorer and Config Management is better on Windows).

5 years ago, we would be comparing old GNOME 3 or KDE Plasma 5 on X11 and Windows 10. I would be forced to agree. The Windows UI was better in many ways at that point.

Today we have KDE Plasma 6.3 on Wayland and Windows 11. This is an entirely different ball game. It's hard to explain. Wayland feels like it has taken an eternity to lift off, like well over a decade, but now things change dramatically on the scale of months. A few months ago HDR basically didn't work anywhere. Right now it's right in front of me and it works great. You can configure color profiles, SDR applications don't break ever, and you even get emulated brightness. Display scaling? Multiple monitors with different scale factors? What about one monitor at 150% and another at 175% scale factor? What about seamlessly dragging windows between displays with different scale factors? Yes, Yes, Yes, and Yes. No `xrandr` commands. You configure it in the GUI. I am dead serious.

File Explorer? That's the application that has two context menus, right? I think at this point Windows users might actually be better off installing KDE's Dolphin file manager in Windows for the sake of their own productivity. If I had the option to use Windows File Explorer on KDE I would impolitely decline. I have not encountered any advertising built into my file explorer. I do not have an annoying OneDrive item in the menu on the left. I have a file tree, a list of Places, and some remote file shares. When I right click it does not freeze, instead it tends to show the context menu right away. And no, I'm not impressed by Tabs and Dark Mode, because we've had that on Linux file managers for so long that some people reading this were probably born after it was already supported.

Windows still has the edge in some areas, but it just isn't what it used to be. The Linux UI is no longer a toy.

> When I remote to a Windows workstation through RDP, I can't tell the difference. VNC is always janky.

I don't really blame you if you don't believe me, but I, just now, went into System Settings, went to the Remote Desktop setting, and clicked a toggle box, at which point an RDP server spawned. Yes, RDP, not VNC, not something else. I just logged into it using Reminna.

Not everything on Linux is seamless and simple like this, but in this case it really is. I'm not omitting a bunch of confusing troubleshooting steps here, you really can do this on a modern Linux setup, with your mouse cursor. Only one hand required.

> Of course there is Word/Excel/Illustrator which is simply not available on Linux

True, but if you want to use Linux and you're held back by needing some specific software, maybe it's not the end of the world. You have many options today. You can install VirtualBox and run your spreadsheets in there. You can use Office 365 in a browser. You can run Crossover[1] and emulate it. You can use an office alternative, like possibly WPS Office. You can dual boot. You can go the crazy route and set up a KVM GPU passthrough virtual machine, for actually native performance without needing to reboot.

The point I'm making here is not "Look, Linux is better now! Everyone go use it and get disappointed ASAP!" If you are happy with Windows, there's literally no point in going and setting yourself up for disappointment. Most people who use Linux do so because they are very much not happy with Windows. I'm sure you can tell that I am not. However, in trying to temper the unending optimism of Linux nerds, sometimes people go too far the other way and represent Linux as being in far worse of a state than it actually is. It really isn't that bad.

The worst thing about modern Linux is, IMO, getting it to work well on your hardware. Once you have that part figured out, I think modern Linux is a pretty good experience, and I highly recommend people give it a shot if they're curious. I think Bazzite is a really nice distro to throw on a random spare computer just to see what modern Linux is actually capable of. It's not the absolute most cutting edge, but it gives you a nice blend of fairly up-to-date software and a fairly modern RPM ostree base system for better stability and robustness, and it's pretty user-friendly. And if you don't like it, you can easily get a full refund!

[1]: https://www.codeweavers.com/compatibility/crossover/microsof...


> You can use an office alternative, like possibly WPS Office.

Or ONLYOFFICE, which is FOSS (and what I use personally). Or LibreOffice (also free/libre software, of course). I don’t miss MS Office one bit, the compatibility is nothing short of excellent nowadays, and the speed and UX both surpass it.

There are specialized software packages that are Windows-only, of course, but at least office programs ain’t it.


The last time I deployed Linux servers on bare metal was about 2010.

Apparently Linux VMs on other people's computers is very much appreciated.


I definitely prefer working in Linux.

But having Windows tightly integrated when needed is nice.

If only I could run replace the Windows shell with a Linux DE...


Is it a VM? It seems to be much faster than most VMs I've used.


Literally built on top of MS's Hyper-V.

IDK how many VMs you've used, but there has been a lot of work specifically with x86 to make VMs nearly as fast as native. If you interact with cloud services everything you do is likely on a VM.


It's handy if you have other services that are Windows-based, though. And, being a VM, it's fairly convenient to have multiple versions and to back up.


So, how you run Windows on Linux like WSL does?


Methods I know are using qemu/Wine/proxmox/VirtualBox.


But he was acting as if Linux didnt need VMs ;)


Linux doesn't need VMs, people need VMs. If you spend most of your time in Windows-exclusive apps and use WSL2 on occasion, then you already know what you want, why are you worried about arguing about it on the Internet?

For many software engineers, a lot of our work is Linux, and it wouldn't be atypical to spend most of the time doing Linux development. I work on Linux and deploy to Linux, it's just a no-brainer to run Linux, too, aside from the fact that I simply loathe using modern Windows to begin with.

(Outside of that, frankly, most people period live inside of the web browser, Slack, Discord, and/or Steam, none of which are Windows-exclusive.)

My point isn't that Linux is better than Windows, it's that WSL2 isn't better than literally running Linux. If you need to do Linux things, it is worse than Linux at basically all of them.


Steam by itself is irrelevant, what matters is whether the game you want to play runs on Linux.

For anything that is PvP multiplayer, this is very much not a given because of how pervasive kernel-level anti-cheat solutions are today.


You still have to go and make sure that what you want is there and works, but it's not a bad bet. With a few major omissions aside, there is a pretty big library of supported games.

> For anything that is PvP multiplayer, this is very much not a given because of how pervasive kernel-level anti-cheat solutions are today.

To be fair, though, you probably still have a better shot of being able to play the games you want to under Linux than macOS and that doesn't seem to be that bad of an issue for Mac users. (I mean, I'm sure many of them game on PC anyways, but even that considered macOS has greater marketshare than Linux, so that's a lot of people either able to deal with it or have two computers.)


Speaking as a Mac user, it's really bad. Much worse than Linux/SteamOS actually. Not only most games just aren't there, many games that are advertised as Mac-compatible are actually broken because they haven't been updated for a long time, and macOS is not particularly ABI-stable when it comes to GUI. Sometimes they just don't support hi-DPI, so you can play it but forget about 4K. But sometimes it just straight up won't start.

I do indeed have two computers with a KVM setup largely for this reason, with a secondary Windows box relegated to gaming console role.


Fair point. I know it was rough when Apple made the break-away with 32-bit.

Still, the point is that you can make it work if you want to make it work. Off the top of my head:

- Two computers, completely separate. Maybe a desktop and a laptop.

- Two computers, one desk and a KVM like you suggest.

- Two computers, one desk. No proper KVM, just set up remote desktop and game streaming.

- (on Linux) KVM with GPU passthrough, or GPU passthrough with frame relay. One computer, one desk.

- Game streaming services, for more casual and occasional uses.

- Ordinary virtualization with emulated GPU. Not usually great for multimedia, but still.

- And of course, Steam Play/Heroic Launcher/WINE. Not as applicable on macOS, but I know CodeWeavers does a lot to keep macOS well-supported with Crossover. With the aforementioned limitations, of course.

Obviously two computers has a downside, managing two boxen is harder than one, and you will pay more for the privilege. On the other hand, it gives you "the real thing" whenever you need it. With some monitors having basic KVM functionality built-in, especially over USB-C, and a variety of mini PCs that have enough muscle to game, it's not really the least practical approach.

I suspect for a lot of us here there is a reasonable option if we really don't want to compromise on our choice of primary desktop OS.


> You know what's even more convenient than a VM? Not needing a VM and still having the exact same functionality.

Exactly.


I heard 2025 was the year of Linux on the desktop!


> You know what's even more convenient than a VM? Not needing a VM and still having the exact same functionality

I mean this is basically heresy now.

most code is virtualised, or sandboxed, or in a VM, or a docker container, or several of the above at the same time.


The important bit though is that Docker containers are not VMs or sandboxes, they're "just" a combination of technologies that give you an isolated userland using mostly Linux namespaces. If you're running a Linux host you already have namespaces, so you can just use them directly. Distrobox gives you basically the same sort of experience as WSL2 except it doesn't have any of the weird parts of running a VM because it's not VMs.


Your comment that you can do Linux things on Linux missed the point entirely.

Where is the reverse WSL on Linux, where Windows is deeply embedded and you have all the Windows features in your hands?

You can use Wine/Crosseover, which is cool, but even now the number of software products it supports is tiny. Steam has a lot of games.

You can run a virtual machine with Windows on it. That is identical to what you can do on Windows with Linux.

WSL2-> is a virtual machine with unique tooling around it that makes it easier to use and integrates well with Windows.


Windows supports Linux because the latter is open source, it's a lot easier than the reverse.

Linux, on the other hand, barely supports Windows because the latter is closed, and not just closed, windows issues component updates which specifically check if they run in wine and stop running, being actively hostile to a potential Linux host.

The two are not equivalent, nobody in the Linux kernel team is actively sabotaging WSL, whereas Microsoft is actively sabotaging wine.


> whereas Microsoft is actively sabotaging wine

Do you have a link to where I can read more about this? My understanding is that Microsoft saw Wine as inconsequential to their business, even offloading the Mono runtime to them [1] when they dropped support for it.

[1] https://www.mono-project.com/


> Until 2020, Microsoft had not made any public statements about Wine. However, the Windows Update online service will block updates to Microsoft applications running in Wine. On 16 February 2005, Ivan Leo Puoti discovered that Microsoft had started checking the Windows Registry for the Wine configuration key and would block the Windows Update for any component.[125] As Puoti noted: "It's also the first time Microsoft acknowledges the existence of Wine."

https://en.m.wikipedia.org/wiki/Wine_(software)


This. Windows needs to open source its operating system. End of story.


Microsoft seems to be taking a outside-in "component at a time" approach to open sourcing Windows. Terminal, Notepad, Paint, Calculator, the new Edit.com replacement, a lot of WSL now, etc.

This approach has been fascinating so far, but yeah not "exciting" from "what crazy things can I do with Windows like put it in a toaster" side of things.

It would be great to see at least a little bit more "middle-out" from Windows Open Source efforts. A minimal build of the NT Kernel and some core Windows components has been "free as in beer" for a while for hobby projects with small screens if you really want to try a very minimal "toaster build" (there's some interesting RPi out there), but the path to commercialization is rough after that point and the "small screens" thing a bit of a weird line in the sand (though understandable given Microsoft's position of power on the desktop and sort of the tablet but not phone).

The NT Kernel is one of the most interesting microkernels left in active use [0], especially given how many processor architectures it has supported over decades and how many it still supports (even the ones that Windows isn't very commercially successful on today). It could be a wealth of power to research and academia if it were open source, even if Microsoft didn't open source any of the Windows Subsystems. It would be academically interesting to see what sort of cool/weird/strange Subsystems people would build if NT were open source. I suppose Microsoft still fears it would be commercially interesting, too.

[0] Some offense, I suppose to XNU here. Apple's kernel is often called a microkernel for its roots from the Mach kernel, but it has rebuilt some monoliths on top of that over the years (Wikipedia more kindly calls it a "hybrid kernel"), and Mach itself is still so Unix flavored. NT's "object oriented" approach is rather unique today, with its more VMS heritage, a deeply alternate path from POSIX/Unix/Linux(/BSD).


I doubt it would happen, large projects that aren't open source from the onset and are decades old can have licensed or patented code, Microsoft would have to verify line by line that they can open source it.


Wait long enough and it will happen, the question is just "how long". (Microsoft has open-sourced OS and languages from the 1980s) Some days it seems like Microsoft is more interested in Azure, Copilot and GAME PASS and Windows is an afterthought.


I would certainly love it if Microsoft stopped trying to sell Windows and just open sourced it. I think Windows is a much more pleasant desktop operating system than Linux, minus all the ads and mandatory bloat Microsoft has put in lately. But if Windows was open source the community could just take that out.

I really don't see it happening any time in the next decade at least, though. While Windows might not be Microsoft's biggest focus any more it's still a huge income stream for them. They won't just give that up.


I preferred WSL to running linux directly even though I had no need for any windows only software. Not having to spend time configuring my computer to make basic things work like suspend/wake on lid down/up, battery life, hardware acceleration for video playback on the browser, display scaling on external monitor and so on was reason enough.


All this usually works out of the box now, especially if you pick your hardware accordingly.


That was certainly not the case ~2 years ago, the last time I installed linux on a laptop.

It also doesn't appear to be the case even now. I searched for laptops available in my country that fit my budget and for each laptop searched "<laptop name> linux reddit" on google and filtered for results <1 year old. Each laptop's reports included some or other bug.

https://www.reddit.com/r/linuxhardware/comments/1hfqptw/linu...

https://www.reddit.com/r/linuxhardware/comments/1esntt3/leno...

https://www.reddit.com/r/linuxhardware/comments/1j3983j/hp_o...

https://www.reddit.com/r/linuxhardware/comments/1k1nsm8/audi...

The laptop with the best reported linux support seemed to be Thinkpad P14s but even there users reported tweaking some config to get fans to run silently and to make the speakers sound acceptable.

https://www.reddit.com/r/thinkpad/comments/1c81rw4/thinkpad_...


You are going to find issues for any computer for any OS by looking things up like this.

And yeah, it's best to wait a bit for new models, as support is sorted out, if the manufacturer doesn't support Linux itself. Or pick a manufacturer that sells laptops with Linux preinstalled. That makes the comparison with a laptop with Windows preinstalled fair.


> You are going to find issues for any computer for any OS by looking things up like this

I wasn't cherry-picking things. I literally searched for laptops available in my budget in my country and looked up what was the linux support like for those laptops as reported by people on reddit.

> Or pick a manufacturer that sells laptops with Linux preinstalled

I suppose you are talking about System76, Tuxedo etc. These manufacturers don't ship to my country. Even if I am able to get it shipped, how am I supposed to get warranty?


You weren't cherry picking but the search query you used would lead to issue reports.

HP, Dell and Lenovo also sell Linux laptops on which Linux runs well.

I sympathize with the more limited availability and budget restrictions, but comparisons must be fair: compare a preinstalled Windows and a preinstalled linux, or at least a linux installed on hardware whose manufacturer bothered to work on Linux support.

When the manufacturer did their homework, Linux doesn't have the issues listed earlier. I've seen several laptops of these three brands work flawlessly on Linux and it's been like this for a decade.

I certainly choose my laptops with Linux on mind and I know just picking random models would probably lead me to little issues here and there, and I don't want to deal with this. Although I have installed Linux on random laptops for other people and fortunately haven't run into issues.


As a buyer, how am I supposed to know which manufacturer did their homework and on which laptops?

> it's been like this for a decade

Again, depends on the definition of "flawlessly". Afaik, support for hardware accelerated videoplayback on browsers was broken across the board only three years ago.


> As a buyer, how am I supposed to know which manufacturer did their homework and on which laptops?

You first option is to buy a laptop with linux preinstalled from one of the many manufacturers that provides this. This requires no particular knowledge or time. Admittedly, this may lead you to more expensive options, entry grade laptops won't be an option.

Your second best bet is to read tech reviews. Admittedly this requires time and knowledge, but often enough people turn to their tech literate acquaintance for advice when they want to buy hardware.

> Afaik, support for hardware accelerated videoplayback on browsers was broken across the board only three years ago.

Yes indeed, that's something we didn't have. I agree it sucks. Now, all the OSes have their flaws that others don't have, and it's not like the videos didn't play, in practice it was an issue if you wanted to watch 4K videos for hours on battery. Playing regular videos worked, and you can always lower the quality if your situation doesn't allow the higher qualities. Often enough, you could also get the video and play it outside the browser. I know, not ideal, but also way less annoying that the laptop not suspending when you close the lid because of a glitch or something like this.


> You first option is to buy a laptop with linux preinstalled

I have earnestly tried for >20 minutes trying to find such a laptop with any reputed manufacturer in my country (India) and come up empty-handed. Please suggest any that you can find. Even with Thinkpads, the only options are "Windows" or "No Operating System".

>Your second best bet is to read tech reviews.

Which tech reviews specifically point out linux support?

>Playing regular videos worked, and you can always lower the quality if your situation doesn't allow the higher qualities

The issue was never about whether playing the video worked. CPU video decoding uses much more energy and leads to your laptop running hot and draining battery life.

Can we at least agree to reduce the timeframe for things working flawlessly to "less than two years" instead of "a decade"? Yes you were able to go to the toilet downstairs but the toilet upstairs was definitely broken.


"Thinkpad linux" with region set to India on DDG yields many results, including https://www.lenovo.com/us/en/d/linux-laptops-desktops/

If buying with Linux is not an option at your place, you can always buy one of the many models found with this search without OS and install it yourself. Most thinkpads should be all right. Most elitebooks should do. Dell laptops sold with Ubuntu somewhere on the planet should do. I'm afraid I can't help nore, you'll have to do your search. Finding out which laptops are sold with Linux somewhere should not be rocket science. I don't buy laptops very often, I tend to keep my computers for a healthy amount of time, I can't say what it's like in India in 2025.

> Can we at least agree to reduce the timeframe for things working flawlessly to "less than two years" instead of "a decade"? Yes you were able to go to the toilet downstairs but the toilet upstairs was definitely broken.

No. I understand that it can be a dealbreaker for some, but that's a minor issue for me on laptops, even unplugged, and I do watch a lot of videos (for environmental reasons I tend to avoid watching videos in very high resolutions anyway, so software rendering is a bummer but not a blocker). There are still things that don't work, like Photoshop or MS Office, so you could say that it's still not flawless, still, that doesn't affect me.


>many results, including https://www.lenovo.com/us/en/d/linux-laptops-desktops/

Many results, including a US-specific page of the Lenovo website.

>If buying with Linux is not an option at your place, you can always buy one of the many models found with this search without OS and install it yourself.

>Finding out which laptops are sold with Linux somewhere should not be rocket science.

It should not. Given the amount of time I have already spent on trying to find one, it is fair to say that there are none easily available in India, at least in the consumer laptop market.

> I understand that it can be a dealbreaker for some, but that's a minor issue for me on laptops

Stockholm syndrome.


> Stockholm syndrome.

Stockholm Syndrome was bullshit made up on the spot to cover for the inability of the person making it up to defend their position with facts or logic, and...that fits most metaphorical uses quite well, too, though its not usually the message the metaphor is intended to communicate.


> Many results, including a US-specific page of the Lenovo website.

Are you failing to see that this US-specific page gives you a long list of models you can consider elsewhere?

> Stockholm syndrome.

Yeah, no. It just appears I have different needs than you and value different tradeoffs. It appears that the incredible comfort Linux brings me offsets the minor inconvenience software rendered browser video playback causes me.

I'm done in this discussion, we've been quite far away the kind of interesting discussions I come to HN for for a few comments now.


On Windows, I don't have to pick my hardware accordingly.

I have to onboard a lot of students to work on our research. The software is all linux (of course), and mostly distribution-agnostic. Can't be too old, that's it.

If a student comes with a random laptop, I install WSL on it, mostly ubuntu. apt install <curated list of packets>. Done. Linux laptops are OK too, I think, but so far only had one student with that. Mac OS used to be easy, but gets harder with every release, and every new OS version breaks something (mainly, CERN root) and people have to wait until it's fixed.


> On Windows, I don't have to pick my hardware accordingly.

Fair enough. I think the best way to run Linux if you want to be sure you won't have tweak to stuff is to buy hardware with linux preinstalled. That your choice is more limited is another matter than "linux can't suspend".

Comparing a preinstalled Windows with a linux installed on random laptop whose manufacturer can't be bothered to support is a bit unfair.

Linux on a laptop where the manufacturer did their work runs well.


Yes, machines with Linux preinstalled normally work quite well. But it's still a downside of choosing Linux that the choice of laptops is so much smaller. Similar to the downside of Mac OS that you are locked in to pricey-but-well-built laptops, or the downside of Windows that "it runs Windows" doesn't mean the hardware is not bottom-of-the-barrel crap with a vendor who doesn't care about Linux compatibility. WSL allows to run a sane development environment even then :)


100% agree


I use Windows with wsl for work, and Linux and MacOS at home. Windows is a mess, it blows my mind that people pay for it. Sleep has worked less reliably on my work machine than my Fedora Thinkpad, and my Fedora machine is more responsive in pretty much every way despite having modest specs in comparison. Things just randomly stop working on Windows in a way that just doesn't happen on other OSes. It's garbage.


> You can use Wine/Crosseover, which is cool, but even now the number of software products it supports is tiny. Steam has a lot of games.

This isn't really the case, and hasn't been for some years now, especially since Valve started investing heavily in Wine. The quality of Wine these days is absolutely stunning, to the point that some software runs better under Wine than it does on Win11. Then there's the breadth of support which has has moved the experience from there being a slight chance of something running on Wine, to now it being surprising when something doesn't.



> Where is the reverse WSL on Linux, where Windows is deeply embedded and you have all the Windows features in your hands?

https://github.com/Fmstrat/winapps

Enjoy.


Late I am, but this reminded me of someone else here posting about https://usebottles.com

( https://news.ycombinator.com/item?id=44025837 )

Seems to be based on very polished https://www.winehq.org , so no full VM, but maybe it's less 'heavy'?


I am experimenting with Bottles for an article right now, as it happens -- but no, it does not seem to me that Bottles is at all comparable to WinApps.

WinApps runs the apps on real native Windows in a VM, but integrates their UI with the host OS.

WINE does this anyway, and it's an inherent property of WINE because there is no host OS. WINE does some fakery and indirection to make Unix filesystems appear on drive letters and things, but the app is still executing on the host OS, just via a translation layer.

Bottles runs the apps on top of WINE, but maintains separate WINE instances for each app and allows different onces to have different auxiliary tools, such as games-compatibility libraries, different versions of WINE, etc.


Sigh I knew that. I just posted this in the wider context of running Windows-applications under Linux, no matter which way and how, because at the end of the day that's all that counts. Yah, well. Maybe not, because that WiNE approach could be seen as less ressource intensive, while full-VM feels rather bloated, though more stable.

And because it bubbled up instantly, having read about it before, right here on HN, just ...uhhhmmm...hours ago.


OK, fair enough.

FWIW, I tested Bottles on 2 machines here, one with Ubuntu 22.04 and one with Ubuntu 24.04.

I could not get any app to install in Bottles that wouldn't run under bare WINE. Apart from a friendly GUI -- although it looks awful on any other desktop, like most Gtk 4 apps -- I can't see any benefit to it, TBH.


I was actually looking for something like this.


> WSL is more powerful than Linux

This is the kind of statement that makes you pay the karma tax. WSL is great, I use it on a day to day basis. I also use Linux on a day to day basis. And as great as WSL is, for running Linux software on supported hardware, Linux beats WSL hands down. And I mean, of course it does, do you expect a VM to beat native? In the same way that Windows software runs better on Windows. (with a few exceptions on both sides).

Compared to Linux, WSL I/O is slow, graphics is slow and a bit janky, I sometimes get crashes, memory management is suboptimal, networking has some quirks, etc... These problems are typical of VMs as it is hard for the host and guest OS to coordinate resource use. If you have an overpowered computer with plenty of RAM, and are mostly just using the command line, and don't do anything unusual with your network, then sure it may be "better" than Linux. But the truth is that it really depends on your situation.


[flagged]


Do you believe the 600+ people with the same problem here: https://github.com/microsoft/WSL/issues/4197


I knew which issue this was before I clicked it. Oh hey, there's me commenting in the issue a year ago!


WSL 1 had fast IO but couldn't support all features.

WSL 2 supports all features but has famously slow IO.

Example:

1. Shell into WSL

2. Clone a repo

3. Make a bunch of changes to the repo with a program within WSL

4. Run git status (should finish in less than a second)

5. Open repo from a Windows IDE

6. Run git status. This makes windows change each file's permissions, ownership, etc... so it can access the files as git status recursively travels through every file and folder

7. Go for coffee

8. Go for lunch

9. Git status finished after 35 minutes.

10. Close IDE

11. Shell back into WSL

12. Make a change in WSL

13. Run git status from within WSL

14. Wait another 35 minutes as Windows restores each file's ownership and permissions one by one

------------------------------------

The IO overhead is so bad that Microsoft built two new products just to get around it:

1. VSCode WSL remote-client architecture.

VSCode acts as a server within WSL and a client within Windows. Connect both VSCode instances (through proxy/tunnel if needed) and the server can perform the client's File IO ops on behalf of the client rather than letting an Application on Windows try to interact with any of WSL's file systems.

2. Windows DevDrive

Basically set aside a virtual-disk/partition and set it up as a different file system (ReFS) that doesn't use Window's file permissions, ownership and doesn't decrypt then decompress on each file input, doesn't compress then encrypt each file output, and doesn't virus scan the files on usage.

TL;DR Store the files on a network drive and hope race-condition ops from both WSL and Windows don't corrupt any files.


Well, WSL is Linux. It's really just a VM of it (since WSL2, WSL1 was actually running on the windows kernel which was pretty cool).

The big drawback to WSL to me is the slow filesystem access because NTFS sucks. And having to deal with Windows in the first place.

Ps I wouldn't worry about your karma. It's just a number :P


NTFS is not the problem.

The problem is Windows IO filters and whatnot, Microsoft Defender trying to lazily intercept every file operation, and if you're crossing between windows and Linux land, possibly 9pfs network shares.

WSL2's own disk is just a VM image and fairly fast - you're just accessing a single file with some special optimizations. Usually far, far more responsive than anything done by windows itself. Don't do your work in your network-shared windows home folder.


>The problem is Windows IO filters

Not the biggest issue of them, 'find' and 'git status' on WSL2 in a big project is still >100 times slower on windows dev drive which avoids those filters than it is with WSL 1 on dev drive.

WSL 1 on regular ntfs with defender disabled is about 4x slower than WSL1 on dev drive, so that stuff does cause some of it, but WSL2 feels hopelessly slow. And wsl 2 can't share memory as well or take as much advantage of the filesystem cache (doubling it if you use the windows drive in both places I think, unless the network drive representation of it doesn't get cached on the WSL2 drive.


WSL2, in my testing, is orders of magnitude faster at file heavy operations than anything outside WSL, dev drive or not. We have an R&D department that's using WSL2 and jumping through hurdles of forwarding hardware because it's night and day compared to trying under windows on the same machine. It provided other benefits too, but the sheer performance was the main selling point.

WSL2 does not take less advantage of filesystem caches. Linux's block cache is perfectly capable. HyperV is a semi-serious hypervisor, so it should be using a direct I/O abstraction for writing to the disk image. Memory is also balloning, and can dynamically grow and shrink depending on memory pressure.

Linux VM's is something Microsoft has poured a lot of money into optimizing as that's what the vast majority of Azure is. Cramming more out of a single machine, and therefore more things into a single machine, directly correlates with profits, so that's a heavy investment.

I wonder why you're seeing different results. I have no experience with WSL1, and looking into a proprietary legacy solution with known issues and limited features would be a purely academic exercise that I'm not sure is worth it.

(I personally don't use Windows, but I work with departments whose parent companies enforce it on their networks,


> Linux's block cache is perfectly capable. HyperV is a semi-serious hypervisor, so it should be using a direct I/O abstraction for writing to the disk image.

Files on the WSL2 disk image work great. They're complaining about accessing files that aren't on the disk image, where everything is relayed over a 9P network filesystem and not a block device. That's the part that gets really slow in WSL2, much slower than WSL1's nearly-native access.

> Memory is also balloning, and can dynamically grow and shrink depending on memory pressure.

In my experience this works pretty badly.

> a proprietary legacy solution with known issues and limited features

Well at least at the launch of WSL2 they said WSL1 wasn't legacy, I'm not sure if that has changed.

But either way you're using a highly proprietary system, and both WSL1 and WSL2 have significant known issues and limited features, neither one clearly better than the other.


> WSL2 does not take less advantage of filesystem caches.

My understanding is when you access files on the windows drive, the linuxvm in WSL2 caches it in its own memory, and the windows side caches it in its: now you have double the memory usage on disk cache where files are active on both, taking much less advantage of caches than if you had used WSL1 where windows serves as the sole cache for windows drives.

I'm only comparing working on windows filesystems that can be accessed by both. My use case is developing on large windows game projects, where the game needs the files fast when running, and WSL needs the files fast when searching code, using git, etc. WSL1 was usable on plain NTFS, and now much closer to ext4 with dev drive NTFS. WSL2 I couldn't make fast.

You could potentially have the windows files on a network drive on the WSL2 side living in native ext4, but with that you get the double filesystem caching issue, and you might slow a game editor launch on the windows side by way too much, your files are inaccessible during upgrades and you have to always have RAM dedicated to WSL2 running to be able to read your files. MS store versions of WSL2 will even auto upgrade while running and randomly make that drive unavailable.


Running WSL2 on Dev Drive means that you're effectively doing network I/O (to localhost); of course it's slow. It's also very pointless since your WSL2 FS is already a separate VHD.


Not pointless if you are working on a windows project but using unix tools to search code, do commits, etc. WSL2 just isn't usable for it in large projects. git status can take 5 minutes on unreal engine.


If you just need the stock Unix command line tools, MSYS2 will give you them at native speed, no VM needed, no funky path mappings etc.

WSL is for when you actually need it to be Linux.


I don't need just stock commandline tools, I use Claude Code etc. Maybe it can work in msys2 (it uses node) but it works well in WSL1. Also do a good bit with Python where some dependencies haven't worked in msys2.

I do windows and android gamedev and the server side is Linux.

However, WSL1 is pretty much abandoned and lots of newer distros have moved to forcing systemd or container based stuff that doesn't work in it, and some elf binaries no longer work in it (including newer node.js versions).

I use WSL2 as well when I really need it to be Linux and am doing webdev or something.

I use msys2 for a few things, usually compiling windows dependencies with the gnu toolchain that need GCC's stuff like inline assembly or computed goto (codecs etc.). Maybe possible from wsl1 too but they usually have full build and dependency instructions for msys2.


I use it, I am required to use Windows, and it’s a huge improvement over doing Data Science on native Windows, but the terrible filesystem access ruins what otherwise would be a seamless experience.

It’s fine for running small models but when you get to large training sets that don’t fit in RAM it becomes miserable.

There is a line where the convenience of training or developing locally gives way to a larger on demand cloud VM, but on WSL the line is much closer.


Slow IO is why I still use wsl1.


This. WSL was SO much more interesting in v1 times.


I liked the networking in WSL1 more too


Corporate networking is why I still use WSL1 (I didn’t spend enough time to check why it doesn’t with WSL2, zScaler could be the culprit maybe).

However it’s not perfect, for example I hit this bug when trying to run node a few days ago https://github.com/microsoft/WSL/issues/8219#issuecomment-10... and I don’t think they’re fixing bugs in WSL1 anymore


still use WSL1 also because VMWare runs so dreadfully slow with any kind of Hyper-V enabled - if so, VMWare must also use it, so you get a Type-2 running under a Type-1 the lag is untennable lag and performance.


>The big drawback to WSL to me is the slow filesystem access because NTFS sucks

Thats if you are going from VM/host. If you use the allocated space for VM, its pretty fast.


Is it really a NTFS issue ?

The culprit would be the plan9 bits (think of smb or nfs but .. wilder ? why are they using 9P again ?)


I'm guessing they use plan9 because distros already ship support for it, and it's super simple compared to NFS? It doesn't seem like CIFS/NFS would be any faster, and they introduce a lot more complexity.


Where are you experiencing filesystem slowness? I've been using WSL in some advanced configurations (building Win32 apps by cross-compiling from Linux CLANG and dropping the .exe into a Windows folder, copying large files from Linux->Windows and vice versa, automating Linux with .BAT files, etc.) and I haven't seen this slowness at all.



While I can see the subtle distinction you're trying to draw people's attention to (NTFS is not the problem, filesystem operations generally on Windows are the problem) I have to say it seems like a distinction without a difference in real terms. They made a range of changes that seem to produce more complicated code everywhere because the overhead of various filesystem tasks are substantially higher on this OS vs every other OS.

But in the end they had to get the OS vendor to bless their process name anyway, just so the OS would stop doing things that tank the performance for everybody else doing something similar but who haven't opened a direct line up with the OS vendor and got their process name on a list.

This seems like a pain point for the vendor to fix, rather than everybody shipping software to their OS


I find it to be incredibly janky. Pretty much every every time my computer sleeps (so every morning, at least) I have to restart it because somehow the VM-host networking gets screwed up and VS code connections into the VM stop working. You also can't just put things in your Windows User directory because the filesystem driver is so slow that git commands will take multiple seconds, so now you have two home directories to keep track of. There were also some extremely arcane things I had to fix when setting it up involving host DNS and VPN adapter priority not getting propagated into the VM so networking was completely broken. IIRC time would also not match the host after a sleep and get extremely far out of sync, though I haven't run into that for a while since now I have to reboot Windows constantly anyway.

I don't have a need to run multiple OSes though. All of my tools are Linux based, and in companies that don't let people run Linux, the actual tools of the trade are almost all in a Linux VM because it's the only reasonable way to use them, and everything else is cross-platform. The outer OS just creates needless issues so that you now need to be a power user with two operating systems and their weird interactions.


> somehow the VM-host networking gets screwed up

> extremely arcane things I had to fix when setting it up involving host DNS and VPN adapter priority not getting propagated into the VM so networking was completely broken

Are you sure you set up the VPN properly? Messing around with Linux configs is a good way to end up with "somehow" bugs like that.


I don't know how it's set up. That's kind of my point though. I have to now be an expert in Linux and Windows to debug this stuff, which is a waste of my time as someone who's job it is to develop (server, i.e. Linux) software. I had exactly zero issues when I was using Fedora. At one point my company made all of the Linux users move off (we do now have an IT-supported Linux image, but I haven't found the time to re-set up my laptop and don't fully trust that it will work without a bunch of trouble/IT back-and-forth because they also made Windows users start using passkeys), and since then I've seen way more issues with Windows than Linux (e.g. one day my start menu just stopped reacting to me clicking on programs), in addition to things like ads in the lock screen and popups for some XBox pass thing that I had to turn off, which is just insane in a "professional" OS. A lot of days I end up having to hold down the power button to reboot because it just locks up entirely.

OSX was a bit janky with docker filesystem slowness, homebrew being the generally recommended package manager despite being awful (why do I sometimes tap a cask and sometimes pour a bottle? Don't tell me; I don't care. Just make it be "install". Also, don't take "install" as a cue to go update all of my other programs with incompatible versions without asking), annoying 1+ second animations that you can't turn off that make it so the only reasonable way to use your computer is to never maximize a window (with no tiling support of course), and completely broken external monitor support (text is completely illegible IIRC), but Windows takes jank to another level.

By contrast, I never encounter the issues people complain about on Linux. Bluetooth works fine. Wifi works fine. nVidia GPUs and games work fine. Containers are easy to use because they're natively part of the OS. I prefer Linux exactly because I stopped enjoying "tinkering" with my computer like 10 years ago, and I want it to just quietly work without drawing attention to itself (and because Windows 8 and the flat themes that followed were hideous and I was never going to downgrade to that from Windows 7).


Thats odd. I have none of these problems. Sleep doesnt interrupt the VM. And I regularly use the git CLI through WSL on projects living within windows user directories. Both work fine.


FWIW, you can run a VPN (e.g. tailscale) in WSL2. I have WSL2 start up on boot and I can remotely ssh to WSL2 without logging into Windows at all.

I also have tailscale running on Windows itself and they don't conflict.


I think you might want to give more context.

I use linux. I don't need WSL at all. Not at work nor at home.

So you praise WSL because you use Windows as your main system? Than yes its great. It definitly makes the Windows experience a lot better.

OpenSSH for Windows was also a game changer. Honestly, i have no clue why Microsoft needed so long for that.


Openssh should have been a game changer but they made a classic openssh porting bug (not reading all bytes from the channel on close) and have now been sat on the fix in “prerelease” for years. I prodded the VP over the group about the issue and they repeatedly made excuses about how the team is too small and getting updates over to the windows team is too hard. That was multiple windows releases ago. Over on GitHub if you look up git receive pack errors being frequent clone problems for windows users you’ll find constant reports ever since the git distribution stopped using its own ssh. I know a bunch of good people at Microsoft, but this leadership is incapable of operating in a user centric manner and shouldn’t be trusted with embedded OSS forks.


I'm a simple man, if I open the shell and `ssh foo@bar.com` doesn't work, I don't use that computer. Idk if Windows has fixed that yet or why it's so hard for them. Also couldn't even find the shell on a Chromebook.


putty is longer necessary? That would be a wild upgrade in usability for the work laptop, shall go try it


openssh has been an optional windows component for... almost a decade now? including the server, so you can ssh into powershell as easily as into any unix-like. (last time I set it up there was some fiddling with file permissions required for key auth to work, but it does work.)


OpenSSH on Windows is great for the odd connection and SFTP session, but I still feel strongly that any serious usage should just stick with PuTTY and WinSCP. The GUI capabilities these provide are what Windows users are used to. The only benefit of built-in SSH is if you're working with some minimal image stuff, like Windows Server Core or Tiny11. IMHO.


IIRC (it's been a while) I used the server with vscode remote ssh extension.


imo the interesting part in opensssh into Windows.


I feel old but its only 6 years not a decade :P


I guess 'before covid' and 'decade ago' is the same in my mind ;) I might have been using a preview build back then, too


I dislike using putty, I use the ssh client from WSL. Just feels .. better. And bash/fish history helps.



On the other hand sometimes the GUI on WSL decides to break and you have to restart the whole thing.


Aged like fine milk


Running a Linux VM on Windows is nicer than just booting into Linux? That's quite a take. Windows is so user-hostile these days that I feel bad for those who have to deal with it. Calling it delightful must be symptomatic of some sort of Stockholm syndrome.


> symptomatic of some sort of Stockholm syndrome

I have since moved to macbooks for the hardware, but until not too long ago WSL was my linux "distro" of choice because I didn't want to spend time configuring my computer to make basic things work like suspend/wake on lid down/up, battery life, hardware acceleration for video playback on the browser, display scaling on external monitor and so on.


Who deals with this? All this is fine out of the box on a modern Linux distro.


That was certainly not the case ~2 years ago, the last time I installed linux on a laptop.

It also doesn't appear to be the case even now. I searched for laptops available in my country that fit my budget and for each laptop searched "<laptop name> linux reddit" on google and filtered for results <1 year old. Each laptop's reports included some or other bug.

https://www.reddit.com/r/linuxhardware/comments/1hfqptw/linu...

https://www.reddit.com/r/linuxhardware/comments/1esntt3/leno...

https://www.reddit.com/r/linuxhardware/comments/1j3983j/hp_o...

https://www.reddit.com/r/linuxhardware/comments/1k1nsm8/audi...

The laptop with the best reported linux support seemed to be Thinkpad P14s but even there users reported tweaking some config to get fans to run silently and to make the speakers sound acceptable.

https://www.reddit.com/r/thinkpad/comments/1c81rw4/thinkpad_...


> linux

Which Linux? Each distro is essentially a different operating system.


I thought you said everything should work seamlessly on any modern distro.


Not all distros that exist in the current year are "modern". Mint for example, still ships with X11 and old forks of Gnome. Lots of people are running Arch with weird components that don't work well for whatever reason. And so on...

Modern means systemd, pipewire, Wayland, Gnome, an up to date kernel, etc... So the current Ubuntu and Fedora releases.

I've had 100% working laptops for 15 years now. Because I always run the newest Ubuntu.


I run Ubuntu and suspend is pretty much a nightmare to the point I just gave up pretending it exists. These are Dell computers sold with supposed Ubuntu support. Close the lid and put it in a backpack is inevitably an invitation for a hot laptop or empty battery when you pull it out a few hours later (for the record: Windows isn't any better at this in my experience so WSL never solved that problem either).

Previous laptops (all ThinkPads) used to be able to get everything all to work (debian) but it did take effort and finding the correct resources. Unfortunately all the old documentation about this stuff is pre-systemd and UFI and it's not exactly straightforward anymore.


Google "Dell suspend issues". It's just their computers, it doesn't work any better on Windows. My wife has had 2 Dell laptops now, neither suspended properly ever (and she only runs Windows). According to the internet, this is a Dell problem. One of her laptops also had the Wifi card break within 4 hours of use, brand new. But she likes the "design" and is stubborn.


Google harder. It's a general Windows problem. Microsoft can't even get it to work on their own Surface devices. Show me a Windows laptop that suspends properly and I'll show you a liar.


Well there you go. Meanwhile Linux suspend does work more often than not in my experience. I've had a ThinkPad, Acer and MSI laptop with working suspend on Linux.


Other than an up to date kernel, your list of what "modern" means is entirely wrong. The rest of the entries are polarizing freedesktop-isms. There's nothing out of date about, e.g., KDE Plasma.


Afaict, all the reporters used the newest available Ubuntu/Fedora/Arch.


I read all the links, most of the problems weren't bugs (Fan runs loud? Fans run under Windows as well... Only modern suspend? Literally created for Windows...). From all those links the only thing that was a bug was an issue with a kernel regression and 4/5 distros he listed weren't one I listed.

Maybe I was too positive on Fedora (I was going by it's reputation, I use Ubuntu for work). Ubuntu is solid.


Issues reported:

Link 1: screen only updating every 2 seconds, visual glitches. Link 2: brightness reset to full on screen unlock, fans turning on when charging. Link 3: bluetooth troubles, speakers cant be muted if headphone jack is on mute. Link 4: audio quality and low volume, wifi not coming back after sleeping. Link 5: fans being too loud, poor sound quality.

Either your Stockholm syndrome is affecting your reading comprehension or you just take bugs like these as part of the normal "working perfectly" linux experience.


Aren't these issues almost always kernel-related?


Nothing works out of the box with Linux. They may "seem" to work out of the box but you realize how many little tweaks go into making a laptop/consumer device work fully when you work as an embedded dev. It is quite difficult to get to the same power consumption levels and same exact hardware / software driver capabilities under Linux. There are simply no APIs for many things. So the entire driver has to live in userspace using some ioctls to write random stuff to memory or it cannot exist. There are also algorithms that the hardware manufacturer wants to keep closed.

Note that NVIDIA drivers didn't get better since they are more open source now. They are not. GPUs are now entire independent computers with their own little operating system. Some significant parts of the driver now runs under that computer.

Yes the manufacturers may allocate some people to deal with it and the corrosiveness of the kernel community. But why? Intel and AMD uses that as a marketing and sales stragtegy. If the hardware manufacturer is the best one there is, where is the profit for supporting Linux? Even Thinkpads don't have 100% support of all the little sensors and PMICs.

HiDPI issue hasn't been solved yet completely. Bluetooth is still quite unreliable. MIPI support should be the best due to the number of devices, until you realize everybody did their own shitty external driver and there are no common good drivers for MIPI cameras so your webcam doesn't work. USB stack is still dodgy. Microsoft in 90s had a cart of random hardware populating the USB tree completely and they just fucked with the NT kernel plugging and unplugging until it didn't break anymore for love's sake. Who did that level of testing with Linux?


This is why you buy computers designed for Linux, with Linux preinstalled, and with support that you can call to get help if there is an issue.


Then you cannot claim that Linux works out of the box. It doesn't if you need to select hardware for it. However, I already know that since I actually used Linux for 15 years. Both on the consumer side as a normal user for 15 years and now I am actually an embedded Linux developer. The underlying architecture of GNU/Linux distros is heavily server biased which often is the polar opposite of a consumer system.

Except for Apple (and maybe Framework), all laptops are designed by contract original design manufacturers (ODMs) Taiwan, Korea and China. Your usual Linux laptop OEMs like System76 and Tuxedo just buy better combinations of the whitelabel stuff. They are inferior to actual big OEMs designs which contain more sophisticated sensors and power management and extra UEFI features. This includes business laptops Dell Latitudes, HP Elitebooks and Lenovo Thinkpads. None of those manufacturers actually do Linux-based driver development. All the device development, manufacturing and testing is done under Windows and only for Windows. The laptops are booted with Windows to do functional tests at factory not Linux.

Linux is an afterthought for all OEMs. After Windows parts are released and tested, the kernel changes to Linux is added. They are rudimentary support which doesn't include 100% of the featureset. Many drivers today have quite proprietary user-space side. You'll get none of that from any laptop manufacturer. You may say you don't care about those and you're okay with 10 - 20% power loss. That's not the definition of out-of-the box for me.


> Then you cannot claim that Linux works out of the box. It doesn't if you need to select hardware for it

That is not what that means. At all.

> Your usual Linux laptop OEMs like System76 and Tuxedo just buy better combinations of the whitelabel stuff.

This is not what System76 do, actually.

> Many drivers today have quite proprietary user-space side. You'll get none of that from any laptop manufacturer.

Not with System76

> You may say you don't care about those and you're okay with 10 - 20% power loss.

I'm not. That's why I stopped buying Windows hardware and started buying Linux hardware!


Apple users's whole identity is based on thinking linux users do this daily.


You need new reasons to hate Linux, because all those issues were solved a while ago.


There is a reason why 1) people whose main environment is Linux feel (correctly) that these problems have been solved a long time ago, and 2) people whose main environment is not Linux but who try Linux occasionally feel (correctly) that these problems still occasionally crop up.

People whose main environment is Linux intentionally buy hardware that works flawlessly with Linux.

People who try Linux occasionally do it on whatever hardware they have, which still almost always works with Linux, but there are occasional issues with sketchy Windows-only hardware or insufficiently tested firmware or flaky wifi cards, and that is enough for there to be valid anecdotes in any given comments section with several people saying they tried it and it isn't perfect. Because "perfect" is a very high bar.


>People whose main environment is Linux intentionally buy hardware that works flawlessly with Linux.

Hm, recently I bought a random "gamer PC" for the beefier GPU (mainly to experiment with local LLMs), installed Linux on it, and everything just worked out of the box. I remember having tons of problems back in 2009 when I first tried Ubuntu, though. I have dual boot, just today I ran a few benchmarks with Qwen3. On Windows, token generation is 15% slower. Whenever I have to boot into Windows (mainly to let the kid play Roblox), everything feels about 30% slower and clunkier.

At work, we use Linux too - Dell laptops. The main irritating problem has been that on Linux, Dell's Dock Stations are often buggy with dual monitors (when switching, the screen will just freeze). The rest works flawlessly for me. It wasn't that long ago when my Windows (before I migrated to Linux) had BSODs every other day...


My random "gamer PC" won't even boot into any Linux live CD, so I can't install it at all.

Anecdotes are like that.


> people whose main environment is Linux feel (correctly) that these problems have been solved a long time ago

There is also the quiet part to this. People who religiously use Linux and think that it is the best OS that can ever be, don't realize how many little optimizations go into a consumer OS. They use outdated hardware. They use the lower end models of the peripherals (people still recommend 96 DPI screens just for this). They use limited capabilities of that hardware. They don't rely on deeply interactive user interfaces.


I own a 2011 thinkpad, a 2014 i7 desktop and a "brand new" 2024 zen5 desktop. They all work wonderfully and all functionality I paid for is working. I haven't had a single problem with the newest machine since I bought it other than doing the rigmarole to get accelerated video encoder/decoder to work on Fedora. Sucks but I can't complain.

The older machines I've owned since around 2014 and I remember the hardware support was fairly competent but far from perfect and graphics and multimidia performance was mediocre at best and ZERO support for accelerated video encode/decoder. Fast forward to around the last year or two and linux on both of these machines is screaming fast (within those machines capabilities...), graphics and multimidia is as good as you could get on windows (thanks wayland and pipewire!) and acc. video decode/encode works great (still have to do the rigmarole in fedora, but it's ootb in manjaro).

Both the 2014 machine and the 2025 sport a 4k display @120hz (no frame drops!) with no issues using 200% scaling for hi-dpi usage. Pretty much all of the apps are hi-dpi aware, with the exception of a few running on WINE which until a few months wasn't HI-DPI aware. (this feature is experimental and among many other improvements in WINE may take another year to mature and be 100% stable)


200% is just rendering the same pixels and them drawing them 4 times and driving a single monitor at the single resolution is easy stuff. Would your HiDPI system with one monitor at 125%, one at 100% and another at 150% scaling? This is when the font rendering gets fucked up and your Hi-DPI native toolkits start blurring icons. That's my setup. Windows is perfectly capable to make this work. GTK wasn't able to do fractional scaling until recently and Qt has 100s of papercuts.

I got a Thinkpad to just run this setup under Linux 2020. AMD didn't solve the problem in their driver until 2022 when I was able to drive all of them at 60 Hz.


No, 200% is rendering 4 pixels with "features" 2x larger in each axis. You may get 200% scaling as you said with some legacy apps that give zero fucks about dpi scaling but are still scaled trough some mechanism to properly match other apps.

Fractional scaling has been a problem across all platforms, but I agree Linux has taken its time to get it right and still have some gotchas. You should try to avoid it in any platform honestly, you can get sometimes get blurry apps even in Windows. AFAIK KDE is the first to get it right in this complex situations where you mix multiple monitors with different fractional scaling ratios and have legacy apps to boot. GNOME has had experimental fractional scaling for a while but it's still hidden behind a flag.

It also helps to not have nVidia trash on your old (and sometimes even new) computers if you want longevity. My old machines have intel and AMD graphics with full support from current kernel and mesa.


Linux is basically everyone's go to for older devices. Windows 10 will run like shit on a 10 year old laptop with 4GB RAM but latest Ubuntu is nice and snappy.


I have a 13 year old laptop that runs Windows 10. I cannot run Linux because neither nouveau nor Nvidia drivers support its GPU. It has 8 GiBs of RAM and it works perfectly for light browsing and document editing.


What GPU?


I don't need new reasons to hate Linux. Like I said, I have moved to macbooks as my personal computing device because of the better hardware.

> solved a while ago

Can not be the case because I was facing these issues less than a couple of years ago.

I was responding to the "Stockholm syndrome" comment specifically because there are a number of hardware and software problems (e.g. https://jayfax.neocities.org/mediocrity/gnome-has-no-thumbna...) with using linux as a desktop operating system that linux users have to find their way around, so I found the comment rather full of irony.

PS: I already know that the file-picker issue has been fixed. That does not take away from the fact that it was in fact broken for decades. It is only meant as an example.


> Can not be the case because I was facing these issues less than a couple of years ago

Just like with Mac and Windows, you choose the supported hardware, and everything is flawless.


If there's some set of fully Linux-capable laptops out there, it's a small subset of the Windows-capable ones.

And it's not clear what the Linux ones are. Like, our dept ordered officially Linux-supported Thinkpads for whoever wanted them, and turns out they still have unsolved Bluetooth audio problems. Those people use wired headphones now.


This is true. Until people pay reliably for Linux hardware instead of Windows, that will always be the case, just as it is for Mac.

Just like Mac, though, the key is to buy from a vendor that ships hardware designed for Linux, with Linux preinstalled, and with support for Linux.

Unlike, Mac, though, Linux won't block you from installing it on Windows hardware, so it's not as obvious that you're on your own.


And what is supported hardware here? What even is "support"?


I'm writing this from Purism Librem 14, which works flawlessly, including suspend. There's also System76, Framework and more. See also: https://news.ycombinator.com/item?id=32964519.


As far as I can tell, Chromebooks are the only truly supported GNU/Linux laptops.


System76 is my go-to. There are others. You can even get some major vendors (Dell, Lenovo) to ship with Linux preinstalled, though I don't know if the firmware or chips diverge from the Windows variants.


Basically any thinkpad


There's no way, especially if you include Bluetooth in that list.


> Running a Linux VM on Windows is nicer than just booting into Linux

Indeed, it does. Having stable system and not dealing with Linux on Desktop, clear tradoffs (like "just add another 16gb RAM stick in laptop/desktop and you are golden") is great for peace of mind.

The average uptimes on my laptops (note for plural) is ~3 weeks, until next Windows Update to be applied. I don't have nostalgia on the days of using Linux on desktop (~2003 student times, ~2008 giving it one more try, ~2015 as required by dayjob)

Of course it adds up that I can tell people around me (who are not tech guys often, but smart enough to know basic concepts and be able to run bash scripts provided to them) - "yep, machine with 32GB+ of RAM will work fine, choose any you like" - and it works.


I'm confused, in what world does running Linux require more RAM than Windows?

The suspend/hibernate on laptops isn't that great, but tbh I never had great results on windows either (macos is decent though).

And uptimes for desktop systems are similarly just limited by whenever there's a kernel update.


I meant overhead you need to spend in RAM to run WSL2


This is the opposite of what I've heard. Most often you hear of people installing Linux on old machines due to it performing better than Windows on low resources.


I'm talking about more regular situation when you deal with new hardware- why on earth I'd go with outdated and limiting me T480 when T16gen4 is around the corner. Or ARM based laptops.


If for some reason I could never use a MacBook again, it wouldn't be easy to decide between Windows or Linux as the host OS on a laptop. Do I want something that's intentionally user-hostile or something that's unintentionally broken a lot?

I'd at least try Linux cause I abhor Microsoft, but idk if it'd work out.


Maybe it is both-sidesism but the motd you get by default on Ubuntu these days is as bad as any OS. (“Ubuntu Advantage” sounds about as good as https://prospect.org/health/2024-01-12-great-medicare-advant...)

At least the nags in Windows look like modern web-based UI (so far that ‘use Electron’ seems to be the post-Win 8 answer to ‘how to make Windows apps’) in contrast to MacOS which drove my wife crazy with nag dialogs that look like a 1999 refresh of what modal dialogs looked like on the classic Mac in 1984.


My acid test for WSL2 was to install the Linux version of Google Chrome in it, and then play Youtube videos fullscreen with that. It worked. Somehow WSL1 was the more impressive hack but how can you argue with what works? WSL2 works fine.

Also 1980s style X11 widgets on the Windows desktop in their own windows? Cool.


I have to say too, though, once you get the hang of the way an EFI system boots, it's really good for dual boot. I let the Linux installer mount the undersized existing one as /boot/orig_efi and made a new, bigger EFI system partition. Not only was the UEFI on that particular laptop fine with it, scanning both EFI system partitions for bootable stuff, but also, grub2 installed in the new one automatically included the Windows boot in the old one as a boot option.

Cool because nothing about how Windows boots is intercepted; you can just nuke the new partitions (or overwrite them with a new Linux installation). I still prefer a native Linux boot with "just in case" Windows option to WSL.


But not having to dual boot and just get both worlds at the same time definitely beats having to dual boot.


I don't think people are using WSL to avoid problems with dual booting. Dual-booting has become about as simple as it can be, thanks to UEFI, but it's still not exactly fun to have to close all of your open apps to switch to another OS to run just one app.


You get much nicer window decorations if you use the wayland support instead of X11.


> You get much nicer window decorations if you use the wayland support instead of X11.

Wayland supports window managers ?


Step it up a notch and see if Netflix works w/ its DRM.


Forced to work on Windows for ++nth job, I was looking forward to WSL. Indeed, while it worked, it was magic. Sadly, I have had no end of bizarre bugs. The latest one almost crashed my whole desktop - as far as I can piece together, something crashed, leading to a core dump the size of my desktops entire memory - half the machine's RAM. This in turn put WSL in a weird state - it would neither run, not be uninstallable. Googling found bug reports with similar experiences, no responses from Microsoft and magic incantation that maybe worked for some people - but not for me.

It might be due to my corpo's particular setup etc. but for me 95% of the value of WSL would be the ability to run it on "corporate" Windows boxes. Alas.


I'm sure that feature is important for whatever works you're doing, but that's a feature I've _never_ desired, and WSL is missing plenty of features that are important for my work.

Hardware performance counters basically do not work in WSL2, which among other issues, makes it extremely difficult to use rr. https://github.com/rr-debugger/rr/issues/2506#issuecomment-2... Some people say they got it working, but I and many other users encounter esoteric blockers.

The Dozen driver is never at feature parity with native Linux Vulkan drivers, and that's always going to be the case.

By default, WSL security mitigations cause GCC trampolines to just not work, which partly motivated the opt-in alternative implementations of trampolines last year. https://gcc.gnu.org/git/?p=gcc.git;a=commit;h=28d8c680aaea46...

gWSL is also a terrible X11 server that makes many very basic window management configurations impossible, and while I prefer VcXsrv, it has its own different terrible issues.

I can imagine that WSL2 looks attractive if all you want to do is run command line apps in multiple isolated environments, but it is miserable for anything graphical or interactive.


> I can imagine that WSL2 looks attractive if all you want to do is run command line apps in multiple isolated environments, but it is miserable for anything graphical or interactive.

Indeed, that's my case - using CLI mostly for ssh/curls/ansible/vim over ansible and Puppet, so on.

For GUI part, Windows is chosen and shines for me.


I think it really depends on what you do and whether the Linux side of it has hard dependencies on system packages. Personally, at work I much prefer working directly on my Linux workstation, and at home have even switched to using Linux for my gaming desktop. I really don't like the direction Windows has been trending for the past few years, and with the specter of a forced Windows 11 upgrade on the horizon I decided it's time to go all in. My system runs better and I can still play all my games. The jankiest thing I do is I have a mingw toolchain so I can compile some game mods into Windows DLLs to be loaded by Wine, but even that ended up being pretty seamless. Just install the toolchain and the project just compiled.


I don't understand. Docker/podman/distrobox/lxc all allow you to do the exact same thing without the virtual machine overhead. I think the real win of WSL is that its a best of all worlds. You get to use Windows with access to every game ever made plus all of the proprietary apps everyone needs to use, with all of the upside of having a full and complete linux command line experience.


You get all of Windows telemetry, vulnerabilities and backdoors, the always fun game of spot the new Advertising opportunity, AI “copilot” spyware I mean feature, updates that reset your machine at will, a terrible UAC model that encourages “just click OK already!”, and dependence on a company that has gone out of their way to prove how much of an unstoppable behemoth they are; and best of all you get to pay for the privileges above.

I know… every year is the year of the Linux desktop… but seriously the AI spyware included was enough to get me gone for good.


It's hard to pick the Windows feature I hate the most, but floating around at the top is Defender. It can't be disabled, at least not easily, and it demolishes IO performance. And Windows update takes the computer hostage, and takes ages to do anything giving no feedback in the process, meanwhile APT can update to a new major version in like 5-10 minutes.


You can setup local and limited user accounts under Windows. Many applications including every development tool out there doesn't need any admin permissions.

Spyware and adware is a government policy / regulation problem. Thanks to GDPR and DMA, using Windows in EU is significantly better experience (try setting a Windows desktop with an EU image). You can remove almost all of the apps including Edge and Copilot. There are no ads in the UI. Neither in Explorer nor in Start menu.


The current process to install windows11 with a local account… is to, press SHIFT + F10 at a screen in the middle of install after the first reboot, enter into the command prompt: ODBE/BYPASSNRO, and disconnect from any internet options, and/or ipconfig disable your networking…

But guess what? Fuck You because that is the old way of doing it now, and now the new command is start ms-chx:localonly

This is a company that fucking hates you.


Or you just use Rufus to build the USB installation disk.


ventoy does it too


Yes, you get Windows telemetry which enabled fixing bugs without a bug report, you get minimal ads in the start menu (if you're playing "spot the new advertising opportunity" I found it. It's in the start menu. You can stop playing now), AI "copilot" which isn't spyware just because you think it is, updates that ASK you nicely multiple times to update (I don't want to be ableist, if you suffer from a Christopher Nolan Memento-like disability where you don't remember the warnings, you might think it's "resetting at will", but I assure you, it isn't), a great UAC model that's a lot better than "just type your root password into this terminal already, and just hope the binary wasn't hijacked in some way to keylog you, because unlike UAC, there is no visual evidence that you're not getting hacked", and dependence on a company that SV_BubbleTime thinks "has gone out of their way to prove how much of an unstoppable behemoth they are" with no evidence or clarity so they must just be making FUD, and best of all the OS costs so little you can pay it in 8 hours of working as a software developer.


I don't even care about privacy. Windows is too slow, nagging, and plastered with ads.


Stockholm’s my man.


Sunk cost, my man.


Good you diagnosed yourself


You're hilarious


with the virtual machine overhead.


Because it's easier to set up a local dev environment in WSL than in any of those.


How is it easier to setup a linux dev environment in WSL than in https://containertoolbx.org/ or https://distrobox.it/ or just in Linux directly?


I meant if you're using Windows to begin with


Gnome (a linux desktop environment) ships a "Boxes" app [0] that is very impressive. You can, with a few clicks, install one of a huge number of Linux distros in an auto-provisioned VM, enable hardware passthrough for USB devices and host 3D acceleration, and manage files with drag-and-drop from the host system. I also use it for Windows and MacOS VMs (don't tell Apple), but you need to provide your own images.

[0]: https://apps.gnome.org/Boxes/


> WSL is more powerful than Linux ...

Are you a Windows user who is happy to have a good way to run Linux on Windows, or are you a Linux user trying to convince other Linux user that instead of using Linux, they should use Linux in a VM running on Windows?

I am a longtime Linux user, and I can't see a reason in the universe why I would want to access my Linux through a VM on Windows. That seems absolutely insane.


Look I get it. I’m forced to use Windows at work and I thank the lord WSL is a thing. But I would switch to Linux base in a heartbeat if I could. WSL is jank as fuck compared to just using Linux.


I will also die on this hill - NixOS on WSL + Windows + komorebi[1] for tiling window management is peak productivity for me.

[1]: https://github.com/LGUG2Z/komorebi


Why not a Linux distro with i3wm, instead? What could possibly hold you back from upgrading?


I've yet to find anything comparable feature-wise on Linux - and they all come with the huge downside of having to roll your own cohesive settings widget ecosystem for basic everyday things like WiFi and Bluetooth connectivity. I run Cosmic Epoch on my old Macbook which is better, but again, feature-wise, it's just not comparable for serious work.


Thanks for your reply, but as a Linux user for over 20 years, all I take away from your post is that you haven't really tried, probably because the variety of distros vastly exceeds the two classic options of mac vs windows.

I understand the "roll your own" argument very well. In my time, I've experienced quite the variety of configs and dotfiles, but I'm not young anymore so I've settled with using Regolith which is an opinionated set of tools, including my favourite i3wm, on top of Ubuntu, and I simply use defaults for the most things.

Anyway, it's much easier to use Linux as a daily driver than it's ever been. The choice of distro is simply which package manager to use, and everything else just works, as long as it's in the package manager's inventory.

I haven't compiled my own computer's kernel in 6 years (but I still cross compile for rpi and other IoT), and I haven't used my dotfiles in 3 years, just defaults.


> Thanks for your reply, but as a Linux user for over 20 years, all I take away from your post is that you haven't really tried, probably because the variety of distros vastly exceeds the two classic options of mac vs windows.

A very big and very incorrect assumption. This reads like you asked the initial question without any actual curiosity behind it.


Thank you for the details!


> having to roll your own cohesive settings widget ecosystem

What gets you that on windows? The builtin stuff is far from cohesive.


I just run NixOS, but that feels like a respectable answer.


> WSL is more powerful than Linux because of how easy it is to run multiple OS on the same computer simultaneously.

I'd venture to say this depends on which OS you're more comfortable with. I'm more comfortable with Linux, so I'd say it's easier/better/less janky to use Linux as a host OS.

> Like if one project has a dependency on Ubuntu22 and another is easier with Ubuntu24. You don't have to stress "do I update my OS?"

Once you're a developer who's been burned by this enough times, you do this with containers or dedicated dev VMs. You do not develop on your host OS and stay sane.


I think it depends a lot on what you're trying to do. I found that anything GPU-related was a nightmare of drivers and configuration which was a show-stopper for me. Now I just run arch/kde and that all works fine out of the box


Well, I'd still rather just use linux, but I take your meaning.


Me too. Particularly after having to do Docker things a few years ago, destroying my productivity due to file system speed.

However, for those of us that went Linux many years ago, and like our free open source, in 2025, is it better to go back to the dark side, to run Windows and have things like a LAMP stack and terminals run with WSL?

I don't play games or run Adobe products, I use Google Docs and I don't need lots of different Linux kernels. Hence, is it better to run Linux in Windows now? Genuinely asking.


As someone who occasionally does use WSL, I definitely think it's not better no. But I'm still biased, because I know a lot more about using linux than I do about using windows, and WSL is still windows.


for me,

> is it better to run Linux in Windows now? Genuinely asking.

definitely is. Servicing takes ~ 1 minute per month to click on "yeah, let's apply those updates and reboot". Peace of mind with no worrying on external hardware won't work or monitor will have issues or laptop won't sleep or during the call battery will discharge faster due to lack of hardware acceleration or noise cancellation not working or ...


wsl2 is linux


*on bare metal

not on a shitty wrapper running on an ad-platform.


I would rather use Linux, outside of VM.


While I mostly agree with this sentiment, sidestepping the power management and sleep issues as well as better driver support and touchpad handling on some laptops makes it quite a bit better.


If you have sleep and power management issues l, your hardware does not support Linux.

This is not a Linux issue, it's a "I bought a Windows computer, slapped Linux on it, and expected that to work" issue.


I've been installing Linux almost universally on "Windows computers" [sic] for the past two decades or more, per your characterization. Sometimes great, sometimes meh. Your point? I am simply illustrating there's a value for WSL over bare metal in some cases, not playing the whose fault it is game.


Sic? You don't understand the argument at all then.

Buy computers that were designed for and ship with Linux, and with support you can call to get help. Modern hardware is far too complex to handle multiple OSes without a major effort. Assuming they even want to support anything but Windows, which most don't.


Two things:

First, that's not the discussion at all. The question is does WSL have valid use cases and benefits over bare metal Linux. The answer is absolutely yes. For whatever reason you have the computer in front of you and you have the choice between the two modalities (many times you don't buy it, employer does, etc.)

Second, if everyone had your attitude, seeing PCs as "Windows computers" and stayed in their lanes in the 90s and 2000s, you would not have the option of three and a half supported "Linux computers" you are alluding to today. Viva hackers who see beyond the label.


WSL is better than no option, sure. It's not as good as Linux on Linux hardware.

The hackers sure. Reverse engineering takes a lot of skill and my hat's off to them.

Almost everyone here, though, are not in either camp. Most have the means and ability to buy a Linux computer if they so choose. But they don't and then complain when Linux fails to run well on a system that never has had a team of dedicated system integration work on it.


I agree. Back in the day (10+ years ago), I used to argue with people about why I ran VMs instead of just partitioning the disk and booting up the OS I needed.

XAMPP did not work out of the box with me on Windows (skill issue on my part, I know), so my preferred setup was to run a Ubuntu Server VM (LAMP stack) and then develop whatever I had on a Windows IDE.

I could have done that under full Linux, I just did not want that. Then Vagrant came into existence, which I'd say was for my use case (but never came around to adopt it).

I'm really happy with my WSL2 setup. I stopped using VMware Workstation when WSL2 broke it, but WSL2 is exactly what I needed to match my use case.


> XAMPP did not work out of the box with me on Windows (skill issue on my part, I know), so my preferred setup was to run a Ubuntu Server VM (LAMP stack) and then develop whatever I had on a Windows IDE.

Why wouldn't you have just spent 5 minutes to get XAMPP working?


It's really a skill issue on my part.

LAMP stack worked for me perfectly on Linux out of the box, whether Ubuntu Server or any RHEL-based distro (even with SELinux enabled!).

I spent some solid 8+ hours on that, saw it uneconomical and went the VM way.


I stopped using VMware Workstation when WSL2 broke it

Is it still broken?


Nope, VMWare added the capability to work as a sort of nested hypervisor atop Hyper-V (which WSL2 and newer Windows security features depend on).

That being said, there is a performance impact.


WSL gave me the push to switch from macOS to Windows. And I couldn't be happier, tbh. There was a lot lacking in my Hackintosh/Windows dual boot setup.


> Edit: for clarity, by "multiple OS" I mean multiple Linux versions. Like if one project has a dependency on Ubuntu22 and another is easier with Ubuntu24. You don't have to stress "do I update my OS?"

For this part, I just create systemd-nspawn containers.

Last time I wanted to test something in a very old version of WebKit, creating a Debian Jessie container takes a few minutes. Things run at native speed.


You use distrobox (https://distrobox.it/) and move on with your life. At work I use multiple versions of Ubuntu seamlessly without messing with VMs on a host fedora box without issue. That includes building things like .deb packages.


> Like if one project has a dependency on Ubuntu22 and another is easier with Ubuntu24. You don't have to stress "do I update my OS?"

Have you tried lxd? It's far less janky than Docker (IMHO) to achieve what you describe. Docker is uniquely unsuited to your use case.


I love WSL, but you can do these things with Distrobox.


I'm with you - after years of messing with dualboot Linux, including (foolishly) running multiday Gentoo builds, WSL + Windows now gives me everything I want from Linux with zero friction.

In fact, I'm a little annoyed that I can't get a comparably smooth experience on my MacBook without spinning up a full QEMU VM. I know it's a bit hypocritical since, like most people, I run WSL2 (which is container/VM-based), not WSL1 (the original magic syscall translation vision).

Does anyone know why there's no lightweight solution on macOS - something like LXC plus a filesystem gadget - that would let me run stuff like "apt-get install chromium"?


Try https://tart.run/

>Native performance Tart is using Apple’s native Virtualization.Framework that was developed along with architecting the first M1 chip. This seamless integration between hardware and software ensures smooth performance without any drawbacks.


> WSL1 (the original magic syscall translation vision).

Actually, the OG "magic syscall translation" is Cygwin[0], which dates back to 1995[1].

[0] https://cygwin.com

[1] https://en.wikipedia.org/wiki/Cygwin

Edit: Fixed prose.


Absolutely! I remember playing and struggling with Cygwin back in the day… I meant original in the sense of the original vision for WSL.


Perhaps with the Mac Hypervisor someone is working on it.

But Qemu (via UTM) starts up pretty quickly for me. No slower than WSL2 under Windows. My only issue is that it seems to drain power even when idle.


Is this close enough? https://github.com/lima-vm/lima


I think WSL is great but if your only goal is to run several Linux OSes, any hypervisor will do. I think Proxmox is better suited to your use-case (hosted on Linux).

I love WSL because it lets me have the best of Windows and Linux.


Is it not the case that wsl2 is a vm; it requires hyperV enablement; and that turns your main windows OS into effectively a type of privileged vm, since hyperV is a type 1 bare metal hypervisor?

This is not often discussed, so it took me a lot of digging a couple of years ago, but I'm still surprised this is never discussed as a consequence / side effect / downside of wsl2. There are performance impacts to turning on hyper V, which may or may not be relevant to user (e.g. If this is also their gaming machine etc:)


You don't stress about Windows updates? Hard to believe it.


Yeah exactly ... I want Windows running in Linux, not the other way around, so I actually control the software and the updates!

I actually just tried WINE for the FIRST time (surprisingly, I have been out of the Windows world for so long)

https://www.winehq.org/

And as long as I installed the binaries from their repo, not Debian 12, it worked very well

Wine is an impressive project too. It's not a VM, which has upsides and downsides, but I was able to run GCC-TDM, Python 3, and git bash in it!


What do you mean by that?


As a reply to: You don't have to stress "do I update my OS?"


I'm also not sure on your question, over the last 5 years, average interruption time is ~ 5 minutes to apply update, which happens roughly once a 3 weeks or so. Once or twice per year, release updates happen and that takes may be 30 minutes of interruption (not totally sure here as I usually grab my coffee and cigarrets and go reading news on balcony, which may easily take ~1h for me).

So for me, updates practically doesn't affect my workflow at all.


Congrats. I'm linux desktop user, still had to waste hours of time nursing miscellaneously recalcitrant windows updates that annoyed people around.


I'm genially intrigued, how to achieve this, assuming you are on standard Windows Defender setup, not some 3 AV/DLP tools working at the same time on system.

My setup/config has NOTHING special usually - got laptop(s), start bundled Windows (10/11) Pro, run "reset this PC" to ensure fresh like install, use for several years. No magical steps involved or whatever "bloatware cleanup" people do.


Having had this conversation with Linux users 100x, I've come to the conclusion that some people are just cursed. Me, I use Windows and have never run into the kind of boondoggles that they swear were common occurrences for them: Drivers breaking on update, BSODs, two-hour-long-forced-updates, etc. etc. It just... never happens. Maybe I'm the one that's blessed.


> It's an absolute delight to use, out of the box, on a desktop or laptop, with no configuration required.

I have been using it since the beginning of WSL 1 with a very terminal heavy set up but it has some issues.

For example WSLg's clipboard sharing is buggy compared to VcXsrv. It doesn't handle pasting into Linux apps without introducing Windows CRs. I opened an issue for this https://github.com/microsoft/wslg/issues/1326 but it hasn't gotten a reply.

Also, systemd is still pretty sketchy. It takes over 2 minutes for systemd services to start and if you close a WSL 2 terminal for just a few minutes systemd will delay a new terminal from opening for quite some time. This basically means disabling systemd to use WSL 2 in your day to day.

Then there's this 6 year old issue with 1,000+ upvotes https://github.com/microsoft/WSL/issues/4699 around WSL not reclaiming disk space. It means you need to routinely shut everything down and compress your VM's disk or you'll run out of space.

Beyond that is does work well so I'm happy it exists.


Also, systemd is still pretty sketchy. It takes over 2 minutes for systemd services to start and if you close a WSL 2 terminal for just a few minutes systemd will delay a new terminal from opening for quite some time. This basically means disabling systemd to use WSL 2 in your day to day.

That doesn't sound good. I was planning to set up a Windows/WSL2 box, but this gives me second thoughts. Where can I read more about this?


It's still ok even without systemd. Technically systemd is disabled by default, you have to turn it on with systemd=true in /etc/wsl.conf.

I can't find a definitive source with an open ticket but if you Google around for "WSL 2 systemd delay startup" you'll find assorted folks talking it about with a number of different reasons.

I just went by my end results of there is a delay with systemd enabled and no delay with it disabled.


never had problems of systemd/2 minutes delays

not sure what would be the correct test here, but:

root@LP-T16:~# uname -rn

LP-T16 5.15.167.4-microsoft-standard-WSL2

root@LP-T16:~# time systemctl restart ssh

real 0m0.039s

user 0m0.008s

sys 0m0.001s


The delay is related to starting WSL 2, not starting a systemd service btw.

Maybe it's specific to Windows 10 Pro, who knows. I'm using the latest WSL 2 from the MS app store.

I just know when I installed Docker directly into WSL 2, when I launched a terminal I could not run `docker info` and connect to the Docker daemon for 2 minutes. The culprit was the Docker service was not available. I was able to reproduce this on Arch and Ubuntu distros.

Separate to that systemd also delayed a terminal from opening for ~15 seconds (unrelated to Docker).

After ~10 minutes of the terminal being closed, both issues happened. They went away as soon as I disabled systemd.


First opening of my main wsl2 Ubuntu 22.04 instance takes roughly 20 seconds, the next new terminals opens in ~1s. As it happens once a 3 weeks or so when Windows rebooted for updates, I don't care much.

It takes me more time to fill passwords for ssh keys to agent anyways.

Granted, I'm not using native docker inside.


Jumping on the anti-wsl bandwagon; I just can't abide the loss on control on windows, will the next update ignore/reset/override my privacy settings? What Gordian knot must I slay to have a local only account (Thanks Rufus!) How do I turn off/uninstall a million things I don't want, Xbox game bar?!?

Linux or *BSD give so much more respect to the user, on windows you are the product! Stand up for yourself and your data!


It doesn't work on any of my 3 Windows machines, all completely different hardware. Jank factor 100% for me. I wish I was seeing what you're seeing.


I like that wsl is a thing when I'm on a windows machine, but it can also serve as a reminder of the often unnecessary frictions that exist between operating systems.

When the answer to a "how do I do X on windows" question begins with "start WSL", my primary reaction is frustration because they're basically saying "there's not a good way to do that on Windows, so fire up a Linux VM".

Just to pick my most recent example, from today. I wanted to verify the signatures on some downloaded rpm files, and the rpm tools work on linux. I know, rpm files are native to a family of linux distros, so it's not surprising that the tools for retrieving and verifying their signatures don't work on windows but... it also seems reasonable to want a world where those tools can install and run on windows, straight from a PowerShell session, with no VM.

Multiply that by all the little utilities that that can't be deployed across multiple operating sytems, and it just seems like some incompatibility headaches are never really going to go away.


Still somewhat janky. I use it on my work machine (since it at least seems a bit faster than using VirtualBox) and regularly run into issues where npm won't build my project due to the existence of symlinks [1,2]. wslg windows also don't yet have first-party support from the windowing system [3]. I also remember having trouble setting up self-signed certs and getting SSL working.

1. https://stackoverflow.com/questions/57580420/wsl-using-a-wsl... 2. https://github.com/microsoft/WSL/issues/5118 3. https://github.com/microsoft/wslg/issues/22


Now if they could only do Windows 12 by taking baby steps in yearly release of Windows 11.1, 11.2 etc.

Iterating on improvements and polishing on Screens and Design that they haven't touched in the past 30 years. Improving on ARM support etc. And STOP adding Ads on the OS.

And the Surface Laptop continues to push Hardware quality forward. From Speaker, Touchpad, Screen, Motherboard etc.


It's not literally true that "Weasels Ripped My Flesh"[1] but WSL2 did rip the python support in QGIS by polluting my PATH with space characters.

[1] https://en.m.wikipedia.org/wiki/Weasels_Ripped_My_Flesh


I like WSL for this single reason too - it gives me space to run isolated experiments without touching my primary OS. So if that's what windows users get out of it, cool.

You can do the same thing with many other technologies on most other operating systems. I've used, in chronological order: FreeBSD jails, VMs, Cloud-hosted VMs, Docker, K8s, and Nix flakes. WSL is probably somewhere in around K8s.

My point is, we've had the ability to run "subsystems" for decades, by different names, on every OS. WSL is cool but quite late to the game, far from being "more powerful than linux".


Perhaps "more powerful" is also a factor of who is the computer user. For example, Linux is not as "powerful" if the computer user is someone who knows little about how to use it.

For a person who will not invest the time to learn, e.g., how to avoid or minimise dependencies, indeed something like Windows with WSL may appear "more powerful".

The point of this comment is that "power" comes from learning and know-how as much as if not more than simply from choice of operating system. That said, some choices may ultimately spell the difference between limitations or possibilities.


Install Promox or TrueNAS on a bare metal desktop to experience the true power of multiple operating systems running simultaneously. On most days, I am running multiple VMs with these OSes in parallel: Windows Server 2025, Windows 11 Pro, and these flavours of Linux - TrueNAS/Debian, Ubuntu, Manjaro, Zorin OS. I also have a dozen or more lightweight containers running, some with LXC on the bare metal host and others with Docker inside the TrueNAS VM.

This setup automatically backs up my data and is resilient to disk failures. It’s the ultimate form of power and bliss.


I used to agree with this for WSL1. Syscall translation gave solid performance, decent FS integration, and interop within WSL with windows executables. I really liked it.

WSL2 has been such a pain. You're basically managing a VM with VMWare Tools somewhat more integrated. I gave up on WSL2 after a few months and went back to booting my arch installation most of the time. Now I'm on a mac for the first time in a long time because windows has gotten so bad.

This is doubly sad because the NT kernel is so well designed to host multiple OSes due to the OS/2 stuff decades ago. All wasted.


It is really good but honestly would prefer something a little more like:

- Linux that works great on a laptop / does the right thing when closing the lid - Linux that doesn't have worse battery life than Windows / macOS - Seamlessly runs Windows when you need to run something (e.g. click on Excel) - Isn't necessarily free (prefer quality over low price in this situation)

Windows of course has many of these traits and WSL is a pretty good compromise, but I would prefer to boot into Linux and use Windows only when necessary (since my need for it is less common).


WSL is great if you're on Windows, but I wouldn't say it's more powerful than Linux. Distrobox on Linux covers your "multiple OS" use case quite well.


Windows treats you like a baby. You cannot learn the internals of it and it forces decisions on you. With Windows, the computer that you paid for is not yours.


You can run multiple OSes simultaneously on Linux itself - Linux can run VMs just fine. I.e. Linux guests on Linux host and so on. Take a look for example at virt-manager (libvirt / qemu + kvm).

And WSL is a limited VM using HyperV anyway. If you want to run a VM, you can as a well run a proper one which isn't limited and runs a full blown distro with sane configuration.

So WSL is definitely not more powerful than normal Linux.


And yet when I reboot my computer windows has shown me an entirely new place I can see ads - this week it was my lock screen.

So I left - I am willing to do more work to be spied on less, to be used as a product less, and to fight with my computer about who owns it less.


> and to fight with my computer about who owns it less.

This is a great way of saying it and expresses the uneasy feeling windows has given me recently. I use Linux machines but I have 1 windows machine in my home as a media PC; and for the last several years windows has made me feel like I don’t own that computer but I’m just lucky to be along for the ride. Ramming ads on the task bar and start menu, forcing updates on me, forcing me to make a Microsoft account before I can login (or just having a dark UI pattern so I can’t figure out how to avoid it, for the pedantic).

With Linux I feel like the machine is a turing complete wonderbox of assistance and possibility, with windows it feels like Microsoft have forced their way into my home and are obnoxiously telling me they know best, while condescendingly telling me I’m lucky to be here at all. It’s a very different feeling.


Yeah, "Weather and More" is such a joke. I like the idea of Weather on my lock screen in theory, and I sometimes miss Windows 8's great support for Lock Screen live data, but I have huge problems with almost everything else in the "and More" (news, no thanks, ads, definitely no thanks, tips, maybe not). Thankfully it is still really easy to turn off "Weather and More", but I wish they'd give us a "Weather and Nothing Else". (Same reason one of the first things I do is disable the "Widgets" display on the taskbar in Windows 11. Weather is great, everything else I don't want and/or actively hate.)


Yeah this is what pisses me off the most about windows. Telemetry that can't be turned off normally. Ads everywhere. Microsoft deciding when I must restart for updates. Microsoft trying to manage my behaviour telling me to try new features. Screw that. My computer is my own and must do what I choose.

This feature thing is really one of their strategies. At work they send us "adoption managers" that run reports to check whether people use feature xyz enough and set up stupid comms campaigns to push them to do so.

I really hate that. I decide how I use my computer. Not a vendor.


The development experience is relatively cumbersome compared to using a native Linux distribution and containerizing application dependencies where needed.


Last time I used it they kept hogging some common keyboard shortcuts for whatever Windows stuff even though the VM-window was focused. Did they stop that?


I've lived in WSL for 3 years now, and have zero complaints. It has worked with no issues what so ever. In 2025, Windows is the best Linux UI.


Using WSL on Win11. I would prefer Linux but I never got used to Open Office/Gimp/... and need to use PowerPoint / Affinity. But WSL mostly works, and added some tools and config to make it useful with WezTerm

https://www.amazingcto.com/upgrading-wsl-with-zsh-and-comman...


> Edit: for clarity, by "multiple OS" I mean multiple Linux versions. Like if one project has a dependency on Ubuntu22 and another is easier with Ubuntu24. You don't have to stress "do I update my OS?"

You can run multiple Linux distributions in chroots or containers, such as docker containers. I have showed people how to build packages for Ubuntu 22.04 on Ubuntu 20.04 for example.


This is what tools like toolbx or distrobox solve. You can have easy to use containers with libs from any distro with a few commands, using podman or docker as the backend.


It's a... VM? Like the Linux VMs running on Linux computers in the cloud?

Sorry but not sorry, it's not easier to run than on linux. It requires the Windows store to work, and to use Hyper-V (which breaks VMware workstation, among other things).

It's in a better package, to be sure, but it's not "easier to run multiple OS on the same computer". It's easier to use multiple OSes (no SSH, GUI forwarding, etc), as long as all those OSes are Linux flavors supported by WSL.

Want FreeBSD or Windows? Nope!


Does it really need the store? I thought you could just go "wsl install" on the console.


The files, including and especially the distro files, `wsl install` installs still originate from the Store's CDN, so the truly paranoid that distrust the Store (including some corporate environments) and just entirely block Store CDN access at the DNS and/or firewall level still break WSL installs.


There's a --web-download argument which helped with issues when I had limited access to the store.


You're likely right, I haven't used it in ages. Though I recall that at one point you had to get distributions from the Store, but it may have been that long ago that it was still being called "Bash for Windows".


As of 24H2, you can just "wsl install" from the commandline and it'll do all necessary setup to get you up and running, including installation of Hyper-V components if needed.


You don't need the store.


> Want FreeBSD or Windows? Nope!

Well, it is windows subsystem for Linux :) not windows subsystem for windows or FreeBSD for that matter :)

Ps I wonder if you can make your own image? After all its really just Hyper-V with some config candy.


It's a bit more than just some candy, there's substantial glue on both the Linux/Windows sides to get Plan9, WSLG, and the other components to work.

That said, the kernel they distribute is open source and you're not limited to just the distros they're working with directly. There are a number of third party (e.g. there's no Arch from Arch or Microsoft, but there's a completely compatible third party package that gives you Arch in WSL2)


>e.g. there's no Arch from Arch or Microsoft, but there's a completely compatible third party package that gives you Arch in WSL2

No longer true since last month.

https://lists.archlinux.org/archives/list/arch-dev-public@li...


I'm shocked. They were adamant it wasn't going to happen for a long long time.


The main complaint was the market place TOS that gave Microsoft a free-pass on any trademarked assets. The new WSL2 installation way avoids all of this.

Along with the glibc hacks needed by WSL1.

(I was part of the discussion and also very adamant about this not happening)


Haha yes, I was being cheeky :)

I'm pretty sure that with the opensourcing, we'll see freebsd or more exotic systems popping up quite quickly. Heck, macOS would be fun!


> Heck, macOS would be fun!

Especially in licensing! /sarcasm


That would make it even funnier in my book!


You're right, it is incredibly nice. Just the other day I got a Windows-only developer to install and use the POSIX/*NIX toolkit we use for development/deployment. In 30 minutes he was editing and deploying left and right with our normal open source stack. No messing around with Cygwin or MSYS or anything, it all just worked in Ubuntu on WSL. It's fantastic.


I share your sentiments. Makes testing my builds against windows, Ubuntu 22, Ubuntu 24, etc a breeze. It pretty much 'just works' and I can take it to go on my laptop. Even though I do most my work in Linux, Windows is a convenient 'compatibility layer'. I was skeptical at first when my friend suggested I try this, but daily usage has won me over.


WSL is massively slower than Linux. Not just the 10% or so for VM, but probably 50-90% slower for disk access. It takes many times longer to start tmux. It has update bugs that crash open terminals and that's not even part of the regular windows forced-update fiasco. In short, it's garbage. It's one of the primary reasons I moved back to Linux for my daily driver.


I'm old enough to remember that before docker there was chroot. It's fairly easy to put lots of different user mode portions of Linux distros into directories and chroot into them from the same kernel. It seems a bit like what you're asking for.

There's also debootstrap which is useful for this technique, not sure if it also works on Ubuntu.


debootstrap absolutely works in Ubuntu


For WSL 1, I kinda agree. It was basically the Posix Subsystem re-implemented and improved. Technically amazing, and running parallel to Windows without virtualization. Too bad it had so many performance issues.

But WSL2 is just a VM, no more, no less. You can do the same with VMware Workstation or similar tools, where you even get a nice accelerated virtual GPU.


My only big gripe with WSL right now is GUI applications. wslg is not good, and the only good experience is when applications have a good remote development UX such as vscode.

Another, smaller, gripe is networking. Because of how WSL is networked, I've run into edge-case issues with connecting to networked applications running in WSL from Windows.


You need to make sure that they use Wayland. Running X11 apps is significantly slower in wslg. Native Wayland apps run much faster.


Run a rootless X server (XWin, Xming) on Windows, network the two (SSH tunnel), you have GUI Linux apps on Windows.


Lack of all packet types disqualified it for me. Is there any hope for nmap, etc?


I use WSL, but I'm actively looking for a way to move away from it. The only thing holding me back are languages like Ruby or Python, which are designed to work in a Unix-like environment. I briefly considered forking Ruby and stripping out all of the Unix-isms but in the end I gave up and just installed Linux (WSL).


docker is pretty easy to use on linux (even rootless docker isn't particularly painful) and KVM using QEMU is also pretty easy for running Windows things. I used WSL quite a bit but ultimately have switched back to running Ubuntu as my main.

Here's the main difference between making Windows vs Linux the main OS from my POV: Windows is a lot of work and only the corporate editions can be converted into not-a-hot-mess-of-distractions (supposedly). Out of the box Linux doesn't have all of the bullshit that you have to spend time ripping out of Windows. You can easily re-install Linux to get the "powerwash" effect. But if you powerwash Windows you have to go back and undo all the default bullshit again.

Having said that Windows+WSL is a very nice lifeline if you're stuck in Windows-land. It's a much better combo than MacOS.


WSL gives you no support for USB devices, which is a massive pain for embedded development when IT forces you to use Windows. Also, this might just be specific to my setup but WSL networking is very finicky with my company's VPN, and breaks completely if the VPN ever drops out requiring a full reboot.


WSL2 can forward USB devices

https://learn.microsoft.com/en-us/windows/wsl/connect-usb

I regularly run ADB through WSL2 using this.


That doesn't work for mass storage devices without a custom kernel, and that's just too much hassle to bother with.

https://askubuntu.com/a/1533361


There are always going to be niche cases. In general USB storage devices are slow to transfer data anyways, so you are better off in copying the files directly from windows mounted location.


For me it was slow, full of compatibility issues, and glitchy. Some simple packages wouldn't even install in the official Ubuntu WSL distro. To be honest I don't know what the use case for this is, other than to run some one-off Linux thing once in a while without having to use another box.


How long ago did you try that?

I use WSL2 to handle Linux (and Windows cross-) compilation regularly, along with running a number of native tools that are specific to Linux.

I've never had any issues with that, even to the point that I've been able to run MAME natively from Linux and have it show up like any other windowed app.


I agree with your opinion on WSL. I psy a similar "tax" when I defend ChromeOS, and I will not stop it, like you won't.

The Linux on Desktop is finally approaching, in more than one "shape", none of which is the shape some people expected/wanted.


Windows 10 with WSL(2) is/was peak Windows for me. You could build stuff and edit MS Office documents in the same place. Sadly, it wasn't meant to last. I have no intention of giving W11 a try, not yet decided what I'll be using come this fall.


I'll second you, WSL makes Windows a first class experience because now I can seamlessly have Linux and Windows apps in one laptop. Yes, I could run VMWare Workstation or HyperV, etc, but this is just better integrated.


I'm a daily driver. It completely changed the way I work. Am I curious if something will compile? Open a terminal and type make. The files are all already there. You can even run graphics apps. It's wonderful.


As of a couple of years ago the integration was not that great and I switched to just using a full-fledged VM instead. For example, trying to use binaries in WSL from within Visual Studio or vice versa was not great.


> WSL is more powerful than Linux because of how easy it is to run multiple OS on the same computer simultaneously.

I do that with KVM too, and each has their own kernel, not one shared kernel made and controlled by one vendor.


I heart WSL. Years ago I was going to switch to MAC OS to have a more unix like experience/workflow. Then WSL came out and I stayed because Linux is the environment I spend most of my time in.


> Like if one project has a dependency on Ubuntu22 and another is easier with Ubuntu24.

Sounds like you could benefit from Qubes OS, which runs everything in VMs with a great UX. Including Windows.


I agree it is a convenient way to run multiple Linux VMs, but it comes with the drawback of having to use Windows, which is a major impediment to anything I may want to do with my computer.


I use Ubuntu 22 in a LXC container on Ubuntu 24 (because of Webex).

I also run other Linux instances with KVM.

I even run a Linux x86_64 executable on an ARM SBC using QEMU.

I just feel that Linux is so much more flexible than Windows.


The power of linux with the professionalism of paid MSFT engineers


You can run multiple linux distros on linux just fine via KVM/QEMU, there is nothing special WSL offers except that it is a must if you're doomed to use windows.


qemu on Linux solves a bunch of these problems as well. But yeah, UX-wise WSL is pretty good at solving the problem of “provide Windows devs a POSIX environment”.


Qemu is nothing like wsl UX wise. The UX on windows is double click gimp and then a window for gimp opens. For qemu it opens a new window for the wm, has awkward input focus interactions, you probably have to log in to the vm, and it can not be easily setup to automatically open the app you want.


I used to love WSL when I had a Windows machine because I used lots of docker containers, but now that I am in a Mac with Apple Silicon, there is no going back.


ELI5 does it allow me to run windows programs in Linux?


no


WSL sucks and I much prefer having a true VM in Hyper-v. WSL is full of weird behaviour and gotchas. Docker?? no pid 1, weird kernel etc ...


>WSL is more powerful than Linux because of how easy it is to run multiple OS on the same computer simultaneously

Is VMWare more powerful than Linux?


Previously, I had dual boot with ubuntu and windows. Sometime last year I just removed ubuntu, and haven't regretted it.

wsl works good enough.


I still have issues with the networking but I agree. Its a fantastic system and it shits me only that it could be a bit better.


It’s a delight to use if you don’t mind your computer conducting 24/7 surveillance on you for a multinational corporation.


If you want to “run multiple versions of Linux at once” and don’t like plain Docker, maybe check-out Podman Desktop.


Most people have little use for running multiple OSes, and that drops a lot when you just abandon Windows entirely.


I want to know what limitations and tradeoffs am I embracing when using WSL vs booting linux off a usb stick.


On linux, I've been using lxc (now incus) for years to get different distros.


Can do the same with FreeBSDs Linuxulator. I run Arch Linux on FreeBSD, emulated.


WSL is so incredible. But support for it from 3rd party dev tools is so terrible.


I agree with you. Maybe if it had AI shoehorned in, hn would be happy.


This would be a great point if WSL didn't require running Windows


I tried it and found it to be such an abomination. I can’t understand why any self respecting software developer would use Windows with a bastard linux like WSL instead of just using actual Linux. Feels like a massive skill issue.


it's for when your corpo provides you with a windows laptop for development. hope that helps with your understanding


Well I guess now you just need to add WSL support to wine.


You can do the same on Linux. Distrobox exists.


We already have Distrobox that does same thing.


> Every time I praise WSL on hn I pay the karma tax

Hmm...

> WSL is more powerful than Linux

Oh.


I'm not the biggest fan of WSL2, but it's definitely good enough for people to like it. it's worked well enough for me in the past, but the last time I used it, there were problems with mDNS and BPF that it just made more sense for me to boot into leenucks.

But you're definitely not crazy for liking it. And people should chill out instead of downvoting for someone who just says what works for them.

I haven't tried Win11 and probably won't unless my employer forces me to. But if Win11+WSL2 works for you, more power to you.


Have you used Distrobox?


I won't downvote you, but I will die on the other hill - the one over there that has a guy sitting down with his arms folded sporting an angry face every time someone something positive about WSL. There's at least three of us on that hill. And we're not going anywhere.


I'll second this, and I'm someone who ran a certain alternative OS to Linux before Linux was viable instead of run Windows, worked as a developer of Win16 and Win32 apps early in my career which gave me a deep love-hate of the platform, couldn't stand Microsoft's monopoly tactics back in the 1990s and 2000s, and remain ever-sceptical of Microsoft's open source and Linux initiatives...

... but WSL is an excellent piece of work. It's really easy to deploy apps on. Frankly, it can be easier to a deployment there than on a Linux or macOS system, for example the reasons detailed above.


Real talk. And anybody who argues is taking a heavy dose of copium to justify their use of Linux and the ensuite of compatibility issues that entails. Let them have their sense of superiority :' )


... You know that you can run VMs, or full-OS containers on a Linux desktop right?

Or on a macOS Desktop. Bonus: doing so on either platform doesn't also mean your host OS is running under a hypervisor, as it does with WSL2.

Bigger bonus: you don't have to run fucking Windows.


> Bonus: doing so on either platform doesn't also mean your host OS is running under a hypervisor

Why do you think, technologically, this is some form of "bonus"?


Because it broke/put restrictions on the ability to run other hypervisors as the user.


Windows by default runs on a hypervisor since some Windows 11 version.


That just sounds like another reason not to use Windows at all honestly.


That may all very well be, but uuh, you're then forced to use Windows


> WSL is more powerful than Linux because of how easy it is to run multiple OS on the same computer simultaneously.

This is why you pay karma tax. This statement is so clearly representative of a falsity.

My linux can run multiple linuxes as well without VM overhead. Something Windows can’t do. Furthermore WINE allows me to forgo running any vm to run windows applications.

I developed on WSL for 3 years and consistently the biggest issue was the lack of ability to use tooling across the shared OSes.

Your karma depleting statements are biased, unfounded, and it shows as you do not really provide counter evidence. That’s why you lose karma.


Except Wine cant cover all of Windows (partly due to fault of Windows). I can't run UWP apps for example. Windows is not a good operating system but if you need it. WSL creates way more intuitive working environment for you. So even if you can run multiple Linux OSes in Linux you can't run Windows as easily you can do linux on Windows. So OPs statement is not incorrect.


There are virtual machines for Linux with seamless window integration, so upgrading to Linux is still recommended imo.

OP's statement remains incorrect, because their assumption is that the WSL experience can't be reproduced in Linux.


Still can't run everything. Especially apps or games that does vm detection.


Another thing is GUI integration is not as good as WSL. You can't make Windows windows as Linux windows. You can do that easily with WSL.


> You can't make Windows windows as Linux windows

I can easily do that by using VirtualBox with Seamless Windows mode enabled.


I've never seen a good UWP app. My biggest issue with Wine is that it can't run anything that needs a driver. That means any hardware with garbage Windows-only control software (hello Roboteq) needs a proper VM.


Is anything using UWP? It's a complete dead end.


I totally agree and will join you on the hill. I used Linux exclusively at my job for two years straight and now do the same job but from Windows 11 with WSL 2 on the same physical ThinkPad T41 laptop. Windows gets the basics right more than Linux did (sleep states, display, printing). And as the OP notes; it makes it easy to run multiple distributions and never fear that something I install or reconfigure within the WSL2 terminal will screw up my host. Having a different OS improves isolation in this regard, not at a technical level but for me making mistakes and entering commands in the wrong place, since Windows does not accept Linux commands. JetBrains and VSCode both have great support for WSL2.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: