Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can someone please explain why "feature parity" even matters here? If Nvidia can do something with GNU/Linux that is technically hard to do with Windows, why shouldn't they do it? In what way does it make sense to force technical limitations of Windows on GNU/Linux users?


"feature parity" is a poor choice of words, but nVidia builds their Windows and Linux drivers from a single code base.

Keeping support for four monitors in the linux code probably meant adding a bunch of #ifdefs in various files.

They just decided it wasn't worth the effort to maintain.


That may be true, but they are presumably already maintaining a bunch of #ifdef's for the Windows-only features.


Given the relative popularity of Windows vs Linux, it probably is worth the effort to maintain Windows-only features.


True, but Windows has the much larger market share so it becomes worth it.


> That may be true

We'll never know. That's a binary blob on your machine.


This is true, but I fail to see the relevance here. This post just comes across as a "any chance to mention Free Software" type post. This thread is talking about if this decision by Nvidia makes sense. Trying to inject "well, we could all just ignore Nvidia if you were using Free Software," doesn't really contribute much to that conversation.

> That's a binary blob on your machine.

On my machine? It's less likely than you might think.


Being able to assume the #ifdef for features go one direction can have some benefit in reasoning about the code it relates to. In other words, there may be benefits to keeping the windows features a superset of the linux features, most likely in preventing stupid programmer errors.


It goes (or at least at one time went) well beyond that. Last I heard, nvidia.o was shared across Linux, FreeBSD, Solaris, and Windows.


Don't these systems use different object-file formats?


At the very least I can confirm the object is identical (same md5sum) across Linux, FreeBSD, and Solaris. Past sources have indicated to me Windows shares his object.


How is this a limitation with Windows even? I'm sure Windows can have more than 3 displays, so why does Nvidia have this restriction on either platform?


Because they sell "professional" Quadro/Tesla cards with this feature enabled at enormous prices.


Aaaah... so this probably is not about Linux vs. Windows, but rather actually about 'GeForce GTX 560 Ti' vs. Quadro. They aren't worried about driving people away from Windows, they are worried about cannibalizing Quadro sales.

There is probably a lot of use cases for many monitors that don't really require Windows (perhaps pro power-users, like traders, who need lots of browser windows open at once), so they need to ensure that their premium cards retain their edge even on Linux.

That makes much more sense.


But how many of those users would need to use NVIDIA's drivers? As I understand it, the benefit of NVIDIA's drivers over Noueveau is performance, not basic hardware support. If all you need is many monitors for web browsers, then you should not have a problem using Noueveau.


> But how many of those users would need to use NVIDIA's drivers?

That's a good point; I cannot say. Perhaps they think they are dealing with people who think 3D performance is necessary, or dealing with people (IT organizations?) that prefer 1st party drivers to hardware savings.

I don't think Microsoft is really involved in this though.


I have some experience here. With Nouveau, XRandr works out of the box -- great if you have one card. With the proprietary driver, no such luck (last I checked). On the other hand, my workstation has two cards, and XRandr is not up to the task; it was back to Xinerama and playing with xorg.conf.


I'm using the proprietary driver, it has had XRandr support for a while now.


Ah, my information is a bit out of date (at least a year, maybe more?). Thanks.


This sort of logic completely baffles me.

People buy hardware to use their advertised features. If you are the type of person that only needs 2D rendering and video then you should use an Intel GPU. I would assume people who buy a state-of-the-art dedicated GPU want to use them for actual 3D acceleration.


Or possibly the advertised feature they're interested in is 'ridiculous number of attached monitors'. Intel GPUs generally only have a couple of display connectors; if you want a card with a lot of physical ports you'll want to buy a discrete GPU whether or not it support 3D acceleration.


Even some of the Quadro cards (Q2000m in my personal experience) won't do more than 3 displays, and to do that requires changing bios settings which make games unplayable for the most part.


Yeah even Windows98 could run 9 displays. From the sound of it though, it looked like it was about presenting a single display to the OS across multiple GPUs. That's significantly different than Windows' built-in multi-display support.

Think full-screen apps that call APIs that aren't going to work across screen boundaries.


It doesn't. That's just their excuse. If it did, they'd start removing Windows driver features, too, since they can't be found on the Linux driver, and therefore there's no feature parity.


I bet it has more to do with them pushing new cards and wanting users to replace them with new ones.

I have the 560 Ti and it's capable of playing most current games comfortably. The newer cards seem to be able to do 4 monitors, but the 500 series is being left behind by nvidia as they push the 600/700 series with their shield and shadowplay features.


It's possible they have a licensing deal with MS that forces them to do this, the same way Amazon dictate that they must have the best price on a book.

Then again, I find the simplicity argument somewhat convincing as well. NVidia's unified driver model must be hellishly complicated anyway, anything they can do to keep control of the complexity is probably something they'd greet with open arms.


Maybe Nvidia doesn't want end users to feel like they're missing out on anything on Windows, therefore finding the need to cripple Linux users?


But what is Nvidia's motivation for that? How does Nvidia win if people stay on Windows vs. moving to Linux? As far as graphics cards are concerned, they would just start putting more effort into their Linux support (to match the swing in customer OS usage).


Pressure from Microsoft in some fashion?


Valid point. We know how MS has managed to keep up their sales point monopoly for decades (forced Windows sales via mandatory pre-installations). We know how they wanted to offer exclusively their browser on Windows (the EU disagreed with that idea).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: