Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> NVIDIA doesn't support GBM

If Nvidia don't support it, how much of a useful standard can it really be?



GBM is a very useful standard if you only ever want to request buffers from drivers that use the Linux kernel's GPU buffer management code and modesetting code. (Preferably only Mesa-based drivers too.) NVidia doesn't use this and probably can't for licensing and other reasons. If you want to talk to any graphics driver that isn't correctly and intimately entwined with the right parts of the Linux kernel, GBM is basically useless. It is not in any way, shape, or form a generic standard for buffer management.

(In principle GBM isn't quite a single-implementation Mesa only standard - third party implementations are possible, though the only one that exists right now is by ARM for their newer Mali GPUs. People seem to have had mixed results with it and I'm not sure it's even intended to run desktop Wayland.)


I'm not super familiar with the details of this stuff -- does this mean that there's basically no hope of any wlroots-based window manager ever working on non-Linux operating systems like BSD?


Most of the BSD variants have some (generally outdated) port of the Linux kernel graphics stack, sometimes even with a wrapper layer to try and make the BSD kernel internals look enough like Linux for it to run unmodified. So it's probably not completely hopeless, but that's mainly because the best shot at getting the graphics accceleration it needs is a straight port of the Linux kernel drivers. (At least for non-NVidia users.)


Thankfully, getting less and less outdated (thanks to FreeBSD's wrapper layer, and OpenBSD Foundation sponsoring its own work).

FreeBSD 12.0 has Linux DRM 4.16 code (April 2018), while -current has Linux DRM 5.0 code (March 2019).

OpenBSD -current (next release) has Linux 4.19 DRM code (October 2018, LTS), with initial AMDGPU backports from 4.20 (AMD Picasso & Raven 2 support).


wlroots already supports FreeBSD and OpenBSD support is in the works.


Intel and AMD maintain open source drivers that have supported this for ages. It's only NVIDIA that still tries to shovel outdated proprietary drivers without support for community developed standards down our throats.

Adding that their proprietary drivers are not particularly good, NVIDIA is not a very popular choice for a Linux machine.

Just remember that it took them almost a decade to add KMS support.


It's an interesting position to suggest that nvidia which has no obligation to support you in any way shape or form is shoveling anything down your gullet by not supporting the standards you prefer.

You could vote with your wallet but there aren't enough Linux users to move anyone's needles as far as gpus.


> but there aren't enough Linux users to move anyone's needles as far as gpus.

Well, for server and ML payloads, we are the vast majority. Things like Google Stadia is certainly enough to move needles, and if AMD ends up able to compete in ML with future products, then we'd be able to make a huge dent in NVIDIA revenue.


NVidia has a horrible history of Linux support.

GBM works just fine, they've simply chosen not to support it.


NVidia has a different problem too. Every time I want to install one of their cards for computing purposes only (not graphics), the installation procedure starts messing with my display system. Drives me crazy.


Nvidia has excellent support, I think you are conflating Linux with open source. For decades, Nvidia had (and arguably still has) excellent support for the former, without really caring for the latter.


Are you referring to the small decade it took them to support KMS so that their driver behaved even remotely like a modern one?

Or perhaps their marvelous installation methods of running a random script as root that rewrites configuration files, and its configuration interface that likewise also rewrites configuration files in attempts to get multi-monitor setups working that until recently hardly ever worked?

Maybe you are referring to their magnificent support, forcing you to stay on outdated kernels as upgrading would break compatibility and render you without a functioning graphics adapter short of a VGA-resolution framebuffer console?

It could also be their fantastic backwards compatibility, requiring you to keep track of driver series compatible with certain adapters, where every other GPU in existence just works OOB.

I used NVIDIA up until a 2 years back. While you could arguably get things to work, claiming they had excellent support is laughable at best.


I think he refers to the fact that Nvidia's drivers provided excellent OpenGL support for multiple years compared to the absolute dumpsterfire that fglrx was. Before Valve decided to pay attention to Linux and thus prod AMD to improve their drivers, if you wanted anything approaching serious 3D performance on Linux, you had to use Nvidia. Anything else would be a waste of money.


Fair enough: That was indeed true many years ago.


Not that many, fglrx's replacement amdgpu was released in april 2015, that is just four years ago. And it wasn't until later that it got usable.


Most distros have a nvidia driver package that aligns with the kernel version shipped. This also means updating one updates the other automatically.

Furthermore lag time for supporting newest kernel normally lags what 30 days?

In brief what you are describing is self inflicted wounds.


Indeed, any problems spawning from using NVIDIA can be considered self-inflicted, which is why I left that dumpsterfire.


Expect supporting wayland properly, which all other major vendors have been doing for quite a while. I wouldn't call this excellent at all...


> Expect supporting wayland properly

Doing what everyone else does != doing it properly

> I wouldn't call this excellent at all

So far Wayland support is most further in Gnome and even there a lot of features are still missing/broken despite Wayland being well over a decade old at this point. A lot of very basic features (remote desktop, screen sharing, exclusive fullscreen, keyboard/mouse shortcuts) are still in their infancy. Not to mention both Gnome and KDE have implemented their Wayland compositor inside the shell process so now a compositor crash means you lose all your open apps which hasn't been a problem in Linux for decades.


> A lot of very basic features (remote desktop, screen sharing, exclusive fullscreen, keyboard/mouse shortcuts) are still in their infancy.

- remote desktop: good point. Considering that screen recordings work fine, I don't see a technical reason that a Wayland remote desktop setup couldn't work.

- exclusive fullscreen: what do you mean by this? Using Firefox, F11 and Super-F both work in Sway and F11 works with GNOME.

- keyboard/mouse shortcuts: maybe? The fact that a random process can no longer read all of your keystrokes seems like a plus, to me. Otherwise, just add a shortcut to your window manager or DE and run a command of your choice.

- losing all apps on compositor restart: Only an issue with GNOME and KDE. Pressing Super-Shift-C on Sway reloads the compositor and not your apps.


> The fact that a random process can no longer read all of your keystrokes seems like a plus

It is a plus, unless you write software that allows for global keyboard/mouse shortcuts (which I have done). In which case it is just a huge pain in the ass to not have it, and then hearing from some developers that you can just "add a shortcut to your window manager" is incredibly frustrating. It's not like you can trust an average end-user to actually do so, even the ones that do run linux. Then you will get complained at for not having functionality that existed at some point in the past.


Exclusive fullscreen = application runs directly on the screenbuffer, bypassing the compositor. Not just that a window happens to be taking up the whole screen.


This is never going to exist the same way it did in X, which was a hack around the X display model.


It doesn't matter how it is going to exist, what matters is for it to exist.


Well--no, not in the same way, but it can still be a hint passed along somewhere, to allow the compositor to swap out its root framebuffer for the program's.


Screen sharing is quite important in my workflow. Fellow developers and customers will send me buying Windows or a Mac if all screen sharing applications on Linux stop working. I'm using Slack and Meet. Skype is almost abandoned among my customers.


I would like to see the approach your pronning of "not doing like others" applied to some other protocols like TCP. Networking would be really fun and Internet a great success. The very concept of a protocol is that everybody does the same, the underlying implementation is different.

Wayland and its current limitations have nothing to do with, and does not excuse neither, nvidia's non-compliant implementation.


The protocol is over a decade old; the implementations of it are not.


How long did it take to get multiple monitors working with xrandr, again?


Funnily enough, this is exactly what I was talking about. Nvidia supported multiple monitors on Linux for decades using TwinView, they simply didn't add support for the open source xrandr for 4-5 years, but even that has been supported for ~7 years now.


> Nvidia supported multiple monitors on Linux for decades using TwinView

And Xorg supported them for decades before using Xinerama. Instead of collaborating with upstream, they just dumped something incompatible and broken, and said "Deal with it".


Multi monitor display worked perfectly for long time, long before "it works on intel, let's call it new standard" crew decided that flat shared framebuffer is the only way to go (apparently because of compositors and nothing else), something that also broke support for multi-gpu and especially heterogenous multi-gpu.


Zaphod mode is still available. nVidia pushed its TwinView solution while everybody else was using either the Zaphod mode or Xinerama. I don't think TwinView on Linux allowed any dynamic configuration. So, when the rest of the world switched to the RandR extension, every graphic card was able to dynamically add/remove monitors while nVidia users were stuck with a subpar solution.


It's an nvidia problem, not a standard problem.


But I mean what's the point of declaring something to be a standard if the important people weren't interested in implementing and supporting it?

Anyone can come up with a standard in isolation. Getting the relevant people on board and able and willing to support it is the useful bit.

Did they say they'd support it? If they never said they did it seems unfair to criticise. Are you going to support my graphics standard that I just made up?


This is a standard problem. You can't be a standard if the market does't care about you. You are just a specification and guidelines.


Intel and AMD cares about it. If the Linux desktop market would grow, Nvidia would quickly change their mind. Now, this market growing is another question... But for me, nvidia gpu are just not a possibility because they lack good drivers on Linux.


So how do you want to grow Linux market share if you say FU to a significant part of the potential market? Steam hardware survey is pretty clear about AMD:Nvidia GPU ratio.


Steam is a pretty niche userbase and not at all representative of the desktop market at large.


The same can be said about Linux desktop share. Should we draw the same conclusions?

Except gaming is not that "niche":

1.35 billion PCs worldwide - https://www.statista.com/statistics/610271/worldwide-persona...

47M active daily and 90M monthly users for steam https://www.pcgamer.com/steam-now-has-90-million-monthly-use...


I don't think so - I see nobody with a desktop computer these days except gamers or people running CAD or modelling who are basically doing the same thing as running a game.


And all people using a computer for work...


(For gamers.)


> If the Linux desktop market would grow

You can think of any number of fantasy scenarios that would make one of the biggest GPU vendors care about your standards, but sadly that is not the world we live in. The world we live in is one where Nvidia has the best game support on the market (far, far better than AMD - let's not even think about Intel here) and therefore anyone who still plays games on a PC will own one of them unless they are some sort of AMD advocate.

At the end of the day, nobody wants to be reading "we don't support this because they're not nice to us". That's going straight back to the linux dark ages of 15 years ago, where you needed to care about all sorts of weird and arcane details to get a functioning desktop system.


a standard supported by nothing, that reminds me of xhtml2...


It is very condescending to label Intel, AMD and pretty much every other GPU vendor other than Nvidia as nothing.


Discrete GPU market share is 80%+ Nvidia, last time I checked.


Discrete GPUs are a write-off of a minority of the total market, last I checked.


It's also significant portion of "people who pay for Linux developer salaries" in the form of few workstation users that use Linux-based software that is often paired with Quadros.


So are desktop Linux users who don't use GNOME or KDE, if we're honest with ourselves


If Nvidia don't support it, then it's just a standard, not the standard.


Meanwhile X is still on death watch and Nvidia will be brought into the fold or be forced to stop pretending they support the linux desktop.


Wherein "death watch" is expected to work without difficulty for the next decade or longer.

They aren't pretending they support Linux, BSD, Solaris, Mac, Windows for a decade after each card is released.

While this was true ATI/amd were shipping garbage that barely worked for a few years. This has only changed in recent years.

We are a few years having one dedicated gpu maker in the fold and are already talking about using your massive 2% marketshare to strong arm the other. You could afford to be more humble and less entitled.


It really doesn't matter to me if Nvidia drops out of the linux desktop market or not. I've never owned their hardware and never will, so it makes not an iota of difference to me.

I'm just observing that Redhat is the trendsetter and if they say X is legacy, that makes it so. Unless Nvidia or Nvidia customers decide to pick up where Redhat is leaving off and pay for developers to work on X, but I sincerely doubt that's going to happen.

Unless Nvidia decides to support the new system, they can't plausibly claim to support the linux desktop. I just hope whatever happens will result in less online whining from linux-using Nvidia customers.


I think having a substantial chunk of potential machines no longer work would be a meaningful difference to the overall actual and potential user base regardless of how you feel personally. I think we should therefore act in everyone's interests.


Who's going to put up the money? Not Redhat, they don't want to pay for it anymore. Will the nvidia users? Will they be willing to pay for their card twice? Will Nvidia pay for the continued development of X, when they already seem to loath spending money on linux?

If nobody picks up the slack, the paths ahead seems pretty clear. Either Nvidia produces a proper driver, or their support for the Linux desktop can be classified as legacy at best. Feeling upset about this situation doesn't change the nature of it.


This is exactly the kind of reasoning that has always stood in the way of real people using Linux. It makes me sad that we've come a really long way towards taking that mindset out of the OS and we seem to be moving back to it again.


You may not like my reasoning, but how is my reasoning wrong? Somebody needs to pick up the slack now that Redhat has no interest in funding X. All you're doing is protesting that you don't appreciate this situation, but that's irrelevant.

It doesn't matter who the ones "suffering" are. What do you expect to do, convince me of the moral necessity of supporting nvidia cards so that I turn back time and devote my life to reverse engineering GPUs and implementing FOSS drivers? That can't happen. I don't have the power to support nvidia cards, only Nvidia has that power, and they seem to have no interest in exercising it. It doesn't matter if you convince me that nvidia card support is more important that curing child cancer; the situation remains unchanged. To change the situation you must convince nvidia, complaining to anybody else about it is a waste of your time.


NVidia doesn't care, why should the rest of the community subsidize their development costs?

The solution has been presented, and they reject it.


The point I was trying to make is that your end users are fundamentally the ones suffering. You are not suffering, because you avoid the problem by throwing money at it. Nvidia is not suffering, because you as a community don't matter to their wallet.

The only people who suffer are the people we write software for. Maybe that doesn't matter to you, because you write software only for people like you. That's fair. It matters to me though.


It is true,intel and AMD's GPU can consider as nothing.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: