Back in the early 2000s, Sun ("the network is the computer") had a similar solution that worked seamlessly for most of their software org-- the Sun Ray. https://en.wikipedia.org/wiki/Sun_Ray
It was a network terminal. Your files and entire session were on the server. Your “local” terminal consisted only of a network interface and enough compute power to display your session. The way they had it set up was that you could insert your Sun employee ID – the same card used to get into the building – into a slot in the terminal. That authenticated you to the server and displayed your session instantly. Want to show a colleague something you’re working on? Just put your ID into their Sun Ray and show them exactly what you were doing. That was cool! It was a frictionless way to demo and collaborate.
I worked on a Sun Ray. I and many colleagues absolutely hated it.
These tiny machines were just way too slow to handle even the tiny amount of work they had to do. Also, everyone knows that X over network is just not made for modern applications ("modern" in the year 2000!). I worked with Matlab, and had a lot of fun trying to rotate 3D plots with a few thousand points. It was just unbearable.
Then of course the "single point of failure" thing. Network problems? No one can work. Main server has a drive failure? No one can work. Main server needs an upgrade? No one can work.
The Sun Rays had super-poor USB support. My ergonomic keyboard had no auto-repeat when connected to these things, absolutely impossible to fix or even debug. Then of course there was Sun software: although they invented Java, their JVM was leaking like a sieve and everything Java had to be restarted regularly. The Sun coreutils were just very limited compared to the GNU counterparts. We complained endlessly, and in the end, IT budged and we all got our dedicated Linux machines.
How long did the networked machines experiment last at Sun before you switched to Linux PCs?
I remember that famous Larry Ellison speech in the mid 90s about how thin client/networked applications will be the future. It’s apparently what helped make him famous because no one cared about enterprise DBs in the tech media:
> everyone knows that X over network is just not made for modern applications ("modern" in the year 2000!). I worked with Matlab
No. I don't think everyone knows that at all. At my university we used X terminals connected to a central Sun SPARCcenter 2000 over ethernet and it was fine. We used MATLAB, Maple V, SAS, etc etc.
What's old is new again. How soon until we realize the X window system actually had some good ideas again and start running desktop apps on cloud servers for remote work?
It had some good ideas, but that was the extent of it. Every iteration of the solution since then has agreed that streaming all application UI is always going to be janky and needless. It's a lot better to host the data on the server side and let the client handle visual rendering by itself.
These days lots of people stream games and CAD sessions over the internet using software like parsec. It works quite well and is often better than the UI running locally on an underpowered GPU.
Sure but most regular consumer or business applications don't really need that level of graphical power. Rendering a menu or button or blurb of text locally from some layout language is always going to be more performant than streaming raw pixels from a server.
It would be, if everyone agreed on a toolkit. As it as, at least X11, AFAIK Wayland, and Windows/RDP ended up just throwing pixels over the wire because every program renders text/menus/whatever differently.
I wonder what the world would look like if we decoupled what something is supposed to be and what data it's supposed to have, from how it actually looks and acts. Say, some "lowest common denominator" that works reasonably across all platforms.
For example, I'd say that I need:
A dropdown for a single option, with options: A, B, C
And let the device itself decide what needs to be displayed in the native GUI toolkit. Then just send that specification over the wire, instead of needlessly wasting the bandwidth on lots of pixels.
Actually, I think I'm just describing an analogue to HTML with the equivalent of CSS provided by the platform, but for native desktop toolkits (hopefully without the complexity of a browser engine).
It wasn't really groundbreaking tech. It's been possible for years, see online. It's just the business model sucks. We did some prototypes in a large company I worked for and it was easy to tell if you could put servers in local data centers with fat pipes it would work. We had demos using ec2 back in 2013. Them we ran the numbers and were like why, esp with free to play becoming more and more of a thing so we canned it.
Nvidia had a successful product as well before stadia. Hell I remember reading a use AWS GPU instances as a gaming machine blog posts before stadia.
I don’t think Stadia itself was groundbreaking. Xbox cloud worked as good as Stadia.
But a few yrs in the big players in the cloud gaming industry figured out some critical latency issues around the controllers and optimizing delivery of the video feed. To the point where lag was almost non-existent on a good fiber connection. The datacenter stuff was obviously the core innovation though.
I still prefer to download my Xbox games to my series s/x but I spent a year playing Xbox cloud games exclusively and I could 100% see that being the default for a big part of the casual audience.
I have streamed my worksession with nomachine and I have used VS code remote. nomachine is really good, but I would still prefer to have a beefy enough machine.
I used to run Firefox on my Debian server with X tunneling in high school sometimes. It wasn't that bad then and surely it's even better with today's internet.
I bet on LAN X11 forwarding was pretty sweet, but when I tried it over Wi-Fi/tethering is was pretty janky (and before someone suggests xpra, that wasn't buttery smooth either). But I do like the idea of what X11 forwarding is, better than VNC where whole desktop is shared. I read that on Windows RDP can "stream" only the window, whereas on Linux the implementation is full desktop only? (please correct me if I'm wrong cause I would love to tunnel apps from servers).
Honestly, I think xpra is probably the best open solution you will find. It does have (too) many knobs to configure so it takes time to try out different codecs and settings to find a sweet spot using their session info toolset. And the mailing list is very helpful. For example, if bandwidth is plenty, you can use the straight RGB encoding and it performs well. Turning on audio can have effect on your latency as well.
IIRC, part of the problem is X11 forwarding works best when the software is using raw X primitives (someone can correct my terminology if I'm wrong) because then X11 can just send draw commands. Both GTK and Qt don't use the X11 primitives so instead the server has to send entire pixel buffers, which is naturally a lot slower.
X maintainers are promoting a new platform that doesn’t provide for remote hardware rendering. Ironically, shipping Javascript to a web browser (local to the display) seems to be the way ahead for performance.
We had SunRay at a municipality here in Sweden where I worked at the IT department. I tought it was really cool and I had built our own "session routing" script that could connect terminals to different servers based on the smartcard ID.
Terminals in the schools connected to a school server where students could login without smartcards, but if a teacher inserted their card it would connect them to the "admin" server or to a windows VM.
I had my own server in the DC that I could then connect to from my desk using a multi-monitor SunRay terminal. At home I had a SunRay connecting in to the office with VPN. I could move between terminals by just inserting my smartcard in whichever terminal I was at.
There was even a company creating a SunRay laptop called Gobi that I tried using , but a regular laptop with the software client was a much better experience.
I helped deploy the previous version of that, the Java station, to everyone in MPK. They did not take away the existing computers and mostly went unused.
It was a network terminal. Your files and entire session were on the server. Your “local” terminal consisted only of a network interface and enough compute power to display your session. The way they had it set up was that you could insert your Sun employee ID – the same card used to get into the building – into a slot in the terminal. That authenticated you to the server and displayed your session instantly. Want to show a colleague something you’re working on? Just put your ID into their Sun Ray and show them exactly what you were doing. That was cool! It was a frictionless way to demo and collaborate.