Moore’s law has been unsustainable for 20 years, I remember Pentium 4’s with 4ghz. But that hasn’t seemed to matter in terms of real day to day performance improvements. This article makes some great points about the scaling cost and reduced market opportunity for there to be more than 2 or 3 makers in the market, but that’s a trend we’ve seen in every market in the world, to be honest I’m surprised it took this long to get there.
As interesting as this breakdown of the current state of things is, it doesn’t tell us much we didn’t know or predict much about the future, and that’s the thing I most wanted to hear from an expert article on the subject, even if we can take it with a large pinch of salt.
>software is getting slower more rapidly than hardware is becoming faster.
>Wirth attributed the saying to Martin Reiser, who in the preface to his book on the Oberon System wrote: "The hope is that the progress in hardware will cure all software ills. However, a critical observer may observe that software manages to outgrow hardware in size and sluggishness."
I wish that there would be more instances of developments like to Mac OS X 10.6, where rather than new features, the software was simply optimized for a given CPU architecture, and the focus was on improving performance.
This doesn’t hold with my experience, I remember how slow my old machines used to be - especially the ones with spinning hard disks. Not to mention early smartphones… I’ve had computers in my youth I used to turn on then go and make a coffee. Now it’s practically instant.
Yeeeah dude you have some seriously rose tinted glasses on.
My windows machine can go from completely powered off to a big application like a web browser fully open and operational on maybe 10-15 seconds, and yes, fully powered off. No way I’m making coffee in that time. Sure, windows might run stuff in the background, but it’s so fast that stuff doesn’t get in the way of me using the thing at all.
As for phones, “forever” is a silly thing to say but even if I granted you that one off boot time, you have somewhat moved the goalposts as you talked about the act of using software in day to day usage. On my laptop I can genuinely open 10 apps, close 10 apps, and open 10 apps again all within 10 seconds, probably faster if I wasn’t constrained by literal click actions. My computers growing up would grind to a halt if you tried to run like 3 things at once!
There are always instances of inefficient software, but there was back in the day too, I remember generating a new map on civilisation 1 on my 286 and it taking 10 hours. There is no way you can possibly say that on average software is slower today than it used to be, that statement in general is just completely false.
> I remember generating a new map on civilisation 1 on my 286 and it taking 10 hours
Humm. If my glasses are rose tinted, yours are black tinted. I've played quite a lot of civ 1, even on an xt with 8088, and i'm pretty sure map generation did not take 10 hours.
Perhaps if you hacked yours to use larger maps...
I'm also pretty sure civ 1 got to the menu on my 8088 faster than Death Stranding 2 on my ps5. Although that might be my rose tinted glasses indeed. Civ 1 also didn't have a 29 Gb day one patch :)
Let’s be honest for a second - we could both pick examples of slow software on old machines, slow software on new machines, fast software on old machines, and fast software on new machines, and talk about cherry picked experiences all day long to “prove our points”.
Which… by default… means I’m you’re going to have a lot of trouble asserting your “everything is slower now” point, if you see my point ;). I may also have to accept I can’t assert necessarily that everything is faster now. Fortunately I don’t really need to, as I am happy to restrict my point to simple activities, such as web browsing and such, but that’s by the by.
And to answer your question I don’t remember why it took quite so long to generate civ 1 maps, but I have a vague memory of trying to do weird things with the maps with my dad, so perhaps they were larger or there was some generation mod but I honestly couldn’t tell you my memory is too fuzzy. I do remember the copy protection that required looking up words in the manual though…
The improvements you describe come down to one thing: solid state drives. If we had them in the 90s, your software would've been starting up just as fast (or even faster) then. Meanwhile, once the software did start, it ran faster and with fewer resources than software does today. It's not rose colored glasses: software today truly is worse on average than software 20 years ago.
You say that like once the software has started it had washed its hands of that slow spinning disk, anything that interacts with data (god, remember trying to do video editing 20 years ago?? It was hell) was a nightmare.
And oh yeah, 100% I agree SSD’s are a huge factor as to why software runs faster now than it used to on average for day to day activities. Thanks for mentioning one of the things that helps demonstrate my point.
But it is not the only thing, that’s for sure. I can think of a few other critical things that made usage of computing faster over the years, and I bet you can too.
Partially unrelated but I once downloaded anime on dialup, used to take 30 hours per episode at low quality…
The horrors of booting Windows 2000 on a Pentium 1.
But there is something to be said about prior optimizations in limited resources. A few months back a friend was restoring a G4 iMac and I was astounded at how snappy the whole thing felt once the OS had finally loaded.
We forget how snappy iTunes was while using only 30MB of RAM.
If you can combine the best of boths worlds, there could be something special there.
Modern OS and software are very bloated but it is astounding how fast the hardware crunches through all that. But it could be better.
There was bloated software back then too, everything’s relative in that regard. A huge amount of the “bloat” can also be attributed to project assets, we forget how restricted we used to be about the look and feel of applications also. I bet people who write 1980’s software said the same thing about iTunes needing a “bloat worthy” 30mb of ram, I mean, what single application could possibly need that much! Am I right?
Software optimized for a particular processor is today's tech. In the future we may see CPUs that alter themselves to better serve the needs of software. I can see a day where a "cpu" was actually a bunch of different processors connected via a FPGA so that the software could reconfigure the CPU on the fly.
The great thing about AI, it is finally a killer feature that is enjoyed and useful by every user worldwide. And the tech industry finally have an excuse to up sell 16GB as baseline, and perhaps even try to push 24 or 32GB Memory, along with GPU / NPU and CPU upgrade.
For users, a few hundred dollar extra ( on top of the original purchase ) is a such a small number compared to the productivity gain over the usage span of the computer.
AI alone not only increased the server hardware requirement but also user client requirement. It is basically the question everyone has been asking, what is after Smartphone? And to a degree it is AI. ( or LLM )
This will easily push the whole Semi-Conductor Industry forward all the way to 2032 ~ 2035. We will be at 8A or 6A by then.
PCIe 7? possibly PCIe 8? WiFi 9 which is a fixed version of WiFi 8. There are so many great Hardware improvement coming out all because of the demand of greater computing usage.
Software side has been rather boring TBH. I really like the phase Allan Kay uses to describe modern days software are "reinventing the flat tire".
I think the future of compute will look much like today!
Given the power and ubiquity of smart phones, most people don't need any other computer in their personal life. What can be done locally on a smart phone seems like it will be more constrained by battery life and physical size than anything else, and there will continue to be a mix of things that can run on-device and other more compute-hungry functions run in the cloud. I don't see smartphones being replaced or augmented by other devices like smart glasses - people want one device that does it all, and not one they wear on their face.
The same is somewhat true for business use too, especially if compute-heavy AI use becomes more widespread - some functions local and the heavy work done in AI-ready datacenters. I'm mildly surprised that there hasn't already been a greater shift away from local compute to things like Chromebooks (ubiquitous in schools) since it has so many benefits (cost, ease of management, reduced need to upgrade), but maybe it will still come if the need to rely on datacenter compute increases.
Even if we imagine futuristic power-sipping neuromorphic chips, I don't see that changing things very much other that increasing the scope of what can be done locally on a power budget.
> Given the power and ubiquity of smart phones, most people don't need any other computer in their personal life.
Maybe, but they need other form factors, too. Like a big screen and a keyboard.
And once you have those, you might as well through some compute in there: getting smartphone-like performance out of desktop-like components (or even laptop components) is pretty cheap, so might as well throw them in.
Chromebooks are fairly capable 'thin' clients. Their specs aren't that much worse than other laptops.
However all that being said, I mostly agree that at least with current style AI we will likely see a lot more data centre compute in the future. At least as long as hardware costs are a significant fraction of total costs: the 'cloud' can multiplex multiple users over expensive hardware, your local computer is mostly idle and waiting for your input. (And it has to be, if you want low latency.)
However, if costs shift so that electricity and not hardware is the main cost, we might see a return to local compute. Maybe.
> Maybe, but they need other form factors, too. Like a big screen and a keyboard.
It seems generational habits are changing. TV is dead, and younger generation seems ok watching things on small screens, even smartphone. Can always sync phone to parent's/hotel big screen TV if available.
God of war 2 was made on 300mhz cpu and 32mb of ram.
We haven't been bound by moore's law because we just waste computing power because programmers are expensive. No one is trying to optimize nowadays except in very niche places. And when push comes to shove we just start wasting slightly less. Like adding a JIT to a script language 15 years too late.
As someone who has met people whose entire job is to optimise games for consoles, this isn’t true. It’s ok to want to get the most out of hardware in terms of features too.
Can you point to a couple of games in the last few years that would make one exclaim "what kind of forbidden magic they did so they squeeze so much performance out of the platform"?
Like seriously that they have just gone all in on higher end hardware for Doom Dark ages was kind of disappointing.
Doom 2016 runs at almost 60fps on a Pentium 4!
The one title that did impress me was ’Humanity’but that uses a heavily modifed Unreal 4 engine, but it all runs very smoothly on limited hardware because of some smart design choices.
This question is a trap, I could answer it but you’d find a problem with any one of my answers just to “prove” your point. How about you humour me and suggest some you think might at least come close?
As interesting as this breakdown of the current state of things is, it doesn’t tell us much we didn’t know or predict much about the future, and that’s the thing I most wanted to hear from an expert article on the subject, even if we can take it with a large pinch of salt.