It is astonishing how oblivious people who are "paid to spend time on the product" can be about the products they ship.
Case in point, Microsoft shipped a EXPLORER.EXE with Windows 10 that is 20% whitespace because of an Adobe invented metadata format that I bet nobody involved with Windows ever thought about it. See
Wasting space is a byproduct of having too much space. Just because it's bloated does not mean they didn't know it was bloated. They probably didn't care.
As time goes by, minimum hardware requirements for operating systems increase. This isn't because it is more difficult to write software, or the hardware stopped being capable of doing the same thing it used to do. It's because software design wastes more resources as those resources become increasingly available.
Today, people ship statically compiled binaries of tens to hundreds of megabytes in size because it is convenient to do so. That would have been absolutely insane before increased bandwidth and storage space made it practical. I have an internet connection so fast my wifi card can't use all the bandwidth.
This is one of the reasons for Docker containers to be so popular. They are basically statically linked apps, carrying everything they need within themselves.
But - I like that. Not having to worry about some library version because I know the obe that is supplied is the same one the developer put there. Nice.
Yep. The production box no longer needs to have the exact same library packages installed as the dev box, but now it needs the same docker images installed. It's just moving the goal posts - with respect to "dependency problems".
The actual benefit to containers is abstracted and unified software components in a complex system. But as we create new, different cloud computing tech, this becomes more difficult, and defeats the purpose somewhat.
Sure, but that should be part of updates. Even if developer doesn't take care of it, you can rebuild the image to update libraries. Of course app might stop working, but at least other apps/containers still work and you can revert the change easily.
I don't know. I like static linking. I'd rather I like things that just work ™. I mean look at PHP for Windows. I can't run php --version because I don't have visual C++ installed. Why? Wouldn't it be easier to include everything you need with your binaries? How much extra stuff are we talking?
To me, RAM and processor are much scarcer resources.
To me, that just speaks about Microsoft’s failure to make the C++ redistributible API compatible across versions. If it was, they could just bundle them with the OS.
The game is more graphically intensive than you might think and the people that care about this kind of game very much value a solid 60 FPS frame rate.
It would've been possible on the Saturn, and on the Dreamcast at 60 fps, easily. How much more powerful are even mediocre modern PCs compared to the Dreamcast?
Saturn would most likely choke on memory requirements, given how much is going on in the levels, as well as lack of transparency support. Symphony of the Night was the state of the art 2D platformer for that hardware generation and it has frequent loading pauses, despite being much more static.
Dreamcast would probably work, given enough effort by people who know the platform inside and out. But then, a much lower barrier to entry is probably a significant reason as to why Sega was willing to fund another 2D Sonic game in the first place.
That's being incredibly unfair in comparison. Old hardware had built in support for sprites, and as long as you kept within the limit the game would run smoothly. It had nothing to do with CPU speed - the machine had hardware support to draw say 8 sprites at the same time so if you were only drawing 8 it was fine, but try drawing 9 and you won't render even a frame a second. Nowadays we're doing an extremely brute force approach because everything is its own object that has to be refreshed every frame - and in a game like sonic mania there is so much animation going on it's insane. In a way it would be easier to do a 3D game than a very complex 2D game like this.
Uhm, no? Even a PC video card from 10 years back can literally render millions of textured polygons (=basically sprites) at a steady 60 fps. Any modern card with programmable shaders would be able to do all the transforms and effects on the GPU with virtually zero CPU overhead at the same time.
There really is no excuse whatsoever for a 2D game like sonic mania to run badly on even low-end PC hardware even with built-in graphics hardware.
Memory bandwidth is the main concern when blitting several layers of pixels 1:1, unless you have some kind of crafty system in place for avoiding overdraw. Although Sonic Mania seems to genuinely use a 320x240-ish internal framebuffer, so my wild guess is that the bottlenecks relate to scaling and post-processing, or possibly some funky driver use causing sync issues.
> Quick note: The situation is not different on Linux, MacOS, or other operating systems because all executable formats have some way to embed images. What publishers should do is remove unncessary metadata before publishing, which is easy to do in the case of the PNG because we can simply omit the iTXt chunk.
From your link - so I guess the situation on linux isn't different? Or am I missing something, other than that it depends on the embedded PNG?
I'm not quite sure what exactly the author's point is.
Yes, you could embed images in ELFs, too, if you really wanted, but why? There's no standardized way to do it, and putting the images in separate files is easier.
There is a standard way to embed any file into an ELF. The program "objcopy" creates object files from arbitrary input files. You then simply link the objects into your executable. The program can then access those arbitrary embedded data chunks by their symbol names, just as if they were literal strings in the source code.
Case in point, Microsoft shipped a EXPLORER.EXE with Windows 10 that is 20% whitespace because of an Adobe invented metadata format that I bet nobody involved with Windows ever thought about it. See
http://ontology2.com/essays/LookingForMetadataInAllTheWrongP...