Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is astonishing how oblivious people who are "paid to spend time on the product" can be about the products they ship.

Case in point, Microsoft shipped a EXPLORER.EXE with Windows 10 that is 20% whitespace because of an Adobe invented metadata format that I bet nobody involved with Windows ever thought about it. See

http://ontology2.com/essays/LookingForMetadataInAllTheWrongP...



Wasting space is a byproduct of having too much space. Just because it's bloated does not mean they didn't know it was bloated. They probably didn't care.

As time goes by, minimum hardware requirements for operating systems increase. This isn't because it is more difficult to write software, or the hardware stopped being capable of doing the same thing it used to do. It's because software design wastes more resources as those resources become increasingly available.

Today, people ship statically compiled binaries of tens to hundreds of megabytes in size because it is convenient to do so. That would have been absolutely insane before increased bandwidth and storage space made it practical. I have an internet connection so fast my wifi card can't use all the bandwidth.

What a time to be alive.


This is one of the reasons for Docker containers to be so popular. They are basically statically linked apps, carrying everything they need within themselves.

But - I like that. Not having to worry about some library version because I know the obe that is supplied is the same one the developer put there. Nice.


I believe this is a myth! You still very much have to care - for security updates.

However, more often than not, you've just lost visibility into what libraries and versions you're actually running..


Yep. The production box no longer needs to have the exact same library packages installed as the dev box, but now it needs the same docker images installed. It's just moving the goal posts - with respect to "dependency problems".

The actual benefit to containers is abstracted and unified software components in a complex system. But as we create new, different cloud computing tech, this becomes more difficult, and defeats the purpose somewhat.


Sure, but that should be part of updates. Even if developer doesn't take care of it, you can rebuild the image to update libraries. Of course app might stop working, but at least other apps/containers still work and you can revert the change easily.


I don't know. I like static linking. I'd rather I like things that just work ™. I mean look at PHP for Windows. I can't run php --version because I don't have visual C++ installed. Why? Wouldn't it be easier to include everything you need with your binaries? How much extra stuff are we talking? To me, RAM and processor are much scarcer resources.


To me, that just speaks about Microsoft’s failure to make the C++ redistributible API compatible across versions. If it was, they could just bundle them with the OS.


The PC release of Sonic Mania was delayed because of slowdown issues.

Slowdown issues. On a game that is literally nothing but 90s graphics.


Unrelated, but about a year ago I downloaded Sonic the Hedgehog 2 from the Xbox Store onto my Xbox one (I think it cost £0.99).

Anyway, it was about 500Mb in size and it runs Sonic 2 on the Xbox One's Xbox 360 emulator, which is then emulating a Sega Mega Drive.

That just blows my mind! 8 x86 cores to emulate 3 PPC cores to emulate a mega drive.


The game is more graphically intensive than you might think and the people that care about this kind of game very much value a solid 60 FPS frame rate.


Works at 60fps on Nintendo Switch though, which isn't exactly the most powerful machine in the world.


Not just works, it's 100% solid on the Switch outside of the mode-7'ish bonus game with no stuttering or frame drops.


It's good but definitely not 100% solid 60 fps according to Digital Foundry's analysis: https://youtu.be/DXturRMOzU4?t=7m23s


I imagine it is easier to get working on a known hardware configuration.


It's about Sega Saturn levels of graphically intensive, and should be well within the range of capability of even pleb-tier GPUs from the modern era.


yet somehow it ran at 60fps 20 years ago on hardware that was 10000 times slower

hmm


The game includes garphics which wouldn't be possible on the Mega Drive / Genesis. For example alpha blending or lots of sprite rotating.

Also Sonic on the Mega Drive had multiple framerate drops, e. g. when you lose lots of rings.


It would've been possible on the Saturn, and on the Dreamcast at 60 fps, easily. How much more powerful are even mediocre modern PCs compared to the Dreamcast?


Saturn would most likely choke on memory requirements, given how much is going on in the levels, as well as lack of transparency support. Symphony of the Night was the state of the art 2D platformer for that hardware generation and it has frequent loading pauses, despite being much more static.

Dreamcast would probably work, given enough effort by people who know the platform inside and out. But then, a much lower barrier to entry is probably a significant reason as to why Sega was willing to fund another 2D Sonic game in the first place.


Though we do have 4K tvs now.


That's being incredibly unfair in comparison. Old hardware had built in support for sprites, and as long as you kept within the limit the game would run smoothly. It had nothing to do with CPU speed - the machine had hardware support to draw say 8 sprites at the same time so if you were only drawing 8 it was fine, but try drawing 9 and you won't render even a frame a second. Nowadays we're doing an extremely brute force approach because everything is its own object that has to be refreshed every frame - and in a game like sonic mania there is so much animation going on it's insane. In a way it would be easier to do a 3D game than a very complex 2D game like this.


Uhm, no? Even a PC video card from 10 years back can literally render millions of textured polygons (=basically sprites) at a steady 60 fps. Any modern card with programmable shaders would be able to do all the transforms and effects on the GPU with virtually zero CPU overhead at the same time.

There really is no excuse whatsoever for a 2D game like sonic mania to run badly on even low-end PC hardware even with built-in graphics hardware.


Note that they actually did set the minimum requirement to "a PC video card from 10 years back": http://store.steampowered.com/app/584400/Sonic_Mania/

Memory bandwidth is the main concern when blitting several layers of pixels 1:1, unless you have some kind of crafty system in place for avoiding overdraw. Although Sonic Mania seems to genuinely use a 320x240-ish internal framebuffer, so my wild guess is that the bottlenecks relate to scaling and post-processing, or possibly some funky driver use causing sync issues.


There's slowdown in some sections on the Xbox One version, too.


that is hc dedication level, props.


> Quick note: The situation is not different on Linux, MacOS, or other operating systems because all executable formats have some way to embed images. What publishers should do is remove unncessary metadata before publishing, which is easy to do in the case of the PNG because we can simply omit the iTXt chunk.

From your link - so I guess the situation on linux isn't different? Or am I missing something, other than that it depends on the embedded PNG?


I'm not quite sure what exactly the author's point is.

Yes, you could embed images in ELFs, too, if you really wanted, but why? There's no standardized way to do it, and putting the images in separate files is easier.


There is a standard way to embed any file into an ELF. The program "objcopy" creates object files from arbitrary input files. You then simply link the objects into your executable. The program can then access those arbitrary embedded data chunks by their symbol names, just as if they were literal strings in the source code.

I explain how to do this in the context of embedding a version file here: https://stackoverflow.com/questions/16349557/does-gcc-have-a...


If we're comparing platforms, Xcode (Apple's IDE for iOS/macOS apps) runs pngcrush on all PNG assets by default.


I don't expect it to be different on Linux or any other operating system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: