Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The CVE database proves otherwise.

Unfortunately liability is not yet a thing across the industry.



What does that site prove?

https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=stack 3496 entries https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=pointer 2389 entries

https://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=java 2034 entries

A possible outcome is that you would trade pointer bugs etc. for "Java bugs" if Java embedded were used everywhere. Embedding a complete runtime increases the attack surface alot.


Contrary to urban myths, C has a runtime as well.

Many of those Java exploits are on C and C++ written code layer, yet another reason to get rid of them in security critical code.

According to Microsoft Security Research Center and Google's driven Linux Kernel Self Protection project, the industry losses due to memory corruptions in C written software goes up to billions of dollars per year.

Several reports are available.


> Contrary to urban myths, C has a runtime as well.

Would you like to tell us how it compares to e.g. Java's runtime as well?


For starters, easier to exploit.

Then, unless we are speaking about a non-conformant ISO C implementation for bare metal deployments, it provides the initialization before main() starts, floating point emulation, handling of signals on non-UNIX OSes, VLAs.


So basically all the stuff that noone cares about? Maybe it's only a runtime if you really want to win a pointless argument?


How Apple, Google, Microsoft, Sony, ARM are now steering the industry regarding their OS SDKs has already proven who is on the right side.

I have been at this since the BBS days, I don't care about Internet brownie points.

The only thing left is actually having some liabily in place for business damages caused by exploits, I am fairly confident that it will eventually happen, even if it takes a couple of more years or decades to arrive there.


Claiming that a little process startup code (that isn't really part of a particular language, but is much more part of the OS ABI) was easier to exploit than an entire JRE is just dishonest.

I would never think of "floating point emulation, handling of signals on non-UNIX OSes, VLAs" as anything resembling a "runtime". These are mostly irrelevant anyway, but apart from that they are just little library nuggets or a few assembly instructions that get inserted as part of the regular compilation.

By "runtime", I believe most people mean a runtime system (like the JRE), and that is an entirely different world. It runs your "compiled" byte code because that can't run on its own.


I care about computer science definitions, not what most people think.

Dishonest is selling C to write any kind of quality software, specially anything conected to the Internet, unless one's metrics about quality are very low.


So "computer science" defines a little process startup code to be equal to the JRE? If that is so, I admit defeat to your infallible logic.


Computer science defines any kind of code used to support language features as a language runtime.

Assume whatever you feel like.


So that seems to be about how balanced and deep you want discussions to go. Thanks for the propaganda anyway.


Propaganda goes both ways.


Most of the notorious security breaches in practice (Stuxnet, NSA, IoT mega-botnets…) had nothing to do with buffer overflows.


A few stars don't hide the impact of billions of dollars in fixing buffer overflows and their related exploits.

> 70% of the vulnerabilities Microsoft assigns a CVE each year continue to be memory safety issues

https://msrc-blog.microsoft.com/2019/07/18/we-need-a-safer-s...

If you are going to argue that is 'cause Windows, there are similar reports from Google regarding Linux.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: