Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Was there any study done to the (un)reliability of C? I know for a fact practically every piece software I use is programmed in either C or C++.

The sole exceptions are Anki and Gentoo's portage system, both Python. And I'm pretty sure the reason portage is so extremely slow is because it is in Python (I've checked, it's not I/O-bound). And Anki is some very unreliable software.

In fact, give me a single big desktop software project made with a language that is not C or C++.



"I know for a fact practically every piece software I use is programmed in either C or C++."

How confident are you that that software will work as expected? How much are you willing to bet?

The fact that a language is popular does not prove that the language is good, nor that it should be used, nor that it is not causing us problems. Just the other day on HN, there was an article about a massive number of vulnerabilities in X11 applications -- all resulting from problems that you have in C and C++ but not in higher-level languages. Entire classes of bugs that are common problems in C are just not an issue in other languages.

Sure, it is possible to write unreliable code in high-level languages. It is just a lot easier to write unreliable C code, and C programmers are much more likely to do so (even those with years of experience).


"There are only two kinds of languages: the ones people complain about and the ones nobody uses"

Bjarne Stroustrup

http://www.stroustrup.com/bs_faq.html#really-say-that

Of course, that just dismisses the criticism of the language. But to say that it "should [not] be used" ignores the current landscape of engineers, employers and problems.


That quote is always annoying, because it applies equally well to all criticisms of any programming language in active use, but not all criticisms are created equal. A useful heuristic would actually help distinguish between reasonable criticism and inevitable kvetching.


I'm not betting anything. I'm just saying that it seems to me the advantages of higher level languages do not outweigh the disadvantages. Most of the software I use was written relatively recently, which means other program languages were available, yet C was chosen over them.


Perhaps you should face your fears and learn C ;-)


In fact, give me a single big desktop software project made with a language that is not C or C++.

I find your requirements odd and vague (why big? why desktop?), but speaking just for myself: Eclipse, jEdit, CyberDuck -- all written in Java.

I use plenty of applications written in C and C++ too, of course; but I think that's largely due to (1) inertia in the application development industry, and (2) it took a while for runtime environments like the JVM to perform well, so they got a bad reputation early on that's no longer really deserved -- but the reputation persists in the minds of many developers.


My parent's post said C was bad and software shouldn't be written in it. So I retort by talking about desktop software because this is an area where C and C++ rule supreme.

I specified big because big software projects carry more merit. A calculator written in your favourite language might be pretty handy, but it doesn't show your language can be used for real software. I use some Python scripts, but I don't see it in a lot of serious applications, apart from the dead slow portage and Anki, which is the most buggy software on my machine.

I don't think inertia is a real factor. Most software I use is brand spanking new. Google Chrome, to give an example, is barely five years old! Most of the other software I use is from the GNOME project, which has seen a major rewrite with GNOME 3 about two years ago, where they could have chosen to write it in a high-level language. Yet practically all of it is still in C, with a Javascript layer for the Shell (which I both love and loathe).

And speed is interesting. It might be due to old compilers/interpreters, bad programming habits or oldness, but all non C/C++/C# software I have ever used, was dead slow.


Google Chrome is based on WebKit, which was based on KHTML, which has been around since 1998. Moreover, Chrome depends on system and external libraries that provide C/C++ headers. Any viable alternative would need to work with C/C++ headers without needing large amounts of glue code, which isn't necessarily easy.

People still use C not because all of their code is so performance-sensitive that they can't deal with the overhead of bounds checks and garbage collection, but because there still aren't reasonably high-performance alternatives can easily integrate with existing codebases. Go and Rust are two promising contenders in this regard, but the former has only recently achieved C-level performance, and the latter is still not ready for production use. With that said, Mozilla is writing a browser in Rust (Servo), which shows their aspirations.


WebKit might be C/C++, that doesn't mean the browser itself needs to be that way. So is Anki, for example, a Python program that works with the QT toolkit, which is C++.

But point taken. I'm not saying we should use C, I'm anxious to a future where other languages can be used for serious applications. I was just trying to show C cannot possibly be that bad, as is it is still, together with C++, choice number one for every desktop application (and is Objective-C counts, it also dominates the mobile market).


My point in all this has been that language popularity is nearly orthogonal to technical pros/cons. Languages become popular for non-technical reasons. The popularity of C, C++, Objective-C, and related languages has almost nothing to do with the technical features of those languages, and almost everything to do with the marketing of popular OSes: Unix, Windows, and iOS. If an OS written in ML had become dominant in the 80s or 90s, it is nearly certain that ML would be a popular language. The inertia created by this large ecosystem cannot be denied; it is part of the reason you keep seeing new software being written in C/C++.

C absolutely is that bad. It is poorly defined. It forces programmers to explicitly write out things that can and should be done automatically. There is no standard error-handling system, just conventions involving return values and global error flags; there is no error-recovery system at all. Something as seemingly simple as computing the average of two signed integers winds up being non-trivial. C++ is even worse, as not only does it inherit most of the problems with C, but it introduces an entirely new list of pointless problems. Debugging code written in these languages is needlessly difficult -- you are spending as much time on high-level problems (i.e. design problems) as you are on tracing pointers and figuring out where some uninitialized value became a problem. It is not unreasonable to estimate that the dominance of C and C++ carry the cost of billions of dollars spent dealing with the headaches caused by these languages' problems.

It is hard to come up with a list of technical advantages to counter the above. C has few features, and those features are not very powerful. All that C really has going for it is that you can be "close to the machine," though it should be clear that other languages let you do this too (since other languages have been used to implement entire OSes). C++ has a few technical features that may be advantageous -- but they all compose poorly with each other, and their value is weighed down by all the baggage C++ carried from C (and indeed, most of the really bizarre bugs you can make in C++ stem from this baggage).

The non-technical reasons for C being popular vastly outweigh the technical deficiency of the language. The reason C/C++ is "choice number one" for desktop software is almost entirely a result of those non-technical reasons.


Nobody is doubting there's plenty of great software written in C. That doesn't mean we should choose C for all the new software we write, though. I'm sure one will find slow and unreliable code written in C if they look hard enough..


> And I'm pretty sure the reason portage is so extremely slow is because it is in Python (I've checked, it's not I/O-bound)

I was under the impression portage was "slow" (relative to other package management tools) due to the fact that it built everything from source (for which it uses make). Where is the Python bottleneck?

> In fact, give me a single big desktop software project made with a language that is not C or C++.

What constitutes "big"? There are a few large desktop projects that run on the JVM, for instance.


> I was under the impression portage was "slow" (relative to other package management tools) due to the fact that it built everything from source (for which it uses make). Where is the Python bottleneck?

I was talking about the overhead of making the list of packages to update/install. Pretending to install a simple package takes 7 seconds. Pretending to update takes 13 seconds. That's a very long time, because I might want to review the list of packages and make some changes, and every time I have to wait 13 seconds.

The compiling part is true, but that is not supervised.


I'm pretty sure the 'emerge -pv' performance is IO bound and would not be significantly improved by writing it in C.


It's been a while since I used Gentoo, but most of the serious users consider build-from-source as a feature and don't penalize portage for that. I believe your parent is referring to the sometimes annoyingly long dependency computation time of something like emerge -uDN world.


Eclipse is made with Java.


A major part of Firefox (hundreds of KLOC) is written in JavaScript. While the layout and network code is all C++, most parts that aren't performance-critical and don't need to interact directly with the OS are JS. This includes all of the high-level UI code.


And JS VM is written in C.


Just three, ask for more if you wish:

The original version of Skype was done in Delphi.

The first versions of Mac OS were done in Apple Pascal.

Photoshop was originally done in Apple Pascal.


I'm not siding with or against you here, but each of your examples is of something originally being in a language other than C.

Your position isn't strengthened if you can only name systems that were written in something else before someone converted them to C. In fact, by design, that's kind of benefitting the opposing side. I'm just saying.


1) the original UI for Skype was written in Delphi. The core functionality is, and has always been written in C/C++.

2) Mac OS is not a desktop application.

3) Adobe Photoshop is all C/C++ now. They converted it to C/C++ because they decided that was a better choice. Its tough to sell that as a case for a major desktop application not in c/c++.


The parent poster asked for desktop software written in languages besides C or C++ without any reference of hybrid applications or timeframe when they were written.

Since when desktop operating systems are not desktop applications?

I can provide other examples, but most likely I will get extra requirements along the way, so why bother...

After all, you can always use the argument that any application needs to call C APIs when the OS is written in C, therefore all applications are written in C.


What a victim complex. If someone is debating that we should stop using C, and I ask for software not written in C, it's obvious that I'm looking for contemporary software. And the reasoning that every software is "C" because it needs to do system/library calls would be ludicrous, and to provisionally accusing me of that is insulting.

I've done some research myself, and there are some interesting and big projects in languages other than C. Eclipse in Java, Dropbox in Python to name two. But my point still stands. I looked at all the desktop applications I have installed and they're without fault C or C++.


My impression is that desktop software is/was largely written in C/C++/C#/Objective C/Objective C++ because those are the languages to which the Operating Systems of today expose their APIs. For example Win32 = C/C++, Cocoa = Objective C, Metro = C# (or I guess anything compiled to the Common Intermediate Language (CIL)???).

Now that said, most languages provide a bridging layer that allows them to call out to those "native" APIs. These are used to enable API ?wrappers?, however due to these being provided by third parties (I believe) there has been a tendency to gravitate toward the "blessed" language of the OS vendor.

One of the big advantages of Java (to me at least) was that it provided a platform independent windowing capability inbuilt within the JDK that has been maintained by Sun/Oracle/(and Apple) as new operating system revisions were released.

Note, for example, that C/C++/Objective-C/Objective-C++ programs aren't/weren't allowed in the Mac App Store (I'm not sure if this is still the case...). (Personally I ported a Java App to Objective C++ due to this.)

But generally I agree with your point that this isn't the only reason why C was so pervasively used. However, you also need to consider that a large number of programming environments and tools were specifically developed to aid C/C++ programmers, e.g. Borland C++, Visual C++, Code Warrior, XCode. It's also worth remembering that the GNU C Compiler and Debugger were import contributions to free software back in the day.

But also consider distribution of compilers etc. I think it is pretty fair to say that a lot of programmers learnt to program using Borland Pascal/C++ because at that point the Internet was not as accessible as today and copies of these could be "obtained".

The advent of Internet has not only allowed the distribution of compilers and environments for other programming languages, it has also meant that the languages used for backend systems, i.e. web servers and web applications, is irrelevant to the user's web browser.

Anecdotally, for safety-critical system software an issue with some languages other than C is that they have not been suitable for real-time systems. I don't know much about this other than that exception handling and also garbage collection can cause issues due to their non-determinism.

I fear that you'll think that the above is a bit too much like saying "all software is 'C' because it needs to do system/library calls", however I think it's probably fairer to say "all software is 'C' because many people have really, really liked it" and "better the devil you know".


Metro is an updated version of COM.

On the desktop side, you have C++/CX which is C++ with some language extensions to make talking to COM easier.

For those that would rather use plain ISO C++, there is the Windows Runtime template framework.

.NET code is JITed or compiled AOT with NGEN and makes use of CCW to interop with WinRT.

http://msdn.microsoft.com/de-de/magazine/jj651569.aspx

On the mobile side, Windows Phone 8 only has native applications, even .NET gets compiled to native code.

http://channel9.msdn.com/Shows/Going+Deep/Mani-Ramaswamy-and...


There is a lot of applications in C#, a lot in Java, a lot in full python (e.g. ubuntu utilities), some in flex (cheers to balsamiq the wonderful). The fact that you didn't installed them doesn't mean they aren't there. Moreover, you shouldn't say C/C++, they are very different languages and I can bet my hat that your desktop applications are full of C++ with an high level framework like Qt. Not really close to the metal.


Bingo Card Creator, by our very own patio11 (Patrick Mckenzie) is written in Java.


I thought Bingo Card Creator was a Ruby on Rails app...


That's the webapp. The original software is a desktop version, and patio11 wrote it in Java as far as I know.

http://www.kalzumeus.com/2006/06/28/the-sorrows-of-java-prin...


The desktop client is indeed written in Java Swing. Fun fact: for PDF printing and getting updates, it actually does interact with a web service in the Ruby on Rails application. Both are ungodly bits of code put together years ago which shame me but seem to continue functioning.


It's all "original" and "first versions", which kinds of strengthen the op's point.


To be honest, I was looking for some more recent software. Skype is 10 years old and both Mac OS and Photoshop are both decades old. And Pascal is not "better" than C, it is not a high level language.


Funny as someone that started coded back in the day Assembly was enterprise coding, my understanding of a high level language is a bit different than yours.

You asked for desktop software without any mention of time.


The implication is that his test is a proof of C's merit.

Merit is time-sensitive, in that if software is converted to another language, that language is more profitable/efficient for that software.

Therefore, the implication is also that systems converted to C demonstrate C's merit, because it's been decades since they were actually in the language you speak of.


What can I say. I am older than C and remember the days when it was used only as the UNIX system's language.

UNIX's success made people want to port UNIX's tooling for their home computers, this lead to the spread of C implementations outside UNIX.

Had UNIX been written in language XYZ, the article would be called "Learn XYZ". Merit is relative.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: