"But no thank you Microsoft. No thank you spreadshi^Heets. No thank you
Unix. No thank you C++. No thank you unmodifiable syntax. No thank you
von Neumann architectures. No thank you unrepentant snake oil industry
that has lured us into performing the same old menial tasks by mouse instead
of by lever and duped almost everyone involved into thinking of this as some
kind of progress."
Wow, did he ever go off the rails on that paragraph.
spreadsheets? Unix? C++? Unmodifiable syntax? The theory behind every single one of these concepts was to save time.
Faster calculations, an OS with a simple concept applied everywhere, a language implementing an interesting design pattern, a style designed to reduce bugs.
Not that I'd necessarily agree with it, but I think you're missing his overall point there: he's arguing that those things are time savers, but only for tasks within the domain they were designed to tackle, whereas macros allow more general and more automated solutions, saving time across all domains.
The author's ultimate reference point is the Lisp Machine. Being Lisp "all the way down," the user could modify running code even at the OS level upon encountering an error or other undesirable result.
The idea of someone updating the kernel due to an undesirable result makes me fear for my life. "That's odd, why is it failing when I deref 0, that is an odd check, I will just remove it"
Under the assumption that Lisp is the best programming language that will ever be created and the x86 processor is a horribly flawed piece of garbage the author is correct.
Both are pretty good, actually. Lisp is a great language, and its features have been highly influential (and Lisp-influenced features have generally been good things). x86 is a terrible architecture, and it has only survived because of the need for binary compatibility.
I agree but still find the blog author to be a ridiculous caricature. There are also other interesting languages and x86 isn't preventing anyone from hot loading bug fixes.
This is because with the advent of these machines also our hygiene habits changed. We wash a lot more than before, and therefore there is no time saving but cleaner clothes.
The same is true for computers. They did not save time, they changed the way we treat information. We have a lot more than before to process.
>Washing Machines did not save time! This is because with the advent of these machines also our hygiene habits changed. We wash a lot more than before, and therefore there is no time saving but cleaner clothes.
That's not true. The ratio is not at all the same. Except if you take something like 17th century hygiene habits (and for the poor classes, at that) and compare it to modern day habits.
But we had the frequent clothes washing far before the advent of the washing machine. Not to mention that what you said could also be solved with bigger washing machines and more clothing (so that you can still wash at longer intervals, but you have stuff to wear in between).
Surely you don't mean to refute Hans Rosling's testimony that his mother had more time to read to him? I think you have an interesting viewpoint. If these technologies don't actually save any time, it follows that we spend as much doing X before the technology than after. So 4 hrs/wk washing 1 loads/wk + washing machine => 4 hrs/wk washing 4 loads/wk.
What if you reverse this and apply it to computers? I spend maybe 2-3 hours in Excel every week. If I were born 100 years earlier, would it follow that I'd also spend 2-3 hrs/wk making spreadsheets, albeit at a much slower work pace? It seems doubtful. I've got only a few dozen sheets, tops. Lots of time spent in Excel is tweaking, making things line up / look right, updating kludges, etc. Busywork, basically. Maybe this is my fault; the 'math' tells me I should have lots of sheets, but I get caught up doing busywork on this machine that's supposed to automate the whole ordeal. What a scam!
Regarding spreadsheets: Excel (and similar) is a great toy for doodling, but for serious work one would want the 'real thing'. In the 80s there was Javelin Plus, then Lotus Improv in the 90s. Now there's Quantrix.
With endemic spreadsheet errors (since the 80s), and far better solutions being ignored (also since the 80s), there's certainly something to rant about! ;-)
Another example: The advent of the car or plane did not reduce the live time spent on traveling but expanded the ways.
People and their priorities and motivations for giving a slice of their live time to a particular kind of activity do not change greatly because of technology.
There are rumors the current lack of infectious disease is really just an intensely selective breeding ground for much more effective infectious disease.
I would at least blame the security issues we have nowadays on UNIX by getting C widespread for systems programming, instead of more suitable languages.
This is a million times wrong. It's not about humans being smart or stupid, it's about humans being bad at being careful millions and millions of times without error.
The answer isn't to tell them to be more careful. They can be more careful, but they are human, so they will still make mistakes. The answer is for them to use higher level languages that eliminate whole classes of errors.
Sure, but from the overall tone of the comment, it seems clear that he's blaming the "idiots", not the "C", because his only statement about alternatives to C is that they'll have similar problems.
You can't really have a totally safe systems programming language (correct me if I'm wrong here).
It depends what you mean by totally safe, but Cyclone (http://cyclone.thelanguage.org/) is a type and memory safe dialect of C. It works by adding additional information to the type of every pointer, e.g. if it can be NULL, if it's NUL terminated, if you know the length, the scope it was allocated in, the location on the stack it's from, etc...
This all happens at compile time, and it runs about the same speed as C, so I think it qualifies.
From what I've read, this has had more of an influence on programming language designers/implementers (rust, possibly c++11, probably more), than on anybody else, though.
I can do that in Ada, Turbo Pascal, Delphi, Modula-2, Oberon, Oberon-2, Component Pascal, Active Oberon, Modula-3, Cyclon and many others, what is so special about C?
"C also makes memory access and device driver code very easy. The mechanical issues are a side effect of that."
I have two problems with this reasoning. First, C is being used for more than writing device drivers; most programs written in C are not device drivers. Second, and more important, is that low-level memory access and the sort of bit-twiddling you are talking about can be done in Common Lisp, and I doubt that you would be shocked to know that the operating systems written in Lisp for Lisp Machines had device drivers. It is possible for any high-level language to expose a low-level pointer type, and doing so will not force programmers to use such low-level constructs in higher-level code.
"As for design errors, they are hard to prevent anywhere."
While true, they are much harder to prevent in C because of the complicated semantics, the enormous amount of undefined behavior, and the plethora of edge cases one must account for. When something as simple as incrementing a counter forces you to deal with edge cases and potentially undefined behavior, it is pretty hard to verify that a C program actually implements a design, even when that design was formally specified and checked. The lack of high-level features and constructs worsens this situation by inflating the number of lines of code in an implementation.
>I'd actually blame it on idiots writing C myself.
Couldn't be more wrong though.
The people that wrote UNIX and its whole C userland were far from idiots -- more like geniuses. People like K&R, Bill Joy, etc (and until the 90s they weren't even that many, mainly people from elite computing schools and research centers).
It's not like some moron high school dropout, "Learn C in 24 hours"-reading people wrote UNIX and the common server stuff we use (Bind, Sendmail, etc).
I like that Unix, Smalltalk and Lisp have a similar idea of smaller things getting composed. They are all different but I wouldn't judge the ideas based on what we have today.
Agree here. My statement perhaps could be better written. I'm saying not that idiots write C, but idiots can write C badly and often do. Much as anyone can cross a road, idiots can do it and reliably get squashed every time.
I disagree about sendmail though - that was a stinking turd and I have been ever grateful since postfix arrived on the scene :)
Languages like Ada and Modula-2, just to cite two examples from many, allow for safer coding while having similar performance.
The people that cite the performance impact of bound checking and other features, usually tend to ignore that they can be disabled if required for performance reasons, because it suits their arguments that no other language can achieve C like performance.
Or they like to forget that the industry has invested more in optimizing compilers for C than for other languages. So they end up discussing languages without regard for the implementations.
I was referring to operating systems architectures, not only programming languages.
While I believe you could write an operating system in Haskell, for instance, I don't know nothing about it's performance. However, microkernels, even though are safer, have some performance drawbacks.
I think the problem with macros is that they do not always compile and do not always work fine. Debugging macros is hard by any definition, because you have to reason about both what is written and what goes on behind the scenes. A macro is great if the same genius is both writing the macro and using it, otherwise it sucks, in practical terms.
With a function things are much easier – one know what gets in, and you can tell the author of the function what has gone wrong; much easier then trying to understand how the macro did unwind; with macros the additional complexity is that one has to understand what the user of the macro was trying to do.
But as soon as you need to express new domain concepts in a language that does not natively support them, macros become essential to maintaining good, concise code.
There are good arguments for macros, but this is not one of them. Macros are certainly not essential for this purpose.
No, but helpful. In addition to data abstraction and functional abstraction, it's nice to have syntactic abstraction as yet another tool for reducing code duplication.
spreadsheets? Unix? C++? Unmodifiable syntax? The theory behind every single one of these concepts was to save time.
Faster calculations, an OS with a simple concept applied everywhere, a language implementing an interesting design pattern, a style designed to reduce bugs.
I don't think I follow this story.