Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It looks to me as though the recommendation that RankRed has titled "Rule No. 5 — Low Assertion Density" would be better described as "High Assertion Density" — the recommendation is for a minimum of two assertions per function (and functions are supposed to be short per rule 4).

The recommendations look good to me and (with one caveat) correspond to rules that I apply when writing C code with a high reliability requirement.

My one caveat is in "Rule No. 8 – Limited Use of Preprocessor" which bans all complex uses of the preprocessor. The problem is that it is common in C to encounter situations where the only way to avoid lots of code duplication is to store a table of facts in a macro definition and use the preprocessor to expand those facts into the relevant bits of code in each place where they are used. So in these situations you face a trade-off between the risks due to complex preprocessor use, and the risks due to code duplication (namely, a maintainer might change a fact in one place where it is used but fail to change it in another). My experience is that the risks due to code duplication are very high, and so it's worth the risk of using complex preprocessor macros to avoid them. The risks can be mitigated by implementing the necessary macros in a structured way to keep the complexity under control: http://garethrees.org/2007/04/24/relational-macros/



Assertions in code cut both ways. Sometimes they can be great; telling you exactly which assumed invariant is violated. However, that doesn't tell you where or how the invariant was violated.

Sometimes assertions are just crutches for lazy programming. Instead of handling of a very valid (corner) case, some people just assert that it doesn't happen. Lo and behold, years later, it does happen. And those years later, the context is completely lost. How easy is it to handle the corner case now? Hard. In this situation, instead of simply asserting that it doesn't happen, it might be warranted to just assume that it _can_ happen and handle the case. Then, followed up with coverage testing, one needs to (actually try hard) come up with an input that triggers it.

Assertions can also have bugs. The worst is trying to figure out which assertions are really valuable and which aren't.

Shotgun-spraying assertions in the code is not a good strategy in general.


Obviously a recommendation to have a high assertion density does not mean "shotgun-spraying assertions in the code".

If your point is just that the rule could be applied mechanically and without thinking and that would be bad, then that's true, but it applies to everything, not just to the rule about assertions. Someone could apply the "keep functions definitions below 60 lines" rule in a perverse way, by splitting every long function definition at an arbitrary point in the first 60 lines and tail-calling a continuation. It doesn't mean the rule isn't a good one.


In my experience that what's will happen in practice, especially when there is a specific metric attached. Doubly so once there is a mechanically enforced required amount of assertions.

Wouldn't it be better if we could create programs that were correct by construction (and thus needed no assertions)?


I see — your experience suggests that rules inevitably get turned into thoughtlessly evaluated metrics that then get gamed.

It is a shame when that happens, but when it does, it's not the fault of the rules, it's the fault of the organizational culture. The rules would still be valuable if you used them thoughtfully and with the aim of improving the reliability of the product, not gaming some management system.


I don't think they're intentionally gamed, it's just part of human nature. If you don't have to defend your assertions (because more assertions is presumed better, as per the rules), then people are liable to err on the side of putting more in that they need to.

With the exception of assert(ptr != NULL) assertions, I think most of the ones I've hit have actually been completely duff, and with some thought could just be removed. I dread to think what would happen if I grepped the commit histories of all the projects I've worked on for "removed duff assert".


>(and thus needed no assertions)

Assertions are not meant to be needed. Any bug-free programm should behave exactly the same with and without assertions.

On the other hand, in any sufficiently complex algorithm, asserting the pre- and postconditions generally helps readability, maintainablilty and correctness.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: