In my experience, it's useful, even when writing high-level applications, to be aware of the relative cost of low-level operations. The "Numbers Everyone Should Know" slide from this deck is a reasonable starting point: http://research.google.com/people/jeff/stanford-295-talk.pdf To generalize a bit, the small numbers at the top of that chart are mostly a concern for people doing systems programming, but as you progress down the list you'll find operations costly enough to have a noticeable impact on application programs. E.g., if you build a typical web application in a high-level language, your user won't be able to tell if you add a hundred prediction-resistant conditional branch instructions or a hundred L1 cache misses per page view; but if you add a hundred network round trips per page view they'll observe a measurable slowdown. Similarly, if you're making a game and you want to display graphics at 60 frames per second, you can do quite a large amount of computation per frame, but you can't read a file from disk on every frame.