In my experience, I tend to be of the opinion that lazy, immutable data structures with eager evaluation bring you most of the benefits of lazy evaluation with the control of eager evaluation.
Clojure is an example of that. You can plant printfs there and it just works, but you can also set it to do huge amount of processing on huge amount of data and eventually just consume and crunch small part of it.
The arguments of 'take' and 'map' are eagerly evaluated. The sequence returned by 'infinite-series' is immutable and infinite and lazy, and as returned it actually gets passed directly to the 'map' function which actually operates on it. Finally, we use 'take' to extract the first ten items and thanks to the data structure being lazy, this will actually limit the execution so that only ten items are ever generated and multiplied by 10.
The code would look conceptually the same in Haskell but if the 10 in 'take' was a variable and could occasionally be 0, in Clojure you can be sure that a debug (println) places in the 'infinite-series' function would actually get executed. Haskell would probably defer the execution forever since "take 0" is a no-op. Yet Clojure doesn't generate or multiply anything if we called (take 0 ...).
A lazy data structure exposes access methods that exhibit lazy-evaluator behavior. For example, you have a "rectangle" structure that contains position, width and height, and the structure has methods that let you apply matrix transformations to scale, rotate, and skew the rectangle's corners. The matrix methods perform the computation, and then cache that result in case it's needed again.
That is more about functional programming than laziness, but maybe I am splitting hairs. OpenGL, for example, implements the matrix stack functionality, and then it only works for one class. A lot of work then to get ky everywhere you need it.
Clojure is doing a lot of work with fusion and sequences to transfer laziness from arrays up to the other objects that consume them, but it is harder as you go farther away.
Clojure is an example of that. You can plant printfs there and it just works, but you can also set it to do huge amount of processing on huge amount of data and eventually just consume and crunch small part of it.