Medium complexity for a hash table is O(1) and when your keys are single bytes from [0..255], a good stdlib implementation shouldn't be far off from that.
In other words I just found his "not in the same ballpark" remark to be an exaggeration, although I understand where he's coming from: I am learning Haskell at the moment and oftentimes I ask myself how practical the pure functional style can get.
And what's up with this passive-aggressive attitude?
I didn't think it was all that passive. It's frustrating to read a comment that's nitpicking the article by making the exact same point as the author, but using big-O notation as if that were the issue. Hint: it's not. The point is that whatever purely functional data structure is being used, either the whole thing is being copied on every single recursive call or some tricky behind-the-scenes magic is going on to make destructive-looking actions not actually destructive. Either way, you're nuts if you think its approaching C speed.
And what's up with this passive-aggressive attitude?