I fall in to that camp. I don't look down on 'theoretical CS' stuff, but it's rarely ever even had to be a consideration in projects I've worked on, which has included ecommerce systems selling billions of dollars of stuff (large qtys, small price per item), real time reporting of financial data, and numerous other projects requiring a degree of scale or speed or both (php, vb, java and other stuff over the years).
Not everyone works on facebook, nor is everyone writing real-time device drivers, nor is everyone writing something that will explode up in users tomorrow, devastating the business if not every customer is served in less than 50ms.
I may be one of those in the 'get it done' camp, but I've also learned over the past several years to opt for using well-known libraries when possible, to take advantage of the expertise and skills that I don't have (yet). 15 years ago, I was firmly in the 'write it from scratch and focus on performance, tightness, elegance, etc' camp, but not so much today. Part of that is because we simply didn't have the wealth of free software libraries we do today, so you had to write more stuff from scratch, but that's not as much of an excuse today.
I've yet to have to write a binary search after 17 years in professional software development, and my projects have not suffered because of it. I had probably a good 10 years of hobbyist time before that, so I probably have simply picked up 'good' patterns to common problems without necessarily knowing the specific CS theory or names behind them in some cases.
All that said, my answer to performance issues in my apps is generally not 'throw more hardware at it' - I will profile to look for bottlenecks, isolate specific areas and rewrite sections of code to make them more performant, which sometimes means changing how data is organized/stored, or modifying queries, or something else.
Lastly, while I think I understand the type of person/attitude you're talking about "full of self-important people like this", I don't think I see myself in that particular camp (but of course, no one ever does, right?). There's a degree of pragmatism that needs to happen in 'real world' software dev (isn't agile all about "you ain't gonna need it"?). I've also had to 'clean up' after enough other developers over the past 17 years - including myself on a number of occasions - that my perspective may be sufficiently different from the type of people that work in company X for 10 years and rise to the rank of 'sr dev' only ever having worked in one or two places. Or maybe I'm just a self-important ass who's justifying himself too much in public?
EDIT: One other thing that has popped in to my head - I've worked on more than couple projects where other people on the team (before me or concurrently) insisted on certain things being done "right", simply because that was the "right" way. Two things were apparent - they typically didn't know any other way at all (lack of experience) and they had no understanding of what the real use of the application was - never talked to end users or other depts/units, and were creating far more work for everyone else by not implementing things differently (in their minds, 'compromising on correctness'). Couple different scenarios in the past few years spring to mind.
Despite being someone with a formal CS background, I completely understand where you're coming from. I've worked on practical applications for the last 5 years with very little consideration for the details of algorithms or the underlying theories behind what I'm doing.
Despite my highly developed pragmatism, I still think understanding those theories at great depth makes me a better programmer.
It's almost subconscious, but knowing how the compiler works, how the processor runs the compiled code, and how everything works at the most basic level gives me a more complete idea of what I'm doing under the hood.
I guess you could say that it's like having a physics degree would make you a better car mechanic, when the reality is probably far from it. But then, I know lots of car mechanics who would benefit from knowing a little more about the theory of their work rather than just practical know-how. It's a fine balance.
Just to respond to your last point: I've been one of those people insisting on doing something the "right" way. I disagree that this is a bad thing, or a sign of inexperience or lack of understanding. Often when I have to argue hard for taking this "right" path, it's because of experience with doing it the other way, and a belief that one would only do it the "wrong" way because of inexperience. So in other words, I have exactly the opposite view on that entire subject.
Often the "right" way to do things takes a little more time, a little more work, but is usually more rock solid in the long term or easy to understand for other programmers. It's more deterministic and predictable, and usually more about good architecture than raw performance (or it should be).
In general you have good practical points that every programmer should be aware of. By no means should real programming be done in an academic and theoretical manner. But at the same time, theoretical CS is provably correct about a vast number of concepts (that is it's business after all), and by no means should they be ignored or looked down upon. The truth lies in a balance between practicality and theory here, and one should never take a side. Look at each situation with the perspective it demands and solve it using the wide variety of tools available to you, and be prepared to accept that a solution might be more theoretical or more practical than you're willing or able to understand.
I don't come from a CS background but rather my education was in computer engineering, mostly of the hardware and low-level systems programming variety. The majority of what I actually do for work is entirely self-taught.
I find that having a background in the lower-level aspects of computing does help a lot. The situation where I've seen people who know mostly software falter is when debugging problems that involve system interactions. My belief is that being able to construct a mental model of what's happening from top to bottom is crucial in solving some problems and people who can't do it eventually resort to just trying random things until something works.
I guess the take-away from this conversation is that it's useful to have knowledge in another related domain, be it computer hardware, low-level programming, physics, computer science, etc., even when you are developing only high-level applications.
I struggle between wanting to clarify my 'example' situations more, and not wanting to 'name names' (or implicate people enough based on identifiable information). In two recent cases, scenarios that were being argued for as "right" (as in "there is only one right way, because I have a CS degree") were in actuality subpar options where there was no definable 'right way' (think arguments for which order invoicing/accounting software should process debits/credits in). To some extent there are 'right' ways in some fields, but the real right answer is to work with the business units in question, and determine both what they need now, how to adjust them to another process if there's a compelling reason to change, and document all of it for posterity. In contrast, I was dealing with something built in isolation, undocumented, and justified with hand-wavy CS pseudo-BS. This isn't an argument against CS theory at all, but as much as people have run in to justifications for bad code with "well, it worked and we had a deadline", I've also run in to quite a number of piss poor designs that didn't really work (or required massive work arounds) because someone with a CS degree was given free reign without challenge, and who never understood the pragmatic aspects of the business needs they were purporting to serve. In more other words, I suspect we would probably have the exact same view on that subject (if I'd shared explicit details).
I do not at all 'look down' on CS stuff, but also don't feel that I need to be an expert on every single aspect of every single aspect of it to get things done. For example, my Linux desktop worked pretty well for my needs both before and after Linux kernel scheduling patches. When I read about them, I could understand what was going on, and if I was deep in the kernel, might even have been able to identify what/where to do. I appreciate and benefit from proper and efficient algorithms in libraries I use, and can (if need be) identify a particular library that's suited for a particular project based on the algorithms they use (and may even dip in the code to verify it's doing what I really need).
My own background is that I did 6502 machine code (by hand, no assembler) in the 80s (not professionally - I was a kid - did it as a hobbyist), so while I do not claim to be a computer scientist by any stretch, I have a decent conceptual idea of what's going on at the low-level. yes, most of that is out of date today, but I think conceptually I 'get' it. I also don't think it's actually helped that much in day to day work, but, perhaps it has and I just don't see it any more as it might be second nature(?)
Last point(?) - even if what you're doing is technically/algorithmically the most optimal solution to something, please document what you're doing. Even something as basic as identifying "hey, using a bloom filter here because XYZ needed ABC" in some comments will help people coming along who aren't familiar with particular patterns to get up to speed.
Some real world examples I've seen where people have screwed up because of not understanding theory:
A graphical toolkit system that stored all the styles applied to components as a linked-list, but each element included a pointer to the the head of the linked list and everytime a new component was instantiated it was added to the head of the linked list. Which meant that creating a new graphical element went from O(1) to O(N) and caused significant slow down (i.e seconds of time).
A well known open-source xml parsing library which stored the attributes of an xml element in a linked-list, as part of it's xml validation it had to ensure attribute uniqueness and do that it had to iterate through every attribute. This means that to insert N attributes would take O(N^2) - again enough to cause a significant performance degradation.
I've seen cases where people who with degrees in CS have screwed up in equally bad or even worse ways.
Writing software is complicated and it's difficult to remember exactly how every part of a large system is working. Sometimes people forget an important detail and end up write code that performs badly. It's hardly limited to people who don't understand CS theory.
As a selt-taught developer, I know what Big O notation is and I've worked with many CS grads (including Stanford) who did not take into consideration algorithmic efficiency.
I think these kinds of examples are very anecdotal.
The bigger point is that it's almost all anecdotal - the number of people in this field is too broad and varied to be able to draw any substantial conclusions about anything, imo.
Was that because the author had no comprehension of complexity and couldn't understand what was wrong with the design, or because of a lazy oversight that was fixable once the problem was brought to light?
I suspect the "self-important people like this" that trekkin is talking about are those who tell IT to "copy every program (and call) in the codebase, add 'Acme' to the front of all the names, and overwrite a clone of the present general data with Acme's data. Why can't this simple task be done by the end of the week, so I can 'deliver a business solution' to Acme next week?". This is just an extreme case I've seen of how the legacy code in many corporate IT depts consists of thousands of copies of the same basic code patterns.
Yep. I've been out of school since 1991 and the last time I wrote a binary search, or balanced a tree, or even implemented a linked list, was in school.
Not everyone works on facebook, nor is everyone writing real-time device drivers, nor is everyone writing something that will explode up in users tomorrow, devastating the business if not every customer is served in less than 50ms.
I may be one of those in the 'get it done' camp, but I've also learned over the past several years to opt for using well-known libraries when possible, to take advantage of the expertise and skills that I don't have (yet). 15 years ago, I was firmly in the 'write it from scratch and focus on performance, tightness, elegance, etc' camp, but not so much today. Part of that is because we simply didn't have the wealth of free software libraries we do today, so you had to write more stuff from scratch, but that's not as much of an excuse today.
I've yet to have to write a binary search after 17 years in professional software development, and my projects have not suffered because of it. I had probably a good 10 years of hobbyist time before that, so I probably have simply picked up 'good' patterns to common problems without necessarily knowing the specific CS theory or names behind them in some cases.
All that said, my answer to performance issues in my apps is generally not 'throw more hardware at it' - I will profile to look for bottlenecks, isolate specific areas and rewrite sections of code to make them more performant, which sometimes means changing how data is organized/stored, or modifying queries, or something else.
Lastly, while I think I understand the type of person/attitude you're talking about "full of self-important people like this", I don't think I see myself in that particular camp (but of course, no one ever does, right?). There's a degree of pragmatism that needs to happen in 'real world' software dev (isn't agile all about "you ain't gonna need it"?). I've also had to 'clean up' after enough other developers over the past 17 years - including myself on a number of occasions - that my perspective may be sufficiently different from the type of people that work in company X for 10 years and rise to the rank of 'sr dev' only ever having worked in one or two places. Or maybe I'm just a self-important ass who's justifying himself too much in public?
EDIT: One other thing that has popped in to my head - I've worked on more than couple projects where other people on the team (before me or concurrently) insisted on certain things being done "right", simply because that was the "right" way. Two things were apparent - they typically didn't know any other way at all (lack of experience) and they had no understanding of what the real use of the application was - never talked to end users or other depts/units, and were creating far more work for everyone else by not implementing things differently (in their minds, 'compromising on correctness'). Couple different scenarios in the past few years spring to mind.