When I did Algorithms in college ~18yrs ago, I dont recall it being called DP. We certainly learned it but without a distinct name for it, it didnt hold as a unique concept, just an obvious strategy/approach.
Same. Probably the worst class I ever took was my algorithms and data structures class. I swear the professor transcribed most of CLRS onto the blackboard over 10 weeks verbatim, and we never so much as touched a keyboard.
Honestly, it's hard to completely grok a concept until you have solved at least a couple or three problems about the topic, no matter how many examples or lines of pseudocode you have seen. I think that's his complain, and I have to agree with him. I also took a class some years ago that was purely theoretical and I didn't really learn much besides some general concepts.
I was actually referring to solving problems yourself in pseudocode on paper.
In any case, I'm happy with a computer science course that doesn't involve actual computer implementations, and it's not just because it drives home the point that they are different things, but also because you don't have to get bogged down in the practicalities of wrestling with a particular language or a particular toolchain.
Although there is a lot of merit in implementing an algorithm and not just writing pseudo code, doing it on paper has some advantages.
- Its a lot quicker to sketch an algorithm on paper ( you can ignore some details which are either trivial, or irrelevant to the problem )
- At a certain level you are expected to be able to convert pseudo code into actual code
- The most important part of an algorithm is knowing about it and what problems it solves (and variations). As well as the "trick" that makes it solve something particularly well - dynamic programming for solving sub problems etc... Even if I implement an algorithm or just write the pseudo code, I will forget the details fairly quickly, but the takeaway is that I know that for problems of type X I can use algorithms of type Y, (and sometimes i'll remember I can use Y because of fact C related to that particular problem or algorithm)
> you can ignore some details which are either trivial, or irrelevant to the problem
The most important insight of the old saw that teaching someone builds your understanding, but being able to code it ensures you have actual, deep understanding is this: the details you ignore as "trivial" or "irrelevant to the problem" are quite likely the crucial details to understand it and make it work. You can't safely handwave away parts of the problem until you have a good understanding of the entire problem.
I can't even count the cases in which I though I understand some algorithm (either in uni, or more recently, through reading a paper), then I sat down to implement it and realized I don't really understand shit about it.
Sometimes I program in an actual programming language using paper and pen, solely for the pleasure of commenting my code in mathematics rather than in English. I make sure that the code is syntactically and semantically valid - stripped out of the comments, it's a perfectly valid program that you can run.
I'm not sure it is an awful format for some people, but it sure didn't work for me.
I mean, I ended up implementing most of the problem sets in Python anyway - then had to transpile them into the bizarre undocumented form of pseudocode that was apparently the only thing the instructor and TAs could understand.
Combined with the complete lack of value added by the in-class lectures, compared to just reading the textbook, and the draconian attendance policy, it was far and away the worst class I endured in college. Particularly galling for a subject so central to the course of study and one that had material that could be so interesting if handled differently.
All our classes required delivering some kind of working code, either at the end of semester, or throughout it.
It was also the class that introduced us to unit tests, by having the code delivered as C library, that the teacher would link against to run her tests.
Yeah, I attended it until a few years ago. They're also teaching OCaml and Prolog, but Java is the main language for algos, and for the AI class too. C is only now used for the Systems class.