I really don't see how. When students are first exposed to computer programming, it might make sense to start with toy / compact languages that don't have any real-world use. But assembly is not the first language you're supposed to learn!
It's very utilitarian and most commonly just used for debugging and reverse engineering. So why would you waste time on the assembly language of a long-obsolete platform?
Plus, the best way to learn assembly is to experiment. Write some code in your favorite language and look at the intermediate assembler output, or peek under the hood with objdump or gdb. Try to make changes and see what happens. Yes, you can do that with an emulator of a vintage computer, but it's going to be harder. You need to learn the architecture of that computer, including all the hardware and ROM facilities the assembly is interacting with to do something as simple as putting text on the screen... and none of this is going to be even remotely applicable to Linux (or Windows) on x86-64.
I struggled with C until I learned to hex and assembly program on the 68HC11. Maybe I'm just a moron, but things like pointers for a complete beginner seemed so abstract and obtuse until I learned how to do indirect addressing in assembly, then suddenly it was painfully obvious why C had pointers and how they worked. Before that I just mostly used Python where it's far more abstracted away. People forget that many features like pointers exist to address hardware/performance limitations, which is not immediately obvious to new devs the "why and what" of what they're actually doing inside the cpu which limits your intuitive understanding.
That's why some people see the C language as "portable assembly". I think C is at its best when you want Assembly-like memory addressing flexibility but don't want to deal with instruction-set specific idiosyncrasies.
This is my experience as well, for a lot of features that seem like magic in high level languages. I could somewhat accept pointers even without understanding them, but object-oriented programming with its classes and all was such a mystery to me that I was scared to even try using it.
It's not until I learned assembly language that I understood pointers, and from there, I could implement a basic OOP system in C and finally understand what objects are all about. It only clicked when I learned how to do it from scratch.
I have a hypothesis that to move beyond being a consumer of technology to being a producer of technology, modern education in software development should be built on building fundamentals through first principles — especially with kids and young adults. It should be analogous to how we learn counting, arithmetic, and higher level maths.
One of the best features of obsolete, constrained architectures was their simplicity. Recovery for something wrong is quick and there is (generally) no permanent damage. All of this makes it much easier to understand what is happening at a lower level.
Once you have a basic understanding, then you are ready for the next level of abstraction.
I assume that openness and availability of IP is the biggest challenge to putting some kind of curriculum together. I would be highly interested in whether anyone has curated this into a cohesive educational approach, especially one that targets childhood development.
Spending a couple of months tinkering with hardware and programming assembly will make you understand the basics of how a computer works in a totally different way than other high level languages will. I haven't programmed a single line of assembly after high school (back in the 80s...), but the fundamental understanding of how operations are executed, how registries work makes it so much easier to understand the whys, the ifs and buts of programming and optimization. And you'll certainly start to appreciate clean, effective code.
I wish there were books like this today, that could lead a kid from knowing virtually nothing to a graduate level understanding of computer architecture, digital storage, logic and set theory, graphics, mathematical modelling, networking, numerical methods, signal processing, electrical engineering, software design, ...
Wow, so many memories there! Yeah, BASIC (TI-99), then Atari Basic, then Basic XL (along with reading a 6502 book), then GFA BASIC, then Megamax / Laser C, then Pascal and 68K and Ansi C at uni. RISC was part of the 4th year digital architecture/design course.
There are roughly two ways to learn programming: top down (start with abstract concept and go down to actual implementation, e.g. SICP) and bottom up (start with the concrete low-level code and let the abstractions naturally emerge).
I studied electronics, so naturally we began with assembly (Motorola HC11). By the end of the course everyone had independently developed their macros to do things like for-loops, so it was a natural progression from there to C. By the end of the C course "C-style OOP" had also emerged naturally, which led to the next course in C++.
The downside of this approach is that it there is no gradual route from there to functional paradigms (or non-imperative in general). Also, one develops the habit of always thinking of how the language works under the hood, which can be counter, productive. E.g. when I was trying to learn Haskell, my mind was trying to understand how the interpreter worked.
Learning assembler is not just about the language, but understanding how the machine works (buses, memory-mapped peripherals, etc). In older platforms this is much simpler, so while ARM instructions can be easier to learn than the CISC instructions of the HC11, everything else is much friendlier for the beginner in the HC11.
With the dmd compiler, compiling with -vasm will show the generated assembly as it compiles. It's been poo-pooed because why not use objdump or -S? But once you try it, you'll know why it's so convenient, as it just emits the assembler, and not the huge pile of boilerplate needed to make an object file.
For example, I'm working on an AArch64 code generator, more specifically, generating floating point code. I have a function:
You might think that the compiler was generating assembler code, and then assembling the code to binary. Nope. It generates the binary instructions directly. The compiler has a builtin disassembler for the display. This makes for a fantastic way to make sure the correct binary is generated. It has saved me an enormous amount of debugging time.
Learning assembly is what finally made programming "click" for me. With a solid intuition for instruction sets, pointers and adressing modes I could suddenly reason about programs on another lever.
I have found good results with model of teaching since them and wish that more people tried it.
I started programming in AppleSoft BASIC in 1986. But quickly became fascinated with 65C02 assembly. By the time I got to college and started taking different programming language classes, I quickly fell in love C. My knowing assembly helped me understand C and even though I haven’t learned any other assembly language since then, when I read articles about low level architecture of processors, I can follow along.
Let me take that back, I did learn 16 bit x86 assembly and the int21 DOS commands in college.
I went from BASIC straight to Assembly. I wish they had taught me Assembly when I was 9 years old (in the 1970s) instead of BASIC. Once I learned Assembly I never used BASIC again.
If you had a 16 kB memory card (a.k.a "Language Card") you could overlay the ROM memory with RAM, and load the Integer Basic ROM overe the AppleSoft Basic ROM.
AppleSoft did "everything" with floating-point variables, like loops indexing into arrays. It's amazing that programs ran at usable speeds on a 1 Mhz machine.
My C64 didn't come with a monitor. I typed one in from a magazine, then learned assembly by entering instructions directly into it. I was so thrilled to discover assemblers later.
I thought I was in heaven when I got the Action Replay cartridge (not the genuine, but a clone with the same software) that came with a monitor, a "freezer", fastloader, and various disk utilities.
> The era in which there was nothing but assembly language was very short-lived.
Hell, I'm not even sure the era existed.
Grace Hopper was creating the first few high level languages for UNIVAC I. A-0 was complete in May 1952. A-2 (the first which saw extensive use) was created in August 1953.
As far as I can tell, UNIVAC I never had an assembler. If you weren't using A-0, programmers were expected to just type in raw machine code. Here is a UNIVAC programming manual from 1953 [1], and there is no mention of an assembler. Oh, and if you think you see instruction mnemonics... no, those are just letters which the CPU instruction maps onto.
Over in the IBM world, at least the 701 launched with a proper symbolic assembler in April 1953. But it also launched with Speedcoding [2], a somewhat higher level language halfway between non-symbolic assembler (it decodes mnemonics, but the programmer has to specify all addresses as absolute numbers) and an interpreter.
None of the other early computers seem to have had assemblers (though some, like the Manchester Mark 1, had high level languages).
There might have been some programmers at IBM who might have had access to a the prototype 701, and the symbolic assembler before Speedcoding existed [3]. But for everyone else, they seem to have gotten access to high-level languages at the same time, or before they got access to assemblers.
[3] It's also possible that Speedcoding development was largely finished before the first 701 was operational. I'm finding it hard to find exact dates for that.
Part of the problem is that "Assembly language era" is ill-defined. Personally, I don't think it counts as Assembly language unless you are using a symbolic assembler, because that's what modern programmers think about when you say "assembly".
There is a reasonably common interpretation includes the whole machine code era as "Assembly language", as you are writing it out and then hand-assembling it. Which means the UNIVAC's C-10 machine code counts as "Assembly language", even if they weren't using that terminology. With this interpretation, the "assembly language era" lasted a few years. but I think this inspiration is very misleading to any programmer exposed to a proper assembler.
Anyway, even with my stricter definition, there was an assembly-only era, but it only seems to have existed inside IBM's research labs. They had their first symbolic assemblers running on the "test assembly" by October 1950.
There is very little information about this "test assembly" computer on the internet, doesn't even have a wikipedia page (same with the "Tape Processing Machine" or TPM that followed). But "IBM's early computers" by Charles J Bashe documents it. This computer had not one but two symbolic assemblers running by autumn 1950, which does actually seem to beat most high level languages.
Long before the first 701 was even installed (in April 1953, at IBM's HQ), programmers had already gotten sick of assembly programming, which is why speedcoding was created.
Though, this wasn't just internal IBM programmers. Customers who bought the 701 were given documentation about both the computer and the assembler as early as 1951. These customers had hired programmers, who had started writing assembly code, months before the 701 assembler was even debugged and running on the first prototype 701, and years before they received their computers. So maybe there was also an "Assembly language only era" in the offices of these early 701 customers. But it's kind of an edge case if they didn't have a computer to run the assembler on, or test their programs.
I assume these early programmers were occasionally visiting the prototype 701 to assemble and test their code.
It's very utilitarian and most commonly just used for debugging and reverse engineering. So why would you waste time on the assembly language of a long-obsolete platform?
Plus, the best way to learn assembly is to experiment. Write some code in your favorite language and look at the intermediate assembler output, or peek under the hood with objdump or gdb. Try to make changes and see what happens. Yes, you can do that with an emulator of a vintage computer, but it's going to be harder. You need to learn the architecture of that computer, including all the hardware and ROM facilities the assembly is interacting with to do something as simple as putting text on the screen... and none of this is going to be even remotely applicable to Linux (or Windows) on x86-64.