Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"This is not a Motorola MC14500 computer, but it was the MC14500 that introduced me to the idea of one-bit computing. Exploring ways to reduce the chip count, a series of pencil & paper designs revealed the MC14500 itself could be omitted!"

That's really gold. I love these optimization rounds.

Worst experience in this was that I once spent a lot of time optimizing some function that looked like it was using a whole pile of time, only to realize afterwards that a hand-optimized version in assembly was already graciously provided in the same subdirectory. And it ran a lot faster than mine :(



More on that one with a VHDL description too:

http://www.brouhaha.com/~eric/retrocomputing/motorola/mc1450...

Handbook has this interesting statement: "When a job is dominated by calculations or data logging, a multi-bit processor is more appropriate. When the task is decision and command oriented, a one-bit machine is an excellent choice."

I'd like to see that put to the test with a number of simple, control systems to see how it measures up. Plus, I'd love to see the demoscene have a go at 4- and 1-bitters to see what they're really capable of.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: