Tangentially related question: how does one learn "computer architecture" or "computer engineering" as a hobbyist? I'm actually not even sure what those terms really mean.
I'm amazed that people can build homebrew computers using Z80, MOS 6502/6510, or Motorola 68k series microprocessors.
I could buy one of those homebrew Z80 kits and assemble it, but I don't think I'll gain much real understanding just from the act of assembling the kit.
I have long wondered how things work down to the bare-metal level. What does it take to be able to design (and program?) a homebrew computer? Let's say I leave out the FPGA route and only focus on using real, off-the-shelf ICs.
I'm mostly just a software guy (I write Python, I've learned automata theory at University) but I haven't learned any real electronics beyond things like Ohm's law, KCL, KVL, capacitance, impedance as a complex number. I don't even know my transistors or diodes.
The first six textbook chapters (all the hardware stuff) are available on the website in the 'Projects' tab, you should buy the book for the other six. All 12 chapter assignments are available on the website. The Software section has a hardware simulator to test your designs.
I'd like to second this -- I'm slowly making my way through Nand2Tetris (currently building a VM -> assembly transpiler, and if you'd asked me to do that a few months ago, I'd have had no idea what you were talking about), and it's been really fun, taught me a lot about low-level computing, and given me the opportunity to learn a new language (I'm building everything in C). I highly recommend it!
I think you'd find that once you dig into those old machines it's a lot simpler than you think. The 6502 for example is pretty easy to interface. Electronics knowledge isn't much needed at first, as it's mostly just logic. You can make a simple "computer" on a breadboard that just cycles a no-op through all of memory: https://coronax.wordpress.com/2012/11/11/running-in-circles/
From there it's not a tall order to start interfacing SRAMs and some EPROMs and a UART chip and you can get a machine you can connect to over serial.
The Z80 has RAM-refresh support built-in, and can be hand-clocked if you wish making it one of the simplest processors to run - in terms of required external components.
Ben Eater's Youtube series building a simple CPU from scratch was a real "lightbulb" moment for me. After 40 years of programming (including Z80 assembly back in the 8-bit days), it took my understanding of the CPU from a mysterious black box to what's basically a simple look-up table and some control signals.
There are some popular textbooks on computer architecture; Patterson's stuff is good, but there are better texts. Some techno-hobbyist sites (like Ars Technica) do CPU reviews that get into various microarchitectures. Datasheets from processor manufacturers are free and often have interesting details.
Going after things at a really low level, books like The Art of Electronics take you from transistors (and through things you probably don't care about, like power supplies and amplifiers) to digital electronics, to simple computer system design. You can probably find value by skimming the component-level stuff, skipping the goopy analog chapters, and just reading the digital electronics sections. (This is more or less how I started with personal computers in 1974 or so).
Can you give some examples and why/in what respect they are better than (David) Patterson's books? Or are those "better texts" the Ars Technica CPU reviews and manufacturer datasheets that you mentioned in the subsequent sentence?
I liked John Shen's Modern Processor Design and Noam Nisan's The Elements of Computing Systems.
The websites I mentioned have hobbyist-level details that are pretty interesting (but in general won't teach you much about things like concurrency hazards and branch prediction and so forth).
I happen to be a computer architect both professionally and as a hobbyist, but the basics of it really aren't that complicated. All of the _hard_ parts are people being terribly clever to eek out the last bit of performance in a given technology. Getting any modern <$100 FPGA board and designing a really simple processor is definitely do-able as a complete novice. I haven't read it, but I've heard good things about the Nand2Tetris book someone else in this thread mentioned.
I don't see a reply to this and not sure if any is forthcoming, but just on the off chance there isn't: at that level anything analog and when you're optimizing for multiple variables at once (power consumption, clock speed, reliability, gate count) it gets complicated in a hurry.
The big breakthroughs were the ability to simulate hardware for larger circuits at higher frequencies realistically to determine how to make those circuits stable rather than to get them to oscillate wildly at the frequencies they operate at. Once you develop a bit of appreciation for what is required to make a 100 gates work reliably at 10 MHz or so you can begin to understand the black magic required to make billions of transistors work reliably at frequencies in the GHz.
So the skill is not at all related to puzzle games, but it is a puzzle and the tools and knowledge required were usually hard won. At those frequencies and part sizes anything is an antenna, there is no such thing as a proper isolator, coils, resistors and capacitors all over the place even if you did not design them into the circuit (parasitics).
If this sort of stuff interests you I highly recommend the 'Spice' tutorials, and/or to get an oscilloscope and some analog parts to play with.
Is that what “high speed digital design” is? Like the stuff that Dr Howard Johnson teaches? Or is that not the same scope or maybe still something entirely different?
"High Speed Digital Design" generally refers to PCB level design. The Black Magic books are pretty old, covering ancient stuff like DIP packages on manhattan-routed boards. These days the keyword is SI/PI (Signal Integrity, Power Integrity).
I'm not familiar with that book but I just looked at some previews and it seems to be exactly what I'm getting at. I built a large number of radio transmitters in my younger years (long story) in the 100 MHz range, that was all analog so I had a pretty good idea of what it was like to design high frequency stuff, or so I thought. Then I tried to do a bunch of digital circuits at 1/10th that frequency and even after only a hand full of components in a circuit you'd get the weirdest instabilities. From there to the point that you can reliably design digital circuitry is a fascinating journey and gives you infinite respect for what goes on under the hood of a modern day computer.
Yes, and a reason for it subtitle "A Handbook of Black Magic" :). In the olden days EEs would more or less guess by intuition and heavy sprinkling or randomly placed pullups/pulldowns/capacitors to force designs into stable working order.
Dont remember the exact video, but Bill Herd mentioned many times about on site last minute fab fixes involving prodding the product on a hunch of where the problem might be.
Today you can simulate and measure pretty much everything, plus automated design rule tools will warn you of potential problems beforehand.
As a skill, I think it’s just called “engineering in a competitive market.” Anything you can do to make your chip more attractive to customers is a win, so designers go to great lengths to do so. Things like caches, pipelining, branch predictors, register renaming, out-of-order execution, etc. aren’t in any way fundamental to how processors work, they’re just tricks to make your program run a few percent faster. Designing a simple CPU in an fpga that executes everything in a single cycle and uses on-chip block ram is probably only a few hours of work for a novice. It would probably take you longer to download and install the fpga software!
A few responses pointing to Ben Eater's videos. I have only a rudimentary understanding of electronics. I built my first 6502 computer following his first two videos. Ben's able to strip away a lot of things you don't need to know immediately to get up and running. Supplementing this with reading sites like 6502.org has built a lot of confidence for me to go beyond what Ben's released so far.
I'm amazed that people can build homebrew computers using Z80, MOS 6502/6510, or Motorola 68k series microprocessors.
I could buy one of those homebrew Z80 kits and assemble it, but I don't think I'll gain much real understanding just from the act of assembling the kit.
I have long wondered how things work down to the bare-metal level. What does it take to be able to design (and program?) a homebrew computer? Let's say I leave out the FPGA route and only focus on using real, off-the-shelf ICs.
I'm mostly just a software guy (I write Python, I've learned automata theory at University) but I haven't learned any real electronics beyond things like Ohm's law, KCL, KVL, capacitance, impedance as a complex number. I don't even know my transistors or diodes.