Maybe I overlooked it, but I missed a mentioning of ladder logic [1] that was often used in process control devices (even in those for homes, I've seen some of those still in use).
There's an obvious translation from the ladder logic to a MC14500B program, so it's easy to replace those devices with a simpler MC14500B device.
I even think I've seen ladder logic mentioned in some of the MC14005B documentation (but it's been some time since I looked at that).
I originally mentioned ladder logic briefly, but I figured nobody would be familiar with it, so I cut it out. The documentation discusses ladder logic a lot (as you remembered), and how to convert from ladder logic to the MC14500B:
http://www.bitsavers.org/components/motorola/14500/MC14500B_...
My father and colleagues were still using ladder logic at the paper mill they worked at as of the nineties. In an article on PLCs as of 2016 they claimed that ladder logic still dominated...
Great choice of subject matter, as usual, Ken! May I direct your attention to a one-bit machine [1] that's already something of a favorite on HN, posted for discussion in 2014 [2], 2016 [3] and 2019 [4].
"This is not a Motorola MC14500 computer, but it was the MC14500 that introduced me to the idea of one-bit computing. Exploring ways to reduce the chip count, a series of pencil & paper designs revealed the MC14500 itself could be omitted!"
This another great post! I had question about the following:
>"Another key circuit in the processor is the transmission gate. This acts as a switch, either passing a signal through or blocking it."
This is the first time I think I've come across a "transmission gate" circuit in one of your posts. Is this uncommon then? What is the actual input to the transmission gate?
The transmission gate is pretty common in microprocessors. In NMOS circuits, it's a single transistor called a "pass transistor", but in CMOS it uses two transistors and is usually called a transmission gate.
It takes a logic signal as input, as well as a control signal. If the control signal is 1, the logic signal goes through to the output. If the control signal is 0, the transmission gate is disconnected. You can think of it like a relay-controlled switch, or a tri-state buffer.
The nice thing about MOS circuits is the gate resistance is almost infinite, so if you open a pass transistor, a gate on the output side will keep the old value (for a few milliseconds at least). So you can create latches almost for free. This is used very often in microprocessors. The disadvantage is the chip has a minimum clock speed, or else the data will leak away.
Pass transistors / transmission gates can also be used to implement multiplexers, selecting one of the inputs.
A disadvantage compared to regular logic gates is that a logic gate amplifies the input signal, while a pass transistor weakens the input signal. So you usually can't connect two pass transistor circuits together directly.
Interesting, so it has different names in different contexts(NMOS vs CMOS.0 I'm familiar with the "pass transistor" nomenclature and have read about that in at least one of your previous posts. Good to know. Cheers.
When I was in college we had lots of TTL breadboard labs to make various types of digital circuits. I wonder if something like this might be a good part of a curriculum that builds up to making a full processor?
Possibly, but I suspect this chip would probably be more of a tangent than something on the path to a full processor. Also, a 1-bit processor isn't very exciting as far as applications, compared to even a 4-bit processor.
The Connection Machine had a very unusual architecture that was sort of 1-bit, but sort of 32-bit. It was a massively parallel computer of the 1980s with 16,384 processing elements. Each processing element handled one bit at at time. But usually each processing element performed arithmetic on a 32-bit value, in a bit-serial fashion.
You can call this a 1-bit processing element, but I think calling it a 32-bit serial processor is more descriptive. Processing data serially using a 1-bit ALU was not uncommon, from the early EDSAC computer to the PDP-8/S minicomputer to the Datapoint 2200 desktop computer, but these are not considered 1-bit computers. The Connection Machine was more flexible with word size than these, so calling it a 32-bit computer isn't quite accurate either.
In any case, the MC14500B didn't have any support for bit-serial operations. (For instance, you want the processor to add the carry from one bit to the next bit to do addition.) Arithmetic was possible on the MC14500B (Turing machine and so forth), but it was very slow, taking 12 instructions per bit to manipulate the sum and carry. The documentation recommended using an external chip if you needed to do arithmetic.
You mention this briefly, but it's worth pointing out that the CM-2 (and, I believe, the CM-1 as well) was indeed a single-bit-at-a-time processor, but the word length was arbitrary, not 32 bits. You could, in *Lisp (and maybe the low-level calls that C* used, I didn't work with that enough to know) define pretty much any bit length you wanted, up to the number of bits on the processor.
They later implemented a floating-point accelerator that worked with 32 of the 1-bit processors in "slice-wise" mode to do 32 (and possibly 64) bit arithmetic, where the word was spread across all 32 processors.
The CM-5 used true 32-bit SPARC processors.
Also, the CM-2 had up to 128k bits per processor, and you could have 64k of them in one CM-2.
I'm no FPGA expert but I think two problems would be the 4K per-processor memory and the inter-processor communication. You might have routing problems with the Connection Machine's hypercube routing.
Random Connection Machine fact I found on Wikipedia: Maya Lin, who designed the famous Vietnam War memorial in Washington also designed the exterior of the Connection Machine CM-5.
See my post from above. In industrial control there are a lot of processes with a large number of on/off sensors and actuators.
One-bit programmable controllers like 14500B were good enough to handle those.
In such old installations, the sensors and actuators that could not be handled by one-bit controllers were handled by analog controllers, unlike in modern installations, where a MCU handles digitally not only the Boolean variables but also the analog values through ADCs and DACs, so arithmetic computations are also required, besides the logic operations that could already be performed by something like 14500B.
I was early in my career at the time, and remember looking at the datasheet and asking myself the same question. Reflecting back, I think the sibling post that mentions an upgrade of an older discrete TTL PLC system could be the bulk of the target market. So while it was unlikely someone would begin a clean new design with a 1-bit CPU, there were older systems that already had the 1-bit architecture that suited this device for a mid-life re-do.
Besides that, semi vendors have on occasion been known to manufacture a device that has no market success to due to poor product/market fit, or technology bypassing it by the time it was finished. Most devices were originally made as custom designs for some specific customer or application so perhaps there's a boatload of these in some 1970's car.
> In 1977, the MC14500B cost $7.58 in quantities of 100 ($32 in current dollars), which seems expensive. However, at the time, an 8080A CPU cost $20 and a Z80 cost $50 ($85 and $215 in current dollars) so there was a significant cost saving to the MC14500B.5 However, the steady fall of processor prices soon made the MC14500B less attractive.
Perhaps in 1977 you couldn’t source the 4004 in the quantities you might need to make a product leaving you only with more expensive contemporary processors?
Yes, cost was an issue. Also complexity; the documentation says:
"Computers and microcomputers may also be used [for control tasks], but they tend to overcomplicate the task and often require highly trained personnel to develop and maintain the system. A simpler device, designed to operate on inputs and outputs one-at-a-time and configured to resemble a relay system, was introduced. These devices became known to the controls industry as Programmable Logic Controllers (PLC). The Motorola MCI4500B Industrial Control Unit (ICU) is the monolithic embodiment of the PLC's central architecture."
I couldn't find data on how popular the MC14500B was, but I think microcontrollers such as the Texas Instruments TMS1000 were much more popular.
Yes, in hindsight with our computer-centric perspective today, it seems like programming the MC14500 is something you'd get CS and programming people to do.
But the user manual has sections on how to translate ladder logic and similar control logic into a program counter circuit and appropriate code, step by step. If someone was familiar with industrial control and basic digital electronics, I think the manual is about all they would need. The same can't be said for most other microcontrollers which are indisputably full computers, with all of their complexity.
The complete schematics of the Xerox Alto are on Bitsavers. I don't think photos would help a lot with generating a PCB, since the chips cover much of the wiring.
A bigger problem with a Xerox Alto replica is the wire-wrapped backplane. It would be a pain to redo that by hand.
For most purposes, you'd be better off using the ContrAlto emulator.