The vintage IBM circuit board below has a large metal block on it that caught my attention, so I investigated it in detail. It turns out that the board is part of a modem, and the large metal box is a transformer. This blog post summarizes what I learned about this board, along with a bit of history on modems.
This board is a Standardized Modular System (SMS) card, but a very unusual one. In the late 1950s, IBM introduced the Standardized Modular System card, small circuit boards that held a simple circuit, and used these boards to build computers and peripherals into the mid-1960s. The idea was to design a small number of standardized boards that implemented logic functions and other basic circuits. The number of different board designs spiraled out of control, however, with thousands of different types of SMS cards. (I've made an SMS card database describing over 1400 different cards.)
Most SMS cards look like the one above, so the card with the metal block struck me as very unusual. Although some SMS cards are double-width "twin cards", I'd never seen one with a large metal block sandwiched between two boards, so it got my curiosity.
One suggestion was that the metal box was a oven-controlled crystal oscillator (OCXO). A OCXO is often used when a high-precision frequency source is required. The frequency of a quartz crystal varies with temperature, so by putting the crystal in a temperature-controlled module (like the one below), the frequency remains stable.
However, measurements of the module by Curious Marc and Eric Schlaepfer (TubeTimeUS) determined that the metal box was a large transformer (1:1 ratio, about 8 mH inductance). The photo below shows the four connections to the windings, while the external metal wires grounded the case. The transformer is heavy—the board weighs almost exactly one pound—so it's probably filled with oil.
The board shows its age through its germanium transistors, which were used before silicon transistors became popular. Most of the transistors are PNP, apparently because it was easier to produce PNP germanium transistors than NPN. (Silicon transistors are the opposite with NPN transistors much more common than PNP, largely because the electrons in NPN transistors move more easily than the holes in PNP transistors, giving better performance to NPN transistors.)
I found a document1 that gave the board's part number as a transmitter board for an IBM modem, transmitting data across phone lines. The large transformer would have been used to connect the modem to the phone lines while maintaining the necessary isolation. The modem used frequency-shift keying (FSK), using one frequency for a 1 bit and a second frequency for a 0 bit. I reverse-engineered the board by closely studying it, and discovered that the board generates these two frequencies, controlled by a data input line. This confirmed that the board was a modem transmitter board.
The photo below shows the underside of the board, with the traces that connect the components. The board is single-sided, with traces only on the underside, so traces tend to wander around a lot, using jumper wires on the other side to cross over other traces. (It took me a while to realize that the transformer's case was just wired to ground, since the trace wanders all over the board before reaching the ground connection.) At the bottom of the board are the two gold-plated 16-pin connectors that plug into the system's backplane. The connector on the left provides power, while the connector on the right has the signals.
The result of my reverse-engineering is the schematic below. (Click for a larger version.) The circuit seems complicated for a board that just generates a varying frequency, but it took a lot of parts to do anything back then. At the left of the schematic are the board's two inputs: a binary data signal, and an enable signal that turns the oscillator on. Next are the oscillator that produces the signal, and a 13 millisecond delay (both discussed below). The output from the oscillator goes through a filter that makes it somewhat more sine-like. The signal is then amplified to drive the transformer, as well as to produce a direct output.
The oscillator
The oscilloscope trace below shows the output that I measured from the board after powering it up. The blue line shows the data input, while the cyan waveform above shows the frequency output. You can see that the output frequency is different for a "1" input and a "0" input, encoding the data. (The height also changes, but I think that's just a side-effect of the circuit.)
The modem is supposed to generate frequencies of 1020 Hertz for a "mark" (1) and 2200 Hertz for a "space" (0). However, I measured frequencies of 893 and 1920, about 13% too low. This seems like reasonable accuracy for components that are 55 years old. (I don't know what the expected accuracy was at the time. There aren't any adjustments, so the frequencies probably weren't critical. Also, since the two frequencies differ by more than a factor of two, there's a large margin. Another possibility is that I guessed that the board is powered with ±12V but different voltages might yield more accurate frequencies.)
The modem operated at up to 600 baud. This corresponded to 100 characters per second for 6-bit characters, or 75 characters per second for 8-bit characters. The oscilloscope trace below shows the signal changing at 600 baud. At this rate, one bit is represented by only 1.7 cycles of the slower frequency, so the receiver doesn't have a lot of information to distinguish a 0 or a 1 bit. Also note that the waveform is somewhat distorted, not a clean sine wave.
The heart of this board is the frequency-shift keying oscillator that generates the variable output frequency.2 The input data bit selects one of two control voltages to the oscillator, controlling its output frequency.
The oscillator is a fairly common transistor-pair circuit. The diagram below illustrates how it works. (It uses PNP transistors and runs on -12 volts, so ground is the higher voltage, which may be a bit confusing.) Suppose transistor T1 is on and T2 is off. Capacitor C2 will discharge through resistor R2, as shown. When its voltage reaches about -0.6 volts, T2 will turn on. This will pull the right side of C1 up to ground; it was previously at -12 volts because of R4. This causes the left side of C1 to jump up to about +12 volts, turning off T1.
The process then repeats on the other side, with C1 discharging through R1 until T1 turns off and T2 turns on. The result is that the circuit oscillates. The discharge rate is controlled by the values of R1 and R2, and the control voltage; a lower voltage will cause the capacitors to discharge faster and thus faster oscillations.
The traces above show the action of the oscillator, producing the cyan output signal. The yellow curve shows the voltage on the left side of C2, the pink trace shows the voltage on the left side of C1, and the blue trace shows the voltage on the right side of C2. The pink and blue traces show the alternating discharge cycles for the capacitors; the faster discharge yields a higher output frequency.
The output of the oscillator is essentially a square wave, so it goes through some resistor-capacitor filtering stages that shape it to better approximate a sine wave. The top line (yellow) shows the output of the oscillator, and the lines below show the signal as it progresses through the filter. The result is still fairly distorted, but much smoother than the original square wave.
Delay circuit
Another interesting circuit takes the enable signal and outputs this signal delayed by 13 milliseconds. When I reverse-engineered this circuit (below), I figured it was just buffering the signal but it appeared overly complex for that. I measured its behavior and discovered that it implements a delay.
The circuit contains several buffers, but the heart of it is a resistor-capacitor delay. When the enable line is activated, the capacitor is pulled to -12V slowly through the resistors, creating the delay. The photo below shows the delay capacitor and associated resistors.
The oscilloscope trace shows the operation of the delay circuit. When the (inverted) enable line (blue) goes low, the signal output (cyan) immediately turns on. However, the enable outputs (yellow and pink) are delayed by about 13 milliseconds.
I don't know the reason behind this delay circuit. Maybe it gives the oscillator time to settle after being enabled? Maybe the modem protocol uses 13 milliseconds of signal to indicate the start of a new message?
Some background on Teleprocessing
If you used computers in the 1990s, you probably used a dial-up modem like the one below to call a provider such as AOL through your phone line. The name "modem" is short for MOdulator-DEModulator, since it modulates the analog signal to encode the digital bits, as well as demodulating the received signal back to digital. In this way, the modem provided the connection between your computer's digital signals and the analog frequencies transmitted by phone lines.
The history of modems goes back much further, though. IBM introduced what they called "Teleprocessing" in the early 1940s, converting punch-card data to paper tape and sending it over telegraph lines for the U.S. Army.1 In the early 1950s, a device called Data Transceiver removed the intermediate paper tape, connecting directly to a telephone line. With the introduction of the IBM System/360 mainframe in 1964, Teleprocessing became widespread, used for many applications such as remote data entry and remote queries. Banking and airline reservations made heavy use of Teleprocessing. Timesharing systems allowed users to access a mainframe computer over remote terminals, kind of like cloud computing. Even the Olympics used Teleprocessing, transmitting data between widely-separated sites and a central computer that computed scores.
Back then, modems were large cabinets. The board that I examined could be used in an IBM 1026 Transmission Control Unit (below).3 This low cost unit was designed to "make a modest start toward satisfying your data communication requirements ... until it is time to step up to more powerful transmission control units". It could connect a computer such as the IBM 1401 to a single communications line.
Larger installations could use the IBM 1448 Transmission Control Unit (below). This refrigerator-sized cabinet was 5 feet high and could support up to 40 communications links.
Nowadays, people often use a cable modem or DSL modem to connect to the Internet. Fortunately technology has greatly improved and these modems aren't the large cabinets of the 1960s. Speeds have also greatly improved; a modern 180 Mbps network connection is 300,000 times faster than the 600 baud modem board that I examined. At that rate, a web page that now loads in a second would have taken over three days!
Conclusion
This may seem like an overly detailed analysis of a random circuit board. But I was curious about the board due to its unusual transformer. I also figured it would be interesting to reverse-engineer the board to see how IBM built analog circuits back in the 1960s. Hopefully you've enjoyed this look at a vintage modem board.
I announce my latest blog posts on Twitter, so follow me at kenshirriff. I also have an RSS feed. Thanks to Nick Bletsch for sending me the board. I discussed this board on a couple of Twitter threads and got a bunch of interesting comments.
Notes and references
-
For more information, see Introduction to Teleprocessing Technical information is in Teleprocessing—General FE Handbook page 7-7 lists part number 373807 (my board) as a Transmitter card Type 2A. Page 7-30 then describes some characteristics of this modem type. IBM Teleprocessing 1940-1960 provides a historical look. ↩↩
-
The oscillator is essentially a voltage-controlled oscillator (VCO). However, since it only takes two different input voltages (about -2.5 and -9 volts), the circuit isn't as challenging as a typical VCO, which takes a wide range of inputs and needs to have a linear response. ↩
-
The modem card I examined could be used with an IBM 1050 or 1060 Data Communications System, which I believe was the remote terminal subsystem. It could also be used with the IBM 1448 and IBM 1026 Transmission Control Units. (The IBM 1448 connected to an IBM 1410 or IBM 7010 computer.) ↩
10 comments:
"The idea was to design a small number of standardized boards that implemented logic functions and other basic circuits. The number of different board designs spiraled out of control, however, with thousands of different types of SMS cards."
Same thing happened with the SEM (Standard Electronic Module) system used by the USN. Though, to be fair, a lot of specialized combat systems were built using SEM.
Check the value of the resistors. Those are Carbon-Composition resistors, which have an annoying habit of drifting higher in value with age. An increase in resistance value would have a corresponding drop in the frequency. Given that they appear to have a gold band, their tolerance should be 5 percent. But, I'm guessing that they may have drifted to a higher value of resistance, well outside that 5 percent tolerance.
Also, check the leakage resistance on those capacitors. Many designs from that era used waxed-paper for the dielectric in capacitors, and this material tended to degrade, and produce electrical leakage. Most of the capacitors from that era were not hermetically sealed (even the ones which appeared to initially be), such that humidity could enter the device, and contribute to the leakage. This could cause various weird effects to the circuit (It was even worse in the vacuum tube era, where higher voltage were used, such that grids could be biased positively, which caused very weird, and sometimes damaging effects. But, that's a story for another time.).
Also, note that early Germanium transistors frequently had high levels of collector to base leakage currents. This could also upset the bias of the circuits. Note, especially, that passivation wasn't as advanced in the Germanium transistor era, and various damaging effects could occur to the interior transistor elements (e.g., Purple Plague, Sodium migration, etc.).
Dave
According to the manual this modem was capable of operating on a two-wire circuit, which at that time meant half duplex. I wonder if the delay had something to do with turning the line around from transmit to receive and back.
The synthesizer audio guys would love those distorted sines! They remind me of a Moog 901B Oscillator with unijunction transistors. Maybe a fun one to breadboard.
But can you mine bitcoin with it?
Why was the transformer so heavy, and drowned in oil?
Perhaps the transformer was big to resist lightning strikes on the phone line.
The reason the amplitude of the higher frequency tone is lower is simply because it goes through a low pass filter.
Am I missing something in the math here?
> a modern 180 Mbps network connection is 300,000 times faster than the 600 baud modem board
> a web page that now loads in a second would have taken almost 3 months
...but 300000 s = 3.47 days, not 3 months.
Is this properly a modem or what IBM called a “line adapter”? A line adapter achieved the same thing as a modem, but was much cheaper, and was usually mounted inside the device it served. The disadvantages were that the line had to be solid copper, stay on your property, and not cross a public right of way.
Anonymous: you are correct; I messed up the math.
John W: The Teleprocessing Handbook has a chapter titled "IBM modems (Line Adapters)", so it seems like they considered them the same thing at the time. But I find IBM's terminology confusing and you probably have a better understanding of it.
Post a Comment