This page was last updated on 09/29/2001
Note: Most of this page is yet to be filled in. Listed below are all the topics that I plan to address (since I have the item or at least the documentation for it). The "live" links have technical data filled in; the others are simply place-holders for topics I have yet to compose.H-8 Cabinet, Power Supply and Bus
H-8 Cabinet, Power Supply and Bus
The H-8 was stylistically advanced compared to many of the hobby computers of the era. When most other systems were simple boxes with their front surface covered with bit-wise LED indicators and toggle switches, the H-8 had a nice two-tone "sculpted" look, especially when viewed from the side. Overall, the H-8 was approximately 16" wide, 6" high and 17" deep. The front panel was a light gray; the rest of the machine was black. The front panel sloped at about 60 degrees (see drawing). It was aluminum, the bottom, back and louvered top cover were steel and the sides were plastic. The system was convection-cooled.
The power supply of the H-8 consisted of a husky transformer with dual primary windings that could be switched for 120 or 240 volt operation and two center-tapped secondaries, each feeding full-wave rectifiers mounted on the backplane. The supply delivered unregulated +8 VDC (at ten amps), +18 VDC and -18 VDC (at 250 mA); regulation was performed on each card via three-terminal regulators. The mounting bracket for the card served as the heat-sink for the regulators.
The backplane had ten connectors. Each connector consisted of a pair of 25-pin single-inline pin plugs. The pins were on 1/10" spacing and were about 3/8" high. The cards had a pair of 25-pin sockets that mated with the plugs. All these pins were made of tinned steel stock, which caused some problems due to oxidation in areas of high humidity; I eventually replaced the entire set of pins (but not the sockets) with gold-plated pins in my H-8 and had no trouble after that.
The layout of the backplane was such that not all connectors were the same distance apart. The first and last pairs of connectors (those furthest from and closest to the power supply components) were more closely spaced than the rest. Since the mounting brackets for the cards were just about as wide as the spacing between the eight evenly-spaced connectors, it appeared that there could only be eight cards installed in the backplane. Actually, a ninth card was permitted (the HA8-8 Extended configuration card); it mounted backwards in the rearmost slot -- with its etch side facing forward, unlike the rest of the cards. It was also much shorter than a standard card and it had no heat sink / mounting bracket. The front panel cables plugged into the front-most connector.
(The standard-size cards were 6-1/8" high by 12" long. When mounted in the H-8 they were tilted about 30 degrees from vertical because the cabinet had less than six inches of clearance inside.)
Pin-out: Pins on the backplane were numbered from zero to 49, pin zero was at the bottom of the connector.
H-8 Control Panel
The H-8 was unlike most other hobby computers of its day in that it did not use a bit-level switches-and-LEDs front panel. Whereas most systems of the era required you to laboriously enter binary bits and toggle control signals to load a program, the H-8 used a calculator-like keypad and had a nine-digit numeric display. Heathkit accomplished this feat by providing a program in ROM that would recognize keystrokes and drive the display. This was, in effect, a ROM BIOS for the machine. (The front panel monitor program's ROM was on the CPU card.)
The front panel montior program (PAM) was more than just the control panel, though: there were routines that implemented commands for input from or output to any I/O port, for examiming or modifying processor register contents, a single-step capability so you could "walk through" your code and single-button commands to load memory from or dump memory to the optional audio tape.
All this required surprisingly little actual hardware. The control panel was a card similar in size to all the other cards the H-8 used, but it mounted behind the front panel and plugged into the bus with a pair of 25-pin cables. (There was also a five-pin cable that connected directly between the CPU card and the front panel. This carried four signals -- /INT20, /IE /INT10 and RESIN -- which were not provided on the bus.) Besides the 16-button keypad, the nine MAN-1 style seven-segment displays and four status LEDs, the control panel contained two eight-bit output registers and an eight-bit input port, several octal decoders, driver transistors for the LED display and a handful of gates and flops. There was also a circuit which turned the 2.048 MHz system clock into a 2 millisecond periodic interrupt signal and a 1000 Hz tone which was then gated with a bit on one of the output ports and sent to a speaker.
In detail, here's how it worked:
This was a set of nine MAN-1 style seven-segment displays. Physically, these were mounted behind a red, transparent, plastic window and were arranged in three groups of three digits (spacing was such that it looked like "XXX XXX XXX" -- the first six digits were labled "address", the remaining three "data/register"). All the displays had their eight segment bits (yeah, eight: the eighth "segment" is actually the decimal point) wired in common. These were fed from the eight bits of Output Port 361 [all port numbers in this document are expressed in octal].
Each indicator had its "common" pin driven (with a transistor) from a separate decimal decoder which was looking at bits 0 through 3 of Output Port 360. The software selected a digit by sending a digit number to port 360 and displayed a combination of segments by writing the pattern to port 361. (Digit 1 was the leftmost; digit 9 was the rightmost. See "Software Enviroment", below, for the segment bit assigment.) Every 2 milliseconds, the segment pattern was automatically set to "all segments off". This prevented the display from being burned out if the software somehow "locked up" -- in order to achieve enough brightness in the scanned display, the design supplied more current to each digit than was healthy if the display wasn't operated at less than 50% duty-cycle. In normal operation, any given digit was on for about 1.5 milliseconds out of every 18.
There were four status LEDs mounted to the left of digit 1. These were labeled (from top to bottom) "ION", "MON", "RUN" and "PWR". I'll describe them from bottom to top, just to be perverse: "PWR" was the power indicator -- it essentially was lit by the +8V power bus. The "RUN" indicator was lit by the processor's M1 signal (stretched to 4.7 microseconds by a one-shot, since the pulse was normally only 200 nsec wide) -- this indicator basically shows that the CPU is executing instructions. The "MON" indicator required that every 2 milliseconds there be an IO Write to port 260 with data bit 5 set, which the front-panel software did. Hence, this indicator shows that the panel monitor program was running. Finally, "ION" was simply connected to the CPU's EI signal and indicated that interrupts were enabled.
The sixteen keys were wired to octal priority encoders as two ranks of eight keys: 0, 1, 2, 3, 4, 5, 6 and 7 were one group; 8, 9, plus, minus, *, /, # and period were the other group. (See "Software Environment" for the bit patterns which could be encoded.) Additionally, the zero key was connected to one input of each of two two-input AND gates. For one of these gates, the other input was the / key. This combination (zero-slash) caused a RESET to be generated. On the other gate, the # key was combined with the zero key to force /INT10 to be asserted. This action caused the panel monitor code to be re-entered via a RST 1 instruction, as will be explained when I get to describing the CPU card. Other than these two special functions, PAM used the keypad one key at a time. I'm not aware of any software that normally used the keypad for anything, but there was nothing to prevent one from writing code to read port 360 and look for keystrokes. But if you wrote something that did, your code would have to re-sample the port and compare the current value to that obtained from previous read operations -- "de-bounce" the switch contacts, in effect. This is a not-infrequently encountered strategy in microprocessorville. PAM did this. The result was a keyboard that was quite serviceable.
The system bus clock (2.048 MHz) signal is divided by 2048 to create a 1000 Hz square-wave. This is ANDed with bit 7 of Output Port 360 and the result drives a small speaker. The same frequency divider circuit has another output which divides the system clock by 4096. This is the 500 Hz signal which provides the timing for the 2 millisecond real-time interrupt, where it sets a flip-flop whose output is gated with bit 6 of output port 360 to assert /INT10.
The single-step feature is implemented by causing an /INT20 interrupt to occur two M1 cycles after interrupts are enabled. The PAM code arranges things so that the /INT20 happens after the program being stepped has a chance to do a single instruction. Bit 4 of output port 360 enables this feature. (Because of the way the logic is hooked up, the /INT20 interrupt is automatically deasserted two M1 cycles after interrupts are disabled.)Back to top
The original 8080 CPU Card
The CPU card that shipped with the H-8 used an 8080A processor, the 8224 clock generator and the 8228 system controller -- a pretty standard arrangement, except there was an additional flip-flop to "catch" the M1 status for the control panel logic. There was a 2708 EPROM containing the PAM-8 code, buffers to drive the bus, an address decoder for the PROM and an 8-input priority encoder to convert the eight interrupt lines into a "RST n" instruction for the IACK cycle.
The PAMGO modification swapped out the 2708 for a (larger) 2716 -- you had to move a couple of jumpers to bring an additional address bit into the ROM and take it off of the address decoder. You also had to remove a jumper and wire a pin to +5V. The XCON modification required the same re-jumpering, used yet another version of PAM and also required adding a jumper to bring the ROM_DISABLE signal into the ROM address decoder.
|Part Number||PAM Version|
The HA-8-6 Z-80 CPU Card
The HA-8-6 Z-80 CPU card was an optional replacement to the standard H-8's 8080-based processor. Besides the new processor chip, the card also included all the capability of the XCON enhancement in on-board circuitry.
The Z80's clock is generated by dividing the output of a 18.432 MHz TTL Crystal Oscillator by nine. This signal is also the system clock (pin 22) on the bus. A power-on reset is created from a NE-555 circuit. (RESIN from the front panel is also hooked into the '555.) The Z80's processor status signals are transformed, via a bunch of gates and a couple of flip-flops, into MEMWR, MEMRD, IOR, IOW, M1 and HLDA.
The interrupt scheme deviates significantly from the 8080 CPU. To begin with, instead of the 8080's IACK, the Z-80 asserts its IORQ and M1 status signals. A gate combines these to make BINTA, which sends the "RST n" instructon on the data bus. The Z-80 has no signal that's equivelent to the 8080's INTE output. Instead, the data bus is monitored for a what opcode is fetched from memory and the IEN signal is synthesized from the detection of an "EI" or "DI" instruction -- "EI" sets the IEN signal; "DI" resets the signal.
The CPU card contains sockets for two ROMS -- one contains the PAM37 program (444-140). The other socket is shipped with an unknown ROM part not mentioned in the manual -- probably a copy of the ROM for the H-17 controller. With older H-17 controllers, some minor wiring changes have to be made to allow the "Org Zero" capability to disable the H-17 ROM. You also have to re-label the keys on the control panel with a new set of sixteen key labels.
The XCON features are also incorperated into this card. This means basically a DIPSwitch read as Input Port 362, control of ROM DISABLE and a Side Select signal courtesy of bits D3 and D4 (respectively) from Output Port 362. (The ability to use double-sided drives in the H-17 by controlling side select from Output Port 362, bit D4 was not a standard capability of Heathkit software.)
The DIPSwitch was read by the PAM-37 code in order to determine the "default" boot device. If you simply hit PRI (the 1 key on the keyboard) or SEC (the 2 key), this switch would tell the code what device (but always unit zero) to boot from. The switch was coded as follows:
|Position||Encoding (0 = off / 1 = on)|
|7||0 = Normal Boot|
|1 = Auto Boot|
|6||Must be zero (not used)|
|5||Must be zero (not used)|
|4||0 = Boot Primary on 174/177|
|1 = Boot Primary on 170/173|
|3,2||10 = H67 on 170/173|
|01 = H47 on 170/173|
|00 = H37 on 170/173|
|1,0||10 = H67 on 174/177|
|01 = H47 on 174/177|
|00 = H17 on 174/177|
There was also the abiliy to boot from any unit of any device: you hit the BOOT (zero) key; then you selected the device: 0 for the H17, 1 for the H47, 2 for the H67 or 3 for the H37; next you specified what port the device was on: 0 for 170, 1 for 174, 2 for 270 and 3 for 274; finally you selected the unit number (0 through 3). Auto-boot would cause the primary device to boot from unit zero at power-up or reset (not recommended).Back to top
H8-1 / H8-3 RAM
The H8-1 memory card held eight or sixteen TMS4044 4096-bit (4Kx1) static RAM chips. Eight were included with the memory card, the second bank was the optional H8-3 memory chip set. Memory address decoding was accomplished by an octal decoder off of address bits 15 through 13 plus bank selection using address bit 12. The address decoder let you put the board at any 8K boundary from 0 through 56K. You then set another jumper depending on whether the board had one bank of RAM or two.
If the second bank was not installed, the control signals and bus trancievers were only enabled for the first 4K bytes -- if you had a small board it had to be at the top end of your memory space to avoid a 4K byte gap in RAM. If you had a full 8K populated, the bus trancievers and control signals were enabled for the whole 8K bytes.Back to top
H8-2 Parallel I/O Controller
The H8-2 card was the parallel port card. The card provided three bi-directional eight-bit parallel I/O ports, all identical. The ports were implemented by hooking an 8251 USART head-to-head with a 6402 UART so that the serial input and output from the USART were connected to the serial output and input of the UART. The parallel lines from the USART went to the system bus; the parallel lines from the UART went to the I/O connector.
The baudrate clock for the data transfer between the two devices was the system clock (2.048 MHz) divided by 5 to produce the baud clock. The USART or UART would then divide that clock signal by sixteen and transfer data at 25,600 bits per second from one chip to the other, giving the port a basic transfer rate of up to 2560 bytes per second.
Address decoding was via the (now familiar) octal decoder array, in this case you could actually set any of the ports to any address (each USART used two port numbers -- even and odd) and there was the usual data bus buffer.
On the I/O connector side, the data output pins could be jumpered as active-high or active low, thanks to a rank of XOR gates buffering the UART's output pins. TakeData and DataTaken were derived from DR and fed to DRR, respectively to allow the parallel output port to have full handshaking. On the input side, the USART's RXRDY pin became SendData and the DataSent input drove the UART's TBRL -- again, a way to implement full handshaking for the input side of the port.
If that all sounds like a very convoluted way to implement something that could have been simply a pair of eight-bit registers and a flip-flop or two, remember: the software to drive the 8251 as a serial port (in the H8-5) was completely appropriate for also driving a parallel port on the H8-2 instead. Pretty neat, actually.Back to top
HA-8-2 Dual-channel D/A Interface and Music Synthesizer
HA-8-3 Sprite Graphics Video Card
H8-4 Four-port Serial I/O Card
The H8-4 I/O card provided four serial ports, implemented with the 8250 UART. Three of the ports were identical, providing RS-232-C interface as either DTE or DCE connection; the fourth port was able to be configured for use in current-loop mode, like the earlier H8-5 console port. (See the description of the H8-5, below.) Each port used a block of eight I/O addresses. I/O decoding was via a pair of octal decoders decoding the high-order five bits of the address; the low-order three bits were fed to the A2 through A0 inputs of all UARTS. You selected the chip enable for each port by picking one output from each decoder, or a logic-one level if you didn't want to use a port.
Each UART got its baudrate clock from an on-board 1.8432 MHz crystal, the oscillator circuit being provided by the X1 and X2 pins of the UART for port zero and the X2 output of that chip providing the signal to the X1 inputs of the other three UARTs. The 1.8423 MHz baud clock was divided by various factors to produce serial data rates from 38,400 baud down. Here's the division factors:
|38,400||3 and 16*|
Each UART has an interrupt output. These could be connected to one of the IRQ signals on the Benton Harbor Bus, via jumpers. The bus interface consisted of the usual 74LS245 bi-directional octal bus interface circuit (operated unidirectonally) for the eight address bits and three 74LS242 quad bi-directional bus interface circuits (the third one operated unidirectionally to buffer the control signals). Lastly, there was a jumper to disable the whole card if necessary to debug a problem on the bus.
Connections to peripherals were made by connecting a 25-pin "D" connector to a 15-pin header on the card. There were eight such headers: a pair for each port. One header had signals arranged for DTE connection; the other had the same signals but connected for DCE. This saved having to re-wire cables or use a "null modem".
Describing the 8250 is beyond the scope of this paragraph. But follow this link: 8250 Technical Description; it will give you a good description of how to program the beast.Back to top
H8-5 Serial I/O and Cassette Tape Interface
In spite of having a higher model number than the H8-4, the H8-5 was an earlier design. It incorperated two serial ports, one available for connection to a terminal or other RS-232-C device, the other attached to an interface circuit that encoded data as audio tones for use with cassette tape as a data storage system. Both ports used the 8251 USART, like the H8-2 did. The RS-232-C port was pretty straightforward but the audio tape interface was less so.
Address decoding was accomplished with an unusual decoder circuitry. The card needs four I/O port numbers -- two for each USART. Because of the decoder arrangement, you could only select two pairs of I/O addresses out of four possible pairs in a contiguous block. Decoding was through three cascaded decoders: A7 and A6 were decoded by one quad decoder. You chose one of the outputs of this chip and fed it to the enable pin of an octal decoder which decoded A5, A4 and A3. Then you selected one output of that chip and fed it into the enable pin of another quad decoder which decoded A2 and A1. Two outputs of this chip were chosen as the chip select signals for the two USARTS (which also looked at A0 for command/status versus data address selection). The standard port assignments were 370/371 for the tape interface and 372/373 for the RS-232-C port. But you could elect to flip a switch and interchange this assignment (don't ask me why).
This card is covered with jumpers! The baudrate clock to each USART is synthesized by dividing a 4MHz oscillator circuit by 13, then 16, 11 and 8 in a chain of counter chips. Various outputs of this chain provide clock rates for 9600, 4800, 2400, 1200, 600, 300, 150 and 110 baud (the actual frequencies are 16 times the baudrate). You selected the data rate by jumpering the USART's baud clock pin to one of these signals. The tape interface used 1200 baud; the RS-232-C port could be any rate from the set you desired -- in fact, this port could use a different rate for data in and data out, as the RX clock and TX clock pins were jumpered separately.
You also used jumpers to decide the source of and interrupt number for any interrupt request you wanted the card to generate. (The standard set-up defined IRQ3 as RXRDY from the RS-232-C port if your console was hooked up to this card; if you had your console implemented on some other card, you would of course not use that IRQ assignment.)
The RS-232-C interface supplied TXD, RXD, DTR and RTS. Strictly speaking, the interface wasn't necessarilly RS-232-C: you could -- you guessed it -- set some jumpers and get a current-loop TXD and RXD with optoisolators. The cable was a bit weird, though -- the card's connector was a ten-pin single-in-line header (a shorter version of the type of connectors that made the Benton Harbor Bus). This was wired to a 15-pin MOLEX-style connector that snapped into a rectangular hole in the back of the H-8. From there, you were pretty much on your own to connect it to your terminal.
The audio tape interface needs considerable explaining. First, some history: there was at the time an encoding standard for computer data stored on audio tape called the "Kansas City Standard" after the hobby computer fair where it was first proposed. This standard defined that audio tones would encode serial data sent at 300 baud, 8 data bits, one stop bit and no parity. 2400 Hz was defined as "mark" and 1200 Hz as "space". A variation of this standard quickly arose which operated at four times the data rate (1200 baud) using the same two frequencies but fewer cycles of each tone. It was this high-speed varient that Heathkit implemented, though -- via jumpers, what else? -- you could set the board back to 300 baud if you wanted to (though you also had to change the value of one capacitor).
I'll cover data encoding first, because it's simpler: the TX output of the USART feeds an arrangement of two J/K flip-flops that forms a programmable frequency divider. When the USART output a zero bit, the flip-flops divide a 4800 Hz signal from the baudrate chain mentioned above by four, creating a 1200 Hz tone. When the USART outputs a one bit, the divider divides by 2, producing a 2400 Hz tone. This tone goes through an R-C network which acts as a low-pass filter, rounding off the edges of the square-wave into something a bit easier for the audio tape to digest. This frequency-modulated tone (actually FSK modulation) is then fed to the audio input of the cassette recorder.
When you play back a tape recorded from this signal, you must recover the logic bits from the tones. Heathkit accomplished this with a frequency discriminator made from one-shots (monostable multivibrators) and a phase-locked loop. Here's how that worked -- watch closely, it gets a bit bumpy here! First, the audio output of the tape playback was clipped by a pair of back-to-back diodes in an effort to convert a semi-sinusodial waveform back to a square-wave. This was amplified and further clipped by an operational amplifier. The output of that fired a pair of one-shots; one triggering on the rising edge and the other on the falling. These were ORed together, resulting in frequency-doubling. So when the recorder was playing a 1200 Hz tone, the OR gate created a 2400 Hz signal and when the recorder plays a 2400 Hz tone, the output of the gate was a signal at 4800 Hz. Follow me so far?
This tone output feeds several circuits. First, it feeds another one-shot whose period is such that its output is constantly a one if the tone is at 4800 Hz and is a square-wave if the tone is at 2400 Hz. This one-shot feeds the D input of the first of two D flip-flips forming a two-stage shift-register, also clocked from the tone. The result of this arrangement is that the output of the shift register is a one for the high-frequency tone and a zero for the low-frequency tone. Thus, the one-shot and shift-register circuit has recovered the ones and zeros from the tape. But that's only half the job -- the USART also needs a baudrate clock. And we can't use the frequency divider chain on the H8-5 for the purpose -- it's not in phase with the data stream from the tape and minor speed variations in tape speed mean the crystal-controlled baudrate clock would probably not even be on the right frequency from one moment to the next. (This is quite different from a serial port connection or even a MODEM -- in those cases, the baudrate at each end of the cable is essentially identical and the baudrate is also not varying from one millisecond to the next.)
Recovering the clock is accomplished by a phase-locked loop circuit. Again, we start with the output of that OR gate -- this time, we feed a one-shot and flip-flops arrangement which is designed to produce 1200 Hz out no matter whether the tone is 2400 or 4800 Hz from the OR gate. (Of course, the actual frequency will vary with the imperfectly-controlled tape speed, but it's nominally 1200 Hz.) This signal is the "reference frequency" for the PLL. The PLL is set to produce 19,200 Hz output -- the data rate times 16 that the USART requires. This signal feeds the USART's RX clock. It also feeds a divide-by-16 counter that provides the other input to the PLL's phase detector. Inside the PLL chip, the phase detector measures the frequency difference between the 1200 Hz reference signal and the 19,200 Hz-divided-by-sixteen frequency and adjusts the oscillator such that the output is always precisely 16 times the reference frequency. This is the ideal condition for the RX clock of the USART.
That about covers the tape interface, except for a couple of miscellanious circuits. First, the card contains a built-in logic probe circuit (needed to help adjust the one-shot circuits for detecting data) This is simply an LED driven by a pulse-stretching circuit. You connect the input of the pulse stretcher to some test point and watch the LED indicate activity. Finally, the cassette machines' motor start/stop circuitry is accomplished by a couple of relay driver circuits. The recorder's control is activated by TXE from the USART. The player's control is activated by RTS. (The DUMP and LOAD routines in the PAM monitor software will set these control bits when the recorder or player is supposed to be activated.) The recorder's control signal also goes through a five-second timer circuit and sets DSR and RTS -- this lets the recorder get up to speed before the USART starts sending data to the FSK modulator.Back to top
H8-7 Breadboarding Card
Heath Co. recognized that many H-8 owners wanted to create unique circuitry to add to their computers. The company offered two products to help hobbyists do this. The first was the H8-7 breadboarding card; the other was the H8-10 wire-wrap card. Apparently, the logic of this was that one would first use the breadboarding card to debug the design and then transfer the circuitry to the wire-wrap card to make it more rugged. Since the H8-7 was considerably more expensive than the H8-10 and re-usable besides, that was just what most of us did.
The card consisted of four solderless breadboard modules and five bus strips, a bus interface with the usual decoder-and-jumpers address decoder and three-terminal regulators for the +5V bus. The solderless breadboard modules were plastic rectangular sockets designed to take small-diameter leads (#22 wire gauge) or IC pins with two columns of sets of five holes separated by a grooved region that was 0.3" wide. The holes were on 0.1" centers. (Like this)
The breadboard modules allowed one to insert descrete components or DIPs and interconnect them with wires. The five holes in each set were connected together, so if you were using a typical DIP of the time-frame (16 pins, narrow width) you could plug about half a dozen DIPs into the module (straddling the groove) and make up to four connections per pin. So the whole H8-7 card had the capacity to breadboard about two dozen chips worth of circuitry. The bus strips were similar to the breadboard modules, but were much narrower and each one consisted of two 36-hole connectors running the length of the strip (like this). These were intended for supplying +5V and ground to the circuitry; there were power taps at one end of each bus strip that were connected to the card's +5V and ground wiring.
The bus interface circuitry was pretty much typical of the H-8: the data bus was buffered by a pair of 74LS240 biquad tri-state buffers to create eight-bit data in and data out busses; A8 through A15 were buffered with 74SL14 schmitt triggers, whose outputs could then be used directly or one could elect to use an array of decoders wired similar to what the H8-2 used to decode I/O addresses. (Address bits A0 through A7 had no on-card buffering; if you were going to use those lines you were expected to provide your own buffers as part of your design.) There were also gates for some of the control signals so you could conveniently construct IORead, IOWrite, MEMRead and MEMWrite, but you were on your own if you needed to assert an interrupt request signal.
The instruction manual went into considerable detail about how to interface your design to the H-8, even mentioning constraints on available +5V current (the on-card regulators could supply about one ampere each, the H-8's power supply could supply ten amperes total but you had to subtract the requirements of the other cards in your system in order to determine what was actually available).Back to top
HA8-8 Extended Configuration Card
H8-9 PAMGO Rom Replacement Kit
This "kit" was simply a replacement for the PAM-8 monitor ROM chip (444-13) and the instructions for installing it. The new part (444-60) gave you "one button boot" capability; it was a very simple change to the PAM-8 program such that the initial value of PC was preset to 030.000. But the new part was a 2716 whereas the old part was a 2708, so you had to re-jumper the CPU board to connect another address bit.Back to top
H8-10 Wire-wrap card
H8-16 16K Static RAM Card
MH-8 64K DRAM Memory (Tryonix)
Bus Extender Card -- model number and original manufacturer unknown
H-17 5-1/4" Hard-sectored Diskette System
H-17-1 Diskette Drive
The H-17 used 5-1/4" full-height drives. (For those readers who have never seen a 5-1/4" floppy disk, here's a picture.) The two models of drive that I owned were the Wangco (Perkin-Elmer) model 82 and the Seimens FDD 100-5B. There may have been other vendors supplying drives to Heath Co. as well.
These drives were single-sided and capable of seeking up to 40 tracks (the standard capability was 35 tracks). The Seimens drive had a tad faster access time but they were otherwise equivelent. The Heath H-17 controller recorded data in single-density format on ten 256-byte sectors per track for a total capacity of 100 K byte (102,400 bytes). The H-17 used hard-sector formatting. This meant that the media had ten "sector holes" punched in it in addition to the usual index hole. The sector holes were evenly spaced around the media (every 36 degrees), with the index hole being half way between the sector hole for the tenth sector and the sector hole for the first.
Given the rotation speed of 300 RPM (or five revolutions per second), that means that a sector hole was detected every 20 milliseconds, with the index hole adding one additional pulse 190 milliseconds after the first sector pulse. The width of these pulses was determined by the diameter of the hole punched in the media and the rotation speed and was specified to be between 2.5 and 5.5 milliseconds, nominally 4 milliseconds. These pulses were produced by a photodetector.
The drive had a second photodetector to sense write protection; diskettes were normally made with a small rectangular notch punched on one edge of the diskette's envelope. If this notch in the envelope was covered with opaque tape (or if the diskette was manufactured without a notch in its envelope, as some distribution diskettes were), a photodetector would inform the controller and inhibit writing to the diskette. There was also some kind of detector (an actual switch in the Wangco drive) to sense when the head had been seeked to the outermost track (track zero).
The drive had a red LED indicator which was lit when the drive was selected. Drive selection was by one of three "Drive Select" signals supplied by the controller; you programmed which drive was selected by which signal by altering a program shunt jumper on the drive. These types of drives could therefore be programmed as SY0, SY1 or SY2. The electrical interface was open-collector TTL at about 130 Ohms impedance, with the last drive in the "daisy chain" providing a 132 Ohm termination at a voltage of +3 VDC. Data transfer rate was approximately 125,000 bits per second, serially. The drive required +5 V at 1.2 A and +12 V at 0.5 A.
One of the characteristics of these drives that the software had to deal with was the time required to initiate a seek to a particular track. This could be a fairly complex combination of the time required for the head to be brought in contact with the media (head load time, plus head load delay time) and the time required for the head to be positioned to the appropriate track (seek time per track times the number of tracks to traverse and settling time to allow the head position to stabalize). The seek time was specified as 30 milliseconds per track, but the drives typically performed better (sometimes as good as 8 milliseconds per track); seek stabalization time was 20 milliseconds; head load time was 60 milliseconds and head settling was 20 milliseconds. But seeking across multiple tracks could be done much faster than 33 tracks per second, so the access time was not easy to predict when the drive was read randomly.
Fortunately, the operating system software takes care of these ugly details!Back to top
H-19 Video Terminal
H-47 8" Diskette System
H-67 Hard Drive System
The H8 could address up to 65536 bytes of memory. Memory could be RAM or ROM (or in the case of the HA-8-2, a couple of bytes worth of hardware registers). ROM is, of course, read-only; sometimes a memory address was instead write-only. The following table lists what I know of memory allocations:
|Address Range||Type||Resource(s) occupying that space|
|000.000 - 037.377||ROM and Hardware Resources|
|000.000 - 000.001||WO||HA-8-2 Data Register|
|000.000 - 003.377||RO||PAM-8 Monitor ROM|
|000.000 - 007.377||RO||XCON/PAMGO Monitor ROM|
|000.000 - 017.377||RO||PAM37 Monitor ROM|
|020.000 - 023.377||RW||Unused (Low RAM if XCON or Z80 installed)|
|024.000 - 027.377||PRW||H17 Controller RAM (WO if XCON or Z80 installed)|
|030.000 - 037.377||RO||H17 Controller ROM|
|040.000 - 377.377||RAM (if fully populated)|
|040.000 - 040.077||RW||Work locations used by PAM|
|040.100 - 042.177||RW||HDOS Parameters and Vectors|
|042.200 - 362.377||RW||Approx. User Program Space|
|363.000 - 377.377||RW||HDOS Resident code and stack|
Key: RW = read/write, RO = read-only (may actually be ROM contents "shadowed" in low RAM if XCON or Z80 installed), WO = write-only, PRW = protected read/write. Software address space assignments are for HDOS 2.0 with fully-populated memory space and not configured Stand-Alone.Back to top
Standard I/O Port Assignments
CP/M (Early -- 8K origin)
CP/M (With Org-Zero, later)