ThoughtsOnRestoration

Why work with early microcomputers?

March 2021

In the early 90's, I went to college with a new 16MHz Gateway 2000 386sx desktop running MS-DOS 5.0. I may have had Windows 3.1 installed, but I don't remember using it that much until my sophomore year. Even then, I'm not sure I booted Windows unless I was procrastinating by playing minesweeper or solitaire. The machine had an 80 megabyte hard drive, 14" color VGA monitor, and 1.2 and 1.44MB disk drives. I had bought it with my life savings of paper route earnings and graduation gifts. I think it cost about $1500. When I arrived at school, I was issued an email account and granted access to the campus network. My dorm was networked with Token Ring. After ogling the beautiful Macs at the campus computer store (well beyond my means) I plunked down the cash to rent an IBM networking card. In a week or so (we had to apply for static IP addresses on a paper form, just like our class schedules) I was on Usenet and ftp, downloading documents and abusing my unlimited printing privileges at the central printer facility (a 10 minute walk from my dorm). Jobs were posted hourly and I can remember going there at 2AM to pick up reams of quality works such as "Skinhead Hamlet" and "The Anarchist Cookbook".

Logging into the central system gave us access to Unix. It was mesmerizing. I'm sure that I'm also conflating experiences using the workstations and black and white X-terminals in our computer clusters and my summer job working on SGI Personal Iris machines. [1] At school, we had networked games (Netrek, MUDs), vibrant newsgroups to read and post messages to, and we enjoyed multitasking desktop environments. Multitasking! Multiple windows, Xeyes, Xtop, email -- all there, moveable, clickable, updating (slowly!) on the screen. These experiences drove strong feelings. Along with my geeky peers, I wanted my own personal Unix workstation. For most of us, the cost of a workstation was still far out of reach. [2]

In 1993, I was working a summer job at another nearby university, programming data visualizations for a cell physiology lab. It would be my last programming gig before I turned to experimental research and grad school. One day, a senior lab scientist and I met with a math professor. At the end of our meeting -- why did this even come up? -- the professor made an offhanded remark about a free Unix for PCs called Linux. While I had one friend who had succeeded at installing the mach kernel on a 386 (at a high academic cost), Linux promised something like the Real Thing. From that moment on, I spent the rest of my summer job (well, a good part of it) downloading the Soft Landing System distribution and installing Linux with kernel 0.97 on my trusty 386sx. I think it took the better part of a week to even get it to boot. The installation required at least 20-some 1.44MB disks. I scraped together perhaps a dozen, overwriting files and a free copy of OS/2. I distinctly remember a lot of fiddling with the xconfig, the panic one feels when your monitor is suddenly emitting loud, angry whine, and the very slow process of troubleshooting. Even if I could "Google" for solutions, I was off campus by then with only a 1200 baud modem. Around the same time, I upgraded to a 135MB hard drive. I remember it cost about $150 from a mail-order house. (Peripherals Plus?)

I've come to think that the desire my friends and I had for our own personal Unix workstations were the same type of feelings that early microcomputer enthusiasts had years before us -- striving for the opportunity to have and program your very own computer, free of the access constraints and cost of institutional machines; a computer to explore and hack on without interference or worry. The early microcomputers (the Altair 8800 and IMSAI 8080) even mimicked the mainframe and minicomputer front panels, with blinkenlights and switches -- features that disappeared as microcomputers quickly evolved (think of the Sol-20, TRS-80, Commodore PET, and Apple II). Working with early microcomputers is a chance to explore and maybe experience a little of what those pioneers were feeling -- a feeling that I had in my own time and environment.

Early microcomputers have an aesthetic appeal, too. Within them are delightful puzzles. They are fun machines to get "under the hood" with. The electronics are relatively simple and accessible. You can typically fix them if they're not working, a task made easier by their through-hole ICs and other components and, often, the availability of full schematics and theory of operation documents. The computers click and clack with toggle switches, they blink away with each calculation, and the disk drives whir, snap, and stutter. Creative coding can be toggling a few dozen bytes into the front panel and hitting run, halting the processor, and examining memory. It's tactile, and full of stimuli. You can take pleasure in crafting simple tools in assembly or perhaps BASIC. The limitations become intriguing constraints for your creativity to explore, much like haiku.

From there, it is a chance to branch out through computing history -- run emulators of IBM System 360's or PDP-11's; try your hand in the programming environments of the elders; program the early Bourne shell with its fussy syntax; calculate sets of numbers in the original C language (hello, world). It's an environment where the output is strictly ASCII, an early information age celebration of type and a stream of glyphs. You can go as far as I have, restoring a teleprinter and daisy wheel printers to recover the clacking, clicking, and hammering sounds of slow output, now largely replaced in life with the drone of laptop fans, HVAC cooling, silent SSD drives, the lonely night light of LCDs, and of course, endless scrolling.

I started down my journey of revisiting early microcomputing when I picked up an Apple II Plus and a TRS-80 model III in 2013. These were among the first personal computers that I learned on and played with as a kid. Friends and relatives had machines like the Commodore VIC-20 and TI-99/4A. Eventually, our family bought a used Timex Sinclair 1000. Computers were fascinating to me. I attribute my middle-age interest in these machines to the development of a dopamine receptor response that formed as I pined over the Sears catalog computer section of Atari 800's or loitered in Radio Shack long enough to be asked to leave. Yes, I was a coke-bottle-glasses fourth-grade computer store hoodlum. With nefarious intent, I took piano lessons because my teacher had an Apple II Plus, and I could use it for a half hour while I waited my turn at the piano (Lemonade Stand! Compute! type-in programs!) Imagine my surprise when I discovered in 2013 that the Apple II keyboard had shrunk in the intervening thirty years since I last used one. No, wait! It was my hands that grew!

More than nostalgia, this journey over nearly a decade has given me the opportunity to explore and study different eras of computing history, creative coding, and computer art. There are interesting lessons in that history -- drivers for new industries, business school studies of short-lived startups that burned brightly and flamed out, and technological disruption that would bring even behemoths like IBM to the brink of extinction.

Thoughts on restoration

I restore (repair) vintage computers, electronics, and other information-age machines. While I value their historic nature as artifacts, I am not a museum. Restoration is not conservation. My activities alter the "historical integrity" of each object. My work is best viewed as a part of their continuing provenance. My goal is not to "freeze" each machine as a representation of a true document of a point in computing history, as a museum would, but to explore their design and operate them.

Some of the machines came to me with evidence of being used as part of the emergence of the very early days of personal or home computing, and it is that spirit that I am interested in most. What were these machines like to use? What were their capabilities? You can sense the excitement that early microcomputer enthusiasts had. The machines have appealing aesthetics, too---big 8" floppy disks, electromechanical whirring and clanking drives, blinkenlights, tactile front panel switches---as well as an elegant simplicity. Toggling a machine language program into an IMSAI 8080 with a processor and 1K of RAM is about as close to a computing machine's internal process as one can experience. It is a sublime distillation, like computing haiku. But then, the Processor Technology Sol-20, which arrived just a year after the IMSAI, is a testament to the rapid evolution of microcomputers---away from machines that mimicked mainframe and minicomputers of the time, and towards their own distinct class of computing hardware. After forty years those machines evolved into the thin, black supercomputer slabs we carry around every day in our pockets.

I am interested in good practices to follow for my restoration work. Lately, I've tried keeping alterations to a minimum, preserving the "carbon scoring," because the droids in the workshop have seen a lot of action. I document their condition, my changes and repairs, and I try to keep as many of the original pieces and parts as possible. Any alterations could, in theory, be reversed---even if that means returning the machine to an inoperative state. Even this effort has a certain futility; after all, one cannot preserve the original solder when defective parts are replaced. Irreversible change is an inevitable consequence of any activity. It is our own Ship of Theseus dilemma.

So, how does one identify good restoration techniques? I would prefer to avoid archivally unsafe practices. We should ask whether the restoration work will cause irreversible damage immediately or in the future. What resources exist for considering the best methods and practices? Several machines in my collection have a scattered amount of corrosion. I'm confident that my storage conditions will essentially stabilize it (or maybe not, if I don't keep the humidity in my retrobunker below 55 percent). Should I try to remove the corrosion? What would be the best way to do this? Should it be done mechanically or chemically? What implications would these have on the future stability of the materials?

The corrosion is a good reminder that the components and materials of these machines each have their "inherent vice." Metal parts corrode by environmental conditions or through galvanic processes. Capacitors and batteries leak, wire insulation loses its elasticity, rubber parts degrade and stiffen (or sometimes fluidize), plastic cases yellow and become brittle, foam disintegrates, semiconductor dopants diffuse. These machines, like all material tools and objects, will not work or last forever. It's somewhat amazing to me that they have survived forty years and can be coaxed into operation once more.

Notes

  1. I was lucky to have a summer job at a tech spinoff from our local university---quite an outlier in our rust-belt Appalachian town, and fueled partly by congressional pork barrel politics (over the mountain from the Bud Shuster Highway). The company was founded to provide numerical models and analysis for metal casting processes in the defense industry. Foundries used the simulations to identify the best placement of risers and other casting features and to predict the stress distribution that developed in a part during solidification. SGI machines ran the backend finite difference models and a user interface to set up the calculation by importing solid models and configuring the model parameters. We hard coded each radio button and data entry point. The front end was written in C and CVS was used to check in and out the source files. Compilation took on the order of 20-30 minutes. The group was about a dozen people, including myself and another intern. My official title was Junior Software Specialist. I have fond memories of those summers. My first summer working there, I fell for the Tektronix Phaser printer and generated many images of Mandelbrot sets from the SGI demo program dragon. The utility enscript was another favorite, with its pretty headers and clear, two-column code printouts flashing off the workgroup laser printer. It left my Epson FX-80 dot matrix printer at home in the dust.
  2. Martin came to school with a pizza box NeXT Station that he won in a wager with his father.