Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Achieving Accuracy: A Legacy of Computers and Missiles
Achieving Accuracy: A Legacy of Computers and Missiles
Achieving Accuracy: A Legacy of Computers and Missiles
Ebook672 pages10 hours

Achieving Accuracy: A Legacy of Computers and Missiles

Rating: 0 out of 5 stars

()

Read preview

About this ebook

“A Legacy of Computers and Missiles “is an intensively researched, photo-enhanced discussion of digital computing and missile development in the Twentieth Century, organized in two sections.  (No matter what anyone has been told, virtually all of the digital machines ever designed are binary deep down inside. Number representations may have varied, but the binary logic discussed here prevails.) After a bit of early history, The Computing Section begins in earnest with Turing’s Bombe used to decrypt Enigma traffic, then investigates one-by-one digital systems from early room-sized serial machines through the beginning of the modern parallel era, ending with disgustingly parallel post 2000 Super-computers.  Unlike most computing histories, Achieving Accuracy deals in detail with military computing systems generally omitted for lack of definitive information. (Computer design and computer-controlled missile guidance/ submarine navigation occupied some thirty years of the Author’s professional career. )

 

Achieving Accuracy‘s missile descriptions and discussions begin for weapon systems existing well before WW2 and cover virtually all US smart bombs, cruise and ballistic missiles of that century. Missile guidance systems have ranged from the V-1’s dead reckoning through simple, but jammable radio-controlled, to exceedingly complex self-contained inertial guidance systems discussed at length. The reader may be surprised to learn that a “smart-bomb” flew in 1917, with several different models used in anger in WW2. The Minuteman III leg of the present Triad is described in detail along with a somewhat bizarre set of proposed basing plans for the Peacekeeper Missile that were precursors of the recently proposed “Subway” basing plan for MMIII. Missile legacy includes a sub-section, necessarily less complete, describing Soviet/Russian missilery through 2000, noting that the early Soviet ballistic missile development was based almost entirely on the German V-2.

 

LanguageEnglish
PublisherXlibris US
Release dateDec 11, 2008
ISBN9781462810659
Achieving Accuracy: A Legacy of Computers and Missiles
Author

Marshall William McMurran

About the Author BS Mathematics/Chemical Engineering, Oregon State College/University 1951 Distinguished Military Graduate, Oregon State 1951 Appointed Lieutenant USAF June 1951 Awarded Graduate Certificate in Meteorology, UCLA (USAF-sponsor) 1952 USAF Meteorologist 1952-1955 Hamilton AFB, California, Kunsan AB, Korea, Hill AFB, Utah Upon honorable discharge from USAF active duty, joined the Guidance Analysis Group, Autonetics Division of North American Aviation. Simulated Autonetics N6A Navigator gyrocompassing performance prior to the Nautilus under-pole voyage, in 1957. Programmed the first successful digital general-purpose real-time control of a shipboard inertial system. (Perhaps the first-ever real-time general-purpose, fully digital control of any complex system.) Programmed guidance and controls for the Hound Dog missile system. This work formed the basis for the later Polaris and Poseidon shipboard and A3J Vigilante navigation programs. Worked on the configuration of the Minuteman I digital flight computer (D-17). Laid out and programmed the first of the Minuteman I flight and ground control programs. Managed Autonetics Inertial Navigation Division Computer systems, Guidance Systems Engineering, and Inertial Instruments and Processes. Served as Director of Calculator Development, Hybrid Businesses, and Microelectronics Engineering, retiring as Chief Engineer of the Semiconductor Division of Rockwell International. Authored various papers dealing with computers, inertial systems and semiconductors. Author, Programming Microprocessors, 1977, Tab Books, Blue Ridge Summit, PA Author, A Comprehensive Summary of Signal Processing Devices, 1987 Rockwell International internal research document. Author, If It Weren’t for People, Management Would be a Science, 1998 Institute of Industrial Engineers Flight Instructor, Commercial Pilot, Instrument-rated, Multi-engine Professional Engineer, Control Systems

Related to Achieving Accuracy

Related ebooks

Wars & Military For You

View More

Related articles

Reviews for Achieving Accuracy

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Achieving Accuracy - Marshall William McMurran

    Copyright © 2008 by Marshall William McMurran.

    All rights reserved. No part of this book may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or by any information storage and retrieval system, without permission in writing from the copyright owner.

    Any people depicted in stock imagery provided by Getty Images are models, and such images are being used for illustrative purposes only.

    Certain stock imagery © Getty Images.

    Rev. date: 07/08/2021

    Xlibris

    844-714-8691

    www.Xlibris.com

    579421

    CONTENTS

    Chapter 1: The Early Days of Computing

    Chapter 2: Korean-Inspired Cash Matures the Digital World

    Chapter 3: The Second and Later Rounds of (Mostly) Big Iron

    Chapter 4: Small Magnetic Drum Computers of the 1950s-70s

    Chapter 5: Real-Time Control Computers

    Chapter 6: NASA Control Computers

    Chapter 7: Late Model High-Speed Supercomputers

    Chapter 8: Before the Second World War

    Chapter 9: V1s, V-2s, and Their Derivatives

    Chapter 10: Early US Missile Programs

    Chapter 11: United States Missile Development Matures

    Chapter 12: US Missile Development (1960-1999)

    Chapter 13: Soviet and Russian Land-based Missile Systems

    Chapter 14: Soviet and Russian Naval Missile Systems

    To

    Lisa

    FOREWORD

    T HIS BOOK IS devoted to a compiled and edited history of digital computing and missilery, beginning with early arithmetic aids, continuing through (and a little beyond) the twentieth century. Massive US and Soviet technical innovations burgeoned in the 1950s initially fueled by cash infusions triggered by the Korean War. This was followed in the US by post-Sputnik money and, later, continuous cold war spending. Since the lion’s share of the western computer and missile development was funded by the US government, much of the development work was US government owned, and innovations were often hidden from public view, held deep in classified documents. As a result, a great deal of the technical development history available today is often incomplete and occasionally inaccurate. (It was necessary to search and edit many redundant sources rather carefully to arrive at what should be a good approximation to historic truth concerning both the digital computing machinery and missile systems.) Some histories may well be missing from this compilation. Unfortunately, a great deal of that lost history may never be recovered. Classification, security, compartmentation, company proprietary, whatever the name, worked to suppress public disclosures of the sweat of a lot of technical folks. Secrecy was, and still is a necessary evil, but is often overused. Sadly, it has kept a wealth of interesting and useful technical information from the people who paid for it. The prime motivaion for this book was to provide a vehicle for preserving some of the information that still exists.

    Since a myriad of computer and weapons systems were developed during and immediately after the Korean War. It seemed best to deal principally with machines and systems whose designers had made major contributions to the arts of guidance, propulsion, and digital computation. The reader will find more meat in the discussions of computers and missile systems that I had worked with, than those I did not. It is inevitable that some favorite and well-known (to those that know them well) systems have been given short shrift. For that, I apologize.

    For quite some time, the missile legacy built on prior German work. Even so,the systems derived by both the United States and the Soviets during the 1950s represented a giant step forward in weapons system complexity and potential capability. Unfortunately, the technology often got ahead of itself and stumbled. Even so, propulsion and guidance development proceded faster than at any other comparable time before or since. It all fitted together rather well and defined the course of several later technologies that were refined as the Vietnam War came and went.

    In addition to many written sources, I was fortunate to be able to pick the brains and, on occasion, review the personal files of a number of influential managers, educators, engineers, and scientists. Among them were Hal Engebretsen, Dale McLeod, Al Grant, Jeff Schmidt, Hugh Galt, Bob Nease, Joe Cherney, John Pinson, Dee Lyon, Don Pickerell, Bob Knox, Jim Rex, Gene Pentecost, Joe Boltinghouse, Walt Pondrom, Walt Evans, Bob Doty, John Slater, Roger duPlessis, Greg McMurran, Terri Quinn, Bret McMurran, and Grant McMurran. There were many more, but to name them all would require a very long list. All have my gratitude. Any errors in this writing are surely all mine.

    A LEGACY OF COMPUTERS

    CHAPTER 1

    THE EARLY DAYS OF COMPUTING

    I T WASN’T UNTIL the eve of WWII that even rudimentary, somewhat programmable electronic digital computing devices existed on this planet. These few machines were the products of a handful of forward-looking designers and the odd tinkerer. At the beginning of the Korean War in 1950, digital computing technology was primitive even by the standards of the later 1950s. The cautious and rather fragmented computer evolution of the 1930s and 1940s quickly became a revolution, fueled by the influx of Korean-inspired military research and development dollars. The flood of new weapons systems put into work in the 1950s on the western side of the Iron Curtain sparked a dramatic increase in both the sophistication and number of high-speed digital machines. American industry quickly embraced these new tools as they emerged from the clutching hands of professors, engineers, generals, and admirals. This frenetic activity led rather directly to the microelectronics revolution, yielding integrated circuits, the personal computer, and the now indispensible Internet. The blindingly fast, cutting edge, massively parallel 300++ teraflop machines of the twenty-first century owe homage to the Korean legacy.

    Diverse early inventors (circa 3,000 BCE-AD 1930) conceived of a myriad of imaginative written schemes and mechanical contraptions to manipulate and combine numbers in special ways. In a broad sense, they all were contributors to the architecture of the modern digital computer. The Sumerians invented and developed arithmetic techniques using several different number systems including a mixed radix system alternating numerical bases between base 10 and base 6. This rather unwieldy sexagesimal system was the standard number system in Sumer and Babylonia, and led to a timekeeping scheme having sixty seconds to the minute, sixty minutes to the hour, and using a twelve-hour day. The Sumerian time and angular measurement notation, together with the Sumerian twelve-month calendar is, of course, still in use. The Babylonians had successfully used an abacuslike device around 3,000 BC. This earliest abacus appears to have been a board or slab covered with a thin layer of sand used by the Babylonians to trace their symbols. The Chinese made their abacus a going thing around 1,300 BC. They have been using it quite effectively ever since.¹

    img%201.jpg

    ANTIKYTHERA DEVICE

    At this writing, a loose group of Greek and British scientists has been guessing at the purpose and operational details of the Antikythera Mechanism, a fascinating geared device retrieved back in 1900 from an old Roman shipwreck. The forms of the Greek lettering on the device suggest that Hipparchus, an astronomer of the era, might have been involved in its creation around 150-100 BC. When found, the mechanism was badly corroded, but appears to have used as many as seventy gears in the calculations. The machine may be a combination of an odd sort of clock and a calendar, reproducing the motions of the sun and moon, using Babylonian arithmetic progression cycles calculated by the gearing. An investigating team has credited the early designer(s) with producing at least one very odd gear ratio of 235/19 with an estimated precision of one part in forty thousand.

    Imagine for a moment, the handwork required to produce even one of these gears. Once the artisan had extracted a rough cast bronze gear blank from his mold, he must have carefully trued the rim to an approximation of a circle. He then established the center, likely using a crude form of a divider, laboriously checking for the true center by trial and error. He then carefully punched a drill guide, and began drilling. He probably spent hours or days worrying his way through, using a set of rapidly dulling bronze or stone drills. Once the center hole was drilled through, he may have worked for several more days insuring equidistant radii to preclude binding of the gear. The mechanism’s gear teeth are triangular, probably marked using a template and cut by a bronze or granite file slightly deepening each tooth groove in turn to insure that the gear pattern was consistent around the circle. The artisan undoubtedly had to use all of his considerable skill to accomplish this, since the bronze casting was likely full of hard spots and voids.

    It appears that the device was intended for portable use by relatively unskilled users. The differential gear systems employed a considerable number of ratioed gears allowing multiple inputs, and is the earliest known example of this sort of gearing. (Differential gears weren’t used again until around 1877.)

    Not much of real consequence in the digital mechanism design world is found after Antikythera until the 1600s, when a significant calculating device appeared thanks to John Napier (AD 1550-1617) of Naperian logarithm fame. He devised a set of marked-off sticks that he called bones to reduce multiplication and division to the simpler operations of addition and subtraction. Napier’s bones made up a sort of nonsliding slide rule. As Napier worked with his logarithms, he apparently relied on his bones as well as his tables.

    img%202.jpg

    A SHICKARD MACHINE MODEL AT THE COMPUTER

    SCIENCE MUSEUM (PHOTO BY AUTHOR)

    Around 1607, Edmund Gunter² created a device that was a predecessor of the modern slide rule. In 1623, he published a description of this instrument, noting that it was made up of two logarithmic scales, each ranging from 1 to 10 and placed end to end.

    Perhaps the earliest modern working European calculating tool was created around 1623 as well. This was a simple mechanism due to William Schickard, a German scientist and cartographer. The Schickard machine consisted of a series of moveable numbered sticks extending from the side of the devise, driving a set of dials using cams or detents. When the user moved the numbered sticks back and forth, the readout dials followed from their prior position. This calculator could accumulate sums and differences.

    A bit later, but still in the 1600s, both Blaise Pascal and Samuel Morland hit upon mechanical designs to aid/perform addition and subtraction. Pascal began work on his device in 1642 when he was only nineteen. His father was a French tax commissioner, and Pascal’s intent was to reduce his dad’s workload. The Pascal calculator was known as both a Pascalina and an Arithmetique, when it was known at all.

    The Pascal mechanism could both add and subtract, although users seem to have felt that addition was somewhat easier than subtraction. The user dialed both input numbers and the required operation into the Pascalina, activating the computation gears. The Pascalina then displayed the results above the input dials. Pascal innovatively used complementary arithmetic for subtraction to simplify the mechanization.³ He actually produced some fifty of these devices. However, he had sold only a dozen or so by 1652. Unfortunately, the Pascal machines were decimal-based, but the French currency of the time wasn’t, so the machines never caught on. The French were using coinage denominations rather similar to those of the British before decimalisation. French livres corresponded to British pounds, sols to shillings and denires to pence. So the user had to perform rather annoying adjusting calculations if the Pascalinas were to be used as currency calculators. Shrugging off the poor sales, Pascal went on to greater things.

    img%203.jpg

    A PASCALINA REPRODUCTION ON DISPLAY AT THE

    COMPUTER HISTORY MUSEUM (PHOTO BY AUTHOR)

    Samuel Morland (1625-1695) devised not one, but two calculating machines. He published a paper in 1673 entitled, The Description and Use of Two Arithmetick Instruments. Morland’s calculators (or perhaps Morland) were important enough to catch the eye of Britain’s King Charles II. At any rate, Charles accepted a gift of a pair of Morland’s machines in 1662. (A Humphry Adanson would later manufacture Morland’s instruments and offer them for sale. At that time, it seems Humphry was living with Jonas Moore in the Tower of London. It’s not clear what either was doing there.) According to the Morland/Adanson sales pitch, the instruments were mechanical wonders. The advertising literature said in part: "By means of . . . (these machines) . . . the four fundamental rules of arithmetic are readily worked without charging the memory, disturbing the mind, or exposing the operations to any uncertainty. Sounds good, but Samuel Pepys dryly characterized a Morland machine that he saw as being very pretty but not very useful."

    One Morland machine appears to have been a modification of the Pascal calculator. The other, Morland’s Machina Cyclologica Trigonometrica was constructed in 1663 of silvered brass and silver. (It does sound very pretty, so Pepys may have been at least half-right.) That device had geared scales that could pivot and shift around a central point. The scales and arms of the graduated protractors were arranged to form a triangle whose sides and angles could be measured by two small indicating gauges. As an added refinement, the device could be set to work as a graphic multiplier and divider. At this writing, a Morland machine is on display at the Science Museum in South Kensington, United Kingdom.

    Von Leibnitz produced a calculating machine in 1672 using gears and a stepped drum. Known as the "stepped reckoner," it could, with some care, add, subtract, multiply, and divide. The Leibnitz design was continually improved over the years. A derivative product, another arithmometer, perfected by a Charles Thomas a hundred years later in 1774 was a hit. Copies and near-copies of the Thomas machines sold in Europe for many years afterward.

    In the early 1800s Joseph-Marie Jacquard devised a loom pattern controller using punched cards as information sources. The Jacquard loom was a refinement of earlier inventions created by several Frenchmen including Basile Bouchon, Jean Falcon, and Jacques Vaucanson. (Their works dated between 1725 and 1740.) Programmers controlled the Jacquard mechanism by prepunching holes in sets of cards that were strung together to sequentially select loom threads. This chain of cards was mechanically drawn through a set of sensing heads to produce an error-free (mostly) complex woven pattern.

    The term Jacquard loom is really a misnomer. Better to think of a Jacquard head, which is the programmer-controller that works with a great many Dobby Looms⁴ controlling the weaving action to create the intricate Jacquard patterns. A number of such looms remain in use to this day. Some still use cards strung together on cords. The early looms all used very heavy cardstock. Hole sensing in a Jacquard loom is entirely mechanical, and tends to be quite hard on cards. Strings of well-used Jacquard cards are still occasionally offered for sale to collectors, but modern high-volume Jacquard looms typically use durable metal cards that last quite a while.

    img%204.jpg

    A DOBBY (JACQUARD) LOOM

    Each hole in the card controls the position of a hook, raising or lowering the harness, carrying and guiding the warp thread. As the weave progresses, weft threads are programmed to lie above or below the warp. This idea of being able to change the pattern of a loom’s weave by sequentially sensing programming holes in cards on the fly is a very important logical step in the concept of a stored-program computer.

    In 1822 Charles Babbage, using a bit of the work of others, conceived a calculating difference engine. The production of function tables in those days was not only laborious but also prone to error. A contemporary of Babbage, a certain Dionysius Lardner, wrote at the time that a random selection of forty volumes of numerical tables contained no fewer than 3,700 acknowledged errors (and very likely, a comparably large, but unknown number of unacknowledged ones). Numerical tables of the time suffered from three basic error sources: calculation errors, transcription (copying) errors, and typesetting or printing errors. Babbage’s prime motive for his engine design was to eliminate calculation errors in the production of these mathematical tables. In 1834, Babbage designed a second machine, an analytical engine. Later in 1842, Lady Ada Byron, Lord Byron’s daughter, worked with Babbage. She may well have created programs of a sort for Babbage. (The ADA Compiler was named for her.)

    img%205.jpg

    A DIFFERENCE ENGINE CAM AND GEAR (FROM THE

    COMPUTER HISTORY MUSEUM)—PHOTO BY AUTHOR

    The use of Jacquard loom cards as control inputs influenced Babbage, who liked the idea of using sequences of punched cards to direct the computational sequences of his Analytical Engine. Babbage’s cards were strung together much like Jacquard’s. This Babbage mechanization played a crucial legal role in later years. It provided a precedent that was key to preventing Hollerith’s company from claiming patent rights on the idea of storing data son punched cards. Even so, the Jacquard idea of sequentially reading holes on punched cards to control mechanical activity would find an eighty-one-column home at IBM.

    Babbage designed his engine around the idea that the nth successive ordered difference of an nth order polynomial is a constant. Finer entries can be generated by intelligently chosen interpolations and subtractions. Babbage’s engine calculated table entries using Newton’s Method of Divided Differences, producing intermediate polynomial values without ever having to multiply. (A pure difference engine needs only to subtract.) Unfortunately, Babbage’s differencing machine was never completed during his lifetime, although he once demonstrated a six-wheeled model. Recently, volunteers have completed the assembly of a difference engine according to Babbage’s drawings. The builders found Babbage’s work to be accurate and the engine operable.

    For the record, George Boole, the Irishman came along in the 1850s and published The Mathematical Analysis of Logic, incorporating ever so important methods of binary function manipulation. Around the same time in Sweden, George and Edvard Scheutz built a practical sort of mechanical computer. In the 1870s, Frank Baldwin created a compact single cylinder mechanical calculating (adding) machine. The Franklin Institute of Philadelphia awarded him a medal in 1874 for his design. Baldwin fiddled for a time with patents for the machine, but never placed it on the market.

    img%206.jpg

    ODHNER AND HIS ARITHMOMETER

    In 1874-79, Willgodt Odhner of St. Petersburg in Russia received a pair of US Patents for what he called a new and improved "arithmometer." Odhner briefly described it as follows: My invention is an instrument for assisting in calculating, it being adapted to add, subtract, multiply, and divide numbers without any labor on the part of the operator other than that required to set and rotate certain numbered counting wheels, and to adjust a slide carrying a series of recording-wheels. Odhner built perhaps 50 such machines under his 1874 patent, and perhaps others under his second 1879 patent.

    Beginning in 1882, William Seward Burroughs developed, patented, and sold very workable geared and cam adding machines in the United States through the American Arithmometer Company that he jointly formed in 1886 with Thomas Metcalfe, R. M. Scruggs, and W. C. Metcalfe. Thomas Metcalfe was elected company president, and William Burroughs vice president. The company initially set the price of their adding machine offering at $475 each. By 1895, American Arithmometer had sold 284 of these machines. In 1904, the expanding company moved to new quarters in Detroit from St. Louis, relocating all of the employees in one day on a special train. Company officers renamed the company the Burroughs Adding Machine Company soon after the move. We’ll hear more from the Burroughs Company a bit later.

    ANALOG COMPUTERS

    img%207.jpg

    A BUSH WHEEL AND DISK INTEGRATOR

    Assorted analog computers came early in the modern cycle and now have mostly gone. Widely used before the present digital era, a typical analog computing system is designed to work on a range of special problems. Once set up, an analog computer configuration might be retained for years, performing parametric studies of water flow, engine performance, controlling flight simulators, or a myriad of other uses. Relying in part on the predetermined behavior of mechanical and electronic components, these analog machines generally provide solutions to combinations of simple first-order differential equations. The analog machine’s digital counterpart is/was the Digital Differential Analyzer (DDA). A DDA consists of a set of interconnected summing registers or integrators. The DDA has been quite accurately, although somewhat facetiously, described as a stuttering analog computer.

    Perhaps the best-known mechanical analog integrating element is the Bush Integrator. Vannevar Bush noted that if a rotating disk drove a second disk with an axis of rotation perpendicular to that of the driving disk, and if the distance of the second disk from the driving disk’s center was varied, the lash up could mechanically solve a simple first-order differential equation of the form, dz=ydx (Whew!). Since the outputs and inputs are all shaft rotations, combinations of interconnected Bush integrators given proper care and feeding, can solve large classes of complex differential equations.

    Electronic integrators or controlled summers are the bases for the more recent analog computers. These integrator elements were (and to a degree, still are) used in conjunction with cams, gears, hydraulics . . . often incorporating pieces of the hardware under study, with the aim of simulating the full behavior of various kinds of complex gadgetry. During its heyday, analog computer programming was typically done by connecting integrators using a patch board much as connections were made on a manual telephone switchboard. Programming such a machine required both the knowledge of expected hardware behavior and of the underlying describing mathematics. Even the act of scaling the solution (finding the order of magnitude of an answer derived from the computer’s output) is nontrivial. Analog computer solutions typically take the form of plotted graphs, voltage measurements, rotating shaft positions, and more recently, digital displays and readouts.

    THE REAL, BUT CAUTIOUS DIGITAL BEGINNINGS (1930S)

    img%208.jpg

    TURING’S BOMBE

    Several small groups and individuals were working on primitive digital-style computing machines in the 1930s and 40s. Functional forms of modern digital computers though, got a big push from the work of the highly successful WWII British code breaking groups of Bletchley Park. Alan Turing was a premier mathematician and philosopher at Bletchley. He was instrumental in the design and assembly of a special-purpose interconnect searching computer known as Turing’s Bombe. The Bombe was a generalization of an earlier Polish Cryptological Bomb that had successfully decrypted a large class of German Wermacht messages. Beginning in 1932, the Poles manually, laboriously, and consistently had broken German encrypted message traffic. In October of 1938, Marian Rejewski developed a device that would automate and speed up this reconstruction of the German Enigma cipher keys. After the fall of Poland, Mr. Rejewski escaped to France where he and Alan Turing met in 1940. The British version of the Polish Bomb soon followed.

    To decrypt Enigma traffic successfully, it was necessary to begin with some guess of the likely form of clear text in an Enigma encoded message (a crib). The Enigma mechanized the processes of encryption and decryption using three (later four or five) rotors that were variably and sequentially connected from input to output. The encryption key depended on the rotors selected, the starting rotor positions, the rotor ring positions and, if used, plug board (Stecker) settings that gave an additional level of encryption. Bombes isolated the correct Enigma configurations by sequentially disproving each incorrect setting in turn until clear text was reconstructed.

    The Bombes were electromechanical nightmares. Each weighed about a ton. As can be seen from the accompanying photograph, the hardware was enclosed in a cabinet about twice the size of a refrigerator with well over one hundred shafts protruding from the front. Massive plug boards provided most of the interconnect options. The Bombes claim to fame as forerunners of the modern digital computer stems from the fact that the Bombes were all digital, and mechanized digital sequential logic continually testing for desired interconnect characteristics. Some two hundred of these Bombes were built during WWII. British security people destroyed all of them at the end of the war.

    THE COLOSSI (1943-1945)

    img%209.jpg

    A PIECE OF COLOSSUS

    These machines were among the first programmable (well, somewhat programmable) digital electronic computers. The Colossus machines were instrumental in attacking and decrypting German codes. This time, the codes were generated by German Lorenz SZ40/42 encryption devices. These Lorenz codes were at least as tough to decrypt as those generated by the Enigma, but the Lorenz work has never gotten the same press as Enigma decryptions. The Lorenz crypto machines, code named Tunny’s, were in full use by the Germans in 1941.

    The Colossus computers lived up to their name. They were huge. Each was built of state-of-the-art electronics of the era. A Colossus incorporated around 1,600 vacuum tubes and a large, but now uncertain number of relays. No matter how carefully vacuum tubes (or valves as the British call them) are fabricated and tested, tubes are prone to failures. Anyone once owning a vacuum tube TV set that contained perhaps ten to fifteen tubes can appreciate what the maintenance problems must have been with a computer using 1,600 or so, of these potentially failing components. As unreliable as the tubes were when continuously powered, they were worse when turned on and off. Temperatures cycled, causing filaments to burn out and glass-to-metal seals to crack. Therefore, the Colossi were simply left on all the time (except when they failed, of course). This simple operational change markedly improved Colossi reliability. With loving care, the Colossi seemed to have stayed up and running long enough to do their assigned decryption jobs.

    The first Colossus Mark I arrived at Bletchley Park in 1943 followed by nine other Colossi in short order. An important innovation in the Colossus design, perhaps the most important, was largely due to a Bletchley Park code-breaking engineer by the name of Tommy Flowers. Flowers pushed hard for internal bulk information storage. This bulk data store was quite important in the code-breaking process, but also needed a huge array of tubes/valves.

    Memories, particularly high-speed memories, were exceedingly hard to come by in those days. Flowers took the project on as a labor of love. He spent ten months working on the first Colossus at the Post Office Research Center at Dollis Hill. He then personally delivered the completed Colossus to Bletchley Park on 8 December 1943. All the Colossus design work was carried on under wraps two years prior to the completion of the ENIAC in the United States, but because of the stringent British security, no computer designers outside of the small group of the British group of code-breakers had an inkling of the details of the Colossus design.

    The Mark I Colossi were refined and upgraded to Mark II’s beginning in June of 1944. The Mark II was about five times faster than the Mark I, but the Mark II needed nearly 2,500 valves. (That must have done wonders for the Mark II reliability.)⁵ A Colossus Mark II was available to the code breakers just in time to decrypt German traffic prior to D-Day. These decoded messages permitted General Eisenhower to be reasonably sure that the German High Command had swallowed the deceptive bait he had put about concerning the location of the D-day landings.

    Because of the secrecy surrounding the Colossi, even the existence of these machines wasn’t publicly known until 1976, when the first of several periods covered by the British Official Secrets Act expired. Even then, the British only grudgingly doled out information in bits and pieces. (Some details were still classified by the British War Office as late as 1996.) There may well be still more information hidden in the War Office archives that we don’t yet know. At the end of WWII, Winston Churchill was so intent on keeping the Colossi existence quiet that he ordered all but two crunched into pieces no bigger than a man’s hand. The other two were destroyed or disassembled over the years as well. All the security notwithstanding, the Colossi were similar in style to several more or less experimental designs of the late 1930s. ATT’s Bell Labs even produced a publicly known relay-based digital machine of a roughly similar configuration about the same time, but no one outside of Bell Labs ever seemed to care much.

    The United Kingdom sponsored a number of follow-on cryptanalysis projects during the immediate postwar years of 1945-1955. According to the IEEE Annals, one of the more significant of these was Oedipus, a special-purpose rapid analytical machine using novel digital storage. GCHQ (Government Communications Headquarters) and UK companies, Elliott and Ferranti, developed the Oedipus. Oedipus contained a large semiconductor associative memory, a magnetic drum with on-the-fly searching, and a high-speed RAM cache. Its history is sparse and has only recently been made publicly available.

    THE ZUSE MACHINES (CIRCA 1940-1960S)

    Konrad Zuse designed a series of mechanical calculators in Germany operating as early as 1938. He demonstrated the third machine of this sequence, the Z-3, in 1941 in Berlin. The Z-3 has gained a degree of acceptance these days as a very early stored-program computer despite the very important shortcoming that during operation, Z-3 instructions and data were not co-stored in the same memory. (This co-store feature is an important characteristic of the von Neumann architecture and is central to the configuration of virtually all general-purpose stored-program digital machines today).⁶ The Z-3 repertoire didn’t include conditional transfers, another major deficiency of the machine. The Z-3 was however, quite advanced in another way, perhaps unnecessarily so. The computation center could perform floating-point arithmetic.⁷ Considering the state of the art at the time, Z-3 floating-point hardware must have markedly increased the logic complexity, slowed the machine, and degraded the reliability.

    The Z-3 word length was twenty-two bits long, including sign, characteristic, and mantissa. A typical add time of the Z-3 was about 0.7 seconds. It could multiply and divide two numbers in roughly three seconds. The Z-3 used relays for most of the logic functions. (The clattering Z-3 was said to be exceedingly noisy as well as of questionable reliability.) No matter, Zuse rebuilt his war-torn Z-3 after the war, and persisted with the Z4, a somewhat simpler but still relay-based machine. He developed another Zuse machine, the S-1, circa 1942. Zuse formed a company after the war, Zuse KGA, that built and sold some 300 machines in the late 1940s. These machines were variations on the Z4 theme. In 1950, the Leitz Company in Wetzlar, Germany, ordered a large relay computer from Zuse, the Z5. The Z5 sold at that time for some three hundred thousand Deutsche marks. Both the Z4 and the Z5 were moderately successful projects in post-war Germany.

    img%2010.jpg

    KONRAD ZUSE POSING WITH HIS REBUILT Z-3

    Konrad Zuse used 2,800 telephone relays in the Z5 to overcome the poor reliability of vacuum tubes. The Z5 was a very large computer occupying 10 meters by 4.5 meters of floor space. Zuse claimed that the Z5 was the first commercial computer built in Germany (perhaps in Europe), and was the biggest relay computer ever constructed (not really a milestone to be proud of). The Z5 was six times faster than the Z4, had a twelve-word memory, with each word containing thirty-five bits. Floating-point numbers were represented using one bit for the sign, seven bits for the exponent, and 27 bits for the mantissa. It also had a dedicated memory storing eight frequently used constants. The Z5 was a three-address machine⁸ with a clock frequency of about 40 Hertz. The Z5 was not a stored-program machine, instead, it employed subroutines (sub-programs) that were called using three punched tape readers (based on 35 mm standard movie film), apparently activated by each other. The punched tape readers could read twelve 20-bit combinations per second.

    The Z5 was followed by the SM1, and then the Z11, both relay machines. By 1957, Zuse KGA had enough confidence in vacuum tubes to build the Z22. (The jump from the Z11 designation to Z22 was supposed to accompany the technical advance from relays to much faster vacuum tube-based logic.) The Z22 was a serial machine, and was Zuse KGA’s first stored-program computer, of sorts. The machine contained 600 vacuum tubes and 2,400 diodes. It had a clock frequency of 150-kHz, a word length of 38 bits and a magnetic drum central memory of 8-kilobits. These machines were, for a time, successfully delivered to universities and research institutes. In 1960, when the Z22R became available, Zuse KGA claimed that it was the first machine to use a ferrite core memory (a highly questionable claim). A rebuilt variant, the Z23, is on display at the Computer History Museum in Mountain View, California.

    As an aside, we have noted, ATT’s powerful Bell Laboratories dabbled in the digital computer world early on, but their contributions were, largely in special, rather than, general-purpose systems. Bell Labs built a few relay calculators along the way. At least one of these, like the Zuse machines, incorporated floating-point arithmetic.

    IBM and The Harvard Machines (1940-1950)

    img%2011.jpg

    THE LEFT SIDE OF HARVARD MARK I ON DISPLAY

    By the late 1930s, IBM and Harvard had formed a collaboration to design and build an Automatic Sequence Controlled Calculator, or ASCC. The design of this machine, perhaps better known as the Harvard Mark I, was complete in 1944. (During WWII, most IBM facilities were working on military-related products ranging from rifles to engines to bombsights. In 1943, IBM Laboratories perfected a Vacuum Tube Multiplier, a critical component in the electronics of the day, that was useful in the construction of early analog computers.)

    The Harvard-IBM Mark I collaboration was the first time that IBM became involved with any significant computer project. It appears that IBM contributed little of the technology leading to the Mark I design, but supplied a good deal of the development money. The principal designer of the Mark I was Howard Aiken, a graduate of the University of Wisconsin who had received his PhD from Harvard in 1939.

    Like the Collossi, the Mark I was a monster. It was built from more than 750,000 components, was over 50 feet long, 8 feet tall, and weighed some 5 tons. When operating, it continuously consumed 25 kilowatts of power. Mark I was made up of switches, relays, rotating shafts, and clutches, and was said to have sounded like a roomful of Madame deFarges knitting at top speed. The basic calculating units (counters) had to be synchronized mechanically. A 50-foot long shaft powered by a five-horsepower electric motor drove them all. The Mark I could store all of seventy-two numbers, each one twenty-three decimal digits long. It could perform three additions or subtractions per second. Multiplications took six seconds each. Division was even slower, needing something like fifteen seconds! Programming sequences were controlled by instructions read one at a time from punched tape. Program loops were executed by rerunning that part of the program tape, often by looping the tape itself. Another option often used to create program loops was simply to duplicate the instructions on the program tape for as many passes as required. Mark I had no conditional transfer, so the number of times a particular program sequence was executed had to be predetermined. In spite of its size, weight, complex internal hardware, and the comments of a few folks, the Mark I doesn’t qualify as a stored program machine.

    Howard Aiken also led the designs of the Harvard Mark II and III. These were relay computers of sorts, and the US Navy funded both. The Mark II incorporated thirteen thousand relays. The Naval foray into the computer world during World War II initially brought Dr. Aiken to the Dahlgren Model Basin, to be later accompanied by the Mark II Aiken Relay Calculator. (The installation of that machine represented a naval research milestone leading to the later receipt at Dahlgren of the Mark III, probably the most sophisticated computer in the world at the time.)

    The Mark II relays substantially reimplemented the mechanical counters of the Mark I. The Mark II was perhaps three times faster than the Mark I, depending upon the problem running at the time. Mark II could add eight numbers per second, and form a product about once a second. It also implemented floating-point arithmetic., incorporated macro operations, square root, logarithms, exponentials, and trigonometric functions using specially designed built-in hardware. Like the Mark I, Mark II program data was read into the machine sequentially from paper tape. Other relay computers, the Mark 22 (a.k.a. the Bell Model IV) and the Bell Models V and VI were operating at about the same time. (All of these machines were busy working for the military on sets of design and ballistic trajectory problems.)

    A story circulating throughout the computer world in the 1950s, told about an individual at the Naval Research Laboratories (NRL) who had a uniquely thorough understanding of the idiosyncrasies of relay computers of the era. It seemed that he could nearly always put them right with a minimum of fuss. Unfortunately, he had a drinking problem, probably brought on by spending his waking hours with the clattering beasts. Caught drunk on the job, the navy brass summarily fired him. The NRL computer crashed soon after. After several unsuccessful attempts by the existing staff to get it going, and much discussion up the chain of command, he was finally rehired. First, though, as the story went, he insisted on, and got, more money. If true, he was one of a very few individuals to garner a pay raise for drinking on the job, but perhaps it’s just another urban computer legend.

    The Harvard Mark III or the ADEC (Aiken Dahlgren Electronic Calculator) was an electronic-based redesign of the MARK II. Mark III used the services of some 4,000-5,000 vacuum tubes, 3,000 relays, and 1,500 semiconductor diodes. Even with 3,000 Mark II style relays still clattering, most of the Mark II relay functions had been replaced by vacuum tubes. The Mark III system was completed in the fall of 1949. It was delivered to Dahlgren in March of 1950. MARK III formed the cornerstone of the navy’s computer research activities for some time.

    Grace Hopper, instrumental in the design of the COBOL programming system/language was a key programmer of the Mark III. She was a naval officer holding a PhD from Yale (1934). She was one of the very few people, men or women, to hold the rank of commodore in recent times. When the Navy mothballed its commodores, she became a rear admiral. She was well respected by her peers. Although I’m sure she wouldn’t care, I’ve even forgiven her for her work on COBOL.

    JOHN VON NEUMANN (1903-1956)

    John von Neumann was a mental giant among men. His contributions to today’s store of technical knowledge are immense. His computer architectural treatise, done in conjunction with Eckert and Mauchly, formed the basis for modern digital machines. Hungarian-born in 1903 at a time of turmoil in Eastern Europe, his family was quite wealthy. He entered the University of Berlin in 1921 as a chemistry student, moving on to Zurich in 1923 where he received outstanding honors in mathematics examinations even though he hadn’t attended mathematics classes. He earned a diploma in chemical engineering from the Technische Hochschule in Zurich and a doctorate in mathematics from the University of Budapest, both in 1926! He lectured and continued his studies under Hilbert of sets and matrix fame, before coming to the United States in 1930 as a visiting lecturer at Princeton. He was tendered a professorship there in 1931. In 1933, he joined the Institute of Advanced Study along with other notables such as Albert Einstein and J. W. Alexander, the topologist.

    Von Neumann’s contributions to science and engineering included major advancements of set theory, quantum mechanics, statistical mechanics, game theory, hydrodynamics, and logic design.⁹ He died relatively young of cancer at age fifty-three. There is a continuing controversy about the use of the term von Neumann Architecture. Some of the early ENIAC development work, logic, or architectural descriptions may have predated the von Neumann EDVAC paper describing a stored-program computer. No matter, von Neumann’s contributions to computer architecture are so many and so important, that whatever notions might have been suggested by others, no one should ever begrudge John von Neumann any of his honors.¹⁰

    ENIAC (1942-1955)

    img%2012.jpg

    ENIAC

    We’re now on the threshold of the modern style, but still handcrafted, stored-program digital computer. The ENIAC (Electronic Numerical Integrator and Computer) was conceived in 1942. The enabling contract with the US Army Ballistics Research Laboratory (BRL) was signed on 31 March 1943. The Army wanted a machine to compute firing tables rapidly and accurately, and was willing to pay for it. ENIAC was completed in 1945 or perhaps 1946. Leaders of the ENIAC design team were Chief Engineer J. Eckert and Principal Consultant J. Mauchly, both of the University of Pennsylvania Moore School. Like most of the other first-round electronic computing machines, ENIAC was a monster vacuum tube computer using over 18,000 vacuum tubes augmented by 1,500 relays and over 7,000 diodes. It weighed about 27 tons, and needed more than 1,800 square feet of floor space to be comfortable. ENIAC was formally dedicated at the Moore School of Electrical Engineering, University of Pennsylvania on 15 February 1946. It was accepted by the US Army Ordnance Corps later in July. During the rest of 1946, ENIAC remained in operation at the Moore School. The dismantling of ENIAC for shipment to Aberdeen began very late in 1946. The first pieces arrived at Aberdeen Proving Ground in January 1947. ENIAC was finally up and running in its new home in August 1947. ENIAC was used nearly continuously from its acceptance, providing solutions for diverse military research projects until it was shut down for the last time in October of 1955.

    ENIAC wasn’t a stored-program computer, but it was getting close. The internal memory was limited to register storage that held on to intermediate solutions just long enough to perform the next calculation. ENIAC used ten-digit ring counters for storage and arithmetic operations. Arithmetic was performed by counting pulses, and generating carries when the ring counters wrapped around (overflowed). That rather quirky part of ENIAC design was based on the notion of simulating the operation of the counting wheels on a mechanical adding machine.

    ENIAC had twenty ten-digit signed accumulators to sop up the ring counter outputs. These accumulators used pseudo ten’s complement¹¹ arithmetic (ENIAC still operated in binary, no matter how the binary digits were grouped). ENIAC could perform five thousand addition or subtraction operations between any two of the accumulators every second. It was even possible to connect several of the accumulators to run simultaneously, speeding up solutions by taking advantage of the parallel operation. Up to three accumulators could also be logically hooked end-to-end for double and triple precision operations. ENIAC programming was done via prewiring of patch boards, presenting a challenge to the best of engineers. Once connected, the hard-wired program was fixed and couldn’t be modified while the program was running. (This had to have been a big contributor to the proverbial nightmarish ENIAC program checkout sequences.) Making things worse yet, the first ENIAC input/output hardware was limited to an IBM card reader and a tape punch. It was a slow pain for people to communicate with ENIAC, and vice versa.

    At first, a few of the eighteen thousand tubes failed almost every day, leaving ENIAC nonfunctional about half the time. This made ENIAC’s first years at Aberdeen difficult ones for the operating and maintenance crews. Tube failures typically occurred during warm-up and cool-down periods, when the tube heaters and cathodes were under maximum thermal stress. The operators soon took a page from the Colossus book and left power on, reducing ENIAC’s tube failures to a somewhat more acceptable level. At the time, ENIAC represented the largest collection of interconnected electronic circuitry in existence on the planet. The vast majority of its thousands of components all had to work correctly and simultaneously. Major component improvement, preventive maintenance, and failure assessment/testing programs were surely called for.

    Prudently, Aberdeen engineers introduced analysis programs to point up flaws leading to system modifications and component replacements. Tube types were life-tested. Statistical data on failures were compiled, leading to manufacturing and design changes by tube fabricators. Power fluctuations and failures made continuous operation directly off the transformer mains virtually impossible. The substantial quantity of heat dissipated from ENIAC into the warm, humid Aberdeen atmosphere created a heat-removal problem of major proportions. It

    Enjoying the preview?
    Page 1 of 1