The Project Gutenberg eBook of Computers—the machines we think with This ebook is for the use of anyone anywhere in the United States and most other parts of the world at no cost and with almost no restrictions whatsoever. You may copy it, give it away or re-use it under the terms of the Project Gutenberg License included with this ebook or online at www.gutenberg.org. If you are not located in the United States, you will have to check the laws of the country where you are located before using this eBook. Title: Computers—the machines we think with Author: Daniel S. Halacy Release date: January 1, 2024 [eBook #72572] Language: English Original publication: New York: Harper & Row, 1962 Credits: Aaron Adrignola, Tim Lindell, Linda Cantoni and the Online Distributed Proofreading Team at https://www.pgdp.net (This book was produced from images made available by the HathiTrust Digital Library.) *** START OF THE PROJECT GUTENBERG EBOOK COMPUTERS—THE MACHINES WE THINK WITH *** COMPUTERS—THE MACHINES WE THINK WITH ------------------------------------------------------------------------ Computers— THE MACHINES WE THINK WITH D. S. HALACY, JR. [Illustration] HARPER & ROW, PUBLISHERS NEW YORK, EVANSTON, AND LONDON ------------------------------------------------------------------------ COMPUTERS—THE MACHINES WE THINK WITH. _Copyright © 1962, by Daniel S. Halacy, Jr. Printed in the United States of America. All rights in this book are reserved. No part of the book may be used or reproduced in any manner whatsoever without written permission except in the case of brief quotations embodied in critical articles and reviews. For information address Harper & Row, Publishers, Incorporated, 49 East 33rd Street, New York 16, N.Y._ _Library of Congress catalog card number: 62-14564_ F-S ------------------------------------------------------------------------ Contents 1. Computers—The Machines We Think With 1 2. The Computer’s Past 18 3. How Computers Work 48 4. Computer Cousins—Analog and Digital 72 5. The Binary Boolean Bit 96 6. The Electronic Brain 121 7. Uncle Sam’s Computers 147 8. The Computer in Business and Industry 171 9. The Computer and Automation 201 10. The Academic Computer 219 11. The Road Ahead 251 ------------------------------------------------------------------------ COMPUTERS—THE MACHINES WE THINK WITH ------------------------------------------------------------------------ 1: Computers—The Machines We Think With While you are reading this sentence, an electronic computer is performing 3 million mathematical operations! Before you read this page, another computer could translate it and several others into a foreign language. Electronic “brains” are taking over chores that include the calculation of everything from automobile parking fees to zero hour for space missile launchings. Despite bitter winter weather, a recent conference on computers drew some 4,000 delegates to Washington, D.C.; indicating the importance and scope of the new industry. The 1962 domestic market for computers and associated equipment is estimated at just under $3 billion, with more than 150,000 people employed in manufacture, operation, and maintenance of the machines. In the short time since the first electronic computer made its appearance, these thinking machines have made such fantastic strides in so many different directions that most of us are unaware how much our lives are already being affected by them. Banking, for example, employs complex machines that process checks and handle accounts so much faster than human bookkeepers that they do more than an hour’s work in less than thirty seconds. [Illustration: _General Electric Co., Computer Dept._ Programmer at console of computer used in electronic processing of bank checking accounts. ] Our government is one of the largest users of computers and “data-processing machines.” The census depends on such equipment, and it played a part in the development of early mechanical types of computers when Hollerith invented a punched-card system many years ago. In another application, the post office uses letter readers that scan addresses and sort mail at speeds faster than the human eye can keep up with. Many magazines have put these electronic readers to work whizzing through mailing lists. [Illustration: _General Electric Co., Computer Dept._ Numbers across bottom of check are printed in magnetic ink and can be read by the computer. ] In Sweden, writer Astrid Lindgren received additional royalties for one year of 9,000 _kronor_ because of library loans. Since this was based on 850,000 total loans of her books from thousands of schools and libraries, the bookkeeping was possible only with an electronic computer. Computers are beginning to take over control of factories, steel mills, bakeries, chemical plants, and even the manufacture of ice cream. In scientific research, computers are solving mathematical and logical problems so complex that they would go forever unsolved if men had to do the work. One of the largest computing systems yet designed, incorporating half a million transistors and millions of other parts, handles ticket reservations for the airlines. Others do flight planning and air traffic control itself. Gigantic computerized air defense systems like SAGE and NORAD help guard us from enemy attack. When John Glenn made his space flight, giant computers on the ground made the vital calculations to bring him safely back. Tiny computers in space vehicles themselves have proved they can survive the shocks of launching and the environment of space. These airborne computers make possible the operation of Polaris, Atlas, and Minuteman missiles. Such applications are indicative of the scope of computer technology today; the ground-based machines are huge, taking up rooms and even entire buildings while those tailored for missiles may fit in the palm of the hand. One current military project is such an airborne computer, the size of a pack of cigarettes yet able to perform thousands of mathematical and logical operations a second. Computers are a vital part of automation, and already they are running production lines and railroads, making mechanical drawings and weather predictions, and figuring statistics for insurance companies as well as odds for gamblers. Electronic machines permit the blind to read a page of ordinary type, and also control material patterns in knitting mills. This last use is of particular interest since it represents almost a full circle in computer science. Oddly, it was the loom that inspired the first punched cards invented and used to good advantage by the French designer Jacquard. These homely forerunners of stored information sparked the science that now returns to control the mills. Men very wisely are now letting computers design other computers, and in one recent project a Bell Laboratories computer did a job in twenty-five minutes that would have taken a human designer a month. Even more challenging are the modern-day “robots” performing precision operations in industrial plants. One such, called “Unimate,” is simply guided through the mechanical operations one time, and can then handle the job alone. “TransfeRobot 200” is already doing assembly-line work in dozens of plants. The hope has been expressed that computer extension of our brainpower by a thousandfold would give our country a lead over potential enemies. This is a rather vain hope, since the United States has no corner on the computer market. There is worldwide interest in computers, and machines are being built in Russia, England, France, Germany, Switzerland, Holland, Sweden, Africa, Japan, and other countries. A remarkable computer in Japan recognizes 8,000 colors and analyzes them instantly. Computer translation from one language to another has been mentioned, and work is even being done on machines that will permit us to speak English into a phone in this country and have it come out French, or whatever we will, overseas! Of course, computers have a terminology all their own too; words like analog and digital, memory cores, clock rates, and so on. The broad application of computers has been called the “second industrial revolution.” What the steam engine did for muscles, the modern computer is beginning to do for our brains. In their slow climb from caveman days, humans have encountered ever more problems; one of the biggest of these problems eventually came to be merely how to solve all the other problems. At first man counted on his fingers, and then his toes. As the problems grew in size, he used pebbles and sticks, and finally beads. These became the abacus, a clever calculating device still in constant use in many parts of the world. Only now, with the advent of low-cost computers, are the Japanese turning from the _soroban_, their version of the abacus. The large-scale computers we are becoming familiar with are not really as new as they seem. An Englishman named Babbage built what he called a “difference engine” way back in 1831. This complex mechanical computer cost a huge sum even by today’s standards, and although it was never completed to Babbage’s satisfaction, it was the forerunner and model for the successful large computers that began to appear a hundred years later. In the meantime, of course, electronics has come to the aid of the designer. Today, computer switches operate at billionths-of-a-second speeds and thus make possible the rapid handling of quantities of work like the 14 billion checks we Americans wrote in 1961. There are dozens of companies now in the computer manufacturing field, producing a variety of machines ranging in price from less than a hundred dollars total price to rental fees of $100,000 a month or more. Even at these higher prices the big problem of some manufacturers is to keep up with demand. A $1 billion market in 1960, the computer field is predicted to climb to $5 billion by 1965, and after that it is anyone’s guess. Thus far all expert predictions have proved extremely conservative. The path of computer progress is not always smooth. Recently a computer which had been installed on a toll road to calculate charges was so badly treated by motorists it had to be removed. Another unfortunate occurrence happened on Wall Street. A clever man juggled the controls of a large computer used in stock-market work and “made” himself a quarter of a million dollars, though he ultimately landed in jail for his illegal computer button pushing. Interestingly, there is one corrective institution which already offers a course in computer engineering for its inmates. So great is the impact of computers that lawyers recently met for a three-day conference on the legal aspects of the new machines. Points taken up included: Can business records on magnetic tape or other storage media be used as evidence? Can companies be charged with mismanagement for not using computers in their business? How can confidential material be handled satisfactorily on computers? Along with computing machines a whole new technology is growing. Universities and colleges—even high schools—are teaching courses in computers. And the computer itself is getting into the teaching business too. The “teaching machine” is one of the most challenging computer developments to come along so far. These mechanical professors range from simple “programmed” notebooks, such as the Book of Knowledge and Encyclopedia Britannica are experimenting with, to complex computerized systems such as that developed by U.S. Industries, Inc., for the Air Force and others. The computer as a teaching machine immediately raises the question of intelligence, and whether or not the computer has any. Debate waxes hot on this subject; but perhaps one authority was only half joking when he said that the computer designer’s competition was a unit about the size of a grapefruit, using only a tenth of a volt of electricity, with a memory 10,000 times as extensive as any existing electronic computer. This is a brief description of the human brain, of course. When the first computers appeared, those like ENIAC and BINAC, fiction writers and even some science writers had a field day turning the machines into diabolical “brains.” Whether or not the computer really thinks remains a controversial question. Some top scientists claim that the computer will eventually be far smarter than its human builder; equally reputable authorities are just as sure that no computer will ever have an original thought in its head. Perhaps a safe middle road is expressed with the title of this book; namely that the machine is simply an extension of the human brain. A high-speed abacus or slide rule, if you will; accurate and foolproof, but a moron nonetheless. There are some interesting machine-brain parallels, of course. Besides its ability to do mathematics, the computer can perform logical reasoning and even make decisions. It can read and translate; remembering is a basic part of its function. Scientists are now even talking of making computers “dream” in an attempt to come up with new ideas! More similarities are being discovered or suggested. For instance, the interconnections in a computer are being compared with, and even crudely patterned after, the brain’s neurons. A new scientific discipline, called “bionics,” concerns itself with such studies. Far from being a one-way street, bionics works both ways so that engineers and biologists alike benefit. In fact, some new courses being taught in universities are designed to “bridge the gap between engineering and biology.” At one time the only learning a computer had was “soldered in”; today the machines are being “forced” to learn by the application of punishment or reward as necessary. “Free” learning in computers of the Perceptron class is being experimented with. These studies, and statements like those of renowned scientist Linus Pauling that he expects a “molecular theory” of learning in human beings to be developed, are food for thought as we consider the parallels our electronic machines share with us. Psychologists at the University of London foresee computers not only training humans, but actually watching over them and predicting imminent nervous breakdowns in their charges! [Illustration: _Cornell Aeronautical Laboratory_ Bank of “association” units in Mark I Perceptron, a machine that “learns” from experience. ] To demonstrate their skill many computers play games of tick-tack-toe, checkers, chess, Nim, and the like. A simple electromechanical computer designed for young people to build can be programmed to play tick-tack-toe expertly. Checker- and chess-playing computers are more sophisticated, many of them learning as they play and capable of an occasional move classed as brilliant by expert human players. The IBM 704 computer has been programmed to inspect the results of its possible decisions several moves ahead and to select the best choice. At the end of the game it prints out the winner and thanks its opponent for the game. Rated as polite, but only an indifferent player by experts, the computer is much like the checker-playing dog whose master scoffed at him for getting beaten three games out of five. Chess may well be an ultimate challenge for any kind of brain, since the fastest computer in operation today could not possibly work out all the possible moves in a game during a human lifetime! As evidenced in the science-fiction treatment early machines got, the first computers were monsters at least in size. Pioneering design efforts on machines with the capacity of the brain led to plans for something roughly the size of the Pentagon, equipped with its own Niagara for power and cooling, and a price tag the world couldn’t afford. As often seems to happen when a need arises, though, new developments have come along to offset the initial obstacles of size and cost. One such development was the transistor and other semiconductor devices. Tiny and rugged, these components require little power. With the old vacuum-tubes replaced, computers shrank immediately and dramatically. On the heels of this micro-miniaturization have come new and even smaller devices called “ferrite cores” and “cryotrons” using magnetism and supercold temperatures instead of conventional electronic techniques. As a result, an amazing number of parts can be packed into a tiny volume. So-called “molecular electronics” now seems to be a possibility, and designers of computers have a gleam in their eyes as they consider progress being made toward matching the “packaging density” of the brain. This human computer has an estimated 100 billion parts per cubic foot! We have talked of reading and translating. Some new computers can also accept voice commands and speak themselves. Others furnish information in typed or printed form, punched cards, or a display on a tube or screen. Like us, the computer can be frustrated by a task beyond its capabilities. A wrong command can set its parts clicking rapidly but in futile circles. Early computers, for example, could be panicked by the order to divide a number by zero. The solution to that problem of course is infinity, and the poor machine had a hard time trying to make such an answer good. [Illustration: _Aeronutronic Division, Ford Motor Co._ This printed-circuit card contains more than 300 BIAX memory elements. Multiples of such cards mounted in computers store large amounts of information. ] There are other, quainter stories like that of the pioneer General Electric computer that simply could not function in the dark. All day long it hummed efficiently, but problems left with it overnight came out horribly botched for no reason that engineers could discover. At last it was found that a light had to be left burning with the scary machine! Neon bulbs in the computer were enough affected by light and darkness that the delicate electronic balance of the machine had been upset. Among the computer’s unusual talents is the ability to compose music. Such music has been published and is of a quality to give rise to thoughtful speculation that perhaps great composers are simply good selectors of music. In other words, all the combinations of notes and meter exist: the composer just picks the right ones. No less an authority than Aaron Copland suggests that “we’ll get our new music by feeding information into an electronic computer.” Not content with merely writing music, some computers can even play a tune. At Christmas time, carols are rendered by computers specially programmed for the task. The result is not unlike a melody played on a pipe organ. In an interesting switch of this musical ability on the part of the machine, Russian engineers check the reliability of their computers by having them memorize Mozart and Grieg. Each part of the complex machines is assigned a definite musical value, and when the composition is “played back” by the computer, the engineer can spot any defects existing in its circuitry. Such computer maintenance would seem to be an ideal field for the music lover. In a playful mood, computers match pennies with visitors, explain their inner workings as they whiz through complex mathematics, and are even capable of what is called heuristic reasoning. This amounts to playing hunches to reach short-cut solutions to otherwise unsolvable problems. A Rand Corporation computer named JOHNNIAC demonstrated this recently. It was given some basic axioms and asked to prove some theorems. JOHNNIAC came up with the answers, and in one case produced a proof that was simpler than that given in the text. As one scientist puts it, “If computers don’t really think, they at least put on a pretty creditable imitation of the real thing.” Computers are here to stay; this has been established beyond doubt. The only question remaining is how fast the predictions made by dreamers and science-fiction writers—and now by sober scientists—will come to be a reality. When we consider that in the few years since the 1953 crop of computers, their capacity and speed has been increased more than fiftyfold, and is expected to jump another thousandfold in two years, these dreams begin to sound more and more plausible. One quite probable use for computers is medical diagnosis and prescription of treatment. Electronic equipment can already monitor an ailing patient, and send an alarm when help is needed. We may one day see computers with a built-in bedside manner aiding the family doctor. The accomplished inroads of computing machines in business are as nothing to what will eventually take place. Already computer “game-playing” has extended to business management, and serious executives participate to improve their administrative ability. We speak of decision-making machines; business decisions are logical applications for this ability. Computers have been given the job of evaluating personnel and assigning salaries on a strictly logical basis. Perhaps this is why in surveys questioning increased use of the machines, each executive level in general tends to rate the machine’s ability just below its own. Other games played by the computer are war games, and computers like SAGE are well known. This system not only monitors all air activity but also makes decisions, assigns targets, and then even flies the interceptor planes and guided missiles on their missions. Again in the sky, the increase of commercial air traffic has perhaps reached the limit of human ability to control it. Computers are beginning to take over here too, planning flights and literally flying the planes. Surface transport can also be computer-controlled. Railroads are beginning to use the computer techniques, and automatic highways are inevitable. Ships also benefit, and special systems coupled to radar can predict courses and take corrective action when necessary. Men seem to have temporarily given up trying to control the weather, but using computers, meteorologists can take the huge mass of data from all over the world and make predictions rapidly enough to be of use. We have talked of the computer’s giant strides in banking. Its wide use in stores is not far off. An English computer firm has designed an automatic supermarket that assembles ordered items, prices them, and delivers them to the check stand. At the same time it keeps a running inventory, price record, and profit and loss statement, besides billing the customer with periodic statements. The storekeeper will have only to wash the windows and pay his electric power bill. Even trading stamps may be superseded by computer techniques that keep track of customer purchases and credit him with premiums as he earns them. Credit cards have helped pioneer computer use in billing; it is not farfetched to foresee the day when we are issued a lifetime, all-inclusive credit card—perhaps with our birth certificate!—a card with our thumbprint on it, that will buy our food, pay our rent and utilities and other bills. A central computer system will balance our expenses against deposits and from time to time let us know how we stand financially. As with many other important inventions, the computer and its technology were spurred by war and are aided now by continuing threats of war. It is therefore pleasant to think on the possibilities of a computer system “programmed” for peace: a gigantic, worldwide system whose input includes all recorded history of all nations, all economic and cultural data, all weather information and other scientific knowledge. The output of such a machine hopefully would be a “best plan” for all of us. Such a computer would have no ax to grind and no selfish interests unless they were fed into it. Given all the facts, it would punch out for us a set of instructions that would guarantee us the best life possible. This has long been a dream of science writers. H. G. Wells was one of these, suggesting a world clearinghouse of information in his book _World Brain_ written in the thirties. In this country, scientist Vannevar Bush suggested a similar computer called “Memex” which could store huge amounts of data and answer questions put to it. The huge amounts of information—books, articles, speeches, and records of all sorts—are beginning to make it absolutely necessary for an efficient information retrieval system. Many cases have been noted in which much time and effort are spent on a project which has already been completed but then has become lost in the welter of literature crammed into libraries. The computer is a logical device for such work; in a recent test such a machine scored 86 per cent in its efforts to locate specific data on file. Trained workers rated only 38 per cent in the same test! [Illustration: _The Boeing Co._ Engineers using computers to solve complex problems in aircraft design. ] The science of communication is advancing along with that of computers, and can help make the dream of a worldwide “brain” come true. Computers in distant cities are now linked by telephone lines or radio, and high-speed techniques permit the transmission of many thousands of words per _second_ across these “data links.” An interesting sidelight is the fact that an ailing computer can be hooked by telephone line with a repair center many miles away and its ailments diagnosed by remote control. Communications satellites that are soon to be dotting the sky like tiny moons may well play a big part in computing systems of the future. Global weather prediction and worldwide coordination of trade immediately come to mind. While we envision such far-reaching applications, let’s not lose sight of the possibilities for computer use closer to home—right in our homes, as a matter of fact. Just as early inventors of mechanical power devices did not foresee the day when electric drills and saws for hobbyist would be commonplace and the gasoline engine would do such everyday chores as cutting the grass in our yards, the makers of computers today cannot predict how far the computer will go in this direction. Perhaps we may one day buy a “Little Dandy Electro-Brain” and plug it into the wall socket for solving many of the everyday problems we now often guess wrong on. [Illustration: _Royal McBee Corp._ Students at Staples High School, Westport, Connecticut, attend a summer session to learn the techniques of programming and operating an electronic computer. ] [Illustration: _The Saturday Evening Post_ “Herbert’s been replaced by an electronic brain—one of the simpler types.” ] Some years ago a group of experts predicted that by 1967 the world champion chess player would be an electronic computer. No one has yet claimed that we would have a president of metal and wire, but some interesting signposts are being put up. Computers are now used widely to predict the result of elections. Computers count the votes, and some have suggested that computers could make it possible for us to vote at home. The government is investigating the effectiveness of a decision-making computer as a stand-by aid for the President in this complex age we are moving into. No man has the ability to weigh every factor and to make decisions affecting the world. Perhaps a computer can serve in an advisory capacity to a president or to a World Council; perhaps— It is comforting to remember that men will always tell the computer what it is supposed to do. No computer will ever run the world any more than the cotton gin or the steam engine or television runs the world. And in an emergency, we can always pull out the wallplug, can’t we? ------------------------------------------------------------------------ “_History is but the unrolled scroll of prophecy._” —James A. Garfield 2: The Computer’s Past Although it seemed to burst upon us suddenly, the jet airplane can trace its beginnings back through the fabric wings of the Wrights to the wax wings of Icarus and Daedalus, and the steam aerophile of Hero in ancient Greece. The same thing is true of the computer, the “thinking machine” we are just now becoming uncomfortably aware of. No brash upstart, it has a long and honorable history. Naturalists tell us that man is not the only animal that counts. Birds, particularly, also have an idea of numbers. Birds, incidentally, use tools too. We seem to have done more with the discoveries than our feathered friends; at least no one has yet observed a robin with a slide rule or a snowy egret punching the controls of an electronic digital computer. However, the very notion of mere birds being tool and number users does give us an idea of the antiquity and lengthy heritage of the computer. The computer was inevitable when man first began to make his own problems. When he lived as an animal, life was far simpler, and all he had to worry about was finding game and plants to eat, and keeping from being eaten or otherwise killed himself. But when he began to dabble in agriculture and the raising of flocks, when he began to think consciously and to reflect about things, man needed help. First came the hand tools that made him more powerful, the spears and bows and arrows and clubs that killed game and enemies. Then came the tools to aid his waking brain. Some 25,000 years ago, man began to count. This was no mean achievement, the dim, foggy dawning of the concept of number, perhaps in the caves in Europe where the walls have been found marked with realistic drawings of bison. Some budding mathematical genius in a skin garment only slightly shaggier than his mop of hair stared blinking at the drawings of two animals and then dropped his gaze to his two hands. A crude, tentative connection jelled in his inchoate gray matter and he shook his head as if it hurt. It was enough to hurt, this discovery of “number,” and perhaps this particular pioneer never again put two and two together. But others did; if not that year, the next. Armed with his grasp of numbers, man didn’t need to draw two mastodons, or sheep, or whatever. Two pebbles would do, or two leaves or two sticks. He could count his children on his fingers—we retain the expression “a handful” to this day, though often our children are another sort of handful. Of course, the caveman did not of a sudden do sums and multiplications. When he began to write, perhaps 5,000 years later, he had formed the concept of “one,” “two,” “several,” and “many.” Besides counting his flock and his children, and the number of the enemy, man had need for counting in another way. There were the seasons of the year, and a farmer or breeder had to have a way of reckoning the approach of new life. His calendar may well have been the first mathematical device sophisticated enough to be called a computer. It was natural that numbers be associated with sex. The calendar was related to the seasons and the bearing of young. The number three, for example, took on mystic and potent connotation, representing as it did man’s genitals. Indeed, numbers themselves came quaintly to have sex. One, three, and the other odd numbers were male; the symmetrical, even numbers logically were female. The notion that man used the decimal system because of his ten fingers and toes is general, but it was some time before this refinement took place. Some early peoples clung to a simpler system with a base of only two; and interestingly a tribe of Australian aborigines counts today thus: _enea_ (1), _petchaval_ (2), _enea petchaval_ (3), _petchaval petchaval_ (4). Before we look down our noses at this naïve system, let us consider that high-speed electronic computers use only two values, 1 and 0. But slowly symbols evolved for more and more numbers, numbers that at first were fingers, and then perhaps knots tied in a strip of hide. This crude counting aid persists today, and cowboys sometimes keep rough tallies of a herd by knotting a string for every five that pass. Somehow numbers took on other meanings, like those that figure in courtship in certain Nigerian tribes. In their language, the number six also means “I love you.” If the African belle is of a mind when her boyfriend tenderly murmurs the magic number, she replies in like tone, “Eight!”, which means “I feel the same way!” From the dawn of history there have apparently been two classes of us human beings, the “haves” and the “have nots.” Nowadays we get bills or statements from our creditors; in early days, when a slate or clay tablet was the document, a forerunner of the carbon copy or duplicate paper developed. Tallies were marked for the amount of the debt, the clay tablet was broken across the marks, and creditor and debtor each took half. No chance for cheating, since a broken half would fit only the proper mate! Numbers at first applied only to discrete, or distinctly separate, things. The scratches on a calendar, the tallies signifying the count of a flock; these were more easily reckoned. The idea of another kind of number inspired the first clocks. Here was a monumental breakthrough in mathematics. Nature provided the sunrise that clearly marked the beginning of each day; man himself thought to break the day into “hours,” or parts of the whole. Such a division led eventually to measurement of size and weight. Now early man knew not only how many goats he had, but how many “hands” high they were, and how many “stones” they weighed. This further division ordained another kind of mechanical computer man must someday contrive—the analog. The first counting machines used were pebbles or sea shells. For the Stone Age businessman to carry around a handful of rocks for all his transactions was at times awkward, and big deals may well have gone unconsummated for want of a stone. Then some genius hit on the idea of stringing shells on a bit of reed or hide; or more probably the necklace came first as adornment and the utilitarian spotted it after this style note had been added. At any rate, the portable adding machine became available and our early day accountant grew adroit at sliding the beads back and forth on the string. From here it was only a small step, taken perhaps as early as 3000 B.C., to the rigid counter known as the abacus. The word “counter” is one we use in everyday conversation. We buy stock over the counter; some deals are under the counter. We all know what the counter itself is—that wide board that holds the cash register and separates us from the shopkeeper. At one time the cash register _was_ the counter; actually the counting board had rods of beads like the abacus, or at least grooves in which beads could be moved. The totting up of a transaction was done on the “counter”; it is still there although we have forgotten whence came its name. The most successful computer used for the next 5,000 years, the portable counter, or the abacus, is a masterpiece of simplicity and effectiveness. Though only a frame with several rows of beads, it is sophisticated enough that as late as 1947 Kiyoshi Matsuzake of the Japanese Ministry of Communications, armed with the Japanese version—a _soroban_, bested Private Tom Wood of the U. S. Army of Occupation punching the keys of an up-to-the-minute electric calculating machine in four of five problem categories! Only recently have Japanese banks gone over to modern calculators, and shopkeepers there and in other lands still conduct business by this rule of thumb and forefinger. [Illustration: The abacus, ancient mechanical computer, is still in use in many parts of the world. Here is the Japanese version, the _soroban_, with problem being set up. ] The name abacus comes to us by way of the Greek _abax_, meaning “dust.” Scholars infer that early sums were done schoolboy fashion in Greece with a stylus on a dusty slate, and that the word was carried over to the mechanical counter. The design has changed but little over the years and all abacuses bear a resemblance. The major difference is the number of beads on each row, determined by the mathematical base used in the particular country. Some in India, for example, were set up to handle pounds and shillings for use in shops. Others have a base of twelve. The majority, however, use the decimal system. Each row has seven beads, with a runner separating one or two beads from the others. Some systems use two beads on the narrow side, some only one; this is a mathematical consideration with political implications, incidentally: The Japanese _soroban_ has the single-bead design; Korea’s _son pan_ uses two. When Japan took over Korea the two-bead models were tabu, and went out of use until the Koreans were later able to win their independence again. About the only thing added to the ancient abacus in recent years is a movable arrow for marking the decimal point. W. D. Loy patented such a gadget in the United States. Today the abacus remains a useful device, not only for business, but also for the teaching of mathematics to youngsters, who can literally “grasp their numbers.” For that reason it ought also to be helpful to the blind, and as a therapeutic aid for manual dexterity. Apparently caught up in the trend toward smaller computers, the abacus has been miniaturized to the extent that it can be worn as earrings or on a key chain. Even with mechanical counters, early mathematicians needed written numbers. The caveman’s straight-line scratches gave way to hieroglyphics, to the Sumerian cuneiform “wedges,” to Roman numerals, and finally to Hindu and Arabic. Until the numbers, 1, 2, 3, 4, 5, 6, 7, 8, 9, and that most wonderful of all, 0 or zero, computations of any but the simplest type were apt to be laborious and time-consuming. Even though the Romans and Greeks had evolved a decimal system, their numbering was complex. To count to 999 in Greek required not ten numbers but twenty-seven. The Roman number for 888 was DCCCLXXXVIII. Multiplying CCXVII times XXIX yielded an answer of MMMMMMCCXCIII, to be sure, but not without some difficulty. It required an abacus to do any kind of multiplication or division. Indeed, it was perhaps from the abacus that the clue to Arabic simplicity came. The Babylonians, antedating the Greeks, had nevertheless gone them one better in arithmetic by using a “place” system. In other words, the position of a number denoted its value. The Babylonians simply left an empty space between cuneiform number symbols to show an empty space in this positional system. Sometime prior to 300 B.C. a clever mathematician tired of losing track and punched a dot in his clay tablet to fill the empty space and avoid possible error. The abacus shows these empty spaces on its rows of beads, too, and finally the Hindus combined their nine numerals with a “dot with a hole in it” and gave the mathematical world the zero. In Hindu it was _sifr_, corrupted to _zephirium_ in Latin, and gives us today both cipher and zero. This enigma of nothingness would one day be used by Leibnitz to prove that God made the world; it would later become half the input of the electronic computer! Meantime, it was developed independently in various other parts of the world; the ancient Mayans being one example. Impressed as we may be by an electronic computer, it may take some charity to recognize its forebears in the scratchings on a rock. To call the calendar a computer, we must in honesty add a qualifying term like “passive.” The same applies to the abacus despite its movable counters. But time, which produced the simple calendar, also furnished the incentive for the first “active” computers too. The hourglass is a primitive example, as is the sundial. Both had an input, a power source, and a readout. The clock interestingly ended up with not a decimal scheme, but one with a base of twelve. Early astronomers began conventionally bunching days into groups of ten, and located different stars on the horizon to mark the passage of the ten days. It was but a step from here to use these “decans,” as they were called, to further divide each night itself into segments. It turned out that 12 decans did the trick, and since symmetry was a virtue the daylight was similarly divided by twelve, giving us a day of 24 hours rather than 10 or 20. From the simple hourglass and the more complex water clocks, the Greeks progressed to some truly remarkable celestial motion computers. One of these, built almost a hundred years before the birth of Christ, was recently found on the sea bottom off the Greek island of Antikythera. It had been aboard a ship which sank, and its discovery came as a surprise to scholars since history recorded no such complex devices for that era. The salvaged Greek computer was designed for astronomical work, showing locations of stars, predicting eclipses, and describing various cycles of heavenly bodies. Composed of dozens of gears, shafts, slip rings, and accurately inscribed plates, it was a computer in the best sense of the word and was not exceeded technically for many centuries. The Greek engineer Vitruvius made an interesting observation when he said, “All machinery is generated by Nature and the revolution of the universe guides and controls. Our fathers took precedents from Nature—developed the comforts of life by their inventions. They rendered some things more convenient by machines and their revolutions.” Hindsight and language being what they are, today we can make a nice play on the word “revolution” as applied to the machine. The Antikythera computer was a prime example of what Vitruvius was talking about. Astronomy was such a complicated business that it was far simpler to make a model of the many motions rather than diagram them or try to retain them in his mind. There were, of course, some die-hard classicists who decried the use of machines to do the work of pure reasoning. Archytas, who probably invented the screw—or at least discovered its mechanical principle—attempted to apply such mechanical devices to the solving of geometrical problems. For this he was taken to task by purist Plato who sought to preserve the distinct division between “mind” and “machine.” Yet the syllogistic philosophers themselves, with their major premise, minor premise, and conclusion, were unwittingly setting the stage for a different kind of computer—the logic machine. Plato would be horrified today to see crude decks of cards, or simple electromechanical contrivances, solving problems of “reason” far faster than he could; in fact, as fast as the conditions could be set into them! ------------------------------------------------------------------------ _The Mechanics of Reason_ Aristotle fathered the syllogism, or at least was first to investigate it rigorously. He defined it as a formal argument in which the conclusion follows logically from the premises. There are four common statements of this type: All S (for is P (for subject) predicate) No S (for subject) is P Some S (for is P subject) Some S (for is not P subject) Thus, Aristotle might say “All men are mortal” or “No men are immortal” as his subject. Adding an M (middle term), “Aristotle is a man,” as a minor premise, he could logically go on and conclude “Aristotle, being a man, is thus mortal.” Of course the syllogism unwisely used, as it often is, can lead to some ridiculously silly answers. “All tables have four legs. Two men have four legs. Thus, two men equal a table.” Despite the weaknesses of the syllogism, nevertheless it led eventually to the science of symbolic logic. The pathway was circuitous, even devious at times, but slowly the idea of putting thought down as letters or numbers to be logically manipulated to reach proper conclusions gained force and credence. While the Greeks did not have the final say, they did have words for the subject as they did for nearly everything else. Let us leave the subject of pure logic for a moment and talk of another kind of computing machine, that of the mechanical doer of work. In the _Iliad_, Homer has Hephaestus, the god of natural fire and metalworking, construct twenty three-wheeled chariots which propel themselves to and fro bringing back messages and instructions from the councils of the gods. These early automatons boasted pure gold wheels, and handles of “curious cunning.” Man has apparently been a lazy cuss from the start and began straightway to dream of mechanical servants to do his chores. In an age of magic and fear of the supernatural his dreams were fraught with such machines that turned into evil monsters. The Hebrew “golem” was made in the shape of man, but without a soul, and often got out of hand. Literature has perpetuated the idea of machines running amok, as the broom in “The Sorcerer’s Apprentice,” but there have been benevolent machines too. Tik-Tok, a latter-day windup man in _The Road to Oz_, could think and talk and do many other things men could do. He was not alive, of course, but he had the saving grace of always doing just what he “was wound up to do.” Having touched on the subject of mechanical men, let us now return to mechanical logic. Since the Greeks, many men have traveled the road of reason, but some stand out more brightly, more colorfully, than others. Such a standout was the Spanish monk Ramón Lull. Lull was born in 1232. A court page, he rose in influence, married young, and had two children, but did not settle down to married domesticity. A wildly reckless romantic, he was given to such stunts as galloping his horse into church in pursuit of some lady who caught his eye. One such escapade led to a remorseful re-examination of himself, and dramatic conversion to Christianity. He began to write books in conventional praise of Christ, but early in his writings a preoccupation with numbers appears. His _Book of Contemplation_, for example, actually contains five books for the five wounds of the Saviour, and forty subdivisions for the days He spent in the wilderness. There are 365 chapters for daily reading, plus one for reading only in leap years! Each chapter has ten paragraphs, symbolizing the ten commandments, and three parts to each chapter. These multiplied give thirty, for the pieces of silver. Beside religious and mystical connotations, geometric terms are also used, and one interesting device is the symbolizing of words and even phrases by letters. This ties in neatly with syllogism. A sample follows: … diversity is shown in the demonstration that the D makes of the E and the F and the G with the I and the K, therefore the H has certain scientific knowledge of Thy holy and glorious Trinity. This was only prologue to the _Ars Magna_, the “Great Art” of Ramón Lull. In 1274, the devout pilgrim climbed Mount Palma in search of divine help in his writings. The result was the first recorded attempt to use diagrams to discover and to prove non-mathematical truths. Specifically, Lull determined that he could construct mechanical devices that would perform logic to prove the validity of God’s word. Where force, in the shape of the Crusades, had failed, Lull was convinced that logical argument would win over the infidels, and he devoted his life to the task. Renouncing his estate, including his wife and children, Lull devoted himself thenceforth solely to his Great Art. As a result of dreams he had on Mount Palma, the basis for this work was the assumption of simple premises or principles that are unquestionable. Lull arranged these premises on rotating concentric circles. The first of these wheels of logic was called A, standing for God. Arranged about the circumference of the wheel were sixteen other letters symbolizing attributes of God. The outer wheel also contained these letters. Rotating them produced 240 two-term combinations telling many things about God and His good. Other wheels prepared sermons, advised physicians and scientists, and even tackled such stumpers as “Where does the flame go when the candle is put out?” [Illustration: From the _Enciclopedia universal illustrada_, Barcelona, 1923 Lull’s wheel. ] Unfortunately for Lull, even divine help did not guarantee him success. He was stoned to death by infidels in Bugia, Africa, at the age of eighty-three. All his wheelspinning logic was to no avail in advancing the cause of Christianity there, and most mathematicians since have scoffed at his naïve devices as having no real merit. Far from accepting the _Ars Magna_, most scholars have been “Lulled into a secure sense of falsity,” finding it as specious as indiscriminate syllogism. Yet Lull did leave his mark, and many copies of his wheels have been made and found useful. Where various permutations of numbers or other symbols are required, such a mechanical tool is often the fastest way of pairing them up. Even in the field of writing, a Lullian device was popular a few decades ago in the form of the “Plot Genii.” With this gadget the would-be author merely spun the wheels to match up various characters with interesting situations to arrive at story ideas. Other versions use cards to do the same job, and one called Plotto was used by its inventor William Wallace Cook to plot countless stories. Although these were perhaps not ideas for great literature, eager writers paid as much as $75 for the plot boiler. Not all serious thinkers relegated Lull to the position of fanatic dreamer and gadgeteer. No less a mind that Gottfried Wilhelm von Leibnitz found much to laud in Lull’s works. The _Ars Magna_ might well lead to a universal “algebra” of all knowledge, thought Leibnitz. “If controversies were to arise,” he then mused, “there would be no more reason for philosophers to dispute than there would for accountants!” Leibnitz applied Lull’s work to formal logic, constructed tables of syllogisms from which he eliminated the false, and carried the work of the “gifted crank” at bit nearer to true symbolic logic. Leibnitz also extended the circle idea to that of overlapping them in early attempts at logical manipulation that foreshadowed the work that John Venn would do later. Leibnitz also saw in numbers a powerful argument for the existence of God. God, he saw as the numeral 1, and 0 was the nothingness from which He created the world. There are those, including Voltaire whose _Candide_ satirized the notion, who question that it is the best of all possible worlds, but none can question that in the seventeenth century Leibnitz foresaw the coming power of the binary system. He also built arithmetical computers that could add and subtract, multiply and divide. A few years earlier than Leibnitz, Blaise Pascal was also interested in computing machines. As a teen-ager working in his father’s tax office, Pascal wearied of adding the tedious figures so he built himself a gear-driven computer that would add eight columns of numbers. A tall figure in the scientific world, Pascal had fathered projective geometry at age sixteen and later established hydrodynamics as a science. To assist a gambler friend, he also developed the theory of probability which led to statistical science. Another mathematical innovation of the century was that of placing logarithms on a stick by the Scot, John Napier. What he had done, of course, was to make an analog, or scale model of the arithmetical numbers. “Napier’s bones” quickly became what we now call slide rules, forerunners of a whole class of analog computers that solve problems by being actual models of size or quantity. Newton joined Leibnitz in contributing another valuable tool that would be used in the computer, that of the calculus. _The Computer in Literature_ Even as Plato had viewed with suspicion the infringement of mechanical devices on man’s domain of higher thought, other men have continued to eye the growth of “mechanisms” with mounting alarm. The scientist and inventor battled not merely technical difficulties, but the scornful satire and righteous condemnation of some of their fellow men. Jonathan Swift, the Irish satirist who took a swipe at many things that did not set well with his views, lambasted the computing machine as a substitute for the brain. In Chapter V, Book Three, of _Gulliver’s Travels_, the good dean runs up against a scheming scientist in Laputa: The first Professor I saw was in a very large Room, with Forty Pupils about him. After Salutation, observing me to look earnestly upon a Frame, which took up the greatest part of both the Length and Breadth of the Room; he said, perhaps I might wonder to see him employed in a Project for improving speculative knowledge by practical and mechanical Operations. But the World would soon be sensible of its Usefulness; and he flattered himself, that a more noble exalted Thought never sprang in any other Man’s Head. Every one knew how laborious the usual Method is of attaining to Arts and Sciences; whereas by his Contrivance, the most ignorant Person at a reasonable Charge, and with a little bodily Labour, may write Books in Philosophy, Poetry, Politicks, Law, Mathematicks, and Theology, without the least Assistance from Genius or Study. He then led me to the Frame, about the Sides whereof all his Pupils stood in Ranks. It was a Twenty Foot Square, placed in the Middle of the Room. The Superfices was composed of several Bits of Wood, about the Bigness of a Dye, but some larger than others. They were all linked together by slender Wires. These Bits of Wood were covered on every Square with Papers pasted on them; and on these Papers were written all the Words of their Language in their several Moods, Tenses, and Declensions, but without any Order. The Professor then desired me to observe, for he was going to set his Engine to work. The Pupils at his Command took each the hold of an Iron Handle, whereof there were Forty fixed round the Edges of the Frame; and giving them a sudden Turn, the whole Disposition of the Words was entirely changed. He then commanded Six and Thirty of the Lads to read the several Lines softly as they appeared upon the Frame; and where they found three or four Words together that might make Part of a Sentence, they dictated to the four remaining Boys who were Scribes. This work was repeated three or four Times, and at every Turn the Engine was so contrived, that the Words shifted into new Places, as the square Bits of Wood moved upside down. Six hours a-day the young Students were employed in this Labour; and the Professor showed me several Volumes in large Folio already collected, of broken Sentences, which he intended to piece together, and out of those rich Materials to give the World a compleat Body of Art and Sciences; which however might be still improved, and much expedited, if the Publick would raise a Fund for making and employing five Hundred such Frames in _Lagado_.... Fortunately for Swift, who would have been horrified by it, he never heard Russell Maloney’s classic story, “Inflexible Logic,” about six monkeys pounding away at typewriters and re-creating the world great literature. _Gulliver’s Travels_ is not listed in their accomplishments. The French Revolution prompted no less an orator than Edmund Burke to deliver in 1790 an address titled “Reflections on the French Revolution,” in which he extols the virtues of the dying feudal order in Europe. It galled Burke that “The Age of Chivalry is gone. That of sophists, economists, and _calculators_ has succeeded, and the glory of Europe is extinguished forever.” Seventy years later another eminent Englishman named Darwin published a book called _On the Origin of Species_ that in the eyes of many readers did little to glorify man himself. Samuel Butler, better known for his novel, _The Way of All Flesh_, wrote too of the mechanical being, and was one of the first to point out just what sort of future Darwin was suggesting. In the satirical _Erewhon_, he described the machines of this mysterious land in some of the most prophetic writing that has been done on the subject. It was almost a hundred years ago that Butler wrote the first version, called “Darwin Among the Machines,” but the words ring like those of a 1962 worrier over the electronic brain. Butler’s character warns: There is no security against the ultimate development of mechanical consciousness in the fact of machines possessing little consciousness now. Reflect upon the extraordinary advance which machines have made during the last few hundred years, and note how slowly the animal and vegetable kingdoms are advancing. The more highly organized machines are creatures not so much of yesterday, as of the last five minutes, so to speak, in comparison with past time. Do not let me be misunderstood as living in fear of any actually existing machine; there is probably no known machine which is more than a prototype of future mechanical life. The present machines are to the future as the early Saurians to man ... what I fear is the extraordinary rapidity with which they are becoming something very different to what they are at present. Butler envisioned the day when the present rude cries with which machines call out to one another will have been developed to a speech as intricate as our own. After all, “... take man’s vaunted power of calculation. Have we not engines which can do all manner of sums more quickly and correctly than we can? What prizeman in Hypothetics at any of our Colleges of Unreason can compare with some of these machines in their own line?” Noting another difference in man and his creation, Butler says, ... Our sum-engines never drop a figure, nor our looms a stitch; the machine is brisk and active, when the man is weary, it is clear-headed and collected, when the man is stupid and dull, it needs no slumber.... May not man himself become a sort of parasite upon the machines? An affectionate machine-tickling aphid? It can be answered that even though machines should hear never so well and speak never so wisely, they will still always do the one or the other for our advantage, not their own; that man will be the ruling spirit and the machine the servant.... This is all very well. But the servant glides by imperceptible approaches into the master, and we have come to such a pass that, even now, man must suffer terribly on ceasing to benefit the machines. If all machines were to be annihilated ... man should be left as it were naked upon a desert island, we should become extinct in six weeks. Is it not plain that the machines are gaining ground upon us, when we reflect on the increasing number of those who are bound down to them as slaves, and of those who devote their whole souls to the advancement of the mechanical kingdom? Butler considers the argument that machines at least cannot copulate, since they have no reproductive system. “If this be taken to mean that they cannot marry, and that we are never likely to see a fertile union between two vapor-engines with the young ones playing about the door of the shed, however greatly we might desire to do so, I will readily grant it. [But] surely if a machine is able to reproduce another machine systematically, we may say that it has a reproductive system.” Butler repeats his main theme. “... his [man’s] organization never advanced with anything like the rapidity with which that of the machine is advancing. This is the most alarming feature of the case, and I must be pardoned for insisting on it so frequently.” Then there is a startlingly clear vision of the machines “regarded as a part of man’s own physical nature, being really nothing but extra-corporeal limbs. Man ... as a machinate mammal.” This was feared as leading to eventual weakness of man until we finally found “man himself being nothing but soul and mechanism, an intelligent but passionless principle of mechanical action.” And so the Erewhonians in self-defense destroyed all inventions discovered in the preceding 271 years! _Early Mechanical Devices_ During the nineteenth century, weaving was one of the most competitive industries in Europe, and new inventions were often closely guarded secrets. Just such an idea was that of Frenchman Joseph M. Jacquard, an idea that automated the loom and would later become the basis for the first modern computers. A big problem in weaving was how to control a multiplicity of flying needles to create the desired pattern in the material. There were ways of doing this, of course, but all of them were unwieldy and costly. Then Jacquard hit on a clever scheme. If he took a card and punched holes in it where he wanted the needles to be actuated, it was simple to make the needles do his bidding. To change the pattern took only another card, and cards were cheap. Patented in 1801, there were soon thousands of Jacquard looms in operation, doing beautiful and accurate designs at a reasonable price. To show off the scope of his wonderful punched cards, Jacquard had one of his looms weave a portrait of him in silk. The job took 20,000 cards, but it was a beautiful and effective testimonial. And fatefully a copy of the silk portrait would later find its way into the hands of a man who would do much more with the oddly punched cards. At about this same time, a Hungarian named Wolfgang von Kempelen decided that machines could play games as well as work in factories. So von Kempelen built himself a chess-playing machine called the Maelzel Chess Automaton with which he toured Europe. The inventor and his machine played a great game, but they didn’t play fair. Hidden in the innards of the Maelzel Automaton was a second human player, but this disillusioning truth was not known for some time. Thus von Kempelen doubtless spurred other inventors to the task, and in a short while machines would actually begin to play the royal game. For instance, a Spaniard named L. Torres y Quevedo built a chess-playing machine in 1914. This device played a fair “end game” using several pieces, and its inventor predicted future work in this direction using more advanced machines. Charles Babbage was an English scientist with a burning desire for accuracy. When some mathematical tables prepared for the Astronomical Society proved to be full of errors, he angrily determined to build a machine that would do the job with no mistakes. Of course calculating machines had been built before; but the machine Babbage had in mind was different. In fact, he called it a “difference engine” because it was based on the difference tables of the squares of numbers. The first of the “giant computers,” it was to have hundreds of gears and shafts, ratchets and counters. Any arithmetic problem could be set into it, and when the proper cranks were turned, out would come an answer—the right answer because the machine could not make a mistake. After doing some preliminary work on his difference engine, Babbage interested the government in his project since even though he was fairly well-to-do he realized it would cost more money than he could afford to sink into the project. Babbage was a respected scientist, Lucasian Professor of Mathematics at Cambridge, and because of his reputation and the promise of the machine, the Chancellor of the Exchequer promised to underwrite the project. For four years Babbage and his mechanics toiled. Instead of completing his original idea, the scientist had succeeded only in designing a far more complicated machine, one which would when finished weigh about two tons. Because the parts he needed were advanced beyond the state of the art of metalworking, Babbage was forced to design and build them himself. In the process he decided that industry was being run all wrong, and took time out to write a book. It was an excellent book, a sort of forerunner to the modern science of operations research, and Babbage’s machine shop was doing wonders for the metalworking art. Undaunted by the lack of progress toward a concrete result, Babbage was thinking bigger and bigger. He was going to scrap the difference engine, or rather put it in a museum, and build a far better computer—an “analytical engine.” If Jacquard’s punched cards could control the needles on a loom, they could also operate the gears and other parts of a calculating machine. This new engine would be one that could not only add, subtract, multiply, and divide; it would be designed to control itself. And as the answers started to come out, they would be fed back to do more complex problems with no further work on the operator’s part. “Having the machine eat its own tail!” Babbage called this sophisticated bit of programming. This mechanical cannibalism was the root of the “feedback” principle widely used in machines today. Echoing Watt’s steam governor, it prophesied the coming control of machines by the machines themselves. Besides this innovation, the machine would have a “store,” or memory, of one thousand fifty-digit numbers that it could draw on, and it would actually exercise judgment in selection of the proper numbers. And as if that weren’t enough, it would print out the correct answers automatically on specially engraved copper plates! [Illustration: _Space Technology Laboratories_ “As soon as an Analytical Engine exists, it will necessarily guide the future course of science. Whenever any result is sought by its aid, the question will then arise—by what course of calculation can these results be arrived at by the machine in the shortest time?” Charles Babbage—_The Life of a Philosopher_, 1861. ] It was a wonderful dream; a dream that might have become an actuality in Babbage’s own time if machine technology had been as advanced as his ideas. But for Babbage it remained only a dream, a dream that never did work successfully. The government spent £17,000, a huge sum for that day and time, and bowed out. Babbage fumed and then put his own money into the machine. His mechanics left him and became leaders in the machine-tool field, having trained in Babbage’s workshops. In despair, he gave up on the analytical engine and designed another difference engine. An early model of this one would work to five accurate places, but Babbage had his eyes on a much better goal—twenty-place accuracy. A lesser man would have aimed more realistically and perhaps delivered workable computers to the mathematicians and businessmen of the day. There is a legend that his son did finish one of the simpler machines and that it was used in actuarial accounting for many years. But Babbage himself died in 1871 unaware of how much he had done for the computer technology that would begin to flower a few short decades later. Singlehandedly he had given the computer art the idea of programming and of sequential control, a memory in addition to the arithmetic unit he called a “mill,” and even an automatic readout such as is now standard on modern computers. Truly, the modern computer was “Babbage’s dream come true.” _Symbolic Logic_ Concurrently with the great strides being made with mechanical computers that could handle mathematics, much work was also being done with the formalizing of the logic. As hinted vaguely in the syllogisms of the early philosophers, thinking did seem amenable to being diagrammed, much like grammar. Augustus De Morgan devised numerical logic systems, and George Boole set up the logic system that has come to be known as Boolean algebra in which reasoning becomes positive or negative terms that can be manipulated algebraically to give valid answers. John Venn put the idea of logic into pictures, and simple pictures at that. His symbology looks for all the world like the three interlocking rings of a well-known ale. These rings stand for the subject, midterm, and predicate of the older Aristotelian syllogism. By shading the various circles according to the major and minor premises, the user of Venn circles can see the logical result by inspection. Implicit in the scheme is the possibility of a mechanical or electrical analogy to this visual method, and it was not long until mathematicians began at least on the mechanical kind. Among these early logic mechanizers, surprisingly, was Lewis Carroll who of course was mathematician Charles L. Dodgson before he became a writer. Carroll, who was a far busier man than most of us ever guess, marketed a “Game of Logic,” with a board and colored cardboard counters that handled problems like the following: All teetotalers like sugar. No nightingale drinks wine. By arranging the counters on Carroll’s game board so that: All M are X, and No Y is not-M, we learn that No Y is not-X! This tells the initiate logician that no nightingale dislikes sugar; a handy piece of information for bird-fancier and sugar-broker alike. [Illustration: Lewis Carroll’s “Symbolic Logic.” ] Charles, the third Earl Stanhope, was only slightly less controversial than his prime minister, William Pitt. Scientifically he was far out too, writing books on electrical theory, inventing steamboats, microscopes, and printing presses among an odd variety of projects; he also became interested in mechanical logic and designed the “Stanhope Demonstrator,” a contrivance like a checkerboard with sliding panels. By properly manipulating the demonstrator he could solve such problems as: Eight of ten children are bright. Four of these children are boys. What are the minimum and maximum number of bright boys? A simple sliding of scales on the Stanhope Demonstrator shows that two must be boys and as many as four may be. This clever device could also work out probability problems such as how many heads and tails will come up in so many tosses of a coin. In 1869 William S. Jevons, an English economist and expert logician, built a logic machine. His was not the first, of course, but it had a unique distinction in that it solved problems faster than the human brain could! Using Boolean algebra principles, he built a “logical abacus” and then even a “logical piano.” By simply pressing the keys of this machine, the user could make the answer appear on its face. It is of interest that Jevons thought his machine of no practical use, since complex logical questions seldom arose in everyday life! Life, it seems, was simpler in 1869 than it is today, and we should be grateful that Jevons pursued his work through sheer scientific interest. More sophisticated than the Jevons piano, the logic machine invented in America by Allan Marquand could handle four terms and do problems like the following: There are four schoolgirls, Anna, Bertha, Cora, and Dora. When Anna or Bertha, or both, remain home, Cora is at home. When Bertha is out, Anna is out. Whenever Cora is at home, Anna is too. What can we tell about Dora? The machine is smart enough to tell us that when Dora is at home the other three girls are all at home or out. The same thing is true when Dora is out. _The Census Taker_ Moving from the sophistication of such logic devices, we find a tremendous advance in mechanical computers spurred by such a mundane chore as the census. The 1880 United States census required seven years for compiling; and that with only 50 million heads to reckon. It was plain to see that shortly a ten-year census would be impossible of completion unless something were done to cut the birth rate or speed the counting. Dr. Herman Hollerith was the man who did something about it, and as a result the 1890 census, with 62 million people counted, took only one-third the time of the previous tally. Hollerith, a statistician living in Buffalo, New York, may or may not have heard the old saw about statistics being able to support anything—including the statisticians, but there was a challenge in the rapid growth of population that appealed to the inventor in him and he set to work. He came up with a card punched with coded holes, a card much like that used by Jacquard on his looms, and by Babbage on the dream computer that became a nightmare. But Hollerith did not meet the fate of his predecessors. Not stoned, or doomed to die a failure, Hollerith built his card machines and contracted with the government to do the census work. “It was a good paying business,” he said. It was indeed, and his early census cards would some day be known generically as “IBM cards.” While Jacquard and Babbage of necessity used mechanical devices with their punched cards, Hollerith added the magic of electricity to his card machine, building in essence the first electrical computing machine. The punched cards were floated across a pool of mercury, and telescoping pins in the reading head dropped through the holes. As they contacted the mercury, an electrical circuit was made and another American counted. Hollerith did not stop with census work. Sagely he felt there must be commercial applications for his machines and sold two of the leading railroads on a punched-card accounting system. His firm merged with others to become the Computing-Tabulating-Recording Company, and finally International Business Machines. The term “Hollerith Coding” is still familiar today. [Illustration: _International Business Machines Corp._ Hollerith tabulating machine of 1890, forerunner of modern computers. ] Edison was illuminating the world and the same electrical power was brightening the future of computing machines. As early as 1915 the Ford Instrument Company was producing in quantity a device known as “Range Keeper Mark I,” thought to be the first electrical-analog computer. In 1920, General Electric built a “short-circuit calculating board” that was an analog or model of the real circuits being tested. Westinghouse came up with an “alternating-current network analyzer” in 1933, and this analog computer was found to be a powerful tool for mathematics. [Illustration: _International Business Machines Corp._ A vertical punched-card sorter used in 1908. ] While scientists were putting the machines to work, writers continued to prophesy doom when the mechanical man took over. Mary W. Shelley’s _Frankenstein_ created a monster from a human body; a monster that in time would take his master’s name and father a long horrid line of other fictional monsters. Ambrose G. Bierce wrote of a diabolical chess-playing machine that was human enough to throttle the man who beat him at a game. But it remained for the Czech playwright Karel Čapek to give the world the name that has stuck to the mechanical man. In Čapek’s 1921 play, _R.U.R._, for Rossum’s Universal Robots, we are introduced to humanlike workers grown in vats of synthetic protoplasm. _Robota_ is a Czech word meaning compulsory service, and apparently these mechanical slaves did not take to servitude, turning on their masters and killing them. Robot is generally accepted now to mean a mobile thinking machine capable of action. Before the advent of the high-speed electronic computer it had little likelihood of stepping out of the pages of a novel or movie script. As early as 1885, Allan Marquand had proposed an electrical logic machine as an improvement over his simple mechanically operated model, but it was 1936 before such a device was actually built. In that year Benjamin Burack, a member of Chicago’s Roosevelt College psychology department, built and demonstrated his “Electrical Logic Machine.” Able to test all syllogisms, the Burack machine was unique in another respect. It was the first of the portable electrical computers. The compatibility of symbolic logic and electrical network theory was becoming evident at about this time. The idea that yes-no corresponded to on-off was beautifully simple, and in 1938 there appeared in one of the learned journals what may fairly be called a historic paper. Appearing in _Transactions of the American Institute of Electrical Engineers_, “A Symbolic Analysis of Relay and Switching Circuits,” was written by Claude Shannon and was based on his thesis for the M.S. degree at the Massachusetts Institute of Technology a year earlier. One of its important implications was that the programming of a computer was more a logical than an arithmetical operation. Shannon had laid the groundwork for logical computer design; his work made it possible to teach the machine not only to add but also to think. Another monumental piece of work by Shannon was that on information theory, which revolutionized the science of communications. The author is now on the staff of the electronics research laboratory at M.I.T. Two enterprising Harvard undergraduates put Shannon’s ideas to work on their problems in the symbolic logic class they were taking. Called a Kalin-Burkhart machine for its builders, this electrical logic machine did indeed work, solving the students’ homework assignments and saving them much tedious paperwork. Interestingly, when certain logical questions were posed for the machine, its circuits went into oscillation, making “a hell of a racket” in its frustration. The builders called this an example of “Russell’s paradox.” A typical logical paradox is that of the barber who shaved all men who didn’t shave themselves—who shaves the barber? Or of the condemned man permitted to make a last statement. If the statement is true, he will be beheaded; if false, he will hang. The man says, “I shall be hanged,” and thus confounds his executioners as well as logic, since if he is hanged, the statement is indeed true, and he should have been beheaded. If he is beheaded, the statement is false, and he should have been hanged instead. World War II, with its pressingly complex technological problems, spurred computer work mightily. Men like Vannevar Bush, then at Harvard, produced analog computers called “differential analyzers” which were useful in solving mathematics involved in design of aircraft and in ballistics problems. A computer built by General Electric for the gunsights on the World War II B-29 bomber is typical of applications of analog devices for computing and predicting, and is also an example of early airborne use of computing devices. Most computers, however, were sizable affairs. One early General Electric analog machine, described as a hundred feet long, indicates the trend toward the “giant brain” concept. Even with the sophistication attained, these computers were hardly more than extensions of mechanical forerunners. In other words, gears and cams properly proportioned and actuated gave the proper answers whether they were turned by a manual crank or an electrical motor. The digital computer, which had somehow been lost in the shuffle of interest in computers, was now appearing on the scientific horizon, however, and in this machine would flower all the gains in computers from the abacus to electrical logic machines. _The Modern Computer_ Many men worked on the digital concept. Aiken, who built the electromechanical Mark I at Harvard, and Williams in England are representative. But two scientists at the University of Pennsylvania get the credit for the world’s first electronic digital computer, ENIAC, a 30-ton, 150-kilowatt machine using vacuum tubes and semiconductor diodes and handling discrete numbers instead of continuous values as in the analog machine. The modern computer dates from ENIAC, Electronic Numerical Integrator And Computer. [Illustration: _Remington Rand UNIVAC_ ENIAC in operation. This was the first electronic digital computer. ] Shannon’s work and the thinking of others in the field indicated the power of the digital, yes-no, approach. A single switch can only be on or off, but many such switches properly interconnected can do amazing things. At first these switches were electromechanical; in the Eckert-Mauchly ENIAC, completed for the government in 1946, vacuum tubes in the Eccles-Jordan “flip-flop” circuit married electronics and the computer. The progeny have been many, and their generations faster than those of man. ENIAC has been followed by BINAC and MANIAC, and even JOHNNIAC. UNIVAC and RECOMP and STRETCH and LARC and a whole host of other machines have been produced. At the start of 1962 there were some 6,000 electronic digital computers in service; by year’s end there will be 8,000. The golden age of the computer may be here, but as we have seen, it did not come overnight. The revolution has been slow, gathering early momentum with the golden wheels of Homer’s mechanical information-seeking vehicles that brought the word from the gods. Where it goes from here depends on us, and maybe on the computer itself. ------------------------------------------------------------------------ “_Theory is the guide to practice, and practice is the ratification and life of theory._” —John Weiss 3: How Computers Work In the past decade or so, an amazing and confusing number of computing machines have developed. To those of us unfamiliar with the beast, many of them do not look at all like what we imagined computers to be; others are even more awesome than the wildest science-fiction writer could dream up. On the more complex, lights flash, tape reels spin dizzily, and printers clatter at mile-a-minute speeds. We are aware, or perhaps just take on faith, that the electronic marvel is doing its sums at so many thousand or million per second, cranking out mathematical proofs and processing data at a rate to make mere man seem like the dullest slowpoke. Just how computers do this is pretty much of a mystery unless we are of the breed that works with them. Actually, in spite of all the blurring speed and seeming magic, the basic steps of computer operation are quite simple and generally the same for all types of machines from the modestly priced electromechanical do-it-yourself model to STRETCH, MUSE, and other ten-million-dollar computers. It might be well before we go farther to learn a few words in the lexicon of the computer, words that are becoming more and more a part of our everyday language. The following glossary is of course neither complete nor technical but it will be helpful in following through the mechanics of computer operation. COMPUTER DICTIONARY ACCESS TIME—Time required for computer to locate data and transfer it from one computer element to another. ADDER—Device for forming sums in the computer. ADDRESS—Specific location of information in computer memory. ANALOG COMPUTER—A physical or electrical simulator which produces an analogy of the mathematical problem to be solved. ARITHMETIC UNIT—Unit that performs arithmetical and logical operations. BINARY CODE—Representation of numbers or other information using only one and zero, to take advantage of open and closed circuits. BIT—A binary digit, either one or zero; used to make binary numbers. BLOCK—Group of words handled as a unit, particularly with reference to input and output. BUFFER—Storage device to compensate for difference in input and operation rate. CONTROL UNIT—Portion of the computer that controls arithmetic and logical operations and transfer of information. DELAY LINE—Memory device to store and later reinsert information; uses physical, mechanical, or electrical techniques. DIGITAL COMPUTER—A computer that uses discrete numbers to represent information. FLIP-FLOP—A circuit or device which remains in either of two states until the application of a signal. GATE—A circuit with more than one input, and an output dependent on these inputs. An AND gate’s output is energized only when all inputs are energized. An OR gate’s output is energized when one or more inputs are energized. There are also NOT-AND gates, EXCLUSIVE-OR gates, etc. LOGICAL OPERATION—A nonarithmetical operation, i.e., decision-making, data-sorting, searching, etc. MAGNETIC DRUM—Rotating cylinder storage device for memory unit; stores data in coded form. MATRIX—Circuitry for transformation of digital codes from one type to another; uses wires, diodes, relays, etc. MEMORY UNIT—That part of the computer that stores information in machine language, using electrical or magnetic techniques. MICROSECOND—One millionth of a second. MILLISECOND—One thousandth of a second. NANOSECOND—One billionth of a second. PARALLEL OPERATION—Digital computer operation in which all digits are handled simultaneously. PROGRAMMING—Steps to be executed by computer to solve problem. RANDOM ACCESS—A memory system that permits more nearly equal access time to all memory locations than does a nonrandom system. Magnetic core memory is a random type, compared with a tape reel memory. REAL TIME—Computer operation simultaneous with input of information; e.g., control of a guided missile or of an assembly line. REGISTER—Storage device for small amount of information while, or until, it is needed. SERIAL OPERATION—Digital computer operation in which all digits are handled serially. STORAGE—Use of drums, tapes, cards, and so on to store data outside the computer proper. _The Computer’s Parts_ Looking at computers from a distance, we are vaguely aware that they are given problems in the form of coded instructions and that through some electronic metamorphosis this problem turns into an answer that is produced at the readout end of the machine. There is an engineering technique called the “black box” concept, in which we are concerned only with input to this box and its output. We could extend this concept to “black-magic box” and apply it to the computer, but breaking the system down into its components is quite simple and much more informative. There are five components that make up a computer: input, control, arithmetic (or logic) unit, memory, and output. As machine intelligence expert, Dr. W. Ross Ashby, points out, we can get no more out of a brain—mechanical or human—than we put into it. So we must have an input. The kind of input depends largely on the degree of sophistication of the machine we are considering. With the abacus we set in the problem mechanically, with our fingers. Using a desk calculator we punch buttons: a more refined mechanical input. Punched cards or perforated tapes are much used input methods. As computers evolve rapidly, some of them can “read” for themselves and the input is visual. There are also computers that understand verbal commands. Input should not be confused with the control portion of the computer’s anatomy. We feed in data, but we must also tell the computer what to do with the information. Shall it count the number of cards that fly through it, or shall it add the numbers shown on the cards, record the maximum and minimum, and print out an average? Control involves programming, a computer term that was among the first to be assimilated into ordinary language. The arithmetic unit—that part of the computer that the pioneer Babbage called his “mill”—is the nuts and bolts end of the business. Here are the gears and shafts, the electromechanical relays, or the vacuum tubes, transistors, and magnetic cores that do the addition, multiplication, and other mathematical operations. Sometimes this is called the “logic” unit, since often it manipulates the ANDS, ORS, NORS, and other conjunctives in the logical algebra of Boole and his followers. The memory unit is just that; a place where numbers, words, or other data are stored and ready to be called into use whenever needed. There are two broad types of memory, internal and external, and they parallel the kind of memory we use ourselves. While our brain can store many, many facts, it does have a practical limit. This is why we have phone books, logarithm tables, strings around fingers, and so on. The computer likewise has its external memory that may store thousands of times the capacity of its internal memory. Babbage’s machine could remember a thousand fifty-digit numbers; today’s large computers call on millions of bits of data. [Illustration: Conversion of problem to machine program. ] After we have dumped in the data and told the computer what to do with them, and the arithmetic and memory have collaborated, it remains only for the computer to display the result. This is the output of the computer, and it can take many forms. If we are using a simple analog computer such as a slide rule, the answer is found under the hairline on the slide. An electronic computer in a bank prints out the results of the day’s transactions in neat type at hundreds of lines a minute. The SAGE defense computer system displays an invading bomber and plots the correct course for interceptors on a scope; a computer in a playful mood might type out its next move—King to Q7 and checkmate. With this sketchy over-all description to get us started, let us study each unit in a little more detail. It is interesting to compare these operations with those of our human computer, our brain, as we go along. [Illustration: _Remington Rand UNIVAC_ A large computer, showing the different parts required. ] _Input_ An early and still very popular method of getting data into the computer is the punched card. Jacquard’s clever way of weaving a pattern got into the computer business through Hollerith’s census counting machines. Today the ubiquitous IBM card can do these tasks of nose counting and weaving, and just about everything else in between. Jacquard used the punched holes to permit certain pins to slide through. Hollerith substituted the mercury electrical contact for the loom’s flying needles. Today there are many other ways of “reading” the cards. Metal base plate and springs, star wheels, even photoelectric cells are used to detect the presence or absence of the coded holes. A human who knows the code can visually extract the information; a blind man could do it by the touch system. So with the computer, there are many ways of transferring data. [Illustration: _Remington Rand_ UNIVAC The Computer’s Basic Parts. ] An obvious requirement of the punched card is that someone has to punch the holes in the first place. This is done with manually operated punches, power punches, and even automatic machines that handle more than a hundred cards a minute. Punched cards, which fall into the category called computer “software,” are cheap, flexible, and compatible with many types of equipment. Particularly with mathematical computations and scientific research, another type of input has become popular, that of paper tape. This in effect strings many cards together and puts them on an easily handled roll. Thus a long series of data can be punched without changing cards, and is conveniently stored for repeated use. Remember the old player-piano rolls of music? These actually formed the input for one kind of computer, a musical machine that converted coded holes to musical sounds by means of pneumatic techniques. Later in this chapter we will discuss some modern pneumatic computers. More efficient than paper is magnetic tape, the same kind we use in our home recording instruments. Anyone familiar with a tape recorder knows how easy it is to edit or change something on a tape reel. This is a big advantage over punched cards or paper tapes which are physically altered by the data stored on them and cannot be corrected. Besides this, magnetic tape can hold many more “bits” of information than paper and also lends itself to very rapid movement through the reading head of the computer. For example, standard computer tape holds seven tracks, each with hundreds of bits of information per inch. Since there are thousands of feet on a ten-inch reel, it is theoretically possible to pack 40 _million_ bits on this handful of tape! Since the computer usually can operate at a much higher rate of speed than we can put information onto tape, it is often the practice to have a “buffer” in the input section. This receiving station collects and stores information until it is full, then feeds it to the computer which gobbles it up with lightning speed. Keeping a fast computer continuously busy may require many different inputs. Never satisfied, computer designers pondered the problem of all the lost time entailed in laboriously preparing cards or tapes for the ravenous electronic machine. The results of this brain-searching are interesting, and they are evident in computers that actually read man-talk. Computers used in the post office and elsewhere can optically read addresses as well as stamps; banks have computers that electrically read the coded magnetic ink numbers on our checks and process thousands of times as many as human workers once did. This optical reading input is not without its problems, of course. Many computers require a special type face to be used, and the post office found that its stamp recognizer was mistaking Christmas seals for foreign stamps. Improved read heads now can read hand-printed material and will one day master our widely differing human scrawls. This is of course a boon to the “programmer” of lengthy equations who now has to translate the whole problem into machine talk before the machine can accept it. If a machine can read, why can’t it understand verbal input as well? Lazy computer engineers have pushed this idea, and the simplest input system of all is well on the way to success. Computers today can recognize numbers and a few words, and the Japanese have a typewriter that prints out the words spoken to it! These linguistic advances that electronic computers are making are great for everyone, except perhaps the glamorized programmer, a new breed of mathematical logician whose services have been demanded in the last few years. [Illustration: Magnetic Tape - Paper Tape - IBM Card - Magnetic Ink Characters ] _Control_ Before we feed the problem into the machine, or before we give it some “raw” data to process, we had better tell our computer what we want it to do. All the fantastic speed of our electrons will result in a meaningless merry-go-round, or perhaps a glorious machine-stalling short circuit unless the proper switches open and close at the right time. This is the job of the control unit of the computer, a unit that understands commands like “start,” “add,” “subtract,” “find the square root,” “file in Bin B,” “stop,” and so on. The key to all the computer’s parts working together in electronic harmony is its “clock.” This timekeeper in effect snaps its fingers in perfect cadence, and the switches jump at its bidding. Since the finger-snapping takes place at rates of millions of snaps a second, the programmer must be sure he has instructed the computer properly. The ideal programmer is a rare type with a peculiarly keen brain that sometimes takes seemingly illogical steps to be logical. Programmers are likely to be men—or women, for there is no sex barrier in this new profession—who revel in symbolic logic and heuristic or “hunch” reasoning. Without a program, the computer is an impressively elaborate and frighteningly expensive contraption which cannot tell one number from another. The day may come when the mathematician can say to the machine, “Prove Fermat’s last theorem for me, please,” or the engineer simply wish aloud for a ceramic material that melts at 15,000° C. and weighs slightly less than Styrofoam. Even then the human programmer will not start drawing unemployment insurance, of course. If he is not receiving his Social Security pension by then he will simply shift to more creative work such as thinking up more problems for the machine to solve. Just as there are many jobs for the computer, so there are many kinds of programs. On a very simple, special-purpose computer, the program may be “wired-in,” or fixed, so that the computer can do that particular job and no other. On a more flexible machine, the program may still be quite simple, perhaps no more than a card entered in a desk unit by an airline ticket agent to let the computer arrange a reservation for three tourist seats on American Airlines jet flight from Phoenix to Chicago at 8:20 A.M. four days from now. On a general-purpose machine, capable of many problems, the program may be unique, a one-of-a-kind highly complex set of instructions that will make the computer tax its huge memory and do all sorts of mental “nip-ups” before it reaches a solution. A computer that understands about sixty commands has been compared to a Siamese elephant used for teak logging; the animal has about that many words in its vocabulary. Vocabulary is an indication of computer as well as human sophistication. The trend is constantly toward less-than-elephant size, and more-than-elephant vocabulary. The programmer’s work can be divided into four basic phases: _analysis_ of the problem; _application_ or matching problem requirements with the capabilities of the right computer; _flow charting_ the various operations using symbolic diagrams; and finally, _coding_ or translating the flow chart into language the computer knows. The flow chart to some extent parallels the way our own brains solve logic problems, or at least the way they _ought_ to solve them. For example, a computer might be instructed to select the smallest of three keys. It would compare A and B, discard the larger, and then compare with C, finally selecting the proper one. This is of course such a ridiculously simple problem that few of us would bother to use the computer since it would take much longer to plot the flow chart than to select the key by simple visual inspection. But the logical principle is the same, even when the computer is to be told to analyze all the business transactions conducted by a large corporation during the year and advise a program for the next year which will show the most profit. From the symbolic flow chart, the programmer makes an operational flow chart, a detailed block diagram, and finally the program itself. Suitably coded in computer language, this program is ready for the computer’s control unit. With a problem of complex nature, such as one involving the firing of a space vehicle, programmers soon learned they were spending hours, or even days, on a problem which the computer proceeded to zip through in minutes or seconds. It was something like working all year building an elaborate Fourth of July fireworks display, touching the match, and seeing the whole thing go up in spectacular smoke for a brief moment. Of course the end justifies the means in either case, and as soon as the computer has quit whirring, or the skyrockets faded out, the programmer gets back to work. But some short cuts were learned. Even a program for a unique problem is likely to contain many “subroutines” just like those in other problems. These are used and re-used; some computers now have libraries of programs they can draw on much as we call on things learned last week or last year. With his work completed, the programmer’s only worry is that an error might exist in it, an error that could raise havoc if not discovered. One false bit of logic in a business problem; a slight mathematical boner in a design for a manned missile, could be catastrophic since our technology is so complicated that the mistake might be learned only when disaster struck. So the programmer checks and rechecks his work until he is positive _he_ has not erred. How about the computer? It checks itself too; so thoroughly that there is no danger of it making a mistake. Computer designers have been very clever in this respect. One advanced technique is “majority rule” checking. Not long ago when the abacus was used even in banking, the Japanese were aware that a single accountant might make a false move and botch up the day’s tally. But if two operators worked the same problem and got the same answer, the laws of probability rule that the answer can be accepted. If the sums do not agree, though, which man is right? To check further, and save the time needed to go through the whole problem again, _three_ abacuses, or abaci, are put through their paces. Now if two answers agree, chances are they are the right solution. If all three are different, the bank had better hire new clerks! [Illustration: _Remington Rand UNIVAC_ A word picture “flow chart” of the logical operation of selecting the proper key. ] _Arithmetic or Logic_ Now that our computer has the two necessary ingredients of input and control, the arithmetic or logic unit can get busy. Babbage called this the “mill,” and with all the whirring gears and clanking arms his engine boasted, the term must have been accurate. Today’s computer is much quieter since in electronic switches the only moving parts are the electrons themselves and these don’t make much of a racket. Such switches have another big advantage in that they open and close at a great rate, practically the speed of light. The fastest computers use switches that act in _nanoseconds_, or billionths of a second. In one nanosecond light itself travels only a foot. The computer may be likened to someone counting on two of his fingers. Instead of the decimal or ten-base system, most computers use binary arithmetic, which has a base of two. But fingers that can be counted in billionth parts of a second can handle figures pretty fast, and the computer has learned some clever tricks that further speed things up. It can only add, but by adroit juggling it subtracts by using the complement of the desired number, a technique known to those familiar with an ordinary adding machine. There are also some tricks to multiplying that allow the computer again to simply add and come up with the answer. With pencil and paper we can multiply 117 times 835 easily. Remember, though, that the computer can only add, and that it was once called a speedy imbecile. The most imbecilic computer might solve the problem by adding 117 to itself 835 times. A smarter model will reverse the procedure and handle only 117 numbers. The moron type of computer is a bit more clever and sets up the problem this way: 835 835 835 835 835 835 835 8350 83500 —— 97695 A moment’s reflection will show that this is the same as adding 7 times 835, 10 times 835, and 100 times 835. And of course the computer arrives at the answer in about the time it takes us to start drawing the line under our multiplier. [Illustration: _The Bendix Corp., Computer Division_ Assembly of printed-circuit component “packages” into computer. ] Perhaps smarting under the unkind remarks about its mental ability, the computer has lately been trying some new approaches to the handling of complex arithmetical problems. Instead of adding long strings of numbers, it will take a guess at the result, do some smart checking, adjust its figures, and shortly arrive at the right solution. For nonarithmetical problems, the computer substitutes yes and no for 1 and 0 and blithely solves problems in logic at the same high rate of speed. _Memory_ When we demonstrated our superiority earlier in multiplying instead of adding the numbers in the problem, we were drawing on our memory: recalling multiplication tables committed to memory when we were quite young. Babbage’s “store” in his difference engine, you will recall, could memorize a thousand fifty-digit numbers, a feat that would tax most of us. The grandchildren of the Babbage machine can call on as many as a billion bits of information stored on tape. As you watch the reels of tape spinning, halting abruptly, and spinning again so purposefully, remember that the computer is remembering. In addition to its large memory, incidentally, a computer may also have a smaller “scratch-pad” memory to save time. Early machines used electromechanical relays or perhaps vacuum-tube “flip-flops” for memory. Punched-card files store data too. To speed up the access to information, designers tried the delay-line circuit, a device that kept information circulating in a mercury or other type of delay. Magnetic drums and discs are also used. Magnetic tape on reels is used more than any other memory system for many practical reasons. There is one serious handicap with the tape system, however. Information on it, as on the drum, disc, file card, or delay line, is serial, that is, it is arranged in sequence. To reach a certain needed bit of data might require running through an entire reel of tape. Even though the tape moves at very high speed, time is lost while the computer’s arithmetic unit waits. For this reason the designers of the most advanced computers have gone to “random access” instead of sequential memory for part of the machine. Tiny cores of ferrite material which has the desired magnetic properties are threaded on wires. These become memory elements, as many as a hundred of them in an area the size of a postage stamp. Each core is at the intersection of two wires, one horizontal and one vertical. Each core thus has a unique “address” and because of the arrangement of the core matrix, any address can be reached in about the same amount of time as any other. Thus, instead of spinning the tape several hundred feet to reach address number 6,564, the computer simply closes the circuit of vertical row 65 and horizontal row 64, and there is the desired bit of information in the form of a magnetic field in the selected core. Hot on the heels of the development of random-access core memories came that of thin metallic film devices and so-called cryogenic or supercold magnetic components that do the same job as the ferrite cores but take only a fraction of the space. Some of these advanced devices also lend themselves to volume production and thus pave the way for memories with more and more information-storage capability. [Illustration: _International Business Machines Corp._ Magnetic core plane, the computer’s memory. ] In the realm of “blue-sky” devices, sometimes known as “journalistors,” are molecular block memories. These chunks of material will contain millions of bits of information in cubic inches of volume, and some way of three-dimensional scanning of the entire block will be developed. With such a high-volume memory, the computer of tomorrow will fit on a desk top instead of requiring rows and rows of tape-filled machines. Today, tape offers the cheapest “per bit” storage, and it is necessary to use the external or peripheral type of information storage. This is not much of a problem except for the matter of space. Since most computers are electronic, all that is required to tie the memory units to the arithmetic unit is wire connections. Douglas Aircraft ties computers in its California and North Carolina plants with 2,400 miles of telephone hookup. Sometimes even wires are not necessary. In the Los Angeles area, North American Aviation has a number of plants separated by as many as forty miles. Each plant is quite capable of using the computers in the other locations, with a stream of digits beamed by microwave radio from one to the other. Information can be transferred in this manner at rates up to 65,000 bits per second. _Output_ Once the computer has taken the input of information, been instructed what to do, and used its arithmetic and memory, it has done the bulk of the work on the problem. But it must now reverse the procedure that took place when information flowed into it and was translated into electrical impulses and magnetic currents. It could convey the answer to another machine that spoke its language, but man would find such information unintelligible. So the computer has an output section that translates back into earth language. Babbage’s computer was to have printed out its answers on metal plates, and many computers today furnish punched cards or tape as an output. Others print the answers on sheets of paper, so rapidly that a page of this book would take little more than a second to produce! One of the greatest challenges of recent years is that of producing printing devices fast enough to exploit fully the terrific speeds of electronic computing machines. There would be little advantage in a computer that could add all the digits in all the phone books in the world in less than a minute if it took three weeks to print out the answer. Impact printers, those that actually strike keys against paper, have been improved to the point where they print more than a thousand lines of type, each with 120 characters in it, per minute. But even this is not rapid enough in some instances, and completely new kinds of printers have been developed. One is the Charactron tube, a device combining a cathode-ray tube, something like the TV picture tube, with an interposed 64-character matrix about half an inch in diameter. Electrical impulses deflect the electron beam in the tube so that it passes through the proper matrix character and forms that image on the face of the tube. This image then is printed electrostatically on the treated paper rather than with a metal type face. With no moving parts except the paper, and of course the electrons themselves, the Charactron printer operates close to the speed of the computer itself, and produces 100,000 words a minute. This entire book could be printed out in about forty-five seconds in this manner. [Illustration: _Minneapolis-Honeywell, Electronic Data Processing Division_ A high-speed printer is the output of this computer. It prints 900 _lines_ a minute. ] There are many other kinds of outputs. Some are in the form of payroll checks, rushing from the printer at the rate of 10,000 an hour. Some are simply illuminated numbers and letters on the face of the computer. As mentioned earlier, the SAGE air defense computer displays the tracks of aircraft and missiles on large screens, each accurately tagged for speed, altitude, and classification. The computer may even speak its answer to us audibly. General Electric engineers have programmed computers to play music, and come up with a clever giveaway record titled “Christmas Carols in 210 Time,” à la pipe-organ solo. Some more serious musical work is now being done in taking a musical input fed to a computer, programming it for special effects including the reverberant effect of a concert hall, and having that played as the output. A more direct vocal output is the spoken word. Some computers have this capability now, with a modest vocabulary of their own and an extensive tape library to draw from. As an example, Gilfillan Radio has produced a computerized ground-control-approach system that studies the radar return of the aircraft being guided, and “tells” the pilot how to fly the landing. All the human operator does is monitor the show. The system uses the relatively simple method of selecting the correct words from a previously tape-recorded human voice. More sophisticated systems will be capable of translating code from the computer directly into an audible output. One very obvious advantage of such an automatic landing system is that the computer is never subject to a bad day, nerves, or fright. It will talk the aircraft down calmly and dispassionately, albeit somewhat mechanically. These then are the five basic parts of a computer or computer system: input, control, arithmetic-logic, memory, and output. Remember that this applies equally to simple and complex machines, and also to computers other than the more generally encountered electronic types. For while the electronic computer is regarded as the most advanced, it is not necessarily the final result of computer development. Let us consider some of the deviants, throwbacks, and mutations of the computer species. [Illustration: _Kearfott Division, General Precision, Inc._ The tiny black box is capable of the same functions as the larger plastic laboratory model pneumatic digital computer. Packaging densities of more than 2,000 elements per cubic inch are expected. ] _Another Kind of Computer_ We have discussed mechanical, electromechanical, electrical, and electronic computers. There are also those which make use of quite different media for their operation: hydraulics, air pressure, and even hot gases. The pneumatic is simplest to explain, and also has its precedent in the old player-piano mentioned earlier. Just as an electric or electronic switch can be open or closed, so can a pneumatic valve. The analogy carries much further. Some of the basic electronic components used in computers are diodes, capacitors, inductors, and “flip-flop” circuits which we have talked of. Each of these, it turns out, can be approximated by pneumatic devices. The pneumatic diode is the simplest component, being merely an orifice or opening through which gas is flowing at or above the speed of sound. Under these conditions, any disturbance in pressure “upstream” of the orifice will move “downstream” through the orifice, but any such happening downstream cannot move upstream. This is analogous to the way an electronic diode works in the computer, a one-way valve effect. The electrical capacitor with its stored voltage charge plays an important part in computer circuitry. A plenum chamber, or box holding a volume of air, serves as a pneumatic capacitor. Similarly, the effect of an inductor, or coil, is achieved with a long pipe filled with moving air. The only complicated element in our pneumatic computer building blocks is the flip-flop, or bistable element. A system of tubes, orifices, and balls makes a device that assumes one position upon the application of pneumatic force, and the other upon a successive application, similar to the electronic flip-flop. Pneumatic engineers use terms like “pressure drop” and “pneumatic buffering,” comparable to voltage drop and electrical buffering. A good question at this point is just why computer designers are even considering pneumatic methods when electronic computers are doing such a fine job. There are several reasons that prompt groups like the Kearfott Division of General Precision Inc., AiResearch, IBM’s Swiss Laboratory, and the Army’s Diamond Ordnance Fuze Laboratory to develop the air-powered computers. One of these is radiation susceptibility. Diodes and transistors have an Achilles heel in that they cannot take much radiation. Thus in military applications, and in space work, electronic computers may be incapable of proper operation under exposure to fallout or cosmic rays. A pneumatic computer does not have this handicap. High temperature is another bugaboo of the electronic computer. For operation above 100° C., for instance, it is necessary to use expensive silicon semiconductor elements. The cryogenic devices we talked of require extremely low temperatures and are thus also ruled out in hot environments. The pneumatic computer, on the other hand, can actually operate on the exhaust gases of a rocket with temperatures up to 2000° F. There may be something humanlike in this ability to operate on hot air, but there are more practical reasons like simplicity, light weight, and low cost. The pneumatic computer, of course, has limitations of its own. The most serious is that of speed, and its top limit seems to be about 100 kilocycles a second. Although this sounds fast—a kilocycle being a thousand cycles, remember—it is tortoise-slow compared with the 50-megacycle speed of present electronic machines. But within its limitations the pneumatic machine can do an excellent job. Kearfott plans shrinking 3,000 pneumatic flip-flops and their power supply and all circuitry into a one-inch cube; and packing a medium-size general-purpose digital computer complete with memory into a case 5-1/2 inches square and an inch thick. Such a squeezing of components surely indicates _compressed_ air as a logical power supply! Going beyond the use of air as a medium, Army researchers have worked with “fluid” flip-flops capable of functioning at temperatures ranging from minus 100° to plus 7,000° F.! The limit is dictated only by the material used to contain the fluid, and would surely meet requirements for the most rigorous environment foreseeable. The fluid flip-flop operates on a different principle from its pneumatic cousin, drawing on fluid dynamics to shift from one state to the other. Fluid dynamics permits the building of switches and amplifiers that simulate electronic counterparts adequately, and the Army’s Diamond Ordnance Fuze Laboratory has built such oscillators, shift registers, and full adders, the flesh and bones of the computer. Researchers believe components can be built cheaply and that ultimately a complete fluid computer can be assembled. The X-15 is cited as an example of a good application for fluid-type computing devices. The hypersonic aircraft flies so fast it glows, and a big part of its problem is the cooling of a large amount of electronic equipment that generates additional heat to compound the difficulty. Missiles and space vehicles have similar requirements. Tomorrow’s computer may use liquid helium or a white-hot plasma jet instead of electronics or gas as a medium. It may use a medium nobody has dreamed of yet, or one tried earlier and discarded. Regardless of what it uses, it will probably work on the same basic theory and principles we’ve outlined here. And try as we may, we will get no more out of it than we put in. [Illustration: By Herbert Goldberg © 1961 Saturday Review “Is this your trouble?” ] ------------------------------------------------------------------------ “_It is the machines that make life complicated, at the same time that they impose on it a high tempo._” —Carl Lotus Becker 4: Computer Cousins—Analog and Digital There are many thousands of computers in operation today—in enough different outward varieties to present a hopeless classification task to the confused onlooker. Actually there are only two basic types of computing machines, the analog and the digital. There is also a third computer, an analog-digital hybrid that makes use of the better features of each to do certain jobs more effectively. The distinction between basic types is clear-cut and may be explained in very simple terms. Again we go to the dictionary for a starting point. Webster says: “Analogue.—That which is analogous to some other thing.” Even without the terminal _ue_, the analog computer is based on the principle of analogy. It is actually a model of the problem we wish to solve. A tape measure is an analog device; so is a slide rule or the speedometer in your car. These of course are very simple analogs, but the principle of the more complex ones is the same. The analog computer, then, simulates a physical problem and deals in quantities which it can measure. Some writers feel that the analog machine is not a computer at all in the strict sense of the word, but actually a laboratory model of a physical system which may be studied and measured to learn certain implicit facts. [Illustration: _Minneapolis-Honeywell Computer Center_ A multimillion dollar aerospace computer facility. On left is an array of 16 analog computers; at right is a large digital data-processing system. The facility can perform scientific and business tasks simultaneously. ] The dictionary also gives us a good clue to the digital computer: “Digital.—Of the fingers or digits.” A digital machine deals in digits, or discrete units, in its calculations. For instance, if we ask it to multiply 2 times 2, it answers that the product is exactly 4. A slide rule, which we have said is an analog device, might yield an answer of 3.98 or 4.02, depending on the quality of its workmanship and our eyesight. The term “discrete” describes the units used by the digital machine; an analog machine deals with “continuous” quantities. When you watch the pointer on your speedometer you see that it moves continuously from zero to as fast as you can or dare drive. The gas gauge is a graphic presentation of the amount of fuel in the tank, just as the speedometer is a picture of your car’s speed. For convenience we interpolate the numbers 10, 20, 30, 1/4, 1/2, and so on. What we do, then, is to convert from a continuous analog presentation to a digital answer with our eyes and brain. This analog-to-digital conversion is not without complications leading to speeding tickets and the inconvenience of running out of gas far from a source of supply. A little thought will reveal that even prior to computers there were two distinct types of calculating; those of measuring (analog) and of counting (digital). Unless we are statisticians, we encounter 2-1/2 men or 3-1/2 women about as frequently as we are positive that there is exactly 10 gallons of fuel in the gas tank. In fact, we generally use the singular verb with such a figure since the 10 gallons is actually an arbitrary measurement we have superimposed on a quantity of liquid. Counting and measuring, then, are different things. Because of the basic differences in the analog and digital computers, each has its relative advantages and disadvantages with respect to certain kinds of problems. Let us consider each in more detail and learn which is better suited to particular tasks. Using alphabetical protocol, we take the analog first. _The Analog Measuring Stick_ We have mentioned the slide rule, the speedometer, and other popular examples of analog computers. There are of course many more. One beautiful example occurs in nature, if we can accept a bit of folklore. The caterpillar is thought by some to predict the severity of the winter ahead by the width of the dark band about its body. Even if we do not believe this charming relationship exists, the principle is a fine illustration of simulation, or the modeling of a system. Certainly there are reverse examples in nature not subject to any speculation at all. The rings in the trunk of a tree are accurate pictures of the weather conditions that caused them. These analogies in nature are particularly fitting, since the analog computer is at its best in representing a physical system. While we do not generally recognize such homely examples as computers, automatic record-changers, washing machines, electric watt-hour meters, and similar devices are true analogs. So of course is the clock, one of the earliest computers made use of by man. While Babbage was working with his difference engine, another Englishman, Lord Kelvin, conceived a brilliant method of predicting the height of tides in various ports. He described his system of solving differential equations invented in 1876 in the _Proceedings of the Royal Society_. A working model of this “differential analyzer,” which put calculus on an automated basis, was built by Kelvin’s brother, James Thomson. Thomson used mechanical principles in producing this analog computer, whose parts were discs, balls, and cylinders. [Illustration: _Science Materials Center_ A simple analog computer designed to be assembled and used by teen-agers. Calculo performs multiplication and division within 5 per cent accuracy, and is a useful demonstration device. ] Early electrical analogs of circuits built around 1920 in this country have been discussed briefly in the chapter on the computer’s past. The thing that sparked their development was an engineer’s question, “Why don’t we build a little _model_ of these circuits?” Solving problems in circuitry was almost like playing with toys, using the circuit analyzers, although the toys grew to sizable proportions with hundreds of components. Some of the direct-current analog type are still operating in Schenectady, New York, and at Purdue University. A simple battery-powered electric analog gives us an excellent example of the principle of all analog machines. Using potentiometers, which vary the resistance of the circuit, we set in the problem. The answer is read out on a voltmeter. Quite simply, a known input passing through known resistances will result in a proportional voltage. All that remains is assigning values to the swing of the voltmeter needle, a process called “scaling.” For instance, we might let one volt represent 100 miles, or 50 pounds, or 90 degrees. Obviously, as soon as we have set in the problem, the answer is available on the voltmeter. It is this factor that gives the analog computer its great speed. General Electric and Westinghouse were among those building the direct-current analyzer, and the later alternating-current network type which came along in the 1930’s. The mechanical analogs were by no means forgotten, even with the success of the new electrical machines. Dr. Vannevar Bush, famous for many other things as well, started work on his analog mechanical differential analyzer in 1927 at the Massachusetts Institute of Technology. Bush drew on the pioneering work of Kelvin and other Englishmen, improving the design so that he could do tenth-order calculations. Following Bush’s lead, engineers at General Electric developed further refinements to the “Kelvin wheels,” using electrical torque amplifiers for greater accuracy. The complexity of these computers is indicated in the size of one built in the early 1940’s for the University of California. It was a giant, a hundred feet long and filled with thousands of parts. Not merely huge, it represented a significant stride ahead in that it could perform the operation of integration with respect to functions other than just time. Instead of being a “direct” analog, the new machine was an “indirect” analog, a model not of a physical thing but of the mathematics expressing it. Engineers realized that the mechanical beast, as they called it, represented something of a dinosaur in computer evolution and could not survive. Because of its size, it cost thousands of dollars merely to prepare a place for its installation. Besides, it was limited in the scope of its work. During World War II, however, it was all we had, and beast or not, it worked around the clock solving engineering problems, ballistics equations, and the like. England did work in this field, and Meccano—counterpart of the Gilbert Erector Set firm in the United States—marketed a do-it-yourself differential analyzer. The Russians too built mechanical differential analyzers as early as 1940. Electronics came to the rescue of the outsized mechanical analog computers during and after the war. Paced by firms like Reeves Instrument and Goodyear Aircraft, the electronic analog superseded the older mechanical type. There was of course a transitional period, and an example of this stage is the General Electric fire-control computer installed in the B-29. It embraced mechanical, electrical, and electronic parts to do just the sort of job ideally suited to the analog type of device: that of tracking a path through space and predicting the future position of a target so that the gunsight aims at the correct point in space for a hit. Another military analog computer was the Q-5, used by the Signal Corps to locate enemy gun installations. From the track of a projectile on a radar screen, the Q-5 did some complicated mathematics to figure backwards and pinpoint the troublesome gun. There were industrial applications as well for the analog machine. In the 1950’s, General Electric built computers to solve simultaneous linear equations for the petroleum industry. To us ultimate users, gasoline poses only one big mathematical problem—paying for a tankful. Actually, the control operations involved in processing petroleum are terribly involved, and the special analog computer had to handle twelve equations with twelve unknown quantities simultaneously. This is the sort of problem that eats up man-years of human mathematical time; even a modern digital computer has tough and expensive going, but the analog does this work rapidly and economically. Another interesting analog machine was called the Psychological Matrix Rotation Computer. This implemented an advanced technique called multiple-factor analysis, developed by Thurston of the University of Illinois for use in certain psychological work. Multiple-factor analysis is employed in making up the attribute tests used by industry and the military services for putting the right man in the right job. An excellent method, it was too time-consuming for anything but rough approximations until the analog computer was built for it. In effect, the computer worked in twelve dimensions, correlating traits and aptitudes. It was delivered to the Adjutant General’s Office and is still being used, so Army men who wonder how their background as baker qualifies them for the typing pool may have the Psychological Matrix Rotation Computer to thank. In the early 1950’s, world tension prompted the building of another advanced analog computer, this one a jet engine simulator. Prior to its use, it took about four years to design, build, and test a new jet engine. Using the simulator, the time was pared to half that amount. It was a big computer, even though it was electronic. More than 6,000 vacuum tubes, 1,700 indicator lights, and 2,750 dials were hooked up with more than 25 miles of wire, using about 400,000 interconnections. All of this required quite a bit of electrical power, about what it would take to operate fifty kitchen ranges. But it performed in “real” time, and could keep tabs on an individual molecule of gas from the time it entered the jet intake until it was ejected out the afterburner! Other analog computers were developed for utility companies to control the dispatching of power to various consumers in the most efficient manner. Again the principle was simply to build a model or analog of an actual physical system and use it to predict the outcome of operation of that system. From our brief skim of the history of the analog computer we can recognize several things about this type of machine. Since the analog is a simulator in most cases, we would naturally expect it to be a special-purpose machine. In other words, if we had a hundred different kinds of problems, and had to build a model of each, we would end up with a hundred special-purpose computers. It follows too that the analog computer will often be a part of the system it serves, rather than a separate piece of equipment. [Illustration: _The Boeing Co._ Analog machine used as flight simulator for jet airliner; a means of testing before building. ] There are general-purpose analog computers, of course, designed for solving a broad class of problems. They are usually separate units, instead of part of the system. We can further break down the general-purpose analog computer into two types; direct and indirect. A direct analog is exemplified in the tank gauge consisting of a float with a scale attached. An indirect analog, such as the General Electric monster built for the University of California mentioned earlier, can use one dependent variable, such as voltage, to represent all the variables of the prototype. Such an analog machine is useful in automatic control and automation processes. Finally, we may subdivide our direct analog computer one further step into “discrete” analogs or “continuous” analogs. The term “discrete” is the quality we have ascribed to the digital computer, and a discrete analog is indicative of the overlap that occurs between the two types. Another example of this overlap is the representation of “continuous” quantities by the “step-function” method in a digital device. As we shall see when we discuss hybrid or analog-digital computers, such overlap is as beneficial as it is necessary. [Illustration: _General Motors Corp._ Large analog computer in rear controls car, subjecting driver to realistic bumps, pitches, and rolls, for working out suspension problems of car. ] We are familiar now with mechanical, electromechanical, and fully electronic analogs. Early machines used rods of certain lengths, cams, gears, and levers. Fully electronic devices substitute resistors, capacitors, and inductances for these mechanical components, adding voltages instead of revolutions of shafts, and counting turns of wire in a potentiometer instead of teeth on a gear. Engineers and technicians use terms like “mixer,” “integrator,” and “rate component,” but we may consider the analog computer as composed of passive networks plus amplifiers where necessary to boost a faint signal. Some consideration of what we have been discussing will give us an indication of the advantages of the analog computer over the digital type. First and most obvious, perhaps, is that of simplicity. A digital device for recording temperature could be built; but it would hardly improve on the simplicity of the ordinary thermometer. Speed is another desirable attribute of most analog computers. Since operation is parallel, with all parts of the problem being worked on at once, the answer is reached quickly. This is of particular importance in “on-line” application where the computer is being used to control, let us say, an automatic machining operation in a factory. Even in a high-speed electronic digital computer there is a finite lag due to the speed of electrons. This “slack” is not present in a direct analog and thus there is no loss of precious time that could mean the difference between a rejected and a perfect part from the lathe. It follows from these very advantages that there are drawbacks too. The analog computer that automatically profiles a propeller blade in a metalworking machine cannot mix paint to specifications or control the speed of a subway train unless it is a very special kind of general-purpose analog that would most likely be the size of Grand Central Station and sell for a good part of the national debt. Most analogs have one particular job they are designed for; they are specialists with all the limitations that the word implies. There is one other major disadvantage that our analog suffers by its very nature. We can tolerate the approximate answer 3.98 instead of 4, because most of us recognize the correct product of 2 times 2. But few production managers would want to use 398 rivets if it took 400 to do the job safely—neither would they want to use 402 and waste material. Put bluntly, the analog computer is less accurate than its digital cousin. It delivers answers not in discrete units, but approximations, depending on the accuracy of its own parts and its design. Calculo, an electrical-analog computer produced for science students, has an advertised accuracy of 5 per cent at a cost of about $20. The makers frankly call it an “estimator.” This is excellent for illustrating the principles of analog machines to interested youngsters, but the students could have mathematical accuracy of 100 per cent from a digital computer called the abacus at a cost of less than a dollar. Greater accuracy in the analog computer is bought at the expense of costlier components. Up to accuracies of about 1 per cent error it is usually cheaper to build an analog device than a digital, assuming such a degree of accuracy is sufficient, of course. Analog accuracies ten times the 1 per cent figure are feasible, but beyond that point costs rise very sharply and the digital machine becomes increasingly attractive from a dollars and cents standpoint. Designers feel that accuracies within 0.01 per cent are pushing the barriers of practicality, and 0.001 per cent probably represents the ultimate achievable. Thus the digital computer has the decided edge in accuracy, if we make some realistic allowances. For example, the best digital machine when asked to divide 10 by 3 can never give an exact answer, but is bound to keep printing 3’s after the decimal point! There are other differences between our two types of computers, among them being the less obvious fact that it is harder to make a self-checking analog computer than it is to build the same feature into the digital. However, the most important differences are those of accuracy and flexibility. For these reasons, the digital computer today is in the ascendant, although the analog continues to have its place and many are in operation in a variety of chores. We have mentioned fire control and the B-29 gunsight computer in particular. This was a pioneer airborne computer, and proved that an analog could be built light enough for such applications. However, most fire-control computers are earthbound because of their size and complexity. A good example is the ballistic computer necessary for the guns on a battleship. In addition to the normal problem of figuring azimuth and elevation to place a shell on target, the gun aboard ship has the additional factors of pitch, roll, and yaw to contend with. These inputs happen to be ideal for analog insertion, and a properly designed computer makes corrections instantaneously as they are fed into it. A fertile field for the analog computer from the start was that of industrial process control. Chemical plants, petroleum refineries, power generating stations, and some manufacturing processes lend themselves to control by analog computers. The simplicity and economy of the “modeling” principle, plus the instantaneous operation of the analog, made it suitable for “on-line” or “on-stream” applications. The analog computer has been described as useful in the design of engines; it also helps design the aircraft in which these engines are used, and even simulates their flight. A logical extension of this use is the training of pilots in such flight simulators. One interesting analog simulator built by Goodyear Aircraft Corporation studied the reactions of a pilot to certain flight conditions and then was able to make these reactions itself so faithfully that the pilot was unaware that the computer and not his own brain was accomplishing the task. The disciplines of geometry, calculus, differential equations, and other similar mathematics profit from the analog computer which is able to make a model of their curves and configurations and thus greatly speed calculations. Since the analog is so closely tied to the physical rather than the mental world, it cannot cope with discrete numbers, and formal logic is not its cup of tea. Surely, progress has been made and improvements continue to be designed into modern analog computers. Repetitive operations can now be done automatically at high speed, and the computer even has a memory. High-speed analog storage permits the machine to make sequential calculations, a job once reserved for the digital computer. But even these advances cannot offset the basic limitations the analog computer is heir to. Fewer analog machines are being built now, and many in existence do not enjoy the busy schedule of the digital machines. As the mountains of data pile up, created incidentally by computers in the first place, more computers are needed to handle and make sense of them. It is easier to interpret, store, and transmit digital information than analog; the digital computer therefore takes over this important task. Even in control systems the digital machine is gaining popularity; its tremendous speed offsets its inherent cumbersomeness and its accuracy tips the scales more in its favor. These advantages will be more apparent as we discuss the digital machine on the next pages and explain the trend toward the hybrid machine, ever becoming more useful in the computer market place. Of course, there will always be a place for the pure analog—just as there has always been for any specialist, no matter what his field. _The Digital Counter_ The digital computer was first on the scene and it appears now that it will outnumber and perhaps outlive its analog relative. A simple computer of this type is as old as man, though it is doubtful that it has been in use that long. Proof of this claim to its pioneering are the words _digit_ and _calculi_, for finger and pebbles, respectively. We counted “how many” before we measured “how large,” and the old Romans tallied on fingers until they ran out and then supplemented with pebbles. Perhaps the first computations more complex than simple counting of wives or flocks came about when some wag found that he could ascertain the number of sheep by counting legs and dividing by four. When it was learned that the thing worked both ways and that the number of pickled pigs feet was four times the number of pigs processed, arithmetic was born. The important difference between analog and digital, of course, is that the latter is a means of counting, a dealing with discrete numbers rather than measuring. This kind of computation was taxed sorely when such things as fractions and relationships like _pi_ came along, but even then man has managed to continue dealing with numbers themselves rather than quantity. Just as the slide rule is a handy symbol for the analog computer, the abacus serves us nicely to illustrate the digital type, and some schools make a practice of teaching simple arithmetic to youngsters in this manner. Our chapter on the history of the computer touched on early efforts in the digital field, though no stress was laid on the distinction between types. We might review a bit, and pick out which of the mechanical calculating devices were actually digital. The first obviously was the abacus. It was also the only one for a long time. Having discovered the principle of analogy, man leaned in that direction for many centuries, and clocks, celestial simulators, and other devices were analog in nature. Purists point out that even the counting machines of Pascal and Leibnitz were analog computers, since they dealt with the turning of shafts and gears rather than the manipulation of digits. The same reasoning has caused some debate about Babbage’s great machines in the 1800’s, although they are generally considered a digital approach to problem-solving. Perhaps logicians had as much as anyone to do with the increasing popularity of the digital trend when they pointed out the advantages of a binary or two-valued system. With the completion in 1946 by Eckert and Mauchly of the electronic marvel they dubbed ENIAC, the modern digital computer had arrived and the floodgates were opened for the thousands of descendants that have followed. For every analog computer now being built there are dozens or perhaps hundreds of digital types. Such popularity must be deserved, so let us examine the creature in an attempt to find the reason. [Illustration: Courtesy of the _National Science Foundation_ The computer family tree. Its remarkable growth began with government-supported research, continued in the universities; and the current generation was developed primarily in private industry. ] We said that by its nature the analog device tended to be a special-purpose computer. The digital computer, perhaps because its basic operation is so childishly simple, is best suited for general-purpose work. It is simple, consisting essentially of switches that are either on or off. Yet Leibnitz found beauty in that simplicity, and even the explanation of the universe. Proper interconnection of sufficient on-off switches makes possible the most flexible of all computers—man’s brain. By the same token, man-made computers of the digital type can do a wider variety of jobs than can the analog which seemingly is more sophisticated. A second great virtue of the digital machine is its accuracy. Even a trial machine of Babbage had a 5-place accuracy. This is an error of only one part in ten thousand, achievable in the analog at great expense. This was of course only a preliminary model, and the English inventor planned 20-place accuracy in his dream computer. Present electronic digital computers offer 10-place accuracy as commonplace, a precision impossible of achievement in the analog. We pointed out in the discussion of analog computers that the complexity and expense of increased accuracy was in direct proportion to the degree of accuracy desired. Happily for the digital machine, the reverse is true in its case. Increasing accuracy from five to six figures requires a premium of one-fifth, or 20 per cent. But jumping from 10-place to 11-place precision costs us only 10 per cent, and from 20-place to 21-place drops to just 5 per cent. Actually, such a high degree of accuracy is not necessary in most practical applications. For example, the multiplication of 10-digit numbers may yield a 20-digit answer. If we desired, we could increase the capability of our digital computer to twenty digits and give an accuracy of one part in 10 million trillion! However, we simply “round off” the last ten digits and leave the answer in ten figures, an accuracy no analog computer can match. The significant point is that the analog can never hope to compete with digital types for accuracy. A third perhaps not as important advantage the digital machine has is its compactness. We are speaking now of later computers, and not the pioneer electromechanical giants, of course. The transistor and other small semiconductor devices supplanted the larger tubes, and magnetic cores took the place of cruder storage components. Now even more exotic devices are quietly ousting these, as magnetic films and cryotrons begin to be used in computers. [Illustration: _Science Materials Center_ BRAINIAC, another do-it-yourself computer. This digital machine is here being programmed to solve a logic problem involving a will. ] This drastic shrinking of size by thinking small on the part of computer designers increases the capacity of the digital computer at no sacrifice in accuracy or reliability. The analog, unfortunately, cannot make use of many of these solid-state devices. Again, the bugaboo of accuracy is the reason; let’s look further into the problem. The most accurate and reliable analog computers are mechanical in nature. We can cut gears and turn shafts and wheels to great accuracy and operate them in controlled temperature and humidity. Paradoxically, this is because mechanical components are nearer to digital presentations than are electrical switches, magnets, and electronic components. A gear can have a finite number of teeth; when we deal with electrons flowing through a wire we leave the discrete and enter the continuous world. A tiny change in voltage or current, or magnetic flux, compounded several hundred times in a complex computer, can change the final result appreciably if the errors are cumulative, that is, if they are allowed to pile up. This is what happens in the analog computer using electrical and electronic components instead of precisely machined cams and gears. The digital device, on the other hand, is not so penalized. Though it uses electronic switches, these can be so set that even an appreciable variation in current or voltage or resistance will not affect the proper operation of the switch. We can design a transistor switch, for example, to close when the current applied exceeds a certain threshold. We do not have to concern ourselves if this excess current is large or small; the switch will be on, no more and no less. Or it will be completely off. Just as there is no such thing as being a little bit dead, there is no such thing as a partly off digital switch. So our digital computer can make use of the more advanced electronic components to become more complex, or smaller, or both. The analog must sacrifice its already marginal accuracy if it uses more electronics. The argument here is simplified, of course; there are electronic analog machines in operation. However, the problem of the “drift” of electronic devices is inherent and a limiting factor on the performance of the analog. These, then, are some of the advantages the digital computer has over its analog relative. It is more flexible in general—though there are _some_ digital machines that are more specialized than _some_ analog types; it is more accurate and apparently will remain so; and it is more amenable to miniaturization and further complexity because its designer can use less than perfect parts and produce a perfect result. In the disadvantage department the digital machine’s only drawback seems to be its childish way of solving problems. About all it knows how to do is to add 1 and 1 and come up with 2. To multiply, it performs repetitive additions, and solving a difficult equation becomes a fantastically complex problem when compared with the instantaneous solution possible in the analog machine. The digital computer redeems itself by performing its multitudinous additions at fabulous speeds. Because it must be fed digits in its input, the digital machine is not economically feasible in many applications that will probably be reserved for the analog. A digital clock or thermometer for household use would be an interesting gimmick, but hardly worth the extra trouble and expense necessary to produce. Even here, though, first glances may be wrong and in some cases it may prove worth while to convert analog inputs to digital with the reverse conversion at the output end. One example of this is the airborne digital computer which has taken over many jobs earlier done by analog devices. There is another reason for the digital machines ubiquitousness, a reason it does not seem proper to list as merely a relative advantage over the analog. We have described the analog computer used as an aid to psychological testing procedures, and its ability to handle a multiplicity of problems at once. This perhaps tends to obscure the fact that the digital machine by its very on-off, yes-no nature is ideally suited to the solving of problems in logic. If it achieves superiority in mathematics in spite of its seemingly moronic handling of numbers, it succeeds in logic because of this very feature. While it might seem more appropriate that music be composed by analogy, or that a chess-playing machine would likely be an analog computer, we find the digital machine in these roles. The reason may be explained by our own brains, composed of billions of neurons, each capable only of being on or off. While many philosophers build a strong case for the yes-no-maybe approach with its large areas of gray, the discipline of formal logic admits to only two states, those that can so conveniently be represented in the digital computer’s flip-flop or magnetic cores. The digital computer, then, is not merely a counting machine, but a decision-maker as well. It can decide whether something should be added, subtracted, or ignored. Its logical manipulations can by clever circuitry be extended from AND to OR, NOT, and NOR. It thus can solve not only arithmetic, but also the problems of logic concerning foxes, goats, and cabbages, or cannibals and missionaries that give us human beings so much trouble when we encounter them. The fact that the digital computer is just such a rigorously logical and unbending machine poses problems for it in certain of its dealings with its human masters. Language ideally should be logical in its structure. In general it probably is, but man is so perverse that he has warped and twisted his communications to the point that a computer sticking strictly to book logic will hit snags almost as soon as it starts to translate human talk into other human talk, or into a logical machine command or answer. For instance, we have many words with multiple meanings which give rise to confusion unless we are schooled in subtleties. There are stories, some of them apocryphal but nonetheless pointing up the problem, of terms like “water goat” cropping up in an English-to-Russian translation. Investigation proved that the more meaningful term would have been “hydraulic ram.” In another interesting experiment, the expression, “the spirit is willing but the flesh is weak” was machine translated into Russian, and then that result in turn re-translated back into English much in the manner of the party game of “Telephone” in which an original message is whispered from one person to another and finally back to the originator. In this instance, the final version was, “The vodka is strong, but the meat is rotten.” It is a fine distinction here as to who is wrong, the computer or man and his irrational languages. Chances are that in the long run true logic will prevail, and instead of us confusing the computer it will manage instead to organize our grammar into the more efficient tool it should be. With proper programming, the computer may even be able to retain sufficient humor and nuance to make talk interesting and colorful as well as utilitarian. We can see that the digital machine with its flexibility, accuracy, and powerful logical capability is the fair-haired one of the computer family. Starting with _a_ for abacus, digital computer applications run through practically the entire alphabet. Its take-over in the banking field was practically overnight; it excels as a tool for design and engineering, including the design and engineering of other computers. Aviation relies heavily on digital computers already, from the sale of tickets to the control of air traffic. Gaming theory is important not only to the Saturday night poker-player and the Las Vegas casino operator, but to military men and industrialists as well. Manufacturing plants rely more and more on digital techniques for controls. Language translation, mentioned lightly above, is a prime need at least until we all begin speaking Esperanto, Io, or Computerese. Taxation, always with us, may at least be more smoothly handled when the computers take over. Insurance, the arrangement of music, spaceflight guidance, and education are random fields already dependent more or less on the digital computer. We will not take the time here to go thoroughly into all the jobs for which the computer has applied for work and been hired; that will be taken up in later chapters. But from even a quick glance the scope of the digital machine already should be obvious. This is why it is usually a safe assumption that the word computer today refers to the digital type. _Hybrid Computers_ We have talked of the analog and the digital; there remains a further classification that should be covered. It is the result of a marriage of our two basic types, a result naturally hybrid. The analog-digital computer is third in order of importance, but important nonetheless. [Illustration: _Minneapolis-Honeywell_ Nerve center of Philadelphia Electric Company’s digital computer-directed automatic economic dispatch system is this console from which power directors operate and supervise loading of generating units at minimum incremental cost. ] Necessity, as always, mothered the invention of the analog-digital machine. We have talked of the relative merits of the two types; the analog is much faster on a complex problem such as solving simultaneous equations. The digital machine is far more accurate. As an example, the Psychological Matrix Rotator described earlier could solve its twelve equations practically instantaneously. A digital machine might take seconds—a terribly long time by computer standards. If we want an accurate high-speed differential analyzer, we must combine an analog with a digital computer. Because the two are hardly of the same species, this breeding is not an easy thing. But by careful study, designers effected the desired mating. The hybrid is not actually a new type of computer, but two different types tied together and made compatible by suitable converters. The composite consists of a high-speed general-purpose digital computer, an electronic analog computer, an analog-to-digital converter, a digital-to-analog and a suitable control for these two converters. The converters are called “transducers” and have the ability of changing the continuous analog signal into discrete pulses of energy, or vice versa. Sometimes called digital differential analyzers, the hybrid computers feature the ease of programming of the analog, plus its speed, and the accuracy and much broader range of the digital machine. Bendix among others produced such machines several years ago. The National Bureau of Standards recently began development of what it calls an analog-digital differential analyzer which it expects to be from ten to a hundred times more accurate than earlier hybrid computers. The NBS analyzer will be useful in missile and aircraft design work. Despite its apparent usefulness as a compromise and happy medium between the two types, the hybrid would seem to have as limited a future as any hybrid does. Pure digital techniques may be developed that will be more efficient than the stopgap combination, and the analog-digital will fall by the wayside along the computer trail. _Summary_ Historically, the digital computer was first on the scene. The analog came along, and for a time was the more popular for a variety of reasons. One of these was the naïve, cumbersome mode of operation the digital computer is bound to; another its early lack of speed. Both these drawbacks have been largely eliminated by advances in electronics, and apparently this is only the beginning. In a few years the technology has progressed from standard-size vacuum tubes through miniature tubes and the shrinking of other components, to semiconductors and other tinier devices, and now we have something called integrated circuitry, with molecular electronics on the horizon. These new methods promise computer elements approaching the size of the neurons in our own brains, yet with far faster speed of operation. Such advances help the digital computer more than the analog, barring some unexpected breakthrough in the accuracy problem of the latter. Digital building blocks become ever smaller, faster, cheaper, and more reliable. Computers that fit in the palm of the hand are on the market, and are already bulky by comparison with those in the laboratory. The analog-digital hybrid most likely will not be new life for the analog, but an assimilating of its better qualities by the digital. ------------------------------------------------------------------------ “‘_What’s one and one and one and one and one and one and one and one and one and one?_’ ‘_I don’t know,’ said Alice. ‘I lost count._’ ‘_She can’t do Addition,’ the Red Queen interrupted._” —Lewis Carroll 5: The Binary Boolean Bit In this world full of “bigness,” in which astronomical numbers apply not only to the speed of light and the distance to stars but to our national debt as well, it is refreshing to recall that some lucky tribes have a mathematical system that goes, “One—two—plenty!” Such an uncluttered life at times seems highly desirable, and we can only envy those who lump all numbers from three to billions as simply “plenty.” Instead we are faced today with about as many different number systems as there are numbers, having come a long way from the dawn of counting when an even simpler method than “one—two—plenty” prevailed. Man being basically self-centered, he first thought in terms of “me,” or one. Two was not a concept, but two “ones”; likewise, three “ones” and so on. Pebbles were handy, and to represent the ten animals slain during the winter, a cave man could make ten scratches on the wall or string out that many stones. It is said that the ancient cabbies in Rome had a taximeter that dropped pebbles one by one onto a plate as the wheels turned the requisite number of revolutions. This plate of stones was presented to the passenger at the end of his ride—perhaps where we get the word “fare”! Prices have risen so much that it would take quite a bag of pebbles in the taximeter today. Using units in this manner to express a sum is called the unitary system. It is the concept that gives rise to the “if all the dollars spent in this country since such and such a time were laid end to end—” analogies. Put to practice, this might indeed have a salutary effect, but long ago man learned that it was not practical to stick to a one-for-one representation. How long it was before we stumbled onto the fact that we had a “handy” counting system attached to our wrists is not positively known, but we eventually adopted the decimal system. In some places the jump from one to ten was not made completely. The Pueblo Indians, for instance, double up one fist each time a sum of five is reached. Thus the doubled fist and two fingers on the other hand signifies seven. In the mathematician’s language, this is a modulo-5 system. The decimal system is modulo-10; in other words we start over each time after reaching 10. Besides the word digit in our vocabulary to tie fingers and numbers, the Roman numerals V and X are graphic representations of one hand with thumb widespread, and two hands crossed, respectively. A point worth remembering is that the decimal system was chosen arbitrarily because we happen to have ten digits. There is no divine arithmetical significance in the number 10; in fact mathematicians would prefer 12, since it can be divided more ways. The ancient Mayans, feeling that if 10 were ten times as good as 1, then surely 20 would be twice the improvement of the decimal system. So they pulled off their boots and added toes to fingers for a modulo-20 number system. Their word for 20, then, is the same as that for “the whole man” for very good reason. Other races adopted even larger base systems, the base of 60 being an example. If we look to natural reasons for the development of number systems, we might decide that the binary, or two-valued system, did not attain much prominence in naïve civilizations because there are so few one-legged, two-toed animals! Only when man built himself a machine uniquely suited to two-valued mathematics did the binary system come into its own. Numbers are merely conventions, rigorous conventions to be sure with no semantic vagueness. God did not ordain that we use the decimal system, as evidenced in the large number of other systems that work just fine. Some abacuses use the biquinary system, and there are septal, octal, and sexagesimal systems. We can even express numbers in an ABC or XYZ notation. So a broad choice was available for the computer designer when he began to look about for the most efficient system for his new machine. Considering only the question of a radix, or base, which will permit the fewest elements to represent the desired numbers, mathematicians can show us that a base of not 10, or 12, or any other whole number is most efficient, but the fraction 2.71828. The ideal model is not found in man, then, since man does not seem to have 2.71828 of anything. However, the strange-looking number does happen to be the base of the system of natural logarithms. Now a system of mathematics based on 2.71828 might make the most efficient use of the components of the computer, but it would play hob with other factors, including the men who must work with such a weird set of numbers. As is often done, a compromise was made between ideal and practical choices. Since the computer with the most potential seems to be the electronic computer, and since its operation hinges on the opening and closing of simple or sophisticated switches, a two-valued mathematical system, the binary system, was chosen. It wasn’t far from the ideal 2.71828, and there was another even more powerful reason for the choice. Logic is based on a yes-no, true-false system. Here, then, was the best of all possible number systems: the lowly, apparently far-from-sophisticated binary notation. As one writer exclaimed sadly, a concept which had been hailed as a monument to monotheism ended up in the bowels of a robot! _The Binary System_ It is believed from ancient writings that the Chinese were aware of the binary or two-valued system of numbers as early as 3000 B.C. However, this fact was lost as the years progressed, and Leibnitz thought that he had discovered binary himself almost 5,000 years later. In an odd twist, Leibnitz apprised his friend Grimaldi, the Jesuit president of the Tribunal of Mathematics in China, of the religious significance of binary 1 and 0 as an argument against Buddhism! A legend in India also contains indications of the power of the binary system. The inventor of the game of chess was promised any award he wanted for this service to the king. The inventor asked simply that the king place a grain of wheat on the first square of the board, two on the second, and then four, eight, and so on in ascending powers of two until the sixty-four squares of the board were covered. Although the king thought his subject a fool, this amount of wheat would have covered the entire earth to a depth of about an inch! We are perhaps more familiar with the binary system than we realize. Morse code, with its dots and dashes, for example, is a two-valued system. And the power of a system with a base of two is evident when we realize that given a single one-pound weight and sufficient two-pound weights we can weigh _any_ whole-numbered amounts. At first glance, however, binary numbers seem a hopeless conglomeration of ones and zeros. This is so only because we have become conditioned to the decimal system, which was even more hopeless to us as youngsters. We may have forgotten, with the contempt of familiarity, that our number system is built on the idea of powers. In grade school we learned that starting at the right we had units, tens, hundreds, thousands, and so on. In the decimal number 111, for example, we mean 1 times 10^2, plus 1 times 10^1, plus 1. We have handled so many numbers so many times we have usually forgotten just what we are doing, and how. The binary system uses only two numbers: 1 and 0. So it is five times as simple as the decimal system. It uses powers of two rather than ten, again far simpler. Let’s take the binary number 111 and break it down just as we do a decimal number. Starting at the left, we have 1 times 2^2, plus 1 times 2^1, plus 1. This adds up to 7, and there is our answer. The decimal system is positional; this is what made it so much more effective in the simple expression of large numbers than the Roman numeral system. Binary is positional too, and for larger numbers we continue moving toward the left, increasing our power of two each time. Thus 1111 is 2^3 plus 2^2 plus 2^1 plus 1. [Illustration: _System Development Corp._ A computer teaching machine answering a question about the binary system. ] We are familiar with decimal numbers like 101. This means 1 hundred, no tens, and 1 unit. Likewise in binary notation 101 means one 4, no 2’s, and one 1. For all its seeming complexity, then, the binary system is actually simpler than the “easy” decimal one we are more familiar with. But despite its simplicity, the binary system is far from being inferior to the decimal system. You can prove this by doing some counting on your fingers. Normally we count, or tally, by bending down a finger for each new unit we want to record. With both hands, then, we can add up only ten units, a quite limited range. We can add a bit of sophistication, and assign a different number to each finger; thus 1, 2, 3, 4, 5, 6, 7, 8, 9, 10. Now, believe it or not, we can tally up to 55 with our hands! As each unit is counted, we raise and lower the correct finger in turn. On reaching 10, we leave that finger—thumb, actually—depressed, and start over with 1. On reaching 9, we leave it depressed, and so on. We have increased the capacity of our counting machine by 5-1/2 times without even taking off our shoes. The mathematician, by the way, would say we have a capability of not 55 but 56 numbers, since all fingers up would signify 0, which can be called a number. Thus our two hands represent to the mathematician a modulo-56 counter. This would seem to vanquish the lowly binary system for good, but let’s do a bit more counting. This time we will assign each finger a number corresponding to the powers of 2 we use in reading our binary numbers. Thus we assign the numbers 1, 2, 4, 8, 16, 32, 64, 128, 256, and 512. How many units can we count now? Not 10, or 55, but a good bit better than that. Using binary notation, our ten digits can now record a total of 1,023 units. True, it will take a bit of dexterity, but by bending and straightening fingers to make the proper sums, when you finally have all fingers down you will have counted 1,023, or 1,024 if you are a mathematical purist. Once convinced that the binary method does have its merits, it may be a little easier to pursue a mastery of representing numbers in binary notation, difficult as it may seem at the outset. The usual way to convert is to remember, or list, the powers of 2, and start at the left side with the largest power that can be divided into the decimal number we want to convert. Suppose we want to change the number 500 into binary. First we make a chart of the positions: Power of 2 8 7 6 5 4 3 2 1 0 ───────────────────────────────────────────────────────────── Decimal Number 256 128 64 32 16 8 4 2 1 ───────────────────────────────────────────────────────────── Binary Number 1 1 1 1 1 0 1 0 0 Since 256 is the largest number that will go into 500, we start there, knowing that there will be nine binary digits, or “bits” in our answer. We place a 1 in that space to indicate that there is indeed an eighth power of 2 included in 500. Since 128 will go into the remainder, we put a 1 in that space also. Continuing in this manner, we find that we need 1’s until we reach the “8” space which we must skip since our remainder does not contain an 8. We mark a 1 in the 4 space, but skip the 2 and the 1. Our answer, then, in binary notation is 111110100. This number is called “pure binary.” It can also lead to pure torture for human programmers whose eyes begin to bug with this “bit chasing,” as it has come to be called. Everything is of course relative, and the ancient Roman might gladly have changed DCCCLXXXVIII to binary 1101111000, which is two digits shorter. There is a simpler way of converting that might be interesting to try out. We’ll start with our same 500. Since it is an even number, we put a 0 beneath it. Moving to the left, we divide by two and get 250. This also is an even number, so we mark down a 0 in our binary equivalent. The next division gives 125, an odd number, so we put down a 1. We continue to divide successively, marking a zero for each even remainder, and a 1 for the odd. Although it may not be obvious right away, we are merely arriving at powers of two by a process called mediation, or halving. Decimal 1 3 7 15 31 62 125 250 500 ───────────────────────────────────────────────────── Binary 1 1 1 1 1 0 1 0 0 Obviously we can reverse this procedure to convert binary numbers to their decimal equivalents. There is an interesting extension of this process called duplication by which multiplication can be done quite simply. Let us multiply 95 times 36. We will halve our 95 as we did in the earlier example, while doubling the 36. This time when we have an even number in the left column, we will simply cancel out the corresponding number in the right column. 95 36 47 72 23 144 11 288 5 576 2 **** 1 2304 —— 3420 This clever bit of mathematics is called Russian peasant multiplication, although it was also known to early Egyptians and many others. It permits unschooled people, with only the ability to add and divide, to do fairly complex multiplication problems. Actually it is based on our old stand-by, the binary system. What we have done is to represent the 95 “dyadically,” or by twos, and to multiply 36 successively by each of these powers as applicable. We will not digress further, but leave this as an example of the tricks possible with the seemingly simple binary system. Even after we have learned to convert from the decimal numbers we are familiar with into binary notation almost by inspection, the results are admittedly unwieldy for human handling. An employee who is used to getting $105 a week would be understandably confused if the computer printed out a check for him reading $1101001. For this reason the computer programmer has reached a compromise with the machine. He speaks decimal, it speaks binary; they meet each other halfway with something called binary-coded decimal. Here’s the way it works. A little thought will show that the decimal numbers from 0 through 9 can be presented in binary using four bits. Thus: _Decimal_ _Binary_ 0 0 1 1 2 110 3 111 4 1100 5 1101 6 1110 7 10111 8 11000 9 11001 In the interest of uniformity we fill in the blanks with 0’s, so that each decimal number is represented by a four-digit block, or word, of binary code. Now when the computer programmer wants to feed the number 560 into the computer in binary he breaks it into separate words of 5, 6, and 0; or 0101, 0110, and 0000. In effect, we have changed $5 words into four-bit words! The computer couldn’t care less, since it handles binary digits at the rate of millions a second; and the human is better able to keep his marbles while he works with the computer. Of course, there are some computers that are classed as pure binary machines. These work on mathematical problems, with none of the restrictions imposed by human frailty. For the computer the pure binary system is more efficient than the binary decimal compromise. The four-digit words can be made to represent not only numbers, but letters as well. When this is done it is called an alpha-numeric or alphameric code. Incidentally, it is conceivable that language could be made up of only 1’s and 0’s, or perhaps _a_’s and _b_’s would be better. All it would take would be the stringing together of enough letters to cover all the words there are. The result would be rather dull, with words like _aabbababaabbaaba_, _bbaabbaabababaaabab_, and _aaaaaaaaabaaa_; it is doubtful that the computer will make much headway with a binary alphabet for its human masters. In the early days of binary computer work, the direct conversion to binary code we have discussed was satisfactory, but soon the designers of newer machines and calculating methods began to juggle the digits around for various reasons. For one thing, a decimal 0 was represented by four binary 0’s. Electrically, this represents no signal at all in the computer’s inner workings. If trouble happened, say a loose connection, or a power failure for a split second, the word 0000 might be printed out and accepted as a valid zero when it actually meant a malfunction. So the designers got busy trying other codes than the basic binary. One clever result is the “excess-3” code. In this variation 3 is added to each decimal number before conversion. A decimal 30 is then represented by the word 0011 instead of 0000. There is, in fact, no such computer word as 0000 in excess-3 code. This eliminates the possibility of an error being taken for a 0. Excess-3 does something else too. If each digit is changed, that is, if 1’s become 0’s and 0’s become 1’s, the new word is the “9’s complement” of the original. For example, the binary code for 4 in excess-3 is 0111. Changing all the digits, we get 1000, which is decimal 5. This is not just an interesting curiosity, but the 9’s complement of 4 (9 minus 4). Anyone familiar with an adding machine is used to performing subtraction by using complements of numbers. The computer cannot do anything but add; by using the excess-3 code it can subtract by adding. Thus, while the computer cannot subtract 0110 from 1000, it can quite handily add 1001 to 1000 to get the same result. There are many other reasons for codes, among them being the important one of checking for errors. “Casting out nines” is a well-known technique of the bookkeeper for locating mistakes in work. Certain binary codes, containing what is called a “parity bit,” have the property of self-checking, in a manner similar to casting out nines. A story is told of some pioneer computer designers who hit on the idea of another means of error checking not as effective as the code method. The idea was clever enough, it being that identical computers would do each problem and compare answers, much like the pairs of abacus-wielders in Japan’s banks. In case both computers did not come up with the same answer, a correction would be made. With high hopes, the designers fed a problem into the machines and sat back to watch. Soon enough a warning light blinked on one machine as it caught an error. But simultaneously a light blinked on the other. After that, chaos reigned until the power plugs were finally pulled. Although made of metal and wires, the computers demonstrated a remarkably human trait; each thought the other was wrong and was doing its best to change its partner’s answer! The solution, of course, was to add a third computer. Binary decimal, as we have pointed out, is a wasteful code. The decimal number 100 in binary decimal coding is 0001 0000 0000, or 12 digits. Pure binary is 1100100, or only 7 digits. By going to a binary-octal code, using eight numbers instead of ten, the words can be 3-bit instead of 4-bit. This is called an “economy” code, and finds some application. There are also “Gray” codes, reflected binary codes, and many more, each serving a particular purpose. Fortunately for the designer, he can be prodigal with his use of codes. With 4-bit words, 29 _billion_ codes are available, so a number of them are still unused. Having translated our decimal numbers into code intelligible to our computer, we still have the mathematical operations to perform on it. With a little practice we can add, subtract, multiply, and divide our binary numbers quite easily, as in the examples that follow. Addition: 1100 (12) 0111 ( 7) —— —— 10011 (19) Subtraction: 1010 (10) - 0010 ( 2) ——— —— 1000 (8) Multiplication: 0110 (6) × 0011 (3) ——— –—— 0110 0110 0000 0000 ——— 10010 (18) TN1 Division: 1010 ÷ 10 = 0101 (10 ÷ 2 = 5) The rules should be obvious from these examples. Just as we add 5 and 5 to get 0 with 1 to carry, we add 1 and 1 and get 0 with 1 to carry in binary. Adding 1 and 0 gives 1, 0 and 0 gives 0. Multiplying 1 times 1 gives 1, 1 times 0 gives 0, and 0 times 0 gives 0. One divides into 1 once, and into 0 no times. Thus we can manipulate in just the manner we are accustomed to. The computer does not even need to know this much. All it is concerned with is addition: 1 plus 1 gives 0 and 1 to carry; 1 plus 0 gives 1; and 0 plus 0 gives 0. This is all it knows, and all it needs to know. We have described how it subtracts by adding complements. It can multiply by repetitive additions, or more simply, by shifting the binary number to the left. Thus, 0001 becomes 0010 in one shift, and 0100 in two shifts, doubling each time. This is of course just the way we do it in the decimal system. Shifting to the right divides by two in the binary system. The simplest computer circuitry performs additions in a serial manner, that is, one operation at a time. This is obviously a slow way to do business, and by adding components so that there are enough to handle the digits in each row simultaneously the arithmetic operation is greatly speeded. This is called parallel addition. Both operations are done by parts understandably called adders, which are further broken down into half-adders. There are refinements to basic binary computation, of course. By using a decimal point, or perhaps a binary point, fractions can be expressed in binary code. If the position to the left of the point is taken as 2 to the zero power, then the position just to the right of the point is logically 2 to the minus one, which if you remember your mathematics you’ll recognize as one-half. Two to the minus two is then one-fourth, and so on. While we are on the subject of the decimal point, sophisticated computers do what is called “floating-point arithmetic,” in which the point can be moved back and forth at will for much more rapid arithmetical operations. No matter how many adders we put together and how big the computer eventually gets, it is still operating in what seems an awkward fashion. It is counting its fingers, of which it has two. The trick is in the speed of this counting, so fast that one million additions a second is now a commonplace. Try that for size in your own decimally trained head and you will appreciate the computer a little more. _The Logical Algebra_ We come now to another most important reason for the effectiveness of the digital computer; the reason that makes it the “logical” choice for not only mathematics but thinking as well. For the digital computer and logic go hand in hand. Logic, says Webster, is “the science that deals with canons and criteria of validity in thought and demonstration.” He admits to the ironic perversion of this basic definition; for example, “artillery has been called the ‘_logic_ of kings,’” a kind of logic to make “argument useless.” Omar Khayyám had a similar thought in mind when he wrote in _The Rubáiyát_, The grape that can with logic absolute, The Two-and-Seventy Sects confute. Other poets and writers have had much to say on the subject of logic through the years, words of tribute and words of warning. Some, like Lord Dunsany, counsel moderation even in our logic. “Logic, like whiskey,” he says, “loses its beneficial effect when taken in too large quantities.” And Oliver Wendell Holmes asks, Have you heard of the wonderful one-hoss shay That was built in such a logical way It ran a hundred years to the day? The words logic and logical are much used and abused in our language, and there are all sorts of logic, including that of women, which seems to be a special case. For our purposes here it is best to stick to the primary definition in the dictionary, that of validity in thought and demonstration. Symbolic logic, a term that still has an esoteric and almost mystical connotation, is perhaps mysterious because of the strange symbology used. We are used to reasoning in words and phrases, and the notion that truth can be spelled out in algebraic or other notation is hard to accept unless we are mathematicians to begin with. We must go far back in history for the beginnings of logic. Aristotelian logic is well known and of importance even though the old syllogisms have been found not as powerful as their inventors thought. Modern logicians have reduced the 256 possible permutations to a valid 15 and these are not as useful as the newer kind of logic that has since come into being. Leibniz is conceded to be the father of modern symbolic logic, though he probably neither recognized what he had done nor used it effectively. He did come up with the idea of two-valued logic, and the cosmological notion of 1 and 0, or substance and nothingness. In his _Characteristica Universalis_ he was groping for a universal language for science; a second work, _Calculus Ratiocinator_, was an attempt to implement this language. Incidentally, Leibnitz was not yet twenty years old when he formulated his logic system. Unfortunately it was two centuries later before the importance of his findings was recognized and an explanation of their potential begun. In England, Sir William Hamilton began to refine the old syllogisms, and is known for his “quantification of the predicate.” Augustus De Morgan, also an Englishman, moved from the quantification of the predicate to the formation of thirty-two rules or propositions that result. The stage was set now for the man who has come to be known as the father of symbolic logic. His name was George Boole, inventor of Boolean algebra. In 1854, Boole published “An Investigation of the Laws of Thought on which are Founded the Mathematical Theories of Logic and Probabilities.” In an earlier pamphlet, Boole had said, “The few who think that there is that in analysis which renders it deserving of attention for its own sake, may find it worth while to study it under a form in which every equation can be solved and every solution interpreted.” He was a mild, quiet man, though nonconformist religiously and socially, and his “Investigation” might as well have been dropped down a well for all the immediate splash it made in the scientific world. It was considered only academically interesting, and copies of it gathered dust for more than fifty years. Only in 1910 was the true importance given to Boole’s logical calculus, or “algebra” as it came to be known. Then Alfred North Whitehead and Bertrand Russell made the belated acknowledgment in their _Principia Mathematica_, and Russell has said, “Pure mathematics was discovered by Boole, in a work he called ‘The Laws of Thought.’” While his praise is undoubtedly exaggerated, it is interesting to note the way in which mathematics and thought are considered inseparable. In 1928, the first text on the new algebra was published. The work of Hilbert and Ackermann, _Mathematical Logic_, was printed first in German and then in English. What was the nature of this new tool for better thinking that Boole had created? Its purpose was to make possible not merely precise, but _exact_ analytical thought. Historically we think in words, and these words have become fraught with semantic ditches, walls, and traps. Boole was thinking of thought and not mathematics or science principally when he developed his logic algebra, and it is indicative that symbolic logic today is often taught by the philosophy department in the university. Russell had hinted at the direction in which symbolic logic would go, and it was not long before the scientist as well as the mathematician and logician did begin to make use of the new tool. One pioneer was Shannon, mentioned in the chapter on history. In 1938, Claude Shannon was a student at M.I.T. He would later make scientific history with his treatise on and establishment of a new field called information theory; his early work was titled “A Symbolic Analysis of Relay and Switching Circuits.” In it he showed that electrical and electronic circuitry could best be described by means of Boolean logic. Shannon’s work led to great strides in improving telephone switching circuits and it also was of much importance to the designer of digital computers. To see why this is so, we must now look into Boolean algebra itself. As we might guess, it is based on a two-valued logic, a true-false system that exactly parallels the on-off computer switches we are familiar with. The Biblical promise “Ye shall know the truth, and the truth shall make you free” applies to our present situation. The best way to get our feet wet in the Boolean stream is to learn its so-called “truth tables.” _Conjunctive Boolean Operation_ A _and_ B equal C A B C (A · B = C) ——— 0 0 0 1 0 0 0 1 0 1 1 1 _Disjunctive Boolean Operation_ A _or_ B equals C A B C (Ā ∨ B = C) ——— 0 0 0 1 0 1 0 1 1 1 1 1 In the truth tables, 1 symbolizes true, 0 is false. In the conjunctive AND operation, we see that only if both A and B are true is C true. In the disjunctive OR operation, if _either_ A _or_ B is true, then C is also true. From this seemingly naïve and obvious base, the entire Boolean system is built, and digital computers can perform not only complex mathematical operations, but logical ones as well, including the making of decisions on a purely logical basis. Before going on to the few additional conditions and combinations that complete the algebra, let’s study some analogies that will make clear the AND/OR principles of operation. We can think of AND as two bridges in sequence over two rivers. We can reach our destination only if both bridges are working. However, suppose there are two parallel bridges and only one river. We can then cross if either or both of the bridges is working. A closer example is that of electrical switches. Current will flow through our AND circuit if—and only if—both switches are closed. When the switches are in parallel—an OR circuit—current will flow if either, or both, are closed. The truth tables resemble the bridge or switch arrangements. We can proceed across the line of 1’s and 0’s in the first table only if both switches are closed. The symbol 1 means that the switch is closed, so we can cross only the bottom line. In the second table, we are told we can proceed across the line if either switch is closed. Thus we can cross lines 2, 3, and 4. We can use many symbols in our two-valued system. _Symbol_ Bridge No Bridge Power No Power 1 0 True False A little imagination suggests a logic computer of sorts with one switch, a battery, and a light bulb. Suppose we turn on the switch when we drive into our garage. A light in the hallway then indicates that the car is available. By using two switches we can indicate that a second car is also in the garage; or that either of them is, simply by choosing between AND logic and OR logic. Childish as this seems, it is the principle of even our most complex thinking processes. You will remember that the brain is considered a digital computer, since neurons can only be on or off. All it takes is 10 billion neuron switches! [Illustration: _Remington Rand UNIVAC_ AND and OR gates in series. Switches 1 _and_ 2, plus 3 _or_ 4, are needed to light the bulb. ] In addition to the conjunctives AND and OR, Boolean algebra makes use of the principle of negation. This is graphically illustrated thus: _Original_ _Negation_ A Ā 1 0 0 1 The negation device used in computer circuitry is called an inverter, since it changes its input from a 1 to a 0, or vice versa. The usefulness of such an element is obvious when we remember the computer trick of subtracting by adding complements. The inverter circuit used with a code like the excess-3 readily forms these complements. Further sophistication of the basic Boolean forms leads to units other than the AND and OR gates. Possible are NOT, NOR, and exclusive-OR forms. In the latter, there is an output if one and only one input is present. The NOR circuit is interesting in that it was made possible with the introduction of the transistor; the vacuum tube does not permit this configuration. [Illustration: _Computer Control Co._ The functions of two binary variables. ] Present-day symbolic logic is not the pure Boolean as presented back in 1854. Boole’s OR was the exclusive, one and only one, type. Today the logician generally assumes the either-or connotation. The logic has also been amplified, using the commutative, associative, and distributive laws much like those of conventional algebra. We are indebted to De Morgan for most of this work, showing that A and B equals B and A; A and (A and B) equals (A and B) and A; and so on. While these seem intuitively true, the implications are nonetheless of great importance both in pure logic and its practical use in circuitry. A graphic representation of the metamorphosis from symbolic to actual implementation of Boolean equations follows: The implication of importance is that logic applies equally well whether we are making a qualifying statement such as “A man must have strength _and_ courage to win a barehanded fight with a lion,” or wiring a defensive missile so that it will fire only if a target is within range _and_ is unfriendly. In the early period of computer design the engineer was faced with the problem of building his own switches and gates. Today many companies offer complete “packaged” components—AND gates, OR gates, and the other configurations. This is the modular approach to building a computer and the advantages are obvious. The designer can treat the components simply as “black boxes” that will respond in a prescribed way to certain input conditions. If he wants, the engineer can go a step further and buy a ready-built logic panel consisting of many components of different types. All he need do to form various logic circuits is to interconnect the proper components with plug-in leads. This brings us to the point of learning what we can do with these clever gates and switches now that we have them available and know something about the way they work. We talked about the computer adder circuit earlier in this chapter. It is made up of two half-adders, remember, with perhaps an additional OR gate, flip-flop, etc. Each half-adder is composed of two AND gates and an OR gate. So we have put together several basically simple parts and the result is a piece of equipment that will perform addition at a rate to make our heads swim. There are other things we can do with Boolean logic besides arithmetic. A few gates will actuate a warning signal in a factory in case either of two ventilators is closed and the temperature goes up beyond a safe point; or in case both vents are closed at the same time. We can build a logic computer that will tell us when three of four assembly lines are shut down at the same time, and also which three they are. [Illustration: _General Electric Co., Computer Dept._ Electronic computers are built up of many “building blocks” like this one. ] Logic problems abound in puzzle books, and many of us spend sleepless nights trying to solve them in our heads. An example is the “Farnsworth Car Pool” problem. Rita Farnsworth asks her husband if someone in his car pool can drive for him tomorrow so that she may use the car. Joe Farnsworth replies, “Well, when I asked Pete if he would take my turn he said he was flying to Kansas City today, but he’d be glad to drive tomorrow if he didn’t have to stay over and that his wife has been staying home lately and he will drive her car if she doesn’t go to work. Oscar said that since his own car is due back from the garage tomorrow he can drive it even if his wife does use hers, provided the garage gets his back to him. But if this cold of mine gets any worse I’m going to stay home even if those fellows have to walk to work, so you can certainly have the car if I don’t go to work.” This dialogue of Joe’s confuses Rita and most of us are in the same state. [Illustration: _Autonetics Division, North American Aviation, Inc._ Testing an assembled digital computer. ] The instruction manual for BRAINIAC, a do-it-yourself computer that sells for a few dollars, gives a simple wiring diagram for solving Rita’s dilemma. Electrically the problem breaks down into three OR gates and one AND gate. All Mrs. Farnsworth has to do is set in the conditions and watch the indicator light. If it glows, she gets the car! These are of course simple tasks and it might pay to hire a man to operate the vents, and ride to work on the bus when the car pool got complicated. But even with relatively few variables, decision-making can quickly become a task requiring a digital computer operating with Boolean logic principles. [Illustration: _Science Materials Center_ Problem in logic reduced to electrical circuits. ] The Smith-Jones-Robinson type of problem in which we must find who does what and lives where is tougher than the car pool—tough enough that it is sometimes used in aptitude tests. Lewis Carroll carried this form of logical puzzler to complicated extremes involving not just three variables but a dozen. To show how difficult such a problem is, an IBM 704 required four minutes to solve a Carroll puzzle as to whether any magistrates indulge in snuff-taking. The computer did it the easy way, without printing out a complete “truth table” for the problem—the method a man would have to use to investigate all the combinations of variables. This job would have taken 13 hours! While the question of the use of snuff is perhaps important only to tobacconists and puzzle-makers, our technical world today does encounter similar problems which are not practical of solution without a high-speed computer. A recent hypothetical case discussed in an electronics journal illustrates this well. A missile system engineer has the problem of modifying a Nike-Ajax launching site so that it can be used by the new Nike-Hercules missile. He must put in switching equipment so that a remote control center can choose either an Ajax system, or one of six Hercules systems. To complicate things, the newer Hercules can be equipped with any of three different warheads and fly either of two different missions. When someone at the control center pushes a button, the computer must know immediately which if any of the missiles are in acceptable condition to be fired. This doesn’t sound like too big a problem. However, since there are twelve on-off signals to be considered, and since each has two possible states, there are 4,096 possible missile combinations. Not all these are probable, of course, but there is still sufficient variation to make it humanly impossible to check all of them and close a firing switch in the split second the control center can allow. The answer lies in putting Boolean algebra on the job, with a system of gates and inverters capable of juggling the multiplicity of combinations. Then when the word comes requesting a missile launch, the computer handles the job in microseconds without straining itself unduly. Just as Shannon pointed out twenty-five years ago, switching philosophy can be explained best by Boolean logic, and the method can be used not only to implement a particular circuit, but also to actually design the circuit in the first place. A simple example of this can be shown with the easy-to-understand AND and OR gates. A technician experimenting with an AND gate finds that if he simply reverses the direction of current, he changes the gate into an OR gate. This might come as a surprise to him if he is unfamiliar with Boolean logic, but a logician with no understanding of electrical circuits could predict the result simply by studying the truth tables for AND and OR. Reversing the polarity is equivalent to changing a 1 to a 0 and vice versa. If we do this in the AND gate table, we should not be surprised to find that the result looks exactly like the OR table! It acts like it too, as the technician found out. Boolean logic techniques can be applied to existing circuits to improve and/or simplify them. Problems as simple as wiring a light so that it can be turned on and off from two or more locations, and those as complex as automating a factory, yield readily to the simple rules George Boole laid down more than a hundred years ago. Watching a high-speed electronic digital computer solve mathematical problems, or operate an industrial control system with speed and accuracy impossible for human monitors, it is difficult to believe that the whole thing hinges on something as simple as switches that must be either open or closed. If Leibnitz were alive, he could well take this as proof of his contention that there was cosmological significance in the concept of 1 and 0. Maybe there is, after all! [Illustration: _Industrial Electronic Engineering & Maintenance_ “Luckily I brought along a ‘loaner’ for you to use while I repair your computer.” ] ------------------------------------------------------------------------ “_Whatever that be which thinks, understands, wills, and acts, it is something celestial and divine._” —Cicero 6: The Electronic Brain The idea of a man-made “brain” is far from being new. Back in 1851, Dr. Alfred Smee of England proposed a machine made up of logic circuits and memory devices which would be able to answer any questions it was asked. Doctor Smee was a surgeon, keenly interested in the processes of the mind. Another Britisher, H. G. Wells, wrote a book called _Giant Brain_ in 1938 which proposed much the same thing: a machine with all knowledge pumped into it, and capable of feeding back answers to all problems. If it was logical to credit “human” characteristics to the machines man contrived, the next step then was to endow the machine with the worst of these attributes. In works including Butler’s _Erewhon_, the diabolical aspects of an intelligent machine are discussed. The Lionel Britton play, _Brain_, produced in 1930, shows the machine gradually becoming the master of the race. A more physical danger from the artificial brain is the natural result of giving it a body as well. We have already mentioned Čapek’s _R.U.R._ and the Ambrose Bierce story about a chess-playing robot without a built-in sense of humor, who strangles the human being who beats him at a game. With these stories as models, other writers have turned out huge quantities of work involving mechanical brains capable of all sorts of mischief. Most of these authors were not as well-grounded scientifically as the pioneering Dr. Smee who admitted sadly that his “brain” would indeed be a giant, covering an area about the size of London! The idea of the giant brain was given new lease by the early electronic computers that began appearing in the 1940’s. These vacuum-tube and mechanical-relay machines with their rows of cabinets and countless winking lights were seized on gleefully by contemporary writers, and the “brain” stories multiplied gaudily. Many of the acts of these fictional machines were monstrous, and most of the stories were calculated to make scientists ill. Many of these gentlemen said the only correct part of the name “giant brain” was the adjective; that actually the machine was an _idiot savant_, a sort of high-speed moron. This opinion notwithstanding, the name stuck. One scholar says that while it is regrettable that such a vulgar term has become so popular, it is hardly worth while campaigning against its use. An amusing contemporary fiction story describes an angry crowd storming a laboratory housing a “giant brain,” only to be placated by a calm, sensibly arguing scientist. The mob dispersed, he goes back inside and reports his success to the machine. The “brain” is pleased, and issues him his next order. “Nonsense!” scoff most computer people. A recent text on operation of the digital computer says, “Where performance comparable with that of the human brain is concerned, man need have little fear that he will ever be replaced by this machine. It cannot think in any way comparable to a human being.” Note the cautious use of “little,” however. Another authority admits that the logic machines of the monk Ramón Lull were very clever in their proof of God’s existence, but points out that the monk who invented them was far cleverer since no computer has ever invented a monk who could prove anything at all! The first wave of ridiculous predictions has run its course and been followed by loud refutations. Now there is a third period of calmer and more sensible approach. A growing proportion of scientists take a middle-of-the-stream attitude, weighing both sides of the case for the computer, yet some read like science fiction. Cyberneticist Norbert Wiener, more scientist than fictioneer, professes to foresee computerized robots taking over from their masters, much as a Greek slave once did. Mathematician John Williams of the Rand Corporation thinks that computers can, and possibly will, become more intelligent than men. Equally reputable scientists take the opposite view. Neuro-physiologist Gerhard Werner of Cornell Medical College doubts that computers can ever match the creativity of man. He seems to share the majority view today, though many who agree will add, tongue in cheek, that perhaps we’d _better_ keep one hand on the wall plug just in case. _Thinking Defined_ The first step in deciding whether or not the computer thinks is to define thinking. Far from being a simple task, this definition turns out to be a slippery thing. In fact, if the computer has done no more than demand this sort of reappraisal of the human brain’s working, it has justified its existence. Webster lists meanings for “think” under two headings, for the transitive and intransitive forms of the verb. These meanings, respectively, start out with “To form in the mind,” and “To exercise the powers of judgment ... to reflect for the purpose of reaching a conclusion.” Even a fairly simple computer would seem to qualify as a thinker by these yardsticks. The storing of data in a computer memory may be analogous to forming in the mind, and manipulating numbers to find a square root certainly calls for some sort of judgment. Learning is a part of thinking, and computers are proving that they _can_ learn—or at least be taught. Recall of this learning from the memory to solve problems is also a part of the thinking process, and again the computer demonstrates this capability. One early psychological approach to the man-versus-machine debate was that of classifying living and nonliving things. In _Outline of Psychology_, the Englishman William McDougall lists seven attributes of life. Six of these describe “goal-seeking” qualities; the seventh refers to the ability to learn. In general, psychologist McDougall felt that purposive behavior was the key to the living organism. Thus any computer that is purposive—and any commercial model had better be!—is alive, in McDougall’s view. A restating of the division between man and machine is obviously in order. Dr. W. Ross Ashby, a British scientist now working at the University of Illinois, defines intelligence as “appropriate selection” and goal-seeking as the intelligent process _par excellence_, whether the selecting is done by a human being or by a machine. Ashby does split off the “non goal-seeking” processes occurring in the human brain as a distinct class: “natural” processes neither good nor bad in themselves and resulting from man’s environment and his evolution. Intelligence, to Ashby, who long ago demonstrated a mechanical “homeostat” which showed purposive behavior, is the utilization of information by highly efficient processing to achieve a high intensity of appropriate selection. Intelligent is as intelligent does, no distinction being made as to man or machine. _Humanoid_ and _artificial_ would thus be meaningless words for describing a computer. Ashby makes another important point in that the intelligence of a brain or a machine cannot exceed what has been put into it, unless we admit the workings of magic. Ashby’s beliefs are echoed in a way by scientist Oliver Selfridge of Lincoln Laboratory. Asked if a machine can think, Selfridge says, “Certainly; although the machine’s intelligence has an elusive, _unnatural_ quality.” “Think, Hell, COMPUTE!” reads the sign on the wall of a computer laboratory. But much of our thinking, perhaps some of the “natural” processes of our brains, doesn’t seem to fit into computational patterns. That part of our thinking, the part that includes looking at pretty girls, for example, will probably remain peculiar to the human brain. _The Human Brain_ Mundy Peale, president of Republic Aviation Corporation, addressing a committee studying the future of manned aircraft, had this to say: Until someone builds, for $100 or less with unskilled labor, a computer no larger than a grapefruit, requiring only a tenth of a volt of electricity, yet capable of digesting and transmitting incoming data in a fraction of a second and storing 10,000 times as much data as today’s largest computers, the pilots of today have nothing to worry about. The human brain is obviously a thing of amazing complexity and fantastic ability. Packed into the volume Mr. Peale described are some 10 _billion_ neurons, the nerve cells that seem to be the key to the operation of our minds. Hooked up like some ultra-complicated switchboard, the network of interconnections stores an estimated 200,000,000,000,000,000,000 bits of information during a lifetime! By comparison, today’s most advanced computers do seem pathetically unimpressive. We have discussed both analog and digital computers in preceding chapters. It is interesting to find that the human brain is basically a digital type, though it does have analog overtones as well. Each of the neurons is actually a switch operated by an electric current on a go/no-go, all-or-nothing basis. Thus a neuron is not partly on or partly off. If the electrical impulse exceeds a certain “threshold” value, the switch operates. Tied to the neurons are axons, the long “wires” that carry the input and output. The axons bring messages from the body’s sensors to the neurons, and the output to other neurons or to the muscles and other control functions. This grapefruit-size collection of electrochemical components thus stores our memories and effects the operation we call thinking. Since brain impulses are electrical in nature, we speak of them in electrical terms. The impulses have an associated potential of 50 millivolts, that is, fifty thousandths of a volt. The entire brain dissipates about 10 watts, so that each individual neuron requires only a billionth of a watt of power. This amount is far less than that of analogous computer parts. A neuron may take a ten-thousandth of a second to respond to a stimulus. This seemingly rapid operation time turns out to be far slower than present-day computer switches, but the brain makes up for this by being a “parallel operation” system. This means that many different connections are being made simultaneously in different branches, rather than being sequential, or a series of separate actions. Packaging 10 billion parts in a volume the size of a grapefruit is a capability the computer designer admires wistfully. Since the brain has a volume of about 1,000 cubic centimeters, 10 million neurons fit into a space of one cubic centimeter! A trillion would fit in one cubic foot, and man-made machines with even a million components per cubic foot are news today. Even when we are resting, with our eyes closed, a kind of stand-by current known as the alpha rhythm is measurable in our brains. This current, which has a frequency of about 10 cycles per second, changes when we see or feel something, or when we exercise the power of recall. It disappears when we sleep soundly, and is analogous to the operating current in a computer. Also, there is “power” available locally at the neurons to “amplify” weak signals sufficiently to trigger off following branches of neurons. Philosophers have proposed two general concepts of the human brain and how it functions. The _a priori_ theory presupposes a certain amount of “wired-in” knowledge: instincts, ideals, and so on. The other theory, that of the _tabula rasa_ or clean slate new brain, argues that each of us organizes an essentially random net of nerves into ordered intelligence. Both theories are being investigated with computers, and as a result light is beginning to be shed on the workings of our brains. [Illustration: _The Upjohn Company, Ezra Stoller Associates Photo_ “A moment at a concert” is diagrammed by brain model, showing eyes, ears, nerves, and structures analogous to brain. Picture at top represents perception. ] There is another division of philosophical thought in the mechanistic versus _elan vital_ argument. In other words, is the entire mind to be found in its constituent parts, or is there an intangible extra something that really breathes life into us? Whatever the correct concept, the brain does record impressions it can later recall. No one yet knows just how this is done, but several theories have been advanced. One of these describes a “chain circuit” set up in a neuron network by messages from the body’s sensors. This circuit, once started, continues to circle through the brain and is on tap whenever that particular experience needs to be recalled. The term “reverberate” is used in connection with this kind of memory, seeming to be a good scientific basis for the poetic “echoes of the past.” Reverberation circuits also provide the memory for some computers. Among other explanations of memory is that of conditioning the neurons to operate more “easily,” so that certain paths are readily traversed by brain impulses. This could be effected by chemical changes locally, and such a technique too is used in computers. However the brain accomplishes its job, it is certain that it evolved in its present form as a result of the environment its cells have had to function in for billions of years. Its prime purpose has been one of survival, and for this reason some argue that it is not particularly well adapted to abstract reasoning. Although the brain can do a wide variety of things from dreaming to picking out one single voice amid the hubbub of noise at a social gathering—a phenomenon scientists have given the descriptive name of “cocktail party effect”—men like Ashby consider it a very inflexible piece of equipment not well suited to pure logic. As a test of your brain as a logical device, consider the following problem from the Litton Industries “Problematical Recreations.” If Sara shouldn’t, then Wanda would. It is impossible that the statements: “Sara should” and “Camille couldn’t” can both be true at the same time. If Wanda could, then Sara should and Camille could. Therefore Camille could. Is this conclusion valid? If your head starts to swim, you are not alone. Very few humans solve such problems easily. Interestingly, those who do, make good computer programmers. _The Computer’s Brain_ Just as we have an anthropomorphic God, many people have done their best to endow the computer with human characteristics. Not only in fiction but also in real life, the electronic brains have been described as neurotic and frustrated on occasion, and also as being afraid and even having morning sickness! A salesman for a line of computers was asked to explain in understandable terms the difference between two computers whose specifications confused a customer. “Let’s put it this way,” the salesman said, “The 740 thinks the 690 is a moron!” We can begin to investigate the question of computer intelligence by again looking up a definition. The word “compute” means literally to think, or reckon, with. Early computers such as counting sticks, the abacus, and the adding machine are obviously something man thinks with. Even though we may know the multiplication tables, we find it easier and _safer_ to use a mechanical device to remember and even to perform operations for us. These homely devices do not possess sufficient “intelligence” to raise any fears in our minds. The abacus, for example, displays only what we might charitably call the property of memory. It has a certain number of rows, each row with a fixed number of beads. While it is not fallible, as is the human who uses it, it is far more limited in scope. All it can ever do is help us to add or subtract, and if we are clever, to multiply, divide, do square roots, and so on. If we are looking for purposive behavior in computing machines, it is only when we get to the adding machine that a glimmer appears. When a problem is set in and the proper button pushed, this device is compelled to go through the gear-whirring or whatever required to return it to a state of equilibrium with its problem solved. So far we might facetiously describe the difference in the goal-seeking characteristics of man and machine by recalling that man seeks lofty goals like climbing mountains simply because they are there, while the computer seeks its goal much like the steel ball in the pinball machine, impelled by gravity and the built-in springs and chutes of the device. When we come to a more advanced computer, however, we begin to have difficulty in assessing characteristics. For the JOHNNIAC, built by Rand and named for John von Neumann, can prove the propositions in the _Principia Mathematica_ of Whitehead and Russell. It can also “learn” to play a mediocre game of chess. If we investigate the workings of a digital computer, we find much to remind us of the human brain. First is the obvious similarity of on-off, yes-no operation. This implies a power source, usually electrical, and a number of two-position switches. The over-all configuration of the classic computer resembles, in principal if not physical appearance, that of the human brain and its accessories. As we have learned, the electronic computer has an input section, a control, an arithmetic (or logic) section, a memory, and an output. Looking into the arithmetic and memory sections, we find a number of comparisons with the brain. The computer uses power, far more than the brain. A single transistor, which forms only part of a neuron, may use a tenth of a watt; the brain is ahead on this score by a factor of millions to one. Electronic switches have an advantage over the neuron in that they are much faster acting. So fast have they become that engineers have had to coin new terms like nanosecond and picosecond, for a billionth and a trillionth of a second. Thus, the computer’s individual elements are perhaps 100,000 times faster than those of the brain. There is no computer in existence with the equivalent of 10 billion neurons. One ambitious _system_ of computers does use half a million transistors, plus many other parts, but even these relatively few would not fit under a size 7-1/2 hat. One advanced technique, using a “2-D” metal film circuitry immersed in liquid helium for supercooling, hopefully will yield a packaging density of about 3-1/2 million parts per cubic foot in comparison with the brain’s trillion-part density. We have mentioned the computer memory that included the “delay line,” remindful of the “chain circuit” in the brain. Electrical impulses were converted to acoustic signals in mercury, traversed the mercury, and were reconverted to electrical impulses. Early memory storage systems were “serial” in nature, like those stored on a tape reel. To find one bit of information required searching the whole reel. Now random-access methods are being used with memory core storage systems so wired that any one bit of information can be reached in about the same amount of time. This magnetic core memory stores information as a magnetic field, again analogous to a memory theory for the human brain except that the neuron is thought to undergo a chemical rather than magnetic change. [Illustration: _General Electric Co., Computer Dept._ Tiny ferrite cores like these make up the memory of some computers. Each core stores one “bit” of information. ] Until recently, computers have been primarily sequential, or serially operating, machines. As pointed out earlier, the brain operates in parallel and makes up for its slower operating individual parts in this way. Designers are now working on parallel operation for computers, an improvement that may be even more important than random-access memory. _Bionics_ It is obvious that while there are many differences in the brain and the computer there are also many striking similarities. These similarities have given rise to the computer-age science of “bionics.” A coinage of Major J. E. Steele of the Air Force’s Wright Air Development Center, _bionics_ means applying knowledge of biology and biological techniques to the design of electronic devices and systems. The Air Force and other groups are conducting broad research programs in this field. As an indication of the scope of bionics, Dr. Steele himself is a flight surgeon, primarily trained as neurologist and psychiatrist, with graduate work in electronics and mathematics. Those engaged in bionics research include mathematicians, physical scientists, embryologists, philosophers, neurophysiologists, psychologists, plus scientists and engineers in the more generally thought of computer fields of electronics and other engineering disciplines. A recent report from M.I.T. is indicative of the type of work being done: “What the Frog’s Eyes Tell the Frog.” A more ambitious project is one called simply “Hand,” which is just that. Developed by Dr. Heinrich Ernst, “Hand” is a computer-controlled mechanical hand that is described as the first artificial device to possess a limited understanding of the outside world. Although it will undoubtedly have industrial and other applications, “Hand” was developed primarily as a study of the cognitive processes of man and animals. Besides the Air Force’s formal bionics program, there are other research projects of somewhat similar nature. At Harvard, psychologists Bruner and Miller direct a Center for Cognitive Studies, and among the scientists who will contribute are computer experts. Oddly, man knows little of his own cognitive or learning process despite the centuries of study of the human mind. It has been said that we know more about Pavlov’s dog and Skinner’s pigeons than we do about ourselves, but now we are trying to find out. Some find it logical that man study the animals or computer rather than his own mind, incidentally, since they doubt that an intelligence can understand itself anyway. As an example of the importance placed on this new discipline, the University of California at Los Angeles recently originated a course in its medical school entitled “Introduction to the Function and Structure of the Nervous System,” designed to help bridge the gap between engineering and biology. In Russia, M. Livanov of the Soviet Academy Research Institute of Physiology in Higher Nervous Activity has used a computer coupled with an electric encephaloscope in an effort to establish the pattern of cortical connections in the brain. While many experts argue that we should not necessarily copy the brain in designing computers, since the brain is admittedly a survival device and somewhat inflexible as a result of its conditioning, it looks already as if much benefit has come from the bionics approach. The circuitry of early computers comprised what is called “soldered” learning. This means that the connections from certain components hook up to certain other components, so that when switches operated in a given order, built-in results followed. One early teaching device, called the Electric Questionnaire, illustrates this built-in knowledge. A card of questions and answers is slipped over pegs that are actually terminals of interconnected wires. Probes hooked to a battery are touched to a question and the supposed correct answer. If the circuit is completed, a light glows; otherwise the learner tries other answers until successful. More sophisticated systems are those of “forced” learning and free association. Pioneer attempts at teaching a computer to “perceive” were conducted at Cornell University under contract with the Air Force to investigate a random-network theory of learning formulated by Dr. Frank Rosenblatt. Specifically, the Perceptron learns to recognize letters placed in front of its “eyes,” an array of 400 photocells. The human brain accomplishes perception in several steps, though at a high enough rate of operation to be thought of as a continuous, almost instantaneous, act. Stimuli are received by sense organs; impulses travel to neurons and form interconnections resulting in judgment, action if necessary, and memory. The Perceptron machine functions in much the same manner. [Illustration: _Electronics_ Simplified version of a mammalian visual system (A) and Perceptron simulating the biological network (B). ] The forced learning technique, in which Perceptron was told when it correctly identified a letter, and when it missed, was used first. Later it was found that “corrective” or reinforced teaching, which notes only errors, was more effective. After Perceptron had seen each letter fifteen times and received proper correction, it could subsequently identify all the letters correctly. Announcement of Perceptron triggered many wild headlines and a general misconception in the public mind. Dr. Rosenblatt and other developers wisely refuse to comment on the potential of his machine, but the number of experiments being conducted indicates wide scientific interest, and _perceptron_ has attained the prestige of an uncapitalized generic term. However, the theory of its random process has been questioned by scientists including Theodore Kalin, one of the builders of an early electrical logic machine. Kalin feels that intelligence presupposes a certain minimum of _a priori_ knowledge: the wired-in learning of the computer or the instincts or inherited qualities of animals. This of course echoes the thoughts of Kant who deplored the notion as similar to all the books and papers in a library somehow arranging themselves properly on the shelves and in filing cabinets. Indeed, the whole idea of finding human intelligence mirrored in the electronic innards of the computer has been flatly denounced at some scientific symposiums. Computers given an intelligence test at the University of Michigan “flunked,” according to researchers. Another charge is that the reaction of the brain’s neuron depends on its history and thus cannot be compared with the computer. However, other researchers seem to have anticipated this weakness and are working on electronic or electrochemical neurons that also are conditioned by their input. Despite criticism, the bionics work proceeds on a broad front. More recently a machine called Cybertron has been developed by the Raytheon Company. This more sophisticated machine is being trained to recognize sonar sounds, using the corrective technique. If Cybertron errs, the teacher pushes a “goof” button. When the machine is fully developed, Raytheon feels it will be able to recognize all typical American word sounds, using its 192 learning elements, and to type them out. Computers generally do “logical” operations. Many human problems do not seem to be logical, and can be solved only by experience, as the mathematician Gödel demonstrated some years ago. Since Cybertron solves such “alogical” problems, its builders prefer not to call it a computer, but rather a self-organizing data-processor that adapts to its environment. Among the variety of tasks that Cybertron could perform are the grading of produce and the recognition of radar signals. Raytheon foresees wide application for Cybertron as a master learner with apprentice machines incapable of learning but able to “pick the brains” of Cybertron and thus do similar tasks. [Illustration: _Cornell Aeronautical Laboratory_ With the letter C in its field of view, Perceptron’s photocells at top center are activated. Simultaneously, response units in panel at right identify the letter correctly. ] The assembly of machines like Perceptron and Cybertron requires elements that simulate the brain’s neuron. One such component which has evolved from bionics research is the Artron, or artificial neuron. Inside the Artron are logic gates and inhibit gates. By means of reward or punishment, the Artron learns to operate a “statistical switch” and send impulses to other Artrons or to a readout. There are two interesting parallels here besides the operation of a simulated neural net. One is the statistical approach to decisions and learning. The late John von Neumann theorized that the brain’s actions might be statistical, or based on probability. Second, the designers of Artron see a similarity in its operation and Darwin’s theory of natural selection. Another new component in the bionics approach is the “neuristor.” This semiconductor diode simulates the axon, the nerve fiber that connects with the neuron. Another device is the “memistor,” unique in that it uses electrochemical phenomena to function as a memory unit. A different kind of artificial neuron called MIND is made up of magnetic cores. There is another plus factor in this duplication of what we think is the system used by the brain. While one neuron may not be as reliable as a vacuum tube or transistor, the complete brain is millions of times more dependable than any of its single parts. This happy end result is just the reverse of what man has come up with in his complex computer systems. For instance, individual parts in the Minuteman missile must have a reliability factor of 99.9993% so that the system will have a fair chance of working properly. Duplication of the brain’s network may well lead to electronic systems that are many times more reliable than any of their individual parts. Bionics is apparently a fruitful approach, both for benefiting computer technology and for learning more about the human brain. As an example, consider the fact that work with the Perceptron indicated that punishment was more effective in the learning process than punishment and reward together. This of course does not say that such a method would work best with a human subject, but if separate tests with human beings proved a similar result, it might then be safe to infer some similarity between the human and computer brain. One of the biggest roadblocks to implementation of a humanlike neural net is economic. Since there are some 10 billion neurons in the brain, and early electronic neurons consisted of several components including transistors which are a bargain at $2 each, building such a computer might double our national debt. Bionics workers have been thinking dreamily in terms of something like one cent per artificial neuron. This is a ridiculously low figure, but even at that a one-tenth brainpower computer with only a billion penny neurons would cost $10 million for those components alone! [Illustration: _Cornell Aeronautical Laboratory_ Random wiring network between the Mark I Perceptron’s 400 photocell sensors and the machine’s association units.... The Mark I has ten sensory output connections to each of its 512 association units. ] Not yet whipped, researchers are now thinking in terms of mass-producing lattices of thin metal, in effect many thousands of elements in a microscopic space, and propagating electrochemical waves rather than an electrical current through them. [Illustration: _Raytheon Co._ When Cybertron doesn’t catch on to a new lesson, engineers push the goof button to punish the machine. When it learns correctly it is allowed to continue its studies with no interruption, thus it constantly improves its skill. ] Other ideas include getting down to the molecular level for components. If this is achieved it will be a downhill pull, for even the human neuron consists of many molecules. Farfetched as these ideas seem, packaging densities of 100 billion per cubic foot are being talked of as foreseeable in less than ten years. This is only about ten times as bulky as the goal, the human brain, and when it is achieved the computer will be entitled to a big head. _The Computer as a Thinker_ About the time Johnny was having all his trouble reading, a computer named JOHNNIAC was given the basic theorems needed, and then asked to prove the propositional calculus in the _Principia Mathematica_, a task certainly over the heads of most of us. The computer waded through the job with no particular strain, and even turned in one proof more elegant than human brains had found before. When the same problems were given to an engineer unfamiliar with that branch of mathematics, his verbalized problem-solving technique paralleled that of JOHNNIAC. Asked if he had been thinking, the engineer said he “surely thought so!” In his interesting department in _Scientific American_, mathematical gamester Martin Gardner describes a simple set of punched cards for solving the type of logic problem discussed earlier in this chapter. Using these cards and a simple digital type of manipulation, we happily learn that Camille surely could. The problem is a simple, three-premise type in two-valued logic and can be solved by any self-respecting digital computer in a split second. A few demonstrations like this give a rather disconcerting insight into our brain’s limitations and build more respect for the computer’s intelligence. When we hear of expensive computers apparently frittering away their valuable time playing games we may well wonder how come. But games, it turns out, are an ideal testing ground for problem-solving ability and hence intelligence. Back in 1957, computer experts Simon and Newell predicted that in ten years the chess champion of the world would be a computer. Master players most likely laughed up their sleeves, and thus far the electronic machine has done no better than play a routine game against a human amateur. This, of course, is not a mean achievement. Wise heads are supposed to have responded to the prediction with “So what?” [Illustration: Photo at left from _Organization of the Cerebral Cortex_, by D. A Sholl, J. Wiley and Sons. Right, _General Electric Research Laboratory_ Photo at right shows a “crossed-film cryotron” shift register—an advanced computer element. The separation of active crossovers shown is comparable to the separation of nerve cells in the section of cat brain shown at left. ] Alex Bernstein of IBM worked out a program for the 704 computer in which the machine looks ahead four moves before each of its plays. Even this limited look ahead requires 2,800 calculations, and the 704 takes eight minutes deliberating. Occasionally it makes a move the experts rate as masterful. Chess is a far more complex game even than it appears to those of us on the sidelines. In an average game there are forty moves and each has about thirty possibilities. So far this sounds innocuous, but mathematics shows that there are thus 10^{120} possible moves in any one game. This number is a 1 followed by 120 zeros, and to underline its size it has been estimated that even if a million games a second were played, the possibilities would not be exhausted in our lifetime! Obviously human chess wizards do not investigate all possible moves. Instead they use heuristic reasoning, or hunch playing, to cut corners. The JOHNNIAC computer is investigating such approaches to computer-playing chess, in a movement away from rigorously programmed routines or “algorithms.” Algorithms are formulas or equations such as the quadratic equation used in finding roots. If indeed the computer does dethrone the human chess champ by 1967, it will be exceedingly hard to argue that the machine is not thinking. The word “heuristic” comes from the Greek _heuriskein_, meaning to discover or invent. An example of what it is and how important it is can be seen in the recent disproving of a famous conjecture made by the mathematician Euler some 180 years ago. Euler was interested in the properties of so-called “magic squares” in which letters are arranged vertically and horizontally. While it is possible to arrange the letters _a_, _b_, _c_, _d_, and _e_ in such a square so that all are present in each row and in different order, Euler didn’t think such was the case with a square having six units on a side. He tried it, visualizing officers of different rank arranged in rows. Convinced that it would not work, he extended his educated guess to squares having units of ten, fourteen, and other even numbers not divisible by four. He didn’t actually prove his conjecture, because the amount of paperwork makes it practically impossible. In 1901 a mathematician did try all the possible configurations of the square of six units and found that Euler was indeed correct. It was assumed that ten was impossible too, until 1958 when three American mathematicians spoiled Euler’s theory by finding workable magic squares having ten units per side. They did not do this by exhausting all the possibilities, for such a chore would have been humanly impossible. In fact, a computer labored for 100 hours and completed only a tiny fraction of the job. The square-seekers concluded that it would take even the high-speed computer upwards of a century to do the job, so instead they used hunches or inspired guesses, working out a heuristic for the task. The point of importance is that not only man, but the computer as well, despite its fantastic speed, must learn to use heuristic reasoning rather than blindly plowing through all possible solutions. There are just too many numbers! Computers play other games too, from tick-tack-toe and Nim, which it plays flawlessly, to Go and checkers. Dr. Arthur Samuel of IBM has taught the 704 computer to play checkers well enough to beat him regularly, though Dr. Samuel, scientist that he is, admits he is not a great checker-player. He has used two types of learning in the program: “rote” and “generalization.” So far these have been used separately, while human players use both types of learning in a game. American scientists visiting Russia recently reported that the Russians, like some of us, were amazed to hear that computer time was allotted to the mere playing of games. The real goal in all this game-playing is to learn how to do other more important things. Gaming is being applied to war strategy and to business management. Corporation executives are playing games with computers that simulate the operation of their firms, both to improve methods and to learn about themselves and their employees. A General Problem-Solver computer is being developed too; one which can solve problems like the cannibals and the missionaries and then do mathematical equations and other types of thinking. As was pointed out, when the computer’s method of solving a problem is compared with the protocol used by a person (by having him think aloud as he goes through the problem) it is seen that both use pretty much the same tricks and short cuts. As the computer keeps closing the gap, we can push the goal back by redefining our terms. This is much like dangling a carrot on a stick, and with the computer doggedly taking the part of the donkey, it is a pretty good technological flail. By making the true test of intelligence something like artistic creativity, we can rule out the machine unless it can write poetry, compose music, or paint a picture. So far the computer has done the first two, and the last poses no particular problem, though debugging the machine might be a messy operation. True, the machine’s poetry is only about beatnik level: CHILDREN Sob suddenly, the bongos are moving. Or could we find that tall child? And dividing honestly was like praying badly, And while the boy is obese, all blast could climb. First you become oblong, To weep is unctious, to move is poor. This masterpiece, produced by a computer in the Librascope Laboratory for Automata Research, is not as obscure as an Eliot or a Nostradamus. Computer music has not yet brought audiences to their feet in Carnegie Hall. The machine’s detractors may well claim that it has produced nothing truly great; nothing worthy of an Einstein or Keats or Vermeer. But then, how many of us people have? There is yet another way we can ban the computer from membership in our human society. While human beings occasionally think they are machines, and Dr. Bruno Bettelheim has documented a case history of “Joey” who was so convinced that he was a machine that he had to keep himself plugged in to stay alive, no machine has yet demonstrated that it is consciously aware of itself, as human beings are. Machines are, hopefully, objective. Consciousness seems to be subjective in the extreme; indeed, some feel that it is a thing one of us cannot hope to convey as intelligence to another and thus has no scientific importance. It is also noted that the thinking and learning processes can be carried out with no need for consciousness of what we are doing. An example given is that of the cyclist who learns, without being “aware” of the fact, that to turn his machine left he must first make a slight swing to the right in order to keep from falling outward during his left turn. This observation in itself is not final proof of the pudding, of course, unless we are aiming only to make a mechanical bike-rider, but many of our other actions are carried out more or less mechanically without calling attention to themselves. Just as certainly, however, the thing called consciousness plays a vital role in human thinking. Perhaps the machine must learn to do this before it can be truly creative. Although we have described some fairly “exotic” devices, it should be remembered that the computers in use outside of the laboratory today are fairly old-fashioned second-generation models. They have progressed from vacuum tubes or mechanical relays to “solid-state” components. When Artrons and neuristors and memistors and other more sophisticated parts are standard, we can look for a vast increase in the brain power of computers. The Gilfillan radar ground-controlled-approach system for aircraft that “sees” the plane on the radar scope, computes the proper path for it to follow, and then selects the right voice commands from a stored-tape memory seems to be thinking and acting already. The addition of eyes and ears plus limbs and locomotion to the computer, foreseen now in the photocell eyes of Perceptron, the ears of Cybertron, and dexterity of Mobot and Hand, will move the computer from mere brain to robot. Some people profess to worry about what will happen when the computer itself realizes that it is thinking, calling to mind the apocryphal story of the machine that was asked if there was a God. After brief cogitation, it said, “_Now_ there is.” To offset such a chilling possibility, it is comforting to recall the post-office electronic brain that mistook the Christmas seals on packages for foreign stamps, and the Army computer that ordered millions of dollars worth of supplies that weren’t needed. Or perhaps it isn’t comforting, at that! The question of whether or not a computer actually thinks is still a controversy, though not as much so as it was a few years ago. The computer looks and acts as if it is thinking, but the true scientist prefers to reserve judgment in the spirit of one shown a black sheep some distance away. “_This_ side is black,” he admitted, “but let’s investigate further.” ------------------------------------------------------------------------ “_For forms of government let fools contest, That which is best administered is best._” —Pope 7: Uncle Sam’s Computers The modern electronic version of the computer is about fifteen years old, and like most teen-agers, it is a precocious child. To list all the applications in which it has made a place for itself would take several pages and an inclusive listing from Airlines to Zoology. There are hundreds of different types, priced from less than one hundred dollars to more than ten millions. The latter are so expensive that outright purchase is not usually possible for users. Rental or leasing arrangements are therefore available; and there are a growing number of computer centers to which the customer can take or send his work and have it done. There are also do-it-yourself computer facilities, much like those for laundry, dry cleaning, and so forth, as well as installations in trailers that move from place to place. Most require a source of conventional electric power, but there are some portable models that operate on batteries. Scanning the list of jobs the computer now does, it would seem impossible to classify the varied tasks. Since many machines are versatile, general-purpose types, it is even more difficult to definitely categorize the computer. Dr. John R. Pierce, an expert at the Bell Telephone Laboratories, describes some of the chores done by a digital computer in a typical session at Bell: Check parts of a computer program used in connection with machine methods for processing manufacturing information. Process and analyze data on telephone transmission which have been transmitted to the laboratories by teletypewriter and automatically punched on cards for computer processing. Solve a partial differential equation. Compute details of the earth’s magnetic field. Check part of a program used to handle programming cards. Fit curves to data by translating numerical information into graphs. Locate an error in a program designed to process psychological data. These “simple” problems required but three minutes of the computer’s time. A larger task, something like solving 350 mathematical logic theorems from _Principia Mathematica_, takes a bit longer—eight and a half minutes, to be exact. Despite this versatility, it is generally possible to break the computer’s capabilities down into broad classifications. First we can say that it does either simple data-processing, or scientific computations. Each of these can then be further subdivided ad infinitum. Examples will be seen as we describe uses of computers on the following pages. Since the government was the first user of computers, beginning back in 1890 with Hollerith’s punched-card machines, we would do well to see what other work it has put the computer to in the years that have elapsed. _The Computer in Washington_ An inventory of electronic computers installed in the Federal Government by the end of 1961 totaled 800, with 200 more on order. These figures are exclusive of those for tactical and classified use by the Department of Defense. Some 45,000 people are engaged in computer operations in the government, and a total expenditure of about $1.5 billion is estimated. An indication of the importance accorded the computer by Washington is the Interagency Data Processing Committee, concerned with questions of sharing of computers in geographic areas, setting up of a “library” of applications, and assurance of continued computer operation in the event of attack or other emergency. Users of computers, in addition to the Department of Defense, are the Atomic Energy Commission, Department of Commerce, National Aeronautics and Space Agency, Federal Aviation Agency, Post Office Department, and others for a total of 43 agencies. The Peace Corps, for example, recently announced that it would acquire a computer for use in its work. Red tape is not the only output from Washington, D.C. Not long ago the Hoover Commission estimated that our Federal Government also produces 25 _billion_ pieces of paper each year! Someone else converted this already impressive statistic into the more startling information that placed end to end these papers would reach the moon four times—in triplicate, of course! Data-processing, then, the handling of information, would seem to be the major part of the computer’s work for Uncle Sam. The Census Bureau was the first government user of the computer, and it continues to handle its work in this way. In 1951 the government procured a UNIVAC I to take over this onerous chore from its predecessors. Beginning with the 1950 census, the computer has been in operation practically twenty-four hours a day, seven days a week. In its first ten years it performed more than 510 billion mathematical operations in keeping pace with our exploding population. We are producing more than paperwork, it seems. The 1950 census required four years to process. With newer computers the 1960 count will take only half as long despite the population explosion. Information-handling computers make possible another important phase of the government’s work. In 1936, machines began to process Social Security records, which are becoming a monumental pile of paperwork themselves with close to 100 million accounts that must be kept up to date. Social Security numbers recently turned up in government computers handling another job—that of income-tax bookkeeping. The U.S. Commissioner of Internal Revenue, Mortimer Caplin, put a pilot system of computer accounting of tax records into operation in January of 1962 in the Atlanta region. In 1963, the Philadelphia Center will follow suit, and by 1966 all income-tax accounting will whiz through tape reels into computers. The figures on tax greenbacks laid end to end are not available, but it is known that 400 miles of magnetic tape will be needed to hold all the records. The new system will make it tough on the income-tax chiseler. Caplin points out that not only the withholding-tax information from the employer, and forms from the employee, but also dividend statements and other supplementary income information will funnel automatically into John Doe’s portion of the tape. If John is moonlighting, holding down a second job he might forget to mention, the computer will spot it and charge a tax on it. The apprehended tax-dodger may well call the computer an infernal revenue machine. There are of course many other ways the computer is helping out in the complex problems of government, both Federal and local. The computer has already figured in national elections, making predictions well in advance as to the outcome. Now the machines are being used in the actual voting procedure. In 1952 an IBM computer predicted Eisenhower’s victory within two hours after the first polls closed. In the early days of computer predictions, the men using them were overly cautious and afraid to accept the machine’s word. Techniques and confidence have improved with practice, and in 1960 IBM’s RAMAC predicted victory for Kennedy at 8:12 P.M. election night. To make accurate predictions, the computer is given information from preceding elections. In 1960 it was fed the results of the 1956, 1952, 1948, and 1928 (because of the religious considerations) elections. Forecasters were able to ask the computer such questions as, “How is Nixon doing compared with Eisenhower’s showing in 1952?” “How is Kennedy doing compared with Al Smith back in 1928?” “Is labor voting as a bloc?” and “How solid is the South?” The computer is now an accepted part of network equipment for election reporting. ABC used the Remington Rand UNIVAC; CBS, IBM RAMAC and other machines; and NBC the RCA 501. [Illustration: _International Business Machines Corp._ Computers are used to predict the results of elections. ] In addition to forecasting results, computers are beginning to do other election work. Los Angeles County experimented with a computer method of counting votes in 1960. Greene County, Ohio, used punched cards for ballots for 50,000 voters in a pioneering computer voting system. The cards were processed in a UNIVAC computer at Dayton Air Force Depot. A bolder suggestion is that of political scientist R. M. Goldman of Michigan State University: actual voting by telephone-operated computer! To solve another problem area in voting, the use of computers was recently proposed at a state congressional hearing in Boston. Redistricting, the bugaboo that led to “gerrymandering,” might well be done by “unbiased” computers which would arrive at an optimum redistricting plan. These unbiased results would be “beyond politics and in the best interests of the voters and the State,” according to the computer expert who proposed the plan. Moving from voting to a more complicated problem, that of urban renewal, the University of Washington is conducting a survey under federal grant on the extent of deterioration and the causes of decay in Spokane residential, commercial, and industrial areas. The IBM 709 computer makes possible an accurate and extensive survey expected to shed light on areas of arrested development, and on the amount of tax revenue lost because of existing blight. _Electronic Legal Eagle_ Some writers see the clearest evidence of the victory of the computer—if indeed we admit to there ever having been any real battle—in the admission by the legal profession that it must begin to chart the legal seas of the computer age. In 1961 the American Law Institute and the American Bar Association, feeling that the computer will cast its “automated shadow on every phase of society,” conducted a joint three-day seminar in Washington, D.C. Titled “Legal Problems in the Use of Electronic Data Processing in Business, Industry and Law,” the seminar discussed “function and operation of computers and their impact on tort, tax, corporation, labor, contract, banking, sales, antitrust, patent and copyright law, as well as on the law of evidence and trial practice.” Lawyer Roy Freed of the Philadelphia Bar, in a booklet called “A Lawyer’s Guide Through the Computer Maze,” describes the working of the machine and then poses some challenging legal questions. What duty does the company acting as a computer service organization have to preserve the confidential nature of the data it processes for its customers? Can business records placed on magnetic tape be used in evidence, or must the original records be preserved? How long can corporate management lag behind others in their industry in adopting machine data-processing systems before they expose themselves to a mismanagement charge? To what extent should the manufacturer of a complex product that has a potential for causing harm try to minimize his liability as a maker by anticipating design defects through simulated operation on the computer? Other legal experts asked other questions. If an electronically processed check is bounced erroneously, who is responsible? If a noncomputerized railroad has a train wreck, can the road be sued on the premise that the accident would not have occurred with modern traffic controls? Or if the reverse occurs, can an anticomputer claimant win a suit against the machine? Applications of the computer in patent law may lead to more thorough search in addition to higher speed. This could well clear another bottleneck by issuing fewer and faster patents. But copyright violation problems lie in the possibility of making copies of tapes or other media suitable for the computer’s use. The altering or falsification of computer data also poses a tricky legal problem; there is already a precedent in the Wall Street man who juggled the punched cards on the computer to his own advantage. Perhaps there was one question none of the lawyers present had the courage to bring up: what if the day comes when the court itself is a computer, and the case is presented to it as a stack of cards, or a prepunched or magnetized tape? Such a mechanized justice was fancifully depicted on a television thriller by Ray Bradbury. _Computers in Khaki_ Despite the low IQ it has been accused of, it was inevitable that the computer be drafted. In the 40’s we were desperate. Included in government use of computers are military research, development, and tactical and strategic methods. World War II was a different kind of war, a complicated, electronic war that required advanced methods of operation. At Eastertime in 1942, IBM answered an urgent call from Washington and gathered all available data-processing machines for use by the military. Punched cards kept track of allotments, insurance, and the logistics of running a war. Mobile computing machines operated close to the front lines, and were important enough that a captured German officer was carrying urgent orders to bring in one of these units. [Illustration: _Motorola, Inc., Military Electronics Division_ Technician checks circuitry of airborne digital computer. ] Besides the mundane effort of mere data-processing, wartime computers did important cloak-and-dagger work as well. A report came in from Allied intelligence that the Germans were working on a frightening new development—an electrically powered cannon. If it were successful we would need some kind of counterweapon. But the dike was leaking in a hundred other places too, and there was not time or equipment to do everything it seemed we might have to do. The answer was to feed some complex mathematics to an IBM computer called the Automatic Sequence Controlled Calculator—mathematics describing the new cannon. The computer cogitated briefly and decided that the Germans were on the wrong track; that the gun would not work. We therefore ignored the threat, letting the Germans waste their valuable time going down the blind alley, and turned our efforts elsewhere. We have said that World War II was a different kind of war. One new development to bear out this difference was called “Operations Research”—the reduction of any program to mathematical formulas and the investigation of these formulas rather than a conventional, intuitive approach. The technique was pioneered in England, spread to the United States, and is now one of the most powerful tools not only of the military but also of government and business. The computer has made operations research a more powerful technique by permitting the analysis of thousands or millions of possibilities in hours instead of lifetimes. Back in the days of bows and arrows, the soldier had no need for a computer. Even the rifleman required little more than a simple sight and maybe a bit of Kentucky windage. With the coming of long-range artillery, computers became desirable, and now we have moved into an age of warfare that would be impossible without high-speed computing machines. In 1948 IBM introduced a computer known as SSEC for Selective Sequence Electronic Calculator. This machine was put to work on a problem for the Los Alamos Atomic Energy Laboratory, a problem called “Hippo.” Hippo was as unwieldy as its name, requiring some nine million involved mathematical operations that would have taken about 1,500 man-years of skilled time. That many mathematicians or that much time was not available, of course, and SSEC clicked through the job in 150 hours by itself. Another computer, the MANIAC, designed by John von Neumann, is credited with beating the Russians to the punch with the hydrogen bomb. As an outgrowth of operations research, the simulation of war games has become an important part of military work. A number of firms, including System Development Corporation, Technical Operations, Inc., and others, devote much of their time to “playing games” to work out the optimum strategy and tactics for war in case we find it necessary again. It is perhaps not paradoxical that war be considered a game. As William Cowper said, “War’s a game, which were their subjects wise, Kings would not play at.” The game of chess, conversely, stems from war and its tactics. Indeed, the term checkmate, for victory, comes from the Persian words _shah mat_, the king is dead. Through the years many war games have been developed, games which eliminate the physical conflict but preserve the intellectual maneuvering necessary for waging “war.” John von Neumann was one of the more recent to turn his great genius to this subject in the development of his “minimax” theory. This is an outline of a situation in which consequences of decisions depend on the actions of an opponent. We have seen that the computer, though not yet world champion, can play chess; the minimax theory is more grist for its electronic mill. Tech-ops operates the Combat Operations Research Group for the U.S. Continental Army Command at Fort Monroe, Virginia. Among the games played here with computers are SYNTAC, in which field-experienced officers evaluate new weapons and tactics, and AUTOTAG, a computer simulation of tank-antitank combat. Other projects of this firm include air battle simulations, ship loading and other logistics problems, fallout studies, and defense against missile attacks. The beauty of such schemes is that we will not make the mistake of the Germans with their electric cannon. When the computer blinks “Tilt” or an equivalent, the engineers may have red faces, but no huge amount of time or money will have been spent before they sigh, “Back to the old drawing board!” [Illustration: _Aeronutronic Division, Ford Motor Co._ ARTOC (Army Tactical Operations Central) uses computer techniques for battlefield display and communications to aid field commanders. ] At Picatinny Arsenal, computers evaluate ammunition by simulating as many as a thousand battles per item. Design and management studies for projects like Nike-Zeus and Davy Crockett are also conducted at Picatinny. A mobile computer, called MOBIDIC, is designed for field combat use and has been moved in three 30-foot trailers to location with the Seventh Army in Europe. There it handles requisitions for rockets, guided missiles, electronic equipment, and other items. MOBIDIC is just part of the Army’s FIELDATA family of computers that includes helicopter-transported equipment to provide field commanders with fast and accurate data on which to base their risk decisions. Another concept is ARTOC, for Army Tactical Operations Central, an inflatable command post in which computers receive and process information for display on large screens. This is a project of Aeronutronic. In 1961 an IBM 7090 computer was installed at Ispra, Italy, for use by the European Atomic Energy Commission (EURATOM). The computer would have as its duties the cataloging of technical information on atomic energy, the translation of technical publications, and use in basic research on solutions of Boltzmann equations and other advanced physics used in atomic work. In this country, the National Science Foundation has acknowledged the importance of the computer in scientific investigation by underwriting costs for such equipment for research centers in need of them. [Illustration: _International Business Machines Corp._ Command post of SAGE, the most complex computer application to date. ] _In the Air_ Beyond the realm of war gaming, the computer also forms the heart of the hardware that such simulation and studies develop. SAGE is an example of this, a complex warning system that protects our country from attack. The acronym SAGE is a more dignified and impressive name than the words it stands for—Semi-Automatic Ground Environment, an environment that by 1965 will have cost $61 billion! Sage is not a single installation, but a vast complex of centers feeding information from Ballistic Missile Early Warning Site radar and airborne radar, from ships, Texas towers, and ground-based radar, and from weather stations into a central control. This control sends the proper signals to defensive rockets, missiles, and aircraft for action against an invader. It does this with one hand, while with the other it keeps tabs on normal military and commercial air traffic. The System Development Corporation designed and IBM built the SAGE computer, a computer already old-fashioned since it uses vacuum tubes instead of the newer transistor devices. Despite this shortcoming, it does a fantastic job of tracking all the aircraft and missiles in its ken, labeling them for speed, heading, altitude, as well as the vital information of friend or foe, and continuously plans a defense. Since it can monitor civilian traffic as well, SAGE may one day take over control of that too. Thus the money spent will yield a bonus in addition to the protection SAGE has already afforded in its military role. The Air Force uses airborne computers by the thousands. Indeed, the need for small lightweight computers for applications in aircraft led to early work in the miniaturization of components that made possible tiny computers for missile and space use. Small digital computers were built for “drone” aircraft navigation; now more advanced computers provide “air data,” air-speed, altitude, flight attitude, pressure, and other information. Other Air Force computers, used in BMEWS radar, take the place of human observers. These smart computers can recognize radar tracks that are potential missile trajectories, discriminate among these tracks to select hostile trajectories, and project them to impact points and times. Called MIPS, for Missile Impact Predictor Set, the computer takes over from its human forerunner who just can’t seem to perform the 200,000 operations a second required to do the job. Another space-tracking computer called SPADATS has been installed at NORAD Combat Operations Center at Colorado Springs. This computer has the assignment of around-the-clock cataloging of all man-made objects in space, a sizable and growing task. At Vandenberg Air Force Base, the Air Force maintains an EDP, or Electronic Data Processing project with a more earth-bound job of cataloging. Started back in 1957, this project has as its primary task the efficient allocation of manpower for the global Strategic Air Command team. At nearby Edwards Air Force Base, an IBM 7090 computer is helping to develop the Dyna-Soar manned space glider. This computer is also doing work for the X-15 program, and research on fuels, lunar probes, rocket nozzles, and nose cones. At Tinker Air Force Base in Oklahoma, a new system of keeping track of jet engine parts, so that they go back on the proper engine, uses a recorder “gun” wired to a central control computer. Engine parts are metal tagged with coded letters which the recorder “reads” and transmits to the computer for filing. Computers play a big part in the “largest and most sophisticated logistic data and message communications system in the world.” Delivered to the Air Force in January of 1962, “Comlognet” connects 450 different air bases and other installations. This system started out modestly, handling about 10 million punched cards a day, and is heralded as only a forerunner of an automatic system which will some day take care of the complete interflow of data among widely separated military and civilian locations. Besides being part of complex navigation and bombing systems, computers help the Air Force to score the results of practice bombing missions. Computers control the launching of Sidewinder missiles from aircraft and also permit accurate “toss-bombing” of nuclear payloads from fighter bombers. These computers do all the mathematics and let the pilot approach his target from any direction, at any speed and altitude. The new Skybolt ballistic missile, launched from the B-52 bomber, has its own guidance computer, which is actually a digital differential analyzer, a hybrid device like that described in an earlier chapter. One of the largest single computers in the Air Force is that called Finder. Using 70,000 transistors, it does analytical work on electronic countermeasures. Today there are some 110,000 aircraft flying the skies in this country, about double the number ten years ago. Not only the quantity but also the speed of aircraft has increased, and the job of the aircraft controller has become a nightmare. With the lives of air travelers in his hands, this overworked FAA employee has until recently used the same equipment that served in the days of 180 miles-per-hour piston-engine transports. We have discussed some examples of the computer as a director of air traffic; the automatic ground-controlled-approach system that lands planes in bad weather without human help is one, the mighty SAGE defense system is another. SAGE may one day take over commercial air traffic: in the meantime, the Federal Aviation Agency relies heavily on smaller computers in locations all over the country. Originally, general-purpose business computers were put to work processing the vast quantities of data needed to keep traffic flowing along the airways. New, special designs, including those of the Librascope Division of General Precision, Inc., are being added as they become available. Remington Rand UNIVAC is also working on the problem, and UNIVAC equipment has been tested on Strategic Air Command round-robin flights. It has posted as many as eighty in-flight Axes for one mission, a feat that the unaided human controller can only gasp at. Obviously, control of aircraft cannot be turned over pell-mell from human to computer. The FAA is proceeding cautiously, and a recent report from an industry fact-finding board recommended a “Project Beacon” approach which will continue to rely heavily on radar plus human controllers. But when the problems of communication between man and machine are worked out, no human being can keep track of so many aircraft so accurately, or compute alterations in course to prevent collision and ensure an optimum use of air space as can the computer. ------------------------------------------------------------------------ _On the Sea_ The Navy uses computers too. At the David Taylor Test Basin in Maryland, a UNIVAC LARC is busy doing design work on ship hulls. Other computers mounted in completed Navy vessels perform navigation and gun-ranging functions. At New London, Connecticut, a Minneapolis-Honeywell computer simulates full scale naval battles. Radar and sonar screens in mock submarine command posts show the maneuvering of many ships in realistic simulations. Polaris submarines depend on special computers to launch their missiles, and the missiles themselves mount tiny computers that navigate Polaris to its target. Another computer task was the “sea testing” of the nuclear submarine “Sea Wolf” before it was launched! [Illustration: Photo courtesy of _Litton Systems, Inc._ Airborne computer-indicator system in Hawkeye naval aircraft. This equipment performs task of surveillance, tracking, command and control. ] Computers are being used by the Navy in a project that has tremendous applications not only for military application but for civilian use as well. Mark Twain to the contrary, a lot of people have tried to do something about the weather, among them an Englishman named Richardson. Back in 1922 he came up with the idea of predicting the weather for a good-sized chunk of England. Basically his ambitious scheme was sound. Drawing on weather stations for the data, he determined to produce a 24-hour forecast. Unfortunately for Mr. Richardson, the English, and the world in general, the mathematics required was so complicated that he labored for three months on that first prediction. By then it had lost much of its value—and it was also wrong! The only solution that Richardson could think of was to enlist the aid of about 60,000 helpers who would be packed into a huge stadium. Each of these people would be given data upon which to perform some mathematical operation, and then pass on to the next person in line. Pages would transfer results from one section of the stadium to another, and a “conductor,” armed with a megaphone undoubtedly along with his baton, would “direct” the weather symphony, or perhaps cacaphony. As he lifted his baton, the helpers were to calculate like crazy, when he lowered it they would pass the result along. What Richardson had invented, of course, was the first large-scale computer, a serial computer with human components. For a number of reasons, this colossal machine was never completed. It was obviously much easier to simply damn the weatherman. Actually, Richardson had stumbled onto something big. He had brought into being the idea of “numerical weather prediction.” It is known that weather is caused by the movement of air and variations in its pressure. Basically it is simple, knowing pressure conditions yesterday and today, to project a line or extrapolate the conditions for tomorrow. If we know the conditions tomorrow, we can then predict or forecast the temperature, precipitation, and winds. [Illustration: _U.S. Navy_ Weather map prepared and printed out by computer gives data in graphical form. Enlarged view of weather “picture” (above) shows how it is formed by printed digits representing the pressure at reporting stations. ] There was even the mathematics to make this possible in Richardson’s day: the so-called “primitive equations” of the pioneer mathematician Euler. These are six partial differential equations involving velocity, pressure, density, temperature, and so on. But though the principle is simple, the practical application is hopelessly involved—unless you have a stadium filled with 60,000 willing mathematicians or a fast computer of some other type. In 1950 the stage was finally set for the implementation of numerical weather prediction. First, electronic computers were available. Second, and importantly, mathematician C. G. Rossby had worked some magic with the original primitive equations and reduced them to a single neat equation with only four terms. The new tool is called the Rossby equation. Meteorologists and mathematicians at Princeton’s Institute for Advanced Study decided to combine the Rossby equation, the MANIAC computer, and some money available from the Office of Naval Research. The result was JNWPU, Joint Numerical Weather Prediction Unit, later to become NANWEP, for Navy Numerical Weather Problems Group. It is too bad that pioneer Richardson did not live to see the exploitation of his dream. What NANWEP does is to take the meteorological data from some 3,000 reporting stations, compare them with those existing yesterday, and print out a weather map for the Northern Hemisphere for tomorrow. Because there are so many more stations reporting than the handful that Richardson used, the number of computations has risen to the astronomical total of about 300,000,000. Despite this, a Control Data Corporation 1604 digital computer does the job in a good bit less time than the three months it took Richardson. NANWEP prints out its weather maps 40 minutes from the time all data are in. Teletype reports come in from the thousands of weather stations; these are punched on tape and fed to the 1604. Since the information includes geographical position in addition to meteorological data, the computer prints out numbers that form a map of weather coming up. Although the meteorologist adds some clarifying lines by connecting points of equal pressure, the “raw” map with its distinctive shaded areas is meaningful even to the layman. Further refinements are in the offing. As many as 10,000 weather stations may eventually report to the central computer, which may also learn to accept the teletype information directly with no need for the intermediate step of punching a tape. Although it will be a long time before a positive forecast, exact in every detail, is possible, NANWEP already has lifted weather prediction from the educated guesswork of the older meteorologists to truly scientific forecasting. It turns out that numerical weather prediction brings with it some bonuses. NANWEP can predict the action of ocean waves three days in advance, in addition to its regular wind, temperature, and precipitation information. So it is now being put to work preparing optimum routes for ships. Here’s the way it would work. A ship sailing from California to Japan requests the best routes for the voyage. Initially the computer is given the ship’s characteristics and told how it will perform in various sea conditions. It then integrates this information with the predicted sea conditions for the first day’s leg, and plots several different courses. Distances the ship would travel on each of these courses are plotted, and a curve is drawn to connect them. Now the computer repeats the process for the next day, so that each of the tentative courses branches out with its own alternates. The process is repeated for each of the five days of the voyage. Then the computer works backward, picking the best route for the entire voyage, and gives the course to be followed for optimum time. If that isn’t sufficiently informative for the captain, he can request and receive not one but three courses: one for the fastest trip regardless of sea condition, another for the fastest trip with waves of only a certain height, and finally a course for the fastest trip through calm water! The advantages of such a service are immediately obvious and give a hint at many other applications of the technique to air travel, truck transport, and so on. NANWEP is ground-based, of course. There are also airborne weather computers like those of the U.S. Weather Bureau’s National Severe Storm Research Aircraft Project. The Weather Bureau has jumped its computer budget from $1.5 to $2.5 million to extend this and other projects. The compact airborne computers ride along in DC-6 and B-57 aircraft to monitor hurricanes off Florida and tornadoes in the Great Plains area. The computers gather forty different kinds of information and convert it to digital form at thousands of characters a second. Such monitoring of violent weather by means of computers suggests an intriguing use of the machine. Man has long considered the prospect of going the step beyond weather recording and prediction to actually changing or even creating his own weather. He has done a few rather startling things of this kind, admittedly on a small scale but with tremendous implications. Cloud-seeding experiments are samples, as attempts both to induce precipitation and to create or destroy storms. These experiments, though inconclusive, have led to results—including precipitation of lawsuits and ill feeling. Meteorologists attempted to divert a hurricane along the Atlantic coast line once, apparently with results. But the storm swerved too far and the weathermen incurred the justifiable wrath of those living in the area affected. Why not simulate such an experiment in the computer? Besides being safer, it is also far cheaper. In the long run, we may do something about the weather at that. _Computers in Space_ There are many points in history when seemingly fortuitous happenings take place. The invention of the printing press appears to have occurred at a fork in the road as literature flowered. The discovery of gasoline and the automobile went hand in hand. So it is with the electronic computer and the spacecraft. Is the computer here because it was needed for such an application, or did it actually cause the advent of space flight? Our conclusions must depend on our belief or disbelief in such things as causality. A realistic view might be merely to applaud and appreciate the confluence of two important streams of thought to make a river that will one day flow to the other planets and finally out of the solar system entirely. Putting even something so unsophisticated as a brick into orbit would require the plotting of an exact trajectory handily done only by a computer. Sending the Mercury capsule aloft obviously requires a more refined aiming system, and its re-entry into the atmosphere demands a nicety of calculation measured in a fraction of a degree. The same is true for the Russian achievements in sending a space vehicle around the moon, and manned capsules in prolonged orbit. Such navigation can be planned and carried out only by the sophisticated mathematics of a computer. Dr. Wernher von Braun has said that any effective space-vehicle firing program would be impossible without computers and computing techniques. Not long ago, the mariner could leisurely brace himself on the deck of his vessel and take a noon sight with his sextant. It mattered little if it took him some time to work out the computations; his ship traveled at only a few knots and in only two dimensions. Today the space capsule or missile moves as far in a single minute as a ship might in an entire day, and it moves not across the practically flat surface of the sea but through three-dimensional space in which that third degree of freedom is of vital importance. Not only must the navigation be done with fantastic precision, it must be done in “real time” to be of any value. This is true whether the mathematics is being done by a Mercury capsule or one of our antimissile missiles. Just as Richardson’s weather prediction three months after the fact was of little use, the trajectory of an invading missile will avail us nothing if it takes us thirty minutes to compute. The problem by then, for the survivors, will be one of fallout and not blast. For this reason a computer is aboard practically every space vehicle that leaves the earth. The Atlas and Titan, the Minuteman and Polaris, all are controlled by tiny digital computers in their innards, supplemented by more complex machines on the ground. These ground computers calculate the trajectory, then monitor the missile to correct its course if necessary. Complex as these functions seem, they are childishly simple by comparison with the kind of calculations that are necessary for lunar or planetary flight. A mathematician who knew his astronomy could work out the figures necessary to launch a space craft on its flight to Venus, but he would have to start some time before launching day. In fact, it would take forty generations of mathematicians to do the job. The trip itself would consume about four months. At the Jet Propulsion Laboratories of the California Institute of Technology, this 800-year project is planned and flown in thirty seconds by an IBM 7090 computer. For example, the computer tells us that if we had blasted off bright and early on August 17, 1962, we could make it to the Clouded Planet at 10:09 A.M., December 9. The curved trip through space would cover 32,687,000 miles. The computer, then, not only can perform in real time but can even shrink time. The Venus trip is simulated daily at the Jet Propulsion Laboratories, and tapes stored in the computer cabinets also bear the names Moon, Mars, Saturn, Jupiter, and so on. When the day comes to make the actual voyage, the odds are good that because of what scientists have learned from the computer the trip will go as smoothly as all the simulations. Rather than the planetary voyages, which are still some time off, lunar soft landings will be among the first to demonstrate the accuracy of simulations now being made by General Dynamics, whose Atlas-Centaur will put the lunar rover Surveyor on the moon shortly. Apollo, the three-man lunar spaceship, won’t be far behind. Not long ago a computer was put to work to see if it could pare down the costs of the Atlas and Thor rocket engines. We have to have such defensive weapons, but the cheaper we can make them the more we can afford. The economy program worked, reducing costs more than a third. _Summary_ The computer is on the Washington payroll to stay, and it may well move up the hierarchical ladder there. It was not a comedian but an M.I.T. professor who recently suggested that the computer will replace the bureaucrat. Contending that the computer is inherently more flexible than the bureaucrat, Professor John McCarthy told an Institute of Radio Engineers meeting that the machines will not regiment us. “On the contrary, I think we can expect a great deal more politeness from machines than we have gotten from humans,” he said. His views were debated by other panelists, but the gauntlet seems to have been flung. With a party affiliation, the computer may well run for president someday! [Illustration: Lichty, © _Field Enterprises, Inc._ “It IS human, men!... Besides solving our problems of global strategy, it’s also beginning to jot down its memoirs!” ] ------------------------------------------------------------------------ “_Business may not be the noblest pursuit, but it is true that men are bringing to it some of the qualities which actuate the explorer, scientist, artist: the zest, the open-mindedness, even the disinterestedness, with which the scientific investigator explores some field of research._” —Earnest Elmo Calkins 8: The Computer in Business and Industry The government, of course, is not the only user of the electronic computer. Business is faced with the same problems as government, plus others perhaps, and can use the same techniques in planning, producing, merchandising, and keeping track of its products. To General Electric goes the distinction of first installing the large-scale electronic computer for its business-data processing. This was done quite recently, in 1954. Commenting on the milestone, the _Harvard Business Review_ said in part: The revolution starts this summer at General Electric Company’s new Appliance Park near Louisville, Kentucky. The management planning behind the acquisition of the first UNIVAC to be used in business may eventually be recorded by historians as the foundation of the second industrial revolution; just as Jacquard’s automatic loom in 1801 or Frederick W. Taylor’s studies of the principles of scientific management a hundred years later marked turning points in business history. It is early yet for comment from historians, but the growth of the business computers from the pioneer UNIVAC bears out the theme of the _Harvard Business Review_ suggestion. In 1961 there were 6,000 large electronic computers in use; General Electric alone has more than 100. One big reason for this is the fact that government is not alone in its output of paperwork. It has been estimated that one-sixth of our Gross National Product, or about $85 billion, is devoted to paper-handling. In the time it takes to read this chapter, for example, Americans are writing 4 million checks, and this is only a small part of the paperwork involved in the banking business. [Illustration: _General Electric Co., Computer Dept._ First National Bank of Arizona personnel operate sorters during initial operation of a new GE-210 computer-controlled data-processing system. The sorters process bank checks at the rate of 750 per minute as printer (foreground) prints bank statements at 900 lines per minute. ] Wholesale banks have been called fiscal intelligence agencies, doing business by the truckload, and measuring the morning mail by the ton. Yet this information is dealt with not only in volume, but in precise and accurate detail. If a client asks about the rating of a customer who has just ordered several million dollars worth of goods, the bank may be called on to furnish this information in a very short time, even though the customer resides halfway around the world. Since they deal in figures, it is logical that banks were among the first businesses to be computerized. Many of us are aware of those stylized numbers now on the bottom of most of our checks, and vaguely conscious that through some mysterious juggling by computers called ERMA and other such names banks balance our accounts at electronic speed. Insurance companies were next in line as computer candidates. Like banking, insurance is believed to have been available to Babylonian merchants thousands of years ago. In those days there were fewer people, and probably claims were fewer; the abacus was the only computer needed to keep pace. But since insurance was introduced on the North American continent, coincidentally in the same state, Pennsylvania, as banking, it has been threatened with drowning in a sea of its own policies. The computer is ideally suited for doing the work of the insurance business. There is no question today that the computer has taken over from the insurance clerk. One firm installed computers in 1953 and since then has doubled its accounts and tripled dollar volume, without hiring the 250 additional people who normally would have been required for such an increase. Eight outlying offices have been closed, yet service is better and faster, agents’ commissions are paid twice a month instead of only once, and actuarial computations that once took six months are now done in a week. A computerized world is of course not without its problems. The computer system is so efficient, in fact, that the same outcry is going up from labor as was heard in the days of the first industrial revolution. It has been said with some truth that automation upgrades jobs, and not the workers themselves. The change-over from quill pen to pushbutton console will take some time and cause some pain, but in the end our gain will be as great a stride as we have made since the days of the introduction of the first factories with their more efficient production methods. Surely the business worker already has been freed from the tedium of adding columns of figures and much filing, and given pleasanter work in exchange. _The Shopper’s Friend_ After banking and insurance, which businesses yield to the lure of high-speed automatic data-processing? Department stores are dabbling, and supermarkets too are beginning to use the computer. The A & P stores are studying such a system, as is the Liggett Drug Company. At first the computer looked attractive as an inventory and ordering tool; now it is headed in the direction of automating the actual shopping operation. In Paris, a retail grocer made merchandising history by displaying more than 3,000 different items in a floor space of only 230 square feet. The trick is in a punched-card system that automatically registers and prices any item the buyer selects. At the check stand the card is run through a computer which figures the bill and orders the groceries, which are automatically selected from the warehouse and delivered in a cart to the purchaser at the door! A similar automatic supermart system is being pushed by Solartron—John Brown, Ltd., in England. The computerized scheme works much like the French one. The shopper inserts a card in the slot beside the item she wants and a punch marks it in alpha-numeric code for item and price. If more than one item is desired the card is reinserted. With each punch, the machine slices off a bit of the edge of the card so that it slides deeper into the slot next time. At the cashier’s station, the card is placed in a computer. Fifteen seconds after she has paid for them, the shopper is delivered her groceries. Besides the saving in time for the shopper, there is a saving for the grocer in floor space and also the elimination of the loss through shoplifting. About the only thing that might seem to be against the new system is the psychology of the large display, which motivation researchers tell us stimulates volume buying. With this factor in mind, an official of Thompson Ramo Wooldridge, Inc., has suggested retaining the large stocks on display, but coding them with fluorescent paint of certain wavelengths to correspond to price. The shopper fills her cart even as in the conventional store, but at the mechanized checkstand an electronic eye on the computer scans and prices the items while they are being automatically packaged. The doubting Thomases say of this system that the packager will probably put the eggs on the bottom, along with the tomatoes and ice cream! The advertising journal _Commercial Art_ comments sadly on this accepted fact of automation in the market place: The checkout clerk is doomed, that last survivor of human warmth in most of today’s supermarkets. His eventual executioner will be the electronic computer, of course. Pilot systems using computers for automatic checkouts are already drawing a bead on the jovial little man in the green smock. Eventually even he will disappear from the faceless canyons of our sleek supermarkets. But the writer finds a ray of hope in the conclusion of his editorial. Skilfully designed packages can strike an emotional chord in the consumer, can create strong brand preferences even in the absence of product differences. Supermarkets can give the appearance of being a friendly, “human” place to shop even if the only humans visible are the customers. To make more complete the rout of conventional merchandising by the computer-oriented system is the plan to automate even the trading stamp. American Premium Systems, Inc., a Texas firm, is developing a plan in which the customer receives a coded plastic card instead of a stamp book. When he makes a purchase, a card is punched with the number of credits he has earned. By means of a centralized computer, an IBM 1401 in this instance, records are kept continuously, and when the customer has accrued 1,500 points he receives a premium automatically. The obvious advantage here is to the customer, who is spared the messy task of licking thousands of evil-tasting multicolored stamps, and the danger of losing the book before redemption. But the storekeeper profits too. He does not risk the loss or theft of stamps, nor does he buy stamps for people who are not going to save them. The complete system will call for an IBM 7074 and represents an outlay of about $3 million to service some 6 million customer accounts. Before leaving the area of merchandising, it might be well to mention inventory management in general and the effect of the computer upon it. Applying what is known as “conceptual order analysis,” one marketer who is using computers in his business talks of “warehousing without bricks or mortar.” With a confidence born of actual testing, his firm expects one day to have _no_ inventory except that on his production lines or in transit to a customer. This revolutionary idea is based on practically instantaneous inventorying, production ordering, and delivery scheduling. While the warehouse without bricks or mortar is not yet a fact, research discloses many manufacturers who have already cut their standing inventories, from small amounts to as much as 50 per cent, while maintaining customer service levels. This was done using what by now are “standard” electronic information-handling methods. The implication here is of the computer not merely as a data-handler, but as a business organizer and planner as well. _Electronic Ticker Tape_ The stock market lends itself to the use of high-speed data-processing, even though a Wall Street man achieved notoriety some time back as the first embezzler to use computer techniques. Admittedly it is harder to track down the hand in the till when it pushes buttons and leaves no telltale fingerprints or handwriting, but computerization continues despite this possible drawback. The same firm has added digital computers to one of its offices for faster service. The American Stock Exchange installed $3 million worth of new processing equipment to provide instantaneous automatic reports on open, high, low, close, bid, asked, and volume-to-the-moment figures. [Illustration: _International Business Machines Corp._ On the floor of the New York Stock Exchange, representatives of Thomson & McKinnon and IBM discuss a model of the computing system which will speed transactions from the offices of the brokerage houses in 41 cities to the New York and American stock exchanges. ] The stock market’s need for the computer lies in the usual two factors: tremendous paperwork and increasing pressure for speed. Trading of stock amounts to about three-fourths of a billion shares in a year, and occasionally 3 million shares a day change hands. A major brokerage house has confirmations to handle on thousands of trades, dividends to credit to nearly half a million active accounts, and security position and cash balances to compute for each customer. The increasing amount of business, plus the demand for more speed and accuracy, make the computer the only solution. Simply reporting the results of the day’s marketing in the newspapers is a monumental task. The Associated Press is installing a system based on an IBM 1620 computer, in which ticker information will also be given in the computer for sorting, comparison, tabulation, and storage. At the correct time, the machine will print out the format for publication in the press at the rate of 4,500 words a minute. With a memory of 20 million characters and a capacity for 600,000 logical decisions each minute, the computer keeps up with stock information practically as fast as it is received, and even a late ticker will not mean a missed newspaper deadline. Associated Press expects to be able to transmit the stock-market results to its papers within fifteen seconds after the ticker closes. Not just in the United States but in Japan as well, the computer is invading the stock market. The abacus is out, and now the exchange in Tokyo is using an advanced UNIVAC solid-state computer to process transactions. _Versatile Executive_ It is this high-volume capacity, speed, and accuracy that makes the computer a welcome new employee in most business operations. An example is the Johnson’s Wax system linking its facilities for rapid management reaction to changing conditions. Headquarters is linked to twenty three warehouses and sales offices, and today’s work is based on yesterday’s inventory instead of last month’s. Computers schedule hotel reservations, and handle accounts payable and receivable for the hotel industry. Auto-parking, now a $500 million a year business, leans ever more heavily on computers for ticket-issuing, car-counting, traffic direction, charge-figuring, and collection. The freeway too has its computers, though there have been minor setbacks like that on the New Jersey Turnpike where an automatic toll-card dispenser was mistaken by slow-thinking people for a collector and its working was jammed with coins and battered by abuse when no change was forthcoming! Man will take some educating as the machine finds wider employment. The computer has been seen in the publishing business primarily as a tool for searching lists and printing addresses. Now it is beginning to take over more important duties such as typesetting. The new daily _Arizona Journal_ is the first newspaper to make use of this technique. From use in other businesses, the computer has grown to fostering a business of its own. An example is in the production of payroll checks by specialty firms, and safeguarding against bad checks with such services as Telecredit, a computer-run system that spots bad checks upon interrogation from its member stores. In Waterbury, Connecticut, a computer helps home-buyers and realtors by listing all available homes in the area. Three reports are produced: a total listing, a listing by style, and a listing by price. Bell Telephone in New York uses a computer system to deliver its 9 million directories to subscribers in the city and suburbs. The rapid system permits changing of delivery orders even while the books are at the printers. A computer method of making sausage recipes is now available to all packers. Remington Rand developed this application at its UNIVAC Center on the campus of Southern Methodist University. _Communication_ Communication is a vital part of all business, and the digital computer finds another application here. A technique known as adaptive control was recently presented at a symposium by scientists from IBM. Special-purpose computers integrated into communication networks would make possible the “time-sharing” of channels and cut costs per message sharply. Another digital computer, an inexpensive “decision threshold” device, is being pushed as a means of reducing errors in the transmission of messages. These logical uses of the computer were presaged in the 30’s when Shannon wrote his pioneering circuit-logic paper, and in the late 40’s with his work on information theory. TV Station KNXT in Los Angeles uses a digital computer to control the complicated switching necessary during station breaks. This electronic juggling of live shows, commercials, and network programming is called TASCON, for Television Automatic Sequence Control. It can be programmed hours before use, and then needs only the push of the button instead of frantic manual switching that occasionally throws the human operator. Not just the mechanics of transmitting the commercials on TV, but even the billing and other accounting functions are a major computer project. To handle close to $700 million a year in payments, an IBM 7090 computer is being used. There are more than 5,000 TV stations in the country, with billings dependent on a complicated structure of 180 different rates. As a result, there is an undesirable lag in payment. Putting records on tape and feeding them to the computer is expected to clear up the trouble and provide a bonus in the form of advising stations on discount rates for programming on a current basis. The computer isn’t content with skirting the edges of the advertising game, of course. A heated battle is going on now in this industry over the growing use of the computer to plan campaigns and actually evaluate ads, a task held by some to be the exclusive domain of the human adman with his high creative ability. The Industrial Advertising Research Institute triggered the fight by using a computer to study 1,130 advertisements appearing in the industrial journal _Machine Design_ and select the best black-and-white and the best color ads. While diehards snorted ridicule, the computer made its choices. IARI then compared its selections with those made by two of the largest and most experienced rating firms. On color ads, the computer scored 66 per cent, rating two out of three ads practically the same as the human selectors. With black-and-white it did even better, scoring 71 per cent. Its detractors, assuming of course that the human raters were infallible, gloated that the computer was a flop, that it could pick only the average ads accurately and fell down on excellent and poor ones. The agency of Batten, Barton, Durstine & Osborn thought otherwise and is using the computer in its advertising. As a tool for media selection and scheduling, BBDO likened the computer to a power shovel instead of a spade. The new method makes it possible to compare thousands of combinations a second. Another firm, the Simulmatics Corporation, agrees with BBDO. The computer, it says, will permit advertising campaigns far more effective than those waged at present, since the most efficient campaign may be too complex to be devised without artificial aid. The key to the Simulmatics system is the “media mix model” in which a hypothetical campaign can be tried out in advance in the computer. Young & Rubicam differs hotly with computer advocates. A spokesman leveled a low blow at the computer, suggesting that it will have difficulties forming motivational research based on Freudian analyses! The firm says no way has yet been found to transpose “Viennese fatuities” into Arabic numerals. It deplores the turning of a media-planner into a rubber stamp as media selection becomes an automatic reiteration which “those with an abacus could pipe to a stale and sterile tune.” The battle rages, but the outcome seems to be a foregone conclusion. Either the computer will sway Madison Avenue from Viennese fatuities, or it will learn about sex. _Industry_ We have discussed the computer in business; perhaps it would be well to stress that this includes industry as well. The computer not only functions in the bank and brokerage house, insurance office, and supermart, but also is found increasingly in jobs with oil refineries, chemical plants, surveying teams, knitting mills (a likely application when we remember Jacquard), and steel mills. As automation takes over factories, it brings the computer with it to plan and operate the new production methods. Transportation too is making good use of the computer. Freight-handling in the United States, Canada, England, and the U.S.S.R. is using machine techniques. Our high-speed airplanes are already more aimed than flown, and less and less seen and seen from. Mach-3 aircraft are on the drawing boards now, aircraft that will fly at three times the speed of sound or about 2,000 miles per hour. An airliner taking off from London must already be cleared to land in New York. So authorities on both sides of the ocean are concerned. In England, giant computers like the Ferranti Apollo and others are on order. There is talk in that country too of integrating military and commercial aviation into one traffic control system. In the next ten years the sky population may double again, in addition to flying faster, further crowding the airlanes and particularly the space adjacent to airports. The only solution to this aerial traffic jam lies in the electronic computer. Not as spectacular as air traffic control, but important nonetheless, is the job of planning the route an airliner will fly. United Air Lines uses a Bendix G-15 to select flight plans for its big DC-8’s. In a manner similar to the NANWEP course-planning described for surface vessels, the computer examines a number of possible routes for the big transports, considering distance flown, wind, temperature, weight and fuel requirements, and time schedules. This flight-planning was originally done by manual computation and required an hour to work out details for only one possible flight plan. The computer method was demanded because of the increased speed of the big jets and their sensitivities to weather conditions en route. The computer examines a number of tentative plans in minutes and selects the one which will make the optimum use of winds aloft, temperatures, weather, and so on. If weather changes en route require it, the pilot can call the planning center no matter where he is and request that the computer work out a new flight plan. Once the optimum flight plan has been figured, an electronic computer in the aircraft itself may one day assure that the desired flight path is actually flown. The ASN-24 computer, developed by Librascope, Incorporated, and the Air Force, weighs only thirty-one pounds, yet performs more than 20 million computation steps in a six-hour flight. The electronic navigator, with information from Doppler equipment and other navigation aids, evaluates which is the best “fix,” weighing for example the relative accuracies of a Loran fix and a dead-reckoning fix. The computer even shoots celestial fixes and plots the results! Obviously faster than its human monitor, the electronic navigation computer solves navigation problems with an error as small as one part in 32 million. A broader use of the computer in aircraft is proposed by the Convair Division of General Dynamics. Because today’s airplane is far more complicated than those ten years ago, and those ten years hence will extend this trend, the firm feels that checkout of the aircraft will require electronic computers. While adding about 3 per cent to the total cost of the plane, such equipment could perform a variety of functions including maintenance analysis and would add an hour a day to the profit-making flight time. There would be no profit for the airlines with the best flight planning and in-flight control in the world if there were no passengers aboard; the “traffic problem” extends from the sky to the ticket counter. For this reason most airlines have already recruited the computer for another important job—that of ticket reservation clerk. An example, recently installed by United Airlines, is the “Instamatic,” a giant, far-flung system weighing 150 tons and requiring 12,000 miles of circuits. Instamatic cost $16 million and can handle 540,000 reservations in a single day. So complex is the computer system that it requires 40,000 printed-circuit boards, 500,000 transistors, and 2,000,000 ferrite memory cores. But it gets the job done, and any one of 3,000 agents all over the country can confirm space on any flight, anytime, within seconds! There are other systems used by competing lines, systems called Sabre, Teleflite, and so on. But Remington Rand UNIVAC has proposed an over-all system that will make any of them look like a child’s do-it-yourself walkie-talkie. The UNIVAC plan is for a single interline reservation system, used by all twenty-four domestic airlines. Called AID, for Airline Interline Development, the new scheme would cost the airlines only 12 cents per message, and could be tied in with foreign carriers for international bookings. [Illustration: _Remington Rand UNIVAC_ Console for airlines reservation system permits pushbutton booking of space. ] Present methods of reservations among airlines require from less than a minute for easy bookings to several hours for the tough ones. The AID system uses a dial phone, with direct lines to a central computer in Chicago. The response to the dialed request is an immediate voice answer. If space is available, the computer also stores all the needed information for the reservation and transmits a teletype message to the boarding point of the proper airline. To go back another step, the aircraft on which the computer confirms seat space was most likely built with the help of another computer. A typical production system is that used by Lockheed in its Marietta, Georgia, plant. There an IBM 305 RAMAC computer keeps track of 45,000 parts orders continuously. The result is better and faster operation, and a saving to Lockheed of $3,500 a month. In California, Lockheed is using a computerized data acquisition system called EDGE, for Electronic Data Gathering Equipment, that feeds production information directly into a computer memory for analysis and action orders. Remote reporting stations can be operated by production-line workers and will relay production data to the central computer. Although the Lockheed EDGE system will cost more than $600,000 a year, officials feel that it will save the company three times that at the outset, and perhaps more when wider use is made of its potential. An interesting feature is the tying together of Lockheed’s widely separated plants at Sunnyvale, Palmdale, and Van Nuys, California. North American Aviation links its complex of plants in the Los Angeles area by microwave, even bouncing beams of data from reflectors atop Oat Mountain where there is no direct line-of-sight path between the different locations. Douglas Aircraft maintains a data link between California and Charlotte, North Carolina, to permit use of computers over a distance of 2,400 miles. The airlines are also using computer inventory systems to control their stock of spare parts. Material costs represent 60 per cent of airline revenue and are rising; some larger carriers have investments of as much as $75 million in spare parts. It takes the computer to control the flow of repairable parts through the shop efficiently, schedule the removal of those requiring periodic checks, spot high-use items, and so on. As an example of the complexity a large airline faces in its maintenance, TWA stocks 8,000 different replaceable items. When such parts are needed, they must be on hand _where_ they are needed, but overstocking can lead to financial ruin. To match increasing competition, airlines find it necessary to resort to the laws of probability and other sophisticated statistical techniques in stocking parts. Fed such equations, the computer can match ten to twelve man-years of work in three hours, and mean the difference between an oversupply of parts in New York with outages in Los Angeles, and properly balanced stocks. The ramifications of the computer in the airplane industry are far-reaching. For example, Boeing has recorded the lessons it learned on its Bomarc missile program in computers so that it can retain and apply them on its Minuteman and Dyna-Soar programs. The computer will thus keep track of men and their projects and warn them of previous mistakes. Modern management techniques such as PERT and PEP, favored by the government, make good use of the computer. The McDonnell Aircraft Corporation is primarily a builder of planes and space vehicles, but it has found itself in the computer business too as a data-processing center. Installing computers for its own engineering and business uses, McDonnell soon began selling computer time in off hours to banks and other businesses. It now has a computer valuation of about $10 million and operates around the clock. _The Designing Computer_ It seems strange that the computer was a bookkeeper and clerk for years before anyone seriously considered that it might be an engineer as well, yet the men who themselves designed the computer were loath to use it in their other work. Part of this resistance stems from the high premium placed on the creativity of research and design work. The engineer uses science in his work, to be sure, but he professes to use it as an artist, or with the personal touch of, say, a brewmaster. There is another possible reason for the lag in computer use by the men who should appreciate its ability the most. In the early days of the computer, it clacked away all week figuring payrolls, and perhaps writing checks. That’s what it was ordered for, and that’s where the money was—in the businessman’s application of the computer. To be sure, the military was using the computer for other purposes, but the average scientist or engineer not employed by Uncle Sam had access to an electronic computer only on Sunday, if at all, when the big machine had done its primary work and could take a breathing spell. To further compound excuses for the foot-dragging engineer, there was a difference in needs in payroll computation and scientific mathematical calculation. Commercial computers are designed for a high rate of input and output, with a relatively slow arithmetic going on inside. The engineer, on the other hand, might need only several minutes of computer time, but it could take him a couple of days to put the problem into a form the machine could digest. Slowly, however, enough engineers fought the battle of translation and forewent Sunday pursuits like church, picnics, and golf to learn haltingly how to use the electronic monster. It took courage, in addition to sacrifice, because the computer was pooh-poohed by some sharp scientific brains as an _idiot savant_ at best. Behind the inertia there could have been a touch of concern too—concern that the machine just might not be as stupid as everybody kept saying it was. Heavy industry made use of the machines. The steel plants, petroleum and chemical plants, and even the designers of highways were among the early users of computer techniques. There was of course good reason for this phenomenon. Faced with problems involving many variables and requiring statistical and probabilistic approaches, these people could make the best use of machines designed for repetitive computations. The refiner with a new plant in mind could simulate it in the computer and get an idea of how, or if, it would work before building his pilot plant. Today the notion of dispensing with even the pilot plant is getting serious consideration. One program used by a gasoline producer analyzed thirty-seven variables and thirty-seven restrictions, a matrix that could never be evaluated by ordinary methods. Textile fiber research is another example, with thread tests run on dozens of samples and averaged statistically for valid conclusions. B. F. Goodrich put the computer to work in its laboratories at such tasks as multiple-regression studies of past production of processes like polymerization and the running of a batch of new material on the computer. These applications were accomplishing a two-fold benefit. First, years were being telescoped into weeks or even days; second, _complete_ investigation rather than sketchy sampling was possible. Optimum solutions took the place of the guesswork once necessary because of the lack of sufficient brainpower to run down all the possibilities. Still there were scientists and designers in other fields who shook their heads loftily and said, “Not for me, thanks.” The computer was but a diligent clerk, they held, relieving the engineer of some onerous chores. It could do nothing really creative; that must be left to man and his brain. By now many industrial firms had purchased or rented computers for the technical people so that they would not have to fight for a place in line at the payroll computer. Civil engineering agencies, perhaps a hundred strong, used computers to design bridges and plan and lay out highways. Designers at the Tudor Engineering Company of San Francisco put its Bendix G-15D to work planning the highway that Contra Costa County will need in 1980. Almost all of our fifty states now use computers in their highway departments. In 1960, Georgia solved more than a thousand highway bridge design problems in its computers. Besides doing the work faster and cheaper, the computer produces a safer product. For example, if substandard materials are programmed in, the computer will print out a warning or even stop working altogether so that the error can be corrected. Steel companies, like Jones & Laughlin, use computers not only to run production mills, but also as research tools. Three hours of operation of a new furnace can be simulated in the computer in thirty seconds. Tracing the steel back to its ore, the computer is used again. The Bureau of Mines has used the machines for several years; they are helpful in problems ranging from open-pit operation, grades of ore, drill-core data logging, reserve calculations, and process control. [Illustration: _General Electric Co., Computer Dept._ Computer operation of Jones & Laughlin steel mill. ] Gradually, then, the resistance was worn down. Grudgingly at first, and accepting the computer only as an assiduous moron, engineers in other fields put it to work. Complex machine operations like gear-shaping were planned and carried out by computers that even punched out tapes for controlling the production tools. Optics designers switched from desk calculators to electronic computers. Mechanical engineers in jobs from ultrasonic vibrators to tractor design became users of computers. Mass spectrometry, heat-exchanger design, and waterworks design joined the jobs the computer could do. The computer had figured in plotting trajectories for missiles, and in the production of aircraft; engineers found it could design them too. Back in 1945, an analysis of twenty-one different flight conditions at each of twelve stations of an airplane fuselage took 33 days and cost more than $17,000. Today, by using a high-speed computer instead of a desk calculator, the analysis is completed in a day and a half, at a cost of $200! The last of the diehards seemed to be the electronics people themselves. A survey conducted by a technical journal in the field showed that in 1960 many designers were not using computers in their work. Admitting that the computer was a whiz just about everywhere else, the electronics engineer still could say, “The machine is great on paperwork, but I do _creative_ work. The computer can’t help me.” Other reasons were that computers were expensive, took much time to program, and were helpful only with major design problems. Fortunately, all designers do not feel that way, and progress is being made to put the computer to work in the electronics field. It is helping in the design of components (Bendix saves ten man-hours in computing a tenth-order polynomial and associated data) and of networks (Lenkurt Electric saves close to 250 engineering hours a week in filter network design). Bell Telephone uses the computer approach in circuit analysis, and Westinghouse in the design of radar circuitry. It is interesting that as we move up the design scale, closer to what the engineer once considered the domain of human creativity, the computer still is of great value. In systems design it is harder at the outset to pin down the saving in time and the improvement in the system (the latter is perhaps hard to admit!) but firms using computers report savings in this field too. One interesting job given the computer was that of designing the magnetic ink characters to be used in its own “reading” applications. This project, conducted by Stanford Research Institute, is typical of the questions we have begun to ask the computer about its needs and ways to improve it. A larger scale application of this idea is that of letting the computer design itself. Bell Telephone Laboratories developed such a system, called BLADES, for Bell Laboratories Automatic Design System, to design a computer used in the Nike-Zeus antimissile defense system. A wag once noted that the computer would one day give birth to an electronic baby. His prophecy came true perhaps quicker than he anticipated, but there is one basic difference in that the progeny is not necessarily a smaller machine. The giant LARC, for instance, was designed by lesser computers. As A. M. Turing has pointed out, it is theoretically possible for a simple computer to produce a more complex one. This idea is borne out in nature, of course, and man is somewhat advanced over the amoeba. Thus the implication in the computer-designed computer is far more than merely the time and money saved, although this was certainly a considerable amount. The BLADES system in twenty-five minutes produced information for building a subassembly, a job that required four weeks of manual computation. Notable improvements in the general-purpose computer are doing much to further its use as a technical tool. Present machines do jobs as varied as the following: personnel records, inventorying, pattern determination, missile system checkout, power-plant control, system simulation, navigation, ballistic trajectory computations, and so on. Special computers are also provided now for the engineer; and among these is the Stromberg-Carlson S-C 4020 microfilm recorder. Engineering specifications are put into the computer and the machine can then produce on request mechanical drawings as required by the engineer. Data stored in the memory is displayed on a Charactron tube. There is little resistance to this type of computer, since the engineer can say it is doing work below his level of ability! Of course, the draftsman may take a dim view of computers that can do mechanical drawing. [Illustration: _Bell Telephone Laboratories_ Engineer checks design information for first computer built from complete information furnished by another computer. Shown is a subassembly of the computer, which will be used in the Army’s Nike-Zeus antimissile defense system. ] After a rather hard to explain slow start, then, the computer is now well established as a scientific and engineering tool. Blue-sky schemes describe systems in which the engineer simply discusses his problem with the machine, giving specifications and the desired piece of equipment. The machine talks back, rejecting certain proposed inputs and suggesting alternatives, and finally comes up with the finished design for the engineer’s approval. If he laughs overly loud at this possibility, the engineer may be trying to cover up his real feelings. At any rate the computer has added a thinking cap to its wardrobe of eyeshade and work gloves. _Digital Doctor_ Medical electronics is a fairly well-known new field of science, but the part being played in medicine by the computer is surprising to those of us not close to this work. Indicative of the use of the computer by medical scientists is a study of infant death rates being conducted by the American Medical Research Foundation. Under the direction of Dr. Sydney Kane, this research uses a UNIVAC computer and in 1961 had already processed information on 50,000 births in ninety participating hospitals. Punched-card data include the mother’s age, maternal complications, type of delivery, anesthetics used, and other pertinent information. Dr. Kane believes that analysis by the computer of this information may determine causes of deaths, after-birth pathological conditions, and incapacity of babies to reach viability. A reduction in infant mortality of perhaps 12,000 to 14,000 annually is believed possible as a result of the studies. Another killer of mankind, cancer, is being battled by the computer. Researchers at the University of Philadelphia, supported in part by the American Cancer Society, are programming electronic computers to act as cancer cells! The complexity of the problem is seen in the fact that several man-years of work and 500 hours of computer programming have barely scratched the surface of the problem. A third of a million molecules make up the genes in a human cell, and the actions of these tiny components take place many times faster than even the high-speed computer can operate. Despite the problems, some answers to tough chemical questions about the cancer cells are being found by using the computer, which is of course thousands of times faster than manual computation. If you were discharged from a hospital in 1962, there is a chance that your records are being analyzed by a computer at Ann Arbor, Michigan as part of the work of the Commission on Professional and Hospital Activity. Information on 2-1/2 million patients from thirty-four states will be processed by a Honeywell 400 computer to evaluate diagnostic and hospital care and to compare the performance of the various institutions. In the first phase of a computerized medical literature analysis and retrieval system for the National Library of Medicine, the U.S. Public Health Service contracted with General Electric for a system called MEDLARS, MEDical Literature Analysis and Retrieval System. MEDLARS will process several hundred thousand pieces of medical information each year. New York University’s College of Engineering has formed a biomedical computing section to provide computer service for medical researchers. Using an IBM 650 and a Control Data Corporation 1604, the computer section has already done important work, including prediction of coronary diseases in men under forty. The success of computers in these small-scale applications to the problems of medicine has prompted the urging of a national biomedical computer system. It is estimated that as yet only about 5 per cent of medical research projects are using computer techniques, but that within ten years the figure will jump to between 50 and 75 per cent. An intriguing possibility is the use of the computer as a diagnostic tool. Small office machines, costing perhaps only $50, have been suggested, not by quacks or science-fiction writers, but by scientists like Vladimir Zworykin of the Rockefeller Institute of Medical Research. Zworykin is the man who fathered the iconoscope and kinescope that made television possible. The simple diagnostic computer he proposes would use information compiled by a large electronic computer which might eventually catalog the symptoms of as many as 10,000 diseases. Using an RCA 501 computer, a pilot project of this technique has already gathered symptoms of 100 hematological diseases. Another use of the computer is in the HIPO system. Despite its frightening acronymic name, this is merely a plan for the automated dispensing of the right medicine at the right time to the right patient, thus speeding recoveries and preventing the occasional tragic results of wrong dosage. More exotic is a computer called the Heikolator which is designed to substitute for the human brain in transmitting messages to paralyzed limbs that could otherwise not function. The simulation of body parts by the computer for study is already taking place. Some researchers treat the flow of blood through arteries as similar to the flow of water through a rubber tube, analyze these physical actions, and use them in computer simulation of the human system. The Air Force uses a computer to simulate the physical chemistry of the entire respiratory and circulatory systems, a task that keeps track of no less than fifty-three interdependent variables. Dr. Kinsey of the Kresge Eye Institute in Detroit is directing computer work concerning the physiology of the eye. According to Kinsey it was impossible previously to approximate the actual composition of cell substances secreted from the blood into the eye. Even those whose eyes no longer serve them are being benefited by computer research. The Battelle Memorial Institute in Columbus, Ohio, uses an IBM computer to develop reading devices for the blind. These complicated readers use a digital computer to convert patterns of printed letters into musical tones. Further sophistication could lead to an output of verbalized words. Interestingly, it is thought that the research will also yield applications of use in banking, postal service, and other commercial fields. Russia is also aware of the importance of the computer in the medical field. A neurophysiologist reported after a trip to Russia that the Soviet Union is training its brightest medical students in the use of the computer. Such a philosophy is agreed to by medical spokesmen in this country who state that no other field can make better use of the computer’s abilities. Among advanced Russian work with computers in the biomedical field is a study of the effects on human perception of changes in sound and color. Visionary ideas like those of radio transmitters implanted in patients to beam messages to a central computer for continuous monitoring and diagnosis are beginning to take on the appearance of distinct possibilities. Some are beginning to wonder if after it has learned a good bedside manner, the computer may even ask for a scalpel and a TV series. _Music_ The computer has proved itself qualified in a number of fields and professions, but what of the more artistic ones? Not long ago RCA demonstrated an electronic computer as an aid to the musical composer. Based on random probability, this machine is no tongue-in-cheek gadget but has already produced its own compositions based on the style of Stephen Foster. Instead of throwing up their hands in shocked horror, modern composers like Aaron Copland welcome the music “synthesizer” with open arms. Bemoaning only the price of such a computer—about $150,000—Copland looks to the day when the composer will feed in a few rough ideas and have the machine produce a fully orchestrated piece. The orchestration, incidentally, will include sounds no present instruments can produce. “Imagine what will happen when every combination of eighty-eight keys is played,” Copland suggests. Many traditionalists profess to shudder at the thought of a machine producing music, but mathematical compositions are no novelty. Even random music was “composed” by Mozart, whose “A Musical Dice Game” is chance music with a particularly descriptive title, and Dr. John Pierce of Bell Laboratories has extended such work. [Illustration: Taken from “_Illiac Suite_,” by L. A. Hiller and L. M. Isaacson, copyrighted 1957, by _Theodore Presser Co._ Used by permission. Random chromatic music produced by ILLIAC computer resembles the compositions of some extreme modern composers. ] In 1955, Lejaren A. Hiller, Jr., and L. M. Isaacson began to program the ILLIAC computer at the University of Illinois to compose music. The computer actually published its work, including “Illiac Suite for String Quartet,” Copyright 1957, New Music Editions, done in the style of Palestrina. All music lies somewhere between the complete randomness of, say, the hissing of electrons in vacuum tubes and the orderliness of a sustained tone. No less a master than Stravinsky has called composition “the great technique of selection,” and the computer can be taught to select in about any degree we desire. Hiller describes the process, in which the machine is given fourteen notes representing two octaves of the C-major scale, and restricted to “first-species counterpoint.” By means of this screening technique, the computer “composed” by a trial-and-error procedure that may be analogous to that of the human musician. Each note was examined against the criteria assigned; if it passed, it was stored in memory; if not, another was tried. If after fifty trials no right note was found, the “composition” was abandoned, much as might be done by a human composer who has written himself into a corner, and a new start was made. In an hour of such work, ILLIAC produced several hundred short melodies—a gold mine for a Tin Pan Alley tunesmith! It was then told to produce two-voice counterpoint for the basic melodies. “Illiac Suite” is compared, by its programmers at least, with the modern music of Bartok. Purists whose sensibilities are offended by the very notion of computer music point out that music is subjective—a means of conveying emotion from the heart of the composer to that of the listener. Be that as it may, the composition itself is objective and can be rigorously analyzed mathematically, before or after the fact. From a technical standpoint there seems to be only one question about this new music—who composed it, the programmer or the computer? An interesting sidelight to computer music is its use to test the acoustics of as yet unbuilt auditoriums. Bell Telephone Laboratories has devised such a machine in its Acoustical and Visual Research Department. The specifications of the new auditorium are fed into the computer, followed by music recorded on tape. The computer’s output is then this music as it will sound in the new hall. Critical experts listen and decide if the auditorium acoustics are all right, or if some redesign is in order. _The Machine at Play_ The computer’s game-playing ability in chess and other games has been described. It is getting into the act in other fields, spectator sports as well. Baseball calls on the computer to plan season strategy and predict winners. When Roger Maris began his home-run string, an IBM 1401 predicted that he had 55 chances in 100 of beating Ruth’s record. Workers at M.I.T. have developed a computer program that answers questions like “Did the Red Sox ever win six games in a row?” and “Did every American League team play at least once in each park in every month?” An IBM RAMAC computer is handling the management of New York’s Aqueduct race track, and promises to do a better job than the human bosses, thus saving money for the owners and the State of New York Tax Commission. The Fifteenth Annual Powderpuff Derby, the all-women transcontinental air race, was scored by a Royal Precision LGP-30 computer, and sports car enthusiasts have built their own “rally” computers to gauge their progress. The Winter Olympics at Innsbruck, Austria, will be scored by IBM’s RAMAC, and even bowling gets an assist from the computer in the form of a scoring device added to the automatic pin-setter, bad news to scorekeepers who fudge to boost their points. An IBM 704 has proved a handy tool for blackjack players with a system for winning 99 per cent of the time, and rumor has it that a Los Angeles manufacturer plans to market a computer weighing only two pounds and costing $5, for horse-players. Showing that the computer can be programmed with tact is the demonstrator that answers a man’s age correctly if he answers ten questions but announces only that a woman is over twenty-one. Proof that the computer has invaded just about every occupation there is comes to light in the news that a Frankfurt travel agency uses a computer called Zuse L23 as an agent. The traveler simply fills out a six-question form, and in a few seconds Zuse picks the ideal vacation from a choice of 500. Computers, it seems, are already telling us where to go. _Business Outlook_ The computer revolution promises to reach clear to the top of the business structure, rather than find its level somewhere in middle management. The book, _Management Games_ lists more than 30,000 business executives who have taken part in electronic computer management “games” in some hundred different versions. The first widely used such game was developed in 1956 by the American Management Association. While such games are for educational purposes, their logical extension is the actual conduct of business by a programmed computer. In his book, _Industrial Dynamics_, Dr. J. W. Forrester points out that a high-speed digital computer can be used in analyzing as many as 2,000 variables such as costs, wages, sales, and employment. This is obviously so far beyond human capability that the advantage of computer analysis becomes evident. A corollary benefit is the speed inherent in the computer which makes it possible to test a new policy or manufacturing program in hours right in the computer, rather than waiting for months or years of actual implementation and possible failure. For these reasons another expert has predicted that most businesses will be using computer simulations of their organizations by 1966. Regardless of the timetable, it is clear that the computer has jumped into business with both its binary digits and will become an increasingly powerful factor. [Illustration: Lichty, © _Field Enterprises, Inc._ “Our new ‘brain’ recognizes the human factor, doctor!... After feeding it the symptoms, it gives the diagnosis and treatment.... But YOU set the fee!” ] ------------------------------------------------------------------------ “_Men have become the tools of their tools._” —Thoreau 9: The Computer and Automation In his movie, _City Lights_, Charlie Chaplin long ago portrayed the terrible plight of the workman in the modern factory. Now that the machine is about to take over completely and relieve man of this machinelike existence, it is perhaps time for Charlie to make another movie pointing up this new injustice of civilization or machine’s inhumanity to man. It seems to be damned if it does and damned if it doesn’t. For some strange reason, few of us become alarmed at the news of a computer solving complex mathematics, translating a book, or processing millions of checks daily, but the idea of a computer controlling a factory stimulates union reprisals, editorials in the press against automation, and much general breast-beating and soul-searching. Perversely we do not seem to mind the computer’s thinking as much as we do its overt action. It is well to keep sight of the fact that automation is no new revolution, but the latest development in the garden variety of industrial revolution that began a couple of centuries ago in England: Mechanization was the first step in that revolution, mechanization being the application of power to supplement the muscles of men. Mass production came along as the second step at the turn of this century. It was simply an organization of mechanized production for faster, more efficient output. Automation is the latest logical extension of the two earlier steps, made possible by rapid information handling and control. Recent layoffs in industry triggered demonstrations, including television programs, that would indicate we suspect automation of having a rather cold heart. The computer is the heart of automation. [Illustration: _Remington Rand UNIVAC_ Control operations require “real-time” computers that perform calculations and make necessary decisions practically instantaneously. ] None of these steps is as clear-cut or separate as it may seem without some digging into history and an analysis of what we find. For example, while we generally consider that the loom was simply mechanized during the dawn of industrial revolution, the seeds of computer control were sown by Jacquard with punched-card programming of the needles in his loom. Neither is it sufficient to say that the present spectacle of automated pushbutton machines producing many commodities is no different from the introduction of mass-produced tractors. Tractors, after all, displaced horses; the computer-controlled factory is displacing men who don’t always want to be put out to pasture. Automation is radically changing our lives. It is to be hoped that intelligent and humane planning will facilitate an orderly adjustment to this change. Certainly workers now toil in safer and pleasanter surroundings. It is reported that smashed toes and feet, hernia, eye trouble, and similar occupational accidents have all but disappeared in automated automobile plants. Unfortunately other occupational hazards are reportedly taking the place of these, and the psychological trauma induced by removal of direct contact with his craft has given more than one worker stomach ulcers. Let us investigate this transfer of contact from man to computer-controlled machine. A paper presented at the First Congress of the International Federation of Automatic Control, held in Moscow in 1960, uses as its introductory sentence, “Automatic control always involves computing.” The writer then points out that historically the computing device was analog in nature and tied so closely with the measuring and control elements as to be indistinguishable as an actual computer. In more recent history, however, the trend has been to separate the computer. With this trend is another important change, that of using the digital computer in automatic control. One of the first papers to describe this separate computer function is “Instrument Engineering, Its Growth and Its Promise,” by Brown, Campbell, and Marcy, published in 1949. “Naturally,” the authors state, “a computer will be used to control the process.” Not a shop foreman or an engineer, but a computer. Watt’s “flyball” governor pioneered the field; more recent and more obvious examples of control by computers include ships guided by “Iron Mike” and airplanes flown by the automatic pilot. These were analog devices, and the first use of a digital computer as a control was in 1952, quite recently in our history. This airborne digital control computer was built by Hughes and was called “Digitac.” Since most industries have been in existence for many years, far antedating aviation, electronics, and the modern computer, the general incorporation of such control has been difficult both because of the physical problem of altering existing machines and the mental phenomenon of inertia. Factory management understandably is slow to adopt a revolutionary technique, and most control systems now in use in industry are still analog in nature. However, where new plants are built from the ground up for computer control, the results are impressive. Designed by United Engineering, the Great Lakes 80-inch hot strip mill automatically processes 25-ton slabs of steel. More than 1,000 variables are controlled, and 200 analog signals and 100 digital computer-generated signals are used in the process. The steel sheets are shot out of the rolls at some 45 miles an hour, or about 66 feet a second! A human supervisor would have a difficult job just watching the several hundred signals related to thickness, temperature, quality, and so on, much less trying to think what to do if he noticed something out of specifications. This would be roughly analogous to an editor trying to proofread a newspaper as it flashes by on the press and making corrections back in the linotype room before any typographical errors were printed. The new computer-controlled mill has an output of about 450,000 tons a month, twice that of the next largest in operation. American control experts who attended the Moscow conference brought back the information that Russian effort in computer control is greater than that in the United States, and that the Russians are more aware of what we are doing in the field than we are of their progress. Their implementation of modern computer control may be made easier because their industries are newer and do not represent such a long-established and expensive investment in hard-to-modify existing equipment. Basically, at least, computer control is simple and can be compared to the feedback principle that describes many physical systems including the workings of our own bodies. In practice, the computer can be put in charge of producing something, and by sampling the output of its work can constantly make corrections or improvements that are desired. This is of course an extreme simplification, and the control engineer speaks of “on-line” operation, of adaptive systems that adjust to a changing environment, of predictive control, and so on. One vital requirement of the computer involved in a control process, obviously, is that it cannot take its time about its computations. The control computer is definitely operating “on the line”; that is, in real time, or perhaps even looking ahead by a certain amount so that it can not only keep up with production but also predict forthcoming changes and make corrections in time to be of use. The human process controller is stuck with methods like those of the cook who mixes up his recipe with a spoonful of this, and three pinches of that, sniffs or tastes the batter subjectively, and may end up with a masterpiece or a flop. Computer control processes the same batter through the pipes at a thousand gallons a minute and catches infinitesimal variations in time to correct them before the hotcakes are baked. In effect it makes hindsight into foresight by compressing time far more than man could hope to do. Early applications of the computer in industrial processes were simply those of data “loggers,” or monitors. It was still up to the human operator to interpret what the computer observed and recorded, and to throw the switch, close the valve, or push the panic button as the case demanded. Actual computer control, the “closing of the loop” as the engineers call it, is the logical next step. This replaces the human operator, or at least relegates him to the role of monitor. The Great Lakes hot-rolling steel mill has been mentioned as an example of complete computer control. In Hayange, France, the first European completely automated steel-beam mill is slated to go into operation late in 1962. The Jones & Laughlin Steel Corporation in this country uses a digital computer system to control continuous annealing in its Aliquippa, Pennsylvania, plant, and is evaluating an RCA computer-controlled tin-plating line operating at 3,000 feet a minute. Newer computer-control applications in the offing include sintering and other metal production operations. [Illustration: _Minneapolis-Honeywell_ Boston ice cream makers, H. P. Hood & Sons, use computer to make pushbutton ice cream. Analog computer thinks out recipes, punches them on cards to operate valves. ] To those of us who consume it, ice cream may not seem a likely candidate for computer control. However, the firm of H. P. Hood & Sons uses computer control in its blending operation, finding it 20,000 times as fast, and more accurate than when handled by human operators, since computer controls hold mixes within one-tenth of 1 per cent accuracy. Automation is a significant breakthrough in this industry, whose history goes back 110 years, and in baking, which is a little older. The Sara Lee bakeries use the computer too in assembling the ingredients for their goodies. To bake such cakes, Mother will have to get herself a computer. Minneapolis-Honeywell furnished the computer for the ice-cream control; this same company delivered a system for the Celanese Corporation of America’s multimillion dollar acetyl manufacturing plant at Bay City, Texas. The new plant produces a petrochemical used in plastics, paint, synthetic rubber, dye, fibers, and other products. Going “on-stream” in 1962, the Celanese plant will produce half a billion pounds of chemicals annually. Russia has been mentioned as active in industrial computer control. A case in point is the soda plant at Slavyansk in the Donets Basin, which was recently test-operated for a continuous period of 48 hours by computer. An unusual feature of this test was that the computer was in Kiev, almost 400 miles away. A wire link between the two cities permitted monitoring and control of the plant from Kiev in what the Russians claim as the first remote automatic operation of such a plant. Other Soviet achievements include two large-scale automatically controlled installations. In oil-field operation at Tataria, gas and oil outputs from many wells are monitored and controlled from a central station, dropping the work force required from 600 to 100. The other installation controls irrigation servicing 9,000 acres. A desktop control handles the pumping of water from the Syr Darya River through underground pipes, and distribution to Uzbekistan cotton fields. The Russians have also designed an automatic distillation unit for the Hungarians. With an annual capacity of a million tons, the unit was installed in the large Szoeny refinery and scheduled for operation by 1962. Refineries in the United States are also employing automatic controls in their operations. Phillips Petroleum installed a digital computer control system in its Sweeney, Texas, plant to achieve maximum efficiency in its thermal cracking process. In the first step of an experimental program, Phillips, working with Autonetics computer engineers, used a digital computer to plan optimum furnace operation. An initial 10 per cent improvement was achieved in this way, and a further 6 per cent gain resulted when a digital computer was installed on-line to operate the cracking furnace. The Standard Oil Company of California is using an IBM 7090 in San Francisco to control its catalytic or “cat” cracking plant in El Segundo, some 450 miles away. The need for computer speed and accuracy is shown by the conditions under which the cracking plant must operate continuously with no shutdowns except for repair. Each day, two million gallons of petroleum is mixed in the cracker with the catalyst, a metallic clay. The mixing takes place at incandescent heat of 1,000° F., and the resulting inferno faces operators with more than a hundred changing factors to keep track of, a job feasible only with computer help. Another use of computer control in the petroleum industry is that of automatic gasoline blending, as done by the Gulf Oil Corporation. A completely electronic system is in operation at Santa Fe Springs, California. The system automatically delivers the prescribed quantities of gasoline for the desired blend. In case of error or malfunction of equipment, the control alerts the human supervisor with warning lights and an audible alarm. If he does not take proper action the control system automatically shuts itself off. From the time the war-inspired industry of synthetic rubber production began in 1940 until very recently, it has been almost entirely a manual operation. Then in 1961 Goodyear Tire & Rubber introduced computer control into the process at its Plioflex plant in Houston, Texas. Goodyear expects the new system to increase its “throughput” and also to improve the quality of the product through tighter, smoother control of the complicated operation. Other chemical processors using computer control in their plants include Dow Chemical, DuPont, Monsanto, Union Carbide, Sun Oil, and The Texas Company. Adept at controlling the flow of material through pipes, the computer can also control the flow of electricity through wires. An example of this application is the use of digital computers in electric-utility load-control stations. A typical installation is that of the Philadelphia Electric Company in Philadelphia, the first to be installed. Serving 3-1/2 million customers, the utility relies on a Minneapolis-Honeywell computer to control automatically and continuously the big turbine generators that supply electric power for the large industrial area. The memory of the computer stores data about the generators, transmission-line losses, operating costs, and so on. Besides controlling the production of power for most economy, the computer in its spare time performs billing operations for exchange of power carried on with Pennsylvania-New Jersey-Maryland Interconnection and Delaware Power & Light Company and Atlantic City Electric Company. Other utilities using computer control are the Riverside Power Station of the Gulf States Utilities Company, Southern California Edison, and the Louisiana Power & Light Company’s Little Gypsy station in New Orleans. Another industry that makes use of a continuous flow of material is now being fitted for computer control, and as a result papermakers may soon have a better product to sell. IBM has delivered a 1710 computer to Potlach Forests, Inc., in Idaho for control of a paperboard machine 500 feet long. Papermaking up to now has been more art than science because of the difficulty of controlling recipes. With the computer, Potlach expects to make better paper, have less reject material, and spend less time in changing from one product run to another. Showing that automatic control can work just about anywhere, the English firm of Cliffe Hill Granite Company in Markfield, Leicestershire, controls its grading and batching of granite aggregate from a central location. Besides rock-crushers, cement plants like Riverside Cement Company use computer control in the United States. Thus far most of the computer control operations we have discussed are in the continuous-processing fields of chemicals or other uniform materials. The computer is making headway in the machine shop too, although its work is less likely of notice there since the control panel is less impressive than the large machine tool it is directing. Aptly called APT, for Automatically Programmed Tools, the new technique is the brainchild of M.I.T. engineer Douglas Ross. Automatic control eliminates the need for drill jigs and other special setup tools and results in cheaper, faster, and more accurate machine work. [Illustration: _International Business Machines Corp._ Controlled by instructions generated by IBM’s AUTOPROMPT, a Pratt & Whitney Numeric-Keller continuous-path milling machine shapes a raw aluminum block (upper left) into the saddle-shaped piece shown at right. The surface is a portion of a geometric shape called a hyperbolic paraboloid. ] A coded tape, generated by a computer, controls the milling machine, drill press, or shaper more accurately than the human machinist could. In effect, the computer studies a blueprint and punches out instructions on tape that tell the machine what it is to do, how much of it, and for how long. Huge shaping and contouring machines munch chunks of metal from blanks to form them into complex three-dimensional shapes. Remington Rand UNIVAC and IBM are among the companies producing computers for this purpose. The trend is to simpler, more flexible control so that even small shops can avail themselves of the new technique. In a typical example of the savings possible with “numerical” tape control, these were the comparative costs: [Illustration: _Control Engineering_ Operation of computer-controlled freight yard in England. ] _Conventional_ _Tape Control_ Tooling $755 $45 Setup time 15 min. 15 min. Work time 15 min. 11 min. Cost per $2.96 $1.81 part Beyond the automated single- or multipurpose tool is the completely computer-controlled assembly line. Complete automation of products like automobiles may be some distance off, but there is nothing basically unworkable about the idea. Simpler things will be made first, and to promote thinking along these lines, Westinghouse set up an automatic assembly line for paperweights. An operator typed the initials of manufacturing department managers on a computer, which transferred the instructions to a milling machine. The machine cut the initials in aluminum blocks which were then automatically finished, painted, and packaged for shipment as completed paperweights. Another firm, Daystrom, Inc., is designing a computer control system for assembly lines which will adjust itself for the “best” product as an output. President Tom Jones described the principle in which the computer will begin production, then move valves, switches, or other controls a small amount. Measuring the finished product, it will decide if the change is in the right direction, and proceed accordingly. Once it finds the optimum point, it will lock in this position and settle down to business. An excellent example of the computerized assembly line is the Western Electric Company carbon resistor production line at its Winston-Salem plant. A digital computer with a 4,096-word memory is used for the programming, setup, and feedback control of the eleven-station line. It can accept a month’s scheduling requirements for deposited carbon high-quality resistors in four power ratings and almost any desired resistance values. Production rate is 1,200 units per hour. The computer keeps track of the resistors as they are fabricated, rejecting those out of specification and adjusting the process controls as necessary. Operations include heating, deposition of carbon, contact sputtering, welding, grooving, and inspecting. _The Robots_ Most of these automated factory operations are doing men’s work, but it is only when we see the robot in the shape of ourselves that cold chills invade our spines. Children’s Christmas toys lately have included mechanical men who stride or roll across the floor and speak, act, and even “think” in more or less humanoid fashion, some of them hurling weapons in a rather frightening manner. There is an industrial robot in operation today which may recall the dread of Frankenstein, though its most worried watchers are perhaps union officials. Called Unimate, this factory worker has a single arm equipped with wrist and hand. It can move horizontally through 220 degrees, and vertically for 60 degrees, and extend its arm from 3 feet to 7 feet at the rate of 2-1/2 feet a second. Without a stepladder, it can reach from the floor to a point nearly 9 feet above it. Unimate can pick up 75 pounds, and its 4-inch fingers can clamp together on an iron bar or a tool with a force of up to 300 pounds. The robot weighs close to a ton and a half, but can be moved from job to job on a fork-lift truck. Its designers have turned up a hundred different jobs that Unimate could do, including material loading, packaging, welding, spray painting, assembly work, and so on. The robot has a memory and can retain the 16,000 “bits” of information necessary for 200 operations. To teach it a new task, it is only necessary to “help” it manually through each step one time. Unimate can be instructed to wait for an external signal during its task, such as the opening of a press or a furnace door. Advantages of a robot are many and obvious. Pretty girls passing by will not distract it, nor will it require time for lunch or coffee breaks, or trips to the washroom. If necessary it will work around the clock without asking for double power for overtime. High temperatures, noxious gases, flying sparks, or dangerous liquids will not be a severe hazard, and Unimate never gets tired or forgets what it is doing. But Unimate has some drawbacks that are just as obvious. It can’t tell one color from another, and thus might paint parts the wrong color and never know the difference. It is not readily movable, and not very flexible either. It costs $25,000, and will need about $1,300 in maintenance a year. Some industry spokesmen say that this is far too much, and Unimate has a long way to go before it puts any people out of work. Others say it is a step in the right direction, and this is probably a fair evaluation. Apparently United States Industries, Inc., whose AutoTutor teaching machines are pacing the field, has made another step in the right direction with its “TransfeRobot 200.” This mechanical assembly-line worker is an “off-the-shelf” item, and currently in use by about fifty manufacturers. TransfeRobot uses its own electronic brain, coupled with a variety of magnetic, mechanical, or even pneumatic fingers to pick up, position, insert, remove, and do other necessary operations on small parts. Besides these capabilities, TransfeRobot controls secondary operations such as drilling, embossing, stamping, welding, and sealing. It is now busy building things like clocks, typewriters, automobile steering assemblies, and electrical parts. No one-job worker, it can be re-programmed for other operations when a new product is needed, or quickly switched to another assembly line if necessary. Billed as a new hand for industry, TransfeRobot obviously has its foot in the door already. United States Industries estimates current yearly sales of its small automation equipment at about $3 million. [Illustration: _Massachusetts Institute of Technology_ Dr. Heinrich Ernst, Swiss graduate student at MIT, watches his computer-controlled “hand” pick up a block and drop it in the box. ] The robots in Čapek’s play _R.U.R._ looked like their human makers, but scientist Claude Shannon is more realistic. “These robots will probably be something squarish and on wheels, so they can move around and not hurt anybody and not get hurt themselves. They won’t look like the tin-can mechanical men in comic strips. But you’ll want them about man-size, so their hands will come out at table-top or assembly-line level.” Since Professor Shannon is the man who sparked the implementation of symbolic logic in computers, his ideas are not crackpot, and the Massachusetts Institute of Technology’s Hand project is a good start toward a real robot. Dr. Heinrich Ernst, a young Swiss, developed Hand with help from Shannon. Controlled by a digital computer, the hand moves about and exercises judgment as it encounters objects. Such research will make true robots of the remotely manipulated machines we have become familiar with in nuclear power experiments, underwater exploration, and so forth. Hughes Aircraft’s “Mobot” is a good example, and it is obvious that the robot’s bones, muscles, and nerves are available. All they need is the brain to match. While we wait fearfully for more robots which look the way we think robots should, the machine quietly takes over controlling more and more even bigger projects. The computer does a variety of tasks, from the simple one of cutting rolling-mill stock into optimum lengths to minimize waste, to that of running an electronic freight yard in which cars are classified and made up automatically. The computer in this application not only measures the car and weighs it, but also computes its rollability. Using radar as its eyes, the computer gauges the speed and distance between cars as they are being made up and regulates their speed to prevent damaging bumps. To the chagrin of veteran human switchmen, the computer system has proved it can “hump” cars—send them coasting to a standing car for coupling—without the occasional resounding crash caused by excessive speed. About all that is holding up similarly automated subway trains in the United States is approval from the union. Soviet Russia claims she already has computer-run subways and even ships. The latter application took place on the oil tanker _Engineer Pustoshkin_ plying the Caspian Sea. The main complaint of the director of this research work, P. Strumpe, is that ships are not yet designed for computer control and will change for the better when their designers realize the error of their ways. [Illustration: _Hughes Aircraft Company_ Mobot Mark II, carrying a Geiger counter in its “hands,” demonstrates how it can substitute for men in dangerously radiated areas. ] Minneapolis-Honeywell in this country is working toward the complete automation of buildings, pointing out that they are as much machines as structures. A 33-story skyscraper in Houston will use a central computer to check 400 points automatically and continuously. Temperature and humidity will be monitored, as well as doors and windows. Presence of smoke and fire will be automatically detected, and all mechanical equipment will be monitored and controlled. Equipped with cost figures, the central computer will literally “run” the building for optimum efficiency and economy. Harvard University has a central control for seventy-six campus buildings, and in Denver work is being done toward a central control for a number of large buildings. It is fitting that automation of buildings be carried on, since historically it was in the home that self-control of machines was pioneered with automatic control of furnaces with thermostats. [Illustration: _Robodyne Division, U.S. Industries, Inc._ TransfeRobot assembly-line worker installs clockwork parts with speed and precision. ] In this country our traffic is crying for some kind of control, and New York is already using punched-card programming to control part of the city’s traffic. The Federal administration is studying a bold proposal from RCA, Bendix, General Motors, and Westinghouse for an automatically controlled highway. The reason? Traffic is getting to be too much for the human brain to deal with. A better one has to be found, and the computer is applying for the job. The coming of automation has been likened to a tidal wave. It is useless to shovel against it, and the job would seem to be to find suitable life preservers to keep us afloat as it sweeps in over the world. One approach is that of a nonprofit foundation to study the impact of automation on workers. This group, a joint United States Industries, Inc., and International Association of Machinists organization, has already come up with a scheme for collecting “dues” from the machines, in annual amounts of from $25 to $1,000, depending on the work output of the machine. A key project of the foundation is a study of effective retraining of workers to fit them for jobs in the new, computerized factory. Such studies may well have to be extended from the assembly line to the white-collar worker and executive as well. The computer can wear many different kinds of hats! ------------------------------------------------------------------------ Teaching Machine Age Lilyn E. Carlton in _Saturday Review_ “_In the good old-fashioned school days, Days of the golden rule, Teacher said, ‘Good morning, class,’ And so she started school._ _Alas! How different things are now, The school day can’t begin Till someone finds the socket And plugs the teacher in._” 10: The Academic Computer It was inevitable that the computer invade, or perhaps “infiltrate” is the better word, our education system. Mark I and ENIAC were university-born and -bred, and early research work was done by many institutions using computers. A logical development was to teach formal courses in using the computer. While application of the machine in mathematical and scientific work came first, its application to business and to the training of executives for such use of the computer was soon recognized. As an example, one of two computers installed by U.C.L.A. in 1957 was for use exclusively in training engineering executives as well as undergraduates in engineering economy. Early courses were aimed at those already in industry, in an attempt to catch them up with the technology of computer-oriented systems in business and science. As special courses, many of these carried a high tuition fee. Next came the teaching of professors and deans of engineering institutions in techniques of computer education for undergraduates. Today the computer is being taught to many students in many schools. New York University has a $3 million computer at its Courant Institute of Mathematical Sciences, being used by students in basic and applied research on projects ranging from the design of bridges to the analysis of voting patterns in Congress. M.I.T. recently added a digital computer to teach its students the operation of electronic data-processing equipment. Another computer is used in more sophisticated work including speech analysis, study of bioelectrical signals, and the simulation of automata as in the “Hand” project. At the computing center of the University of Michigan a second generation of computers is being installed. Students in some one hundred different courses use these computers, programming them with a language developed at the University and called MAD, for Michigan Algorithm Decoder. These are typical examples of perhaps two hundred schools using computers. That knowledge of computer techniques is essential for the engineering graduate is evident in the fact that of a recent class of such students at Purdue, 1,600 used the computer during the term. Less known is the integration of computer courses in secondary education. The Royal McBee Corporation teaches a special course on the computer to youngsters at Staples High in Westport, Connecticut. At the end of the first four-week session it was found that the students, fifteen to seventeen years old, had learned faster than adults. At New York’s St. Vincent Ferrer Catholic High School, 400 girls participated in a similar project conducted by Royal McBee. Other high schools are following suit, and computers are expected to appear in significant numbers in high schools before the end of 1962. Textbooks on computers, written for high-school students, are available. As an example of the ability of young people in this field, David Malin of Walter Johnson High School in Rockville, Maryland, read his own paper on the use of computers to simulate human thought processes to science experts attending the 1961 Eastern Joint Computer Conference held in Washington, D.C. The use of the computer in the classroom encompasses not only colleges and high schools, but extends even to prisons. Twenty inmates of a Pennsylvania state institution attended a pilot program teaching computer techniques with a UNIVAC machine. [Illustration: _Datamation_ Seventeen-year-old David Malin who presented a paper on computers at the Eastern Joint Computer Conference in 1961. ] The United States is not alone in placing importance on the computer in schools. Our Department of Commerce has published details of Russian work in this direction, noting that it began in 1955 and places high priority on the training of specialists in computer research, machine translation, automation, and so on. The Department of Commerce feels that these courses, taught at the graduate, undergraduate, and even high-school level, are of high quality. _Teaching Machines_ Thus far we have talked of the computer only as a tool to be studied and not as an aid to learning in itself. In just a few years, however, the “teaching machine” has become familiar in the press and controversial from a number of standpoints, including those of being a “dehumanizer” of the process of teaching and a threat to the apple business! Actually, the computer has functioned for some time outside the classroom as a teaching machine. Early applications of analog computers as flight simulators were true “teaching machines” although perhaps the act was not as obvious as classroom use of a computer to teach the three R’s. Even today, there are those who insist that such use of the computer by the military or industry offers more potential than an academic teaching machine. Assembly workers have been taught by programmed audiovisual machines such as Hughes Aircraft’s Videosonic trainer, and the government has taught many technicians by computer techniques. A shrewd observer, however, noting that the computer is called stupid, bluntly points out that any untaught student is in the same category, and that perhaps it takes one to teach one. A strong motivation for looking to the machine as a public teaching tool is the desperation occasioned by the growing shortage of teachers. If the teaching machine could take over even some of the more simple chores of the classroom, early advocates said, it would be worth the effort. Formal study of machine methods of teaching have a history of forty years or more. In the 20’s, Sydney Pressey designed and built automatic teaching—or more precisely, testing—machines at Ohio State University. These were simply multiple-choice questions so mechanized as to be answered by the push of a button rather than with a pencil mark. A right answer advanced the machine to the next question, while an error required the student to try again. Pressey wisely realized the value in his machines; the student could proceed at his own pace, and his learning was also stimulated by immediate recognition of achievement. To further enforce this learning, some of the teaching machines dispensed candy for a correct answer. Using this criterion, it would seem that brighter students could be recognized by their weight. Unfortunately, Pressey’s teaching machines did not make a very big splash in the academic world, because of a combination of factors. The machines themselves had limitations in that they did not present material to be learned but were more of the nature of _a posteriori_ testing devices. Too, educators were loath to adopt the mechanized teachers for a variety of reasons, including skepticism, inertia, economics, and others. However, machine scoring of multiple-choice tests marked with special current-conducting pencils became commonplace. Another researcher, B. F. Skinner, commenced work on a different kind of teaching machine thirty years ago at Harvard. Basically his method consists of giving the subject small bits—not computer “bits,” but the coincidence is interesting—of learning at a time, and reinforcing these bits strongly and immediately. Skinner insists that actual “recall” of information is more important than multiple-choice “recognition,” and he asks for an answer rather than a choice. Called “operant reinforcement,” the technique has been used not only on man, but on apes, monkeys, rats, dogs, and surprisingly, pigeons. During World War II, Dr. Skinner conducted “Project Pigeon” for the military. In this unusual training program, the feathered students were taught to peck at certain targets in return for which they received food as a reward. This combination of apt pupils and advanced teaching methods produced pigeons who could play ping-pong. This was in the early days of missile guidance, and the pigeons next went into training as a homing system for these new weapons! To make guidance more reliable, not one but three pigeons were to be carried in the nose of the device. Lenses in the missile projected an image before each pigeon, who dutifully pecked at his “target.” If the target was in the center of the cross hairs, the missile would continue on its course; if off to one side, the pecking would actuate corrective maneuvers. As Project “Orcon,” for Organic Control, this work was carried on for some time after the end of the war. Fortunately for the birds, however, more sophisticated, inorganic guidance systems were developed. The implications of the pigeon studies in time led to a new teaching method for human beings. Shortly after Skinner released a paper on his work in operant reinforcement with the pigeons, many workers in the teaching field began to move in this direction. For several years Skinner and James Holland have been using machines of this type to teach some sections of a course in human behavior to students at Radcliffe and Harvard. Rheem Califone manufactures the DIDAK machine to Skinner’s specifications. To the reasons advanced by those who see teacher shortages looming, Skinner adds the argument that a machine can often teach better. Too much time, he feels, has been spent on details that are not basic to the problem. Better salaries for teachers, more teachers, and more schools do not in themselves improve the actual teaching. Operant reinforcement, Skinner contends, _does_ get at the root of the problem and, in addition to relieving the teacher of a heavy burden, the teaching machine achieves better results in some phases of teaching. It also solves another problem that plagues the educator today. It is well known that not all of us can learn at the same rate. Since it is economically and culturally impossible except in rare cases to teach children in groups of equal ability, a compromise speed must be established. This is fine for the “average” child, of whom there may actually be none in the classroom; it penalizes the fast student, and the slow student perhaps even more. The teaching machine, its proponents feel, takes care of this difficulty and lets each proceed at his own rate. Since speed in itself is no sure indicator of intelligence, the slow child, left to learn as he can, may reach heights not before dreamed possible for him. Many educators agree that automated teaching is past due. James D. Finn, Professor of Education at the University of Southern California, deplores the lack of modern technology in teaching. “Technology during the period from 1900 to 1950 only washed lightly on the shores of instruction,” he says. “The cake of custom proved to be too tough and the mass production state, at least 100 years behind industry, was not entered except here and there on little isolated islands.” [Illustration: _Educational Science Division,_ _U.S. Industries, Inc._ AutoTutor teaching machine has programs for teaching many subjects. ] These little isolated islands are now getting bigger and closer together. The Air Force has for some time trained technicians at Keesler Field with U.S. Industries AutoTutor machines, and also uses them at the Wright Air Development Center. The Post Office Department has purchased fifty-five U.S. Industries’ Digiflex trainers. Following this lead, public education is beginning to use teaching machines. San Francisco has an electronic computer version that not only teaches, tests, and coaches, but even sounds an alarm if the student tries to “goof off” on any of the problems. The designers of the machine selected a sure-fire intellectual acronym, PLATO, for Programmed Logic for Automatic Teaching Operations. The System Development Corporation, the operations firm that designed the SAGE computer, calls its computer-controlled classroom teacher simply CLASS. This machine uses a Bendix G-15 computer to teach twenty youngsters at a time. To show the awareness of the publishers of texts and other educational material, firms like Book of Knowledge, Encyclopedia Britannica Films, and TMI Grolier are in the “teaching machine” business, and the McGraw-Hill Book Company and Thompson Ramo Wooldridge, Inc., have teamed to produce computerized teaching machines and the programs for them. Other publishers using “programming” techniques in their books include Harcourt-Brace with its 2600 series (for 2,600 programmed steps the student must negotiate), Prentice-Hall, and D. C. Heath. Entirely new firms like Learning, Incorporated, are now producing “programs” on many subjects for teaching machines. Subjects available in teaching machine form include algebra, mathematics, trigonometry, slide rule fundamentals, electronics, calculus, analytical geometry, plane geometry, probability theory, electricity, Russian, German, Spanish, Hebrew, spelling, music fundamentals, management science, and even Goren’s bridge for beginners. While many of these teaching machines are simply textbooks programmed for faster learning, the conversion of such material into computer-handled presentation is merely one of economics. For example, a Doubleday TutorText book costs only a few dollars; an automatic AutoTutor Mark II costs $1,250 because of its complex searching facility that requires several thousand branching responses. However, the AutoTutor is faster and more effective and will operate twenty-four hours a day if necessary. With sufficient demand the machine may be the cheaper in the long run. The System Development Corporation feels that its general concept of automated group education will be feasible in the near future despite the high cost of advanced electronic digital computers. It cites pilot studies being conducted by the State of California on data-processing for a number of schools through a central facility. Using this same approach, a single central computer could serve several schools with auxiliary lower-priced equipment. Even a moderately large computer used in this way could teach a thousand or more students simultaneously and _individually_, the Corporation feels. After school hours, the computer can handle administrative tasks. [Illustration: _System Development Corp._ The CLASS facility incorporates an administrative area, hallway, combined observation and counseling area, and a large classroom area divided by a folding wall. ] In the CLASS system developed by the System Development Corporation, the “branching” concept is used. In a typical lesson program, if the student immediately answers that America was discovered by Christopher Columbus, he will be told he is correct and will then be branched to the next item. If he answers Leif Ericson, the computer takes time out to enlighten the pupil on that score. Next, it reinforces the correct date in the student’s mind before asking another question. Although it would seem that a lucky student could progress through the programmed lesson on guesswork alone, the inexorable laws of probability rule this out. He cannot complete the lesson until he has soaked up all the information it is intended to impart. He can do this without an error, in a very short time, or he can learn by the trial-and-error process, whichever is better suited to his speed and mental ability. Making up the program for the teaching machine is a difficult task and requires the services of technical expert, psychologist, and programmer. An English-like language is used in preparing a CLASS program for the computer. Put on magnetic tape, the program goes into the memory of the computer and is called out by proper responses from the student as he progresses through the lesson. [Illustration: _System Development Corp._ Students in CLASS are learning French in a group mode of automated instruction. ] Complex as the programming is, entries from the student’s control are processed into the computer in about one-tenth of a second, and an answer is flashed back in about the same amount of time. Remember that the CLASS computer is handling twenty students at a time, and that in addition to teaching it is keeping a complete record of how the student fared at each step of the lesson. It is obvious that the binary or yes-no logic of the computer ties in with the concept put forth by Skinner and others of presenting small bits of information at a time. We can use the game of 20 Questions as a good analogy. Even getting only simple yes-no answers, skilled players can elicit an amazing amount of information in often far less than the permitted number of questions. Thus even complex subjects can be broken down into simple questions answerable by discrete choices from the student. The automated group education system of the System Development Corporation is made up of the following components: a digital computer to control and select the material presented and to analyze responses, a magnetic tape storage unit, a typewriter for printing out data analysis, a slide projector and screen for presenting educational materials, and individual desks with keyboards for the students’ responses. We have pointed out that even though it is possible to break down educational material into multiple-choice or yes-no answers to which are assigned intrinsic values, the ideal system permits answers on a linear scale. In other words, instead of picking what he considers the most nearly correct, a student writes his own answer. Some experts feel that the advances being made in optical scanning, or “reading” techniques for computers, will result in linear programming of the teaching machines within the next ten years. Such a development will do much to alleviate the complaint that the machine exerts a rigid mechanizing effect on the teaching process. While fear of displacement motivates some teachers to distrust the machine, an honest belief that the human touch is necessary in the schoolroom is also a large factor against acceptance. Yet these same wary teachers generally use flash cards, flip charts, and other mechanical aids with no qualms. The electronic computer is a logical extension of audiovisual techniques, and in time the teacher will come to accept it for what it is. The human teacher will continue to be an indispensable element in education, but he must recognize that as our technology becomes more complex he will need more and more help. In 1960 there were about 44 million students in our classrooms, and about 135,000 too few teachers. By 1965 it is estimated there will be 48 million students and 250,000 teachers fewer than we need. Parallel with this development is the rapidly growing need for college graduates. One large industrial firm which employs 150,000 hires only 300 college graduates a year at present, but will need 7,000 when it automates its plants. The pressure of need thus is forcing our educational system to make use of the most efficient means of educating our students. Beyond simply taking its place with other aids, however, the computer will make great changes in our basic concepts of teaching, according to Dr. Skinner. He asks the question “Are the students who learn in spite of a confusing presentation of a subject better for the experience, or were they better students at the outset?” He advances this argument to say that perhaps “easy” learning is actually the best; that we would do well to analyze the behavior called thinking and then produce it according to these specifications. The traditional teacher finds the prospect alarming and questions the soundness of minimizing failure and maximizing success. There is not yet definite agreement by other psychologists with Skinner’s contention that recall rather than recognition is the desired method. Neither is it sure that the negative reinforcement of a number of incorrect choices may result in remembering wrong answers. And of course the division between rote learning and creativity is an important consideration. The answers may well lie in the computer, which when properly programmed is about the most logical device we have available to us. Thus the machine may determine the best teaching methods and then use them to teach us. Regardless of these as yet unanswered questions, however, the future of the teaching machine seems to be assured. One authority has predicted that it will be a $100 million market by 1965. An intriguing use of computer techniques in teaching is being investigated by Corrigan Communications, which scores students answering questions on telecourses. This work is being done with a course in medicine, and with the rapid growth of educational television the implications of combining it and teaching machine techniques are of great importance. Classroom teaching is not the only educational application for the teaching machine. A computer-controlled library is an interesting thought, with the patron requesting information from a central computer and having it presented instantaneously on a viewing screen in front of him. Such a system could conceivably have access to a national library hookup, constantly updated with new material. Such a service would also be available for use during school study hall, or by the teacher during class. Visitors to the World’s Fair in Seattle previewed the computerized information center of the future. Called Library 21, it is considered a prototype of the next century’s core libraries which will be linked to smaller branches by communications networks. Many computers were displayed, tied in with teaching machines, language laboratories, and information from the Great Books, tailored to the individual questioner’s sex, personality, and mental level. Also shown was a photo process that reduces a 400-page book to the size of a postage stamp for storage. With this kind of progress, we can in the foreseeable future request and receive up-to-date information of any kind of human knowledge anywhere—in language we can understand. Another computer application sure to come is that of handling correspondence courses. The teaching of extension courses in the home, through television and some sort of response link, has been mentioned, and it is not impossible that the school as a physical plant may one day no longer be necessary. [Illustration: _International Business Machines Corp._ This system supplies legal information in minutes, with insertion of punched-card query (top). Using inquiry words, computer prints citations of statutes (middle); then, on request, full text (below). ] Since the computer itself does not “teach,” but merely acts as a go-between for the man who prepared the lesson or program and the student who learns, it would seem that some of our teachers may become programmers. The System Development Corporation has broken the teaching machine program into three phases: experimenting with the effects of many variables on teaching machine effectiveness, developing a simplified teaching machine, and finally, analyzing the educational system to find where and how the machine fits. Research is still in the first phase, that of experiment. But it is known that some programs produced so far show better results than conventional teaching methods, and also that teaching machines can teach any subject involving factual information. Thus it is evident they will be useful in schools and also in industry and military training programs. _Language_ If man is to use the computer to teach himself, he must be able to converse with it. In the early days of computers it was said with a good deal of justification that the machine was not only stupid but decidedly insular as well. In other words, man spoke to it in its own language or not at all. A host of different languages, or “compilers” as they are often called, were constructed and their originators beat the drums for them. With tongues like ALGY, ALGOL, COBOL, FACT, FLOWMATIC, FORTRAN, INTERCOM, IT, JOVIAL, LOGLAN, MAD, PICE, and PROLAN, to name a few, the computer has become a tower of Babel, and a programmer’s talents must include linguistics. One language called ALGOL, for Algorithmic Oriented Language, had pretty smooth sailing, since it consists of algebraic and arithmetic notation. Out of the welter of business languages a compromise Common Business Oriented Language, or COBOL, evolved. What COBOL does for programming computer problems is best shown by comparing it with instructions once given the machine. The sample below is typical of early machine language: [Illustration: SUBTRACT QUANTITY-SOLD FROM BALANCE-ON-HAND. IF BALANCE-ON-HAND IS NOT LESS THAN REORDER-LEVEL THEN GO TO BALANCE-OK ELSE COMPUTE QUANTITY-TO-BUY = TOTAL-SALES-3-MOS/3. ] Recommended by a task force for the Department of Defense, industry, and other branches of the government, COBOL nevertheless has had a tough fight for acceptance, and there is still argument and confusion on the language scene. New tongues continue to proliferate, some given birth by ALGOL and COBOL themselves. Examples of this generation are GECOM, BALGOL, and TABSOL. One worthy attempt at a sort of machine Esperanto is called a pun-inviting UNCOL, for Universal Computer-Oriented Language and seems to be a try for the computer’s vote. One harried machine-language user has suggested formation of an “ALGOLICS Anonymous” group for others of his ilk, while another partisan accuses his colleagues in Arizona of creating a new language while “maddened by the scent of saguaro blossoms.” It was recently stated that perhaps by the time a decision is ultimately reached as to which will be the general language, there will be no need of it because by then the computer will have learned to read and write, and perhaps to listen and to speak as well. Recent developments bear out the contention. Although it has used intermediate techniques, the computer has proved it can do a lot with our language in some of the tasks it has been given. Among these is the preparation of a Bible concordance, listing principal words, frequency of appearance, and where they are found. The computer tackled the same job on the poems of Matthew Arnold. For this chore, Professor Stephen Maxfield Parrish of Cornell worked with three colleagues and two technicians to program an IBM 704 data-processing system. In addition to compiling the list of more than 10,000 words used most often by Arnold, the computer arranged them alphabetically and also compiled an appendix listing the number of times each word appeared. To complete the job, the computer itself printed the 965-page volume. The Dead Sea Scrolls and the works of St. Thomas Aquinas have also been turned over to the computer for preparation of analytical indexes and concordances. At Columbia University, graduate student James McDonough gave an IBM 650 the job of sleuthing the author of _The Iliad_ and _The Odyssey_. Since the computer can detect metric-pattern differences otherwise practically undiscoverable, McDonough felt that the machine could prove if Homer had written both poems, or if he had help on either. Thus far he is sure the entire _Iliad_ is the work of one man, after computer analysis of its 112,000 words. The project is part of his doctoral thesis. A recent article in a technical journal used a title suggested by an RCA 501, and suspicion is strong that the machines themselves are guilty of burning midnight kilowatts to produce the acronyms that abound in the industry. The computer is even beginning to prove its worth as an abstracter. Other literary jobs the computer has done include the production of a book of fares for the International Air Transport Association. The computer compiled and then printed out this 420-page book which gives shortest operating distances between 1,600 cities of the world. Now newspapers are beginning to use computers to do the work of typesetting. These excursions into the written language of human beings, plus its experience as a poet and in translation from language to language, have undoubtedly brought the computer a long way from its former provincialism. As pointed out, computer work with human language generally is not accomplished without intermediate steps. For example, in one of the concordances mentioned, although the computer required only an hour to breeze through the work, a programmer had spent weeks putting it in the proper shape. What is needed is a converter which will do the work directly, and this is exactly what firms like Digitronics supply to the industry. This computer-age Berlitz school has produced converters for Merrill Lynch, Pierce, Fenner & Smith for use in billing its stock-market customers, Wear-Ever as an order-taking machine, _Reader’s Digest_ for mailing-list work, and Schering Corporation for rat-reaction studies in drug research, to mention a few. The importance of such converters is obvious. Prior to their use it was necessary to type English manually into the correct code, a costly and time-consuming business. Converters are not cheap, of course, but they operate so rapidly that they pay for themselves in short order. Merrill Lynch’s machine cost $120,000, but paid back two-thirds of that amount in savings the first year. There is another important implication in converter operation. It can get computer language out of English—or Japanese, or even Swahili if the need arises. A more recent Digitronics’ converter handles information in English or Japanese. If the computer has its language problems, man has them also, to the _n_th degree. There are about 3,000 tongues in use today; mercifully, scientific reports are published in only about 35 of these. Even so, at least half the treatises published in the world cannot be read by half the world’s scientists. Unfortunately, UNESCO estimates that while 50 per cent of Russian scientists read English, less than 1 per cent of United States scientists return the compliment! The ramifications of these facts we will take up a little later on; for now it will be sufficient to consider the language barrier not only to science but also to culture and the international exchange of good will that can lead to and preserve peace. Esperanto, Io, and other tongues have been tried as common languages. One recent comer to the scientific scene is called Interlingua and seems to have considerable merit. It is used in international medical congresses, with text totaling 300,000 words in the proceedings of one of these. But a truly universal language is, like prosperity, always just around the corner. Even the scientific community, recognizing the many benefits that would accrue, can no more adopt Interlingua or another than it can settle on the metric system of measurement. Our integration problems are not those of race, color, and creed only. Before Sputnik our interest in foreign technical literature was not as keen as it has been since. One immediate result of the satellite launching by the Russians was amendment of U.S. Public Law 480 to permit money from the sale of American farm equipment abroad to be used for translation of foreign technical literature. We are vitally concerned with Russia, but have also arranged for thousands of pages of scientific literature from Poland, Yugoslavia, and Israel. Communist China is beginning to produce scientific reports too, and Japanese capability in such fields as electronics is evident in the fact that the revolutionary “tunnel diode” was invented by Esaki in Japan. It is understandable that we should be concerned with the output of Russian literature, and much attention has been given to the Russian-English translator developed by IBM for the Air Force. It is estimated that the Russians publish a billion words a year, and that about one-third of this output is technical in nature. Conventional translating techniques, in addition to being tedious for the translators, are hopelessly slow, retrieving only about 80 million words a year. Thus we are falling behind twelve years each year! Outside of a moratorium on writing, the only solution is faster translation. The Air Force translator was a phenomenal achievement. Based on a photoscopic memory—a glass disc 10 inches in diameter capable of storing 55,000 words of Russian-English dictionary in binary code—the system used a “one-to-one” method of translation. The result initially was a translation at the rate of about 40 words per minute of Russian into an often terribly scrambled and confusing English. The speed was limited not by the memory or the computer itself but by the input, which had to be prepared on tape by a typist. Subsequently a scanning system capable of 2,400 words a minute upped the speed considerably. Impressive as the translator was, its impact was dulled after a short time when it was found that a second “translation” was required of the resulting pidgin English, particularly when the content was highly technical. As a result, work is being done on more sophisticated translation techniques. Making use of predictive analysis, and “lexical buffers” which store all the words in a sentence for syntactical analysis before final printout, scientists have improved the translation a great deal. In effect, the computer studies the structure of the sentence, determining whether modifiers belong with subject or object, and checking for the most probable grammatical form of each word as indicated by other words in the sentence. The advanced nature of this method of translation requires the help of linguistics experts. Among these is Dr. Sydney Lamb of the University of California at Berkeley who is developing a computer program for analysis of the structure of any language. One early result of this study was the realization that not enough is actually known of language structure and that we must backtrack and build a foundation before proceeding with computer translation techniques. Dr. Lamb’s procedure is to feed English text into the computer and let it search for situations in which a certain word tends to be preceded or followed by other words or groups of words. The machine then tries to produce the grammatical structure, not necessarily correctly. The researcher must help the machine by giving it millions of words to analyze contextually. What the computer is doing in hours is reproducing the evolution of language and grammar that not only took place over thousands of years, but is subject to emotion, faulty logic, and other inaccuracies as well. Also working on the translation problem are the National Bureau of Standards, the Army’s Office of Research and Development, and others. The Army expects to have a computer analysis in 1962 that will handle 95 per cent of the sentences likely to be encountered in translating Russian into English, and to examine foreign technical literature at least as far as the abstract stage. Difficult as the task seems, workers in the field are optimistic and feel that it will be feasible to translate all languages, even the Oriental, which seem to present the greatest syntactical barriers. An indication of success is the announcement by Machine Translations Inc. of a new technique making possible contextual translation at the rate of 60,000 words an hour, a rate challenging the ability of even someone coached in speed-reading! The remaining problem, that of doing the actual reading and evaluation after translation, has been brought up. This considerable task too may be solved by the computer. The machines have already displayed a limited ability to perform the task of abstracting, thus eliminating at the outset much material not relevant to the task at hand. Another bonus the computer may give us is the ideal international and technical language for composing reports and papers in the first place. A logical question that comes up in the discussion of printed language translation is that of another kind of translation, from verbal input to print, or vice versa. And finally from verbal Russian to verbal English. The speed limitation here, of course, is human ability to accept a verbal input or to deliver an output. Within this framework, however, the computer is ready to demonstrate its great capability. A recent article in _Scientific American_ asks in its first sentence if a computer can think. The answer to this old chestnut, the authors say, is certainly yes. They then proceed to show that having passed this test the computer must now learn to perceive, if it is to be considered a truly intelligent machine. A computer that can read for itself, rather than requiring human help, would seem to be perceptive and thus qualify as intelligent. Even early computers such as adding machines printed out their answers. All the designers have to do is reverse this process so that printed human language is also the machine’s input. One of the first successful implementations of a printed input was the use of magnetic ink characters in the Magnetic Ink Character Recognition (MICR) system developed by General Electric. This technique called for the printing of information on checks with special magnetic inks. Processed through high-speed “readers,” the ink characters cause electrical currents the computer can interpret and translate into binary digits. Close on the heels of the magnetic ink readers came those that use the principle of optical scanning, analogous to the method man uses in reading. This breakthrough came in 1961, and was effected by several different firms, such as Farrington Electronics, National Cash Register, Philco, and others, including firms in Canada and England. We read a page of printed or written material with such ease that we do not realize the complex way our brains perform this miracle, and the optical scanner that “reads” for the computer requires a fantastically advanced technology. As the material to be read comes into the field of the scanner, it is illuminated so that its image is distinct enough for the optical system to pick up and project onto a disc spinning at 10,000 revolutions per minute. In the disc are tiny slits which pass a certain amount of the reflected light onto a fixed plate containing more slits. Light which succeeds in getting through this second series of slits activates a photoelectric cell which converts the light into proportionate electrical impulses. Because the scanned material is moving linearly and the rotating disc is moving transversely to this motion, the character is scanned in two directions for recognition. Operating with great precision and speed, the scanner reads at the rate of 240 characters a second. National Cash Register claims a potential reading rate for its scanner of 11,000 characters per second, a value not reached in practice only because of the difficulty of mechanically handling documents at this speed. Used in post-office mail sorting, billing, and other similar reading operations, optical scanners generally show a perfect score for accuracy. Badly printed characters are rejected, to be deciphered by a human supervisor. It is the optical scanner that increased the speed of the Russian-English translating computer from 40 to 2,400 words per minute. In post-office work, the Farrington scanner sorts mail at better than 9,000 pieces an hour, rejecting all handwritten addresses. Since most mail—85 per cent, the Post Office Department estimates—is typed or printed, the electronic sorter relieves human sorters of most of their task. Mail is automatically routed to proper bins or chutes as fast as it is read. The electronic readers have not been without their problems. A drug firm in England had so much difficulty with one that it returned it to the manufacturer. We have mentioned the one that was confused by Christmas seals it took for foreign postage stamps. And as yet it is difficult for most machines to read anything but printed material. An attempt to develop a machine with a more general reading ability, one which recognizes not only material in which exact criteria are met, but even rough approximations, uses the _gestalt_ or all-at-once pattern principle. Using a dilating circular scanning method, the “line drawing pattern recognizer” may make it possible to read characters of varying sizes, handwritten material, and material not necessarily oriented in a certain direction. A developmental model recognizes geometric figures regardless of size or rotation and can count the number of objects in its scope. Such experimental work incidentally yields much information on just how the eye and brain perform the deceptively simply tasks of recognition. Once 1970 had been thought a target date for machine recognition of handwritten material, but researchers at Bell Telephone Laboratories have already announced such a device that reads cursive human writing with an accuracy of 90 per cent. The computer, a backward child, learned to write long before it could read and does so at rates incomprehensible to those of us who type at the blinding speed of 50 to 60 words a minute. A character-generator called VIDIAC comes close to keeping up with the brain of a high-speed digital computer and has a potential speed of 250,000 characters, or about 50,000 words, per _second_. It does this, incidentally, by means of good old binary, 1-0 technique. To add to its virtuosity, it has a repertoire of some 300 characters. Researchers elsewhere are working on the problems to be met in a machine for reading and printing out 1,000,000 characters per second! None of us can talk or listen at much over 250 words a minute, even though we may convince ourselves we read several thousand words in that period of time. A simple test of ability to hear is to play a record or tape at double speed or faster. Our brains just won’t take it. For high-speed applications, then, verbalized input or output for computers is interesting in theory only. However, there are occasions when it would be nice to talk to the computer and have it talk back. In the early, difficult days of computer development, say when Babbage was working on his analytical engine, the designer probably often spoke to his machine. He would have been stunned to hear a response, of course, but today such a thing is becoming commonplace. IBM has a computer called “Shoebox,” a term both descriptive of size and refreshing in that is not formed of initial capitals from an ad writer’s blurb. You can speak figures to Shoebox, tell it what you want done with them, and it gets busy. This is admittedly a baby computer, and it has a vocabulary of just 16 words. But it takes only 31 transistors to achieve that vocabulary, and jumping the number of transistors to a mere 2,000 would increase its word count to 1,000, which is the number required for Basic English. The Russians are working in the field of speech recognition too, as are the Japanese. The latter are developing an ambitious machine which will not only accept voice instructions, but also answer in kind. To make a true speech synthetizer, the Japanese think they will need a computer about 5,000 times as fast as any present-day type, so for a while it would seem that we will struggle along with “canned” words appropriately selected from tape memory. We have mentioned the use of such a tape voice in the computerized ground-controlled-approach landing system for aircraft, and the airline reservation system called Unicall in which a central computer answers a dialed request for space in less than three seconds—not with flashing lights or a printed message but in a loud clear voice. It must pain the computer to answer at the snail-like human speed of 150 words a minute, so it salves its conscience by handling 2,100 inputs without getting flustered. The writer’s dream, a typewriter that has a microphone instead of keys and clacks away merrily while you talk into it, is a dream no longer. Scientists at Japan’s Kyoto University have developed a computer that does just this. An early experimental model could handle a hundred Japanese monosyllables, but once the breakthrough was made, the Japanese quickly pushed the design to the point where the “Sonotype” can handle any language. At the same time, Bell Telephone Laboratories works on the problem from the other end and has come up with a system for a typewriter that talks. Not far behind these exotic uses of digital computer techniques are such things as automatic translation of telephone or other conversations. _Information Retrieval_ It has been estimated that some 445 trillion words are spoken in each 16-hour day by the world’s inhabitants, making ours a noisy planet indeed. To bear out the “noisy” connotation, someone else has reckoned that only about 1 per cent of the sounds we make are real information. The rest are extraneous, incidentally telling us the sex of the speaker, whether or not he has a cold, the state of his upper plate, and so on. It is perhaps a blessing that most of these trillions of words vanish almost as soon as they are spoken. The printed word, however, isn’t so transient; it not only hangs around, but also piles up as well. The pile is ever deeper, technical writings alone being enough to fill seven 24-volume encyclopedias each day, according to one source. As with our speech, perhaps only 1 per cent of this outpouring of print is of real importance, but this does not necessarily make what some have called the Information Explosion any less difficult to cope with. The letters IR once stood for infra-red; but in the last year or so they have been appropriated by the words “information retrieval,” one of the biggest bugaboos on the scientific horizon. It amounts to saving ourselves from drowning in the fallout from typewriters all over the earth. There are those cool heads who decry the pushing of the panic button, professing to see no exponential increase in literature, but a steady 8 per cent or so each year. The button-pushers see it differently, and they can document a pretty strong case. The technical community is suffering an embarrassment of riches in the publications field. While a doubling in the output of technical literature has taken the last twelve years or so, the next such increase is expected in half that time. Perhaps the strongest indication that IR is a big problem is the obvious fact that nobody really knows just how much has been, is being, or will be written. For instance, one authority claims technical material is being amassed at the rate of 2,000 pages a minute, which would result in far more than the seven sets of encyclopedias mentioned earlier. No one seems to know for sure how many technical journals there are in the world; it can be “pinpointed” somewhere between 50,000 and 100,000. Selecting one set of figures at random, we learn that in 1960 alone 1,300,000 different technical articles were published in 60,000 journals. Of course there were also 60,000 books on technical subjects, plus many thousands of technical reports that did not make the formal journals, but still might contain the vital bit of information without which a breakthrough will be put off, or a war lost. Our research expenses in the United States ran about $13 billion in 1960, and the guess is they will more than double by 1970. An important part of research should be done in the library, of course, lest our scientist spend his life re-inventing the wheel, as the saying goes. To back up this saying are specific examples. For instance, a scientific project costing $250,000 was completed a few days before an engineer came across practically the identical work in a report in the library. This was a Russian report incidentally, titled “The Application of Boolean Matrix Algebra to the Analysis and Synthesis of Relay Contact Networks.” In another, happier case, information retrieval saved Esso Research & Engineering Co. a month of work and many thousands of dollars when an alert—or lucky—literature searcher came across a Swedish scientist’s monograph detailing Esso’s proposed exploration. Another literature search obviated tests of more than a hundred chemical compounds. Unfortunately not all researchers do or can search the literature in all cases. There is even a tongue-in-cheek law which governs this phenomenon—“Mooer’s” Law states, “An information system will tend not to be used whenever it is more painful for a customer to have information than for him not to have it.” As a result, it has been said that if a research project costs less than $100,000 it is cheaper to go ahead with it than to conduct a rigorous search of the literature. Tongue in cheek or not, this state of affairs points up the need for a usable information retrieval system. _Fortune_ magazine reports that 10 per cent of research and development expense could be saved by such a system, and 10 per cent in 1960, remember, would have amounted to $1.3 billion. Thus the prediction that IR will be a $100 million business in 1965 does not seem out of line. The Center for Documentation at Western Reserve University spends about $6-1/2 simply in acquiring and storing a single article in its files. In 1958 it could search only thirty abstracts of these articles in an hour and realized that more speed was vital if the Center was to be of value. As a result, a GE 225 computer IR system was substituted. Now researchers go through the entire store of literature—about 50,000 documents in 1960—in thirty-five minutes, answering up to fifty questions for “customers.” [Illustration: _International Business Machines Corp._ The document file of this WALNUT information retrieval system contains the equivalent of 3,000 books. A punched-card inquiry system locates the desired filmstrip for viewing or photographic reproduction. ] [Illustration: _International Business Machines Corp._ This image converter of the WALNUT system optically reduces and transfers microfilm to filmstrips for storage. Each strip contains 99 document images. As a document image is transferred from microfilm to filmstrip, the image converter simultaneously assigns image file addresses and punches these addresses into punched cards controlling the conversion process. ] The key to information retrieval lies in efficient abstracting. It has been customary to let people do this task in the past because there was no other way of getting it done. Unfortunately, man does not do a completely objective job of either preparing or using the abstract, and the result is a two-ended guessing game that wastes time and loses facts in the process. A machine abstracting system, devised by H. Peter Luhn of IBM, picks the words that appear most often and uses them as keys to reduce articles to usable, concise abstracts. A satisfactory solution seems near and will be a big step toward a completely computerized IR system. For several years there has been a running battle between the computer IR enthusiast and the die-hard “librarian” type who claims that information retrieval is not amenable to anything but the human touch. It is true that adapting the computer to the task of information retrieval did not prove as simple as was hoped. But detractors are in much the same fix as the man with a shovel trying to build a dike against an angry rising sea, who scoffs at the scoop-shovel operator having trouble starting his engine. The wise thing to do is drop the shovel and help the machine. There will be a marriage of both types of retrieval, but Verner Clapp, president of the Washington, D.C., Council on Library Resources, stated at an IR symposium that computers offer the best chance of keeping up with the flood of information. One sophisticated approach to IR uses symbolic logic, the forte of the digital computer. In a typical _reductio ad logic_, the following request for information: An article in English concerning aircraft or spacecraft, written neither before 1937 or after 1957; should deal with laboratory tests leading to conclusions on an adhesive used to bond metal to rubber or plastic; the adhesive must not become brittle with age, must not absorb plasticizer from the rubber adherent, and must have a peel-strength of 20 lbs/in; it must have at least one of these properties—no appreciable solution in fuel and no absorption of solvent. becomes the logical statement: KKaVbcPdeCfg, and KAhiKKKNjNklSmn. Armed with this symbolic abbreviation, the computer can dig quickly into its memory file and come up with the sought-for article or articles. It has been suggested that the abstracting technique be applied at the opposite end of the cycle with a vengeance amounting to birth control of new articles. A Lockheed Electronics engineer proposes a technical library that not only accepts new material, but also rejects any that is _not_ new. Here, of course, we may be skirting danger of the type risked by human birth control exponents—that of unwittingly depriving the world of a president, or a powerful scientific finding. Perhaps the screening, the function of “garbage disposal,” as one blunt worker puts it, should be left as an after-the-fact measure. Despite early setbacks, the computer is making progress in the job of information retrieval. Figures of a 300 per cent improvement in efficiency in this new application are cited over the last several years. Operation HAYSTAQ, a Patent Office project in the chemical patent section accounting for one-fifth of all patents, showed a 50 per cent improvement in search speed and 100 per cent in accuracy as a result of using automated methods. Desk-size computer systems with solid-state circuits are being offered for information retrieval. The number of scientific information centers in this country, starting with one in 1830, reached 59 in 1940 and now stands at 144. Significantly, of 2,000 scientists and engineers working at these centers, 381 are computer people. Some representative information retrieval applications making good use of computer techniques are the selection of the seven astronauts for the Mercury Project from thousands of jet pilots, Procter & Gamble’s Technical Information Service, demonstration of an electronic law library to the American Bar Association, and Food Machinery and Chemical Corporation’s Central Research Laboratory. The National Science Foundation, the National Bureau of Standards, and the U.S. Patent Office are among the government agencies in addition to the military services that are interested in electronic information retrieval. _Summary_ The impact of the computer on education, language and communication, and the handling of information is obviously already strongly felt. These inroads will be increased, and progress hastened in the years ahead of us. Perhaps of the greatest importance is the assigning to the machine functions closer to the roots of all these things. Rather than simply read or translate language, for example, the computer seems destined to improve on it. The same applies to the process of teaching and to the storage and retrieval of data. The electronic computer has shown that it is not a passive piece of equipment, but active and dynamic in nature. It will soon be as much a part of the classroom and library as books; one day it may take the place of books themselves. [Illustration: Lichty, © _Field Enterprises, Inc._ “How come they spend over a million on our new school, Miss Finch, and then forget to put in computer machines?” ] ------------------------------------------------------------------------ “_’Tis one and the same Nature that rolls on her course, and whoever has sufficiently considered the present state of things might certainly conclude as to both the future and the past._” —Montaigne 11: The Road Ahead In Book One of _Les Miserables_, Cosette says, “Would you realize what Revolution is, call it Progress; and would you realize what Progress is, call it Tomorrow.” Victor Hugo’s definitions apply well to what has been termed by some the computer revolution and by others simply the natural evolution of species. The computer has a past and a present, differentiated mainly by the slope of the line plotting progress against time. Its future, which blurs somewhat with the present, will obviously be characterized by a line approaching the vertical. The intelligent machine has been postulated for years, first by the scientist, then by the science-fiction writer, and now again by the scientist. Norbert Wiener of cybernetics fame, Ashby and his homeostat, Grey Walter and his mechanical turtles, A. M. Turing, John von Neumann, and others, have recently been joined by men like Ramo, Samuel, Newell, _et al._, who, if not actually beating the drums for machine intelligence, do more than admit to the possibility. For each such pro there are cons, of course, from sincere, intelligent authorities who in effect holler “Get a horse!” at those who say the computer is coming. The Royal Society in England met its stiffest opposition from otherwise intelligent people who deplored naturalism in any form. Perhaps such detractors are a necessary goad, a part of progress. At any rate, science survived the Nicholas Gimcrack jibes of the Popes and Addisons and Swifts. Darwin was more right than Butler, though the latter probably made more money from his work. Today, we find a parallel situation in that there are those who refuse to accept the computer as an intelligent machine, though it is interesting to watch these objectors regroup and draw another line the machine dare not go past. The writers of science and pseudo-science have often been accused of fantasy and blue-sky dreams. A case in point in the electronics field is the so-called “journalistor” or marvelous successor to the transistor. Such riding off in all directions with each new laboratory experiment may be justified in that it prods the scientist who must keep up with the press and his advertising department! This theory apparently works, and now it seems that the most startling and fantastic stories come not from writers, but from the scientists themselves. In 1960 the Western Joint Computer Conference was held in San Francisco, and one session was devoted to the fanciful design and use of a computer with the problem-solving capability of an intelligent man and the speed and capacity of a high-speed data-processor. It was proposed to use “tunnel-effect tetrodes” with a switching time of one ten-billionth of a second as the logic and storage elements. These would be fabricated of thin-film materials by electron beam micromachining, and 100 billion of them could be packed into a cubic inch volume. With these tiny components and new circuit modes a supercomputer could be built, stored with information, and programmed to solve what one of the participants called the most difficult problem the human being faces today—that of bargaining. This computer has not yet been built; it won’t be for some time. But design and fabrication are moving in that direction on a number of fronts. One of these fronts is that of hardware, the components used in building up the computer circuitry. In a decade we moved from vacuum tubes to transistors to thin-film devices. Examples of shrinkage on a gross scale are shown in the use of a single ferrite core to replace some twenty conventional (relatively speaking!) components. Memory circuits once were mechanical relays or tube circuits. Briefly they were transistorized, and then ferrite cores. Magnetic thin-film circuits have now been developed, making random-access storage almost as compact as the sequential tape reel. As circuits grow smaller the major problem is manipulating them, or even seeing them, and a sneeze can be disastrous in today’s electronics plant. One early journalistor was the molecular circuit. Many scientists and engineers working in the field scoffed at or derided such a visionary scheme. But the industry has indeed progressed into the integrated-circuit technology—a sort of halfway point—and is now on the fringe of actual functional block techniques in which the individual components are not discernible. Electronic switching and other action at the molecular level is close to reality, and hardheaded scientists now speak calmly of using a homogeneous block of material as a memory, scanning its three dimensions with the speed of light to locate any one or more of billions of bits of data in a few inches of volume. Writing on the head of a pin was a prophetic bit of showmanship, and pinhead-size computers will not necessarily have pinhead mentalities. This progress toward a seemingly hopeless goal takes on an inexorable quality when the writings of von Neumann are compared with the state of the art today. Starting out much faster but much larger than similar elements of the brain, computer components have been made even faster while simultaneously shrinking dramatically toward the dimensions necessary to produce quantitative equivalence. It happens that these goals work out well together, the one helping the other. Circuitry is now at the point where speed is ultimately dependent on that limiter of all physical activity, the speed of light, or of electrons through a conductor. Only by putting elements closer together can speed be increased; thus one quality is not achieved at the sacrifice of the other. [Illustration: _International Business Machines Corp._ This experimental “memory plane” consists of 135 cryotron devices built up in a 19-layer “sandwich.” Produced automatically, it is an example of continued shrinking of computer elements. ] As an example of the progress being made toward speeding up computers, speakers at the recent Winter General Meeting of the American Institute of Electrical Engineers described a coming generation of “gigacycle” computers now on the drawing boards. Present electronic machines operate at speeds in the megacycle range, with 50 million cycles per second representing the most advanced state of the art. Giga means billion; thus the new round of computers will be some thousand times as fast as those now operating. Among the firms who plan such ultraspeed computers are RCA, IBM, and Sperry Rand Corporation. To achieve such a great increase in speed requires faster electronic switches. Transistors have been improved, and more exotic devices such as tunnel diodes, thin-film cryotrons, magnetic thin-films, parametrons, and traveling-wave tubes are now coming into use. Much of the development work is being supported by the U.S. Bureau of Ships. Operational gigacycle computers are expected within two years! Not just the brickmaker, but the architect too has been busy in the job of optimizing the computer. The science of bionics and the study of symbolic logic lead to better ways of doing things. The computer itself comes up with improvements for its next generation, making one part do the work of five, and eliminating the need for whole sections of circuitry. Most computers have a fixed “clock”; that is, they operate at a certain cyclic rate. Now appearing on the scene are “asynchronous” computers which don’t stand around waiting when one job is done, as their predecessors did. One advanced notion is the “growing” of complex electronic circuitry, in which a completed amplifier, or array of amplifiers, is pulled from the crystal furnace much the way material for transistors is now grown. Pooh-poohed at first as ridiculous, the notion has been tried experimentally. Since a computer is basically a multiplicity of simple units, the idea is not far off at that. It is conceivable that crystal structure can be exploited to produce millions of molecules of the proper material properly aligned for the desired electronic action. With this shrinking come the benefits of small size, low power consumption, low cost, and perhaps lower maintenance. The computer will be cheap enough for applications not now economically feasible. As this happens, what will the computer do for us tomorrow? A figure of 7 per cent is estimated for the amount of paperwork the computer has taken over in the business world. Computer men are eyeing a market some five times that amount. It does not take a vivid imagination to decide that such a percentage is perhaps conservative in the extreme. Computer sales themselves promise to show a fourfold increase in the five-year period from 1960 to 1965, and in the past predictions have been exceeded many times. As population grows and business expands in physical size and complexity, it is obvious that the computer and its data-processing ability will be called upon more and more. There is another factor, that of the internationalizing of business. Despite temporary setbacks of war, protective tariffs, insular tendencies, and the like, in the long run we will live in one integrated world shrunk by data links that can get information from here to there and back again so fast it will be like conversing with someone across the room. Already planners are talking worldwide computerized systems. As a mathematical whiz, the computer will relieve us of our money worries. Coupled with the credit card, perhaps issued to us at birth, a central computer will permit us to make purchases anywhere in the world and to credit our account with wages and other income. If we try to overdraw, it may even flash a warning light as fast as we put the card in the slot! This project interests General Dynamics researchers. Of more importance than merely doing bookkeeping is the impact the computer will have on the planning and running of businesses. Although it is found in surveys that every person thinks computer application reaches to the level just below his in the management structure, pure logic should ultimately win out over man’s emotional frailties at all levels. Operations research, implemented by the computer, will make for more efficient businesses. Decisions will increasingly be made not by vice-presidents but by digital computers. At first we will have to gather the necessary information for these electronic oracles, but in time they will take over this function themselves. Business is tied closely to education, and we have had a hint of the place the computer will make for itself in education. The effect on our motivation to learn of the little need for much learning will be interesting. But then, is modern man a weaker being because he kills a tiger with a high-powered rifle instead of club or bare hands—or has no need to kill the tiger in the first place? After having proved itself as a patent searcher, the computer is sure to excel as inventor. It will invade the artistic field; computers have already produced pleasing patterns of light. Music has felt the effect of the computer; the trend will continue. Some day not far off the hi-fi enthusiast will turn on his set and hear original compositions one after the other, turned out by the computer in as regular or random form as the hearer chooses to set the controls. Each composition will bring the thrill of a new, fresh experience, unless we choose to go back in the computer’s memory for the old music. The computer will do far more in the home than dream up random music for listening pleasure. The recorded telephone answerer will give way to one that can speak for us, making appointments and so on, and remembering to bring us up to date when we get home. A small computer to plug in the wall may do other things like selecting menus and making food purchases for next week, planning our vacations, and helping the youngsters with their homework. It is even suggested that the computer may provide us with child-guidance help, plus psychological counsel for ourselves and medical diagnoses for the entire family. The entire house might be computerized, able to run itself without human help—even after people are gone, as in the grimly prophetic story by Ray Bradbury in which a neat self-controlled home is shown as the curtains part in the morning. A mechanical sweeper runs about gathering up dust, the air conditioning, lighting, and entertainment are automatic, all oblivious to the fact that one side of the house is blackened from the blast of a bomb. Perhaps guarding against that eventuality is the most important job the computer can do. Applications of computing power to government have been given; and hints made of the sure path from simple tasks like the census and income tax, Peace Corps work, and so on to decision-making for the president. Just as logic is put to work in optimizing business, it can be used to plan and run a taut ship of state. At first such an electronic cabinet member will be given all available information, which it will evaluate so as to be ready to make suggestions on policy or emergency action. There is more reason for it going beyond this status to become an active agent, than there is against. Government has already become so complex that perhaps a human brain, or a collection of them, cannot be depended on to make the best possible decision. As communications and transportation are speeded up, the problem is compounded. Where once a commander-in-chief could weigh the situation for days before he had to commit himself and his country to a final choice, he may now be called upon to make such a far-reaching decision in minutes—perhaps minutes from the time he is awakened from a sound sleep. The strongest opposition to this delegation of power is man’s own vanity. No machine can govern, even if it can think, the politician exclaims. The soldier once felt the same way; but operations research has given him more confidence in the machine, and SAGE and NORAD prove to him that survival depends on the speed and accuracy of the electronic computer. Incurable romanticism is found even among our scientific community. The National Bureau of Standards describes a computer called ADAM, for Absolutely Divine Automatic Machine. But the scientists also know that ADAM, or man, needs help. Rather than consider the machine a tool, or even an extension of man’s mind, some are now concerned with a kind of marriage of man and machine in which each plays a significant part. Dr. Simon Ramo, executive vice president of Thompson Ramo Wooldridge, Inc., has termed this mating of the minds “intellectronics.” The key to this combination of man’s intellect and that of electronics is closer rapport between the team members. [Illustration: _Department of Defense_ Computer use in defense is typified in this BIRDIE system of the United States Army. ] The man-machine concept has grown into a science called, for the present at least, “synnoetics,” a coinage from the Greek words _syn_ and _noe_ meaning “perceive” and “together.” This science is defined as the treating of the properties of composite systems, consisting of configurations of persons, mechanisms, plant or animal organisms, and automata, whose main attribute is that their ability to invent, to create, and to reason—their mental power—is greater than the mental power of their components. We get a not-too-fanciful look into the future in a paper by Dr. Louis Fein presented in the summer 1961 issue of _American Scientist_, titled “Computer-related Sciences (Synnoetics) at a University in 1975.” Dr. Fein is an authority on computers, as builder of RAYDAC in 1952, and as founder and president of the Computer Control Company. The paper ostensibly is being given to alumni some years hence by the university president. Dr. Fein tells us that students in the Department of Synnoetics study the formal languages used in communication between the elements of a synnoetic system, operations research, game theory, information storage, organization and retrieval, and automatic programming. One important study is that of error, called Hamartiology, from the Greek word meaning “to miss the mark.” The speaker tells us that this field was variously called cybernetics, information science, and finally computer-related science before being formally changed to the present synnoetics. A list of the courses available to undergraduates includes: Von Neumann Machines and Turing Machines Elements of Automatic Programming Theory, Design, and Construction of Compilers Algorithms: Theory, Design, and Applications Foundations of the Science of Models The Theory, Design, and Application of Non-Numeric Models Heuristics Self-Programming Computers Advice Giving—Man to Machine and Machine to Man Simulation: Principles and Techniques Pattern Recognition and Learning by Automata The Grammar, Syntax, and Use of Formal Languages for Communication Between Machine and Machine and Between Man and Man Man-Automaton Systems: Their Organization, Use, and Control Problem-Solving: an Analysis of the Relationship Between the Problem-Solver, the Problem, and the Means for Solution Measurements of the Fundamental Characteristics of the Elements of Synnoetic Systems Of course, synnoetics spills over into the other schools, as shown in the following typical courses taught: Botany Department Machine-Guided Taxonomy in Botany Business School Synnoetic “Business Executives” Engineering School Theory of Error and Equipment Reliability Design of Analog and Digital Computers Humanities Department Theory of Creative Processes in the Fine Arts Law School Patent and Precedence Searches with Computers The Effect of Automata on the Legislative and Judicial Process Mathematics Department The Theory of Graphs and the Organization of Automata Medical School Computer-Aided Medical Diagnosis and Prescription for Treatment Philosophy The Relationships between Models and the Phenomena That Are Modeled Psychology Department Studies in Intuition and Intellect of Synnoetic Systems Simulation in the Behavioral Sciences Sociology Department Synnoetics in Modern Society The speaker proudly refers to the achievement of the faculty mediator and a computer in settling the “famous” strike of 1970. He simply got both sides first to agree that each would benefit by concentrating attention—not on arguing and finally settling the issues one at a time—but on arguing and finally settling on a program for an automaton. This program would evaluate the thousands of alternative settlements and would recommend a small class of settlements each of which was nearly optimum for both sides. The automaton took only 30 minutes to produce the new contract last year. It would have taken one year to do this manually, and even then it would have been done less exhaustively. Agreeing on the program took one week. Of course, you have already heard that in many areas where people are bargaining or trying to make optimum decisions such as in the World Nations Organization, in the World Court, and in local, federal, and world legislative bodies, there is now serious consideration being given to convincing opposing factions to try to agree on a program and having once agreed on it, the contract or legislation or judgment or decision produced with the program would be accepted as optimum for both sides. Automata may also be provided to judges and juries to advise them of the effects of such factors as weight of evidence on verdicts in civil cases. Dr. Fein makes an excellent case for the usefulness of the science of synnoetics; the main point of challenge to his paper might be that its date is too conservatively distant. Of interest to us here is the idea of man and machine working in harmony for the good of both. Another paper, “The Coming Technological Society,” presented by Dr. Simon Ramo at the University of California at Los Angeles, May 1, 1961, also discusses the possible results of man-machine cooperation during the remainder of the twentieth century. He lists more than a dozen specific and important applications for intellectronics in the decades immediately ahead of us. Law, medicine, engineering, libraries, money, and banking are among these. Pointing out that man is as unsuited for “putting little marks on pieces of paper” as he was for building pyramids with his own muscles, he suggests that our thumbprints and electronic scanners will take care of all accounting. Tongue in cheek, he does say that there will continue to be risks associated with life; for instance, a transistor burning out in Kansas City may accidentally wipe out someone’s fortune in Philadelphia. The making of reservations is onerous busywork man should not have to waste his valuable time on, and the control of moving things too is better left to the machine for the different reason that man’s unaided brain cannot cope with complex and high-speed traffic arteries, be they in space or on Los Angeles freeways. Business and military management will continue to be aided by the electronic machine. But beyond all these benefits are those more important ones to our brains, our society, and culture. Teaching machines, says Dr. Ramo, can make education ten times more effective, thus increasing our intellect. And this improved intellect, multiplied by the electronic machine into intellectronic brainpower, is the secret of success in the world ahead. Instead of an automated, robotlike regimented world that some predict, Ramo sees greater democracy resulting. Using the thumbprint again, and the speed of electronics, government of our country will be truly by the people as they make their feelings known daily if necessary. Intellectronic legislation will extend beyond a single country’s boundaries in international cooperation. It will smash the language and communication barriers. It will permit and implement not only global prediction of weather, but global control as well. Because of the rapid handling of vast amounts of information, man can form more accurate and more logical concepts that will lead to better relations throughout the world. Summing up, Dr. Ramo points out that intellectronics benefits not only the technical man but social man as well: The real bottleneck to progress, to a safe, orderly, and happy transition to the coming technological age, lies in the severe disparity between scientific and sociological advance. Having discussed technology, with emphasis on the future extension of man’s intellect, we should ask: Will intellectronics aid in removing the imbalance? Will technology, properly used, make possible a correction of the very imbalance which causes technology to be in the lead? I believe that the challenging intellectual task of accelerating social progress is for the human mind and not his less intellectual partner. But perhaps there is hope. If the machines do more of the routine, everyday, intellectual tasks and insure the success of the material operation of the world, man’s work will be elevated to the higher mental domains. He will have the time, the intellectual stature, and hence the inclination to solve the world’s social problems. We must believe he has the capability. [Illustration: _Thompson Ramo Wooldridge, Inc._ Information in many forms can be displayed with “polymorphic” data-processing systems. ] Antedating synnoetics and intellectronics is another idea of such a relationship. In his book _The World, The Flesh and the Devil_, J. D. Bernal considers man’s replacement of various of his body’s parts with mechanical substitutes until the only organic remains would be his brain. This is a sort of wrong-end-to synnoetics, but in 1929 when the book was published there was already plenty of raw material for such a notion. Wooden legs and hooks or claws for hands, metal plates for bone material, for example; and the artificial heart already being developed. More recently we have seen the artificial kidney used, along with other organs. We have also added electronic gear to our organic components, for example the “pacemaker” implanted in many laggard hearts to keep them beating in proper cadence, plastic plumbing, and the like. There is a word for this sort of part-organic, part-mechanical man: the name “cyborg” for cybernetic organism was proposed by two New York doctors. Their technical definition of cyborg is “an exogenously extended organizational complex functioning as a homeostatic system.” There is of course strong precedent in nature for the idea of such a beneficial combination: symbiosis, the co-existence or close union of two dissimilar organisms. The shark and his buddy, the pilot fish, are examples; as are man and the many parasites to which he is host. The idea of man being part of machine harks back to youthful rides in soapbox racers, and later experiences driving cars or flying aircraft. The pilot who flew “by the seat of his pants” in the early days easily felt himself part of the machine. As planes—and cars—grew bigger and more complex, this “one-manship” became more remote and harder to identify. The jet transport pilot may well have the feeling of handling a train when he applies force to his controls and must wait for it to be amplified through a servo system and finally act on the air stream. In the space age the man-machine combination not only survives but also flourishes. Arthur C. Clarke writes in a science-fiction story of a legless space man who serves well and happily in the weightlessness of his orbiting satellite station. We have two stages of development, then, not necessarily sequential: man working with the machine and man as part of the machine. Several writers have suggested a third stage in which the machine gradually supplants the weaker human being much as other forms eased out the dinosaur of old. William O. Stapledon’s book, _Last and First Men_, describes immortal and literal giant brains. Many writers believe that these “brains” will not be man’s, but those of the machine, since frail humanity cannot survive in its increasingly hostile environment. Arthur C. Clarke is most articulate in describing what he calls the evolutionary cycle from man to machine. As the discovery of tools by pre-man created man, so man’s invention of thinking machines set about the workings that will make _him_ extinct. Clarke theorizes that this breakthrough by man may well be his last, and that his machines will “think” him off the face of the earth! [Illustration: _Hughes Aircraft Company_ Withstanding underwater pressures, at depths too great for human divers, a Mobot vehicle demonstrates in this artist’s concept how it can perform salvage and rescue operations at the bottom of the ocean. ] As we move into a technology that embraces communication at a distance of millions of miles, survival under death-dealing radiation, and travel at fantastic speeds, man’s natural equipment falters and he must rely on the machine both as muscle and brain. Intelligence arose from life but does not necessarily need life, in the sense we think of it, to continue. Thus the extension of man’s intellect by electronics as hailed by Dr. Ramo will lead ultimately to our extinction. Clarke feels that the man-machine partnership we have entered, while mutually benevolent, is doomed to instability and that man with his human shortcomings will fall by the wayside, perhaps in space, which may well be the machine’s true medium. What will remain will be the intelligent machine, reduced as time goes on to “pure” intelligence free to roam where it will and do what it wants, a matterless state of affairs that even Clarke modestly disclaims the imagination to speculate upon. Before writing man off as a lost cause, we should investigate a strong argument against such a take-over by the machine. Man stands apart from other creatures in his consciousness of himself. He alone seems to have the ability to ponder his fate, to reflect, and to write books about his thoughts and dreams. Lesser animals apparently take what comes, do what they have to do, and get through this life with a minimum of changing their environment and themselves. Thus far the machines man has built do not seem to be conscious of themselves. While “rational beings,” perhaps, they do not have the “ability to laugh” or otherwise show conscious awareness of their fate. A term applied to primitive mechanical beings is “plugsuckers.” They learn to seek out a wall socket or other form of energy and nourish themselves much as animals must do. Just where man himself switched from plugsucking and began to rewire his own world is a fuzzy demarcation, but he seems to have accomplished this. Consciousness is subjective in the extreme, and thus far only in fiction have computers paused to reflect and consider what they have done and its effect on them. However, the machine-builder, if not yet the machine itself, is aware of this consciousness problem. The Hoffman Electronics Corporation recently published an advertisement in the form of a science-fiction story by A. E. Van Vogt. The hero is a defense vehicle, patrolling the Pacific more effectively because it thinks it is king of the Philippine Deep. Its name is Itself, and it has a built-in alter ego. Hoffman admits it has not produced a real Itself—yet, but points out calmly that the company’s business _is_ the conversion of scientific fiction to scientific fact. It has been suggested that mechanical consciousness may evolve when the computer begins to reproduce itself, a startling conception blessed in theory by logicians and mathematicians, as well as philosophers. A crude self-replicating model has been built by scientists—a toy train that reproduces itself by coupling together the proper cars to copy the parent train, a whimsical reflection of Samuel Butler’s baby engines playing about the roundhouse door. Self-reproducing machines may depend on a basic “cell” containing a blueprint of what it should look like when complete, which simply hunts around for the proper parts and assembles itself. In the process it may even make an improvement or two. Having finished, it will make a carbon copy of its blueprint and start another “baby” machine on the way. Writers on this subject—some under the guise of science-fiction—wonder at what point the _machines_ will begin to wonder about how _they_ came to be. Will they produce philosophic or religious literature, or will this step in evolution prove that consciousness was a bad mutation, like seven fingers or three heads, and drop it from the list of instructions? Clarke admits that the take-over by the machines is centuries off; meantime we can enjoy a golden age of intellectronic partnership with the machine. Linus Pauling, pointing out that knowledge of molecular structure has taken away the mystery of life, hopes that a “molecular theory of thinking” will be developed and so improve man that he may remake his thoughts and his world. Mathematician John Williams believes that existing human intelligence can preserve its distinction only by withdrawing from competition with the machine and defining human intelligence rigorously enough to exclude that of the machines. He suggests using the computer not just for a molecular theory of thinking, but also in the science of genetics to _design_ our children! Whatever lies ahead, it seems obvious that one of the most important things the computer can help us think about is the computer itself. It is a big part of our future. ------------------------------------------------------------------------ Index Abacus, 5, 21, 22, 60, 85, 129, 178, 181 Abstracting computer, 245, 248 Accuracy analog computer, 82 digital computer, 87 Ackerman, 110 ADAM computer, 258 Adaptive principle, 205 Adders, 107, 108, 115 Adding machine, 129 Addition, computer, 106 Address, computer, 63 Advertising, use of computer, 180 AID, 183, 184 AIEE, 254 Aiken, 46 Air Force, 6, 132, 133, 151, 160, 182, 225 Airborne computer, 90, 154, 158, 162 AiResearch Mfg. Co., 69 Airline reservations, computer, 58, 183, 184 Algebra, Boolean, 8, 110, 119 Alpha rhythm, 126 Alphanumeric code, 104 American Premium Systems, Inc., 175 Analog computer, 21, 45, 72, 74, 80, 125, 203 direct, 76, 79 direct-current, 76 discrete, 80 indirect, 76, 79 mechanical differential analyzer, 76 scaling, 76 Analytical engine, 36, 37 AND gate, 112, 113, 117, 119 Antikythera computer, 25 Apollo computer, 182 space vehicle, 169 Applications, digital computer, 92 _A priori_ concept, 126, 135 APT computer, 209 Aquinas, St. Thomas, 235 Arabic numbers, 23 Archytas, 25 Arithmetic unit, computer, 51, 60 Aristotle, 26 Aristotelian logic, 109 _Arizona Journal_, 179 Army, U. S., 21, 78, 146, 259 _Ars Magna_, 28, 29 ARTOC, 157 Artron, 136 Ashby, W. Ross, 51, 124, 128, 251 ASC computer, 155 Associated Press computer system, 177 Asynchronous computer, 255 Athena computer, 52 Atlas missile, 4, 168 Atlas-Centaur missile, 169 Atomic Energy Commission, U. S., 149 Automatic control, 80, 203 pilot, 203 Automation, 26, 80, 173, 181, 201, 202, 203, 211, 217 Automaton, 26 Auto-parking, use of computer, 178 Autonetics, 207 AUTOPROMPT computer, 210 AUTOTAG, 156 AutoTutor teaching machine, 213, 225 B-29, 45, 77, 82 Babbage, 5, 35, 37, 41, 51 Babylonian arithmetic, 23 Ballistic computer, 83 Banking, 1, 172, 173 Bar Association, American, 152, 249 Battelle Memorial Institute, 195 Batten, Barton, Durstine, & Osborn, 180 Bell Telephone Laboratories, 4, 147, 241 Bendix Corp., 182, 190, 218 Bendix G-15 computer, 183, 188 Bernal, J. D., 264 Bernstein, Alex, 141 Bettelheim, Bruno, 144 BIAX memory units, 10 Bierce, Ambrose, 43, 121 BINAC computer, 7, 47 Binary, 98 digit, 55, 104 notation, 101, 103 pure, 102, 104 system, 85, 97, 99 variables, 114 Bionics, 7, 132, 135, 255 BIRDIE, 259 Birds, counting, 18 Bit, 55, 104 “Black box” concept, 50, 115 BLADES system, 191 Block diagram, 58 BMEWS, 159 Boeing Airplane Co., 186 Boltzmann equation, 158 Bomarc missile, 186 _Book of Contemplation_, 27 Book of Knowledge, 6, 226 Boole, George, 38, 110 Boolean algebra, 38, 110, 119 Bradbury, Ray, 153, 257 _Brain_, 121 Brain computer, 128, 129, 130 human, 87, 125, 128 BRAINIAC computer, 88, 117 Britton, Lionel, 121 Buffer computer, 55 lexical, 238 Buildings, automation of, 217 Burack, Benjamin, 44 Bureau of Mines, U. S., 189 Bureau of Ships, U. S., 255 Burke, Edmund, 32 Burkhart, William, 45 Bush, Vannevar, 13, 45, 76 Business, computer in, 171 Business management, use of computer, 12, 143 Butler, Samuel, 32, 33, 121, 252, 268 CALCULO computer, 75 _Calculus Ratiocinator_, 109 Calendars as computers, 24 California Institute of Technology, 169 Cancer Society, American, 193 _Candide_, 30 Capek, Karel, 43, 121, 215 Caplin, Mortimer, 150 Carroll, Lewis, 38, 118 CDC 1604 computer, 165 Celanese Corp. of America, 207 Celestial simulator, 85 Census, 41 Census Bureau, U. S., 149 Chain circuit, 127 _Characteristica Universalis_, 109 Charactron tube, 66 Checkers (game), 8, 143 Checking, computer, 60 Checkout computer, 183 Chemical Corp., 249 Chess, 8, 9, 16, 35, 99, 142, 156 Circuit chain, 127 delay-line, 63 flip-flop, 63, 115 molecular, 9, 253 printed, 62 reverberation, 128 Clapp, Verner, 248 Clarke, Arthur C., 265 CLASS teaching machine system, 226-228 Clock, 20, 24, 56, 85 COBOL language, 234 Code, computer binary-coded decimal, 103, 106 binary-octal, 106 economy, 106 excess-3, 105, 114 “Gray,” 106 reflected binary, 106 self-checking, 105 Color computer, 4 _Commercial Art_, 175 Commission on Professional and Hospital Activity, 194 Communication, use of computers, 179 Computer ADAM, 258 addition, 106 airborne, 90, 154, 158, 162 analog, 21, 45, 72, 74, 80, 125, 203 direct, 76, 79 direct-current, 76 discrete, 80 indirect, 76, 79 mechanical differential analyzer, 76 scaling, 76 Antikythera, 25 Apollo, 182 space vehicle, 169 applications, digital, 92 ASCC, 155 asynchronous, 255 Athena, 52 ballistic, 83 Bendix G-15, 183, 188 BINAC, 7, 47 BRAINIAC, 88, 117 CALCULO, 75 CLASS, 226-228 code, binary-coded decimal, 103, 106 color, 4 definition, 129 dictionary, 49, 50 difference engine, 5, 35 digital, 18, 45, 73, 84, 125, 203 division, 107 do-it-yourself, 75, 88, 117, 147 electrical-analog, 75 electronic, 1, 46, 122, 151 ENIAC, 7, 40, 46, 85, 215 ERMA, 173 family tree, 86 FINDER system, 161 flow chart, 58, 59 GE 210, 172 GE 225, 245 general-purpose, 54, 81, 191 gigacycle, 254 “Hand,” 132, 214, 215 household, 15, 257 hybrid, 80, 84, 92 ILLIAC, 197 input, 51, 54, 125 JOHNNIAC, 11, 47, 129, 140, 142 language, 233 LARC, 47, 162, 191 LGP-30, 198 limitations, 89 MANIAC, 47, 156, 165 Memex, 13 mill, 38, 51, 60 MIPS, 159 MOBIDIC, 157 MUSE, 48 music, 11, 92, 196, 257 on-line, 81, 205 on-stream, 83, 207 output, 51, 65, 125 parts, 50, 52, 53 problem-solving, 140, 143 Psychological Matrix Rotation, 78, 94 Q-5, 77 RAMAC, 150, 151, 198, 199 Range Keeper Mark I, 42 RAYDAC, 260 RCA 501, 151 “real-time,” 78, 168, 202, 205 RECOMP, 47 revolution, 251 Sabre, 183 SAGE, 3, 12, 37, 53, 158, 159, 226, 259 sequential, 126 “Shoebox,” 242 “software,” 54 spaceborne, 167 special-purpose, 79 SSEC, 155, 156 Stone Age, 21 store, 36, 62 STRETCH, 47, 48 subtraction, 106 testing, 117 UNIVAC, 47, 149, 151, 171, 221 VIDIAC, character-generator, 242 Zuse L23, 199 Computer Control Co., 260 Conjunctive operation, 37, 51, 110 Consciousness, 144, 145, 267 Continuous analog computer, 80 Continuous digital computer, 80 Continuous quantity, 73 Control, computer, 51, 56 Control Data Corp., 194 Conversion analog-to-digital, 74 digital-to-analog, 74 Converters, 94 Cook, William W., 29 Copland, Aaron, 11, 196 Cornell Medical College, 123 Cornell University, 133 Corrigan Communications, 231 Council on Library Resources, 248 Counting Australian, 20 birds, 18 boards, 20 digital, 84 machines, 20 man, 19 modulo-, 97, 101 Credit card, 13, 256 Cryogenics, 70 components, 63 Cryotron, 9, 88, 141, 254, 255 Cybertron, 135, 139 Cyborg, 265 Daedalus, 18 Darwin, Charles, 32, 137, 252 Data link, 14, 185, 256 logger, 205 processing, 22, 171, 264 recording media, 57 Daystrom, Inc., 211 Dead Sea Scrolls, 235 Decimal system, 19 Decision-making, 91 Defense, use of computer, 259 Delay-line circuit, 63 DeMorgan, Augustus, 38, 110, 115 Department of Commerce, U. S., 149, 221 Department of Defense, U. S., 148, 234 Design, use of computer, 14, 172, 186, 268 Desk calculator, 51 Diagnostic use of computer, 194 Diamond Ordnance Fuze Laboratory, U. S. Army, 69 Dictionary, computer, 49, 50 DIDAK teaching machine, 224 Difference engine, 5, 35 Digiflex trainer, 225 Digital computer, 18, 45, 73, 84, 125, 203 Digital differential analyzer, 94 Digitronics, 236 Discrete quantity, 73 Disjunctive operation, 110 Division, computer, 107 Dodgson, Charles L., 38 Do-it-yourself computer, 75, 88, 117, 147 Douglas Aircraft Co., 65 Dow Chemical Corp., 208 Du Pont Corp., 208 Dunsany, Lord, 108 Eccles-Jordan circuit, 47 Eckert, J. Presper, 47, 85 EDGE computer system, 185 Education, use of computers, 219 _Elan vital_, 127 Election, use of computers, 150 Electric Questionnaire, 133 Electric utilities, use of computers, 93, 208 Electrical-analog computer, 75 Electrical logic machine, 44 Electronic computers, 1, 46, 122, 151 Elephant, compared with computer, 56 Encyclopedia Britannica, 6, 226 ENIAC computer, 7, 40, 46, 85, 215 _Erewhon_, 32, 121 ERMA computer, 173 Ernst, Heinrich, 132, 215 Euler, 142, 143, 163 EURATOM, 158 Family tree, computer, 86 Farnsworth Car Pool logic problem, 116, 118 Farrington Electronics, Inc., 240 Federal Aviation Authority, 149, 161 Federal Government, 148 Feedback principle, 36, 204 Fein, Louis, 260 Fermat’s theorem, 56 Ferranti, Ltd., 182 Ferrite cores, 9, 63, 131, 253 FIELDATA computer family, 157 FINDER computer system, 161 Finn, James D., 224 Flexibility of digital computer, 89 Flight simulator, 83 Flip-flop circuit, 47, 63, 115 fluid, 70 Floating-point arithmetic, 108 Flow chart, computer, 58, 59 Flyball governor, 36, 203 Fluid computer, 70 Food Machinery Corp., 249 Ford Instrument Co., 42 Forrester, J. W., 199 _Fortune_, 245 _Frankenstein_, 42, 212 Freed, Roy, 152 Free learning, 7 Freight trains controlled by computer, 211 Game-playing, 8, 12, 143 Gaming theory, 92 Gardner, Martin, 140 GE 210 computer, 172 GE 225 computer, 245 General Dynamics Corp., 169, 183, 256 General Electric Co., 10, 45, 67, 76, 77, 79, 171, 172, 240 General Motors Corp., 218 General Precision, Inc., 69 General-purpose computer, 81, 85, 191 _Gestalt_ principle, 241 _Giant Brain_, 121 Gigacycle computer, 254 Gilfillan Radio, 67 Glenn, John, 3 Go (game), 143 Goal-seeking behavior, 124 Gödel, Kurt, 135 “Golem,” 27 Goodrich Tire & Rubber Co., 188 Goodyear Aircraft Corp., 77 Tire & Rubber Co., 208 Goren, Charles, 226 Government, 258, 263 Greek numbers, 23 Grieg, 11 Grimaldi, 99 _Gulliver’s Travels_, 30 Half-adder, 107, 115 Hamilton, Sir William, 109 “Hand” computer, 132, 214, 215 Handwriting reader, 241 Harcourt-Brace, 226 _Harvard Business Review_, 171, 172 Harvard University, 132, 217, 224 Hawkeye aircraft computer, 162 Heath, D. C., and Co., 226 HAYSTAQ, 249 Heikolator computer, 195 Hero, 18 Heuristics, 56, 142 High-school computer training, 15, 220 High-temperature susceptibility, 69 Hilbert, David, 110 Hiller, Lejaren A., Jr., 197 Hindu numbers, 23 HIPO system, 195 Hippo problem, 155 Hoffman Electronics Corp., 267 Holland, James, 224 Hollerith coding, 42 Hollerith, Herman, 2, 41, 54, 148 Holmes, Oliver Wendell, 109 Homeostat, 124 Homer, 26, 47 Hood, H. P. & Sons, 206 Hoover Commission, 149 Hourglass, 24 Household computer, 15, 257 Hughes Aircraft Co., 203, 215, 222 Hugo, Victor, 251 Hybrid computer, 80, 84, 92 IBM cards, 41 IBM 704 computer, 8 IBM 1401 computer, 175 IBM 1620 computer, 177 IBM 7074 computer, 175 Icarus, 18 Ice cream, computer-made, 206 ILLIAC computer, 197 “Illiac Suite,” 196, 197 _Iliad_, 26, 235 India, chess legend, 99 Industrial Advertising Research Institute, 180 Industrial revolution, 173 Industry, 181 “Inflexible Logic,” 32 Information explosion, 245 Information retrieval, 14, 243, 246, 247 Input, computer, 51, 54, 125 Instamatic computer system, 183 Insurance, use of computer, 92, 173 Intellectronics, 258, 262 Intelligence, 124, 135 Interagency Data Processing Committee, 148 Internal Revenue Department, U. S., 150 International Air Transport Association, 235 International Association of Machinists, 218 International Business Machines Corp., 69, 237, 247, 255 Interlingua, 237 Inventory, 176, 185 Inverter, 114, 119 IRE, 170 Isaacson, L. M., 197 Jacquard, Joseph M., 4, 34, 41, 54, 202, 242 Jet engine simulator, 78 Jet Propulsion Laboratory, 169 Jevons, William S., 40 JOHNNIAC computer, 11, 47, 129, 140, 142 Johnson’s Wax Co., 178 Jones & Laughlin Steel Corp., 188, 189, 205 Journalistor, 64, 252 Kalin, Theodore, 45, 135 Kalin-Burkhart machine, 45 Kane, Sydney, 193 Kant, Immanuel, 135 Kelvin, Lord, 75 Kelvin wheels, 76 Khayyám, Omar, 108 KNXT, television station, 179 Kresge Eye Institute, 195 Kyoto University, 243 Lamb, Sydney, 238 Language, computer, 233 LARC computer, 47, 162, 191 Law, 232 Law Institute, American, 152 Learning, 123 forced, 133, 134 free, 7, 133 reinforced, 134 soldered, 7, 133 Learning, Inc., 226 Leibnitz, Gottfried, 24, 29, 85, 99, 109, 120 Lenkurt Electric Co., Inc., 190 LGP-30 computer, 198 Library, use of computers, 231 Limitations of computers, 89 Lincoln Laboratory, 124 Lindgren, Astrid, 3 Literature, computers in, 30 Litton Industries, 128 Livanov, M., 133 Lockheed Aircraft Corp., 185, 248 Logarithms, 30 Logic, 38, 90, 108, 229 Aristotelian, 109 Farnsworth problem, 116, 118 mathematical, 110 symbolic, 38, 109, 110, 115, 248, 255 unit, 60 Logical algebra, 40, 108 piano, 40 Loom, Jacquard, 34 Loy, W. D., 23 Luhn, H. P., 247 Lull, Ramon, 27, 28, 122 Lull’s wheel, 28 _Machine Design_, 180 Machine shop, use of computers, 209 Machine Translations, Inc., 239 MAD, computer language, 220 Maelzel chess automaton, 35 Magic squares, 142 Magnetic cores, 64 Magnetic disc, 63 Magnetic drum, 63 Magnetic films, 88, 255 Magnetic ink, 3, 240 Magnetic tape, 55 Majority rule checking, 60 Malin, David, 220, 221 Maloney, Russell, 32 Management games, 199 MANIAC computer, 47, 156, 165 Man-machine relationship, 258 Mark I computer, 46, 219 Marquand, Allan, 40, 44 Matsuzake, Kiyoshi, 21 Mauchly, John, 47, 85 Mayans, 24, 97 McCarthy, John, 170 McDonnell Aircraft Corp., 186 McDonough, James, 235 McDougall, W., 124 McGraw-Hill Book Co., 226 Mechanical-relay, 122 Mediation principle, 102 Medical diagnosis, 257 Medical Research Foundation, American, 193 Medical use of computers, 193 MEDLARS system, 194 Memex computer, 13 “Memistor,” 137 Memory computer, 51, 63, 254 BIAX, 10 MIND, 137 molecular, 64 scratch-pad, 63 unit, 62 Mercury space capsule, 168, 249 Merrill Lynch, Pierce, Fenner & Smith, 236 Michigan State University, 151 MICR, 240 “Mill,” computer, 38, 51, 60 MIND memory unit, 137 Minneapolis-Honeywell Co., 162, 206, 208, 216 Minimax theory, 156 Minuteman missile, 4, 137, 168 MIPS computer, 159 MIT, 44, 169, 209, 215, 220 Mobot, 145, 215, 216, 266 MOBIDIC computer, 157 Modeling principle, 83 Modular approach, 115, 116 Molecular block memory, 64 Molecular circuit, 9, 253 Molecular electronics, 9 Monsanto Chemical Corp., 208 “Mooer’s” Law, 245 Morse code, 99 Mozart, 11, 197 Multiplication computer, 61, 107 Russian peasant, 103 MUSE computer, 48 Music, 11, 92, 196, 257 Nanosecond, 61 NANWEP, 165 Napier, John, 30 “Napier’s bones,” 30 National Library of Medicine, 194 NASA, 149 National Bureau of Standards, 94, 239, 249, 258 National Cash Register Co., The, 240 National Science Foundation, 158, 249 Navigation, use of computer, 182 Navy, U. S., 162 Negation principle, 113, 114 Neuristor, 137 Neurons human, 91, 125, 128, 135 artificial, 136, 138 Newell, Allen, 141, 251 Newton, Isaac, 30 New York University, 194, 220 Nike missile, 119, 157, 191 Nim (game), 8, 143 NORAD, 3, 160, 258 North American Aviation Corp., 65, 185 Numbers cuneiform, 23 Arabic, 23 Babylonian, 23 binary, 55 discrete, 73 Greek, 23 Hindu, 23 pure binary, 102, 104 Roman, 23, 97 Numerical control, 210 Numerical weather prediction, 163 _Odyssey_, 235 Ohio State University, 222 On-line computers, 81, 205 On-stream computers, 83, 207 _On the Origin of Species_, 32 Operant reinforcement, 223 Operations research, 36, 155, 256 Optical scanning, 240 OR gate, 112, 113, 117, 119 _Outline of Psychology_, 124 Output, computer, 51, 65, 125 Packaging density, 9, 140 Paper tape, 54 Papermaking, 209 Paradox, 45 Parallel addition, 107 Parallel operation, 126 Parametron, 255 Parity bit checking, 105 Parrish, Stephen Maxfield, 235 Pascal, Blaise, 30, 85 Patent Office, U. S., 249 Pauling, Linus, 7, 268 Pavlov, 133 Peace Corps, 149, 258 Peale, Mundy, 125 PEP system, 186 Perceptron, 7, 8, 134, 135 PERT system, 186 Petroleum industry, 208 Philadelphia Electric Co., 208 Philco Corp., 240 Phillips Petroleum Co., 207 Phonetic typewriter, 56 Picatinny Arsenal, 157 Pierce, John R., 147, 197 Pitt, William, 39 Plato, 25, 30 PLATO computer system, 25, 226 Player-piano, 54, 68 “Plot Genii,” 29 Plotto, 29 Pneumatic buffering, 69 Pneumatic capacitor, 69 Pneumatic computer, 54, 68, 69 Pneumatic diode, 69 Pneumatic flip-flop, 69 Pneumatic inductor, 69 Poetry computer, 144 Polaris missile, 4, 162, 168 Polymorphic data-processing, 264 Post office, 55, 149, 225 Potentiometers, 76 Predictive analysis, 238 Predictive control, 205 Prentice-Hall, Inc., 226 President, 16, 258 Pressey, Sydney, 222 Prices, computer, 5, 48, 147 Primitive equations, 163 _Principia Mathematica_, 110 Printed-circuit, 62 Printers, 65, 66 Prison, use of computers, 221 Problem-solving computer, 140, 143 Process control, 83 Procter & Gamble, 249 Program, 52, 226 Programmer, 55, 56, 103, 104, 128, 233 Programming, 36, 55 Psychological Matrix Rotation Computer, 78, 94 Public Health Service, U. S., 194 Pueblo Indians, 97 Punched cards, 2, 34, 41, 42, 43, 54 Purdue University, 76, 220 Pure binary, 102, 104 Q-5 computer, 77 Radcliffe College, 224 Radiation effects, 69 RAMAC computer, 150, 151, 198, 199 Ramo, Simon, 258, 262 Rand Corp., 11, 129 Random-access memory, 63, 131 Random net, 136 Range Keeper Mark I computer, 42 RAYDAC computer, 260 Raytheon Co., 135, 136 RCA, 205, 218, 255 RCA 501 computer, 151 Reading, by computer, 3, 55, 229 _Reader’s Digest_, 236 Real estate, 179 “Real-time” computers, 78, 168, 202, 205 RECOMP computer, 47 Reeves Instrument Co., 77 Republic Aviation Corp., 125 Reservations, airline, 3 Reverberation circuit, 128 Revolution, computer, 251 Rheem Califone, 224 Richardson, L. F., 163 _Road to Oz, The_, 27 Robot, 44, 212 Rockefeller Institute for Medical Research, 194 Roman numerals, 23, 97 Rosenblatt, Frank, 133, 135 Ross, Douglas, 209 Rossby, C. G., 165 Royal McBee, Corp., 220 Royal Society, 251 _Rubáiyát_, 108 R.U.R., 44, 121 Russia, 11, 77, 133, 143, 195, 207, 215, 221, 236, 242 Russian peasant multiplication, 103 Russell, Bertrand, 110, 111, 130 Sabre computer, 183 SAC, 160, 161 SAGE computer, 3, 12, 37, 53, 158, 159, 226, 259 Samuel, Arthur, 251 Sara Lee Bakeries, 206 Sausage making by computer, 179 Scaling, analog computer, 76 _Scientific American_, 140, 239 “Sea Wolf” testing by computer, 162 Second industrial revolution, 171 Self-reproducing machines, 33, 268 Selfridge, Oliver, 124 Sequential computers, 126 Sex and numbers, 19 Shannon, Claude, 44, 110, 215 Shelley, Mary W., 42 “Shoebox” computer, 242 Sidewinder missile, 160 Signal Corps, U. S. A., 77 Simon, Herbert, 141 Simulator, 79, 169, 187, 189 Simulmatics Corp., 181 Simultaneous linear equations, 77 Skinner, B. F., 133, 223, 230 Skybolt missile, 160 Slide rule, 7, 85 Smee, Alfred, 121 Social Security, 149 “Software,” computer, 54 Solartron-John Brown, Ltd., 174 Sonotype, 243 _Son pan_, 23 “Sorcerer’s Apprentice,” 27 _Soroban_, 5, 22 Southern Methodist University, 179 Spaceborne computers, 167 Space flight, 3, 92 SPADATS system, 160 Special-purpose computers, 79 Speech computer, 242 Sperry Rand Corp., 255 Sports, use of computers, 198 SSEC computer, 155, 156 Standard Oil Co. of California, 207 Stanhope, Earl of, 39 demonstrator, 40 Stapledon, Olaf, 265 Steel mill, 189, 204 Steele, J. E., 132 Stock Exchange, American, 176 Stock market, 176, 177 Stone Age computer, 21 “Store” computer, 36, 62 Stravinsky, Igor, 197 STRETCH computer, 47, 48 Stromberg-Carlson, 191 “Subroutine” computer program, 59 Subtraction, computer, 106 Sumerian cuneiform, 23 Sundial, 24 Sun Oil Co., 208 Supermarket, use of computers, 13, 174 Surveyor space vehicle, 169 Swift, Jonathan, 30 Switch, statistical, 137 Syllogism, 26, 109 Symbiosis, 265 Symbolic logic, 38, 109, 110, 115, 248, 255 Synnoetics, 260 SYNTAC, 150 Synthetic rubber production, 208 System Development Corp., 156, 220, 226 Szoeny refinery, 207 _Tabula rasa_, 126 Tallies, 20 Tape memory, 64 magnetic, 54 paper, 54 TASCON, 180 Taylor, Frederick W., 171 Teaching machines, 6, 100, 222, 225 AutoTutor, 213, 225 CLASS, 226, 227, 228 DIDAK, 224 Digiflex, 225 PLATO, 226 Pressey, S., 222 Skinner, B. F., 133, 223, 230 Videosonic, 222 Technical Information Service, 249 Technical Operations, Inc., 150 Telecredit, 179 Teleflite, 183 Testing computers, 117 Texas Company, The, 208 Thinking, 123 molecular theory of, 268 Thompson Ramo Wooldridge, Inc., 226, 258 Thomson, James, 75 Thomson & McKinnon, 177 Thor missile, 167 Tick-tack-toe, 8, 143 Tik-Tok, 27 Titan missile, 168 TMI Grolier, 226 Torres y Quevedo, L., 35 Trading stamps with computers, 175 Traffic control, 218 Trains, 215 Transcontinental & Western Air Lines, 186 TransfeRobot, 4, 213, 217 Transportation, 181 Transistors, 9, 87, 144 Translation computer, 91, 92, 237 Traveling-wave tube, 255 Truth tables, 110, 112 Tunnel diode, 255te Turing, A. M., 191 TutorText, 226 UNESCO, 236 Unimate, 4, 212, 213 Union Carbide, 208 Unitary system, 97 Unicall, 243 United Air Lines, 182, 183 United States Industries, Inc., 213, 218, 225 UNIVAC computer, 47, 149, 151, 171, 221 University of California, 76 University of California at Berkeley, 238 University of California at Los Angeles, 133, 219 University of Illinois, 78 University of London, 8 University of Michigan, 135, 220 University of Pennsylvania, 46 University of Philadelphia, 193 University of Southern California, 225 University of Washington, 152 Upjohn Co., The, 127 Vacuum tubes, 9, 63, 114, 122 van Vogt, A. E., 267 Venn, John, 29, 38 Videosonic trainer, 222 VIDIAC character-generator, 242 Vitruvius, 25 Vocal computer, 67 Voltaire, 29 Voltmeter, 76 von Braun, Wernher, 168 von Kempelen, Wolfgang, 35 von Neumann, John, 130, 137, 156, 251, 253 Wall Street, 6, 176 Walter, Grey, 251 Walnut information retrieval system, 246, 247 War strategy, 143 Water clock, 24 Watt, James, 36, 203 _Way of All Flesh, The_, 32 Wearever Aluminum Co., 236 Weather Bureau, U. S., 166 Weather map, 164 Weather prediction, 15, 163 Wells, H. G., 13, 121 Werner, Gerhard, 123 Western Electric Co., 212 Western Reserve University, 245 Westinghouse Corp., 76, 211, 218 Whitehead, A. N., 110, 130 Wiener, Norbert, 123, 251 Williams, John, 268 Wood, Tom, 21 _World Brain_, 13 Wright Brothers, 18 X-15 aircraft, 71, 160 Young & Rubicam, 181 Zero, concept of, 24 Zuse L23 computer, 199 Zworykin, Vladimir, 194 ------------------------------------------------------------------------ ● Transcriber’s Notes: ○ New original cover art included with this eBook is granted to the public domain. ○ The example of binary division on page 107 couldn’t accurately be drawn with HTML characters. ○ Some formulas and tables that could not replicated well in HTML were replaced by page images from the printed book. ○ Missing or obscured punctuation was silently corrected. ○ Typographical errors were silently corrected. ○ Inconsistent spelling and hyphenation were made consistent only when a predominant form was found in this book. ○ Text that: was in italics is enclosed by underscores (_italics_); was in bold by is enclosed by “equal” signs (=bold=). ○ The use of a caret (^) before a letter, or letters, shows that the following letter or letters was intended to be a superscript, as in 10^{th} Century. ○ Superscripts are used to indicate numbers raised to a power. In this plain text document, they are represented by characters like this: “P^3” or “10^{18}”, i.e. P cubed or 10 to the 18th power. ○ Variables in formulas sometimes use subscripts, which look like this: “A_{0}”. This would be read “A sub 0”. *** END OF THE PROJECT GUTENBERG EBOOK COMPUTERS—THE MACHINES WE THINK WITH *** Updated editions will replace the previous one—the old editions will be renamed. Creating the works from print editions not protected by U.S. copyright law means that no one owns a United States copyright in these works, so the Foundation (and you!) can copy and distribute it in the United States without permission and without paying copyright royalties. Special rules, set forth in the General Terms of Use part of this license, apply to copying and distributing Project Gutenberg™ electronic works to protect the PROJECT GUTENBERG™ concept and trademark. Project Gutenberg is a registered trademark, and may not be used if you charge for an eBook, except by following the terms of the trademark license, including paying royalties for use of the Project Gutenberg trademark. If you do not charge anything for copies of this eBook, complying with the trademark license is very easy. You may use this eBook for nearly any purpose such as creation of derivative works, reports, performances and research. Project Gutenberg eBooks may be modified and printed and given away—you may do practically ANYTHING in the United States with eBooks not protected by U.S. copyright law. Redistribution is subject to the trademark license, especially commercial redistribution. START: FULL LICENSE THE FULL PROJECT GUTENBERG LICENSE PLEASE READ THIS BEFORE YOU DISTRIBUTE OR USE THIS WORK To protect the Project Gutenberg™ mission of promoting the free distribution of electronic works, by using or distributing this work (or any other work associated in any way with the phrase “Project Gutenberg”), you agree to comply with all the terms of the Full Project Gutenberg™ License available with this file or online at www.gutenberg.org/license. Section 1. General Terms of Use and Redistributing Project Gutenberg™ electronic works 1.A. By reading or using any part of this Project Gutenberg™ electronic work, you indicate that you have read, understand, agree to and accept all the terms of this license and intellectual property (trademark/copyright) agreement. If you do not agree to abide by all the terms of this agreement, you must cease using and return or destroy all copies of Project Gutenberg™ electronic works in your possession. If you paid a fee for obtaining a copy of or access to a Project Gutenberg™ electronic work and you do not agree to be bound by the terms of this agreement, you may obtain a refund from the person or entity to whom you paid the fee as set forth in paragraph 1.E.8. 1.B. “Project Gutenberg” is a registered trademark. It may only be used on or associated in any way with an electronic work by people who agree to be bound by the terms of this agreement. There are a few things that you can do with most Project Gutenberg™ electronic works even without complying with the full terms of this agreement. See paragraph 1.C below. There are a lot of things you can do with Project Gutenberg™ electronic works if you follow the terms of this agreement and help preserve free future access to Project Gutenberg™ electronic works. See paragraph 1.E below. 1.C. The Project Gutenberg Literary Archive Foundation (“the Foundation” or PGLAF), owns a compilation copyright in the collection of Project Gutenberg™ electronic works. Nearly all the individual works in the collection are in the public domain in the United States. If an individual work is unprotected by copyright law in the United States and you are located in the United States, we do not claim a right to prevent you from copying, distributing, performing, displaying or creating derivative works based on the work as long as all references to Project Gutenberg are removed. Of course, we hope that you will support the Project Gutenberg™ mission of promoting free access to electronic works by freely sharing Project Gutenberg™ works in compliance with the terms of this agreement for keeping the Project Gutenberg™ name associated with the work. You can easily comply with the terms of this agreement by keeping this work in the same format with its attached full Project Gutenberg™ License when you share it without charge with others. 1.D. The copyright laws of the place where you are located also govern what you can do with this work. Copyright laws in most countries are in a constant state of change. If you are outside the United States, check the laws of your country in addition to the terms of this agreement before downloading, copying, displaying, performing, distributing or creating derivative works based on this work or any other Project Gutenberg™ work. The Foundation makes no representations concerning the copyright status of any work in any country other than the United States. 1.E. Unless you have removed all references to Project Gutenberg: 1.E.1. The following sentence, with active links to, or other immediate access to, the full Project Gutenberg™ License must appear prominently whenever any copy of a Project Gutenberg™ work (any work on which the phrase “Project Gutenberg” appears, or with which the phrase “Project Gutenberg” is associated) is accessed, displayed, performed, viewed, copied or distributed: This eBook is for the use of anyone anywhere in the United States and most other parts of the world at no cost and with almost no restrictions whatsoever. You may copy it, give it away or re-use it under the terms of the Project Gutenberg License included with this eBook or online at www.gutenberg.org. If you are not located in the United States, you will have to check the laws of the country where you are located before using this eBook. 1.E.2. If an individual Project Gutenberg™ electronic work is derived from texts not protected by U.S. copyright law (does not contain a notice indicating that it is posted with permission of the copyright holder), the work can be copied and distributed to anyone in the United States without paying any fees or charges. If you are redistributing or providing access to a work with the phrase “Project Gutenberg” associated with or appearing on the work, you must comply either with the requirements of paragraphs 1.E.1 through 1.E.7 or obtain permission for the use of the work and the Project Gutenberg™ trademark as set forth in paragraphs 1.E.8 or 1.E.9. 1.E.3. If an individual Project Gutenberg™ electronic work is posted with the permission of the copyright holder, your use and distribution must comply with both paragraphs 1.E.1 through 1.E.7 and any additional terms imposed by the copyright holder. Additional terms will be linked to the Project Gutenberg™ License for all works posted with the permission of the copyright holder found at the beginning of this work. 1.E.4. Do not unlink or detach or remove the full Project Gutenberg™ License terms from this work, or any files containing a part of this work or any other work associated with Project Gutenberg™. 1.E.5. Do not copy, display, perform, distribute or redistribute this electronic work, or any part of this electronic work, without prominently displaying the sentence set forth in paragraph 1.E.1 with active links or immediate access to the full terms of the Project Gutenberg™ License. 1.E.6. You may convert to and distribute this work in any binary, compressed, marked up, nonproprietary or proprietary form, including any word processing or hypertext form. However, if you provide access to or distribute copies of a Project Gutenberg™ work in a format other than “Plain Vanilla ASCII” or other format used in the official version posted on the official Project Gutenberg™ website (www.gutenberg.org), you must, at no additional cost, fee or expense to the user, provide a copy, a means of exporting a copy, or a means of obtaining a copy upon request, of the work in its original “Plain Vanilla ASCII” or other form. Any alternate format must include the full Project Gutenberg™ License as specified in paragraph 1.E.1. 1.E.7. Do not charge a fee for access to, viewing, displaying, performing, copying or distributing any Project Gutenberg™ works unless you comply with paragraph 1.E.8 or 1.E.9. 1.E.8. You may charge a reasonable fee for copies of or providing access to or distributing Project Gutenberg™ electronic works provided that: • You pay a royalty fee of 20% of the gross profits you derive from the use of Project Gutenberg™ works calculated using the method you already use to calculate your applicable taxes. The fee is owed to the owner of the Project Gutenberg™ trademark, but he has agreed to donate royalties under this paragraph to the Project Gutenberg Literary Archive Foundation. Royalty payments must be paid within 60 days following each date on which you prepare (or are legally required to prepare) your periodic tax returns. Royalty payments should be clearly marked as such and sent to the Project Gutenberg Literary Archive Foundation at the address specified in Section 4, “Information about donations to the Project Gutenberg Literary Archive Foundation.” • You provide a full refund of any money paid by a user who notifies you in writing (or by e-mail) within 30 days of receipt that s/he does not agree to the terms of the full Project Gutenberg™ License. You must require such a user to return or destroy all copies of the works possessed in a physical medium and discontinue all use of and all access to other copies of Project Gutenberg™ works. • You provide, in accordance with paragraph 1.F.3, a full refund of any money paid for a work or a replacement copy, if a defect in the electronic work is discovered and reported to you within 90 days of receipt of the work. • You comply with all other terms of this agreement for free distribution of Project Gutenberg™ works. 1.E.9. If you wish to charge a fee or distribute a Project Gutenberg™ electronic work or group of works on different terms than are set forth in this agreement, you must obtain permission in writing from the Project Gutenberg Literary Archive Foundation, the manager of the Project Gutenberg™ trademark. Contact the Foundation as set forth in Section 3 below. 1.F. 1.F.1. Project Gutenberg volunteers and employees expend considerable effort to identify, do copyright research on, transcribe and proofread works not protected by U.S. copyright law in creating the Project Gutenberg™ collection. Despite these efforts, Project Gutenberg™ electronic works, and the medium on which they may be stored, may contain “Defects,” such as, but not limited to, incomplete, inaccurate or corrupt data, transcription errors, a copyright or other intellectual property infringement, a defective or damaged disk or other medium, a computer virus, or computer codes that damage or cannot be read by your equipment. 1.F.2. LIMITED WARRANTY, DISCLAIMER OF DAMAGES - Except for the “Right of Replacement or Refund” described in paragraph 1.F.3, the Project Gutenberg Literary Archive Foundation, the owner of the Project Gutenberg™ trademark, and any other party distributing a Project Gutenberg™ electronic work under this agreement, disclaim all liability to you for damages, costs and expenses, including legal fees. YOU AGREE THAT YOU HAVE NO REMEDIES FOR NEGLIGENCE, STRICT LIABILITY, BREACH OF WARRANTY OR BREACH OF CONTRACT EXCEPT THOSE PROVIDED IN PARAGRAPH 1.F.3. YOU AGREE THAT THE FOUNDATION, THE TRADEMARK OWNER, AND ANY DISTRIBUTOR UNDER THIS AGREEMENT WILL NOT BE LIABLE TO YOU FOR ACTUAL, DIRECT, INDIRECT, CONSEQUENTIAL, PUNITIVE OR INCIDENTAL DAMAGES EVEN IF YOU GIVE NOTICE OF THE POSSIBILITY OF SUCH DAMAGE. 1.F.3. LIMITED RIGHT OF REPLACEMENT OR REFUND - If you discover a defect in this electronic work within 90 days of receiving it, you can receive a refund of the money (if any) you paid for it by sending a written explanation to the person you received the work from. If you received the work on a physical medium, you must return the medium with your written explanation. The person or entity that provided you with the defective work may elect to provide a replacement copy in lieu of a refund. If you received the work electronically, the person or entity providing it to you may choose to give you a second opportunity to receive the work electronically in lieu of a refund. If the second copy is also defective, you may demand a refund in writing without further opportunities to fix the problem. 1.F.4. Except for the limited right of replacement or refund set forth in paragraph 1.F.3, this work is provided to you ‘AS-IS’, WITH NO OTHER WARRANTIES OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO WARRANTIES OF MERCHANTABILITY OR FITNESS FOR ANY PURPOSE. 1.F.5. Some states do not allow disclaimers of certain implied warranties or the exclusion or limitation of certain types of damages. If any disclaimer or limitation set forth in this agreement violates the law of the state applicable to this agreement, the agreement shall be interpreted to make the maximum disclaimer or limitation permitted by the applicable state law. The invalidity or unenforceability of any provision of this agreement shall not void the remaining provisions. 1.F.6. INDEMNITY - You agree to indemnify and hold the Foundation, the trademark owner, any agent or employee of the Foundation, anyone providing copies of Project Gutenberg™ electronic works in accordance with this agreement, and any volunteers associated with the production, promotion and distribution of Project Gutenberg™ electronic works, harmless from all liability, costs and expenses, including legal fees, that arise directly or indirectly from any of the following which you do or cause to occur: (a) distribution of this or any Project Gutenberg™ work, (b) alteration, modification, or additions or deletions to any Project Gutenberg™ work, and (c) any Defect you cause. Section 2. Information about the Mission of Project Gutenberg™ Project Gutenberg™ is synonymous with the free distribution of electronic works in formats readable by the widest variety of computers including obsolete, old, middle-aged and new computers. It exists because of the efforts of hundreds of volunteers and donations from people in all walks of life. Volunteers and financial support to provide volunteers with the assistance they need are critical to reaching Project Gutenberg™’s goals and ensuring that the Project Gutenberg™ collection will remain freely available for generations to come. In 2001, the Project Gutenberg Literary Archive Foundation was created to provide a secure and permanent future for Project Gutenberg™ and future generations. To learn more about the Project Gutenberg Literary Archive Foundation and how your efforts and donations can help, see Sections 3 and 4 and the Foundation information page at www.gutenberg.org. Section 3. Information about the Project Gutenberg Literary Archive Foundation The Project Gutenberg Literary Archive Foundation is a non-profit 501(c)(3) educational corporation organized under the laws of the state of Mississippi and granted tax exempt status by the Internal Revenue Service. The Foundation’s EIN or federal tax identification number is 64-6221541. Contributions to the Project Gutenberg Literary Archive Foundation are tax deductible to the full extent permitted by U.S. federal laws and your state’s laws. The Foundation’s business office is located at 809 North 1500 West, Salt Lake City, UT 84116, (801) 596-1887. Email contact links and up to date contact information can be found at the Foundation’s website and official page at www.gutenberg.org/contact Section 4. Information about Donations to the Project Gutenberg Literary Archive Foundation Project Gutenberg™ depends upon and cannot survive without widespread public support and donations to carry out its mission of increasing the number of public domain and licensed works that can be freely distributed in machine-readable form accessible by the widest array of equipment including outdated equipment. Many small donations ($1 to $5,000) are particularly important to maintaining tax exempt status with the IRS. The Foundation is committed to complying with the laws regulating charities and charitable donations in all 50 states of the United States. Compliance requirements are not uniform and it takes a considerable effort, much paperwork and many fees to meet and keep up with these requirements. We do not solicit donations in locations where we have not received written confirmation of compliance. To SEND DONATIONS or determine the status of compliance for any particular state visit www.gutenberg.org/donate. While we cannot and do not solicit contributions from states where we have not met the solicitation requirements, we know of no prohibition against accepting unsolicited donations from donors in such states who approach us with offers to donate. International donations are gratefully accepted, but we cannot make any statements concerning tax treatment of donations received from outside the United States. U.S. laws alone swamp our small staff. Please check the Project Gutenberg web pages for current donation methods and addresses. Donations are accepted in a number of other ways including checks, online payments and credit card donations. To donate, please visit: www.gutenberg.org/donate. Section 5. General Information About Project Gutenberg™ electronic works Professor Michael S. Hart was the originator of the Project Gutenberg™ concept of a library of electronic works that could be freely shared with anyone. For forty years, he produced and distributed Project Gutenberg™ eBooks with only a loose network of volunteer support. Project Gutenberg™ eBooks are often created from several printed editions, all of which are confirmed as not protected by copyright in the U.S. unless a copyright notice is included. Thus, we do not necessarily keep eBooks in compliance with any particular paper edition. Most people start at our website which has the main PG search facility: www.gutenberg.org. This website includes information about Project Gutenberg™, including how to make donations to the Project Gutenberg Literary Archive Foundation, how to help produce our new eBooks, and how to subscribe to our email newsletter to hear about new eBooks.