Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Technology Books Media Book Reviews

A History of Modern Computing 46

cmalek has sent us a review of Paul E. Cerruzi's A History Of Moderning Computing, which delves into the past of the machines we use today. To read more about your computer's ancestors, click below.
A History of Modern Computing
author Paul E. Cerruzi
pages 398
publisher MIT Press, 1998
rating 8/10
reviewer Chris Malek
ISBN 0-262-03255-4
summary A thorough treatment of the history of computing in the US; a must read book for any computer enthusiast.
In 1951, the Eckert-Mauchly Computer Corporation brought into the world the UNIVAC, an event which marks the real beginning of our computer and information age. This was not the first "computer" of course; the ENIAC (which Eckert and Mauchly had built during the Second World War) was the first stored program electronic computer. As well, such electro-mechanical calculators as the Mark I, the Zuse Z machines, the ABC had all preceeded it. UNIVAC, however, was created in a fundamentally different spirit than its predecessors, which were used exclusively by government funded military agencies and scientists: it was meant to be sold as a commodity.

In 1948, hearing of Eckert and Mauchly's plans, Howard Aiken (builder of Harvard's Mark I) said that computers would never be a marketable product, since in the U.S., only a handful of them would find use. He was wrong: by 1953, businesses (who could afford one) as well as government agencies were lining up for them, and when they got them, the machines changed the way they processed data. IBM, seeing early in UNIVAC a threat to their punched card tabulator business, responded by announcing the IBM 701 in 1952, and the modern computer age and computer industry had taken its first tentative steps.

In A History of Modern Computing, Paul E. Ceruzzi, Curator of the Department of Space History at the National Air and Space Museum, weaves the fascinating tale of computing in the United States between 1945 and 1995, and "woven" is an appropriate adjective for this book. He takes the standpoint that a technology cannot be viewed in isolation: it must be taken as one participant in a complex system. He calls this philosophy "social construction:" technology evolves as the result of pressures from many interacting forces in society, and in turn causes the society itself to evolve, thus changing the evolutionary pressures on the technology.

It is the evolution of this system, not merely the technology in it, that Ceruzzi is concerned with in A History of Modern Computing. As we watch the computer change from scientific instrument to commercial product, see the emergence of first mainframes, then minicomputers, and finally the personal computer, Cerruzi shows how this development was affected by many forces, of which the following are but a few: IBM, Digital, GE and the Seven Dwarves; NASA, the military and other government agencies; the Cold War, the space race, and the 60's counterculture. We also see how these are all, in turn, affected by computers.

The influence of NASA's Manned Space Program in the 1960's is illustrative of this interaction at work. At the time, computing was done exclusively by batch processing -- a series of jobs run in succession, without human intervention. Computers were simply too costly to run for most sites to allow users direct interactive access -- a typical system rented for $20,000 - $40,000 per month, with a purchase price in the millions. The Manned Space Program, with its essentially unlimited budgets, and its 1970 deadline to put a man on the Moon, was one place at which such real-time computing was not only cost feasible, but necessary (in order to quickly determine whether the orbit resulting from a launch would be stable, or whether the mission should be aborted, for example).

Working with several generations of IBM mainframes, and with the help of IBM engineers, NASA evolved their own software, a real-time system called Mercury Monitor, into a powerful real-time extension of the IBM/360 operating system, which was soon adopted by other commercial installations. By the early 1970's, this modified OS became a fully supported IBM product. Most importantly,

"These modifications of [the IBM/360 OS] could not have happened without the unique nature of the Apollo mission .... Such modifications were not even permitted by IBM for most other customers, who typically leased and did not own equipment. NASA's modifications did show that a large commercial mainframe could operate in other than batch mode." (p 124)

The book is well illustrated, with many images of computers and people, and these illustrations add much to text. It is aimed at a general audience, and the prose reads well and easily. At 312 pages of text, Ceruzzi manages to pack in a satisfying level of detail without overwhelming the reader. It is not a highly technical book; those seeking to know details about how each computer worked will be disappointed. Ceruzzi does not shirk the technical aspects: he is simply more interested in the impact of a technology rather than its workings. For those so inclined, it is well footnoted, and the footnotes are well worth reading. It also has an extensive bibliography.

One drawback that some may see is that this is a history of computing in the United States, and even though there was work being done in other countries, notably England and Japan, this is only touched upon briefly. He does warn you that he's going to do this, however. And because I know it's going to come up, no, there is no mention of Linux, or Linus Torvalds, or Richard Stallman, or free software. He does have a whole chapter on UNIX and networked computers, however.

Ceruzzi also emphasizes real-world practical applications of ideas, and the role of university-based computer science research is largely left out. He is also a bit brief and somewhat vague about the years leading up to the creation of UNIVAC. For example, while he does mention the Mark I, and Howard Aiken, he fails to mention that it was in fact IBM who constructed the machine under Aiken's direction, and that the idea of the computer was not new to them with the 701. Finally, for a book published in 1998, the fact that only 8 pages are spent on the Internet and its implications is a bit odd.

In total, however, A History of Modern Computing serves as a worthy companion to other books published already (such as A History of Computing Technology, Williams, 1985 and The Computer From Pascal to Von Neumann, Goldstine, 1972), and will be enjoyed by anyone interested in learning how computing in the United States arrived in its current state.

Pick this book at at Amazon.

Table of Contents

Introduction: Defining "Computer"
1. The Advent of Commercial Computing, 1945-1956
2. Computing Comes of Age, 1956-1964
3. The Early History of Software, 1952-1968
4. From Mainframe to Minicomputer, 1959-1969
5. The Go-Go Years and the System/360, 1961-1975
6. The Chip and Its Impact, 1965-1975
7. The Personal Computer, 1972-1977
8. Augmenting Human Intellect, 1975-1985
9. Workstations, UNIX, and the Net, 1981-1995
Conclusion: The Digitization of the World Picture
Notes
Bibliography
Index

At Amazon.com:

This discussion has been archived. No new comments can be posted.

A History of Modern Computing

Comments Filter:
  • looks like an interesting book.
    Might be able to _FINALLY_ settle some bar bets I have made!

    PimpSmurf
  • by Signal 11 ( 7608 ) on Thursday October 14, 1999 @05:01AM (#1614051)
    They needed to write a whole book on the history of computing? I can sum it up in one paragraph!

    195? - ENIAC invented.

    1967 - while under the influence of illicit drugs, some college kids get together and create UNIX.

    197? - Apple creates the first PC.

    1980 - some nerd-boy founds a company out in Redmond WA - begins selling crummy OSs to the world.

    1985 - aforementioned crummy OSs have taken over the world.

    1991 - The aforementioned college kids finally sober up and release Linux.

    1999 - Aforementioned company goes nuts as Linux (UNIX!) starts taking back it's fair share.

    2003 - Microsoft collapses. Sadly, nobody cares.

    2008 - Aforementioned college kids go out and get drunk again.. linux falters.

    2009 - *BSD guys take over the world.

    --

  • This book also makes a nice companion to an earlier book of Ceruzzi's title _Reckoners_, while the new book gives a nice picture of the organizations which developed computers as business tools, while _Reckoners_ provides profiles of the people who worked on some of the early projects. The most interesting chapter about Konrad Zuse constructing his computer in his parent's living room out of cast off telephone parts and used movie film.
  • In this splendid review (of a very intriguing book) there is a brief comment about how odd it seems that a book published in 1998 has so little to say about the Internet. Let me contribute a little understanding from years in the publishing business.

    Publishing is a time-consuming process--it is typical for a book manuscript to take 18 months or more to go from "manuscript to bound books." With some publishers the cycle takes even longer--and many commercial publishers will also time the release dates of books in order to make sure that they bring out new titles every month. They will sequence the books so that the titles they expect to be blockbusters appear in late fall (as Illiad's UserFriendly: The Comic Strip is appearing on O'Reilly's October list), and position the weaker titles in the early spring (the kiss of death in publishing is a February pub date).

    Most books don't happen overnight--in this case the author is working full time as an historian at the Smithsonian. So he didn't write this in three months or even six--it more likely is the result of two or three years of work.

    All of which means that the original outline for the book could very easily have been done as early as 1994 or 1995--when practically nobody, certainly nobody in a "general market audience," would have heard of the Web.

    If you're wondering how all those third-party manuals seem to hit the shelves in no time, rest assured that computers have revolutionized some publishers as well. Computer books aren't the same thing as traditional nonfiction--they are much more like products. The publisher lines up a series of writers, assembles a bunch of chapters together, and packages the product for sale. Having been through the process a couple of times, as well as having done real books, I can tell you that there's little comparison.

    I'll buy the book.

  • The Soul of a New Machine (Tracy Kidder) should be required reading for everyone who really wants to see an example of a how, instead of just a historical view (this was actually required for a course at my college this past year). A great account of Data General's efforts to build, what else, a new machine. Well worth the $11 8^)

  • You forgot

    1939 - John Atanasoff and Clifford Berry build first electronic computer.

    Sperry Rand's patent on ENIAC was voided in federal court because it derived from Atanasoff's invention.

    It's really quite a shome that the history books ignore Atanasoff. It probably has as much to do with the fact that he was from Iowa as anything else. In my mind he is the father of modern computing.

  • I read this this past Summer - it's a pretty good account of what's happened, plenty of pictures, anecdotes, and even some humor - it isn't too technical either, so most people should be able to understand it.

    8.5/10

  • Should be called A History of Modern US Computing
  • Nice to see the old myth about the ENIAC being the first computer is still being pepetuated.

    Chris
    Chris Wareham
  • I'm sure his parents were really happy when
    this guy (Konrad Zuse) [tu-berlin.de] built the first freely
    programmable computer (the Z1, ready in 1938)
    which completely filled up their whole living room. [tu-berlin.de]

    BTW: his parents did sponsor him. No gov. funding.



    --
  • That's the stupidest comment I've ever heard. I guess by that Edward Gibbon's "The Decline and Fall of the Roman Empire" is out of date- I mean after all it was written in 1776 and Italian history continued after that...

    Any historical text is valid (and "up-to-date" as long as the points and conclusions it draws are still valid.

    Dan Sniderman
    Consultant
    BA - History 1984 (I guess I'm out-of-date now too!)
  • Wasn't it David Atanasoff? It was Clifford though ABC wot?
  • Agreed. This is just a great book. I recently re-read it last year (it was also required reading in one of my college courses).

    Not only does the book present a gripping account of the engineering problems that were encountered and the various solutions that were developed, it also gives a great look into the personalities involved. This is a very human story, with all its great blunders and triumphs.

    My favorite part of the story is the description of the hiring/interviewing procedure for the project. New grads should pay attention to what these guys were looking for. You might be surprised.

    --

  • You know it'll get marked down when you start a comment with:

    This isn't a flame BUT...

    Did anyone watch (and recall) the recent TV series on Bletchley park and the Enigma codebreakers? Didn't one of the later episodes mention a computer that was built before ENIAC?

    If I'm wrong, then please correct me, but I was led to believe (by the TV series)that ENIAC was the second electronic computer...
  • My (thank the gods) ex-wife collects first editions and has a history text book entitled "A Complete History of The United States" printed in 1876. You, having a degree in history, could guess that there are many items in the book that are inacurate or appear to be in contrast to "modern" history books (anyone read a history of history writing?).

    But since there actually is no time as stated in the article earlier in the continum of the is-now-and-always-will-be slashdot marked with the same date stamp as this post, all history is always current if, indeed, "history" and "current" have any meaning at all.

  • by jd ( 1658 ) <imipak@ y a hoo.com> on Thursday October 14, 1999 @06:39AM (#1614069) Homepage Journal
    I can't condemn the authors for writing a book that will sell. Let's face it, that's why people write books.

    Mentioning the ABC computer, from which ENIAC was based, would alienate a lot of people. Too many lecturers and too many courses have money riding on promoting ENIAC as the "first" modern computer.

    Talking about Colossus - a computer that blew the socks off ENIAC and (in a recent benchtest) was shown to be faster than a Pentium II for code-cracking - would devastate US pride. A British computer from the 1940's, superior to a modern American high-tech system? That's one hell of an ego-basher.

    Then, there's the Manchester Mark 1. ENIAC was not truly a stored-program computer, but the MM1 (also known as The Baby) -was-. The MM1 also used optical memory (the first computer to do so) and stored both program and data in memory at all times. It was also a binary computer (many early computers were based on base-10), and had a 32-bit architecture.

    All in all, ENIAC was just one more computer in a LONG line of both British and American inventions that revolutionised and shaped the modern computer age. It's significance is vastly overblown, and it has no real importance in the scheme of things, beyond it's publicity value.

  • by Anonymous Coward
    Right, the ENIAC was preceeded by the ABC (Atanasoff-Berry Computer) built at Iowa State in from the late 1930's -> early 1940's. While not general purpose (it was built soley to solve systems of linear equations in physics), it was the first digitally controlled and digital computing machine. Had rolling drums with condensors for memory, and 1/6 of a second (per second) for control signals. Machauly visited Atansoff before building the ENIAC to see the ABC. I just finished writing a paper on this stuff for a grad computer architecture class. Turned into a book report (in a grad class!) but it was intresting to some degree.
    There was also the Colossus in Britain that was used to decode German communications during WW1 -- which Alan Turing worked on BTW, but it wasn't digital in the ways that Atanasoff's machine was (both in computation and control).
    Anyway...
  • Sadly correct. I didn't see any mention of Colossus, the Manchester Mk 1, the work done by Alan Turing, Ferranti, ICL, or any other major British name, from the early days of computing. These probably had as much impact as any of the people mentioned in the extract, if not more.

    (The entire modern computer architecture is based on Turing's design, and the entire notion of computable problems is based on Turing's theoretical work.)

  • Talking about Colossus - a computer that blew the socks off ENIAC and (in a recent benchtest) was shown to be faster than a Pentium II for code-cracking - would devastate US pride. A British computer from the 1940's, superior to a modern American high-tech system? That's one hell of an ego-basher.

    Do you have a link for this? I've heard about Colossus being faster than a PII, but have never seen anything concrete.

    One data point I do have is comparing Turing's earlier Bombe with a P100 [geocities.com]. The link mentions a P100 doing in 8 minutes what a Bombe did in 900 minutes (15 hours).

    If we assume that Colossus was twice as fast as a PII, and the generic PII is four times as fast as a P100, then Turing's Colussus was about 800 times faster than his Bombe. Damn, that really blows Moore's law away.

    George

  • Not off the top of my head, but I'll see if I can dig it up. When I find it, I'll post the link.

    In the meantime, I -do- remember seeing the article in the New Scientist, so if they have any kind of online search facility and have their older articles still online, it should be possible to locate.

  • These self-proclaimed histories of computing always seem to jump from the Apple II (which they mention only because it was first computer designed for non-wireheads) right to the PC. They totally ignore the plethora of 80s computers that were far more prevalent than the PC was during that era. The Commodores, the Ataris, the TIs, the CoCos, the Amigas. What of them? Do they have no place in history? Are they to be stricken from every obelisk?
  • I did a little research (which I should have done before posting, but hey, why would I not want to shoot off my mouth) and realized Turing's Bombe's decrypted Enigma, while Colossus decrypted Lorenz, two entirely different encryption machines.

    I also found a link [fu-berlin.de] stating that Colossus does outperform a modern Pentium. The link is of a presentation given in July of 1998 for Pentium comparisons.

    They claim they performance is because of the parallelism of the Colossus.

    George
  • by Anonymous Coward
    >Nice to see the old myth about the ENIAC being the first computer is still being pepetuated.

    Yes... But it would be bad for the american ego if an american book would publish that the first computer was not american but german...

    The Z3 [tu-berlin.de]

    Another source [epemag.com]

    It could also be disputed that the UNIVAC was the first comercial computer. The company of Konrad Zuse delivered a Z4 system to the ETH in Zuerich in 1950. Data can be found here: Zuse KG [epemag.com]

    Maybe that book needs a rewrite...

  • The Soul of a New Machine is one of those rare books that can convey the feeling a techie gets when totally immersed in an interesting problem. What's more, it does so in a way a layperson can easily grok. As I read, I felt a profound sense of empathy with the engineers as they struggled to build the machine of their dreams. When they conquered a problem, I felt joy. When they burnt out from the stress, I felt loss. I recommend this book to anyone who wonders what being a techie is really about.
  • And I'd bet dollars to donuts (though dollars are worthless to me... ;-) that even if they did, there'd be no mention of the good old BBC Micro.

    "When I were a lad, all round 'ere used to be Beebs..."
  • The Brits get us colonials back in their own British Museum of Science & Industry [nmsi.ac.uk] in London, which was (and may still be) running an exhibit on the history of computers this summer when I visited. It mentions various developments in America here and there, but you'd think that computers were invented in Britan if that's all you saw.

    Alan Turing [st-and.ac.uk] devised most of the theoretical basis for computers in mathematics, but all the modern computers that we use are called Von Neumann [st-and.ac.uk] machines for a reason.

  • There were several machines built using relays, but
    "The ENIAC is the first large-scale machine to make
    use of electronic circuits for general operations..."
    p194, High Speed Computing Devices by Engineering
    Research Associates, Inc. 1950 McGraw Hill Book Co
  • Atanasoff: Forgotten Father of the Computer
    by Clark R. Mollenhoff

    The First Electronic Computer: The Atanasoff Story
    by Alice R. & Arthur W. Burks (1989)

    Before the Computer: IBM, NCR, Burroughs, and Remington Rand and the Industry They Created, 1885-1956
    by James W. Cortada (1993)

    Building IBM: Shaping an Industry and Its Technology
    by Emerson W. Pugh (1995)

    Computer: A History of the Information Machine
    by Martin Campbell-Kelly & Wm. Aspray (1997)

    The Computer Comes of Age
    by Rene Moreau (1986)

    The Computer from Pascal to Von Neumann
    by Herman H. Goldstine (reprinted 1993)

    John Von Neuman and the Origins of Modern Computing
    by William Aspray (1991)

    Engines of the Mind: The Evolution of the Computer from Mainframes to Microprocessors
    by Joel N. Shurkin (1996)

    ENIAC: The Triumphs and Tragedies of the World's First Computer
    by Scott McCartney (1999)

    From Memex to Hypertext: Vannevar Bush and the Mind's Machines
    by James M. Nyce, Paul Kahn (eds.) & Vannevar Bush (1992)

    Great Men and Women of Computing
    by Donald D. Spencer (2nd ed. 1999)

    A History of Computing Technology
    by Michael R. Williams (2nd ed. 1997)

    A History of Modern Computing
    by Paul E. Ceruzzi (1998)

    History of Personal Workstations
    by Adele Goldberg (ed.) (1988)

    History of Scientific Computing
    by Stephen G. Nash (ed.) (1990)

    Leo: The Incredible Story of the World's First Business Computer
    by David Caminer (ed.) (1997)

    Makin' Numbers: Howark Aiken and the Computer
    by I. Bernard Cohen (ed.) (1999)

    Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists
    by Cathy A. Lazere, Dennis Elliott Shasha (1998)

    Remembering the Future: Interviews From Personal Computing World
    by Wendy M. Grossman (1997)

    The Timetable of Computers: A Chronology of the Most Important People and Events in the History of Computers
    by Donald D. Spencer (2nd ed. 1999)

    Transforming Computer Technology: Information Processing for the Pentagon 1962-1986
    by Arthur L. Nordberg, Judy E. O'Neill, & Kerry Freedman (1996)

    Turing and the Computer: The Big Idea
    by Paul Strathem (1999)

    When Computers Went to Sea: The Digitization of the United States Navy
    by David L. Boslaugh (1999)

    Ada, the Enchantress of Numbers: A Selection from the Letters of Lord Byron's Daughter and Her Description of the First Computer
    by Betty A. Toole (ed.) (1998)

    A.M. Turing's ACE Report of 1946 and Other Papers (Charles Babbage Institute Reprint Series for the History of Computing, Vol. 10)
    by Alan Turing, et al. (1986)

    A Bibliographic Guide to the History of Computing, Computers, and the Information Processing Industry
    by James W. Cortada (1990)

    A Bibliographic Guide to Computer Applications, 1950-1990
    by James W. Cortada (1996)

    Business Builders in Computers
    by Nathan Aeseng (1999)

  • Glory and Failure: The Difference Engines of Johann Muller, Charles Babbage, and Georg and Edvard Scheutz (History of Computing)
    by Michael Lindgren, Craig G. McKay (translator) (1990)

  • The Computer- My Life
    by Konrad Zuse (1993, translation of 2nd ed., Springer-Verlag, 1986)



  • I used to dream of a Beeb, all we had was a wind-up pocket calculator, with no keyboard, no display, and no elastic.


    But it was a calculator to us


    Mony python weekend and all ...

    Steve

  • I doubt many modern computers use a Von Neumann architecture. :) If anything, it's maybe a mix of Von Neumann and Turing architectures, as (in practice), both are really sub-optimal designs.

    However, on the more abstract level, all modern computers are Turing Machines. There is not a single program in existance which cannot run on the abstract TM, and vice versa, for any modern machine with enough memory.

  • One of my favorites has been

    Dorf, Richard C.
    Introduction to Computers and Computer Science. (San Francisco: Boyd & Fraser Publishing Co.) 1972

    It provides an interesting discussion of the early computers, and includes many interesting pictures. My favorites are the pics of components, which demonstrate that organization is a virtue, and the photographs of IBM installations, complete with attractive models working intelligently at the consoles. The photos demonstrate the IBM "component" dress code, subtlely revealing the corporate culture at the itty-bitty machine company.

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...