Introduction“An Electronic Brain!” That’s what a reporter on live national television called the UNIVAC—right before it performed the task of predicting the results of the 1952 presidential election. It was a risky publicity stunt, and no one, not even the engineers who had constructed this computer, knew what it would do. Beamed into homes nationwide on black-and-white TVs, this was the first time that the public had seen a computer in action. The computers that came before were gigantic, noisy machines, built during World War II and hidden away in top-secret laboratories. The UNIVAC (short for UNIVersal Automatic Computer) was new—built for offices instead of battle—and would now have to prove itself. The American public watched the UNIVAC’s giant blinking console and rows of spinning magnetic tape while it did its calculations. Contrary to the polls, UNIVAC called a landslide victory for Dwight D. Eisenhower. To everyone’s surprise, the UNIVAC was correct! It was sensational! The American audience was enthralled—it was sci-fi in real life! This TV event propelled computers into pop culture and the public imagination.
Computers have come a long way since 1952! We now have access to the entire sum of human knowledge with devices that fit in the palms of our hands. The inventions that have gotten us to this point are part of a technological journey that starts as far back as the Stone Age. Before we dive into the history of computers, let’s define what a computer is.
A computer is a machine that stores, retrieves, and processes data by following a set of instructions.
At its core, a computer is a tool that expands the capacity of the human mind. We are all familiar with the idea that tools enhance our physical abilities to do more work—a hammer helps our arm strike a nail. A computer is a tool that enhances our mental abilities. Computers help us solve complicated math equations, store and sort vast libraries of information, and even help us find our new favorite restaurant!
The internet, combined with the creation of the World Wide Web in 1990, turned computers into media machines. The web is an integral part of the global economy and, for many, an extension of a personal identity. Computers have become so integrated into our lives that, in 2011, the United Nations declared access to the internet a human right.
Billions of people have smartphones that are 100,000 times more powerful than the computer that flew the first astronauts to the moon. But a computer in nearly every pocket has not always been the case. For the majority of history, computing machines were used by very few people—scientists doing research, governments managing bureaucracies, militaries fighting wars, and large corporations looking to maximize their profits. Early computers were incredibly expensive and physically massive and required specialized technical knowledge to use. It wasn’t until the personal computer revolution of the 1970s that the power of computing became more accessible to the average person.
This book will highlight the milestones of computer history and explore the idea that technological knowledge is power. It will not teach you how to code or go into the weeds of computer science. Instead, this book focuses on the intent, purpose, and impact of the people and machines that changed our world. While this book profiles innovators in technology, there is no such thing as a “lone genius.” The origin of computers came from the work of thousands of people and a social climate that prioritized the funding of scientific advancement. The study of computer history is the study of humanity.
Copyright © 2022 by Rachel Ignotofsky. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.