Computers have been such an integral part of our lives that it’s difficult to imagine when they weren’t in our homes, workplace, or back pockets. But computers, as we know them today, are still new compared to what they used to be.
Computers have been around since the invention of the abacus some 5,000 years ago. Yet, it wasn’t until 1944 that humans developed the first digital computer.
The computer, called Mark I, was massive and weighed up to five tons. It was designed to make calculations only. But Mark I sparked the beginning of what would become an evolution of computer development.
The following is a brief history on how computers evolved from room-sized machines to pocket-sized wonders on which we play games, stream multimedia, and calculate numbers.
Five-Ton Calculators: The First Generation Of Computers
The development of computers as we know them began between 1940 and 1956. Howard Aiken presented the original concept of the IBM Automatic Sequence Controlled Calculator (ASCC), or Mark I, in 1937. The Mark I was later developed and initiated in 1944 by John von Neumann.
The Mark I was a general purpose electromechanical computer. It was used to help with the war effort during World War II.
The Mark I and other computers used magnetic drums for memory. Vacuum tubes were also used as switched and amplifiers. It’s because of these vacuum tubes that early computers had such a large size.
Other early computers include the ENIAC, or Electronic Numerical Integrator and Computer. It was developed in 1946 by Betty Jennings and various others as the first electronic general-purpose computer.
Ida Rhodes co-designed the C-10 programming language in 1950 for UNIVAC I, the computer system that would be used later to determine the U.S. census.
Shrinking Down To Size
Between the 1950s and 1970s, computers began to shrink down to manageable sizes. Engineers replaced vacuum tubes with transistors and replaced magnetic storage with core memory.
Between 1964 and 1971, computers took on a need for speed. Engineers installed integrated circuits into computers.
These integrated circuits were large numbers of transistors within a silicon chip. These chips enabled computers to come in smaller sizes.
The Development Of The Modern Computer
In 1971, technology made it possible to put millions of transistors onto a single circuit chip. This gave way to the invention of the Intel 4004 chip, the first microprocessor to be commercially available to the public.
Tim Berners-Lee developed the HyperText Markup Language (HTML) in 1990, giving rise to the World Wide Web. Improvements in graphics, computer memory, speed, and transistor chips fueled the development of computers as we know them today.
Where Do We Go From Here?
Technology is always evolving. In 2016, the first reprogrammable quantum computer was created at the University of Maryland. And in 2017, the Defense Advanced Research Projects Agency (DARPA) began developing a program that uses molecules as computers.
As technology evolves and improves, engineers will find a way to make computers smaller, faster, and more efficient. In the future, we can expect ongoing work in the field of artificial intelligence, quantum computation, and nanotechnology.