Magazine Home
Gears, Tubes, Transistors
The Computer Corner

January 7, 2024

by Charles Miller

Last week I mentioned that recent advances in quantum computing is making the next generational jump in computing a little less hypothetical. Be warned that the following history of the last 200 years of computers is a huge oversimplification, but here goes...

Many historians agree that the first operational programmable computer was created in the 1820s by Charles Babbage. Called the "difference engine" it was purely mechanical, cranked by hand or could possibly be powered by steam. It could interpolate or tabulate functions by using a small set of polynomial coefficients. Later versions of Babbage's designs were not finished in 1842 when the British government abandoned funding the project. During the 1980s the Science Museum of London completed the construction of that difference engine from Babbage's plans; it weighs five tons, and it works. The difference engine was many times faster than computing using pencil and paper.

World War II provided the impetus to create the first programmable, electronic, digital computer. Named Colossus it was used by British code breakers to help in the cryptoanalysis of intercepted Nazi communications. Various models of Colossus used between 1,600 and 2,400 vacuum tubes to perform Boolean and counting operations, largely replacing mechanical gears with much faster electronics. Colossus was a generation ahead of mechanical calculators such as the difference engine.

In the 1950s the ability to mass produce transistors led to what is often called the "second-generation" electronic computer, constructed using solid state transistors instead of vacuum tubes. Massachusetts Institute of Technology's TX-0 (Transistorized Experimental computer zero) had 3,600 Philco transistors. Once again, this was a generational leap forward in computing speed.

Then in the late 1960s the integrated circuit (IC) led to the "third-generation" computer and the next great leap forward. An IC (also known as microchip, microprocessor, or silicon chip) is a design of electronic circuits incorporating thousands of transistors on a small chip of silicon. This third generation of computers is where computing technology is now. Today's silicon chips can have not just thousands but billions of transistors on one chip that is physically much smaller than older designs. Today's computers are hundreds of times faster than just a few decades ago but they still trace their lineage to the 1960s.

What is now on the horizon is a new generation of computers that some say could be millions to trillions of times faster than any supercomputer existing today. As this brief history outlines, we have already moved from metal gears to vacuum tubes to transistors. Quantum computers use quantum bits, or qubits, which process information very differently. How this is possible is a real challenge to explain. Decades ago the American theoretical physicist Richard Feynman said about the quantum electrodynamics work that won him the Nobel Prize, "If it were possible to describe it in a few sentences, it wouldn't have been worth a Nobel Prize." That is how I also explain it.

**************

Charles Miller is a freelance computer consultant with decades of IT experience and a Texan with a lifetime love for Mexico. The opinions expressed are his own. He may be contacted at 415-101-8528 or email FAQ8 (at) SMAguru.com.

**************
*****

Please contribute to Lokkal,
SMA's online collective:

***

Discover Lokkal:
Watch the two-minute video below.
Then, just below that, scroll down SMA's Community Wall.
Mission

Wall


Visit SMA's Social Network

Contact / Contactar

Subscribe / Suscribete  
If you receive San Miguel Events newsletter,
then you are already on our mailing list.    
Click ads

Contact / Contactar


copyright 2024