Evolution of Computing

Early Computing

Early computers were mechanical. They used gears, levers, and other mechanical components to perform calculations. The most famous example is the Babbage Difference Engine.

Later, vacuum tubes were used to build electronic computers. These were large, slow, and unreliable.

Transitors

NOT Gate Using Transistor

Transistors were invented in 1947. They are the basis of modern computing.

A transistor is a switch that can be turned on or off by applying a small voltage. They form the subcomponents of logic gates.

An integrated circuit, or is a collection of transistors and other components on a single chip. These are often referred to as "chips" or "microchips".

Moore's Law

Moore's Law is an observation made by Gordon Moore, co-founder of Intel, in 1965. It states that:

The number of transistors on a microchip doubles approximately every two years.

Note the specifics - it's all about how many transistors can be packed onto a chip (density of transisitors).

Continued Growth?

Kurzweil's Commentaries

Ray Kurzweil has observed more generally that the rate of change in technology is itself accelerating. This is a more general observation than Moore's Law, which is specific to transistors.

Above we look at how much we can compute with a dollar over time. The curve is mildly exponential (curved up), not linear (straight).

Kurzweil and others believe that as long as there is a demand for more computing power, the technology will continue to evolve to meet that demand.

Alternatives to Von Neumann Architecture

GPU's

Specialized Processors

Alternatives to Transistor Based Computing

Quantum Computing

Organic Computing