Discover how far back the seeds of computing lie and who is most responsible for the development of the computer.
Computers developed from calculating machines. One of the earliest mechanical devices for calculating, still widely used today, is the abacus -- a frame carrying parallel rods on which beads or counters are strung. Herodotus, the Greek historian who lives around 400 B.C.E., mentions the use of the abacus in Egypt.
In 1617, John Napier (1550-1617) invented "Napier's Rods" -- marked pieces of ivory for multiples of numbers.
In the middle of the same century, Blaise Pascal (1623-1662) produced a simple mechanism for adding and subtracting.
Multiplication by repeated addition was a feature of a stepped drum or wheel machine of 1694 invented by Gottfried Wilhelm Leibniz (1646-1716).
In 1823, the English visionary Charles Babbage (1792-1871) persuaded the British government to finance an "analytical engine." This would have been a machine that could undertake any kind of calculation. It would have been driven by steam, but the most important innovation was that the entire program of operations was stored on a punched tape. Babbage's machine was not completed and would not have worked if it had been. The standards required were far beyond the capabilities of the engineers of the time, and in any case, rods, levers, and cogs move too slowly for really quick calculations. Only electrons, which travel at near the speed of light, are rapid enough. Although he never built a working computer, Babbage thought out many of the basic principles that guide modern computers.
Based on the concepts of British mathematician Alan. M. Turing (1912-1954), the earliest programmable electronic computer was the 1,500-value "Colossus," formulated by Max Newman (1897-1985), built by T.H. Flowers, and used by the British government in 1943 to crack the German codes generated by the coding machine "Enigma."