Ad blocker interference detected!
Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers
Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.
Originally, the term "computer" referred to a person who performed numerical calculations under the direction of a mathematician, possibly with the aid of a variety of mechanical calculating devices such as the abacus onward. Examples of early calculating devices, the first ancestors of the computer, included the abacus and the Antikythera mechanism, an ancient Greek device for calculating the movements of planets, dating from about 87 BCE. The end of the Middle Ages saw a reinvigoration of European mathematics and engineering, and Wilhelm Schickard's 1623 device was the first of a number of European engineers to construct a mechanical calculator.
Charles Babbage was the first to conceptualize and design a fully programmable computer as early as 1820, but due to a combination of the limits of the technology of the time, limited finance, and an inability to resist tinkering with his design, the device was never actually constructed in his lifetime. A number of technologies that would later prove useful in computing, such as the [punch card] and the [vacuum tube] had appeared by the end of the 19th century, and large-scale automated data processing using punch cards was performed by tabulating machines designed by Hermann Hollerith.
During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated, special-purpose analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. These became increasingly rare after the development of the programmable digital computer.
A succession of steadily more powerful and flexible computing devices were constructed in the 1930s and 1940s, gradually adding the key features of modern computers, such as the use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability. Defining one point along this road as "the first computer" is exceedingly difficult. Notable achievements include the Atanasoff-Berry Computer (1937), a special-purpose machine that used valve-driven (vacuum tube) computation, binary numbers, and regenerative memory; the secret British Colossus computer (1944), which had limited programmability but demonstrated that a device using thousands of valves could be made reliable and reprogrammed electronically; the American ENIAC (1946) — which was the first really general purpose machine, but still used the decimal system and had an inflexible architecture that meant reprogramming it essentially required it to be rewired, and Konrad Zuse's Z machines, with the electromechanical Z3 (1941) being the first working machine featuring automatic binary arithmetic and feasible programability.
The team who developed ENIAC, recognizing its flaws, came up with a far more flexible and elegant design, which has become known as the stored program architecture, which is the basis from which virtually all modern computers were derived. A number of projects to develop computers based on the stored program architecture commenced in the late 1940s; the first of these were completed in Britain. The first to be up and running was the Small-Scale Experimental Machine, but the EDSAC was perhaps the first practical version that was developed.
Valve-driven computer designs were in use throughout the 1950s, but were eventually replaced with transistor-based computers, which were smaller, faster, cheaper, and much more reliable, thus allowing them to be commercially produced, in the 1960s. By the 1970s, the adoption of integrated circuit technology had enabled computers to be produced at a low enough cost to allow individuals to own a personal computer of the type familiar today.