A computer is an electronic machine that works by sequentially reading a set of instructions that make it perform logical and arithmetic operations on binary numbers.
Upon power up, a computer runs, one after the other instructions which are read, manipulate, and then write a set of data accessible to him. Tests and conditional jumps can change following statement, and therefore act differently depending on the data or needs of the moment.
To manipulate the data are obtained either by reading memoirs, by the reading of interface components (devices) that represent external physical data into binary values (moving a mouse button pressed on a keyboard, temperature, speed, compression …). Once used, or handled, the data is written, either in memory or in components that can transform a binary value by a physical action (write to a printer or on a monitor, acceleration or braking of a vehicle, change temperature of a furnace …).
The computer can also respond to disruptions that allow it to run programs of specific responses to each, then resume the sequential execution of the interrupted program.
The current technology of computers dates to the mid-twentieth century. They can be classified according to several criteria (scope, size or architecture).
Calculating machines played a vital role in the development of computers for two reasons quite independent.
* On the one hand, to their origins: it is during the development of an automatic calculating machine printer until 1834, Babbage began to imagine his Analytical Engine, the forerunner of computers. It was a calculating machine programmed by reading punched cards (inspired by the Jacquard loom), with a card reader for data and one for programs, with memories, a central computer and printer that will inspire the development the first computers one hundred years later (Howard Aiken was much reference to machinery Babbage when he convinced IBM to build the Harvard Mark I in 1937), which will lead us to mainframes of the 1960s.
* On the other hand, for their propagation through the commercialization in 1971 the first microprocessor, the Intel 4004, which was invented during the development of an electronic calculating machine for the Japanese company Busicom, which is causing the explosion of personal computing in 1975 and which lies at the heart of all computers today regardless of their size or features (although only 2% of the microprocessors produced each year are used as central processing units computer, the 98% are used in the construction of cars, household robots, watches, cameras …).