how to made a computer

                               Computer


Computer

A computer is a machine that may be configured to automatically perform series of mathematical or logical operations (computation). Modern digital electronic computers are capable of running programmes, which are generalised sets of operations. These apps give computers the ability to carry out a variety of tasks. A computer system is a minimally functional computer that contains the peripheral devices, operating system (primary software), and hardware required for proper operation. This phrase may also apply to a collection of connected computers that work as a unit, such as a computer network or computer cluster.







Several ages of computers and computing equipment.
First-generation computer, automatic mechanical calculator (1820) with a difference engine (Colossus computer)
Early vacuum tube computer (ENIAC), middle row; supercomputer (IBM Summit)
Nintendo GameCube video gaming console and a smartphone are in the bottom row (LYF Water 2).
Computers are used as control systems in a wide variety of industrial and consumer goods. Included are straightforward special-purpose gadgets like microwaves and remote controls, as well as industrial robots and computer-aided design tools, general-purpose gadgets like personal computers, and portable gadgets like cellphones. The Internet, which connects billions of other computers and people, is powered by computers.

The sole purpose of early computers was to perform computations. Since ancient times, simple manual tools like the abacus have supported humans in performing computations. Early in the Industrial Revolution, a few mechanical devices were created to automate time-consuming, difficult processes, such creating weaving patterns. Early in the 20th century, more advanced electrical machines performed specialised analogue calculations. During World War II, the first digital electronic calculators were created. Following the development of the first semiconductor transistors in the late 1940s, silicon-based MOSFET (MOS transistor), monolithic integrated circuit chip technology, and the microprocessor and microcomputer revolutions in the late 1950s and 1970s respectively, were developed. Since then, transistor counts have drastically increased, as have the speed, power, and adaptability of computers.
Moore's law projected that this trend would accelerate, resulting in the Digital Revolution in the late 20th and early 21st centuries.

A modern computer normally consists of a central processing unit (CPU), which is typically a microprocessor, and some kind of computer memory, which is typically semiconductor memory chips. A sequencing and control unit can alter the order of operations in response to data that has been stored while the processing element performs arithmetic and logical operations. Keyboards, mice, joysticks, and other input and output devices, as well as input/output devices that serve both purposes, are examples of peripheral devices (e.g., the 2000s-era touchscreen). Peripheral devices make it possible to save and retrieve the outcomes of operations as well as retrieve information from an external source.

Post a Comment

0 Comments