Cards (26)

  • The electronic computer is one of the most important developments of the twentieth century. Like the industrial revolution of the nineteenth century, the computer and the information and communication technology built upon it have drastically changed business, culture, government and science, and have touched nearly every aspect of our lives.
  • “We’re changing the World with Technology” - Bill Gates
  • Analog computers use continuous physical magnitudes to represent quantitative information. At first they represented quantities with mechanical components (see differential analyzer and integrator), but after World War II voltages were used; by the 1960s digital computers had largely replaced them. Nonetheless, analog computers, and some hybrid digital-analog systems, continued in use through the 1960s in tasks such as aircraft and spaceflight simulation.
  • One advantage of analog computation is that it may be relatively simple to design and build an analog computer to solve a single problem.
  • Another advantage is that analog computers can frequently represent and solve a problem in “real time”; that is, the computation proceeds at the same rate as the system being modeled by it.
  • Their main disadvantages are that analog representations are limited in precision —typically a few decimal places but fewer in complex mechanisms— and general-purpose devices are expensive and not easily programmed.
  • In contrast to analog computers, digital computers represent information in discrete form, generally as sequences of 0s and 1s (binary digits, or bits).
  • The modern era of digital computers began in the late 1930s and early 1940s in the United States, Britain, and Germany. The first devices used switches operated by electromagnets (relays). Their programs were stored on punched paper tape or cards, and they had limited internal data storage.
  • These computers came to be called mainframes, though the term did not become common until smaller computers were built.
  • Mainframe computers were characterized by having (for their time) large storage capabilities, fast components, and powerful computational abilities.
  • They were highly reliable, and, because they frequently served vital needs in an organization, they were sometimes designed with redundant components that let them survive partial failures. Because they were complex systems, they were operated by a staff of systems programmers, who alone had access to the computer. Other users submitted “batch jobs” to be run one at a time on the mainframe.
  • The most powerful computers of the day have typically been called supercomputers. They have historically been very expensive and their use limited to high-priority computations for government-sponsored research, such as nuclear simulations and weather modeling.
  • On the other hand, the design of costly, special-purpose processors for supercomputers has been supplanted by the use of large arrays of commodity processors (from several dozen to over 8,000) operating in parallel over a high-speed communications network.
  • Although minicomputers date to the early 1950s, the term was introduced in the mid-1960s. Relatively small and inexpensive, minicomputers were typically used in a single department of an organization and often dedicated to one task or shared by a small group.
  • Minicomputers generally had limited computational power, but they had excellent compatibility with various laboratory and industrial devices for collecting and inputting data.
  • One of the most important manufacturers of minicomputers was Digital Equipment Corporation (DEC) with its Programmed Data Processor (PDP). In 1960 DEC’s PDP-1 sold for $120,000. Five years later its PDP-8 cost $18,000 and became the first widely used minicomputer, with more than 50,000 sold. The DEC PDP-11, introduced in 1970, came in a variety of models, small and cheap enough to control a single manufacturing process and large enough for shared use in university computer centres; more than 650,000 were sold. However, the microcomputer overtook this market in the 1980s.
  • A microcomputer is a small computer built around a microprocessor integrated circuit, or chip. Whereas the early minicomputers replaced vacuum tubes with discrete transistors, microcomputers (and later minicomputers as well) used microprocessors that integrated thousands or millions of transistors on a single chip.
  • In 1971 the Intel Corporation produced the first microprocessor, the Intel 4004, which was powerful enough to function as a computer although it was produced for use in a Japanese-made calculator.
  • In 1975 the first personal computer, the Altair, used a successor chip, the Intel 8080 microprocessor.
  • Another class of computer is the embedded processor. These are small computers that use simple microprocessors to control electrical and mechanical functions. They generally do not have to do elaborate computations or be extremely fast, nor do they have to have great “input-output” capability, and so they can be inexpensive.
  • Embedded processors help to control aircraft and industrial automation, and they are common in automobiles and in both large and small household appliances. One particular type, the digital signal processor (DSP), has become as prevalent as the microprocessor. DSPs are used in wireless telephones, digital telephone and cable modems, and some stereo equipment.
  • The First Generation
    • The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms.
    • They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
    First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time.
    • Input was based on punched cards and paper tape, and output was displayed on printouts.
  • The Second Generation
    • Transistors replaced vacuum tubes
    • One transistor replaced the equivalent of 40 vacuum tubes.
    • Still generated a great deal of heat that can damage the computer.
    • Moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words.
    • Still relied on punched cards for input and printouts for output.
    • These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
  • The Third Generation
    • The development of the integrated circuit was the hallmark of the third generation of computers.
    • Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
    • It could carry out instructions in billionths of a second.
    • Users interacted with third generation computers through keyboards and monitors and interfaced with an operating system.
    • Computers for the first time became accessible to a mass audience.
  • The Fourth Generation
    • The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip.
    • Linked together to form networks, which eventually led to the development of the Internet.
    • Development of GUIs, the mouse and handheld devices.
    • Based on Artificial Intelligence (AI).
    Parallel processing and superconductors is helping to make artificial intelligence a reality.
    • There are some applications, such as voice recognition, that are being used today.
  • Like minicomputers, early microcomputers had relatively limited storage and data-handling capabilities, but these have grown as storage technology has improved alongside processing power.