Site MapHelpFeedbackChapter Outline
Chapter Outline
(See related pages)

LECTURE OUTLINE: HISTORY OF THE COMPUTER
  1. Introduction
    1. Today's cell phones, tablet PCs, and Personal Data Assistants (PDAs) represent a sharp contrast to technology of the past.
    2. Digital cameras, scanners, multimedia projectors, and the World Wide Web have become part of instructional technology in schools where teachers have resources and funds to use them with students.
    3. Some instructional technology remains unused or underused in schools, as a result of a lack of training, funding, or both.
  2. Early Times
    1. Primitive humans used their fingers to count, so that ten became the basis of our number system today.
    2. time passed, people used rocks, notched wood, and stones for counting, storing, and record keeping.
    3. The abacus is the earliest known device for manipulating data, with beads in a wood frame manipulated to keep track of numbers and place values. A primitive abacus was found near Babylon, indicating calculation existed about 3,000 B.C. The abacus is the only early calculation aid used today.
  3. Technology Pioneers and Their Calculating Devices
    1. Several pre-twentieth-century inventors pioneered computational devices and their inventions prior to the computer.
      1. John Napier invented Napier's Rods or Bones in 1617. Users could multiply large numbers by manipulating rods.
      2. Blaise Pascal built a calculating machine in 1642 that could add and subtract.
      3. Baron Gottfried Wilhelm Von Leibniz's 1694 calculating machine was called the Stepped Reckoner. This machine could add, subtract, multiply, and divide, and used cylinders instead of gears to perform calculations. His primary contribution was not his machine, however, but his system of binary mathematics, never completed, of counting using only digits 0 and 1.
      4. George Boole devised a system of logic based on the binary system, known as Boolean Algebra. Inventors in the 1930s used this binary system, and it became the standard internal logic of today's digital computers.
      5. Joseph Marie Jacquard used punched cards in 1790 to create patterns on fabric woven on a loom, and this device was the forerunner of the keypunch machine.
    2. All the mechanical devices of the early days could only do arithmetic. Other individuals contributed to the development of real computers.
      1. Charles Babbage is known as the father of computers. Babbage designed a machine called the Analytical Engine in 1835 that provided for printed data, a control unit, and an information storage unit. The store, or memory, and the mill, or processing unit for arithmetical calculations, were key achievements in computer history.
      2. Augusta Ada Byron, Countess of Lovelace, is considered the first computer programmer. She wrote a demonstration program for the Analytical Engine and raised money for Babbage's invention.
      3. Herman Hollerith developed the Tabulating Machine in 1887 for punched-card data processing, which was used in the 1890 census. Hollerith later organized his own company, which became International Business Machines, or IBM, in 1924.
      4. the 20th century, the Census Bureau bought a machine designed by James Powers to replace Hollerith's machines. Powers' company later merged with two others to form the conglomerate Unisys. Their machines primarily served business; scientific computing was still in need.
    3. The age of the modern computer began in 1944.
      1. Howard Aiken headed a group of scientists at Harvard backed by IBM, who in 1943 built the Mark I, the first electromechanical computer, responsible for making IBM a giant in computer technology. Mark I was large, noisy, heavy, and hot, and results were printed on an electric typewriter. Aiken built an illustrious program for computer scientists at Harvard, and the term debug came from an incident when a moth flew into a Mark II computer in 1945 and had to be removed--the system had to be "debugged."
      2. John Atanasoff and Clifford Berry designed and built the first electronic digital computer in 1939. Their contribution has only recently been recognized.
      3. John Mauchly visited Atanasoff and then used much of what he learned in his work with J. Presper Eckert at the Moore School of Electrical Engineering at the University of Pennsylvania, but never mentioned the source of his ideas. In 1946, Mauchly and Eckert completed an operational electronic digital computer called the ENIAC (Electronic Numerical Integrator and Calculator), derived from the unpatented work of Atanasoff. It was tremendous in size, heavy, and hot, and it used 18,000 vacuum tubes to conduct electricity. ENIAC operated at a rate 500 times faster than an electromechanical computer of the day.
      4. John Von Neumann helped the Moore team address ENIAC's memory and programming limitations through a machine called the EDVAC. Von Neumann became known as the architect of the stored-program concept, although not without protest from the Moore group. The EDVAC was smaller, simpler, and faster than the ENIAC. The EDSAC, developed at Cambridge University, and the EDVAC were the first computers to use binary notation.
      5. The era of commercial computers began in 1951, with the arrival of the UNIVAC, and computers were manufactured on a large scale for the first time.
  4. Generations of Computers
    1. Each stage of technological advancement is known as a generation. The first generation of computers began in the 1940s and extended into the 1950s. Bulky, expensive vacuum tubes were used to conduct electricity, and pioneering work was done in magnetic storage. The UNIVAC used a mercury delay line, which relied on ultrasonic pulses.
    2. The second generation of computers began in the mid-1950s when the transistor replaced the vacuum tube. The transistor, an electronically operated switch, conducted electricity more efficiently than the vacuum tube and was less noisy, hot, heavy, and expensive.
    3. The third generation of computers began in 1964 with the introduction of the IBM 360, the computer that pioneered the use of integrated circuits on a chip. Hundreds of transistors could be installed on a single silicon chip. Computers became smaller, faster, and more accessible to small companies. Video display terminals replaced punchcards for inputting data, and both main and auxiliary memory was developed. In the 1970s, development of large-scale integration (LSI) of circuits led to smaller computers, handheld calculators, and embedded computers in TVs and cameras. Personal computers such as the Apple and IBM PC became widespread, with a wide variety of software.
    4. The fourth generation of computers began in the 1970s and extended into the 1990s, a result of the development of microprocessor technology. Artificial intelligence, wireless networks, parallel processing, the Internet, and virtual reality represent some of the future widespread advancements for technology. Edward Hoff of Intel designed the microprocessor in 1968. In 1975, Bill Gates and Paul Allen founded Microsoft. Steve Wozniak and Steve Jobs introduced the Apple II in 1977. In 1981, the IBM PC entered the personal computing field.
    5. The fifth generation of computers started in the mid-1990s and, continuing to the present, heralded incredibly fast computer chips, use of voice recognition, natural and foreign language translation, fiber optical networks, optical disks, touch screens, handwriting recognition software, and wearable computers. This generation is based on logical inference and the use of artificial intelligence (AI), which imitates human reasoning. Today, computers and printers communicate through wireless networks and machines that have parallel processing. Flat-panel displays, increasing use of the Internet and wireless communication devices, video conferencing, 3-D, virtual reality, and interactive gaming and educational technology are all available.







Computer Education for TeacherOnline Learning Center

Home > Chapter 1 > Chapter Outline