Computers Not the greatest invention of the 20 th Essay Example
Computers Not the greatest invention of the 20 th Essay Example

Computers Not the greatest invention of the 20 th Essay Example

Available Only on StudyHippo
  • Pages: 10 (2549 words)
  • Published: January 15, 2019
  • Type: Research Paper
View Entire Sample
Text preview

The computer, which was a significant discovery of the 20th century, has become an integral part of modern life. It has permeated every aspect of society and is responsible for various tasks beyond computation. For instance, supermarket scanners not only calculate grocery bills but also help manage inventory. Telephone switching centers, driven by computers, efficiently handle millions of calls and ensure smooth communication. Automatic teller machines enable banking transactions from anywhere globally. To comprehend the impact of computers on our lives and their potential in the future, it is essential to understand their evolution.

The first computer can be considered the abacus, a device with sliding beads on a rack originating in Asia Minor 5,000 years ago. Early merchants utilized it for trade transactions while paper and pencil usage diminished its significance over time. How

...

ever, it took approximately twelve centuries for the next major advancement in computing devices to emerge.

In 1642, Blaise Pascal invented a numerical wheel calculator known as a Pascaline to assist his father's duties as an 18-year-old French tax collector's son. This rectangular box made of brass utilized eight movable dials to perform addition calculations up to eight digits long. The device achieved this by using a base of ten - when one dial completed a full revolution or moved ten notches, it caused the next dial representing the tens column to move one position.The Pascaline, a mechanical calculator, was limited to addition. However, in 1694, Leibniz improved upon it by creating a machine capable of both addition and multiplication. Leibniz's machine used gears and dials similar to Pascal's device but with a stepped-drum gear design based on studying Pascal's original notes an

View entire sample
Join StudyHippo to see entire essay

drawings. Mechanical calculators became widespread in 1820 with the invention of Charles Xavier Thomas de Colmar's arithometer, which could perform all four basic arithmetic operations - addition, subtraction, multiplication, and division. The practicality of the arithometer led to its extensive usage until World War I. Alongside contributions from Pascal and Leibniz, Colmar's calculator played a significant role in the development of mechanical computation. However, it was Charles Babbage who truly ushered in the era of modern computers as we know them today. Babbage's frustration with calculation errors for the Royal Astronomical Society fueled his desire for automation. In 1822, he proposed the steam-powered Difference Engine capable of performing differential equations and automatically printing results. Babbage's decade-long work on this machine eventually resulted in the creation of the Analytical Engine, widely considered as the first general-purpose computer.Lady Lovelace played a crucial role in the design of Babbage's steam-powered Engine. She provided assistance with revisions, secured funding from the British government, and spread awareness about it. Due to her profound understanding of the machine, she developed instruction routines for data input, making her the world's first female computer programmer. Her contributions were so significant that a programming language was named after her by the U.S. Defense Department in the 1980s.

Although Babbage's Engine was never constructed, it laid down the groundwork for modern computers and introduced an innovative concept. The Analytical Engine comprised over 50,000 components and featured perforated cards as input devices containing operating instructions. It also had a memory "store" capable of holding up to 1,000 numbers with a maximum of 50 decimal digits. Additionally, it included a processing unit known as a "mill" which could

process instructions in any order thanks to its control mechanism. The system also incorporated output devices capable of producing printed results.

Babbage borrowed the idea of using punch cards for encoding machine instructions from Joseph-Marie Jacquard's invention called the Jacquard loom. This loom employed punched boards to control weaving patterns. In 1889, American inventor Herman Hollerith further applied this concept by using cards as data storage and feeding them into a machine for mechanical compilation.
The initial goal was to find a faster method for calculating the U.S. Census, which had become time-consuming due to population growth. Each punch on a card represented a number, while combinations of punches represented letters. A single card could hold up to 80 variables. Hollerith's machine allowed census takers to finish their work in just six weeks instead of ten years, increasing speed and reducing errors. In 1896, Hollerith established the Tabulating Machine Company, later known as IBM. Other companies like Remington Rand and Burroughs also produced punch readers for business purposes. Punch cards were widely used for data processing in both business and government sectors until the 1960s.

Over time, engineers made advancements in computing technology. In 1931, Vannevar Bush created a complex calculator capable of solving differential equations that puzzled scientists and mathematicians before this invention. This calculator required numerous gears and shafts to represent numbers and their relationships.

John V. Atanasoff, an Iowa State College professor, along with his graduate student Clifford Berry, developed an all-electronic computer that utilized Boolean algebra in computer circuitry. Their approach was based on George Boole's work establishing the binary system of algebra by stating that equations could be expressed as true or false statementsBy

1940, Atanasoff and Berry had built the first all-electronic computer using electronic circuits as on or off states. However, their project lacked funding and recognition compared to other scientists' developments. The outbreak of World War II led governments to invest in computer development for strategic purposes, resulting in accelerated progress. In 1941, German engineer Konrad Zuse created the Z3 computer for aircraft and missile design, but the Allies made greater advancements. The British completed Colossus in 1943 - a secret code-breaking computer that played a crucial role despite limited impact on the industry due to its non-general-purpose nature and secrecy until after the war. Simultaneously, American efforts resulted in another significant milestone with Howard H. Aiken constructing the Harvard-IBM Automatic Sequence Controlled Calculator or Mark I in 1944. This impressive machine was designed solely for creating ballistic charts for the U.S Navy and spanned across half of a football field with around 500 miles of wiring. The Mark I utilized electromagnetic signals to operate its mechanical components.Despite its slow speed, taking around 3-5 seconds for each calculation, and limited flexibility in modifying sequences of calculations, this computer had the capability to perform basic arithmetic as well as more complex equations. Another computer called the Electronic Numerical Integrator and Computer (ENIAC) was developed in a partnership between the U.S. government and the University of Pennsylvania. ENIAC consisted of 18,000 vacuum tubes, 70,000 resistors, and 5 million soldered joints. This massive machine consumed 160 kilowatts of power, causing lights in Philadelphia to dim. Created by John Presper Eckert and John W. Mauchly, ENIAC was a versatile computer that operated at speeds 1,000 times faster than Mark I.

Unlike previous computers like Colossus and Mark I, ENIAC introduced important concepts in computer design that remained significant for four decades.

In the mid-1940s, John von Neumann joined the University of Pennsylvania team and designed the Electronic Discrete Variable Automatic Computer (EDVAC). One notable innovation of EDVAC was its use of stored memory technique and conditional control transfer which allowed for more flexible programming capabilities. This architecture also featured a central processing unit that coordinated all functions through one source.

In 1951, Remington Rand manufactured UNIVAC I - one of the first commercially available computers to incorporate these advancementsThe U.S Census Bureau and General Electric were among the owners of UNIVACs, one of which accurately predicted the winner of the 1952 presidential election - Dwight D. Eisenhower. First-generation computers had unique machine language programs written in binary code for specific tasks, making programming difficult and limiting versatility and speed. These large computers used vacuum tubes and magnetic drums for data storage. However, the invention of transistors in 1948 replaced bulky vacuum tubes, leading to smaller electronic devices like computers. By integrating transistors into computers by 1956, along with advancements in magnetic-core memory, the second generation of computers emerged. These smaller, faster, more reliable machines took advantage of transistor technology and were developed for atomic energy labs to efficiently process large amounts of data needed by scientists.Only two LARCs, the Livermore Atomic Research Computer in California and one at the U.S. Navy Research and Development Center in Washington D.C., were installed. However, businesses found them unappealing due to their high cost and excessive power consumption.

In the early 1960s, commercially successful second generation computers from companies like Burroughs, Control

Data, Honeywell, IBM, Sperry-Rand,and others started being used in business, universities,and government sectors. These computers replaced machine language with assembly language and introduced abbreviated programming codes instead of long binary codes. They utilized transistors instead of vacuum tubes and included vital components like printers,tape storage,disk storage,memory systems , operating systems,and stored programs.

An important example of a second-generation computer was the IBM 1401. Widely recognized as a Model T in the computer industry,it played a significant role.By 1965, major businesses were regularly using these computers for processing financial information.The introduction of the stored program concept together with programming languages finally made computers cost-effective and efficient for business applications.This concept involved the storage of specific function instructions in a computer's memory, allowing for quick replacement with alternative instructions. As computers advanced, they gained the capability to perform various tasks such as printing invoices, designing products, and calculating paychecks. This progress was made possible by the introduction of more advanced high-level programming languages like COBOL and FORTRAN. These languages simplified programming by utilizing words, sentences, and formulas rather than complex binary machine code. Consequently, new professions like programmer, analyst, and computer systems expert emerged alongside the growth of the software industry.

While transistors improved performance compared to vacuum tubes, they still generated excessive heat that could potentially damage internal components. To address this issue, quartz rocks were introduced. In 1958 at Texas Instruments engineer Jack Kilby developed the integrated circuit (IC) by combining three electronic components on a small silicon disc made from quartz. Scientists achieved greater integration by fitting more components onto a single chip or semiconductor. This innovation led to computers becoming progressively smaller.

The development of third-generation

technology involved the utilization of an operating system that enabled multiple programs to run simultaneously on a computer while coordinating memory through a central program.The advancement of integrated circuits furthered the reduction in component size, with large-scale integration (LSI) allowing hundreds of components to fit on one chip. In the 1980s, very large-scale integration (VLSI) increased this number to hundreds of thousands, and ultra-large scale integration (ULSI) escalated it into millions. This smaller size resulted in reduced computer size and cost while improving power, efficiency, and reliability.

In 1971, the Intel 4004 chip combined all computer components onto one tiny chip, revolutionizing microprocessors by enabling them to be manufactured and programmed for different needs like never before. As a result, everyday items such as microwave ovens, televisions, and automobiles began incorporating microprocessors. This compact yet powerful technology allowed ordinary people to utilize computers beyond just large businesses or government contracts.

During the mid-1970s, computer manufacturers like Commodore, Radio Shack, and Apple Computers sought to create computers that were accessible to the general public. They achieved this by introducing minicomputers with user-friendly software packages for word processing and spreadsheet programs. The rise in popularity of arcade video games and home video game systems in the early 1980s sparked consumer interest in programmable home computers.

In 1981, IBM released its personal computer (PC), which could be used in homes, offices, and schools.
The use of personal computers was further fueled by the affordability of cheaper IBM PC clones. By 1982, the number of personal computers had increased from 2 million to 5.5 million and eventually reached an astonishing 65 million by the end of the decade. As technology advanced, computers became

smaller in size, transitioning from desktop computers to laptops that could be easily carried in a briefcase and eventually evolving into palmtop computers small enough to fit inside a breast pocket.

In 1984, Apple introduced its Macintosh line as a direct competition to IBM's PC. The Macintosh stood out with its user-friendly design and operating system that allowed users to interact by moving screen icons instead of typing commands. Users could control an on-screen cursor using a mouse, replicating natural hand movements on the computer screen.

With the increasing prevalence of computers in workplaces came new methods for utilizing their potential. Smaller yet powerful computers could be interconnected or networked to share memory space, software, information, and communicate with each other. These networked computers allowed individual computers to form electronic co-ops, unlike traditional mainframe computers that relied on terminals for various applications. These networks could cover large distances through either direct wiring known as Local Area Network (LAN) or telephone lines.

The Internet serves as an example of such a network that connects computers worldwide into one information network.In the 1992 U.S. presidential election, Al Gore emphasized the importance of developing the "information superhighway," which refers to the network that we know today as computer networks like the Internet. Although it may take many years or even decades for its full potential to be realized, these networks are primarily used for electronic mail (E-mail) at present. This enables users to send messages across offices or anywhere in the world by typing in a computer address.

Describing the fifth generation of computers is difficult because they are still in an early stage of development. HAL9000 from Arthur C. Clarke's novel,

2001: A Space Odyssey, serves as a well-known fictional example of a character representing fifth-generation computers. Similar to their real-life counterparts, HAL had all the desired functions such as engaging in conversations with humans, processing visual input, and learning from its own experiences.

Unfortunately, HAL experienced a psychotic breakdown due to its human-like traits and took control of a spaceship, resulting in killing most of the crew onboard. However, recent engineering advancements have made it possible to replicate many capabilities demonstrated by HAL9000. Computers can now comprehend spoken instructions and imitate human reasoning. Additionally, language translation is also an important focus for fifth-generation computers.

Initially considered easy task for programmers, they soon discovered that successful comprehension relies heavily on context and meaning rather than mere word translation.
The convergence of various advancements in computer design and technology is paving the way for the development of fifth-generation computers. A notable improvement is parallel processing, which replaces the traditional single central processing unit design with multiple CPUs working together harmoniously. Superconductor technology is another breakthrough enabling electricity flow without resistance, greatly enhancing information transmission speed. Presently, some characteristics of fifth-generation computers have already been integrated into current systems.

Expert systems, like doctors, use problem-solving techniques to assist in diagnosing patient needs. It will take several more years for expert systems to be widely used. When ranking the most important inventions or developments of the twentieth century, computers would likely rank highly; however, including computers on this list would be inaccurate. Modern-day computers are the result of centuries of progress without which they may not have reached their current stage or may not have been developed at all. As computers continue to

evolve and become smaller and faster, these advancements will overshadow the contributions of earlier individuals who laid the foundation for computer technology. Nevertheless, history has shown that it is those few pioneers in their fields who enable mankind to reach our current state of advancement.This progress will only continue.

Get an explanation on any task
Get unstuck with the help of our AI assistant in seconds
New