History Of Computers 18510 Essay Example
History Of Computers 18510 Essay Example

History Of Computers 18510 Essay Example

Available Only on StudyHippo
  • Pages: 11 (2892 words)
  • Published: September 19, 2018
  • Type: Research Paper
View Entire Sample
Text preview

The abacus is historically recognized as the most significant early computing instrument, as it has been utilized and renowned for over 2,000 years. Additionally, the astrolabe, another computing tool, was employed approximately 2,000 years ago for navigation purposes. Blaise Pascal is widely attributed to constructing the initial "digital calculating machine" in 1642. This machine solely executed additions of numbers through dial inputs and aimed to assist Pascal's tax collector father. In 1671, Gottfried Wilhelm von Leibniz developed a computer that was constructed in 1694. Not only could it perform additions, but through sequential addition and shifting, it was capable of multiplication. Leibniz invented a distinctive mechanism known as the "stepped gear" to introduce the addend digits, and this mechanism remains in use today.

The initial models created by Leibniz and Pascal were not widely utilized, remaining more as curiosities until Tomas of Colmar (Charles Xavier Thomas) constructed the first commer

...

cially successful mechanical calculator in 1820. This innovative device possessed the ability to perform addition, subtraction, multiplication, and division. Other inventors then proceeded to create a series of improved "desk-top" mechanical calculators. By approximately 1890, these calculators included several built-in operations such as accumulating partial results, storing and reintroducing past results, and printing of results. However, manual initiation was necessary for each operation. These advancements were primarily tailored to suit the needs of commercial users, without much emphasis on the requirements of the scientific community.

Babbage

Concurrently with Tomas of Colmar's development of the desktop calculator, Charles Babbage commenced a series of highly innovative computer developments in Cambridge, England.

In 1812, Babbage had the realization that many lengthy computations, particularly those required for mathematical tables, involved repetitive routine operations. From

View entire sample
Join StudyHippo to see entire essay

this, he deduced the possibility of automating these operations. He commenced the design of an automatic mechanical calculating machine, which he named a "difference engine". By 1822, he successfully created a small functional model to demonstrate its capabilities. With financial support from the British government, Babbage initiated the construction of a full-size difference engine in 1823. This machine was intended to be steam-powered, fully automated (including the printing of resulting tables), and operated based on a fixed instruction program. While limited in flexibility and applicability, the difference engine represented a significant advancement in concept. For a decade, Babbage continued his work on it, but in 1833 he shifted his focus due to a "better idea" – the development of a general-purpose, fully program-controlled, automated mechanical digital computer as known today.

Babbage referred to his invention as an "analytical engine"; the features aimed for in this design demonstrated remarkable foresight, although their true significance was not fully acknowledged until more than a century later. The blueprints for the analytical engine outlined a parallel decimal computer that would work with numbers (referred to as words) consisting of 50 decimal digits. Additionally, it would have a memory capacity capable of storing 1,000 such numbers. This machine would have incorporated all the essential operations required by a modern general-purpose computer, including the crucial ability for "conditional control transfer", allowing instructions to be executed in any desired order rather than just numerically sequential. The analytical engine was intended to utilize punched cards, similar to those employed in a Jacquard loom, which could be read into the machine from various reading stations. It was designed to function automatically, powered by steam

and requiring only one operator. Unfortunately, Babbage's computers were never completed.

The reasons for his failure are often attributed to the lack of precision machining techniques during his time. Another theory suggests that Babbage was focused on solving a problem that wasn't urgently needed in 1840. Following Babbage's work, there was a temporary decline in interest towards automatic digital computers. However, between 1850 and 1900, significant progress was made in the field of mathematical physics. It became clear that differential equations could characterize the majority of observable dynamic phenomena, making it necessary to have readily available methods for their solution as well as other calculus problems. Additionally, the availability of steam power sparked advancements in manufacturing, transportation, and commerce, resulting in a period of notable engineering accomplishments.

The designing of railroads and the construction of steamships, textile mills, and bridges required differential calculus to determine quantities such as centers of gravity, centers of buoyancy, moments of inertia, and stress distributions. Additionally, the evaluation of the power output of a steam engine required practical mathematical integration. As a result, there was a strong need for a machine capable of quickly performing repetitive calculations.

Hollerith

A significant advancement in automating computation occurred with the introduction of punched cards. Originally utilized in computing by Herman Hollerith and James Powers in 1890 at the U.S. Census Bureau.

Devices were created to automatically read punched card information without human intervention, reducing reading errors, increasing workflow, and allowing punched cards to be used as a memory store. Different problems could be stored on separate batches of cards. This caught the attention of commercial interests, leading to improved punch-card business machine systems by companies like IBM, Remington-Rand,

and Burroughs. These systems utilized electromechanical devices powered by electricity to generate mechanical motion. They were capable of automatically feeding in a set number of cards, performing operations like addition, multiplication, and sorting, and producing punched cards with results.

The machines were slow, processing around 50 to 250 cards per minute, with each card holding up to 80 decimal numbers. Despite their slowness, punched cards represented a significant advancement at the time.

Automatic Digital Computers

During the late 1930s, punched-card machine techniques were reliable and widely used. Various research groups worked to develop automatic digital computers. Howard Hathaway Aiken led an IBM team that constructed a promising machine using standard electromechanical parts. This machine, known as the Harvard Mark I, could handle 23-decimal-place numbers (words) and perform all four arithmetic operations.

Additionally, the Mark I computer had special built-in programs, also known as subroutines, for handling logarithms and trigonometric functions. Originally, the Mark I was controlled using prepunched paper tape and lacked the capability for reversal, meaning automatic "transfer of control" instructions could not be programmed. Output from the Mark I was generated through a cardpunch and electric typewriter. Despite incorporating IBM rotating counter wheels alongside electromagnetic relays, the Mark I was classified as a relay computer. Although it was slow, taking 3 to 5 seconds for a multiplication, it was fully automatic and capable of completing extensive computations. The Mark I served as the first in a series of computers designed and constructed under Aiken's guidance.

Electronic Digital Computers

The military had an urgent requirement for computing capability during World War II, particularly for new weapons systems that lacked necessary data

such as trajectory tables. To address this need, J. Presper Eckert, John W. Mauchly, and their team at the Moore School of Electrical Engineering at the University of Pennsylvania made the decision in 1942 to construct a high-speed electronic computer.

The machine known as ENIAC, short for Electronic Numerical Integrator and Computer (or Calculator), had a numerical word size of 10 decimal digits. It was capable of multiplying two numbers at a rate of 300 products per second by retrieving values from a multiplication table stored in its memory. Despite being challenging to operate, ENIAC was significantly faster than the previous generation of relay computers. ENIAC utilized 18,000 standard vacuum tubes, took up 167.3 m6 (1,800 ft6) of floor space, and consumed approximately 180,000 watts of electrical power.

The ENIAC, an early electronic digital computer, featured punched-card input and output. It had a multiplier, divider square rooter, and 20 adders that used "ring counters" for quick-access storage. To execute a program, separate units of the ENIAC were connected together to create a specific flow of computations. This required rewiring the connections for each new problem, as well as setting function tables and switches. While this manual instruction technique was inconvenient, the ENIAC was efficient for its designated programs. It is widely recognized as the first successful high-speed EDC and was in use from 1946 to 1955. A controversy arose in 1971 regarding the patentability of the basic digital concepts of the ENIAC, with claims made by another U.S. entity.

Physicist John V. Atanasoff had already used similar ideas in a vacuum-tube device he created at Iowa State College in the 1930s. In 1973, the court ruled in favor

of the company that utilized the Atanasoff claim.

The Modern "Stored Program" EDC

Inspired by the success of ENIAC, mathematician John von Neumann conducted a theoretical study in 1945 on computation. His study showed that a computer could have a simple and fixed physical structure yet still be capable of effectively executing any type of computation through proper programmed control, without requiring changes to the hardware. Von Neumann's contributions revolutionized the organization and construction of practical fast computers. These ideas, referred to as the stored-program technique, became essential for future generations of high-speed digital computers.

The stored-program technique incorporates multiple elements of computer design and function, enabling high-speed operation. While the specifics are not included here, considering the implications of performing 1,000 arithmetic operations per second provides insight. If each instruction in a job program were only used once in sequential order, no human programmer could generate enough instructions to keep the computer occupied. Therefore, subroutines are utilized within the job program, allowing for repetition based on computation progress. In addition, it would be advantageous to modify instructions during a computation to alter their behavior. Von Neumann addressed these needs by introducing conditional control transfer as a special machine instruction, enabling interruption and reinitiation of the program sequence at any point. Furthermore, all instruction programs and data are stored in the same memory unit, permitting arithmetic modification of instructions just like data.

Computing and programming were enhanced in terms of speed, flexibility, and efficiency by utilizing subroutines to perform extensive computational tasks. Instead of repetitively reprogramming frequently used subroutines for every new problem, they could be stored in libraries and loaded into memory when necessary. Consequently, a significant

portion of a program could be constructed using these subroutine libraries. The computer memory itself acted as a workspace where various parts of a lengthy computation were stored, processed incrementally, and ultimately combined to produce the final results. The computer control functioned as a coordinator for the entire process. Once the benefits of these techniques were recognized, they became widely adopted as standard practice.

In 1947, the first generation of modern programmed electronic computers that took advantage of advancements started to emerge. This group included computers with random access memory (RAM), a type of memory that allows nearly constant access to specific information. These machines had input and output devices using punched cards or punched tape, and their RAMs had a capacity of 1,000 words with an access time of 0.5 microseconds (0.5?10^-6 sec). Some of them could perform multiplications in 2 to 4 microseconds. In terms of size, they were much smaller than ENIAC, some being about the size of a grand piano and requiring 2,500 small electron tubes, which was significantly fewer than the earlier machines.

The first-generation stored-program computers, such as EDVAC and UNIVAC, required significant maintenance and had a reliability rate of approximately 70% to 80%. Despite their limitations, these computers were used for a span of 8 to 12 years. Initially programmed in machine language, advancements in advanced programming started emerging by the mid-1950s.

In the 1950s, two significant engineering discoveries transformed the perception of the field. The introduction of magnetic-core memory and transistor-circuit element brought about an image of enhanced reliability and increased capability. These discoveries quickly made their way into new digital computer models. By the early 1960s, commercially available machines

had RAM capacities ranging from 8,000 to 64,000 words and access times of 2 or 3 microseconds.

These expensive machines were commonly found in large computer centers operated by industry, government, and private laboratories. They required a significant investment for purchase or rental and were even more costly to operate due to expanding programming costs. In order to make the most of their high capabilities, methods of operation were developed to allow for sharing accessibility. Batch processing became one such mode, where problems were prepared in advance and stored on cost-effective storage mediums like magnetic drums, magnetic-disk packs, or magnetic tapes. Once a problem was completed, the computer would transfer the program and results to one of these peripheral storage units and then move on to a new problem. Another way these fast and powerful machines were utilized is through time-sharing. With time-sharing, the computer would process multiple waiting jobs in rapid succession, ensuring each job progressed quickly and maintaining customer satisfaction.

Such modes of operation necessitate complex "executive" programs to manage the administration of different tasks.

Advancements in the 1960s

In the 60s, efforts to create the fastest and most capable computers reached a breakthrough with the development of the LARC machine for Livermore Radiation Laboratories at the University of California built by Sperry-Rand Corporation, as well as the Stretch computer created by IBM. The LARC had a core memory of 98,000 words and had a multiplication speed of 10 microseconds. The Stretch had multiple levels of memory, with slower access for larger capacity levels. The fastest access time was under 1 microsecond and the total capacity was around 100 million words. During this time, major computer manufacturers

began offering a range of computer capabilities and costs, along with various peripheral equipment including input methods such as consoles and card feeders, output methods like page printers, cathode-ray-tube displays, and graphing devices, as well as optional storage devices such as magnetic-tape and magnetic-disk for file storage. These technologies found extensive use in businesses for applications like accounting, payroll, inventory control, supply ordering, and billing.

Central processing units (CPUs) for these purposes did not require high arithmetic speed and were mainly used for accessing large amounts of file records and keeping them updated. Many computer systems were designed for more modest uses, like tracking patient records, medications, and treatments in hospitals. They were also utilized in automated library systems such as MEDLARS and the Chemical Abstracts system, which now stores computer records of almost all known chemical compounds.

Advancements in the Later Years

In the 1970s, there was a shift away from highly powerful centralized computational centers towards a wider range of applications for more affordable computer systems. Nowadays, most continuous-process manufacturing sectors such as petroleum refining and electrical-power distribution systems rely on computers with relatively lower capabilities to control and regulate their operations.

In the 1960s, the obstacle of programming application problems hindered the self-sufficiency of medium-sized computer installations. However, advancements in programming languages have removed these barriers. Nowadays, there are application languages focused on controlling various manufacturing processes, operating machine tools, and performing other tasks. Additionally, a revolution in computer hardware occurred through miniaturization of computer logic circuits and component manufacturing using large-scale integration (LSI) techniques. In the 1950s, it was realized that reducing the size of electronic digital computer

circuits and parts would enhance speed and efficiency, improving overall performance with the availability of suitable manufacturing methods. Around 1960, there was significant progress in photo printing conductive circuit boards to eliminate wiring, enabling the incorporation of resistors and capacitors into circuitry through photographic means (refer to printed circuit).

During the 1970s, vacuum deposition of transistors became widespread, resulting in the availability of entire assemblies like adders, shifting registers, and counters on small "chips." This era also saw the introduction of programmable minicomputers by various companies, even those new to the computer industry. These minicomputers were accompanied by software packages. As the trend of size reduction continued, personal computers were introduced. These programmable machines were small and affordable enough for individuals to purchase and use. Successful personal computers were launched by companies like Apple Computer and Radio Shack. The development of these small computers rapidly expanded, propelled in part by the popularity of computer games. In the 1980s, very large-scale integration (VLSI) emerged as a common practice, enabling the placement of hundreds of thousands of transistors on a single chip.

During the 1990s, the Japanese government had a plan to create a new generation of supercomputers called the fifth generation. These supercomputers would utilize new technologies in large-scale integration. However, this project was ultimately abandoned. At the same time, there was a rise in popularity and advancements in microprocessor technology due to the success of personal computers. As a result, there was increased competition in the computer industry, and computing power became more affordable. Microprocessors began to take on additional roles such as process-control, testing, monitoring, and diagnostics tasks. These changes forced the entire computer industry to

make significant adjustments during the 1990s.

Long-established and recent leaders in the industry were downsizing their workforce, closing factories, and divesting subsidiaries. Concurrently, competition in the hardware sector heightened, and personal computer manufacturers multiplied, along with specialty companies specializing in specific areas of production, distribution, or customer service. Computers are progressively shrinking in size to enhance usability in office environments, schools, and residences. While programming productivity has not kept pace with this advancement, software has become a significant expense for many systems. Nevertheless, techniques like object-oriented programming have been devised to mitigate this challenge. As a whole, the computer industry continues to witness remarkable expansion.

As computer and telecommunications technologies continue to merge, various applications such as computer networking, electronic mail, and electronic publishing have become more advanced. However, the Internet has experienced the most significant growth in recent years, impacting software manufacture and other related fields.

Get an explanation on any task
Get unstuck with the help of our AI assistant in seconds
New