Dawn Of The Digital Age 18536 Essay Example
Dawn Of The Digital Age 18536 Essay Example

Dawn Of The Digital Age 18536 Essay Example

Available Only on StudyHippo
  • Pages: 12 (3189 words)
  • Published: September 19, 2018
  • Type: Case Study
View Entire Sample
Text preview

The beginning of the Digital Age is upon us.

The history of computers can be traced back to the birth of the abacus, which occurred approximately two thousand years ago. An abacus is a wooden device consisting of two horizontal wires with beads strung on them. By manipulating these beads according to memorized "programming" rules, users were able to solve various arithmetic problems.

Blaise Pascal is often given credit for constructing the initial digital computer in 1642. The device featured dials for inputting numbers and was created to assist his father, who was a tax collector. In 1694, Gottfried Wilhelm von Leibniz developed a computer that was capable of addition and multiplication. Some components were modified to achieve this functionality. Leibniz also invented a unique stepped gear mechanism for inputting the digits of the number to be added, which is still in use today.

The prototypes created by Pascal and L

...

eibniz were not widely used and were even deemed peculiar until Charles Xavier Thomas invented the first functional mechanical calculator over a century later. Thomas' calculator had the capabilities of addition, subtraction, multiplication, and division. Numerous advancements were subsequently made to desktop calculators, including features like accumulating partial results, storing and automatically recalling past results (memory functions), and printing results. However, these improvements primarily catered to commercial users rather than the scientific community.

During the development of the desktop calculator, Thomas witnessed a series of captivating advancements in computers taking place in Cambridge, England. In 1812, Charles Babbage, a professor of mathematics, recognized that lengthy calculations, especially those required for constructing mathematical tables, consisted of repetitive and predictable actions. This led him to believe tha

View entire sample
Join StudyHippo to see entire essay

it should be achievable to automate these actions.

Babbage developed an automatic mechanical calculating machine called the Difference Engine. He had a working model by 1822 and received financial assistance from the British government to begin fabrication in 1823. The Difference Engine was intended to be steam powered, fully automatic, and capable of printing tables. It operated based on a fixed instruction program. Despite its limited adaptability and applicability, the Difference Engine represented a significant advancement. Babbage dedicated the next decade to its development, but in 1833, he became interested in what he believed was a superior concept. He envisioned a general-purpose, fully program-controlled, automatic mechanical digital computer called the Analytical Engine. Although ahead of its time, the revolutionary design of the Analytical Engine would not be fully recognized until a century later.

The plans for this engine called for a decimal computer with the same specifications: operating on numbers of fifty decimal digits and a memory capacity of one thousand such digits. It was designed to have all the necessary operations for a modern, general-purpose computer. Additionally, it was equipped with Conditional Control Transfer Capability, enabling commands to be executed in any order, not just as programmed. The analytical engine used punched cards, similar to those used in a Jacquard loom. These cards were read into the machine from various Reading Stations. The machine operated automatically, powered by steam, and required only one user.

Babbage's computers were left incomplete, with several theories explaining his failure. Many attribute it to his lack of precision machining techniques, while others speculate that he was addressing a problem that had little demand in 1840. Following Babbage's efforts, there was a temporary

decline in interest towards automatic digital computers.

During the period between 1850 and 1900, significant progress was made in mathematical physics. It was discovered that differential equations could be used to identify the majority of observable dynamic phenomena, signifying that most natural events can be quantified or explained using equations. Consequently, this development facilitated simpler methods of computation.

Additionally, the introduction of steam power resulted in the growth of manufacturing, transportation, and commerce, leading to significant advancements in engineering. Differential calculus was necessary for designing railroads, steamships, textile mills, and bridges as it helped calculate crucial factors like center of gravity, center of buoyancy, moment of inertia, and stress distributions. Mathematical integration was also used to assess the power output of steam engines. Consequently, there emerged a pressing demand for a machine capable of efficiently executing numerous repetitive calculations.

The development of punched cards by Herman Hollerith and James Powers in 1890 was a significant advancement in automated computing. Both individuals were employed by the U.S. Census Bureau at the time. Their devices were capable of reading information punched into the cards, eliminating the need for human assistance. This innovation greatly minimized reading errors, enhanced workflow efficiency, and allowed punched cards to function as a virtually unlimited memory resource. Additionally, various complex equations could be stored on separate stacks of cards and accessed as required.

Commercial companies quickly recognized these benefits and subsequently pioneered the development of more advanced punch-card computers. This included creations by International Business Machines (IBM), Remington, Burroughs, and other corporations. These machines utilized electromechanical mechanisms, in which electrical power generated mechanical motion. They incorporated functionalities like automated card feeding, as well as capabilities for

addition, multiplication, and sorting. Additionally, these systems could produce punched result cards.

Compared to modern machines, these computers were slower, typically processing fifty to 220 cards per minute, with each card containing approximately 80 decimal numbers or characters. Nevertheless, punched cards represented a significant advancement at the time, offering a means of input/output and large-scale memory storage. For over half a century following their initial implementation, punched card machines greatly facilitated early business computing globally and contributed significantly to scientific computations.

During World War II, the military had a significant demand for computer capacity, particularly for creating trajectory tables and other crucial data for new weapons. To meet this need, John P. Eckert, John W. Mauchly, and their team at the Moore School of Electrical Engineering at the University of Pennsylvania embarked on constructing a high-speed electronic computer. This device was later named ENIAC (Electrical Numerical Integrator And Calculator).

The numerical "word" of ENIAC consisted of ten decimal digits. By utilizing a multiplication table stored in its memory, it had the capability to multiply two of these numbers at a speed of three hundred per second. Consequently, ENIAC was approximately one thousand times swifter than its predecessor, the relay computers.

ENIAC utilized 18,000 vacuum tubes, occupied approximately 1,800 square feet of floor space, and required around 180,000 watts of electrical power. It featured punched card input/output, one multiplier, one divider/square rooter, and twenty adders utilizing decimal ring counters that performed as both adders and quick-access read-write register storage. The executable instructions of a program were stored in separate "units" of ENIAC, which were interconnected to establish a "route" for the information flow.

Redoing these connections, as well as resetting

function tables and switches, was necessary after each computation. This method of "wiring your own" was inconvenient and limited the programmability of ENIAC. However, it proved efficient in executing the specific programs it was designed for.

ENIAC is widely recognized as the first successful high-speed EDC and operated from 1946 to 1955. However, in 1971, a dispute arose regarding the patentability of ENIAC's fundamental digital concepts. It was argued that physicist John V. Atanasoff had already employed these ideas in a simpler vacuum-tube device. In 1939, Atanasoff and Clifford Berry of Iowa State College completed the prototype for the initial digital computer, capable of storing data and performing binary addition and subtraction. Their progress towards a newer model was halted by the outbreak of World War II. In 1973, the courts ruled in favor of the company defending Atanasoff's claim.

In 1945, mathematician John Von Neumann became fascinated by the success of ENIAC and conducted a study on computation. He determined that a computer should have a straightforward, unchanging physical structure and be capable of performing any computation through a properly programmed control, eliminating the need for modifications to the unit.

Von Neumann introduced a fresh understanding of the organization and construction of computers, emphasizing their practicality and speed. These concepts, commonly known as the stored-program technique, were crucial for the development of high-speed digital computers in subsequent generations and were unanimously embraced.

The stored-program technique incorporates various aspects of computer design and operation that enable high-speed performance. When considering the significance of performing one thousand operations per second, it becomes evident that no human programmer could generate an adequate number of instructions to continually occupy the computer if

each instruction in a job program was only used once in consecutive order.

The job program (subroutines) should be arranged in a way that they can be used repeatedly, taking into account the computation variables. Additionally, it would be useful to have the ability to modify instructions during a computation to alter their behavior. Von Neumann addressed these requirements by introducing conditional control transfers as a special type of machine instruction. This allowed the program sequence to be paused and resumed from any point, with both instruction programs and data stored in the same memory unit. Consequently, instructions could be modified arithmetically, just like data, when necessary.

Thanks to these techniques, computing and programming experienced significant improvements in terms of speed, flexibility, and efficiency. Instead of having to reprogram frequently used subroutines for each new program, they could now be stored in "libraries" and loaded into memory only when necessary. This allowed for a substantial portion of a program to be constructed using the subroutine library.

The computer memory, which was versatile, served as the location where all components of a lengthy computation were stored, worked on individually, and finally integrated to produce the end outcomes. The computer control function only endured as a "messenger" for the overall process. Once these techniques proved advantageous, they quickly became a common practice.

The initial development of modern programmed electronic computers that took advantage of these improvements occurred in 1947. This category encompassed computers utilizing Random Access Memory, which is memory designed for almost constant access to specific information. These machines featured punched card or punched tape input/output devices and had RAM capacities of one thousand words. Access times were approximately one-half

Greek MU seconds, with some machines capable of performing multiplications in two to four MU seconds. In terms of size, they were much smaller than the ENIAC, resembling grand pianos and utilizing only two thousand-five hundred electron tubes, considerably fewer than the earlier ENIAC required. The first generation stored program computers necessitated regular maintenance and achieved a reliability of operation of around seventy to eighty percent over a period of eight to twelve years. Usually coded in ML, by the mid 1950s, advancements in several aspects of advanced programming had been accomplished. This group of computers included EDVAC and UNIVAC, which became the first commercially available computers.

In the early 1950's, two important engineering discoveries - magnetic core memory and the transistor circuit element - revolutionized the electronic/computer field. These discoveries improved the reliability and capability of hardware. Digital computers quickly incorporated these advancements, with commercially available machines boasting RAM capacities of up to sixty-four thousand words by the 1960's. Access times ranged from two to three milliseconds (MS). However, purchasing or renting these machines was expensive as was expanding the programming. Therefore, they were mainly found in large computer centers operated by industry, government, and private labs that could afford the cost and support personnel. This situation led to collaborative modes of operation for sharing these powerful machines.

Batch processing is a mode in which problems are prepared and held on a storage medium like magnetic drums, magnetic disk packs, or magnetic tapes for computation. Once a problem is finished, the computer dumps the entire problem, including the program and results, onto one of these peripheral storage units before moving on to a new problem.

Another method

of accessing these fast and powerful machines was known as time-sharing. Time-sharing involves the computer processing multiple jobs rapidly one after another, allowing each job to run independently and satisfying each "customer". This mode of operation required complex executable programs to manage the different tasks.

During the 1960's, the pursuit for creating the fastest and most powerful computer reached a landmark achievement with the development of the LARC machine. The LARC was commissioned by the Livermore Radiation Laboratories of the University of California and constructed by the Sperry-Rand Corporation. Additionally, IBM also introduced their Stretch computer during this time. The LARC boasted a base memory of 98,000 words and could perform calculations in just ten Greek MU seconds. On the other hand, the Stretch computer offered multiple degrees of memory with varying access speeds to accommodate larger capacities. The fastest access time was less than one Greek MU second, with a total capacity of approximately 100 million words.

During this time, computer manufacturers started providing a variety of options and price ranges, along with additional features like consoles, card feeders, page printers, cathode-ray-tube displays, and graphing devices. These machines were extensively utilized by businesses for tasks such as accounting, payroll, inventory control, supply ordering, and billing.

CPU's used for these purposes did not require high arithmetic speed and were typically utilized for accessing extensive files and maintaining their accuracy. The majority of computer systems were primarily marketed for simpler applications, such as hospitals for managing patient records, medications, and treatments. Moreover, they were employed in libraries, including the National Medical Library retrieval system, as well as in the Chemical Abstracts System, which now encompasses computer records of

almost all recognized chemical compounds.

The trend in the 1970's was to shift from powerful, single-purpose computers to more versatile and affordable computer systems. Many industries, including petroleum refining and electrical power distribution, started using smaller computers to control and regulate their operations.

During the 1960s, programming application issues posed a challenge for medium-sized on-site computers. However, advancements in application programming languages overcame these obstacles. These languages enabled the control of various manufacturing processes and the use of machine tools with computers. Additionally, a computer hardware revolution was underway, involving the reduction in size of computer-logic circuitry components through large-scale integration (LSI) techniques. In the 1950s, researchers recognized that scaling down electronic digital computer circuits and parts could enhance speed, efficiency, and overall performance. By approximately 1960, the development of photo printing for conductive circuit boards eliminated wiring. This breakthrough also allowed for the integration of resistors and capacitors into the circuitry. In the 1970s, vacuum deposition of transistors became standard practice, enabling the creation of complete assemblies, including adders, shifting registers, and counters, on tiny "chips."

During the 1980s, there was a growing trend of very large scale integration (VLSI) where a single chip would contain hundreds of thousands of transistors. In addition, various companies started offering programmable minicomputers along with software packages. This trend of "shrinking" technology continued with the introduction of personal computers (PCs) that were not only small in size but also affordable enough for individuals to buy and use.

During the 1970s, several companies like Apple Computer and Radio Shack introduced highly successful PCs due to the popularity of computer video games. Both Intel and Motorola Corporations were engaged in intense competition in the

production of semiconductor chips, while Japanese firms were excelling in the field of memory chips. By the late 1980s, certain personal computers were powered by microprocessors capable of performing approximately four million instructions per second.

Microprocessors with read-only memory (ROM) now have the ability to perform a greater range of process-control, testing, monitoring, and diagnosing functions. These functions include tasks such as automobile ignition systems, automobile-engine diagnosis, and production-line inspection duties.

During the 1970s and 1980s, Cray Research and Control Data Inc. held a dominant position in the supercomputer industry. However, in the early 1980s, the Japanese government revealed their ambitious plan to develop a new generation of supercomputers. Known as the "fifth" generation, these supercomputers would incorporate innovative technologies like very large integration and new programming languages. One of the remarkable achievements expected from this new generation of supercomputers is advanced artificial intelligence capabilities, including voice recognition.

Despite the significant advances in hardware, progress in software development has not kept pace. As a result, software has become the primary expense in many systems due to the slow growth of programming productivity. To address this challenge, new programming techniques like object-oriented programming have been introduced. Despite the challenges faced by software, the cost per calculation of computers is decreasing at a rapid rate, and their convenience and efficiency are expected to improve in the near future.

The computer field is witnessing significant growth, with various applications such as computer networking, computer mail, and electronic publishing experiencing recent advancements. The continuous progress in technology results in the development of cheaper and more powerful computers, which suggests that computers or terminals will soon be found in the majority, if not all,

homes, offices, and schools.

Bibliography

Becher, Rhoda McShane. Parents and schools. Urbana, Ill.: ERIC Clearinghouse on Elementary and Early Childhood Education, University of Illinois, [1986].

Besterman, Theodore. Education; a bibliography of bibliographies. Totowa, N.J., Rowman and Littlefield, 1971.

Carey, Nancy Lane, Laurie Lewis, Elizabeth Farris, and Shelley Burns are credited as project officers for the report titled "Parent involvement in children's education: efforts by public elementary schools". This report was published in Washington, DC by the U.S. Department of Education, Office of Educational Research and Improvement, National Center for Education Statistics in 1998. It is available for sale by the U.S. Government Printing Office, Superintendent of Documents.

Christopher, Cindy J. (1996). Building parent-teacher communication: an educator's guide. Technomic Pub. Co., Lancaster, Pa.

The book "Making our high schools better: how parents and teachers can work together" was written by Anne W. Dodd and Jean L. Konzal. It was published by St. Martin's Press in New York in 1999.

Drazan, Joseph Gerald wrote a book titled "An annotated bibliography of ERIC bibliographies, 1966-1980". It was published by Greenwood Press in 1982 and is located in Westport, Conn.

The book "Home, school, and community relations: a guide to working with parents" by Carol Gestwicki was published in 1987 by Delmar Publisher.

Haley, Paul. Karen Berry; edited by Leslie F. Hergert. Home and school as partners: helping parents help their children. Andover, MA: Regional Laboratory for Educational Improvement of the Northeast and Islands, 1988.

Henderson, Anne. Parent participation-student achievement: the evidence grows [an annotated bibliography]. Columbia, Md.: National Committee for Citizens in Education, c1981.

The book "Collaborative consultation" was written by Idol, Lorna, Phyllis Paolucci-Whitcomb, and Ann Nevin and was published by

Aspen Publishers in 1986 in Rockville, Md.

McKinney, Kay. Edited by Nancy Paulu. Parents: here's how to make school visits work. Washington, D.C.: Office of Educational Research and Improvement, U.S. Dept. of Education: For sale by the Supt. of Docs., U.S. G.P.O., [1987?]

Rich, Dorothy. Teachers and parents: an adult-to-adult approach. Washington, D.C.: National Education Association, c1987.

United States Congress House Select Committee on Children, Youth, and Families held a hearing on June 7, 1984 in Washington, DC. The hearing was titled "Improving American education: roles for parents" and took place during the Ninety-eighth Congress, second session. This hearing was published by U.S. G.P.O. and is available for sale by the Supt. of Docs., U.S. G.P.O., 1984.

Wilde, Jerry. An educator's guide to difficult parents. Huntington, N.Y.: Kroshka Books, c2000.

The text contains a total of 3027 words.

Get an explanation on any task
Get unstuck with the help of our AI assistant in seconds
New