Business Processes and Information Systems Essay Example
Business Processes and Information Systems Essay Example

Business Processes and Information Systems Essay Example

Available Only on StudyHippo
  • Pages: 10 (2517 words)
  • Published: December 4, 2017
  • Type: Tests
View Entire Sample
Text preview

Define IT Infrastructure from both a technology and a service perspective. Which services does IT infrastructure comprise? IT infrastructure consists of a set of physical devices and software applications that are required to operate the entire enterprise. It is also a set of firmwide of services budgeted by management and comprising both human and technical capabilities. Services include Platforms used to provide computing services that connect employees, customers, and suppliers into a coherent digital environment, including large main frames, desktop and laptop computers, PDA's, and internet appliances.

Telecommunications services that provide data, voice and video connectivity to employees, customers, and suppliers. Data management services that store and manage corporate data and provide capabilities for analyzing the data. Applications software services that provide enterprise-wide capabilities such as enterprise resource planning, customer relationship management, supply chain manage

...

ment, and knowledge management systems that are shared by all business units. Physical facilities management services that develop and manage the physical installations required for computing, telecommunications, and data management services.

IT management services that plan and develop the infrastructure, coordinate with the business units for IT services, manage to account for IT expenditures, and provide project management services. IT education services that provide training in system use to employees and offer managers training in how to plan and manage IT investments. IT research and development services provide the firm with research on potential future IT projects and investments that could help the firm differentiate itself in the marketplace. List each of the eras in IT infrastructure evolution, and describe their distinguishing characteristics. Electronic ACCTG Machine Period (1930 to 1950) Specialized machines that could sort computer cards into bins accumulate totals, and print

View entire sample
Join StudyHippo to see entire essay

reports. Software programs were hardwired into circuit boards, and they could be changed by altering the wired connections on a patch board.

There were eno programmers and a human-machine operator was the operating system, controlling all systems resources. General Purpose Mainframe and Minicomputer Period (1959 to present) Commercial all electronic vacuum computers appeared through the introduction of UNIVAC computers and the IBM 700 series. It was the first commercial computer with a powerful operating system that could provide time-sharing, multitasking, and virtual memory. Mainframe computers eventually became powerful enough to support thousands of online remote terminals connected to the centralized mainframe using proprietary communication protocols and propritary data lines. The mainframe period was a period of highly centralized computing under the control of professional programmers and systems operators, with most elements of infrastructure provided by a single vendor, the manufacturer of the hardware and software.This pattern began to change with the introduction of the minicomputers by Digital Equipment Corporation DEC in 1965.

DEC offered powerful machines at lower prices than IBM mainframes, making possible decentralized computing, customized to the specific needs of individual departments or business units rather than time sharing on a single huge mainframe. Personal Computers Period (1981-Present) The first personal computer appeared in 1970’s but was not widely adopted by Americans until the IBM PC in 1981. It first used the DOS operating system, a text-based command language, and later the Microsoft Windows operating system, the Wintel PC Windows operating system with Intel processor) the computer became the standard desktop personal computer. 95% of the billion computers use WINTEL.

The proliferation of PC’S launched spate of personal desktop productivity software tools- word processors, spreadsheets, electronic

presentation software, and small data management programs, that were valuable to both home and corporate users. Client/Server Period (1983 to Present)In this desktop or laptop computers called clients are networked to powerful server computers that provide the client computers with a variety of services and capabilities. The computer processing work is split between these two types of machines. The client is the user's point of entry while the server typically processes and stores shared data, serves up web pages, or manages activities.

This server is called a two-tier client/server architecture. Server replies to both the software application and the physical computer on which the network runs. The server could be a mainframe but today they are typically a more powerful version of personal computers. A multitiered client/server-the entire network is balanced over several different levels of server, depending on the kind of service being requested. EG.

A Web Server, a computer that uses Web server software to house web pages for a website, will serve a client in response to a request for service. If a client requests access to a corporate system ( a product list or price info) the request is passed along to an application server.A. S. handles all application operations between a user and an organization’s backend business systems.

Client/ server computing enables businesses to distribute computing work across a series of smaller inexpensive machines that cost much less than minicomputers or centralized mainframe systems. The result is an explosion in computing power and applications throughout the firm. What is Moore’s Law and the Law of Mass Digital Storage? What aspects of infrastructure change do they help explain? Since the first microprocessor chip was

introduced in 1959, the number of components on a chip with the smallest manufacturing costs per component had doubled each year. There are three aspects of Moore’s Law? the power of microprocessors doubles every 18 months computing power doubles every 18 months the price of computing falls by half every 18 months Nanotechnology uses individual atoms and molecules to create computer chips and other devices that are thousands of times smaller than current technologies permit. Law and Mass Digital Storage- the 2nd technology driver of IT infrastructure change.

The world produces as much as 5 Exabytes (a billion gigabytes) of unique info per year. The amount of digital info is doubling every year. This growth involves magnetic storage of digital data, and printed documents account for only. 003 percent of the annual growth. 5) How do network economics, declining communication costs, and technology standards affect IT infrastructure and the use of computers? Network Economics- Moore’s Law and the Law of Mass Storage help us understand why computing resources are now so readily available. The economics of networks and the growth of the internet provide some answers.

Robert Metcalfe inventor of Ethernet local area network technology claimed that the value or power of a network grows as a function of the number of network members. As the number of members in a network grows linearly, the value of the entire system grows and continues to grow forever as members increase. Demand for IT has been driven by the social and business value of digital networks, which rapidly multiply the number of actual and potential links among network members. Declining Communication Costs- fourth driver 1. 1billion People worldwide now

have internet access. 5% of private sector firms use email, 100% of public sector use it.

Communication costs are falling and utilization of communication and computing facilities are increasing. To take advantage of this business value, firms must greatly expand their internet connections, including wireless connectivity, and greatly expand the power of their client/server networks, desktop clients, and mobile computing devices. Standards and Network Effects- Today’s infrastructure and Internet computing would be impossible without agreements among manufacturers and widespread consumer acceptance of technology standards. Technology standards are specifications that establish the compatibility of products and the ability to communicate in a network. It unleashes powerful economies of scale and result in price declines as manufacturers focus on products built to a single standard. Without economies of scale, computing would be far more expensive.

Wintel Pc became the standard desktop and mobile client computing platform. List and describe the components of IT infrastructure that firms need to manage. Operating Systems Platform- At the client level, 95% of PCs and 45% of handheld devices use some form of Microsoft Windows operating system. In contrast in the server marketplace, more than 85% of the corporate servers in the US use some form of Linux, an inexpensive and robust open source relative of Unix. Microsoft is capable but not generally used for more than 300 client computers in a network.

Linux and Unix can run on many different types of processors. The major providers of Unix are IBM, HP, and Sun. Enterprise and Other Software Applications This software is the largest single component after telecommunication in the IT Infrastructure. A large portion of the software budget of medium-large size companies will be

spent on enterprise system software. The largest providers of enterprise application software are SAP and Oracle. Data Management and Storage Enterprise data management software is responsible for organizing and managing the firm’s data so that it can be efficiently accessed and used.

The leading software providers are IBM (DB2), Oracle, Microsoft (SQL Server), and Sybase (Adaptive Server Enterprise). A growing new entrant is MySQL, A Linux open-source relational database product available for free on the internet and supported by HP. The physical storage market is dominated by EMC Corporation. Storage area Networks (SAN) connect multiple storage devices on a separate high speed network dedicated to storage.

The SAN creates a large central pool of storage that can be rapidly accessed and shared by multiple servers. The market for data digital storage devices has been growing at more than 15% annually over the last 5 years. Networking/Telecommunications Platforms Telecommunication services consist of telecommunications, cable, and telephone company charges for voice lines and Internet access, these could be counted as part of the firms infrastructure. Telecommunications platforms are provided by telecommunications/telephone services companies that offer voice and data connectivity, wide area networking, and internet access.

Internet Platform Internet platforms overlap with, and must relate to, the firms general network infrastructure and hardware and software platforms. Expenditures for internet related infrastructure for hardware, software, and, management services to support a firms Web site, including Web hosting services, and for intranets and extranets. Web hosting services maintain a large Web server, or series of servers, and provide fee paying subscribers with space to maintain their Web sites. This is growing by 10% annually. The internet revolution led to an increase in server

computers, with many firms collecting thousands of small servers to run their internet operations.

Since then there has been a steady push toward server consolidation, reducing the number of server computers by increasing the size and power of each. Consulting and System Integration ServicesFirms implement IT infrastructure and have to make significant changes in business processes and procedures, training and education, and software integration. Consulting services help them with these problems. Software integration means ensuring the new infrastructure works with the firms older, so called legacy systems and ensuring the new elements of the infrastructure work with one another. A legacy system is generally an older transaction processing system created for mainframe computers that continues to be used to avoid the high cost of replacement or redesign.

Most firms replied on their accounting firms in the past, because they were the only ones who understood a company’s business process and had the expertise to change its software. 7) Compare grid computing and edge computing. Grid computing involves connecting geographically remote computers into a single network to create a virtual supercomputer by combining the computational power of all computers on the grid. Grid computers require software programs to control and allocate resources on the grid, such as open-source software provided by Globus Alliance or private providers. The business case for using grid computing involves cost savings, speed of computation, and agility.

The grid adjusts to accommodate the fluctuating data volumes that are typical in this seasonal processing time for seismic data while improving output quality and helping its scientists pinpoint problems in finding new oil supplies Edge Computing is a multitier, load balancing scheme for web-based applications in which

significant parts of Web site content, logic, and processing are performed by smaller, less expensive servers located nearby the user in order to increase responsiveness and resilience while lowering technology costs. In this case edge computing is another technique like grid computing and on demand computing for using the internet to share the workload experience by a firm across many computers located remotely on the network. In edge platform applications, edge servers initially process request from the user client computer. The edge servers delivers to the client presentation components such as static Web page content, reusable code fragments, and interactive elements gathered on forms.

Database and business logic elements are delivered by the enterprise computing platform. How can businesses benefit from on demand computing? From automatic computing? On demand computing refers to firms offloading peak demand for computing power to remote, large scale data processing centers. In this manner, firms can reduce their investment in IT infrastructure by investing just enough to handle average processing loads and paying for only as much additional computing power as the market demands. Another term for on-demand computing is utility computing which suggests that firms purchase computing power from central computing utilities and pay only for the amount of computing power they use, much as they would pay for electricity. In addition to lowering the cot of owning hardware resources, on-demand computing gives firms greater agility to use technology and greatly reduces the risk of over-investing in IT infrastructure.

Over investing shifts from having a fixed infrastructure to a flexible one some of it owned by the firm, and some of It rented. Automatic computing This is an industry-wide effort to

develop systems that can configure themselves, optimize and tune themselves, heal themselves when broken, and protect themselves from outside intruders and self-destruction. For Eg a PC knows it is invaded by a virus and eradicates it or alternatively turns its workload over to another processor and shuts itself down before the virus destroys any files. Some of these capabilities are also available on desktops such as firewalls. A firewall is a hardware or software placed in between an organizations internal network and external networks to prevent outsiders from invading private networks.

These programs can be updated automatically as the need arises by connecting to an online virus protection service such as MCAFEE.  How can businesses benefit form virtualization and multi-core processors? Virtualization is the process of presenting a set of computing resources so that they can all be accessed in ways that are not restricted by physical configuration or geographic locations. This decreases the power consumption ad curbs hardware proliferation. Server virtualization enables companies to run more than one operating system at the same time on a single machine. Most servers run at 10 to 15% of capacity and virtualization can boost server utilization rates to 70% or higher. This translates into fewer computers required to process the same amount of work.

Server utilization runs between the operating system and the hardware, masking server resources, including the number and identity of physical servers, processors, and operating systems from server users.In addition to reducing hardware and power expenditure, virtualization allows business to return their legacy applications on older versions of an operating system on the same server as new applications. Multiprocessors is integrated circuit that contains two or

more processors. In the past, the speed of processors have been increased by increasing their frequency from a few megahertz to today’s chips which operate at gigahertz frequencies.

Get an explanation on any task
Get unstuck with the help of our AI assistant in seconds
New