Intel And The Microprocessor Industry Business Essay Example
Intel And The Microprocessor Industry Business Essay Example

Intel And The Microprocessor Industry Business Essay Example

Available Only on StudyHippo
  • Pages: 8 (2151 words)
  • Published: September 10, 2017
  • Type: Research Paper
View Entire Sample
Text preview

The impact of IT engineering on our economy and lifestyle is unquestionable, with the ICT sector leading the way as the most influential field. It should be noted that these advancements are not solely credited to desktop computers, but rather, to the innovation and skillful management at the micro-processor level. Intel stands out as an exemplary company in this aspect.

The original discoverers of the microchip, Intel has maintained their dominance in this industry. We will examine Intel's development over time, closely observing the techniques adopted by the company's leadership to sustain their success in a sector where many competitors have failed. We will also explore their current advanced approach to competition and their future plans. We will evaluate these techniques and assess whether they are adequate to ensure the company's longevity into the twenty-second century. Background History: Intel ha

...

s consistently held a strong position in the strategic market of IT. The emergence of microchips sparked a flurry of activity in a sector that was previously overlooked due to technological limitations at that time.

Despite experiencing fluctuations due to competitors and changing customer demands, Intel Corporation, established in 1968 by Robert Noyce and Gordon Moore, revolutionized the semiconductor industry with the creation of the microprocessor. It is worth noting that both Noyce and Moore are credited with inventing the Integrated Circuit during their time at Fairchild. Later joined by Andy Grove, Intel shifted its focus from memory business to microprocessors used worldwide in computer systems. Key dates include:
- 1968: Founding of N M Electronics (later renamed Intel Corporation) by Noyce and Moore
- 1970: Development of dynamic RAM (DRAM)
- 1971: Introduction of the world's first microprocessor an

View entire sample
Join StudyHippo to see entire essay

becoming a publicly traded company
- 1974: Introduction of the first general-purpose microprocessor
- 1993: Launch of the fifth-generation chip, Pentium
- 1997: Introduction of Pentium II microprocessor and addition to Dow Jones Industrial AverageIn 2000, the market witnessed the launch of the first Intel 1-gigahertz processor. In 2010, Intel made an announcement that their systems would be based on the new Intel ATOM platform. However, chip performance has gradually decreased over time. During the 1990s, there was a yearly increase of 60 percent in chip performance. This growth rate dropped to only 40 percent from 2000 to 2004. This is in contrast to Moore's Law prediction that transistor count on chips would double every two years. The future remains uncertain about whether this law or techniques used in producing Intel Microprocessors have reached their limits. Nevertheless, efforts are being made to explore alternative options.

Furthermore, in the year 2000, Intel introduced its Itanium processor as their initial powerful internet server processor designed for the early twenty-first century. It featured a new architecture enabling it to function as a high-performance computing platform with advanced multimedia capabilities and extensive memory capacity suitable for use with robust internet servers.

Intel's continuous development of more powerful processors and expansion into other technology areas solidified its position as a key player in the information economy. The progression from Core 13 to Core 15 to Core 17 does not represent disruptive innovation, but rather the introduction of the Atom processor. Despite having less technological advantage, the Atom processor satisfies two important criteria.

  • It is cheaper.
  • It caters to an entirely different market that may have

been overlooked by major players.

  • It can enter a completely new market: mobile but affiliated.
  • The cloud-computing sector is the most suitable industry where this usage can be justified. This sector demands speed, efficiency, and compactness, making it ideal for the Atom processor. In this area, users are highly mobile and only require access to a central storage database and processing unit without having to carry their work with them. Apple and Google played significant roles in creating this market through their OS systems and iPods, potentially causing disruption for larger high-end players. However, these players closely observed the situation and quickly filled the gap created by this innovative technology.

    This not only keeps them in the game but also opens up a potential new market that can enhance productivity. This is where the advantage of open innovation comes into play. By collaborating with their partners, they have assisted both major players in the industry, such as IBM (Datacentre Builder), and smaller players like Lenovo (System builders), in transitioning towards the disruptive innovation of cloud computing. This helps them maintain their leadership in the processor and IT industry as a whole. However, what exactly is meant by "Prolonging Innovation" and "Disruptive Innovation"? In the case of prolonging innovation, a competitor takes over the market by using extreme procedural innovations that enhance the value provided to the existing market.

    The debut of a new merchandise or service (Value) into a new market causes uproar among previously established participants and their offerings. This can be due to the creation of a new market that leads to widespread adoption, displacing the old norm.

    Schumpeter attributes this to the entrance of innovative entrepreneurs. Clayton further explains that this impact does not necessarily rely on technology, but rather the strategy employed by the innovation. Generally, disruptive inventions are technologically uncomplicated, made up of off-the-shelf components assembled in a simpler product architecture compared to previous approaches. They provide less of what established market customers desire and are therefore rarely implemented there initially.

    According to Christensen (1997, p. 15), they provided a distinct set of properties that were only valued in emerging markets that were distant and unimportant to the mainstream. The established participant only realizes or pays attention to the impact of this innovation when it is too late. This is when their concern encounters what Andrew Grove called points of inflexion.

    . The established participant does not immediately use the dynamics of new products because the benefits they offer may not be as good as those of established products. However, this becomes irrelevant as the new product is quickly adopted and improved upon. This has varying consequences, such as economic sabotage and market recession, which can lead to the collapse of entire markets and economic damage.

    According to Schumpeter, the harmonization experience mentioned above is short-lived and would lead to a boost in the later period. In the 1980s, the Intel group was affected by two major inflection curves. The first one was the penetration of the Japanese. Thurman discussed the aggressive commoditization and mass production of memory chips by Japanese companies. According to him, the Japanese always focused on improving product delivery and marketing. By commoditizing, they ensured that the origin of the product (whether US-based or Japanese produced) was irrelevant.

    Improved and

    advanced techniques not only enhanced the quality of the end product and ensured large-scale production from their mill (Thurman, 1989), but also reduced costs to a point where competitors had no choice but to withdraw due to losses incurred by their smaller mills (Saxenian, 1994). These advancements were made in the DRAM, Static RAM, and EPROM (Electrically Programmable Read Only Memory) market, driving out all major players. Despite efforts to improve product quality, production capacity, and promotional strategies, the impact of the Japanese market could not be stopped.

    Those who were able to survive either transitioned into alternative business paths in a process termed radical Paradigm displacement within the existing memory chip industry (Tidd 2010, pg 24) and radical product innovation in the semiconducting material industry. Intel, on its part, ventured into the microprocessor chip market. On the other hand, those who failed to adapt eventually went under. Intel's introduction of the microprocessor caused a disruption in the industry that had been predominantly governed by vacuum tubes and mainframe computers.

    Due to its smaller size, simplicity, and faster production process, the product quickly became popular in the market. In order to reduce costs even further, developers switched to large-scale production, which resulted in a slight improvement in sales and profits (Andrew, 1996). This led to the growth of a previously non-existent minicomputer industry. Initially, the product was only accessible to individuals who couldn't afford mainframe systems, so the market was largely ignored by most companies except for IBM. However, eventually even the minicomputer market was replaced by personal computers as low-income individuals adopted the innovation.

    Compaq emerged as a major beneficiary during this period, achieving the remarkable feat

    of becoming the fastest Luck 500 company to cross the $1 billion mark (Andrew, 1996). However, the minicomputer had various impacts on the market during that time. As it was produced by smaller independent companies, this led to a shift in the market structure from vertical to horizontal. In order to survive, most established mainframe computer manufacturers quickly adapted to this change.

    Those who opposed the change suffered from a widespread seafaring ship caused by creative destruction. In a paper by NICOLA DE LISO and GIOVANNI FILATRELLA, they identified this result as common when two different technologies offer the same function or service. This is also evident with the Vacuum tube and semiconductor platform. The old technology is enhanced with the introduction of the new one. The new technology undergoes further refinement to exceed the value of the old technology.

    This process continues indefinitely until one surrenders and the other becomes dominant. (FILATRELLA, 2007). The concept of open innovation was developed by Chesbrough. It essentially refers to a structure or attitude that enables the flow of knowledge or information. The individuals involved could be colleagues, partners, allies, competitors, or even customers. The means could be formal or informal. The information exchanged could be related to the current work or future trends.

    Chesrough categorized it as two aspects: inside out and outside in. Using the outside in approach, it involves gathering ideas from various sources and developing them to create value. These sources can range from end-customers and users of products, to suppliers in the supply chain, complementors in the industry, and even competitors with similar products or shared knowledge.

    During the pre-microprocessor era, the engineering industry followed a free-open

    innovation approach where individuals would patent and publish their new findings or inventions. This allowed others to copy, modify, improve, and mass-produce these innovations, a process known as second-sourcing (Andrew, 1997, pg69).

    However, IBM made a bold move by refusing to follow the norm when they stopped freely sharing their knowledge and demanded payment for their engineering. Because their competitors refused to pay, IBM became the sole provider and achieved a strong monopoly in the market for many years. The reason for this was the competition created by late adopters who pushed others out of the market. In response, the competition decided to develop their own version of the product (Andrew, 1997, pg70).

    What were the benefits?

    • The benefits were that they set a standard template for the Personal computer industry.
    • Additionally, because the platforms all looked the same, software developers could work on packages that worked on basically similar platforms.

    Although the industry has diluted this concept, Apple, a new company that has not fully adopted open innovation or limited its acceptance of it. Apple products operate under a vertical system of production where the company is responsible for production from design to OS and Applications. However, they have produced modified microprocessors with the AMD RISC architecture. The level of openness lies in how they interact with customers. While most of the research is done in-house and they have patents on intellectual property, they still seek feedback from end-customers for user compatibility.

    The consequence is the release of an OS platform that can work with other proprietary applications without enlistment. This means they have created their

    own market while also allowing their customers to use third-party applications on a closed product. The idea of "Riding Two Horses" comes from the difficulties of choosing which direction to take in technology. In the engineering industry, decisions must be made on whether to stick with the old platform or migrate to a new invention. This choice can be influenced by options given to end-users by competing industries. The chosen option becomes dominant in the market and improves over time, while the other option becomes obsolete. However, both rival platforms or procedures can coexist for an indefinite period of time. Various economic and technological reasons for maintaining technological continuity have been studied, including the issue of network externalities as explored by Katz and Shapiro (1986).According to Arthur (1994), he examined competing technologies and concluded that if there are increasing returns to adoption, the market becomes locked-in to an inferior choice when the acceptance of an inferior technology reaches a certain point. However, if not, the previous technology remains the dominant platform while the new emerging technology remains stagnant. Intel encountered the Intrinsic Effect when they had to decide between the newer and faster Reduced Instruction Set Computing (RISC) and the slower and older Complex Instruction Set Computing (CISC). They had already invested significant resources in the development of production.

    They had their Technical lacerate apart on which to take. Eventually, Andrew Grove made the decision. The CISC won the twenty-four hours because it was compartible with most package and the industry wasn't ready for the unknown (Andrew, 1996, Pg103).

    Get an explanation on any task
    Get unstuck with the help of our AI assistant in seconds
    New