In the last five decades, the world has witnessed unprecedented advances in computer technology leading to the incorporation of processors in a wide range of appliances including vehicles, electronics, kitchen appliances, and even clothing. In addition, modern smartphones and tablets contain powerful processors that support computationally heavy/complex tasks such as image/video editing. In spite of this, companies including Intel and AMD are investing heavily in research with the aim of producing even smaller microprocessors in the near future.
To learn more, checkout the infographic below created by New Jersey Institute of Technology Online Master of Science in Computer Science degree program.
An Overview of Microchip Tech
Microchip technology cannot be evaluated without mentioning Moore’s Law, which was postulated by George Moore in 1965. Moore stated that the number of transistors that could be fit per square inch of integrated circuits would double annually for at least a decade. Since then, Moore’s Law has become a golden rule that is highly revered in the electronics industry. Although Moore was a brilliant scientist — matriculated at the California Institute of Technology, received a PhD in chemistry from Caltech, and conducted postdoctoral research at Johns Hopkins University’s Applied Physics Laboratory, he could not have foreseen the full implications of his observation.
Although transistors were incredibly large (as wide as a cotton fiber) and expensive (would cost approximately $8 today) in the 1960s, this did not deter Intel from developing and launching a commercial microprocessor chip (the P4004) in 1971 that packed 2,300 transistors. IBM leveraged the benefits of microprocessor technology to release a personal computer on August 12, 1981. Following IBM’s lead, Apple unveiled the Macintosh computer on January 24, 1984. While these computers were useful, they were the preserve of large businesses, government agencies, research facilities, and tertiary-level learning institutions. This changed on March 12, 1989 when Tim Berners-Lee initiated the process that would lead to the creation of the World Wide Web.
Modern Microchip Technology
Modern microprocessors are powerful because they contain billions of processors. For instance, the Intel Skylake processor contains about 1.75 billion transistors making it 400,000 times more powerful than 1960s chips. Surprisingly, it only costs a fraction of a cent to pack billions of transistors on a chip the size of a fingernail. In fact, these microprocessors have fueled the mobile device revolution that has swept the world in the last decade. Currently, 86 percent of Americans aged 18 to 29 years and 83 percent of 30- to 49-year-olds own a smartphone. It is also worth noting that some individuals own multiple mobile devices.
The Future of Moore’s Law
In recent years, microprocessor manufacturers have faced difficulties keeping up with Moore’s Law. Intel reckons that the number of transistors on microchips is now doubling every two and a half years instead of one year. As a result, some industry insiders and market analysts are urging manufacturers to ditch Moore’s Law and come up with new ways of increasing the computing capacity of microchips. This is because the electronics industry has almost fully exploited the capabilities of chip-making materials like silicon. To avoid technological stagnation, players in the electronics sector are considering leveraging improvements in areas such as organic hardware, cloud computing, and software engineering.
Cloud Computing and Software Innovations
Cloud computing and software are the two leading fronts that computer scientists are evaluating to boost microchip processing speed. On the software innovation front, deep learning has taken the lead especially at tech companies and startups. Since deep learning is modeled on the way the human brain works, it enables developers to build extremely fast, resilient, and more clever algorithms. Google has deployed this technology in diverse areas including self-driving cars, search, medical research, and natural language processing (NLP). Researchers have also come up with software tools/platforms that can automate computer-programming tasks. To enhance computational efficiency, the software is usually run as clusters on specialized microchips. Mobile devices including smartphones and tablets rely on deep learning for motion sensing, wireless/mobile payment and satellite positioning.
Microsoft is also a leader in this field thanks to the field-programmable gate array (FPGA) Catapult microchip, which can be reconfigured to perform multiple tasks in a matter of milliseconds. These technological advances mean that computers and mobile devices can deploy resources other than processing speed to complete tasks like search and web-route calculations.
In 2015, the global cloud computing industry expanded by 28 percent pushing revenues to $110 billion. According to market research carried out by TBR, the cloud computing industry’s revenues will grow from $80 billion in 2015 to $167 billion in 2020. Besides this, Morgan Stanley expects cloud computing to account for 30 percent of Microsoft’s total revenue by 2018.
Progress in microchip development has been hampered by the limits of physics laws. Some of the strategies that computer scientists are investigating to circumvent this obstacle include processors that mimic the human brain, organic materials and programmable micro-hardware.
In the future, quantum-computing technology could be used to ramp up the processing speed of chips substantially. Intel has said that it plans to reduce the size of its processor nodes to 14-nanometer, 10-nanometer, 7-nanometer, and 5-nanometer by 2020. This means Intel’s processors will be able to support increasingly larger numbers of transistors. An emerging field called “Spintronics” is promising to revolutionize the electronics industry with chips that are based on atomic-level components. Another promising strategy is the replacement of silicon with materials such as gallium nitride (GaN), graphene, and gel. Research has shown that these materials outperform silicon. The microchip manufacturing process can be simplified by using extreme ultraviolet (EUV) lithography technology because it supports extremely short wavelengths — a fraction of the wavelengths within the visible spectrum. This is desirable because the smaller the space occupied by individual components, the more powerful the microchip. IBM has been working on a 65mW neuromorphic chip that mimics the functioning of brain neurons.
Over the last half-century, microchips or microprocessors have fueled the greatest technological revolution in human history. However, the looming demise of Moore’s Law has spurred computer scientists and researchers to look for new ways of maintaining processing speed growth. Promising solutions include cloud computing, deep learning, quantum computing, extreme ultraviolet lithography, and chips that mimic brain functioning.
Add This Infographic to Your Site