Moore’s Law: Its History, Future, and Competing/Complimentary Technologies

Sanjay Basu, PhD
7 min readMar 20, 2023

--

Copyright: Sanjay Basu

Update: Gordon E. Moore, a co-founder and former chairman of Intel Corporation, the California semiconductor chip maker that helped give Silicon Valley its name, achieving the kind of industrial dominance once held by the giant American railroad or steel companies of another age, died on Friday, March 24th at his home in Hawaii. He was 94.

I was traveling across a few universities these past weeks on either side of the ponds. I met quite a few young graduates with a humanities background. They have a keen interest in Ethics in AI and AI safety. With AI now becoming omnipresent with the recent release of ChatGPT and other Large Language Models, they like to participate and contribute to the new future with AI. I was talking about the infrastructure powering this phenomenal growth. As a result, we were talking about semiconductors, and I realized that they did not necessarily know about Moore’s law and its contributions to society. This is a humble effort to shed some light on Moore’s law, its contributions, some competing/complementary laws and finally, a short biography of Gordon Moore.

Introduction

Moore’s Law has been the driving force behind the semiconductor industry’s growth for over half a century. Named after Gordon Moore, co-founder of Intel, it refers to the observation that the number of transistors on a microchip doubles approximately every two years, resulting in an exponential increase in computing power. In this blog, we will delve into the history of Moore’s Law and its future prospects and explore competing or complementary laws that may come into play.

The History of Moore’s Law

The Beginning (1965)

In 1965, Gordon Moore published a paper in Electronics Magazine in which he predicted the future growth of the semiconductor industry. He observed that the number of transistors on an integrated circuit was doubling every year and forecasted that this trend would continue for at least the next decade. This prediction became known as Moore’s Law.

The 1970s: Refinement and Adoption

Moore’s prediction held true throughout the 1970s. In 1975, he updated his prediction, stating that the number of transistors on a chip would double every two years instead of one. This updated version of Moore’s Law became widely accepted and adopted as an industry target, and companies started using it to plan their research and development strategies.

The 1990s: Continued Success and Challenges

The 1990s saw the growth of personal computers and the Internet, which were both heavily influenced by the ongoing development of semiconductor technology. However, it also became clear that the pace of Moore’s Law was facing challenges. Manufacturing techniques were reaching their physical limits, and new approaches were required to maintain the pace of innovation.

In the 2000s and 2010s, Moore’s Law continued to be a driving force in the semiconductor industry, but it faced increasing challenges due to technological and physical limitations. Here’s an overview of the key developments during these decades and the early 2020s:

The 2000s: Challenges to Dennard Scaling and the Rise of Multicore Processors

During the 2000s, Dennard Scaling, which had been a complementary principle to Moore’s Law, started to falter. Transistor leakage and heat dissipation became critical issues as transistors continued to shrink in size. In response to these challenges, the industry shifted focus towards multicore processor architectures. Instead of relying solely on transistor density, chipmakers began to develop processors with multiple cores on a single chip, allowing for parallel processing and improved performance.

The 2010s: FinFET and Heterogeneous Computing

In the 2010s, the semiconductor industry introduced FinFET (Fin Field-Effect Transistor) technology. This 3D transistor architecture helped to mitigate some of the challenges associated with the miniaturization of transistors, enabling further scaling of integrated circuits. Additionally, heterogeneous computing became more prominent, with specialized processors such as GPUs and TPUs being used alongside CPUs to accelerate specific workloads, such as AI and machine learning.

The early 2020s: Slowing Pace and Emerging Technologies

By the early 2020s, the pace of Moore’s Law started to slow down. The limitations of traditional silicon-based technology became increasingly apparent, and the cost of maintaining the pace of development skyrocketed. However, new materials and approaches emerged to address these challenges, such as:

  1. Extreme ultraviolet lithography (EUV): A new technique for creating smaller transistors, allowing for further scaling of integrated circuits.
  2. 3D integration: Stacking multiple layers of integrated circuits on top of each other to increase the density of transistors without necessarily shrinking their size.
  3. New materials: Research on materials such as graphene and carbon nanotubes has the potential to overcome the physical limitations of silicon-based technology.
  4. Quantum computing: Quantum computers leverage the principles of quantum mechanics to perform complex calculations much faster than classical computers, potentially revolutionizing computing.

The Future of Moore’s Law

In recent years, the semiconductor industry has acknowledged that Moore’s Law is slowing down. The limitations of traditional silicon-based technology are becoming apparent, and the costs of maintaining the pace of development are skyrocketing. The future of Moore’s Law depends on the ability of researchers to develop new materials, manufacturing techniques, and computing architectures that can continue to push the boundaries of computing power.

The developments in the early 2020s suggest that while the pace of Moore’s Law is slowing down, the semiconductor industry is adapting and embracing new technologies and approaches to continue pushing the boundaries of computing power. Moore’s Law: Its History, Future, and Competing/Complimentary Technologies

Moore’s Law has significantly impacted various aspects of technology, society, and everyday life. Here are some notable contributions and positive outcomes resulting from the exponential growth in computing power driven by Moores Law:

Competing and Complementary Laws

Several competing and complementary laws have been proposed as potential successors of Moore’s Law. Some of the most notable include:

Dennard Scaling

Dennard Scaling, named after IBM researcher Robert Dennard, is a principle that states that as transistors shrink in size, their power consumption per unit area remains constant. This allowed for a more efficient and less power-hungry computing experience. However, Dennard Scaling has been faltering since the early 2000s due to issues like transistor leakage, requiring a shift in focus towards other approaches, such as multicore architectures.

Koomey’s Law

Koomey’s Law, named after Jonathan Koomey, an energy and environmental researcher, states that the energy efficiency of computing (measured in computations per kilowatt-hour) doubles approximately every 1.57 years. As energy efficiency becomes an increasingly important aspect of modern computing, Koomey’s Law could become a valuable metric for the industry.

Neven’s Law

Named after Hartmut Neven, a scientist at Google, Neven’s Law applies to quantum computing. It posits that the rate of progress in quantum computing is outpacing Moore’s Law, doubling every few months. If this trend continues, it could potentially revolutionize computing, offering massive leaps in computational power.

3D Integration

3D integration is a complementary approach to Moore’s Law that involves stacking multiple layers of integrated circuits on top of one another. This increases the density of transistors without necessarily shrinking their size, potentially allowing for continued scaling and increased computing.

Gordon Moore: A Life Shaping the Semiconductor Industry

Early Life and Education

Gordon Earle Moore was born on January 3, 1929, in San Francisco, California. Raised in nearby Pescadero, Moore developed an early interest in science and chemistry. After finishing high school, he attended San Jose State University and later transferred to the University of California, Berkeley, where he received a Bachelor of Science degree in chemistry in 1950. Driven by his passion for science, Moore pursued graduate studies at the California Institute of Technology (Caltech) and obtained a Ph.D. in chemistry and minor in physics in 1954.

Career and Accomplishments

Fairchild Semiconductor

Following his academic pursuits, Gordon Moore joined the Shockley Semiconductor Laboratory, working under Nobel laureate William Shockley. However, he soon left the company with seven other colleagues (known as the “Traitorous Eight”) due to differences with Shockley’s management style. In 1957, the group founded Fairchild Semiconductor, a company that would go on to play a significant role in shaping the future of the semiconductor industry. At Fairchild, Moore served as the director of research and development and was part of the team that developed the first integrated circuit.

Intel Corporation

In 1968, Moore, along with his colleague Robert Noyce, left Fairchild Semiconductor to establish Intel Corporation. The company quickly became a leader in the development and manufacturing of memory chips and microprocessors. As the CEO of Intel, Moore led the company to develop innovative technologies that would become integral to the digital revolution, such as the first commercially available microprocessor (Intel 4004) and the x86 series of microprocessors.

Moore’s Law

Gordon Moore’s most significant contribution to the world of technology came in the form of a prediction, now known as Moore’s Law. In 1965, he published a paper in Electronics Magazine, where he observed that the number of transistors on an integrated circuit was doubling approximately every year. He predicted that this trend would continue for at least a decade. Moore’s Law, as it came to be known, soon became a guiding principle for the semiconductor industry, driving research, development, and innovation.

Philanthropy and Personal Life

Gordon Moore has also made significant contributions in the realm of philanthropy. In 2000, he and his wife, Betty, established the Gordon and Betty Moore Foundation, which supports environmental conservation, scientific research, and patient care. The foundation has donated billions of dollars to various causes, reflecting Moore’s commitment to creating a better world through science and technology.

Moore married Betty Whitaker in 1950, and they have two sons. Despite his enormous success, Moore is known for his humility and unassuming demeanor.

Legacy

Gordon Moore’s vision and relentless pursuit of innovation have left an indelible mark on the world of technology. His prediction, Moore’s Law, has been a driving force behind the exponential growth of computing power over the past half-century, profoundly impacting industries, economies, and everyday life. In recognition of his numerous accomplishments, Moore has received various honors, including the National Medal of Technology and Innovation, the IEEE Medal of Honor, and the Presidential Medal of Freedom.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

No responses yet

Write a response