Posted by mgianfagna on November 30, 2021
By Mike Gianfagna
Democratize is an unusual word. It doesn’t appear in normal conversation very much and is often used to describe a major trend. I started with a Google search for the term. The first hit I got was, to make democratic. Using a word to define itself is, well, annoying. I ignored that one.
The next hit was, make (something) accessible to everyone. I liked that one a lot better. It got to the heart of what I think this word means, at least in my world. I’m a big fan of alliteration, so democratized design borders on poetry for me. But what exactly does that mean? It turns out, when you ask the question makes a difference. Let’s explore.
Back in the dawn of time (from a chip perspective), custom chip design and the innovation potential it unlocked were reserved for those companies with the resources to do it all. This was the so-called era of integrated device manufacturers, or IDMs. These large, monolithic companies had the vast design resources required for custom chips. They built any reusable IP that was needed, and they operated the mask lab and fabrication facility required to build those custom chips. An incredible investment all-in.
But the opportunity for product differentiation was vast. Companies like TI, RCA, General Electric, IBM, Bell Labs, and Sony were some of the names that dominated the landscape. They all built highly differentiated products based on the custom chips they designed and built. Life was good if you were part of an IDM. In the pre-EDA days, these companies also wrote the software needed to design and manufacture custom chips – everything from simulation and layout to fracturing those layouts into trapezoids to make photomasks.
Chapter One lasted from about the 1950s to the early 1980s.
Beginning in the 1980s, two magical events occurred. First, ASIC companies were born. Early pioneers such as LSI Logic (LSI) and VLSI Technology Inc. (VTI) came on the scene. These companies had the resources to design and build custom chips, just like the IDMs. But there was a twist. The focus was exclusively on building custom chips for others. The new entrants enabled those companies lacking the vast resources to build a custom chip to finally access the technology. LSI and VTI built custom chips for one customer and typically one application. Only that customer could buy that chip. And so, the application-specific integrated circuit, or ASIC, era began.
Tools to assist with the design and manufacturing of custom chips also started to become available around this time. New EDA companies, like Daisy Systems, Mentor Graphics, and Valid Logic got an enthusiastic welcome from the new breed of ASIC customer. Lots of new chip startups were funded, and the market exploded with many new and highly differentiated products.
Chapter Two lasted from the 1980s to the early 2000s.
All was going well on the product innovation front, and then deep submicron happened. The complexity of semiconductor manufacturing went through the roof. Three-dimensional transistors, lots more routing layers, and feature sizes so small they defied the limitations of the wavelength of light itself. Subtle circuit effects began to be even more subtle. Process corners exploded just to keep track of it all. And some manufacturing effects started to move in the opposite direction, appearing to defy the laws of physics.
All this made it very time-consuming and very expensive to build an ASIC, even with a well-built-out infrastructure available. ASIC design starts fell, leaving only the rich companies (again) or the companies fortunate enough to own a large share of a very large market to participate.
Chapter Three lasted from the early 2000s until about 10 years ago.
This last chapter takes us to the present day. Custom chips can still be quite vexing to build. Innovations such as extreme ultraviolet lithography have helped somewhat with the wavelength of light problem. But the cost of this technology limits its use, and the most advanced processes have their own set of counter-intuitive challenges.
Something else has happened, however, that is changing the landscape. It’s a series of events. Up to about 10 years ago, chips defined the market and software was written to run on those chips. Over the past 10 years that has started to invert. More and more, software is defining markets and chips are designed to run the software. This is how almost every artificial intelligence (AI) feature works, from the self-driving car to Amazon Alexa. These markets are quite large as well. That certainly helps defray design costs. So does an infusion of many new and well-funded companies into the chip business. Consider that Amazon, Google, Apple, and Facebook didn’t design chips 10 years ago. Today, they dominate the chip landscape. I touched on some of this in my prior blog post.
The methods being used to deliver technology are also evolving. Beyond a new, smaller, faster, and lower power chip, there are now multi-die design strategies as well as some very innovative ways to marry software and chips.
I don’t know when Chapter Four will end, but I know there are exciting times ahead. There is so much more to say about these developments. And we have more to say, and soon. So, keep watching.
For now, I’ll leave you with the simple observation that the universe is cyclic in nature, so this was all quite predictable.
Catch up on some of my other recent blog posts for more industry insights:
In the era of Smart Everything—where devices are getting smarter and everything is connected—Synopsys technology is at the heart of innovations that are changing the way we live. Read on to get the latest look at trends in semiconductor chip design, verification, IP integration, and software security and quality. Learn about the ins and outs of electronic design automation from our industry-leading experts and how silicon and software are powering the automotive, artificial intelligence, 5G, cloud and IoT markets.