Navigating the Era of Autonomous Design for Powerful Compute with Hardware

Synopsys Editorial Staff

Jan 11, 2023 / 4 min read

The impact of artificial intelligence (AI) on humankind is being felt worldwide—improving processes, optimizing decision-making, and doing the work of humans in a fraction of the time. While some are cautious of what the future may bring, many solutions we take for granted today in areas like medical research, smartphone photography, and voice assistance wouldn’t have been possible without AI, and specifically machine learning (ML).

Predictably, there are many conferences dedicated to AI technology and its impact. A leading event Synopsys participates in is the AI Hardware Summit. This is an annual conference dedicated to the enabling technology that AI algorithms run on. It is also the place to learn about advances on the horizon, and the obstacles standing in the way.

Our President and COO Sassine Ghazi gave the closing keynote on Day 1 of the most recent summit, during which he illuminated the opportunities, challenges, and potential solutions that that can make AI deployments ubiquitous. Read on to learn more about new strategies the industry can adopt to increase and hasten AI’s impact.

Power Challenges in AI Hardware

The Balancing Act Between Complexity and Energy

After an inspirational opening touching on areas like AI use for COVID vaccine development, cancer research, and supercomputer-driven digital twins of Earth to predict future climate patterns, Ghazi cited a few eye-opening statistics and their implications.

He pointed out how data continues to grow exponentially. Before 2018, data was mostly generated through humans interacting with an application, but in the past four years, machines have begun to generate much more useful data. Looking ahead to 2025, this trend will grow from roughly 30 zettabytes to 160 zettabytes of data. The challenge going forward is that as data increases, the models that consume this data will also grow in complexity.

Ghazi went on to explain that for AI to mimic the human brain, we must look toward context-aware transformer models. However, these models come at a price as they push the limits of chip design. This, in turn, causes concerns for energy and the environment. For example, running a transformer model for computer vision or natural language processing applications to 13 percent completion consumes the same amount of energy as an average household over a year.

As an industry, he pointed out that we need to reiterate how AI applications are vastly net positive, even if they add to carbon dioxide (CO2) emissions. As well as helping to solve complex virus puzzles and many other human-enhancing things, AI will play a key role in helping the world manage and lower CO2 emissions, defined as Scope 3 emissions and detailed by the US EPA organization.

He discussed the growing talent shortage and offered novel ways to manage it, also using AI/ML. With the right deployment of the technology, senior design teams are now able to perform at a much higher level of efficiency, while junior design teams, supported by AI tools, can perform like senior design teams can today.

The conclusion of his talk highlighted these benefits, as well as ways to address design complexity and the carbon footprint implication of AI models. It turns out AI is both a problem and a solution.

AI is Powering the Way Forward in Chip Design

According to Ghazi, AI itself can improve existing chip design processes and turn traditional chip design flows into autonomous design instruments. Many of the tasks faced by hardware engineers require examination of vast amounts of information to identify the optimal approach for a given SoC design. This kind of work keeps teams of experienced designers busy for weeks. By using reinforcement learning algorithms–the same algorithms used to beat master chess players–these tasks can be done far faster. The results are also typically superior to those delivered by the experienced design team. This frees those design teams to focus on what they do best–innovation for the next big thing.

AI-Driven Chip Design Impact | Synopsys

Ghazi went on to explain that AI-driven and guided decision-making can also improve energy consumption. Drawing on real numbers from many customer chip design projects, average chip-level power reduction of 8% is achievable so far. While that number may seem small out of context, every milliwatt counts toward the goal of reversing the energy consumption of AI, a goal known as net zero. If all data center ICs were to be optimized for power using AI, this could mean 7.8 trillion Watt hours less energy spent without any compromise on speed.

AI Chip Energy Consumption | Synopsys

Tapping into these new possibilities and opportunities, Ghazi believes reaching 1,000x improvement in AI compute performance is possible, helping to lead the way for future innovation.

He concluded that AI is the only way forward to improve existing processes and turn traditional chip design flows into autonomous design instruments. His insights and proof points delivered a hopeful and exciting conclusion to the first day of the AI Hardware Summit.

Continue Reading