Posted by mike demler on November 27, 2007
As design complexity has continued to increase with the ongoing advances prescribed by Moore’s Law, it has become generally accepted that verification consumes more of a project’s time than the design itself. In forecasting (logic) designer productivity at future process nodes, the International Technology Roadmap For Semiconductors 2006 Update (ITRS) estimates that the average percentage of a project effort spent in verification is currently at 70%. Verification can mean many different things, but here I am primarily referring to functional verification.
For analog designers, my experience has been that verification is a continuous process that is executed as a design develops, from a very low level down to an individual transistor in some cases, progressing up to the completion of a block design then to the assembly of multiple blocks into a complete chip. Analog designers are more likely to think in terms of simulation rather than verification, and in terms of a tool rather than a process or methodology. That is the case at least up to design review time. The design review is well understood, and it can be a very stressful process! In the end though, whether it’s called verification or simulation, the objective is to confirm that the design meets the specifications under (as complete as possible) a range of operating conditions. Many sleepless nights can be spent worrying that something was missed before a critical design review.
For digital designers, there is Verification Methodology. Books are written on the topic, many seminars are given, and an entire industry eco-system has grown up to support verification. Why the difference? One of the more popular books on the subject, the Verification Methodology Manual for System Verilog (VMM), sums it up nicely: “verification methodologies have evolved alongside the design abstraction and kept pace with the complexities of the designs being implemented”. Digital design has evolved as logic synthesis took over for transistor-level design. Hardware description languages evolved into verification languages.
Analog designers, though their productivity has increased tremendously with progressively more powerful tools, must still design at the same level of abstraction as when Moore’s Law was first conceived. So what of analog verification? Is there a need for a more rigorous process and methodology now that analog designs have gotten more complex? Is there a need for an analog verification language? I think that the 2006 ITRS report is actually behind the times here, saying that “In the future, mixed-signal systems will become a more relevant fraction of all silicon developments, bringing the development of proper verification methodologies in this arena to a critical level”.The consumer electronics segment is the major driver for the semiconductor industry today, and all consumer electronics are analog/mixed-signal. That future is now. Is functional verification of mixed-signal systems critical to you?Should we be building a better bridge from analog to digital verification?
In upcoming posts I will discuss some of the concepts and terminology in the VMM digital verification methodology, as a vehicle to facilitate open discussion on how different (or similar) some of the concepts may be from what we are familiar with in analog. We do tend to develop our own languages for the work we do, so maybe a little cross-translation can help.
Analog… meet digital.Digital… meet analog.