Q&A : How ST optimized their validation flow and decreased turn-around time by 8X using Synopsys Custom Explorer
Posted by Hélène Thibiéroz on November 1st, 2012
You may have watched the previous video I posted on CustomExplorer Ultra (if not, it is not too late :)). A very interesting feature of this tool is its Waveform comparison capability.
This utility allows you to compare two sets of simulation runs in batch and produce a text report of the differences. In just a few words, you define a simple rules file that controls the comparisons, what signals are to be compared and the tolerances of the comparisons. Using sample-based comparison techniques, CXU compares golden-to-target simulation results and provides validation results. You are going to tell me, nothing really revolutionary here.. So let me just give you finer details
Because CXU is a verification platform supporting multi test-benches as well as simulators, you can use this utility in multiple ways:
- In a multi-testbench configuration, you can compare your master set of data to any number of other tests. Similarly, in a multi-sweep configuration (parametric, corner etc.), you can also validate all your runs to a golden set of data. If you are a verification engineer, you can eliminate the vast majority of manual “eyeballing” of analog or digital signals. Some of our users have reported reducing as much as one full week of manual effort to 15 minutes for analyzing 100 analog waveforms.
- Because CXU supports multiple Synopsys and third party simulators as well as data formats, you can compare a golden set of data (of course HSPICEJ) to different simulators runs. This feature is especially useful for CAD engineers, dealing with multiple vendors and tools.
- Other interesting point for Mixed-signal designers, you can validate your results versus a golden set of data across both analog and digital domains. You can for example compare analog to analog, analog to digital or digital to digital. Similarly using CXU hierarchy editor, you can toggle views at the leaf level (for example schematic versus Verilog-AMS) and compare results to your golden standard, allowing you to easily calibrate your behavioral models.
This interview post talks about a fourth application :). Here Branimir Ivetić and Vani Priya from Technology R&D – Smart Power Technology of STMicroelectronics talk about a unique flow they have developed using CX-U Waveform Compare to validate their Smart Power device models on FastSPICE simulators and how it has increased their confidence in Synopsys FastSPICE simulators.
Branimir Ivetić was born in 1979 in Sarajevo (Bosnia and Herzegovina) and moved to Italy with his family in 1994. He got the Degree in Electronics Engineering from “Polytechnic University of Milan” in 2003, while already completing his apprenticeship activity in STMicroelectronics’ Flash Wireless Division. In 2004 he joined the Technology R&D department for Smart Power technologies, and he is currently part of its Design Enablement group, as a Senior AMS Verification engineer.
Vani Priya holds a Bachelor Degree in “Electronics and Telecommunication Engineering” and has been working for STMicroelectronics, India since 2007 in Technology R&D Smart Power Technology, Design Enablement group.
Q- What are your current job responsibilities and which technologies are you currently working with?
Branimir – My main job responsibilities include the development, deployment, maintenance and support of AMS verification methodologies for ST Smart Power design community worldwide. The primary target for those methodologies is to guarantee the first silicon success of our Smart Power ASICs and applications. Some of those methodologies are:
- Pre and Post Layout Simulation with SPICE and FastSPICE simulators;
- A/D co-simulation for an efficient full-chip verification;
- Static and Dynamic Analog ERC (Electrical Rules Checking);
- Mixed-mode simulation and HDL-AMS modeling;
These methodologies are developed and supported with all ST’s Smart Power technologies. In particular, our most advanced Smart Power technology is a 0.13um High-Density BCD.
Vani – My primary responsibilities include:
- BCD Silicon Device Models Validation against FastSPICE simulators;
- Continuous enhancement and support of FastSPICE simulators (HSIM, XA, etc.) for their efficient usage among design community;
- Evaluation and deployment of specific AMS Verification methodologies, like the ones based one HSIM Circuit Check, Verilog-A macro-models etc.
Q- Branimir, please tell our readers about Smart Power Technology. What kinds of applications are typically targeted on Smart Power Technologies?
Branimir – In general terms, a Smart Power process technology is quite more complicated than a lithographically corresponding pure CMOS technology. That’s because these two kinds of process families are conceived to fabricate quite different final applications. The complexity gap is expressed through the availability of more (and more complex) elementary devices, more operating voltages (spanning from a typical low voltage 1.8V or 5V to a maximum of 65V, 90V or 100V) and logically different blocks, that need to coexist, all together, on the same Silicon die of a Smart Power ASIC.
The native name given to the Smart Power technologies was BCD (Bipolar-CMOS-DMOS). In fact, those technologies allow the integration of control logic functions (CMOS based), analog circuits for signal conversion/management (both CMOS and Bipolar based) and power management blocks (based on power DMOS structures) on the same chip.
On the other hand, the main differentiator of ST Smart Power technologies from the competitors’ ones is the completeness of its offering portfolio. We do dispose of High-Voltage, High-Power, High-Density and SOI BCD technologies, covering all lithographies (with products running from 2.0um to 0.13um) and all ST business segments. Some typical BCD applications are:
- Lighting applications and motors drivers in general;
- Motherboards DCDC converters, HDD Power Combo, Power line drivers and modems;
- Bio-medical applications like 2D or 3D ultrasound ecography;
- Control and signal conditioning ASICs for proprietary MEMS sensors.
- Airbag, gasoline direct injection and ESP/ABS control, car battery monitoring;
- Car radio and consumer audio amplifiers, automotive and consumer electronics power management, etc.
Q – Branimir, how key is Device Model Validation on Fast SPICE to your team?
Branimir – As I have just exposed, a so wide range of applications together with the high complexity, typical for Smart Power technologies, require efficient AMS verification methodologies supported by proven FastSPICE simulators.
The first mandatory step to support a FastSPICE simulator for a complex technology consists in Silicon device models validation. In fact, finding any inaccuracy in the interpretation of device models during the design becomes very expensive, because fixing the device models in the simulator is not a trivial task, and may mean not using the FastSPICE for one or two months, possibly disrupting delivery targets. To avoid such issues, we at ST – “Technology R&D Smart Power Technology” – make it absolutely necessary to fully validate correct support of our Smart Power device models by HSIM and XA, simulators that we officially adopt in our design flows. So the activity of device models validation, even if transparent to our design community, is really important, and one of the key factors allowing us to guarantee a high quality for our AMS verification methodologies.
Silicon device models validation flow consists of accurate simulations of apposite small test cases sets with three simulators, our golden SPICE, HSIM & XA, and the resultant waveforms are compared.
Q- Vani, please give us more insights into your SNUG paper which discusses about this flow? What are the main advantages of this flow versus the earlier approach?
Vani – First of all about the flow, each Smart Power technology has around 50 devices having on an average with six test cases per device for device characterization. So, we first generate around 300 (50 x 6) test cases for each FastSPICE simulator that is adopted by our design team. These testcases include AC, DC and transient simulations according to the device characteristic that needs to be validated. Next these simulations are submitted to the compute farm. Finally the resultant waveforms from these 300 simulations are compared on a waveform viewer taking into account visually absolute and relative tolerances that are permissible.
One can imagine that performing all these steps manually is highly time-consuming and prone to human errors. Due to these reasons we started automating the validation flow starting from the test case generation till plots comparison using CX waveform compare, which is being done today without any human interventions.
The main point that has attracted us to CX Waveform Compare is its ability to read plot files from different simulators and different formats. Moreover, we found it’s command language to be very powerful and well thought out (with all the bells and whistles) For example, one can set different tolerance for voltage and current waveforms, ignore certain regions of the waveforms, account for shifts along X-axis etc.
The validation flow has been in production now for over a year. With quite accurate rules setting for CX Waveform Compare, the time of comparison between two plot files with about 10 waveforms has been reduced to few seconds and this in turn has reduced our validation turn-around time – including debugging of reported mismatches – from 8 weeks to 1 week.
As a pilot project, we have also run this flow on a medium sized AMS IP with about 100 signals selected for comparison and we were satisfied with the run time.
Thanks to Branimir and Vani for sharing their insights. For more information on their work, you can access STMicroelectronics India SNUG paper at: