Posted by Alex Seibulescu on October 31, 2011
Unless you’ve completely isolated yourself from the wisdom of Verification and EDA pundits, you must have heard at some point that High Level Verification and High Level Synthesis are the way of the future. This has been true for at least the past 10 years and most likely is still going to be true for the next 10. The value is obviously there but there are a myriad of t’s to cross and i’s to dot until the moment some form of a high level flow will become a viable alternative to RTL verification. In the meantime we still need to find a way to make the current tried and true verification flows more efficient. So as I always do when I grapple with existential verification topics, I paid a visit to my friend Coverage and as always, he had an answer for me: High Level Coverage.
The idea is pretty straightforward. Modern verification tools (hint, hint) allow you mix a SystemVerilog testbench with a DUT assembled with high level SystemC or C++ models, many of which can be easily obtained off-the-shelf. Gradually, the high level models can be swapped out for their more timing and power accurate RTL equivalents until we’re completely back into the RTL world. This appears to be an increasingly popular approach to deal with the ever growing verification task we’re all aware of. However, what is not yet recognized is the ability to shift a part of the RTL coverage closure task to the higher design abstraction level. Let me explain. Some verification tools (more hint, hint) allow you to tap into the internal signals of the high level models and sample them in SystemVerilog functional covergroups or properties. This opens up the opportunity to develop and debug both the stimulus and the coverage side of the testbench in a much more efficient setting. Tests can be developed and graded based on the collected coverage, constraints can be relaxed or tightened and the functional coverage model can be refined before any RTL is available. The setup can then be re-used when the high level models are replaced with RTL thereby bringing a significant boost in overall productivity. Invariably, there will be some changes required to extend the transaction level stimulus to a finer controlled one, new coverage targets may be added to test details not available in the virtual models, but a significant part of the testbench and coverage model development and debugging time has been shifted to an earlier stage where productivity can be levels of magnitude higher.
Some people never get old and learn to adapt to the ever changing times. My friend Coverage is definitely of them.
"Coverage is by now pervasive in most verification flows but has in the modest opinion of this blogger, yet to reach its full potential. Although I have spent most of my 18 years in EDA (ouch!) on the R&D side, I have always been a good listener to our customers' concerns. My hope is that this blog will be an informal venue for all of us to explore how to push the benefits of Coverage and related methodologies to new levels" —Alex Seibulescu