Coverage is My Friend

 

Silicon Billionaires

There are many of them these days. They used to be multi-millionaires but you know how these things go. Now, before you get too excited to find out the latest scoop on the yacht sailing, private jet flying, Davos skiing crowd, remember that I don’t typically hang out with Larry, Steve or Mark, I hang out with my friend Coverage. So this is not about the folks on the cover of Forbes Magazine, it’s about our favorite chips and the billion transistors they pack these days. More precisely it’s about verifying that a billion switches turning on and off, somehow harmoniously join forces to perform billions of instructions per second, transfer the proverbial Library of Congress across the country in minutes or make your picture on the iPad look better than the reality. How do we make sure that these ever growing billionaires do what they’re supposed to without collapsing under the enormous amount of verification data they generate?

Armed with traditional wisdom I went and told my friend Coverage that the only way forward is to raise the level of abstraction at which we describe the functionality of the billionaires. “Think about what happened when we moved from gate level to RTL”, I said. As it often happens lately, his answer took me by surprise. “People have been talking about transaction level modeling, SystemC, C++ and so on for a long time”, he said “but the bulk of verification is still done at RTL”. We went back and forth trying to figure out whether this is because of timing and power worries or something else perhaps but we eventually agreed that whatever the reason, things are what they are. Maybe they’ll change one day but what do we do in the meantime? And then it clicked. Rather than fighting with the abstraction level of the design description, how about raising the abstraction level of the verification data generated at RTL? Synthesize coverage data at the level of the verification plan, build infrastructure to track higher level transactions via smart log files, build tools to analyze protocols, etc. Verification can continue to be done at RTL but the analysis of the generated data should be raised to a level where it can be dissected and comprehended much easier. But wait! Aren’t we simply shifting a highly complex hardware modeling problem to an equally challenging tool development problem? Maybe, but challenging tool development is what we like to do 😉 My friend Coverage and I suddenly felt like a billion is not such a large number anymore.

  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon