A View from the Top: A Virtual Prototyping Blog

 

(Not) Trusting Statistics (not generated by myself)


Happy New Year! Are we really in 2009 already? It feels somewhat unreal … I started my pre-spring cleaning at home and found some old statistics which reminded me of a couple of posts on statistics I have seen recently in the Blogosphere. We probably all have heard of Mark Twain’s “Lies, damned lies and statistics”. My variation of it is my golden rule to only trust statistics which I had a chance to influence myself 🙂 (there are probably more cynical ways of putting this).

Recently, Grant Martin was trying to bust a couple of myths related to numbers. First, Grant was taking on the “70% goes to verification” rule. He then was asking later for data to validate that the 80/20 rule is true for development as well, i.e. that 20% of the effort can determine 80% of the development properties, like power, cost etc.

Well, trust us Germans. Some of us measure everything. No, I won’t disclose my actual weight and height of 10 years ago, but one related item I ran across was a set of measured data as it relates to development efforts. It was created between 1991 and 1994 in four projects in the video encoding and decoding domain.

Anyway, at the time we – a team at the Technical University of Berlin working for the German Telekom on HDTV Motion Estimation chipsets – were also measuring very precisely our development efforts, i.e. which part of the development was taking up which effort. All this was driven by Christian von Reventlow, who wrote his Ph.D. thesis on the topic of complexity measurement for hardware development projects. I picked up some of his work later. The objective was to gather enough data to eventually find out which complexity measures correlate well with actual development efforts for hardware blocks. We came up with a hardware equivalent of the well know function points used in the software world.

Development effort across four projectsSo back to Grant’s question, at least for the verification piece I can offer some more hard data, albeit somewhat dated. This figure shows the effort distribution across four chip development projects. The first four elements are architecture definition, specification, RTL development and schematic entry. Yup, we are talking 1994, we did not use synthesis yet. After the bar we have the verification tasks, manual code inspections, validation by simulation and validation against a software reference (in only one chip). If we look at this as design vs. verification, then the verification effort even for these relatively small chips – they were all between 90K and 200K gates – was 54%, 44%, 46% and 45%.

Is 70% for verification out of the question? Not really. Scaling above data up to more complex designs can easily add 20%. In addition the data above was dominated by design and verification of the IP blocks itself. If one now takes into account the system integration, i.e. the connection of the IP blocks to make up a system or a system on chip, then verification of that integration will easily add some more verification complexity.

Bottom line this data seems to confirm that 70% effort for verification is a realistic number. My real interest today, however, lies on the software side. I better get on from here to proving that it is not a myth that increasingly the software development effort dominates the hardware development effort …

Share and Enjoy:
  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon
  • LinkedIn
  • RSS