A View from the Top: A Virtual Prototyping Blog


Good Reasons to Drop Old Habits? Change is Hard!

I am involved in discussions about adoption of system-level technologies a lot. System-level design in EDA and embedded software are always intertwined as the software is the main factor changing when going beyond RTL. Given that system-level design technologies expand beyond the traditional realm of hardware, their adoption is non-trivial for project teams. The overall situation reminds me more and more of Malcolm Gladwell’s “Outliers”: for success several factors have to fall in place together, not all of them in our control.

In “Outliers”, Malcolm Gladwell describes “The Story of Success” as it relates to people. He argues very convincingly how several things need to fall in place. Amongst them is being born at the right time of year to be more mature and therefore be chosen for special training in sports. Also, the need being there at exactly the right time for a technology like computers to be needed. In addition the right background is crucial to have the ability to develop a skill – parents supporting training, education capabilities or available of excess compute time on a mainframe.

The same is true in principle for technology adoption, and most certainly for system-level design technologies. The time for adoption needs to be right. Being too early means you simply may not have enough breathing room to get to the mainstream., which almost certainly means that you will fall into the chasm instead of crossing it. Having the right technology combined with the right environment for it to be adopted, is crucial. The actual user need has to be there too. If the users can get by with traditional techniques, then the reasons to change methodologies, i.e. to drop old habits, are not good enough. Finally, when the time is right, the technology exists, the need is there, users need to know about the solution too. That’s where the power of marketing comes in.

So where are we with system-level technologies? Let’s look back 15 years. As some of you know, I cam to the US in 1997 to run the technical marketing for the Felix Initiative at Cadence. The resulting product VCC has been described  – together with Synopsys’ Behavioral Compiler – as one of the two system-level trailblazer projects in “ESL Design and Verification: A Prescription for Electronic System Level Methodology (Systems on Silicon)” (Martin, Bailey, Piziali). Were we too early? Probably. Leading customer told us though that they would need function-architecture co-design as implemented in VCC at the time. They documented with their tool purchases that they had a need. Was the technology right? Well, it worked for the early adopter designs and most of its concepts – like transaction-based design – find itself now in technologies like SystemC TLM 2.0. Were user getting by with traditional techniques? Absolutely! It was getting harder, but why abandon a methodology that’s still works? Did users know about it? Well, we marketed it quite a bit, and knowledge about system-level design approaches was growing.

So what was missing? The technical barrier to adopt the technology was simply too high. In the case of VCC we were addressing hardware and software designers with a combined methodology. We had developed techniques to abstract hardware (processors, busses, peripherals etc.), to abstract software (RTOSs, drivers and the actual functionality) and we could even create the implementations from the complete executable system-level specification. Heck, at DAC in 2001 we were demoing a flow in which the customer told us during the demo whether a task should run in hardware or software and then we automatically re-built the design and did show the resulting layout of the hardware together with the implemented software image.

The results were awesome: We achieved much faster development times and more robust designs meeting their specifications. Too bad that software and hardware developers had to change the way they are doing things completely. Also the complete ecosystem of Processor, IP, RTOS and middleware providers had to provide models to make it all work.

In the spirit of fellow Blogger Steve Leibson’s law, that “it takes 10 years for any disruptive technology to become pervasive in the design community”, I am tempted to formulate my own “Schirrmeister’s Law” as “The ability to adopt a new design technology is inversely proportional to the number of changes it requires”. Sounds trivial. Still something I learned the hard way during the VCC days and something we often overlook today.

If I could just quantify and measure it. That is hard, but here is an example: Almost 15 years later Virtual Prototyping is going through its adoption. How many changes does it require? Well, compared to VCC it is much more adoptable given that the software developers do not need to abstract anything. The real software – namely .elf files – run on abstracted hardware. The number of changes – or additional steps – is limited to the hardware developer.

imageSo what about the other adoption criteria? Is the time right? Well, software development has probably never gotten as much attention from the EDA side as it has these days. And our customers are expressing their concerns in every meeting I am at.. Do we have the right technology combined with the right environment for it to be adopted? The successes seem to speak for themselves – we have real users achieving real results. Is the user need there, do users get by with traditional techniques? We recently did a survey of about 800 embedded software developers. One of the questions we asked was “What target execution environment are you using for your embedded software development?”. The results are shown on the left. The top four answers are pointing to hardware based techniques, so they seem to work. Probing for limitations on those techniques users confirmed issues which I have written about in this Blog before – like the ability to control and debug the execution. Finally, whether software users are aware of virtual prototyping was a question intended to check whether target users know about the technology. There is definitely room for marketing effort here as the awareness was lower than we had hoped.

Bottom line, the adoption of system-level technologies like Virtual Prototyping remains a hot topic. Certainly the industry has lowered the barriers to adoption, but there is still some distance to go. The number of changes we ask users to adopt technologies is often still significant enough to shy away from adoption, even if it means mortgaging the future …

I will think about how to measure my postulated “Schirrmeister’s Law” above with quantifiable data. As always, I welcome your thoughts.

Share and Enjoy:
  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon
  • LinkedIn
  • RSS