A View from the Top: A Virtual Prototyping Blog

 

Inflexibility? That’s so 2009 :)

With the year coming to an end faster than I really can comprehend (have you started on your Christmas wish list yet?), I am looking back to what I said would be important going into 2010.  In my Electronic Design column’s forecast “2010 Will Change The Balance In Verification” i suggested that software development would change how verification is done. Well, looking back I can confirm that this is happening, albeit not only the way I had suggested.

I have previously quoted Janick Bergeron, my fellow blogging colleague on verification, that the “solution to the verification problem lies in the design process”. 2010 validates for me that this is really becoming a reality. Simply put, design is changing and one of the main reasons is verification. To enable successful chip and system design while also enabling the different participants in the design chain from IP providers through semiconductor vendors, Integrators to OEM, three components are required. Developers need to be enabled to

  • re-use as many blocks for chip design as possible and have tools to create new, differentiating blocks as fast as possible
  • assess whether the integration of blocks  (new and re-used) will work correctly when integrated
  • prototype the “chip to be” for early software development, hardware-software integration and verification

The last item is successfully addressed today by virtual platforms and FPGA prototypes, depending at which point of the design flow the project team or their customers decides to require enablement for software developers and verification. The integration aspect is addressed by IP-XACT based tools at the RT-Level (for RTL assembly) and architecture design tools at the transaction-level (to understand bus bandwidth, impact of software mappings into multiple processors etc.). To efficiently enable blocks we have a striving Implementation IP licensing market and various tool options for high-level block design and synthesis, which brings me back to the topic of verification.

The main reason for high-level design of blocks is not a faster path to implementation, better coding efficiency or re-use and connections to system-level models. The main reason for high-level design of blocks is verification!

I have used the picture on the left in a previous post on high-level synthesis already. It shows the various options a user has to implement a block once the idea for a the block is born. The options range from dedicated hardware implementation – read “inflexible” – to full software implementation – ready “flexible”. There are various grades of flexibility in-between.

Especially in consumer applications flexibility has become a key issue today. In order to meet changing standards and to be able to support different versions of algorithms with the the same intent (like en- and de-coding video or audio), mixed hardware/software implementations are very attractive. Two of the implementation options have profound impact on verification and a re worth to be singled out as verification actually seems to be the most important driver for their adoption:

  • High-Level Synthesis (HLS) for fixed hardware blocks is on a path to real adoption, especially for data-path oriented designs, but also for control. Besides the advantage of getting to the actual implementation faster, verification also changes and makes HLS attractive. The basic idea is that more verification can be done at higher-levels of abstraction, i.e. TLM, and then the implementation path is automated. This is the same reason why the industry switched for digital design from gate-level design to RT-level design quite some time ago. In my first project I entered the blocks of an FFT chip at the gate-level, but already verified them in RTL. If logic synthesis would have existed the way it does today, I would have not even touched the gate-level.
  • High-Level Synthesis for custom processors has emerged as an attractive option to get from idea to implementation. Flexibility of the implementation is key here. In contrast to a fixed hardware block and its high-level synthesis, the main functionality runs in software on a custom processor, which is automatically synthesized from a high-level architecture description language. This is a great approach to address the implementation of standards which have not been finalized yet as well as implementation of algorithms which are similar in nature (like different video standards). It turns out that verification is an even bigger motivator to adopt this approach because it eliminates the fundamental shortcoming of having to verify hardware blocks: Prior to taping out the design for ASIC implementation, the design’s functionality has to be verified in full using hardware oriented test-benches, which include for a video algorithm for example all variations of bit-rates, frame sizes etc. Custom processors, or Application Specific Instruction Processors (ASIPs) offer a very unique, fundamentally different alternative for verification. Given that the actual functionality – which drives the majority of the verification effort – is actually not implemented in hardware but has moved into software, now the correctness of the custom processor hardware itself can be verified independent of the function it executes. As a result of this separation of structural and functional verification, they also can be separated in time. Specifically, once the actual correctness of the custom processor execution is verified, the implementation can proceed and functional verification can be done fully in parallel using software.

Well, all this happened faster in 2010 as I had expected and is really driven by verification. To close this post and to come back to my original 2010 prediction on verification, I was correct in principle with my prediction. We have more customers using embedded software and directed tests written software for verification of hardware blocks. It is now the second most adopted use model for virtual prototypes (besides early software development). The adoption of high-level design for blocks – custom processors and fixed hardware alike – is further changing the verification dynamic, especially for block verification.

It will be interesting to see how this will play out in 2011. Interesting times!

Share and Enjoy:
  • del.icio.us
  • Digg
  • Facebook
  • Google Bookmarks
  • Print
  • Twitter
  • StumbleUpon
  • LinkedIn
  • RSS