Posted by Michael Posner on September 13, 2013
Over the last two weeks I’ve been traveling in China and Taiwan talking to FPGA-Based Prototyping users at the two Synopsys User Group, SNUG, events. FPGA-based prototypes in these regions are used to validate all types of designs from smaller RTL blocks to video cores and mobile application processors. The engineers face the same challenges as engineers in the rest of the world, mainly in the area of converting ASIC RTL to FPGA images.
This is what the Great Wall Of China looks like over a busy weekend…..
I was a little surprised to find so many engineers manually converting gated clocks for example rather than using vendor tools such as Synopsys’ Certify which do this automatically. After understanding which tools were used it was not a surprise to hear the complaint that the synthesis for the huge FPGA devices such as the Xilinx Virtex-7 2000T takes a very long time. Almost all the engineers use the FPGA vendor tools which do a great job but lack key capabilities targeted at FPGA-based prototypers. The engineers found it hard to believe that there are tools available to not only do automated gated clock conversion but also include “fast” prototyping modes reducing the time to operational prototype. This is why I titled this blog, mind the gap, as I felt that there was a knowledge gap. The engineers were very intelligent but lacked the industry knowledge of what tools were available to them. I am sure it will not be long before these engineers expand and make better use of the tools as their disposal.
At the SNUG events we took the opportunity to show and tell the HAPS-70 systems. Below is a picture from SNUG Taiwan of me holding the HAPS-70 S48, our 48 million ASIC gate capacity system.
Yes, business travel is hard work but someone has to do it……
We were also demonstrating the Embedded Vision Development System. For more information on this solution and a video click this link. http://www.synopsys.com/Systems/BlockDesign/ProcessorDev/Pages/Videos.aspx
The demonstration was running live on a HAPS-62 system. The demo design operates at close to real speed and it was pretty cool to see the live images.
A webcam is used to capture video which is streamed onto the HAPS-62 where the Application Specific Processor created by Synopsys’ Processor Designer operates on the image. The mode above was some outline capture function. Of course now I see my outline it highlights just how big my ears are, why did no one tell me this !!!