Posted by Alex Seibulescu on February 15th, 2011
After a surprisingly long period of sunny skies, the clouds have returned to the Bay Area. Now, although some say that predicting when a chip will be ready for tape-out is akin to forecasting the next storm, I have not decided to subtly shift the topic of this blog to the exciting world of meteorology. Instead, I wanted to explore the nexus of coverage, the kind my friend is so enthused with, and cloud computing. If you haven’t yet heard about cloud computing, you’re probably reading this blog by accident but that’s ok, my friend and I are highly socially predisposed and we’re always happy to meet new people. There are many significantly better descriptions of what cloud computing really is but I will offer my own to keep things simple. According to my irrelevant opinion, cloud computing is a combination of hardware and software resources provided as a service to some end consumer. The cloud part comes from the fact that you don’t really have to know or care about where these services reside or come from, kind of like the clouds, you don’t know where they come from, where they’ll go next, or how high up they are, only whether they provide shade, rain, that sort of thing. In any case, cloud computing is part of our new reality, so I asked my friend whether he thinks there is any potential symbiosis between the lofty clouds and our daily challenges. The answer came with lightening speed and in a thunderous voice.
Designs are notoriously getting larger at a rapid pace and that becomes uniquely alarming if you wear verification engineering shoes. One way to cope with the formidable task of verifying them is to collect all sorts of coverage data and use that as a guide for assessing verification completeness, areas to focus verification on, etc. Naturally, this translates into overhead in both simulation performance and disk usage as all this voluminous data needs to be generated and stored. Although a strong case could be made for using the cloud to address the performance flavor of overhead, my friend vigorously honed in on the second aspect, storage. If your case is like that of many other organizations in the semiconductor business, you may also have discovered that disk storage has increasingly become an important slice of your overall IT infrastructure cost. What fraction of the total pie this represents, of course depends on many factors and will vary from site to site but the bottom line is that it is a problem today, it will become a bigger problem tomorrow, and cloud computing may be one way to tackle it. Picking Amazon as an example, my friend pointed out that even the middle of the pack “m1.large” standard cloud instance comes with a temporary local storage of 850GB. That’s a lot of disk space that gets included at no extra cost as part of doing business in the clouds. One could easily imagine running a regression in the cloud, collecting coverage from each test, merging it all out there and only permanently storing or downloading the merged database for further processing. The temporary data for each regression test, the biggest component of the peak disk usage, can be stored on the local “free” cloud storage and then discarded after merging.
Identifying an opportunity and taking advantage of it are of course not synonymous. Simulation vendors need to provide integrated cloud solutions and verification teams need to unleash some innovative thinking to fully take advantage of them, but the possibilities are tantalizing. Just ask my friend. He’s seen a bright light behind the cloud coverage and is convinced that getting to the clouds is, for once, not rocket science!