The first formal verification of a processor?

It was 1992 and I was about to leave Rockwell-Collins and move to Austin, Texas to work for Computational Logic Inc., a formal verification research company. My co-worker Bud and I were chatting while walking back from the cafeteria.

“So, what will you be doing for this new company?”

Bud was a long time Collins employee, self-taught in many things and with a healthy dose of skepticism about “the latest thing.”

Continue reading “The first formal verification of a processor?”

SoC design cost model usage and implementation

In the last post we created a cost model of the SoC design process, modeling development stages separately and composing them for the overall design flow. Each stage models a few basic components: baseline development work, iterations (e.g., debug loop), hand-offs to and from other stages, and bug reporting.

This post will discuss use of a single-stage model, a multi-stage model, and implementation details, providing examples from RTL and verification stages.
Continue reading “SoC design cost model usage and implementation”

Building a Cost Model for an SoC Design Flow

“How much is verification going to cost?”

The last post started with this simple question and looked at empirical studies on the cost of bugs. This time we will approach it from the opposite direction and create a cost model for the System-on-Chip development process, including the cost of bugs.

A cost model parameterized with a project’s data is more than a theoretical framework – it is a management tool to perform methodology and tool tradeoffs, identifies opportunities for improvements, and aligns with efficiency metrics as described in the Project Metrics post.
Continue reading “Building a Cost Model for an SoC Design Flow”

The cost of bugs

It all started with a simple question: “How much is verification going to cost?”

This is an important question since verification is a major part of System-on-Chip development efforts – 56% according to a recent study (the 2012 Wilson Research Group study sponsored by Mentor Graphics). The same study shows that verification engineers spend 35% of their time in debug.

Understanding the cost of bugs seems like a good place to start looking for answers.
Continue reading “The cost of bugs”

Big Data Analytics for Verification?

Big Data has been getting lots of attention recently, and for good reason – the capability to store and analyze large data sets presents opportunities we have never had before.

I have used ad hoc tools to analyze “Medium Data” for instruction set optimization, company-wide debug tool usage, and random fails in a compute farm. It is easy to see the potential for larger data sets and more powerful analytic tools.
Continue reading “Big Data Analytics for Verification?”

Project Metrics

Why are our projects predictably unpredictable?

The 2012 Wilson Research Group Functional Verification Study shows that around 67% of our projects are late, continuing a trend going back at least ten years.

As an industry we are pretty bad at estimating how much time it will take to complete a design unless we have previously done something very similar. Yet many projects involve substantial changes: implementing new protocols, incorporating new versions of IPs, increasing the number of CPU cores, implementing new power management schemes, etc.

We know there are going to be surprises, but being able to spot problems early enough to take corrective action is difficult: many projects use design teams located around the world with different tools, methodology, and even design goals. Getting timely, accurate information is a challenging problem in itself.

This analysis is reinforced by a recent design management survey showing the top requirement from project management was for global visibility.

Continue reading “Project Metrics”