The pitfalls of mixing formal and simulation: Where trouble starts
By Mark Eslinger, Joe Hupcey and Nicolae Tusinschi (Siemens EDA)
EDN (May 23, 2022)
The most effective functional verification environments employ multiple analysis technologies, where the strengths of each are combined to reinforce each other to help ensure that the device under test (DUT) behaves as specified. However, this creates an inherent challenge of properly comparing—and combining—the results from each source to give a succinct, accurate picture of the verification effort’s true status.
The most common problem we see is when design engineers want to merge the results from formal analysis with the results of RTL code and functional coverage from their UVM testbench, yet they don’t fully understand what formal coverage is providing. Hence, we will start on the familiar ground of simulation-generated code and functional coverage before going into defining formal coverage.
![]() |
E-mail This Article | ![]() |
![]() |
Printer-Friendly Page |
|
Related Articles
- The Designer's Dilemma: Everything is OK... Until It Isn't
- Are you optimizing the benefits of cloud computing for faster reliability verification?
- Out of the Verification Crisis: Improving RTL Quality
- Processor-In-Loop Simulation: Embedded Software Verification & Validation In Model Based Development
- How formal verification saves time in digital IP design
New Articles
- Mastering Key Technologies to Realize the Dream - M31 IP Integration Services
- Create high-performance SoCs using network-on-chip IP
- IoT Security: Exploring Risks and Countermeasures Across Industries
- How Efinix is Conquering the Hurdle of Hardware Acceleration for Devices at the Edge
- An overview of Machine Learning pipeline and its importance