The pitfalls of mixing formal and simulation: Where trouble starts
By Mark Eslinger, Joe Hupcey and Nicolae Tusinschi (Siemens EDA)
EDN (May 23, 2022)
The most effective functional verification environments employ multiple analysis technologies, where the strengths of each are combined to reinforce each other to help ensure that the device under test (DUT) behaves as specified. However, this creates an inherent challenge of properly comparing—and combining—the results from each source to give a succinct, accurate picture of the verification effort’s true status.
The most common problem we see is when design engineers want to merge the results from formal analysis with the results of RTL code and functional coverage from their UVM testbench, yet they don’t fully understand what formal coverage is providing. Hence, we will start on the familiar ground of simulation-generated code and functional coverage before going into defining formal coverage.
![]() |
E-mail This Article | ![]() |
![]() |
Printer-Friendly Page |
|
Related Articles
- Are you optimizing the benefits of cloud computing for faster reliability verification?
- Out of the Verification Crisis: Improving RTL Quality
- Processor-In-Loop Simulation: Embedded Software Verification & Validation In Model Based Development
- How formal verification saves time in digital IP design
- Formal, simulation, and AMBA verification IP combine to verify configurable powerline networking SoC
New Articles
- Stop-For-Top IP model to replace One-Stop-Shop by 2025... and support the creation of successful Chiplet business
- Lossless Compression Efficiency of JPEG-LS, PNG, QOI and JPEG2000: A Comparative Study
- Four ways to build a CAD flow: In-house design to custom-EDA tool
- Understanding Interface Analog-to-Digital Converters (ADCs) with DataStorm DAQ FPGA
- How to achieve better IoT security in Wi-Fi modules