Embedded SOC takes new codesign tricks
By Thomas Bollaert, Codesign Application Specialist, Summit Design, Beaverton, Ore., EE Times
June 15, 1999 (11:52 a.m. EST)
One of the most challenging tasks in embedded designs involving system-on-chip technologies is cosynthesis. Many approaches have been attempted. Some consisted of mapping the specification on a user-constrained architectural template; others tried modifying a full software model by migrating functional units into hardware. Still others tried the inverse, by migrating the system from a full hardware implementation to a mix of hardware and software; some even studied how to automatically generate a dedicated engine, or processor, for the specified application.
Because all of these practices are presented as codesign, confusion about definitions and expectations has increased. The result is that codesign has the image of a tool-based process that lies somewhere between a leading-edge technology and black magic-where pressing a button transforms a state chart into a million-gate system packed with 4 Mbytes of embedded software.
Codesig n has to be redefined as the combined specification and implementation of hardware and software. It should be seen not as a tool-driven process but as a design methodology that must support concurrent engineering between hardware and software developers where partially incomplete or variable specifications exist. And it must enable tight collaboration among all players: system architects, hardware and software designers, and marketing. Above all else the goal of codesign is not to optimize the design itself but first to optimize the design process. Out of this focus the other goals of the codesign methodology emerge: better design productivity, better optimization of the system and overall improved product quality.
Codesign disciplines include several tasks: cospecification, cosimulation, codevelopment and co-verification. The first is the combined specification of the system, the second is the functional validation and product constraints assessment-such as cost, performance and power. The third is the refinement of hardware and software functionality, achieved manually or with cosynthesis, and the fourth is the combined simulation of hardware and software
Codevelopment can be seen as the concurrent process of hardware and software design. Once the system architecture is committed, this design process should be done in the respective software (Tornado, Prism) and hardware environments (Visual Hardware Description Language or VHDL). This is necessary because each of these two domains is very different and has very different needs and constraints. But to achieve concurrent engineering, methodological rules, such as codesign reviews, should be established.
The usual V curve illustrating a project life cycle with the design and validation branches applies both to hardware and software design. Codesign is a design methodology in the sense that it is about bridging the design branches of the hardware V curve and the software V curve at all phases of the design process. As codevelopment is a bout combined hardware and software development, cosynthesis definitely has a role in this discipline in order to facilitate model transformation along the design axis.
The recent Date '99 conference in Munich saw the emergence of C-to-HDL synthesis tools (C-Level Design, Frontier Design). This shapes a new link between C-based cospecification environments and a codevelopment methodology. Disciplines can be bridged today with co-verification.
In design processes when the hardware and software specifications are frozen there is usually little interaction between the two design teams. Until recently the first moment when software and hardware designers would test the combination of their respective developments was on the prototype board. For the past couple of years co-verification systems have helped make this moment happen sooner by providing a virtual prototype of the system for jo int development, debug and verification of the hardware and software.
This approach provides great benefits to both design teams. A concrete example of co-verification in action can be the test and validation of software diagnostics. The system doesn't need to be an embedded one and doesn't even have to have a microcontroller or a DSP in it. The system could be a PCI-based peripheral board intended for PCs.
Co-verification can be used to test all diagnostics and drivers against a virtual (HDL) model of the board. While executing, the software generates cycles on a PCI master agent connected to the rest of the design. Such an environment helps to verify the hardware-software interactions in the system. Any discussion about joint verification is also about joint development and codesign. However, this stage is only the back end of the codesign flow as it can only be exercised when architecture is committed and significant amount of hardware and software development is already available for tes ting.
As cospecification and codevelopment enable hardware and software designers to move down the design branch of the V curves concurrently, cosimulation provides the necessary integration and communication links to ensure design coherency with the original specification. Cosimulation differs from co-verification in that the latter defines the verification of the embedded software running in an instruction-set simulator (ISS) against a virtual prototype of the hardware target running in an HDL simulator. More generically, cosimulation is the combined simulation of various virtual components usually described in the C language.
Basically, co-verification addresses the needs of hardware developers to get software-originated stimuli to HDL hardware models. It may also be of some benefit to software developers working on the lowest levels of hardware driver routines. The intent is to verify that the software-hardware interface works correctly. However, for the majority of the software develop ment process co-verification is too slow. In cosimulation, a higher abstraction approach can be taken to get simulation speeds to the point where the simulation environment becomes a usable tool for the rest of the software developers. Rather than simulate the software against low-abstraction-level HDL models, higher-abstraction-level functional models can be written in C and simulated in a purely software simulation environment. A further increase in the abstraction level can be taken on the software side by executing the software in host-code mode instead of executing it in an ISS. As long as the software is written in C or C++ it can be compiled using a development host computer's native C compiler and executed in a host-code simulation environment. Even in small-to-medium-size applications-several hundred kilobytes-the problem of developing the actual high-level application code becomes the primary issue.
And then there are software development problems such as uninitialized variables and pointe rs, a loop that runs once too often or incorrect function calls. These matters can be addressed more easily in a host-code simulation environment with high-abstraction-level hardware models.
In the case of user-interface intensive applications, such as a GSM handset, it would be important to verify that the application code produces the expected result on the handset's LCD with correct real-time response. In order to do so, some virtual models of the display and keyboard would need to allow the developer to stimulate the application and observe its responses. These models need not be HDL. In fact, the software developer is much better off if they aren't in HDL. They can be pure data-flow abstraction models such as simulated hardware registers that the developer can stimulate, monitor and visualize and use to generate simulated interrupts.
Validating the man-machine interface of the device before prototype can avoid painful on-target debug, ensure and validate predictable product behavior wit h the marketing team, and permit results and stimuli to be shared by hardware and software design teams for coherency.
Such cosimulation can be accomplished by using a tool such as the ESIM, where embedded system developers have reported virtual simulation performance approaching real-time system speeds with overall development-time savings of up to 33 percent and significant design quality improvements. Ideally, as the hardware development progresses, the abstract C models can be swapped for HDL models of the peripherals when those models become available. An environment that allows mixing of C and HDL models can provide the trade-offs between very-high-performance simulation and implementation-level detail.