Complex SoCs breed new design strategies
By Chappell Brown, EE Times
December 22, 2000 (3:28 p.m. EST)
The task of building effective system-on-chip (SoC) devices is nudging electronic engineers-whether circuit designers or test and verification specialists-out of their comfortable niches. Trained as specialists, designers are being asked to confront the diverse circuit types that can now coexist on the same chip, essentially forcing them to become generalists. Today, it is no longer enough to be good at, say, logic design, without also having a good grasp of memory types, programmable logic or mixed-signal design. And the various methods of blending software with hardware and the thorny issues of test and verification get thrown into the mix as well.
This week's System Design focuses on those SoC issues. Contributors are grappling with the larger issues of how to blend technologies, or in some cases, how to avoid having to blend technologies with new tools and design approaches.
"All logic suppliers, driven by the economics of integ ration, are moving toward SoC devices," said Yankin Tanurhan, director of Embedded FPGA at Actel Corp. (Sunnyvale, Calif.). But although integration has always brought many advantages with it, Tanurhan noted that at the more recent complexity scales, and enabled by deep-submicron processes, the design and mask set costs of those devices can skyrocket.
Actel is introducing an embeddable programmable gate array core that blends fast static RAM memory in order to offer a standard design that can meet the needs of diverse ASIC-style design objectives.
With the increasing variety of circuit types on the same chip, the whole area of test and verification is undergoing a revolution. The earlier strategy of building a chip and then testing it has gone out the window, according to Yervant Zorian, chief technology adviser at LogicVision Inc. (San Jose, Calif.). As a result, test a nd verification is being replaced by a more dynamic co-test and co-verification strategy.