By Darryl Koivisto, Mirabilis Design Inc. Embedded.com Nov 18 2005 (14:00 PM)
Two “laws” have been much on the mind of developers of embedded systems faced with bringing higher levels of performance to designs with strict limits on power: Moore’s Law and the law of diminishing returns
In the mid-60s, Gordon Moore made the observation that component-doubling time on integrated circuits was 12 months, which he later revised to a somewhat slower 24 months. Since then, others have tried to more closely fit his prediction to actual results by changing the doubling interval to 18 months.
This extrapolation has taken on the force of law over the years, despite evidence that as we move toward smaller and smaller geometries that the law of diminishing returns is taking its toll: the tendency for a continuing application of effort or skill toward a particular project or goal to decline in effectiveness after a certain level of result has been achieved.
In the face of this, in order to maintain performance improvements in designs that are severely constrained as to power and space, developers have moved away from the well understood and tested single processor model to system, board and circuit designs with two, three or four processors operating in parallel which they hope will allow them to more efficiently balance power and performance.
But as we make this move, it is important to keep yet another “law” – Gene Amdahl’s – in mind, as a guide to where we can go with this approach, where it is most useful and where it is not.
Click here to read more ...