By Tomas Hedqvist, IAR Systems
Embedded.com (05/27/10, 09:54:00 AM EDT)
Power debugging is beginning to appear as a concept in the embedded industry, but what do we mean? In this article we will take a look at what is driving this need in embedded systems, and the way to optimize software to minimize an application’s power consumption.
Today, many embedded systems are powered by battery or via a signal cable. We see them in almost any market segment; medical, consumer electronics, home automation and many more. A common design goal for all these systems is that the power consumption must be low. Convenience, environmental awareness and costs are all contributing factors in choosing products that do not require us to constantly change the battery.
The design goals of low power consumption and a long battery lifetime have traditionally been the domain of hardware developers. As more battery-powered applications become controlled by a microcontroller, the hardware manufacturers present products with functionality and characteristics that contribute to lower power consumption and hence longer battery lifetime.
But in an active system, power consumption does not only depend on the design of the hardware, but also of how it is used. Of course, how it is used is determined by the system software.
The software developers’ domain on the other hand has always been to develop applications that are as efficient as possible and use as little memory as possible. In low-power systems power consumption is a third dimension that needs to be taken into account. However, the lack of proper tools has prevented power consumption to become an integrated part of the software development process.
Recently an innovative technology that achieves just what has been missing, integrating the system's power consumption in the development of software for embedded systems, has been announced by IAR Systems. The term used to describe this approach is what is called 'power debugging'.
Click here to read more ...