AI will be increasingly important in EDA, reducing design costs and supporting engineers

Artificial intelligence (AI) is having one of its periodic days in the sun, and dominates the conversation at almost any industry event. The Design Automation Conference (DAC 2023) was no exception, with AI seen by the semiconductor community as both an opportunity and a challenge. 

An opportunity, of course, because AI requires so many chips, from the huge and complex system-on-chips that will power the AI engines and models, to the semiconductors that will be embedded in every device to bring AI to every application.  

The complexity of the chips fuels demand for a wide variety of IP, but this is where some of the challenges are seen. Integrating many blocks of sophisticated IP to form an AI system-on-chip – which may also integrate yet more functionality such as 5G – is a long process, and it requires very advanced skills. There may be hundreds of IP blocks that need to be tested and integrated, with the results recalibrated every time one of the blocks is changed or enhanced. Identifying the cause of a fault or failure may take many engineer-weeks. 

This is true of other chip applications too, of course, including 5G. Engineers with the required skills are in short supply in many markets, and that shortage is worsened by two factors – the number of AI-focused chip start-ups that are now competing for talent, and the increasingly long design cycle for a complex chip, which will consume a growing number of engineer hours before it is ready. 

At DAC, Alberto Sangiovanni Vincentelli, from the University of California at Berkeley, said in a presentation: “The scarce resource of the future is talent. Everyone and his brother wants to study AI. But we don’t have the people to design the chips to implement that AI.” 

DAC buzzed with discussion about how to address the skills gap in electronic design and manufacturing. Some of the ideas were conventional – making electronic engineering more attractive to young people at school and college level, for instance. But of course, another option is to use AI itself, to help or even replace the engineers. 

Some attendees were positive about this development, claiming AI could reduce the time to develop new chips, by taking on some of the tasks of design assistants.  

Of course, others believe such an approach would eventually threaten jobs altogether, especially if the skills shortage eases in future, and the use of AI also entails disruption to tried-and-tested processes and organizational structures. 

But, at least with the current state of AI technology, replacement of engineers is fanciful. Where AI excels is in rapidly gaining actionable insights from huge quantities of data, such as that generated by EDA tools, and that can support the engineers and make their design and verification tasks quicker and less onerous. 

An example is Thalia’s AMALIA software platform. This is an IP re-use platform for analog and mixed-signal ICs, that allows designers to re-use IP blocks quickly and optimise existing IP for new applications. The suite of tools are designed to free up engineers’ time for complex and high-value tasks by automating key processes. The powerful combination of two of the AMALIA tools, the Technology Analyzer and Circuit Porting, which use unique AI algorithms, can be combined typically resulting in up to 70% of IP blocks needing minimal or no changes before they are re-used. This saves a significant amount of engineering time because every block doesn’t need to be manually checked and verified. 

This example shows how AI is already being incorporated into design automation toolsets in order to boost efficiency and improve commercial outcomes. In other words, AI can be a valuable way to support engineers and reduce the time to produce and test complex chips – including those that will, themselves, enable AI processing and applications in future.