As the world faces acute semiconductor or chip shortage, a team of Google researchers is working on to design next-generation artificial-intelligence (AI) chips, and has created an AI model that allows chip design to be performed by artificial agents with more experience than any human designer.
The new AI method utilises past experience to become better and faster at solving new instances of the problem.
“Our method was used to design the next generation of Google’s artificial intelligence (AI) accelerators, and has the potential to save thousands of hours of human effort for each new generation,” the team wrote in a paper appeared in the scientific journal Nature.
“Finally, we believe that more powerful AI-designed hardware will fuel advances in AI, creating a symbiotic relationship between the two fields”, they noted.
In about six hours, the model could generate a design that optimises the placement of different components on the chip.
To achieve this, the Google team used a dataset of 10,000 chip layouts for a machine-learning model, which was then trained with reinforcement learning.
“Our RL (reinforcement learning) agent generates chip layouts in just a few hours, whereas human experts can take months,” Anna Goldie, research scientist at Google Brain, who took part in the research, said in a tweet.
“These superhuman AI-generated layouts were used in Google’s latest AI accelerator (TPU-v5)!” She added.
Google has used the model to design its next generation of tensor processing units (TPUs), which run in the company’s data centres to enhance the performance of various AI applications.
Chip floor-planning is the engineering task of designing the physical layout of a computer chip.
Despite five decades of research, chip floor planning has defied automation, requiring months of intense effort by physical design engineers to produce manufacturable layouts.
“In under six hours, our method automatically generates chip floor plans that are superior or comparable to those produced by humans in all key metrics, including power consumption, performance and chip area,” according to the Google AI team.