AMD wants to market its Epyc processors from 2023, where it features an AI inference engine based on an fpga from its subsidiary Xilinx. That’s hardly surprising given the company’s recently acquired patents pending to stack machine learning accelerators.

It is not yet known which stack technology and interconnect will be used. It seems logical to place the accelerator on top of the I/O die. This means that computational patterns in the thermal domain are less limited. AMD is working on an accelerator port that could potentially handle multiple packaging methods.

This also opens up opportunities to add other chips such as GPUs or asics and offer a more modular processor lineup. Customers can then take advantage of custom chips for their applications.

While explaining its financial results, Victor Peng, head of Adaptive and Embedded Computing, said that Xilinx is already using the scalable AI engine for image recognition and a variety of applications in embedded systems and automobiles. More will be announced at Financial Analyst Day on June 9, CEO Lisa Su said.

Source: Tom’s Hardware

Source: Hardware Info

Previous articleStarlink, the new portable receiver is coming: Internet via satellite also on the campsite and on motorhomes
Next articleDon’t let your iPhone ruin your nap: how to program Do Not Disturb mode


Please enter your comment!
Please enter your name here