Intel on Monday provided a handful of new details on a chip for artificial intelligence (AI) computing it plans to introduce in 2025 as it shifts strategy to compete against Nvidia and Advanced Micro Devices.
At a supercomputing conference in Germany on Monday, Intel said its forthcoming “Falcon Shores” chip will have 288GB of memory and support 8-bit floating point computation. Those technical specifications are important as artificial intelligence models similar to services like ChatGPT have exploded in size, and businesses are looking for more powerful chips to run them.
The details are also among the first to trickle out as Intel carries out a strategy shift to catch up to Nvidia, which leads the market in chips for AI, and AMD, which is expected to challenge Nvidia’s position with a chip called the MI300.
Intel, by contrast, has essentially no market share after its would-be Nvidia competitor, a chip called Ponte Vecchio, suffered years of delays.
Intel on Monday said it has nearly completed shipments for Argonne National Lab’s Aurora supercomputer based on Ponte Vecchio, which Intel claims has better performance than Nvidia’s latest AI chip, the H100.
But Intel’s Falcon Shores follow-on chip won’t be to market until 2025, when Nvidia will likely have another chip of its own out.
Jeff McVeigh, interim head of Intel’s accelerated computing systems and graphics group, said the company is taking time to rework the chip after giving up its prior strategy of combining graphics processing units (GPUs) with its central processing units (CPUs).
“While we aspire to have the best CPU and the best GPU in the market, it was hard to say that one vendor at one time was going to have the best combination of those,” McVeigh told Reuters. “If you have discrete offerings, that allows you at the platform level to choose both between the ratio as well as the vendors.”
© Thomson Reuters 2023
The post Intel Reveals Details on Its Plans to Make Chip for AI Computing by 2025 Against Rivals Nvidia, AMD appeared first on Technology News.