Post by account_disabled on Sept 14, 2023 2:57:03 GMT -8
According to a senior AMD executive, AMD is considering deeper integration of AI across its Ryzen product line. But there's still one element missing: the client applications and operating systems that will actually take advantage of it.
ⓒ Playground AI
AMD launched the Ryzen 7000 series, which includes Phone Number List the Ryzen 7040HS, last January, and announced the Ryzen 7040U, a low-power version of the 7040HS, in early May. The two processors are the first chips to be equipped with Ryzen AI XDNA hardware, one of the first AI logics for PCs.
However, AI so far is a service that runs entirely in the cloud, with some exceptions, including Stable Diffusion. This is precisely why AMD's adventure is dangerous. Why bother investing expensive silicon into an IPU (Inference Processing Unit) to develop functions that no one can properly use on a PC?
This was one of the questions PCWorld asked David McAfee, vice president and general manager of AMD's client channel business. According to McAfee, we're on the verge of a series of announcements and events that will help clarify what's happening overall in terms of AI processing. This is just the tip of the iceberg.
It's not clear whether McAfee is talking about Google I/O, the Microsoft Build developer conference being held later this month, or something else entirely. However, AMD seems to be taking a 'smaller stride' approach to AI than expected.
“AI on PC is not as complicated as you think”
Of course, if you have sufficient storage and memory, you can run AI models on Ryzen CPUs or Radeon GPUs. However, McAfee explained that Ryzen and Radeon are too heavy hammers to use for this type of computing.
AMD sees AI in PCs as small, lightweight, frequently triggered tasks running on AI processors called IPUs. AMD has been using AI for quite some time, experimenting and optimizing AI technology for PCs, and has Sense for Ryzen processors that adjusts clock speeds through Precision Boost 2, mXFR, and Smart Prefetch. It blends several technologies under the umbrella of MI (SenseMI). IPU can take these technologies to another level.
“IPUs and future IPUs are more of a combination of specialized engines that perform specific types of computing in a power-efficient manner,” McAfee said. “They are integrated in some ways with the memory subsystem and the rest of the processor. “Workloads running on GPUs are expected to become a more frequent type of compute on user platforms (if not continuously running workstreams) rather than one-time events.”
AMD considers the IPU to be similar to a video decoder. In the past, video decoding tasks could be pushed through with Ryzen CPUs. However, massive computing power was required to ensure a pleasant experience. “Alternatively, we could do this using a relatively small engine that is part of the chip design and extremely efficient,” McAfee said.
This means that, at least for now, the Ryzen AI IPU does not have a separate card or its own memory subsystem. Stable Diffusion's generative AI models run using dedicated video RAM. However, when asked about the concept of “AI RAM,” McAfee expressed a negative stance, saying, “That method would be very expensive.”
Ryzen’s AI future
The relationship between XDNA and Ryzen AI is like the relationship between RDNA and Radeon in that the first term defines the architecture and the second term defines the brand. AMD secured AI capabilities through the acquisition of Xilinx, but did not disclose exact details. McAfee acknowledged that AMD and its competitors will need to define Ryzen AI's capabilities using language that consumers can understand, such as core counts and clock speeds that help define the CPU.
McAfee said, “Along with IPU, there is something called architecture generation. “What we integrate into this year’s products and what we integrate into future products may be different architecture generations,” he said.
ⓒ Playground AI
AMD launched the Ryzen 7000 series, which includes Phone Number List the Ryzen 7040HS, last January, and announced the Ryzen 7040U, a low-power version of the 7040HS, in early May. The two processors are the first chips to be equipped with Ryzen AI XDNA hardware, one of the first AI logics for PCs.
However, AI so far is a service that runs entirely in the cloud, with some exceptions, including Stable Diffusion. This is precisely why AMD's adventure is dangerous. Why bother investing expensive silicon into an IPU (Inference Processing Unit) to develop functions that no one can properly use on a PC?
This was one of the questions PCWorld asked David McAfee, vice president and general manager of AMD's client channel business. According to McAfee, we're on the verge of a series of announcements and events that will help clarify what's happening overall in terms of AI processing. This is just the tip of the iceberg.
It's not clear whether McAfee is talking about Google I/O, the Microsoft Build developer conference being held later this month, or something else entirely. However, AMD seems to be taking a 'smaller stride' approach to AI than expected.
“AI on PC is not as complicated as you think”
Of course, if you have sufficient storage and memory, you can run AI models on Ryzen CPUs or Radeon GPUs. However, McAfee explained that Ryzen and Radeon are too heavy hammers to use for this type of computing.
AMD sees AI in PCs as small, lightweight, frequently triggered tasks running on AI processors called IPUs. AMD has been using AI for quite some time, experimenting and optimizing AI technology for PCs, and has Sense for Ryzen processors that adjusts clock speeds through Precision Boost 2, mXFR, and Smart Prefetch. It blends several technologies under the umbrella of MI (SenseMI). IPU can take these technologies to another level.
“IPUs and future IPUs are more of a combination of specialized engines that perform specific types of computing in a power-efficient manner,” McAfee said. “They are integrated in some ways with the memory subsystem and the rest of the processor. “Workloads running on GPUs are expected to become a more frequent type of compute on user platforms (if not continuously running workstreams) rather than one-time events.”
AMD considers the IPU to be similar to a video decoder. In the past, video decoding tasks could be pushed through with Ryzen CPUs. However, massive computing power was required to ensure a pleasant experience. “Alternatively, we could do this using a relatively small engine that is part of the chip design and extremely efficient,” McAfee said.
This means that, at least for now, the Ryzen AI IPU does not have a separate card or its own memory subsystem. Stable Diffusion's generative AI models run using dedicated video RAM. However, when asked about the concept of “AI RAM,” McAfee expressed a negative stance, saying, “That method would be very expensive.”
Ryzen’s AI future
The relationship between XDNA and Ryzen AI is like the relationship between RDNA and Radeon in that the first term defines the architecture and the second term defines the brand. AMD secured AI capabilities through the acquisition of Xilinx, but did not disclose exact details. McAfee acknowledged that AMD and its competitors will need to define Ryzen AI's capabilities using language that consumers can understand, such as core counts and clock speeds that help define the CPU.
McAfee said, “Along with IPU, there is something called architecture generation. “What we integrate into this year’s products and what we integrate into future products may be different architecture generations,” he said.