More in

Samsung extends In-Memory processing power

1 min read

Samsung Electronics has used this year's Hot Chips conference to showcase its latest advancements with processing-in-memory (PIM) technology.

Samsung demonstrated the first successful integration of its PIM-enabled High Bandwidth Memory (HBM-PIM) into a commercialised accelerator system, and has broadened PIM applications to embrace DRAM modules and mobile memory that accelerates the move toward the convergence of memory and logic.

HBM-PIM incorporates the AI processing function to enhance high-speed data processing in supercomputers and AI applications and it has been tested in the Xilinx Virtex Ultrascale+ (Alveo) AI accelerator, where it delivered an almost 2.5X system performance gain as well as more than a 60% cut in energy consumption.

“HBM-PIM is the industry’s first AI-tailored memory solution being tested in customer AI-accelerator systems, demonstrating tremendous commercial potential,” said Nam Sung Kim, senior vice president of DRAM Product & Technology at Samsung Electronics. “Through standardization of the technology, applications will become numerous, expanding into HBM3 for next-generation supercomputers and AI applications, and even into mobile memory for on-device AI as well as for memory modules used in data centres.”

The Acceleration DIMM (AXDIMM) brings processing to the DRAM module itself, minimizing large data movement between the CPU and DRAM to boost the energy efficiency of AI accelerator systems.

With an AI engine built inside the buffer chip, the AXDIMM can perform parallel processing of multiple memory ranks (sets of DRAM chips) instead of accessing just one rank at a time, greatly enhancing system performance and efficiency.

Since the module can retain its traditional DIMM form factor, the AXDIMM facilitates drop-in replacement without requiring system modifications.

Currently being tested on customer servers, the AXDIMM can offer approximately twice the performance in AI-based recommendation applications and a 40% decrease in system-wide energy usage.

Samsung’s LPDDR5-PIM mobile memory technology can provide independent AI capabilities without data centre connectivity. Simulation tests have shown that the LPDDR5-PIM can more than double performance while reducing energy usage by over 60% when used in applications such as voice recognition, translation and chatbot.

Samsung plans to expand its AI memory portfolio by working with other industry leaders to complete standardization of the PIM platform in the first half of 2022.