News dalla rete ITA

6 Luglio 2023

Corea del Sud

SAMSUNG ACCELERATES FORAY INTO FASTEST AI CHIP MARKET

South Korea’s Samsung Electronics Co. is speeding up its efforts to penetrate deeper into the high bandwidth memory 3 (HBM3) market, an area it has neglected relative to other high-performance chips due to its tiny share of the entire memory chip market. But the advent of generative AI such as ChatGPT is driving the world’s top memory chipmaker to ramp up production of HBM chips, which boast faster data-processing speeds and lower energy consumption than conventional DRAMs. Samsung has recently shipped samples of HBM3 products with a 16 gigabyte (GB) memory capacity with the lowest energy consumption of its kind to customers, according to industry sources on Monday. Currently, the 16 GB is the maximum memory capacity for the existing HBM3 products and processes data at a speed of 6.4 GB per second, the industry’s fastest. It also delivered the samples of a 12-layer 24 GB HBM3, a fourth-generation HBM chip and the industry’s slimmest of its kind. Its smaller domestic rival SK Hynix Inc. unveiled the same-type model for the first time in the world last April. Samsung is now ready to mass produce both types of HBM3, the sources said. In the second half, it will launch an advanced model of HBM3 memory with higher performance and capacity. An HBM is a product made by vertically stacking DRAM chips. It is mainly used for graphic processing units (GPUs) that power generative AI platforms such as ChatGPT.  Samsung is understood to have begun the shipment of HBM3 products to its major customers. AMD Inc.’s recently announced MI 300 accelerated processing units are embedded with Samsung’s HBM3 memory. AMD is a fabless US semiconductor company. MI 300 chips are used to power supercomputers. The Aurora supercomputer, developed jointly by Intel Corp. and the Argonne National Laboratory, is said to be equipped with Samsung chips, according to the sources. IN EARLY STAGES OF GROWTH HBM was not high on the list for Samsung. It instead focused on mobile chips and high-performance computing technologies designed to improve the data processing capability and the performance of complex calculations. The HBM market is still in its early stages of growth, accounting for less than 1% of the DRAM market. But Samsung’s aggressive foray into the sector is expected to shake up the HBM market, giving a leg up to the sluggish DRAM market, industry watchers said.The HBM market is forecast to grow by an annual average rate of 45% or more between this year and 2025, in tandem with AI market growth, according to TrendForce.SK Hynix controls half of the HBM market worldwide, trailed by Samsung with a 40% stake and Micron Technology Inc. with a 10% stake, the Taiwan-based research firm said. In 2021, Samsung developed HBM-PIM (processing-in-memory) integrated with an AI accelerator. It enhances the generative capability of an AI application by 3.4 times more than an HBM-powered GPU accelerator, according to Samsung Electronics. It also unveiled CXL DRAMs, which have bigger data-processing capacity than conventional DRAMs so that it can prevent a memory bottleneck for AI supercomputers, or a slowdown in text generation and data movement. (ICE SEOUL)


Fonte notizia: THE KOREA ECONOMIC DAILY