Samsung reveals brand-new memory chip with ‘highest-capacity to date’ for AI

Samsung is likely to see 'very strong' improvement in earnings, analyst says

Revealed: The Secrets our Clients Used to Earn $3 Billion

Samsung logo design showed on a glass door at the business’s Seocho structure in Seoul on July 7,2022 Samsung Electronics has actually started applications for tax breaks for 11 prospective chip plants in Texas amounting to financial investments of about $192 billion, according to files submitted with Texas authorities.

Jung Yeon- je|Afp|Getty Images

Samsung Electronics on Tuesday stated it has actually established a brand-new high-bandwidth memory chip that has the “highest-capacity to date” in the market.

The South Korean chip huge declared the HBM3E 12 H “raises both performance and capacity by more than 50%.”

“The industry’s AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need,” stated Yongcheol Bae, executive vice president of memory item preparation at Samsung Electronics.

“This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era,” stated Bae.

Samsung Electronics is the world’s biggest maker for vibrant random-access memory chips, which are utilized in customer gadgets such as smart devices and computer systems.

Generative AI designs such as OpenAI’s ChatGPT need great deals of high-performance memory chips. Such chips allow generative AI designs to bear in mind information from previous discussions and user choices in order to create humanlike reactions.

The AI boom continues to sustain chipmakers. U.S. chip designer Nvidia published a 265% dive in 4th financial quarter earnings thanks to increasing need for its graphics processing systems, countless which are utilized to run and train ChatGPT.

During a call with experts, Nvidia CEO Jensen Huang stated the business might not have the ability to keep this level of development or sales for the entire year.

“As AI applications grow exponentially, the HBM3E 12H is expected to be an optimal solution for future systems that require more memory. Its higher performance and capacity will especially allow customers to manage their resources more flexibly and reduce total cost of ownership for datacenters,” stated Samsung Electronics.

Samsung stated it has actually begun tasting the chip to clients and mass production of the HBM3E 12 H is prepared for the very first half of 2024.

“I assume the news will be positive for Samsung’s share price,” SK Kim, executive director of Daiwa Securities, informed CNBC.

“Samsung was behind SK Hynix in HBM3 for Nvidia last year. Also, Micron announced mass production of 24GB 8L HBM3E yesterday. I assume it will secure leadership in higher layer (12L) based higher density (36GB) HBM3E product for Nvidia,” stated Kim.

In September, Samsung protected an offer to supply Nvidia with its high-bandwidth memory 3 chips, according to a Korea Economic Daily report, which mentioned confidential market sources.

The report likewise stated that SK Hynix, South Korea’s second-biggest memory chipmaker, was leading the high-performance memory chip market. SK Hynix was formerly called the sole mass manufacturer of HBM3 chips provided to Nvidia, the report stated.

Samsung stated the HBM3E 12 H has a 12- layer stack, however uses sophisticated thermal compression non-conductive movie which permits the 12- layer items to have the exact same height spec as 8-layer ones to fulfill present HBM plan requirements. The result is a chip that loads more processing power, without increasing its physical footprint.

“Samsung has continued to lower the thickness of its NCF material and achieved the industry’s smallest gap between chips at seven micrometers (µm), while also eliminating voids between layers,” statedSamsung “These efforts result in enhanced vertical density by over 20% compared to its HBM3 8H product.”