Techno Blender
Digitally Yours.

Nvidia: Micron starts mass production of its memory chips for use in Nvidia’s AI semiconductors

0 24


Micron Technology has started mass production of its high-bandwidth memory semiconductors for use in Nvidia’s latest chip for artificial intelligence, sending its shares up more than 4% before the bell on Monday.

The HBM3E (High Bandwidth Memory 3E) will consume 30% less power than rival offerings, Micron said, and could help tap into soaring demand for chips that power generative AI applications.

Elevate Your Tech Prowess with High-Value Skill Courses

Offering College Course Website
MIT MIT Technology Leadership and Innovation Visit
Indian School of Business ISB Product Management Visit
IIT Delhi IITD Certificate Programme in Data Science & Machine Learning Visit

Nvidia will use the chip in its next-generation H200 graphic processing units, expected to start shipping in the second quarter and overtake the current H100 chip that has powered a massive surge in revenue at the chip designer.

Demand for high-bandwidth memory (HBM) chips, a market led by Nvidia supplier SK Hynix, for use in AI has also raised investor hopes that Micron would be able to weather a slow recovery in its other markets.

HBM is one of Micron’s most profitable products, in part because of the technical complexity involved in its construction.

The company had previously said it expects “several hundred million” dollars of HBM revenue in fiscal 2024 and continued growth in 2025.

Discover the stories of your interest


Micron Technology has started mass production of its high-bandwidth memory semiconductors for use in Nvidia’s latest chip for artificial intelligence, sending its shares up more than 4% before the bell on Monday.

The HBM3E (High Bandwidth Memory 3E) will consume 30% less power than rival offerings, Micron said, and could help tap into soaring demand for chips that power generative AI applications.

Elevate Your Tech Prowess with High-Value Skill Courses

Offering College Course Website
MIT MIT Technology Leadership and Innovation Visit
Indian School of Business ISB Product Management Visit
IIT Delhi IITD Certificate Programme in Data Science & Machine Learning Visit

Nvidia will use the chip in its next-generation H200 graphic processing units, expected to start shipping in the second quarter and overtake the current H100 chip that has powered a massive surge in revenue at the chip designer.

Demand for high-bandwidth memory (HBM) chips, a market led by Nvidia supplier SK Hynix, for use in AI has also raised investor hopes that Micron would be able to weather a slow recovery in its other markets.

HBM is one of Micron’s most profitable products, in part because of the technical complexity involved in its construction.

The company had previously said it expects “several hundred million” dollars of HBM revenue in fiscal 2024 and continued growth in 2025.

Discover the stories of your interest

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment