Techno Blender
Digitally Yours.

Samsung to mass-manufacture memory for AI this year and beat SK Hynix

0 43


Last updated: June 26th, 2023 at 12:57 UTC+02:00

Samsung believes memory semiconductors will lead the charge in AI supercomputing before the end of the decade. The company is a believer in memory chips outshining Nvidia GPUs in AI server applications. And a couple of months ago, Kye Hyun Kyung said Samsung will make sure “memory semiconductor-centered supercomputers can come out by 2028.” Now, reports say Samsung is preparing to mass-produce high-bandwidth memory (HBM) chips for AI applications this year.

According to the Korean media, Samsung is planning to mass-manufacture HBM chips for AI in the second half of 2023 and intends to catch up with SK Hynix, the latter of which quickly took the lead in the AI memory semiconductor market.

SK Hynix had roughly 50% market share in the HBM market in 2022, while Samsung held around 40%, according to TrendForce (via The Korea Times). Micron accounted for the remaining 10%. But the HBM market as a whole is not that prevalent and accounts for only about 1% of the entire DRAM segment.

Nevertheless, demand for HBM solutions is expected to increase as the AI market grows, and Samsung now intends to catch up with SK Hynix and mass produce its HBM3 chips in anticipation of these market changes. Regardless of whether or not the term “AI” has become a buzzword, AI servers are becoming more widespread, and high-bandwidth memory solutions are gaining more traction.

Samsung’s HBM3 solution vertically stacks multiple DRAM chips and has 16GB and 24GB capacities. HBM3 can top speeds of up to 6.4Gbps.


Last updated: June 26th, 2023 at 12:57 UTC+02:00

Samsung believes memory semiconductors will lead the charge in AI supercomputing before the end of the decade. The company is a believer in memory chips outshining Nvidia GPUs in AI server applications. And a couple of months ago, Kye Hyun Kyung said Samsung will make sure “memory semiconductor-centered supercomputers can come out by 2028.” Now, reports say Samsung is preparing to mass-produce high-bandwidth memory (HBM) chips for AI applications this year.

According to the Korean media, Samsung is planning to mass-manufacture HBM chips for AI in the second half of 2023 and intends to catch up with SK Hynix, the latter of which quickly took the lead in the AI memory semiconductor market.

SK Hynix had roughly 50% market share in the HBM market in 2022, while Samsung held around 40%, according to TrendForce (via The Korea Times). Micron accounted for the remaining 10%. But the HBM market as a whole is not that prevalent and accounts for only about 1% of the entire DRAM segment.

Nevertheless, demand for HBM solutions is expected to increase as the AI market grows, and Samsung now intends to catch up with SK Hynix and mass produce its HBM3 chips in anticipation of these market changes. Regardless of whether or not the term “AI” has become a buzzword, AI servers are becoming more widespread, and high-bandwidth memory solutions are gaining more traction.

Samsung’s HBM3 solution vertically stacks multiple DRAM chips and has 16GB and 24GB capacities. HBM3 can top speeds of up to 6.4Gbps.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment