facebook rss twitter

Samsung announces 16Gb GDDR6 mass production

by Mark Tyson on 18 January 2018, 11:31

Tags: Samsung (005935.KS)

Quick Link: HEXUS.net/qadpxz

Add to My Vault: x

Samsung has announced that it has started mass production of the industry's first 16Gb (2GB) GDDR6 memory chips. The production is intended for makers of advanced graphics products, graphics cards, gaming devices, automotive, networking and AI systems.

These new chips are fabricated on a 10nm-class process and run at 1.35V. Compare that to Samsung's previous GDDR5 chips which were built on a 20nm process and were available in a maximum 8Gb (1GB) density. In another comparison, Samsung notes that the new solution performs at an 18Gbps pin speed with data transfers of 72GBps, which represents a more than two-fold increase over 8Gb GDDR5 with its 8Gbps pin speed.

"Beginning with this early production of the industry’s first 16Gb GDDR6, we will offer a comprehensive graphics DRAM line-up, with the highest performance and densities, in a very timely manner," said Jinman Han, senior vice president, Memory Product Planning & Application Engineering at Samsung Electronics. "By introducing next-generation GDDR6 products, we will strengthen our presence in the gaming and graphics card markets and accommodate the growing need for advanced graphics memory in automotive and network systems."

We've talked about the speed and density advantages above but there's another proud feather in GDDR6's cap - power efficiency. Samsung says that its new GDDR6 chips include "an innovative, low-power circuit design," and run at 1.35V to lower energy consumption approximately 35 percent over the widely used GDDR5 at 1.55V. Last but not least, there are production efficiency gains at Samsung's chip factories. There is a claimed 30 per cent manufacturing productivity gain compared to the lower density 20nm 8Gb GDDR5.

As mentioned in the intro, GDDR6 is expected to be important in graphics products. Many components and devices aimed at PC enthusiasts will benefit from the improvements in density, performance, and energy efficiency thinks Samsung. The new GDDR6 chips will be widely used in burgeoning computing segments like 8K video, VR and AR.

Only a week ago Samsung started mass production of 8GB HBM2 Aquabolt chips, some of which will, again, be taken up by PC graphics card makers.



HEXUS Forums :: 11 Comments

Login with Forum Account

Don't have an account? Register today!
Looks like this will triumph compared to HBM2 if the price is right
3dcandy
Looks like this will triumph compared to HBM2 if the price is right

Not really; it targets a completely different market, and doesn't address any of the issues that makes HBM a better fit in some circumstances. For instance - the high bitrate is going to mean higher energy consumption, the narrow bus means you're still a LONG way behind the per chip/stack bandwidth of HBM2, and the chip is only 2GB, while HBM2 can already go up to 8GB per stack.

It's a direct replacement for GDDR5, not a competitor to HBM2.
I was being very general Jim. What I mean is that products if the price will right will be GDDR6 as HBM2 is too expensive right now. I'm guessing designing a product that already uses GDDR5 to go onto 6 will be easy. Cue the rebrands!
3dcandy
I was being very general Jim. What I mean is that products if the price will right will be GDDR6 as HBM2 is too expensive right now. I'm guessing designing a product that already uses GDDR5 to go onto 6 will be easy. Cue the rebrands!

That would miss the point.

Twice the bandwidth per pin and twice the capacity, that means you can ship with only 4 memory chips and still hit the magic 8GB to charge top dollar. Fewer chips means fewer PCB layers which drives costs down. However, I think HBM has shown us that raw bandwidth isn't all that matters. I have to wonder if a 1070 with double rate but 128 bit memory bus would suffer from fewer channels that I presume can do their own thing (ie I presume they are unganged).
Both consoles could have waited for this to implement on their hardware, I mean think about it, this would have actually made them more suitable for true 4K