yitit
Home
/
Hardware
/
AMD Confirms Radeon Instinct MI100 ‘Arcturus’ Discrete GPU Accelerator With CDNA Architecture In 2H 2020
AMD Confirms Radeon Instinct MI100 ‘Arcturus’ Discrete GPU Accelerator With CDNA Architecture In 2H 2020-July 2024
Jul 3, 2025 8:14 AM

During Dell's EMC presentation, AMD's CTO, Mark Papermaster, confirmed that they will be introducing the next-generation CDNA architecture-based Radeon Instinct MI100 accelerator during the second half of 2020.

AMD's Radeon Instinct MI100 CDNA Architecture Based Discrete GPU Accelerator Arriving in 2H 2020

The AMD Radeon Instinct MI100 which is internally referred to as 'Arcturus' will be a next-gen HPC part that will feature an enhanced version of the 7nm Vega architecture. The accelerator has never been mentioned by AMD officially until now. The GPU seems to be the top HPC part for 2020 in the AMD first-generation CDNA portfolio. Mark confirmed that the Discrete GPU will be introduced in the second half of 2020.

Mark Papermaster confirmed MI100 Discrete GPU accelerator for 2H 2020 pic.twitter.com/P6KTrm0B2S

— Hassan Mujtaba (@hms1193) June 17, 2020

Following is the quote from Mark during the Q/A session:

Like our multi-generational commitment to the Zen roadmap in x86 CPU, we have done the same with our DNA architectures for GPU - rDNA for gaming and visualization, and cDNA for compute & AI. The rDNA is driving gain in AMD share for graphics and deployed in the upcoming Sony and Microsoft new game consoles, and for cDNA you will see the MI100 discrete GPU both 2nd half of 2020.

The ROCm software stack creates an alternative for GPU compute with easy portability and enabling competition. - AMD CTO, Mark Papermaster

AMD CDNA Architecture Based Radeon Instinct Arcturus GPU_1

Based on what we have learned from various prototype leaks, the Radeon Instinct MI100 'Arcturus' GPU will feature several variants. The flagship variant goes in the D34303 SKU which makes use of the XL variant. The info for this part is based on a test board so it is likely that final specifications would not be the same but here are the key points:

Based on Arcturus XL GPUTest Board has a TDP of 200WUp To 32 GB HBM2 MemoryHBM2 Memory Clocks Reported Between 1000-1200 MHzAMD CDNA Architecture Based Radeon Instinct Arcturus GPU_Radeon CDNA Roadmap

The Radeon Instinct MI100 test board has a TDP of 200W and is based on the XL variant of AMD's Arcturus GPU. The card also features 32 GB of HBM2 memory with pin speeds of 1.0 - 1.2 GHz. The MI60 in comparison has 64 CUs with a TDP of 300W while clock speeds are reported at 1200 MHz (Base Clock) while the memory operates at 1.0 GHz along with a 4096-bit bus interface, pumping out 1 TB/s bandwidth. There's a big chance that the final design of the Arcturus GPU could be featuring Samsung's latest HBM2E 'Flashbolt' memory which offers 3.2 Gbps speeds for up to 1.5 Tb/s of bandwidth.

AMD Radeon Instinct Accelerators

Accelerator NameAMD Instinct MI400AMD Instinct MI300XAMD Instinct MI300AAMD Instinct MI250XAMD Instinct MI250AMD Instinct MI210AMD Instinct MI100AMD Radeon Instinct MI60AMD Radeon Instinct MI50AMD Radeon Instinct MI25AMD Radeon Instinct MI8AMD Radeon Instinct MI6
CPU ArchitectureZen 5 (Exascale APU)N/AZen 4 (Exascale APU)N/AN/AN/AN/AN/AN/AN/AN/AN/A
GPU ArchitectureCDNA 4Aqua Vanjaram (CDNA 3)Aqua Vanjaram (CDNA 3)Aldebaran (CDNA 2)Aldebaran (CDNA 2)Aldebaran (CDNA 2)Arcturus (CDNA 1)Vega 20Vega 20Vega 10Fiji XTPolaris 10
GPU Process Node4nm5nm+6nm5nm+6nm6nm6nm6nm7nm FinFET7nm FinFET7nm FinFET14nm FinFET28nm14nm FinFET
GPU ChipletsTBD8 (MCM)8 (MCM)2 (MCM)
1 (Per Die)
2 (MCM)
1 (Per Die)
2 (MCM)
1 (Per Die)
1 (Monolithic)1 (Monolithic)1 (Monolithic)1 (Monolithic)1 (Monolithic)1 (Monolithic)
GPU CoresTBD19,45614,59214,08013,3126656768040963840409640962304
GPU Clock SpeedTBD2100 MHz2100 MHz1700 MHz1700 MHz1700 MHz1500 MHz1800 MHz1725 MHz1500 MHz1000 MHz1237 MHz
INT8 ComputeTBD2614 TOPS1961 TOPS383 TOPs362 TOPS181 TOPS92.3 TOPSN/AN/AN/AN/AN/A
FP16 ComputeTBD1.3 PFLOPs980.6 TFLOPs383 TFLOPs362 TFLOPs181 TFLOPs185 TFLOPs29.5 TFLOPs26.5 TFLOPs24.6 TFLOPs8.2 TFLOPs5.7 TFLOPs
FP32 ComputeTBD163.4 TFLOPs122.6 TFLOPs95.7 TFLOPs90.5 TFLOPs45.3 TFLOPs23.1 TFLOPs14.7 TFLOPs13.3 TFLOPs12.3 TFLOPs8.2 TFLOPs5.7 TFLOPs
FP64 ComputeTBD81.7 TFLOPs61.3 TFLOPs47.9 TFLOPs45.3 TFLOPs22.6 TFLOPs11.5 TFLOPs7.4 TFLOPs6.6 TFLOPs768 GFLOPs512 GFLOPs384 GFLOPs
VRAMTBD192 GB HBM3128 GB HBM3128 GB HBM2e128 GB HBM2e64 GB HBM2e32 GB HBM232 GB HBM216 GB HBM216 GB HBM24 GB HBM116 GB GDDR5
Infinity CacheTBD256 MB256 MBN/AN/AN/AN/AN/AN/AN/AN/AN/A
Memory ClockTBD5.2 Gbps5.2 Gbps3.2 Gbps3.2 Gbps3.2 Gbps1200 MHz1000 MHz1000 MHz945 MHz500 MHz1750 MHz
Memory BusTBD8192-bit8192-bit8192-bit8192-bit4096-bit4096-bit bus4096-bit bus4096-bit bus2048-bit bus4096-bit bus256-bit bus
Memory BandwidthTBD5.3 TB/s5.3 TB/s3.2 TB/s3.2 TB/s1.6 TB/s1.23 TB/s1 TB/s1 TB/s484 GB/s512 GB/s224 GB/s
Form FactorTBDOAMAPU SH5 SocketOAMOAMDual Slot CardDual Slot, Full LengthDual Slot, Full LengthDual Slot, Full LengthDual Slot, Full LengthDual Slot, Half LengthSingle Slot, Full Length
CoolingTBDPassive CoolingPassive CoolingPassive CoolingPassive CoolingPassive CoolingPassive CoolingPassive CoolingPassive CoolingPassive CoolingPassive CoolingPassive Cooling
TDP (Max)TBD750W760W560W500W300W300W300W300W300W175W150W

Comments
Welcome to yitit comments! Please keep conversations courteous and on-topic. To fosterproductive and respectful conversations, you may see comments from our Community Managers.
Sign up to post
Sort by
Login to display more comments
Hardware
Recent News
Copyright 2023-2025 - www.yitit.com All Rights Reserved