AMD is all set to unveil its next-generation CDNA GPU-based Instinct MI100 accelerator on the 16th of November as per tech outlet, Aroged. The information comes from leaked documents which are part of the embargoed datasheets for its next-generation data center and HPC accelerators lineup.
AMD Instinct MI100 With First-Generation CDNA Architecture Launches on 16th November - Aims To Tackle NVIDIA's A100 In The Data Center Segment With Fastest Double Precision Power
The AMD Instinct MI100 accelerator was confirmed by AMD's CTO, Mark Papermaster, almost 5 months ago. Back then, Mark stated that they will be introducing the cDNA based Instinct GPU by the second half of 2020. Since we approach end of year, it looks like AMD is now all set to launch the most powerful data center GPU it has ever built under the leadership of RTG's new chief, David Wang.

AMD's Radeon Instinct MI100 will be utilizing the CDNA architecture which is entirely different than the RDNA architecture that gamers will have access to later this month. The CDNA architecture has been designed specifically for the HPC segment and will be pitted against NVIDIA's Ampere A100 & similar accelerator cards.
ased on what we have learned from various prototype leaks, the Radeon Instinct MI100 'Arcturus' GPU will feature several variants. The flagship variant goes in the D34303 SKU which makes use of the XL variant. The info for this part is based on a test board so it is likely that final specifications would not be the same but here are the key points:
Based on Arcturus GPUs (1st Gen cDNA)Test Board has a TDP of 200W (Final variants ~300-350W)Up To 32 GB HBM2e Memory
Specifications that were previously leaked by AdoredTV suggest that the AMD Instinct MI100 will feature 34 TFLOPs of FP32 compute per GPU. Each Radeon Instinct MI100 GPU will have a TDP of 300W. Each GPU will feature 32 GB of HBM2e memory which should pump out 1.225 TB/s of total bandwidth.

As per the embargo, the AMD Instinct MI100 HPC accelerator will be unveiled on 16th November at 8 AM CDT.
AMD Radeon Instinct Accelerators
| Accelerator Name | AMD Instinct MI400 | AMD Instinct MI300X | AMD Instinct MI300A | AMD Instinct MI250X | AMD Instinct MI250 | AMD Instinct MI210 | AMD Instinct MI100 | AMD Radeon Instinct MI60 | AMD Radeon Instinct MI50 | AMD Radeon Instinct MI25 | AMD Radeon Instinct MI8 | AMD Radeon Instinct MI6 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| CPU Architecture | Zen 5 (Exascale APU) | N/A | Zen 4 (Exascale APU) | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A |
| GPU Architecture | CDNA 4 | Aqua Vanjaram (CDNA 3) | Aqua Vanjaram (CDNA 3) | Aldebaran (CDNA 2) | Aldebaran (CDNA 2) | Aldebaran (CDNA 2) | Arcturus (CDNA 1) | Vega 20 | Vega 20 | Vega 10 | Fiji XT | Polaris 10 |
| GPU Process Node | 4nm | 5nm+6nm | 5nm+6nm | 6nm | 6nm | 6nm | 7nm FinFET | 7nm FinFET | 7nm FinFET | 14nm FinFET | 28nm | 14nm FinFET |
| GPU Chiplets | TBD | 8 (MCM) | 8 (MCM) | 2 (MCM) 1 (Per Die) | 2 (MCM) 1 (Per Die) | 2 (MCM) 1 (Per Die) | 1 (Monolithic) | 1 (Monolithic) | 1 (Monolithic) | 1 (Monolithic) | 1 (Monolithic) | 1 (Monolithic) |
| GPU Cores | TBD | 19,456 | 14,592 | 14,080 | 13,312 | 6656 | 7680 | 4096 | 3840 | 4096 | 4096 | 2304 |
| GPU Clock Speed | TBD | 2100 MHz | 2100 MHz | 1700 MHz | 1700 MHz | 1700 MHz | 1500 MHz | 1800 MHz | 1725 MHz | 1500 MHz | 1000 MHz | 1237 MHz |
| INT8 Compute | TBD | 2614 TOPS | 1961 TOPS | 383 TOPs | 362 TOPS | 181 TOPS | 92.3 TOPS | N/A | N/A | N/A | N/A | N/A |
| FP16 Compute | TBD | 1.3 PFLOPs | 980.6 TFLOPs | 383 TFLOPs | 362 TFLOPs | 181 TFLOPs | 185 TFLOPs | 29.5 TFLOPs | 26.5 TFLOPs | 24.6 TFLOPs | 8.2 TFLOPs | 5.7 TFLOPs |
| FP32 Compute | TBD | 163.4 TFLOPs | 122.6 TFLOPs | 95.7 TFLOPs | 90.5 TFLOPs | 45.3 TFLOPs | 23.1 TFLOPs | 14.7 TFLOPs | 13.3 TFLOPs | 12.3 TFLOPs | 8.2 TFLOPs | 5.7 TFLOPs |
| FP64 Compute | TBD | 81.7 TFLOPs | 61.3 TFLOPs | 47.9 TFLOPs | 45.3 TFLOPs | 22.6 TFLOPs | 11.5 TFLOPs | 7.4 TFLOPs | 6.6 TFLOPs | 768 GFLOPs | 512 GFLOPs | 384 GFLOPs |
| VRAM | TBD | 192 GB HBM3 | 128 GB HBM3 | 128 GB HBM2e | 128 GB HBM2e | 64 GB HBM2e | 32 GB HBM2 | 32 GB HBM2 | 16 GB HBM2 | 16 GB HBM2 | 4 GB HBM1 | 16 GB GDDR5 |
| Infinity Cache | TBD | 256 MB | 256 MB | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A | N/A |
| Memory Clock | TBD | 5.2 Gbps | 5.2 Gbps | 3.2 Gbps | 3.2 Gbps | 3.2 Gbps | 1200 MHz | 1000 MHz | 1000 MHz | 945 MHz | 500 MHz | 1750 MHz |
| Memory Bus | TBD | 8192-bit | 8192-bit | 8192-bit | 8192-bit | 4096-bit | 4096-bit bus | 4096-bit bus | 4096-bit bus | 2048-bit bus | 4096-bit bus | 256-bit bus |
| Memory Bandwidth | TBD | 5.3 TB/s | 5.3 TB/s | 3.2 TB/s | 3.2 TB/s | 1.6 TB/s | 1.23 TB/s | 1 TB/s | 1 TB/s | 484 GB/s | 512 GB/s | 224 GB/s |
| Form Factor | TBD | OAM | APU SH5 Socket | OAM | OAM | Dual Slot Card | Dual Slot, Full Length | Dual Slot, Full Length | Dual Slot, Full Length | Dual Slot, Full Length | Dual Slot, Half Length | Single Slot, Full Length |
| Cooling | TBD | Passive Cooling | Passive Cooling | Passive Cooling | Passive Cooling | Passive Cooling | Passive Cooling | Passive Cooling | Passive Cooling | Passive Cooling | Passive Cooling | Passive Cooling |
| TDP (Max) | TBD | 750W | 760W | 560W | 500W | 300W | 300W | 300W | 300W | 300W | 175W | 150W |









