Intel's upcoming Emerald Rapids-SP Xeon CPUs launching in Q4 2023 are rumored to get a major boost in L3 cache while offering up to 64 cores.
Intel Reportedly Boosts Emerald Rapids Xeon CPU Cache By Almost 3x, Up To 64 Cores & 320 MB L3 Pool
Intel's Emerald Rapids-SP Xeon CPU family will be based upon a mature 'Intel 7' node. You can think of it as a 2nd Gen 'Intel 7' node which would lead to slightly higher efficiency. Emerald Rapids is expected to make use of the Raptor Cove core architecture which is an optimized variant of the Golden Cove core that will deliver 5-10% IPC improvement over Golden Cove cores. It will also pack up to 64 cores & 128 threads which is a small core bump over the 56 cores & 112 threads featured on Sapphire Rapids chips.

Now as per HXL (@9550pro), it seems like the Emerald Rapids-SP Xeon CPUs will be peaking out at 64 cores and available in 1S/2S server configurations. The 4S-8S platforms will have to wait till the next-generation Granite Rapids-SP Xeon chips for an upgrade. But with that said, one key area that is expected to see a huge boost on Emerald Rapids-SP Xeon CPUs is the L3 cache. It is reported that the Emerald Rapids-SP CPUs will pack up to 320 MB of L3 cache. This is 2.84x higher than the 112.5 MB L3 cache featured on the top Sapphire Rapids-SP chip, the Xeon 8490H.
Raptor Lake-S→Raptor Lake-S Refresh
Raptor Lake-HX→Raptor Lake-HX Refresh
Emerald Rapids (Only 1S-2S) 64C 320 MB L3 Cache
— HXL (@9550pro) March 17, 2023
The interesting part is that this is just the L3 cache and while it is slightly lower than the 384 MB of L3 cache featured on AMD's top EPYC 9654 chip which rounds up to 480 MB if we combine the L2 cache, the Emerald Rapids-SP CPU can come even close if we combine its plausible L2 pool with the L3 cache. The current Sapphire Rapids-SP CPUs feature 2 MB of L2 cache per core which equals 120 MB across 60 cores. Considering Intel keeps the same L2 cache for Emerald Rapids-SP, we get up to 128 MB of L2 cache which combined with the L3 cache equals 448 MB and that's just 6% lower than AMD's top EPYC Genoa chip.
Intel Emerald Rapids-SP (64-Core SKU) - 320 MB L3 + 128 MB L2 = 448 MB Total CacheAMD EPYC Genoa (64-Core SKU) - 384 MB L3 + 96 MB L2 = 480 MB Total CacheIntel Sapphire Rapids-SP (60-Core SKU) - 112.5 MB L3 + 120 MB L2 = 232.5 MB Total Cache
Intel's Emerald Rapids Xeon-SP CPUs will more or less match the existing EPYC Milan & Rome core counts but Genoa and Bergamo will be offering up to 50% & 2x core/thread count increase over Emerald Rapids and they will be available in full volume by 2023.

So talking about the platform details, the Eagle Stream ecosystem will allow support for 125-350W TDP SKUs on Socket E (LGA 4677), enabling drop-in support from Sapphire Rapids-SP. The HPC and data center segment will have immense scalability options ranging from 1S, 2S, 4S, 8S, and even more sockets (via xNC support) for increased compute and core densities. The chips will come with the latest accelerators including:
Intel Data Streaming AcceleratorIntel QuickAssist TechnologyIntel Dynamic Load BalancerIntel Advanced Matrix ExtensionsIntel In-Memory Analytics Accelerator
Besides that, the platform will enable support for faster DDR5-5600 (1DPC) and retain DDR5-4800 (2DPC). The 8-channel DDR5 memory platform will allow up to two DIMMs per channel for a total of 16 DIMMs per socket and each socket can support up to 24 Gb DRAM densities. There's also Crow Pass Persistent Memory support "Crystal Ridge 3.0" listed but with Optane canned, that no longer seems to be the case. There will be four UPI 2.0 links running at a higher width speed of x24 for up to 20 GT/s transfer rates.
As far as PCIe lanes are concerned, the Intel Emerald Rapids Xeon CPUs will feature up to 80 Gen 5 PCIe lanes per CPU in addition to PCIe 4.0 lanes from the North Bridge. The platform will support the bifurcation of x16, x8, x4, and x2 (Gen 4) and will also support Shared Virtual Memory & Scalable IO Virtualization. The Emmitsburg PCH will offer 20 PCIe 3.0 lanes, 1G Ethernet for Manageability, and an x8 DMI connection rated at PCIe 3.0 speeds. For security, the platform will offer:
Intel Trust Domain ExtensionsIntel SGX With IntegrityTME-MK - 128 KeysPlatform Firmware Resilience (PFR) with Peripheral Device AttestationHardware Enforced Execution ControlsIntel VT-Redirect Protection (Formerly HLAT)Intel Control-Flow Enforcement Technology (CET)VM Denial of Service Prevention
Intel reaffirmed that it's Emerald Rapids CPUsare sampling and they have completed the first power-on with top customers.
Next-Gen Intel Xeon vs AMD EPYC Generational CPU Comparison (Preliminary):
| CPU Name | Process Node / Architecture | Cores / Threads | Cache | DDR Memory / Speed / Capacities | PCIe Gen / Lanes | TDPs | Platform | Launch |
|---|---|---|---|---|---|---|---|---|
| Intel Diamond Rapids | Intel 3 / Lion Cove? | TBD | TBD | DDR5-7200 / 4 TB? | PCIe Gen 6.0 / 128? | Up To 425W | Birch Stream | 2025? |
| AMD EPYC Turin | 3nm / Zen 5 | 128 / 256? | TBD | DDR5-6000 / 8 TB? | PCIe Gen 6.0 / TBD | Up To 600W | SP5 | 2024-2025? |
| Intel Granite Rapids | Intel 3 / Redwood Cove | TBD | TBD | DDR5-6400 / 4 TB? | PCIe Gen 5.0 / 128? | Up To 500W | Birch Stream | 2024 |
| Intel Sierra Forest | Intel 3 / Crestmont | 144 / 144 | TBD | DDR5-6400 / 4 TB? | PCIe Gen 5.0 / 128? | Up To 500W | Birch Stream | 2024 |
| AMD EPYC Bergamo | 5nm / Zen 4C | 128 / 256 | 512 MB L3? | DDR5-5600 / 6 TB? | PCIe Gen 5.0 / TBD? | Up To 400W | SP5 | 2023 |
| Intel Emerald Rapids | Intel 7 / Raptor Cove | 64 / 128 | 320 MB L3? | DDR5-5200 / 4 TB? | PCIe Gen 5.0 / 80 | Up To 375W | Eagle Stream | 2023 |
| Intel Sapphire Rapids | Intel 7 / Golden Cove | 56 / 112 | 105 MB L3 | DDR5-4800 / 4 TB | PCIe Gen 5.0 / 80 | Up To 350W | Eagle Stream | 2023 |
| AMD EPYC Genoa | 5nm / Zen 4 | 96 / 192 | 384 MB L3? | DDR5-5200 / 4 TB? | PCIe Gen 5.0 / 128 | Up To 400W | SP5 | 2022 |









