Last week, we reviewed the Radeon RX 6700 XT, AMD's new midrange entry into the red-hot video-card market, and we found in our testing that the new graphics card had some issues with games based on the DirectX 11 (DX11) graphics API. As a close-priced competitor to the GeForce RTX 3070 from Nvidia (at $479 versus $499, for the companies' respective reference cards), the Radeon RX 6700 XT card performed quite competitively in modern AAA titles based on DirectX 12 (DX12). But its DX11 performance echoed an identical issue we first observed when testing the RX 6800 XT back in November of last year: In some games, it wasn't quite up to snuff.
This cried out for further investigation. The Radeon RX 6700 XT, like the RX 6800 cards before it, are based on AMD's latest video-card architecture, dubbed RDNA 2, the common thread among them. So, we rolled out an AMD-based testbed and employed both SAM ("Smart Access Memory," the company's own name for Resizable BAR, much more about which at the link), as well as the AMD Radeon Software overclocking suite to try and bridge the gap in DX11. In a nutshell, SAM is frame-rate acceleration tech that makes it easier for your CPU and GPU to talk to one another, in theory increasing frame rates in games with lots of textures or high-quality assets that need to be loaded in quickly.
What did we see? Well, first, some background.
RDNA 2 and DX11: Two Acronyms Apart
We first noticed the shortfalls that RDNA 2-based AMD GPUs were having with DX11 during our benchmarking of the Radeon RX 6800 XT back in November of last year. In the course of our video card testing, we always run a suite of what we call "legacy" (or older) game titles: Tomb Raider (the 2013 version), Bioshock: Infinite, Sleeping Dogs, and Hitman: Absolution. These AAA games from the early and mid-2010s help us see how well a given card handles older games that likely have not been explicitly optimized for in modern video-card drivers.
During that review process, we had some back-and-forth with AMD. The company asked about the details of our benchmarking process and the specifics of the testbed PCs we were using, and it also sent us a fresh second set of the initial RDNA 2 cards (the Radeon RX 6800 XT and the Radeon RX 6800) for us to retest, to try and troubleshoot the approximate 20% shortfall in performance of the RX 6800 XT vs. the Nvidia GeForce RTX 3080 Founders Edition we saw in our legacy tests of Hitman: Absolution, Bioshock: Infinite, and Sleeping Dogs.
When we tested the Radeon RX 6700 XT, we saw the same issue when comparing it with its like-priced GeForce rival, the RTX 3070, and so we had another chat with AMD. The company was more candid about the issue of optimizing older games for the new RDNA 2 architecture. According to a statement issued us:
“AMD strives to provide the best performance to gamers with all of our products. We prioritize optimizations that benefit the widest group of gamers, ensuring those on the AMD RDNA 2 architecture will enjoy strong performance in more modern titles as well as more broadly played popular titles using both modern and legacy APIs.”
Essentially, this means that AMD's engineers will prioritize certain games over others when it comes to which get optimized first. During a call prior to the RX 6700 XT launch, the company clarified that it relies on data from services like Steam Charts to determine which games are most popular (and therefore receive the most attention) while the drivers are being worked on.
Continuing the quote from above in regards to legacy titles, AMD said...
“Performance uplift will vary on a game-to-game basis. We expect to see an amazing performance lift on RDNA 2 on a wide variety of games. However, you may experience lower performance uplift compared to first-generation RDNA GPUs in a small subset of CPU-bound legacy titles.”
So until AMD provides a tool that gamers can use to see which titles have or haven't been optimized for RDNA 2, it's up to benchmarkers like us to see what's up. Here's why it matters.
Who Needs DirectX 11 Performance Anymore?
Now, of course, most people will choose DX12 if presented with the option. So who is this problem really relevant for?
For legacy gamers like myself, of course. Throughout most of my gaming career, I've been the "Wait for the Steam Sale!" type. Only seldom will I pay full price for a AAA release at the time of launch. And luckily, for people like me, over the past few years entirely new subscription models of PC-game distribution have sprung up, along the lines of Netflix's "Pay once a month, and get the whole library" way of doing business.
Some services, like UPlay and EA Play, offer up access to those publishers' complete back catalogs, along with their current lineup of AAA titles (including DX11 titles like Assassin's Creed: Odyssey). But other services, like Xbox Game Pass, have taken things a step further. The company regularly does deals with developers from around the world, indie to AAA, to feature their games on the Xbox Games Pass service for a limited amount of time.
With 18 million subscribers and growing, it's clear that this model is popular with gamers, while also providing the primary argument for why DX11 performance still matters in 2021: Much like you'd catch the occasional 1990s show coming to Netflix as a surprise ("Hey, they have Fresh Prince of Bel-Air now!"), Xbox Games Pass contains a new surprise batch of current and legacy AAA games every week. This model means you might end up playing games you never planned to. To that end, you should have a graphics card that can keep up with whatever era of PC gaming you want to visit the next time you boot up your rig.
Raising the BAR...Er, the SAM: Let's Jump Into Testing
So then, onward to the setup and benchmarking portion of our tests. You can see the details of our usual Intel video card testing rig at our explainer How We Test Graphics Cards. For our AMD-based test setup, we installed AMD's Ryzen 9 5900X into an MSI MEG X570 Ace AM4 motherboard (our standard test platform for latest-generation Ryzen CPUs) and populated two of the DIMM slots with 16GB of memory set at 3,400MHz.
We deployed this X570-chipset-based testbed to give AMD the best possible chance to push its own GPUs to their limit (re: compatibility and SAM integration). Note that Intel has also enabled Resizable BAR/SAM across specific Z490, Z590, and X299 motherboards, in the event you have an Intel processor want to try to replicate our tests for yourself with a recent Radeon card of your own. We used an NZXT Kraken Z63 280mm closed-loop liquid cooling solution, with fan profiles set to the default of our motherboard's BIOS settings.
We put the stock Radeon RX 6700 XT up against its closest competitor in the space, the Nvidia GeForce RTX 3070 Founders Edition, to gauge both the scope of the DX11 shortfalls, as well as to provide a baseline to which an overclocked RX 6700 XT, also backed by SAM technology, can aspire. Eagle-eyed readers will also notice a new game here. We included Wolfenstein: Youngblood in our re-testing of the RX 6700 XT, since it was both recently released on Xbox Game Pass and also utilizes DX11 as its primary graphics API.
So, with the testbed details and our Wolfenstein disclaimer out of the way, let's dive into the first portion of our testing. In these runs, we used games that had toggles between DX11 and DX12 (Shadow of the Tomb Raider and F1 2020, for example), or tested games like Sleeping Dogs and Wolfenstein: Youngblood, which are natively DX11 since they were released before DX12 became ubiquitous.
DirectX 11 Testing: Stock Settings
Let's see how the new AMD-based configuration stacked up against the original DX11 results we saw in our Radeon RX 6700 XT review. You'll see DX11 and DX12 results for the games that are switchable, and just DX11 results for the games with no DX12 option.
If you've already read our reviews of either the AMD Radeon RX 6700 XT or the RX 6800 XT, there's not a whole lot new to see here (aside from the Wolfenstein results, which tell the same story).) In any benchmark tests we ran on games that had an option to switch between the two, in DX11 the Radeon RX 6700 XT would post frame-rate results that were, on average, around 20% slower than the same test run in DX12. Furthermore, we tested the RTX 3070 Founders Edition in DX11, and saw that the card remained almost rock-solid between the two APIs, with no major variance to be seen when switching from one API to the other.
The DX11 shortfall on the RDNA 2 cards was most apparent in our Shadow of the Tomb Raider and F1 2020 testing runs at 1080p and 1440p, where we saw performance losses ranging between 15% and 34% depending on the resolution being tested. The problem repeated itself throughout the rest of our legacy-game tests, and it is especially apparent when you compare results from games like Sleeping Dogs, where the RX 6700 XT scored the same frame rate in 1080p as the RTX 3070 did at 1440p (156 frames per second, in each case).
Turn to results in games like Total War: Warhammer II, however (known as a core benchmark for many sites), and we see more consistent performance between the DX11 and DX12 runs. Given this game's popularity in benchmarks, it's possible that AMD's driver optimization strategy might involve singling out games with popular benchmark routines to optimize, while leaving older legacy titles out of the conversation. But we don't have evidence of that.
Recommended by Our Editors
DirectX 11 Testing: Overclock and SAM
Next, to give the AMD Radeon RX 6700 XT the best possible chance to make up for the gap in DX11, we enabled Smart Access Memory (SAM) on our MSI MEG Ace X570 motherboard, then used the AMD Radeon Software to apply an overclocking profile of 150MHz on the GPU, and 275MHz on the VRAM. (The latter is the same profile we achieved stable results with during our initial review of the card.)
Alas, all this tuning couldn't make up for RDNA 2's driver shortfalls with some of these games. Throughout our testing, we found that the combination of SAM and overclocking, together, could actually do nothing or make things worse, with games scoring either identically to the stock-settings run, or slightly slower depending on the run in question.
Admittedly we did see a few tiny bumps in performance (156fps to 159fps in Sleeping Dogs, whoa!), but nothing near enough to make up for the performance disparity caused by AMD's uneven driver optimization.
Now, in the company's defense, we didn't benchmark any of the games that were showcased during its original marketing push for SAM...though this was by design. The company has shown that it knows which games are benchmarked by the major testing and tech sites, and it will target those specifically for driver optimizations. So the only way to get a real-world temperature check is to push outside those boundaries, into titles that aren't already in AMD's crosshairs before we have a chance to test them.
Can AMD Save AMD From AMD?
So, were we able to use ancillary AMD tech—an AMD platform, SAM tech, and Radeon overclocking—to get AMD's at-stock tech up to GeForce speeds?
Not exactly. Of the eight games we tested during this deep dive into Radeon performance, six showed frame-rate losses on the RX 6700 XT that averaged around 20% slower than the RTX 3070 Founders Edition, depending on the game being run. These results reaffirm what we already know from AMD's statement regarding the situation, though to see it in practice is another thing entirely.
In the course of our tests, we found that the overall performance of AMD's RDNA 2-based GPUs are dependent on a variety of factors. These include the DirectX API you're playing on, the motherboard you're using (an AMD SAM-enabled one, or otherwise), as well as the amount of overclock applied to the card. (The AMD reference version of the RX 6700 XT we used comes at reference-stock settings, but some third parties will sell overclocked-from-the-box versions.) And though GPUs based on RDNA 2 work fine on DX11, RDNA 2 owners can expect an average frame-rate loss of 20% on certain DX11 titles, with some extreme cases cutting over 40% from the peak frame rates we saw versus the GeForce RTX 3070 Founders Edition.
Are we overstating the issue? It depends on who you are, what you play, and what you play on. If you're a casual gamer playing on a non-gaming, 60Hz monitor, a card like the Radeon RX 6700 or RX 6800 series will probably run most older games at much faster than 60fps at your desired gaming resolution, and you may not care about (or even be able to see) the potential loss in frame rates in older games. But if you're someone like me who's been waiting years to play a AAA game at high-refresh (or just want to replay it again with a more powerful system this time around), then AMD's RDNA 2 graphics cards may not be the right pick just yet.
Still, more than four months have passed between our initial observation of this problem with the first RDNA 2 cards, and the launch of the Radeon RX 6800 XT, and the same games still show the same behind-the-pack performance. We hope that by the time of our next review of a new RDNA 2 card that AMD's engineering team has taken a harder look at DX11 performance. But until then, we'd recommend that if you play a lot of older titles (and who wouldn't, with so many on offer these days), you're probably going to be better off opting for a card in Nvidia's GeForce RTX 30 Series lineup instead...if you can find one, of course.