الاثنين، 8 يونيو 2020

AMD Declares 4GB of GPU VRAM ‘Not Enough’ for Today’s Games

AMD is arguing that 4GB GPUs are effectively obsolete, based on performance data showing a distinct advantage for GPUs with more than 4GB of VRAM.

The company’s argument boils down to the following slide:

Image by AMD

Independent reviews back up the idea that anyone buying an RX 5500 XT is leaving performance on the table. The average gap is small — only about 5 percent for 1440p averages and 9 percent for 1440p minimums according to TechSpot — but the relatively small average obscures an important point: The titles with the largest division between 4GB and 8GB versions of the Radeon 5500 XT tend to be newer games. TechSpot reports Call of Duty: Modern Warfare is 22 percent faster on the 8GB card compared with the 4GB. So does AMD.

It’s sometimes difficult for consumers to tell how much VRAM matters at the low end of the market. Manufacturers commonly offer GPUs with more VRAM for a higher price, but customers have no interest in paying for VRAM that their GPU isn’t powerful enough to effectively utilize. If your card isn’t powerful enough to run the detail levels required to fill 4GB of VRAM, buying 8GB will gain you nothing. Manufacturers want to sell you higher VRAM cards because the profit margin from additional VRAM is much larger than the cost of the VRAM itself.

When the Xbox Series X and PlayStation 5 arrive later this year, default console RAM capacity will jump to 16GB. GPUs don’t have to match that — consoles share RAM between CPU and GPU, while PCs have dedicated memory for each — but the gap between 4GB and 8GB cards is only going to widen from here.

The reason your total VRAM capacity is so important is that PCs use separate memory pools for CPU and GPU. When your GPU can’t find data in VRAM, it has to fetch it across the PCIe bus. PCIe 4.0 bandwidth is fast, but it doesn’t hold a candle to GDDR6 or HBM, and the interface isn’t optimized for low latency. This is why having less VRAM than required is so terrible for performance — you can see the system hitch as it tries to load a map or even a visual effect.

One good reason to believe the transition to 8GB over 4GB is upon us is that the 5500 XT is turning in very playable frame rates in the same titles where it winds up VRAM-bound. If a GPU scores 10 frames per second with 4GB of RAM and 15 FPS with 8GB, that’d be a 50 percent improvement and utterly useless, as far as the end-user is concerned. But that’s not what’s happening here.

If you already have a 4GB GPU, there’s no reason to panic — games aren’t going to suddenly and collectively fall off a cliff. Remember that the games in question are being tested at very high detail levels, and that certain detail settings inevitably put pressure on VRAM more than others. It’ll still be possible to play future titles on 4GB cards by reducing detail and potentially resolution settings, and while that comes with its own obvious tradeoffs, it’s still better than being kicked out of playing entirely.

One of the points I considered was whether AMD is making this argument because it specifically ships more RAM in lower price points than its competitor. I’m sure that’s part of the reason, but AMD has actually been making that argument for several years now. All the way back to Polaris, AMD has argued that its larger VRAM buffers were an asset against Nvidia’s tendency to offer narrower RAM buses and less bandwidth. What’s changed, in this case, are the games themselves. We’re starting to see pressure on the 4GB memory point, and it’s not as if games are going to suddenly start collectively using less VRAM.

Now, does this mean no 4GB GPU can be a good buy? Of course not. If you want to play Stardew Valley and Undertale in 1080p, you don’t exactly need much GPU horsepower.

We are at a point in time where 8GB GPUs are demonstrating real performance advantages over 4GB cards, but the gains are still small on average and confined to certain titles. A 4GB GPU may still be a smart purchase depending on the use-case and required gaming performance, but it’s clear that 8GB will be needed soon. Taking advantage of the imminent transition by buying an 8GB budget card today will likely improve overall performance in the future.

One last point: Different GPUs from different manufacturers can use VRAM differently. Don’t conclude that just because AMD sees this kind of jump from a 4GB / 8GB shift that it’s also automatically true for Nvidia. I’m not saying that it isn’t true — but we can’t assume that it is without testing the same game on two different versions of each card.

Now Read:



sourse ExtremeTechExtremeTech https://ift.tt/3hfVpGW

ليست هناك تعليقات:

إرسال تعليق