General Gaming Article |
In Case You Missed It - January 24-30 Edition Posted: 30 Jan 2016 05:00 AM PST | ||||||||||||||||||||||
Benchmarked: Rise of the Tomb Raider Posted: 30 Jan 2016 02:03 AM PST Who knew raiding tombs could be so difficult?You have to love Lara and her single minded focus on getting whatever she wants. And if you're like us, you're also jealous that she has all the money and equipment needed to jet set around the globe to all sorts of exotic locales. I have to be honest, though: I've been camping in the snow plenty of times, and a tiny campfire in the middle of a blizzard would not be enough to keep me warm. Which is why it's more fun to run around as Ms. Croft in a virtual world where rain, snow, falling rocks, wild animals, and gunshot wounds won't phase me. Should you care to join us in this pastime, you might want to know if your rig is up to the task at hand. As the latest installment in the long-running franchise, and the second title since 2013's Tomb Raider reboot, Rise of the Tomb Raider ($54 for PC download) ups the ante on graphics requirements yet again. It's definitely not the most demanding game on the block, but if you want to crank every dial to maximum and run at a high resolution, you're inevitably going to come up short. We've posted our optimization guide detailing all of the settings and what they do, along with some recommendations on what sort of settings you should use to get 60+ fps on our latest computer builds. Now, it's time to dig through a larger assortment of hardware and provide some concrete benchmarks. Rather than trying to determine what settings we should use to hit playable frame rates, this time we're using the same settings on a large collection of graphics cards—and in some cases, processors—to see which cards can reach the summit, and which will plunge into a spikey trap of destruction.
A scenic overlookBefore we get to the benchmarks, let's take a quick detour off the beaten path to look at the scenery. As discussed in our optimization guide, Rise of the Tomb Raider comes equipped with five presets (plus "Custom") for graphics quality. We'll just bypass the Lowest preset, as frankly it looks pretty awful—though if we could travel back in time, our 2005 selves would likely be impressed. In fact, even for moderate systems, you hopefully won't need to stoop down to the Low preset, which again has a pretty noticeable drop in fidelity. Our focus will primarily be on the top three presets: Medium, High, and Very High. But if you're curious, here's what the five presets look like at 1080p: Even the Very High preset doesn't actually represent the maximum image quality—you can still enable things like SSAA, along with higher quality shadows and hair. The penalty for going from Very High to Maximum (minus SSAA) looks to be around 25 percent, however, and the minor improvements in image quality generally aren't worth the trouble. In fact, even looking at the Very High vs. High vs. Medium screenshots, you might wonder if the drop in frame rates is worth the slightly better visuals. Bottom line here is that you shouldn't feel bad if you have to start at the Medium setting and start tweaking, as Rise of the Tomb Raider still looks quite nice. Something else to mention is the game's use of HBAO+ for ambient occlusion, as opposed to the more pedestrian SSAO. Nvidia developed HBAO+, but unlike some previous titles, you can use the setting with both Nvidia and AMD GPUs. The catch is that it's optimized for Nvidia hardware, resulting in a larger hit to frame rates on AMD cards; for this reason, we've elected to test with HBAO+ turned off (using the "On" setting for ambient occlusion), even at the Very High preset. If you want to pixel hunt, there are differences between the two modes, and HBAO+ looks better, but in motion we feel most gamers are unlikely to notice or even appreciate the finer nuances of HBAO+. Check Your EquipmentSo what sort of settings will we test, and what hardware are we using? We've settled on the following five configurations, along with limited testing at the Low preset for a few specific cases that we'll get to later:
If you happen to be familiar with the last Tomb Raider (2013), you might think you have a good idea of what to expect. At the highest quality settings, the patterns are pretty similar, but the 2013 release happened to scale very well to lower performance hardware. Sure, it looked pretty awful at the lower quality settings, but a single fast GPU could reach into the hundreds of fps. Rise of the Tomb Raider is not so forgiving, as we'll see in a moment. Fast graphics cards may have to opt for High or even Medium presets (with tweaking), while moderate hardware may struggle even at the Low preset. Ouch. Don't say we didn't warn you! For our test platform, we're using one system for all the discrete graphics cards, but we'll check out a few integrated graphics solutions later and update the article. For the time being, here's our standard GPU test system:
We've also included results for two 980 Ti cards in SLI for some of the more demanding settings. On the AMD side, we tried to test a pair of R9 290X GPUs in CrossFire, but things didn't go so well as one of our GPUs has gone belly up. Le sigh. Since that's the only pair of AMD GPUs we currently have for CrossFire testing, we don't have any results right now, but you'll see in a moment that there are other items that AMD needs to address. A few final items before we depart. First, we're running the latest graphics drivers for both AMD and Nvidia GPUs. However, Rise of the Tomb Raider is an Nvidia title ("The Way It's Meant To Be Played") and Nvidia released a Game Ready driver a couple of days ago. AMD meanwhile reports that they are working on an optimized driver, but for the time being it is not ready—it may come out next week, or perhaps later in the month, and we'll see about testing and updating our findings when that happens. But let's be clear: If you're a gamer eagerly awaiting a new release, Nvidia's approach to drivers is far better; you get to play the game at launch with what should be a reasonably optimized experience. It may not always be perfect (see Batman: Arkham Knight), but more often than not, having a driver tuned for a new game helps a lot. The second item we want to note is the choice of CPUs. Rather than trying to test multiple systems—which would be ideal if you want to know exactly how a particular configuration performs—we're electing to simulate slower processors using our i7-5930K. The Gigabyte BIOS allows us to disable cores and Hyper-Threading, and while the larger L3 cache is still a factor, at least we can get some idea of how mainstream parts like the i5-4690K and i3-4350 perform. Besides, testing every desirable configuration is a rabbit hole with no end in sight—we would have to look at A10-7850K, A8-7650K, FX-8350, FX-6300, and more to really check out the CPU side of the equation. The good news is that most games are far more dependent on GPU performance rather than CPU/APU performance, so our three test CPUs should at least give a good idea of what to expect. Feeing Testy?One final item to discuss before we get to the pretty graphs [Ed: I like pretty graphs!] is the benchmarking procedure. Rise of the Tomb Raider doesn't have a built-in benchmark mode, unlike its predecessor, which means we need to explore alternative means of benchmarking. This is good and bad—good because it's a better real-world look at the game's true performance, but bad because it's far more time consuming and it makes it hard for others to compare results with our numbers. So let's talk about what we're doing for our test sequence. We use a save at the start of the Soviet Installation level, except we've already played through the level and taken care of all the enemies. This makes the test sequence more consistent, as engaging hostiles in a benchmark inevitably leads to increased variability. We follow the same path each time, as closely as possible; each test run takes about 52 seconds. While we're running the test path, we use FRAPS to log frame rates, which we then analyze to find the average as well as 97 percentile performance. If you'd like to test your own rig using our benchmark, you can download our save file and put a copy in the appropriate folder (default is C:\Program Files (x86)\Steam\Userdata\[Unique Steam ID]\391220\remote)—don't forget to back up your own save first, if you have one. As for the benchmark run itself, just follow our path shown in the video below. Then feel free to share your results in the comments—and if you want to calculate the 97 percentile, you'll have to do that by opening the CSV file in Excel (or some other spreadsheet program).
The game is afootWe'll start our benchmarks with a look at graphics cards, all running on a 4.2GHz i7-5930K. This is a beefy rig, so rest assured we're making our best effort to hit high frame rates—if the CPU is a bottleneck here, there aren't many faster CPUs around (short of additional overclocking). As a reminder, we're testing at 3840x2160 using the High preset, with anti-aliasing set to FXAA.
4K can be punishing even in the best of situations, and Rise of the Tomb Raider is certainly not going to be a best-case scenario, particularly so soon after launch. We might see some performance improvements over the coming weeks, but we're far from breaking 60 fps at the Very High setting, and even using the High preset we're still coming up short on the fastest current GPUs. A single 980 Ti at stock breaks the 30 fps mark, but not by much, with lows still dipping into the high 20s; it's still playable, for the most part, but it's not an ideal solution. The GTX 980 is about 20 percent off the pace set by its big brother, averaging exactly 30 fps but with frequent dips into the low-to-mid 20s. AMD's R9 Fury X meanwhile falls between those two in average fps, but the minimums are substantially worse, often dipping into the low teens. Lack of VRAM may be part of the problem here, as a single R9 390 actually manages slightly better 97 percentile results, but drivers are almost certainly a big part of the problem. Looking at the potential for multiple GPUs to help, dual 980 Ti cards in SLI do get us well into the playable range, particularly if you're running a G-Sync display. (Our particularly test display is an Acer XB280HK, if you're wondering.) Even so, we're still well short of 60 fps, and minimum frame rates end up slightly worse than with a single GPU—nothing new there. We've encountered plenty of new releases that fail to scale at all with SLI/CrossFire at launch, so getting even a 33 percent boost from the second GPU at launch is pretty decent; hopefully we'll see even greater gains in the coming weeks.
Dropping down to 2560x1440, we also move to the Very High preset (minus HBAO+, which ends up knocking about 10 percent off Nvidia GPUs and 15-20 percent off AMD GPUs). The net result is that we're rendering half as many pixels, but they're rendered at a higher quality, and performance ends up only improving by a moderate 30-50 percent. If you're struggling to hit acceptable frame rates at QHD, many GPUs will still need to run at the High or even Medium preset. As for the cards, 980 Ti SLI easily averages more than 60 fps—so paired with a 40-144Hz QHD display like the Acer XB270HU would be an awesome experience. Minimum frame rates are still a bit choppy at times, however, falling just below 40 fps. A single 980 Ti boasts higher 97 percentile scores, but it still falls short of 60 fps averages; again, G-Sync would be a boon here. The Fury X does a bit better at this setting, with a clear win over the GTX 980, but it's well short of 60 fps and would benefit from a FreeSync panel like the Asus MG279Q. (Yes, we're very bullish on FreeSync/G-Sync right now, precisely for games like this where averaging more than 60 fps can be a bit difficult even with higher-end hardware.) Moving down the list, we can see that the 4GB VRAM cards are still struggling—look at the 390 vs. the 290X, where the minimum fps drops quite a bit thanks to texture thrashing on the 290X. For some reason the Fury X doesn't appear to have as much difficulty, perhaps because of its faster HBM, or perhaps due to some other architectural/driver differences. Basically, most graphics cards will need to reduce some of the quality settings to handle QHD at smooth frame rates.
1080p with Very High settings finally allows several single GPU configurations to run at or slightly above 60 fps. Interestingly, the GTX 980 passes the Fury X now, placing second on our charts (not counting the SLI setup). AMD's minimum fps continues to be a problem, with a lot more dips and stutters than Nvidia's cards. We won't say too much else here, as we test multiple settings at 1080p, but the Very High setting still requires at least a GTX 980 or above to run really well. Users with FreeSync or G-Sync displays meanwhile could get by with an R9 390 or GTX 970 or better GPU.
1080p with High settings finally allows the $300 GPUs to hit 60 fps averages, with the GTX 970 now holding a slight lead over the R9 390. In fact, AMD is still encountering problems, this time with apparent CPU limitations, as the GTX 970 also delivers a better overall experience than the Fury X. The 390 and 290X end up with the same average fps, but the additional VRAM on the 390 gives a huge boost to minimum frame rates. Now granted, Rise of the Tomb Raider looks quite nice, even at the High preset—in many ways, it's similar to the previous iterations Ultimate preset. It's still a bit surprising to see so many GPUs struggling to reach playable frame rates at this setting, but we expect further driver tuning will help.
Our final GPU chart is a sobering look at graphics requirements. Here we've turned off a lot of the high visual quality settings (though PureHair remains enabled), and yet the $150-$200 cards continue to fall well short of the 60 fps mark. Considering a GTX 950 is pretty similar to a notebook GTX 970M in performance, for the time being only the fastest notebooks are going to handle Rise of the Tomb Raider without resorting to Low-to-Medium quality. Ouch. Speaking of Low quality testing, we did run a few of the cards at 1920x1080 Low (FXAA). We won't generate graphs for these results, as the visual hit is pretty severe, but the GTX 950 managed 58.8 fps average, with a 48.9 fps 97 percentile score. That at least beats the R9 285 (for now), which scored 54.8/36.2 average/97 percentile. The R9 380 meanwhile crested the 60 fps mark with 67.4 fps and a 37.4 for 97 percentile. So even at Low quality, several otherwise capable GPUs are failing to hit 60 fps. Double Ouch. Perhaps not surprising given the game's TWIMTBP branding, Nvidia comes out on top in most of our tests. The R9 390 at least is a decent match of the GTX 970, and the R9 380 4GB beats the GTX 950 2GB, but if you happen to run AMD hardware, we'd suggest holding off for a driver update before raiding this particular tomb. And as a final little nugget of information, we did do some limited testing of Intel's HD 530 Graphics on an i7-6700K. It's not pretty, though not in the sense of rendering errors. There were a few minor rendering glitches, but the biggest problem is frame rates. Even at 1280x720 with the Lowest preset, average frame rates on HD 530 Graphics failed to break 30 fps, and in fact they're closer to 20 fps than 30: 21.6 average fps and 15.9 fps for 97 Percentile. So Intel processor graphics solutions other than Iris can basically forget about running Rise of the Tomb Raider, unless a driver update from Intel improves the situation. Brain taxidermyTerrible puns aside, we also wanted to look at how a few of the GPUs scale with lesser CPUs. As noted earlier, we're using a single CPU to simulate two other CPUs. It's not going to be exact, but it should be close enough for our purposes. If a game is predominantly GPU limited—which was the case with 2013's Tomb Raider reboot—then any decent CPU will prove sufficient. We've tested the 1080p Very High and Medium presets this round, and we're looking at the 980 Ti and Fury X at the top of the GPU totem pole (basically, removing GPU limits from the equation as much as possible), with the R9 380 and GTX 950 representing mainstream parts. Yikes! Did we mention AMD needs to work on tuning their drivers for this particular title yet? The i7-5930K and i5-4690K are at least somewhat close, and the R9 380 doesn't do too badly with the i3-4350, but the Fury X takes a swan dive when paired with the dual-core processor. That's not encouraging, and hopefully it isn't too difficult to fix. Then again, we doubt many users are looking at running a Fury X with a budget Core i3 processor. Nvidia for their part shows far more reasonable scaling. The 6-core i7-5930K wins out overall, which it should considering it's also running a higher clock speed, but the i5-4690K isn't far off, and neither is the i3-4650. There's a bit more choppiness with the Core i3 configuration, but overall you should be fine with any single Nvidia GPU matched with any recent Core i3 or higher Intel CPU. We had hoped to check out AMD APUs/CPUs as well, but time is not on our side—perhaps we'll check that aspect once drivers are up to snuff. Dropping to Medium quality puts more of a burden on the CPU, at least for the faster GPUs. Average frame rates on Nvidia remain relatively consistent, but minimums show a clear progression when moving from i3 to i5 to i7 parts. You could still use a 980 Ti with a Core i3 and not worry much, but again we expect most people plunking down $650 on a GPU will have at least a Core i5 processor, and more like a Core i7. The R9 380 again has very stable results, but the Fury X is seriously handicapped by the dual-core processor—and Hyper-Threading doesn't appear to help. This is one of those items that DirectX 12 should help alleviate, as the CPU bottleneck will be reduced, but these CPU charts certainly paint AMD's drivers in a less than kindly light. Now we just need to wait and see how long it takes for AMD to rectify the situation. Tracking the Divine SourceWe've had plenty to say about performance and drivers, but the short summary right now is that Nvidia has a clear lead. That's not really a shock, given the Nvidia branding and Game Ready driver, but in an ideal world we'd see "Game Ready" drivers from all contenders on every major launch. With AMD's new Crimson drivers and a stated increased focus on all things Radeon (from the Radeon Technology Group), things are getting a bit better but AMD isn't out of the woods yet. Interestingly, despite the Nvidia branding, AMD definitely had at least some influence on Rise of the Tomb Raider. This is the first game to come out using Eidos' new PureHair library. AMD had Eidos present at their RTG Summit last December, and one of the points of their presentation was how Eidos was able to take AMD's open source TressFX and modify it as they saw fit. The result is PureHair, which is supposedly optimized to work even better for things like animating Lara's ponytail. Considering how much time you'll spend looking at Lara's head, it's a far more noticeable graphics effect than HBAO+ in our opinion. As for Rise of the Tomb Raider, the game is treading familiar ground in terms of Lara Croft and her spelunking activities, but the reboot definitely helped to breathe new life into the series. There have been several other good games that overlap with Tomb Raider in a variety of ways—the hunting and crafting of the Far Cry series, for instance, definitely gives a feeling of déjà vu—but that's not a bad thing. PC Gamer scored Rise of the Tomb Raider at 83, and their full review is definitely worth a read if you haven't checked it out. This isn't a series known for innovation (other than the original game back in 1996, perhaps), but it's still good fun, and the graphics are better than ever. Sometimes, that's all you really want, and we can think of far worse ways to spend a weekend. |
You are subscribed to email updates from Maximum PC latest stories. To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google Inc., 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States |