General Gaming Article

General Gaming Article


Doctor: GPU Backplates, Excessive Power, Multiple Monitors

Posted: 03 Aug 2015 02:49 PM PDT

This article was published in the September 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Updating Storage

Doc, I own a Samsung EVO 250GB SSD, which Maximum PC recently identified as having new firmware to alleviate memory/performance loss. After upgrading my Samsung Magician suite, the software detected the update, but failed to install it. After a little research, it appears the AMD chipset (990FX) on my Asus motherboard and the Samsung firmware utility are at odds. Supposedly, there is a newer AMD SATA controller driver, but I'm unable to locate it. It isn't on Asus's website or AMD's. I'm at a loss. I wrote to Asus and its customer service told me to flash my motherboard's firmware to version 2501, which I already have. –Steve M

The Doctor Responds: It sounds like you've done a fair bit of research already, Steve, so hopefully this gets you to where you need to be. Head over to http://bit.ly/1eiYpgr, scroll to the bottom, and on the right-hand side you should see "AMD Chipset, AHCI, USB 3.0 and RAID drivers." Choose either the 64- or 32-bit package, and away you go.

Four for One

Love your column! I have a few questions for you. You helped with my decision to buy an EVGA GeForce GTX 980 SC because it vents out the back of the computer. It didn't come with a backplate, but you can buy one from EVGA. Will this extend the life of the card? The 980 is also G-Sync-capable. It sounds great on paper, but can our eyes really tell the difference between 60 and 144Hz? I've a Core i7-2600K at 4GHz+, with 8GB (4x 2GB) of DDR3, in an Asus P8Z77-V Deluxe motherboard in dualchannel mode. Is it better to have two sticks instead?

Finally, I've been using SSDs for a few years. My system boots from a 256GB Patriot Wildfire and I love the speed. But it's really temperamental with my favorite game, World of Tanks. Last time there was a 6GB update, it wouldn't install and I'd lose the download. I installed the game to my WD spinner and haven't had a problem since. My brother was having a similar problem, solved by a mechanical disk. Is it the SSD or WOT causing these issues? – Bart Cubbage

The Doctor Responds: Always happy to help, Bart. That backplate won't necessarily make your card last longer. It does help draw heat away from the memory modules on the back of the PCB. However, in the Doc's conversations with Nvidia, it's apparent the plate's purpose is largely aesthetic. It can actually be a detriment to cooling performance for folks with two cards in SLI right next to each other. Fortunately, your board's PCIe x16 slots are three spaces apart. If you like the way it looks, you have the Doc's blessing to buy it.

As far as G-Sync goes, the technology improves several aspects of gaming. In short, you'll see the biggest benefit from G-Sync at frame rates under 60. If you also own a high-refresh rate monitor, G-Sync will continue to eliminate the stuttering that happens with VSync turned on (and the tearing between frames with it off)—the effect is just less pronounced.

The Doc wouldn't recommend changing out your memory just to consolidate, unless you're upgrading to 16GB or more. Fewer modules can help with overclocking and/or more aggressive memory timings. You aren't going to see a difference in the real world.

Finally, there's no technical reason why World of Tanks wouldn't run on your SSD—to the contrary, it appears an SSD may improve its performance. Is your drive full to the point that there wasn't enough free capacity to download, extract and install the patch?

Storage Setup

Doc, I built a computer a couple of years ago. The case turned out to be defective, with its USB 3.0 interface shorted. When I tried to plug a thumb drive into it, everything shut down. The third-party controller was fried, and I've had to live with it since. Z77 boards are getting hard to find, so I ordered an ASRock model from Newegg before they are gone. I'm still deciding on a case. The question: Should I set the BIOS to AHCI before I install the boot drive? It's a SanDisk 120GB SSD. Any pitfalls to look out for during the swap? –Bill Kirchmeyer

The Doctor Responds: If you're going from one Z77-based board to another, Bill, the swap should be fairly easy. Typically, these operations go sideways when the chipset vendors are different, old graphics drivers aren't properly cleared off before switching from one vendor to another, or storage is misconfigured. Maintaining the same hardware means the requisite software will mostly be resident. Third-party controllers might vary between the mobos. However, don't expect any of them to be show-stoppers. Simply install their driver packages once you're back up and running.

Is your SSD currently set up to use AHCI? If so, that's the route to go. There are ways to toggle between IDE and AHCI, forcing Windows to load one driver or the other at start-up. But you won't need to if you start and end with this more modern mode.

Barrier to Entry

My HP 210 G1 needs some help! HP says an SSD has not been "qualified" for it, so adding one will void the warranty. Is there usually something in a laptop like mine that prevents installing an SSD? I'm willing to accept some risk, but I don't want to screw things up. –Art Hudson

The Doctor Responds: Many notebook vendors do employ whitelists to restrict the add-in cards their systems support (typically Wi-Fi cards). However, the Doc has never encountered an issue with SATA-based storage. You're going to be pulling out a 320 or 500GB disk, according to HP's datasheet, and replacing it entirely. If your existing partition fits on the SSD, consider a model that comes bundled with cloning software. Else, you'll need to start over with a fresh installation of Windows—not always a bad thing, but more labor-intensive.

Right-Sizing Your PSU

Hey Doc! I'm currently saving for a new PC. I've purchased an NZXT H440 chassis and Corsair RM750 PSU. Next, I plan on buying an Intel Core i5-4690K, a 256GB SSD, a 1TB hard drive, and a GeForce GTX 970. Is the 750W supply too much? If so, will it damage my system? Thanks! –Benji Smith

The Doctor Responds: Although you don't need 750W of output to power those parts, you certainly won't hurt anything by using the RM750, Benji. Think of the supply's rating as a ceiling; anywhere under 750W, it'll take good care of you. In fact, because the RM750 includes a feature called Zero RPM Fan Mode, there's a good chance that, most of the time, it won't get hot enough to require active cooling, meaning it'll operate silently.

Let's say you're running a theoretical application that taxes all of your hardware simultaneously. You'll be well under 300W of power consumption. Add a second GeForce GTX 970 in SLI if you want. Overclock your CPU and GPUs within the limits of air cooling. That 750W PSU still has more than enough headroom.

Output Alphabet Soup

Dear Doctor, I got myself into a pickle when I bought new video card: EVGA's GeForce GTX 980 Superclocked. The card supports up to four monitors across one HDMI, one DVI, and three DisplayPort connectors. I'm currently using two Asus VS229H-P monitors, each equipped with one VGA, one DVI, and one HDMI connector, but no DisplayPort. They're attached using DisplayPort to HD male adapter cables. I want to add a third and possibly fourth monitor, but want to know what's better between DisplayPort, DVI, or HDMI. I cannot find any tests to guide my choice. Would you stay with the current setup, change the cables to DisplayPort to DVI, or buy new monitors that support DisplayPort and use that interface from end to end? Please help! –Jerry Franco

The Doctor Responds: If you like your monitors, there's no reason to replace them. In fact, four of those VS229H-Ps would look nice lined up on a desk. You will, however, have to work around the different outputs offered by EVGA's card and the display's inputs.

In this specific application, there's no real reason to favor DVI, HDMI, or DisplayPort. All three interfaces support the panel's native 1920x1080 resolution at 60Hz. HDMI will carry an audio signal to the 3.5mm mini-jack, if you plan to use headphones. So, feel free to keep your existing screens hooked up the way they are. Add a third using the DVI cable that Asus bundles with its VS229H-P. And, should you decide to try a fourth, go HDMI to HDMI or connect a third DisplayPort cable adapting to either DVI or HDMI.

The Age-Old Question

Hey Doc, I'm looking for a relatively inexpensive upgrade to my current GPU, but I'm not sure if it'd be better to just start a new rig. I currently have a Core i7-3770 running Windows 7 64-bit with 32GB of RAM on an Asus Sabertooth P67 mobo. The whole configuration is backed by a 750W PSU. My current graphics card is a Radeon HD 6870, and it's definitely the low performer in my build. I had a second card running in CrossFire, but was having heat problems due to limited placement options for my PC. Is there a decent upgrade for me? –Berek Marcus

The Doctor Responds: Relatively inexpensive is… well, relative. What the Doc will say, though, is that you're rocking a powerful platform. A meaningful overhaul would be pricey. Assuming you're already enjoying the benefits of an SSD, replacing the 6870 would be the next logical upgrade. It sounds like you're especially sensitive to thermals, Berek. If the sub-$200 range sounds reasonable, Nvidia's GeForce GTX 960 would serve up a notable speed-up in your favorite games. Its 120W TDP is quite a bit lower than the 6870's, too.

Submit your questions to: doctor@maximumpc.com

Doctor: GTA Performance, $300 Graphics Cards, Thunderbolt Pains

Posted: 03 Aug 2015 02:32 PM PDT

This article was published in the August 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Dual Graphics?

Greetings Doc. My rig consists of an ASRock FM2A88X-ITX+, an AMD A10-7850K APU, two Crucial 240GB SSDs in RAID 1, and 8GB of Kingston HyperX DDR3 at 2400 MT/s. My question is which would be a better graphics solution? Install a Radeon R7 250 in the PCI slot and enable AMD's Dual Graphics tech, or forget the on-die graphics and install a Radeon R9 (or something comparable) in the PCI slot? —Rick Stephenson

The Doctor Responds: If you're using a Mini-ITX mobo, Rick, the Doc can't help but wonder if your small form factor chassis can accommodate a dual-slot PCI Express card. Assuming it can, though, and that you have plenty of room (and airflow) in your case, the Doc would almost certainly bypass the on-die graphics and spring for something like a Radeon R9 270. You should be able to find one for around $150.

Why go that route when you already have a Radeon R7-class engine with 512 shader cores in your APU? As a rule, one graphics processor is better for compatibility than two or more. This is doubly true for AMD's Dual Graphics, which relies on up-to-date CrossFire profiles for increased performance. Even then, higher benchmark results aren't always indicative of a better experience with Dual Graphics. The Radeon R9 270, on the other hand, is still relatively affordable, gives you more processing power than the APU and R7 250 combined, and will almost assuredly support the newest games before a Dual Graphics configuration. Plus, it's well-balanced with the two Steamroller-based modules in your A10-7850K—paying more for an even faster card may not be worthwhile if it's bottlenecked by the APU.

GTA V Frame Rates

I started out by writing a long letter explaining the workarounds I've tried. But I'll distill it down to this: my frame rates in GTA V are horrible—15 to 20fps. I've done everything short of nuking my machine and reinstalling Windows. Given my system specs, I expect to see 30-40fps nominally. I'm not running at some insane resolution; I'm at 1920x1080.

Before pulling the trigger, there's one item I wanted to run by you to see if it could be the culprit. My motherboard (GA-Z68XP-UD3-iSSD) has "Dual PCI Express 2.0" printed on the board just above the PCIe slot. GPU-Z also shows "PCIe 16x 2.0." However, the Gigabyte website says the PCIe slot is 3.0-capable. Since thirdgen PCIe is nearly double the throughput of 2.0, could this be a bottleneck causing the frame rate to suffer? If I have to get a new board, so be it. I just want to know what's going on. —Michael Guimond

The Doctor Responds: Performance issues can be very frustrating to troubleshoot, Michael. But the Doc will focus on your PCIe question so you can move on to other possible causes before venting your frustrations on the denizens of Los Santos. Gigabyte's Z68XP-UD3-iSSD may support PCIe 3.0 transfer rates through its x16 slot, but because the controller it communicates with is built into your CPU, that component needs to support PCIe 3.0 as well. Unfortunately, Intel's Core i5-2500K is limited to PCIe 2.0, so you'd need something like a Core i5-3570K (or another Ivy Bridge-based CPU) for thirdgen throughput. If you upgrade, be sure to flash the latest BIOS before removing your i5-2500K.

The good news is that PCIe bandwidth probably isn't your problem. Even a modern card like the GTX 970 doesn't move enough information over the PCI Express bus to saturate an 8GB/s link. You might want to try overclocking your i5 to see if that helps, though. GTA is notoriously processor-bound, so higher clock rates could improve performance. Also, make sure you have the latest patch. Rockstar recently published a handful of updates that made a big difference.

Shopping for Balance

Hey Doc, I received a Mini-ITX case (Cooler Master Elite 130) from a buddy and decided to build a system with it. I dropped in a Gigabyte GA-Z97N-WIFI motherboard with a quad-core Intel Core i5-4590 at 3.3 GHz. It's liquid-cooled by a Cooler Master Seidon 120V, and I maxed out the RAM using a G.Skill Ripjaws X Series 16GB kit. I have a Samsung 850 EVO 250GB SSD for the OS and a Western Digital Blue WD10EZEX 1TB for data that I repurposed from an old machine. The whole setup is powered by a Cooler Master G550M 550W PSU.

The issue I'm having is that the integrated graphics just isn't doing it for me. Can you suggest a graphics card that will work with my system and really kick butt? I've about $300 to spend and have been looking at the Radeon R9 290 and GeForce GTX 900-series. I should have enough available power to support either, but I'm not sure if one card offers an advantage over the other. I trust you and Maximum PC to give me an honest and unbiased recommendation. —Steve

The Doctor Responds: It's a privilege to have your trust, Steve. That's an otherwise highly capable PC you have, and your Cooler Master Elite 130 does in fact have room for any high-end graphics card you want to install (even though it's a Mini-ITX enclosure).

The Radeon R9 290 is a great card at around $250. Stepping up to a GeForce GTX 970 would be even better (it's faster, uses less power, and generates less heat). But that would put you over your budget. If 3D performance is your top priority, nothing beats AMD's R9 290 for less than $300.

Office Uproar

My company's IT group just forced me to "upgrade" to Office 2013. Every time they do these upgrades, everyone spends months figuring out how to undo the changes Microsoft made to its latest version. One I can't figure out is how to move the star icon for Favorites from the right-hand side of the screen back to the left, where I'm used to it being. Can you help? Also, is there a place we can send messages to Microsoft, telling them to stop making all these stupid changes?

The Doctor Responds: The Doc must assume this update also involved a new version of Internet Explorer, which now has its "View favorites, feeds, and history" tab in the upper right-hand corner between Home and Settings. There are a couple of ways to access this. First, you can turn on the Favorites toolbar by clicking "View > Toolbars > Favorites bar." This will put any link you've specifically added to the Favorites bar right under the Menu bar across the top. Or, if you'd like to enable the Explorer bar, which is the pane on the left-hand side of the screen with all of your favorites, click "View > Explorer bars > Favorites." Of course, if the Doc is misunderstanding the favorites to which you're referring, please feel free to send clarification. If you'd like to submit feedback, check out connect.microsoft.com. The company publishes a list of software currently accepting bugs and suggestions. IE is on the list, but Office is not.

Thunderbolt Anger

I look forward to your column every month and finally have a problem that requires the Doctor's attention.

For some time, I've wondered why Windows is not embracing Thunderbolt tech. While some might covet its video capabilities, I've always been interested in Thunderbolt's data transfer speed. When Asus introduced its X99-based mobos, I thought it was time to give Thunderbolt a try. One of the features flaunted was that the X99 Deluxe worked with its ThunderboltEX II PCIe cards. I bought the X99 Deluxe, a ThunderboltEX II/DUAL, a Core i7-5930K CPU, and the rest of the hardware necessary for my new build. The build was uneventful, until it was time to install the Thunderbolt card. Although I'd read that the X99 Deluxe and the ThunderboltEX II/DUAL were compatible, it turns out the card has a ninepin header, while the mobo has a five-pin Thunderbolt header. The cable that came with the card had nine pins on both ends and simply wouldn't work.

I quickly found out the Thunderbolt EX II cards were initially intended for the Z87- series motherboards, which had nine-pin Thunderbolt headers. Apparently, some later versions shipped with a nine-to-five-pin adapter cable that allowed them to be used with the Z97 and X99 mobos. I began an exchange with Asus to get the correct cable, leading to my most frustrating tech support experience ever!

I was told they weren't compatible with each other; get a new motherboard. Then, it was call the Asus eStore (I did; they didn't know what I was talking about). I was told to order a newer version of the motherboard that came with the cable (no X99 boards do). How about looking on eBay? Or try a third-party Thunderbolt card with a five-pin cable.

At this point I've had it with Asus. But I'm stuck and would like to know if there's any other way to get the card to work with the X99 Deluxe, or if a thirdparty card is available that will work with the five-pin header. Thanks for all you do. –Tony Paradowski

The Doctor Responds: This isn't the first complaint the Doc has fielded about Asus's Thunderbolt products, Tony. But you're right—that mobo and the ThunderboltEX II/DUAL should work together.

Company representatives say that customers with the older kit who contact the Asus service department are eligible to receive an adapter. Indeed, they claim many of these have already shipped out. The Doc knows some folks over at Asus who should be able to rectify the situation for you. Expect an email follow-up from one of them in the days to come.

Submit your questions to: doctor@maximumpc.com

Intel SSD 750 1.2TB Review

Posted: 03 Aug 2015 12:52 PM PDT

at a glance

(+) Flash
Superb performance; simple to use; huge capacity.

(-) Too Much Cash
Extremely expensive; 2.5-inch version needs rare connections.

This article was published in the July 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Is this the second coming for SSDs?

Wipe the slate clean. Forget the SATA interface and forget the AHCI protocol. This is high bandwidth NAND Flash-specific storage of the highest order. This is solid state all grown up and forging its own path.

Since the first consumer SSDs hit our PCs, they've been piggy-backing highlatency mechanical drive tech in terms of their interfaces and protocols. In the beginning of the SSD revolution, running across the 600MB/s limit of the SATA 6Gb/s interface wasn't much of a problem. But quickly our SSDs became more trustworthy and more capable, and suddenly they were bumping their heads against the limits.

Along came PCIe-based drives, but they were mostly still smaller SATA drives connected to a PCIe RAID controller. Then came actual PCIe interfaces specifically designed for SSDs. The new wave of Flash storage had begun. The M.2 socket and the still mostly unused SATA Express arrived with the Z97 and X99 chipsets.

But breaking clear of the bandwidth limitations of the SATA interface is only one strand in unlocking the true potential of solid-state storage. The other is all about getting around the legacy setup. The AHCI protocol was introduced when SSDs were a mere twinkle in an old USB stick's eye, and has been almost inextricably linked to high latency spinning platters.

That setup still works fine for mechanical drives, but the legacy commands in AHCI still have to be run through, even when an SSD is in place. The drive then has to go through each legacy command, even if they have no relevance to high-speed SSDs. That wastes an awful lot of CPU cycles. When Intel introduced its enterprise level DC P3700 NVMe drives, it used the Linux AHCI stack as an example of the wasted time and power that goes into running an SSD across the legacy protocol. Using a quad-core i5, the Linux stack runs to 27,000 CPU cycles, meaning it needs a full 10 Sandy Bridge-level cores to drive one million IOPS (input/ouput operations per second). With the streamlined NVMe Linux stack, that's cut by around a third, needing only 10,000 CPU cycles to reach the magic million figure.

The Gigabyte's Era

That enterprise drive is important, not just for providing an example of how powerful the NVMe protocol can be. It also forms the basis of this first consumer NVMe drive. The bespoke controller at the heart of the P3700 is the same 18-channel controller as the one that makes the SSD 750 so speedy. That controller is kept honest with a hefty 2GB of DDR3 cache on this 1.2TB version.

Its top speed? The peak sequential numbers from ATTO are unprecedented. The reads were almost 2.7GB/s and the writes were 1.3GB/s. The scores on the harder AS SSD benchmark were lower, but 2.1GB/s and 1.2GB/s respectively isn't bad. We're shifting from talking in MB/s here…

There are some interesting scores when we get to the 4K random read/write performance. At 38MB/s, the 4K random read speed isn't amazing, but the write performance of 200MB/s is stunning, nearly twice as quick as the closest competing drive. Compare all that with the fastest PCIe drive we've previously tested, the Samsung XP941, and you can see where the AHCI protocol is holding things back. Its 4K write performance is a quarter of what the SSD 750 can achieve.

There we have it then. The 750 is the fastest SSD we've ever tested, and by a comfortable margin. But it's also one of the most expensive. The problem with being a brand new technology, and being first to market, is that it will always command a high price tag. That's especially true given there's still no NVMe competition. Couple that with the fact it's Intel, and prices were always going to be sky-high. This is like the Extreme Edition CPU of solid-state drives.

But there's a certain justification. Intel has had to invest heavily in the ecosystem of NVMe, working to ensure all mobo makers are able to get BIOS support for their Z97 and X99 boards. The SSD makers who will follow up with inevitably cheaper NVMebased SSDs won't have that R&D cost.

There's a little bit of the Titan X about the SSD 750. It shares the basic core silicon with its pro-class brethren, the P3700, so it's got similar levels of performance. But the 1.6TB enterprise drive is well over $5,000, while this high-capacity, high performance drive is just $1,030. Looking at the cost per GB makes it look less extortionate. When Intel released the X-25M, the first real performance SSD (for all its subsequent faults), it cost $3.72 per GB. This 1.2TB drive delivers a far lower 86c/GB.

And prices will tumble. This is the birth of NVMe. The ecosystem is there now, so you can bet NVMe-based SSDs will be appearing from everyone by the end of the year. It'll go crazy once Marvell and SandForce's controllers are out there. June's Computex is likely to be built upon a sea of NVMe drives. Plus, we doubt this is as fast as an NVMe drive can go. The 750 sure looks lightning-fast, but we'll know more once competing NVMe controllers are out in the wild. For now though, this is as quick as it gets.

$1,030, www.intel.com

Benchmarks

Intel SSD 750 1.2TB
Samsung XP941 512GB
Samsung 850 Pro 512GB
ATTO Read/Write Speed (MB/s)
2,671/1,322
1,060/997
561/532
AS SSD Read/Write Speed (MB/s)
2,136/1,175
1,055/885
528/500
AS SSD 4K Read/Write Speed (MB/s)
38/200
23/49
37/107
5GB File Compression (seconds)
69
73
73
30GB Folder Transfer (seconds)
120
132
169
PCM8 Consistency (index)
4,906
4,849
4,751

Best scores are bolded. Our testing platform is a stock-clocked Intel Core i7-4770K on an Asus ROG Maximus VII Hero Z97 motherboard with 8GB Corsair Dominator DDR3 at 1,600MHz.

Specifications
Capacity
1.2TB
Memory
Intel 128Gb 20nm MLC NAND
Controller
Intel CH29AE41AB0 (18-channel)
Interface
PCIe 3.0 x4 - NVMe
Cache
2GB DDR3
Warranty
5 years

Acer XG270HU Review

Posted: 03 Aug 2015 12:49 PM PDT

at a glance

(+) Yup
FreeSync; high refresh rate; cheaper than the Asus RoG Swift.

(-) Nope
Still TN, and not a brilliant TN; ghosting with FreeSync; fairly pricey.

This article was published in the July 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Our first FreeSync monitor touches down, but can it give the pricey Asus Swift a run for its money?

Acer's new XG270HU is the first monitor we've tested that offers AMD's framesyncing technology, FreeSync. It's a highend, high refresh rate panel that ought to deliver the best possible FreeSync experience at a fraction of the cost of something like Asus's range-topping Republic of Gamers Swift PG278Q screen.

In case you're wondering what FreeSync is, it's essentially a technology that leverages the Adaptive Sync part of the DisplayPort 1.2a protocol, to enable AMD graphics cards to speak to compatible monitors and display frames from the GPU when they're ready.

Nvidia's G-Sync and AMD's FreeSync are both designed to do the same thing, but the basic difference is that the green team requires the monitor manufacturers to license the G-Sync tech and install a specific hardware module in their displays, to sync with GeForce GPUs. AMD's tech is based on the newly established DisplayPort 1.2a Adaptive Sync protocols, and as such needs no new hardware, nor licenses. Because, y'know, it's free syncing.

Initially, we thought the Acer had grabbed the same AU Optronics panel the Asus Swift is using and, given that it's almost $300 cheaper, that would be a major feather in the cap for both Acer and AMD's FreeSync initiative. A quick specs check would seem to corroborate that initial thought—both are native 2560x1440 panels; they both use TN tech with a 1ms response time; and they both run with a 144Hz refresh rate. But there's one key spec that separates the two. The Swift is a true 8-bit screen while the XG270HU uses Frame Rate Control (FRC) to enhance its 6-bit monitor.

Putting the monitors cheek-by-jowl, you can see they're most definitely not the same. Aside from possibly a slightly improved viewing angle for the Acer panel, it loses out to the Swift in terms of image quality. Sometimes only by a touch, but sometimes by a wider margin. The contrast and gradients on the Acer are only slightly behind the expensive Asus monitor, but it's a long way behind when it comes to the black reproduction. It's also not so hot on the whites either, but both being TN panels, neither are exactly paragons in that regard.

Stand and Be Counted

The addition of the G-Sync license and hardware isn't the only reason the Swift is more expensive. The Acer is just making do with a limited tilt stand—there's no height adjustment, which is a bit of a miss considering TN's paucity of vertical viewing angle.

There's also a noticeable amount of ghosting on the panel when you're using FreeSync, too, something Nvidia claims isn't present with G-Sync panels because its hardware is set up specifically for each panel to operate optimally. With FreeSync, AMD says it's up to the panel manufacturers to ensure dynamic refresh rates and pixel persistence is tuned for each screen.

Our first experience of FreeSync isn't brilliant, then. But it wasn't with G-Sync, either. This is a new technology, so the early screens aren't going to be perfect, and the XG270HU most certainly isn't that. At $500 it's also not really toeing the line that FreeSync isn't adding a price premium—you can pick up a 27-inch IPS 1440p panel for around $100 less.

Still though, FreeSync is great if you're already running a compatible AMD graphics card. Games certatinly run much smoother than they do with V-Sync alone, and this Acer panel doesn't need any messing around with to get that up and running. We do, however, still baulk at this price for a TN screen, especially when IPS or MVA Adaptive Sync capable screens are on their way.

$500, www.acer.com

Specifications
Screen Size
27-inch
Native Resolution2560x1440
Panel TechnologyTN
Refresh Rate144Hz
Response Time1ms
InputsDisplayPort, HDMI, DVI

Samsung SM951 Review

Posted: 03 Aug 2015 12:44 PM PDT

at a glance

(+) Light Speed
The fastest M.2-based drive we've tested; certainly the fastest 256GB consumer drive around; bootable with the right mobo.


(-) Stalled
Still expensive for a 256GB drive (though it's only $40 more than its predecessor); tricky to find without buying it pre-installed; not an NVMe drive.

This article was published in the July 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Another blisteringly fast, but slightly easier-to-find, Samsung M.2 PCI Express SSD

A little while back, we got our hands on Samsung's 512GB XP941 M.2 PCIe SSD—a rare creature in the wild, but one worth tracking down for its stunning performance. Well, stunning that is, if you could run it at its full x4 PCIe speed. But now we have our hands on its successor, the slightly easier-to-obtain SM951.

This M.2 drive first saw the light of day at Samsung's annual shindig, the SSD Global Summit, which took place, in South Korea, last July. When it was first announced, the company claimed it would be the world's first NVMe SSD for the PC, but that idea got kicked out of the window by the powers that be, as the drive only supports the good old AHCI architecture. (Samsung has just announced it's started mass production of the NVMe version of the drive, called the SM951-NVMe).

Like the XP941, the SM951 is handled by the Korean firm's OEM branch, which is why getting hold of one without buying a device with it pre-installed is a bit tricky. The drive comes in three capacities, the entry level 128GB, the 256GB drive (our review sample), and the flagship 512GB unit. There's no 1TB drive, as per the original announcement, and all are the M.2 2280 format (22mm wide, 80mm long).

The XP941 used a 3-core Samsung S4LNO53X01 controller with a PCI Express 2.0 x4 interface. The SM941, meanwhile, uses a Samsung S4LN058A01 controller, again a 3-core chip, but with a PCI Express 3.0 x4 interface. So, what does moving from 2.0 to 3.0 PCIe do then? Well, in theory it doubles the bandwidth—you only have to look at the quoted sequential read/write performance figures of the drives to see the advantage of the 3.0 interface. The 256GB XP941 is quoted as up to 1,080MB/s and 800MB/s for read/writes respectively, while the 256MB SM951 surpasses that by some distance with figures of 2,150MB/s reads and 1,200MB/s writes. But while this drive has hugely fast sequential speeds, it remains way off the 3.2GB/s maximum bandwidth of the 3.0 x4 bus.

Boots 'N' All

Surprisingly, given what Samsung is doing with 3D-NAND, the SM951 uses old-fashioned 2D planer 10nm class MLC NAND. In all probability, the same 19nm 64Gb MLC NAND used in the XP941. Also added to the SM951 is support for the PCIe L1.2 power state, which can be looked at as the PCIe equivalent of DevSleep that's supported by some standard SSDs. Another advantage of the SM951 over the previous model is that it's bootable (with the right motherboard that is), so make sure you do your homework before buying either drive or motherboard.

To test the drive at its full x4 PCIe potential, we used an Asrock Z97 Extreme6, one of the few Z97 boards that can run these drives at full speed, thanks to its extra M.2 x4 port. Alternatively, there's Intel's X99 chipset, or you could use an adapter card, such as the Lycom DT-120.

It takes some believing, but even the quoted figures for the sequential reads seemed conservative when the SM951 was tested with the ATTO benchmark, the drive producing a score of 2,253MB/s. However, the write performance was bang on the money at 1,272MB/s. When it comes to handling the small 4K files of everyday use, the SM951 shows once again the advantage of the extra bandwidth available to it with a read speed of 44.59MB/s, in the AS SSD 4K test, compared to the XP941, but the biggest improvement comes in the write figures. The XP941 gave up a score of 73.05MB/s, totally eclipsed by the 132.31MB/s produced by the SM951.

These figures add up to make the SM951 an incredible SSD. In fact, the only thing holding us back is the promise of the imminent NVMe version.

$260, www.samsung.com

Benchmarks

Samsung SM951 256GBSamsung XP941 512GB
ATTO Sequential Read (MB/s)
2,2531,091
Sequential Write (MB/s)
1,2721,003
AS SSD Sequential Read (MB/s)
1,989
1,144
Sequential Write (MB/s)
1,184
900
4K Read (MB/s)
44.59
29.65
4K Write (MB/s)
132.31
73.05

Best scores are bolded.

Specifications
Capacity256GB
NAND Type
Samsung 19nm
Toggle MLC
M.2 Format2280
Quoted Sequential Read Speed
Up to 2,150MB/s
Quoted Sequential Write SpeedUp to 1,200MB/s

Silverstone Tundra TD02-E Review

Posted: 03 Aug 2015 12:41 PM PDT

at a glance

(+) Commodore 64
Solid cooling and acoustics; improved dimensions on original TD02.

(-) Atari Jaguar
Could perform better for the money.

This article was published in the July 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

This cooling ain't fooling around

Last year, we reviewed Silverstone's original TD02 closed-loop CPU cooler (CLC). It used a 240mm radiator, which had integrated tubing that fed heat coming from a pump and heatsink combo that sits on top of the CPU. Enthusiasts like CLCs because they can allow higher CPU overclocking than a conventional air cooler. They also take some weight off the mobo, though modern backplate designs tend to make that a non-issue. But the TD02 wasn't perfect. Its radiator was thick, so it couldn't fit in some cases. The tubes were also a bit undersized, so it could have trouble getting heat to the rad.

We liked the aluminum pump housing of the original, and the white tubing and fan blades. But we were apparently in the minority. When you unbox the TD02-E, you'll see black rubber tubing and fan blades, and no white accents on the rad. Silverstone's literally gone to the dark side, though there's a shell with a carbon fiber-style pattern on it. Plus the firm's snowflake logo is still present, glowing blue on top of the pump.

The radiator is 27mm thick instead of the usual 25mm, but that's unlikely to be a problem. It still uses a dense soldered fin array, which creates better contact than non-soldered fins, and the higher density increases the cooler's effective surface area. You need proportionally stronger fans to compensate for the thicker array, but that's definitely not a problem for Silverstone, a company known for designing its own fans for a variety of roles. The TD02-E fans' blades are highly angled for better penetration, whereas regular case fans will have flatter blades, because they don't need to focus their airflow.

Feeling the Heat

Anyone who's installed a CLC before will find no surprises. Silverstone uses a metal backplate, which is a nice touch and adds to the cooler's durability. Users of LGA 2011 mobos have a set of mounting screws, and the cooler drops right in and is secured by four metal fasteners. You can get it all in with a Phillips screwdriver. Silverstone also includes an adapter to connect both fans to one fan header, which seems a nobrainer but sometimes isn't provided.

The TD02-E operates quietly, yet it idles this hex-core Intel Core i7-3960X just 10 C above room temperature. When we put the nightmare-level load on this chip, the heat rises up to 71.6 C, which is higher than we'd like. But the cooler stays nice and quiet despite the CPU being overclocked to about 4.2GHz at this point. Once that test was done, we instructed the "AI Suite" software of our Asus Rampage IV Extreme motherboard to set the fans to "Turbo" mode. Temps dipped to 70 C. Still a few degrees higher than we'd like, but respectable when you consider how much heat the CPU is generating.

Unfortunately, the CLC market is packed with some competitive entries from the likes of Cooler Master and Corsair, who will get your chip cooler for about the same price and with the same noise level. And of course, if your case can fit a 280mm radiator instead, this Silverstone unit pales in comparison. So we can't crown the TD02-E as the new king. In fact, our tests indicated the original performed a bit better, probably thanks to its thicker rad. But if the TD02-E can keep a hex-core CPU at 72 C when clocked to 4.2GHz, it should be more than enough for a chip that uses the smaller LGA 1150/1155 socket.

$100, www.silvertstonetek.com

Benchmarks

Tundra TD02-ECooler Master Nepton 280LTundra TD02Corsair H100i
Ambient Air
23.4/23.4
22.1/22.4
19.8/19.8
20.3/20.5
Idle Temperature
33.1/33
33.2/30
29.8/29.5
30.7/29.3
Load Temperature
71.6/70
64.5/63.3
65.8/63.0
67.1/61
Load: Ambient
48.2/46.6
42.4/40.9
46.0/43.2
46.8/40.5
Street Price
$100
$120
Discontinued
$100

Best scores are bolded. First number is Quiet mode, second is Performance. All temperatures in degrees Celsius. All tests performed with an Intel Core i7-3960X at 4.2GHz, on an Asus Rampage IV Extreme motherboard in a Corsair 900D with stock fans set to Standard.

Specifications
Radiator Dimensions (H x D x W)
11 x 2.1 x 4.9 inches
Stock Fans
2x 12cm PWM
Socket Support
LGA775/1150/1155/1156/1366/2011/2011-3
AM2/AM2+/AM3/AM3+
FM1/FM2
Additional Fan Support2x 12cm

Audio-Technica ATH-PDG1 Review

Posted: 03 Aug 2015 12:34 PM PDT

at a glance

(+) Spacious
Open-back design is rare, but the soundstage on offer here makes a good case for a revival.

(-) Fiscally Voracious
We're all for performance over frills, but the construction materials and presentation don't convey $200.

This article was published in the August 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Gaming cans from the big boys of studio sound

Step into any recording studio and you'll see a raft of Audio-Technica kit. Indeed, the Japanese firm is beyond proving itself in the pro audio sector. But it's a different story in the PC enthusiast market. After an uncertain first outing in the form of its ATHAG1 gaming headset, A-T's still in away game territory when it comes to gamers.

That first-gen gaming headset had two major drawbacks: It was unreasonably expensive and it favored a paddle support system instead of the more comfortable headband. Everything else was pretty much on point. As you'd expect from such a prestigious brand, sound quality was precise, refused to distort at high volumes, and featured an impressive stereo spread range. That's an important preface to the arrival of these cans; a second generation of gaming audio priced below its predecessor, and featuring a traditional headband design.

The PDG1 has a new sibling in A-T's second-gen gaming headset range, the ATH-PG1, the latter offering a closed-cup design in contrast to this model's "open air" dynamic drivers. What's the difference? There's no clear superior, both having their pros and cons. Open-back headphones let in more external noise and leak more of your audio out into the room, which means they're not as good at producing powerful low-end than closed-back cups. However, they're generally better at creating a wider soundstage, giving the impression that the audio's coming from all around you. Closed-back cans tend to sound like the audio's coming from inside your head.

We generally prefer closed-back headsets. However, the ATH-PDG1 makes a very convincing case for itself. Stereo spread is, indeed, very impressive. We're not talking about digital surround—these cans are getting no help from a soundcard or firmware layer to create cinematic Dolby whizzbangery. It's all about how those 40mm drivers behave organically based on their engineering. As such, they're capable of mighty impressive clarity and spread, easily equal to the first-gen ATH-AG1 in all aspects other than low-end oomph.

Thrills, Not Frills

They're much more comfortable, too. At 225g, these are legit featherweight headphones that bear no disconcerting squeaks or creaks when adjusted to fit. You can even flip each earcup to face outward, DJ style, which proves useless in a gaming setting, but serves as a reminder of A-T's prevalence beyond the gaming sector.

We wouldn't call the cushioning on the headband generous, but it doesn't need to be when the unit's so light. The only potential for discomfort is in the short distance between the padded contact point and the speaker surface of each earcup; leading to your ear pressing against the hard, flat speaker surface a little. We didn't find this enough of an issue to force us to take a breather, though.

Lessons have clearly been learned in the field of ergonomics, then. But pricing remains slightly problematic. These new cans are positioned at $200. And for that price, this is something of a no-frills package. No carry case, no swappable contact pads, no braided cable. The inline remote's distinctly economy class, offering a slightly apologetic mic mute on/off switch and volume scroller. There is a smartphone cable, but that's literally all she wrote when it comes to added extras.

That creates a dilemma. The ATH-PDG1 does do the fundamentals exceptionally well. Even the mic's worthy of mention for its adjustability and clarity. We've berated plenty of pricey headsets for their inclusion of meaningless distractions like customizable LEDs, so it's great to see a product focusing on doing the important things well. However, it doesn't do those things demonstrably better than other models—step forward Kingston's HyperX Cloud and Cloud II. Nevertheless, it's an encouraging step for A-T, placing its gaming products among the ranks of the best, just not quite at the apex.

$200, audio-technica.com

Specifications
Driver Size
40mm
Frequency Response
20Hz–20KHz
Weight
225kg
Cable Length
3.2m
Connection Type
Single mini stereo jack plug (3.5mm)
Mic
Omnidirectional condenser

Raijintek Triton AIO Review

Posted: 03 Aug 2015 12:19 PM PDT

at a glance

(+) Titan
Good value; funky coolant dyes; straightforward to install; expandability.

(-) Rain Check
Hoses kink quite easily; installation instructions could be better for less-experienced users; non-PWM fans.

This article was published in the August 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Hard to pronounce, easy to install

Another day, another company you may not have heard of before. Only established in 2013, Raijintek is yet another marriage between German designers and Taiwanese manufacturers, producing a range of fans, cases, and coolers. One of its latest products is its first liquid cooler, the Triton All-In-One.

The first thing you'll notice, releasing the Triton from its box, is the quality of the components. They're not quite up to the same standard as other AIOs we've looked at lately, such as Corsair's Hydro H110i. But a glance at the Triton's price tag might explain why—it's significantly cheaper.

The pump unit is a beast, quite a bit taller than the one on the Corsair Hydro H110i, for example, and runs almost silently. The unit is made from clear acrylic with a chrome-plated copper base, and uses a graphite pipe and ceramic axis bearing. It runs at 3,000rpm and can shift 120l/hr. For those who like things to be lit up, the pump unit is illuminated by a pair of white LEDs. As part of the extras, you get three dyes—red, green, and blue—to color the coolant, if transparent liquid just isn't your thing.

Getting Kinky

After dealing with some impressive manuals that came with the last few liquid coolers we've reviewed, it was a shock to see the basic fitting instructions Raijintek supplies. But truth be told, the design has been so well thought-out that more detailed instructions simply aren't needed. Having said that, first-timers might find a bit more hand-holding reassuring.

One thing to take care of is the hoses. These don't have anti-kink coils fitted, so care must be taken not to flatten them. What they do have, however, is standard compression fittings, meaning the cooling loop can be customized with off-the-shelf components. So, although it's marketed as an All-In-One, it can be expanded, by adding in a GPU water block, for example.

The Triton comes with a pair of 120mm fans (rated between 1,000 and 2,600rpm), but surprisingly these aren't PWM fans. Instead, they're controlled by a rheostat, which means having to get your hands inside the case to turn the fans up or down as required. It's a pain. If you want to use the Triton for an overclocked system that doesn't scream the house down, and have a preference for cooling fans you can monitor, it's worth tracking down the cheaper Core Edition version, which comes without fans.

At the Core i7-4770K's stock speeds, the Triton performs remarkably well, keeping the processor at 21°C, and with hardly a sound. Without any PWM fans, it's up to you to adjust the rheostat to get the right balance of performance and noise, which is a job you could do without. To test at standard clock speeds, we adjusted the rheostat to around the middle position to get the best balance. At stock speeds with a 100 percent load, the cooler still performs pretty well at 62°C, though it does trail behind the much more expensive Fractal Design Kelvin S24.

For overclocking performance, we took no chances. Turning the fans up to full speed and at full load, the Triton once again fell behind the Kelvin S24, but still kept the processor at under 90°C. However, with two 2,600rpm fans running full tilt, the only way to describe it is LOUD. The Triton performs much better than its price tag implies. Even so, by not having PWM fans, Raijintek isn't doing itself any favors.

$100, www.raijintek.com

Benchmarks

Raijintek Triton Fractal Design Kelvin S24
Reference CPU 100% Load (°C)
6249
Reference Peak-to-Idle (sec)
197166
Overclocked CPU 100% Load (°C)
8576
Overclocked Peak-to-Idle (sec)209182

Best scores are bolded.

Specifications
Socket Support
Intel and AMD
Water Block
Nickle-plated copper cold plate
Fans
2x 120mm (1,000 - 2,600rpm)
Radiator Dimensions (W x H x D)
275 x 120 x 32mm
WarrantyTwo years

Turtle Beach Grip 500 Review

Posted: 03 Aug 2015 12:13 PM PDT

at a glance

(+) Gripper
Lovely soft finish; decent buttons; customizable.

(-) Kipper
Can be slippery; lots of competition at this price point.

This article was published in the August 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Like a kitten in your hand, but no pussycat when you get gaming

Turtle Beach is really onto something with the soft coating it's dipped the Grip 500 in. Its molecules spun and woven to feel like the firmest marshmallow beneath the hand, the mouse kisses your fingertips like a nymph newly spawned from a pool of inky black water, begging for protection from a pursuing satyr.

Other things we'd like to see dipped in this stuff include the door handles on lingerie shops, those massive KFC containers you should never try and finish on your own, and most glorious of all, the insides of our socks, so we could wiggle our toes against it all day and get a little shiver of excitement each time.

Back to the mouse. You get seven buttons, all programmable with commands or macro sequences but not really suitable for left-handers. You could probably get away with using it in the left hand at a pinch, but the position of the buttons that fall so nicely under the right thumb would lead to some awkward ring-fi nger gymnastics, were you to try it for long gaming sessions. You can click the main buttons all the way back at the mouse's mid-point, useful should you have particularly short fingers.

The body of the mouse is broad, sitting high in the palm for those who prefer it that way, while a tapered shape accommodates claw grippers. It fits perfectly in our giant-sized hands, but could be too big if you're somewhat more lightly built. Button layout is nicely thought-out, as long as you're prepared to use your thumb, with a good amount of travel once you've pressed down. It never feels like you're going to push the switch all the way through the body of the mouse, which is comforting.

The main switches are Omrons, the ubiquitous choice of the gaming mouse, and the infrared garnet eye of an Avago laser sensor gleams invisibly underneath. There are the now-traditional grilles either side of the cable to make the mouse look like a sports car, and the lighting can be adjusted with an app, which also handles the button programming. Five color-coded profiles can be copied to the mouse's onboard memory, ready to be switched between on the fly, if you can remember which is which and can be bothered to set them up.

Too Soft?

The wheel, where so many mice fall down, is nicely notched and sports a grippy tyre. It feels very light, however, putting up just barely enough resistance to being spun. We'd have liked to see a bit more presence and weight to it, so we could be certain we'd rotated it rather than just brushed against it. It can be assigned different functions using the PC app, which is nice to see, though we lack the imagination to think of too many situations in which it would actually be useful.

The soft-touch coating could, sadly, prove to be the Grip 500's major drawback. If you regularly game in a sauna, or perhaps one of the hotter parts of the country, you may build up a tiny bit of perspiration. Should this happen, the mouse is going to get quite slippery, the problem being that there are no ridges to hold your fingers on the buttons or grippy textures on the body. This is especially true on the left of the body where your thumb rests. Come down from the buttons too hard and you risk sliding off and thumbplanting the desk underneath, possibly causing a nasty bruise.

There's also the price. For about $15 less, you could get yourself a Razer DeathAdder or Logitech G502, either of which is more satisfying to use than this otherwise commendable, comfortable effort from Turtle Beach.

$70, www.turtlebeach.com

Specifications
Sensor
Laser
Max Sensitivity
9,800dpi
Programmable Buttons
Seven
Cord Length
2m
ConnectorUSB

Turtle Beach Impact 500 Review

Posted: 03 Aug 2015 12:11 PM PDT

at a glance

(+) Impact
Great build quality; lovely to use; nice clicky action.

(-) Detract
Wobbly on its feet; bit basic; a touch expensive.

This article was published in the August 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

The Tyrion Lannister of the mechanical keyboard world. Only without the wenches. Or wine. Or wit

Turtle Beach is perhaps better known for the gaming headsets it's been making for many years, but it's increasing its holdings in the peripherals market with a range of keyboards and associated pointing devices. They've all got something in common: They're as black as a panther at midnight and are sturdily built, with the kinds of features you'd expect to see from more established desktop manufacturers, such as Logitech.

The Impact 500 is a mechanical keyboard with the number pad amputated, a kind of keyboard carbine for the sort of close-quarters typing that means waiting until you see the whites of your enemies' eyes before you unleash a barrage of trash talk. There's even a removable mini-USB cable (that's nicely braided), so you can make a swift escape from a player more skilled at insults than you. Lol.

The soft coating we waxed lyrical about on the Grip 500 mouse is present, and makes a bit more sense on a keyboard. It's still gorgeous, but the keys here are smooth-topped rather than rubber-coated, and feature the kind of contouring that the mouse was so sorely lacking. Even the sweatiest fi nger will have a hard time falling off, and there are deep gullies between the keys to channel your pouring bodily fluids away.

Cherry Blue switches lurk beneath the keys, with a click that activates quite high up in the button's travel. It is, as we've come to expect, a positive action—there's no doubt whether you've pressed a key or not—and there's plenty of travel once you've pressed it. Whether you allow the key to bottom out or snap your finger straight back up again, it'll still register. There's six-key rollover with anti-ghosting (you can press six at once and it can distinguish them) for those finger-twisting combos.

Mechanical Minimalist

The steel-reinforced frame means it's a sturdy unit, its compact dimensions perhaps adding even more stiffness than a wider chassis would. There's little wobble in the keys themselves, just the tiniest movement if you rattle one about, and there are some decent feet under the keyboard, too, meaning it sticks, gecko-like, to your desk as you thrash from side to side. Extend the little feet that raise the back by about half an inch, however, and a disturbing degree of slippage creeps in. The feet are capped with hard plastic rather than grippy rubber, an oversight on the manufacturer's part, as the front rubber pads aren't enough on their own.

What you don't get are any programmable macro keys (boo) or Las Vegas-like lighting systems (yay), which means there's no application or special drivers to install before you can use it. It also keeps the desktop footprint small. Perhaps they could have gone further—Print Screen has its uses, but when did you last use Scroll Lock or Pause Break? An FN button next to the right Alt key gives access to media controls found on the central F keys, and an indicator lamp on the F9 key lets you know when the Windows key lock is engaged, but that's about as complicated as it gets.

Unless you really need a number pad, this is a great mechanical keyboard. It may lack frills, but its build quality is excellent, it does the fundamentals well, and is a joy to use in a keyboard-intensive game. It's only $20 less than Corsair's K65 RGB though. How much do you like the colored lights?

$130, www.turtlebeach.com

Specifications
Switch Type
Cherry MX Blue
Cord
Detachable
Connector
USB
Anti-ghosting
Yes
Backlit
No
Keys
88
RolloverSix keys

Thrustmaster Leather 28 GT Wheel Add-On Review

Posted: 03 Aug 2015 12:09 PM PDT

at a glance

(+) Wheely Hot
Top quality materials and build; plenty of controls; multi-system compatibility.

(-) Wheely Not
Only worth it if you've got a leather fetish; quick release system is anything but.

This article was published in the August 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Hand-stitched leather hasn't looked this good since Desperate Housewives

Attracting a member of the opposite sex into your bedroom can be a tricky business, so the last thing you need is having certain unappealing items left lying around to potentially dampen the mood. Posters of scantily clad celebrities don't tend to go down well, while the sight of a fake steering wheel clamped in front of your monitor could be seduction suicide.

Fear not though, as this wheel may as well be an aphrodisiac, exuding quality from its hand-stitched leather and 2mm-thick brushed-metal center panel. Six buttons and a D-pad provide plenty of control, plus there's a manettino dial for a taster of what life would be like behind the wheel of a Ferrari 458. Thrustmaster's dial is more show than go though, with an air of a budget 1990s stereo, rather than Maranello's finest.

Otherwise, you'll be hard-pressed to find any quality flaws. Despite weighing just 1kg, the wheel is fitted with substantial 13cm-long metal paddle shifters, so there's no excuse for missing a gear shift when you need to unleash your inner Dominic Toretto for a fast getaway. The buttons are rated to withstand over 10 million hits, and the 28cm-diameter wheel gets an inner steel hoop for strength and better force feedback transmission.

Trouble is, you can't just fling this wheel about in mid-air and expect things to happen. You'll need a T300, T500, or TXseries racing system to use it with, and since these already include a wheel, you're going to end up with a pair of wheels for the single force feedback unit and pedal set. That's bad feng shui, and your bank balance isn't going to be happy either, with compatible setups starting at around $300.

At least you'll be able to attach and remove your leather-bound beauty in no time, courtesy of Thrustmaster's Quick Release system, right? Well, if this is the designer's idea of a quick release, their partner must have a serious problem with RSI. Rather than a simple locking clip mechanism, such as you might find attaching an F1 car's steering wheel, here you'll need to tighten a large plastic nut instead. No big deal, if only the nut didn't nestle in the annoyingly tight gap between the back of the wheel and the motor unit.

Hell for Leather

But once everything is ready and you can put pedal to metal (well, plastic), it all comes good. Whichever racing system you mount the Leather 28 GT to, TM's excellent force feedback motor brings it to life. There's a colossal 1,080-degree max rotation, plus dual belt drive for near-silent force feedback, and contactless magnetic position sensing for max rotation accuracy and increased longevity. The result is superbly strong and communicative feedback, with none of the usual gear grinding that lesser wheels can produce.

Such advanced internals do need to be kept cool by means of a fan, and unfortunately this is far from silent. It's certainly not going to overshadow your racing, but it's loud enough to partly undo all the effort that went into making the motor drive so smooth and quiet.

Of course, the TM Leather 28 GT isn't to blame there, but it's not off the hook. This is undoubtedly a great wheel, but it's also essentially the same as what's included with Thrustmaster's T300 Ferrari GTE setup, only you get to grip leather rather than rubber. If you could spec your own motor/wheel/pedal combo, then it'd make sense, but splashing this much cash to get a second wheel purely for a different texture hardly seems like the best route to driving nirvana.

$150, www.thrustmaster.com

Specifications
Diameter28cm
Weight
1kg
RequiresT300 RS/T300 Ferrari GTE; T500 RS/Ferrari F1 Wheel Integral T500; TX Racing Wheel Ferrari 458 Italia Edition

Kingston HyperX Savage 240GB Review

Posted: 03 Aug 2015 12:03 PM PDT

at a glance

(+) Sophisticated
Speedy Phison controller; good value; impressive real-world performance.

(-) Savage
Stupid name; limiting interface.

This article was published in the September 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Performance soothes the savage beast

Tech naming conventions are becoming a parody of themselves. Here, we've got a solid-state drive that Kingston wants to describe as "savage." As well as "hyper." With an extraneous "X." We've seen more savage things at retiree quilting clubs.

But, compared to some of the SSDs we've seen from Kingston, we'll accept that this latest drive has some call to use the HyperX label. It's impressively speedy, and not just when compared to some of its more pedestrian stablemates. Put up against Samsung's 850 EVO, resplendent with its 3D memory and in-house MGX memory controller, this Kingston effort manages to keep pace, and, in some instances, actually outperform it.

We've got to say (while refusing to acknowledge its barbaric subtitle anymore), the thing that really gives this latest HyperX the performance that Kingston has been craving so badly is that eight-channel Phison PS3110-S10 memory controller. The last HyperX SSD we tested was still using the aging SandForce controller, effectively making it a real-world irrelevance, especially when the likes of Crucial and its half-terabyte MX100 drives were making fools of just about the entire industry.

It's not the first Kingston drive to pack a Phison controller. The SSDNow 310 was rocking an older version. It was a version that definitely wasn't going to set the storage world alight, but in the relatively affordable high-capacity end of the solidstate storage market, it kinda made sense. Speed in that sector is less important than capacious data stores.

Fly, You Fool

For the HyperX brand, though, pace is most definitely paramount. And here, the current Phison controller, coupled with the 19nm Toshiba MLC NAND, delivers performance that almost puts it ahead of even the 512GB Samsung 850 Pro. Compared with the similarly priced 250GB 850 EVO, this Kingston takes the plaudits in terms of the peak and average sequential benchmarks. It's impressive stuff.

It's a little behind on the random 4K read/write speeds, but not by much. It's certainly up there with the main contenders in the SATA 6Gb/s world. But yes, we're still talking SATA. NVMe drives remain rarer than a non-cynical tech journalist, and both the Samsung 850 drives and this latest Kingston are bumping their heads against the limits of the interface. Much as SSDs have been doing since the SATA 6Gb/s interface met NAND flash.

Where the Kingston drive does show a bit more of a tangible improvement over the competing Samsung drive is in the real-world file compression and data transfer tests. Indeed, it's almost 20 seconds quicker on our hefty 30GB Steam folder transfer than the 3D memory-laden 250GB Samsung EVO.

But it's still slower than the larger 500GB Samsung, with its greater parallel performance, though we don't know whether the eight-channel Phison controller will deliver the standard speed hike that now comes with higher capacity. We'll need to get our hands on the big brother of this drive to figure that out.

What we can say is that Kingston's latest HyperX SSD does finally offer genuinely competitive storage performance, and in a world hobbled by the SATA interface, that's pretty much all you can ask for. A drop in price would be nice, but considering this drive's just 54 cents per GB, it's not exactly terrible value.

But would we recommend the Kingston over the Samsung? Probably not. Performance is there in spades, but the consistency tests show it's a little more prone to performance spikes than the 850 EVO, when really being pushed. And then there's the warranty. There's still the longevity fear with solid state, and having the five-year, as opposed to three-year, warranty definitely inspires confidence. But it sure is close.

$130, Kingston

Benchmarks
Kingston HyperX Savage 240GBSamsung 850 EVO 250GB
Peak Sequential Read/Write (MB/s)558/537553/532
Average Sequential Read/Write (MB/s)500/481495/485
Random 4K Read/Write (MB/s)37/9539/107
5GB File Compression (secs)7682
30GB Folder Transfer (secs)218236

Best scores are bolded. Tests were carried out on an Asus ROG Z97 Hero motherboard and a stock-clocked Core i7-4790K, with 8GB DDR3 at 1,600MHz.

Specifications
Memory ControllerPhison PS3110-S10
MemoryToshiba 19nm MLC
Capacity240GB
Form Factor2.5-inch
ConnectionSATA 6Gb/s
WarrantyThree years

Acer V17 Nitro Black Edition Review

Posted: 03 Aug 2015 11:58 AM PDT

This article was published in the September 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

aT a glance

(+) V8 Engines
Nice IPS display; good keyboard; runs quiet.

(-) V8 Juice
Underpowered for 17.3-inch; poor trackpad; coil whine.

This mainstream gaming notebook could use a shot of N2O

A name like V17 Nitro Black Edition inspires images of fast cars and powerful engines. Acer, not normally a company associated with high-performance systems, might be trying to change its image, it seems. And indeed, on the surface, the V17 packs some decent hardware, such as a gimmicky 3D RealSense camera. But while some parts are good choices, others leave us wanting.

The V Nitro series is available in two sizes, the 15.6-inch V15 and the 17.3-inch V17 we're reviewing. And that's where we first encounter some anomalies. Despite having a larger chassis, the V17 is only available with a 1080p display, where the smaller V15 is available with 1080p or 4K IPS displays. The battery is also a fairly small three-cell 52Wh capacity, which impacts battery life in a negative way. The core hardware is otherwise the same, so the only reason to go with the larger laptop is if you prefer lower-DPI displays. There is some good news—for gaming purposes, 1080p is a much better fit for the graphics card. But that brings us to the next concern: Acer's choice of Nvidia's GTX 960M.

There's a big gap in performance and features between the GTX 960M and 965M. The 965M uses a trimmed-down GM204 GPU, while the 960M uses Nvidia's first generation of Maxwell hardware, GM107. That means certain DirectX feature level 12.1 options are missing, along with VXGI (Voxel Global Illumination). The bigger concern is that the 965M offers 30–40 percent more performance compared to the 960M. In many games, that will be the difference between running high versus medium quality at 1080p.

Even though we think the GTX 965M would have been a better choice, the V17 still has decent performance; it surpassed our zero-point Alienware 14 in every test with the exception of battery life. Three hours of video playback is quite short, even for a mainstream gaming laptop, and if you happen to fire up a game, the V17 will only last about 45 minutes. Turning on Battery Boost can help, at the cost of frame rates, but getting more than an hour of unplugged gaming will prove difficult.

Middle of the Road

What really separates great laptops from the merely average isn't just performance-oriented insides, it's the entire experience, and this is where the V17 falls short. The keyboard is quite good, we've no complaints there, but the clickpad is another matter. It's unresponsive and lacks precision, and while any gaming on this laptop will almost certainly be done with an external mouse, there's no reason the integrated trackpad shouldn't work well. It's 2015—getting the trackpad right should be a simple matter.

That's not the only flaw we encountered. There's a phenomenon known as "coil whine" that typically occurs when a small capacitor or coil begins to vibrate and emit a high-pitched noise. We've seen this on GPUs, motherboards, and network adapters, but in the case of the Acer V17, our test laptop exhibits the problem when the GTX 960M fires up. Do anything without engaging the discrete GPU and it's fine, but as soon as a game starts running, we hear a high-pitched noise. This problem is usually unit-specific, so while this particular V17 has coil whine, another V17 may not, but it does speak to issues with quality control.

The V17 is decent overall, and the size and modest performance allows it to run without making a lot of noise. The price is attractive, especially now you can find it for $200 under the MSRP, but there are many competitors with higher-quality materials that simply look better, and they don't fail the touchpad test. If you like large displays and are after a mainstream gaming notebook, you could do far worse for the price.

$1,499, Acer

Specifications
CPUIntel Core i7-4720HQ
RAM2x 8GB DDR3L-1600
GPU
GeForce GTX 960M 4GB
Display17.3-inch, 1920x1080 IPS
Storage256GB SSD, 1TB HDD, DVDRW
ConnectivityHDMI, Ethernet, SD reader, 2x USB 3.0, 2x USB 2.0, 802.11ac, Bluetooth 4.0
Dimensions16.7 x 11.5 x 1 inches
Weight (Lap / Carry)6.61 / 7.89lb
Benchmarks
Zero-Point
Stitch.Efx 2.0 (sec)962913 (5%)
ProShow Producer 5 (sec)1,6291,537 (6%)
x264 HD 5.0 2nd (fps)
13.514.36 (6%)
BioShock Infinite 1080p (fps)3657 (58%)
Metro: Last Light 1080p (fps)3050.8 (67%)
3DMark 11 Performance (fps)4,1705,664 (36%)
Battery Life (1080p video, mins)234172 (-26%)

Our zero-point notebook is an Alienware 14 with a 2.4GHz Intel Core i7-4700MQ, 16GB DDR3-1600, 256GB mSATA SSD, 750GB 5,400rpm HDD, GeForce GTX 765M, and Windows 7 Home Premium 64-bit. BioShock Infinite tested at 1920x1080 at Ultra DX11 settings; Metro: Last Light tested at 1920x1080 at DX11 medium quality settings with PhysX disabled.

MSI Cubi Review

Posted: 03 Aug 2015 10:23 AM PDT

This article was published in the September 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

at a glance

(+) Cube
Huge amount of expandability; relatively cheap alternative to Intel's NUC; looks gorgeous.

(-) Noob
Can be a bit challenging to get to work on certain monitors; SATA ribbon port feels flimsy.

Fantastically small home theater PC in a box

If the MSI Wind Box was really just a tub of hot air, then the Cubi is a hurricane. A very good-looking hurricane. Sleek and small, with a clean design, our review sample is a treasure trove of golden hardware. It's everything the Wind Box should've been.

Around 1mm taller than Intel's latest NUC, the Cubi's slightly sloped, edgy physique gives it a much more arresting and elegant look than its rival. It's almost a shame that it's VESA-mountable, meaning most will be slapped to the back of a TV, as the pristine, glossy-white chassis really does look quite at home on our desk.

Clearly, because of the size of this thing, it's not a powerhouse when it comes to performance. With a Cinebench score of 199 (almost double what its gusty cousin can produce), it loses out to the Intel NUC5i3RYK we tested in our June issue, but only by 20 points. However, it makes up for this by running as silently as the grave. Idling at around 50 degrees Celsius and peaking at 70. Under load it's almost impossible to hear any noise from the Cubi without physically slamming your ear next to the small little box of joy.

The storage situation is also far better than its little brother. Our review sample came with 120GB of internal flash storage by default (107GB left over after a quick install of Windows 10). But with access to an additional 2.5-inch drive, via an mSATA adapter, the Cubi can accommodate up to 1TB of additional SSD space, if you require it. This does increase the height of the Cubi by 12mm or so, but all in all, it's completely worth it for that expandability.

Well-Connected

Speaking of expandability, it also includes two SoDIMM slots, allowing up to a total of 16GB of RAM. Couple that with a Broadwell Core i5 and you have yourself one seriously boast-worthy work or home theater PC, one that you can fit into the palm of your hand, no less, and one that runs on an impressive 65W power supply.

I/O is handled by a positive menagerie of connectivity, with the back featuring two USB 3.0 ports, Ethernet, Mini DisplayPort, HDMI, and power. The front also houses a four-pole input for headphone and microphone combos, plus an additional two USB 3.0 ports. It even comes with Wireless AC and Bluetooth 4.0, which is a must in today's modern age of high-speed gaming and streaming.

And did we mention this thing streams 4K? Because yep, although we guess that's not as impressive as we'd all like to think nowadays, there are still a lot of people out there using laptops that stutter playing 720p videos on YouTube. It's an impressive PC rolling in at $280 for the barebones Core i3 model, or $350 for Core i5 version.

The MSI Cubi comes in at a very attractive price point and performs admirably in competition against the Intel NUC and its own little brother, the Wind Box. For price to performance, you're not going to get any better any time soon.

All up, MSI seems well on the way to cracking the algorithm that will hopefully deliver the perfect, all-round Steam, office, and home-streaming box. It's not 100 percent there yet, but it certainly looks promising for the future of micro-PCs.

$280, MSI

Benchmarks
MSI CubiGigabyte Brix S
Cinebench R15 (index)199
252
PCMark v8 (index)1,7812,049
Idle Temperature (°C)49
53
Load Temperature (°C)7069
Idle Power Draw (W)910
Load Power Draw (W)2024
1080p Video Power Draw (W)1819

Best scores are bolded.

Specifications
SKUCubi
CPUIntel Core i3-5005U 2GHz
MemoryUp to 16GB DDR3L 1,600MHz
Storage1x 2.5-inch, 1x mSATA
Network802.11 ac WiFi card, Gigabit Lan
Video1x HDMI, 1x Mini DisplayPort
I/O4x USB 3.0, 1x 3.5mm headphone/mic
Dimensions (D x W x H)115 x 111 x 35mm

Phanteks Enthoo Evolv ATX Review

Posted: 03 Aug 2015 09:43 AM PDT

at a glance

(+) Titanium
Crisp; good-looking; stunning build quality; superb water-cooling support, feature packed.


(-) It's Gray
Perforated PSU floor; front I/O could be better.

This article was published in the September 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

A feature-packed modder's dream

So, why is this Evolv ATX so much better that the ITX version, or the original Evolv Phanteks, released way back in September 2014? Quite simply, it's the build quality. This isn't a cheap case. Indeed, many might wince at paying $130 for a chassis. But then you unpack it. And then you take a look at what's included, and the stunning build quality that's been achieved. You quickly begin to appreciate what Phanteks has managed to achieve at such a low (yes, low) price point.

The case itself is made up entirely of solid 3mm-thick aluminum panels situated on top of an all-steel chassis. That gives it a hefty, solid dependability, plus also makes a lovely "ding" noise when you flick it. The sound, admittedly, isn't a vital case-buying prerequisite, but it's nice to hear, all the same. There's little-to-no flex in any of the panels (including the windowed side panel), all of which have sound-dampening foam on the joins, preventing any excess noise from vibration.

The case is incredibly modular. Indeed, it's a modder's dream. Supporting up to seven 3.5-inch drives (five with the included brackets) and four 2.5-inch drives (two included), it's neatly partitioned into two separate compartments for your power supply and motherboard. There's also plenty of room in the back for cable management, plus it includes some of Phanteks's Velcro cable-tidy straps and a multitude of tie-down points.

The chassis also has a vast array of features, including a sliding radiator bracket in the top. This can easily be removed for quick installation of all-in-ones or radiators, hinged and removable side-panel doors, plus support for a refrigerated truckload of water-cooling components. And the goodies keep on coming—Phanteks also supports several places to mount a pump for water cooling, thanks to the included pump bracket.

The case itself doesn't feature a 5.25-inch bay—for an optical drive, for example, though this is something we don't really see as a necessity anymore. Besides, in a world of streaming and USB solutions, it's not that expensive to simply go out and buy an external drive for the rare times that you actually find yourself needing to burn something onto a disc.

Premium Passion

As far as negatives go, the front I/O consists of two USB 3.0 ports and a headphone and microphone jack. It would've been nice to have seen an additional two USB ports here, especially for an ATX case of this size. Additionally, while the perforated floor separating the power supply from the motherboard allows for ample airflow, it would've looked a lot cleaner aesthetically if the panel was solid.

All in all though, this chassis is definitely worth buying for those looking to invest in a premium case with plenty of room to maneuver. It's incredibly clean and has an air of professionalism about it, with a kind of poised flair. Think Jaguar F-Type. Sophisticated, yet elegant. Put simply, we love this case.

$130, Phanteks

Specifications
Form FactorE-ATX (up to 264mm), ATX, microATX, Mini-ITX
Dimensions & Weight235 x 495 x 510mm; 10.2kg
CoolingFront: 3x 120mm/2x 140mm (2x 140mm included); Top: 3x 120mm/2x 140mm; Rear: 120/140mm (1x 140mm included)
CPU Cooler Clearance 194mm
Graphics Card Max Length
420mm (no HDD brackets); 300mm (with HDD brackets)
Storage Support4x 2.5-inch SSD mounts; 7x 3.5-inch HDD mounts

Samsung S34E790C Review

Posted: 03 Aug 2015 09:37 AM PDT

at a glance

(+) Bending the Rules
Spectacular wrap-around in-game visuals; VA panel packs plenty of contrast and black levels.


(-) Just Bent
Curved geometry looks weird in Windows; poorly calibrated colors; expensive against competition.

This article was published in the September 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

Curved but not completely crazy

This is nonsense. This, we'd argue, would be any sensible person's default initial reaction to the new Samsung 34-inch übertron with a curved LCD panel. Like stereoscopic 3D displays or 200Hz TV sets, the technology of curved LCD screens is real enough. But surely it only exists as a marketing tool, not because it enhances the viewing experience?

Curved screens certainly qualify for gimmick status in the HDTV market. After all, the only way a curved screen can possibly work is when you're positioned in the perfect place. In other words, you need to be sitting exactly in the sweet spot of the circle of which the screen curve forms a segment, and that is rarely the case with a TV. You're probably too far away and sitting at least slightly off-center, especially if more than one person is watching.

For a PC monitor, it's perhaps not such a crazy idea. Using a PC is much more likely to be a solitary affair and you'll indeed be sitting dead center in front of the screen. So, it's just a question of tweaking the distance to the screen and its height to hit the right spot.

More Than a Gimmick?

The problem is, even perfectly positioned, the Windows desktop still looks very weird with a curved screen. No matter where you sit, the geometry of the Taskbar looks badly bent. Actually, it had us eyeing the bottom and top bezels like a master craftsman, trying to work out the shape of the panel. It is indeed perfectly straight along each side, which we knew. But the visual weirdness has you doubting what you instinctively know to be true.

Get in-game, however, and things suddenly look a lot more clever. Without screen elements like the Taskbar constantly reminding you of the shortcomings of a curved panel, you can simply enjoy the subtle wrap-around feeling. Admittedly, it's a little difficult to pick apart the curved aspect of this panel from its overall proportions and dramatic widescreen format. With 3440x1440 pixels and a 21:9 aspect ratio, this screen would be spectacular even without the curve.

However, we've seen 34-inch 3440x1440 monitors with flat, rather than curved, panels before. They're stunning things, but there's definitely a sense that the far edges of the screen are a long way off and viewed at a conspicuously oblique angle.

Overall, then, the benefit of a curved panel is tangible, if subtle, in games. If it's something you appreciate, the next question is whether this Samsung effort is a good example of the breed. The stand is adjustable for both height and tilt, which is particularly important for a curved screen where your viewing position is so critical.

Then there's the image quality. At default settings, the color temperature is very cold, blue, and subdued. Weirdly, we had to enable the "game" mode in the menu to get colors that looked more vibrant, but it still didn't look terribly natural.

A quick perusal of the spec sheet reveals the probable explanation. The S34E790C has a VA rather than IPS or TN panel. Except at the very high end, VA panels tend to have slightly wonky color balance. And so it is here. That said, VA also delivers fantastic contrast and black levels.

Factor in the punitive pricing and this Samsung is a marginal proposition. There are 40-inch 4K monitors that can be had for less, for instance. More to the point, so can 34-inch curved monitors with IPS panels, such as Dell's UltraSharp U3415W. Even at the same price, the likes of the Dell is probably where our money would go. For less money, it's a no-brainer.

$1,190, Samsung

Specifications
Panel Size34-inch curved
Panel TypeVA
Native Resolution3440x1440
Refresh Rate60Hz
InputsDisplayPort, 2x HDMI
StandTilt, height (no VESA)

Retail Skylake-S Processors Appear Online Ahead of Launch

Posted: 03 Aug 2015 08:42 AM PDT

Fancy new box art

Skylake Box

Unless there are any last minute changes, Intel will formally introduce Skylake-S (desktop variant of Skylake) on August 5 at Gamescom. The launch will presumably include the immediate availability of Core i7-6700K and Core i5-6600K CPUs, though at least one vendor has jumped the gun.

Both the 6700K and 6600K have shown up online at Amazon's AU storefront and eBay's Australian website by the same seller, WCCFTech reports. The listings include shots of the retail boxes (not renders), and while the final specs haven't been announced and can't yet be considered official, the retail boxes pretty much confirm what we already know (via prior rumors). Just don't pay any attention to the obviously inflated price tags ($650 for the 6700K and $470 for the 6600K).

These are both unlocked processors with 91W TDPs. The flagship Core i7-6700K brings four cores and eight threads to the Skylake-S party. It has a base clockspeed of 4GHz, boost clockspeed of 4.2GHz, 8MB of L3 cache, and support for DDR4-2133 and DDR3L-1600 memory.

Intel's Core i5-6600K is also a quad-core part, but with base and boost clockspeeds of 3.5GHz and 3.9GHz, respectively, and no Hyper Threading support. It has less cache (6MB) while supporting the same memory as the 6700K.

Skylake will require a new socket (LGA 1151) and represents a "tock" in Intel's "tick-tock" cadence, meaning it's a new architecture as opposed to a die shrink with various optimizations.

The original plan called for Cannonlake (tick) to succeed Skylake in 2016, though manufacturing difficulties in getting to 10nm led to Intel's decision to delay the launch until 2017. To fill the gap, Intel will introduce a chip design manufactured on its current 14nm process called Kaby Lake, thus breaking Intel's traditional tick-tock cycle.

Follow Paul on Google+, Twitter, and Facebook

Windows 10 vs. 8.1 vs. 7: Performance

Posted: 03 Aug 2015 06:14 AM PDT

We compared graphics performance between Windows 7, 8.1 and 10 with our standard benchmarks. The results? Meh.

We were in the midst of running benchmarks and writing the stories for our  Budget Gamer, Midrange and Turbo Blueprints builds when Windows 10 launched on Wednesday. So we figured, "Hey, while we're at it, let's take a look at how Windows 10 benches in comparison to Windows 7 and 8.1."

We took our Turbo build (look for the full write-up on that Monday) and put it through our standard benchmarks we use for Build Its and system comparison. The Turbo is a bit of a beast, sporting a 5930K, GTX 980Ti, and 16GB of DDR4 RAM which live on an MSI X99S SLI mobo. The main storage is an M.2 Kingston HyperX Predator 480GB SSD that boasts read speeds of 1400MB/s and write speeds of 1000MB/s.

More: See our full review of Windows 10.

We've been testing systems using Windows 8.1 for quite some time, so we hunted down our copy of Windows 7 and ran our benches. All went well with Windows 7, so we turned to Windows 10 to continue the test.

Installation Woes

There's something to note here about Windows 10: It didn't like that we had our SSD set to PCI mode instead of SATA mode. While Windows 7 and 8.1 installed to the M.2 SSD in PCI mode just fine, the Windows 10 installer choked on two attempts.

First, we started an upgrade from Windows 7 at around 5 p.m. before we all left the office (in an attempt to "save time" so we wouldn't have to install our benches all over again). When we came back the next day, the installer had only made it to 32 percent. So, we tried a fresh install, starting at 8:45 a.m. At 4:30 p.m., the system still hadn't booted to the desktop, and we watched in vain as the little dots spun in circles, mocking us. It was only when we switched the M.2 over to SATA mode that all worked as it should. 

After the installation, we were able to switch the M.2 back to PCI mode without any issues. We should also bring up that this particular motherboard doesn't have any Windows 10 drivers available for it yet. We're maintaining  a list of motherboard drivers to make upgrades easier, so your mileage may vary.

The Tests

Our regular readers may already know what we use, but here's a refresher. We use four main graphics benchmarks when we compare systems. The benchmarks are meant as stress tests on a system's graphics hardware. While we run more in-depth benchmarks at varying resolutions when we compare or review GPUs, our system tests stick to 1440p and above.

For our benchmarks, we use Batman: Arkham City (GOTY), Middle Earth: Shadow of Mordor and Tomb Raider. We run Batman at 1440p with all settings turned up, VSync off and PhysX off, while we run Mordor and Tomb Raider at maximum settings at 2160p. We also score systems With 3DMark Fire Strike turned to Ultra.

Let's take a look at how the the Microsoft OSes fared.

Win 10 vs 8.1 vs 7 Game Benchmarks

As you can see, the differences in score between the three versions are negligible. We got a slight bump up in Batman in Windows 10, but five frames per second isn't exactly mind-blowing. Let's have a look at Fire Strike Ultra.

Windows 10 vs 8.1 vs 7 Fire Strike Benchmark

Fire Strike Ultra does show us that there's a jump in score from Windows 7 to 8.1 of nearly 100 points. However, the difference in scores from Window 8.1 to 10 is a measly six points. Six. That's not enough for us to say that Windows 10 offers any definite advantages in graphics performance yet. However, DirectX 12 may change this in the future.

There are several other features that gamers may want to consider when making the choice to upgrade or not. The fact that there's PC-to-XBox streaming may be a selling point, while others may like the Start Menu over Windows 8.1's Start screen. 

For now, Windows 10 is still dripping wet from its birth. It's wailing from having its butt spanked, so we know it's working, but it's not yet doing any better than its older sibling, Windows 8.1. After all, Windows 8.1's driver offerings are much more mature, so this can be expected in the early months.

We'll be running more in-depth tests to see if these initial findings hold true across more hardware and games, and we'll be watching as Windows 10 grows up.

Asus R9 Fury Strix Review

Posted: 03 Aug 2015 12:00 AM PDT

At A Glance

(+) Furies: High performance; good cooling; more efficient than stock Fury; less expensive than Fury X.

(-) Furries: Still expensive; requires lots of space; less efficient than GTX 980; drivers could use tuning.

Asus Strix R9 Fury, aka Rocky II

It's not a stretch to say that the launch of AMD's Fury X was a disappointment on many levels. Initially billed as the fastest of all current GPUs, by the time the Fury X actually launched, driver updates and Nvidia's GTX 980 Ti were enough to spoil the party. And let's also be clear on that "launch" business: even one month after the initial release, it's difficult/impossible to find any Radeon Fury X cards in stock—and if you do happen to find one, don't be surprised if it's currently priced well above MSRP. Some of the delay could potentially be blamed on AMD "reworking" the liquid cooling solution to minimize pump noise, but that shouldn't affect the vanilla Fury that was officially announced last week…except AMD didn't provide any hardware samples, leaving it up to the manufacturers of Fury cards (currently Sapphire and Asus) to provide samples.

Looking back from our lofty hindsight perch, it appears the Fury line launched before there was sufficient supply. Couple that with apparent production problems and drivers that still appear to need a bit of tuning, and you have to wonder why AMD didn't just wait another month or two. Oh, that's right: Windows 10. It's not clear how many people are planning to buy a new system for Windows 10. Technically, if you have a PC that's running Windows 7 or 8.1 without performance issues, it should handle Windows 10 just fine. Anyway, maybe AMD wanted to get Fury out the door before the new Windows 10 PCs launched, even if that meant launching with a rather limited supply.

Or perhaps the cards are just so popular that no one can manage to keep them in stock? After all, even if it's not the absolute fastest GPU, Fury X is still very powerful, and there are many gamers who still play for Team Red. Unfortunately for AMD, competitor Nvidia doesn't appear to have any issues with supply of their 980 Ti, which has been readily available since launch (though it did sell out the first week). It doesn't help that the 980 Ti also overclocks better and uses slightly less power than the Fury X. But the vanilla Fury that we're looking at today is a different story (assuming inventory can keep up with demand once the cards hit retail).

The air-cooled Fury is in many ways to the Fury X what the 980 Ti is to the Titan X. The "X" graphics cards are only available as reference designs, ostensibly to allow for better quality control. For the board manufacturers, what that really means is there is little to no way to differentiate other than through warranty and support. All Fury X and Titan X cards will use the same board, the same cooler, and have the same clock speeds, so they are all effectively interchangeable. (The exception to this is the EVGA Hybrid Titan X, with liquid cooling and a 30 percent price premium.) Meanwhile with the "affordable" alternatives, Fury and 980 Ti, the add-in board partners are free to let their imaginations run wild. That typically means some experimentation with clock speeds, voltages, and cooling. And the Asus Strix R9 Fury is happy to provide all three. Let's start with a look at specifications, comparing the R9 Fury X with the "stock" R9 Fury:

AMD Fury X/Fury/390X Specs
Card R9 Fury X R9 Fury R9 390X
GPU Fiji Fiji Hawaii
(Grenada)
GCN / DX Version 1.2 1.2 1.1
Lithography 28nm 28nm 28nm
Transistor Count (Billions) 8.9 8.9 6.2
Compute Units 64 56 44
Shaders 4,096 3,584 2,816
Texture Units 256 224 176
ROPs 64 64 64
Core Clock (MHz) 1,050 1,000 1,050
Memory Capacity 4GB 4GB 8GB
Memory Clock (MHz) 1,000 1,000 1,500
Bus Width (bits) 4096 4096 512
Memory Bandwidth (GB/s) 512 512 384
TDP (Watts) 275 275 275
Price $649 $549 $429

In some areas, the Fury is an exact match for the Fury X: they both have 4GB HBM (High Bandwidth Memory) offering 512GB/s of bandwidth, they both have 275W TDP, and they both have 64 ROPs. That last item in particular seems like it might be holding back Fury X performance—not that AMD really had any room to add additional ROPs, as the 8.9 billion transistors coupled with the silicon interposer for HBM mean the Fiji GPU is effectively as big as it can be, but we do have to wonder what Fury could have done with 96 ROPs.

The main difference between the two GPUs is the number of compute units (CUs). Fury X has the full 64 CUs, giving it 256 texture units and 4,096 shaders; Fury, by comparison, has 56 CUs, with 224 texture units and 3,584 shaders. It also has a core clock 50MHz lower, though that shouldn't prove too big a deficit. On paper then, Fury X offers the same amount of bandwidth, and 14 percent more compute resources. Combined with its clock speed advantage, Fury X should be up to 20 percent faster than the vanilla Fury, but in cases where the GPUs are limited by the ROPs and/or bandwidth, we may see a performance gap of five percent or less.

The Asus Strix R9 Fury further muddies the waters with its design and specifications. It has the same clock speeds and other features as the reference R9 Fury, but Asus has chosen to limit the GPU (ASIC) power by default to 216W, though you can boost this via the Asus GPU Tweak utility. This doesn't appear to affect stock performance, but certain stress-testing applications, e.g., Furmark, will clock down a bit. Peak power use, on the other hand, is lower than the Fury X and 980 Ti, and the card runs quietly and doesn't get too hot, both admirable qualities. As far as clock speeds, by default the Strix is clocked at the usual 1,000MHz of Fury, but Asus does provide a 1,020MHz OC option via GPU Tweak—which also bumps up the power limit by 10 percent. The OC is so small, however, that it's more of a check-box item than a serious factory overclock.

Asus Fury Strix vs 290X Size Comparison (1)
Standard R9 290X shown on top of the Strix R9 Fury for size comparison.

The Asus Strix Fury is definitely made for windowed cases, with an eye-catching aesthetic and a pulsating Strix logo. However, the triple-fan cooler makes it substantially larger than the Fury X, albeit without a CLC taking up additional space. Those with larger cases won't have any trouble installing the Strix Fury, 390X, or any other large GPU; mATX and smaller form factors are a different matter. The card is a full 12-inches in length, plus the fan shroud is taller than a "normal" graphics card, so you'll definitely want to ensure your case is large enough before purchasing—or just buy a fat new case to house your GPU(s). Besides the large custom cooler, Asus has a reinforced frame and metal backplate that makes for a sturdier graphics card, and they've also used a 12-phase power delivery mechanism with "Super Allow Power II" capacitors, chokes, and MOS. The result of these high-end features is that the Strix Fury also carries a higher MSRP of $579, though we'll have to see where street prices land over the coming weeks.

Back for Another Round

We're running the same collection of tests as in our 980 Ti and Fury X reviews, with the less demanding settings detailed in the Fury X review. TL;DR: we turned off the Advanced Graphics settings in Grand Theft Auto V, as well as disabling 4xMSAA at 4K; for The Witcher 3, we've disabled HairWorks at all resolutions. The remaining games are run mostly at maximum quality, except for SSAA and a few other items. Batman: Arkham Origins is maxed out with 4xMSAA but no PhysX, Hitman: Absolution runs at Ultra with 4xMSAA, Metro: Last Light maxes out all settings but leaves off SSAA and Advanced PhysX, Middle-Earth: Shadow of Mordor uses the Ultra preset, Tomb Raider runs the Ultimate preset, and Unigine Heaven 4.0 uses Ultra quality with Extreme tessellation. All of the graphics cards have been tested in our standardized Haswell-E testbed:

All of the latest GPUs have been tested with the latest publicly available drivers, AMD Catalyst 15.7 or Nvidia 353.30. The AMD 15.7 drivers are effectively identical to the 15.15 launch drivers for the Fury and 300 series, as well as the 15.6 beta drivers previously used on the 200 series and earlier GPUs, but all AMD GPUs are now using the same drivers. Note also that we are now reporting 97 percentile minimum frame rates, as discussed previously.

R9 Fury > GTX 980

Eight Game Average Asus Strix Fury

Batman Origins Asus Strix Fury

Gtav Asus Strix Fury

Hitman Absolution Asus Strix Fury

Metro Last Light Asus Strix Fury

Shadow Mordor Asus Strix Fury

Tomb Raider Asus Strix Fury

Unigine Heaven Asus Strix Fury

Witcher 3 Asus Strix Fury

3dmark Fire Strike Asus Strix Fury

Fury X had a difficult time with 980 Ti, ultimately failing to lay claim to the super heavyweight title. Dropping down a weight class to the mere heavyweights changes the story. Here, the $549 R9 Fury takes a clear lead over the $499 GTX 980, though there are still cases where Nvidia leads. The R9 Fury has significant leads in Batman: Arkham Origins, Hitman: Absolution, and Shadow of Mordor; meanwhile, GTX 980 leads in Metro: Last Light, as well as claiming quite a few victories at 1080p. Overall, the two GPUs are in a dead heat at 1080p, but the Fury leads by eight percent at 1440p and a convincing 13 percent at 4K. And this time there's no caveat about 6GB vs. 4GB VRAM.

Considering the pricing, however, it's not too much of a stretch to point to the GTX 980 Ti as an even faster GPU. It might cost $100 more than the baseline Fury cards, but it's only $70 more than the Asus Strix Fury. What's more, that 12 percent increase in GPU cost will deliver on average 11 percent more performance at 4K, 15 percent more at 1440p, and 20 percent better frame rates at 1080p. And that's before overclocking. In that sense, the story is similar to the earlier Fury X. Speaking of which, the more expensive AMD GPU only ends up outperforming the air cooled Fury by 6–8 percent on average.

Looking at just the minimum frame rates, the lead tends to be quite a bit less. On average, the 980 actually delivers equal or better minimum frame rates to the Fury, which again suggests that further tuning for the Fury drivers would prove beneficial. Unigine Heaven in particular has issues with AMD GPUs right now, but since that's more of a graphics test than an actual game, we place less stock in those results. Removing Heaven from the averages, we find that the Fury and 980 are essentially tied for minimum FPS.

Float Like a Butterfly

Another month hasn't radically altered the performance ladder in the GPU world, but now AMD has filled the gap between the R9 390X and the R9 Fury X. What's more, the Asus Strix Fury fills that gap while using substantially less power as an added bonus. The 390X still packs a lot of performance, but peak power use is over 120W higher than the GTX 980, and performance is still lower on average. The Strix Fury dances to a different beat and manages to surpass the GTX 980 while only using 50W more power under load, and slightly less power than the 980 Ti and Titan X. It's not a better fighter in every discipline, but it packs a punch and helps keep the pressure on Nvidia.

There are still shortcomings with AMD's Fury line, unfortunately. One area we still haven't been able to fully explore is overclocking. The Asus Strix Fury managed to run at 1,070MHz core (seven percent overclock) and 1,080MHz HBM (eight percent overclock), but without the ability to tweak voltages, that's all we can get from most Fiji-based cards right now. Once the utilities are updated, we might be able to improve the situation… or additional voltage may not even help. Nvidia's Maxwell 2.0 GPUs meanwhile continue to deliver substantial overclocks (20 percent or more over stock clocks) and better power efficiency.

For those backing the green corner, we haven't seen anything that would convince them to change sides, but short of the second coming that's almost a given. Those rooting for Team Red, on the other hand, can now point to improved performance and near-parity, which is a welcome change. The fence sitters still need to decide how much they're willing to spend—or if they even need to spend anything at all. The improved performance from GM200 and Fiji can be substantial, but so is the price, and it's entirely possible to have a great time gaming on a PC even with lesser hardware. And even 1080p at high quality is generally a superior graphical experience than you'll get from console gaming. But if you're looking to run 4K resolutions at maximum quality, most single GPUs will still struggle.

Looking forward to next year, the real change is going to be TSMC's 16nm FinFET process, not to mention both AMD and Nvidia will be looking at HBM 2.0, which will double the bandwidth and memory capacity of HBM 1.0. After four years, GPUs will finally move on from 28nm, and the combination of 16nm FinFET and HBM 2.0 as the potential to make even today's best GPUs look rather tame. Windows 10 is launching as well, and by next year we should also have a better idea of what DirectX 12 really means for gaming.

In the meantime, the AMD R9 Fury basically delivers linear price/performance scaling over the GTX 980, which is quite good at the top of the performance ladder. It can handle just about anything you might want to throw at it, up to and including 4K gaming (though generally not at maximum quality). The Fury ends up being nearly as fast as the Fury X for $100 less, making it a slightly better value proposition, and there's still hope that driver improvements can further increase performance.

Asus, for their part, offers a potent card in the Strix R9 Fury, for those with the budget and a sufficiently large case. It runs quietly and stays cool even under pressure, and Asus has really put a lot of engineering prowess into the design, allowing them to reduce power requirements by about 30W compared to other Fury designs. Is that enough to warrant the $30 price premium over other R9 Fury cards? For some users, certainly, but whatever Fury card you choose will still offer a healthy amount of performance.

Follow Jarred on Twitter.

MMORPG News

MMORPG News


General: UPDATE: Final Fantasy XII Overhaul? Seems Not, Sorry to Say

Posted: 03 Aug 2015 08:00 AM PDT

UPDATE: Final Fantasy XII Overhaul? Seems Not, Sorry to Say

Rumors have been swirling and partially confirmed that Final Fantasy XII will be getting either reworked / updated or completely remade after a Distant Worlds concert in Pittsburgh, Pennsylvania this past weekend. During the concert, composer Arnie Roth stated that the game would be getting 'remade', though it is unclear whether this is indeed a fact, or if it will be a reworking of the beloved title.

General: Gunjack - New VR Arcade Shooter Revealed by CCP

Posted: 03 Aug 2015 07:36 AM PDT

Gunjack - New VR Arcade Shooter Revealed by CCP

Set in the larger EVE universe, CCP has announced a new arcade shooter called Gunjack. The game is being developed for the Samsung Gear VR platform and is based around "mining operations and pirates". Check out the trailer below and let us know what you think!

Lord of the Rings Online: List of Several US & EU Worlds Closing in Early '16 Revealed

Posted: 03 Aug 2015 07:29 AM PDT

List of Several US & EU Worlds Closing in Early

The Lord of the Rings Online site has been updated with a list of both North American and European worlds that will be closed beginning in early 2016. As of this time, no new characters can be created on these worlds and players will begin the transfer process by opening the test world for character copy.

Rift: 'We're Excited About the Future'

Posted: 03 Aug 2015 07:23 AM PDT

"Ocho" from the Rift Community Team has posted on the game's official forums in response to players who are concerned about its future longevity. Ocho challenges the notion of doom by revealing that the team is growing, the population is strong and that there are concrete plans for the future.

General: Top 5 Reasons You Should Not Play an MMO Right Now

Posted: 02 Aug 2015 01:06 PM PDT

Top 5 Reasons You Should Not Play an MMO Right Now

You probably read articles on our website because you like MMOs. As a matter of fact, I'd be willing to bet that you play them regularly. You might even have read something on this site today related to one of your favorite MMOs. I'm here to tell you that there are other things you could be doing with your spare time. As great as MMOs can be - and they can be really, really awesome - there are other things worth enjoying as well.

General: Felspire: CBT Announced For August 3rd

Posted: 01 Aug 2015 10:42 AM PDT

Felspire: CBT Announced For August 3rd

37Games has announced that fantasy MMORPG Felspire will begin Closed Beta on August 3rd. Players can take on the role of a mage, archer or a warrior and have access to advanced classes.

Total Pageviews

statcounter

View My Stats