General Gaming Article

General Gaming Article


Intel Pledges Support for FreeSync, Where Does That Leave G-Sync?

Posted: 28 Aug 2015 06:20 PM PDT

Whose ship will sync?

Asus Monitor

Looking for something to ponder over the weekend? Try this on for size: With Intel vowing support for the VESA-backed Adaptive-Sync standard, which is what AMD's FreeSync technology utilizes, where does that leave Nvidia's competing (and proprietary) G-Sync technology?

Before you answer that question, let's back up a moment. Intel Fellow and Chief Graphics Software Architect David Blythe told TechReport last week that Intel plans to add Adaptive-Sync technology to future processors with integrated graphics. What that essentially means is that, in time, Intel CPUs with integrated graphics will play nice with displays that support AMD's FreeSync technology.

In an article titled "Intel May Have Just Killed Nvidia's G-Sync," Timothy Green of The Motley Fool called this a "big win for FreeSync" and wrote that "G-Sync technology may be a lost cause." (Note that Green owns stock in Nvidia and The Motley Fool "owns and recommends Intel").

Green supports his argument by pointing out that even though Nvidia owns the biggest share of the discrete graphics card market, Intel's integrated graphics account for 75 percent of the graphics market as a whole.

"G-Sync certainly isn't dead yet, but a scenario where both G-Sync and FreeSync are widely used now that Intel has backed FreeSync is difficult to imagine," Green writes.

Not so fast. What's missing from a cursory glance at the graphics market share breakdown is how many people are using integrated graphics. Both my primary laptop and desktop are equipped with Intel CPUs with integrated graphics, but both are running Nvidia GPUs -- incidentally, my laptop (Asus ROG G751JY-DB72) is one of a handful of models that support G-Sync.

It's likely the number of integrated graphics users is still higher -- mainstream users tend to outnumber enthusiasts -- but I doubt it's as lopsided as the market share breakdown suggests. That's point number one.

Point number two is that gamers and enthusiasts who turn their noses up at integrated graphics solutions are the ones more likely to be interested in FreeSync and G-Sync technologies. Speaking for myself, I don't care if the graphics in my CPU supports either technology because I play games with a dedicated GPU (or two).

That said, it's an interesting development on a number levels. For one, Intel and AMD are rivals in the chip business, so for Intel to back a standard that AMD uses and actively promotes (under its own FreeSync branding) is no small thing.

Secondly, you can bet that once integrated graphics solutions are in place that support FreeSync, AMD will go nuts touting its technology. AMD will talk up the affordability of FreeSync, since unlike G-Sync, it's not proprietary and doesn't require any licensing fees or special hardware modules -- that's a bonus for monitor makers and consumers alike.

Finally, even though integrated graphics solutions are weak alternatives to dedicated graphics, they're getting faster with each new generation. Gaming on integrated graphics is actually feasible, depending on which solution we're talking about, and that's only going to improve with time. Should the day come when integrated graphics truly rival discrete solutions, it would been a boon for FreeSync.

What do you think about all this -- should Nvidia be worried that Intel threw its weight behind Adaptive-Sync/FreeSync, or is this only a minor victory for AMD?

Follow Paul on Google+, Twitter, and Facebook

Chrome to Kill Automatic Flash Playback

Posted: 28 Aug 2015 05:31 PM PDT

Flash Plug-in Now Integrated with Chrome

Earlier this year, Google added a setting in Chrome that allowed users to switch off specific plugin content, such as Flash-based ads, to help speed up page loads and reduce power consumption. Chrome users could access this setting by clicking the menu button, choosing "Settings," "Show Advanced Settings," and then the "Content Settings" button within the "Privacy" section. Users could elect Chrome to run all plugin content, let the user choose what content can run freely on web pages, or allow Chrome to detect and run "important" plugin content.

Back in June, Google offered advertisers three methods for getting ads to the eyes of Chrome users without using Flash: letting Google's AdWords to automatically convert Flash to HTML5, create an ad from scratch using HTML5 tools supplied by Google, or upload their own HTML5 ads that were produced without Google-supplied tools.

Now Google is reporting that Flash ads will be paused by default starting September 1. According to Google AdWords, the company's online ad network, "most" Flash ads will be converted to HTML5 by default, however, if they're not, advertisers need to identify the Flash ads that can't be altered and begin converting those ads to HTML5 using third-party tools.

"If you already have Flash ads uploaded to AdWords, we highly recommend that you create new image ads instead," Google states. "Eligible Flash campaigns are automatically converted to HTML5 when you upload them in AdWords, AdWords Editor, and many third-party tools."

Many websites use Flash to "wow" potential customers and at times require the visitor to wait a short duration until the full Flash-laden site is loaded on the screen. While this method is truly annoying, what's worse is that many advertisers use audio in Flash ads that can be hard to locate in open tabs. To remedy that, Google added an icon to Chrome that shows up whenever a tab is playing unwanted audio.

Web surfers may agree that the use of Flash has become a bit dated and out-of-hand. Over the years, hackers have taken advantage of the security flaws found in Flash, allowing them to steal identities and install malware, hence the need to move away from Adobe's cash cow and use the supposedly safer HTML5 technology instead.

With that all said, is Google canning the use of all Flash media altogether? Not yet. Chrome will "pause" all minor Flash files, allowing the visitor to click and play the Flash elements if desired. Again, the change will begin to roll out next week.

NVIDIA Shield Set-Top Box on Google Play

Posted: 28 Aug 2015 12:43 PM PDT

Nvidia Shield Stand

Nvidia's Chris Daniel updated the company's blog with news that the Shield Android TV set-top-box can now be purchased on Google Play in North America. The device is one of three Shield gadgets that Nvidia has released over the years, starting with the Shield handheld (2013) followed by the Shield tablet (2014) and the Shield set-top-box (2015). The other two devices are not for sale on Google's storefront.

The specifications show that the Shield set-top-box sports a Nvidia Tegra X1 SoC, 3 GB of RAM, a 256-core GPU, 16 GB of storage (500 GB for the "Pro" model), dual-band Wireless AC and Bluetooth 4.1/BLE connectivity, and Android 5.1 "Lollipop." There's also Gigabit Ethernet, HDMI 2.0, two USB 3.0 ports, a microSD card slot, a microUSB 2.0 port and an IR receiver.

With this device, Nvidia is shooting to support 4K Ultra HD playback at 60 frames per second when connected to a compatible Ultra HD TV. The specs also show that the box supports Dolby 7.1 and 5.1 surround sound via pass through over HDMI. Playback audio is up to 24-bit/192 kHz over HDMI and USB, the specs reveal.

Despite the box's movie "theater" roots, Nvidia's set-top-box also aims to bring HD Android gaming to the living room. Packed with a Nvidia Shield controller, owners can play high-definition games that can't be played on other Android devices such as Doom 3 BFG Edition, War Thunder, and Brawl.

However, like the other two Shield gadgets, the set-top-box can stream specific PC games from your "Kepler" or better PC gaming rig. Nvidia also serves up over 50 PC games that can be streamed via Nvidia Grid. These games include Batman: Arkham Origins, Ultra Street Fighter IV, The Witcher 2: Assassins of Kings, Borderlands, Dead Island, and tons more.

"More than 135 high-quality games are available in our curated SHIELD Hub app, and another 300+ on Google Play. More arrive all the time," Daniel says.

The Shield set-top-box comes with one controller and 16 GB of storage for a meager $199.99. Want the massive 500 GB version instead? Be prepared to cough up $299.99.

Newegg Daily Deals: LG 34UB67-B UltraWide Monitor, PNY 128GB Micro SDXC Card, and More!

Posted: 28 Aug 2015 10:53 AM PDT

LG Ultrawide

Top Deal:

Future generations will look back and giggle at the multi-monitor setups of yesteryear. They'll listen as old folks tell about bezels and messes of cables that came with running two or three monitors at a time, but they'll never know the struggle. That's because they'll be rocking a monitor akin to today's top a deal, an LG 34UB67-B 34-Inch UltraWide for $450 with free shipping (normally $500 - use coupon code: [EMCANW72]; additional $50 mail-in-rebate). That's 34 inches of 21:9 bliss, and yes, it's an IPS panel too!

Other Deals:

Dell U2414H Black 23.8-inch 8ms HDMI Widescreen IPS Monitor for $210 with free shipping (normally $240 - use coupon code: [EMCAWNW23])

LG 27MP36HQ-B Black 27-inch 5ms HDMI Widescreen IPS Monitor for $200 with free shipping (normally $230 ; Free 128GB solid state drive w/ purchase, limited offer)

Samsung 850 Pro Series MZ-7KE128BW 2.5-inch 128GB SATA III 3-D Vertical Internal Solid State Drive (SSD) for $82 with free shipping (normally $95 - use coupon code: [EMCAWNW37])

PNY High Performance 128GB microSDXC Flash Card for $50 with $1 shipping (normally $80 - use coupon code: [EMCAWNW47])

Asus Rolls Out TUF Sabertooth Z170 Mark 1 Motherboard

Posted: 28 Aug 2015 10:41 AM PDT

One TUF cookie

Asus TUF Sabertooth Z170 Mark 1

If Mad Max were to build a system around Skylake, we have no doubt he'd opt for the new Asus TUF Sabertooth Z170 Mark 1. True to its name, this "TUF" board is littered with features designed for durability, such as port-protecting Dust Defenders, military-grade components, ESD guards, and so forth.

Let's start with the basics. This is a socket LGA1151 motherboard built around Intel's Z170 Express chipset. It has four DIMM slots supporting up to 64GB of DDR4-2400 RAM, three PCI-Express x16 slots, three PCI-Express x1 slots, six SATA 6Gbps ports, two SATA Express ports, six USB 3.0 ports, eight USB 2.0 ports, a single USB 3.1 Gen 2 Type-C port, a single USB 3.1 Gen 2 Type-A port, 8-channel audio, and GbE LAN.

The board boasts an 8+4 power phase design and has armor all over the place, both to keep things cool and for rigidity. There's also a reinforced backplate, a dozen fan connectors, and an onboard processor that monitors temp sensors and fan speeds.

With the introduction of this model, Asus now offers a dozen Z170 Express chipset motherboards. This is likely to be one of the more expensive ones, though Asus hasn't yet revealed pricing or said when it will be available.

Follow Paul on Google+, Twitter, and Facebook

Don't Expect to See Chrome Notifications in Windows 10's Action Center

Posted: 28 Aug 2015 08:44 AM PDT

Valid explanation or bogus excuse?

Chrome

You would think that adding support in Chrome for notifications to appear in the Action Center for Windows 10 would be an easy "Yes" for Google, but you'd also be wrong. As it turns out, Google has no intention of using the Action Center, at least not anytime soon.

The revelation came in response to a feature request on Google's Chromium support forum.

"Thanks for the input and ideas! We've discussed this quite a bit and decided not to integrate with the system level notification at this time," a Chromium forum moderator posted. "It would create a weird state where Chrome behaves differently on Win 10 than on Win 7/8 and developers of extensions/websites wouldn't know what they design for. Maybe we can revisit it in a few years when most users are on Win 10."

A smiley face was added to the end of the post for good measure, though it didn't prevent users from frowning on Google's decision.

"What kind of a stupid reason is that? Developers of extensions/websites do NOT NEED to know what they design for. Notifications are notifications. Where they show up on the host OS is not a concern for the extensions/websites developers," one of the forum users posted in response.

Several others chimed in with contempt and confusion at Google's "weird decision" to not use the Action Center in Windows 10.

"You're basically electing to clutter the user's interface by using a separate notification system for Chrome, where you could be unifying it with the system. Wasn't the whole point of the Chrome running in the background thing unifying it with the system?," another user posted.

The reaction to Google's decision prompted a followup response by the moderator speaking on Google's behalf. The shortened version is that "on Win 10, using the native notification system would mean that all notification could show briefly before disappearing but they could also not show, depending on a user setting. All notifications would show as coming from Chrome. They would not be actionable, and so on."

Former Maximum PC contributing writer and current PCWorld senior editor Brad Chacos thinks the decision has more to do with an ongoing feud between Google and Microsoft, and less to do with technical reasons. Chacos points out that Google and Microsoft have a contentious history when it comes to feature support, such as the time Microsoft's native Calendar app removed Google Calendar support in Windows 8.1, or when Microsoft released its own version of YouTube for Windows Phone to counter Google's refusal to release apps for its servers on the competing platform.

What do you think -- does Google have a valid reason for not supporting Chrome notifications in Windows 10's Action Center, or are users once again getting caught in the crosshairs of a feud between two tech titans?

Follow Paul on Google+, Twitter, and Facebook

Valve and HTC's Vive VR Headset to See "Limited" Launch in 2015

Posted: 28 Aug 2015 08:15 AM PDT

Drats!

HTC Vive

Swap out your mechanical keyboard for a crappy membrane plank before reading any further. It's just a precaution, should you decide to Hulk-smash your keyboard in frustration at what we're about to tell you.

Remember the Vive VR system being developed by Valve and HTC we told you about earlier this year? It's the one that impressed our own Jimmy Thang, who had a chance to demo early versions on two separate occasions (lucky dog!) and called it "the best VR experience I've ever had. And this is coming from a guy who has tried nearly all of the VR headsets out there, including Oculus VR's newest Crescent Bay prototype. This is the closest thing to a modern-day holodeck we have at the moment."

High praise, and rather than take his word for it, you were supposed to be able to buy one in time for the holiday shopping season. That will still be true for some of you, though by and large, most early VR adopters will have to wait until 2016 to experience the Vive. That's because Valve revealed HTC will only make available a "limited quantity" of Vive VR headsets for its 2015 launch.

"Later this year, HTC will offer the first commercial Vive units via a limited quantity of community and developer systems, with larger quantities shipping in calendar Q1 2016," Valve said at PAX Prime, according to GamesIndustry.biz.

We know, it's a first world problem and all that jazz. It's also disappointing because we've seen first hand how awesome and promising the Vive VR is. While Valve didn't say exactly how limited the initial launch will be, it appears the focus has now shifted to next year.

That's also a bummer for Valve and HTC, as the delay negates the head start it would have had over the first consumer version of Oculus Rift, which is also supposed to ship in the first quarter of next year.

Follow Paul on Google+, Twitter, and Facebook

Memory Myths: How Much RAM Is Enough?

Posted: 28 Aug 2015 12:00 AM PDT

This article was published in the September 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here.

It's tempting to always get the biggest and fastest memory kit, but how much RAM do you actually need?

Component lifespans are usually pretty easy to track. Processors get higher clock speeds, more cores, and smaller silicon; graphics cards get better clocks, more transistors, and bigger heatsinks; and storage gets bigger and cheaper.

Memory is another component that's constantly evolving: faster speeds, bigger quantities, more channels. Conventional wisdom suggests that adding faster and larger amounts of memory will allow games and applications to run faster, but that's not always the case, which is why we've examined this murky situation.

The Memory Landscape

Computer memory is currently divided into two main types: DDR3 and DDR4. The former is older, having debuted back in 2007, while the latter only hit the mainstream recently, with Intel's X99 platform in 2014, and more recently with Skylake's Z170 platform.

They both work using the same principle: DRAM chips store data that the computer needs immediately, but it's lost when it's no longer useful or the PC is turned off. It's governed by several common attributes: Larger amounts mean more data can be stored, and higher MHz ratings mean the memory runs at a faster speed, so data moves in and out more rapidly.

The newer standard, DDR4, has several advantages over DDR3. It runs at a higher frequency, so it's able to process tasks at a faster rate: DDR3 is generally clocked between 1,333MHz and 2,400MHz, while DDR4 ranges from 2,400MHz to 3,200MHz and beyond. It's possible to blur these lines with overclocking, but, for the most part, DDR4 is faster. It balances those better speeds with more efficient power consumption, and its chips have double the internal memory banks, faster burst access, and higher data transfer rates.

DDR3 and DDR4 memory work with different motherboards and chipsets. DDR3 memory is compatible with nearly every motherboard and socket type you're able to buy right now, while DDR4 memory is only compatible with boards that use Intel's X99 chipset and LGA2011 processor socket, or the new Z170 boards with DDR4 sockets. (Note that some Z170 boards will support DDR3 instead of DDR4.)

DDR4, however, does have one downside. That's increased latency. Newer DDR4 2,133MHz memory has a latency rating of CL15, which means it'll take 14.06ns to perform a read, while DDR3 1,600MHz memory reads at 13.75ns. That's a tiny margin, and DDR4 negates this disadvantage with its generally higher clock speeds. Nevertheless, if you'd like to keep an eye out, look for CAS ratings. This indicates latency, and lower is better.

No matter which memory you buy, you'll have to deal with channels. Dual- and quad-channel setups are the most popular and improve performance by allowing motherboards to use multiple channels to send and receive data simultaneously, thereby improving bandwidth. It's possible to run memory in single-channel mode, but there'll be a performance decrease if you run a single stick of memory rather than two or four.

PCF275.w rev6.kingston hyperbeast ram

Different locations of the key notch (on the insertion edge of each DIMM) prevents a DDR3 or DDR4 stick from being installed into an incompatible board or platform.

The Changing PC Landscape

The variety of different specifications means that prices vary wildly. The cheapest 16GB DDR3 kits made from two 8GB sticks currently cost about $90, but the most expensive can cost more than $300. It's a similar story with DDR4, where dual- and quad-channel kits also vary by huge amounts when it comes to price. But these will generally be more expensive than their DDR3 equivalents.

Manufacturers claim that the increased speeds and better features provided by pricier memory will make a dramatic difference to performance, but we're not so sure, so we've set up some test rigs to find out just how much memory you really need. Both of the test rigs we've set up use MSI motherboards. One uses Intel's Z97 chipset with a Core i7-4770K processor, while the other is an X99 rig with a Core i7-5820K chip. Both use operating systems installed on a Samsung 850 Evo SSD, and both use an Nvidia GeForce GTX 980 graphics card.

We've already mentioned the different processors and chipsets that work with DDR3 and DDR4, but there's more to choosing components than just making sure your new gear is compatible on paper. Intel's Haswell architecture is behind the bulk of its current desktop processors, and it supports up to 32GB of dual-channel memory. It's used for chips that range from cheap Celerons and Pentiums to more expensive Core i5s and i7s, and these desktop Haswell chips all plug in to the LGA1150 socket. Most Haswell-based processors are deployed with mobos that have Intel's H87, Z97, and Z87 chipsets. When it comes to memory support, they're all impressive. They handle four slots that accommodate two sets of dual-channel memory, and most full-size ATX boards also support 32GB or 64GB of memory at high speeds.

Intel has further developed its architecture with Haswell-E. Chips that use this system also use the LGA2011 socket and X99 chipset, which means that support for DDR4 is included. That in turn means support for faster memory speeds when compared to DDR3, and the X99 platform is quad-channel.

AMD's processors and APUs, meanwhile, use the Piledriver architecture. Its own memory controller was given a speed boost over the previous generation of AMD hardware, but memory support ultimately still isn't as good on this side of the fence. All of AMD's current chips support DDR3 memory, however, some of them are restricted to 1,600MHz or 1,866MHz memory, while only a handful officially top out at 2,133MHz (though some enthusiast mobos allow you to overclock the RAM to higher speeds). Like Intel mainstream platforms (LGA1150/1151), these boards support dual-channel memory.

Don't Forget Your Mother

Processors and chipsets aren't the only bits of your PC that need to be checked before shelling out for new memory—motherboards are also vital. You'll need to make sure a board has the right number of slots, and also check what amount and speed of memory it can accept: It's no good dropping a few hundred bucks on a 32GB 3,000MHz kit if your motherboard taps out at 16GB and 2,666MHz.

There are nuances to be examined, then, but for the most part, the memory landscape is heartening. No matter what processor, chipset, or motherboard you use, you'll be able to equip a rig with plenty of high-end memory at decent speeds. That's good for PC building, but it's not necessarily great news for companies that rely on flogging expensive, high-end kits.

Future developments from Intel and AMD will only improve the situation. Intel's newly launched architecture, Skylake, supports DDR4 across all of its full-fat desktop chips, but it'll also be backward-compatible with DDR3, which adds a huge amount of versatility. (Again, pay attention to the number and type of memory slots, as you cannot use DDR3 sticks in a DDR4 slot of vice versa.) We also expect to see improvements to the memory controller and support for larger amounts of memory running at faster speeds.

AMD isn't standing still, either. Its next proper desktop architecture is called Zen, and it'll offer full DDR4 support to bring the company's chips alongside Intel.

MPC104.feat haswell e.x99blockdiagram

The X99 chipset introduced DDR4 to the high-end consumer market, bringing with it faster clock speeds and better power efficiency, but with increased latency.

DDR3 Memory

The first set of DDR3 benchmarks we locked and loaded were PCMark 8's Home, Creative, and Work tests—a trio of suites that simulate the kind of low-intensity tasks that take place on many systems, from web browsing and video chatting to word processing and spreadsheets.

Our first tests deployed the bare minimum of sluggish DDR3: 8GB of RAM clocked at 1,333MHz. With this RAM, the rig returned scores of 5,170, 6,794, and 5,234 points, in the Home, Creative, and Work tests, respectively. However, with 8GB of 1,600MHz memory deployed, the scores barely improved, with the Creative run only jumping to 6,852.

There wasn't even much of a difference in these tests when we installed 16GB of 1,866MHz memory: In those three benchmarks, the machine scored 5,270, 6,961, and 5,225. The biggest leap came in the Creative test, which suggests more memory helps with photo editing and other trickier tasks, but it's hardly a game-changing jump in performance.

We saw similarly modest gains in other photo-related applications. GigaPan Stitch knits together a group of high-resolution photos, and our test image took four minutes and 12 seconds to complete in a rig with 8GB of 1,333MHz memory. That only improved by 11 seconds when we doubled the RAM and upped its speed to 1,866MHz.

Other application benchmarks saw similarly modest impacts. A Cinebench R15 CPU test with two 4GB, 1,600MHz sticks returned a result of 703; doubling the memory and improving its speed to 1,866MHz only improved that figure to 751.

We only saw big improvements in a few benchmarks when running DDR3 tests. In PCMark Vantage, our 8GB 1,600MHz rig scored 18,313 points, but doubling the memory and running it at 1,866MHz saw that result jump by almost 3,000 points—a significant increase.

Indeed, our theoretical tests indicate that improving memory amounts and speeds does make a difference, but that these gains don't generally translate to real-world tests.

In SiSoft Sandra's multithreaded bandwidth test, our 2x 4GB 1,333MHz setup scored 16.57GB/s, but doubling the memory and improving its speed to 1,866MHz saw that result jump to 23.33GB/s. There was a decent jump in single-threaded bandwidth, and cache bandwidth also improved significantly when faster memory was added in larger amounts.

The leap from two to four memory sticks doesn't often have much of an impact on our application tests, either. In Cinebench R15's OpenGL test, a machine with two 4GB 1,600MHz sticks scored 111 frames per second, with this score only jumping to 117fps with four 4GB 1,600MHz sticks installed.

When running applications using DDR3, then, the differences between slow and fast memory often aren't huge—and, as long as you've got 8GB of memory installed, then you're going to have enough to get most stuff done in real-world situations.

There was a noticeable performance difference between our rig with 1,333MHz and 1,600MHz memory installed, but, once beyond that 1,600MHz speed, the gaps between different memory speeds narrowed rapidly. We ran GeekBench single-core benchmark on 1,600MHz memory, and then again at 2,800MHz memory, but its result only improved by around 100 points.

The benchmarks demonstrate that there are performance gains to be had by installing more memory at faster speeds, but those gains are only noticeable in high-end applications. For most of us, 8GB or 16GB of 1,866MHz memory will be more than enough.

DDR3 and Gaming

We tested a variety of games using our DDR3 rig, but only found sporadic improvements. In Metro: Last Light, a machine with two 4GB 1,333MHz sticks averaged 126fps, but improving to a pair of 8GB 1,866MHz DIMMs saw that result jump to 144fps. In both Bioshock Infinite and Batman: Arkham Origins, though, the improvements were far less impressive—a few frames better in the minimum frame rate benchmark, and only a gain of 2fps to the average rate.

There also wasn't much of a difference in any of our Unigine Heaven 4.0 tests. In all of our DDR3 tests—ranging from a system with two 4GB 1,333MHz sticks to a machine with four 8GB 1,600MHz DIMMs—the benchmark's average frame rate hovered between 63.4fps and 66.8fps. Those configurations didn't differ much in 3DMark's Fire Strike test either: in the same range of memory setups, our results only jumped between 11,607 points and 11,635 points, well within the margin of error.

PCF280.game 1.grab13

Whether you've got DDR3 or DDR4, upping the size or speed of your memory makes little difference to Bioshock Infinite. Crucial's Ballistix sticks, left, look good, but also come with handy extras, such as integrated thermal sensors.

The DDR4 Difference

Newer DDR4 memory operates with faster speeds, better channel support, and Intel's latest enthusiast chipset and controller, so we expected our tests to reveal bigger performance disparities. Our initial tests, though, appeared to follow the blueprint already set out by the older DDR3 sticks.

In the Cinebench R15 CPU test, a machine with two 4GB 2,400MHz sticks scored 1,143 points. Doubling the memory and increasing its speed to 3,000MHz, however, only saw that result jump to 1,190. The X264 video encoding test led to similar patterns. Our more modest rig ran through its two tests at 205 frames per second and 68 frames per second, but increasing the memory's speed to 3,300MHz saw those results only inch forward to 211fps and 73fps—hardly a jump that'll make a big real-world difference.

GigaPan Stitch's photo-editing tool only saw a couple of seconds' worth of improvement with its memory sped up, while Geekbench exhibited similarly small gains: Our first DDR4 rig scored 22,165 points, but doubling the memory to 8GB, running at 2,666MHz, only saw the score jump to 22,849.

It's a shame because, as with DDR3, theoretical tests illustrated that improving speeds and amounts did make a difference. When we had two 4GB 2,400MHz sticks installed, our test rig delivered 15GB/s and 28.58GB/s of single- and multi-threaded  bandwidth, with those numbers jumping to 17GB/s and 32GB/s with those same sticks clocked to 3,300MHz.

Those same benchmarks illustrated how DDR4 copes with quad-channel and larger amounts of memory: Our machine with two 8GB sticks may have delivered 32GB/s of multi-threaded bandwidth, but doubling the memory (and channels) saw that figure leap to 45GB/s. Quad-channel delivered impressive numbers throughout our benchmarks, then, but those figures weren't always translated to real-world tests. So, we'd say that it's not a vital addition to your PC, unless you're keen on buying a Haswell-E system to run intensive work applications or the most demanding games.

DDR4 and Gaming

We saw a big jump in just one of our gaming benchmarks, Metro: Last Light, while testing with DDR3. However, updated DDR4 memory proved even less dramatic. Improving the amount and speed of memory saw our Metro: Last Light results jump by a mere couple of frames, and our biggest improvements in Bioshock Infinite and Batman also only saw increases of a frame or two, no matter the amount or speed of DDR4.

We'll let Unigine Heaven have the last word. Our rig averaged 62.7fps with two 4GB 2,400MHz sticks installed, but this only improved to 64.2fps once we installed four 8GB 2,666MHz DIMMs.

There's no doubt about the pure, naked speed of DDR4, but it looks like we're at the point, for gaming especially, where any 8GB dual- or quad-channel configuration will be ample. Memory simply isn't the bottleneck in gaming. Processors and graphics cards are the components that are more likely to be holding back your frame rates.

Benchmarks
DDR3 DDR4

8GB (2x 4GB) 1,600MHz AMD AE34G1609U2
8GB (2x 4GB) 1,333MHz Corsair CMV8GX3 M2A1333C9
16GB (2x 8GB) 1,600MHz CML16GX3 M2A1600C9
16GB (2x 8GB) 1,866MHz Crucial Ballistix Sport XT 8GB (2x 4GB) 2,400MHz Kingston HX424 C15FBK4/32
16GB (2x 8GB) 2,666MHz Ballistix BLE2C8 G4D26AFEA
Cinebench R15 (index)
703
738
751
721
740
747
PC Mark 8 (index)
5,161
5,170
5,228 5,270
5,320
5,364
PC Mark Vantage (index)
18,313
19,718
20,944
20,427
26,541
28,088
3DMark Firestrike (index)
11,624
11,607
11,608
11,635
11,820
11,920
X264 v4.0 (fps)
66.6
66.1
69.3
69.4
68.6
72.6
SiSoftware Sandra Memory bandwidth (GB/s)
21.1
16.6
20.6
23.3
28.6
31.8
Gigapan Stitch (secs)
248
254
242
241 245
244
Bioshock Infinite (min/avg fps)
11 / 109
11 / 115.9
11.6 / 115.3
13.2 / 117.6
33.2 / 122.6
44.5 / 124.7
Batman: Arkham Origins (min/avg fps)
100 / 135
99 / 137
103 / 137
106 / 139
104 / 136
104 / 133
Unigine heaven min/avg (fps)
26.5 / 64.1
25.2 / 63.4
26.8 / 63.6
26.2 / 63.6
28.6 / 62.7
27.8 / 62.7

Best scores are bolded.

What Memory Do You Really Need?

It's tempting to buy the fastest and largest memory kit you can afford when putting together a new build, but, as many of our benchmarks illustrate, aiming for the top of the tech tree is actually an unnecessary extravagance when it comes to memory.

The story is the same whether you're creating a PC using DDR3 or DDR4. A decent amount like 8GB or 16GB running at a reasonable speed will be enough to handle most tasks you throw its way, whether it's for work or gaming. You'll still see occasional benefits if you buy larger and faster kits, sure, but they'll be less significant—so, it's only worth looking toward these kits if you're a true enthusiast who wants the best parts available, or if you're running unusually demanding software and need to wring every last bit of performance from your PC.

Quad-channel kits, meanwhile, are great if you're using applications that'll truly take advantage of DDR4's improved architecture, such as encoding or rendering, but most people won't feel the benefit. It's no surprise, then, that it's only available with expensive X99-based CPUs (as well as the earlier X79-based platforms, which used quad-channel DDR3).

The majority of PC users, even enthusiasts, just don't need to cough up for the priciest kits around, and that's definitely no bad thing. Memory, processor, and chipset developments have leveled the playing field, which means it's one less component to worry about when putting together a new PC.

The Aesthetics of Memory

Memory manufacturers try to sell expensive kits on the basis of their size or speed, but that's not the only advantage that comes from spending big on a high-end set of DIMMs—many of them are also designed to look better than their cheaper, plain-looking alternatives.

Corsair's Dominator Platinum range sits at the top of the firm's product stack, and some of its key benefits are about the visuals. Corsair boasts of its industrial design and LED lighting—the top metal bar can be upgraded with different attachments, the LEDs can be changed, and the box has a good-looking fan kit that can sit on top of the sticks to provide extra cooling.

Expensive memory kits like this don't just have aesthetic advantages—Corsair's Dominator Platinum chips are hand-sorted, have improved monitoring hardware, and better heatsinks. But there's no denying the visuals play a key part when it comes to high-end memory.

Other firms offer similarly high-end extras. Crucial's Ballistix memory sticks have attractive aluminum heatsinks alongside practical extras like integrated thermal sensors, while Kingston's HyperX Predator and Beast products have good-looking exteriors, but are chosen specifically to provide the best performance.

Kits like this bring practical and visual improvements to the table, then, but they're not always necessary. If you're building a midrange rig, or want to put together a machine without a window in its case panel, they're simply overkill.

Pentiums, Celerons, and APUs: Buying Memory for a Budget PC

Our tests have examined the effect of different memory on high-end machines, but if you're building a budget rig, then different considerations should come to the fore at checkout time.

For starters, don't splash out on an expensive, fast memory kit if you're going to be constructing a PC built around one of Intel's Haswell-based Celeron or Pentium chips, as most of these only support DDR3 that runs at 1,333MHz. That's slow enough to cause a performance hit in many tests, but on a low-end rig, it's unlikely you'll be running the sort of applications that'll suffer with lesser speeds.

PCF297.feat2.haswell opener

Don't bother with high-end memory if you're running an A10-7850K or Pentium K.

AMD's APUs are more accepting to faster memory, but you'll still need to pay attention to speeds. A couple of its cheapest parts only handle 1,333MHz or 1,600MHz DDR3, but most can support 1,866MHz sticks. It's the same on the CPU side, with FX chips mostly supporting 1,866MHz parts.

There's one other main consideration when putting together a budget machine: the motherboard. Budget boards don't often support the extreme speeds offered by pricier components, and many—especially at smaller form factors—only have two slots, rather than four. That's fine if you're building a system you don't intend to upgrade, but it can prove restrictive if you want to add more memory later.

MMORPG News

MMORPG News


Guild Wars 2: Heart of Thorns to Launch October 23rd

Posted: 28 Aug 2015 11:32 AM PDT

Heart of Thorns to Launch October 23rd

ArenaNet Tweeted out this morning that the Guild Wars 2 expansion Heart of Thorns will be launching on October 23rd. Game editions run from $49.99 to 99.99.

General: Pillars of Eternity - The White March Packs a Punch

Posted: 27 Aug 2015 01:15 PM PDT

Pillars of Eternity - The White March Packs a Punch

This week, a few of us at the MMORPG offices went hands-on with Pillars of Eternity's first expansion, The White March - Part 1. Read on for what Chris thought of his first two hours in the game.

General: Is There a Better Alternative to Questing?

Posted: 27 Aug 2015 01:09 PM PDT

Is There a Better Alternative to Questing?

Games like EverQuest and World of Warcraft have given us so much, providing the conceptual framework for the majority of MMOs that have been released over the past decade plus. Most contemporary MMORPGs, although distinct in their own right, emulate one or more tropes from their predecessors, which in turn were inspired by other video games and pen-and-paper RPGs.

Armored Warfare: Developer Q&A Special Edition Published

Posted: 28 Aug 2015 04:47 AM PDT

Developer Q&A Special Edition Published

My.com and Obsidian Entertainment have published the latest Developer Q&A, this time a Special Edition. Answers are categorized in several different groups that includes Territory Wars & Battalion Activities, an extensive section on Game Mechanics, and Miscellaneous.

Gigantic: Closed Beta Begins

Posted: 28 Aug 2015 04:38 AM PDT

Closed Beta Begins

Motiga has announced that Gigantic has entered closed beta testing, with interested players invited to sign up as potential testers. Gigantic is ready for Windows 10 and XBox one players to take part, along with French, German, English and Spanish support.

EVE Online: Alliance Tournament XIII Comes to a Close This Weekend

Posted: 28 Aug 2015 03:18 AM PDT

Alliance Tournament XIII Comes to a Close This Weekend

EVE Online fans will want to be on hand during this weekend's live stream of the final action in the Alliance Tournament XIII. Alliances have been in the preparation stages for months, complete with practice runs and fleet assembly. Teams are vying for bragging rights, tournament ships, reward medals and PLEX.

General: Mobile Game, Rune Story, to Launch in the West Soon

Posted: 28 Aug 2015 03:10 AM PDT

Mobile Game, Rune Story, to Launch in the West Soon

COLOPL has announced that its mobile city building game with 'classic RPG' elements, Rune Story, will be launching in the west in the near future. Already in soft launch in several worldwide locations, Rune Story brings guild functionality, cooperative battles, customized play styles and more to mobile gamers.

Gloria Victis: A Living, Open World

Posted: 27 Aug 2015 02:53 PM PDT

A Living, Open World

Creating the total immersion has been one of the fundamental goals of Gloria Victis since the very beginning of development. This implicates a need of an open world - a really open one, without loading screens or invisible barriers at every turn. A world which would be as much believable and alive as possible.

Guild Wars 2: Heart of Thorns PAX Reveal Leaked [SPOILERS]

Posted: 27 Aug 2015 02:42 PM PDT

Heart of Thorns PAX Reveal Leaked [SPOILERS]

It appears that Saturday's Guild Wars 2: Heart of Thorns reveal has been leaked by IGN. While the Twitter post has been removed, a trailer is still live and savvy Reddit users have detailed all of the leaked information. WARNING: If you do not wish to know the details of the PAX reveal, DO NOT CONTINUE!

Armored Warfare: Early Access 5 Launching Soon with the New Base Feature

Posted: 27 Aug 2015 12:20 PM PDT

Early Access 5 Launching Soon with the New Base Feature

Obsidian Entertainment has announced that Early Access Test 5 will begin on September 3rd and run through September 20th. During the latest event, players will get a look at their Private Military Company (PMC) and gain the ability to build and upgrade multiple structures.

ARK: Survival Evolved: Free Steam Weekend Begins Today

Posted: 27 Aug 2015 12:15 PM PDT

Free Steam Weekend Begins Today

Sitting on the fence about ARK: Survival Evolved? Then get ready to take off to a remote and dinosaur infested world from today, Thursday, August 27th through Sunday, August 30th during ARK's free Steam weekend.

General: MUD: A Living Fantasy World

Posted: 27 Aug 2015 10:22 AM PDT

MUD: A Living Fantasy World

MUD is a brand new MMO from indie studio Pure Bang Games, based in Baltimore Maryland. Built upon the ideals that made Multi User Dungeons the precursor to what we now know as MMORPGs, MUD seeks to bring players into a world that reacts to their choices. We spoke with project lead Benjamin Walsh about MUD, its many systems, and when we can expect to get our hands on the game.

Crowfall: Resolving Crowd Control Issues

Posted: 26 Aug 2015 12:19 PM PDT

Resolving Crowd Control Issues

I don't know about you, but when I hear about a new MMO announcement, my mind usually jumps straight to questions about the game's setting, character development, and long-term developer support. These are the big-ticket items that can make or break a new title for me, but the more specific game mechanics can sometimes be overlooked... until it's too late.

Dragomon Hunter: Full Class List Revealed

Posted: 27 Aug 2015 10:54 AM PDT

Full Class List Revealed

Aeria Games has announced the full class lineup for Dragomon Hunter, the anime-styled monster-hunting game. In addition to new information posted to the Dragomon site, the team has put together a handy video to give fans a look at each class. Check it out!

Otherland: Steam Early Access to Begin on September 10th

Posted: 27 Aug 2015 10:36 AM PDT

Steam Early Access to Begin on September 10th

After a short delay, the Otherland team has let fans know that Steam early access will kick off on September 10th. The team is using some extra time to "address community feedback" that was given during recent closed beta events.

Neverwinter: Elemental Evil Module to Launch for XBox One on Sept. 8th

Posted: 26 Aug 2015 03:04 PM PDT

Elemental Evil Module to Launch for XBox One on Sept. 8th

XBox One Neverwinter players will finally get their chance to experience the Elemental Evil module beginning on September 8th. Elemental Evil will bring content from three other expansions as well including Curse of Icewind Dale, Shadowmantle and Fury of the Feywild.

Star Wars: The Old Republic: Companions to Remain Available

Posted: 27 Aug 2015 09:42 AM PDT

Companions to Remain Available

The Star Wars: The Old Republic forum has been updated with a new post to reassure players that their existing companions will remain available to them outside of the story Chapters in the Knights of the Fallen Empire expansion. During Chapters, a specific group of characters will be accompanying players.

Dragon Age: Inquisition: The Descent: Finally, Dwarves

Posted: 26 Aug 2015 12:03 PM PDT

The Descent: Finally, Dwarves

The original launch version of Dragon Age: Inquisition had much of what fans of the franchise were looking for. There were the open zones and the interesting and chatty group of companions. Inquisition had huge plot-shaking fights with evil dragons, and small sweet moments with friends around the card table. There was one area, though, where the game fell short: Dwarves. Uh, no pun intended.

Total Pageviews

statcounter

View My Stats