General Gaming Article

General Gaming Article


Gamers Petition for GeForce GTX 970 Refund Over Error in Specs

Posted: 27 Jan 2015 12:07 PM PST

Nvidia GeForce GTX 970 Diagram

Internal miscommunication at Nvidia led to confusion over the GTX 970's specs

Sometimes the tech world can be like a geek version of a soap opera, and this is one of those times. The main characters in this case are Nvidia and the GeForce GTX 970. If you're looking for a quick summary of events, it's this: Gamers noticed a slowdown in performance when games tried to access more than 3.5GB of memory on the GTX 970. This in turn led to Nvidia explaining a new memory architecture in the GTX 970, along with clarification of specs that were different than originally reported. In light of all this, there's a petition floating around demanding a refund for anyone who purchased a GTX 970, but to really understand what's going on, a deeper explanation is necessary.

This all began a week ago when users on various forums began investigation a memory issue with the GTX 970. At a glance, it seemed that the card was only using 3.5GB of its 4GB of GDDR5 memory. Upon closer look, it was discovered that a serious performance drop could occur when accessing that final .5GB of VRAM, which isn't an issue on the GTX 980.

To clarify what was happening, Nvidia issued the following statement:

"The GeForce GTX 970 is equipped with 4GB of dedicated graphics memory. However the 970 has a different configuration of SMs than the 980, and fewer crossbar resources to the memory system," Nvidia said. "To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. The GPU has higher priority access to the 3.5GB section. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, but may report more for GTX 980 if there is more memory used by other commands. When a game requires more than 3.5GB of memory then we use both segments.

"We understand there have been some questions about how the GTX 970 will perform when it accesses the 0.5GB memory segment. The best way to test that is to look at game performance. Compare a GTX 980 to a 970 on a game that uses less than 3.5GB. Then turn up the settings so the game needs more than 3.5GB and compare 980 and 970 performance again."

Nvidia Senior VP of GPU Engineering, Jonah Alben, spoke with PC Perspective and broke things down even further with a quite a few technical details. He also offered a helpful diagram, seen below.

Nvidia GeForce GTX 970 Diagram

As you can see in the graph, there are 13 enabled SMMs, each with 128 CUDA cores for a total of 1,664. There are also three that are grayed out -- they've been disabled from the full GM204 found on the GTX 980. But what's really important is the memory system, which is connected to the SMMs through a crossbar interface.

"That interface has 8 total ports to connect to collections of L2 cache and memory controllers, all of which are utilized in a GTX 980. With a GTX 970 though, only 7 of those ports are enabled, taking one of the combination L2 cache / ROP units along with it. However, the 32-bit memory controller segment remains," PC Perspective writes.

There are a couple of takeaways there. First is the GTX 970 has less ROPs and L2 cache than the GTX 980 even though it was reported otherwise. Why? Nvidia blames the gaffe on an error in the reviewer's guide, which is usually a PDF (or actual paper) containing detailed info on a product prior to its launch that manufacturers send out to reviewers, and a misunderstanding between the engineering team and the technical PR team on how the architecture actually functioned.

Bottom line is, the GTX 970 has 56 ROPs and 1,792KB of L2 cache instead of 64 ROPs and 2,048KB of L2 cache like the GTX 980.

That's actually not as big of a deal as it sounds, as the SMMs are the true bottleneck, not the ROPs.

"A quick note about the GTX 980 here: it uses a 1KB memory access stride to walk across the memory bus from left to right, able to hit all 4GB in this capacity," PC Perspective writes. "But the GTX 970 and its altered design has to do things differently. If you walked across the memory interface in the exact same way, over the same 4GB capacity, the 7th crossbar port would tend to always get twice as many requests as the other port (because it has two memories attached). In the short term that could be ok due to queuing in the memory path. But in the long term if the 7th port is fully busy, and is getting twice as many requests as the other port, then the other six must be only half busy, to match with the 2:1 ratio. So the overall bandwidth would be roughly half of peak. This would cause dramatic underutilization and would prevent optimal performance and efficiency for the GPU."

There are a LOT more details to digest, and rather than continue to quote bits and pieces, we suggest you read PC Perspective's detailed report. If after doing so you come to the conclusion that it's much ado about nothing, great, there's nothing more to see here. However, if you fall on the other side of the fence and feel duped, you can check out and sign the petition at Change.org.

Our take? It's an unfortunate situation Nvidia created for itself, and gamers have a right to be angry over the misreported specs. At the same time, it appears that the impact on real-world performance is negligible, at least for now -- this could be a bigger issue as higher resolution game play becomes more common. Even still, it remains a great card for the price.

Follow Paul on Google+, Twitter, and Facebook

An Inside Look at How Logitech Designs Its Gaming Mice

Posted: 27 Jan 2015 11:35 AM PST

logitech gaming mouseThe science and testing behind Logitech's gaming mice

This is part two of our in-depth tour of Logitech's facilities in Switzerland. This article focuses on how Logitech designs and develops its gaming mice. For an inside look at how the company is attempting to reinvent the mechanical keyboard, click here.

While Logitech is generally viewed as a peripheral manufacturer, the company views itself as a technology company. In an attempt to show PC gamers that it uses cutting-edge design methodologies, Logitech invited us to its headquarters in Lausanne, Switzerland to show us how the company designs and tests it gaming mice.

Logitech explains how its G402 mouse uses two sensors

logitech g402 hyperion fury

Logitech G402 Hyperion Fury
The company's most interesting mouse today is arguably the G402 Hyperion Fury, which it claims to be "the world's fastest gaming mouse." Logitech boasts that the G402 can move a blistering 12.5 meters a second. To achieve this, Logitech says it uses a combination of two sensors. At slow-to-moderate speeds, the mouse uses a traditional optical sensor. Optical sensors are arguably the most common sensors used in gaming mice and use high-speed cameras to take blazing-fast images of the surface it rests upon. From here, the sensor then overlaps the images to create a movement map. While the cameras used in Logitech's optical sensors are magnitudes faster than the traditional point-and-shoot cameras you find at your camera store (think about 12,000 shots a second), the company says that even they have detectable lag when you're trying to move a mouse at 12.5 meters a second. Therefore, beyond a certain speed threshold, the G402 switches over to an accelerometer/gyroscope solution. It uses a small ARM processor that can switch on the fly, and Logitech claims less than a millisecond of delay results from the switch. While a gyroscope solution isn't the most accurate sensor at low speeds, Logitech says they excel when there is a quick burst of movement, thus the G402 uses a hybrid solution that aims to leverage both sensor's strengths to achieve its speed.

An indepth interview with Logitech's mouse expert Chris Pate

Logitech G302 Daedalus Prime

Logitech G302 Daedalus Prime
While this hybrid sensor seems advantageous for the end user, we were surprised to hear that the company's even newer G302 Daedalus Prime mouse opts instead to support a more traditional optical solution. Logitech told us the reason the hybrid solution wasn't included was because the G302 was designed to be a smaller, lighter MOBA mouse, and trying to house two sensors along with the G402's ARM processor wasn't ideal to achieve this compact form factor. This isn't to say the G302 doesn't have its element of uniqueness, however.

Logitech says its mice are good for at least 20 million clicks

Because MOBAs like League of Legends and DOTA 2 feature tons of clicking, the Daedalus Prime is largely focused on eliminating the travel between the mouse's buttons and its microswitches that activate commands. The G302 is able to do this by separating the left and right mouse buttons from the body of the mouse (Logitech says most mice use a monolithic design), and having them rest directly on top of the microswitch. This means that there is no air travel between the button and the switch at all. In the absence of air travel, Logitech designed a new metal spring tensioning system that rests between the button and the switch. When we asked Logitech if this could potentially add unwanted tension, which could theoretically create microscopic amounts of lag in and of itself, the company assured us that it didn't, but rather aided in a consistent clicking experience.

A Logitech contraption that measures mouse accuracy

logitech g602

Logitech G602
One of the best-selling mice that Logitech currently offers is its G602 wireless mouse. According to Logitech, when you look at the mouse industry as a whole, wireless mice outsell wired ones. This might not be true for gaming, but with the G602, Logitech worked to overcome many of gamers' fears.

The most obvious concern for gamers is lag. According to Logitech, lag on the G602 is imperceptible. The company ran an experiment where it asked a group of gamers if they could detect any noticeable lag using its wireless gaming mouse. People said they believed it felt laggier than a traditional wired mouse. When Logitech plugged in a faux wired cable (that did nothing), the same users said it felt much more responsive. Essentially, Logitech asserts that it was merely the placebo effect at play. According to Logitech, the G602 is capable of delivering a two millisecond response time. The company says that most people can only detect latency at four milliseconds and beyond. According to its own studies, some people can't even perceive 40 milliseconds of lag.

Logitech has a special room that removes all wireless signals to detect wireless dead zones for its wireless mice.

Logitech claims it could have gotten the G602's response time under two milliseconds, but at the cost of battery life, which is actually the true obstacle of a wireless gaming mouse. By scaling it back to two milliseconds, Logitech says it was able to get much more battery life out of the G602, which it asserts is able to get 250 hours of use out of a single charge. How is the company able to achieve those figures? Logitech says that it designed the G602 with battery in mind and created a sensor specifically for gaming wirelessly. The G602 also uses Logitech's proprietary USB interface. When we asked them why it didn't use Bluetooth, the company informed us that the response rate of Bluetooth devices are at the mercy of the host (computer) device. The G602, in particular, uses a 1,000Hz polling rate through USB.

Logitech proving that there is no added acceleration to its mice.

Other interesting things we learned about mice from Logitech is that no sensor is 100 percent accurate. You might see that terminology used to market mice from other vendors, but Logitech asserts that these claims are simply false.

Another question we had pertained to laser mice. Several years ago, laser mice were quite popular because they tracked on a wider range of surfaces compared to optical. While laser mice aren't terrible, optical mice have one key advantage over them, and that comes down to accuracy variance, more commonly referred to as "mouse acceleration." Mouse acceleration is undesired for gaming and generally equates to an inconsistent movement experience. According to Logitech, with laser mice, you get about a five to six percent variance, making for an inconsistent experience, compared to and optical sensor's one percent equivalent.

One final interesting tidbit that we learned is that many gamers prefer braided cables on their mice, but Logitech's data shows that more pros actually prefer plastic cables as they tend to offer more flexibility. So if you want to play like a pro, you might want to consider ditching the braided cable.

For more pictures and information from the event, check out our image gallery below. 

Intel Teases First NUC Desktop with Core i7 Broadwell CPU

Posted: 27 Jan 2015 09:49 AM PST

NUC Meets BroadwellNew frontier for the NUC

We were intrigued with the potential of the NUC when it first came out -- here was this tiny box with fairly respectable hardware inside powerful enough to serve as a secondary PC or, for the right person, a primary system. There have been several follow-up models since then, but the best is yet to come. Intel has gone and updated its NUC product page with a new model that will be the first to feature a Core i7 processor inside.

Not a lot of details are available on the Core i7 model (NUC5i7RYH), which is one of several new NUCs based on the chip maker's 5th Generation Core processor (14nm Broadwell) line. According to the listing, it will feature a Core i7 part, 2.5-inch drive support, and mosey into retail sometime in the second quarter of this year.

The updated NUC site also lists six other Broadwell-based systems, half of them sporting Core i5 processors (one with a Core i5 5300U vPro chip and two with Core i5 5250U CPUs) while the other half come equipped with Core i3 chips (Core i3-5010U).

What they all have in common is support for up to 16GB of RAM, 2.5-inch and M.2 SSD storage support, four USB 3.0 ports, and 802.11ac Wi-Fi. If you need HDMI output, only the Core i5 models will oblige (and potentially the forthcoming Core i7 model).

You can find out more details on each one here.

Follow Paul on Google+, Twitter, and Facebook

Crucial Ballistix Elite RAM Now Available in DDR4 Memory Kits

Posted: 27 Jan 2015 09:01 AM PST

Crucial Ballistix DDR4Another memory option for Intel X99 platforms

The number of DDR4 memory kits is growing and will continue to do so as more people build (or buy) systems based on Intel's X99 chipset. One of the newest is Crucial's Ballistix Elite line, now available in DDR4 form as a single 4GB module and in 8GB (2x4GB) and 16GB (4x4GB) kits (Crucial says a 32GB kit is also available, though it's not listed on the company's web store yet). As both kits use essentially the same 4GB module, the performance ratings are the same across the board.

Crucial's 4GB DDR4 Ballistix Elite module is rated at DDR4-2666 (PC4-2133), which Crucial calls an "introductory" speed -- we take that to mean there should be some overclocking headroom, especially since the Ballistix Elite series is aimed at "extreme enthusiasts, gamers, and overclockers." The sticks also support Intel XMP 2.0 profiles, feature a custom-designed baclk PCB with anodized aluminum heat spreaders, and sport 16-17-17 timings at 1.2V.

If you do plan to overclock, you might want to take advantage of Crucial's exclusive Ballistic Memory Overview Display utility, otherwise known as M.O.D. You can use M.O.D. to read information from the modules, including real-time temperatures from the integrated thermal sensor, voltages, and more.

Pricing on Crucial's website breaks down as follows:

  • 4GB Ballistix Elite DDR4-2666: $95
  • 8GB (2x4GB) Ballistix Elite DDR4-2666: $190
  • 16GB (4x4GB) Ballistix Elite DDR4-2666: $380

Newegg also carries the kits, though they're in pre-order form. Pricing looks like this:

  • 4GB Ballistix Elite DDR4-2666: $100 (out of stock)
  • 8GB (single stick) Ballistix Elite DDR4-2666: $220 (releases March 10, 2015)
  • 8GB (2x4GB) Ballistix Elite DDR4-2666: $200 (releases February 6, 2015)
  • 16GB (2x8GB) Ballistix Elite DDR4-2666: $352 (releases March 10, 2015)
  • 16GB (4x4GB) Ballistix Elite DDR4-2666: $380 (releases February 6, 2015)
  • 32GB (4x8GB) Ballistix Elite DDR4-2666: $704 (releases March 10, 2015)

Shipping charges range from $1 to $3, depending on the kit.

Follow Paul on Google+, Twitter, and Facebook

EVGA Breeds New Torq X5 and X3 Mice for Gamers

Posted: 27 Jan 2015 08:17 AM PST

EVGA Torq X5Built from the ground up for gaming

Quick, what's the first thing you think of when you hear "EVGA?" Most people would probably say graphics cards, followed by power supplies (or vice versa). Motherboards would have also been an acceptable answer, as would have Shield. But gaming mice? That's the type of last place answer that goes unanswered on Family Feud, yet it also represents EVGA's newest products. Specifically, EVGA just announced two new Torq series rodents, the X5 and X3, both designed from scratch for "hardcore gamers."

There are actually four different models -- Torq X3, X3L, X5, and X5L. The "L" denotes a laser sensor, while the other two both use optical. Here's a better look at how they break down:

  • Torq X5L: Laser 8200 dpi, RGB LED, Omron 20m switches, 1000Hz polling rate
  • Torq X5: Optical 6400 dpi, RGB LED, Omron 20m switches, 1000Hz polling rate
  • Torq X3L: Laser 5000 dpi, RGB LED, Omron 10m switches, 1000Hz polling rate
  • Torq X3: Optical 4000 dpi, red LED, Omron 10m switches, 1000Hz polling rate

All four variants have access to five profiles and eight buttons, and are ambidextrous in design, measuring 1.53 (H) by 4.64 (L) by 2.59 (W) inches. They also sport on-the-fly adjustable DPIs and work with EVGA's Unleash software.

The Torq X5L ($60), Torq X5 ($50), and Torq X3 ($40) are all available now direct from EVGA; the Torq X3L ($40) is a Best Buy exclusive, and also available now.

Follow Paul on Google+, Twitter, and Facebook

Nifty Infographic Explains Inner Workings of a Hard Drive

Posted: 27 Jan 2015 07:47 AM PST

HDD InfographicVirtual autopsy of a hard disk drive

You probably already have at least a basic understanding of how a mechanical hard disk drive (HDD) works, but have you ever tried to explain it someone less savvy? It's a little more difficult than it seems -- there's a lot going on inside a hard drive. This is where infographics can come in handy, and eBuyer just sent us a rather neat one that takes a look at the various parts inside your typical HDD.

Compared to more complex parts like CPUs and GPUs, hard drives are relatively easy to understand and there might not be anything new for you in the infographic. However, if you've taken someone under your wing and recently introduced them to the wonderful world of PCs, this is one of those things you'll want to share with them.

The infographic covers the various internal bits, such as the printed circuit board (PCB), shock mount, actuator, read/write heads, spindle, and so forth. There's also a history lesson sprinkled in.

"They may be getting smaller, thinner, and lighter every year, but that's certainly not how hard disks started out. Back in 1956, IBM's RAMAC 305 system used 50 platters, originally called 'fixed disks' or 'Winchesters', that were 61cm wide and housed in a unit bigger than a pair of fridges!," the infographic explains. "All this just to store a trifling 5MB of data for the inconceivable cost of more than $400,000 in modern dollars."

It also offers up some definitions, such as seek time being the time between the CPU's request for a file and the point at which the first byte is delivered.

Give it a look, and if you know someone that's new to PCs, pass it along.

HDD Infographic Thumbnail
Click for the full infographic

Follow Paul on Google+, Twitter, and Facebook

Total Pageviews

statcounter

View My Stats