General Gaming Article

General Gaming Article


Best Cheap Graphics Card

Posted: 12 Aug 2014 02:43 PM PDT

Six entry-level cards battle for budget-board bragging rights

The video-card game is a lot like Hollywood. Movies like My Left Foot and The Artist take home the Oscars every year, but movies like Grown Ups 2 and Transformers 3 pull in all the cash. It's the same with GPUs, in that everyone loves to talk about $1,000 cards, but the actual bread-and-butter of the market is made up of models that cost between $100 and $150. These are not GPUs for 4K gaming, obviously, but they can provide a surprisingly pleasant 1080p gaming experience, and run cool and quiet, too.

This arena has been so hot that AMD and Nvidia have recently released no fewer than six cards aimed at budget buyers. Four of these cards are from AMD, and Nvidia launched two models care of its all-new Maxwell architecture, so we decided to pit them against one another in an old-fashioned GPU roundup. All of these cards use either a single six-pin PCIe connector or none at all, so you don't even need a burly power supply to run them, just a little bit of scratch and the desire to get your game on. Let's dive in and see who rules the roost!

Nvidia's Maxwell changes the game

Budget GPUs have always been low-power components, and usually need just a single six-pin PCIe power connector to run them. After all, a budget GPU goes into a budget build, and those PCs typically don't come with the 600W-or-higher power supplies that provide dual six- or eight-pin PCIe connectors. Since many budget PSUs done have PCIe connectors, most of these cards come with Molex adapters in case you don't have one. The typical thermal design power (TDP) of these cards is around 110 watts or so, but that number fluctuates up and down according to spec. For comparison, the Radeon R9 290X has a TDP of roughly 300 watts, and Nvidia's flagship card, the GTX 780 Ti, has a TDP of 250W, so these budget cards don't have a lot of juice to work with. Therefore, efficiency is key, as the GPUs need to make the most out of the teeny, tiny bit of wattage they are allotted. During 2013, we saw AMD and Nvidia release GPUs based on all-new 28nm architectures named GCN and Kepler, respectively, and though Nvidia held a decisive advantage in the efficiency battle, it's taken things to the next level with its new ultra-low-power Maxwell GPUs that were released in February 2014.

Beginning with the GTX 750 Ti and the GTX 750, Nvidia is embarking on a whole new course for its GPUs, centered around maximum power efficiency. The goal with its former Kepler architecture was to have better performance per watt compared to the previous architecture named Fermi, and it succeeded, but it's taken that same philosophy even further with Maxwell, which had as its goal to be twice as efficient as Kepler while providing 25 percent more performance.

Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units.

Maxwell offers far greater power savings by using more granular clock gating, which allows it to shut down unused graphics units.

Achieving more performance for the same model or SKU from one generation to the next is a tough enough challenge, but to do so by cutting power consumption in half is an even trickier gambit, especially considering the Maxwell GPUs are being fabricated on the same 28nm process it used for Kepler. We always expect more performance for less power when moving from one process to the next, such as 32nm to 28nm or 22nm to 14nm, but to do so on the same process is an amazing achievement indeed. Though Nvidia used many technological advances to reduce power consumption, the main structural change was to how the individual CUDA cores inside the Graphics Processing Clusters (GPCs) are organized and controlled. In Kepler, each GPC contained individual processing units, named SMX units, and each unit featured a piece of control logic that handled scheduling for 192 CUDA cores, which was a major increase from the 32 cores in each block found in Fermi. In Maxwell, Nvidia has gone back to 32 CUDA cores per block, but is putting four blocks into each unit, which are now called SM units. If you're confused, the simple version is this—rather than one piece of logic controlling 192 cores, Maxwell has a piece of logic for each cluster of 32 cores, and there are four clusters per unit, for a total of 128 cores per block. Therefore, it's reduced the number of cores per block by 64, from 192 to 128, which helps save energy. Also, since each piece of control logic only has to pay attention to 32 cores instead of 192, it can run them more efficiently, which also saves energy.

The benefit to all this energy-saving is the GTX 750 cards don't need external power, so they can be dropped into pretty much any PC on the market without upgrading the power supply. That makes it a great upgrade for any pre-built POS you have lying around the house.

Gigabyte GTX 750 Ti WindForce

Nvidia's new Maxwell cards run surprisingly cool and quiet in stock trim, and that's with a fan no larger than an oversized Ritz cracker, so you can guess what happens when you throw a mid-sized WindForce cooler onto one of them. Yep, it's so quiet and cool you have to check with your fingers to see if it's even running. This bad boy ran at 45 C under load, making it the coolest-running card we've ever tested, so kudos to Nvidia and Gigabyte on holding it down (the temps, that is). This board comes off the factory line with a very mild overclock of just 13MHz (why even bother, seriously), and its boost clock has been massaged up to 1,111MHz from 1,085MHz, but as always, this is just a starting point for your overclocking adventures. The memory is kept at reference speeds however, running at 5,400MHz. The board sports 2GB of GDDR5 memory, and uses a custom design for its blue-colored PCB. It features two 80mm fans and an 8mm copper heat pipe. Most interesting is the board requires a six-pin PCIe connector, unlike the reference design, which does not.

The WindForce cooler is overkill, but we like  it that way.

The WindForce cooler is overkill, but we like it that way.

In testing, the GTX 750 Ti WindForce was neck-and-neck with the Nvidia reference design, proving that Nvidia did a pretty good job with this card, and that its cooling requirements don't really warrant such an outlandish cooler. Still, we'll take it, and we loved that it was totally silent at all times. Overclocking potential is higher, of course, but since the reference design overclocked to 1,270MHz or so, we don't think you should expect moon-shot overclocking records. Still, this card was rock solid, whisper quiet, and extremely cool.

Gigabyte GTX 750 Ti WindForce


score:9

$160(Street), www.gigabyte.us

MSI GeForce GTX 750 Gaming

Much like Gigabyte's GTX 750 Ti WindForce card, the MSI GTX 750 Gaming is a low-power board with a massive Twin Frozr cooler attached to it for truly exceptional cooling performance. The only downside is the formerly waifish GPU has been transformed into a full-size card, measuring more than nine inches long. Unlike the Gigabyte card though, this GPU eschews the six-pin PCIe connector, as it's just a 55W board, and since the PCIe slot delivers up to 75W, it doesn't even need the juice. Despite this card's entry-level billing, MSI has fitted it with "military-class" components for better overclocking and improved stability. It uses twin heat pipes to dual 100mm fans to keep it cool, as well. It also includes a switch that lets you toggle between booting from an older BIOS in case you run into overclocking issues.

MSI's Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card.

MSI's Twin Frozr cooling apparatus transforms this svelte GPU into a full-sized card.

Speaking of which, this board lives up to its name and has a beefy overclock right out of the box, running at 1,085MHz base clock and 1,163MHz boost clock. It features 1GB of GDDR5 RAM on a 128-bit interface.

The Twin Frozr cooler handles the miniscule amount of heat coming out of this board with aplomb—we were able to press our finger forcibly on the heatsink under load and felt almost no warmth, sort of like when we give Gordon a hug when he arrives at the office. As the only GTX 750 in this test, it showed it could run our entire test suite at decent frame rates, but it traded barbs with the slightly less expensive Radeon R7 260X. On paper, both the GTX 750 and the R7 260X are about $119, but rising prices from either increased demand or low supply have placed both cards in the $150 range, making it a dead heat. Still, it's a very good option for those who want an Nvidia GPU and its ecosystem but can't afford the Ti model.

MSI GeForce GTX 750 Gaming


score:8

$140, www.msi.com

Sapphire Radeon R7 265 Dual-X

The Sapphire Radeon R7 265 is the odds-on favorite in this roundup, due to its impressive specs and the fact that it consumes more than twice the power of the Nvidia cards. Sure, it's an unfair advantage, but hate the game, not the player. This board is essentially a rebadged Radeon HD 7850, which is a Pitcairn part, and it slides right in between the $120 R7 260X and the $180ish R7 270. This card actually has the same clock speeds as the R7 270, but features fewer streaming processors for reduced shader performance. It has the same 2GB of memory, same 925MHz boost clock, same 256-bit memory bus, and so on. At 150W, its TDP is very high—or at least it seems high, given that the GTX 750 Ti costs the exact same $150 and is sitting at just 60W. Unlike the lower-priced R7 260X Bonaire part, though, the R7 265 is older silicon and thus does not support TrueAudio and XDMA CrossFire (bridgeless CrossFire, basically). However, it will support the Mantle API, someday.

Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus.

Sapphire's R7 265 is the third card in this roundup to use a two-fan cooling apparatus.

The Sapphire card delivered the goods in testing, boasting top scores in many benchmarks and coming in as the only GPU in this roundup to hit the magical 60fps in any test, which was a blistering turn in Call of Duty: Ghosts where it hit 67fps at 1080p on Ultra settings. That's damned impressive, as was its ability to run at 49fps in Battlefield 4, though the GTX 750 Ti was just a few frames behind it. Overall, though, this card cleaned up, taking first place in seven out of nine benchmarks. If that isn't a Kick Ass performance, we don't know what is. The Dual-X cooler also kept temps and noise in check, too, making this the go-to GPU for those with small boxes or small monitors.

Sapphire Radeon R7 265 Dual-X


score:9ka

$150 (MSRP), www.sapphiretech.com

AMD Radeon R7 260X

The Radeon R7 260X was originally AMD's go-to card for 1080p gaming on a budget. It's the only card in the company's sub-$200 lineup that supports all the next-gen features that appeared in its Hawaii-based flagship boards, including support for TrueAudio, XDMA Crossfire, Mantle (as in, it worked at launch), and it has the ability to drive up to three displays —all from this tiny $120 GPU. Not bad. In its previous life, this GPU was known as the Radeon HD 7790, aka Bonaire, and it was our favorite "budget" GPU when pitted against the Nvidia GTX 650 Ti Boost due to its decent performance and amazing at-the-time game bundles. It features a 128-bit memory bus, 896 Stream Processors, 2GB of RAM (up from 1GB on the previous card), and a healthy boost clock of 1,100MHz. TDP is just 115W, so it slots right in between the Nvidia cards and the higher-end R7 265 board. Essentially, this is an HD 7790 card with 1GB more RAM, and support for TrueAudio, which we have yet to experience.

This $120 card supports Mantle, TrueAudio, and CrossFire.

This $120 card supports Mantle, TrueAudio, and CrossFire.

In testing, the R7 260X delivered passable performance, staking out the middle ground between the faster R7 265 and the much slower R7 250 cards. It ran at about 30fps in tests like Crysis 3 and Tomb Raider, but hit 51fps on CoD: Ghosts and 40fps on Battlefield 4, so it's certainly got enough horsepower to run the latest games on max settings. The fact that it supports all the latest technology from AMD is what bolsters this card's credentials, though. And the fact that it can run Mantle with no problems is a big plus for Battlefield 4 players. We like this card a lot, just like we enjoyed the HD 7790. While it's not the fastest card in the bunch, it's certainly far from the slowest.

AMD Radeon R7 260X


score:8

$120 www.amd.com


MSI Radeon R7 250 OC

In every competition, there must be one card that represents the lower end of the spectrum, and in this particular roundup, it's this little guy from MSI. Sure, it's been overclocked a smidge and has a big-boy 2GB of memory, but this GPU is otherwise outgunned, plain and simple. For starters, it has just 384 Stream Processors, which is the lowest number we've ever seen on a modern GPU, so it's already severely handicapped right out of the gate. Board power is a decent 65W, but when looking at the specs of the Nvidia GTX 750, it is clearly outmatched. One other major problem, at least for those of us with big monitors, is we couldn't get it to run our desktop at 2560x1600 out of the box, as it only has one single-link DVI connector instead of dual-link. On the plus side, it doesn't require an auxiliary power connector and is just $100, so it's a very inexpensive board and would make a great upgrade from integrated graphics for someone on a strict budget.

Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB.

Some R7 250 cards include 1GB of RAM, but this MSI board sports 2GB.

That said, we actually felt bad for this card during testing. The sight of it limping along at 9 frames per second in Heaven 4.0 was tough to watch, and it didn't do much better on our other tests, either. Its best score was in Call of Duty: Ghosts, where it hit a barely playable 22fps. In all of our other tests, it was somewhere between 10 and 20 frames per second on high settings, which is simply not playable. We'd love to say something positive about the card though, so we'll note that it probably runs fine at medium settings and has a lot of great reviews on Newegg from people running at 1280x720 or 1680x1050 resolution.

MSI Radeon R7 250 OC 1TB


score:6

$90 http://us.msi.com

PowerColor Radeon R7 250X

The PowerColor Radeon R7 250X represents a mild bump in specs from the R7 250, as you would expect given its naming convention. It is outfitted with 1GB of RAM however, and a decent 1,000MHz boost clock. It packs 640 Stream Processors, placing it above the regular R7 250 but about mid-pack in this group. Its 1GB of memory runs on the same 128-bit memory bus as other cards in this roundup, so it's a bit constrained in its memory bandwidth, and we saw the effects of it in our testing. It supports DirectX 11.2, though, and has a dual-link DVI connector. It even supports CrossFire with an APU, but not with another PCIe GPU­—or at least that's our understanding of it, since it says it supports CrossFire but doesn't have a connector on top of the card.

The R7 250X is a rebadged HD 7770, made for cash-strapped gamers.

The R7 250X is a rebadged HD 7770, made for cash-strapped gamers.

When we put the X-card to the test, it ended up faring a smidgen better than the non-X version, but just barely. It was able to hit 27 and 28 frames per second in Battlefield 4 and CoD: Ghosts, and 34 fps in Batman: Arkham Origins, but in the rest of the games in our test suite, its performance was simply not what we would call playable. Much like the R7 250 from MSI, this card can't handle 1080p with all settings maxed out, so this GPU is bringing up the rear in this crowd. Since it's priced "under $100" we won't be too harsh on it, as it seems like a fairly solid option for those on a very tight budget, and we'd definitely take it over the vanilla R7 250. We weren't able to see "street" pricing for this card, as it had not been released at press time, but our guess is even though it's one of the slowest in this bunch, it will likely be the go-to card under $100.

PowerColor Radeon R7 250X


score:7

$100, www.powercolor.com

Should you take the red pill or the green pill?

Both companies offer proprietary technologies to lure you into their "ecosystems," so let's take a look at what each has to offer

Nvidia's Offerings

G-Sync

Nvidia's G-Sync technology is arguably one of the strongest cards in Nvidia's hand, as it eliminates tearing in video games caused by the display's refresh rate being out of sync with the frame rate of the GPU. The silicon syncs the refresh rate with the cycle of frames rendered by the GPU, so movement onscreen looks buttery smooth at all times, even below 30fps. The only downside is you must have a G-Sync monitor, so that limits your selection quite a bit.

Regular driver releases

People love to say Nvidia has "better drivers" than AMD, and though the notion of "better" is debatable, it certainly releases them much more frequently than AMD. That's not to say AMD is a slouch—especially now that it releases a new "beta" build each month—but Nvidia seems to be paying more attention to driver support than AMD.

GeForce Experience and ShadowPlay

Nvidia's GeForce Experience software will automatically optimize any supported games you have installed, and also lets you stream to Twitch as well as capture in-game footage via ShadowPlay. It's a really slick piece of software, and though we don't need a software program to tell us "hey, max out all settings," we do love ShadowPlay.

PhysX

Nvidia's proprietary PhysX software allows game developers to include billowing smoke, exploding particles, cloth simulation, flowing liquids, and more, but there's just one problem—very few games utilize it. Even worse, the ones that do utilize it, do so in a way that is simply not that impressive, with one exception: Borderlands 2.

AMD's Offerings

Mantle and TrueAudio

AMD is hoping that Mantle and TrueAudio become the must-have "killer technology" it offers over Nvidia, but at this early stage, it's difficult to say with certainty if that will ever happen. Mantle is a lower-level API that allows developers to optimize a game specifically targeted at AMD hardware, allowing for improved performance.

TressFX

This is proprietary physics technology similar to Nvidia's PhysX in that it only appears in certain games, and does very specific things. Thus far, we've only seen it used once—for Lara Croft's hair in Tomb Raider. Instead of a blocky ponytail, her mane is flowing and gets blown around by the wind. It looks cool but is by no means a must-have item on your shopping list, just like Nvidia's PhysX.  

Gaming Evolved by Raptr

This software package is for Radeon users only, and does several things. First, it will automatically optimize supported games you have installed, and it also connects you to a huge community of gamers across all platforms, including PC and console. You can see who is playing what, track achievements, chat with friends, and also broadcast to Twitch.tv, too. AMD also has a "rewards" program that doles out points for using the software, and you can exchange those points for gear, games, swag, and more.

Currency mining

AMD cards are better for currency mining than Nvidia cards for several reasons, but their dominance is not in question. The most basic reason is the algorithms used in currency mining favor the GCN architecture, so much so that AMD cards are usually up to five times faster in performing these operations than their Nvidia equivalent. In fact, the mining craze has pushed the demand for these cards is so high that there's now a supply shortage.

All the cards, side by side

Benchmarks
  MSI Geforce GTX 750 Gaming GigaByte GeForce GTX 750 Ti GeForce GTX 650 Ti Boost * GeForce GTX 660 * MSI Radeon R7 250 PowerColor Radeon R7 250X AMD Radeon R7 260X Sapphire Radeon R7 265
Price $120 $150 $160 $210 $90 $100 $120 $150
Code-name Maxwell Maxwell Kepler Kepler Oland Cape Verde Bonaire Curaco
Processing cores 512 640 768 960 384 640 896 1,024
ROP units 16 16 24 24 8 16 16 32
Texture units 32 40
64 80 24 40 56 64
Memory 2GB 2GB 2GB 2GB 1GB 1GB 2GB 2GB
Memory speed 1,350MHz 1,350MHz 1,500MHz 1,500MHz 1,500MHz 1,125MHz 1,500MHz 1,400MHz
Memory bus 128-bit 128-bit 192-bit 192-bit 128-bit 128-bit 128-bit 256-bit
Base clock 1,020MHz 1,020MHz 980MHz 980MHz N/A N/A N/A N/A
Boost clock 1,085MHz 1,085MHz 1,033MHz 1,033MHz 1,050MHz 1,000MHz 1,000MHz 925MHz
PCI Express version 3 3 3 3 3 3 3 3
Transistor count 1.87 billion 1.87 billion 2.54 billion 2.54 billion 1.04 billion 1.04 billion 2.08 billion 2.8 billion
Power connectors N/A N/A 1x six-pin 1x six-pin N/A 1x six-pin 1x six-pin 1x six-pin
TDP 54W 60W 134W 140W 65W 80W 115W 150W
Fab process 28nm 28nm 28nm 28nm 28nm 28nm 28nm 28nm
Multi-card support No No Yes Yes No Yes Yes Yes
Outputs DVI, VGA, HDMI 2x DVI,
2x HDMI
2x DVI, HDMI, DisplayPort 2x DVI,
HDMI, DisplayPort
DVI-S, VGA, HDMI DVI, VGA, HDMI 2x DVI, HDMI, DisplayPort 2x DVI, HDMI, DisplayPort

Provided for reference purposes.

How we tested

We lowered our requirements, but not too much

We normally test all of our video cards on our standardized test bed, which has now been in operation for a year and a half, with only a few changes along the way. In fact, the only major change we've made to it in the last year was swapping the X79 motherboard and case. The motherboard had endured several hundred video-card insertions, which is well beyond the design specs. The case had also become bent to the point where the video cards were drooping slightly. Some, shall we say, "overzealous" overclocking also caused the motherboard to begin behaving unpredictably. Regardless, it's a top-tier rig with an Intel Core i7-3960X Extreme processor, 16GB of DDR3 memory, an Asus Rampage IV Extreme motherboard, Crucial M500 SSD, and Windows 8 64-bit Enterprise.

For the AMD video cards, we loaded Catalyst driver 14.1 Beta 1.6, as that was the latest driver, and for the Nvidia cards, we used the 334.89 WHQL driver that was released just before testing began. We originally planned to run the cards at our normal "midrange GPU" settings, which is 1920x1080 resolution with maximum settings and 4X AA enabled, but after testing began, we realized we needed to back off those settings just a tad. Instead of dialing it down to medium settings, though, as that would run counter to everything we stand for as a magazine, we left the settings on "high" across the board, but disabled AA. These settings were a bit much for the lower-end cards, but rather than lower our settings once again, we decided to stand fast at 1080p with high settings, since we figured that's where you want to be gaming and you deserve to know if some of the less-expensive cards can handle that type of action.

Mantle Reviewed

A word about AMD's Mantle API

AMD's Mantle API is a competitor to DirectX, optimized specifically for AMD's GCN hardware. In theory, it should allow for better performance since its code knows exactly what hardware it's talking to, as opposed to DirectX's "works with any card" approach. The Mantle API should be able to give all GCN 1.0 and later AMD cards quite a boost in games that support it. However, AMD points out that Mantle will only show benefits in scenarios that are CPU-bound, not GPU-bound, so if your GPU is already working as hard as it can, Mantle isn't going to help it. However, if your GPU is always waiting around for instructions from an overloaded CPU, then Mantle can offer respectable gains.

To test it out, we ran Battlefield 4 on an older Ivy Bridge quad-core, non-Hyper-Threaded Core i5-3470 test bench with the R7 260X GPU at 1920x1080 and 4X AA enabled. As of press time, there are only two games that support Mantle—Battlefield 4 and an RTS demo on Steam named Star Swarm. In Battlefield 4, we were able to achieve 36fps using DirectX, and 44fps using Mantle, which is a healthy increase and a very respectable showing for a $120 video card. The benefit was much smaller in Star Swarm, however, showing a negligible increase of just two frames per second.

Enabling Mantle in Battlefield 4 does provide performance boosts for most configs.

We then moved to a much beefier test bench running a six-core, Hyper-Threaded Core i7-3960X and a Radeon R9 290X, and we saw an increase in Star Swarm of 100 percent, going from 25fps with DirectX to 51fps using Mantle in a timed demo. We got a decent bump in Battlefield 4, too, going from 84 fps using DirectX to 98 fps in Mantle.

Overall, Mantle is legit, but it's kind of like PhysX or TressFX in that it's nice to have when it's supported, and does provide a boost, but it isn't something we'd count on being available in most games.

Final Thoughts

If cost is an issue, you've got options

Testing the cards for this feature was an enlightening experience. We don't usually dabble in GPU waters that are this shallow, so we really had no idea what to expect from all the cards assembled. To be honest, if we were given a shot of sodium pentothal, we'd have to admit that given these cards' price points, we had low expectations but thought they'd all at least be able to handle 1920x1080 gaming. As spoiled gamers used to running 2K or 4K resolution, 1080p seems like child's play to us. But we found out that running that resolution at maximum settings is a bridge too far for any GPU that costs less than $120 or so. The $150 models are the sweet spot, though, and are able to game extremely well at 1080 resolution, meaning the barrier to entry for "sweet gaming" has been lowered by $100, thanks to these new GPUs from AMD and Nvidia.

Therefore, the summary of our results is that if you have $150 to spend on a GPU, you should buy the Sapphire Radeon R7 265, as it's the best card for gaming at this price point, end of discussion. OK, thanks for reading.

Oh, are you still here? OK, here's some more detail. In our testing, the Sapphire R7 265 was hands-down the fastest GPU at its price point—by a non-trivial margin in many tests—and is superior to the GTX 750 Ti from Nvidia. It was also the only GPU to come close to the magical 60fps we desire in every game, making it pretty much the only card in this crowd that came close to satisfying our sky-high demands. The Nvidia GTX 750 Ti card was a close second, though, and provides a totally passable experience at 1080p with all settings maxed. Nvidia's trump card is that it consumes less than half the power of the R7 265 and runs 10 C cooler, but we doubt most gamers will care except in severely PSU-constrained systems.

Moving down one notch to the $120 cards, the GTX 750 and R7 260X trade blows quite well, so there's no clear winner. Pick your ecosystem and get your game on, because these cards are totally decent, and delivered playable frame rates in every test we ran.

The bottom rung of cards, which consists of the R7 250(X) cards, were not playable at 1080p at max settings, so avoid them. They are probably good for 1680x1050 gaming at medium settings or something in that ballpark, but in our world, that is a no-man's land filled with shattered dreams and sad pixels.

Benchmarks
  Nvidia GTX 750 Ti (reference) Gigabyte GTX 750 Ti MSI GTX 750 Gaming Sapphire Radeon R7 265 AMD Radeon R7 260X PowerColor Radeon R7 250X MSI Radeon R7 250 OC
Driver 334.89 334.89 334.89 14.1 v1.6 14.1 v1.6 14.1 v1.6 14.1 v1.6
3DMark Fire Storm 3,960 3,974 3,558 4,686 3,832 2,806 1,524
Unigine Heaven 4.0 (fps) 30 30
25 29 23 17 9
Crysis 3 (fps) 27 25 21 32 26 16 10
Far Cry 3 (fps) 40 40
34 40 34 16 14
Tomb Raider (fps) 30 30 26 36 31 20 12
CoD: Ghosts (fps) 51 49 42 67 51 28 22
Battlefield 4 (fps) 45 45 32 49 40 27 14
Batman: Arkham Origins (fps) 74 71 61 55 43 34 18
Assassin's Creed: Black Flag (fps) 33 33 29 39 21 21 14

Best scores are bolded. Our test bed is a 3.33GHz Core i7-3960X Extreme Edition in an Asus Rampage IV Extreme motherboard with 16GB of DDR3/1600 and a Thermaltake ToughPower 1,050W PSU. The OS is 64-bit Windows 8. All games are run at 1920x1080 with no AA except for the 3DMark tests.

Nvidia Tegra K1 Claims Fame as First 64-Bit ARM Chip for Android

Posted: 12 Aug 2014 10:32 AM PDT

Nvidia Tegra K1Android enters the 64-bit ARM era

Say hello to "Denver," the codename for Nvidia's 64-bit Tegra K1 System-on-Chip (SoC), which also happens to be the first 64-bit ARM processor for Android. The new version of Nvidia's Tegra K1 SoC pairs the company's Kepler architecture-based GPU with its own custom-designed, 64-bit, dual-core "Project Denver" CPU, which Nvidia says is fully ARMv8 architecture compatible.

So, what's special about this chip besides a 64-bit instruction set? Nvidia designed Denver to offer the highest single-core CPU throughput and industry-leading dual-core performance. Each Denver core (and there are two) sports a 7-way superscaler microarchitecture and includes a 128KB 4-way L1 instruction cache, a 64KB 4-way L1 data cache, and a 2MB 16-way L2 cache that services both cores.

Using a process called Dynamic Code Optimization, Denver optimizes frequently used software routines at runtime into dense, highly tuned microcode-equivalent routines stored in a dedicated 128MB main-memory based optimization cache. This allows for faster access and execution, which translates into faster performance, in part because it lessens the need to re-optimize the software routine.

Denver will also benefit Android platforms with new low latency power-state transitions. This is in addition to extensive power-gating and dynamic voltage and clock scaling routines based on workloads. The end result is more efficient power usage, which allows Denver's performance to rival even some mainstream PC-class CPUs at significantly reduced power consumption, Nvidia says.

If you want to dig even further into the architecture, you can get more details here.

Follow Paul on Google+, Twitter, and Facebook

Gartner Predicts 5.2 Million Chromebook Sales in 2014

Posted: 12 Aug 2014 10:00 AM PDT

Samsung ChromebookChromebooks are big in the education sector

We've pointed out before how Chromebooks are some of the best selling laptops on Amazon, and though these cloud-based systems aren't as capable as their Windows-based counterparts, they've having no trouble finding an audience, particularly in education circles. In fact, market research firm Gartner forecasts 5.2 million Chromebook sales by the end of the year, which would translate into a 79 percent jump compared to 2013.

That's just the tip of the cloud -- by 2017, Gartner sees Chromebook sales tripling to 14.4 million units. Not too shabby for what some power users a glorified web browsing machine, though to be fair, Chromebooks are capable of much more than just surfing the Internet.

"Competition in the Chromebook market is intensifying as more vendors launch Chromebooks, with eight models in the market in 2014," said Isabelle Durand, principal analyst at Gartner. "Now that the PC market is no longer growing strongly, vendors are searching for new business opportunities. They launched Chromebooks to revive interest in sub-$300 portable PCs once the netbook bubble had burst."

Unlike netbooks, however, Chromebooks are most predominate in the education sector, which accounted for nearly 85 percent of Chromebook sales last year. And of the 2.9 million Chromebooks sold during 2013, 82 percent were sold in North America, Gartner says.

Despite just 15 percent of Chromebooks sold in 2013 landing outside the education sector, Gartner insists they will carve out a place in businesses for specific workers, such as staff in banking, hotel receptions, and so forth.

"So far, businesses have looked at Chromebooks, but not bought many," Durand added. "By adopting Chromebooks and cloud computing, businesses can benefit; they can shift their focus from managing devices to managing something much more important — their data."

Follow Paul on Google+, Twitter, and Facebook

Maingear's Nvidia Battlebox Titan Z PC Line Starts at $2,999

Posted: 12 Aug 2014 09:04 AM PDT

Maingear BattleboxUndercutting the competition

Nvidia has partnered with various system builders to equip their current rigs with at least one GeForce Titan Z graphics card and rebrand them as Battleboxes, Maingear being one of them. Unlike ones we've already seen, however, Maingear's Battlebox Titan Z PCs are a little more cost friendly, with the least expensive model (Vybe) starting at $2,999. Battlebox configurations are also available on Maingear's F131 and Shift starting at $3,199 and $3,499, respectively.

"The new Maingear Battlebox Titan Z PCs offer a great value with optimal performance with the Titan Z card, if someone has been waiting to get PC their Titan Z pc, any of these PC are great options," said Wallace Santos, CEO and founder of Maingear.

As the Titan Z is capable of gaming at a 4K resolution, you can add an Asus 28-inch 4K monitor to each configuration. Doing so would bring the starting price up to $3,698 for the Vybe, $3,898 for the F131, and $4,198 for the Shift. With or without the monitor, each system comes with an extended two year warranty.

Bear in mind that these are starting prices. Plenty of upgrades are available for each model. You can check them all out here.

Follow Paul on Google+, Twitter, and Facebook

Alienware Alpha PC Game Console is Now Available to Pre-Order in the U.S.

Posted: 12 Aug 2014 07:49 AM PDT

Alienware AlphaConfigurations start at $549

In an alternate reality, Dell is launching its Alienware Alpha PC game console as its official Steam Machine. In this world, however, Valve threw a wrench into every OEM's plans by delaying the launch of its Steam Machine platform until next year, as it wanted more time to tweak its Steam Controller. Fair enough, though not all system builders are willing to put their PC console plans on hold. Enter Alienware, which today announced it's taking pre-orders for its Alpha console.

"Gamers can now secure their gateway to the entire Steam library on a system that was engineered to provide an immersive PC gaming experience, custom-tailored for the living room," Alienware said. "The Alienware Alpha merges the open ecosystem and flexibility of PC gaming with the ease-of-use and intuitive interface of consoles. This enables gamers to choose what and how they want to play, whether it's a competitive online FPS with the bundled Microsoft Xbox 360 wireless controller for Windows, or having their friends bring their controllers of choice for a fragfest in the newest indie side scroller."

Systems start at $549, though you can configure a higher priced machine with more potent hardware. That includes the availability of Intel Core i7 processor options, 8GB of RAM, 1TB of storage, 802.11ac Wi-Fi, and Nvidia Maxwell-based GPU options. All systems come with a Microsoft Xbox 360 controller to boot.

As an added bonus, Alienware Alpha machines will also come bundled with Payday 2, Magicka, Magicka: Dungeons and Daemons DLC, and Gauntlet Helm.

Alienware Alpha systems will ship in November, just as previously promised.

Follow Paul on Google+, Twitter, and Facebook

Lenovo Launches Professional Grade ThinkStation P Series Workstations

Posted: 12 Aug 2014 06:16 AM PDT

Lenovo P Series

Rebuilt from the ground up

Lenovo is using its time at SIGGRAPH 2014 in Vancouver, British Columbia to launch its new ThinkStation P Series of desktop workstations. The new Lenovo P900, P700, and P500 workstations join the entry-level ThinkStation P300 announced in May of this year and have been completely retooled compared to Lenovo's previous generation of workstations. According to Victor Rios, Vice President and General Manager of Lenovo's Workstation division, these are the "best designed" workstations the company has ever built.

"We built on the advancements delivered in our 30 Series workstations and engineered the P Series with new levels of innovation based on extensive customer feedback," Rios said in a statement. "The result for users is optimum performance, outstanding reliability, and unparalleled usability that is unlike anything they have experienced."

The P-Series boasts tool-less upgrades, intuitive red touch points that guide users' hands to quick and easy component changes, integrated handles, QR codes, and a diagnostic USB port that allows you to plug in an Android-based tablet or smartphone for system analysis.

On the ultra high-end, the ThinkStation P900 comes with a 1300W power supply, Intel Xeon processor options with support for dual CPUs, up to 512GB of DDR4 memory spread across 16 DIMM slots, up to 14 storage devices (8 internal + 2 external + 4 M.2), up to four GPUs, support for 12Gbps SAS and up to 32Gbps PCIe, 10 expansion slots, Firewire, four USB 3.0 ports, a serial port, up to a 29-in-1 card reader, RAID support, and dual GbE ports

The P700 Series is slightly subdued compared to the P900 with up to an 850W PSU, up to 384GB of DDR4 memory (12 DIMM slots), up to 12 storage devices, up to 3 GPUs, and 7 expansion slots.

Slightly lower is the P500, which is similar to the P700 but with 8 DIMM slots supporting up to 256GB of DDR4 memory, single Xeon processor support, room for 11 storage storage devices, support for two GPUs, and a single GbE port.

Lenovo says its new P Series systems will be available later this fall. No word yet on price.

Follow Paul on Google+, Twitter, and Facebook

Newegg Daily Deals: Asus Z97-C Motherboard, G.Skill Ares Series 8GB (2x4GB) DDR3-1600, and More!

Posted: 12 Aug 2014 06:11 AM PDT

Asus Z97-C Motherboardnewegg logo

Top Deal:

Talk is cheap, and if you've been talking about building a new system for several months now, you'll have to bite the bullet and part with some green, though perhaps not as much as you think. To build a respectable system without spending a fortune, keep checking Newegg's Daily Deals, including today's for an Asus Z97-C LGA 1150 Motherboard for $120 with free shipping (normally $140 - use coupon code: [EMCPBPC38]). It has four DIMM slots with support for up to 32GB of DDR3-3200 (OC, of course), four SATA 6Gbps ports, a single M.2 port, USB 3.0 connectivity, and plenty more features.

Other Deals:

G.Skill Ares Series 8GB (2 x 4GB) 240-Pin DDR3 1600 (PC3 12800) Desktop Memory for $77 with free shipping (normally $85 - use coupon code: [EMCPBPC42])

LG  27-inch 5ms HDMI Widescreen LED Backlight LCD Monitor IPS 200 for $200 with free shipping

OCZ Technology 120GB Vertex 460 2.5-inch SSD for $80 with free shipping

Intel Core i5-3570K Ivy Bridge Quad-Core 3.4GHz (3.8GHz Turbo) LGA 1155 77W Desktop Processor for $230 with free shipping

MMORPG News

MMORPG News


ArcheAge: Looking Forward - Korean Patch 1.7

Posted: 10 Aug 2014 05:58 PM PDT

Looking Forward - Korean Patch 1.7

There are no brakes on the ArcheAge update train. XLGAMES released Patch 1.7 for the Korean version of ArcheAge last month and the features it added are pretty exciting. Let's take a look at the changes.

Heroes of the Storm: Blizzard Headed to PAX Prime

Posted: 10 Aug 2014 05:02 PM PDT

Blizzard Headed to PAX Prime

Fans of Blizzard's Heroes of the Storm who will be attending this year's PAX Prime will be able to get in on some firsthand action as the game will be taking center stage at the convention.

General: Elegato Game Capture Immortalizes Your Play

Posted: 11 Aug 2014 04:57 PM PDT

Elegato Game Capture Immortalizes Your Play

Elegato has announced the release of its new game play recording device called "Game Capture". With Game Capture, players can record and stream both XBox and Playstation gaming action.

Guild Wars 2: How Can ArenaNet Get Ex-Players Back Into The Game?

Posted: 09 Aug 2014 07:35 PM PDT

How Can ArenaNet Get Ex-Players Back Into The Game?

There are two types of MMO fans that companies make significant efforts to satisfy. Obviously, players who are currently active in the game get attention, while people who have never tried the game are the main targets of advertising and promotional campaigns.

War Thunder: Battle Rating Explained by the Development Team

Posted: 11 Aug 2014 10:54 AM PDT

Battle Rating Explained by the Development Team

Gaijin Developers have started a new series of videos where they cull questions from the official War Thunder site, social media locations and Reddit. In this first video, the team takes on the question of Battle Rating, explaining it in very close detail. Check it out!

General: Dead Island 2: First Trailer Released with Gameplay Footage

Posted: 11 Aug 2014 10:45 AM PDT

Dead Island 2: First Trailer Released with Gameplay Footage

Deep Silver & Yager have released the first ever game play footage trailer from the upcoming Dead Island 2. The trailer features pre-alpha footage and takes place in sunny California. Be warned: It is packed with Dead Island's trademarked bloody action. Be forewarned! Check it out and let us know what you think!

World of Speed: Gamescom Trailer Shows Off Teamwork

Posted: 11 Aug 2014 10:37 AM PDT

Gamescom Trailer Shows Off Teamwork

Slightly Mad Games has released the Gamescom 2014 trailer for World of Speed. In it, the team emphasizes team game play and the acquisition and completion of objectives. Check it out!

Divergence Online: More Customizable Than Ever

Posted: 11 Aug 2014 10:19 AM PDT

More Customizable Than Ever

We have received word from the Divergence Online team that players are being invited to come check out the overhauled character system in the game. The team is excited to let players have the ability to make characters in any way they wish: "gone are the days of picking a body type and choosing one face from a dozen". You can mold virtually everything about your character you wish."

General: Draenor Cinematic and Date

Posted: 09 Aug 2014 07:22 PM PDT

Draenor Cinematic and Date

This week we talk about the latest earnings call and see what's in store for WoW's 10th anniversary.

Total Pageviews

statcounter

View My Stats