General Gaming Article

General Gaming Article


MSI Radeon R9 270 Gaming OC Review

Posted: 30 Jul 2014 03:39 PM PDT

No surprises here, just a solid 1080p card

MSI is offering two flavors of its midrange Radeon R9 270 GPU, formerly known as the Radeon HD 7870 GHz edition. There is a standard model and one with an "X" after its name. The difference between the two is the X model has slightly higher core and boost clocks, but otherwise the two cards are the same and are both based on AMD's Pitcairn GCN core, which is a 28nm part that debuted in 2013.

Don't bother with the R9 270X—the non-X version shown here is just fine.

Don't bother with the R9 270X—the non-X version shown here is just fine.

The card in front of you is the MSI R9 270 Gaming model, which is a stock R9 270 with a mild overclock, hence the word "Gaming" in its moniker. It has an MSRP of $180, while the X version is roughly $20 more, though street prices are higher due to the mining craze and short supply. For those who are prone to guffawing at a card that is merely rebadged and price-dropped, this is par for the course and actually good news for gamers. That's because both Nvidia and AMD refine their manufacturing processes over time, so by the time a GPU gets a rebadge, it's often able to run at higher clocks with better efficiency for a much lower price. The bottom line is that this card once had a $350 price tag and now costs less than $200, so there's very little to complain about.

To rehash the specs, this is a card with a base clock of 900MHz and a boost clock of 975MHz, which is 50MHz higher than the reference board. It has 2GB of GDDR5 memory that runs at 5.6GHz, and 1,280 stream processors. Since this is not new silicon, the card does not offer support for TrueAudio, but as it's a Graphics Core Next (GCN) card, it does support AMD's new Mantle API (at press time, BF4 was not optimized for Mantle with the R9 270, but AMD said it's "being investigated"). As a midrange GPU, the R9 270 has a low-ish TDP of 150w, and therefore requires only a single six-pin PCIe connector for power—an advantage over the 270X, which requires two six-pin connectors. Interestingly, the R9 270 doesn't have a direct competitor from Nvidia since it costs just a bit over $200, so it sits right in between the $250 GTX 760 and the $150 GTX 650 Ti (the Ti Boost is out of stock everywhere, but costs around $175). The GTX 660 is about the same price, but that card is ancient, so we compared it to the more-expensive GTX 760.

Overall, we had a pleasant testing experience with the MSI R9 270 card. It was quiet and cool—never getting hotter than
60 C—and was totally stable. It ran the grueling new Star Swarm demo over a weekend with nary a hiccup, and we were also able to overclock it to 1,140MHz boost clock, which netted a 10 percent bump in performance. Basically, we found its performance exactly in line with its price, in that it was a bit slower than the more-expensive GTX 760 in all the games we test aside from Tomb Raider, which is an AMD game.

In the end, there's nothing wrong with the MSI R9 270 Gaming OC and we have no problem recommending it. However, we'd still go with the GTX 760 just because it is quite a bit faster in many games, and only costs $30 more. If Mantle support is important to you, though, feel free to pull the trigger.

$220 (street), www.msi.com

Note: This review was originally featured in the April 2014 issue of the magazine.

D-Link DGL-5500 Review

Posted: 30 Jul 2014 03:22 PM PDT

A router built specifically for gamers

When it comes to PC parts and accessories, all roads eventually lead to gamers. Intel and AMD both sell unlocked processors so gamers can more easily overclock their rigs for a few extra frames per second; pro gamer Johnathan "Fatal1ty" Wendel has endorsed everything from motherboards to power supplies; there's gaming RAM; and of course, a whole assortment of accessories designed to give you an edge when smoking your friends on the virtual battlefield. Up until now, one of the few items missing from the list was an 802.11ac wireless router.

The new Mac Pro stole its design from this router—true story.

The new Mac Pro stole its design from this router—true story.

D-Link gets credit for tying up that loose end with the DGL-5500, a dual-band AC1300 wireless router built specifically for gamers. What makes the DGL-5500 different from all the other 802.11ac models, including D-Link's own DIR-868L (reviewed in our February issue), is the inclusion of Qualcomm's StreamBoost technology.

Whereas the majority of modern routers rely on simple quality of service (QoS) rules to prioritize network packets, StreamBoost examines what applications are running and how much actual bandwidth each one needs. It also manages latency because a laggy gamer is a dead gamer. The question is, does it work as advertised?

For the most part, StreamBoost lives up to the hype. We consistently saw lower pings in online games when connected to the DGL-5500 versus our zero-point router, the Asus RT-AC66U. External factors beyond our control also affect ping, so it's hard to offer an apples-to-apples comparison, but to give one example, our ping averaged around 42ms in Battlefield 4 when using Asus's router. When switching to D-Link's model and turning on StreamBoost, our pings hovered around 19ms. After firing up Netflix on a second PC and initiating file downloads on two other systems, the ping stayed around 22–24ms—that's impressive.

In our evaluation of D-Link's DIR-868L, we said the fugly web-based interface could use a major overhaul, and that's what we got with the DGL-5500. It's much better looking than before and far less complicated to navigate, though it's also painfully slow when switching between menus. The UI is also heavily biased toward StreamBoost—if you disable the feature, you lose access to the My Network map, which provides a graphical view of all active devices and how much bandwidth each one is consuming.

The DGL-5500 outpaced our zero point router in 802.11n mode on the 2.4GHz band in our three indoor tests. It also did better at picking out uncluttered channels on its own—we use inSSIDer ($20, www.inssider.com) to identify the best channel(s) for testing. However, the RT-AC66U boasts better range and faster transfers in 802.11ac mode on the 5GHz band. It's worth pointing out the DGL-5500 lacks beamforming, which concentrates the wireless signal at connected devices for longer range and better speeds.

There are other shortcomings, as well—you can't configure Guest networks, the single USB 2.0 port doesn't support printer sharing, and the combined speed of both channels is capped at AC1300 rather than AC1750 as it is with D-Link's DIR-868L. While StreamBoost is a step forward, the router is a step backward in other areas. Note to D-Link: Gamers care about this stuff, too.

$140 [street], www.d-link.com

Ask the Doctor: Too Much GPU, WiFi Upgrades, Disabling SkyDrive, and more

Posted: 30 Jul 2014 12:46 PM PDT

The doctor tackles Too Much GPU, WiFi Upgrades, Disabling SkyDrive, and more

From Integrated to Top-Shelf

After almost 30 years developing software on stock PCs I finally performed my first build from the pages of Maximum PC. I scoured your pages from many issues and planned a build during a long weekend and it's been purring along for 18 months.

I have a Core i5-3570K on an Asus P8Z77.V board, with 16GB RAM, two 128GB SSDs, a 3TB backup drive, and 850W PSU in an NZXT Phantom 410 chassis. Now I'm thinking of adding a graphics card. I don't do a lot with graphics and so I've managed with on-board, but I might do more. The 780 Ti sounds very cool.  Will it work well in this system? Will overall performance improve? Apart from a Hyper 212 CPU cooler, I'm only using the Phantom's stock fans… will I need more cooling?

- David Kates

The Doctor Responds:

Yes, performance will certainly improve—that's one of the best graphics cards on the market, period, and it's going to be faster than what you get with your integrated graphics by a factor of four or more. But if you're not gaming or doing much graphically intensive work—and since you've gotten along just fine with integrated graphics until now, that seems to be the case—then the Doc thinks the 780 Ti might be a little bit overkill.

780 ti

The 780 Ti is a great card, but may be overkill for some. 

There's nothing wrong with overkill, and your rig can certainly handle the 780 Ti, but if you're just getting into applications that need more graphical oomph, the Doc suggests starting with something cheaper, like the GTX 760. It's $250, or roughly a third the price of the 780 Ti but will give you enough oomph for gaming on high settings on a 1080p panel. If you're not going to be gaming and just want a little more graphical muscle for everyday tasks, you'll be fine with something even less expensive.

Hard drive recognition issues

I recently bought a WD Green 3TB hard drive for backup purposes. I have it in a Thermaltake BlacX USB 3.0 cradle outside of the PC. I have tried a hundred different things I found online, from different partitioning hardware to updating drivers and BIOS, but I can't get the PC to recognize any more than 746 GB. From what I saw online, lots of others are having the same problem. I run a Phenom II X6 1090T in a MSI 890FXA-GD65 motherboard with BIOS version 18.9, an AMD 7970 and 16GB of RAM. any idea would be appreciated.

- Gayle Curry

The Doctor Responds:

It looks like the problem is with that Thermaltake BlacX cradle not correctly supporting Advanced Format drives like the WD Green series with 4KB sectors instead of 512B sectors. Depending on the model, Thermaltake lists their BlackX docks as supporting "up to 2TB" or "up to 4TB" drives, but it depends on the USB controller inside the cradle, and its firmware. Depending on your particular model, you may be able to update the firmware for the USB controller, but Thermaltake's website is a little wonky—we couldn't find firmware update tools for the BlackX 5G, for example.

WD's support page indicates that you may be able to get it to work if you attach the drive directly to your motherboard's SATA ports and format it as a GPT partition in Windows Disk Management, then remove it and put it in the USB dock.

Upgrading to Wireless-AC

I'm a MaximumPC subscriber. I have a Dell Inspiron 15R-5520. It has an Intel Centrino Wireless N-2230.

I just purchased the Nighthawk Netgear router R7000. I would like to upgrade the wireless adapter. I contacted Dell to no avail, and couldn't even find an email address for Intel. Do you think that the Intel Network 7260 HMWG WiFi Wireless-AC 7260 H/T Dual Band 2x2 AC + Bluetooth HMC is compatible with the Inspiron 15R? At present I can't use the 5.0 GHz band.

- Carlos H Castillo

The Doctor Responds:

It looks like you ought to be able to put that Intel Wireless-AC card into your Inspiron 15R. The Inspiron takes a half-height mini-PCIe wireless card, like the 7260, and the slot is user-accessible. At least, for a generous interpretation of "user-accessible;" you do have to take apart most of the laptop to get to it. Dell's service manual for your model is at http://bit.ly/1crLreZ, and we found a YouTube tutorial for your model as well at http://bit.ly/Kedtkr. Be sure to download the appropriate drivers from Intel's support site.

Some laptop manufacturers use a whitelist in the BIOS to restrict which wireless cards you can use with the laptop, but your particular model doesn't seem to be one of those. Your model seems to have two antennas, and the Intel 7260 has two antenna leads, so everything looks good there as well. Dell representatives answering other peoples' questions online seem to indicate that the 15R 5520 can accommodate adapters with 5GHz bands, too.

The standard caveat applies—neither Dell nor Intel support or recommend Wi-Fi card installation by anyone but a professional technician, and there's always the risk that it doesn't work. But if it does, well, you'll have legendary WiFi speed on that Inspiron—at least, when you're at home.

Mysterious Files

I am preparing to upgrade from Windows 8 Pro to Windows 8.1 Pro. While cleaning up my system and removing unnecessary files I noticed a string of files in my C: folder with names like bdlog.txt, BDR-im or -bz, and so forth. I have attached a copy of the file names, but I have not been able to identify what program they may belong to. I was hoping that maybe you might have an answer as to how I can identify them so I don't remove something I need. As they occupy over 8GB of space it would be nice to know. So far my internet searches have not found anything and trying various programs on my system to open them have not worked, although one file did have a note stating it could be opened with a bootloader program.

- Eli Cohen

The Doctor Responds:

Those files are normal, provided you're using BitDefender antivirus. Bdlog.txt is a normal activity log, and the various bdr-im and bdr-bz files are related to the Rescue Mode bootable linux environment. Apparently BitDefender just dumps all this stuff into the root of the C: drive rather than in a Program Files/BitDefender folder. So it's just annoying, not malicious.

GoodbyeDrive

I have been reading about the new Windows and deciding whether to make the move.  I have no use for a touch screen operating system so it was with relief to see it is now possible to launch Windows directly into the desktop mode.  However, I still have a concern.  It appears that when Windows 8.1 is installed it automatically sets itself up to save files to SkyDrive by default.  This gives me great concern because I do not want to use any cloud servers.  Is it possible to use Windows 8.1 without using a cloud server?

- Bruce Noren

The Doctor Responds:

Yes, you can disable SkyDrive, but Microsoft doesn't make it obvious. Once you've installed Windows 8.1, run gpedit.msc, either by typing it into the search bar on the Start screen or by hitting Win+R to bring up the Run command. Go to Local Computer Policy, then Computer Configuration > Administrative Templates > Windows Components > SkyDrive. You'll see options to save documents and settings to the local PC by default, instead of SkyDrive, but it sounds like you want something more drastic. The second option is "Prevent the usage of SkyDrive for file storage." That's the one you want. Double-click it and, in the box that pops up, select "Enabled." This is somewhat counterintuitive; you're not enabling SkyDrive, you're enabling the policy that disables SkyDrive. Thanks, Microsoft!

QuadSLI and Quake Wars

I have a weird problem. I play Enemy Territories: Quake Wars a lot. I originally built a computer with an i7-2660 CPU and dual GeForce GTX 560 Ti graphics cards. I got about 200 frames per second. I upgraded to a GeForce GTX 690 and only got about 220fps, which was substantially less of a jump than I expected. Recently I got a bonus at work and added a second GTX 690 and ran them in quad SLI. The frame rates dropped to 30-40 with a huge amount of stuttering. WTF? I have upgraded the drivers and changed every possible setting to no avail. Out of frustration I disabled quad SLI so the game was running on only one of the four GPUs and was stunned to see the frames per second at 380-400. What gives? I have seen you write that many games don't "scale" to quad SLI but why does this game run so much better the less I silicon I give it?

- John Plosay

The Doctor Responds:

Quake Wars is the culprit here, not your GPUs. Everything we found relating to that game and how it responds to SLI and Crossfire back up your experience, that it runs slower instead of faster in some cases, and it's very likely due to the fact that you are obviously not running a resolution that is anywhere near the limit of needing one high end GPU, much less four. Remember, the game is over six years old, was based on a dated engine when it came out, and just isn't engineered to scale with today's levels of GPU power.  The fact that you only received a 20fps boost after going from GTX 560 Ti SLI to a GTX 690 indicates there's a CPU bottleneck, which is common at lower resolutions. Therefore, it's not surprising that adding more GPU horsepower to the equation didn't help things. As for why it got much worse, we can only imagine the extra demand placed on the CPU to handle traffic for four GPUs caused it to be somewhat overloaded, and since you were already CPU-bound in terms of performance, it just made things much, much worse.

Our advice to resolve this issue would be to remove the second GPU and sell it on eBay. With the money from that transaction you can acquire an LCD that runs at a higher resolution, such as 2560x1440 for a 27" panel or 2560X1600 on a 30" panel. Get one of those and you will be all set to run all of the latest games at full resolution with just a single GTX 690 as it's still one of the most powerful cards available.

Google Releases 64-bit Chrome Browser for Windows to Beta Channel

Posted: 30 Jul 2014 11:50 AM PDT

Chrome64-bit Chrome creeps closer to a stable release

Here's a bit of good news if you've been wanting to experiment with Google's Chrome browser in 64-bit form but weren't so keen on installing an ultra-early build that might be riddled with buggy code. Google just added the Chrome 64-bit Beta Channel for Windows 7 and 8 users, giving curious users and early adopters a more stable release to play with. It's probably not a good idea to use it for mission critical applications, but it should be in pretty good shape at this point.

You can download the installer from Google's Beta download pages. Be warned that the new version will replace the existing version you have installed, though it will also preserve all your setting and bookmarks, so there's no need to uninstall Chrome before hitting up the new release, Google says.

In theory, the 64-bit build should speed up page loads and offer other benefits on the backend, especially if you're a power user with multiple tabs open at any given time. However, you may or may not notice a real-world difference, depending on your setup and your browsing habits.

Follow Paul on Google+, Twitter, and Facebook

Column: Think You're Innocent?

Posted: 30 Jul 2014 11:40 AM PDT

The dark side of the RICO anti-racketeering laws

What do the Mafia and web forums have in common, besides colorful insults and threats, and tortured misuse of English? As of December, they are both subject to the RICO anti-racketeering laws, creating harsh, life-ruining penalties for even minor participants.

rico

RICO is a federal law designed to get the bosses of organized crime who may not commit crimes themselves. But to do that, legislators had to make even somewhat loose association with anything perceived to be organized and criminal into a fairly easy-to-prove crime. Early on, RICO started creeping over our rights. It started getting used against political activists, such as pro-life demonstrators, and expanded to where it's being used right now against Ecuadorian activists trying to get Chevron to clean up a toxic spill the company has already been convicted of. But it dates from the 1970s, a time when to make something happen you really had to organize things. Now you can put up a wiki, or a forum, or a tracker, go to bed, and wake up to a new criminal business, political movement, or vast collection of My Little Pony trivia by morning.

David Ray Camez, an Arizona teenager who was getting into carding and fake IDs when he got collared buying a fake driver's license from a Secret Service agent, was convicted again in December under RICO as a racketeer for using the website Carder.su. Under this interpretation, Americans who set up or use websites that facilitate illegal activity could be found guilty of every crime on that website. Along with the theory that every Terms of Service violation is a potential CFAA felony, everyone who uses a computer can go to jail as a felon. Like the world's worst lottery, all they have to do is pick you for guilt. Good luck, Citizen!

Best Buy Sees "Crashing" Tablet Sales

Posted: 30 Jul 2014 11:27 AM PDT

Best BuyTablet fad is slowing down

Best Buy's computer section looks decidedly different today than it did a couple of years ago. Gone are the aisles filled with desktop machines, which are now relegated to a small section off the side (if at all), replaced by mobile devices, including rows and rows of tablets. You can't fault Best Buy for following the money trail, and just as tablets took over the floor space when everyone wanted one, look for PCs to take some of its territory back. Why? Best Buy CEO Hubert Joly says tablets are now crashing.

Joly made the statement to Re/Code in an interview, adding that he's seeing a revival in the PC business at Best Buy due in part to Microsoft's decision to stop supporting Windows XP. In addition, two-in-one devices that combine both a tablet and laptop are gaining favor among Best Buy's shoppers.

"The tablets boomed and now are crashing. The volume has really gone down in the last several months. But I think the laptop has something of a revival because it's becoming more versatile," Joly said. "So, with the two-in-ones, you have the opportunity to have both a tablet and laptop, and that's appealing to students in particular. So you have an evolution. The boundaries are not as well defined as they used to be."

Interestingly, Joly blamed some of the declining PC sales on the "enormous" deflation in the Windows market. He points out that you can find laptops selling for $300 that used to cost $1,000. The solution? More innovation at the high end of Windows, Joly says.

Follow Paul on Google+, Twitter, and Facebook

Column: More Moore's Law Anxiety

Posted: 30 Jul 2014 11:23 AM PDT

Is Moore's Law over?

Will the hand-wringing over Moore's Law never stop? Intel's announcement that its next-generation 14nm process will be delayed a couple months triggered yet another round of fretting over the fate of this widely misunderstood "law".

Much of the panic is because Intel's "tick-tock" strategy has indeed operated like clockwork, chiming a smaller geometry every two years. Slippage is common at other companies, but not at Intel. So when the world's largest semiconductor vendor stops the clock for a few months, hearts begin palpitating.

moore's law

Especially when Intel admits that disappointing yields prompted the delay. The factory's initial 14nm wafers have more defects than expected, ruining too much of each batch. If a certain percentage aren't usable, Intel can't earn the profits needed to pay for the factory.

But worry not—Moore's Law isn't dead yet. First, remember that it's merely an observation, not a scientific principle. In 1965, Gordon Moore observed that affordable integrated circuits were growing about twice as dense every 12 months. In 1975, slower progress persuaded him to lengthen the period to 24 months. (The oft-quoted 18-month cycle was someone else's idea.) So the law isn't carved in stone, much less in silicon.

Besides, the industry hasn't kept pace for years. The law predicts that an affordable chip in 2014 should integrate nearly 38 billion components. (Moore counted resistors and capacitors, not just transistors.) The densest microprocessors today have more than 5 billion transistors, and the densest memory chips have 4 billion transistors and about 8 billion total components. So the curve is definitely flattening, but it's still a long way from flat.

Intel's three-month blip is a reminder that process technology is becoming fiendishly complex, even for the world's best engineers. If future delays stretch into a whole year, we'll know we have crossed the watershed.

Maingear's Spark Lights Up the Small Form Factor Gaming Scene, Now Available

Posted: 30 Jul 2014 10:52 AM PDT

Maingear SparkA tiny but spunky gaming PC

Maingear first introduced its Spark system back in January during the Consumer Electronics Show (CES) held in Las Vegas. At the time, the Spark was supposed to be Maingear's eventual Steam Machine. Valve threw a wrench in those plans by delaying the whole Steam Machine initiative until next year, but it hasn't stopped Maingear from forging ahead in the small form factor gaming department. On the contrary, Maingear just launched its Spark gaming system to the pubilc.

Let's talk dimnensions. Somewhat similar in form to Intel's NUC, Maingear's Spark measures a scant 4.5 inches (W) x 4.23 inches (D) x 2.34 inches (H) and weighs 0.89 pounds, making it the smallest, lightest, and most versatile gaming PC solution Maingear has ever offered, the company said.

As for specs, Spark still boasts AMD on the inside -- an AMD A8-5557M APU clocked at 2.1GHz (3.1GHz via Turbo) with AMD Radeon R9 M275X graphics. It also sports a pair of SO-DIMM DDR3-1600/1333 slots, mSATA slot (supports up to 512GB), 2.5-inch tray, GbE LAN, 802.11ac Wi-Fi, Bluetooth 4.0, HDMI output, mini DisplayPort, four USB 3.0 ports, and Kinsington Lock.

Pricing starts at $699, which includes 4GB of DDR3-1600 RAM and 500GB WD Blue 7200 RPM hard drive. That doesn't include Windows -- pricing starts at $120 for Windows 7 Home Premium 64-bit (Windows 8 options also available).

You can configure a Spark system now.

Follow Paul on Google+, Twitter, and Facebook

Microsoft Surface Pro 3 Earns Major Kudos From DisplayMate

Posted: 30 Jul 2014 10:25 AM PDT

Surface Pro 3Surface Pro 3 impresses DisplayMate with excellent color accuracy

The Surface Pro 3 is arguably the first tablet capable of replacing a notebook, which is something Microsoft is quite fond of saying, but did you know that it has one of best display screens of any mobile device? That's not Microsoft making that claim, it's Dr. Raymond M. Soneira, President of DisplayMate Technologies Corporation. Dr. Soneira put the Surface Pro 3 through its paces and came away quite impressed with the tablet's panel.

"Based on our extensive Lab tests and measurements on the display for the Surface Pro 3, Microsoft has produced an excellent professional grade high performance display for Windows. In fact, the Surface Pro 3 has one of the very best and most accurate displays available on any mobile platform and OS," Dr. Soneira said. "It joins near the top of a small set of tablets that have excellent top tier displays – ideal for professionals that need a very accurate high performance display for their work, and for consumers that want and appreciate a really nice and beautiful display."

That's high praise from one of the most respected display gurus in the business. The Surface Pro 3 earned it by reproducing some of the most accurate on-screen colors of any tablet or smartphone ever measured by DisplayMate.

It's also only the second display to earn "Very Good to Excellent" ratings in all categories, save for Brightness Decrease. Amazon's Kindle Fire HDX 8.9 was the other, the Surface Pro 3 is more accurately calibrated with the best "Absolute Color Accuracy" DisplayMate has ever measured, Dr. Soneira says.

Follow Paul on Google+, Twitter, and Facebook

 

Watch Dogs Graphical Analysis: Stock vs Worse Mod Take 2

Posted: 30 Jul 2014 10:10 AM PDT

Watch Dogs FeaturedWe compare stock Watch Dogs against Worse Mod 1.0

Our last attempt at comparing the stock graphics of Watch Dogs received some fair criticism. Coupling SweetFX and TheWorse Mod led to noticeably darker graphics that threw off our results. You all spoke, and we listened. Here's our second take on the Worse Mod which is now in its final incarnation: version 1.0.

First, a bit of background. Gamers were understandbly upset when Watch Dogs launched in May with Uplay issues, lackluster performance, and graphics that simply didn't match up to the versions demoed and displayed at E3.

TheWorse, MaLDo, and VAAS stepped in to create TheWorse Mod. Working off of some initial modding work by kadzait24, the team set out to add the stunning 'E3 effects' back into the game while also including extra visual flair and improvements. We've done the testing, but we'll let you, the readers, be the judge.

 

Testing Methodology

We've redone our testing of Watch Dogs with comparable scenes rendered in the bone-stock version of the game as well as a modded version with TheWorse Mod (with Maldo textures) installed. We've passed up SweetFX this time around to ensure that gamma isn't an issue.

Our capture rig is a bit less powerful this time around with an AMD FX-6300 processor alongside an AMD R9 280X and 8GB of RAM. It's still capable of running the game at close to 30 FPS on maxed out settings with and without the mod, however. 

In-game graphics settings are exactly the same as last time. Check out the specific settings below:

Watch Dogs Settings

Watch Dogs Settings

Breaking Down the Video

Watch Dogs

Aiden stands in an alleyway outside of a house (click the image for a comparison GIF). 

This scene showcases the immense difference in ambiance between stock Watch Dogs and the modded version. The splitscreen effect works great here because the difference in coloring is immediately obvious. Everything is shaded in a more stylized hue that covers up the game's unseemly bits while accentuating the beautiful shadows, lighting, and character models. 

Watch Dogs

This safehouse shot features very little movement aside from a shifting skyline (click the image for an animated GIF).

What's great about this area is that it demonstrates that TheWorseMod isn't simply a basic color filter added onto the game. In some areas, the world has a khaki hue that makes the world feel dirty and dusty. In this scene, all of the colors are more muted and mellow with the blue sky unblemished. There's not really a noticeable improvement in this screen, but that's because TheWorseMod really works best with movement. The effects of the mod are best demonstrated in-game or through video.

Watch Dogs

Cars pass by on a random street in Chicago.

Watch Dogs looks good, but it falls short of greatness in its stock incarnation. Everything looks a bit too plain and normal. This isn't even close to the stunning scenes we were treated to by the game's E3 trailers.

Watch Dogs

Aiden stands on a sidewalk as people hurry past.

This particular segment looks best in-game or in the video because of the swaying trees and the numerous pedestrians. There's a clear colorcast that makes the world come alive along with some great-looking grass and a well-worn sidewalk.  The way the light filters through the scene and casts shadows makes this look a hell of a lot more like what we expected it to. 

Conclusion

We're in love with TheWorse Mod. It does a great job of making Watch Dogs feel more cinematic. The game's stock graphics look stark, plain, and fairly artificial whereas the modded graphics make the game look more mystical, interesting, and in a way, more believable. A lot of the beauty of TheWorse Mod comes from artistic manipulation of depth of field and fog. In fact, we found that in some areas, these effects in particular were applied a bit too liberally and detracted from visibility. Minor quibbles aside, the mod is a much needed facelift for an otherwise good game. 

As for performance, frame rates with or without the mod were nearly indistinguishable. Some users report better performance on some machines without the added texture pack, but your mileage may vary.

That's our take, what do you think? Does TheWorse Mod make the game look more like it does in the game's E3 trailers?

Total Pageviews

statcounter

View My Stats