General Gaming Article

General Gaming Article


Valve to Demo SteamVR and New Living Room Devices at GDC

Posted: 23 Feb 2015 03:56 PM PST

SteamThe fight for the living room just got bigger

Unlike most developers and publishers that tend to inundate consumers with trailers, screenshots, developer diaries, and press releases to keep their games in the public's eye, Valve tends to stay quiet. So while we know that the company has been working on its Steam Machines and perfecting its Steam Controller, it appears that these devices weren't the only things the company has been working on. In fact, Valve will be unveiling a selection of new living room devices and a SteamVR hardware system at GDC while demonstrating its Steam Machines and finalized Steam Controller.

"Steam is bringing the best games and user-generated content to exciting new destinations," reads a post asking for VR content creators to sign up. "At GDC 2015, we'll be giving demos of the refined Steam Controller, new living room devices, and a previously-unannounced SteamVR hardware system."

No further details regarding the new devices were provided, though Valve is looking for VR content creators to sign up and that the company will be providing scheduled VR demos during GDC March 4-6. For a while know, we knew that Valve had been dabbling with VR devices and had even shared its research with Oculus VR. But, at that time, the company had stated that there was no interest in releasing its own VR hardware. In addition, Valve provided a few statistics about Steam which now offers around 4,500 games, that there is 400 million pieces of user-generated content from the Steam Community, and that there are over 125 million active Steam accounts worldwide. 

As for what the new living room devices might be, we couldn't even begin to guess (perhaps a SteamTV at this rate). Luckily we are scheduled to meet with Valve at GDC and will report back very soon.

What do you think Valve will reveal at the event exactly? Let us know in the comments section below!

Follow Sean on Google+, Twitter, and Facebook

In-Game Graphics Settings Explained

Posted: 23 Feb 2015 01:53 PM PST

Arma3A guide to interpreting game settings

PC gamers have been fiddling with graphics settings since the dawn of time, but it takes a special kind of know-how to understand what each of those settings actually does. It doesn't help that there aren't any standard naming conventions, which means that options like "model" and "object" quality are usually one and the same. To help clarify, we've rooted out what all of the most common graphics settings actually do.

Keep in mind that the names of settings can vary between games, so use your best judgment before making too many changes. For those who aren't interested in fiddling around with sliders, you should stick to utilities like Nvidia's GeForce Experience, which optimize the gameplay experience for you.

Anti-Aliasing 

Anti-aliasing

Nvidia's demonstration of anti-aliasing.

Anti-aliasing, or "AA," is a setting that most gamers are probably familiar with. Although there are different forms of anti-aliasing, they all attempt to smooth over the jagged edges of objects—or "jaggies" as they're affectionately called. These visual artifacts are a consequence of the very nature of presenting an image on a screen. Individual pixels are assigned colors, and these combinations of colors—rendered as individual objects—have jagged edges. Anti-aliasing creates what our eyes perceive as a smooth line in their place. 

Gif

It's hard to see in this smaller resolution, but the edges of the car and the gun have noticeably jagged edges if you use CTRL + middle mouse scroll up to zoom in after clicking the image to enlargen.

The concept itself is easy to understand, but appreciating the different forms of anti-aliasing is a bit more complicated. Nvidia breaks it down into two major types: supersampling and multisampling. The simplest method, supersampling (FSAA), involves rendering the scene at oversized dimensions. With a native resolution of 1920x1080, four samples would mean that the GPU renders the scene at 3840x2160 before bringing it back down to its original size. Multisampling (MSAA) is a bit different in that it samples groups of adjacent pixels together instead of individually. This saves precious processing power, but sacrifices minor details in exchange for better performance. 

This is a resource-intensive setting that is most important at lower resolutions. As the number of pixels—and with it, the resolution—increases, the jagged edges of objects become less obvious. In fact, applying excessive anti-aliasing to higher resolutions can have a catastrophic effect on performance because of the multiplicative nature of anti-aliasing—rendering a 3840x2160 scene is hard enough without supersampling it to 7680x4320 or higher. 

Anisotropic Filtering

Gif

Notice the lack of track marks on the ground to the right in the image with bilinear anistropic filtering. 

Anisotropic filtering is anti-aliasing's little brother. While AA smooths out jagged edges, anisotropic filtering adds detail to what would otherwise be blurry, faraway objects. To save resources, distant objects are rendered with lower-resolution textures; these less-detailed surfaces can eventually become blurry when viewed at an angle. 

Texture filtering solves this problem by raising the level of detail in faraway textures to an adequate level. Basic, isotropic filtering uses a square pattern that isn't appropriate for fixed perspectives. Anisotropic filtering steps in to use rectangular or trapezoidal patterns to improve textures.

Just like anti-aliasing, anisotropic filtering can be resource-intensive, so this is a setting you'll want to pay particular attention to. Raising the setting from values like 1x or 2x increases the detail in distant textures and thus uses more processing power. 

Texture Quality

Arma3 Textures

The change in this picture is pretty drastic with thicker grass and a vastly improved facade on the building.

Texture quality is a hugely important setting because almost all of the objects and models visible in game are textured. Think of it as a sort of virtual wallpaper that gives otherwise featureless objects a more familiar face—grass, snow, walls, etc. Lower-resolution textures look blurry and lack detail. Increasing the texture quality will drastically improve the look of any game. Unlike model quality, raising texture quality relies more on VRAM than it does on your GPU's processing power.  

Lighting Quality

Battlefield 4

The difference between the ultra and low lighting is pretty drastic, with ultra effects being smoother and more realistic.

Adjusting the lighting quality setting affects the number of light sources and their effects on the environment. It's an incredibly complicated topic. Fortunately, raising or lowering lighting quality usually has a fairly obvious effect on the game. Low settings usually reduce light to basic points and can cause weird reflections (see the Battlefield 4 screenshot above). Unfortunately, raising lighting quality has a drastic effect on performance because of the complex calculations that take place behind the scene to realistically light the scene. 

Shadow Quality

Watch Dogs

The shadows in this scene aren't particularly complex, but the shift from low to ultra adds more detailed shadows to objects in the distance.

This setting is fairly self-explanatory. Adjust it to control the quality of rendered shadows. Going from on to off has an obvious effect although not all games support completely disabling shadows. Moving between levels of shadow has a more subtle effect, with shadows disappearing from smaller objects in the distance. The edges of shadows become smoother and less pronounced as you approach "High" or "Ultra" levels. Increased shadow quality also means that the shadows will better resemble the detailed shape of the object casting the shadow. These effects are fairly expensive (taxing) because of the inherent relationship between light and shadows—the placement and size of each shadow has to be calculated.

Vertical Synchronization (VSync)

Vsync is a holdover from the era of CRT monitors, but it's still sometimes necessary for LCD monitors. To put it simply, vsync synchronizes your monitor and your graphics card to eliminate tearing effects. Without it, your video card is free to render frames as soon as it's able, which means that it might very well be presenting a scene that hasn't yet been fully updated on your screen. This tearing—imagine a photo literally torn in half and reattached slightly askew—usually happens when your frame rate far exceeds the refresh rate of your monitor. Unless you've got a particularly capable monitor, your refresh rate is probably capped at 60Hz, which means that you'd ideally want a constant 60 frames per second. 

Unfortunately, vsync isn't without its downsides. Particularly astute gamers might notice a bit of added latency while moving the mouse cursor or entering keyboard commands. There's also the performance cost associated with synchronization, which means that if you're barely averaging 60 frames a second, you'll probably be just fine keeping vsync off.

Learn More

We'll be adding more explanations to this guide over time. In the meantime, there are a wealth of resources available for people interested in diving deeper into the world of computer graphics. Head on over to Tweakguides.com (especially their Gamer's Graphics & Display Settings Guide) or check out the folks over at r/buildapc who have created a pretty comprehensive game settings guide

Which settings do you usually turn on and off? Tell us in the comments!

Newegg Daily Deals: Asus 12X Blu-ray Burner, Seagate Backup Plus Slim 5TB USB Drive, and More!

Posted: 23 Feb 2015 01:25 PM PST

Asus Blu-ray Drivenewegg logo

Top Deal:

HD-DVD lost the high definition format war and there was a collective cry about how much more expensive Blu-ray was. We all shook an angry fist at Sony, then got over it. But if you didn't, instead choosing to hold a grudge and refusing to purchase Blu-ray hardware until it dipped into affordable territory, then you should check out today's top deal -- it's for an Asus 12X Bluray Burner for $60 with free shipping (normally $80 - use coupon code: [EMCANNA23]; additional $20 mail-in-rebate). Take advantage of that rebate it knocks the price down to $40.

Other Deals:

Seagate Backup Plus Slim 1TB Silver USB 3.0 Portable Hard Drive for $60 with free shipping (normally $65 - use coupon code: [EMCANNA33])

Seagate Backup Plus 5TB Black USB 3.0 External Hard Drive for $140 with free shipping (normally $170 - use coupon code: [EMCANNA34])

G.Skill Sniper Series 16GB (2x8GB) SDRAM DDR3 2400 Desktop Memory for $115 with free shipping (normally $130 - use coupon code: [EMCANNA32])

G.Skill Sniper 8GB (2x4GB) 240-Pin DDR3 1600 Desktop Memory for $60 with free shipping (normally $70 - use coupon code: [EMCANNA28])

Microsoft Bing Predicted Oscar Results with Surprising Accuracy

Posted: 23 Feb 2015 10:12 AM PST

OscarsJust adding to its resume

It's becoming increasingly difficult not to be impressed (or downright spooked) with the prediction engine behind Microsoft's Bing browser. For its latest feat, Bing successfully predicted most of the top winners for the 2015 Oscars, correctly identifying 20 of the 24 results for an 84 percent success rate. That alone is impressive, though it's just another notch in Bing's belt when it comes to predicting outcomes.

Tapping into Bing's predictions engine, Cortana impressed during the World Cup last year by correctly picking 15 out of 16 knockout matches, including Germany's win over Argentina in the final. It's only trip up was selecting Brazil to beat the Netherlands in a rather meaningless third-place match.

David Rothschild, an economist in Microsoft Research's New York City lab, developed the prediction model that seems to work so well. It certainly did at the 2015 Oscars, correctly predicting the winner for Best Picture (Birdman). The categories it missed include:

  • Original Screenplay: Picked The Grand Budapest Hotel instead of Birdman
  • Animated Feature: Picked Dragon 2 instead of Big Hero 6
  • Original Score: Picked The Theory of Everything instead of The Grand Budapest Hotel
  • Film Editing: Picked Boyhood instead of Whiplash

This isn't the first time the prediction engine has done well at the Oscars. Last year it correctly picked 21 out 24 Oscar winners, and in 2013, it identified 19 of 24 winners.

How is it possible? One thing Rothschild takes into account is public opinion, which he assumes will put pressure on voters. However, he says public opinion is only loosely related to the winners because of the sentimentality involved.

"Prediction markets follow a select group of people who have high levels of information on what voters will do and are willing to wager real-money on the outcomes," Rothschild says. "And, prediction market-based forecasts have been incredibly accurate."

Follow Paul on Google+, Twitter, and Facebook

Lenovo Faces Class Action Lawsuit Over Superfish

Posted: 23 Feb 2015 09:19 AM PST

LenovoNo big surprise

Lenovo's been in damage control ever since news broke that it was installing a careless piece of adware called Superfish onto consumer laptops and desktops, but the court of public opinion isn't the only one it has some explaining to do. According to reports, a class-action lawsuit against Lenovo and Superfish was filed at the end of last week claiming "fraudulent" business practices.

Let's backtrack a moment. Superfish came under scrutiny for a number of reasons, the least of which is that some users complained it would install on their systems upon first boot even if they declined the software. Furthermore, attempts to uninstall the software would leave behind a dangerous root certificate, which is the real issue.

New Information

According to Ars Technica, a company called Komodia is behind the dubious technology that allows Superfish to do what it does, which is hijack web searches in order to serve up ads. It uses a fake SSL certificate to do that, essentially a man-in-the-middle attack, leaving users susceptible to hackers. Komodo bundles a password protected private encryption key to prevent hackers from creating websites to spy on users, but it took Errata Security CEO Rob Graham all of three hours to discover that the password is "komodia." Try not to give yourself a nosebleed from the obligatory facepalm.

As time goes on, the list of applications that use the same SSL-hijacking technology as Superfish is rapidly growing. Facebook's security team alone has identified over a dozen applications other than Superfish using the same Komodo library.

"Initial open source research of these applications reveals a lot of adware forum posts and complaints from people. All of these applications can be found in VirusTotal and other online virus databases with their associated Komodia DLL's. We can't say for certain what the intentions of these applications are, but none appear to explain why they intercept SSL traffic or what they do with data," Facebook says.

Back to the Lawsuit

While the full extent of Komodo's "redirection SDK" continues to be investigated, Lenovo and Superfish are the two high profile companies that are bearing the brunt of criticism. In the lawsuit, Plaintiff Jessica Bennett claimed her laptop was damaged by Superfish, which she refers to as "spyware" in court documents, and that Lenovo and Superfish invaded her privacy, PCWorld reports.

The lawsuit is seeking unspecified damages from the two companies.

Removal Tool

Lenovo last week provided instructions on how to manually remove Superfish, including the root certificate that likes to stick around. In an updated statement over the weekend, Lenovo tells us it has now released an automated tool that will completely remove Superfish. You can find the tool (along with its source code) here.

Follow Paul on Google+, Twitter, and Facebook

Samsung Promises Another Fix for 840 EVO SSD Performance Issues

Posted: 23 Feb 2015 05:39 AM PST

Samsung EVO 840The first fix, issued in October, turned out to be a dud

The Samsung 840 Evo launched to some rave reviews in 2013. We gave it a "kick ass" 9 out of 10 and hailed it as "the fastest SSD we have ever tested by a sizable margin." Unfortunately, some of that luster has since worn off, with a large number of 840 Evo owners reporting a serious decline in read performance of drives with several months' of data on them. As for the firmware update and Performance Restoration Software that the company released in October to address the issue, they were apparently of very little help as the problem has resurfaced like a recrudescent cancer.

This leaves the South Korean firm with no other choice but to issue another fix, which it says is just around the corner.

"In October, Samsung released a tool to address a slowdown in 840 EVO Sequential Read speeds reported by a small number of users after not using their drive for an extended period of time. This tool effectively and immediately returned the drive's performance to normal levels," the company told AnandTech in an email. "We understand that some users are experiencing the slowdown again. While we continue to look into the issue, Samsung will release an updated version of the Samsung SSD Magician software in March that will include a performance restoration tool."

Although the company has yet to state the cause of this current problem, it could very well have something to do with the original issue, which was caused by the manner in which Samsung's NAND management algorithms handled NAND cell charge decay.

Follow Pulkit on Google+

Nvidia Slapped with Lawsuit Over GTX 970 Performance and Specifications

Posted: 23 Feb 2015 03:46 AM PST

Nvidia Geforce GTX 970Gigabyte also tagged in proposed class-action lawsuit

The furor over GTX 970's specs refuses to die down. What was until recently a public relations debacle is now threatening to snowball into a costly lawsuit, with a class-action complaint being filed Thursday by Cass County, Michigan-resident Andrew Ostrowski against Nvidia and Gigabyte for engaging "in a scheme to mislead consumers nationwide about the characteristics, qualities and benefits of the GTX 970."

Before we go any further, here's a quick recap: In late January, many people began complaining about performance issues in games once VRAM usage hit the 3.5GB mark. This prompted Nvidia to clarify that the 970's total memory is divided into a 3.5GB segment and a 0.5GB segment, with the comparatively slower second partition being only used when the game needs more than 3.5GB memory. To make matters worse, the company also disclosed that the card actually has fewer ROPs (raster operations pipeline) and a smaller L2 cache than advertised, a gaffe it attributes to internal miscommunication that lead to an error in the reviewer's guide. It bears mentioning, however, that the impact on real-world performance appears to be minimal — at least for now.

The proposed lawsuit alleges that Nvidia deliberately avoided disclosing the discrepancy, lest it have an adverse impact on sales and ruin what eventually turned out be an annus mirabilis of sorts for the company.

"In other words, Nvidia's record profits were driven in part by the sale of the company's flagship GTX 970 GPUs, which is likely why it did not want to disclose the material limitations at issue herein until after it had made millions of dollars in sales of such products," reads the complaint, adding that the two defendants still persist in making some of the misleading claims in their advertising and marketing literature.

First and foremost, Ostrowski wants the court to issue an order granting the case official class-action status, and to appoint him and his counsel to represent the class. Once that is out of the way, he would like to see the court award, among other things, disgorgement, restitution and "an order that defendants engage in a corrective advertising or full refund campaign."

Follow Pulkit on Google+

Symantec Confirms Antivirus Update Was Behind Internet Explorer Crashes

Posted: 23 Feb 2015 01:06 AM PST

Norton Internet SecurityAV vendor inadvertently crippled millions of Internet Explorer installations

On Friday, a thread came up on the Norton Community forum from a user complaining of a Norton Internet Security (NIS) antivirus update breaking Internet Explorer on Windows 7 Pro 64-bit. It soon swelled to multiple pages as droves of other users running Internet Explorer 9 and up on Windows Vista and up confirmed as much. Needless to say they were all very angry with an antivirus update, of all things, rendering a key software completely unusable (see what we did there?), and in some cases, forcing them to uninstall NIS.

To Symantec's credit,  it did not take the company too long to identify the culprit and roll out a fix. Into the eighth page of the thread and around five hours after the original post, reports of the issue having been fixed by a fresh update started coming in.

Here's what Symantec had to say: "Based on our analysis, the issue was caused by a corrupt file in the virus definition set. Symantec recreated a snapshot of the same definition package as 20150221.001 and released it through our LiveUpdate servers."

According to the company, only 32-bit versions of the world's second most-used browser were affected by the faulty update.

Follow Pulkit on Google+

Intel Sees Moore’s Law Continuing Beyond 10nm Chips

Posted: 22 Feb 2015 10:54 PM PST

300mm wafer

Company believes move to 7nm possible without EUV

We are coming up on the semi-centennial anniversary of Moore's law, a prediction in 1965 by Intel founder Gordon Moore that the number of transistors on an (economical) integrated circuit would continue to double every 12 months until at least 1975, at which point he revised the rate of "circuit density-doubling" to 24 months. The prediction has held up rather well since then. But with all due respect to its remarkable longevity and massive impact on technology, the many physical limitations to transistor scaling at smaller nodes have led many to conclude the famous axiom is on borrowed time. Intel, however, looks determined to soldier on with Moore's law beyond the 10nm node.

That's according to comments made by Mark Bohr, senior fellow for logic technology development at Intel, during a call with reporters to preview the company's agenda for the 2015 IEEE international Solid-State Circuits Conference (ISSCC) in San Francisco.

Although three of the five papers that the company is scheduled to present at ISSCC deal with existing 14nm process technology, Bohr will take part in a panel discussion on the the move beyond 10nm and the many challenges it poses.

Per the existing roadmap, the company expects to move to 10nm in 2016 and to 7nm in 2018.

"I still believe we can do 7nm without EUV [Extreme Ultraviolet Lithography] and deliver improved cost per transistor. I'm not going to say exactly how, because our competitors watch what we do closely," Bohr said.

He did drop a few hints, though: "We have published papers on III-V [three-five] devices, so that's one example [of the new materials Intel could use to move to 7nm], but introducing any new technology will be about balancing performance against manufacturability [sic]."

Bohr's comments assume significance because EUV, for long considered the best bet to replace current 193-nm lithography and extend Moore's law beyond 10nm, isn't ready for prime time — in fact, hasn't been for over a decade now.

"Scaling does continue to provide lower cost per transistor, and it is Intel's view that cost reduction is needed to justify new generations of process technology," he said, adding that it is crucial to recognize the importance of heterogeneous integration.

"Going forward, heterogeneous integration will become increasingly important, but we may not be able to do it all on one chip, so you will see more use of SoC solutions such as 2.5D integration, where two are mounted side by side on a substrate, or full 3D integration, stacking chips on top of each other, each one tuned for a different [manufacturing] process to perform different functions."

Follow Pulkit on Google+

Total Pageviews

statcounter

View My Stats