General Gaming Article

General Gaming Article


VIA Agrees SYSMark 2012 Not Useful, Quits Too

Posted: 22 Jun 2011 05:47 PM PDT

AMD resigning in a huff over alleged bias in SYSMark2012 is one thing, but now two other vendors have publicly confirmed that they have quit the benchmarking organization called BAPCo.

On Wednesday, officials with VIA confirmed reports that the company had quit and said its reasons were similar to AMD's.

"VIA today confirmed reports that we have tendered our resignation to BAPCo," Richard Brown, a spokesman for VIA, told Maximum PC late Wednesday. "We strongly believe that the benchmarking applications tests developed for SYSmark 2012 and EEcoMark 2.0 do not accurately reflect real world PC usage scenarios and workloads and therefore feel we can no longer remain as a member of the organization." 

Echoing AMD's statement, VIA said it wanted more transparency.

"We hope that the industry can adopt a much more open and transparent process for developing fair and objective benchmarks that accurately measure real world PC performance and are committed to working with companies that share our vision."

VIA is familiar to most consumers for its successful run of chipsets in the Pentium III and Athlon days but it has since exited chipsets for Intel and AMD CPUs to concentrate on its own CPUs such as the VIA C7, Nano X2 and Eden X2 chips. All are seen as low powered and pale in comparison to the speed of Intel's processors, or even AMD's processors.

Still, VIA taking its ball home along with AMD as well as Nvidia can't help the perception that something is amiss with the new SYSMark 2012 benchmark. Nvidia officials confirmed on Tuesday that the company quit BAPCo's board too, but would not elaborate on why it left.

A BAPCo spokesman has denied AMD's claims of bias and said AMD was involved in the process all along and even created the method used to adopt the work loads. 

 

Chrome to Natively Support Skype-like Features

Posted: 22 Jun 2011 04:44 PM PDT

Skype may have eventually gone to Microsoft, but that would have never happened had Redmond's cloud-obsessed rival Google not dropped the idea of acquiring the popular VoIP service in 2009. The Internet behemoth came very close to making a bid but backed out at the last moment.

According to Wesley Chan, an investment partner at Google Ventures, the data-intensive nature of Skype's underlying peer-to-peer technology turned out to be the deal breaker. Needless to say, the Big G has absolutely no regrets about not acquiring Skype's "old technology" as its own efforts seem to be coming along nicely. It has now announced plans to add Skype-like real-time communication (RTC) features into Chrome using its open-source WebRTC initiative.

"Our goal is to enable Chrome with Real-Time Communications (RTC) capabilities via simple Javascript APIs," Henrik Andreasson, a Google programmer, wrote on Friday. "We are working hard to provide full RTC support in Chrome all the way from WebKit down to the native audio and video parts. When we are done, any web developer shall be able to create RTC  applications, like the Google Talk client in Gmail, without using any plugins but only WebRTC components that runs in the sandbox. "

Besides Google, the WebRTC project also has the backing of Mozilla and Opera. It will eliminate the dependence of RTC applications like Google Voice and Video Chat on pesky browser plugins. Browsers from all three vendors are expected to begin shipping with WebRTC support very soon.

The Eyefinity Field Manual: Your Guide to Multi-Monitor Bliss

Posted: 22 Jun 2011 03:22 PM PDT

When ATI Launched its 5000 series graphics cards back in September 2009, it was more than just a performance marvel, it ushered in a new technology that promised PC gamers an experience unlike no other: true, no compromise, multi-monitor gaming.

Sure, some ambitious PC-centric titles such as Supreme Commander had dabbled with multiple display support in the past, but these limited attempts offered little more than a convenient way to separate the mini map from the action. Eyefinity, by comparison, promised to bring multi-display gaming to hundreds of titles that were never optimized to support it. This would turn out to be both its biggest strength, and its greatest obstacle. It was an ambitious and somewhat buggy undertaking when it was revealed back in 2009, but has over a year and a half of driver releases improved the situation?

Having lived with an Eyefinity setup now for the past twelve months, I feel uniquely qualified to talk about not just how the technology has evolved, but if it was worth the cost.

Does multi-monitor gaming in 2011 finally live up to all the marketing hype? Hit the jump to find out.

Eyefinity in 2011. What's Improved, What's Hasn't.

AMD fans may not care to admit it, but the Catalyst driver suite hasn't always been the industries golden boy when it comes to reliability. ATI has a proud history of impressive hardware under its belt, but the software team has had a lot to prove over the past few years. If you're one of the few still holding a grudge, let me reassure you. Despite the odd hiccup here and there, I can now comfortably state that a modern release of the Catalyst Control Center feels every bit as stable, and capable, as its ForceWare competitor.

Eyefinity support was added with the launch of the 5000 series GPUs in Catalyst 9.9, but most of the caveats that existed back then have been fully smoothed out today. Of the three biggest obstacles we saw at launch, all but one of these have been addressed by the driver team.

CROSSFIRE SUPPORT

When AMD launched Eyefinity without native support for Crossfire, we were left scratching our heads. Considering that older titles such as Crysis still have the ability to make our graphics cards weep, how would a single GPU hold up while being forced to pump out 3 times as many pixels as before? The answer is not very well. Performance was acceptable in source engine games, but let's face it, 15fps in Crysis just doesn't cut it. 

Luckily, with the release of Catalyst 9.12 in December 2009 this obstacle was not just overcome, but has improved dramatically ever since. Practically every subsequent driver release has brought improved Crossfire scaling to the table, and nobody benefits from this more than an Eyefinity gamer. A 15% improvement in your favorite game might go unnoticed on a single display, but makes all the difference in the world when you're trying to push over 6 million pixels at once.

BEZEL COMPENSATION

The single most common complaint about Eyefinity setups is usually in reference to the thick plastic frames that house a modern day LCD. The thickness of the bezel was never much of a concern for display makers before Eyefinity came along, and unfortunately not much has changed. Eyefinity users are still very much in the minority, and simply don't buy enough screens to influence the market. AMD has worked with Samsung to release the best bezel offering we've seen so far, but at $1,900 for a 3 monitor configuration, and $3,100 for six, the vast majority of Eyefinity hopefuls will still need to look at thicker bezel alternatives.

AMD, aware that this was a sticking point for many users, introduced a pretty innovative fix in Catalyst 10.3 to help address the issue. Driver level bezel correction allows you compensate for the dead space between your LCDs and converts this area into a blind spot, rather than a hard stop. Prior to introduction of bezel correction, textures would end on one monitor and abruptly start on the second, creating a very disjointed image. The bezel correction works by creating a custom resolution for your configuration and broadcasting this as the native setting for all your apps.

Bezel correction is an optional step when setting up a new Eyefinity group in the Catalyst control center, so make sure you don't miss it. You might not notice it in a racing game when looking at the grass or sky, but it will stick out like a sore thumb when you're looking at a face that has been incorrectly split in two by a 4-inch bezel.

DISPLAY PORT BLUES

When Eyefinity initially launched, the requirement for one monitor to be DisplayPort ready dramatically limited your options. In 2011 DisplayPort monitors still command a slight premium, but it's nowhere near as bad as it was. With a little effort I was able to find several options starting at just $250. They may not be the best panels on the market, but it's a great way to get your feet wet. The alternative is to use an active DisplayPort adapter, and luckily, these too have seen a considerable drop in price. Back in 2009 these dongles could run you $100 or more; now perfectly acceptable options are in the $25-$30 range.

The Nvidia solution for 2D surround requires users to purchase two graphics card in SLI. It's not as elegant a solution to be sure, but if you were going to buy two cards either way, it certainly isn't a deal breaker.

Do I Really Need a Multi-Monitor Gaming Setup?

Eyefinity really is luxury gaming taken to excess. Does anyone really need to run their favorite games at a resolution 12 times higher than an Xbox 360? Probably not. Is it worth every penny if you have the money to burn? Well that really depends on your favorite genre.

FIRST PERSON SHOOTERS

Have you ever considered buying a $60 mouse pad because a fancy display ad told you it would improve your accuracy? If you answered yes, and you haven't considered Eyefinity yet, you've been looking in all the wrong places. By increasing the number of pixels you can see at any given time, your advantage on the battlefield increases exponentially. I cannot begin to tell you how many times I've seen foes blindly run past me in the distance, completely unaware of what's going on directly beside them.

Aside from the obvious competitive advantages, Eyefinity also creates a sense of immersion that is simply unmatched. Having so much additional screen space to monitor can be a bit unnerving at first, particularly for single display veterans, but over time you will eventually train your eyes to scan these monitors effortlessly using your peripheral vision. It takes a healthy dose of both patience and practice, but once you get the hang of it you'll wonder how you ever lived without it.

On the down side, some FPS titles suffer from an uncorrectable phenomena often called Fisheye, but we'll touch more on that later.

Portrait or Landscape?  LANDSCAPE
Best Examples: Battlefield Bad Company 2, Dead Space, Source Engine Games Including: Team Fortress 2, Half-Life 2, Left for Dead 2.

THIRD PERSON ACTION GAMES & MASSIVE MULTIPLAYER ONLINE RPGS

Most of the advantages I listed above for the first person shooter junkies also apply here, but with one key exception. The third person viewpoint is ideally suited for Eyefinity, without exception. Rather than the extra displays being a source of distraction, it just works. Open world games such as Just Cause 2 offer a slightly better experience over indoor titles such as Batman Arkham Asylum - but only because the walls of a high security prison are somewhat less captivating than a tropical wonderland.

The vast majority of MMOs I tested also looked great in Eyefinity. The extra screen real estate comes in super handy if you prefer to leave your status windows open. Eve Online, for example, is a gorgeous game, but on a single monitor it's sometimes hard to even find your ship under the sea of required scanner windows.

Portrait or Landscape? LANDSCAPE
Best Examples (Action): Assassin's Creed II & Brotherhood, Batman Arkham Asylum, Just Cause 2, Splinter Cell Conviction
Best Examples (RPG/MMOs): World of Warcraft, Eve Online, Rift, Dragon Age

TOP DOWN STRATEGY GAMES

Any decent strategy gamer will tell you that awareness of your surroundings should always be priority #1, so using this logic, more pixels should offer a decisive advantage, right? The answer is an undeniably yes; some developers know this, and have gone to great lengths to even the playing field.

The latest releases of big budget games such as Dawn of War, Shogun, and even Civilization work flawlessly with Eyefinity, while others such as Starcraft II have gone out of their way to disable it. Some resourceful gamers have found ways to mod Eyefinity support back in, but you won't catch me logging into a multiplayer lobby with a modified client. Blizzard isn't known for having much patience in that regard.

Blizzard aside, I would describe the strategy genres embrace of wide screen gaming as mixed at best. When it works, the extra screen real estate can be invaluable for providing early warning of an incoming enemy attack, but since most strategy games use some variation on fog of war, this isn't always an advantage, even if the game natively supports it.

Another key example would be Civilization V. While the game looks simply fantastic in Eyefinity, multiple-monitors really don't add much to the experience beyond the wow factor. Since the vast majority of your attention is focused on the center of the screen, multiple-monitors can actually be somewhat annoying since you need to turn your head to view the controls locked on the far left, and far right monitors respectively. If you love strategy games, you're probably better off spending your hard earned cash on a single 30" screen with a native 2560 x 1600 resolution. If you do use Eyefinity however, I would highly recommend looking at a portrait configuration. Unless you make a habit out of building your bases out in the middle of the map, you typically don't need to see as far east or west. Worse yet, if you end up building on the edge of a map, you could end up wasting an entire monitor that gets stuck looking into a black abyss. Portrait-mode Eyefinity also somewhat addresses the Civilization problem mentioned above with controls being placed so far out of your normal visual spectrum.

Portrait or Landscape?  PORTRAIT
Best Examples: Dawn of War II & Retribution, Total War Shogun 2, Civilization V

RACING & SIMULATION GAMES

If you find AMD showing off Eyefinity in the field, more often than not, you're probably going to see them demonstrating a racing or flight sim. The reason for this is simple. People who drive cars are used to being able to look out side windows and see the landscape whipping by.

Some might argue that Eyefinity was designed with simulation games in mind, but it's also the one genre that gains next to no advantage from the extra real estate. Eyefinity can turn almost any racing sim into an immersive experience that you have to see to believe, but the advantages end there. At the end of the day the road ahead is all that matters, making an Eyefinity setup less helpful to your game than a set of fancy rims. It sure adds alot of bling, but don't expect it to help you win the race.

Portrait or Landscape? LANDSCAPE
Best Examples: Need for Speed Shift, Burnout Paradise, Dirt 2, H.A.W.X 1 & 2


Picking Monitors & Hardware

One of the most common questions people ask me about multi-monitor gaming is where to start. If you've got a Radeon 5000 or 6000 series GPU, or two GeForce 400 or 500 series boards, you're already half way there.

PICKING THE GRAPHICS CARD

When picking from the AMD camp the most expensive single GPU you can afford will always be your best option. If you have any money left over, buy two. If you're an Nvidia fan, simply buy a pair of whatever you can comfortably afford. This might sound like an overly simple answer to a complicated question, but trust me when I say you'll need all the performance you can get.

Another important consideration is video memory. 1GB might sound like more than enough for a graphics card, but it disappears quickly at resolutions of 5760 x 1080 or higher. Always opt for a GPU with the most amount of onboard memory possible.

PICKING THE MONITORS

Picking the right monitors is one of the most difficult challenges, and leaves you with some tough choices. The natural temptation is to simply mix and match, and while you can certainly take this route to help keep the cost down, I'm going to list a few best practices.

Requirements:
All Monitors must be the same resolution, refresh rate, and at least one must have a DisplayPort (if you don't have an active DisplayPort adapter when using Eyefinity).

RECOMMENDATIONS:

  1. Buy the same brand/model/year whenever possible.
    When I took the Eyefinity plunge over a year ago, I mixed a two-year-old Dell Ultrasharp 2407-WFP, with the closest match I could find at the time, the Dell U2410. This made financial sense back then but despite my best efforts I was never able to fully match the colors between the two models, since the newest Ultrasharp panels are all IPS. I lived with it for almost a year, before finally giving up and standardizing the U2410 for all three.
  2. Stick with the same size.
    If you are going to mix and match old & new displays, stick with the same physical size if at all possible. If you have a 24" 1080p panel in the center, try your best to find a matching set. It's not required, but it's definitely ideal.
  3. Three vs. six monitor setups.
    If you're flush with cash you might be tempted to opt for the full six display Eyefinity experience (what Maximum PC reader wouldn't?!). But trust me when I say diminishing returns sets in quickly after the first three. If you're a typical user who sits less than three feet away from your displays, anything more than three monitors can be pretty overwhelming. In addition to the visual overload, you're also forcing your system to crunch over 12 million pixels at once. The performance toll for a six-monitor configuration can be pretty extreme.
  4. If you're mixing and matching old and new monitors, you'll want to calibrate.
    Software solutions such as ColorWizzard ($50) work well, but hardware devices such as Datacolor's Spyder 3 work best ($169). Mismatched colors might not be something you'll notice on the desktop, but when the walls in your favorite game change color as your eyes scan across the screen, it can be pretty distracting.

HOW TO CONFIGURE MONITOR GROUPS

Setting up an Eyefinity group using the Catalyst Control Center has gotten much easier since launch, but it's still not completely intuitive, even for advanced users. Rather than walk you through it step-by-step, I would encourage you to check out AMD's interactive tutorial.

Common Problems With Eyefinity

FISHEYE & FOV

When developers set to work on a modern game, they typically do so with the expectation that the vast majority of users will experience that content on a wide screen display. This generally means an aspect ratio of 16:9 (1080p, 720p) or one of the other generally accepted standards such as 4:3 or 16:10. Eyefinity takes these ratios and throws them right out the window. Once activated, three seperate 1080p displays in landscape mode are presented to the operating system as a single, massively wide panel with an aspect ratio of 16:3 - possibly lower if you're using bezel correction.

The end result of such a hugely disproportionate aspect ratio varies greatly depending on the title, but first person shooters in particular are prone to suffering from image distortion the further you get from the center. You'll often hear this refereed to as "Fisheye", but I've also heard it called tunnel vision. Forward-thinking developers, such as Valve, have included the ability to adjust the FOV (Field of Vision) to help compensate for this problem, and it makes a huge difference if you're sensitive to this type of distortion. If - like me - you only use the side displays for peripheral vision, not being able to adjust this setting isn't a complete deal breaker, but is certainly nice to have.

When you lower the FOV you will stretch the appearance of objects further away from the center, as you increase it, images become more compressed. Valve makes this setting accessible with a handy slider under the graphics settings, but many other titles hide it in .cfg and .ini files. If you Google your favorite game and add the word "Eyefinity" to the end, you will typically get directed to the Wide Screen Gaming Forums. This community has done a fantastic job of rounding up fixes for almost every title, and is a great starting point when looking for fixes.

The WSGF community has also released an FOV calculation tool that is second to none.

STRETCHED MENUS & DISTORTED CUT SCENES

Eyefinity works in an impressively large percentage of games - both older and modern titles - however, you'll need to come to terms with the fact that some will never stretch out properly no matter how hard you try. High profile games such as the Witcher or Mass Effect 1 & 2 are great examples of titles where the developer made engine level design decisions that make trying to run Eyefinity a complete nightmare.

The most common mistake I've seen so far is when developers code the size of the main menu options to scale with the width of the display. Best-case scenario is an ugly UI, more often than not however, it renders the interface completely unusable. Another common Eyefinity issue is when cut scenes or in-game scripted sequences render incorrectly. Bulletstorm, for example, allows for the adjustment of the FOV by binding a key in the .cfg files which works great. Scripted sequences however, ignore this override, leading to an abnormally high number of unintended crotch shots between missions, which works...not so great.

Final Impressions

Multi-monitor gaming in 2011 is not only alive and well, but is more affordable than ever before. It's easy to go overboard once you start looking high-end displays with AMD's latest and greatest in Crossfire, but don't feel pressured. Even a modest investment in a pair of matching side monitors will drastically improve the sense of immersion you derive from your gaming sessions. Assuming you already have a modern GPU, a $500 investment is about all it takes to get started.

This might sound like a hands down ringing endorsement, and while I do firmly believe multiple-monitors is a worthwhile upgrade for any hardcore gamer, it still isn't for the faint of heart. If rooting through the file system in search of .ini files, or researching FOV fixes for hours on end doesn't sound like your idea of a good time, then this might not be for you. An impressively large percentage of titles work with next to no effort at all, but you'll only remember the ones that don't. Once you've sampled true ultra wide-screen gaming, it can be painful being forced back down to a single display to play one of the more stubborn titles.

Wide Screen Gaming Resources

AMD Eyefinity Tech Demo & More Information
Delphium's Field of View & Aspect Ratio Calculator
Widescreen Fixer (Unofficial Eyefinity patch for all COD titles + Bioshock, Battlefield 2, Halo, Unreal 3, and more)
Widescreen Gaming Forum

LulzSec Franchise Opens in Brazil, Takes Down Government Sites

Posted: 22 Jun 2011 02:57 PM PDT

lulzIf you thought one LulzSec was one too many, get ready for your worst nightmare. A new Brazilian faction of the now infamous hacker group has begun its own attacks on government sites in Brazil. "Our Brazilian unit is making progress. Well done @LulzSecBrazil, brothers!" LulzSec proper tweeted.

LulzSec Brazil reportedly brought down the Brazilian government site, as well as the site of the President's office. The successful attacks were announced on Twitter with the traditional LulzSec call of "Tango Down". LulzSec Brazil has promised more mayhem is to follow today, but we have yet to hear of any significant attacks. The Brazilian arm of Anonymous has also been making some noise in the last 24 hours, possible in response to LulzSec's attacks. 

Both Anonymous and LulzSec in Brazil seem to be protesting, in their own ways, the lack of transparency in the nations government. Brazil is often cited as lacking sufficient protections for free speech. We have to wonder if more LulzSec franchises will pop up around the globe.

Google Hiring Product Manager for "Games at Google"

Posted: 22 Jun 2011 02:43 PM PDT

goog game

Google has its proverbial fingers in a lot of pies, but one space they have yet to really investigate is gaming. If a new job posting is to be taken seriously, The Big G is about to change that. The company is looking for a product manager for a product called Games at Google. Can't really get more clear than that.

There aren't a lot of solid details in the job posting, just that the applicant has to be (and we're paraphrasing) a genius with an advanced degree and technical expertise. The position is said to include, "game distribution and discovery, player identity, game mechanics, and more." The term social gaming is also used.

There are a few way this could go. Google may be developing an Android alternative to Apple's Game Center and the cross-platform Open Feint. Similarly, they could be developing a gaming centric back end for the Chrome web store. What do you thing Google is up to?

Google Hits Unique Visitor Milestone

Posted: 22 Jun 2011 11:34 AM PDT

In its "Do No Evil" quest to become the entire Internet, Google hit a milestone in May that no other website has ever hit before. Just when you thought that the company couldn't possibly attract new visitors simply because everybody and his sister already used the service – no one searches the Web anymore, after all, they Google it – the Internet giant became the first website to ever have 1 billion unique visitors in a month.

It looks like the China fracas didn't faze users in other countries. So how does the world's biggest website keep drawing in new eyeballs? It's all about the extra services. The Wall Street Journal says that Gmail and YouTube both contributed significantly to Google's total.

Google's billion-user breakthrough occurred after an 8.4 percent increase in visitors over the past year. Those 1 billion visitors spent about 200 billion minutes on Google's sites, or just over three hours per person. Microsoft's had an even bigger viewership gain over the past year, with a 15 percent increase in users. The Redmond-based company nearly cracked the billion user mark itself, but fell just short at 905 million. When comScore first began tracking Internet traffic in 2006, Microsoft actually held the edge over Google by around 45 million visitors.

Fired IT Manager Responds With Porn

Posted: 22 Jun 2011 11:05 AM PDT

Hell hath no fury like a sysadmin scorned. Just ask Baltimore Substance Abuse Systems Inc; after the organization fired 52-year old Walter Powell in 2009, the IT manager went on a hack attack against his former employer, breaking into the network and installing keyloggers on company computers. With his former CEO's password firmly in hand, he unleashed his coup de grace. And yes, it involves porn, as many humorous computer stories do.

Powell used his ill-gotten access to seize control of his former boss' computer while the CEO was in the middle of a board meeting. Smack dab in the middle of a PowerPoint presentation, Powell replaced the original slideshow with slideshow of hot and steamy pornographic images on the room's 64-inch television, according to NakedSecurity.

The Associated Press reports that Powell "pleaded guilty to two counts of unlawful access to a computer causing a malfunction and a count of possessing a pass code without authorization ." Earlier this week, Powell was sentenced to three years of probation, 100 hours of community service and two years in the slam, but the judge on the case suspended the jail sentence.

Computers Play Second Fiddle To Mobile Devices On Wi-Fi Networks

Posted: 22 Jun 2011 10:29 AM PDT

Little brothers are like your own portable punching bag: name calling, insulting and rubbing your smaller sibling's face in the dirt are all typical big brother pastimes. As any bigger brother can tell you, though, it sucks when your little brother gets big enough to fight back and punch you in the eye. The days of us big brother PC-types mocking younger technologies like smartphones and tablets may be coming to an end if a recent report is any indication: more people access Wi-Fi Internet using mobile devices than traditional computers.

"Well, duh," you might be thinking, but GigaOm says it's actually the first time it's ever happened, citing a report from cloud networking provider Meraki. In 2010, desktop operating systems like Windows and OS X accounted for the lion's share of the market, claiming 64 percent of the total Wi-Fi pie. Android devices and the iOS devices combined only accounted for a third of all Wi-Fi access.

Those numbers shifted gigantically in 2011. Wi-Fi usage for Mac OS X, Windows 7/Vista and XP all fell roughly 50 percent apiece, to a cumulative total of 36 percent for desktop operating systems. Google and Apple were all too happy to pick up what the desktops dropped: every mobile device in the study saw decent usage gains. Now, mobile OS's can call themselves king of the Wi-Fi roost, sitting atop a healthy 58 percent chunk of the market. The iPhone alone accounts for almost a third of all Wi-Fi usage.

Before you think the traditional PC's sky is falling, realize the limitations of the report: Meraki's study took place in restaurants and only covered about 100,000 devices, a small fraction of the total number of devices connecting to the Internet via Wi-Fi. There's no reason to think those numbers wouldn't scale up, though. Plus, traditional desktop PCs are unaccounted for, since people rarely drag their rigs down to Burger King. If anything, we think Meraki's report suggests that users are ditching laptops in favor of smaller tablets and smartphones, and what's wrong with that? We keep a smartphone in our pocket, too.

PowerColor Pumps Double Barreled Radeon "HD6870X2" Graphics Card

Posted: 22 Jun 2011 09:06 AM PDT

PowerColor today said it "aims to blow gamers' minds" with its very first dual-GPU solution with AMD's Bart XT graphics engine, the PowerColor HD6870X2. As the name implies, this dual-GPU graphics card sports two 6870 graphics chips under its dual-fan cooling apparatus. That equates to 2,240 stream processing units and 4.03 teraFLOPS of computing power.

The GPU cores cruise along at 900MHz and 2GB of GDDR5 memory at 1050MHz (4.2Gbps). To keep instability from rearing its ugly head, PowerColor's cooling solution consists of 'Heat Pipe Direct Touch (HDT)' technology with six flattened heat pipes directly over the GPU. This, PowerColor claims, dissipates heat 50 times better than a regular copper base.

No official word on price, though according to news and rumor site Fudzilla, PowerColor plans to sell the card for around $450.

Image Credit: PowerColor

PlayStation 3 Modder Short on Funds, Expects Jail Sentence

Posted: 22 Jun 2011 08:33 AM PDT

George "Geohot" Hotz received a mountain of criticism for backing down on his scuffle with Sony and settling out of court, in part because so many donated money to his legal battle. Hotz has since made amends by donating leftover legal defense money to the EFF, but many are still furious he didn't fight this thing to the end. Given what might go down with another PS3 modder -- Alexander "graf_chokolo" Egorenkov -- Hotz might have made the right move after all.

Authorities raided Egorenkov's home in February after being sued by Sony for hacking the PS3 trying to restore OtherOS (Linux) to the system. According to Kotaku, Sony at last check was seeking €1,000,000 (about $1.44 million), and while Egorenkov tried to fight the good fight with donations, the well is apparently drying up and he's now fearing the worst.

"Hi guys, no money left anymore. Going to jail soon probably because I cannot pay court costs," Egorenkov said on his website. "But I'm ready to stand up for everything I said and go to jail for that too. It's not important to win, more important is to show them that we are ready to fight, that they cannot scare me off easily. Yeah, I'm ready to go to jail for my beliefs and my principles."

Egorenkov hasn't actually been sentenced yet, but if he does end up going to prison, it might be Sony that ultimately can't afford the outcome, in terms of yet another PR hit.

Total Pageviews

statcounter

View My Stats