General Gaming Article

General Gaming Article


Intel's Ivy Bridge: The Maximum PC Review

Posted: 24 Apr 2012 11:24 AM PDT

The world's smallest-process consumer CPU comes out in a big way

You are, no doubt, quite familiar with Intel's CPU-release "cadence" of tick-tock by now. If not, the short story is that every tock brings a major breakthrough, while ticks are decent upgrades but nothing to Twitter home about.

That's not necessarily the case with Intel's latest tick, the Ivy Bridge CPU. Sure, the performance enhancements on the x86 side of the aisle won't exactly knock you on your tuchus, but they're still decent. The upgrades to the graphics core, however, make Ivy Bridge more noteworthy.

As we know, Intel found religion through graphics and has been progressively improving on it ever since. The Clarksfield CPUs moved graphics directly into the CPU package, and Sandy Bridge CPUs moved graphics directly onto the CPU die itself. With Ivy Bridge, Intel says it outdid itself by doubling the graphics performance of Sandy Bridge.

If you're ready to write off Ivy Bridge as an incremental chip that you, the enthusiast, doesn't give a damn about, you're wrong. There's a lot more to Ivy Bridge that makes it the default CPU for an enthusiast who doesn't want to jump into the bigger, pricier LGA2011 socket. Don't believe us? Read on to find out why you want this CPU instead of Sandy Bridge.

Meet the Ivy Bridge Lineup

Click to enlarge

World's First Chip with '3D' Transistors

Despite its revolutionary tri-gate design, Ivy Bridge doesn't do much to advance x86

We've long dubbed Intel the "Master of the Fab." The company's prowess in chip fabrication is the envy of the world. Yeah, there was that little thing with the Pentium 4, which hit the process wall like a freight train, but for the most part, Intel's mastery of chip fabrication has always made its new CPUs a tour de force of technology that makes you wonder if the company doesn't have a crashed flying saucer hidden at 2200 Mission College Boulevard.

With Ivy Bridge, Intel again amazes with the world's first use of tri-gate, or 3D, transistors. Also called finFETs, for fin field-effect transistors, the 3D transistors literally rise up off of the die to dramatically reduce power consumption while increasing performance. 

In a traditional planar transistor, current flows on a flat surface like a river. A gate, which ostensibly controls that flow, lies across the top of that river with contact only along a small surface. With a finFET, or 3D tri-gate, the flow of power spans a fin that juts from the surface. Instead of just contacting the surface along one dimension, the gate encircles it and makes contact on three sides.

Intel says this gives it far greater control of power and enables it to drive the signal harder while adding only a small amount to the build cost. Despite having similar architectural underpinnings to Sandy Bridge, Ivy Bridge should provide better performance while consuming significantly less power than an equivalent SB processor. So far, that seems to be panning out. A typical performance Sandy Bridge chip, such as the 3.4GHz Core i7-2600K is rated at 95 watts. The new 3.5GHz Core i7-3770K is rated at 77 watts. And those are higher-performing processors. The promise of tri-gate should pay even more dividends at lower power thresholds. Right now, Intel is only detailing its quad-core parts. Dual-core CPUs haven't been announced yet but we'll be curious to see how aggressively Ivy Bridge performs in notebook.

Ivy Bridge isn't just a process story, though. It's about keeping the chains moving. If, after all this investment in 3D transistors, the damned CPU isn't any faster, no one would care if it were made out of the purist shimmering samite. Fortunately, that isn't the case, which you can see in the chart on the third page. But first, let's break it down two ways: Even Intel says Ivy Bridge isn't a big step forward for pure x86 performance, as it's largely a die shrink of the Sandy Bridge core. The cache remains the same and the base clocks are similar. Where Ivy Bridge appears to have an edge in x86 performance is in its lower power envelope. As you know, Intel essentially overclocks, or "Turbo Boosts," the chip based on how much power it's eating and how hot it's running. So if a chip can run cooler and consume less power than its counterpart, it can run at a higher turbo clock for longer. 

Where Intel seems to have put most of its focus this time is in the GPU side. In fact, Intel says it has achieved roughly a doubling of the graphics performance over the Sandy Bridge processors. The improvement is good enough that the company says Ivy Bridge processors are capable of playing 100 games out of the box while Sandy Bridge could only play 50. Detailed info about Ivy Bridge's graphics capabilities is on the second page, but suffice to say, it's obviously better. Is it enough to forego a discrete GPU?

For certain uses—such as an HTPC or all-in-one PC that won't be used primarily as a gaming machine, yes. Of course, notebook users will also be pleased to get more graphics performance from the newer Ivy Bridge parts.

Overclocking

With Ivy Bridge, Intel maintains the "K" versions that it introduced with its Lynnfield procs and continued on with Sandy Bridge. Like Sandy Bridge, Ivy Bridge isn't hugely tolerant of bclock, or base clock, overclocking. Intel says the most you should expect is a 7 percent bclock nudge before things go sideways. Instead, overclocking will continue to rely on upping the Turbo Boost or clock ratios. Intel has enhanced Ivy Bridge a bit by increasing the maximum core ratio overclock from 59 on Sandy Bridge to 63. Ivy Bridge also now lets you change the core ratios in real time. Graphics support a greater range for overclocking, too, and Ivy Bridge will let you run the RAM up to DDR3/2667 through overclocks (DDR3/1600 is the official speed.)

Compatibility

We've long railed against Intel for releasing new sockets with new CPUs (remember the short-lived Socket LGA1156 and Socket 423 and numerous LGA775 versions?), but the company has stepped up to the plate for the Sandy Bridge-to-Ivy Bridge transition. As Intel promised, most LGA1155 boards will support Ivy Bridge procs if the firmware and BIOS are updated to support the chip by the board maker. However, not all chipsets will make the Ivy Bridge cut. Intel has intentionally left out support for the business chipsets Q65, Q67, and B65 while supporting consumer H61, H67, P67, and Z68. Why leave some out? Intel believes the day of an IT shop getting down and dirty and upgrading processors in an office-drone PC are long gone, so there's just no reason to expend the resources on unnecessary support. Besides getting the latest core technology from Intel, switching to Ivy Bridge on older 6-series boards should also give you PCIe 3.0 support on some slots. 

Ivy Bridge vs. Sandy Bridge: Beneath the Surface

The 22nm-based Ivy Bridge processor is considerably smaller than its predecessor. It has nearly 400 million more transistors yet is about 25 percent smaller. What's more interesting, however, is how much real estate is dedicated to each task on the new Ivy Bridge vs. Sandy Bridge. These die shots (not to scale) show that the almost 2x performance bump in graphics comes at the price of die space. Intel, however, discounts any criticism regarding how much emphasis it placed on graphics over x86 functionality and says just because it looks like more space was expended on graphics doesn't mean it's more important. Um, OK.

Sandy Bridge

Ivy Bridge

Next up: The 7-series chipset and graphics benchmarks!

 


 

 

7-series Chipset Brings Few Changes 

 Gigabyte's GA-Z77X-UD5H features out-of-the-box Ivy Bridge support and USB 3.0 ports powered by Intel!

Conspiracy theorists, unite: If you're one of the tin-foil hat wearers (this means you, Nathan Edwards) who was absolutely certain Intel was trying to sandbag USB 3.0 in order to push Thunderbolt, the new Z77 chipset puts your suspicions to rest. The Z77, you see, finally brings native USB 3.0 support to the world of Intel. Why all the fuss over native support? First, it cuts the cost of a board, slightly, since the board maker has one less chip to supply. Generally, performance and compatibility of integrated USB 3.0 tends to be better, too. Finally, native support means USB 3.0 in just about every new PC going forward. That means more devices and lower costs, which, as Admiral Kirk says, is better for me, better for you, and (pause) better for them.

Native USB 3.0 won't extend to all ports on a motherboard, though. The Intel PCH supports up to four USB 3.0 ports, so on motherboards that offer more than that, it'll be a mix of USB 2.0, Intel USB 3.0, and third-party USB 3.0 support. On the Gigabyte GA-Z77X-UD5H that we used, for example, it had four USB 3.0 ports on back with an additional three USB 3.0 headers. This was done by using the Intel chipset support and a discrete controller from VIA.

Beyond USB 3.0, the 7-series chipsets is a fairly incremental update. SATA support, for example, is the same weak-sauce mix of two SATA 6Gb/s and four SATA 3Gb/s. When we critically asked why not all 6Gb/s ports? Intel threw it back in our face by saying that backward compatibility with the 6-series boards was important to keep costs down on the 7-series boards. And since we're always whining about backward compatibility, isn't that important? Well, yes—but this is the last time, Intel. The 5-series, 6-series, and now 7-series have all shared the same SATA 6Gb/s configuration, so we better not see the 8-series with it, too.

Other key differences between 7-series and 6-series are support for three displays using Ivy Bridge's graphics chip, and of course, support for both Sandy Bridge and Ivy Bridge chips out of the box. Is there a performance difference? Frankly, no. For our tests, we used a Gigabyte GA-Z77X-UD5H board, first with the Core i7-3770K, which we then swapped out for a Core i7-2600K. We then re-ran our benchmarks and compared them to our Z68/2600K numbers. The difference? Nada, other than the weird, unexplainable bogies we had with a couple of benchmarks. The two, frankly, are essentially indistinguishable. Even the Intel USB 3.0 support didn't prove to be superior to any of the discrete USB 3.0 chipsets we've seen. So if you're considering whether to move from Z68 just to upgrade, we don't recommend it. However, if you're building a new box on an Ivy Bridge processor, we'd build on Z77 just to have the latest chipset.

Ivy Bridge Graphics

It's what everyone's been waiting for. Does Intel deliver?

We're all Charlie Browns when it comes to Intel graphics. Intel, of course, is Lucy, pulling the graphics eye-candy football away after promising that this time will be different. Once again, Intel is promising that this generation of the GPU built into the upcoming Ivy Bridge 22nm CPU will be different. Honest!

Several years ago, Intel promised to speed up its graphics core by 10x per generation—and that 10x speedup would start with Ivy Bridge. With Ivy Bridge almost upon us, it's worth diving in to its internal architecture to understand what's really changed.

Based on what we know about DirectX 11 compute shaders and the OpenCL 1.1 implementation, it looks like Intel's new GPU is getting a pretty robust set of compute-capable shaders. That's an encouraging sign, as is support for hardware tessellation.

Those are the gross differences. Internally, the GPU has been redesigned from the ground up. The GPU is partitioned into five domains. The global asset area takes care of geometry. It includes geometry, vertex, and hull shaders, plus the tessellator. Setup is also in this section. The resulting output is fed into the thread dispatch engine to the execution units (EUs), which do a lot of the heavy lifting. After the EUs are done, the render section takes over.

Intel's hardware tessellation engine is fixed-function, but can accept different cues for setting the overall level of tessellation. The EUs have been beefed up, with each EU offering 2x the performance per watt of Sandy Bridge. The higher-end HD 4000 has 16 EUs, up from 12 on Sandy Bridge's GPU. Intel also added an L3 cache to the GPU, which improves overall throughput, since data doesn't need to be fed to the GPU from the ring bus as frequently. This also saves on overall power.

One of the key performance-enhancing features is co-issue of instructions to execution units. Sandy Bridge supported this on some operations, but Ivy Bridge extends this to many more operations. 

How does this affect actual performance with PC games? We ran a few tests on very early drivers. What we saw was definitely encouraging.

Even with early drivers, we're seeing about a 25 percent or better increase with 3D games. You'll still need to sacrifice some detail levels, but you'll get acceptable performance in all but the most bleeding-edge games. Titles like StarCraft II, Civilization V, Modern Warfare 3, and Portal 2 will probably run fine, if you're willing to dial back resolution, turn off AA, and run at medium or lower detail levels. It's probably best to steer away from highly demanding titles, though, such as Deus Ex: Human Revolution or The Witcher 2.

Note that 3DMark 2011 actually runs, giving clear evidence that Ivy Bridge is indeed DirectX 11 compliant. That's not a big score, but the fact of the score is encouraging. As with Sandy Bridge, Ivy Bridge includes a dedicated, fixed-function video encoder. Intel is claiming a nearly 2x encode advantage over Sandy Bridge, but that will depend on the application and workload. We saw only a 6 percent gain over Sandy Bridge when encoding an HD video file for iPhone using CyberLink's Media Espresso 6.5 (295 seconds for Ivy Bridge versus Sandy Bridge's 311 seconds.) Encoding performance is likely to be better with stereoscopic content, for example.

Finally, the new GPU, in conjunction with motherboards using Intel's 7-series chipsets, will support up to three simultaneous displays. As with Sandy Bridge, DVI support will be limited to single link only, but that will only affect a handful of users with older 30-inch monitors. Full bandwidth support for very high resolutions will be available through DisplayPort 1.2 or HDMI 1.4a.

Overall, Ivy Bridge's graphics are clearly better. Desktop users who are regular PR gamers will definitely want to stick with their favorite discrete graphics card, but owners of Ivy Bridge ultrabooks might be able to get a reasonable gaming fix now—provided the unit is built with the HD 4000. It's unlikely that the HD 2500 will be much use for gaming. –Loyd Case

Next up: The benchmarks!

 


 

 

Ivy Bridge vs. the Benchmarks 

New kid proves itself to be the new standard bearer

For our testing, we used a Gigabyte GA-Z77X-UD5H motherboard using the new Z77 "Panther Point" chipset. To this, we added a 3.5GHz Core i7-3770K and installed a fresh copy of 64-bit Windows 7 Professional along with 8GB of DDR3/1600, a GeForce GTX 580 card, and a 150GB Western Digital Raptor. For benchmarks, we reached for the same set of mostly CPU-dependent benchmarks that we've used to review the last few rounds of processors.

For direct comparisons, we decided to pit the new 3770K against the Core i7-2600K and Core i7-3820. Why not the Core i7-2700K, which is the same clock as the Core i7-2600K? First, there's but a 100MHz difference between the Core i7-2600K and the new Core i7-3770K and both are priced the same. The Core i7-2700K has always been a bit of an odd duck part to us. You pay $25 over a 2600K and really only get 100MHz more megahertz. Why bother? Obviously, the LGA2011 Core i7-3820 can't be tested in the same board as Core i7-3770K, so we used our old standby: the Asus P9X79 Deluxe.

For reference, we also included in our chart the performance numbers of the Core i7-3960X, AMD's octo-core FX-8150, and the classic Corei7-990XE "Gulftown." While the last two platforms also had to use different motherboards, we tried to normalize as much as possible by clocking the RAM the same and using the same graphics cards and drivers.

The test suite includes everything from 3D modeling tests, to video editing and video transcoding, to several synthetic benchmarks and a few gaming tests with the resolutions cranked down low enough to take the graphics card out of the equation. 

While we included six-core and eight-core processors in the chart, this is really about Intel's quads. Three scenarios come up: Do you buy a Sandy Bridge or Ivy Bridge for your new build? Should you upgrade from your Sandy Bridge to Ivy Bridge? Should you just bypass Ivy Bridge for Sandy Bridge-E or a hexa-core chip?

Click to enlarge

Let's dig into the numbers

When we look at all three quad cores it's clear that Ivy Bridge has a performance advantage over the Sandy Bridge part in just about every benchmark. Across the board, we generally saw from 5 to 15 percent in favor of the Ivy Bridge. In fact, the only place where Ivy Bridge was slower was in 3DMark's GPU test and Dirt 2. Why? Frankly, we don't know. We actually expected the scores to be fairly close, with Ivy Bridge slightly ahead of the pack, but for baffling reasons it was slower in these tests. Even more baffling, an exact duplicate of our configuration at Gigabyte HQ put the numbers where they should have been. What's going on? We're not sure, as we swapped every component possible in an attempt to find out where the gremlin was but could not root it out.

Despite these two anomalies, it's pretty clear that Ivy Bridge is faster over the similarly priced Sandy Bridge part. The real shocker was its competiveness with the Core i7-3820 in some benchmarks. We thought the Core 7-3820's base clock advantage of 200MHz and quad-channel memory would put it in front, but that wasn't always the case. In some benchmarks, the Core i7-3770K was ahead by a small, but measurable margin of 3 to 6 percent.

One interesting benchmark to examine here is the Cinebench 10 Single Core test. That's where we have Cinebench 10 render runs only on a single core instead of across all cores. This is probably the best indication of how efficient Ivy Bridge's cores are against the different generations of chips here: Sandy Bridge, Westmere, and Bulldozer. It's just no contest. Ivy Bridge's core is about 15 percent faster than Sandy Bridge's, 9 percent faster than Sandy Bridge-E's, 34 percent faster than Westmere's and an incredible 73 percent faster than Bulldozer's here. Don't think that gives Ivy Bridge a definitive edge over the big boys, though. Despite each core being faster, more cores still matter if your application uses them. Even the ancient Core i7-990XE has an edge over the Core i7-3770K in many of our multithreaded benchmarks. We will be honest, though—the margin isn't as great as we would have expected.

But let's get back to our questions: Do you buy a Sandy Bridge or Ivy Bridge for your new build? This one's easy. Ivy Bridge, my friend. With the price of 2600K and 3770K exactly the same, there's really no reason to buy a 2600K unless you're limited by your motherboard's support for it. 

Should you upgrade from your Sandy Bridge to an Ivy Bridge? No. It would be foolish to think that just because Ivy Bridge is here your Sandy Bridge chip is a piece of junk. The only reason we could see upgrading is if you're coming from a lower-end, limited Sandy Bridge chip or need better integrated graphics, but otherwise, Sandy Bridge has plenty of life left in it.

Should you just bypass Ivy Bridge for Sandy Bridge-E or a hexa-core chip? That question can't be answered by us. It has to be answered by your computing needs. While we think Ivy Bridge is a hell of a chip, it's not faster than a hexa-core, even an older one, on thread-heavy tasks like 3D rendering and modeling, video encoding, and other content creation jobs. We still recommend that if you compute for a living, using thread-heavy tasks, it's worth the stretch for a hexa-core chip such as the Core i7-3960X or Core i7-3930K. All that aside, we think the Core i7-3770K is the new king of the midrange. Yes, it's hard to have the same enthusiasm we had when the Core i7-2600K first arrived and wiped the floor with all other CPUs, but you shouldn't discount Ivy Bridge. It's fast, it's cheap, and it's cool. What more could you ask for? 

Sophos: 20 Percent Of Macs Hide A Chlamydia-Like Risk For Windows PCs

Posted: 24 Apr 2012 10:52 AM PDT

The Flashback botnet scare may have thrust Macs' supposed invulnerability to antiviruses claim under a microscope, but Sophos decided it wanted some numbers to go along with the heaping of hype. So the company studied feedback from 100,000 Apple computers with Sophos antivirus installed and surprisingly discovered that the Macs were fairly teeming with malware. Before you start laughing, consider this: the vast majority of the malware found didn't affect OS X at all. It targeted Windows PCs.

Only 2.7 percent of the infected Macs contained malware that was harmful to Apple computers, Sophos reports in both a press release and a post on its Naked Security blog. However, a whopping 20 percent of Macs -- that's one in five, if math isn't your strong point -- were riddled with "one or more instances of Windows malware." Most of the Mac-targeting bugs were either Flashback or fake antivirus scams, while the top Windows malware found on Macs turns PCs into spam factories. 

Yes, Macs can often transmit that malware to PCs.

Sophos says that some of the PC malware infected the Macs as far back as 2007 and could have been easily removed at any point if the hardware owners would have installed an antivirus program rather than buying into the whole "Macs don't get viruses" thing. Sophos' Graham Cluley also says PC malware on Macs is a lot like Chlamydia:

Just like malware on your computer, Chlamydia commonly shows no obvious symptoms. But left undetected Chlamydia can cause serious problems, such as infertility… The good news is that Chlamydia is easy to treat. And, if it isn't too tacky to make a parallel, so is malware on Macs.

Cluley then went on to plug Sophos' free antivirus product for Mac users. Keep that in mind while you're contemplating these numbers. The study also drew its sample from 100,000 Macs that "recently" installed Sophos antivirus, which means the stats could be skewed somewhat, as you aren't likely to install a new antivirus program unless you're worried that either a) you're Mac is infected or b) the Flashback boogieman is going to get you.

NoFan's New All-Copper CPU Heatsink Cooler Is Pretty, Big And Whisper Quiet

Posted: 24 Apr 2012 10:32 AM PDT

Sometimes, you don't want to hear about a CPU's manufacturing process, or its cores, or the strength of its integrated graphics. Kidding! Of course you want to hear about all that. What you don't want to hear is the sound of a heavy-duty fan trying to keep your heavy-duty proc from getting hot under the collar. Enter this amazing all-copper beaut of a heatsink from Nofan. It's massive, it's purdy, and it's silent.

Engadget pointed us towards FanlessTech.com, the first site to find the copper behemoth. (Check out the FanlessTech link for even more pics.) The Nofan CR-95C heatsink clocks in at roughly 7.09 inches by 5.83 inches and weighs 2.25 lbs. That massive size and stylized "IcePipe" design help it keep CPUs with a TDP rating of up to 100W cool and calmly running; that's not quite enough for a Core i7 with a beefy overclock, but more than capable of handling any stock Trinity or Ivy Bridge processors with ease.

FanlessTech doesn't expect mass shipments until June, but Britain's QuietPC.com expects a batch of 50 in tomorrow and is accepting preorders now for $107.50 a pop. It may not fit the needs of hardcore system builders (unless you have a big ol' HTPC with a side window, that is), but we thought you should bask in the CR-95C's beautiful glory nonetheless.

Image credit: QuietPC.com

AMD Launches Energy-Efficient Radeon 7000M Mobile GPU Series

Posted: 24 Apr 2012 10:11 AM PDT

AMD's Radeon 7000 series GPUs have officially been out for, what, just over four months now? Time sure flies! But even though you've been able to shove next-gen Radeon cards into a desktop build for over a third of a year, laptop users haven't been quite as lucky, as mobile variants hadn't been announced -- until today. This morning, AMD announced the Radeon 7000M series with three new GPUs built around the 28nm manufacturing process.

The chart above lays out most of the pertinent specs for the enthusiast-grade 7900M ("Wimbledon"), the mainstream 7800M ("Heathrow") and the Ultrabook-friendly 7700M ("Chelsea"), the new 28nm GPUs. The other entries in the 7000M lineup are based off the older 40nm process.

TheVerge reports the flagship 7970M hits 70fps in Skyrim at 1920x1200 resolution, blowing away the 49.3fps put up by the Nvidia 675M. Keep in mind, however, that the 675M is basically a rebranded 580M and isn't based on the Kepler architecture. The 7870M and 7770M hit 41.4fps and 36.9fps, respectively. AMD claims the mobile GPUs are powerful enough to run up to six monitors using Eyefinity.

Energy efficiency is oh-so-important in this slim-and-light notebook days; the 7000M series lays off the juice thanks to AMD's new "Enduro" switching technology, which behaves like Nvidia's Optimus technology and swaps the graphical load between integrated and discrete graphics depending on need. Radeon 7000M series cards can also shut down unused portions of the GPU to save even more energy and disable the GPU entirely when integrated graphics processors are handling the workload.

Google Drive Goes Live With 5GB Of Free Cloud Storage

Posted: 24 Apr 2012 10:02 AM PDT

After years of rumors, whispers and supposed false starts -- and a week of anticipatory service upgrades from competitors like Dropbox and SkyDrive -- Google Drive is finally here. Yep, Google's getting into the increasingly crowded cloud storage game and it's bringing wallet-friendly price points and a bevy of features swiped from Google Docs and others.

Watch the video above to get the basic details, then wander over here to claim your own 5GB of free cloud storage space. Upgrade options range from $2.50/mo for a 20GB upgrade all the way up to $800/mo for 16TB worth of Google's servers. $5/mo nets you 100GB of space. (Note; I just edited those numbers, which changed between now and the initial posting of the article.)

Files stored in GDrive are available for easy sharing via Gmail or Google+; each file also has sharing options for individual users. Apps are already available for PCs, Macs and Android devices, with iOS support "coming soon." There are also robust file-searching capabilities in place. I just installed the desktop PC client, and it supports drag-and-drop functionality.

Once you sign up for Google Drive, it basically absorbs your Google Docs. In fact, "Docs" gets replaced by "Drive" in the black nav bar at the top of Google sites. All of the features available in Docs are also available in Drive, right down to the awesome real-time collaboration mode. Google Drive tracks file revisions for up to a month, though you can opt to have it track a file's revisions eternally if you so desired.

One thing I don't like off the bat is that opening a file stored as a converted GDocs document in the GDrive PC client boots up your browser and opens Google Docs, rather than opening the file in a local word processing program. Not only is it irritating, it introduces a decent amount of lag time between clicking on the file and having it actually open, at least on my Core i5 notebook. It's understandable, though, as GDocs converts files to a new GDocs format for editing. (I guess I'm grumping more about Docs than GDrive, here.) Standard .doc files and the like that haven't been converted to Google's proprietary GDocs format open locally in Word or LibreOffice just fine, though.

While SkyDrive and Dropbox's file support is fairly limited, Google says GDrive's file support chops are comparatively beefy: Open over 30 file types right in your browser—including HD video, Adobe Illustrator and Photoshop—even if you don't have the program installed on your computer. Nifty!

It's free, so go check it out if you want. The lack of Linux support may bum out some, but Google Docs fanatics especially will find a lot to like in Google Drive.

Dropbox Adds 'Get Link' Feature for Easier Sharing

Posted: 24 Apr 2012 08:45 AM PDT

Freemium cloud storage service Dropbox today announced "a whole new way" of sharing files, which it says makes it ridiculously "easy to share your stuff from the web, your computer, or mobile device." To be honest, though, the said feature is far from being novel (perhaps Dropbox is happy about beating Leonardo da Vinci to the punch). While unprecedented it most definitely isn't, you're likely to find it very useful. Hit the jump for more.

Sharing of files using their URLs has for long been the bread and butter of the cyberlocker industry, but Dropbox has taken its own sweet time rolling it out to all of its 50 million-plus users. Hitherto, this feature was only available in beta.

"Anyone with the link gets access to a snazzy page where they can view (but not edit) your stuff," wrote Dropbox's Jon Ying in a blog post announcing the new feature. "Our gallery pages give your photos, videos, and even docs the gorgeous, full-browser view they deserve. This means that people who follow your link can see pictures, look at presentations, and watch home videos without having to download and open them separately."

Making a link for a file or folder is as easy as right clicking and selecting "Get link." When on mobile, you simply need to open the file and press the link icon on the bottom left corner.

New Screenshots and System Specs Emerge for Max Payne 3

Posted: 24 Apr 2012 07:27 AM PDT

The third installment in the Max Payne series is set to ship for PC on May 29th in North America and June 1st in Europe, but is your system ready? To help you determine that, Rockstar Games has coughed up a list of system specifications, including hardware and software, with what appears to be both minimum and recommended configurations (the list's layout is a bit vague).

Max Payne 3 will run on 32-bit and 64-bit flavors of Windows XP Service Pack 3, Windows Vista Service Pack 2, and Windows 7 Service Pack 1. As for the hardware, you'll need at least an Intel dual-core 2.4GHz or AMD dual-core 2.6GHz processor, 2GB of RAM, Nvidia 8600GT 512MB VRAM or Radeon HD 3400 512MB VRAM, 100 percent DirectX 9 compatible soundcard, and 35GB of hard drive space.

On the upper end, Rockstar Games recommends or supports an Intel Core i7 3930K or AMD FX8150 processor, 16GB of RAM, Nvidia GeForce GTX 680 or Radeon HD 7970 graphics card, and DirectX 9 compatible soundcard supporting Dolby Digital Live.

"Developed in parallel with the game's console versions, Max Payne 3 for PC supports DirectX11 including tessellation, as well as a number of additional advanced graphics options and is optimized to run across a wide range of PC setups," Rockstar Games said.

A Special Edition SKU will be available exclusively through GameStop in the U.S. (and throughout Europe from select retailers). Alternately, you can pre-purchase the digital download version and receive pre-order bonus content such as character and weapon packs.

Image Credit: Rockstar Games

Mark Your Calendar, Windows 8 Release Preview Arriving First Week of June

Posted: 24 Apr 2012 06:48 AM PDT

Microsoft's next generation desktop operating system, Windows 8, inches closer to release with each passing day. In fact, barring any last minute snags and/or delays, Microsoft will make available the Release Preview of Windows 8 in early June. How early? Within the first week, which is less than seven weeks away. What this tells us is that Windows 8 is nearly ready for prime time.

It's important to remember when playing with the Consumer Preview of Windows 8 that it's basically beta code, an unfinished build that won't fully resemble the final product. But come June, the Release Preview will likely be the last sneak peek before Windows 8 goes gold, and for the most part, what you see is what you're going to get, presumably in October.

Microsoft announced the June release window during a Windows 8 Developer Day event in Japan and posted a picture to its @BuildWindows8 Twitter account. One of the people who commented noticed that Microsoft used an old Windows logo on the announcement slide, though it was probably just a simple oversight.

There are still tons of questions Windows 8 (and Microsoft) need to answer, however we do know that Microsoft is paring down the number of versions consumers will have to wade through. There will be two main versions for x86 users, Windows 8 and Windows 8 Pro, and a seperate version for ARM users, Windows RT.

Image Credit: Microsoft

Vint Cerf, Al Gore Among First Inductees into Internet Hall of Fame

Posted: 24 Apr 2012 06:27 AM PDT

The Internet Society announced last month the creation of an annual Internet Hall of Fame program to honor leaders and luminaries who have made significant contributions to the development and advancement of the global Internet, and on Monday, the group inducted more than 30 people during an awards ceremony in Geneva, Switzerland. Linus Torvalds, the father of Linux, and Vint Cert, the father of the Internet, are both among the inaugural class, and so are a few surprise names.

Perhaps the biggest surprise is Al Gore, though not for inventing the Internet. Rather, he was honored for being a "key proponent of sponsoring legislation that funded the expansion of and greater public access to the Internet." According to his bio on the Internet Hall of Fame's website, Gore was one of the first government officials to foresee what kind of impact the Internet could have beyond the realm of academia.

"There are extraordinary people around the world who have helped to make the Internet a global platform for innovation and communication, spurring economic development and social progress," the Internet Society said in a statement. "This program will honor individuals who have pushed the boundaries to bring the benefits of a global Internet to life and to make it an essential resource used by billions of people. We look forward to recognizing the achievements of these outstanding leaders."

The Internet Society convened an Advisory Board made up of some pretty big names to vote on the inductees for the 2012 class. Jimmy Wales, co-founder of Wikipedia, and Chris Anderson, Editor-in-Chief of Wired are among the dozen board members.

You can view a list and bios of all 33 inaugural inductees here.

G.Skill Hurls New TridentX DDR3 Memory Kits at Ivy Bridge

Posted: 24 Apr 2012 05:53 AM PDT

With Intel having finally and officially launched its much anticipated Ivy Bridge platform yesterday, the floodgates have been opened for a new generation of parts and accessories designed to play nice with the Santa Clara chip maker's 3rd generation Core processors. One of those companies is G.Skill, makers of high performance system memory like the new TridentX DDR3 series.

Anyone is welcome to pick up a TridentX kit and run with it, however the new sticks are really designed for overclocking enthusiasts. To prove it, G.Skill posted a couple of screenshots showing a 16GB DDR3-2800 TridentX memory kit running at 3,320MHz and a 32GB DDR3-2666 kit running at 2,933MHz.

TridentX memory kits range in stock frequency from 2,400MHz to 2,800MHz in 8GB to 32GB capacities. Each kit sports a removable top fin for added flexibility when installing the RAM into cramped systems with large CPU coolers.

No word yet on price or availability.

Image Credit: G.Skill

Total Pageviews

statcounter

View My Stats