GDC 2015: Nvidia Shield and GRID Posted: 05 Mar 2015 08:28 PM PST Does anyone buy CDs or Blu-ray discs anymore? You can stream so much stuff for a few bucks a month that it's hard to make an argument for physical media these days. Those two mediums have nearly leapfrogged the downloading phase that PC games have been in for a decade, since the dawn of Steam. Now Nvidia is making a push for streaming games, too, and its new Shield console is central to that effort. We sat down today for a talk presented by Eric Young, an engineering manager at Nvidia, who gave us some more details about how the Shield handles streaming from the company's cloud-based service dubbed GRID. Nvidia GRID has been streaming major PC titles to the Shield Tablet and Shield Portable for nine months now, so its existence is probably not news to our readers. In case it is, Nvidia pitches it like this: You can buy a game and be playing it in less than a minute, you can use it on mobile devices that would otherwise choke, games update themselves without you needing to download a patch, and it solves a lot of problems with piracy. The secret sauce is that the game runs in the cloud itself, then a video of the gameplay is shot down the series of tubes to your computer. You move your mouse, and you see the mouse move in-game, "In half of the blink of an eye," according to company CEO Jensen Huang. If you're close enough to the GRID server farm, Nvidia says that your latency can be as low as 150 milliseconds. That's low enough to play a racing game like Grid 2, which they demonstrated live at the Shield console press event on March 3rd. Another critical element is hardware-accelerated video encoding and decoding. The latest Nvidia video cards have this, and so does the Shield console. It can do both H.264 and H.265. If your Internet connection can handle bandwidth in the 15-25mbps range, you can get 60 frames per second at 1080p. Eric Young says that it takes only 10ms to encode frames on the shield, and the bundled controller's Wifi Direct tech lowers input latency to 10ms as well. The console takes 5ms to send video to your display, which has its own processing latency of a few milliseconds (this last step varies widely from one TV to another). Users of the Shield tablet or Shield Portable will be limited to 60 FPS at 720p, unless you plug in an Ethernet cable via an OTG adapter. The shield uses its hardware acceleration to also play 4K movies and TV shows, where available. Netflix has just started doing that, so the Shield is ready out-of-the-box. Developers who want to integrate their games into GRID will be getting access to the GRID Link SDK "very soon." We suspect that it will coincide with the company's GPU Technology Conference happening later this month. |
GDC 2015: Xbox Live Comes to Windows 10 Posted: 05 Mar 2015 07:01 PM PST Microsoft aims for a unified experience If you've heard of Games for Windows Live (GFWL), then you're probably familiar with some of its troubles. The difficulties some users had with fundamental things like logging in and updating the GFWL could produce some epic tales of woe. GFWL was deactivated last year, and with it went its online matchmaking system, meaning that games that used this service to create multiplayer sessions either no longer had multiplayer or had to plug into something else, such as SteamWorks. With the next big version of Windows coming out this year, Microsoft wants to give it another shot, and thankfully their using a different set of tools and also introducing some interesting new features. We sat down for a lecture on the subject, conducted by Microsoft engineers Vijay Gajjala and Brian Tyler. First of all, the Xbox Live service on Windows 10 is running native desktop PC code, in a Windows 10 app, rather than being a port. Windows tablets and phones running Windows 10 all get their native apps, which in theory should mean smoother operation. Microsoft is also talking about single sign-on, so you can use one Microsoft account across all Win10 devices to log into XBL as well as logging into the device itself. When asked about this service's availability for earlier versions of Windows, and for Android or Max OSX devices, the company had no firm plans on the horizon. Don't hold your breath. When you log into XBL on Windows 10, you get the complete XBL experience – your profile page, achievements, leaderboards, messages, activity feeds, game clips, privacy settings, basically everything that's exposed to an Xbox One user. There's a "Game DVR" where you can record gameplay and upload it to be viewed on other devices logged into XBL. (And in case you haven't heard, you can stream Xbox One games to your PC and play them there.) As far as Microsoft is concerned, there is no difference between what Xbox One users can do and see, and what PC users can do and see when logged into XBL. Microsoft is also adding a Twitter-like feature where users can be followed, rather than requiring an accepted friend request to see another XBL player's activity. This can be disabled in your privacy settings, if you wish. Windows 10 developers who want to use XBL now get access to a fancy telemetry system. In a nutshell, a dev can flag certain player activities like acquiring a specific weapon, loading a particular level, or completing a lap in a racing game, and have those events uploaded to a "stats engine" in Microsoft's cloud. This is used to handle achievements, but the data can also be aggregated and analyzed to better understand how people are playing the game. So you could answer questions like, "Is this boss fight too hard?" or "Is this secret room a little *too* secret?" This data feed is an optional service that developers can subscribe to or ignore. Devs also get access to the Windows Dev Center, which is a desktop client where they can do things like add a challenge to the game (a limited-time achievement), or create a leaderboard for a specific activity, like number of bad guys killed or number of explosions caused. These things can be staged on a test server before being published live, to catch problems before they're published live. In a separate but closely linked GDC session conducted by Ferdinand Schober, a software development engineer also at Microsoft, we gleaned some details about cross-platform multiplayer. Session data is stored in Microsoft's cloud, so it will be difficult to misplace. Like GFWL, Microsoft will handle matchmaking on its own servers, and developers can find-tune the player skill ranges to search for, and how long the matcher searches before expanding its searching area another notch. Like on the Xbox One, invites and joining in progress are also handled with a universal UI, rather than game-customized interfaces or notifications. This stuff isn't available in the Windows 10 Technical Preview yet, but it's never too soon to familiarize yourself with Microsoft's next flagship operating system. Win10 will be free for Windows 7 and 8 users anyway, during the first 12 months of its availability (after which you pay a one-time fee like usual, not a subscription fee as some had feared). We're looking forward to getting our hands on it. |
Nvidia GeForce GTX Titan X: We Touched It Posted: 05 Mar 2015 06:00 PM PST A quick peek into the future In the land of video cards, Nvidia's GTX Titan is generally considered the king. The original gangster came out in February 2013, followed by the Titan Black a year later, each sporting an unprecedented 6GB of RAM, 7 billion transisters, and more shader processors than you could shake a stick at (eventually tipping the scales at 2880). Nvidia capped it off in March 2014 with the Titan Z, which put two Titan Black GPUs on one card. And now it's been nearly a year since we've seen activity from them on the super-premium end. But the company hasn't been idle. Today we got up close and personal with this obsidian brick of magic, the GTX Titan X. How close? This close: Unfortunately, we were forced to double-pinky swear that we wouldn't give you any specifics about the card just yet, other than the fact that it's got 12GB RAM, eight billion transistors, and is probably the fastest video card on Earth. But we can confirm that it was running several live demos on the show floor of the Game Developers Conference this week, conducted by Epic, Valve and Crytek. This is obviously not going to be a paper launch -- the card is already here. The Titan X is just waiting in the wings until it can get a proper introduction at Nvidia's GPU Technology Conference, which starts on March 17th. In the meantime, we took some nifty photos for you. Hope you brought a bib for the drool! |
NVIDIA Shield vs. Razer Forge TV: Hands On Posted: 05 Mar 2015 01:32 PM PST One of these devices comes out ahead, far ahead One of the biggest launches to come out of this year's GDC was NVIDIA's Shield console. Showing the device off to a packed audience, CEO Jensen Huang demonstrated a console that was a combination of both cloud streaming and local Android-based entertainment. Out of all the Android TV style devices that have been announced, the Shield is the most interesting. Shield can do several things: play Android games, play triple-A Android games made for Shield, handle your online media needs, and stream from NVIDIA's Grid streaming service. Grid has been in the making for several years, and NVIDIA hopes to be first to deliver a playable, lag free experience. At launch, NVIDIA will have roughly 50 playable titles, all of which should be the most recent PC hits. NVIDIA's vision is to deliver all games, at maximum graphics settings, without the requirement for having a high-end gaming rig. Backtrack to CES 2015 and you have Razer's Forge TV, a device that's meant to allow you to stream all your games to the living room, lag free. Forge TV is also an Android device, but it doesn't have the power that NVIDIA's Shield has. For reference, Forge TV is equipped with an Adreno 420 GPU, while the Shield's graphics duties are handled by NVIDIA's own Tegra X1, which is based on its current flagship Maxwell architecture. Specs aside, the Shield can do everything the Forge TV can do, and much more. Shield is also 4K ready, while Forge TV is not. We get our hands on both at GDC. What's our impression? Well, put it simply, the Shield is where it's at. We tried both local content made for the Shield as well as streamed content. The real deal though is what's possible when NVIDIA's Grid grows as a platform. Trying out several games from platformers to big titles like Saints Row 4, and Batman: Arkham Origins, we honestly couldn't detect any indication that the games were actually being streamed from Grid. It was impressive. Local games were equally impressive too, and were of much higher quality than your typical library of Android games. NVIDIA's Shield playing Saints Row: 4 over Grid Many who watched NVIDIA's keynote over Twitch's livestream indicated that Shield was dropping frames. But I think it had to do with the setup and the stream rather than the actual Shield itself. During the keynote, it appeared like frames were dropped during gameplay of Witcher 3: Wild Hunt, but NVIDIA told us that the game was buggy and the dropped frames were due to the game, and not the Shield. Moving over to the Forge TV felt like a downgrade. Granted, NVIDIA has quite a lot more invested in Shield than Razer does in its own platform, but at the end of the day, both products are vying for your attention. The games on Forge TV are nowhere near as crisp and bold as on the Shield, and the titles aren't big hitters, but this has a lot to do with the two company's ability to negotiate deals with publishers. I'm sure as these devices become more popular, better titles will be released. However, the Maxwell based Tegra X1 is an order of magnitude more capable and powerful than the GPU inside the Forge TV, and NVIDIA's ability to get native content on the Shield to demonstrate its GPU capabilities are worth noticing. The Shield has a huge lead in graphics, compared not only to the Forge TV, but basically any other Android device. Razer's Forge TV playing Asphalt 8: Airborne Streaming wise, both the Shield and the Forge TV are capable of streaming all your local PC gaming content to the living room. However, to get local streaming working over the Forge TV, you'll have to invest another $40 to get Razers's Cortex: Stream, a proprietary solution that handles encoding duties. Razer informed us that it was not demoing Forge TV's biggest feature, Cortext: Stream, on the show floor. This seems odd to us since Cortex: Stream is arguably Forge TV's most touted feature. Despite the $40 added cost to get Cortex Stream on Forge TV, NVIDIA's Shield costs more from the get-go though, launching at $199, and Grid still hasn't received formal pricing, though NVIDIA indicated it would be "similar" to Netflix. NVIDIA's Grid streaming service provides a significant advantage to the Shield, removing the hardware requirements of a PC, and in fact, removing the need for a PC period. You can literally just get a Shield as a primary gaming system, if entertainment is all you want to do. We're looking forward to testing Shield and Grid in different networking situations, so look out for a report on that later. But for now, Grid works, and it works really well. We'll have to see how it performs in the wild. The question that needs to be asked now is why not just hook up a small form-factor PC to the TV and play all my games in their native glory? Well, consider convenience and price. Both NVIDIA's Shield and Razer's Forge TV cost significantly less than a PC capable of playing the latest games. And if you already have a PC, you can use the local streaming features of both systems to play at 1080p. NVIDIA's Shield is capable of native 4K output, but we didn't get to confirm whether or not the Shield will render games playable at 4K, and you can probably forget Grid gaming at 4K. Your network and Internet setup aside, the Shield reperesents an extremely attractive option. If both the Shield and Forge TV were available right now, I'd put my money on the Shield. [February 5, 2015 @ 18:17 PST: Edited for Clarity] |
Newegg Daily Deals: Microsoft Office Home and Student 2013 Key, Patriot Ignite 480GB SSD, and More! Posted: 05 Mar 2015 11:26 AM PST |
First Look: Logitech G303 Daedalus Apex Performance Edition Posted: 05 Mar 2015 11:15 AM PST Same compact G302 chassis, but with new and improved sensor Logitech recently came by the Lab to show off its new premium gaming mouse, the G303 Daedalus Apex. If you're thinking it looks just like the G302 Daedalus Prime, that's because it uses the same lightweight and portable body, which weighs 87 grams and measures 11.5x6.5x3.7 cm. Logitech says this is the enthusiast version of the G302, thus the "Performance Edition" moniker. The main feature that allowed the G302 to stand out was that it was originally designed as a MOBA mouse and had a new metal spring tensioning system. This system is guaranteed for at least 20 million clicks, which Logitech says is equivalent to a pro gamer practicing 10 hours a day, every day, for two years. More importantly, however, the spring mechanic eliminates air travel time between the two buttons and activating commands. This ensures a speedy, consistent clicking experience. The G303 maintains that system along with the G302's five DPI settings, but the Apex also has a few new tricks up its sleeve. The Apex features 16.8 million colors. The biggest addition to this Daedalus is the PMW3366 sensor, which Logitech uses in the bigger G502. While it isn't as fast as its G402 sensor, which uses an optical/gyroscope hybrid solution, which allows it to travel up to 12.5 meters a second, Logitech considers the PMW3366 to be its most accurate sensor. Logitech says this makes it about 2-3 pixels more accurate than the G302, and while the company admits that this isn't a monumental improvement, says that it should amount to a slightly more responsive and accurate feel for the end user. Logitech also asserts that the sensor mitigates unwanted mouse acceleration and adds zero smoothing. The Apex offers a DPI range between 200-1200. In addition, the sensor is much faster than the G302 before it, going from a cap of 120 inches per second to 300 inches per second. Logitech says this is fast enough for any real-world use and it's able to achieve this speed via the sensor's clock tuning ability which also helps prevent degradation of speed over time. This essentially extends the life of the mouse. To top it off, the sensor also features sensor surface tuning, which tunes the mouse's parameters to match your desk surface for a consistent scrolling experience. All of this on top of a 32-bit ARM processor. Beyond the sensor improvements, Logitech is also jumping on the RGB train (RGB… it's so hot right now). Some of you have clamored for more color options out of Logitech rather than the company's default blue hue, and your voices were heard loud and clear. The G302 will feature 16.8 million colors (you can count them all to be sure) and you'll be able to adjust the brightness or even have the LEDs pulsate, or you could just turn off the fancy lights if they don't tickle your fancy. Wireless mouse fans may be disappointed to hear that it uses a cable, and a braided one at that, but Logitech says it went out of its way to make the cable more flexible than the average braided solution, so that that you get the freedom of a plastic wire with the durability of a braided solution. You'll be able to get your hands on the G302 today for $70. Expect a full review of it sometime in the near future. |
Acer Speeds Up Chromebox CXI Line with Intel Core i3 Models Posted: 05 Mar 2015 06:01 AM PST A faster Chromebox There are plenty of mini PC options in the Windows space, but does anyone remember that Chromeboxes exist? For those who care, Acer is expanding its Chromebox CXI line with a couple of new models that have been upgraded with Intel's 4th Generation Core i3 4030U dual-core processor clocked at 1.9GHz (3MB cache, Hyper Threading support, 15W TDP), a speedy replacement for the Celeron chips that power the existing models. Acer's primary targets are users in education and small to medium (SMB) businesses, along with any consumers who are into Google's Chrome ecosystem. For those who can benefit from a Chrome OS system, the Core i3 upgrade offers enough speed to work on multiple projects at the same time, Acer says. There are two models -- the first is the CXI-i34GKM with 4GB of DDR3-1600 RAM and the second is the CXI-i38GKM with 8GB of memory. Both sport 16GB of internal storage upgradeable via microSD (up to 32GB), HDMI and DisplayPort outputs, 802.11n Wi-Fi, Bluetooth 4.0, GbE LAN, four USB 3.0 ports, and of course Google's Chrome OS. The systems measure 6.51 by 5.12 by 1.3 inches and are VESA mountable. Upon first boot, you're thrust into Google's ecosystem and signed into its services. The Chromeboxes come with certain web apps already installed, and there are now over 30,000 additional apps, themes, and extensions available in app store. Should things go wrong and/or there's a need to wipe the system clean and strat from scratch, both Chromeboxes have a "Powerwash" option that enables IT to quickly reset them to their original factory states with the touch of a button. The 4GB and 8GB models are available now for $350 and $400, respectively. Just as with Chromebooks, there are Windows-based alternatives that cost about the same, which will make these a tough sell considering their limitations. Follow Paul on Google+, Twitter, and Facebook |
GDC 2015: Virtual and Augmented Reality Roundtable Posted: 04 Mar 2015 11:43 PM PST At GDC today, a number of VR and AR developers gathered in a casual forum moderated by Chris Pruett, who does developer relations for Oculus VR. What followed was an interesting jam session as creative minds shared their ideas, triumphs, and frustrations with virtual platforms. Pruett stated at the beginning that he was not there as a representative of Oculus, and in fact he was not the original planned leader of the session. He began by taking an informal survey of the room. By a show of hands, he estimated that about 70% of the people there were actively developing, though only two people raised their hands when he asked if anyone had been working on VR or AR for five years or more. Most of the room also expected to release their product within eight months. Interestingly, not everyone was in it to make money. A few of them were in it for public feedback and planned to use that to iterate development, possibly into a retail product, but not necessarily. They all agreed that motion sickness was a prevalent problem, and they discussed the ways that they were combating it in software (as opposed to the device itself using a motion-sensing camera to keep the user's head correctly oriented). There was a consensus around creating a virtual copy of the user's body within the world, but it had to synchronize with the user's movement, or else the disorientation and nausea would be even worse than it was without the copy. Creating a goggle-like frame around the edges of the user's vision also helped (such as when a movie camera simulates looking through binoculars). Limiting navigation in the world and instead sending the content to the user was also beneficial, as was establishing a visual horizon and a virtual floor beneath the user's feet. The general consensus around VR and AR was that it felt like the early days of 3D gaming in the mid 1990s. Meaning, developers were still learning how the technique works -- and doesn't work, and implementing hacks to create certain illusions when the hardware couldn't handle the processing requirements of doing it for real. Pruett mentioned a trick he'd figured out with mirrors. Ordinarily, mirrors are a problem in a virtual space, because they force the GPU to render a reflected 3D space in addition to what it was already handling, which can kill performance. His workaround was to lower the refresh rate to 30Hz, which looked fine as long as there wasn't a lot of movement in the scene. The attendants largely did not mention their names or what they were working on, but conversation did start to flow once people started letting their guard down a bit. Some hesitation is understandable, since in many cases these people are working on products that aren't ready to for the public eye yet, and they may be using clever ideas that they'd prefer to keep to themselves. But the takeaway from this roundtable is that VR and AR could benefit quite a lot from developers sharing some ideas and discoveries with one another. Because while devices like the Oculus VR are amazing technology, they'll become historical curiosities without compelling content to drive them forward. In an environment as collaborative as software development, two heads are better than one. |