Newegg Daily Deals: Samsung 850 Evo 500GB SSD, WIndows 10 Home 64-bit OEM, and More! Posted: 27 Aug 2015 10:28 AM PDT |
LG Rolly Keyboard for Mobile Devices Folds into a Stick Posted: 27 Aug 2015 09:43 AM PDT Walk softly and carry a keyboard stick LG plans to unveil at IFA 2015 in Berlin what it claims is the industry's first solid rollable wireless portable keyboard. The keyboard, which bears the unfortunate name of "Rolly," folds up along its four rows into a stick that's supposed to easily fit in a pocket, purse, or briefcase. The is a tenkeyless (TKL) plank, meaning it doesn't feature a dedicated number pad. It has high contrast keys, a mobile device stand (two arms fold out to support smartphones and tablets), and is made of impact-resistant polycarbonate and ABS plastic. According to LG, the Rolly offers typists satisfying tactile feedback not found on flexible silicone keyboards. To put some numbers to LG's claim, the Rolly boasts a 17mm key pitch versus the 18mm keyboard found on many desktop keyboards. Though it's intended for mobile gadgets like smartphones and tablets, it connects via Bluetooth 3.0, so you could use it with a laptop or 2-in-1 PC if you wanted to. You can pair the Rolly with up to two different devices at the same time and switch between them with a key press. LG says a single AAA battery provides enough juice to power the keyboard for up to three months of "average use." The Rolly will launch in the U.S. in September, followed by releases in Europe, Latin America, and Asia in the fourth quarter. No word yet on price. Follow Paul on Google+, Twitter, and Facebook |
Dell Upgrades Alienware X51 with Skylake and Liquid Cooling Option Posted: 27 Aug 2015 06:48 AM PDT Abducting Skylake It was three years ago to the day that we posted our review of Dell's Alienware X51 PC game console to the website, which we called "impressively powerful for its size." That was true at the time, though in the fast moving world of technology, the system's Core i5-2320 CPU paired with a GeForce GTX 555 graphics card is nothing to get excited over today. Of course, there have been updates to the Alienware X51 since then, and as of right now, you can configure one with a Skylake processor inside. It's one of a handful of upgrades Dell announced today. Along with Skylake comes DDR4 memory options. You also have access to new M.2 PCI-E SSDs, two USB 3.1 ports (paired with two USB 3.0 and two USB 2.0 ports), and of course Windows 10, which launched to the public on July 29. In addition to unlocked 6th Generation Core i5 and Core i7 processor options and an upgraded foundation, you can opt for a custom liquid solution for lower noise and potentially better overclocking performance, Dell says. There's also support for the Alienware Graphics Amplifier, which might be the most intriguing upgrade option of the bunch. Likely due to its size and thermal restrictions, the Alienware X51's pre-installed graphics card options are rather sparse and uninspiring, topping out at with a GeForce GTX 960. But with the Alienware Graphics Amplifier plugged into the X51, you can use a higher end graphics card like the Titan X, which Dell says will more than double the performance over the highest end X51 configuration. The Alienware Graphics Amplifier is really intended for laptops, though in this case, the pitch from Dell is that you can use it to open the door to 4K gaming. Dell's retooled Alienware X51 is available now. Follow Paul on Google+, Twitter, and Facebook |
AMD R9 Nano Revealed Posted: 27 Aug 2015 05:00 AM PDT AMD R9 Nano: Small but Powerful The initial launch of AMD's Fiji architecture has been a bit rough: The R9 Fury X failed to claim the performance crown from GTX 980 Ti, though it puts up a good fight. Stepping down a notch is the R9 Fury, an air-cooled take on Fiji with eight of the Compute Units (CUs) disabled, but priced $100 lower. When AMD first demonstrated Fury X cards, they also talked about a "Fury Nano," which has now been officially christened the R9 Nano. We always knew the Nano would use the Fiji core and that it would target a lower-power envelope, which led to rampant speculation on how it would be configured and where it would be priced. It turns out the R9 Nano is both better and worse than we expected. This is best illustrated by jumping straight into the specs table comparing AMD's current high-end GPUs: AMD High-End GPU Specs | Card | R9 Fury X | R9 Nano | R9 Fury | R9 390X | GPU | Fiji | Fiji | Fiji | Hawaii (Grenada) | GCN / DX Version | 1.2 | 1.2 | 1.2 | 1.1 | Lithography | 28nm | 28nm | 28nm | 28nm | Transistor Count (Billions) | 8.9 | 8.9 | 8.9 | 6.2 | Compute Units | 64 | 64 | 56 | 44 | Shaders | 4,096 | 4,096 | 3,584 | 2,816 | Texture Units | 256 | 256 | 224 | 176 | ROPs | 64 | 64 | 64 | 64 | Core Clock (MHz) | 1,050 | Up to 1,000 | 1,000 | 1,050 | Memory Capacity | 4GB | 4GB | 4GB | 8GB | Memory Clock (MHz) | 1,000 | 1,000 | 1,000 | 1,500 | Bus Width (bits) | 4,096 | 4,096 | 4,096 | 512 | Memory Bandwidth (GB/s) | 512 | 512 | 512 | 384 | TDP (Watts) | 275 | 175 | 275 | 275 | Price | $649 | $649 | $549 | $429 | Surprised? Yeah, so are we! It turns out that the R9 Nano is a fully enabled Fiji GPU, just like the Fury X. It has the same 64 CUs, 4,096 shaders, 256 texture units, and 64 ROPs. The only differences are in the core clock and TDP, along with the cooling solution. Here's where things get a bit muddy: AMD is listing the GPU clock as "up to 1,000MHz," but with the 175W TDP it should be fully expected that the card will have to run at lower clocks for certain workloads. It sounds as though demanding workloads (e.g., Furmark) may push the clocks as low as 600MHz, while most games will run at 850–950MHz. Overall, AMD is claiming a 30 percent improvement in performance over the R9 290X, which is interesting as our own testing of Fury X averaged 34 percent faster than 290X. Best case, at 4K our testing has Fury X outperforming 290X by 40 percent. In other words, with a moderate 50–200MHz drop in GPU clocks, AMD has been able to reduce TDP by over 35 percent. This shouldn't come as too much of a surprise, however, as the Asus Strix R9 Fury already took a similar tactic by going with a 216W TDP. It's also worth noting that it will be possible to overclock the R9 Nano and increase the power target in order to improve performance. AMD has thermal protection on the Nano that kicks in at 85C, but otherwise the card will modify clocks based on the power use. AMD informed us the clock speeds are updated at a microsecond level, so the tuning of performance will happen very quickly and seamlessly to the end user. Here are some of the slides from the presentation. What will likely surprise a lot of people is the target price. Instead of being a "lesser" Fiji implementation with a lower price point, AMD is going for power efficiency and a price point equal to the Fury X. Basically, these are the same chip and even the same card, with only the cooling really having changed; it's just that Nano will be tuned for lower TDP while Fury X is pushing maximum performance. AMD is big on comparisons with 290X, showing a 40 percent reduction in board length, (up to) 30 percent higher performance, a 20C drop in target operating temperature (75C vs. 95C), 30 percent lower power requirements, and a 16dB drop in noise levels (vs. the reference blower 290X, which definitely wasn't a quiet card). As you would expect, much of this is made possible by the use of HBM. So, what does AMD want users to do with all of this compact goodness? One target market is high-performance mini-ITX systems. While there are certainly mITX cases that have used high-performance graphics cards in the past (Falcon Northwest's Tiki comes to mind), there's a minimum size requirement in order house the 10.5-inch graphics card. R9 Nano provides the ability to go with a smaller chassis, or you could have a similar size chassis with more space for storage. We've seen small GPUs like the GTX 960 and GTX 970 already, so this isn't inherently a huge change, but the Nano should deliver a healthy improvement in frame rates over a GTX 970… at roughly twice the cost. It's an interesting tactic, and we'll have to see how well it succeeds. AMD showed off a prototype system at E3 called Project Quantum, which consisted of a Fury X GPU and a mini-ITX motherboard in a sexy-looking custom chassis. Supposedly, some of these had dual Fury X GPUs, except now it's looking like they were more likely dual Nano GPUs. Only twelve Project Quantum systems were created, but AMD is still looking for someone to take the design and turn it into a retail product. And that's basically where the R9 Nano should succeed: custom builds where being able to cram a lot of performance into the smallest space possible is the primary concern. If you're just after raw performance, a larger desktop is easier to build and service, and likely cheaper at the end of the day, but it's not nearly as eye catching. Unfortunately, even if you really want to buy an R9 Nano, you can't do so just yet. Officially, the R9 Nano will go on sale "the week of September 7." That could mean as early as September 7 or as late as September 11, so we're about two weeks out from the retail launch. Meanwhile, R9 Fury X is still a bit difficult to find in stock, though Newegg lists at least one model at $670. If things go as planned, R9 Nano should launch with a decent inventory, but based on Fury X, it may not stay in stock for the first several weeks. We'll have our full review on the official launch date in a couple of weeks. Follow Jarred on Twitter. |
Logitech Announces G633 and G933 Artemis Spectrum Gaming Headsets Posted: 27 Aug 2015 01:34 AM PDT Just a day before the start of PAX Prime in Seattle, peripheral maker Logitech unveiled the company's latest entry into the gaming audio arena. At Logitech's audio lab in Camas, Washington, we got a hands-on with the two units that are billed as the successors to the G930 headset. Like the G930 before them, both the G633 and G933 (which we've heard have gone by the code names Ripley and Newt, though we don't know which is which) feature 7.1 Dolby surround. Additionally, these headphones support DTS' Heaphone:X 7.1 surround that promises highly accurate surround in games and movies. Users can hot-switch between Dolby and DTS surround using the Logitech Gaming Software configuration tool. The G633 uses either a wired USB or 3.5mm connection, while the G933 uses a 2.4GHz wireless USB mix adapter. If the user prefers, the G933 can use a wired connection as well. The G933's wireless range is advertised at 15 meters. We were told by some Logitech employees that they may reach up to 24 meters in perfect conditions, but the headset is rated at 15 to preserve quality in most conditions. Like other Logitech Gaming headsets, the boom mic swings up and out of the way when the user doesn't need to issue voice commands to teammates. The G633 features both USB and 3.5mm connections and can use both at once. In addition to Windows PCs, the G633 and G933 is compatible with PlayStation 4 and Xbox One. Users can connect two different audio sources with the G633's USB and 3.5mm connections. The G933 allows mixing of up to three sources when using the wireless mix adapter. Each model features the programmable buttons on the left ear cup, right next to a volume wheel, mic toggle and input switch. Both headsets also feature programmable RGB lighting that can be controlled with Logitech Gaming Software. The G633's G-keys and controls. During our visit to Logitech's lab, senior acoustical systems engineer Tracy Wick showed off the Pro-G driver that Logitech developed as the heart of the headset. Logitech says the drivers deliver accurate highs and deep lows that are meant to compete with high-end consumer headsets. The driver uses a proprietary textile mesh membrane that offers audiophile-level sound, Logitech said. The Pro-G driver and membrane material. With our first hands-on, we found the cans to be of moderate weight and of good construction. We were also impressed by the DTS surround demo in Logitech's reference theater room. While we liked what we heard with the hands-on demos, demos usually show off best-case scenarios. We got a G633 headset for review, so expect a more in-depth look soon. The G633 will retail for $149, while the G933 will go for a cool $199. The G633 will be available in September, while the G933 will be available in October. |
Fast Forward: Coming Down to Earth From the Cloud Posted: 27 Aug 2015 12:00 AM PDT This article was published in the August 2015 issue of Maximum PC. For more trusted reviews and feature stories, subscribe here. May you never know the agony of losing vital data filesSome people have been driven nearly to suicide when their only working copy of an unpublished novel, dissertation, business plan, or photo archive has been lost. It may have been corrupted by a failing hard drive, maliciously encrypted by ransomware, or destroyed by a fire or natural disaster. For some victims, hard-drive recovery services are the last resort, no matter the high cost. With so many cloud-storage providers these days, it's tempting to think we've entered a heavenly age in which we no longer need to maintain local backups. Just upload the stuff to a remote server and rely on the data-center angels to keep it safe. Unfortunately, the cloud's lining isn't always silver. One problem is that cloud providers occasionally evaporate. A few years ago, some professional photographers lost their life's work when a cloud service went bankrupt so suddenly that there wasn't time to save the thousands of high-resolution files. That danger can be avoided by using the cloud storage now offered by large, stable companies like Amazon, Apple, Google, and Microsoft. Another strategy is to store your files in multiple clouds. It's more trouble, but most providers offer a few gigabytes for free, so it's cost effective if your archive is relatively small. Nevertheless, local backups are still desirable. Some archives are so large that the initial upload would monopolize even a fast Internet connection for weeks. My largest archive contains thousands of scanned family photos and documents dating to the 1830s, and my personal photo archive is nearly as large and it's growing fast. Even the incremental updates can get unwieldy, especially after a busy day of scanning or photographing. Another problem is that many cloud services automatically synchronize all your files across all your connected devices. It's supposed to be a trendy feature, but it's a drag. I use different computers for my work and personal pursuits, and I don't want my business files and personal files shared on both. Nor do I want everything shared with a notebook, tablet, or smartphone that has less local storage and is more prone to loss or theft. Then too, automatic file sharing can clog a home Internet connection. Usually there are workarounds—Google Drive won't replicate files in subfolders—but some services are mysterious about defeating such features. The cloud shouldn't be the one Chosen One when backing up. For these reasons, cloud storage is no substitute for local backups. The catch is you've got to be smart. Remember, it's not really a backup unless it's stored offsite. If your backup is a second hard drive in your PC, it's susceptible to all the same hazards that endanger the main drive, including power surges, spilled coffee, and ransomware. A disconnected portable hard drive stored in a desk drawer is slightly safer but is still subject to fires, floods, burglaries, or anything else that endangers your home. My strategy is to rotate portable hard drives between my home office and safe-deposit box. Even this precaution may not save you from a major disaster like a flood, earthquake, hurricane, or whatever act of God afflicts your region. Periodically stashing a drive at a relative's or friend's house will preserve most of your valuable files. Of course, cloud storage is the ideal insurance against big calamities, but it shouldn't be your only backup. Tom Halfhill was formerly a senior editor for Byte magazine and is now an analyst for Microprocessor Report. |