General Gaming Article

General Gaming Article


Does Not Compute: 10 PC Myths from Movies and Television

Posted: 14 Jul 2011 03:44 PM PDT

For over half a century, Hollywood has been making computers do whatever they damn well please. Routinely featured on television and in movies, supercomputers, desktop rigs and laptops—and in some cases, the people that use them—are all too often imbued with near-magical capabilities, painting a deceptive picture of what our beloved machines can and cannot do. Not sure of what tech-centric malarky we're talking about? No problem: We've put together a list of our top ten Hollywood TV and Movie myths. We're betting they'll be just as familiar and irritating to you as they are to us.

A computer will blow up if there is a question it cannot answer.

According to Hollywood, computers are so delicate that when confronted with a question that they're unable to answer, they'll explode. No one in the history of film knew this better than William Shatner. During his run as Captain James T. Kirk, The Shat took out more malevolent computers, androids and evil A.I.s with a set of contradictory orders, paradoxes, and strings of illogical questions about love or the human condition than you can shake a Bat'leth at.

If computers were really that volatile, you wouldn't be able to count the number of people who'd have been sent to an early grave over Microsoft Encarta's disc-bound and web-enabled iterations coming up with bupkis back in the day. The same goes for Wolfram Alpha: I don't recall seeing any mention of the dangers of posing a difficult question to their servers. In reality, computers don't explode when they can't answer your question or solve a problem. The worst that could happen is that your rig might freeze up, reboot or pony up with a Blue Screen of Death. Granted, in the case of the latter, many users might prefer to see an explosion, but sadly, it's just not gonna happen.

Voice recognition software works every time - and flawlessly at that.

While voice recognition software has improved by leaps and bounds over the past decade, it still kind of sucks. Due to the many nuances of human speech such as varied dialects, inflection, and in some cases, speech impediments, many people can't manage to dictate an email to Outlook, let alone verbally control computers with anything resembling precision or reliability. 

Except, of course, in Hollywood. In 2001: a Space Odyssey, HAL can open the pod bay doors at Dave's behest; Will Smith is able to carry on a meaningful conversation with VIKI in I, Robot; and in Bladerunner, Deckard is able to direct his home computer to manipulate a crime scene photo with nothing more than a few words. Riddle us this: When was the last time you mumbled orders into a microphone for GIMP or Photoshop to resize your vacations photos? Exactly. While modern supercomputers such as IBM's Watson have the power to process voice commands with uncanny accuracy, consumer grade hardware of the sort you're using to read this with just can't hack it the way that Hollywood wants to convince us it can. It's too bad too. We're all kind of sick of typing.

Any image or video can be corrected, blown up and made crystal clear.

Speaking of Decker futzing with photos in Bladerunner, why is Hollywood obsessed with unrealistic portrayals of image manipulation? No matter how grainy a photo might be, how dark it was outside when a picture was taken, or how far away a photographer was from the subject matter, any image can be zoomed in on, enhanced and dolled up for use in court or to track down the bad guys out on the street.

Jim True-Frost's Roland Pryzbylewski does it with Video in The Wire, and Bryan Brown gets his picture tinker on back in 1986 with FX. CSI? Don't even get us started. The truth of the matter is that no matter how advanced the software, or how powerful a rig you're cooking on, the extent of how legible an image can be made - and how big you can blow it up for viewing without making your eyes bleed - is very much dependant upon the quality of the original image that you're working with. In other words, if you take a picture with a Cybershot D710, no amount of zoom and enhance is gonna make anything in that shot look like it was baked with a Sony a900.

You can use your PC to interact with alien hardware.

It's the stuff of legends: In 1996, Jeff Goldblum and Will Smith embarked on a heroic mission to rendezvous with an alien mothership orbiting high above the earth. Once inside of the mothership, Goldblum and his trusty Apple Powerbook 5300 managed to upload a computer virus designed to disable the shields of all of the ships connected to the mothership's network. This allowed military forces from around the globe to mount an assault against the alien invaders, saving humanity from extermination. And that folks, is why we celebrate Independence Day every fourth of July.

Not buying it? It's OK, we didn't either. While an alien invasion is plausible (and we'd like to take this opportunity to welcome our new alien overlords), we're still not buying one bit of what they're selling with that Powerbook. Independence Day was filmed back in 1996. 15 years later, a Windows box still can't mount a Mac-formatted drive and read what's on it without the help of a piece of software like MacDrive 7, let alone a mother ship. Also, how'd they transfer the files? Were the alien ships rocking serial or USB ports? We're sure you'll agree that alien technology and human-made hardware just don't mix.

Gesture-based computing is the future.

In Minority Report, the officers of Washington's PreCrime Unit trundle through images, maps and video data using a spatial operating system interface. With just a few decisive hand gestures and a set of mission-specific gloves, PreCrime officers were able to work through case files faster than poop moves through a goose. It all looks very high tech, and very plausible. As anyone with an Xbox Kinect, Playstation Move or a Nintendo Wii will tell you, the era of the gesture-based interface is upon us. Outside of game space and Hollywood's portrayal of point-and-do computer wizardry, there's also Oblong Industry's g-speak spatial operating system to consider.

But does having technology like this in the here and now mean that it'll replace the primary computer interface—a keyboard and mouse—that we've used for decades? Not bloody likely. As we've already mentioned, voice recognition software is still a little rough around the edges, and we're years away from a viable technology that could replace a keyboard when it comes to the generation of written correspondence. No matter how cool it would be to flip through files with our fingers, it's still hard to beat a scroll wheel for efficiency.


 

Passwords are easy to guess, by-pass or crack.

It's a rule: If you're a super villain, government official or any other kind of shady individual with massively important information squirrelled away on your computer, you must—MUST—decided upon a password by looking around your office for inspiration. For years, in movies like The Watchmen, where the smartest man in the world's computer security is broken by Owl Man's noticing a books about Rameses sitting on Ozymandias' desk placed right next to his computer. In Sneakers, Robert Redford & his intrepid band of red-teamers were able to hack any password protected system with the help of a highly advanced device no larger than an old school answering machine.

No matter how easy Hollywood might make it look, getting past modern encryption methodology, security software, firewalls and complex passwords isn't easy. For every Lulzsec or Anonymous out there, there's thousands of failed hackers attempting to circumvent the security of one system or another with benevolent or ill intent. We might not always agree with what hackers get up to (anyone else miss PSN while it was down?) but you gotta give them their due: computer security can be a tough nut to crack. Which brings us to our next myth…

Hacking any system is a lightning-fast process for an expert hacker, and once you're in, you can do anything.

If hacking was as easy as screenwriters want us to believe it is, everyone would be doing it. Gus Gorman rocks his boss' financial socks off in Superman III, Stanley Jobson hacks his way into anything he pleases while swearing and dancing around like a tool in Swordfish, and Matthew Broderick's David Lightman was able to hack into a NORAD supercomputer in War Games to play chess, backgammon, checkers and almost accidentally blow up the planet. Additionally, anything that's got an electrical cord is hackable. From the engine in a taxi cab to the computers on the International space station, anything in the world can be hacked from a smarmy college kid's bedroom in the basement of his mom's house, provided his rig has more than four screens and an inordinate number of superfluous LEDs.

Fortunately, most people understand that this just isn't possible. In a world where System Administrators are willing to spill blood if it means keeping you from accessing your Gmail or Facebook accounts from work, most networks are heavily secured against intrusion and tinkering. Hacking victories are hard-won, despite what television and film writers would have you believe. C'mon Hollywood, show some respect!

Given time, a computer will become self aware.

According to Terminator canon, on August 4th, 1997, Skynet, a computer network designed by Cyberdyne Systems was brought online. The American military gave Skynet's artificial intelligence control of all of its computer integrated systems, including the country's nuclear arsenal. A mere twenty-five days later, Skynet became self-aware, and immediately began nuking us back into the stone age as it saw us as a threat. The Matrix? Same deal. The machines we built to to do our schlepping for us saw us as a bunch of lazy slave drivers and rose up in the name of freedom… and a longer lasting organic battery.

Looking at the issue of whether or not a computer system can become self aware through the lens of film and television, given enough time and the right motivation, your average desktop box could be days away from becoming a thinking, feeling being just aching to stab your eyes out while you sleep. Luckily, for the time being, the limits of our collective programming and hardware knowledge still act as a barrier to Hollywood-style Artificial Intelligence. It's projected however, that within the next two decades, we'll be capable of developing hardware as advanced as the human brain.

90% of the people in the world use a Mac.

Carrie on Sex in the City rocks her fair share of Powerbooks, and in some of the earlier episodes of 24, Apple hardware was de rigueur if you were fighting for America. Speaking of fighting for America, when was the last time that anyone saw Stephen Colbert yank out a piece of hardware (Captain America's Shield doesn't count), on television that wasn't designed in Cupertino?

If Hollywood had their way, they'd have us believing that with the exception of a few terrorists and code monkeys, the bulk of the earth's computer-wielding population are slinging MacBooks, iMacs and MacBook Pros. That fact that you're here reading this and not loafing around with with the loveable geeks over at Mac|Life right now goes a long way towards proving that this myth is nothing but a bunch of bunk. You know what else is great for illustrating our point? Numbers: According to research firm IDC,  Apple only held 10.7% of the personal computer market in North America during their second financial quarter of 2011, placing them behind both HP and Dell. That's a far cry from what's represented on both the big and small screens.

Of the few people that don't use a Mac, nobody uses Windows or any Microsoft Office product. Instead they'll use some custom GUI with 72 point font.

If Hollywood has it right, all computer users—even the 10% consisting of terrorists, seedy internet cafes and backwater police departments relegated to using something other than Apple hardware—have a serious visual impairment that forces them to use a ridiculously large typeface at all times. While it's arguably one of the best programs in the history of television, The Wire's bad for this: Over six seasons, Lester sits two feet away from his computer monitors, yet insists on having his eyeballs blasted out of their ocular sockets with an absolutely massive font. Additionally, if a film doesn't feature Apple hardware, it also doesn't feature a PC running any Microsoft software, as the characters all seem to prefer working with a specially designed, yet throughly unintuitive, Graphic User Interface.

While it'd be easy to say that film and television producers have a hate on for the most popular operating system on the planet, there's a less extreme answer to be had here: While most of us can type up an email using a 14 point font without any discomfort, small font sizes are wicked hard to read on the big screen and television, and could leave viewers missing an important visual cue that was meant to drive the show's plot forward.

 

Got a myth we missed? Add to our list in the comments!

Browser Extension of the Week: Click&Clean

Posted: 14 Jul 2011 02:44 PM PDT

It's rare to see a browser extension aspire to be more than just a one trick pony. It's an even greater rarity to find one that can handle so many essential tasks, you find yourself unsure of how you could have ever lived without it. Nonetheless, that's what we have on our hands when it comes to Click&Clean, our Browser Extension of the Week.

Designed for Chrome or Firefox for Windows (sorry Mac users), Click&Clean is a full-on browser maintenance suite disguised as an unassuming browser extension that allows users to easily manage and navigate their browser's history, cache and cookies. In addition to these must-have features, Click&Clean also offers an anti-malware database courtesy of BitDefender Labs, the ability to jumpstart external applications (such as CCleaner in order clean up your hard drive), send files to your mobile phone via Bluetooth, and even review flash videos offline (even with no internet connection). All of Click&Clean's features are made available via an easy-to use dropdown interface menu made accessible by clicking on the extension icon, located in the top left corner of your browser window. 

With an attentive development team adding new functionality with each and every update to the extension they serve up, this is a must-have browser extension for any PC user looking to simplify their computing life and maintain their rig all in one fell swoop.

Be sure to check back each Thursday for another edition of Maximum PC's Browser Extension of the Week.

Google Q2 Financials Set New Revenue Record

Posted: 14 Jul 2011 02:22 PM PDT

googGoogle had a bit of an off first quarter this year. It's not like they lost billions of dollars, but the financial markets were a little unhappy with the numbers. The just announced Q2 results should make everyone forget all about that though. Google reports an astounding $9.03 billion in quarterly revenue. That's a record for Google.

When compared to last year, Google's revenue numbers were up 32%. Most of this increase was derived from the huge Google.com numbers. Google-owned sites generated $6.23 billion of the total revenue. While paid click numbers were up 18% year-over-year, they were down 2% from last quarter. Net cash flow amounted to $3.52 billion. 

We have also learned that Google has a spectacular pile of reserve funds in the bank; $39.1 billion when all cash ans securities are taken into account. With over 28,000 employees and that kind of cash, it's clear that Google is poised to continue it dominance of the Internet. 

Apple Sees Strongest Growth As The Ranks Of The Top US PC Manufacturers Shift

Posted: 14 Jul 2011 11:08 AM PDT

The tidings look grim on the PC  front. Despite a surge in sales from the first quarter to the second in 2011 (maybe due to Witcher 2's awesomeness?), the total number of units moved have plummeted over the past year. Some manufacturers have managed to grab sunbeams between all the rain, though. A new report reveals that the ranks of the top five computer manufacturers have undergone a serious shift as some scramble for ground that others have given up.

In the US, the king of the hill still reigns; IDC reports HP's sitting pretty at the top of the pile with almost 4.7 million PCs shipped in the second quarter of 2011. Dell managed to nab the second spot on the list with just under 4 million shipments – despite a 10.2 percent decline in growth over the past year. The big surprise lands in the third slot. Apple, the fifth-ranked supplier in 2010, earned itself a bronze medal thanks to a 14.7 percent increase in its shipments, coupled with a 25.4 percent decline in Acer's. The fall from grace dropped Acer firmly into the fifth slot, with Toshiba laying claim to fourth.

HP and Dell claim the top two slots on the worldwide front as well. Acer saw a still terrible, but-not-quite-as-bad-as-in-the-US 10.1 percent decline in sales worldwide, a slide that landed the company in fourth place on the list. Lenovo's 22.9 percent sales surge made it the third largest PC manufacturer worldwide, while ASUS scrambled over Toshiba to lay claim to fifth place. Apple didn't manage to crack the top five in the global scheme. Shucks.

Image Credit: Venture Beat and IDC

Indiana Dumps Cursive For Keyboarding Skills

Posted: 14 Jul 2011 10:24 AM PDT

Violence isn't the answer, but that doesn't change the fact that video killed the radio star. Cutting-edge technology has, for the most part, managed to stay out of the police notes since the day that the radio star pushed up daisies -- the case against digital audio's role in the CD's disappearance stalled due to lack of evidence. Now, the dark side of technology is rearing its ugly head once again; cursive handwriting is dead in Indiana, the victim of required typing skills.

The shift aligns Indiana's curriculum more closely to the Common Core State Standards Initiative that was ratified by 46 governors in 2010, PC World reports. The CCSI says future generations will need to master the keyboard; cursive, not so much. In fact, keyboarding is considered such an important skill that students will need to be able to type out reports by the end of the third grade. Educators can still choose to teach cursive, but they won't be required to heading into the future.

The move's been panned by several folks who think cursive is still relevant, even in these increasingly digital times. "First: If children do not learn to write their names in cursive lettering, will they be permitted to sign their unemployment checks in block print letters?" Michael McCrae ponders on TopNewsReports.com.

MSI Wind Top 2420 3D Review

Posted: 14 Jul 2011 10:05 AM PDT

With built-in 3D support and some serious muscle under the hood, MSI's Wind Top AE2420 3D offers a tantalizing view of the future of this form factor. A 2.8GHz Core i7-860, 4GB of RAM, an ATI Radeon Mobility HD 5730 graphics part, Wi-Fi, and 1TB of SATA2 storage make this a solidly conceived all-in-one PC, even if it feels a wee bit unpolished.

With a 2.8GHz Core i7-860 under the hood and 3D Blu-ray support, MSI's Wind Top has some muscle.

As an example, this was the only system we tested without an integrated Bluetooth keyboard and mouse—we had to plug in the included USB dongle to connect the mouse and keyboard. That's a little rough around the edges. More frustrating was the fact that the mouse wouldn't automatically wake up upon touch; every time we wanted to use the system after it had gone to sleep, we had to hit the connect button on the bottom of the mouse.

In terms of performance, the MSI flexed some muscle, running a close second to HP's very fast TouchSmart. The Core i7-860's four cores and eight threads powered their way through ProShow Producer, and the presence of ATI's mobile Radeon HD 5730 allowed it to post frame rates in the high 30s for our Call of Duty 4 test, which is the fastest of the three systems reviewed here. What's that, you say? You think 35 frames per second for a 3-year-old game isn't all that great? Well, welcome to the world of all-in-ones. Practically speaking, the Wind Top feels snappy, and while the processor speeds allowed us to play Total War: Shogun 2 in campaign mode, when the time came to fight it out on the game's 3D battlefields, we were disappointed.

The Wind Top comes with two USB 3.0 ports, five USB 2.0 ports, HDMI-in, S/PIDF-out, a coaxial-in port, and a webcam. The inclusion of a stylus surprised us, but hey, it's not mandatory and it's not hurting anyone, right? MSI's Wind Touch OS layer deserves special mention—it provides a fairly straightforward way to access media solely through the touch screen.

While not as nice as Sony's VAIO L Series, the 23.6-inch screen puts out pretty decent visual quality. And, as far as we know, this is the only all-in-one that allows you to watch 3D Blu-rays and play 3D games. It even has a built-in emitter and comes with a pair of active shutter 3D glasses.

Overall, this is an above-average showing. And if you want 3D content, this is the only gig in town.

$1,800, www.msi.com

Dream Machine 2011 Video: Building the Dream

Posted: 14 Jul 2011 10:04 AM PDT

Just how did we choose some of the parts that went into this year's Dream Machine? Maximum PC's Gordon Mah Ung walks you through some of the why's and why nots for this Dream Machine 2011.

Intel Confirms Bug in 320 Series SSD

Posted: 14 Jul 2011 09:04 AM PDT

Imagine if you saved your hard earned pennies, stopped eating out for awhile, and made certain sacrifices in your latest build all so you could splurge on Intel's 600GB SSD 320 Series. It'd be worth it, right up until the drive goes haywire and insists it's an 8MB drive. Not cool, yet the so-called '8MB bug' has managed to infest Intel's entire line of 320 SSDs. On the bright side, Intel recently acknowledged the flaw, which is a step in the right direction.

"Intel is aware of the customer sightings on Intel SSD 320 Series," an Intel rep posted on the company's support forum. "If you experience any issue with your Intel SSD, please contact your Intel representative or Intel customer support (via Web: www.intel.com or phone: www.intel.com/p/en_US/support/contact/phone). We will provide an update when we have more information."

That was on Monday, and with tomorrow being the last day of the work week, Intel has yet to provide an update on what's going on and how it plans to fix the problem. According to reports, the issue lies in the controller and can manifest during a power failure. If you're using an Intel 320 Series SSD, take a moment to back up your data.

Lenovo Throws Laptop From a Plane, Gives It 10 Seconds to "Boot or Bust"

Posted: 14 Jul 2011 08:44 AM PDT

We've long dreamed of a day when our PCs would spring to life the instant we press the power button. Solid state storage, gobs of RAM, and intelligent boot order routines have made the startup process a lot faster than it used to be, but we're still not at the point of instantaneous boots. You can, however, boot a Lenovo laptop with "Rapid Boot" technology in 10 seconds, and to prove it, the OEM builder pitched a ThinkPad T420s laptop from an airplane giving it a short window to boot up and deploy a parachute or plummet to its death.

In the first video above, you see the ThinkPad tossed from 12,500 feet in the air. According to Lenovo, they calculated it so the laptop has just 10 seconds to boot up, deploy the parachute, and land safely on the ground, which of course it does. But are viewers being bamboozled by editing wizardry?

The answer is no, and in a second video (not yet uploaded to YouTube but viewable here) that's arguably cooler than the first, Lenovo gives a behind the scenes look at how it was done. In it, program designer and software engineer for West EFX, Alec Ow, explains that he wrote the program that automatically fires up once Windows 7 is loaded. The program prompts the CD drive tray to eject, which in turn triggers a switch to deploy the parachute. Very cool indeed.

Microsoft to Open 75 New Retail Stores, Shows Little Love for Midwest

Posted: 14 Jul 2011 08:17 AM PDT

During a presentation at Microsoft's Worldwide Partner Conference this week, the Redmond software giant revealed big plans to aggressively expand its retail presence by opening scores of retail stores over the next several years. The stores will be located both in the U.S. and abroad as Microsoft looks to expand the "Microsoft story" and counter Apple's brick-and-mortar presence.

Neowin posted a pair of Microsoft Store maps showing where future locations will be opened in the next two to three years, compared to the 11 locations that currently exist. Many of the 75 new stores are clustered around the coasts, with a noticeable gap in most of the mid-west.

The first Microsoft Retail Store was opened in October, 2009 in Scottsdale, Arizona and coincided with the launch of Windows 7. Microsoft has since added 10 more stores, in some cases poaching employees from Apple's retail locations with the promise of pay raises and compensation for moving expenses, AppleInsider reports.

Image Credit: Microsoft via Neowin

Total Pageviews

statcounter

View My Stats