For over half a century, Hollywood has been making computers do whatever they damn well please. Routinely featured on television and in movies, supercomputers, desktop rigs and laptops—and in some cases, the people that use them—are all too often imbued with near-magical capabilities, painting a deceptive picture of what our beloved machines can and cannot do. Not sure of what tech-centric malarky we're talking about? No problem: We've put together a list of our top ten Hollywood TV and Movie myths. We're betting they'll be just as familiar and irritating to you as they are to us.
A computer will blow up if there is a question it cannot answer.
According to Hollywood, computers are so delicate that when confronted with a question that they're unable to answer, they'll explode. No one in the history of film knew this better than William Shatner. During his run as Captain James T. Kirk, The Shat took out more malevolent computers, androids and evil A.I.s with a set of contradictory orders, paradoxes, and strings of illogical questions about love or the human condition than you can shake a Bat'leth at.
If computers were really that volatile, you wouldn't be able to count the number of people who'd have been sent to an early grave over Microsoft Encarta's disc-bound and web-enabled iterations coming up with bupkis back in the day. The same goes for Wolfram Alpha: I don't recall seeing any mention of the dangers of posing a difficult question to their servers. In reality, computers don't explode when they can't answer your question or solve a problem. The worst that could happen is that your rig might freeze up, reboot or pony up with a Blue Screen of Death. Granted, in the case of the latter, many users might prefer to see an explosion, but sadly, it's just not gonna happen.
Voice recognition software works every time - and flawlessly at that.
While voice recognition software has improved by leaps and bounds over the past decade, it still kind of sucks. Due to the many nuances of human speech such as varied dialects, inflection, and in some cases, speech impediments, many people can't manage to dictate an email to Outlook, let alone verbally control computers with anything resembling precision or reliability.
Except, of course, in Hollywood. In 2001: a Space Odyssey, HAL can open the pod bay doors at Dave's behest; Will Smith is able to carry on a meaningful conversation with VIKI in I, Robot; and in Bladerunner, Deckard is able to direct his home computer to manipulate a crime scene photo with nothing more than a few words. Riddle us this: When was the last time you mumbled orders into a microphone for GIMP or Photoshop to resize your vacations photos? Exactly. While modern supercomputers such as IBM's Watson have the power to process voice commands with uncanny accuracy, consumer grade hardware of the sort you're using to read this with just can't hack it the way that Hollywood wants to convince us it can. It's too bad too. We're all kind of sick of typing.
Any image or video can be corrected, blown up and made crystal clear.
Speaking of Decker futzing with photos in Bladerunner, why is Hollywood obsessed with unrealistic portrayals of image manipulation? No matter how grainy a photo might be, how dark it was outside when a picture was taken, or how far away a photographer was from the subject matter, any image can be zoomed in on, enhanced and dolled up for use in court or to track down the bad guys out on the street.
Jim True-Frost's Roland Pryzbylewski does it with Video in The Wire, and Bryan Brown gets his picture tinker on back in 1986 with FX. CSI? Don't even get us started. The truth of the matter is that no matter how advanced the software, or how powerful a rig you're cooking on, the extent of how legible an image can be made - and how big you can blow it up for viewing without making your eyes bleed - is very much dependant upon the quality of the original image that you're working with. In other words, if you take a picture with a Cybershot D710, no amount of zoom and enhance is gonna make anything in that shot look like it was baked with a Sony a900.
You can use your PC to interact with alien hardware.
It's the stuff of legends: In 1996, Jeff Goldblum and Will Smith embarked on a heroic mission to rendezvous with an alien mothership orbiting high above the earth. Once inside of the mothership, Goldblum and his trusty Apple Powerbook 5300 managed to upload a computer virus designed to disable the shields of all of the ships connected to the mothership's network. This allowed military forces from around the globe to mount an assault against the alien invaders, saving humanity from extermination. And that folks, is why we celebrate Independence Day every fourth of July.
Not buying it? It's OK, we didn't either. While an alien invasion is plausible (and we'd like to take this opportunity to welcome our new alien overlords), we're still not buying one bit of what they're selling with that Powerbook. Independence Day was filmed back in 1996. 15 years later, a Windows box still can't mount a Mac-formatted drive and read what's on it without the help of a piece of software like MacDrive 7, let alone a mother ship. Also, how'd they transfer the files? Were the alien ships rocking serial or USB ports? We're sure you'll agree that alien technology and human-made hardware just don't mix.
Gesture-based computing is the future.
In Minority Report, the officers of Washington's PreCrime Unit trundle through images, maps and video data using a spatial operating system interface. With just a few decisive hand gestures and a set of mission-specific gloves, PreCrime officers were able to work through case files faster than poop moves through a goose. It all looks very high tech, and very plausible. As anyone with an Xbox Kinect, Playstation Move or a Nintendo Wii will tell you, the era of the gesture-based interface is upon us. Outside of game space and Hollywood's portrayal of point-and-do computer wizardry, there's also Oblong Industry's g-speak spatial operating system to consider.
But does having technology like this in the here and now mean that it'll replace the primary computer interface—a keyboard and mouse—that we've used for decades? Not bloody likely. As we've already mentioned, voice recognition software is still a little rough around the edges, and we're years away from a viable technology that could replace a keyboard when it comes to the generation of written correspondence. No matter how cool it would be to flip through files with our fingers, it's still hard to beat a scroll wheel for efficiency.
Passwords are easy to guess, by-pass or crack.
It's a rule: If you're a super villain, government official or any other kind of shady individual with massively important information squirrelled away on your computer, you must—MUST—decided upon a password by looking around your office for inspiration. For years, in movies like The Watchmen, where the smartest man in the world's computer security is broken by Owl Man's noticing a books about Rameses sitting on Ozymandias' desk placed right next to his computer. In Sneakers, Robert Redford & his intrepid band of red-teamers were able to hack any password protected system with the help of a highly advanced device no larger than an old school answering machine.
No matter how easy Hollywood might make it look, getting past modern encryption methodology, security software, firewalls and complex passwords isn't easy. For every Lulzsec or Anonymous out there, there's thousands of failed hackers attempting to circumvent the security of one system or another with benevolent or ill intent. We might not always agree with what hackers get up to (anyone else miss PSN while it was down?) but you gotta give them their due: computer security can be a tough nut to crack. Which brings us to our next myth…
Hacking any system is a lightning-fast process for an expert hacker, and once you're in, you can do anything.
If hacking was as easy as screenwriters want us to believe it is, everyone would be doing it. Gus Gorman rocks his boss' financial socks off in Superman III, Stanley Jobson hacks his way into anything he pleases while swearing and dancing around like a tool in Swordfish, and Matthew Broderick's David Lightman was able to hack into a NORAD supercomputer in War Games to play chess, backgammon, checkers and almost accidentally blow up the planet. Additionally, anything that's got an electrical cord is hackable. From the engine in a taxi cab to the computers on the International space station, anything in the world can be hacked from a smarmy college kid's bedroom in the basement of his mom's house, provided his rig has more than four screens and an inordinate number of superfluous LEDs.
Fortunately, most people understand that this just isn't possible. In a world where System Administrators are willing to spill blood if it means keeping you from accessing your Gmail or Facebook accounts from work, most networks are heavily secured against intrusion and tinkering. Hacking victories are hard-won, despite what television and film writers would have you believe. C'mon Hollywood, show some respect!
Given time, a computer will become self aware.
According to Terminator canon, on August 4th, 1997, Skynet, a computer network designed by Cyberdyne Systems was brought online. The American military gave Skynet's artificial intelligence control of all of its computer integrated systems, including the country's nuclear arsenal. A mere twenty-five days later, Skynet became self-aware, and immediately began nuking us back into the stone age as it saw us as a threat. The Matrix? Same deal. The machines we built to to do our schlepping for us saw us as a bunch of lazy slave drivers and rose up in the name of freedom… and a longer lasting organic battery.
Looking at the issue of whether or not a computer system can become self aware through the lens of film and television, given enough time and the right motivation, your average desktop box could be days away from becoming a thinking, feeling being just aching to stab your eyes out while you sleep. Luckily, for the time being, the limits of our collective programming and hardware knowledge still act as a barrier to Hollywood-style Artificial Intelligence. It's projected however, that within the next two decades, we'll be capable of developing hardware as advanced as the human brain.
90% of the people in the world use a Mac.
Carrie on Sex in the City rocks her fair share of Powerbooks, and in some of the earlier episodes of 24, Apple hardware was de rigueur if you were fighting for America. Speaking of fighting for America, when was the last time that anyone saw Stephen Colbert yank out a piece of hardware (Captain America's Shield doesn't count), on television that wasn't designed in Cupertino?
If Hollywood had their way, they'd have us believing that with the exception of a few terrorists and code monkeys, the bulk of the earth's computer-wielding population are slinging MacBooks, iMacs and MacBook Pros. That fact that you're here reading this and not loafing around with with the loveable geeks over at Mac|Life right now goes a long way towards proving that this myth is nothing but a bunch of bunk. You know what else is great for illustrating our point? Numbers: According to research firm IDC, Apple only held 10.7% of the personal computer market in North America during their second financial quarter of 2011, placing them behind both HP and Dell. That's a far cry from what's represented on both the big and small screens.
Of the few people that don't use a Mac, nobody uses Windows or any Microsoft Office product. Instead they'll use some custom GUI with 72 point font.
If Hollywood has it right, all computer users—even the 10% consisting of terrorists, seedy internet cafes and backwater police departments relegated to using something other than Apple hardware—have a serious visual impairment that forces them to use a ridiculously large typeface at all times. While it's arguably one of the best programs in the history of television, The Wire's bad for this: Over six seasons, Lester sits two feet away from his computer monitors, yet insists on having his eyeballs blasted out of their ocular sockets with an absolutely massive font. Additionally, if a film doesn't feature Apple hardware, it also doesn't feature a PC running any Microsoft software, as the characters all seem to prefer working with a specially designed, yet throughly unintuitive, Graphic User Interface.
While it'd be easy to say that film and television producers have a hate on for the most popular operating system on the planet, there's a less extreme answer to be had here: While most of us can type up an email using a 14 point font without any discomfort, small font sizes are wicked hard to read on the big screen and television, and could leave viewers missing an important visual cue that was meant to drive the show's plot forward.
Got a myth we missed? Add to our list in the comments!