Summary: User interfaces in film are more exciting than they are realistic, and heroes have far too easy a time using foreign systems.

The way Hollywood depicts usability could fill many a blooper reel. Here are 10 of the most egregious mistakes made by moviemakers.

1. The Hero Can Immediately Use Any UI

Break into a company — possibly in a foreign country or on an alien planet — and step up to the computer. How long does it take you to figure out the UI and use the new applications for the first time? Less than a minute if you're a movie star.

The fact that all user interfaces are walk-up-and-use is probably the single most unrealistic aspect of how movies depict computers. In reality, we know all too well that even the smartest users have plenty of problems using even the best designs, let alone the degraded usability typically found in in-house MIS systems, industrial control rooms, or military systems.

2. Time Travelers Can Use Current Designs

An even worse flaw is the assumption that time travelers from the past could use today's computer systems. In fact, they'd have no conception of any of modern technology's basic concepts, and so would be dramatically more stumped than the novice users we observe in user testing. Even someone who's never used Excel at least understands the general idea of computers and screens.

You might think that people coming from the future would have an easier time using our current systems, given their supposedly superior knowledge. Not true. Like our travelers from the past, they'd lack the conceptual model needed to make sense of the display options. For example, someone who's never seen a command line or typed a command would have a much harder time using DOS than someone who grew up in the DOS era.

If you were transported back in time to the Napoleonic wars and made captain of a British frigate, you'd have no clue how to sail the ship: You couldn't use a sextant and you wouldn't know the names of the different sails, so you couldn't order the sailors to rig the masts appropriately. However, even our sailing case would be easier than someone from the year 2207 having to operate a current computer: sailing ships are still around, and you likely know some of the basic concepts from watching pirate movies. In contrast, it's highly unlikely that anyone from 2207 would have ever seen Windows Vista screens.

3. The 3D UI

In Minority Report , the characters operate a complex information space by gesturing wildly in the space in front of their screens. As Tog found when filming Starfire , it's very tiring to keep your arms in the air while using a computer. Gestures do have their place, but not as the primary user interface for office systems.

Many user interfaces designed for the movies feature gestural input and 3D data visualizations. Immersive environments and fly-through navigation look good, and allow for more dramatic interaction than clicking on a linear list of 10 items. But, despite being a staple of computer conference demos for decades, 3D almost never makes it into shipping products. The reason? 2D works better than 3D for the vast majority of practical things that users want to do.

3D is for demos. 2D is for work.

4. Integration is Easy, Data Interoperates

In movieland, users have no trouble connecting different computer systems. Macintosh users live in a world of PCs without ever noticing it (and there were disproportionally more Macs than PCs in films a decade ago, when Apple had the bigger product-placement budget).

In the show 24 , Jack Bauer calls his office to get plans and schematics for various buildings. Once these files have been transferred from outside sources to the agency's mainframe, Jack asks to have them downloaded to his PDA. And — miracle of miracles — the files are readable without any workarounds. (And download is far faster than is currently possible on the U.S.'s miserable mobile networks.)

(See also sidebar about excessive interoperability in Independence Day .)

5. Access Denied / Access Granted

Countless scenes involve unauthorized access to some system. Invariably, several passwords are tried, resulting in a giant "Access Denied" dialog box. Finally, a few seconds before disaster strikes, the hero enters the correct password and is greeted by an equally huge "Access Granted" dialog box.

A better user interface would proceed directly to the application's home screen as soon as the user has correctly logged in. After all, you design for authorized users. There's no reason to delay them with a special confirmation that yes, they did indeed enter their own passwords correctly.

6. Big Fonts

In addition to the immense font used for "Access Denied" messages, most computer screens in the movies feature big, easily readable text. In real life, users often suffer under tiny text and websites that add insult to injury by not letting users resize the words.

Large text is an obvious concession to the viewing experience: moviegoers must be able to see what's on the screen. Still, enlarging the information that much makes for an unrealistic UI.

7. Star Trek's Talking Computer

The voice-operated computer in Star Trek is an even more egregious example of designing an audience interface rather than a user interface. Spoken commands and spoken responses make it easy for the audience to follow the action, but it's a very inefficient way of controlling a complex system.

In predictions about computing's future, voice interaction is a perennial favorite — it probably even beats 3D, which is the other top contender for most over-hyped UI technology. While voice has its place, it's even less suitable than 3D for most everyday interactions because it's a less data-rich channel and it's harder to specify something in words than to choose it on a graphical display.

8. Remote Manipulators (Waldo Controls)

In Tomorrow Never Dies , James Bond drives his BMW from the back seat with an Ericsson mobile phone that works as the car's remote control. And 007 drives fast, while also evading bad guys.

In practice, there's a reason we use steering wheels to drive cars instead of joysticks, touchpads, or push-buttons. The steering wheel is an excellent input device for fast and accurate specification of directionality.

Many other films feature other types of remote control, which always work with high speed and accuracy despite input devices that are suboptimal for the task. Designing good input devices is a tricky human factors problem, and you can't substitute devices willy-nilly and retain the same performance. A foot pedal, for example, is not as good as a mouse for text editing, because you can't move your legs as accurately as your hands and fingers.

9. You've Got Mail is Always Good News

In the movies, checking your mail is a matter of picking out the one or two messages that are important to the plot. No information pollution or swamp of spam. No ever-changing client requests in the face of impending deadlines. And you never overlook information because a message's subject line violated the email usability guidelines.

10. "This is Unix, It's Easy"

In the film Jurassic Park , a 12-year-old girl has to use the park's security system to keep everyone from being eaten by dinosaurs. She walks up to the control terminal and utters the immortal words, "This is a Unix system. I know this." And proceeds to (temporarily) save the day.

Leaving aside the plausibility of a 12-year-old knowing Unix, simply knowing Unix is not enough to immediately use any application running on the system. Yes, she could probably have used vi on the security terminal. But the specialized security system would have required some learning time — significant learning time if it were built on Unix, which has notoriously inconsistent user interface design and thus makes it harder to transfer skills from one application to the next.

Do the Usability Bloopers Matter?

Does it matter that most films offer such an unrealistic depiction of usability? Mainly, no. A movie's purpose is entertainment, not task performance. So, go ahead and employ user interfaces and interaction techniques that are entertaining and would never work in the real world.

Films are littered with so many other unrealistic plot details: you'd imagine, for example, that the ability to shoot straight might actually be a primary job requirement of Imperial Stormtroopers.

In the movie context, unrealistic usability is only to be expected. Still, I see two real problems with it:

Research funding and management expectations are subtly biased by the incessant emphasis on unrealistic UI design such as voice, 3D, avatars, and AI. When you see something work as part of a coherent and exciting story, you start wanting it. You even start believing in it. After all, we've seen 3D and voice so often that we've developed an implicit belief in their usefulness.

are subtly biased by the incessant emphasis on unrealistic UI design such as voice, 3D, avatars, and AI. When you see something work as part of a coherent and exciting story, you start wanting it. You even start believing in it. After all, we've seen 3D and voice so often that we've developed an implicit belief in their usefulness. Users blame themselves when they can't use technology. This phenomenon is bad enough already; it's made worse by the prevalence of scenes in which people walk up to random computers and start using them immediately. We need people to start demanding easier design and blaming the technology when it's too hard to use. Movies make this change in attitudes more difficult.

Other Top-10 Lists

The UX Conference has a full-day seminar on Emerging Patterns for Web Design.