Have you ever read a fiction book where the main character gets frustrated by an interface? Have you ever played a non-puzzle video game where your character has had trouble determining which buttons activate what doors, traps and/or lasers?

Unless you've been reading a lot more boring stuff than I have (or playing a lot of Myst) you should have noticed that this kind of thing just doesn't happen in fiction. Reality is a dark and desolate place where we have to deal with modal dialogues that interfere with us trying to get work done; where confusing, disorganized software with poorly designed and outright wrong interfaces are considered normal. In video games and books that happen to have computers in them, those computers are easy for everyone to use. You just have to suspend your disbelief that such a user experience is possible.

The utopian nature of such fictional usability aside, we can learn a lot from how fictional interfaces are presumed to work and try to bring them into reality to see if they really are that good.

Remember the computer system in Star Trek: The Next Generation? That thing was capable of understanding requests made in plain language (most of the time) and was able to form responses based on those requests. Unless the plot dictated otherwise, the Enterprise computer simplified the lives of everyone on board. The crew just talked to the computer and it responded. They didn't have to deal with keyboards or mice or mystifying error messages.

A system like that is incredibly hard to build. But we're doing it: IBM's built a system called Watson that is able to process and create spoken phrases in a somewhat intelligent manner. It's no ship's computer, but it's a big step in the right direction if we're ever going to do things the Picard way. Watson may not be doing anything other than trivia for now, but the language-processing software it employs will be applicable everywhere.

Another great fictional interface is the hologram. I don't mean an advanced system like a holodeck but rather using smaller holograms projected into real space to create something called augmented reality. We're already making a few AR systems (as can be seen in the Wikipedia article just linked) but those are not nearly at the level they need to be.

In the game Dead Space 2, holograms are an everyday part of life. They replace all physical interfaces that we have today, not requiring one to be in a specific location in order to access information networks. Today, we need to be in front of a computer monitor to see the output, and we need to be front of input devices to interact. We have a locality dependency on using computers.

Cell phones and other small devices mitigate this dependency by allowing one to carry a computer around whenever one is required. The locality is still there, though, since you need to pull out your cellphone. There's a waiting period, a transition between visualizing the world normally and visualizing it with a tool. Also, the small computer is not as capable as its larger counterparts that are decidedly less mobile.

Augmented reality could bring us information no matter where we are regardless of our proximity to computers. It would allow us to view content on virtual screens (if merely two-dimensional information is accessed), to blur the line between what exists physically and what exists solely as information. A lot of this information relates directly to the physical world anyway, such as map directions. In a sense a GPS device is an augmented reality device, in that it augments what you can see and hear.

But a GPS device is much too clunky for the world of fiction — it requires you to look at it to use it. Another system, this time from Neuromancer, is more appropriate in that some people have mechanical implants that allow images to be superimposed on top of what they're perceiving biologically. A system like this can display overlays onto reality so its user never has to look away from what he's doing in order to gain more information. It's already right there.

I'm moving steadily away from what is currently possible and moving into a futuristic world where people are cyborgs. That's where I think we should be going to make life simpler. We'll start with interfaces that are able to more correctly respond to what is meant rather than what is done, such as with Watson. Then we'll move beyond that to integrate computers into everything around us so that we don't have to be somewhere specific to access information. Further into the future still we'll be able to use tools like glasses with internal projector screens or indeed biological augmentation to allow for a personalized user experience.

And that's just one set of ideas from various sources. It's possible that the future of interaction design will be entirely different, something no author could have ever imagined. Or maybe we'll be stuck with what we have now forever. Whatever the future holds, we should always strive to make things easier for our users by taking small steps or large leaps in the right direction. Fictional interfaces are a great source for inspiration, since any software application and indeed any invention exists first as a fictional idea in its creator's head before it is brought into reality.