My wife bought me this beautiful toy for my birthday (and this was very hard for her, for reasons that I cannot post here):
For those who don’t recognize it, and think this is a regular laptop or some kind of netbook, guess again. It is an Asus Transformer Prime, with an added docking station which comes with a keyboard, touchpad and battery. And let me tell you, this is a cooooool toy (another one of those that you don’t understand why you didn’t buy one before). The screen is awesome, response time is very fast and the user interface is pretty slick. But one of the most interesting things I discovered is that I haven’t used the touchpad, not even once (except to test that it works) as an input interface. When your hands are so close to the screen and you already have to take them out of the keyboard to do something, touching the screen directly seems the correct thing to do. No need to find where the mouse was last time, move it to the new location and click. Just touch where you want and there you go.
It is a bit far-fetched to throw the mouse away, specially in the desktop where the screen is a lot farther from you than in a little pad like the Transformer, but it is certainly a change in the way we interact with a computer. The solution for desktops is to have the computer follow you eyes and listen to your voice. For example, you are reading a web page and you want to follow a link, just look at it and say “follow”. We will never again have to take our hands of the keyboard (there goes the need to memorize all vi shortcuts:-) ). And these interfaces are not so far in the future, we are actually pretty close to getting there. These are cool times to live in.