This article first appeared in the St. Louis Beacon, March 7, 2011 - When Steve Jobs announced the initial iPad only a year ago, and when he announced its successor, iPad 2 last week, he failed to mention one of its most innovative features.
The feature and its effects have implications in a broad range of areas, from (most obviously) personal computing and how we use computer interfaces all the way to (most broadly) how we think, find and digest information.
I'm talking about how the iPad works. Turn it on, and you're presented with a grid of icons, each representing an application. Tap one, and it expands to fill the screen. The entire screen. There are no menus surrounding the application to let you know what other applications you have or if you have mail or even sometimes what time it is. (This feature was really introduced with the iPhone five years ago, but I don't think its full implications were realized until the iPad was introduced).
The iPad becomes the application. Choose a reading app like iBooks or Kindle, and the iPad becomes an eReader. Choose a game, and the iPad becomes a gaming device. Choose a recipe program, and the iPad becomes a cookbook. As you engage with one, the iPad gives no hint it could be any of the others.
Which leads me to my next point: Will the iPad (and devices like it) change the way we think?
Right now, on my computer, I have this document where I am typing this column. Across the top left of the screen is a menu relating to the program. The top right has 22 icons as well as a clock. The icons mostly let me know a program is running, but some also convey information about that program. Down the left of the screen is a dock of icons, mostly applications but a few folders and minimized windows too. And my trash, which needs to be emptied.
But wait, there's more! This screen is one of nine "virtual desktops" I have. The others have various programs running in them. Additionally, a notification program periodically displays messages in the bottom corner of my screen to let me know when stuff I might want to know about is happening.
Could I tone down the distractions on my computer? Of course. Some programs have full-screen environments to do just that. But that isn't the point. In the relatively short time that computers have existed, most of us have bought into the notion that because they can do anything, they should do everything, and all at the same time if possible. "Multitasking" became a positive euphemism for "being distracted." At some point, someone decided that if a program didn't remind you it was running by displaying an icon or a bar or a light or something, you'd forget it even existed. And then it became a "king of the hill" game where programs and new emails and updates and chat messages are all elbowing each other like commuters on a subway train during rush hour.
Contrast that with the iPad. In the one year the iPad has been around, a new form of attention has taken hold -- or rather, seen a rebirth. When you're reading a book -- an actual, paper book -- there's no menu running along the side to remind you what other books are on your bookshelf. When you're watching a movie at the theater, there's no bar running across the top of the screen to tell you when you get a new email. When you're cooking in the kitchen with your generations-old cookbook, it doesn't ding when your printer driver needs an update. The iPad has the computer's versatility, but builds in a focus on only one thing at a time.
Are there times when it's annoying not to have the computer's versatility? Sure. Sometimes I'd like to be able to look something up on the Web while writing an email or write a blog post while reading something in another app. More often, though, I find forced single-tasking to be a feature, not a bug.