Using your finger instead of a mouse or pointer represents a quantum revolution in the way we use -- and design for -- the web. iPads, iPhones and other devices using multi-touch technology are on the verge of changing the web more dramatically than CSS3 or HTML5. Also: why the boundary between the web and mobile is about to disappear.
Thanks, Shannon....I wonder if HFI and other UX leaders are preparing for the multi-touch age…I think it's going to be a whole new ballgame, especially as user expectations start pushing the boundaries of multi-touch.
I think it's going to open up whole new worlds of interactivity. Think about keyboard controls right now and how efficient they are, and they're only possible because we have five digits on each hand to manipulate the keys. With multitouch, there will be potentially five to ten points of contact on any screen at any time, plus the physics/mechanics behind that. How do fingers intuitively move when grasping, turning, etc? Not only are we all going to have to go back and revisit some biology but could be some other mathmatical modeling involved too. How do you wireframe all the potential "moves" that one, two, or three (or more) fingers will have on an object? I am looking forward to it.
Great points all. It's really so intuitive to grasp and twirl, but showing that action in a wireframe…gearing the design to simple, reflexive interaction introduces whole new levels of complexity to the design process.
yes, indeed! An interesting exercise might be to imagine, or find, a tool/object/widget with multitouch capabilities (modeled off of something in iWork, perhaps) and then work backwards to create what a WF of what that object's interaction would look like. I think I'll do that, actually....
Apple's redesigned touch-enabled iWork office suite may seem like an afterthought, but more than anything else on the iPad it's indicative of how we'll use computers in the future.
It's easy to write off iWork's inclusion on the iPad as a minor perk only for business types only, but don't. The suite's fully-redesigned touch interfaces actually reveal more about Apple's vision of the future of computing than any other element of their new tablet. Here's why.
I used each iWork app yesterday, and while I couldn't spend enough time with them to come to a definitive conclusion, they definitely surprised me. Text-input issues aside, each appeared more than capable of offering a similar, if not much improved experience, over their desktop counterparts. And for that, all credit is due to multitouch.
In Pages, one of word processing's most arduous tasks--formatting text cleanly and easily around graphical elements--has been made orders of magnitude easier with touch. Once tapped, pictures and charts can be moved, resized, rotated and masked with finger swipes, pinches and twists, as the text instantly and naturally wraps around them. Once a graphical element is touched, a contextual box can be summoned to the surface with another tap offering options unique to that element, such as its layering position, size, and the like. Again, my time with the app was brief, but the potential available once clicks and drags are replaced by our natural inclination to touch and interact with our fingers was immediately apparent.
Keynote provides a similar interface for composing presentation layouts, which are more graphically intensive and thus even better served by touch. Added to the mix is an intuitive way to rearrange sides individually or in batches with taps and swipes. And while spreadsheets may be the least exciting runt of the litter, one thing touch certainly improves is navigating to or selecting multiple cells in the document: tap, and you're there.
p.s., I don't know if you saw Avatar, but there was a really neat moment in which one of the scientists is looking at data on a huge (clear) screen and he reached out, "grabbed" it with his hand, and transferred it to a tablet type device.... it was awesome.