iPhone: Not touchy feely

There’s an interesting trade-off presented by the iPhone. While the phone can do more, and it’s interface is fluid, in some ways it widens the gulf between human and computer. When you touch it it doesn’t touch you back. That may prove to be a good thing. It may prove that what we think we need we don’t really need. The trade-offs may pay-off.

Jason at 37signals.com has a very interesting point to make about the iPhone, I guess one of the points we will all ponder over before we actually get our hands on the iPhone to really understand how it will or won’t work for us.
Jason had this to say in his post:

When you touch it it doesn’t touch you back.
But we’ve certainly lost the tactile feedback humans are used to when dealing with things that are right in front of us. Now the connection is simulated. Rich textures have been replaced with androgynous glass.
How can you dial the iPhone without looking at it? How can you reach in your pocket and press “1” for voicemail? How can you orient yourself with the interface without seeing it? With a traditional phone or device with buttons you can feel your way around it. You can find the bumps, the humps, the cut lines, the shapes, the sizes. You can find your way around in the dark. Not with the iPhone.
I don’t know if this is better or worse. We won’t know until we try it (and oh man I can’t wait to try it). I just think its really interesting. It’s a pretty big deal. The implications are far reaching. The iPhone demands your attention, it forces you to look at it. We’re lucky it’s beautiful.”

David Pogue who reviewed the iPhone for the NY Times did mention that typing on the iPhone was difficult precisely because one doesn’t get that tactile feedback and he made a lot of mistakes.

It’ll be interesting to see if Apple makes any effort to make the iPhone accessible to blind people; that might address some of the issues you’re talking about here, but I think in general the iPhone will just force people to let go of their old tactile habits and focus on the entirely visual.

Voice recognition is a solution to these problems though I am not entirely convinced it is a viable option as I haven’t comfortable talking into these devices so far. Maybe as James Wheare commented to the post “Apple can always simulate tactility where they eliminate it. The iPod’s scroll wheel clicks by default when you turn it even though it’s a sensor. The Mighty Mouse also has clicky buttons even though it operates with a pressure sensor. It’s a question of separating affordances from operational mechanics.”

The other way to look at this is how often do we dial without looking at the phone when you are driving, to be honest I don’t so maybe it ain’t that bad after all. I guess the world will be a safer place as if you’re doing something that requires so much of your attention that you can’t look down at your phone long enough to dial it then you probably shouldn’t be dialing it in the first place.

This just seems like something too obvious for Apple to overlook. Let me know your thoughts.

Source