TUAW had a post on a new city guide type website targeted to iPhone and iPod Touch users, Schmap. Schmap does a lot of the basic city guy type stuff, quite well I made add, but the thing that got my attention was the interface.
When you view information about a location holding the device upright, you get the typical information such as phone number, address, etc. However, when you turn the device 90Âº into a landscape orientation, the website switches into a split-screen display, with a map of the location on the left and information on the right. Mobile Safari can feed data on device orientation to the website you’re viewing, and this is the first developer I’ve seen really use this information in a big way.
This got me thinking about the future of device interfaces. We’re no longer just looking at button presses for application interaction. We’ve got so many more forms of input at our disposal now. Device orientation, touch, multi-touch, sound, ambient light, geographic location, acceleration… and these aren’t limited to the iPhone/Touch. Macbook Pros have all of these senses too, even though they’re not all natively reported by the operating system, yet.
I think, in the very near future, we will be flooded with devices that have new and natural forms of input. I’m still amused by the fact that when I’m trying a new application on my phone, and I’m trying to figure out how to do something, shaking the device is actually a logical thing to try.