According to recent rumors and speculations, iPhone 5 or iPhone 4S will come with Apple’s faster A5 chip, 1GB RAM, an improved 8-megapixel camera and Qualcomm Gobi Baseband chips that supports both CDMA and GSM networks. But next generation iPhone’s biggest selling point will be a new software feature called Assistant powered by Siri’s artificial intelligence and assistant technology and Nuance’s speech recognition technology.
Folks at 9 to 5 Mac have given us some details about how the new Assistant feature would work:
One of the key elements of Assistant is the conversation view. The system will actually speak back and forth with the user to gain the most information in order to provide the best results. The user essentially can hold a conversation with their iPhone like it is another human being. For example, if a user is making a meeting with me, they will say “setup meeting with Mark” and the first “bubble” of the conversation thread will say that. After that, the system will speak back: “which e-mail address should Mark be notified at, work or personal?” This question will both be spoken out loud by the iPhone Assistant and shown as a new “bubble” in the conversation thread. The user will then respond with the email address they want to notify me at, and the appointment will be made. The iPhone will even show a quick glance at a calendar view to confirm the appointment. If the Assistant was sending an SMS, as another example, a mini SMS view would appear so the user has a quick glance at the SMS thread.
MacRumors now reports that they’ve received similar details from their sources and are confident that Apple will introduce the Assistant feature at the Let’s talk iPhone event, which will be exclusively available on iPhone 5/iPhone 4S.
They’ve gone a step further and contracted Jan-Michael Cart who has created number of iOS concepts, to create a mock up of what the actual Assistant interface will look like based on information they’ve received. Arnold Kim of MacRumors briefly explains how it will work:
After receiving spoken commands, the Assistant shows you back the recognized text and then takes the next step. This could involve sending a text message (with confirmation) or pulling data from Wolfram Alpha. The feature is said to be one of the major differentiators for the next generation iPhone.
You can checkout the mockup video of the rumored iOS 5 Assistant feature below:
We can’t wait to checkout the new iPhone’s personal assistant feature, which could be as revolutionary as iPhone’s multi-touch interface and will give users another intuitive way to interact with the iPhone. We hope that Apple also provides developers with APIs to help them integrate their apps with the Assistant feature.
It looks like the day when we can tell our iPhone to give us directions to an empty parking spot in the area isn’t too far away (a combination of iOS 5’s Assistant feature, turn-by-turn navigation apps like Navigon’s MobileNavigator, TomTom and apps like SFpark that allow iPhone users in San Francisco to find an empty parking spot). Voice recognition technology deeply integrated in iOS 5 will also make it a lot safer for such use cases as users won’t have to focus on the iPhone while driving.
What do you think about the rumored Assistant feature? Please share your views in the comments below.