Apple was granted an interesting patent this week that shows how the company is planning to rival the sheer amount of personal information Google has about its users.
The patent, titled “Journaling on mobile devices,” describes the system by which Apple will use the wide variety of sensors embedded in iOS devices to log events and its associated metadata to the cloud. The event could be a phone call, an app download, a network toggle or even a customised event registered by a third party app.
Here’s what all this collected information will be used for:
The information collected by the journaling subsystem can be stored with timestamps and location information (e.g., latitude, longitude, altitude) in a journal database. [The database can then be] queried by the user or a program using a variety of queries, such as time span queries, event queries, etc.
A web service, or a local application program, or both, can be provided that allows users to manage the journal database, such as defining categories for the journal, specifying desired location based services (LBS) or editing journal entries. In some implementations, the journal database can be shared with others and used to target location based services and the delivery of content to user.
The implications of such a system will be huge for Siri. These data, combined with the power of understanding natural language would tremendously increase the utility of Siri, whose abilities are currently constrained by the inadequate personal data Apple has about its users. Google, on the other hand, overtook Apple takes advantage of the huge amounts of personal data it has learnt about its users over the years with many of their apps.
If this system does work in conjunction with Siri, you could construct queries like “What song was I listening to yesterday when I was in the subway” or “reopen all my webpages which were open when I left home in the morning.” and get useable results.
Apple also talks about opening up this journaling system to developers via an API to include diverse event information in the journalling database:
The journaling subsystem can be accessed by client applications through an Application Programming Interface (API). The applications can identify events and send related event data through program calls to the journaling subsystem to be stored and indexed either locally on the mobile device or on a network. In some applications, the user can specify the event data to be collected and the journaling subsystem automatically collects the specified information when the journaling subsystem is activated by the user or another application. The journaling subsystem can be activated based on a schedule or in response to a trigger event.
As you might have realised by reading through this patent description, Apple’s venturing into Google’s territory of search, data harvesting and data mining. While that may be good, it also brings along privacy concerns, as we saw during Apple’s infamous locationgate incident. More so if it plans to open up the system to developers.
(Apple also keeps the language in the application broad enough to include advertising, another space Google specialises in.)
We think Apple needs to take these steps forward to know as much information about users as possible, since this holds key to the development of new features as we move forward in the “age of context,” as Robert Scoble likes to call it. Apple can pull off including such features into its products as long as it’s upfront about the privacy tradeoffs involved, and it allows users to opt-in or opt-out based on their will.
As goes with every patent, this might not ever see the light of the day, but we’re pretty excited knowing that engineers at Apple are at least visualising something as awesome as this for future versions of iOS.
Here’s the entire patent, in case you’re interested.