An in-depth look at iOS 7’s parallax effect

parallax-100043713-largeThe volume knob on iOS 6 adjusted its reflection as the device was tilted using data from the gyroscope and accelerometer. This sort of realism in UI was unheard of previously, but Apple had had a lot more planned for future versions of iOS.

Apple’s vision for realism in interfaces was finally seen with iOS 7, where UI elements lived in layers at different depths along the z-axis, and reacted to motion differently, creating a parallax effect where objects closer to the eye move faster than ones that are farther away.

Macworld takes an in-depth look at this parallax effect:

Working in concert with a few additional sensors called accelerometers, the gyroscope allows the device to understand the changes in its relative position with a reasonably good level of accuracy.

Starting with a good estimate of a fixed initial position, which is given by the fact that most people will tend to raise the phone to eye level when they use it, iOS can use these inputs to determine the angle between the surface of the screen and our visual plane as they move relative to each other.

From there, the math required to provide the illusion of depth is fairly straightforward; all the software has to do is to organize its content in an arbitrary set of planes, and then move them relative to each other, based on their apparent distance from the eye. That results in a realistic depth-perception effect.

You’ll see parallax not just on the home screen or in Apple apps, but even in third-party applications, since Apple’s opening up those APIs as a part of the iOS 7 SDK.

Apple’s clearly not given up on realism in interfaces (often called skeuomorphism) with iOS 7, just that they’ve moved on from fake leather and linen textures to 3D UI elements.

Tell us what you think about Apple’s choice in the comments below. Is parallax gimmicky, or does it subtly help the user understand the layered organisation of information?