You Can Even Finger Paint With iOS 11 and ARKit


We’ve put together a collection of demonstrations showcasing the power of iOS 11 and ARKit, and, since then, we’ve even seen the software used to enhance navigation instructions while walking, and what a popular music video from A-Ha looks like.

But for those who want to get really creative when ARKit launches, there will be at least one method to do that. The latest demonstration, which was initially published to the site Toptal, shows what it will look like to fingerprint in augmented reality, and then turn those creations into digital 3D models.

Thanks to software in iOS 11 called Vision, and ARKit, the software makes it work like the person creating the art is holding a pen, and then the digital ink is displayed as realistically as possible. Once you’ve finished drawing what you want, the creator will be able to “pull” the creation into a 3D space, effectively making a 3D object.

“One of the cool libraries that Apple introduced in iOS 11 is Vision Framework. It provides some computer vision techniques in a pretty handy and efficient way. In particular, we are going to use the object tracking technique. Object tracking works as follows: First, we provide it with an image and coordinates of a square within the image boundaries for the object we want to track. After that we call some function to initialize tracking. Finally, we feed in a new image in which the position of that object changed and the analysis result of the previous operation. Given that, it will return for us the object’s new location.”

You can find the full write-up of what it took to put the demonstration together through the source link below. You can watch it in action in the video as well.

Are you looking forward to ARKit’s arrival?

[via 9to5Mac; Toptal]

Like this post? Share it!

Related Topics: ARKit, iOS 11