Reality Composer is aimed at helping anyone create AR experiences and produce new content. Once done you can integrate it into apps with the help of Xcode/ export to AR Quick Look. The app lets you create animations and experience which will enhance your 3D content.
Reality Composer Features and Highlights
Built-in AR Library
Import your own USDZ files or take advantage of the hundreds of ready-to-use virtual objects in the built-in AR library. This library harnesses the power of procedural content generation for a variety of assets, so you can customize a virtual object’s size, style, and more.
Animations and Audio
Add animations that let you move, scale, and add emphasis like a “wiggle” or “spin” to virtual objects. You can choose for actions to happen when a user taps an object, comes in close proximity with it or activates some other trigger. You can also take advantage of spatial audio to add a new level of reality to your AR scene.
Record and Play
With Reality Composer for iOS, you can record sensor and camera data in the location where the AR experience will take place, then replay it later on your iOS device while building your app.
Utilizing the latest Metal features to get the most out of the GPU, RealityKit takes full advantage of CPU caches and multiple cores to deliver incredibly fluid visuals and physics simulations. And because it automatically scales the performance of an AR experience to each iOS device, you only need to build a single AR experience.
Easy to use yet incredibly powerful, RealityKit uses Swift’s rich language features to automatically provide the full feature set so you can build AR experiences even more quickly, without the need for boilerplate code. Feel free to check out Apple’s Reality Composer website for more details.