ARKit (Part 1) — First Experience

Introduction to ARKit Augmented Reality (AR) is a technology that computes the position and orientation of a camera in real time and overlays corresponding images, videos, or 3D models onto the scene. The goal of this technology is to superimpose the virtual world onto the real world on screen and enable interaction between them. Apple introduced ARKit at WWDC 2017, making it much easier for developers to implement AR features. This post walks through building a simple demo to experience ARKit firsthand. A few things to note: Running ARKit in Xcode requires iOS 11 or later. There are also hardware requirements — the device must have an A9 processor or later (iPhone SE, iPhone 6S, iPhone 6S Plus and newer, or iPad released after 2017). Xcode 9 is also required. Creating the Project Create a new Xcode project and select the AR App template. After filling in the project information, confirm that the Content Technology option is set to . The SceneKit framework can render 3D graphics on iOS devices and provides the ability to render 3D content for ARKit. Sprite Kit is used for rendering 2D graphics. Build and run the project. After moving the device, it will prompt for camera permission. Once granted, you should see a small airplane on screen. Simple, right? Being an iOS developer can be quite a joy sometimes (I must have forgotten the pain of adapting for iPhone X). This is Apple's built-in sample app. We need to build our own demo step by step. In the method, an ARSession is instantiated: Drawing a Cube Next, let's use SceneKit to draw a 3D cube. SceneKit is a container for all 3D content — it can add multiple 3D objects to a scene at different positions, angles, and sizes. To add content to the virtual scene, you first create a geometry, wrap it into a scene node, and add it to the scene. Add the following code in (remove the previous sample code): Coordinates in ARKit roughly correspond to meters, so the code above creates a cube. The coordinate system for ARKit and SceneKit is shown below: The camera faces in the negative Z direction, which is why the cube is placed at -0.5. When the ARSession starts, the computed camera position is initially set to X = 0, Y = 0, Z = 0. Run the project now and you should see a cube floating in the air. The edges of the cube are not very distinct because no lighting has been added. Add the following line: Now the cube is much easier to recognize. Try walking around to observe it from different angles. References: Apple ARKit by Example Code: All code from this article can be found on my GitHub: .