ARKit

Augmented Reality With ARKit: Feature Points and Horizontal Plane Detection

By now, you may have used an augmented reality app on your iPhone, with virtual objects that appear lifelike and blend in well with the features of the environment. In this series, you’ll learn how to implement this in your own iOS app with ambient light detection and horizontal plane detection to improve your augmented reality application. This tutorial will focus on showing horizontal planes and feature points in ARKit.

Often, when using augmented reality, you want to place your virtual object on a flat surface such as a table, a desk, or even the ground. In order to do this accurately, it’s important that you are able to detect these objects before you begin. After your plane is detected, the object will use a series of dots to anchor your virtual object to it, and the object will remain there, even as you move your device around.

For this tutorial, you’ll need some basic knowledge of Swift, Xcode, and ARKit. If you haven’t used ARKit in the past, you may find it helpful to read my previous tutorial, which is specifically aimed at helping beginners make their first augmented reality application:

  • Augmented Reality
    Code Your First Augmented Reality App With ARKit
    Vardhan Agrawal

Getting Started

Xcode Version

Before we begin, make sure you have the latest version of Xcode installed on your Mac. This is very important because ARKit will only be available on Xcode 9 or newer. You can check your version by opening Xcode and going to Xcode > About Xcode in the upper toolbar. 

If your version of Xcode is older than Xcode 9, you can go to the Mac App Store and update it for free. If you don’t already have Xcode, you can also download and install it for free.

Sample Project

New Project

After you’ve made sure you have the right version of Xcode, you’ll need to make a new Xcode project. 

Go ahead and open Xcode and click Create a new Xcode project.

Figure 1 Create an Xcode Project

You may be used to making a Single View Application, but for this tutorial, you will need to choose an Augmented Reality App. Then click Next.

Figure 2 Select Augmented Reality Template

Gaming Frameworks

You can name your project anything you like, but I will be naming mine Plane Detection. You will also notice that there is an option at the bottom where you can select from SceneKit, SpriteKit, and Metal. 

These are all Apple’s gaming frameworks, and for the purpose of the tutorial, we will be using SceneKit because we’ll be using 3D objects. 

Go ahead and select SceneKit if it isn’t already selected. Your screen should look something like this:

Figure 3 Name your Project

Preparing to Test

Connecting an iPhone

Since the Xcode Simulator doesn’t have a camera, you’ll need to plug in your iPhone. Unfortunately, if you don’t have an iPhone, you’ll need to borrow one to be able to follow along with this tutorial (and for any other camera-related apps). If you already have an iPhone connected to Xcode, you can skip ahead to the next step.

A nifty new feature in Xcode 9 is that you can wirelessly debug your app on a device, so let’s take the time to set that up now:

In the top menu bar, choose Window > Devices and Simulators. In the window that appears, make sure that Devices is selected at the top.

Now, plug in your device using a lightning cable. This should make your device appear in the left pane of the Devices and Simulators window. Simply click your device, and check the Connect via Network box.

Figure 4 Devices and Simulators

You will now be able to wirelessly debug on this iPhone for all future apps.

Complete Setup

Now your setup is complete. You should have a working ARKit app, and you can test it on the iPhone that you just connected. In the upper left of Xcode, next to the Run and Stop buttons, select your device from the simulator dropdown. I’ve selected Vardhan’s iPhone, but you need to select your specific device.

Now you’re done creating your starter project, and you should see a virtual spaceship appear in your world once you click Run. Here’s what it should look like:

Figure 5 Run the Sample App

In Theory

Before we actually begin programming this app, it’s important to understand how ARKit actually detects these planes. In this tutorial, we’ll explore two main concepts: feature points and horizontal planes. In short, augmented reality on your iPhone works with a process called Visual Inertial Odometry (VIO), which takes the data from your cameras and internal sensors to gain a 3D understanding of a scene.

Feature Point

So, what exactly is a feature point? Every image, naturally, would have its own unique features. For example, in a flower, there would be unique shapes, or on a carpet, there may be distinctive textures.

These points represent notable features detected in the camera image. Their positions in 3D world coordinate space are extrapolated as part of the image analysis that ARKit performs in order to accurately track the device’s position, orientation, and movement. Taken together, these points loosely correlate to the contours of real-world objects in view of the camera. — Apple

As Apple’s documentation explains, feature points help your device (and ARKit) get a sense of the depth and “realism” of your world, making augmented reality much more accurate. These are also used to help give your virtual objects a place to anchor on, helping them get a sense of where to remain in case you move your device around.

Horizontal Planes

Similar to feature points, horizontal planes help your app get a sense of its surroundings. Not surprisingly, feature points and horizontal planes are very closely coupled in that these planes couldn’t be detected without feature points. 

Using your iPhone’s built-in sensors, including the camera (of course) and a combination of these feature points, ARKit can detect various planes in your scene. These calculations and estimations are done every frame, and multiple planes can be detected at once.

When you run a world-tracking AR session whose planeDetection option is enabled, the session automatically adds to its list of anchors an ARPlaneAnchor object for each flat surface ARKit detects with the back-facing camera. Each plane anchor provides information about the estimated position and shape of the surface. — Apple

In Code

Great! You’ve now got yourself a working ARKit app. Your goal for this app is to be able to detect a horizontal plane and visualize it with feature points (virtual dots which are placed on scenes in ARKit).

Starter Code

You now have a solid understanding of what feature points and horizontal planes are, and you’re now ready to start programming them into an app. If you want more background on feature points and horizontal planes, I would recommend reading Apple’s Documentation.

Preparing Your Project

If you open your ViewController.swift file, you’ll notice that Apple has some starter code already set up for you (which renders a spaceship). We won’t be covering what everything means in this tutorial, but if you’d like to have a line-by-line explanation, feel free to check out my other tutorial on ARKit.

  • Code Your First Augmented Reality App With ARKit

    Augmented Reality

Since a lot of the code is boilerplate code, we’ll keep it, but as for the spaceship, let’s get it out of the way by removing the code which handles its placement. In your ViewController.swift file, within your viewDidLoad() method, remove the following lines of code:

Feature Points

Enabling Debugging Options

Your Xcode project is now ready to start working on. Our first step is to add visible feature points. It’s likely that you won’t be making them visible for production apps, but it’s a wonderful feature for debugging augmented reality apps.

This step is pretty simple, and it can be done in one line. In your viewDidLoad() method, add the following line of code:

This code configures debug options using an array of options. You can enable other features by adding them to this array, but for now, we just need to display our feature points. Here’s what it should look like when you run your app:

Figure 6 Feature Points

As you can see, all the detected feature points are visible. If you don’t see them, try moving your iPhone around, and you should see yellow dots appear. Part of the reason why they don’t appear immediately is that ARKit is still detecting the scene.

Configuration

Now, we’re ready for plane detection. If you take a look at your viewWillAppear() method, you’ll see the following two lines of code. Let’s take a moment to learn what they mean as they will be relevant for plane detection:

Here, an instance of the ARWorldTrackingConfiguration class is made. This class is responsible for device positioning. After that, we run the configuration on our current sceneView session. If you’re interested in learning more about this, you can visit Apple’s Documentation about it. For now, though, you’re ready to continue to the next step.

Now, let’s enable plane detection on our ARWorldTrackingConfiguration. Below the line which creates the configuration, you’ll need to insert the following line:

Excellent! Now, horizontal plane detection is enabled, but we can’t see it because it’s happening under the hood, meaning that the iPhone knows where the plane is, but we can’t see what it thinks.

Plane Detection

Checking for Anchors

Before we can start visualizing our detected planes, we’ll need to find out when and where the planes are being detected. This can be easily done with a delegate method which is provided to us by Apple.

Start off by declaring a method for the ARSCNViewDelegate, and as you’ll see, the ViewController class already conforms to this delegate in the starter code. Paste the following delegate method into your ViewController class:

This is called when new ARAnchors are added with a corresponding node, which, in this case, would be our horizontal planes.

Tells the delegate that a SceneKit node corresponding to a new AR anchor has been added to the scene.—Apple

Plane Geometry

For most apps, though, you would want the user to be able to tell where your app thinks it spots horizontal planes, so we’ll be learning how to actually display these planes to the user using the node and the anchor provided in the delegate method we called earlier.

Our first question is: is the ARAnchor really a plane, or is it something we don’t want? We can check this using type-casting and optional binding. Put this line of code into your delegate method:

You may have noticed that the anchor parameter has the type ARAnchor. If we want to know whether this anchor is a plane, we can attempt to type-cast it as an ARPlaneAnchor. If that succeeds, we know that a plane has been detected.

Inside the optional binding statement, let’s add code to create an SCNPlane with the dimensions of the anchor we received from the delegate method. Write out the following two lines of code:

The first line here creates a two-dimensional SCNPlane which takes two parameters in its constructor: a width and a height. You’ll see that we pass in anchor.extent.x and anchor.extent.z, which, as their names suggest, represent the width and the height of the detected planes. You’ll notice that we’ve omitted the y-value of the anchor, and that’s because the planes we’re detecting are two-dimensional, and the x and z axes run across planes that we’re identifying.

The next line of code applies a translucent red hue to these detected planes, and you can see through these planes because the alpha value is set to 0.5. Of course, you can use any colors or appearances you wish, as long as you’ll be able to see them.

Creating the Node

We’re not done yet; we still need to create the node to add to our view. Though our plane is a two-dimensional object, we still need to represent it in 3D with an x, y, and z coordinate. These coordinates will be relative to the parent node of the detected plane anchor and not the sceneView. Enter the following lines of code:

We’re first creating a SCNNode with the plane geometry we created in the previous step. This is being done by passing in the geometry as a parameter of SCNNode‘s constructor.

Next, we’re positioning the plane node directly in the center of the plane anchor using a SCNVector3 representation of x, y, and z coordinates in the three-dimensional plane. 

Also, in the next line, you’ll see that the x value of our eulerAngles is being set to -π/2. The property eulerAngles represents the pitch, yaw, and roll of the SCNNode. According to Apple’s documentation, the x value represents the pitch (or rotation about the x-axis), and we want ours to be flat against the plane (not sticking upright).

Last, the newly created node is added as a child node of the detected plane node to make it visible to the user at the same location.

Conclusion

Great job! You should now be able to see horizontal planes (and, of course, the feature points) when you run your application on your iPhone. If you don’t, make sure you move your phone around slowly for it to scan the surface, and try increasing the alpha of the color we set earlier. Here’s what mine looks like:

Figure 7 Horizontal Planes and Feature Points

You now know how to detect flat surfaces such as tables or desks and display them to the user as horizontal planes. In addition to this, you’ve been able to see how ARKit views the world with feature points. While you’re here, be sure to check out some of our other ARKit courses and tutorials here on Envato Tuts+!

  • Augmented Reality
    Code Your First Augmented Reality App With ARKit
    Vardhan Agrawal
  • iOS
    Get Started With Augmented Reality for iOS
    Markus Mühlberger
  • Augmented Reality
    Code a Measuring App With ARKit: Objects and Shadows
    Vardhan Agrawal

Powered by WPeMatico

Leave a Comment

Scroll to Top