Check out the other tutorials that are part of this series:
We’ve blown minds at networking events and gotten rid of work jitters at the range. By doing so, and hard-hitting Googling, I found a few foundational tools needed for building an augmented reality application for iOS devices using Unity’s AR Foundation. With the release of ARKit 3, we have even better ways to build our groundbreaking games. In this tutorial, I’ll go over those foundation tools for Image Tracking to get your thrusters going so you can blast off into the ARverse.
The complete Unity project for this tutorial is available here. All images and models used in this tutorial have been created by me.
FINAL DAYS: Unlock 250+ coding courses, guided learning paths, help from expert mentors, and more.
This tutorial was implemented in:
- Xcode 10.1
- Unity 2019.3.0a6 Personal
- Setup Unity Project
- Install AR Foundation & ARKit Package
- AR Game Configuration
- Image Tracking
*Building to Android with ARCore is outside the scope of this tutorial, we encourage you to research how to do so if you would like to.
Setup Unity Project
In my Create an AR Business Card tutorial, I had you download the ARKit 2.0 plugin from Bitbuckets because said plugin was deprecated from the Asset Store. This means that new purchases of the package are not allowed and that only users who already purchased or downloaded the package before it was deprecated, are allowed to download it. This is where Unity AR Foundation package comes into play. It is to be used to develop a more efficient AR ecosystem for cross-platform builds.
To get started open Unity and click New to create a new project. Let’s title it ARFundamentals. Select where you would like to save your project and in the Template dropdown select 3D.
With your project open navigate to Build Settings from the main menu, File > Build Settings. In the dialogue box that appears select iOS from the list of platforms on the left and select Switch Platform at the bottom of the window. Switching Platforms sets the build target of our current project to iOS. This means that when we build our project in Unity that process will create an Xcode project.
Next, you’ll need to enter the bundle identifier for our game. A bundle identifier is a string that identifies an app. It’s written in the reverse-DNS style, following the format com.yourCompanyName.yourGameName. Allowed characters are alphanumeric characters, periods and hyphens. You’ll need to change the bundle identifier from the default setting in order to build to your desired platform.
It is important to note, once you have registered a bundle identifier to a Personal Team in Xcode the same bundle identifier cannot be registered to another Apple Developer Program team in the future. This means that while you are testing your game using a free Apple ID and a Personal Team, you should choose a bundle identifier that is for testing only – you won’t be able to use the same bundle identifier to release the game. The way some developers choose to do this is to add “Test” to the end of whatever bundle identifier they were going to use – for example, com.yourCompanyName.yourGameNameTest.
With that said, to change the bundle identifier, open the Player Settings field in the Inspector panel by going to Edit in the main menu, Edit > Project Settings > Player. Expand the section at the bottom called Other Settings and enter your bundle identifier in the Bundle Identifier field.
You’ll also need to turn on the Requires ARKit support checkbox. Note that when you do this Unity fills in the Camera Usage Description field with the message “Required for augmented reality support.” This can be any value you want, but it just cannot be blank. You’ll also notice that it gives you a warning that iOS 11.0 or newer is required for ARKit support. So, you’ll need to increase the iOS number in the Target minimum iOS Version field.
The final change you’ll have to make is to the architecture. ARKit only works with ARM64 processors so you’ll need to select that from the Architecture drop-down menu.
Install AR Foundation & ARKit Package
Once you have completely setup up your project for an iOS game, you will need to install the AR Foundation and ARKit packages. To do so, you’ll need to open Unity’s Package Manager by clicking Window > Package Manager. In the dialog box that appears the AR Foundation and ARKit package is not readily available. To see all packages available for install click the Advance button this will open a drop-down menu that allows you to show preview packages. Preview packages are not verified to work with Unity and might be unstable. They are not supported in production environments.
Select the AR Foundation package from the list of packages. The package information appears in the details pane. In the top right-hand corner of the information pane select the version to install, in our case, we will be installing version 1.0 preview 27. Then click the Install button. You will do the same for the ARKit Plugin.
ARFoundation 2.2 provides interfaces for ARKit 3 features, but only Unity’s ARKit XR Plugin 2.2 package contains support for these features and requires Xcode 11 beta and iOS 13 beta. Unity’s ARKit XR Plugin 2.2 is not backward compatible with previous versions of Xcode or iOS. Unity’s ARKit XR Plugin 2.1 will work with the latest ARFoundation (it just doesn’t implement the ARKit 3 features).
While Xcode 11 & iOS 13 are in beta, we will continue to maintain both the 2.2 and 2.1 versions of the packages. The same is also true for Unity’s ARKit Face Tracking package 1.1: it requires Xcode 11 beta and iOS 13 beta. This distinction is temporary. Once iOS 13 is no longer in beta, the ARKit package is expected to work with all versions of Xcode 9+ and iOS 11+.
AR Game Configuration
Now if you right-click in your Scene Hierarchy you will notice a new section called XR in the menu that appears. From said menu add an AR Session, controls the lifecycle of an AR experience, enabling or disabling AR on the target platform. The AR Session can be on any Game Object.
Next, you’ll add an AR Session Origin object by right-clicking Scene Hierarchy and navigating to the XR > AR Session Origin. The AR Session Origin object transforms trackable features (such as planar surfaces and feature points) into their final position, orientation, and scale in the Unity scene. Because AR devices provide their data in “session space”, an unscaled space relative to the beginning of the AR session, the AR Session Origin performs the appropriate transformation into Unity space. Notice that the AR Session Origin object has it’s own AR Camera. So, you no longer need the Main Camera that was included when you create your project. To remove it, right-click on it and Delete.
Go ahead and select the AR Camera from the AR Session Origin drop-down contents and in the Inspector pane tag it as the Main Camera.
In my first tutorial, Create an AR Business Card, I implemented the logic of the XR Image Tracking Subsystem. This subsystem attempts to detect two-dimensional images in the gamespace that have previously been stored in a library of reference images. That two-dimensional image is called a Reference Image. A set of Reference Images is referred to as a Reference Image Library. When you start an image tracking subsystem, you will need to provide it with a Reference Image Library so it knows what to look for. This process is made easier with Unity’s AR Foundation ecosystem.
So after we have set the scene for our AR game will want to create our very own Reference Image Library by clicking Assets > Create > XR > Reference Image Library. In your projects Assets folder, a new Reference Image Library object has been created. Click on the said object and in the Inspector Panel add your References images. In my case, I’m going to use the Zenva Academy logo.
For iOS builds we need to set the specific size of the Reference Image. This is the physical size you expect the image to have in the real world. When enabled the dimensions must be greater than zero.
With the reference library, complete will need to add the AR Tracked Image Manager script to our AR Session Origin. This script is a manager that uses the reference library to recognize and track the 2D images in the physical environment. To add the script to the AR Session Origin simply select the AR Session Origin game object from the Scene Hierarchy, in the Inspector Panel click Add Component and search AR Tracked Image Manager.
Once the library has been added you’ll see that there are three variables that need to be populated, Reference Library, Max Number of Moving Images and Tracked Image Prefab. Reference Library is the XRReferenceImageLibrary to use during image detection. This is the library of images which will be detected and/or tracked in the physical environment. Max Number of Moving Images is The maximum number of moving images to track in realtime. Support may vary between devices and providers. Check for support at runtime with .SubsystemDescriptor.supportsMovingImages. The Tracked Image Prefab is the assigned AR object. If not null, instantiates this prefab for each detected image.
Drag and drop the Reference Library you just created into the designated variable field. For the Max Number of Moving Images put the number of images you have added to the library. In my case, I’m just going to put one. For the Tracked Image Prefab drag and drop the desired 3D object you wish to appear when your device is able to track the 2D image.
Next, you will need to add the AR Input Manager script to the AR Session. This script manages the lifetime of the Input Subsystem. We can add this script to any GameObject in the scene to make the device pose information available. To read the said input use the Tracked Pose Driver. To add this script to the AR Session simply select AR Session from the Scene Hierarchy, in the Inspector Panel, click Add component and search for AR Input Manager.
Boom Bam Thank you, Ma’am! It’s done. Now if you build and run your program on your iOS device you will find that when the camera of your device tracks the Zenva logo the designated game object appears above it. If you were to move the logo within the physical space your device’s camera tracks the logo as it moves.
In this tutorial we learned to:
- Initialize a Unity project for Augmented Reality builds
- Install AR Foundation and ARKit XR packages
- Do fundamental scene setup for Augmented Reality games in Unity
- Use the Image Reference Library for image tracking
And there you have it, a simple image tracking program that you can show friends to blow their minds at your splendor. We’ve boosted our thrusters for hyperdrive capabilities and outfitted our spacesuits for interdimensional travel. All that there is left to do is go further into the ARverse like the courageous pioneers, Neil Armstrong, and Buzz Aldrin, to create more!
We’ll take it one step further by implementing logic to make the AR object disappear when the reference image is no longer being tracked by your iOS device (similarly to Vuforia’s image reference logic). See you next time in Part 2, where I’ll show you how to implement Vuforia’s image target engine. The Engine detects and tracks the features that are naturally found in the image itself by comparing these natural features against a known target resource database. Once the Image Target is detected, Vuforia Engine will track the image as long as it is at least partially in the camera’s field of view. When it is out of view of the camera the AR interactions are ended.
“Just remember there is only one corner of the ARverse you can be certain of improving, and that’s your own code.” – Morgan H. McKie