“The Turtle” developer tutorial: markerless app made with Unity (SLAM)

Paula

Update: Track, save and share AR experiences with SDK 8

Since the launch of Wikitude’s SLAM technology, “The Turtle” has been the #1 tutorial request we’ve received. So who better than the developers themselves to show you how they made turtles float around the streets of Michigan?

This post is a guest post by YETi CGI, a tech design company who developed projects for companies like Disney, Mattel, National Geographic and more.

The Turtle, as we affectionately call it, is a markerless AR concept demo we built on the new Wikitude SDK as part of our own research into 3D mobile AR. In broad strokes, the app causes your phone to see a majestic sea turtle swimming about the room, getting believably close and distant depending on where it is in virtual space.

Building the AR experience

The demo’s effect hinges on the turtle itself, right? Both its look and its feel contribute to the sense of it swimming in the same room the user occupies. In order to convey this, the first step is to give the turtle some context, which is where Wikitude’s SDK comes in. The SLAM algorithm they’ve got defines the virtual space based on the physical, gathering data from the room and mapping it into a Unity scene.

Because of what Wikitude provides, our coding needs were actually pretty simple. Once the Unity scene is created the camera performs a raycast onto the ground, giving the program most of the variables it needs to run. Aided by the virtual map of the physical room and the raycast data, the prefabbed object (which is then given the turtle as its identity) is positioned at a comfortable distance directly in front of the user.

When scene is set to it’s active state, and if tracking is active, position the Turtle correctly in front of the viewer

if (_isTracking)
{
    //cast ray out from center of screen, get position it hits the ground plane
    Ray cameraRay = Camera.main.ScreenPointToRay(new Vector3(Screen.width / 2, Screen.height / 2, 0));
    RaycastHit hit;
    if (ground.Raycast(cameraRay, out hit, 30))
    {
        objectHolder.transform.position = hit.point;
    }
}

The turtle’s animation is mostly provided by the art team, but it only consists of a looped reel of images. The baked in animation doesn’t actually include the circles that the turtle is turning, though it’s worth noting that it could have been accomplished that way. Our approach was to use a free plugin, called DOTween, to assign the turtle an animation state. We think it worked nicely.

Did you know you can now use Unity’s live preview with the Wikitude SDK? See more

Again, Wikitude’s SDK accommodates the project beautifully: most of the code done by our team determines position and camera input. By using Unity, Wikitude was able to provide us with prefabs (pre-built assets) which allowed us to drag and drop ready-to-go elements, giving us the freedom to work on the design and UX without having to first reinvent the wheel.

The outcome

How cool is to track the world around us? We’re pleased with how it turned out, and we’re going to continue learning about the applications of markerless 3D AR. At YETi we’ve been active in the scene for some time, and to have tools like Wikitude’s SDK showing up is a huge encouragement. 

That’s it! Hope you enjoyed this tutorial. Below are some relevant links to get you started with Instant Tracking. Let us know in the comments below which tutorials we should make next.

Get started with the  SDK for Unity: 

 

Download Unity SDK
Set up guide
Instant Tracking info
Other apps using Instant tracking





Interested in creating an AR project of your own?
Talk to one of our specialists and learn how to get started.

Contact The Wikitude Team

Previous Article
Develop powerful AR apps with the new Wikitude SDK 8.10
Next Article
Develop powerful AR apps with the new Wikitude SDK 8.10