Categories
Dev to Dev

How to build an AR app like Pokémon Go in three simple steps

If you haven’t been living under a rock, you’ve heard about the Pokémon Go app. This game has already picked up more users than Twitter and helped push Nintendo’s market capitalization over $69 billion in 2020 (and counting!). Pokémon Go app is also a great example of a location-based (LBS) game and geo-based augmented reality.

Much of the success is due to the phenomenal fan base Pokémon already had in place. Another success component is clever use of smartphone technology, including very easy-to-use AR. And here’s the great news for developers: all the tools to build your own location-based AR game are already out there. We’re quickly going to show you how to use them in three simple steps.

OK, it’s a little more complicated than that, but once we break it down to fundamentals, you’ll see how simple it really is to build your own kind of Pokémon Go app.

Ready? Keep on reading!


Step 1: Set up a basic Geo-based infrastructure

Setting up the server infrastructure for such a game requires some work. You need to manage user accounts and arrange a smart distribution of Pokémon characters or objects of your choice in addition to some contextually aware components in your system.

For simplicity: let’s assume that you already created the server backend required to fetch the closest 5 creatures for any given user position so that once you know where the user is you get a list of them in a JSON Format (making it easier to parse in your JavaScript code).

Placing objects (such as Pokémons) around users using their GPS position was actually the very first feature supported by the Wikitude SDK since 2012.


Step 2: Build a geo-AR app

The Wikitude SDK JavaScript API also offers features to fetch the user’s location, and place videos, 2D and 3D content in geo-spaces. You can query content from a remote origin to create a personalized Geo-AR scene on the fly.

The Wikitude JavaScript API offers so-called AR.GeoObjects, which allows you to place virtual objects in the real world. A callback function informs you about user movements, firing latitude, longitude, and altitude information. In the following sample, the user’s first location is stored for further use, using Wikitude’s JS SDK.

Note that this code must run inside a Wikitude AR-View, the central component of the Wikitude JavaScript SDK.

AR.context.onLocationChanged = function(lat, lon, alt, accuracy){
// store user's location so you have access to it at any time
World.userLocation = {"latitude": lat, "longitude" : lon, "altitude" : alt };
};

The following function uses coordinates to create an AR.GeoObject out of a 3D Model (in Wikitude’s wt3 file format) and an AR.GeoLocation.

createModelAtLocation: function (location) {

World.modelGeoLocation = new AR.GeoLocation(	location.latitude, 
location.longitude + 0.0005, 
AR.CONST.UNKNOWN_ALTITUDE);	
// load model from relative path or url
World.model = new AR.Model(World.PATH_MODEL_WT3, {
// fired when 3D model loaded successfully
onLoaded: function() {

// define model as geoObject
	World.GeoObject = new AR.GeoObject(World.modelGeoLocation, {
		drawables: {
			cam: [World.model]
		},
		onEnterFieldOfVision: function() {
			console.log('model visible');
			World.modelVisible = true;
},
onExitFieldOfVision: function() {
			console.log('model no longer visible');
World.modelVisible = false;
},
		onClick: function() {
			console.log('model clicked');
}}
);
},
onError: function(err) {
console.error(‘unexpected error occurred ’ + err);
}
});
}

Step 3: Make the rules of your game

Now that you can place virtual 3D models in the real world, you can get as creative as you can be and define the rules of your augmented reality game.
Here are some examples of what’s possible:

    • Similar to the Pokémon Go app, you may require the user to be a maximum of 10 meters (30 ft.) away from a 3D model experience, otherwise the player may not collect or see it. On top of that, you could also use altitude values to make sure a user has to get to the top of a building, or other unique locations.
    • Mix up the content! In addition to 3D models, you could also use videos, text, icons, or buttons and place them in any geo-location you like.
    • In addition to GEO AR, you could also use 2D image recognition and tracking: Use signs, billboards, print, walls, or any other 2D surfaces as “stages” for interesting and cool AR augmentations that are “glued” onto these surfaces as seen in this video.
    • If you want to take your game to the next level, use the environment around your user to extend the play with markerless AR or even include physical toys with object tracking to create immersive hybrid play experiences.

    Get an app template!

    Need an app template to get you started? No problem, just download the Wikitude SDK package, which includes an application called “SDK Examples”. One of the many AR templates you will find in there is called “3D Model At Geo Location”. Simply use this as a starting point for your code.

    You can also download Wikitude’s sample app (included in the SDK download package) for examples of how to build your own geo-AR experiences. Also, have a look at this helpful video tutorial made by a developer from our community:

    Additionally, check out this integration tutorial by LBAR – a third-party plugin, built on top of Wikitude’s comprehensive Unity plugin, offering location-based AR for Unity:

    That’s it! We hope you will build great geo-based AR apps with our SDK.

    Wikitude provides a wide selection of development frameworks. Developers can choose between using Native API, JavaScript API or any of the supported extensions and plugins available.
    Next to JavaScript Android and iOS, among the extensions based on Wikitude’s JavaScript API are Cordova (PhoneGap), Xamarin, and Flutter. These extensions include the same features available in the JS API, such as location-based AR, image recognition, object tracking and instant tracking.

    In case you have any questions, don’t hesitate to contact us via the Wikitude forum, we have a broad network of developers who can help you in creating the next big AR app!





Interested in creating an AR project of your own?
Talk to one of our specialists and learn how to get started.

Contact The Wikitude Team

Categories
Dev to Dev

“The Turtle” developer tutorial: markerless app made with Unity (SLAM)

Update: Track, save and share AR experiences with SDK 8

Since the launch of Wikitude’s SLAM technology, “The Turtle” has been the #1 tutorial request we’ve received. So who better than the developers themselves to show you how they made turtles float around the streets of Michigan?

This post is a guest post by YETi CGI, a tech design company who developed projects for companies like Disney, Mattel, National Geographic and more.

The Turtle, as we affectionately call it, is a markerless AR concept demo we built on the new Wikitude SDK as part of our own research into 3D mobile AR. In broad strokes, the app causes your phone to see a majestic sea turtle swimming about the room, getting believably close and distant depending on where it is in virtual space.

Building the AR experience

The demo’s effect hinges on the turtle itself, right? Both its look and its feel contribute to the sense of it swimming in the same room the user occupies. In order to convey this, the first step is to give the turtle some context, which is where Wikitude’s SDK comes in. The SLAM algorithm they’ve got defines the virtual space based on the physical, gathering data from the room and mapping it into a Unity scene.

Because of what Wikitude provides, our coding needs were actually pretty simple. Once the Unity scene is created the camera performs a raycast onto the ground, giving the program most of the variables it needs to run. Aided by the virtual map of the physical room and the raycast data, the prefabbed object (which is then given the turtle as its identity) is positioned at a comfortable distance directly in front of the user.

When scene is set to it’s active state, and if tracking is active, position the Turtle correctly in front of the viewer

if (_isTracking)
{
    //cast ray out from center of screen, get position it hits the ground plane
    Ray cameraRay = Camera.main.ScreenPointToRay(new Vector3(Screen.width / 2, Screen.height / 2, 0));
    RaycastHit hit;
    if (ground.Raycast(cameraRay, out hit, 30))
    {
        objectHolder.transform.position = hit.point;
    }
}

The turtle’s animation is mostly provided by the art team, but it only consists of a looped reel of images. The baked in animation doesn’t actually include the circles that the turtle is turning, though it’s worth noting that it could have been accomplished that way. Our approach was to use a free plugin, called DOTween, to assign the turtle an animation state. We think it worked nicely.

Did you know you can now use Unity’s live preview with the Wikitude SDK? See more

Again, Wikitude’s SDK accommodates the project beautifully: most of the code done by our team determines position and camera input. By using Unity, Wikitude was able to provide us with prefabs (pre-built assets) which allowed us to drag and drop ready-to-go elements, giving us the freedom to work on the design and UX without having to first reinvent the wheel.

The outcome

How cool is to track the world around us? We’re pleased with how it turned out, and we’re going to continue learning about the applications of markerless 3D AR. At YETi we’ve been active in the scene for some time, and to have tools like Wikitude’s SDK showing up is a huge encouragement. 

That’s it! Hope you enjoyed this tutorial. Below are some relevant links to get you started with Instant Tracking. Let us know in the comments below which tutorials we should make next.

Get started with the  SDK for Unity: 

 

Download Unity SDK
Set up guide
Instant Tracking info
Other apps using Instant tracking





Interested in creating an AR project of your own?
Talk to one of our specialists and learn how to get started.

Contact The Wikitude Team