Categories
AR features

Instant Tracking: best practices and tracked environment guidelines

Unlike Image and Object Recognition, which rely on pre-mapped targets to trigger the display of digitally augmented elements, Instant Tracking is markerless. So, instead of requiring a mark, it tracks features of the physical environment itself to overlay AR content. SLAM-based Instant Tracking is, therefore, highly dependent on the characteristics of the physical scene in which the AR experiences are taking place.

After discussing Image Target guidelines, in this post we will briefly talk about how Instant Tracking works, sharing best use practices and the characteristics that make up for a good tracking environment.

Instant Tracking: best use practices and environment guidelines

The ability to track arbitrary environments without the need for a marker is a trait of the Instant Tracking algorithm that enables very specific use cases. A classic application is furniture product placement as implemented in this 3D Model on Plane example. But, how does it work?

The Instant Tracking algorithm works in two distinct states:

  • Initialization State: the origin of the tracking procedure is defined by pointing the device and aligning an indicator. The user must actively confirm when the alignment is satisfactory before transitioning to the tracking state.
  • Tracking State: in this state the environment is being tracked continuously, allowing augmentations to be placed within the scene.

Ideal Scene

For best results during the initialization state, a good tracking environment must be used. Since instant tracking creates a point cloud of the scene, it relies on detecting capturable features. For that reason, an ideal scene is one that contains distinguishable elements and good lighting.

Augmented Reality: Wikitude-Instant Tracking-Ideal-Scene

A structured floor and/or carpet work best to detect ground planes, while clean surfaces might be a problem.

Scene Mapping

To get a good map of the scene, different viewing angles and perspectives are necessary.

Augmented Reality: Wikitude-Instant Tracking-Scene-Mapping

For best results take a step back after starting initialization and move left and right around the scene.

Augmentation Placement

Typically, one can walk 90 degrees to the left and 90 degrees to the right of a 3D model placed in the scene (180 degrees). However, there are cases in which a full 360-degree rotation is possible. It will depend on the trackability of the environment and the performance of the device being used.

Learn more about setting up an ideal sample scene for Instant Tracking with the Wikitude SDK in Unity by watching this Unity video tutorial.

Following these guidelines should result in great Instant Tracking results. For maximum performance update your Wikitude SDK to the latest available version and make sure to use supported devices during the AR experience.

To continue reading the Wikitude AR Guide Series, access the Image Target installment.

For more AR information, access the Wikitude Forum to browse through various AR topics discussed by active developers worldwide. Should you have any further questions, please contact studio@wikitude.com for extra support.





Interested in creating an AR project of your own?
Talk to one of our specialists and learn how to get started.

Contact The Wikitude Team

Categories
AR features

Instant Tracking: Augmented Reality Uses Cases And How-to

Instant Tracking augmented reality technology makes it possible for AR applications to overlay interactive digital content onto physical surfaces without requiring the use of a predefined marker to kick off the AR experience.

To better understand how Instant Tracking works and what is possible to create with it, continue reading to review the following topics:

  • Use Cases
  • Introduction
  • Instant Targets
  • SMART – Seamless AR Tracking with ARKit and ARCore
  • Download AR SDK (free trial links)
  • How-to: sample instructions

Instant Tracking Augmented Reality Use Cases

The video below contains several segments of augmented reality use cases using Instant Tracking AR technology.

As seen in the video, Instant Tracking technology can be used for various applications: retail, journalism, marketing campaigns, furniture placement, museums – or simply just for fun, like the majestic sea turtle swimming about in the air.

Instant Tracking AR Technology Introduction


Unlike Object & Scene Tracking – covered in the first article of the Wikitude AR-technology series, Instant Tracking does not need to recognize a predefined target to then start the tracking procedure thereafter.

Instead, it initializes by tracking the physical environment itself. This markerless augmented reality is possible thanks to SLAM – Simultaneous Localization and Mapping technology.

SLAM is a technology that Computer Vision uses to receive visual data from our physical world (usually in the form of tracked points). Devices then use this visual input to understand and appropriately interact with the environment.

To achieve this, the algorithm behind Instant Tracking works in two distinct states:

  • The initialization state: the end user is required to define the origin of the tracking procedure by pointing the device to align an indicator. Once the user confirms the alignment is satisfactory, a transition to the tracking state takes place.
  • The tracking state: the environment is being continuously tracked, allowing augmentations to be properly placed within the physical scene.

This environment tracking capability enables very specific use cases, like the ones demonstrated in the video above.

Towards the end of this article, we will share instructions on how to create a furniture placement sample app that will help you understand and explore the full potential of Instant Tracking technology.

But first, let’s talk about Instant Targets and SMART, two important Instant Tracking AR features.

Instant Targets

Instant Targets is a feature within AR Instant Tracking which allows end users to save and load their AR sessions.

This means, important digital notes, directions, visual augmentations – and the whole AR experience itself – can be accessed and experienced by multiple users across devices and operating systems (iOS, Android, and UWP) at different points in time.

This makes sharing and revisiting the AR experience easy and meaningful. Instant Targets also allows users to load, edit, and resave the AR experience on the fly. Very practical, especially for remote assistance and maintenance use cases.

While Instant Target helps users share AR experiences, SMART greatly expands device AR capability.

SMART – Seamless AR Tracking – with ARKit and ARCore

SMART is a seamless API within Instant Tracking which integrates ARKit, ARCore and Wikitude’s SLAM engine in a single cross-platform AR SDK.

With it, developers do not have to deal with specific ARKit/ARCore code and can create their projects in either JavaScript, Unity, Xamarin, and Cordova. SMART works by dynamically identifying the end user’s device and deciding which should be used for each particular case.  

One of the best advantages, apart from not having to deal with different codes during the development phase, being the expanded compatibility with a wider range of devices available in the market.

Wikitude AR SDK download (free trial)

To create an Instant Tracking experience yourself, download a free trial of the Wikitude SDK – and follow the instructions listed in the Sample section below.

Wikitude SDK for Android

Wikitude SDK for iOS

Wikitude SDK for Windows

Wikitude SDK for Unity

Wikitude SDK for Cordova

Wikitude SDK for Xamarin

How-to: Instant Tracking Sample and Instructions

Access the Wikitude documentation of your preferred platform to follow instructions on how to create an Instant Tracking sample experience.

The instructions start with a simple implementation for basic understanding, moving forward with 3D model additions and preliminary interaction, working its way to the final fully fledged furniture placement use case example.

Create an Instant Tracking AR experience

Learning how to work with Instant Tracking technology is a must in the modern AR world. It not only allows AR projects to go beyond targeted locations, images and objects but it also enables AR experiences to happen anywhere anytime across different devices and platforms.

For commercial purposes, access our store to choose your package or contact our team to discuss which license is the best match for your AR project.

Categories
AR features

Get SMART – Seamless AR Tracking with ARCore and ARKit

Simultaneous localization and mapping (SLAM) was one of the most impactful technologies in recent years. It opened a wider spectrum of augmented reality experiences going beyond image & object targets by instantly tracking real-world scenes and placing digital layers in everyday environments.

With the release of Wikitude’s markerless AR feature, followed by Apple’s ARKit and Google’s ARCore, lots of new eyes turned to augmented reality. The impressive markerless apps developed in the past year spoke for themselves: SLAM is here to stay.

Now that the dust has settled, an inconvenient truth arises: dealing with different SDKs and APIs for different platforms is cumbersome, time-consuming and not cost-efficient.

Determined to solve developers’ hardship of dealing with multiple APIs, our team created SMART – Wikitude’s ‘Seamless AR Tracking’ Feature for the Wikitude SDK.

Track, save and share: download the new Wikitude SDK 8.

SMART – Wikitude’s ‘Seamless AR Tracking’

SMART is a seamless API which integrates ARKit and ARCore on top of Wikitude’s SLAM engine in a single augmented reality SDK, cross-platform, for any device. This new SDK feature does more than a simple fusion of augmented reality platforms: it wraps them all on Wikitude’s intuitive API. This means developers won’t have to bother dealing with specific ARKit/ARCore code. The Wikitude SDK will dynamically identify the end user’s device and decide if ARKit, ARCore or Wikitude’s SLAM should be used for each particular case.

The SMART feature is part of SDK 7.2 (and latest versions) and makes ARKit and ARCore accessible to developers working with JavaScript, Unity, Xamarin, Titanium, PhoneGap, and Cordova. If you are just getting started with your AR development journey, don’t miss our blog post covering all the development tools and extensions supported by Wikitude.

A free trial the SMART is now available for download.

Expanded Device Compatibility

ARKit and ARCore surely contribute to making AR more readily available for developers, but up to now, both SDKs have restricted device compatibility.

SMART ensures the delivery of the best possible augmented reality experience on a wider range of devices, covering 92,6% of iOS devices and about 95% of all suitable Android devices available in the market. This makes Wikitude the first toolkit in the market to truly democratize markerless AR for developers.

No matter if you are tracking horizontal or vertical planes, SMART is able to place 3D objects, videos, buttons, widgets and more using Wikitude’s SLAM engine.

For those interested in a wider variety of augmented reality experiences, check out our SDK’s geo-location, image recognition and tracking, multi-target recognition, object recognition and more with ample device coverage (smartphones, smart glasses, and tablets) and multiple platform deployment (iOS, Android).

Are you ready to get SMART?

  • Download the SDK of your choice
  • Read all about SMART (part of Wikitude’s Instant Tracking feature)
  • Get the setup guide
  • Use Wikitude’s sample app to try your project

We can’t wait to hear your feedback! Please let us know via Twitter, Facebook and in the comments below how you like SMART. Share the good news using #SMART & #wikitude.

Categories
SDK releases

Wikitude SDK 6 – A technical insight

Update (August 2017): Object recognition, multi-target tracking and SLAM: Track the world with SDK 7

In this blog post, Wikitude CTO Philipp Nagele shares some insights into the technical background of SDK 6 and the changes that go along with the new version of Wikitude’s augmented reality SDK.

The planning and work for SDK 6 reach far back into 2015. While working on the previous release 5.3 to add support for Android 7.0 and iOS 10, we already had a clear plan on what the next major release of our SDK should include. It is very gratifying that we now can lift the curtain on the scope and details of what we think is the most comprehensive release in the history of Wikitude’s augmented reality SDK.

While Instant Tracking is without doubt the highlight feature of this release, there are many more changes included, that are worth mentioning. In this blog post, I’ll try to summarize the noteworthy changes and some additional information and insights.

Pushing the boundaries of image recognition

Most of our customers choose the Wikitude SDK for the ability to recognize and track images for various purposes. The engine that powers it has been refined over the years, and, already with SDK 5, reached a level of performance (both in speed and reliability) that puts it at the top of augmented reality SDKs today.
With Wikitude SDK 6, our computer vision engine took another major step forward. In more detail, the .wtc file format now uses a different approach in generating search indexes, which improves the recognition rate. Measured on the MVS Stanford data set, SDK 5 delivered a recognition rate of around 86%, while SDK 6 now recognizes 94 out of 100 images correctly. Moreover, the recognition rate stays above 90% independent of the size of the .wtc file. So, no matter whether your .wtc file includes 50 or 1000 images, users will successfully recognize your target images.

Another development we have been working lately is to embrace the power of genetic algorithms to optimize the computer vision algorithms. Several thousands of experiments and hours on our servers led to an optimized configuration of our algorithms. The result is a 2D image tracking engine that tracks targets in many more sceneries and different light conditions. You can get a first impression on the increased robustness in this direct comparison between the 2D engine in SDK 5 and SDK 6. The footage is unedited and shows a setup with several challenging factors:

  • Low-light condition (single spot light source)
  • Several occluding objects
  • Strong shadows further occluding the images
  • Busy scenery in general
  • Reflections and refractions

The improvements in SDK 6 make the 2D engine more robust while keeping the same performance in terms of battery consumption and speed.

Instant Tracking – SLAM in practice

You might have seen our previous announcements and advancements in not only recognizing and tracking two-dimensional images, but instead, working with entire maps of three-dimensional scenes. It is no secret that Wikitude has been working on several initiatives in the area of 3D tracking.

For the first time, Wikitude SDK 6 includes a feature for general availability that is based on a 3D computer vision engine that has been developed in-house for the past 2 years. Instant Tracking is based on a SLAM approach to track the surrounding of the device and localize the device. In contrast to image recognition, Instant Tracking does not recognize previously recorded items, but instantaneously tracks the user’s surroundings. While the user keeps moving, the engine extends the recorded map of the scenery. If the tracking is lost, the engine immediately tries to re-localize and start tracking again, with no need for user input.
Instant Tracking is true markerless tracking. No reference image or marker is needed. The initialization phase is instant and does not require a special initialization movement or pattern (e.g. translation movement with PTAM).

The same engine used for Instant Tracking is used in the background for Extended Tracking. Extended Tracking uses an image from the 2D computer vision engine as an initialization starting point instead of an arbitrary surface as in the case of Instant Tracking. After a 2D image has been recognized, the 3D computer vision engines starts recording the environment and stays in tracking mode even when the user is no longer viewing the image.

For details on how to get started with Instant Tracking, visit our documentation, see our sample app (included in the download package) and watch the instruction video to learn how to track the environment. SDK 6 markerless augmented reality feature is available for Unity, JavaScript, Native, extensions and Smart-glasses (Epson and ODG).

Putting the Pieces Together – Wikitude SDK and API

What is the strength of the Wikitude SDK? It is much more than a collection of computer vision algorithms. When planning a new release, the product and engineering team aim to create a cross-platform SDK that is highly usable. We try to think of use-cases for our technology, then identify missing features. So it should come as no surprise that Wikitude SDK 6 is packed with changes and new features beyond the computer vision features.

The most obvious and noticeable change, especially for your users, is FullHD rendering of the camera image. Previously, Wikitude rendered the camera stream in Standard Definition (SD) quality, which was perfect back in 2012 when the Wikitude SDK hit the market. Since then, device manufacturers have introduced Retina displays and pixel-per-inch densities beyond the distinguishable. An image rendered in VGA resolution on this kind of display just doesn’t look right anymore. In Wikitude SDK 6, developers can now choose between SD, HD or FullHD rendering of the camera stream..

Additionally, on some devices, users can now enjoy a smoother rendering experience, as the rendering frequency can be increased to 60fps. For Android, these improvements are based on new support of the Android Camera 2 API, which, since Android 5.0, is the successor to the previous API (technically more than 60% of Android devices as of 1/1/2017 should run the Camera2 API). It allows fine-grained control and access to the camera and its capabilities. While the API and the idea behind it are a welcome improvement, the implementations of the Camera 2 API throughout the various Android vendors are diverse. Different implementations of an API are never a good thing, so support for the new camera features are limited to participating Android devices.

“Positioning” was another feature needed to allow users to interact with augmentations. This feature is ideal for placing virtual objects in unknown environments. With Wikitude SDK 6, developers now have a consistent way to react to and work with multi-touch gestures. Dragging, panning, rotating – the most commonly used gestures on touch devices are now captured by the SDK and exposed in easy-to-understand callbacks. This feature has been implemented in a way that you can use it in combination with any drawable in any of the different modes of the Wikitude SDK – be it Geo AR, Image Recognition, or Instant Tracking

The new tracking technology (Instant Tracking) lead us to another change which developers will encounter quite quickly when using SDK 6. Our previously used scheme of ClientTracker and CloudTracker didn’t fit anymore for an SDK with a growing number of tracker types. SDK 6 now carries a different tracking scheme with more intuitive naming. For now, you will encounter ImageTracker with various resources (local or cloud-based) and InstantTracker, with more tracker types coming soon. We are introducing this change now in SDK 6, while keeping it fully backward compatible with the SDK 5 API, while also deprecating parts of the SDK 5 API. The SDK comes with an extensive migration guide for all platforms, detailing the changes.

Last, I don’t want to miss the opportunity to talk about two minor changes that I think can have great impact on several augmented reality experiences. Both are related to visualizing drawables. The first change affects the way 2D drawables are rendered when they are attached to geo-locations. So far, 2D drawables have always been aligned to the user when attached to a geo-location. Now, developers have the ability to align drawables as they wish(e.g. North), and drawables will stay like that. The second change also affects 2D drawables. The new SDK 6 API unifies how 2D and 3D drawables can be positioned, which adds the ability to position 2D drawables along the z-axis.

Naturally, all of our official extensions are compatible with the newest features. The Titanium (Titanium 6.0) and Unity (Unity3D 5.5) extensions now support the latest releases of their development environments, and x86 builds are now available in Unity.

The release comes with cross-platform samples (e.g. gestures are demonstrated in a SnapChat-like photo-booth experience) and documentation for each of the new features, so you can immediately work with the new release.

Start developing with Wikitude SDK 6

Getting started with Wikitude’s new SLAM-based SDK is super easy! Here’s how:

  1. Download SDK 6 and sample app
  2. Check out our documentation
  3. Select your license plan
  4. Got questions? Our developers are here for you! 

We are excited to see what you will build with our new SDK. Let us know what is your favorite feature via Twitter and Facebook using the hashtag #SDK6 and #Wikitude !