AR features

Instant Tracking: Augmented Reality Uses Cases And How-to

Instant Tracking augmented reality technology makes it possible for AR applications to overlay interactive digital content onto physical surfaces without requiring the use of a predefined marker to kick off the AR experience.

To better understand how Instant Tracking works and what is possible to create with it, continue reading to review the following topics:

  • Use Cases
  • Introduction
  • Instant Targets
  • SMART – Seamless AR Tracking with ARKit and ARCore
  • Download AR SDK (free trial links)
  • How-to: sample instructions

Instant Tracking Augmented Reality Use Cases

The video below contains several segments of augmented reality use cases using Instant Tracking AR technology.

As seen in the video, Instant Tracking technology can be used for various applications: retail, journalism, marketing campaigns, furniture placement, museums – or simply just for fun, like the majestic sea turtle swimming about in the air.

Instant Tracking AR Technology Introduction

Unlike Object & Scene Tracking – covered in the first article of the Wikitude AR-technology series, Instant Tracking does not need to recognize a predefined target to then start the tracking procedure thereafter.

Instead, it initializes by tracking the physical environment itself. This markerless augmented reality is possible thanks to SLAM – Simultaneous Localization and Mapping technology.

SLAM is a technology that Computer Vision uses to receive visual data from our physical world (usually in the form of tracked points). Devices then use this visual input to understand and appropriately interact with the environment.

To achieve this, the algorithm behind Instant Tracking works in two distinct states:

  • The initialization state: the end user is required to define the origin of the tracking procedure by pointing the device to align an indicator. Once the user confirms the alignment is satisfactory, a transition to the tracking state takes place.
  • The tracking state: the environment is being continuously tracked, allowing augmentations to be properly placed within the physical scene.

This environment tracking capability enables very specific use cases, like the ones demonstrated in the video above.

Towards the end of this article, we will share instructions on how to create a furniture placement sample app that will help you understand and explore the full potential of Instant Tracking technology.

But first, let’s talk about Instant Targets and SMART, two important Instant Tracking AR features.

Instant Targets

Instant Targets is a feature within AR Instant Tracking which allows end users to save and load their AR sessions.

This means, important digital notes, directions, visual augmentations – and the whole AR experience itself – can be accessed and experienced by multiple users across devices and operating systems (iOS, Android, and UWP) at different points in time.

This makes sharing and revisiting the AR experience easy and meaningful. Instant Targets also allows users to load, edit, and resave the AR experience on the fly. Very practical, especially for remote assistance and maintenance use cases.

While Instant Target helps users share AR experiences, SMART greatly expands device AR capability.

SMART – Seamless AR Tracking – with ARKit and ARCore

SMART is a seamless API within Instant Tracking which integrates ARKit, ARCore and Wikitude’s SLAM engine in a single cross-platform AR SDK.

With it, developers do not have to deal with specific ARKit/ARCore code and can create their projects in either JavaScript, Unity, Xamarin, and Cordova. SMART works by dynamically identifying the end user’s device and deciding which should be used for each particular case.  

One of the best advantages, apart from not having to deal with different codes during the development phase, being the expanded compatibility with a wider range of devices available in the market.

Wikitude AR SDK download (free trial)

To create an Instant Tracking experience yourself, download a free trial of the Wikitude SDK – and follow the instructions listed in the Sample section below.

Wikitude SDK for Android

Wikitude SDK for iOS

Wikitude SDK for Windows

Wikitude SDK for Unity

Wikitude SDK for Cordova

Wikitude SDK for Xamarin

How-to: Instant Tracking Sample and Instructions

Access the Wikitude documentation of your preferred platform to follow instructions on how to create an Instant Tracking sample experience.

The instructions start with a simple implementation for basic understanding, moving forward with 3D model additions and preliminary interaction, working its way to the final fully fledged furniture placement use case example.

Create an Instant Tracking AR experience

Learning how to work with Instant Tracking technology is a must in the modern AR world. It not only allows AR projects to go beyond targeted locations, images and objects but it also enables AR experiences to happen anywhere anytime across different devices and platforms.

For commercial purposes, access our store to choose your package or contact our team to discuss which license is the best match for your AR project.

Dev to Dev

Wikitude SDK 7: A developer insight

Wikitude SDK 7 includes a long list of changes and additions to our augmented reality SDK. In this blog post, we will go through the modifications in more detail and what they offer for developers and users.

As you will see, SDK 7 has 3 key areas of improvement: Object Recognition and Tracking based on SLAM, multiple image recognition and enhancements for iOS developers.

Bring your objects into your augmented reality scene

Let’s get started with the biggest addition in this release: Object Recognition and Tracking for augmented reality. With this, we introduce a new tracker type beside our existing Image and Instant Tracking. The Object Tracker in the SDK gives you the possibility to recognize and track arbitrary shaped objects. The idea behind it is very similar to our Image Tracker, but instead of recognizing images and planar surfaces, the Object Tracker can work with three-dimensional structures and objects (tools, toys, machinery…). As you may have noticed, we don’t claim that the Object Tracker can work on any kind of object. There are some restrictions you should be aware of and types of objects that work a lot better. The SDK 7 documentation has a separate chapter on that.

In short – objects should be well structured and the surface should be well textured to play nicely with object recognition. API-wise the Object Tracker is set-up the same way as the Image Tracker.

The Object Tracker works according to the same principle as the Image Tracker. It detects pre-recorded references of the same object (the reference is actually a pre-recorded SLAM map). Once detected in the camera, the object is continuously tracked. While providing references for Image Targets is straight-forward (image upload), creating a reference for the object is a little bit more complex.

Scalable generation of object references

We decided to go for an approach that is scalable and usable for many users. This ruled out a recording application, which would be used to capture your object. This would also make it necessary, that each object is physically present. Considering this, we went for a server-side generation of Object Targets (sometimes also referred to as maps). Studio Manager, our web-tool for converting Image Targets, has been adopted for converting regular video files into Object Targets. You will find a new project type in Studio Manager that will produce Object Targets for you. Here’s a tutorial on how to successfully record objects.

After you have uploaded your video, the backend will try to find the best possible Object Target in several computation runs. We can utilize the power of the server to run intensive computational algorithms to come to a more accurate result compared to a pure on-device solution that has to operate in real-time. Have a look at the chapter “How to create an Object Target” in the SDK 7 documentation for a deeper understanding of the process. It also gives us the ability to roll-out improvements of the recording process without the need for a new SDK version.

Rendering Upgrade: Working with occlusion models

When moving from Image Targets to Object Targets, requirements for rendering change as well. When the object has a solid body with different sides it is particularly important to reflect that when rendering the augmentations. SDK 7 introduces a new type called AR.Occluder in the JavaScript API, that can take any shape. It acts as an occlusion model in the 3D rendering engine, so you can hide augmentations or make them a lot more realistic. For your convenience, the occluder can either be used with standard pre-defined geometric shapes or take the form of any 3D model/shape (in wt3 format). Not only Object Tracking benefits from this. Occluders, of course, can be used in combination with Image Targets as well – think of an Image Target on your hand wrist, that acts as an Image Target for trying on watches. For a proper result, parts of the watch need to be hidden behind your actual arm.

Updated SLAM engine enhancing Instant Tracking

Object Recognition and Tracking is based on the same SLAM engine that powers Instant Tracking and Extended Tracking. To make Object Recognition work, we upgraded the SLAM engine with several improvements, changes to the algorithm and bug fixes to the engine itself. This means SDK 7 carries an entirely revamped SLAM engine. You as a developer and your users will notice that in several ways:

1. Higher degree of accuracy in Instant Tracking and Extended Tracking
2. More stable tracking when it comes to rotation
3. Less memory consumption
4. Less power consumption

All in all, that means that devices running 32-bit CPUs (ARMv7 architecture) will see a performance boost and perform considerably better.

Instant Tracking comes also with two new API additions. Setting trackingPlaneOrientation for the InstantTracker lets you freely define on which kind of plane the Instant Tracker should start (wall, floor, ramp…). The other API is called hit testing API and will let you query the depth value of any given screen point (x,y). It will return the 3D- coordinates of the corresponding point in the currently tracked scene. This is useful for placing augmentations at the correct depth in the scene. The SDK will return an estimate dependent on the surrounding tracked points. The video below gives you an idea of how the hit testing API can be used.

1,2,3…. Multiple targets now available

Additionally, our computer vision experts worked hard to make our Image CV engine even better. The most noticeable change is the ability to recognize and track multiple images at the same time in the camera frame. The engine can detect multiple different images, as well as multiple duplicate images in the camera (e.g. for counting purposes). Images can overlap or even superimpose each other. The SDK does not have a hard-coded limit on the number of multiple images it can track – only the processing power of the phone puts a restriction on it. With modern smartphones it is easily possible to track 8 and more images.

Furthermore, the SDK offers developers the ability to get more information about the targets in relation to each other. APIs will tell you how far targets are apart and how targets are oriented towards each other. Callbacks let developers react on changes of the relationship between targets. Developers can define the maximum number of targets, so the application does not waste power searching for further targets. The image below gives you an idea how this feature can look like for a simple interactive card game.

Boosting recognition to unparalleled distances

All developers and users that require images to be recognized from a far distance in their augmented reality scene should take a look at the extended range recognition feature included in SDK 7. By using more information from the camera frame, SDK 7 triples recognition distance compared to previous SDK versions. This means that an A4/US-letter sized target can be detected from 2.4 meters/8 feet. Calculated differently, images that cover 1% of the screenable area can still be accurately recognized and a valid pose can be successfully calculated. The SDK enables this mode automatically for devices capable of this feature (auto-mode). Alternatively, developers can manually enable/disable the function. When testing the feature and comparing it to competing SDKs, we did not detect any other implementation delivering this kind of recognition distance. All in all this means easier handling for your users and more successfully recognized images.

Bitcode, Swift, Metal – iOS developers rejoice

This brings us to a chapter dedicated to iOS developers, as SDK 7 brings several changes and additions for this group. First of all, Wikitude SDK now requires iOS 9 or later, which shouldn’t be a big hurdle for the majority of apps (currently nearly 95% devices meet this requirement). With SDK 7, iOS developers can now build apps including the Wikitude SDK using the bitcode option. Apps built with the bitcode will have the benefit of being smaller, as only the version necessary for the actual device architecture (armv7, armv7s, armv8) is delivered to the user and not a fat binary including all architectures.
As a more than welcomed side-effect of re-structuring our build dependencies to be compatible with bitcode, the

Wikitude SDK can now also run in the iOS simulator. You still won’t see a camera image in the simulator from your webcam, but you can work with pre-recorded movie files as input for the simulator.

In SDK 6.1 we introduced support for OpenGL ES 3 as graphics API. SDK 7 now also lets you use Metal as your rendering API in Native and Unity projects. Talking about new stuff, Wikitude SDK 7 also includes an extensive sample for Swift explaining how to integrate the Wikitude SDK in Swift application. Note the API itself is still an Obj-C API, but the sample makes it a lot clearer how to use API within a Swift environment.

We haven’t forgotten Android

Android developers will be happy to hear, that the Android version of Wikitude SDK 7 makes use of a different sensor implementation for Geo AR experiences. The result is a smoother and more accurate tracking when displaying geo-related content. For Android, we are also following the trend and moving the minimum Android version up a little bit by requiring Android 4.4 or later, which corresponds to a minimum of 90% of Android devices.

We hope you can put SDK 7 and its additions to good use in your AR project. We love to hear from you and we are keen to receive suggestions on how to make the Wikitude SDK even more useful to you!

Start developing with Wikitude SDK 7

Getting started with SDK 7 has never been easier! Here’s how:

Help us spread the news on Twitter, Facebook and Linkedin using the hashtag #SDK7 and #Wikitude.


Wikitude at AWE USA 2017: Auggie Awards, talks and more

The highly anticipated AWE USA 2017 has come and gone and now that the dust has settled it is safe to say…it was awesome! 

The largest augmented and virtual reality exposition and conference in the world is growing stronger. This year’s event, which gathered 4700 attendees, gave the AR+VR community an excellent chance to exchange knowledge, share news, demonstrate technologies and, of course, to have some interactive AR+VR fun.

Wikitude participated in all AWE USA events so far and here is how the Augmented World Expo 2017 unraveled for us.

Partnership announcement with Lenovo NBD

With a great kick-start at the AWE USA 2017 Press Conference, Wikitude CEO Martin Herdina talked about our recent 1 billion app installs achievement as well as some practical applications of our markerless tracking technology launched previously this year. Additionally, he also spoke about the importance of partnership with industry leaders before formally announcing our collaboration with Lenovo New Vision.

Lenovo NBD is launching an Augmented Human Cloud, powered by Wikitude’s intelligent recognition engine and Markerless SLAM technology, and their COO, Oscar Gu, says that “the goal of the AH Cloud is to reduce the AR applications development term from two weeks to two hours”. Impressive stuff!

Wikitude: winner of 2017 Auggie Awards for ‘Best Developer Tool’

Wikitude had the honor of stepping on stage once more, but this time for a different reason: to receive an Auggie Award. Wikitude’s SLAM SDK was recognized as the ‘Best Developer Tool’ and the award was accepted with pride and will certainly be an incentive for Wikitude to keep innovating, pushing boundaries and evolving in the powerful realm of AR.

Wikitude and Walmart explore AR in retail

Martin Herdina took the stage once again to speak about the value of AR in retail, tackling a consumer perspective. He covered market facts, tendencies, and statistics followed by several interesting use cases varying from pre-shopping and onsite tools, home shopping through product visualization as well as brand engagement.

The AR retail oriented talk was finished off by Walmart’s Systems Analyst, Steven Lewis, who shared the AR introduction process experienced by Walmart’s internal development team as well as a practical use case utilized for modular (shelf) set up. If you are interested in Walmart’s Journey into AR // How AR Creates Real Value this AWE Consumer Track talk is for you.

What’s next with Wikitude

For the developer crowd, our CTO Philipp Nagele prepared a speech about the company’s background, followed by an in-depth view of present and future Wikitude developments as well as what the next version of the Wikitude SDK will offer augmented reality developers. On top of that, he was also throwing chocolates around. If you are curious, watch What’s Next with Wikitude to see this great AWE Develop Track talk in its entirety.

In between talks, press announcements and demos, we had a chance to connect with some amazing people and give them a sneak peek of what’s to come. If you didn’t make it to AWE, stay tuned to hear first hand some exciting news in just a few weeks!


Join Wikitude at AWE 2017

Augmented and Virtual Reality enthusiasts…unite!

If you happen to be one of the lucky 5.000 attendees expected to visit the 8th annual edition of AWE USA, the largest AR+VR event in the world, you are in for a treat.

The three-day conference, which starts today – May 31st, is being held at the Santa Clara Convention Center in California and apart from exploring the “Superpowers to Change the World” theme and showcasing 250+ speakers, organizations, and startups, AWE 2017 will also introduce an amusing and highly interactive 20.000 m² AR+VR experience center, also known as the “AWE Playground”.

Visitors will be able to explore over 100.000 m² of exposition ground and connect with a total of 200 innovative exhibitors, including Wikitude who is excited to be, once again, an AWE participant, silver sponsor, featured speaker and Auggie Award finalist.

Follow Wikitude at AWE USA 2017 –  (Booth #634)

Featured Talks

“Walmart’s Journey into AR // How Augmented Reality Creates Real Value in Retail” 
For those interested in learning about key success factors in AR-powered service in Retail and the benefits that arise from innovative augmented reality use, this one is for you. Wikitude CEO Martin Herdina shares the stage with Walmart’s Systems Analyst, Steven Lewis, on June 1st from 12:15 pm to 12:30 pm, Room J (Consumer Track). 

“What’s Next with Wikitude”  

Wikitude CTO Philipp Nagele will present an in-depth look into the company’s recent developments and talk about what the next version of the Wikitude SDK will offer augmented reality developers. Don’t miss the session happening today (May 31st) from 1:30 pm -2:15 pm, Room 209/210.

Press Conference

Want to hear the BIG news? Wikitude is among the selected group of companies speaking at AWE’s press conference this year. Join the most influential tech journalists in the industry on Thursday – Jun 1st, at the Main Stage starting from 9:30 am to hear the latest and greatest news on augmented reality, virtual and mixed reality. Companies joining the AWE press conference are:

Shadow Creator

Auggie Awards

You voted and we got there! With the highest public voting count, we are proud to have been announced as an Auggie Award Finalist.

Wikitude’s SLAM (Simultaneous Localization and Mapping) SDK is competing in the Best Developer Tools category and winners will be announced on June 1st. Thanks to our strong community for the support!

Booth 634 is the place to be

Last but not least, check out Wikitude’s recent developments and meet some of the creative minds behind our tech at booth 634 in the Tools Pavillion. Wikitude will be demonstrating its latest technology advancements including its most popular feature, Instant Tracking. 

Be WOWed by our secret Magic wand demo and see the ‘whole world’ changing in front of your eyes and don’t forget to take your freebie home: an exclusive Superhero Hyperphoto powered by LifePrint. 


The Washington Post launches augmented reality series powered by Wikitude

You might have heard the big news: The Washington Post announced the beginning of its augmented reality journey. Powered by Wikitude, the renowned American daily newspaper launched an interactive AR series to creatively engage readers and transform storytelling.

The AR-enhanced series, initially planned to be divided into six installments, allows readers to explore first-hand some of the world’s most iconic buildings. But how?

The first story revolves around Elbphilharmonie, Hamburg’s world famous concert hall known for its refined acoustic capabilities. Users of the Washington Post Classic iPhone app now have the ability to experience, from the comfort of their own home, what it’s like to gaze upon the highly advanced acoustic panels in action.

When users point their phone at their ceiling, an animated projection of the acoustic panel layout is prompted to demonstrate how its impeccable sound is absorbed, transmitted, reflected and ultimately produced. A chance to “see what perfect sound looks like” – in the Post’s own words.

To create this experience, The Washington Post utilized Wikitude’s SLAM (Simultaneous Localization and Mapping), launched earlier this year with SDK 6. This technology enables any Android or iOS device, including smart glasses, to instantly track the user’s environment and layer interactive AR content into the real world without the need for markers.

The Washington Post’s head of product, Joey Marburger, is confident they are on the right track – “We think [AR will] be more widely adopted — you can really see it bubbling up — and we wanted to be at the forefront of that so by the time it takes off, we’re really good storytellers there”.

Learn more about The Washington Post’s new AR series in the original article, and get started with Wikitude’s SLAM technology today.


Wikitude is among the top 1% rated startups by Early Metrics

Wikitude is among the top 1% rated startups, according to European-based rating agency Early Metrics. As a pioneer in the augmented reality industry, Wikitude was awarded 83 out of 100 points, placing the company in Early Metrics’ prestigious club of five-star startups.

Early Metrics’ ratings, which focus on key non-financial metrics, provide an independent assessment of a venture’ growth potential. The ratings support decision-makers, such as investors and corporations, in identifying and understanding in-depth the most innovative startups across Europe.

Wikitude is the world’s leading independent AR technology provider with a robust ecosystem of over 100,000 registered developers and 20,000 published apps covering a wide variety of industries and use-cases. Its fully in-house developed SDK enables enterprises, agencies, and developers to create powerful AR solutions for mobile devices and smart glasses that delight users and provide tangible ROI.

With the addition of SLAM technology in January 2017, Wikitude’s SDK is the AR market’s most comprehensive developer tool with a combination of location-based, image-based recognition and tracking, and 3D tracking capabilities. Wikitude’s SLAM-based markerless tracking is the most versatile 3D tracking system available for mobile today.

Martin Herdina, CEO at Wikitude, says: “We are delighted by this award. Over $2 billion of investments went into AR/VR in 2016. Being among the top 1% of startups rated by an independent third party shows that we are headed in the right direction in a super exciting market segment. You can expect even more exciting news from Wikitude this year”.

About Wikitude®
Wikitude is the world’s mobile augmented reality (AR) pioneer and leading AR technology provider for smartphones, tablets and digital eyewear on both iOS and Android. Its fully in-house developed AR technology is available through its SDK, Cloud Recognition and Studio products enabling brands, agencies and developers to achieve their AR goals. Wikitude® is a registered trademark of Wikitude GmbH, for more information please visit:

About Early Metrics
Early Metrics is the pan-european rating agency for startups and innovative SMEs, analyzing non-financial metrics to assess their growth potential. Ratings are free for entrepreneurs and provide them with a third party assessment, supporting their growth development. Established in London, Paris and Tel Aviv, Early Metrics works on behalf of private and institutional investors as well as corporates ventures and business units. To get rated or to access rating reports:


Digital agencies

Roomle makes furniture shopping so much more fun

Let’s face it – despite the bright colors, clever Swedish design, and exceedingly happy salespeople, few of us would actually choose to spend their entire Saturday at IKEA. It’s a chore! But good thing for you – we now practically live in Back to the Future, and augmented reality is, in this case, going to get you to spend less time on life administration by streamlining the time you spend planning, designing, and purchasing furniture for your home.

It’s not only one of the best use cases of AR, it’s also one of the most obvious: planning and designing interior spaces using easy-to-understand visuals – while you stand in the space you’re planning. Of course, IKEA already tried it back in 2013 – but the technology has advanced significantly. Now start-up app Roomle is making the process even easier – using Wikitude’s SLAM 3D tracking.

Check out the short demo below:

The benefits are clear, for everyone involved: less hassle, less travel, quicker and more intuitive understanding of how a space will look and feel. Customers love it because it makes their lives easier; retailers love it because it means more sales, and less overhead on showrooms and stores. And those are just the big benefits – here’s a few more:

  • Real-time supply with up-to-date and individually relevant product information
  • Visualization of residential environments and interior architecture
  • Interactive interface creates strong brand connection
  • The personalization factor is enhanced by the unique usability
  • Simpler presentation of complex products
  • Products ‘stick’ in the consumer’s memory and are recognized more quickly

So what makes Roomle the AR room design app of the future? The stuff behind the scenes. It’s got an incredibly simple user interface – users can jump on the app and start designing rooms and spaces intuitively. In the home, they can simply select a product from the catalogue, and use their phone’s camera to see it in live space. Key here is our SLAM 3D markerless tracking tech – without ‘seeing’ the room, the app wouldn’t be able to place the object in the room to see.

Using Roomle is this easy

Screen shot of roomle app, with 3D white chair overlayed on floor
Roomle is even more impressive in the hands of a trained professional (that’s a nice way of saying ‘salesperson!’). It turns an iPad into a custom furniture showroom. Sales staff can pick furniture from the brand catalog, configure it according to the customer´s preferences and demonstrate the result in convincing 3-D or augmented reality views, live in every room. See the longer explanation about how Roomle works, here.

So now that we’ve arrived at the future, what’s the future of the future? Good question – for one, we can imagine one-click ordering (á la Amazon) combined with the flat-packing genius of IKEA to facilitate home shopping even more – take a picture, pick your product, click ‘purchase’ and it shows up at your door one day later. What follows? Pre-fabbed house construction – calculate the price of a new floor or painting a room, or installing an addition to your home.

If you’ve been thinking about making some changes around the house, but the hassle of getting out the measuring tape, doing the research, and going shopping has been holding you back – wait no more, give Roomle a try!

Roomle is powered by Wikitude. Get started with the Wikitude SDK today!

SDK releases

Here comes SDK 6.1

Update (August 2017): Object recognition, multi-target tracking and SLAM: Track the world with SDK 7

When we launched Wikitude SDK 6 nearly 2 months ago, we were excited to see developers jump onto markerless SLAM tracking and creating fascinating augmented and mixed reality experiences. Today we are proudly releasing an update for SDK 6 with stability updates and a few new features. Check out what’s new:

Support for OpenGL ES 3.x Graphics API:

Over the past weeks, we have seen that many augmented reality projects are based on more modern graphics API than OpenGL ES 2.0. Developers using Wikitude SDK 6.1 can now make use of OpenGL ES 3.x. Support for Metal Graphics API is currently being worked on and a new iOS rendering API will be included in our next release.

Improved stability for image tracking:

This release comes with an updated computer-vision engine for image tracking, which delivers a smoother AR experiences. Particularly when holding the device still or using larger augmentations, developers will notice a more stable tracking and little to no jittering. See video below for a performance comparison.

Reworked communication from JavaScript API to native code:

An essential part of the JavaScript API is the ability to communicate with parts of the app that are not involved in the augmented reality experience as such, often written in Obj-C or Java. This communication has been based on a custom URL protocol to send and receive data. In Wikitude SDK 6.1 we are introducing a different approach to the communication between JavaScript API and native code based on exchanging JSONObjects directly.

Several stability updates:

 SDK 6.1 comes with many stability updates and improvements – the most noticeable being the fix of a nasty bug that prevented 2D and 3D augmentations to be rendered separately. With this fix, the z-order of augmentations is properly respected. Additionally, developers now can use the ADE.js script again to debug experiences in the browser.

For a full list of improvements and fixes, make sure to check out the release notes for all supported platforms and extensions:

For customers with an active subscription, the update is free – depending on your current license key, it might be necessary to issue a new key. Please reach out via email for any license key related issues.

All other existing customers can try out Wikitude’s cross-platform SDK 6.1 for free and purchase an upgrade for Wikitude SDK 6.1 anytime.

The update SDK package is available on our download page for all supported platforms, extensions and operating systems. If you have any questions feel free to reach out to our developers via the Wikitude forum.

SDK releases

Wikitude SDK 6: See beyond reality with SLAM

Update (August 2017): Object recognition, multi-target tracking and SLAM: Track the world with SDK 7

Introducing SDK 6, Wikitude’s powerful SLAM solution for Augmented Reality apps

We are excited to announce the latest version of our augmented reality SDK, powered by the all-new Wikitude 3D SLAM engine.

SDK 6 combines top-notch image recognition and tracking, improved geo-location AR, and new 3D tracking technology (SLAM-based), becoming the world’s most comprehensive augmented reality SDK for mobile, tablets, and smart-glasses. 

With this release, Wikitude empowers the 400,000+ AR developers worldwide to transform their cross-platform apps into full-stack augmented reality experiences. Here is an overview of the new features: 

Wikitude 3D tracking technology – Go markerless with Instant Tracking

Ditch the markers! With SDK 6, Wikitude introduces its SLAM-based 3D engine to the world. Our all-in-one solution is robust, accurate and 100% in-house developed.

Instant Tracking is the first feature using Wikitude’s 3D tracking technology. It allows developers to easily map environments and display augmented reality content without the need for target images (markers). This feature works in both indoor and outdoor environments, and is suitable for a wide range of industries including medical, architecture, gaming, industrial machinery, real estate, marketing, and more.

For details on how to get started with Instant Tracking, visit our documentation, try our sample app (included in the download package) and watch the instruction video to learn how to track the environment. SDK 6 markerless augmented reality feature is available for Unity, JavaScript, Native, extensions and Smart-glasses (Epson and ODG).

Gestures – Play with augmentations like never before

The new “gestures” feature allows developers to freely place and interact with multiple augmentations on both marker-based or markerless AR experiences. Unlike other SDKs, Wikitude now enables full control of AR content with support for multi-touch gestures: drag, rotate, zoom, pan, make augmentations bigger or smaller, etc. This feature was designed with the end-user in mind, making AR experiences more dynamic, intuitive and fun.

Robust AR everywhere – track targets in rough conditions

Low light condition, shadows, noisy backgrounds, shiny surfaces… SDK 6’s new computer vision engine is prepared to deliver AR in any environment. This means increased performance, unprecedented accuracy (92% recognition rate) and faster recognition speed for target collections.

Developers can also benefit from a new file format optimized for best performance in rough conditions. For more information on the new file format and compatibilities, please see our documentation.

Additional features

Improved Extended Tracking – first introduced in SDK 5, Extended Tracking allows developers to extend experiences beyond targets. Once the target image is recognized, users can continue the AR experience by freely moving their devices without needing to keep a marker in the camera view. Extended Tracking now shares the same SLAM algorithm as Wikitude’s Instant Tracking feature, providing more robust performance.

Advanced camera options – AR has never looked so realistic as with this sharp new feature. Enjoy high definition camera rendering on new devices and 60 fps for smoother AR experiences. SDK 6 comes with two camera setting options that gives developers full control of their AR experiences. Choose between auto mode or select your preferred rendering quality from standard definition SD to full high definition (FullHD). Moreover, developers now have extended control over focus behaviour of the camera.

Positioning – Let your augmentations fly. The Positioning feature enables free positioning of augmentations on targets in any dimension. Content associated with a geo-location can now also be freely positioned in any direction.

Start developing with Wikitude SDK 6 across platforms

Getting started with Wikitude’s new SLAM-based SDK is super easy! Here’s how

  1. Download SDK and sample app 
  2. Check out our documentation
  3. Select your license plan
  4. Got questions? Our developers are here for you! 

If you’re already working with Wikitude, check out our new and simpler pricing plan. We can’t wait to see your projects using SDK 6! 

Help us spread the news on Twitter, Facebook and Linkedin using the hashtag #SDK6 and #Wikitude.

SDK releases

Wikitude SDK 6 – A technical insight

Update (August 2017): Object recognition, multi-target tracking and SLAM: Track the world with SDK 7

In this blog post, Wikitude CTO Philipp Nagele shares some insights into the technical background of SDK 6 and the changes that go along with the new version of Wikitude’s augmented reality SDK.

The planning and work for SDK 6 reach far back into 2015. While working on the previous release 5.3 to add support for Android 7.0 and iOS 10, we already had a clear plan on what the next major release of our SDK should include. It is very gratifying that we now can lift the curtain on the scope and details of what we think is the most comprehensive release in the history of Wikitude’s augmented reality SDK.

While Instant Tracking is without doubt the highlight feature of this release, there are many more changes included, that are worth mentioning. In this blog post, I’ll try to summarize the noteworthy changes and some additional information and insights.

Pushing the boundaries of image recognition

Most of our customers choose the Wikitude SDK for the ability to recognize and track images for various purposes. The engine that powers it has been refined over the years, and, already with SDK 5, reached a level of performance (both in speed and reliability) that puts it at the top of augmented reality SDKs today.
With Wikitude SDK 6, our computer vision engine took another major step forward. In more detail, the .wtc file format now uses a different approach in generating search indexes, which improves the recognition rate. Measured on the MVS Stanford data set, SDK 5 delivered a recognition rate of around 86%, while SDK 6 now recognizes 94 out of 100 images correctly. Moreover, the recognition rate stays above 90% independent of the size of the .wtc file. So, no matter whether your .wtc file includes 50 or 1000 images, users will successfully recognize your target images.

Another development we have been working lately is to embrace the power of genetic algorithms to optimize the computer vision algorithms. Several thousands of experiments and hours on our servers led to an optimized configuration of our algorithms. The result is a 2D image tracking engine that tracks targets in many more sceneries and different light conditions. You can get a first impression on the increased robustness in this direct comparison between the 2D engine in SDK 5 and SDK 6. The footage is unedited and shows a setup with several challenging factors:

  • Low-light condition (single spot light source)
  • Several occluding objects
  • Strong shadows further occluding the images
  • Busy scenery in general
  • Reflections and refractions

The improvements in SDK 6 make the 2D engine more robust while keeping the same performance in terms of battery consumption and speed.

Instant Tracking – SLAM in practice

You might have seen our previous announcements and advancements in not only recognizing and tracking two-dimensional images, but instead, working with entire maps of three-dimensional scenes. It is no secret that Wikitude has been working on several initiatives in the area of 3D tracking.

For the first time, Wikitude SDK 6 includes a feature for general availability that is based on a 3D computer vision engine that has been developed in-house for the past 2 years. Instant Tracking is based on a SLAM approach to track the surrounding of the device and localize the device. In contrast to image recognition, Instant Tracking does not recognize previously recorded items, but instantaneously tracks the user’s surroundings. While the user keeps moving, the engine extends the recorded map of the scenery. If the tracking is lost, the engine immediately tries to re-localize and start tracking again, with no need for user input.
Instant Tracking is true markerless tracking. No reference image or marker is needed. The initialization phase is instant and does not require a special initialization movement or pattern (e.g. translation movement with PTAM).

The same engine used for Instant Tracking is used in the background for Extended Tracking. Extended Tracking uses an image from the 2D computer vision engine as an initialization starting point instead of an arbitrary surface as in the case of Instant Tracking. After a 2D image has been recognized, the 3D computer vision engines starts recording the environment and stays in tracking mode even when the user is no longer viewing the image.

For details on how to get started with Instant Tracking, visit our documentation, see our sample app (included in the download package) and watch the instruction video to learn how to track the environment. SDK 6 markerless augmented reality feature is available for Unity, JavaScript, Native, extensions and Smart-glasses (Epson and ODG).

Putting the Pieces Together – Wikitude SDK and API

What is the strength of the Wikitude SDK? It is much more than a collection of computer vision algorithms. When planning a new release, the product and engineering team aim to create a cross-platform SDK that is highly usable. We try to think of use-cases for our technology, then identify missing features. So it should come as no surprise that Wikitude SDK 6 is packed with changes and new features beyond the computer vision features.

The most obvious and noticeable change, especially for your users, is FullHD rendering of the camera image. Previously, Wikitude rendered the camera stream in Standard Definition (SD) quality, which was perfect back in 2012 when the Wikitude SDK hit the market. Since then, device manufacturers have introduced Retina displays and pixel-per-inch densities beyond the distinguishable. An image rendered in VGA resolution on this kind of display just doesn’t look right anymore. In Wikitude SDK 6, developers can now choose between SD, HD or FullHD rendering of the camera stream..

Additionally, on some devices, users can now enjoy a smoother rendering experience, as the rendering frequency can be increased to 60fps. For Android, these improvements are based on new support of the Android Camera 2 API, which, since Android 5.0, is the successor to the previous API (technically more than 60% of Android devices as of 1/1/2017 should run the Camera2 API). It allows fine-grained control and access to the camera and its capabilities. While the API and the idea behind it are a welcome improvement, the implementations of the Camera 2 API throughout the various Android vendors are diverse. Different implementations of an API are never a good thing, so support for the new camera features are limited to participating Android devices.

“Positioning” was another feature needed to allow users to interact with augmentations. This feature is ideal for placing virtual objects in unknown environments. With Wikitude SDK 6, developers now have a consistent way to react to and work with multi-touch gestures. Dragging, panning, rotating – the most commonly used gestures on touch devices are now captured by the SDK and exposed in easy-to-understand callbacks. This feature has been implemented in a way that you can use it in combination with any drawable in any of the different modes of the Wikitude SDK – be it Geo AR, Image Recognition, or Instant Tracking

The new tracking technology (Instant Tracking) lead us to another change which developers will encounter quite quickly when using SDK 6. Our previously used scheme of ClientTracker and CloudTracker didn’t fit anymore for an SDK with a growing number of tracker types. SDK 6 now carries a different tracking scheme with more intuitive naming. For now, you will encounter ImageTracker with various resources (local or cloud-based) and InstantTracker, with more tracker types coming soon. We are introducing this change now in SDK 6, while keeping it fully backward compatible with the SDK 5 API, while also deprecating parts of the SDK 5 API. The SDK comes with an extensive migration guide for all platforms, detailing the changes.

Last, I don’t want to miss the opportunity to talk about two minor changes that I think can have great impact on several augmented reality experiences. Both are related to visualizing drawables. The first change affects the way 2D drawables are rendered when they are attached to geo-locations. So far, 2D drawables have always been aligned to the user when attached to a geo-location. Now, developers have the ability to align drawables as they wish(e.g. North), and drawables will stay like that. The second change also affects 2D drawables. The new SDK 6 API unifies how 2D and 3D drawables can be positioned, which adds the ability to position 2D drawables along the z-axis.

Naturally, all of our official extensions are compatible with the newest features. The Titanium (Titanium 6.0) and Unity (Unity3D 5.5) extensions now support the latest releases of their development environments, and x86 builds are now available in Unity.

The release comes with cross-platform samples (e.g. gestures are demonstrated in a SnapChat-like photo-booth experience) and documentation for each of the new features, so you can immediately work with the new release.

Start developing with Wikitude SDK 6

Getting started with Wikitude’s new SLAM-based SDK is super easy! Here’s how:

  1. Download SDK 6 and sample app
  2. Check out our documentation
  3. Select your license plan
  4. Got questions? Our developers are here for you! 

We are excited to see what you will build with our new SDK. Let us know what is your favorite feature via Twitter and Facebook using the hashtag #SDK6 and #Wikitude !