Scene Recognition and Tracking: Augmented Reality Use Cases and How-to


From large industrial machinery to showrooms, scene tracking enables the creation of continuous and persistent AR experiences for areas and large-scale objects.

As the complexity of augmented reality use cases grows, computer vision technology evolves to fulfill new requirements and expand the understanding of the world around us.

Catalyzed by COVID-19, immersive outdoor environments, homes, and workspaces have gained significant momentum, opening the possibility for people to connect remotely and digitally enhance their surroundings.

A crucial functionality in augmented reality and spatial computing is Scene Tracking – a unique feature that can be used to track pre-determined environments and large-scale objects.

By identifying reference points and features in your chosen scene or area, augmented reality content can be displayed and accessed on a wide variety of phones, tablets, and AR smartglasses.

Scene Recognition and Tracking augmented reality

Scene Tracking AR feature and supported targets

Scene Tracking, sometimes referred on the market as Area Targets, empowers a wide variety of use cases: maintenance and remote assistance, training and onboarding, digitalization of the visitor/user experience in the context of retail, tourism, museums, gaming, and more.

In order to trigger this marker-based AR, the device must detect a known target, therefore mapping the object or scene is needed.

Structures that can be mapped as AR targets include but are not limited to:

  • Exhibition booths and showrooms
  • Selected environments on retail stores, rooms and apartments
  • Large complex machinery on factory floors
  • Building façades
  • Sections of indoor spaces
  • Museums
  • Squares, fountains, feature-rich courtyards

Scene Tracking Use Cases

This AR feature is used to recognize and track larger structures that go beyond small-sized objects as well as area targets and machinery. Digital content can be added in the form of, annotations, videos, step-by-step instructions, links, directions, text, 3D augmentations, and more.

Enterprise and Industrial Setting

Scene Tacking can help digitalize workspaces by providing frontline workers access to immersive workflows with real-time information on the industrial setting. On-demand AR instructions and assistance on the factory floor help staff work faster and safer.

Navigation on the factory floor can help teams involved in multiple procedures to work more intuitively. In addition, step-by-step guidance and on-the-fly virtual notes can be attached to machines, facilitating knowledge transfer and streamlining communication across shifts.

AR-enabled training and onboarding can help companies save time and money when welcoming new workforce by guiding technicians throughout tasks and helping them connect with remote experts via video calls. 

AR-Enabled User Experience

Scene Tracking is a powerful tool to connect people to places. Hotels, museums and retail stores can attract visitors by digitally enhancing the exterior of the building or store façade with a digital sneak peek at what’s inside. Promotion, gamification, and additional AR touchpoints incentivize visitors to come inside.

Touristic destinations can offer information on demand in multiple languages, so visitors can, for example, learn about historical sites, monuments, squares, fountains and more.

Object Targets with scenes: how to create a 3D target map reference

In order to build Object Targets based in scenes, it is necessary to create a pre-recorded map of the object that will then be used to trigger the AR experience.

The map creation workflow is simple:

  • Collect images* or 3D models of the object or scene (best practice guide)
  • Convert images into a Wikitude Object Target Collection (.wto) using Studio Editor
  • Use the .wto file in your AR app project

For optimal performance, scanned spaces should have little to no variation compared to the 3D map generated for the target. Extension or alterations of maps are possible to reflect changes in the environment (learn more) – developers still have the option of extending the map with images from different backgrounds and that cover additional areas of the object to increase recognition accuracy.

For detailed instructions, access the how to create Object Targets section of the Wikitude Documentation.

*Keep in mind the new and improved object mapping process (SDK 8 and up) uses images as source material. Previous SDK versions use video-based materials instead.

Get Started with Scene recognition and Tracking

To start creating your AR projects with a free Wikitude SDK trial key, create a Wikitude account and download the platform of your choice.

This account also gives you access to Studio Editor, our web-based tool that allows you to generate, manage and publish your AR experiences without coding.

For commercial purposes, access our store to choose your package or contact our team to discuss which license is the best match for your AR project.

Check the articles below to review other AR features in detail:

Previous Article
How AR adds value in the COVID-19 world
Next Article
How AR adds value in the COVID-19 world