Wikitude Product and Services End-of-Life Announcement

Well over a decade ago, we launched the Wikitude AR SDK. It enabled mobile developers to easily create or integrate augmented reality to their apps with platforms they used and loved. Since then, more than 40,000 mobile apps have been created using Wikitude. These apps got in the hands of millions of users and have significantly shifted the way developers, consumers, workers, gamers and enthusiasts use AR.

As we look to the future of extended reality and focus on paving the way to outstanding headworn-based experiences, it means we need to open space for the new and bring Wikitude to its End-Of-Life.

We want to say a full-hearted thank you to all of you, our community, customers and partners for your trust throughout all these years!

The End-Of-Sales (EOS) will happen on the 21.09.2023, which means that no new purchases or subscription renewals are possible from this date onward. The End-Of-Life (EOL) will occur 12 months after, on the 21.09.2024, when all Wikitude services will be shut down and stored data will be deleted.

Wikitude End Of Life timeline

It’s important to note that after the EOL date, your SDK license keys will still work. However, given that it will not receive any more updates or technical support, the software will become more vulnerable to OS incompatibilities and security risks. Wikitude Studio, Cloud and Studio API access will be fully disabled on the 21.09.2024 and the associated data will be deleted.

Technical support will be given until EOL date only to active subscribers for the duration of their active subscription. Kindly note that we will not be able to assist in your migration efforts, and support will be reduced to a bare minimum.

Thank you for the incredible decade of building augmented reality for mobile and we invite you to join us building the future of Extended Reality. We are here to clear any questions you might have.

The Wikitude Team

Find out more about what is happening next

You probably have many questions, so bellow we’ll answer some of them:

Until when can I make a license purchase?

Off-the-shelf and enterprise purchases will be disabled on the 21 September 2023. Starting with this date users will not be able anymore to make any new orders.

What happens with my upcoming subscription renewal?

Off-the-shelf and enterprise renewals will be disabled on the 21 September 2023. Starting with this date users will not be able anymore to renew a subscription. Should you wish to, you can already disable your upcoming renewal in your account (see dedicated FAQ ). All subscriptions that are still active on the EOS date will be blocked from being renewed by our system.

Until when can I create a Wikitude user account?

Account registrations will be disabled on the 21 September 2023. Starting with this date users will not be able anymore to open an account. Account login and password reset are still possible until 21 September 2024.

Until when can I generate an SDK license key for my purchase/renewal?

License keys can be generated until the 21.09.2024 for all licenses purchased until 21 September 2023.

What does it mean “available only for the duration of the subscription period”?

Technical support, Cloud and Studio API access will continue within the scope of your subscription type until the end of your subscription year. For example: If your current subscription started on 10 October 2022, you would have access until 9 October 2023. If your current subscription started on 6 June 2023, you would have access until 5 June 2024.

How will I get support as an active subscriber?

We are continuing technical support for all customers, only for the duration of their active subscription. Kindly note that Wikitude technical support is being reduced to a bare minimum, therefore only requests associated with an active subscription will be considered, based on the invoice number.
Support tickets should be opened via the Wikitude forum with the email address used for making the purchase. When logged in the forum, you will see on the home page the link “New support ticket”. This will take you to a form where you can create a support ticket. Please make sure that your Subject line follows the recommended format, as otherwise your support request might be disregarded. The format is “INV-2345 / the issue line”. For example: “INV-73527 / Support Website Navigation Broken”.
Should you be entitled to technical support but cannot find the “New support ticket” link in the forum, please send us an email to, and make sure to include your invoice number.

Can I get technical support if I am not an active subscriber?

We are continuing technical support only for customers, for the duration of their active subscription. Kindly note that Wikitude technical support is being reduced to a bare minimum, therefore only requests associated with an active subscription will be considered.

What will happen with my commercial SDK license key?

Unless differently stated in your custom agreement, SDK commercial licenses are valid for the app’s lifetime, even for subscriptions that do not get renewed. Therefore, you can continue with the license key after the EOL date.
However, you will not have access to any other Wikitude services or software updates. Keep in mind that updates from Android, iOS, and Windows might affect the functionality of older Wikitude SDK versions and even lead to breaking changes.
App ID changes will not be accepted.
Please check this FAQ to see which SDK version your key is supporting.

What will happen with my trial key?

Free trial keys are disabled 45 days after the day of generation. Please check this FAQ to see which SDK version your key is supporting.

My Cloud or/and Studio API renewal was due soon after the 21st of September, but I still need some time. What should I do?

Renewals will be disabled starting with 21 September 2023 and some of our customers’ subscriptions are due to renew soon after this date. Therefore, we decided to extend access to those few cases, without requiring an actual renewal. They will be informed individually about the extension conditions and timeframe.

How do I access download packages if there is no public download page?

Starting with the 21 September 2023, the download packages can be accessed when you are logged in the “My Account” page, via the Download button (in the upper part of the Overview section).

Will Wikitude support in any migration efforts?

Kindly note that Wikitude support is being reduced to a bare minimum, therefore we will not be able to allocate resources to migration initiatives.

Are there any Wikitude SDK updates planned before the EOL?

There will be no new features and no feature advancements. Should you, as an active subscriber, see any breaking changes after updating to iOS 17 or Android 14, please let us know via a support ticket and we will look into it.
Please check this FAQ to see which SDK version your key is supporting.

What will happen with my Cloud data?

Wikitude Cloud will be maintained until the announced EOL date, and access is granted in combination with an active cloud subscription. On the 21 September 2024, the Cloud service will be completely shut down and all associated data will become unavailable. Your personal data will be deleted according to our privacy policy ( Should you want to exercise your data subject rights, you can submit your requests here.
The raw images you used for your cloud collections cannot be downloaded from Wikitude Studio, as we only make them accessible in our proprietary data formats. To extend your app functionality with Wikitude SDK, you can consider implementing an offline recognition or migrating your image targets to your cloud servers as opposed to the Wikitude’s cloud-based image recognition service. In both cases you will need to work with the image WTCs or ZIPs, which you can download from the Wikitude Studio until 21 September 2024.

What will happen with my Cloud access?

Cloud access will continue within the scope of your subscription type until the end of your subscription year. For example, if your current subscription started on 6 June 2023, you would have Cloud access until 5 June 2024.
Renewals will be disabled starting with 21 September 2023 and some of our customers’ subscriptions were due to renew soon after this date. Therefore, we decided to extend access to those few cases, without requiring an actual renewal. They will be informed individually about the extension conditions and timeframe.

What will happen to my Studio projects?

Wikitude Studio will be maintained until the announced EOL date. Until then, you can continue using its services. On the 21 September 2024, the Studio will be disabled, and all associated data will become unavailable, including the hosting functionality and hosted projects. Your personal data will be deleted according to our privacy policy ( Should you want to exercise your data subject rights, you can submit your requests here.
Please make sure to export all the projects you might still need in your app before this date. The raw images you used for your WTCs, ZIPs, and WTOs cannot be downloaded from Wikitude Studio, as we only make them available in our proprietary data formats.

What will happen to my Studio API access?

Studio API access will continue within the scope of your subscription type until the end of your subscription year. For example, if your current subscription started on 6 June 2023, you would have Studio API access until 5 June 2024.
Renewals will be disabled starting with 21 September 2023 and some of our customers’ subscriptions were due to renew soon after this date. Therefore, we decided to extend access to those few cases, without requiring an actual renewal. They will be informed individually about the extension conditions and timeframe.

What do I do if I still want to launch a project and I did not manage to buy a license key before the EOS?

We are ending sales on the 21st of October 2023, and it is not possible from that point on to get new license keys for new projects. We are giving users a short window to make any purchase needed for projects they have been working on recently, between 12 September 2023 (first announcement date) and 21 September (end of sales date).
Should you already know you might need a license key for a new project in the near future, you can consider purchasing a license in this time frame and generate the license key later, when the app IDs are known.
Please be advised that after the EOL date, your SDK license keys will still work. However, given that it will not receive any more updates or technical support, the software will become more vulnerable to OS incompatibilities and security risks. Wikitude Studio, Cloud and Studio API access will be fully disabled on the 21.09.2024 and the associated data will be deleted.

Until when will the documentation and forum be available?

Until 21 September 2024. The download packages contain the documentation relevant to that SDK version, so we invite you to save a copy for your future reference if you see necessary.

What happens with the Educational licenses?

The academic program ends as well and we will not be providing any new educational licenses after the 20 September 2023. The validity and use rights remain as agreed for Edu licenses created until this date.

What happens with the Premium Partner Program?

The premium partner program is entering as well a wind down phase and it is closed from accepting new partners as of 13 September 2023.

How can I reach you should I have additional questions or need further details?

Please contact us via

AR features

Markerless AR: how and where to use it

Markerless AR functionality allows developers to create digital applications that overlay interactive augmentations on physical surfaces, without the need for a marker.

We can all agree that computer vision is a key part of the future of augmented reality, mobile or not. That’s why we’ve been working so hard on our Instant Tracking over the last years. If you are not yet familiar with this feature, Instant Tracking creates the perfect digital recreation of real world things anytime, anywhere without the user having to scan any image.

Instant Tracking is also the first feature using Wikitude’s Simultaneous Localization and Mapping (SLAM) technology. SLAM identifies the user’s precise location within an unknown environment by simultaneously mapping the area during the Instant Tracking AR experience.

This allows developers to easily map environments and display augmented reality content without the need for target images or objects (markers). Wikitude’s SLAM markerless augmented reality tracking is one of the most versatile cross-platform 3D-tracking systems available for mobile.

Our SDK also offers its own SLAM Instant Tracking technology which can be dynamically connected to ARKit and ARCore (Wikitude SMART). SMART is a seamless API which integrates ARKit, ARCore and Wikitude’s SLAM engine in a single cross-platform AR SDK.

This feature helps developers create their projects in either JavaScript, Unity, Xamarin, PhoneGap, and Flutter without the need toto deal with specific ARKit/ARCore code. SMART dynamically identifies the end user’s device and decides if ARKit, ARCore or Wikitude SLAM should be used for each particular case.

Here are some interesting use cases for Wikitude’s markerless augmented reality feature:

Where you need to grab someone’s attention, immediately

Getting someone to look at your product is the first step of a good marketing strategy. For both marketing and retail implementations, augmented reality offers immense opportunity to do that. It’s new, easy to understand, and impossible to ignore.

Do you know the first time most of the living population saw the concept of augmented reality (although they probably didn’t know it then)? This scene with Michael J Fox in Back to the Future II.

Maybe it’s not as slick and refined as today’s visual effects, but back in 1989, it was certainly surprising – and attention-grabbing. That’s part of the way AR still works today – especially for the next couple years as wide-spread adoption still continues to grow. The most important thing to remember? If you truly surprise someone, they’ll be sure to tell everyone they know all about it.

The potential here for both retail outlets (online and in physical world) is clear – customers can interact directly with the product, and come as close they can to touching and feeling it without having it in their actual hands.

Even more opportunity exists in the gaming and entertainment – check how gives sport fans an opportunity to collect crypto tokens, earn reward points and unlock experiences with their favorite sport clubs.

When you need to add one small piece of information

AR is at its best when it does just what it says: augment. AR can turn your phone into a diagnostic tool of unparalleled power – perceptive and reactive, hooked into the infinite database of the world wide web.

Adding a few, small, easy to understand bits of information to a real scene can simply help our mind process information so much more quickly – and clearly. Here’s a great example where an automobile roadside assistance service can help a customer diagnose a problem – without actually being anywhere on the roadside.

The opportunities here are endless – factory floor managers, warehouse workers, assembly-line technicians – anyone who needs real-time information, in a real-world setting. It’s a huge technological leap forward for the enterprise – just like when touchscreen mobile devices with third-party apps first appeared.

Where you need to show a physical object in relation to other objects

There’s a reason this idea keeps coming up – it solves a real-world problem, instantly, today.
Architecture, interior design – any creative profession that works in real world spaces can take advantage of augmented reality.

From visualizing artworks or virtually fitting furniture in your living room , the benefit here is clear – we can understand how potential real-world space will look and function so much better when we can actually see the objects we’re thinking of putting there – while we are there.

This last bit is why mobile AR is so important – if we want to make AR a practical technology, we have to be able to use it where we live, work, build and play, and we don’t want to drag a computer (or at least, a computer larger than your smartphone) everywhere we go to do it.

Here’s an example of placing designer clothing in a real-world setting, done by ARe app and powered by Wikitude:

Opening up an endless opportunity to showcase products of any size (from industrial machines to cars and jewellery), markerless AR enables a new level of shopping experience, that can take place directly on the customer’s mobile device at any time. Such options as 360 degrees product viewer, custom features annotations and 24/7 access allows customers to configure and compare products, communicate with merchants and shop in the comfort of their homes.

So be creative in your AR applications – and do something surprising. Developers all over the world are already using Wikitude technology to build AR apps that grab attention and customers – and it’s already making their lives easier.

Markerless AR infographic

Want to dig in deeper? We’ve collected a few of our favorite use cases in the infographic and a list of apps already using the technology on this YouTube playlist. Have a look and see what inspires you to make something inspiring! 

Looking how to get started with Markerless AR?

Interested in creating an AR project of your own?
Talk to one of our specialists and learn how to get started.

Contact The Wikitude Team
Dev to Dev

APIs: scaling up AR capabilities

An API (Application Programming Interface) allows applications to communicate. Serving as an access point to an app, API enables users to access a database, request and retrieve information, or even alter data on other applications.

In this article, we get you familiar with the functionality of the Wikitude Studio and explain how you can benefit from using Studio API in your AR app.

Introduction to Wikitude Studio API

Wikitude’s Studio is an easy-to-use, drag and drop, web-based AR content manager. Using Studio, you can easily create two types of targets(image targets (2D items) and object targets (3D items)) to further augment for your JS-based app. On top of that, you can add simple augmentations to test your targets and their position. 

Not sure if the image target quality is good enough? Use the rating indicating the quality of an image target as a guide. Wikitude Studio API also enables conversion of image target collections to cloud archives and their management, making it possible to work with cloud-based recognition instead of on-app recognition. You can create and host your AR projects in Studio and link them directly to your app without exporting and pushing app updates.  

What does the Studio API do? 

Studio API allows you to access all the functionality mentioned above without logging in to Wikitude Studio. You can have your app or system programmatically communicate with the engine behind Wikitude Studio. The keyword here is “programmatically,” meaning the flow enables simplified app development, design, and administration and provides more flexibility. In practical terms, it allows users to quickly scale up and integrate AR capabilities into existing architecture. 

How can Studio API benefit your business: use cases

Now, let’s see some real situations where Studio API can come in handy.

  • Create the project for each of the targets your customers upload in your CMS

Studio API can be integrated into your own CMS, making it easier to maintain collections and content automatically. Say you run a printing photo service and an accompanying app. The end-customer can upload pictures and add digital content associated with that photo: a video, a song, an animation, or GIF. By scanning a printed image with the app, the customer can access an AR experience that enhances a memory or a moment captured on the photo. 

Creating the image targets, assigning augmentations to the targets, and publishing content can be managed programmatically, enabling you to design the user interaction the way you want to. 

Similar functionality could be used by a postcard service, corporate merchandise producers, and other services. 

  • Easily manage image targets and have your app make updates in the background  

When working with fast-changing content, numerous images, and heavy augmentations, we discourage storing your targets and augmentations in the app. Offline recognition will force you to redeploy the app frequently and make the size of the app massive. That’s where Wikitude cloud-based recognition comes to the rescue.

Imagine a publisher (just like issuing analog books and magazines with an extra AR layer. Such a service can have one app giving access to all the AR experiences associated with each printed item. As the new books and magazines are published, the publisher simply programmatically adds fresh digital content to the server, making it available for the users in the app via cloud-based recognition.

Wikitude cloud-based recognition provides an opportunity to work with a target collection containing up to 50,000 images. Otherwise, you are limited to 1,000 target images per active target collection, and only one group can be active at a time. This flow can lead to a longer recognition speed, and the end-user will need to switch manually between collections. The functionality could be extended to many other fields, such as education, tourism, art, and culture.

  • Integrate the AR functionality with already existing architecture and automatically grab data from that closed system 

The Studio API can also be used for 3D items. By having the 3D models and the material file of a machine, a robot, or part of an assembly line, you can use that information and render images of that specific machinery. The Studio engine will automatically process those images via the API to create object targets, while the API will help position augmentations. 

Using such AR experience lets employees detect and precisely localize malfunctions on the production line by grabbing data from other parts of the system, such as live sensors, measurements, and machinery history. The factory can leverage its existing training material or repairment specifications and overlay AR instructions on the machines, reducing the time required to identify and fix issues. 

Wikitude Plugins API 

The Plugins API allows extending the Wikitude SDK by 3rd party functionality. It enables users to implement external tracking algorithms (such as image tracking, instant, and object tracking) to work in conjunction with the Wikitude SDK algorithms. Additionally, you can input camera frame data and sensor data into the Wikitude SDK, which will take care of the processing and rendering. Our compatible plugins are written in C++, Java, or ObjC and can communicate with JavaScript, Native, and Unity plugins. Please note we currently don’t provide support for the extension SDKs like Cordova, Xamarin, Flutter. 

  • Integrate with OCR and code readers 

What else can you achieve? The Plugins API can trigger AR content via QR code and barcode reader or add text recognition. Our client Anyline‘s text recognition API allows apps to read text, gift cards codes, bank slips, numbers, energy meters, and much more. The company’s solutions have been used by Red Bull Mobile, PepsiCo, and The World Food Program.

Anyline barcode reader

  • Build remote assistance app by leveraging Wikitude’s instant tracking  

Typically, our engine is set up to recognize targets in the camera feed. With the Plugins API, you can set specific images as input rather than grab them from the camera feed. Where does that come in handy? That is an implementation-specific for remote support solutions, where one needs to broadcast a user screen to another user. Scope AR used this functionality when launching WorkLink Remote Assistance, their AR remote assistance tool. They required a markerless tracking provider to complement the plugin they created, and we were happy to support it with our technology.

  • Augment the human body 

Another use case that we’ve often encountered is adding face detection, hand detection, or body detection. To use it, you need a library specialized in one of those detection functions and plug Wikitude in it. It will take over the processing and rendering of AR content. Watch our detailed face detection sample to learn more.

Connecting with a face tracking library via the Plugins API is not the only way to create this type of AR experience in combination with Wikitude. Alternatively, you could access Face and Body tracking from ARKit or ARcore via AR Bridge in our Unity Expert Edition SDK.

Wikitude AR Bridge

As you already know, an API (Application Programming Interface) allows applications to communicate with one another. Wikitude’s AR Bridge, part of the Unity Expert Edition SDK, has similar functionality: it provides access to the tracking facilities of other native platforms, e.g., ARKit and ARCore. The AR Bridge enhances Wikitude’s image and object tracking by tapping into external tracking capabilities. There are two options:

  • Internal: this is a direct communication to ARKit and ARCore maintained by Wikitude and, at the moment, it offers basic positional tracking (no plane detection or more advanced tracking); 
  • Plugin: allows an indirect connection to any tracking facility, pre-existing or written by developers.  As an example, we provide integration with Unity’s AR Foundation plugin. 

We provide a ready-made plugin for Unity’s AR Foundation that developers can use immediately. The plugin uses AR Bridge to inform the SDK about tracking. All current and future AR Foundation features work similarly to what we referred to as 3rd party libraries in the Plugins API context.  

However, plugins can be developed by anyone, not just by Wikitude. For example, a company is building glasses with its tracking system and integrating with Wikitude. Since they are not using ARKit or ARCore, the internal AR Bridge won’t be a default. Instead, they can create their plugin and have this custom solution work fast inside our SDK.

Ready to start building today? Access our store to choose your package or contact our team to discuss which license is the best match for your AR project.


Augmenting the future: interview with Martin Herdina

Martin Herdina talks about Wikitude joining the Qualcomm family, growing together with the developer community, and why the future of augmented reality is headworn.

Running a start-up needs strong vision, grit, and persistence. Running an augmented reality start-up? Double that up and mix in a profound belief in an extraordinary team that can accomplish anything.

It started with a vision

Thirteen years ago, we set out on a mission to pioneer the augmented reality industry. As a team of engineers, researchers, product and business people from all walks of life, we came together under the Wikitude’s roof to pursue our curiosity and see what happens if we take another step towards our vision.

Our belief has always been that AR will drastically shape the future of how we consume information, and we worked hard to make that vision a reality.

Our belief has always been that augmented reality will shape the future of how we consume information, and we worked hard to make that vision a reality.

A fair share of wins (some smaller, some larger) in the market showed us that we are on the right path (even though things have been tough sometimes). Wikitude spearheaded the industry when we launched the world’s first mobile AR app. We’ve created tools that have become the go-to technology for developers worldwide.

Using our AR SDK, Wikitude customers and developers applied augmented reality across industry verticals, creating billions of apps and use cases. Through community’s tireless efforts, our vision of augmented reality has been taking shape!

The ultimate dream

But the final frontier was still ahead – not only making augmented reality accessible for everyone but turning it into the most natural experience that hardware can allow. Since 2013, when Wikitude started supporting wearable devices, we’ve been dreaming of headworn AR.

While smartphones serve as an important step, smart glasses would completely remove the friction of looking down on the small screen.

And this is where Qualcomm steps in. The company plays a special role in the XR ecosystem, having continuously shown interest in the XR segment, investing in the next generation of chips and reference hardware. We’ve been working together since 2019, integrating our AR SDK into Snapdragon® XR Platform, showing a glimpse of how the next generation of spatial computing will look like.

Now, when augmented reality hardware and technology have advanced to the point where both started to gain commercial traction, we are excited to join forces and accelerate the enablement of custom experiences powering the next generation of AR glasses. It’s a very exciting journey ahead, where together we’ll set the stage for a thriving AR ecosystem and mass-market adoption.

The future of AR is headworn

For years, we’ve been tailoring our SDK to support a number of headworn devices to enable flawless tracking and help users discover the potential headworn AR experiences can bring.

Why headworn? We believe it provides a basis to experience the true immersiveness that augmented reality is all about. Something that no smartphone can ever bring. Using AR headsets, users can see the augmented world around the same way they experience the real world. See-through displays allow a wide field of view while you have your hands free and can freely move around, collaborate, work and play with immersive experiences.

Using AR headsets, users can see augmented world around the same way they experience the real word.

The absence of friction headworn AR can provide will pave the way to the metaverse, where we will eventually interact and socialize, just like we do in the real world (plus the endless opportunities the digital universe can bring).

Driving adoption
While the expectations for AR hardware grow and the industry slowly gets to the point of meeting customer expectations, we believe that the world won’t switch to all-in-one AR devices in the nearest future. Instead, we are leaning in on the approach Qualcomm Technologies takes in connecting a lightweight viewer device to the smartphone that provides ultra-low-power technology with advanced rendering.

Powered by 5G, this is a pragmatic step toward enabling headworn AR tomorrow, making the innovation accessible for everyone who can’t wait to experience headworn AR.

What’s next?
Having become a part of the Qualcomm family, Wikitude will continue doing what we do best–working on our cutting-edge AR SDK and growing a thriving developer community. Our expertise in well-designed AR experiences, robust tools and strong knowledge of our developer audience and Qualcomm Technologies XR innovation will help strengthen the XR sector and accelerate the enablement of custom AR experiences as the toolkit of choice for headworn AR glasses.

United in the horizontal-platform approach, we share the vision of running a platform for headworn AR that will open up endless opportunities. And Wikitude developers will be the first to make a difference and start creating and experimenting with the new tools.

Introducing Snapdragon Spaces

Today we are unveiling a new beginning: Snapdragon Spaces XR Developer Platform. This developer-first platform is tailored to remove friction for developers and unlock the full potential of wearable immersive AR.

Backed by Wikitude’s 9th generation AR technology and Qualcomm Technologies leadership in the XR ecosystem, Snapdragon Spaces XR Developer Platform paves the way to a new frontier of spatial computing and empowers developers to create experiences for AR glasses that transform spaces around us.

Learn more about Snapdragon Spaces XR Developer Platform to stay in the know

Snapdragon and Snapdragon Spaces are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

Dev to Dev

How to apply UX design principles in augmented reality

If you are a UX/UI designer who builds user experiences in digital environments, chances are you will be working with augmented reality sooner than you think. As AR applications rapidly break into the mainstream, making the user feel in control of a product becomes even more critical in user experience design.

This article breaks down the role that user experience design principles play in augmented reality application development, with a specific focus on UI design.

The article is based on a presentation by our senior software engineer, Gökhan Özdemir, for the “UX for AR” webinar. Watch the full recording here.

What is UX design for augmented reality? 

User Experience Design, or UX, is the process of designing a product that considers the needs of the users and makes the user flow as seamless and intuitive as possible. Good UX always starts with putting the user at the center of the design process. It also relies on the principles of human psychology and empathy.

Now, what about UX for AR?  In augmented reality apps, success means offering a great user experience through a seamless blend of hardware and software. 

Augmented reality experiences are overlaid on the real environment, so the user experience is spatial and highly contextual. It makes designing UX for AR more challenging as designers need to think through spatial experiences. Getting it wrong can mean users have a less than stellar experience – and no one wants that. 

Getting started

User design can be tricky. Designing for a new technology that is only getting traction? Even trickier! Let’s explore the role of user experience (UX) design in AR applications — how to think through your user experience as a designer and navigate the technical decisions when creating an AR app. 

You will learn how to create a compelling user experience for your AR application that considers the physical space and natural human interaction. 

Five pillars of UX design for augmented reality

Users prefer to interact with elements of an interface discreetly, not to be reminded of what the interface contains. This is different from the traditional user experience (UX) associated with conventional websites and mail applications. The UX for augmented reality (also known as 3D user interface) concept emphasizes interaction and visual interest above all else. Users are interested in entering the virtual space and are not distracted by surroundings that are not real.

Our five common UX design pillars for AR will help you define the considerations you’ll need to make when designing your UI and experiences for virtual objects.

Kick-off your design process by considering these criteria:

  • Environment
  • Movement
  • Onboarding
  • Interaction
  • UI (User Interface)

While it’s crucial to consider the first two pillars (environment and movement) designing for AR, the last three (onboarding, interaction, and UI) are equally crucial for both 3D and traditional 2D screen space UI.  


As augmented reality experiences are spatial and always interconnected with the real world, the environment plays a key role in the design process. The environment can be broken down into four most common categories of space, defined by the distance from the user.

Image source: Wikipedia

Examples of AR in the intimate space include face filters (like Snapchat or Instagram), hand tracking, or hand augmentations if you use a head-mounted AR display. 

Moving to personal space, augmented reality experiences might feature real objects, people, or the area around you. Featured in the video below, you can see a learning-focused AR experience that uses educational models to animate the chemistry concepts through an interactive digital layer.

AR experience in personal space

Another example of using augmented reality in personal space is popular table-top and card games and augmented packaging. Think augmented pizza boxes or collectible cards with augmented characters that interact with each other.

Next up is the social space. If you pan the camera further away, you will target the area that can be occupied by other people, unlike in a personal space where you have more privacy. This space segment is widely used for multiplayer AR games or augmenting objects on a scale, from the furniture to monuments and buildings.

In many cases, AR experiences in public space are anchored to specific locations with enough area to place an augmentation or sites that should be tracked in AR. The mumok AR experience in Vienna is a perfect example of the AR in public space where the entire building is tracked, using the Wikitude Scene Tracking feature.

mumok AR


The success of any new product or service directly depends on how well it integrates with today’s users’ minds — both physically and psychologically. Movement makes the next UX design pillar. When you design the experience, you want to use the area around the user most of the time.

As smartphones and head-worn devices give a limited view into the environment, the designer’s primary task is to guide the user. By including the navigation elements on the screen, you will be steering the user’s gaze, helping them get around and move along the experience. 

While you are visually guiding the user, it’s essential to keep in mind not to dictate to go in specific directions. This might lead to unwanted hiccups in the experience or even cause accidents. 


The next pillar we are going to explore is the user onboarding. Creating user-friendly and engaging augmented reality experiences can be a challenge. It’s not enough to just put some markers around your location or overlay some information on top of an image. You need to understand what the user is looking at and how they are using it. When creating the AR experiences, keep in mind that the most important thing for your users is not accuracy but usability. 

Another factor to consider is that different devices have various technical limitations in supporting AR features. Markerless AR, for instance, would require the user to move the device around, so that computer vision algorithms could detect different feature points across multiple poses to calculate surfaces.

The scanning process takes no time for newer devices with an in-built LiDAR sensor (like iPad Pro). But for other devices, your users might appreciate a comprehensive onboarding UI. The pop-up menu or instructions should guide the user on the following steps to successfully launch and run an AR experience.

To launch a tracking algorithm, you might want to use a sketched silhouette of the desired object to provide a clue on the shape and pose to prompt the user to align the view with the real object. Read more about the Alignment Initializer feature in our documentation.

Alignment initialization

Taking the onboarding offline, sometimes the physical methods like signage are used to communicate about the AR app, provide a QR code for quick download and mark the exact standpoint for an optimal experience. 


Once the AR experience is launched, we are transitioning to another UX design staple – interaction. During this phase, your user will benefit from the intuitive and responsive interaction. When designing for touch, you are most likely be using these most common gestures and prompts: 

  • Tap to select
  • Drag starting from the center of the object to translate
  • Drag starting from edge of the object to rotate
  • Pinch to scale

Responsive interaction means taking into account the distance from the desired objects to the camera, which will define how easy or difficult it is for the user to interact with it. To facilitate the interaction with farther placed objects, consider increasing the sphere’s bounding box to make it less dependent on the distance to the camera.

Minimizing input via finger might also be a good idea, especially when designing for tablet users. As most of the tablets are held with two hands, some UI or interaction elements placed in the middle of the screen will be very hard to interact with. Instead, use gaze input like triggering intro, interactions, or buttons in the augmented space by looking at them long enough. You might know this from VR where you don’t have any controllers and experiences are mostly driven by gaze. 

Consider using accessibility features, especially if you are designing for a broader audience. This way, you let the user rotate or reset the position of an augmentation instead of walking around it.

UI (User Interface)

The final principle we want to highlight is UI, which consists of augmented and traditional screen space. Depending on the use case, you will be using them interchangeably. While UI in the augmented space boosts immersion as the user perceives it as part of the experience, screen space UI is sometimes easier to read and interact. 

Designing with humans in mind

AR can improve people’s lives simply by allowing them to experience something that wasn’t possible before. Applying UX principles to AR can help designers create experiences that are clear, integrate easily into daily life, and create powerful emotional responses.

The guidelines we’ve shared aren’t magic bullets, but they do place fundamental guidance around where designers should be focusing their attention when crafting an experience for a user of any age.

What is your take on using UX principles when designing AR experiences? Let us know via social media (TwitterFacebook, and LinkedIn) and tag @wikitude to join the conversation.

3d Dev to Dev

Creating 3D content for augmented reality

Content is constantly changing. Designed for TVs and devices in the early 2000s, it now transcends the 2D realm and comes to the world around. 3D augmented reality content needs to be as immersive as VR advocates ever dreamed, minus the isolation from the outside world.

The more AR becomes part of our lives, the higher the need for content to adapt to the 3D world. It means the content needs to be realistic, spatial, and engaging. And while there are thousands of apps online, most companies are still figuring out what compelling content looks like in AR.

In this post, we’re diving into the role of content in ​​augmented reality, the challenges the industry faces, and the future of spatial content. 

Augmented reality content basics

Augmented reality content is the computer-generated input used to enhance parts of users’ physical world through mobile, tablet, or smart glasses. It can be user-generated (think of social media face filters) or professionally produced by designers working for brands and specialized agencies. 

AR content often comes as 3D models but can also come in visual, video, or audio format.

Whether you are using AR to buy a new IKEA sofa or play a game, the quality of the content you see in the app will make (or break) the AR experience.

Image source: IKEA

The role of 3D content in augmented reality experiences

Among the thousands of AR apps in the market today, the most successful ones have one thing in common: high-quality, engaging AR content. Fail to deliver that, and your project will risk joining the astonishing 99.9% of apps that flop or become irrelevant in the app stores

Content is the heart of ​​augmented reality. It ensures users have a reason to keep coming back. 

Users might be thrilled to scan an augmented wine bottle a few times and share the experience with friends. But how many times can we expect them to go back and watch the same video? 

Companies must see AR content as a critical component of long-term, well-thought-through digital strategies to ensure app longevity. It means constantly delivering fresh, contextual, and personalized content. 

Easier said than done. From high production costs to a scarcity of skilled professionals, building AR content at scale is one of the biggest challenges companies face that blocks them from keeping the apps relevant in the long run.

Challenges of building 3D content for augmented reality

3D models need to create perfect digital twins of the real world. Combined with other rendering elements (e.g. animation, audio, and physics), they make for AR’s most used type of content and provide an additional immersive layer for the user experience. 

What the user doesn’t see is the relatively complex process of creating such realistic visual assets. Their production can go from a detailed manual process and re-using computer-aided data to a photogrammetry-based creation process.

Size limits, file formats, and the total size of the application are just some of the plenty requirements developers need to understand to build great AR experiences. In addition, a lack of industry standards for AR content and a limited qualified workforce imposes significant industry challenges.

Building 3D assets: 3D model versus 3D scanning

Before we jump into the technicalities of creating content for AR, there are some basic concepts we need to clarify.

3D modeling x 3D scanning

3D modeling and 3D scanning are two ways of building 3D assets for augmented reality. 

3D modeling uses computer graphics to create a 3D representation of any object or surface. This technology is beneficial when used to recreate physical objects because “it does not require physical contact with the object since everything is done by the computer” (Skywell Software). Therefore, 3D modeling becomes ideal for creating virtual objects, scenes, and characters that don’t exist in the real world (think of Pokémons and other fantasy AR games).

3D scanning uses real-world objects and scenes as a base for the production of AR assets. Using this method, the content creators don’t craft the model from scratch using a program. Instead, they scan the object using one of two different methods: photogrammetry or scanning through a 3D scanner device (LiDAR or similar). 

GIF source:

The main difference between the two is how they capture the data of the object. While photogrammetry uses images captured by regular smartphones, smart glasses, or tablets, scanning requires special devices equipped with depth sensors to map the object. 

It makes photogrammetry more accessible to the broader developer crowd when creating AR content, as no special equipment is required. On the flip side, 3D scanners are more reliable. 

Using either of two approaches, a point cloud can be extracted, which can be applied in the AR experience.  You can read more on the advantages of each method in the section 3D point cloud below. 

Ultimately, you can decide between using 3D modeling or 3D scanning by assessing the accessibility to the physical object to scan. If the selected AR object target is not available, then 3D modeling is the way to go. 

How is 3D content created for augmented reality?

There are plenty of AR content creation tools available on the market. Some are easy drag-and-drop that don’t require coding skills. Others are much more complex, and target experienced professionals.

Here’s an overview of the different possibilities:

Image source: DevTeam.Space

3D point cloud: In AR, a point cloud is a virtual representation of the geometry of real-world objects using a collection of points. Generated via photogrammetry software or 3D scanners, these points are captured based on the external surfaces of objects.

Because photogrammetry allows gathering 3D information out of 2D images, this method makes content creation more accessible. It overcomes ownership issues often faced with 3D models. As a result, anyone can create 3D models by simply recording or scanning the real object. 3D scanners (for example, LidAR-enabled devices) gradually become more available in the market and provide more detailed point clouds thanks to depth sensors.

Commercial tools such as Wikitude Studio, Apple Object Capture, and Amazon Sumerian are examples of photogrammetry-based programs.

AR Object Target Transformation in Wikitude Studio Editor

CAD (Computer-Aided Design): CAD models are commonly the first step to prototyping physical goods, bringing a first product view to life in the digital world. Assisted by software applications, AR developers can repurpose legacy CAD models for augmented reality-based solutions. Existing CAD data can then be used as the input method to create digital representations of the object or environment to be augmented.

Once uploaded into the selected program, CAD data is converted to an AR-compatible format in phones, tablets, and smart glasses. CAD models typically provide accurate information about the object, maximizing the potential for a reliable AR experience. Albeit being prevalent in the industrial sector, CAD-based AR experiences are progressively gaining popularity for consumer-facing apps.

Games, computer graphics: authoring software tools such as Blender, 3ds Max, and Maya are popular 3D design applications used by AR content creators. Unity, Unreal Engine, and even Apple’s Reality Composer are great tools to assemble the pieces of content and make them work together for augmented reality.

Other 3D models: beyond CAD, other popular 3D model formats can be leveraged to power augmented reality solutions, for example, glTF 2.0, FBX, Obj, etc. Compatible file formats will depend on the program used to build augmented reality. 

On the one hand, this wide variety of 3D assets formats has opened the doors to creators of many areas to put their existing models to work for AR. On the other hand, it creates confusion among developers, increasing the debate around the need for standardization in the AR industry and the creation of alternative tools that are intuitive and code-free.

What’s next for AR content creation?

With increased interest in augmented reality, we will see more tools emerging that help to create content, overcome workforce scarcity and deliver actual value through the technology. 

To facilitate content creation, AR companies invest in building platforms that don’t require technical skills (therefore closing the workforce gaps) to help brands optimize the AR content creation process. 

An example is Apple’s latest release: Reality Kit 2. This new framework includes a much-awaited Object Capture feature that allows developers to snap photos of real-world objects and create 3D models using photogrammetry. 

But if Apple’s announcement gives you déjà vu, you are not wrong. Last year, the AR media went crazy about an app that lets you copy and paste the real world with your phone using augmented reality.  

The topic of interoperability of experiences across platforms and devices is equally important. The ability to code an AR app once and deploy it in several devices and operating systems helps companies bring their projects to market as fast as possible.

The final and most crucial aspect is understanding how 3D content in augmented reality can deliver value to its users. That means getting clear goals for the AR project, understanding how it fits into your digital strategy, and having a deep knowledge of your customer. 

What are some of the trends you see in AR content creation? Let us know via social media (TwitterFacebook, and LinkedIn) and tag @wikitude to join the conversation.


How augmented reality helps to change the game for gender equality

Read how augmented reality helps increase the visibility of female achievements and contributes to gender equality – one app at a time. 

The Whole Story App

In the U.S. less than 8 percent of public statues represent women. In the UK, a mere 2.7 percent of statues are of historical, non-royal women.

Y&R took matters into their own hands to celebrate historical women around the world. The Whole Story global movement was initiated to build a bridge between technology and public spaces using Wikitude’s augmented reality (AR) technology to highlight powerful women who have made a difference throughout history.

Developed by Current Studios, the app used location-based AR to show augmented female statues alongside existing male figures. Users could locate statues on a map and learn more about the woman’s contributions, as well as share their learnings with friends and family.

Users could view 23 virtual statues in New York City and another 13 throughout the world. The project encouraged people to create and submit more statues of iconic women around the world, hoping one day to be present in all continents. Susan B. Anthony, Florence Nightingale, Nina Simone, Marie Curie and Maria Tallchief are examples of AR statues you could find when browsing the app’s geo-AR map.

The Whole Story app by Current Studios

There are many, many untold stories of women throughout the world. What’s really meaningful about this tool is that it gives people the chance to tell their own stories.
Catherine Patterson | Director of Innovation | Y&R

Building historical pieces can take a long time, require long approval processes and are highly costly. For this reason, the global communications firm chose to use augmented reality. “We don’t want to wait for statues to be built, so we took it into our hands,” says Shelley Diamond, chief client officer at Y&R.

The Whole Story app is a great example of how technology can inspire actions in real life. In 2020 Netflix followed the same logic and honored the oft-forgotten real-life sisters of famous men. The streaming pioneer company has planted multiple statues across the UK honoring women whose achievements have been overshadowed by their famous brothers, including. Among those, the public could see the statues of Charles Dickens’ sister Frances Dickens, Mozart’s sister Maria Anna Mozart, Princess Helena Victoria, sister of King Edward VII.

Creating new realities

Technology creates numerous opportunities to reduce gender inequality: not only in history but also in our present.

Female employees make up between 28 percent (Microsoft) and 42 percent (Amazon) of the workforce at major tech companies.

Albeit “slow adopters” of gender equality practices, technology companies worldwide demonstrate increasing efforts in hiring and empowering women in the workplace.

The augmented reality and virtual reality industries seem particularly keen on improving the number of female leaders in the field. Among the most outstanding personalities, a globally recognized thought leader, futurist, and XR evangelist Cathy Hackl paves the way.

On the organizational level, Women in XR (WXR) and Women in Immersive Tech Europe are an example of a directed effort to bring more gender diversity to the augmented reality industry. Both organizations aim to elevate women leaders and advance equality in emerging technologies. In 2018, AWE hit a major milestone by featuring over 100 female speakers.

At Wikitude, we take pride in having nearly 40 percent female team in 2021 and growing. A few years ago we also announced our very first C-level female leader: Nicola Radacher.

We strive to celebrate women’s leadership in augmented reality and tech every day throughout the year. Use the tag @wikitude to share #WomentoFollow who inspire in XR and tech community via Twitter and LinkedIn.


Augmented Reality for students and educators: get started with AR NOW!

It’s never been a better time for students and educators to take a closer look at the augmented reality technology to future-proof their careers.

Learn why and how to get started with Augmented Reality for students and educators!

Augmented reality technology

As explained in our introductory AR technology 101 guide, augmented reality is when computer-generated elements (graphics, 3D animations, videos, etc.) are digitally layered on top of a user’s view of the real world. 

As seen in the reels below, users can scan images, objects, and locations to reveal augmented experiences or point their devices to see augmentations seamlessly appear as if they were part of the physical environment.

Most of the use cases above were created by experienced independent developers and AR agencies for commercial purposes.

However, with the authoring tools and technical documentation available today (more on these, below), anyone can create AR experiences. Including teachers and students, no matter the level of expertise.

Why augmented reality technology and why NOW?

Timing is everything. And yes, AR has been around for a while and had a spark of fame back in 2016 when Pokémon Go was launched – (side note, don’t miss this tutorial: how to build an app like Pokémon Go in three simple steps). 

Lately, however, tech giants, enterprises, and fortune 500 companies have been keeping a keen eye on the technology – and for good reason! AR growth predictions are incredibly promising, and the technology is already starting to get a significant chunk of the market. 

It is the dawn of AR. The technology is spreading fast and NOW is the perfect time to explore AR with some hands-on action. Students and educators worldwide are taking a closer look at augmented reality tools and the opportunities this technology brings.

The sooner you start the more prepared you will be to deliver when the AR market is at full blast.

We can help you get started NOW!

The Wikitude Academy: free Augmented Reality tools for students and educators

Back in 2012, Wikitude started a program to support academics interested in learning more about AR. As AR-technology providers keen on making the technology advance, we help by offering free access to our augmented reality platform and tools. The program was a success then and continues to thrive until this very day. 

We receive many requests and grant multiple educational AR licenses on a daily basis to students and teachers around the globe.

Teachers reach out to the Wikitude Academy when they are interested in expanding their knowledge or as a means of starting the journey of enhancing their teaching methods with the benefits of AR itself: adding digital augmentations (audio, video, 3D models, instructions) to books, machines, lab equipment, and more.

Our student requests, on the other hand, come from all backgrounds: high-school students creating independent projects, university students for project assignments and final graduation or master thesis, apprentices seeking to learn more about AR during short-term technology courses. 

How does the Wikitude Academy work?

Easy! Eligible students and educators can apply by filling out the application form on the Wikitude Academy page. The Wikitude team analyses each application individually to review the project and eligibility requirements. The final decision is informed by email.

To be eligible to apply students must be actively enrolled in (and professors currently employed by) an accredited academic institution.

The Wikitude EDU AR SDK allows users to create one AR App that can be deployed on Android, iOS, and Windows. Further details, requirements, and restrictions can be found in the Academy page linked above.

Wikitude AR SDK: augmented reality software development kit

Academics with coding knowledge (or code learning in progress), can choose to work with our main product, the Wikitude AR SDK.

The Wikitude SDK allows users to develop AR apps that can recognize, track and augment images, objects, scenes, and geographical locations. 

In the EDU version, students can work with Native or JavaScript API or choose Unity, Cordova, Xamarin or Flutter extensions to create cross-platform AR experiences for smartphones, and tablets across Android, iOS and Windows. Please note, digital eyewear projects are reserved for commercial licenses only.

Wikitude Studio Editor: create AR apps without code

Regardless if you have coding experience or not, you might want to check out the Wikitude Studio Editor.

Wikitude Studio Editor is a web-based authoring tool used for creating augmented reality experiences. 

With useful features and an intuitive interface, it should help you accomplish AR tasks without any particular technical skills. Studio Editor lets you add augmentations to your targets in a drag-and-drop manner, as well as edit and remove them. Augmented projects can be exported for further use or shared for consumption.

Wikitude documentation and AR tech articles

For support and guidance in exploring augmented reality, students and teachers are invited to navigate through our extensive documentation and forum threads. 

Additionally, the augmented reality articles below will give readers access to technical details and sample app instructions:

Download Augmented Reality SDK trial for students and educators

Are you a student or educator and want to give our augmented reality tools a try before sending your official EDU application? Access our download page to choose your platform of choice, view all plugin options and other dev tools. After the download, you will be directed to a registering page to get started.

Apply for a free Wikitude EDU SDK NOW!

AR is inspiring, challenging, useful, innovative, and growing fast. The earlier you start the better.

Dare to create the next big AR use case? Click the button below to start your augmented reality journey.

Digital agencies

5 tips to help pitch AR for your next project

Tips for agencies and developers to successfully pitch AR to clients and potential customers.

You get it – AR is incredibly cool, and most definitely the wave of the future – but your client might still be wondering if it’s time to ride the AR wave.

Have you pinpointed an excellent use-case for AR within a client’s project? We want to give you a few tips and suggest tools you can use to help get them on board. As a bonus, you will have an additional product for your portfolio you can really be proud of!

1. Explain the added value

Let’s talk about flash – and not the Adobe kind. Sizzle, wow-factor, attention-grabber, whatever you want to call it, AR has it.

It not only lets you see more of the world around you – it skyrockets user engagement with interactive content. A customized AR-experience is one of the most attention-grabbing features an app can offer.

For a little extra inspiration for your speech, check out 7 ways to use augmented reality in marketing today.

2. Demonstrate return of investment (ROI)

You’re a business, and they’re a business – the extra spending has to be justified.

The easiest way to do that? Show them a clear example of how AR can be linked directly to revenue – like in this video below from Takondi, one of Wikitude’s premium partners. Watch to see how easily AR can be used to implement mobile commerce.

There are a bunch of ways augmented reality can help businesses make more money. Here are a few of our favorites:

  • e-commerce – directly link real-world items to purchases
  • In-game purchases in an AR environment – you’ve seen the success of Pokémon Go – and remember virtual products have an excellent margin
  • Location-based deals – let users explore top deals near them (and guide them there)
  • Time-sensitive offers – reach people at the right time with the right offer
  • Augmented shopping – make every print material your user’s check out button
    Offer enhanced multimedia about products using 2D and 3D recognition
  • Premium apps – offer an entirely in-app shopping experience!

3. Show the future of AR with facts

When Google, Apple, Facebook, and Microsoft start heavily investing in augmented reality, it is safe to assume the tech is not only on the rise but on the verge of something great.

To have an idea of the current predictions, Digi-Capital’s long-term virtual and augmented reality forecast is for the AR/VR market to reach around $65 billion revenue by 2024. And Worldwide spending on AR/VR products and services throughout the 2019-2023 forecast period, should achieve a five-year compound annual growth rate (CAGR) of 77.0% (IDC).

Access this AR facts and predictions article for more data. And for real-world examples, note that “Pokémon Go, once a viral sensation all over the globe, hasn’t fallen off the map. In fact, the augmented reality game is earning more money than it ever has before. According to mobile analytics firm Sensor Tower, Pokémon Go had a record year in 2019, taking in an estimated $900 million through in-app purchases.” – via the Verge.

Image: Sensor Tower

4. Highlight the simplicity of the tech

Excuse us while we toot our own horn, but, making AR easy is what we do. The Wikitude AR SDK is one of the most versatile tools available for developing mobile AR. Want to build your own Pokémon-Go-like app? You can do it with Wikitude in three easy steps.

5. …and most importantly, show a demo

Seeing is believing. So why not show your clients the great things they can do with AR? It’s a lot easier to want something you can see right in front of you. So the best and most important advice when you’re pitching AR? Show them a demo. Here are a few tips for doing that!

  • It’s best done live – find an AR app, either your own or from another company, and show it during an in-person meeting. If this means bringing the tools with you (like a magazine or product) bring them with you!
  • Undersell, overshow – a good AR project speaks for itself. Rather than building up expectations, casually throw it out there – “Oh, have I shown you this cool trick?” *whips out phone*
  • Make sure it’s going to work! Do you need decent cellular data or Wifi? Nothing impresses less than stalled technology. Check your connection before you move forward
  • And speaking of demos – remember, you can always use the Wikitude trial license to win over your client!
SDK releases

Develop powerful AR apps with the new Wikitude SDK 8.10

The latest Wikitude AR platform release includes updated support for Epson Moverio smart glasses, new Flutter sample app, and stability improvements

The Wikitude AR platform goes through regular quality assurance tests, and maintenance and development processes to ensure you have access to the level of quality you need to create high-performing augmented reality experiences.

SDK 8.10 contains all the latest platform maintenance and stability improvements; brings our optimized Epson Moverio AR SDK up to date; and includes the new, highly requested, Flutter sample app.

Epson Moverio

SDK 8.10 support for BT-300 and BT-350 Epson Moverio smart glasses

Starting off with the development that many of you have been waiting for: updated Epson Moverio Wikitude AR SDK.

The Epson Moverio devices are used by enterprises and consumers worldwide to deliver hands-free augmented reality experiences. Wikitude SDK 8.10 offers a fully optimized AR SDK for Moverio BT-300 and BT-350 smart glasses.

The Wikitude AR platform is adapted to make the best out of the unique features of both devices, ensuring optimal performance in a variety of environments and use cases. Among these customizations are:

  • Intel SSE optimization: providing best processing power and performance for both devices;
  • Optimization for stereoscopic view: enabling full 3D see-through (side-by-side view) support of Moverio smart glasses;
  • Personal calibration: enabling perfect alignment between the real world and AR content.

Watch the video to view an Epson Moverio hands-free supported remote assistance use case combined with Wikitude Object Recognition and Tracking technology:

Access the link below to give the Epson Moverio AR SDK a try (download redirects to the signup page for a free trial). Subscription license users are entitled to this free update.  


Sample app for Flutter. Another request from our awesome AR community

Wikitude was the very first AR platform to offer official support for Flutter. For those of you who are unfamiliar with it, Flutter is an open-source mobile application UI development framework toolkit created by Google. It is used to develop natively-compiled applications for iOS and Android from a single codebase.

The Wikitude documentation, with SDK 8.10, introduces a new Flutter sample app to help you add augmented reality technology to your projects.

The Flutter Plugin is based on our JavaScript API and includes the Wikitude AR library/framework, sample app, and documentation.

AR SDK Performance and Stability Enhancements

The Wikitude SDK is regularly inspected by our quality assurance team and optimized by our technical team so you can have access to high-performing tools that are always up to date.

As updates are frequent – and ideal for app maintenance and device compatibility reasons, we recommend choosing our subscription license, which includes one year of SDK update releases. 

Download Wikitude SDK 8.10

Active Wikitude SDK subscribers are entitled to all SDK version updates released throughout their term. Follow the links below to update your SDK:

New to Wikitude? Download a free Wikitude SDK 8.10 trial version for testing purposes and contact our team to discuss upgrade possibilities.

To explore all SDK options, including smart glasses, plugins, and other dev tools, please access our download page:

Interested in creating an AR project of your own? Access our store to choose your package or contact our team to discuss your specific AR requirements in detail.