Categories
AR features

Markerless AR: how and where to use it

Markerless AR functionality allows developers to create digital applications that overlay interactive augmentations on physical surfaces, without the need for a marker.

We can all agree that computer vision is a key part of the future of augmented reality, mobile or not. That’s why we’ve been working so hard on our Instant Tracking over the last years. If you are not yet familiar with this feature, Instant Tracking creates the perfect digital recreation of real world things anytime, anywhere without the user having to scan any image.

Instant Tracking is also the first feature using Wikitude’s Simultaneous Localization and Mapping (SLAM) technology. SLAM identifies the user’s precise location within an unknown environment by simultaneously mapping the area during the Instant Tracking AR experience.

This allows developers to easily map environments and display augmented reality content without the need for target images or objects (markers). Wikitude’s SLAM markerless augmented reality tracking is one of the most versatile cross-platform 3D-tracking systems available for mobile.

Our SDK also offers its own SLAM Instant Tracking technology which can be dynamically connected to ARKit and ARCore (Wikitude SMART). SMART is a seamless API which integrates ARKit, ARCore and Wikitude’s SLAM engine in a single cross-platform AR SDK.

This feature helps developers create their projects in either JavaScript, Unity, Xamarin, PhoneGap, and Flutter without the need toto deal with specific ARKit/ARCore code. SMART dynamically identifies the end user’s device and decides if ARKit, ARCore or Wikitude SLAM should be used for each particular case.

Here are some interesting use cases for Wikitude’s markerless augmented reality feature:

Where you need to grab someone’s attention, immediately

Getting someone to look at your product is the first step of a good marketing strategy. For both marketing and retail implementations, augmented reality offers immense opportunity to do that. It’s new, easy to understand, and impossible to ignore.

Do you know the first time most of the living population saw the concept of augmented reality (although they probably didn’t know it then)? This scene with Michael J Fox in Back to the Future II.

Maybe it’s not as slick and refined as today’s visual effects, but back in 1989, it was certainly surprising – and attention-grabbing. That’s part of the way AR still works today – especially for the next couple years as wide-spread adoption still continues to grow. The most important thing to remember? If you truly surprise someone, they’ll be sure to tell everyone they know all about it.

The potential here for both retail outlets (online and in physical world) is clear – customers can interact directly with the product, and come as close they can to touching and feeling it without having it in their actual hands.

Even more opportunity exists in the gaming and entertainment – check how Socios.com gives sport fans an opportunity to collect crypto tokens, earn reward points and unlock experiences with their favorite sport clubs.

When you need to add one small piece of information

AR is at its best when it does just what it says: augment. AR can turn your phone into a diagnostic tool of unparalleled power – perceptive and reactive, hooked into the infinite database of the world wide web.

Adding a few, small, easy to understand bits of information to a real scene can simply help our mind process information so much more quickly – and clearly. Here’s a great example where an automobile roadside assistance service can help a customer diagnose a problem – without actually being anywhere on the roadside.

The opportunities here are endless – factory floor managers, warehouse workers, assembly-line technicians – anyone who needs real-time information, in a real-world setting. It’s a huge technological leap forward for the enterprise – just like when touchscreen mobile devices with third-party apps first appeared.

Where you need to show a physical object in relation to other objects

There’s a reason this idea keeps coming up – it solves a real-world problem, instantly, today.
Architecture, interior design – any creative profession that works in real world spaces can take advantage of augmented reality.

From visualizing artworks or virtually fitting furniture in your living room , the benefit here is clear – we can understand how potential real-world space will look and function so much better when we can actually see the objects we’re thinking of putting there – while we are there.

This last bit is why mobile AR is so important – if we want to make AR a practical technology, we have to be able to use it where we live, work, build and play, and we don’t want to drag a computer (or at least, a computer larger than your smartphone) everywhere we go to do it.

Here’s an example of placing designer clothing in a real-world setting, done by ARe app and powered by Wikitude:

Opening up an endless opportunity to showcase products of any size (from industrial machines to cars and jewellery), markerless AR enables a new level of shopping experience, that can take place directly on the customer’s mobile device at any time. Such options as 360 degrees product viewer, custom features annotations and 24/7 access allows customers to configure and compare products, communicate with merchants and shop in the comfort of their homes.

So be creative in your AR applications – and do something surprising. Developers all over the world are already using Wikitude technology to build AR apps that grab attention and customers – and it’s already making their lives easier.

Markerless AR infographic

Want to dig in deeper? We’ve collected a few of our favorite use cases in the infographic and a list of apps already using the technology on this YouTube playlist. Have a look and see what inspires you to make something inspiring! 

Looking how to get started with Markerless AR?

Interested in creating an AR project of your own?
Talk to one of our specialists and learn how to get started.

Contact The Wikitude Team
Categories
Dev to Dev SDK releases

Power your Flutter app with augmented reality

Wikitude SDK 9.9 unlocks the power of augmented reality for Flutter 2.2

Wikitude was the very first AR platform to offer official support for Flutter. And now you can bring even more powerful augmented reality features to your Flutter-based apps. 

Our latest release, SDK 9.9, delivers location-based AR, image and object tracking, markerless AR, and a wide range of features allowing Flutter developers to augment the world around them in just a few hours. 

Wikitude’s Flutter Plugin is based on the JavaScript API and comes with the complete package: comprehensive AR library/framework, sample app, and documentation.

In combination with the Wikitude SDK, Flutter’s 2.2 release brings more options for polish and optimization, iOS performance improvements, Android deferred components, and more.

For those hearing about Flutter for the first time, it is Google’s open-source mobile application UI development framework used to build natively-compiled apps for iOS and Android.

The ease of use and constant improvements in this framework makes Flutter one of the most popular development tools among the community, surpassing an astonishing 2 million users mark in just 3 years. 

Get started with augmented reality for your Flutter-based application. 

Curious what else is new in the Wikitude SDK 9.9? Head over to our release notes to learn more.

Download Wikitude SDK
Categories
SDK releases

Object Tracking 30% faster: Download SDK 9.5

Wikitude SDK 9.5 is out now, delivering unparalleled Object Tracking experiences.

Please welcome our last release of the year. SDK 9.5 brings support for Android 11, two new AR samples, and significant improvements in our computer vision engine for developers to enjoy in both the Professional and Expert editions of the Wikitude SDK.

You can test SDK 9.5 by downloading your SDK of choice here.  

Do you have a subscription license? Download our latest release to update your platform free of charge.

Wikitude SDK 9.5 highlights

30% faster Object Tracking

With this release, the Wikitude computer vision engine gets a significant upgrade, resulting in a 30% faster object tracking speed rate.

This upgrade delivers unparalleled AR experiences based on real-world objects and scenes.

The latest Object Tracking update delivers:

  • 30% speed improvements for object tracking
  • More accurate tracking under challenging conditions (light, noisy environment, etc.)
  • Faster initial recognition of objects
  • Overall performance improvements for recognition and tracking

Object & Scene Tracking covers a wide variety of use cases from supporting maintenance and remote assistance solutions, to augmenting museums pieces, enhancing consumer products like toys, and much more.

Watch our latest Unity tutorial on how to work with multiple object tracking to get started.

Developers can create object tracking AR experiences using images or 3D models as an input method (such as CAD, glTF 2.0 and more).

Check out our interactive AR tracking guide to see the best feature for your specific use case.

New Wikitude samples for Expert Edition

The Wikitude sample app gets an upgrade with SDK 9.5. Unity developers can hit the ground running with two new samples:

  • Advanced rendering making use of ARKit 4 and ARCore advanced functionality

With this sample, you will be able to use ARKit, ARCore, and Wikitude capabilities in a single development framework, making the best of Unity’s AR Foundation. With advanced rendering, AR experiences get more immersive and realistic – leveraging people occlusion, motion capture, scene geometry, and more.

  • Multiple extended images

Bring image tracking experiences to a whole new level.

This new sample enables creating more interactive AR experiences by allowing targets to interact with each other and perform specific actions according to the developer’s needs. It’s ideal for games, sales material, marketing campaigns, and enterprise use cases such as training and documentation.

For full details on this release, check out our change log.

How to get SDK 9.5?

Ready to start developing? Wikitude makes it easy. Select an SDK, get a free trial, and start with your next augmented reality experience today.

Ready to launch your project? Seize the chance to take advantage of our Cyber Week (until 4.12.2020). Use code cyberweek15 to get 15% OFF your Wikitude SDK via our online store.

New to Wikitude? Welcome! We have a free Wikitude trial version for testing purposes. 

If you’re already a user and like what you’re testing, reach out to our team to discuss your upgrade to SDK 9.5. 

Download Now

Categories
SDK releases

New in: Explore Wikitude SDK 9.3

Update: SDK 9.3.1 is now available including iOS 14 compatibility.

August flew by, and a new release of the Wikitude SDK is already available. Today we’re bringing you version 9.3 of our AR toolkit, the last update for the summer. 

For AR developers, this means more stability for experiences based on 3D models as an input method, lots of improvements for Unity Editor, NatCorder support, and expansion of our Alignment Initialization feature for Professional Edition (PE). 

You can test SDK 9.3 by downloading your SDK of choice here

Shout out to our awesome community for all the feedback and interaction via the forum this month. Want to have your say? Share your feedback about this release and what you’d like us to improve in the next SDK. 

Got a subscription license? Download Wikitude SDK 9.3 to update your platform free of charge.

What’s new in the Wikitude SDK 9.3?

Tracking improvements for 3D model-based AR experiences (Expert Edition)

In our last update, we introduced a new 3D Model as an input method for developers working with Object Tracking. This means you can use glTF 2.0 files, CAD models, and other 3D model formats exclusively or additional to images to build AR experiences based on all kinds of physical objects.

SDK 9.3 delivers a smoother tracking behavior and better alignment with real objects. 

In addition to significant stability improvements, we roll out a new SDK mode called ‘Assisted Tracking’ to compliment Object Tracking. This mode gathers complementary information from native AR frameworks (ARCore, ARKit) to provide further tracking stability in typically rough conditions for the engine, such as heavy shaking or erratic movements.   

With this update, the Wikitude SDK gives everything you need to build robust AR in the widest variety of objects possible: from buildings to industrial machines, toys, home appliances, and more. 

As we bring this new method of building AR experiences to the finish line (public release), you can request early access by applying to our beta program. All you need to do is provide details on what kind of object you’re looking to augment, your project goal, and company details.

Stay tuned for what’s to come! These are the initial steps that are preparing our technology for future improvements, expansion of objects that can be tracked, and enhancements in our Object Tracking feature moving forward.

APPLY FOR 3D MODEL OBJECT TRACKING BETA TESTING

3D Model Object Tracking benefits at a glance:

  • More Object Target Input methods available
  • Input source materials are interchangeable and combinable (images + 3D models)
  • Broader range of recognizable and trackable physical objects
  • Improved recognition and tracking performance

Find out which method (3D Model or Image) is best for your specific use case by using our AR Tracking Guide

Alignment Initializer now available for Professional Edition

Our Alignment Initializer feature now became available in the Wikitude Professional Edition and received a performance boost for Expert Edition. 

Based on initial experiments by some of our early beta testers, SDK 9.3 has received additional improvements in the initialization behavior. In particular,  the new release features improved accuracy when initialization is correctly triggered. This means that the alignment initialization process has become more selective, which guarantees significantly improved customer experience. With a smoother and more intuitive process, developers can expect more reliability when tracking objects what may look alike.

 When pointing a device to recognize an object, the improved Alignment Initialization guides the user towards a pre-defined viewpoint, rapidly kicking off the AR experience. 

You’ll benefit from more stability when tracking objects and less jitter for Object Tracking experiences based on 3D Models (Expert Edition) and images (Professional and Expert Editions). 

Improved experience for Unity developers

The Wikitude Unity plugin makes it easy to create cross-platform apps with immersive augmented reality functionalities within a single development environment. 

Thanks to the feedback of our devoted community in the forum and customer discussions, we were able to identify and improve several features within the Unity Editor, delivering a smoother development experience for Wikitude users.

SDK 9.3 brings the following improvements:

  • Point Cloud preview is now properly destroyed upon unloading of a scene 
  • Fixed Live Preview webcam augmentation pose
  • Removed duplicate event setter in all tracker inspectors (Expert Edition only)

Additionally, our team took the opportunity to include compatibility for the popular NatCorder, a lightweight API that allows you to record videos in just a few steps in Unity. 

Other main features developers can now enjoy include recording MP4 videos, animated GIF images, JPEG image sequences, and WAV audio files, as well as recording any texture (or anything that can be rendered to texture) or any pixel data.

Support for iOS 14

It’s official: Apple has released it’s the Golden Master to third-party developers and Wikitude developers get access faster. SDK 9.3.1 includes full compatibility for iOS JavaScript, iOS Native, Unity (Professional and Expert Editions), Flutter, Cordova and Xamarin.

How do I get SDK 9.3?

It’s easy! Just choose your SDK of choice, get a free trial, and start with your next augmented reality experience today.

New to Wikitude? Welcome! We have a free Wikitude trial version for testing purposes. 

If you’re already a user and like what you’re testing, reach out to our team to discuss your upgrade to SDK 9.3. 

Wikitude Download PAGE

Categories
Digital agencies

5 tips to help pitch AR for your next project

Tips for agencies and developers to successfully pitch AR to clients and potential customers.

You get it – AR is incredibly cool, and most definitely the wave of the future – but your client might still be wondering if it’s time to ride the AR wave.

Have you pinpointed an excellent use-case for AR within a client’s project? We want to give you a few tips and suggest tools you can use to help get them on board. As a bonus, you will have an additional product for your portfolio you can really be proud of!

1. Explain the added value

Let’s talk about flash – and not the Adobe kind. Sizzle, wow-factor, attention-grabber, whatever you want to call it, AR has it.

It not only lets you see more of the world around you – it skyrockets user engagement with interactive content. A customized AR-experience is one of the most attention-grabbing features an app can offer.

For a little extra inspiration for your speech, check out 7 ways to use augmented reality in marketing today.

2. Demonstrate return of investment (ROI)

You’re a business, and they’re a business – the extra spending has to be justified.

The easiest way to do that? Show them a clear example of how AR can be linked directly to revenue – like in this video below from Takondi, one of Wikitude’s premium partners. Watch to see how easily AR can be used to implement mobile commerce.

There are a bunch of ways augmented reality can help businesses make more money. Here are a few of our favorites:

  • e-commerce – directly link real-world items to purchases
  • In-game purchases in an AR environment – you’ve seen the success of Pokémon Go – and remember virtual products have an excellent margin
  • Location-based deals – let users explore top deals near them (and guide them there)
  • Time-sensitive offers – reach people at the right time with the right offer
  • Augmented shopping – make every print material your user’s check out button
    Offer enhanced multimedia about products using 2D and 3D recognition
  • Premium apps – offer an entirely in-app shopping experience!

3. Show the future of AR with facts

When Google, Apple, Facebook, and Microsoft start heavily investing in augmented reality, it is safe to assume the tech is not only on the rise but on the verge of something great.

To have an idea of the current predictions, Digi-Capital’s long-term virtual and augmented reality forecast is for the AR/VR market to reach around $65 billion revenue by 2024. And Worldwide spending on AR/VR products and services throughout the 2019-2023 forecast period, should achieve a five-year compound annual growth rate (CAGR) of 77.0% (IDC).

Access this AR facts and predictions article for more data. And for real-world examples, note that “Pokémon Go, once a viral sensation all over the globe, hasn’t fallen off the map. In fact, the augmented reality game is earning more money than it ever has before. According to mobile analytics firm Sensor Tower, Pokémon Go had a record year in 2019, taking in an estimated $900 million through in-app purchases.” – via the Verge.

Image: Sensor Tower

4. Highlight the simplicity of the tech

Excuse us while we toot our own horn, but, making AR easy is what we do. The Wikitude AR SDK is one of the most versatile tools available for developing mobile AR. Want to build your own Pokémon-Go-like app? You can do it with Wikitude in three easy steps.

5. …and most importantly, show a demo

Seeing is believing. So why not show your clients the great things they can do with AR? It’s a lot easier to want something you can see right in front of you. So the best and most important advice when you’re pitching AR? Show them a demo. Here are a few tips for doing that!

  • It’s best done live – find an AR app, either your own or from another company, and show it during an in-person meeting. If this means bringing the tools with you (like a magazine or product) bring them with you!
  • Undersell, overshow – a good AR project speaks for itself. Rather than building up expectations, casually throw it out there – “Oh, have I shown you this cool trick?” *whips out phone*
  • Make sure it’s going to work! Do you need decent cellular data or Wifi? Nothing impresses less than stalled technology. Check your connection before you move forward
  • And speaking of demos – remember, you can always use the Wikitude trial license to win over your client!
Categories
SDK releases

Wikitude SDK 8.5: New Image Recognition Features And Improvements

Image Recognition just got better! Wikitude developers are now able to recognize even more types of targets and can enjoy new SDK features and performance updates.

As a result of our latest product improvement cycle, we are proud to present Wikitude SDK 8.5. The update offers even more stability to the platform and introduces two brand new Image Recognition features:

  • Transparent Areas in Image Targets
  • Image Targets at Runtime

New Image Recognition Features and Improvements

Wikitude Image Recognition is one of the best performing AR technologies on the market. But don’t take our word for it – we confidently invite any interested party to download a free trial and let the product speak for itself.

And, even though Wikitude has been offering Image Recognition since 2012, we are still eagerly committed to pushing the boundaries of what’s possible with this AR technology. Today we are excited to introduce two new Image Recognition features.

Transparent Areas in Image Targets

The new Transparent Area feature has been included to support those special types of image targets that do not fit the typical rectilinear image shape.

Examples of such images containing transparent areas beyond the main outlines include tattoos, stickers, logos, images with cutouts and basically any image file containing parts with alpha channel transparency.

Details at a glance:

  • Images with transparent areas can now be used as Target Images
  • Transparent areas will be ignored during the recognition and tracking process
  • Works with Image Targets at Runtime, not with WTC files
SDK 8.4 (without transparency support – WTC) X SDK 8.5 (with transparency support)

Image Targets at Runtime

Traditionally, Image Recognition requires developers to create an Image Target collection (WTC file), previous to the AR experience, to be used to detect and track images. As efficient and well-performing as this process may be – and it will continue existing as is, developers now have an additional option to create Image Targets.

With the new Image Targets at Runtime functionality, developers can create Image Targets on the fly, no need for preprocessing.

Details at a glance:

  • Use regular images (png, jpg,…) directly with SDK as target images
  • Multiple images can be bundled into a zip file
  • No need for WTC file creation
  • Working with WTC files remains as is

Being able to define Image Targets at runtime means users have the ability to easily create AR experiences on the spot. Ideal for quick and convenient testing during the development phase.

Also practical for spontaneous use cases on the go: demos during networking, dynamic art projects, in a classroom environment, adding AR to game cards, and more.

As mentioned above, Runtime Targets give Wikitude developers yet another target creation option. However, the traditional WTC file target collection methods continue to offer advanced functionalities, such as target image performance ratings and manual height value input fields. Ideal for AR experiences that require fine tuning and precise input for even more accurate distance to target calculation and proper wearable device calibration.

Image Recognition Performance Improvements

SDK 8.5 is a big release for Image Recognition AR technology. Alongside the new features presented above, the update is bringing significant performance advancements. Not only are image targets being recognized faster, but they are also being recognized from even further away. Check the details below:

Recognition distance

Image Targets can now be recognized from more than three meters away. To be more precise, we tested an A4 / US-letter sized image target and SDK 8.5 was able to recognize it from 312cm away. That is a whopping +40% increase in distance from previous SDK versions.

Regardless of the target size, our performance tests show that, with SDK 8.5,  image targets can be recognized even when they occupy a mere 1% of the device screen area. In other words, this update is ideal for use cases in which users do not have the target image within hands reach.

Duplicate Targets

With a growing demand for multiple and duplicate target Image Recognition, on this release, our team focused on improving duplicate target handling. Wikitude developers will notice faster recognition when it comes to duplicate targets.

Recognition Speed and Tracking Stability

As AR technology provider pioneers, Wikitude takes pride in the quality and performance of its Image Recognition technology. And thanks to the natural advancements in processing capacity of devices, camera optics and related technology in general, we are able to expand the performance and capability of our own AR technology along the way. With this 8.5 release, expect substantial improvements in image recognition speed and tracking stability.

Update to the iOS sample project – now based on Swift 4.2

Apple has been progressing the Swift programming language quite fast since its inception in 2014. Nearly every year a new major version was released by Apple. As part of the Wikitude SDK, we are shipping sample applications, that show the integration of the Wikitude SDK and how to use the APIs to create AR experiences. The Swift-based sample project was already a little bit rusty and still based on Swift 3 – in SDK 8.5 the project is now based on Swift 4.2 and is compatible with the latest Xcode version.

AR SDK enhancements, fixes, and stability improvements

In parallel to the new components and innovative solutions that are frequently being added to the Wikitude AR feature set, our augmented reality SDK is also constantly being submitted to rigorous quality tests to ensure that Wikitude developers have access to the finest and most comprehensive AR tools on the market.

Wikitude SDK 8.5 includes a series of fixes and stability improvements. Please review the release notes for your platform for an in-depth report.

Download Wikitude SDK 8.5

Active Wikitude SDK subscribers are entitled to any and all SDK version updates that are released throughout their term. All other parties are invited, however, to download a free Wikitude SDK 8.5 trial version for testing purposes.

Access our download page to explore all options or click on the direct links below to begin downloading now. Once the download is complete you will be automatically redirected to the signup/login page.

Wikitude AR SDK for Android
Download Wikitude SDK 8.5 for Android JavaScript API
Download Wikitude SDK 8.5 for Android Native API

Wikitude AR SDK for iOS
Download Wikitude SDK 8.5 for iOS JavaScript API
Download Wikitude SDK 8.5 for iOS Native API

Wikitude AR SDK for Windows
Download Wikitude SDK 8.5 for UWP Native API

Wikitude AR SDK for Unity
Download Wikitude SDK 8.5 for Unity

Wikitude AR SDK for Cordova
Download Wikitude SDK 8.5 for Cordova

Wikitude AR SDK for Xamarin
Download Wikitude SDK 8.5 for Xamarin

Interested in keeping your app running smoothly and always compatible with operating system device updates. Send us an email to learn how you can get 1+ year of SDK upgrades through our increasingly popular subscription program.

New to Wikitude? Access our store to choose your package or contact our team to discuss your specific AR requirements in detail.

Attending AWE USA 2019 in Santa Clara, CA? Make sure to visit Booth 419 to check cool SDK 8.5 demos and meet the Wikitude team. CTO Phil Nagele will also be talking about scanning and using 3D objects in AR experiences on mobile and smart glasses at the developer track on May 30. More details and Promo Discount code here. Don’t miss out!

Categories
SDK releases

Wikitude SDK 8.4: Plane Detection, Image Recognition improvements, fixes and more

Wikitude is continuously working to improve its SDK so that you can have a smooth and comprehensive AR-developing experience.

Check the latest product enhancements, support updates, fixes and all that has changed with the Wikitude SDK 8.4 release.

Plane Detection

After being introduced last September as an experimental AR feature, Plane Detection went through important tests and tweaks and is now ready for its official debut.  

With Plane detection, you can create AR experiences that accurately anchor digital content to surfaces at any orientation.

  • Horizontal Up (floor, carpet, table)
  • Horizontal Down (ceiling)
  • Vertical (walls, doors)
  • Arbitrary (ramps)

This AR feature not only expands use-case possibilities but also increases the accuracy of environment-understanding. Highly essential for AR experiences that are triggered without the use of pre-mapped targets – SLAM markerless technology.

Plane detection is available for Native API and Unity, with support for JavaScript coming later.

Image Recognition improvements

As the most commonly used Wikitude AR feature, our developer community tends to have a soft spot for Image Recognition. So we try to take extra good care of it.

Our latest developments have been leading to an increase in recognition speed and tracking performance. Wikitude AR developers can expect faster recognition and higher performance in challenging conditions such as cluttered environments and variable lighting already in this release, with even more optimization coming soon. Stay tuned!

New feature: “isDeviceSupportedAPI” in Native API

Working with appropriate AR tools means, among other things, having your apps run on a multitude of different devices from your user base. Even though not all of them might be fully compatible with your AR features of choice, luckily the Wikitude AR features run on the vast majority of current smart devices in circulation today.

When in doubt, use the “isDeviceSupportedAPI” function to easily and quickly check if certain AR features run on a specific smart device or not.

The “isDeviceSupportedAPI” has been available in the JavaScript API for some time now. The same API is now also available in the Native API and helps you save time when checking device compatibility.

Wikitude SDK 8.4 alterations*

  • End-of-Support: Titanium Module
    • Wikitude will open source the Titanium Module on Github in the upcoming weeks.
  • End-of-Support: x86 Intel architecture for Android SDK
    • x86 support for UWP-based devices, Epson BT-350, and Vuzix M300 will remain intact.
  • New minimum iOS requirement: iPhone 5s
  • New minimum Unity requirement: Unity 2017.4 LTS
  • Unity 2017.4: builds Android 64-bit by default

*access the 2019 SDK Clean-up article to view the full list of announced SDK alterations.

AR platform enhancements, fixes, and stability improvements

Wikitude SDK 8.4 contains many fixes and stability improvements as a result of our ongoing quality assurance projects that help keep our AR platform finely tuned, accurate and reliable.

Please review the release notes for your platform for an in-depth report.

Download Wikitude SDK 8.4

Wikitude customers with active subscriptions are entitled to this and all upcoming SDK updates launched during their term. Don’t have an SDK subscription? Contact our team to get one.

SDK 8.4 is available for various supported platforms, extensions, and operating systems. Create powerful cross-platform AR experiences for smartphones, tablets and digital eyewear across Android, iOS and Windows:

New to Wikitude? Contact our team to discuss your specific AR requirements in detail.

Categories
SDK releases

Wikitude Expands Its AR SDK to Epson Moverio BT-300 and BT-350 Smart Glasses

Epson, pioneer in the technology industry, is leading the way in visual communications, wearable products, drone accessories and industrial solutions. A great part of that forefront action is due to their innovative Moverio smart eyewear line. Supported by Wikitude since 2014, the devices have been used by enterprises and consumers worldwide to deliver hands-free augmented reality experiences.

Today, Wikitude is excited to expand the accessibility of its AR technology even further by launching a fully optimized Wikitude SDK for Epson’s newest devices: the Moverio BT-300 and BT-350 smart glasses.

Both Moverio smart glasses include motion-tracking sensors and feature dual (binocular) displays optimal for side-by-side 3D content. These AR eyewear pieces are also reduced in weight, have a high-resolution camera, improved processing capability and an advanced Si-OLED display, superior in contrast and transparency, making digital content blend in much more realistically with the real world.

Following a detailed development, the Wikitude AR SDK has been fully adapted to make the best out of the unique features of both devices, ensuring optimal performance in a variety of environments and use cases. Among these customizations are:

  • Intel SSE optimization: ensuring best processing power and performance for both devices;
  • Optimization for stereoscopic view: enabling full 3D see-through (side-by-side view) support of Moverio smart glasses;
  • Personal calibration: enabling perfect alignment between the real world and AR content.

As of now, developers can download the new Wikitude SDK for Epson to create augmented reality solutions including object recognition, instant tracking (markerless SLAM), image recognition and tracking, location-based AR and more features.

Smart glasses are starting to change the way people interact with the world by fusing digital elements into our everyday lives. Epson’s BT-350 paired with Wikitude’s augmented reality SDK enable the design of innovative visitor experiences at museums, art galleries, exhibitions and even retail stores by going beyond what phones, tablets and audio guides have to offer. Augmented reality is already helping thousands of users experience brands and tours in a more fun, educational and engaging way by offering smart glasses on their premises. Check out a few examples at Epson’s case studies page for inspiration.

On the consumer side, Wikitude paired with Epson’s BT-300 ensures immersive AR experiences for entertainment and gaming using marker-based or markerless tracking. Make the world your virtual playground!

The increasing smart glasses adoption in the enterprise sector proves the crucial role this technology is playing to take preventive, corrective and predictive MRO processes to the next level. Wikitude’s powerful SDK features combined with Epson’s BT-350 smart glasses make the ideal combination for developing AR that is suitable for remote assistance, maintenance, and training.

Enjoy a seamless integration of digital content with the world by trying the fully optimized Wikitude SDK for the Moverio BT-300 and the Moverio BT-350.

Download Free Trial

Categories
Dev to Dev

Wikitude SDK 7: A developer insight

Wikitude SDK 7 includes a long list of changes and additions to our augmented reality SDK. In this blog post, we will go through the modifications in more detail and what they offer for developers and users.

As you will see, SDK 7 has 3 key areas of improvement: Object Recognition and Tracking based on SLAM, multiple image recognition and enhancements for iOS developers.

Bring your objects into your augmented reality scene

Let’s get started with the biggest addition in this release: Object Recognition and Tracking for augmented reality. With this, we introduce a new tracker type beside our existing Image and Instant Tracking. The Object Tracker in the SDK gives you the possibility to recognize and track arbitrary shaped objects. The idea behind it is very similar to our Image Tracker, but instead of recognizing images and planar surfaces, the Object Tracker can work with three-dimensional structures and objects (tools, toys, machinery…). As you may have noticed, we don’t claim that the Object Tracker can work on any kind of object. There are some restrictions you should be aware of and types of objects that work a lot better. The SDK 7 documentation has a separate chapter on that.

In short – objects should be well structured and the surface should be well textured to play nicely with object recognition. API-wise the Object Tracker is set-up the same way as the Image Tracker.

The Object Tracker works according to the same principle as the Image Tracker. It detects pre-recorded references of the same object (the reference is actually a pre-recorded SLAM map). Once detected in the camera, the object is continuously tracked. While providing references for Image Targets is straight-forward (image upload), creating a reference for the object is a little bit more complex.

Scalable generation of object references

We decided to go for an approach that is scalable and usable for many users. This ruled out a recording application, which would be used to capture your object. This would also make it necessary, that each object is physically present. Considering this, we went for a server-side generation of Object Targets (sometimes also referred to as maps). Studio Manager, our web-tool for converting Image Targets, has been adopted for converting regular video files into Object Targets. You will find a new project type in Studio Manager that will produce Object Targets for you. Here’s a tutorial on how to successfully record objects.

https://www.youtube.com/watch?v=eY8B2A_OYF8

After you have uploaded your video, the backend will try to find the best possible Object Target in several computation runs. We can utilize the power of the server to run intensive computational algorithms to come to a more accurate result compared to a pure on-device solution that has to operate in real-time. Have a look at the chapter “How to create an Object Target” in the SDK 7 documentation for a deeper understanding of the process. It also gives us the ability to roll-out improvements of the recording process without the need for a new SDK version.

Rendering Upgrade: Working with occlusion models

When moving from Image Targets to Object Targets, requirements for rendering change as well. When the object has a solid body with different sides it is particularly important to reflect that when rendering the augmentations. SDK 7 introduces a new type called AR.Occluder in the JavaScript API, that can take any shape. It acts as an occlusion model in the 3D rendering engine, so you can hide augmentations or make them a lot more realistic. For your convenience, the occluder can either be used with standard pre-defined geometric shapes or take the form of any 3D model/shape (in wt3 format). Not only Object Tracking benefits from this. Occluders, of course, can be used in combination with Image Targets as well – think of an Image Target on your hand wrist, that acts as an Image Target for trying on watches. For a proper result, parts of the watch need to be hidden behind your actual arm.

Updated SLAM engine enhancing Instant Tracking

Object Recognition and Tracking is based on the same SLAM engine that powers Instant Tracking and Extended Tracking. To make Object Recognition work, we upgraded the SLAM engine with several improvements, changes to the algorithm and bug fixes to the engine itself. This means SDK 7 carries an entirely revamped SLAM engine. You as a developer and your users will notice that in several ways:

1. Higher degree of accuracy in Instant Tracking and Extended Tracking
2. More stable tracking when it comes to rotation
3. Less memory consumption
4. Less power consumption

All in all, that means that devices running 32-bit CPUs (ARMv7 architecture) will see a performance boost and perform considerably better.

Instant Tracking comes also with two new API additions. Setting trackingPlaneOrientation for the InstantTracker lets you freely define on which kind of plane the Instant Tracker should start (wall, floor, ramp…). The other API is called hit testing API and will let you query the depth value of any given screen point (x,y). It will return the 3D- coordinates of the corresponding point in the currently tracked scene. This is useful for placing augmentations at the correct depth in the scene. The SDK will return an estimate dependent on the surrounding tracked points. The video below gives you an idea of how the hit testing API can be used.

1,2,3…. Multiple targets now available

Additionally, our computer vision experts worked hard to make our Image CV engine even better. The most noticeable change is the ability to recognize and track multiple images at the same time in the camera frame. The engine can detect multiple different images, as well as multiple duplicate images in the camera (e.g. for counting purposes). Images can overlap or even superimpose each other. The SDK does not have a hard-coded limit on the number of multiple images it can track – only the processing power of the phone puts a restriction on it. With modern smartphones it is easily possible to track 8 and more images.

Furthermore, the SDK offers developers the ability to get more information about the targets in relation to each other. APIs will tell you how far targets are apart and how targets are oriented towards each other. Callbacks let developers react on changes of the relationship between targets. Developers can define the maximum number of targets, so the application does not waste power searching for further targets. The image below gives you an idea how this feature can look like for a simple interactive card game.

Boosting recognition to unparalleled distances

All developers and users that require images to be recognized from a far distance in their augmented reality scene should take a look at the extended range recognition feature included in SDK 7. By using more information from the camera frame, SDK 7 triples recognition distance compared to previous SDK versions. This means that an A4/US-letter sized target can be detected from 2.4 meters/8 feet. Calculated differently, images that cover 1% of the screenable area can still be accurately recognized and a valid pose can be successfully calculated. The SDK enables this mode automatically for devices capable of this feature (auto-mode). Alternatively, developers can manually enable/disable the function. When testing the feature and comparing it to competing SDKs, we did not detect any other implementation delivering this kind of recognition distance. All in all this means easier handling for your users and more successfully recognized images.

Bitcode, Swift, Metal – iOS developers rejoice

This brings us to a chapter dedicated to iOS developers, as SDK 7 brings several changes and additions for this group. First of all, Wikitude SDK now requires iOS 9 or later, which shouldn’t be a big hurdle for the majority of apps (currently nearly 95% devices meet this requirement). With SDK 7, iOS developers can now build apps including the Wikitude SDK using the bitcode option. Apps built with the bitcode will have the benefit of being smaller, as only the version necessary for the actual device architecture (armv7, armv7s, armv8) is delivered to the user and not a fat binary including all architectures.
As a more than welcomed side-effect of re-structuring our build dependencies to be compatible with bitcode, the

Wikitude SDK can now also run in the iOS simulator. You still won’t see a camera image in the simulator from your webcam, but you can work with pre-recorded movie files as input for the simulator.

In SDK 6.1 we introduced support for OpenGL ES 3 as graphics API. SDK 7 now also lets you use Metal as your rendering API in Native and Unity projects. Talking about new stuff, Wikitude SDK 7 also includes an extensive sample for Swift explaining how to integrate the Wikitude SDK in Swift application. Note the API itself is still an Obj-C API, but the sample makes it a lot clearer how to use API within a Swift environment.

We haven’t forgotten Android

Android developers will be happy to hear, that the Android version of Wikitude SDK 7 makes use of a different sensor implementation for Geo AR experiences. The result is a smoother and more accurate tracking when displaying geo-related content. For Android, we are also following the trend and moving the minimum Android version up a little bit by requiring Android 4.4 or later, which corresponds to a minimum of 90% of Android devices.

We hope you can put SDK 7 and its additions to good use in your AR project. We love to hear from you and we are keen to receive suggestions on how to make the Wikitude SDK even more useful to you!

Start developing with Wikitude SDK 7

Getting started with SDK 7 has never been easier! Here’s how:

Help us spread the news on Twitter, Facebook and Linkedin using the hashtag #SDK7 and #Wikitude.

Categories
News

Wikitude at AWE USA 2017: Auggie Awards, talks and more

The highly anticipated AWE USA 2017 has come and gone and now that the dust has settled it is safe to say…it was awesome! 

The largest augmented and virtual reality exposition and conference in the world is growing stronger. This year’s event, which gathered 4700 attendees, gave the AR+VR community an excellent chance to exchange knowledge, share news, demonstrate technologies and, of course, to have some interactive AR+VR fun.

Wikitude participated in all AWE USA events so far and here is how the Augmented World Expo 2017 unraveled for us.

Partnership announcement with Lenovo NBD

With a great kick-start at the AWE USA 2017 Press Conference, Wikitude CEO Martin Herdina talked about our recent 1 billion app installs achievement as well as some practical applications of our markerless tracking technology launched previously this year. Additionally, he also spoke about the importance of partnership with industry leaders before formally announcing our collaboration with Lenovo New Vision.

Lenovo NBD is launching an Augmented Human Cloud, powered by Wikitude’s intelligent recognition engine and Markerless SLAM technology, and their COO, Oscar Gu, says that “the goal of the AH Cloud is to reduce the AR applications development term from two weeks to two hours”. Impressive stuff!

Wikitude: winner of 2017 Auggie Awards for ‘Best Developer Tool’

Wikitude had the honor of stepping on stage once more, but this time for a different reason: to receive an Auggie Award. Wikitude’s SLAM SDK was recognized as the ‘Best Developer Tool’ and the award was accepted with pride and will certainly be an incentive for Wikitude to keep innovating, pushing boundaries and evolving in the powerful realm of AR.


Wikitude and Walmart explore AR in retail

Martin Herdina took the stage once again to speak about the value of AR in retail, tackling a consumer perspective. He covered market facts, tendencies, and statistics followed by several interesting use cases varying from pre-shopping and onsite tools, home shopping through product visualization as well as brand engagement.

The AR retail oriented talk was finished off by Walmart’s Systems Analyst, Steven Lewis, who shared the AR introduction process experienced by Walmart’s internal development team as well as a practical use case utilized for modular (shelf) set up. If you are interested in Walmart’s Journey into AR // How AR Creates Real Value this AWE Consumer Track talk is for you.

What’s next with Wikitude

For the developer crowd, our CTO Philipp Nagele prepared a speech about the company’s background, followed by an in-depth view of present and future Wikitude developments as well as what the next version of the Wikitude SDK will offer augmented reality developers. On top of that, he was also throwing chocolates around. If you are curious, watch What’s Next with Wikitude to see this great AWE Develop Track talk in its entirety.


In between talks, press announcements and demos, we had a chance to connect with some amazing people and give them a sneak peek of what’s to come. If you didn’t make it to AWE, stay tuned to hear first hand some exciting news in just a few weeks!