Categories
3d Dev to Dev

Creating 3D content for augmented reality

Content is constantly changing. Designed for TVs and devices in the early 2000s, it now transcends the 2D realm and comes to the world around. 3D augmented reality content needs to be as immersive as VR advocates ever dreamed, minus the isolation from the outside world.

The more AR becomes part of our lives, the higher the need for content to adapt to the 3D world. It means the content needs to be realistic, spatial, and engaging. And while there are thousands of apps online, most companies are still figuring out what compelling content looks like in AR.

In this post, we’re diving into the role of content in ​​augmented reality, the challenges the industry faces, and the future of spatial content. 

Augmented reality content basics

Augmented reality content is the computer-generated input used to enhance parts of users’ physical world through mobile, tablet, or smart glasses. It can be user-generated (think of social media face filters) or professionally produced by designers working for brands and specialized agencies. 

AR content often comes as 3D models but can also come in visual, video, or audio format.

Whether you are using AR to buy a new IKEA sofa or play a game, the quality of the content you see in the app will make (or break) the AR experience.

Image source: IKEA

The role of 3D content in augmented reality experiences

Among the thousands of AR apps in the market today, the most successful ones have one thing in common: high-quality, engaging AR content. Fail to deliver that, and your project will risk joining the astonishing 99.9% of apps that flop or become irrelevant in the app stores

Content is the heart of ​​augmented reality. It ensures users have a reason to keep coming back. 

Users might be thrilled to scan an augmented wine bottle a few times and share the experience with friends. But how many times can we expect them to go back and watch the same video? 

Companies must see AR content as a critical component of long-term, well-thought-through digital strategies to ensure app longevity. It means constantly delivering fresh, contextual, and personalized content. 

Easier said than done. From high production costs to a scarcity of skilled professionals, building AR content at scale is one of the biggest challenges companies face that blocks them from keeping the apps relevant in the long run.

Challenges of building 3D content for augmented reality

3D models need to create perfect digital twins of the real world. Combined with other rendering elements (e.g. animation, audio, and physics), they make for AR’s most used type of content and provide an additional immersive layer for the user experience. 

What the user doesn’t see is the relatively complex process of creating such realistic visual assets. Their production can go from a detailed manual process and re-using computer-aided data to a photogrammetry-based creation process.

Size limits, file formats, and the total size of the application are just some of the plenty requirements developers need to understand to build great AR experiences. In addition, a lack of industry standards for AR content and a limited qualified workforce imposes significant industry challenges.

Building 3D assets: 3D model versus 3D scanning

Before we jump into the technicalities of creating content for AR, there are some basic concepts we need to clarify.

3D modeling x 3D scanning

3D modeling and 3D scanning are two ways of building 3D assets for augmented reality. 

3D modeling uses computer graphics to create a 3D representation of any object or surface. This technology is beneficial when used to recreate physical objects because “it does not require physical contact with the object since everything is done by the computer” (Skywell Software). Therefore, 3D modeling becomes ideal for creating virtual objects, scenes, and characters that don’t exist in the real world (think of Pokémons and other fantasy AR games).

3D scanning uses real-world objects and scenes as a base for the production of AR assets. Using this method, the content creators don’t craft the model from scratch using a program. Instead, they scan the object using one of two different methods: photogrammetry or scanning through a 3D scanner device (LiDAR or similar). 

GIF source: Apple.com

The main difference between the two is how they capture the data of the object. While photogrammetry uses images captured by regular smartphones, smart glasses, or tablets, scanning requires special devices equipped with depth sensors to map the object. 

It makes photogrammetry more accessible to the broader developer crowd when creating AR content, as no special equipment is required. On the flip side, 3D scanners are more reliable. 

Using either of two approaches, a point cloud can be extracted, which can be applied in the AR experience.  You can read more on the advantages of each method in the section 3D point cloud below. 

Ultimately, you can decide between using 3D modeling or 3D scanning by assessing the accessibility to the physical object to scan. If the selected AR object target is not available, then 3D modeling is the way to go. 

How is 3D content created for augmented reality?

There are plenty of AR content creation tools available on the market. Some are easy drag-and-drop that don’t require coding skills. Others are much more complex, and target experienced professionals.

Here’s an overview of the different possibilities:

Image source: DevTeam.Space


3D point cloud: In AR, a point cloud is a virtual representation of the geometry of real-world objects using a collection of points. Generated via photogrammetry software or 3D scanners, these points are captured based on the external surfaces of objects.

Because photogrammetry allows gathering 3D information out of 2D images, this method makes content creation more accessible. It overcomes ownership issues often faced with 3D models. As a result, anyone can create 3D models by simply recording or scanning the real object. 3D scanners (for example, LidAR-enabled devices) gradually become more available in the market and provide more detailed point clouds thanks to depth sensors.

Commercial tools such as Wikitude Studio, Apple Object Capture, and Amazon Sumerian are examples of photogrammetry-based programs.

AR Object Target Transformation in Wikitude Studio Editor

CAD (Computer-Aided Design): CAD models are commonly the first step to prototyping physical goods, bringing a first product view to life in the digital world. Assisted by software applications, AR developers can repurpose legacy CAD models for augmented reality-based solutions. Existing CAD data can then be used as the input method to create digital representations of the object or environment to be augmented.

Once uploaded into the selected program, CAD data is converted to an AR-compatible format in phones, tablets, and smart glasses. CAD models typically provide accurate information about the object, maximizing the potential for a reliable AR experience. Albeit being prevalent in the industrial sector, CAD-based AR experiences are progressively gaining popularity for consumer-facing apps.

Games, computer graphics: authoring software tools such as Blender, 3ds Max, and Maya are popular 3D design applications used by AR content creators. Unity, Unreal Engine, and even Apple’s Reality Composer are great tools to assemble the pieces of content and make them work together for augmented reality.

Other 3D models: beyond CAD, other popular 3D model formats can be leveraged to power augmented reality solutions, for example, glTF 2.0, FBX, Obj, etc. Compatible file formats will depend on the program used to build augmented reality. 

On the one hand, this wide variety of 3D assets formats has opened the doors to creators of many areas to put their existing models to work for AR. On the other hand, it creates confusion among developers, increasing the debate around the need for standardization in the AR industry and the creation of alternative tools that are intuitive and code-free.

What’s next for AR content creation?

With increased interest in augmented reality, we will see more tools emerging that help to create content, overcome workforce scarcity and deliver actual value through the technology. 

To facilitate content creation, AR companies invest in building platforms that don’t require technical skills (therefore closing the workforce gaps) to help brands optimize the AR content creation process. 

An example is Apple’s latest release: Reality Kit 2. This new framework includes a much-awaited Object Capture feature that allows developers to snap photos of real-world objects and create 3D models using photogrammetry. 

But if Apple’s announcement gives you déjà vu, you are not wrong. Last year, the AR media went crazy about an app that lets you copy and paste the real world with your phone using augmented reality.  

The topic of interoperability of experiences across platforms and devices is equally important. The ability to code an AR app once and deploy it in several devices and operating systems helps companies bring their projects to market as fast as possible.

The final and most crucial aspect is understanding how 3D content in augmented reality can deliver value to its users. That means getting clear goals for the AR project, understanding how it fits into your digital strategy, and having a deep knowledge of your customer. 

What are some of the trends you see in AR content creation? Let us know via social media (TwitterFacebook, and LinkedIn) and tag @wikitude to join the conversation.

Categories
Dev to Dev

“The Turtle” developer tutorial: markerless app made with Unity (SLAM)

Update: Track, save and share AR experiences with SDK 8

Since the launch of Wikitude’s SLAM technology, “The Turtle” has been the #1 tutorial request we’ve received. So who better than the developers themselves to show you how they made turtles float around the streets of Michigan?

This post is a guest post by YETi CGI, a tech design company who developed projects for companies like Disney, Mattel, National Geographic and more.

The Turtle, as we affectionately call it, is a markerless AR concept demo we built on the new Wikitude SDK as part of our own research into 3D mobile AR. In broad strokes, the app causes your phone to see a majestic sea turtle swimming about the room, getting believably close and distant depending on where it is in virtual space.

Building the AR experience

The demo’s effect hinges on the turtle itself, right? Both its look and its feel contribute to the sense of it swimming in the same room the user occupies. In order to convey this, the first step is to give the turtle some context, which is where Wikitude’s SDK comes in. The SLAM algorithm they’ve got defines the virtual space based on the physical, gathering data from the room and mapping it into a Unity scene.

Because of what Wikitude provides, our coding needs were actually pretty simple. Once the Unity scene is created the camera performs a raycast onto the ground, giving the program most of the variables it needs to run. Aided by the virtual map of the physical room and the raycast data, the prefabbed object (which is then given the turtle as its identity) is positioned at a comfortable distance directly in front of the user.

When scene is set to it’s active state, and if tracking is active, position the Turtle correctly in front of the viewer

if (_isTracking)
{
    //cast ray out from center of screen, get position it hits the ground plane
    Ray cameraRay = Camera.main.ScreenPointToRay(new Vector3(Screen.width / 2, Screen.height / 2, 0));
    RaycastHit hit;
    if (ground.Raycast(cameraRay, out hit, 30))
    {
        objectHolder.transform.position = hit.point;
    }
}

The turtle’s animation is mostly provided by the art team, but it only consists of a looped reel of images. The baked in animation doesn’t actually include the circles that the turtle is turning, though it’s worth noting that it could have been accomplished that way. Our approach was to use a free plugin, called DOTween, to assign the turtle an animation state. We think it worked nicely.

Did you know you can now use Unity’s live preview with the Wikitude SDK? See more

Again, Wikitude’s SDK accommodates the project beautifully: most of the code done by our team determines position and camera input. By using Unity, Wikitude was able to provide us with prefabs (pre-built assets) which allowed us to drag and drop ready-to-go elements, giving us the freedom to work on the design and UX without having to first reinvent the wheel.

The outcome

How cool is to track the world around us? We’re pleased with how it turned out, and we’re going to continue learning about the applications of markerless 3D AR. At YETi we’ve been active in the scene for some time, and to have tools like Wikitude’s SDK showing up is a huge encouragement. 

That’s it! Hope you enjoyed this tutorial. Below are some relevant links to get you started with Instant Tracking. Let us know in the comments below which tutorials we should make next.

Get started with the  SDK for Unity: 

 

Download Unity SDK
Set up guide
Instant Tracking info
Other apps using Instant tracking





Interested in creating an AR project of your own?
Talk to one of our specialists and learn how to get started.

Contact The Wikitude Team

Categories
Dev to Dev

A beginner’s guide to augmented reality with Unity

The Wikitude Academy has been successfully supporting students, professors, and academic institutions since 2012. In short, this awesome initiative has been giving free access to the full feature set of the Wikitude SDK EDU to eligible applicants all around the globe.

Due to the success of the program and popular demand, the Wikitude Academy has now partnered up with award-winning international professor & best-selling author Dr. Penny de Byl to offer online augmented reality courses.

If you are a mobile app creator, game designers/developer, or an AR/Unity enthusiast looking to expand your skill set, this course is for you. Presenting:

A Beginner’s Guide to Augmented Reality with Unity  featuring Mobile AR Applications with Wikitude using ARKit & ARCore for iOS and Android.

Hosted on the online learning platform Udemy and designed for AR beginners, the course ranges from examining AR’s earliest origins to understanding the mashup of computerized environments with the real world. The topics covered in the course include:

  • Projecting Virtual Objects over the live camera feed
  • 2D Image Recognition
  • 3D Object and Scene Recognition
  • 3D Scene Recognition
  • QR and Barcode Detection
  • Image Tracking, and
  • Placing virtual interactable objects and animations into a real scene

Like what you see? Then we suggest you act fast.

With a small investment, enrolled students have full lifetime access to 53 lectures, 10.5 hours of on-demand video, 6 articles, 28 downloadable resources, certificate of completion, and more:

  • All students enrolled in this course are entitled to a free Wikitude SDK EDU license.

Enroll now and learn how to create your own AR app from scratch with Unity and Wikitude.

“Dr. Penny introduces augmented reality techniques using her internationally acclaimed holistic teaching style and expertise from over 25 years of teaching, research, and work in games and computer graphics. Throughout the course, you will follow along with hands-on workshops designed to teach you the fundamental techniques used for designing and developing augmented reality mobile applications.”

To read the full description of the course and sign up, please access A Beginner’s Guide to Augmented Reality with Unity.





Interested in creating an AR project of your own?
Talk to one of our specialists and learn how to get started.

Contact The Wikitude Team

Categories
Dev to Dev

Migrating from Moodstocks: here’s how to keep your app online

The augmented reality industry is on fire these past weeks:

  • Snapchat has quietly introduced a whole new world of augmented reality right in our hands
  • Pokémon Go’s AR game added $7.5 billion to Nintendo’s market value
  • and just last week MoodstocksFrench image recognition startup, was bought by Google

As pioneers in the field, Wikitude couldn’t be more excited with all the attention (and actual usage) augmented reality has gotten in the past week. In fact, there was an increase of 417% on Google searches for the term “augmented reality” in the past week.

Google 'augmented reality' search results

About 1 year ago, the industry experienced a similar hype when Apple bought Metaio. Back then, our team helped hundreds of developers and agencies to migrate their apps to Wikitude. This time around, it won’t be different: we’re here to help Moodstocks-based apps to keep running smoothly and make a switch in just a few steps!

Image Recognition and Tracking, Geo-based AR, Cloud Recognition Services and content management system: Wikitude offer a full stack of products to build incredible AR apps in no time! 

Wikitude SDK

The Wikitude SDK is the core of our augmented reality solution set. It provides developers with a powerful SLAM rendering engine for mobile apps and smart glasses.

The latest version of the Wikitude SDK includes all the features (and more) to keep your Moodstocks app amazing your users. Here’s a list of all features:

  • Geo-based AR
  • Image recognition and tracking
  • Object recognition and tracking
  • Instant tracking
  • Extended image tracking
  • 3D modeling and presentation layers
  • 3D tracking for Unity SDK

This means you can overlay digital content on 2D images on planar (magazines, catalogues, billboard, TV/computer screens) and non-planar surfaces (product packaging, images places on objects, etc), stick the augmentation on the user’s screen and even include location-based AR (just like the Pokémon Go app does).

Why Wikitude?

  • You can build your own white-label app from scratch or include the SDK in an existing app.
  • We have a sample app which you can use for base of your project and to test all features included in the SDK.
  • Our set up guides are quick and easy!
  • Choose among several development platforms and programming languages to build your app, such as JavaScriptUnityXamarin, PhoneGap, Titanium, JavaScript, native iOS and Android.
  • With Wikitude you have the freedom to choose between On-Device Recognition (offline, aka client recognition), Cloud Image Recognition (online) or a ‘hybrid’(online-offline) alternative.
  • Using the Wikitude Cloud recognition will allow you to add, replace or remove images without republishing your app.
  • Manage your AR content with the Wikitude Studio/Target manager.

And a few more things:

  • Stable technology: we are pioneers in the AR industry, first launching our AR SDK in 2009;
  • 100% in-house API: our technology is robust, fast and precise, fully developed by our highly qualified R&D team;
  • Fast and scalable: recognize 2D targets in less than a second (current speed: 0,5 secs). You can upload up to 50,000 images in our cloud, but if you need more, just let us know;
  • Trusted by global brands: Wikitude is the largest independent AR SDK provider in the market. MasterCard, Johnson & Johnson, Volkswagen, Cisco, SAP, Konica Minolta, 20 Fox Century are some of the hundreds of Global brands using our technology;
  • Smart glasses: our SDK is optimized for the industries top AR smart glasses for hands-free experiences;
  • Best support team: we hear this every day and are proud of it! We’re here for you.

How to migrate from Moodstocks to Wikitude:

Don’t be shy, if you have any questions or need help, reach out to info@wikitude.com anytime!

Download Free Trial

Categories
Dev to Dev

Build the next Pokémon Go with Wikitude’s SDK 5.2

Update (August 2017): Object recognition, multi-target tracking and SLAM: Track the world with SDK 7

Are you working on building the next Pokémon Go? Wikitude can help make that a whole lot easier! Geo location, image recognition, and on-screen interactives – all right here for the taking. The next version of our industry-leading augmented reality SDK is out – and it’s got a few upgrades we think developers need to know about.

So if you’ve got the next smash-hit app for AR, hop on board – we’ve got the tools you need to build them.

Build your own Pokémon Go!

The Wikitude app was the first publicly available app that used a location-based approach to augmented reality, already in 2008. If you are ready to build the next Pokémon Go app, all you have to do is check out our Geo-based AR feature included in the Wikitude SDK.

pokemon-go-nick_statt-screenshots-1.0

Image credits: The Verge

With a few lines of code you can build awesome AR games that will not only augment Snorlax, Bulbasaur and Pikachu, but any 3D model of your favorite creature, along with videos, augmented buttons, html widgets and more! 

Here’s how to build an app like Pokémon Go with Wikitude:

(Don’t be shy, if you have any questions or need help, reach out to info@wikitude.com anytime!)

New “Camera control feature” (Input Plugins)

This new feature (Camera Input Plugins) allows developers to feed the Wikitude SDK with their own input images and manage the camera stream on their own, making the SDK more flexible where it receives camera images from.
It’s an extension of our existing Plugins API feature, introduced in the SDK 5. The SDK comes with an extensive sample “Custom Camera” that demonstrates the feature with a custom rendered camera image using a shader for a scanning effect.

Camera Input Plugins

This new camera input feature also not only lets the user to provide input from different cameras to the SDK, but it doesn’t “occupy” the camera while the SDK is running, letting you multitask while other things while the SDK is running.

Making Unity better (and easier)

Our most popular plugin was updated to make product visualization even better!

We made some essential changes to the camera prefab (set up), which has been simplified in its structure. The hierarchy it previously had is now a single GameObject, making it easier to combine the prefab with other Unity objects and features, like physics engine. Most of all, your 3D content will no longer stand alone in your AR experiences!

https://www.youtube.com/watch?v=7VEkSZEO-jU

You will be able to insert shadows, change texture on 3D models, make objects interact with each other and drag and drop several objects in the AR experience.

So whether you are building interactive augmented reality catalogues, like IKEA’s, or awesome AR games, get started today with our updated Unity extension.

Download SDK 7.0

Categories
Dev to Dev

Wikitude Cloud Recognition: Scalable and 10% Faster

In the past days we silently rolled out an upgrade to our cloud services for all customers. The update focused on bringing the components and infrastructure we are using in our Cloud Services up to the latest versions. The database infrastructure used in our Cloud Services played a crucial role, where we upgraded to the latest MongoDB version.

This upgrade, beside all the security and stability fixes you get out of it, also resulted in an performance improvement of average 10%, meaning the cloud services can handle more parallel requests and traffic.

Additionally we fixed a few UI glitches and bugs – particularly when uploading a higher number of images at the same time (>300). The new Wikitude Studio updated its duplicate check for better recognition in case you are uploading an identical image.

Generating WTC files

Wikitude Studio is the easiest and fastest way to generate wtc files, the file format recognized by the Wikitude SDK. Whether you are working with you own CMS or with a large amount of images directly in the cloud, you can use this service to automatize the generation of wtc files in just a few clicks.

If you still haven’t tried the Wikitude Studio including Cloud Recognition and the web-based Editor, get a free trial today at your developer license page. For optimal performance, Wikitude has distributed servers across the planet: Europe, Americas and China. To learn more check out our documentation section.

 

Get Free Trial

 

Categories
SDK releases

Unity Plugin Update – 3D tracking, new sample app and more!

Update (August 2017): A developer insight into Wikitde SDK 7 – Object recognition, multi-target tracking and SLAM based instant tracking

The Wikitude development team has just released an update to our Unity plugin!

Being Wikitude’s #1 plugin since its first release, this upgrade adds a number of cool features to our native SDK, including the long-awaited 3D tracking and a new sample app!

So here it is, the new Unity plugin 1.2.1-1.1.0 (read: Plugin version 1.1.0 including Wikitude SDK 1.2.1) is now available for download and free trial to our developer community!

What’s new?

  • Upgrade to Wikitude SDK (Native API) 1.2.1
  • 3D Tracking
  • Extended Tracking quality indicator
  • Camera Controls (back/front camera)
  • Callbacks are now Unity Events
  • New Unity Plugin Reference included
  • Improved and extended documentation
  • New Unity based sample app for Android and iOS
  • Several bug fixes and stability improvements

Note that due to the changes the upgrade requires some migration steps to be followed. A step-by-step migration guide is available to upgrade your Unity-based apps!

The update is free for all existing Unity Plugin users and available for Android and iOS apps.

To see the full release notes and set up guide check out the Unity documentation section.

Download Unity SDK

 

Categories
Dev to Dev

Using Unity to boost your AR apps

There’s a reason mobile developers love to integrate Unity in their apps: it’s portable, agile and in some cases, it’s free.

While known mostly as a gaming engine, Unity is also frequently employed by anyone looking to model a virtual and now augmented three-dimensional space. The finer visual details, like texture or reflection mapping on 3D virtual objects, make Unity-based apps some of the most immersive experiences available.

For the millions of developers using Unity, Wikitude’s Unity plugin is exciting – because it allows them to combine detailed, textured virtual worlds with real ones. The plugin integrates Wikitude’s computer vision engine into anything you build with Unity, which allows you to augment interactive 3D content in real time on your mobile device.

Unity is all about supporting the growing Virtual and Augmented Reality markets! The company has invested in a number of specific features out of the box, such as head tracking and an appropriate field of view, to ease the development of VR/AR apps. 

d01d5adc-68b8-42e1-9634-3c71c44b0141

“With the increasing demand for augmented reality technology among our Unity app developer community, we are excited to see Wikitude offering a free plugin to connect our technology platforms.” 

JC Cimetiere, Sr. Director Product Marketing of Unity.

If you want to see the Wikitude SDK + Unity plugin in action, you’ve got the perfect chance to do so at the Vision VR/AR Summit in Los Angeles next week, 10-11 February. Wikitude CMO Andy Gstoll will be on hand to check out the keynote presentation from Alex McDowell, designer, storyteller, and creative director behind Unity’s new 5D Global Studio.

Download Unity SDK

So what can you build with Wikitude and Unity? Find out by downloading the Wikitude SDK and free Unity plugin – both free as trial. The possibilities are endless and we can’t wait to see your apps!

Categories
SDK releases

Introducing: Wikitude SDK 5

Update (August 2017): Object recognition, multi-target tracking and SLAM: Track the world with SDK 7

We would like to share the details of our version 5 of the cross-platform Wikitude SDK with you.
We have been working on this release for some time now and consider this our most ambitious release since launching the first version of the Wikitude SDK more than three years ago.

Continue to read the details and you will easily understand why.

For our developers the Wikitude SDK 5 brings increased flexibility when it comes to choosing your development environment.

Beside the existing options to work with Wikitude’s JavaScript API and the well-established extensions for Cordova, Titanium and Xamarin, developers are now able to embed augmented reality features using the new Wikitude Native API for Android and iOS. The Native API will give access to all computer-vision related features like 2D Markerless Image Recognition and Tracking, 2D Cloud Markerless Image Recognition and Tracking.

Speaking of 2D image tracking, the new SDK extends – literally – its functionality here as well. Extended Image Tracking, available in both Native and JavaScript API, is a new tracking mode that will keep the image tracking on although the original target image can not be seen by the camera image anymore.

The Native API is also the base for the new Unity3D plugin for the Wikitude SDK. With the Unity3D plugin developers are able to add 2D target images to their Unity3D based application.

Starting with this SDK release, developers are be able to create and use custom plugins for the Wikitude SDK. Plugins under this framework can receive a shared camera frame plus additional information about recognized images – like the pose and distance. Plugins can either be written in C++, Java or ObjC and can communicate with your augmented reality experience.

Furthermore the Wikitude SDK 5 brings full compatibility with Android Studio (intermediate set-up guide is already available).

WikitudeSDK_architecture_v5

Extended Image Tracking

150527_WT_SDK5_Icon_ExtendedTracking The Extended Image Tracking option is an advanced tracking mode for 2D markerless tracking, that will continue to track your target image, although it can’t be seen by the camera anymore. Users will scan your target image as of now, but will be able to leave the target and continue to move around, still keeping the tracking of the entire 3D scene.
Extended Tracking is the first release of Wikitude’s new 3D Tracking engine and is supplementing Wikitude’s 2D image tracking capabilities. The new mode is fantastic for larger 3D model sceneries, or smaller image targets on larger surfaces, where the user can move around more freely.

Native API for Wikitude SDK

150527_WT_SDK5_Icon_iOS-Android For all developers, who want to use the Wikitude SDK at its core, Wikitude is branching off its computer vision core technology. The Native API contains the full computer vision engine of the Wikitude SDK, but can be integrated using native programming languages for Android and iOS (Java, ObjC).

 

The Native API features:

  • Plugin Framework
  • 2D Image Recognition and Tracking (Offline)
  • 2D Cloud Recognition and Tracking (Online)
  • 2D Extended Image Tracking

Unity3D Plugin for Wikitude SDK

150527_WT_SDK5_Icon_Unity3D
Based on the new Native API, Wikitude offers a plugin for Unity3D so you can integrate Wikitude’s computer vision engine into a game or application fully based on Unity3D. This means you can work with target images and image recognition in your Unity3D app and benefit from the full feature set of the Unity3D development environment. Adding the power of Wikitude’s SDK with the advanced capabilities of Unity3D makes this combo an unbeatable duo.

Plugin Framework

150527_WT_SDK5_Icon_Plugin
The new Plugin Framework allows to extend the Wikitude SDK by 3rd party functionality. Plugins for the Wikitude SDK have access to the camera frames and information about recognized images (pose, distance). This is perfect for additional functionality that also requires camera images. Plugins are written in C++, Java or ObjC and can communicate both with the JavaScript API and the Native API.

The SDK includes two samples for plugins:

  • Barcode and QR Scanner
  • Face Detection

Full Android Studio compatibility

150527_WT_SDK5_Icon_AndroidStudio Android Studio is becoming more and more the preferred IDE for developing Android apps.
While the Wikitude SDK version 4.1 can run in Android Studio Wikitude SDK 5.0 now has been optimized to work nicely with Android Studio.

  • Updated library format .aar
  • Sample App for Android Studio

Availability

Update (August 2017): The SDK 7.0 official release is now available for download. Customers with a valid Wikitude subscription license will receive future updates for free upon release.

Oh, there is one more thing…3D Tracking!

Wikitude will publicly release SLAM based 3D tracking capabilities soon! Please check wikitude.com/SLAM for details. Here is a quick video demo to give you a glimpse of what’s coming.