Categories
Dev to Dev

3D Tracking for large-scale scenes

Update (August 2017): Object recognition, multi-target tracking and SLAM: Track the world with SDK 7

As part of our mission to keep augmenting the world around us, our tech team has been extending the capabilities of our 3D tracking. We started simple with the first public beta version of this feature a few months back, recognizing and tracking small-scale spaces. As things progress very quickly, Wikitude’s computer vision team is taking a step forward to track and map bigger environments or “large-scale” scenes as we like to call it.

In this post we will provide an insight on Wikitude’s 3D tracking technology, share our next move on the 3D tracking road and provide a hands on demo of our large scale feature, the WikiWings app for Android.

The Wikitude cross-platform 3D Tracking technology

Tracking objects and environments in 3D is a complex task. Particularly if this is done without depth sensors and by using just a single front facing the camera as found in the majority of smartphones out there today. We, humans, are actually at an advantage as we are equipped with two eyes (cameras) for sensing depth and understanding the third dimension.

The Wikitude team has been working tirelessly in the past three years with the aim to create a common base for recognizing and tracking three-dimensional objects, structures, and spaces. The market requirements range from being able to recognize objects of a few centimeters in size up to positioning a device in a sequence of rooms and corridors stretching several 100s of meters. There is no “one” computer vision algorithm out there that can support this broad set of requirements and use cases – yet. As the pioneer in the mobile augmented reality industry with a razor sharp focus on technology only, we will continue to address the demand for varying “flavours” of 3D recognition and tracking. The result of this approach is a strong common core for 3D tracking, which serves as the common base for a number of use cases.

151127_WT_Infografik_3D_Engine_02
The Wikitude 3D Tracking engine

In the past weeks, another building block of our 3D tracking has evolved, and today we are excited to share with you the second part of our journey to map and track the world around us.

From “small-scale” to “large-scale”

Wikitude’s first 3D tracking beta released a couple of months ago was the initial public release for mapping and tracking objects and environments on a small scale, such as an office desk for example, as previously described in our blog. We have gathered feedback from our developer community in the past weeks and worked on an updated version of our small-scale 3D tracking. Interested developers can request our updated (December 1st, 2015) version of the beta on our 3D tracking page feature page.

As a next step, we are preparing our next public releases for tracking and mapping larger indoor spaces to navigate users, display augmentations in rooms or show points of interest inside buildings.

To demonstrate the basics of our large scale 3D tracking feature already, our development team took the Wikitude office as an initial test environment. The video below is a first hands-on example of the current capabilities of our SLAM based 3D tracking applied to indoor navigation and localization.

The first step of the above demo is to identify objects and physical structures of the room that will provide key feature points to be tracked. As the user moves around, feature points are captured and become the base for the forming map on the device, see the box in the bottom right corner on the device in the video above. Once the algorithm has tracked key features of our office, it’s time to augment the scene! In the technical demo, we demonstrate a simple augmentation of an animated 3D model using our Native iOS SDK.

Large scale in action: use cases with the Wikitude 3D tracking

The demand for technology that “understands” and enhances the larger spaces around us inside of shopping malls, public buildings, airports, train stations etc. is tremendous. Often times we wonder what’s the shortest way to a departure gate inside of airports and train stations, where we can get the best deals in large shopping malls or find the nearest Starbucks, or locate a piece of machinery on complex industrial sites, even indoor gaming is frequently being requested for. Here are some of the many use cases where the Wikitude 3D tracking can be applied.

Gaming
Mockup 3D Tracking LivingRoom

One of the coolest applications of our large scale 3D tracking is the ability to make rooms highly interactive. This feature allows users to hunt for flying dragons in your living room, fight living creatures in your kitchen or follow an alien in a shopping mall. Any room can become the scene of your game!

 



Architecture151028_WT_Mockup_3D_Tracking_Facade_03_v4

What if we could see a design idea of a building structure in real time? Or plan industrial spaces before a single brick is being moved? Architects can use Wikitude’s large scale 3D tracking to display their plans on their client’s tablet, helping them easily visualise what things will look like upon completion of the project.

 



Indoor navigation (proof of concept)Indoor navigation - augmented reality app

Tracking and mapping indoor spaces enables powerful indoor navigation. Locating deals in the maze of big shopping centers, leading passengers to their boarding gates inside of airports are only the beginning.

 

 

Check out WikiWings, Wikitude’s large-scale demo app

The Wikitude large-scale capabilities will be available in our SDK soon, however, you can get an early taste of it already by downloading the WikiWings demo app for Android.

wikiwings logo

Screenshot 3d tracking app

(We have already shot quite a number of dragons at our office ;)

Update (August 2017): Start developing with Wikitude SDK 7

Getting started with SDK 7 has never been easier! Here’s how:

Help us spread the news on Twitter, Facebook and Linkedin using the hashtag #Wikitude.

Categories
AR features

Wikitude 3D Tracking (Instant Tracking)

2020 Update: Wikitude SDK now supports 3D tracking, Object tracking and 3D Model as input method (CAD). Learn more.

With this post, we are opening a new chapter on Wikitude’s journey towards augmenting the world! We are happy to share today the first version of the all new Wikitude 3D tracking technology. For our team, augmenting rooms, spaces, and objects around us is a natural progress after mastering augmentations on 2D surfaces. Clearly, tracking in 3D is a much more complex task as algorithms must be optimized for a variety of use cases and different conditions. With this release of our 3D tracking technology, developers will be able to map areas and objects of a rather small scale and place 3D content into the scene.

This is the first step of a sequence of releases Wikitude will roll out as our SLAM based 3D recognition and tracking technology evolves. The 3D tracking (instant tracking) feature is now available as a free trial and packaged in our SDK PRO products. This feature is currently available for the SDK 5 Native APIs only.

How does Wikitude 3D tracking work?

The Wikitude SDK tracks 3D scenes by identifying feature points of objects and environments. By identifying feature-rich environments, the SDK will map the scene by displaying a point cloud over the different feature points.
As an example of how the Wikitude 3D tracking works in a small scene, we will use the scenario of an office table. The richer the scene is equipped with feature points, the better the mapping and tracking will be.

151012_WT_SDK5_Infografiken_3DTracking_01_v1

In order to track and map the scene the following steps should be taken:

  1. Launch the Wikitude sample app, which is included in the Native SDK (iOS and Android) download package
  2. Record a tracking map by slowly moving the device from one side to the other of the scene, covering the whole area
  3. 3D point clouds will appear on the screen capturing key feature points of the scenario
  4. Save the tracking map
  5. Load the map in your augmented reality experience to relocalize the scene and visualize the augmentation in real time.

    Download 3D Tracking Trial!

The video below demonstrates the above described steps.

https://youtu.be/GWs2KK-Pv0Q

Important note: It is not possible to use both 2D and 3D tracking within one experience. If you use 3D tracking, recognition, and tracking of 2D markers will not be launched.

Developers can now try the Wikitude 3D tracking together with the SDK 5 free trial. A trial key that has been generated on Oct 15 2015 or later is required. License keys prior to that will show a warning Unlicensed feature when you try to use 3D tracking. If you generated a license key earlier than October 15th, just revisit the license page and a new key will be automatically generated.

The Wikitude 3D Tracking is included in the Native SDK download packages for iOS and Android. In our documentation section you can find all details of this new cross-platform technology and follow the step-by-step set up guide to get started (Android, iOS). We can’t wait to get your feedback and are happy to answer questions you have at sales@wikitude.com or in our developer forum.

And before we close this post, here is a sneak peek of what is coming next! Subscribe to our newsletter and stay tuned to our developments on our dedicated SLAM page.

Update 2017: 3D Instant Tracking now available for Multiple Platforms and Development Frameworks

The Wikitude SDK is available for both Android and iOS devices, as well as a number of leading augmented reality smart glasses.
Developers can choose from a wide selection of AR development frameworks, including Native API, JavaScript API or any of the supported extensions and plugins available.

Among the extensions based on Wikitude’s JavaScript API are Cordova (PhoneGap), Xamarin and Titanium. These extensions include the same features available in the JS API, such as location-based AR, image recognition and tracking and SLAM.
Unity is the sole plugin based on the Wikitude Native SDK and includes image recognition and tracking, SLAM, as well as a plugin API, which allows you to connect the Wikitude SDK to third-party libraries.

Categories
News

We humans see the world in 3D – now Wikitude does, too.

This week is very exciting as the Augmented World Expo will open its doors to “everything AR”. The event in Santa Clara, California, is definitely the most important gathering of the year for us, which is why we have chosen to share more details of our SDK product roadmap with you today.

3D computer vision coming in September

In addition to our already announced and feature packed Wikitude SDK 5.0 a couple of weeks ago, we are announcing today that we will add 3D recognition and tracking capabilities to our SDK package in September. To make the waiting a bit more pleasant, we have produced this video to show you our focus on rooms, spaces and environments both indoors and outdoors.

[responsive_vid]

Rooms, Spaces, Environments – Indoor and Outdoor

As stated in my interview with Tom Emrich last month, it is a natural progression for Wikitude to move from 2D image recognition to 3D recognition and tracking of the real world. Our 2D capabilities are very mature and have been perfected to a high degree of performance and reliability, much to the appraisal of our customers building solutions that are based on augmenting 2D surfaces, such as in theScreen Shot 2015-06-05 at 17.07.22 advertising and printing industries. But the world is more than 2D surfaces, it consists of 3D spaces, rooms and environments, literally everywhere we look. As the video shows, our technology recognizes, maps and “understands” rooms to for example place digital furniture into it or to navigate through an unknown building. It allows you to create immersive entertainment in outdoor spaces, which has the potential to reinvent game play and the production of mixed reality film. Beyond the consumption of digital content in the real world, Wikitude’s 3D tracking will allow you to create your very own digitally augmented spaces. You can come up with your wildest ideas on how to design the world around you, wether it is visualizing your private house on a plot of land or perhaps a new factory, to give an example for a more enterprised application.

Our focus on technology and technology only

Computer vision is Wikitude’s DNA. Our dedicated team works day and night to build powerfulScreen Shot 2015-06-05 at 17.06.40 features for our augmented reality SDK. We do this for developers, visionaries and innovation managers with a dream, project or concept to build valuable apps for consumers, enterprise or both. Our mission is to build technology enabling YOU to “augment the world” so we can all see more than we normally could.

Meet us this week

Are you near San Francisco this week? If you want to learn more about Wikitude’s technology and speak to our team directly, come join us for these two events:

Augmented World Expo from 8-10 June in Santa Clara, our booth number is #70, right at the entrance of the exhibition area.

Sign up to “Learn about Wikitude’s AR SDK and tools” at this meetup in San Francisco on June 10th at 6:30pm.

I am looking forward too seeing you all in California this week, hope you can make it!

Andy Gstoll – Wikitude CMO

Categories
AR features

Shaping the future of technology with SLAM (simultaneous localization and mapping)

Update (August 2017): Object recognition, multi-target tracking and SLAM: Track the world with SDK 7

Wikitude’s SLAM (simultaneous localization and mapping) SDK is ready for download!

The world is not flat and so as our technology begins to spill out from behind the screen into the physical world it is increasingly important that it interacts with it in three dimensions. To do this, we are waking up our technology, equipping it with sensors to give it the ability to feel out its surroundings. But seeing the world as we do is only half the solution.

One secret ingredient driving the future of a 3D technological world is a computational problem called SLAM.

What is SLAM?

SLAM or simultaneous localization and mapping, is a series of complex computations and algorithms which use sensor data to construct a map of an unknown environment while using it at the same time to identify where it is located. This, of course, is a chicken-and-egg type problem and in order for SLAM to work the technology needs to create a pre-existing map of its surroundings and then orient itself within this map to refine it. The concept of SLAM has been around since the late 80s but we are just starting to see some of the ways this powerful mapping process will enable the future of technology. SLAM is a key driver behind unmanned vehicles and drones, self-driving cars, robotics, and augmented reality applications.

“As we are only at the very beginning of augmenting the physical world around us, visual SLAM is currently very well suited for tracking in unknown environments, rooms and spaces,” explains Andy Gstoll, CMO at Wikitude. “The technology continuously scans and “learns” about the environment it is in allowing you to augment it with useful and value-adding digital content depending on your location within this space.”

Prototype of Google’s self-driving car. Source: Google

SLAM use-cases

Google’s self-driving car is a good example of technology making use of SLAM. A project under Google X, Google’s experimental “moonshot” division, the driverless car void of a steering wheel or pedals, uses high definition inch-precision mapping to navigate. More specifically it relies on a ranger finder mounted on the top of the car which emits a laser beam generated to create detailed 3D maps of its environment. It then uses this map and then combines it with maps available of the world to drive itself autonomously.

From the roads to the skies, drones are also using SLAM to make sense of the world around it in order to add value. A great example of this is a concept from MIT research group, Senseable City Laboratory called Skycall. Skycall employs the use of a drone to help students navigate around the MIT campus. The concept sees students call a drone using a smartphone application and then tells it where it wants to go on campus. The drone then asks the student to “follow them” and guides the student to the location. Using a combination of auto-pilot, GPS, sonar sensing and Wi-Fi connectivity the drone is able to sense its environment in order to guide users along pre-defined paths or to specific destinations requested by the user.

Track the World with Wikitude Augmented Reality SDK

Here at Wikitude, we are using SLAM in our product lineup to further the capabilities of augmented reality in the real world which will open up new possibilities for the use of AR in large scale and outdoor environments. From architectural projects to Hollywood film productions, SLAM technology will enable a variety of industries to position complex 3D models in tracked scenes ensuring its complete visualisation and best positioning in the environment. In the demo below, we show you how SLAM was used to help us augment a 3D model of a church steeple that had been destroyed in World War II by allowing users to see what it looked like before the war.

“As the leader in augmented reality technology, it is a natural process for us to expand from 2D to 3D AR solutions as our mission is to augment the world, not only magazines and billboards. Visual SLAM and our very unique approach, you may call it our “secret sauce”, allow us to provide the state of the art AR technology our rapidly growing developer customer base demands,” says Gstoll.

The use of SLAM in our augmented reality product line will be the focus of our talk and demo at this year’s Augmented World Expo which takes place in Silicon Valley June 8-10. We encourage you to come by our booth to see this technology in action!

Start developing with an award winning AR SDK

Getting started with SDK 7 has never been easier! Here’s how:

Multiple Platforms and Development Frameworks

The Wikitude SDK is available for both Android and iOS devices, as well as a number of leading augmented reality smart glasses.

Developers can choose from a wide selection of AR development frameworks, including Native API, JavaScript API or any of the supported extensions and plugins available.

Among the extensions based on Wikitude’s JavaScript API are Cordova (PhoneGap), Xamarin and Titanium. These extensions include the same features available in the JS API, such as location-based AR, image recognition and tracking and SLAM.
Unity is the sole plugin based on the Wikitude Native SDK and includes image recognition and tracking, SLAM, as well as a plugin API, which allows you to connect the Wikitude SDK to third-party libraries.

Wikitude Augmented reality SDK development frameworks

– – – –

This post was written by Tom Emrich, co-producer of the sixth annual Augmented World Expo or AWE. AWE takes place June 8-10 at the Santa Clara Convention Center in California. The largest of its kind, AWE brings together over 3,000+ professionals and 200 participating companies to showcase augmented and virtual reality, wearable technology and the Internet of Things.

Help us spread the news on Twitter, Facebook and Linkedin using the hashtag #Wikitude.

Categories
AR features

Wikitude’s 3D recognition in a large scale outdoor environment

As announced at CES in Las Vegas earlier this year Wikitude has included Simultaneous Localization And Mapping (SLAM) technology into our product lineup. This is a tremendous step forward, both for our technology proposition as well as more creative and additionally powerful use cases available to you.

Today we would like to share a demo video showing an “outside and wide area” scenario. Here, we’re augmenting a 3D model of a church steeple as seen before World War II. After the steeples had been destroyed during the war, they were never rebuilt to their original design due to lack of funds for restoration. Wikitude’s 3D recognition and tracking makes it possible to place the digital steeple exactly where it originally was, providing you with an experience as close as possible to what it was like before the World War.

[responsive_vid]

Please excuse the reflections you see on the tablet’s screen in the video, we did not want to modify or enhance the video in post production as this is meant to be an authentic technology demo without any cuts of the footage.

We hope by sharing this video with you, we will inspire you for your own future AR projects and apps. If you are interested in receiving updates on our 3D recognition and tracking, stay up to date with our latest developments by subscribing to our newsletter.

Categories
AR features Labs Entries News

Wikitude reveals SLAM technology at CES in Las Vegas

After a long period of confidentiality, Wikitude is proud to reveal a long held secret and is making a SLAM dunk at CES 2015 in Las Vegas, literally.

Our skilled team of software engineers have finally given us the GO to spread the word about our Simultaneous Localization And Mapping (SLAM) technology. If you’re not already familiar with this technology, SLAM is essentially doing two things at the same time. On one hand it scans a 3D scene or any real life environment allowing the localization of the device capturing the data. On the other, it simultaneously maps this environment allowing for the augmentation of digital content into the scene.

Demonstrated in the video below, the core Wikitude engine is now capable of augmenting a 3D model in real time, while simultaneously keeping track of it’s position in relation to the surrounding . The algorithm scans and “understands” the basketball court scene to then augment the 3D model of the red Lamborghini next to the physical Mercedes. The second augmentation is the scoreboard on the plain white wall above the court. The score board remains mounted to the wall in a stable position even in a very “low feature environment”. Meaning, that even when only very few details of the basketball players and basket are seen within the field of vision viewed by the device.

[responsive_vid]

As you can imagine, the possibilities of SLAM use cases are endless in both enterprise and consumer space. We’re truly excited to finally announce and demonstrate our application of this technology to you today. We’ll continue our advancements on this technology and our R&D team is working ‘round the clock to make 3D object and environment recognition part of our SDK. We expect to incorporate SLAM technology and to make it available as a product soon.