Documentation

Object Tracking

Introduction to Object Tracking

Object Recognition and Tracking extends the capabilities of the Wikitude SDK to recognize and track arbitrary objects for augmented reality experiences. The feature is based on Wikitude's SLAM engine that is also used for Instant Tracking. Object Tracking let you detect objects, that were pre-defined by you. Suitable objects include

  • Toys
  • Monuments and statues
  • Industrial objects
  • Tools
  • Household supplies

Objects can be best successfully recongnized if they don't consist of many dynamic parts.

Simple Object Tracking

The Simple Object Tracking sample should just give you a rough idea of how object tracking works with the Wikitude Native SDK. We will track a toy fire truck and add a stroked cube as well as an occluder cube to demonstrate its use.

Object Tracking means that instead of tracking an image we track 3-dimensional objects, using information that has been extracted from a video of that object. This is why instead of a Wikitude Target Collection (.wtc) we use a .wto-file (Wikitude Object Target Collection) in this sample.

To use object tracking with the Wikitude Native SDK we have to make our UIViewController conform to the WTObjectTrackerDelegate protocol and we will need a WTObjectTracker member. We also add a StrokedCube and an OccluderCube as members.

@interface WTSimpleObjectTrackerViewController () <WTWikitudeNativeSDKDelegate, WTExternalOpenGLESRenderingProtocol, WTObjectTrackerDelegate>

In -viewDidLoad:, default initialize the StrokedCube and the OccluderCube and set up a WTTargetImageView with the image of the fire truck.

When we start the Wikitude Native SDK in -viewDidAppear: we create a WTTargetCollectionResource with firetruck.wto and hand it over to the WTTrackerManager's -createObjectTrackerFromTargetCollectionResource:delegate: method to create our object tracker.

NSURL *objectTargetCollectionResourceURL = [[NSBundle mainBundle] URLForResource:@"firetruck" withExtension:@"wto" subdirectory:@"Assets"];
self.targetCollectionResource = [self.wikitudeSDK.trackerManager createTargetCollectionResourceFromURL:objectTargetCollectionResourceURL completion:^(BOOL success, NSError * _Nullable error) {
    if ( !success )
    {
        NSLog(@"Failed to load URL resource. Reason: %@", [error localizedDescription]);
    }
    else
    {
        weakSelf.objectTracker = [weakSelf.wikitudeSDK.trackerManager createObjectTrackerFromTargetCollectionResource:weakSelf.targetCollectionResource delegate:weakSelf];
    }
}];

In the -renderBlock method we have to add self.occluderCube and self.renderableCube to the render call, so they will be drawn once the application is running.

[self.occluderCube drawInContext:[self.renderer internalContext]];
[self.renderableCube drawInContext:[self.renderer internalContext]];

The object tracker is now ready, but its delegate (our UIViewController) isn't there yet, because it doesn't conform to the WTObjectTrackerDelegate-protocol. We need to implement a few callback methods first.

-objectTracker:didRecognizeObject: gets called when an object is recognized. We then set self.isTracking to true, hide WTTargetImageView and log the name of the object to the console.

- (void)objectTracker:(WTObjectTracker *)objectTracker didRecognizeObject:(WTObjectTarget *)recognizedObject
{
    NSLog(@"recognized object '%@'", [recognizedObject name]);
    _isTracking = YES;
    [self.targetImageView hide:YES];
}

The majority of the work is done in -objectTracker:didTrackObject:. Here we update the scale and position of the WTOccluderCube and the WTStrokedCube while we are tracking the fire truck.

- (void)objectTracker:(WTObjectTracker *)objectTracker didTrackObject:(WTObjectTarget *)trackedObject
{
    self.occluderCube.yTranslation = 0.5f;

    self.occluderCube.xScale = trackedObject.xScale;
    self.occluderCube.yScale = trackedObject.yScale;
    self.occluderCube.zScale = trackedObject.zScale;

    [self.occluderCube setProjectionMatrix:trackedObject.projection];
    [self.occluderCube setModelViewMatrix:trackedObject.modelView];

    self.renderableCube.yTranslation = 0.5f;

    self.renderableCube.xScale = trackedObject.xScale;
    self.renderableCube.yScale = trackedObject.yScale;
    self.renderableCube.zScale = trackedObject.zScale;

    [self.renderableCube setProjectionMatrix:trackedObject.projection];
    [self.renderableCube setModelViewMatrix:trackedObject.modelView];
}

The next callback-function to implement is -objectTracker:didLoseObject:, which logs the name of the lost object to the console and sets self.isTracking to false again.

- (void)objectTracker:(WTObjectTracker *)objectTracker didLoseObject:(WTObjectTarget *)lostObject
{
    NSLog(@"lost object '%@'", [lostObject name]);
    _isTracking = NO;
}

Finally, similar to the other samples we implement -objectTrackerDidLoadTargets: and -objectTracker:didFailToLoadTargets: by simply logging to the console what happened.

- (void)objectTrackerDidLoadTargets:(WTObjectTracker *)objectTracker
{
    NSLog(@"Object tracker loaded");
}

- (void)objectTracker:(WTObjectTracker *)objectTracker didFailToLoadTargets:(NSError *)error
{
    NSLog(@"Unable to load object tracker. Reason: %@", [error localizedDescription]);
}

If you add all this to your code, an orange cube will be displayed around the fire truck if you look at it. Since the occluder cube is always at the same posititon as the cube, you can't see the cubes sides that are farther away from the camera since the occluder blocks the view.

Simple Object Tracking