Documentation

Instant Tracking

The following sections detail the InstantTracking feature of the Wikitude JavaScript SDK by introducing a simple example experience and building two succeeding experiences based on it. The initial example represents a minimal implementation, showcasing the simplicity the Wikitude JavaScript SDK provides. Its immediate successor adds 3D model augmentations and preliminary interaction, which will be expanded into a fully fledged use case in the final example.

We recommend working through the examples in order as the individual sections aim to be void of redundancy.

The three example sections are preceded by an introductory section that introduces the instant tracking algorithm with brevity.

  1. Introduction
  2. Basic Instant Tracking
  3. 3D Model on Plane
  4. Interactivity

Introduction (1/4)

Instant tracking is an algorithm that, contrary to those previously introduced in the Wikitude SDK, does not aim to recognize a predefined target and start the tracking procedure thereafter, but immediately start tracking in an arbitrary environment. This enables very specific use cases to be implemented. One such use case, a furniture product visualization app, is implemented by our final example.

The algorithm works in two distinct states; the first of which is the initialization state. In this state the user is required to define the origin of the tracking procedure by simply pointing the device and thereby aligning an indicator. Once the alignment is found to be satisfactory by the user (which the users needs to actively confirm), a transition to the tracking state is performed. In this state, the environment is being tracked continuously, which allows for augmentations to be placed within the scene.

Initialization state

Tracking state

The instant tracking algorithm requires another input value to be provided in the initialization state. Specifically, the height of the tracking device is required in order to accurately adjust the scale of augmentations within the scene. To this end, the three examples feature a range input element that allows the height to be set in meters.

Basic Instant Tracking (2/4)

The Basic Instant Tracking example provides a minimal implementation of the instant tracking algorithm. It introduces the two essential classes AR.InstantTracker and AR.InstantTrackable. If you are familiar with image tracking already, they should seem familiar to you as image tracking uses the same pattern with its AR.ImageTracker and AR.ImageTrackable.

An AR.InstantTracker can, minimally, be instantiated without any parameters.

this.tracker = new AR.InstantTracker();

It, however, allows an initial height to be specified (deviceHeight) as well supply a callback function to be invoked when a transition between states occurs (onChangedState).

this.tracker = new AR.InstantTracker({
    onChangedState:  function onChangedStateFn(state) {
    },
    deviceHeight: 1.0
});

An AR.InstantTrackable can, minimally, be instantiated with just the previously generated tracker instance, although supplying drawables to be rendered in both the initialization state and the tracking state is advisable for any practical use case. Therefore two AR.ImageDrawable instances and, correspondingly, two AR.ImageResource instances are generated and supplied as well.

var crossHairsRedImage = new AR.ImageResource("assets/crosshairs_red.png");
var crossHairsRedDrawable = new AR.ImageDrawable(crossHairsRedImage, 1.0);

var crossHairsBlueImage = new AR.ImageResource("assets/crosshairs_blue.png");
var crossHairsBlueDrawable = new AR.ImageDrawable(crossHairsBlueImage, 1.0);

this.instantTrackable = new AR.InstantTrackable(this.tracker, {
    drawables: {
        cam: crossHairsBlueDrawable,
        initialization: crossHairsRedDrawable
    }
});

The only additional change required is a means to transition from one state to the other. For this task we provide the changeTrackerState function which we conveniently call on a button click. The AR.InstantTrackerState defines the two values used to identify each state.

changeTrackerState: function changeTrackerStateFn() {
    if (this.tracker.state === AR.InstantTrackerState.INITIALIZING) {
        this.tracker.state = AR.InstantTrackerState.TRACKING;
    } else {
        this.tracker.state = AR.InstantTrackerState.INITIALIZING;
    }
}
<input id="tracking-start-stop-button" type="image" src="assets/buttons/start.png" onclick="World.changeTrackerState()"/>

Lastly, we provide the changeTrackingHeight function to set the deviceHeight property of the AR.InstantTracker and connect it to our range input element. While this change is, strictly speaking, not required, we strongly recommend every application to supply the device height accurately by this method or another for the Wikitude SDK to provide an accurate scale.

changeTrackingHeight: function changeTrackingHeightFn(height) {
    this.tracker.deviceHeight = parseFloat(height);
}
<input id="tracking-height-slider" type="range" min="0.1" value="1.0" max="2.0" step="0.1" onchange="World.changeTrackingHeight(value)">

The example outlined in this section renders a red crosshair image augmentation while in initialization state as its indicator and a corresponding blue crosshair image augmentation when in tracking state as its augmentation. While the example is quite trivial, we believe it serves the purpose of familiarizing the reader with the core concepts of instant tracking well. Furthermore, we would like to highlight the simplicity of the example application. With just under 20 lines of JavaScript code this sample is fully functional. The alterations we introduce in the following sections are of similar simplicity.

Basic Instant Tracking initialization state

Basic Instant Tracking tracking state

3D Model on Plane (3/4)

In this section, the example application implemented previously is amended to demonstrate user interaction and the use of more sophisticated augmentations.

Firstly, we extend the application by allowing 3D model augmentations to be placed within the scene by simply clicking on the screen. Internally, the ray defined by this touch is intersected with the instant tracking plane, yielding an intersection position that can trivially be applied to the models transform property. Upon this occurrence, the onTrackingPlaneClick callback of the AR.InstantTrackable is invoked and the intersection position coordinates are supplied as separate parameters.

this.instantTrackable = new AR.InstantTrackable(this.tracker, {
    drawables: {
        cam: crossHairsBlueDrawable,
        initialization: crossHairsRedDrawable
    },
    onTrackingPlaneClick: function onTrackingPlaneClickFn(xpos, ypos) {
        World.addModel(xpos, ypos);
    }
});

The addModel function instantiates an AR.Model and sets its initial scale, translate and rotate properties. Note that the translate property is directly set to the intersection coordinates passed into the onTrackingPlaneClick callback. To add some visual variety, the rotation about the Z-axis is randomized.

addModel: function addModelFn(xpos, ypos) {
    if (World.isTracking()) {
        var model = new AR.Model("assets/models/couch.wt3", {
            scale: {
                x: 0.045,
                y: 0.045,
                z: 0.045
            },
            translate: {
                x: xpos,
                y: ypos
            },
            rotate: {
                z: Math.random() * 360.0
            },
        })

        allCurrentModels.push(model);
        this.instantTrackable.drawables.addCamDrawable(model);
    }
}

The isTracking function simply checks whether the tracker is in the tracking state to limit the user interaction thereto.

isTracking: function isTrackingFn() {
    return (this.tracker.state === AR.InstantTrackerState.TRACKING);
}

The example further includes functionality to reset the generated models, which is omitted as it does not directly pertain to instant tracking.

The scene being empty

The scene after several click inputs

Interactivity (4/4)

Lastly, we further extend the example application to allow several different AR.Model augmentations to be placed and introduce gestures that allow alteration of previously placed augmentations.

To begin with, we add several buttons, one for each available model as depicted by the very first pair of images at the beginning of this page. As the HTML definition thereof is straightforward, it is omitted here, yet it is required to be aware of their presence for future reference. More importantly, we initially setup event listeners such that the touchstart event is triggered on each of the buttons, which we utilize to set the requestedModel property. This property indicates which model to instantiate.

setupEventListeners: function setupEventListenersFn() {
    document.getElementById("tracking-model-button-clock").addEventListener('touchstart', function(ev){
        World.requestedModel = 0;
    }, false);
    document.getElementById("tracking-model-button-couch").addEventListener('touchstart', function(ev){
        World.requestedModel = 1;
    }, false);
    document.getElementById("tracking-model-button-chair").addEventListener('touchstart', function(ev){
        World.requestedModel = 2;
    }, false);
    document.getElementById("tracking-model-button-table").addEventListener('touchstart', function(ev){
        World.requestedModel = 3;
    }, false);
    document.getElementById("tracking-model-button-trainer").addEventListener('touchstart', function(ev){
        World.requestedModel = 4;
    }, false);
},

In order to instantiate models, we further implement the onTrackingPlaneDragBegan, onTrackingPlaneDragChanged and onTrackingPlaneDragEnded callbacks of the AR.InstantTrackable. The began and end callbacks are invoked when a one finger drag is initiated or lifted respectively; the update callback is invoked periodically as long as the gesture is continued. As with the onTrackingPlaneClick callback, they do get the intersection positions of the touch ray and the instant tracking plane supplied.

this.instantTrackable = new AR.InstantTrackable(this.tracker, {
    drawables: {
        cam: crossHairsBlueDrawable,
        initialization: crossHairsRedDrawable
    },
    onTrackingPlaneDragBegan: function onTrackingPlaneDragBeganFn(xPos, yPos) {
        World.updatePlaneDrag(xPos, yPos);
    },
    onTrackingPlaneDragChanged: function onTrackingPlaneDragChangedFn(xPos, yPos) {
        World.updatePlaneDrag(xPos, yPos);
    },
    onTrackingPlaneDragEnded: function onTrackingPlaneDragEndedFn(xPos, yPos) {
        World.updatePlaneDrag(xPos, yPos);
        World.initialDrag = false;
    }
});

We simply forward the intersection positions to the updatePlaneDrag function which checks the requestedModel property, which might have been previously set by the touchmove handler function of one of our buttons. If that is the case, a model of that ID is created at the supplied intersection position using the addModel function. The subsequent onTrackingPlaneDragChanged calls update the position, allowing our AR.Model instances to be created by dragging them from the buttons we created previously.

addModel: function addModelFn(pathIndex, xpos, ypos) {
    if (World.isTracking()) {
        var modelIndex = rotationValues.length;
        World.addModelValues();

        var model = new AR.Model(World.modelPaths[pathIndex], {
            scale: {
                x: defaultScaleValue,
                y: defaultScaleValue,
                z: defaultScaleValue
            },
            translate: {
                x: xpos,
                y: ypos
            },
            onDragChanged: function(relativeX, relativeY, intersectionX, intersectionY) {
                this.translate = {x:intersectionX, y:intersectionY};
            },
            onRotationChanged: function(angleInDegrees) {
                this.rotate.z = rotationValues[modelIndex] - angleInDegrees;
            },
            onRotationEnded: function(angleInDegrees) {
               rotationValues[modelIndex] = this.rotate.z
            },
            onScaleChanged: function(scale) {
                var scaleValue = scaleValues[modelIndex] * scale;
                this.scale = {x: scaleValue, y: scaleValue, z: scaleValue};
            },
            onScaleEnded: function(scale) {
                scaleValues[modelIndex] = this.scale.x;
            }
        })

        allCurrentModels.push(model);
        lastAddedModel = model;
        this.instantTrackable.drawables.addCamDrawable(model);
    }
}

The addModel function is very similar to what was presented in the previous example, however, some additions have been made. Specifically, gesture callbacks have been added to the AR.Model. These callbacks follow the invocation pattern of the onClick callback. Should a class derived from AR.Drawable be hit by the touch, its gesture callbacks are invoked in case the corresponding gesture callback is implemented. If not the trackable is considered next, the AR.Context object eventually. For AR.Drawables belonging to an AR.InstantTrackable the drag gesture callbacks receive the tracking plane intersection coordinates in addition to the relative coordinates. This enables a means of translating objects in the instant tracking scene by simply setting the AR.Drawable's translate property to the intersection coordinates received. We recommend setting entire transformation properties (rotate, scale, translate) rather than setting their individual components to minimize the necessary callbacks from JavaScript to the native OS environment. The rotation and scale gesture callbacks operate as demonstrated by the dedicated gesture sample to which the interested reader is hereby kindly referred to.

One more intricacy to consider is disabling the drag gesture while two finger gestures are active in order to prevent counterintuitively behaving transformation interactions. This can be achieved by implementing the AR.context.on2FingerGestureStarted callback and setting a flag therein.

AR.context.on2FingerGestureStarted = function() {
    oneFingerGestureAllowed = false;
}

The onDragChanged callback in adapted to consider this flag and only update the translate property when allowed to do so.

onDragChanged: function(relativeX, relativeY, intersectionX, intersectionY) {
    if (oneFingerGestureAllowed) {
        this.translate = {x:intersectionX, y:intersectionY};
    }
}

The flag is reset in the next onDragBegan callback invocation to re-enable.

onDragBegan: function(x, y) {
    oneFingerGestureAllowed = true;
}

The changes outlined finally enable the initial use case of a furniture product visualization application to be implemented. Again, there are aspects of the example app that have not been covered, but they do not directly relate to the instant tracking feature and can easily be understood from the source code of the example application.

A miniature living room scene on the floor of the Wikitude offices