Documentation

Instant Tracking

The following sections detail the InstantTracking feature of the Wikitude JavaScript SDK by introducing a simple example experience and building two succeeding experiences based on it. The initial example represents a minimal implementation, showcasing the simplicity the Wikitude JavaScript SDK provides. Its immediate successor adds 3D model augmentations and preliminary interaction, which will be expanded into a fully fledged use case in the final example.

We recommend working through the examples in order as the individual sections aim to be void of redundancy.

SMART - Seamless AR Tracking

SMART is a seamless API which integrates ARKit, ARCore and Wikitude’s SLAM in a single augmented reality SDK, cross-platform, for any device. It ensures the delivery of the best possible augmented reality experience on a wider range of devices, covering 92,6% of iOS devices and about 35% of Android devices available in the market.

To enable ARCore with Wikitude follow the instructions in the documentation of ARCore in the section Enable ARCore. This can also be seen in the examples app.

SMART is enabled by default but can be disabled by setting a parameter when creating an AR.InstantTracker with the smartEnabled option. The behaviour cannot be changed during runtime.

new AR.InstantTracker({
    smartEnabled: false
});

To enable ARCore with Wikitude follow the instructions in the documentation of ARCore in the section Enable ARCore. This can also be seen in the examples app.

To check if the device supports platform assisted tracking with SMART (tracking with ARCore) AR.hardware.smart.isPlatformAssistedTrackingSupported must be used. ARCore has a mechanism to download and install the required ARCore companion app automatically. Initiating this procedure with the Wikitude JavaScript SDK is a two step process of calling AR.hardware.smart.isPlatformAssistedTrackingSupported and, if applicable, creating an AR.InstantTracker with the SMART flag set to true. The first call will start a continuous query to determine whether the current device supports ARCore. It will repeatedly report the current status through the AR.hardware.smart.onPlatformAssistedTrackingAvailabilityChanged callback function. This callback provides one of the following enumeration constants as an input parameter.

value action
AR.hardware.smart.SmartAvailability.INDETERMINATE_QUERY_FAILED The query failed for some reason. Try again or create the tracker to run without ARCore. The callback will not be invoked again.
AR.hardware.smart.SmartAvailability.CHECKING_QUERY_ONGOING The query is currently ongoing. No action is required. The callback will be invoked again.
AR.hardware.smart.SmartAvailability.UNSUPPORTED The device does not support ARCore. Create the tracker to run without ARCore. The callback will not be invoked again.
AR.hardware.smart.SmartAvailability.SUPPORTED_UPDATE_REQUIRED The device does support ARCore, but the companion app needs to be installed or updated. Create the tracker to start the installation process. The callback will be invoked again.
AR.hardware.smart.SmartAvailability.SUPPORTED The device does support ARCore and the current version of the companion is already installed. Create the tracker to run with ARCore. The callback will not be invoked again.

Translated into JavaScript code, the table results in the following snippet.

AR.hardware.smart.onPlatformAssistedTrackingAvailabilityChanged = function(availability) {
    switch(availability) {
        case AR.hardware.smart.SmartAvailability.INDETERMINATE_QUERY_FAILED:
            /* query failed for some reason; try again or accept the fact. */
            World.showUserInstructions("Could not determine if platform assisted tracking is supported.<br>Running without platform assisted tracking (ARKit or ARCore).");
            World.createOverlays();
            break;
        case AR.hardware.smart.SmartAvailability.CHECKING_QUERY_ONGOING:
            /* query currently ongoing; be patient and do nothing or inform the user about the ongoing process */
            break;
        case AR.hardware.smart.SmartAvailability.UNSUPPORTED:
            /* not supported, create the scene now without platform assisted tracking enabled */
            World.showUserInstructions("Running without platform assisted tracking (ARKit or ARCore).");
            World.createOverlays();
            break;
        case AR.hardware.smart.SmartAvailability.SUPPORTED_UPDATE_REQUIRED:
        case AR.hardware.smart.SmartAvailability.SUPPORTED:
            /* supported, create the scene now with platform assisted tracking enabled
             *
             * SUPPORTED_UPDATE_REQUIRED may be followed by SUPPORTED, make sure not to
             * create the scene twice
             */
            World.platformAssisstedTrackingSupported = true;
            if (!World.createOverlaysCalled) {
                World.showUserInstructions("Running with platform assisted tracking(ARKit or ARCore). <br> Move your phone around until the crosshair turns green, which is when you can start tracking.");
                World.createOverlays();
                World.createOverlaysCalled = true;
            }
            break;
    }
};

Be careful not to create the tracker twice when an installation of the ARCore app is required. You may receive the AR.hardware.smart.SmartAvailability.SUPPORTED_UPDATE_REQUIRED constant followed by the AR.hardware.smart.SmartAvailability.SUPPORTED constant, but will want to only create the tracker once. The snippet presented avoids doubly creating the tracker by checking its reference.

SMART provides improved tracking capabilities at the expense of control. Because of that some Wikitude SDK features are not available when platform tracking capabilities are used by enabling SMART.

Features SMART ON and platform assisted tracking supported SMART OFF
Improved Tracking x
Plane Orientation x
Camera Control x
Save and Load Instant Targets x
Plane Detection x

Introduction

Instant tracking is an algorithm that, contrary to those previously introduced in the Wikitude SDK, does not aim to recognize a predefined target and start the tracking procedure thereafter, but immediately start tracking in an arbitrary environment. This enables very specific use cases to be implemented. One such use case, a furniture product visualization app, is implemented by our final example.

The algorithm works in two distinct states; the first of which is the initialization state. In this state the user is required to define the origin of the tracking procedure by simply pointing the device and thereby aligning an indicator. Once the alignment is found to be satisfactory by the user (which the users needs to actively confirm), a transition to the tracking state is performed. In this state, the environment is being tracked continuously, which allows for augmentations to be placed within the scene.

Switching from initialization state to tracking state may not always be possible when platform assisted tracking is active; it requires a plane as a precondition. This may take several seconds. Should an attempt be made to switch the state without the underlying algorithms being ready, an error will be raised though the error callback.

Initialization state

Tracking state

The instant tracking algorithm requires another input value to be provided in the initialization state. Specifically, the height of the tracking device above ground is required in order to accurately adjust the scale of augmentations within the scene. To this end, the three examples feature a range input element that allows the height to be set in meters.

When platform assisted tracking is active, the device height above ground input parameter, and consequently the corresponding range input element, only affects the scale in the initialization state; the scale for the tracking state is automatically determined by the platform tracking algorithm. Setting this value correctly is still advised, as an incorrect value will cause a discrepancy of scale when switching between states.

During the initialization, another parameter can be set which influences the alignment of the instant tracking ground plane. This ground plane is represented by the initialization indicator and can be rotated in order to start instant tracking at e.g. a wall instead of the floor. Please refer to the API reference for detailed information.

When platform assisted tracking is active, the tracking plane orientation may not be altered. Doing so will, however, not have any side effects other than raising an error through the error callback.

Basic Instant Tracking

The Basic Instant Tracking example provides a minimal implementation of the instant tracking algorithm. It introduces the two essential classes AR.InstantTracker and AR.InstantTrackable. If you are familiar with image tracking already, they should seem familiar to you as image tracking uses the same pattern with its AR.ImageTracker and AR.ImageTrackable.

An AR.InstantTracker can, minimally, be instantiated without any parameters.

this.tracker = new AR.InstantTracker();

It, however, allows an initial height to be specified (deviceHeight) as well supply a callback function to be invoked when a transition between states occurs (onChangedState).

this.tracker = new AR.InstantTracker({
    onChangedState:  function onChangedStateFn(state) {
    },
    deviceHeight: 1.0
});

An AR.InstantTrackable can, minimally, be instantiated with just the previously generated tracker instance, although supplying drawables to be rendered in both the initialization state and the tracking state is advisable for any practical use case. Therefore two AR.ImageDrawable instances and, correspondingly, two AR.ImageResource instances are generated and supplied as well.

var crossHairsRedImage = new AR.ImageResource("assets/crosshairs_red.png");
this.crossHairsRedDrawable = new AR.ImageDrawable(crossHairsRedImage, 1.0);

var crossHairsBlueImage = new AR.ImageResource("assets/crosshairs_blue.png");
this.crossHairsBlueDrawable = new AR.ImageDrawable(crossHairsBlueImage, 1.0);

this.instantTrackable = new AR.InstantTrackable(this.tracker, {
    drawables: {
        cam: World.crossHairsBlueDrawable,
        initialization: World.crossHairsRedDrawable
    },
});

When using SMART and platform assisted tracking is supported it is only possible to start tracking once ARKit or ARCore have detected a plane. Because of this it is advisable to add a third AR.ImageDrawable to indicate when it is possible to switch from initialization to tracking which can be checked with AR.InstantTracker.canStartTracking.

var crossHairsGreenImage = new AR.ImageResource("assets/crosshairs_green.png");
this.crossHairsGreenDrawable = new AR.ImageDrawable(crossHairsGreenImage, 1.0);

setInterval(
    function() {
        if (World.tracker.canStartTracking) {
            World.instantTrackable.drawables.initialization = [World.crossHairsGreenDrawable];
        } else {
            World.instantTrackable.drawables.initialization = [World.crossHairsRedDrawable];
        }
    },
    1000
);

The only additional change required is a means to transition from one state to the other. For this task we provide the changeTrackerState function which we conveniently call on a button click. The AR.InstantTrackerState defines the two values used to identify each state.

changeTrackerState: function changeTrackerStateFn() {
    if (this.tracker.state === AR.InstantTrackerState.INITIALIZING) {
        this.tracker.state = AR.InstantTrackerState.TRACKING;
    } else {
        this.tracker.state = AR.InstantTrackerState.INITIALIZING;
    }
}
<input id="tracking-start-stop-button" type="image" src="assets/buttons/start.png" onclick="World.changeTrackerState()"/>

Lastly, we provide the changeTrackingHeight function to set the deviceHeight property of the AR.InstantTracker and connect it to our range input element. While this change is, strictly speaking, not required, we strongly recommend every application to supply the device height accurately by this method or another for the Wikitude SDK to provide an accurate scale.

changeTrackingHeight: function changeTrackingHeightFn(height) {
    this.tracker.deviceHeight = parseFloat(height);
}
<input id="tracking-height-slider" type="range" min="0.1" value="1.0" max="2.0" step="0.1" onchange="World.changeTrackingHeight(value)">

The example outlined in this section renders a red crosshair image augmentation while in initialization state as its indicator and a corresponding blue crosshair image augmentation when in tracking state as its augmentation. While the example is quite trivial, we believe it serves the purpose of familiarizing the reader with the core concepts of instant tracking well. Furthermore, we would like to highlight the simplicity of the example application. With just under 20 lines of JavaScript code this sample is fully functional. The alterations we introduce in the following sections are of similar simplicity.

Basic Instant Tracking initialization state

Basic Instant Tracking tracking state

3D Model on Plane

In this section, the example application implemented previously is amended to demonstrate user interaction and the use of more sophisticated augmentations.

Firstly, we extend the application by allowing 3D model augmentations to be placed within the scene by simply clicking on the screen. Internally, the ray defined by this touch is intersected with the instant tracking plane, yielding an intersection position that can trivially be applied to the models transform property. Upon this occurrence, the onTrackingPlaneClick callback of the AR.InstantTrackable is invoked and the intersection position coordinates are supplied as separate parameters.

this.instantTrackable = new AR.InstantTrackable(this.tracker, {
    drawables: {
        cam: crossHairsBlueDrawable,
        initialization: crossHairsRedDrawable
    },
    onTrackingPlaneClick: function onTrackingPlaneClickFn(xpos, ypos) {
        World.addModel(xpos, ypos);
    }
});

The addModel function instantiates an AR.Model and sets its initial scale, translate and rotate properties. Note that the translate property is directly set to the intersection coordinates passed into the onTrackingPlaneClick callback. To add some visual variety, the rotation about the Z-axis is randomized.

addModel: function addModelFn(xpos, ypos) {
    if (World.isTracking()) {
        var model = new AR.Model("assets/models/couch.wt3", {
            scale: {
                x: 0.045,
                y: 0.045,
                z: 0.045
            },
            translate: {
                x: xpos,
                y: ypos
            },
            rotate: {
                z: Math.random() * 360.0
            },
        })

        allCurrentModels.push(model);
        this.instantTrackable.drawables.addCamDrawable(model);
    }
}

The isTracking function simply checks whether the tracker is in the tracking state to limit the user interaction thereto.

isTracking: function isTrackingFn() {
    return (this.tracker.state === AR.InstantTrackerState.TRACKING);
}

The example further includes functionality to reset the generated models, which is omitted as it does not directly pertain to instant tracking.

The scene being empty

The scene after several click inputs

Interactivity

Lastly, we further extend the example application to allow several different AR.Model augmentations to be placed and introduce gestures that allow alteration of previously placed augmentations.

To begin with, we add several buttons, one for each available model as depicted by the very first pair of images at the beginning of this page. As the HTML definition thereof is straightforward, it is omitted here, yet it is required to be aware of their presence for future reference. More importantly, we initially setup event listeners such that the touchstart event is triggered on each of the buttons, which we utilize to set the requestedModel property. This property indicates which model to instantiate.

setupEventListeners: function setupEventListenersFn() {
    document.getElementById("tracking-model-button-couch").addEventListener('touchstart', function( /*ev*/ ) {
        World.requestedModel = 0;
    }, false);
    document.getElementById("tracking-model-button-plant").addEventListener('touchstart', function( /*ev*/ ) {
        World.requestedModel = 1;
    }, false);
    document.getElementById("tracking-model-button-cabinet").addEventListener('touchstart', function( /*ev*/ ) {
        World.requestedModel = 2;
    }, false);
    document.getElementById("tracking-model-button-table").addEventListener('touchstart', function( /*ev*/ ) {
        World.requestedModel = 3;
    }, false);
},

In order to instantiate models, we further implement the onTrackingPlaneDragBegan, onTrackingPlaneDragChanged and onTrackingPlaneDragEnded callbacks of the AR.InstantTrackable. The began and end callbacks are invoked when a one finger drag is initiated or lifted respectively; the update callback is invoked periodically as long as the gesture is continued. As with the onTrackingPlaneClick callback, they do get the intersection positions of the touch ray and the instant tracking plane supplied.

this.instantTrackable = new AR.InstantTrackable(this.tracker, {
    drawables: {
        cam: crossHairsBlueDrawable,
        initialization: crossHairsRedDrawable
    },
    onTrackingPlaneDragBegan: function onTrackingPlaneDragBeganFn(xPos, yPos) {
        World.updatePlaneDrag(xPos, yPos);
    },
    onTrackingPlaneDragChanged: function onTrackingPlaneDragChangedFn(xPos, yPos) {
        World.updatePlaneDrag(xPos, yPos);
    },
    onTrackingPlaneDragEnded: function onTrackingPlaneDragEndedFn(xPos, yPos) {
        World.updatePlaneDrag(xPos, yPos);
        World.initialDrag = false;
    }
});

We simply forward the intersection positions to the updatePlaneDrag function which checks the requestedModel property, which might have been previously set by the touchmove handler function of one of our buttons. If that is the case, a model of that ID is created at the supplied intersection position using the addModel function. The subsequent onTrackingPlaneDragChanged calls update the position, allowing our AR.Model instances to be created by dragging them from the buttons we created previously.

addModel: function addModelFn(pathIndex, xpos, ypos) {
    if (World.isTracking()) {
        var modelIndex = rotationValues.length;
        World.addModelValues();

        var model = new AR.Model(World.modelPaths[pathIndex], {
            scale: {
                x: defaultScaleValue,
                y: defaultScaleValue,
                z: defaultScaleValue
            },
            translate: {
                x: xpos,
                y: ypos
            },
            onDragChanged: function(relativeX, relativeY, intersectionX, intersectionY) {
                this.translate = {x:intersectionX, y:intersectionY};
            },
            onRotationChanged: function(angleInDegrees) {
                this.rotate.z = rotationValues[modelIndex] - angleInDegrees;
            },
            onRotationEnded: function(angleInDegrees) {
               rotationValues[modelIndex] = this.rotate.z
            },
            onScaleChanged: function(scale) {
                var scaleValue = scaleValues[modelIndex] * scale;
                this.scale = {x: scaleValue, y: scaleValue, z: scaleValue};
            },
            onScaleEnded: function(scale) {
                scaleValues[modelIndex] = this.scale.x;
            }
        })

        allCurrentModels.push(model);
        lastAddedModel = model;
        this.instantTrackable.drawables.addCamDrawable(model);
    }
}

The addModel function is very similar to what was presented in the previous example, however, some additions have been made. Specifically, gesture callbacks have been added to the AR.Model. These callbacks follow the invocation pattern of the onClick callback. Should a class derived from AR.Drawable be hit by the touch, its gesture callbacks are invoked in case the corresponding gesture callback is implemented. If not the trackable is considered next, the AR.Context object eventually. For AR.Drawables belonging to an AR.InstantTrackable the drag gesture callbacks receive the tracking plane intersection coordinates in addition to the relative coordinates. This enables a means of translating objects in the instant tracking scene by simply setting the AR.Drawable's translate property to the intersection coordinates received. We recommend setting entire transformation properties (rotate, scale, translate) rather than setting their individual components to minimize the necessary callbacks from JavaScript to the native OS environment. The rotation and scale gesture callbacks operate as demonstrated by the dedicated gesture sample to which the interested reader is hereby kindly referred to.

One more intricacy to consider is disabling the drag gesture while two finger gestures are active in order to prevent counterintuitively behaving transformation interactions. This can be achieved by implementing the AR.context.on2FingerGestureStarted callback and setting a flag therein.

AR.context.on2FingerGestureStarted = function() {
    oneFingerGestureAllowed = false;
}

The onDragChanged callback is adapted to consider this flag and only update the translate property when allowed to do so.

onDragChanged: function(relativeX, relativeY, intersectionX, intersectionY) {
    if (oneFingerGestureAllowed) {
        this.translate = {x:intersectionX, y:intersectionY};
    }
}

The flag is reset in the next onDragBegan callback invocation to re-enable.

onDragBegan: function(x, y) {
    oneFingerGestureAllowed = true;
}

The changes outlined finally enable the initial use case of a furniture product visualization application to be implemented. Again, there are aspects of the example app that have not been covered, but they do not directly relate to the instant tracking feature and can easily be understood from the source code of the example application.

A miniature living room scene on the floor of the Wikitude offices

Scene Interaction

The instant tracking feature further allows for 3D points to be queried from the underlying point cloud structure. This section is concerned with showcasing this feature based on the corresponding sample of the sample application.

To utilize this feature a 2D input position on the screen is required. To acquire this position, we utilize the AR.context.onScreenClick event and attach a function receiving the click coordinates as its input to it. The coordinates received can be supplied to the convertScreenCoordinateToPointCloudCoordinate function of an AR.InstantTrackable unaltered. In addition to the input coordinates, the function requires another two functions that are invoked upon completing the query. These function are called upon success and upon failure respectively. Success meaning a 3D position could be found for the input coordinate, failure meaning such a position could not be found. For the successful case, the resulting 3D position is provided as three separate parameters. These can be used to set the translation of any AR.Drawable directly; a AR.Circle in this case. This feature is only available in the tracking state.

AR.context.onScreenClick = function(touchLocation) {
    if ( World.tracker.state === AR.InstantTrackerState.TRACKING ) {
        World.instantTrackable.convertScreenCoordinateToPointCloudCoordinate(touchLocation.x, touchLocation.y, function(x, y, z) {
            var circle = new AR.Circle(0.1, {
                translate: {
                    x: x,
                    y: y,
                    z: z
                },
                style: {
                    fillColor: '#FF8C0A'
                }
            });
            World.instantTrackable.drawables.addCamDrawable(circle);
        }, function () {
            alert('nothing hit. try selecting another scene location');
        });
    } else {
        alert('Scene information are only available during tracking. Please click the start button to start tracking');
    }
}

Finally, running the sample allows circle augmentations to be placed on screen touch when in tracking state.

Orange circles generated on the floor of the Wikitude offices using the scene picking feature

Persistent Instant Targets

The save and load instant targets feature allows for AR experiences to be persistently accessed by multiple users across devices and operating systems. Furthermore instant targets can be expanded on the fly. This section is concerned with showcasing this feature based on the corresponding samples of the sample application. This feature is not available with platform assisted tracking enabled.

Save Instant Target

To save an instant target there has to be an active InstantTracker in the tracking state and the directories of the provided path have to exist.

To get a valid path from the platform AR.platform.sendJSONObject and architectView.callJavaScript are used. The example shows one possible method of saving and loading augmentations together with the instant target. This is not part of the SDK functionality is best implemented in a custom way for every app.

saveCurrentInstantTarget: function () {
    var augmentations = [];

    allCurrentModels.forEach(function (model) {
        augmentations.push({
            uri: model.uri,
            translate: model.translate,
            rotate: model.rotate,
            scale: model.scale
        });
    });

    if (this.tracker.state === AR.InstantTrackerState.TRACKING) {
        AR.platform.sendJSONObject({
            action: "save_current_instant_target",
            augmentations: JSON.stringify(augmentations)
        });
    } else {
        alert("Save instant target is only available while tracking.")
    }
}

saveCurrentInstantTargetToUrl is called from the platform code with a url which is then used as path for saveCurrentInstantTarget to save to.

saveCurrentInstantTargetToUrl: function (url) {
    this.tracker.saveCurrentInstantTarget(url, function () {
        alert("Saving was successful");
    }, function (error) {
        alert("Saving failed: " + error);
    })
}

Load Instant Target

To load an instant target there has to be an active tracker and a previously saved instant target.

To get a valid path from the platform AR.platform.sendJSONObject and architectView.callJavaScript are used.

loadExistingInstantTarget: function () {
    AR.platform.sendJSONObject({
        action: "load_existing_instant_target"
    });
},

loadExistingInstantTargetFromUrl is called from the platform code with a url which is used to create a TargetCollectionResource that is then used by loadExistingInstantTarget. In the example, when loading the instant target was successful, the previously saved augmentations are recreated.

loadExistingInstantTargetFromUrl: function (url) {
    var mapResource = new AR.TargetCollectionResource(url);
    this.tracker.loadExistingInstantTarget(mapResource, function () {
        var augmentations = JSON.parse(augmentationsJSON);

        World.instantTrackable.drawables.removeCamDrawable(World.drawables);
        World.drawables.forEach(function (drawable) {
            drawable.destroy();
        });
        World.drawables = [];
        augmentations.forEach(function (model) {
            var modelIndex = rotationValues.length;

            rotationValues[modelIndex] = model.rotate.z;
            scaleValues[modelIndex] = model.scale.x;

            World.drawables.push(new AR.Model(model.uri, {
                translate: model.translate,
                rotate: model.rotate,
                scale: model.scale,
                onDragBegan: function() {
                    oneFingerGestureAllowed = true;
                },
                onDragChanged: function(relativeX, relativeY, intersectionX, intersectionY) {
                    if (oneFingerGestureAllowed) {
                        // We recommend setting the entire translate property rather than
                        // its individual components as the latter would cause several
                        // call to native, which can potentially lead to performance
                        // issues on older devices. The same applied to the rotate and
                        // scale property
                        this.translate = {x:intersectionX, y:intersectionY};
                    }
                },
                onRotationChanged: function(angleInDegrees) {
                    this.rotate.z = rotationValues[modelIndex] - angleInDegrees;
                },
                onRotationEnded: function() {
                    rotationValues[modelIndex] = this.rotate.z
                },
                onScaleChanged: function(scale) {
                    var scaleValue = scaleValues[modelIndex] * scale;
                    this.scale = {x: scaleValue, y: scaleValue, z: scaleValue};
                },
                onScaleEnded: function() {
                    scaleValues[modelIndex] = this.scale.x;
                }
            }))
        });
        World.instantTrackable.drawables.addCamDrawable(World.drawables);
    }, function (error) {
        alert("Loading failed: " + error);
    }, {
        expansionPolicy: AR.CONST.INSTANT_TARGET_EXPANSION_POLICY.ALLOW_EXPANSION
    })
}

Setting path for Instant Targets

First a path for the instant target is set.

private final File instantTargetSaveLocation;

public SaveLoadInstantTargetExtension(final Activity activity, final ArchitectView architectView) {
    super(activity, architectView);
    instantTargetSaveFile = new File(activity.getExternalFilesDir(null), "SavedInstantTarget.wto");
    savedAugmentationsFile = new File(activity.getExternalFilesDir(null), "SavedAugmentations.json");
}

To get JSONObjects from JavaScript, a JavaScriptInterfaceListener is set.

@Override
public void onCreate() {
    architectView.addArchitectJavaScriptInterfaceListener(this);
}

@Override
public void onDestroy() {
    architectView.removeArchitectJavaScriptInterfaceListener(this);
}

When a JSON Object was sent, it is analyzed and the path for storing and loading the instant target is sent to JavaScript by using architectView.callJavascript.

@Override
public void onJSONObjectReceived(final JSONObject jsonObject) {
    try {
        switch (jsonObject.getString("action")) {
            case "save_current_instant_target":
                saveAugmentations(jsonObject.getString("augmentations"));
                saveCurrentInstantTarget();
                break;
            case "load_existing_instant_target":
                loadExistingInstantTarget();
                break;
        }
    } catch (JSONException e) {
        activity.runOnUiThread(new Runnable() {
            @Override
            public void run() {
                Toast.makeText(activity, R.string.error_parsing_json, Toast.LENGTH_LONG).show();
            }
        });
    }
}

private void saveAugmentations(String data) {
    try (FileOutputStream stream = new FileOutputStream(savedAugmentationsFile)) {
        stream.write(data.getBytes());
    } catch (IOException e) {
        Log.e(TAG, "Could not save augmentations.", e);
    }
}

private String loadAugmentations() {
    int length = (int) savedAugmentationsFile.length();

    byte[] bytes = new byte[length];
    String jsonString;

    try (FileInputStream in = new FileInputStream(savedAugmentationsFile)) {
        in.read(bytes);
        jsonString = new String(bytes);
    } catch (IOException e) {
        jsonString = "";
    }

    return JSONObject.quote(jsonString);
}

private void loadExistingInstantTarget() {
    architectView.callJavascript(String.format("World.loadExistingInstantTargetFromUrl(\"%s\", %s)", instantTargetSaveFile.getAbsolutePath(), loadAugmentations()));
}

private void saveCurrentInstantTarget() {
    architectView.callJavascript(String.format("World.saveCurrentInstantTargetToUrl(\"%s\")", instantTargetSaveLocation.getAbsolutePath()));
}