Documentation

Setup Guide Epson

Project Setup for Android Studio

  • Create a new Android Application Project (There is also a working SampleProject bundled in this SDK, where all these steps are already made)

  • Copy the file Library/wikitudesdk.aar into the libs folder of your module. (project root/app/libs)

  • Open build.gradle from your module, add the wikitudesdk.aar as a dependency and tell gradle to search the libs folder, like in the code below.

  • If you already purchased a license, please set the applicationId to the package name you provided us with.

You will have to download the correct Epson SDK for your device and add it as dependency. They can be downloaded from the Epson website and are included in the Wikitude Epson package libs folder.

dependencies {
    implementation files('libs/BT300Ctrl.jar')

    implementation (name: 'wikitudesdk', ext: 'aar')
}

repositories {
    flatDir {
        dirs 'libs'
    }
}

Optionally you can include the Wikitude SDK using our custom maven repository instead.

  • Add https://cdn.wikitude.com/sdk/maven as maven source to your project. This is usually done in the top level build.gradle.
buildscript {
    repositories {
        jcenter()
        google()
        maven {
            url 'https://cdn.wikitude.com/sdk/maven'
        }
    }
    ...
}

allprojects {
    repositories {
        jcenter()
        google()
        maven {
            url 'https://cdn.wikitude.com/sdk/maven'
        }
    }
}
  • Open build.gradle from your module and add com.wikitude:js-moverio-bt300:9.0.0 as a dependency like in the code below.
android {
    ...
}

dependencies {
    implementation files('libs/BT300Ctrl.jar')
    implementation 'com.wikitude:js-moverio-bt300:9.0.0'

    implementation 'com.android.support:appcompat-v7:21.0.3'
}
    defaultConfig {
        applicationId "xxxx"
    }
  • Add the following permissions to your AndroidManifest.xml
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_GPS" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-feature android:name="android.hardware.camera" android:required="true" />
<uses-feature android:name="android.hardware.location" android:required="true" />
<uses-feature android:name="android.hardware.sensor.accelerometer" android:required="true" />
<uses-feature android:name="android.hardware.sensor.compass" android:required="true" />
<uses-feature android:glEsVersion="0x00020000" android:required="true" />
  • The activity holding the AR-View (called wearableArchitectView in the following) must have set android:configChanges="screenSize|orientation" in the AndroidManifest.xml, for example this could look like:
<activity android:name="com.yourcompany.yourapp.YourArActivity"
   android:configChanges="screenSize|orientation"/>

AR View in Activity

Keep in mind that the Wikitude SDK is not a native Android SDK as you know from other SDK's. The basic concept is to add an wearableArchitectView to your project and notify it about lifecycle events. The wearableArchitectView creates a camera surface and handles sensor events. The experience itself, sometime referred to as ARchitect World, is implemented in JavaScript and packaged in your application's asset-folder (as in this project) or on your own server. The experiences are written in HTML and JavaScript and call methods in Wikitude's AR-namespace (e.g. AR.GeoObject).

You have to include

 <script src="https://wikitude.com/libs/architect.js"></script>

in your HTML files to use the AR namespace and the wearableArchitectView will handle them properly. To test an ARchitect World on a desktop browser, you must include ade.js tool instead to avoid JavaScript errors and see a development console.

It is recommended to handle your augmented reality experience in a separate Activity. Declare the wearableArchitectView inside a layout XML. E.g. Add this within FrameLayout's parent tags.

<com.wikitude.architect.WearableArchitectView android:id="@+id/wearableArchitectView"
   android:layout_width="fill_parent" android:layout_height="fill_parent"/>

WearableArchitectView is creating a camera surface so ensure to properly release the camera in case you're using it somewhere else in your application. Besides a camera (front or back-facing) the WearableArchitectView also makes use of compass and accelerometer values, requires OpenGL 2.0 and at least Android 4.0. WearableArchitectView.isDeviceSupported(Context context) checks whether the current device has all required hard- and software in place or not.

Note: Make AR-View only accessible to supported devices

It is very important to notify the WearableArchitectView about life-cycle events of the Activity. Call wearableArchitectView's onCreate(), onPostCreate(), onPause(), onResume(), onDestroy() inside your Activity's lifecycle methods. Best practice is to define a member variable for the wearableArchitectView in your Activity. Set it right after setContentViewin Activity's onCreate(), and then access wearableArchitectView via member-variable later on.

this.wearableArchitectView = (WearableArchitectView)this.findViewById( R.id.wearableArchitectView );
final ArchitectStartupConfiguration config = new ArchitectStartupConfiguration();
config.setLicenseKey( * license key */ );
this.wearableArchitectView.onCreate( config );
Note: Since Android 6.0+ you need to make sure your app has camera and external storage runtime permissions before calling wearableArchitectView.onCreate( config ).

Activity's onPostCreate() is the best place to load the AR experience.

this.wearableArchitectView.onPostCreate();
this.wearableArchitectView.load( "YOUR-AR-URL" );

On the first start of an AR-View the Wikitude SDK looks for a calibration. By default the calibration is stored on the external storage, it can also be set manually by calling wearableArchitectView.onPostCreate('calibration path'). It is recommended to use the default calibration path so users who have other apps installed powered by Wikitude SDK can use the same calibration. More about the calibration and how it can be customized can be found here.

The wearableArchitectView.load() argument is the path to the html file that defines your AR experience. It can be relative to the asset folder root or a web-url (starting with http:// or https://). e.g. wearableArchitectView.load('arexperience.html') opens the html in your project's assets-folder, whereat wearableArchitectView.load('http://your-server.com/arexperience.html') loads the file from a server.

Note: You can only pass arguments to the html file when loading it via url. The following will not work: wearableArchitectView.load('arexperience.html?myarg=1')

Location

Management of the location is important in location based augmented reality applications. Depending on the use-case location is used via GPS or network and may be updated every second or once in a while. Although the SDKExamples project provides a basic implementation of a LocationProvider this is by far not the best location strategy available for Android.

Please use your own advanced location strategy implementation in case you have special requirements.

Stereoscopic Rendering (2D/3D Mode)

The Wikitude SDK provides two different stereoscopic rendering modes:

  • Wikitude SDK Stereoscopic Mode
  • Hardware Stereoscopic Mode

The Wikitude SDK stereoscopic mode splits the view and renders a calibrated image separately for both eyes. It can be enabled by using wearableArchitectView.setStereoscopic3dRenderingEnabled (true for 3D mode or false for 2D mode).

The hardware stereoscopic mode of the Epson Moverio stretches a single image over both displays, this can also be triggered by using the hardware buttons or the Epson Moverio SDK. For the device the image is split in half, the left half is displayed on the left eye and the right half is displayed on the right eye. The images for each eye in the 3D mode are half the full resolution (640x720px for the BT-300/350) and stretched to fit the display. This means that if you want to show normal web content without AR objects you have to change the width to half the size. Every AR object is rendered by the Wikitude Epson SDK which means you don't have to change the width as this is handled internally. To enable the hardware stereoscopic mode wearableArchitectView.setStereoscopic3dDisplayModeEnabled (true for 3D mode or false for 2D mode) can be used.

In most cases it makes sense to set both stereoscopic modes to the same value.