Worldengine SDK user guide

SDK setup

Worldengine integration makes use of the Unity Package Manager(UPM) system to make the SDK easy to integrate and update.

  1. Add mobi.lab.scp UPM package folder to the Packages folder of the Unity project. It is a standard UPM package and all other alternative adding methods work also.
  2. Add scoped registry for Microsoft Azure in the manifest.json file located in the project packages folder.

    Adding scoped registry from manifest.json
    { 
    	"dependencies": { /.../ }, 
    	"scopedRegistries": [
    		{
    			"name": "Azure Mixed Reality Services",
    			"url": "https://api.bintray.com/npm/microsoft/AzureMixedReality-NPM",
    			"scopes": [
    				"com.microsoft.azure.spatial-anchors-sdk"
    			]
    		}
    	]
    }
  3. Optional: If assembly definition files are used in the project, then add MobiLab.Worldengine.Runtime references to your scripts folder assembly definition.
  4. Set project build target to Android or iOS
  5. You should now be able to initialize Worldengine when running in the editor and receive responses from the server. You may skip to the Initialization part of the guide to verify that the SDK is working.

iOS build setup

Unity creates iOS builds by first creating an Xcode project based on the Unity project. For this step to work, some settings need to be configured in the Unity project settings.

  1. In XR Plug-in Management > iOS enable ARKit Plug-in Provider.
  2. In Player > Other Settings set Camera Usage Descriptor. This is the message presenter to the user when camera access is requested. (E.g "Camera is used for augmented reality")
  3. In Player > Other Settings set Target minimum iOS Version to "11.0". This is required by the ARKit plug-in.
  4. In Player > Other Settings set Architecture to ARM64. This is required by the ARKit plug-in.

Android build setup

Unity creates Android builds by first creating a Gradle project based on the Unity project. For this step to work, some settings need to be configured in the Unity project settings.

  1. In XR Plug-in Management > Android enable ARCore Plug-in Provider.
  2. Player > Other Settings > Graphics APIs remove Vulkan. This is enabled by default in a new Unity project and is not supported by ARCore
  3. In Player > Publishing Settings enable Custom Main Manifest and Custom Main Gradle Template
  4. In Player > Other Settings set Minimum API Level to 24
  5. Add Azure dependencies to the mainTemplate.gradle dependencies block.

    dependencies {
    	implementation fileTree(dir: 'libs', include: ['*.jar'])
    	implementation('com.squareup.okhttp3:okhttp:[3.11.0]')
    	implementation('com.microsoft.appcenter:appcenter-analytics:[1.10.0]')
    **DEPS**}
    
    
  6. Add permissions to the AndroidManifest.xml

    <uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
    <uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION"/>
    <uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
    <uses-permission android:name="android.permission.CHANGE_WIFI_STATE"/>

Starting Augmented Reality

The following steps should be taken to locate spatial anchors and attached content:

  1. Connect and download data from your Worldengine account through the AugmentedReality facade.
  2. Optional: Download/Update assets from your Worldengine account through the AugmentedReality facade
  3. Start augmented reality session through the AugmentedReality facade
  4. React to OnAdded callbacks on AnchorTriggerGhost, ARElementGhost or AssetGhost and instantiate content. Ghosts are positioned Unity GameObjects that are in sync with the real world and have no visual output. Visible user content can be placed as child objects to ghosts or to actively follow their transforms.

Initialization

Worldengine classes are located in the Mobilab.Worldengine namespace.

using statement
using MobiLab.Worldengine;

Most Worldengine features can be accessed through the AugmentedReality facade class. the async keyword is used to enable sequential execution.

Starting AR Session
 async void Start()
 {
	AugmentedReality.Init();

	//Password and username used by Content Manager development version 
	var user = "string";
	var pass = "string";
	//1. Connect and download data
	await AugmentedReality.DownloadRemoteData(user, pass);
	//2. Download assets. Assets are cached locally per user account. Only new or updated assets will be downloaded
	await AugmentedReality.UpdateAssets();
	//3. Start AR Session with prefered Unity camera object
	var arCamera = Camera.main;
	await AugmentedReality.StartAnchorSession(arCamera);
 }

Rendering content

When the augmented reality session locates spatial anchors and content then callbacks are sent to the client application through the Ghost classes. The events are invoked with an AnchorTriggerGhost/ARElementGhost/AssetGhost parameter that holds all relevant metadata for this anchor/element/asset. The client can start listening to the events at any time in the application lifecycle. When attaching to the events after the session is already started, then already located content can be found from Ghost.Active lists of each ghost class.

Ghost events
using UnityEngine;
using MobiLab.Worldengine;

public class ContentRenderingSystem : MonoBehaviour
{

	//Content to be shown to the user
	public GameObject contentPrefab;

	private void OnEnable()
	{
		//Called when an asset is located
		AssetGhost.OnAdded += AssetGhost_OnAdded;
		//See also ARElementGhost.OnAdded and AnchorTriggerGhost.OnAdded
	}

	private void OnDisable()
	{
		AssetGhost.OnAdded -= AssetGhost_OnAdded;
	}

	private void AssetGhost_OnAdded(AssetGhost ghost)
	{
		//Add content as child of a ghost to keep it in sync
		var content = Instantiate(contentPrefab, ghost.Transform);

		//if AugmentedReality.UpdateAssets() has been called, then the asset file can be accessed from ghost.FilePath
		Debug.Log($"AssetGhost {ghost.Title} has asset of type {ghost.AssetType} in file {ghost.FilePath}");
	}

}

Callbacks:

  • AnchorTriggerGhost.OnAdded - An anchor is located
  • AnchorTriggerGhost.OnRemoved - An anchor is removed from the system.
  • ARElementGhost.OnAdded - An anchor is located that has attached ARElements. This is called for each element.
  • ARElementGhost.OnRemoved - An anchor was removed that has attached ARElements. This is called for each element.
  • AssetGhost.OnAdded - An ARElement was located that has attached assets. This is called for each asset.
  • AssetGhost.OnRemoved - An ARElement was removed that has attached assets. This is called for each asset.

Permissions

The SDK handles requesting permissions from the user when a session is started, but in order for the requests to work your own project has to also include them in the project settings. Access to wifi, location service and the camera is required to relocate and position the content.

Android requires the following permissions to be added to the AndroidManifest.xml:

<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION"/>
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
<uses-permission android:name="android.permission.CHANGE_WIFI_STATE"/>

iOS Xcode project requires the following plist additions:

  • NSBluetoothPeripheralUsageDescription
  • NSLocationWhenInUseUsageDescription
  • NSBluetoothAlwaysUsageDescription

iOS requires AccessWifiCapability to be added to the project.