8i Unity Plugin Documentation

Welcome to the 8i Unity Plugin Docs.

The 8i Unity Plugin allows developers to add 8i’s fully volumetric video to any Unity experience. It’s like embedding traditional 2D video content, except you get to walk around ours as it plays!

About

Download

Download the 8i Unity Plugin. You should be getting a link from 8i after the purchase of stage, or you can ask your 8i contact to get one.

Install

Extract the ‘8i’ folder from the zip file into your project’s Asset directory.

Update

Close the Unity Editor and delete the ‘8i’ folder from your project.

It is required to close the Unity Editor fully as it the Editor will lock the native dlls from being overwritted or deleted.

Support

Supported Unity Versions

  • 2018.4 (LTS)
  • 2019.4 (LTS)
  • 2020.3 (LTS)

Note

For more details on each platform see the individual pages in Platform Support

Quick Start

Download and Installation

Download the 8i Unity Plugin. You should be getting a link from 8i after the purchase of stage, or you can ask your 8i contact to get one.

Once downloaded extract the ‘8i’ folder from the zip file into your project’s Asset directory.

Creating a HvrActor

Before beginning, ensure that the HvrActor, HvrRender or HvrLight components are available through the Unity Component Menus. If they are not, check the installation instructions or the Troubleshooting section of this documentation.

  1. Create a HvrActor by right clicking in the Hierachy and select ‘8i/Create HvrActor’
  2. Drag and drop one of the folders contained under ‘8i/examples/assets/hvr/’ onto the ‘Reference’ slot of the HvrActor
  3. Select the Main Camera in the scene
  4. Add the ‘HvrRender’ component
  5. Optionally, if you need lighting at the expense of speed: Select the ‘Directional Light’ in the scene, then add the ‘HvrLight’ component

The HvrActor will now be rendering both the scene and game view.

Creating a PatActor

The easiest method to create a PAT playback object is to instantiate the PatActor prefab located under 8i/examples/assets/prefabs. The prefab puts together the needed PatActor, PatRender, and PatAudioRender components, and is properly scaled to match the Unity expected unit measures. The prefab still requires some in-code linking of the components, an example of which is located as a full scene in 8i/examples/scenes/Example-PAT-streaming.unity

  • The example scene uses a helper PatStreamPlayer script. The script takes in an input AssetUrl, and upon starting the scene, instantiates the PatActor prefab, links the renderers to the PatActor so they render its data, and then initializes the PatActor instance with the AssetUrl.
  • It can pre-download or directly stream the content at the provided AssetUrl using the DownloadAsset flag. An example scene with the Download toggle enabled also exists as Example-PAT-download.
  • Finally, it shows example usage of the IEightiAsset control interface (which can be used for both Hvr and Pat assets), by enabling looping and beginning playback of the content.

It is important to note that the PatStreamPlayer script is a good base example of the usage, but does not utilize every component of the PAT Unity API. See the Components page for additional information on the Pat components and more advanced usage of the PatAsset interface for things like manual and automatic bitrate switching.

Creating a build

  1. Follow the ‘Adding a HvrActor to your scene’ steps above.
  2. Save the scene.
  3. Open Build Settings window under ‘File/Build Settings…’.
  4. Click the ‘Add Open Scenes’ button on the right of the window, or drag the scene that was just saved from the ‘Project’ window of the Unity Editor.
  5. Click the ‘Build and Run’ and follow the onscreen prompts and select your build location.

Unity will now compile the project and create a build.

If it does not, check the Unity Editor console or the Troubleshooting section of this documentation.

Building for VR

The 8i Unity Plugin fully supports VR rendering.

To enable VR for your Unity Project

  1. Open the ‘Player Settings’ window from ‘Edit/Project Settings/Player’.
  2. Tick the ‘Virtual Reality Supported’ checkbox.
  3. From the new list, make sure the target headset you wish to use is listed. If it isn’t, add it by clicking the + icon.
  4. Ensure your main camera has a HvrRender component attached to it

If you encounter any issues getting VR working in your Unity Project, please consult the Unity Manual [Link to Unity Docs]

Note: Android-based VR headsets like Google Daydream and Oculus Go are also supported. You may want to consult their own document on how to build an app in Unity.

Building for AR

To get started with ARKit or ARCore, we provide a set of tutorials here: Tutorials

Platform Support

Android

Support

Architecture
  • ARMv7, ARM64
Graphics API
  • OpenGLES3

Requirements

CPU
  • Requires ARMv7 or ARM64
  • ARMv7 Requires NEON (supported by almost all phones that are fast enough to run HVR content)
GPU
  • Requires OpenGLES3
RAM
  • Recommended >512MB ( Requirements vary based on application and data quality )
OS
  • KitKat 4.4 – 4.4.4
  • Lollipop 5.0 – 5.1.1
  • Marshmallow 6.0 – 6.0.1
  • Nougat 7.0 – 7.1.2
  • Oreo 8.0 – 8.1
  • Newer Versions
Min Spec
  • API Level 19 ( KitKat )

Building

In order to include HVR data with your build custom build step must be used. This step will scan the scenes in your project and copy any required HVR data into the project’s StreamingAssets folder.

This menu can be found under the 8i/Android drop down menu at the top of the Unity Editor window.

Custom Build Menu
Build

This step will copy any hvr data that is required by the project into the StreamingAssets directory and then build the project to a specified location

Once the build step is complete, the files that were copied to the StreamingAssets directory will be cleared in order to reduce project bloat

Build and Run Same as the ‘Build’ step, but will attempt to run the build on any attached devices once complete.
Prepare Assets for Build

This is a helper utility which will copy any required hvr data to the project’s StreamingAssets folder, but will not build the project

Once this step is complete, the project can be built again via the standard Unity build process.

There is no need to run the “Prepare Assets for Build” option again unless any new hvr data is required or has been removed from the build.

Unpacking Hvr Data

In order to load HVR Data on Android, the data must be available on the device’s local storage in a folder with read access.

This section can be ignored if your project does not need to load data off local storage, or you use another way of bundling and extracting HVR data.

When creating a build, Unity will package any files that are placed within your project’s StreamingAssets folder into a compressed file by default. This compressed file is not able to be read by the 8i Unity Plugin. However the plugin provides support to extract HVR and HVRS data from this file onto the device’s local storage.

This feature is available through the “HvrUnpackingScene” scene included with the plugin.

“HvrUnpackingScene” uses the “LoadingSceneManager” component to unpack the data from the built APK/OBB file. When opened this scene will scan the runing build and will extract any files that need to be exported. Once complete it will automatically load the next scene. This component can be easily modified for your project if you want to customize how the data is unpacked.

This scene can be found at: “../8i/core/interface/platforms/android/scenes/”.

Note

It is strongly recommended to enable ‘Split Application Binary’ (Read More) to dodge the size limit placed by Google Play. Packing everything into one APK is allowed but not recommended.

Google Play requirements

The Google Play Store imposes a size limit of 100mb for APK files uploaded for distribution. To allow larger data to be shipped with an APK, Android supports OBB files, also known as Application Expansion Files (Expansion Files)

Unity also supports these files, via a player setting called “Split Application Binary”. This option will export many of the project resources to an OBB, rather than packaging them with the APK. (Split Application Binary)

Google Play on 64-bit apps

Starting August 1, 2019, Google Play only allows apps have support of 64-bit architecture to be able to publish. That means developer can only choose Unity version that supports Android 64bit build. Only Unity 2018.3 and newer versions (as an exception, 2017.4 back-ported the ARM64 support)have good support of ARM64 so you want to stick with those versions.

To build 64-bit app, you also need to change Unity runtime from Mono to compiled IL2cpp. See here for more information.

iOS

Support

Architecture
  • ARM64
Graphics API
  • OpenGLES3
  • Metal

Requirements

  • A Mac is required to generate the XCode project
  • XCode should always be updated to the latest version
CPU
  • Requires ARM64
GPU
  • Requires support for OpenGLES3 or Metal
RAM
  • Recommended >512MB ( Requirements vary based on application and data quality )
OS
  • iOS 11 or newer
Min Spec
  • iPhone 7

Building

There are no special required steps when creating a build. The project can be built using the default Unity Build menu.

Linux

Support

Architecture
  • x64
Graphics API
  • OpenGlCore

Requirements

CPU
  • Intel and AMD
  • Requires 64bit
GPU
  • ATi and Nvidia
  • Requires OpenGL 4.1 or D3D 11.1
RAM
  • Recommended >2GB ( Requirements vary based on application and data quality )
OS
  • Ubuntu 16.04
  • Ubuntu 17.10
Min Spec
  • Intel Core2 Quad CPU Q6600 @ 2.4 GHz
  • 4GB Ram
  • Nvidia 970 or equivalent ( For VR )
  • Integrated graphics have been tested for modern i5 & i7, but will render slow

Building

There are no special required steps when creating a build. The project can be built using the default Unity Build menu.

MacOS

Support

Architecture
  • x64
Graphics API
  • OpenGlCore
  • Metal

Requirements

  • A Mac is required to generate the XCode project
  • XCode 9.3 is the minimum required version
CPU
  • Intel
  • AMD
  • 64bit Architecture
GPU
  • ATi
  • Nvidia
  • Support for OpenGL
RAM
  • Recommended >2GB
OS
  • El Capitan
  • Sierra
  • High Sierra
  • Mojave
Min Spec
  • El Capitan
  • MacBook Pro (Retina, 13-inch, Mid 2014)
  • 2.6GHz dual-core Intel Core i5 processor
  • 8GB of 1600MHz DDR3L onboard memory
  • Intel Iris Graphics
  • https://support.apple.com/kb/sp703

Building

DLL initialisation

You need to put 8i/core/platforms/common/scenes/HvrPluginInit.scene as the first scene in your build. You can also use 8i->Project Tips to help you.

Windows

Support

Architecture
  • x64
Graphics API
  • DirectX 11
  • OpenGlCore

Requirements

CPU
  • Intel
  • AMD
  • 64bit Architecture
GPU
  • ATi
  • Nvidia
  • Support for OpenGL 4.1 or D3D 11.1
RAM
  • Recommended >2GB
OS
  • Windows 7
  • Windows 8
  • Windows 8.1
  • Windows 10
Min Spec
  • Windows 7
  • Intel Core2 Quad CPU Q6600 @ 2.4 GHz
  • 4GB Ram
  • Nvidia 970 or equivalent ( For VR )
  • Integrated graphics have been tested for modern i5 & i7, but will render slow

Building

DLL initialisation

You need to put 8i/core/platforms/common/scenes/HvrPluginInit.scene as the first scene in your build. You can also use 8i->Project Tips to help you.

Components

IEighti shared interface

HVR and PAT codec playback can be controlled via the common IEighti interface, so that one set of controls can be used while supporting both codecs. However, the instantiation of each of the codec Actors/Assets will differ.

IEightiAsset

This script provides the common set of playback controls between PatAsset and HvrAsset components.
It can be used to Play/Pause/Seek and control things like looping.

IEightiActor

This component provides a shared target for accessing the IEightiAsset interface from either PatActor or HvrActor components.
This type can be used to hold codec agnostic references to playback Actors.

HVR Components

All HVR components included in this plugin can be found within the ‘Component’ menu at the top of the Unity Editor window, or in the ‘Add Component’ menu on any GameObject.

Unity’s manual demonstrates how components are used: https://docs.unity3d.com/Manual/UsingComponents.html

Main Components

HvrActor

This component provides the ability to create and play HVR data within Unity.

It offers control over how the data is played and how it is rendered.

Parameters
Property Type Function
Data    
Mode Reference

Drag and Drop a file or folder from the Project window onto this slot.

Any referenced data is included when a build is created.

  Path

A direct path to a file, folder or network location.

Any data specified will not be included in a build.

Play Bool Should the Asset this HvrActor creates start playing immediately?
Loop Bool Should the Asset this HvrActor creates loop?
Seek Time Float The time the Asset should seek to once it is created
Renderer    
Material Object The material the HvrActor will render with if using the ‘Standard’ RenderMethod.
Render Method FastCubes Renders the actor using cubes
  CorrectCubes Identical results to FastCubes, but has correct depth and z-sorting for cubes on all platforms
  PointSprite Renders the actor with screen aligned squares
  PointSpriteDepth Only renders depth, color is ignored
  PointBlend Renders the actor with smooth points which soften the look of the actor
Use Lighting Bool Should this HvrActor be influenced by HvrLights?
Receive Shadows Bool Should this HvrActor receive shadows?
Cast Shadows Bool Should this HvrActor cast shadows?
Options    
Use Screenspace Quad Bool

Should this HvrActor render using a Screenspace Quad?

This option is exposed in order to handle the cases where a ShaderSubroutine has move points outside the bounds of the HvrActor.

To reduce the amount of overdraw, it is recommended to leave this option disabled unless needed.

Use Occlusion Culling Bool Should this HvrActor use Unity’s Occlusion Culling system to check whether the object is visible when rendering? This is represented as a sphere around the HvrActor.
Occlusion Culling Radius Multiplier Float Increase the radius of the Occlusion Culling sphere.

HvrRender

In order to render HvrActors a HvrRender component must be attached to a Camera.

There are two different render modes that the HvrRender component can be set to.

Standard

This mode renders using Unity’s standard rendering loop and allows for custom materials, lighting and effects. This mode attempts to fit invisibly into Unity’s render loop and work as expected and allow HvrActors to be customized like any other normal mesh rendering.

Direct

Renders directly into the Unity framebuffer. This mode is generally faster than Standard, but comes with the trade off that custom materials, lighting and post effects are not supported.

Parameters
Parameter Type Function
Mode Standard Renders HvrActor within Unity’s render loop This allows for custom materials and effects to affect how an HvrActor is rendered
  Direct Renders directly into Unity’s framebuffer Using this option will ignore all custom materials and effects that are currently set on a HvrActor

Note

ShaderSubroutines effects are compatible with both Standard and ‘Direct’ mode.

HvrLight

HvrLight enables HvrActors to be lit by and cast shadows from Unity Lights.

Note

HvrLight only works when HvrRender’s rendering mode is set to ‘Standard’. In ‘Direct’ rendering mode, HvrActors will not be affected by this component.

Limitations
  • Directional Lights are only supported in Unity 2017.2 and greater
  • Directional Lights shadow casting is not supported in the Unity Editor’s Scene View
  • HvrLights under point light will have shadow artifacts prior to Unity 5.6
  • Some mobile devices will have limited shadow casting support

Effects

HvrActor3DMask

This components allows you to mask sections of an HvrActor.

It works by using HvrActor3DMaskObject to mark areas which should be affected by the masking.

This component stores an array of HvrActor3DMaskObject. Each HvrActor3DMaskObject is applied in the order within the array. For example, a head of an HvrActor can be isolated by creating a ‘Subtractive’ mask which covers the entire HvrActor, then creating a ‘Additive’ mask which only surrounds the head of the HvrActor. Placing the ‘Subtractive’ mask first, and the ‘Additive’ mask second will cause only the head to be visible.

Note

This effect is done in 2D. This means masking out a hand in front of a face will remove the hand, but the face will not be visible. Rotate the camera within the included example scene to see a demonstration of this.

Parameters
Parameter Type Function
Objects Array An array of HvrActor3DMaskObject components.
HvrActor3DMaskObject

Used by HvrActor3DMask to mark areas to be affected when masking a HvrActor

The mask object can be either a sphere or a box. It can be additive or subtractive.

Parameters
Property Type Function
Options Sphere Mask type will be a sphere
  Box Mask type will be a cube
Additive Bool Is this mask additive or subtractive?

HvrColorGrading

This component allows the color grading to be applied to the rendering of HvrActors.

Adjusting the values and sliders of this component will affect the look of the HvrActor it is attached to.

Note

The ‘Direct’ render mode available on HvrRender is not compatible with this component.

HvrShaderSubroutine

This component allows custom shaders to affect the HvrActor during the native plugin rendering step. This allows for complex effects to be written to affect the color, position and size of each voxel.

A HvrShaderSubroutine component can be though of like Unity’s material system. Where you have a shader with some values that can be changed and this changes how a mesh renders.

In the case of HvrShaderSubroutines, the shader suborutine code is executed as a post process step during the native rendering of the voxels.

Multiple HvrShaderSubroutine components can be added to a HvrActor in order to apply multiple post processing effects. The order which each subroutine executes is based on the order of the component as it is attached to the HvrActor’s GameObject from top to bottom. This is known as a stack.

Note

Not all examples in the below documentation have been written for all of the supported shader languages.

You may be required to modify the shader code for your specific target platform.

Writing Shader Subroutines

Unlike Unity’s ShaderLab shaders, shader subroutines are not automatically converted to work for every BuildTarget. Shader subroutines must be written for each Graphics API that the effect needs to work on.

The current supported shader languages are GLSL, HLSL and Metal.

Graphics API Shader Language Language Reference
Direct3D11 HLSL Windows HLSL Shader Reference
OpenGLCore GLSL OpenGL Reference
OpenGLES3 GLSL OpenGLES Reference
Metal Metal Shading Language Apple Metal Shader Specification
Code Blocks

For convenience, a single shader subroutine file can contain all of the different shader languages. When the file is loaded it will only compile the relevant sections.

BEGIN and END lines are used to specify a block of code for different languages.

In addition, each type of shader subroutine corresponds to one shader stage: either the vertex or fragment (pixel) shader stage. A code block will have a _VERTEX or _FRAGMENT suffix depending on the types of shader subroutines it contains. The available types of shader subroutines are discussed later in this page.

For example:

BEGIN_GLSL_VERTEX
    # Code
END_GLSL_VERTEX

BEGIN_GLSL_FRAGMENT
    # Code
END_GLSL_FRAGMENT

BEGIN_HLSL_VERTEX
    # Code
END_HLSL_VERTEX

BEGIN_METAL_VERTEX
    # Code
END_METAL_VERTEX
Syntax and Structure

Shader subroutines can control several properties of HvrActor rendering.

A code block consists of a collection of subroutines. Each subroutine controls one feature of rendering, such as the object-space position of the vertex, the vertex or fragment colour, or the scale factor applied to the voxel.

A subroutine is implemented as a function in the relevant shader language. The feature (position, colour, etc.) controlled by the subroutine is declared using an output semantic. Each shader subroutine may have several input parameters, which have values provided by the HVR Renderer, based on specified input semantics.

Input and output semantics are declared using an extended shader language syntax, which is similar to the way that HLSL uses semantics. The HVR Renderer parses the extended syntax, processes the semantic declarations and outputs standard shader code.

An example is given below, demonstrating modification of the vertex colour, object-space position and voxel scale. Note that the names of all subroutines need to be declared in a comma-separated list in parentheses after the code block BEGIN tag.

BEGIN_GLSL_VERTEX(CalcVertexColour, CalcVertexPosition, CalcVertexScale)

    // Control colour
    vec4 CalcVertexColour(vec4 colour : VERTEX_COLOUR, vec4 oPos : OPOS) : VERTEX_COLOUR
    {
        // Example: This makes the colour darker.
        // The 'oPos' parameter is not used in this example,
        // but could be included in the calculation, i.e. to modify
        // the colour based on the position.
        return vec4(0.5 * colour.rgb, colour.a);
    }

    // Control position
    vec4 CalcVertexPosition(vec4 oPos : OPOS) : OPOS
    {
        // Example: This moves the vertex vertically upwards.
        return vec4(oPos.x, oPos.y + 1.0, oPos.z, oPos.w);
    }

    // Control scale
    float CalcVertexScale(float scale : VOXEL_SCALE) : VOXEL_SCALE
    {
        // Example: This doubles the vertex size.
        return 2.0 * scale;
    }

END_GLSL_VERTEX

In the example above, the CalcVertexColour subroutine has VERTEX_COLOUR specified as its output semantic, so the HVR Renderer uses its return value for the output vertex colour. Its input semantics are VERTEX_COLOUR and OPOS, so the parameters corresponding to these semantics will be filled in by the HVR Renderer with the original vertex position and colour.

The other two shader subroutines (CalcVertexPosition and CalcVertexScale) work similarly.

A list of semantics and their functionality is given below. Each of these can be used as either input or output semantics. Each subroutine must be declared in the appropriate code block (VERTEX or FRAGMENT) based on the shader stage of its output semantic.

Semantic Type Shader Stage Description
OPOS vec4 / float4 Vertex Object-space coordinates of the current vertex.
VERTEX_COLOUR vec4 / float4 Vertex The colour of the current vertex.
FRAGMENT_COLOUR vec4 / float4 Fragment The colour of the current fragment (pixel).
VOXEL_SCALE float Vertex A scaling factor used to modify the size of the voxel (1.0 = original scale).

Important note: Each subroutine must be declared after the BEGIN tag in the code block header. This takes the form of a comma-separated list of function names in parentheses: for example, BEGIN_GLSL_VERTEX(CalcVertexColor, CalcVertexPosition, CalcVertexScale) in the example above. If a subroutine is not declared, it will be ignored by the HVR Renderer.

Helper or utility functions without input or output semantics should not be declared in the code block header.

Parameters

Most shader subroutines are likely to need parameters provided by the application; for example, the current time, in effects that are dynamic or animated. These correspond to uniform variables in GLSL and constant buffers in HLSL.

The name of each variable, struct or cbuffer should be prefixed by <ID> (this is discussed in the ‘Shader Subroutine Stacks’ section).

In GLSL, shader parameters can be declared as global uniform variables:

uniform float _<ID>CurrentTime;

Similarly, in HLSL, global cbuffers can be declared:

cbuffer _<ID>ShaderParams
{
    float _<ID>CurrentTime;
}

In the Metal shading language, shader inputs cannot be declared as global variables. Instead, a struct of parameters must be defined; this can then be declared as a parameter to a shader subroutine using the special semantic SHADER_UNIFORMS. For example:

struct _<ID>ShaderParams
{
    float _<ID>CurrentTime;
};

float4 CalcVertexColour(float4 colour : VERTEX_COLOUR, _<ID>ShaderParams uniforms : SHADER_UNIFORMS) : VERTEX_COLOUR
{
    return float4(colour.rgb * (sin(uniforms._<ID>CurrentTime) * 0.5 + 0.5), colour.a);
}

For Metal, textures should be declared in a separate struct and used with the SHADER_TEXTURES semantic. For example:

struct _<ID>ShaderTextures
{
    texture2d<float> _<ID>RGLookupTable;
};

float4 CalcVertexColour(float4 colour : VERTEX_COLOUR, _<ID>ShaderTextures textures : SHADER_TEXTURES) : VERTEX_COLOUR
{
    // Uses the R and G components of the original colour as texture coordinates.
    constexpr sampler textureSampler(mag_filter::linear, min_filter::linear);
    return textures._<ID>RGLookupTable.sample(textureSampler, colour.rg);
}
Stacks

In order to support shader subroutine stacks, it is required to prefix all custom parameters and methods with “<ID>” (without the quote marks).

This is necessary because when a shader subroutine stack is created, all of the shaders in the stack are compiled into one large shader. If more than one of those shaders has a parameter with the same name, the parameter’s value will not be able to be set differently for each shader. For example, if two shaders had the parameter “colour” when you try to set each to a different value it would affect both paramters in the shader.

In order to address this, a unique ID is generated for each file and is used when the shader subroutine stack is created. This ID is used to replace the “<ID>” prefix and ensures that each shader has unique parameter and method names.

This example demonstrates how to write a shader which is compatible with shader subroutine stacks.

BEGIN_GLSL_VERTEX(VertexColor)

    uniform float _<ID>Saturation;

    float <ID>Luminance(vec3 c)
    {
        return dot(c, vec3(0.22, 0.707, 0.071));
    }

    vec4 CalcVertexColour(vec4 colour : VERTEX_COLOUR, vec4 oPos : OPOS) : VERTEX_COLOUR
    {
        float luminance = <ID>Luminance(colour.rgb);
        colour.r = lerp(colour.r, luminance, _<ID>Saturation);
        colour.g = lerp(colour.g, luminance, _<ID>Saturation);
        colour.b = lerp(colour.b, luminance, _<ID>Saturation);
        return colour;
    }

END_GLSL_VERTEX
Examples

Example 1

Set all voxels to be blue

BEGIN_GLSL_VERTEX(SetVertexColour)
    vec4 SetVertexColour(vec4 colour : VERTEX_COLOUR, vec4 oPos : OPOS) : VERTEX_COLOUR
    {
        colour.rgb = vec3(0, 0, 1);
        return colour;
    }
END_GLSL_VERTEX

BEGIN_HLSL_VERTEX(SetVertexColour)
    float4 SetVertexColour(float4 colour : VERTEX_COLOUR, float4 oPos : OPOS) : VERTEX_COLOUR
    {
        colour.rgb = float3(0, 0, 1);
        return colour;
    }
END_HLSL_VERTEX

BEGIN_METAL_VERTEX(SetVertexColour)
    float4 SetVertexColour(float4 colour : VERTEX_COLOUR) : VERTEX_COLOUR
    {
        colour.rgb = float3(0, 0, 1);
        return colour;
    }
END_METAL_VERTEX

Example 2

Offset the position of all vertices vertically

BEGIN_GLSL_VERTEX(SetVertexPosition)
    vec4 SetVertexPosition(vec4 oPos : OPOS) : OPOS
    {
        if (oPos.y > 100)
            oPos.y += 30;
        return oPos;
    }
END_GLSL_VERTEX

BEGIN_HLSL_VERTEX(SetVertexPosition)
    float4 SetVertexPosition(float4 oPos : OPOS) : OPOS
    {
        if (oPos.y > 100)
            oPos.y += 30;
        return oPos;
    }
END_HLSL_VERTEX

BEGIN_METAL_VERTEX(SetVertexPosition)
    float4 SetVertexPosition(float4 oPos : OPOS) : OPOS
    {
        if (oPos.y > 100)
            oPos.y += 30;
        return oPos;
    }
END_METAL_VERTEX

Example 3

The following sets the color of all voxels to be blue, and sets their scale to 0 if they are below 1m in the data’s object space.

BEGIN_GLSL_VERTEX(SetVertexColour, SetVertexScale)

    vec4 SetVertexScale(float scale : VOXEL_SCALE, vec4 oPos : OPOS) : VOXEL_SCALE
    {
        if (oPos.y < 100)
            return 0;
        return scale;
    }

    vec4 SetVertexColour(vec4 colour : VERTEX_COLOUR, vec4 oPos : OPOS) : VERTEX_COLOUR
    {
        colour.rgb = vec3(0, 0, 1);
        return colour;
    }
    END_GLSL_VERTEX

BEGIN_HLSL_VERTEX(SetVertexColour, SetVertexScale)

    float4 SetVertexScale(float scale : VOXEL_SCALE, float4 oPos : OPOS) : VOXEL_SCALE
    {
        if (oPos.y < 100)
            return 0;
        return scale;
    }

    float4 SetVertexColour(float4 colour : VERTEX_COLOUR, float4 oPos : OPOS) : VERTEX_COLOUR
    {
        colour.rgb = float3(0, 0, 1);
        return colour;
    }

END_HLSL_VERTEX

BEGIN_METAL_VERTEX(SetVertexColour, SetVertexScale)

    float4 SetVertexScale(float scale : VOXEL_SCALE, float4 oPos : OPOS) : VOXEL_SCALE
    {
        if (oPos.y < 100)
            return 0;
        return scale;
    }

    float4 SetVertexColour(float4 colour : VERTEX_COLOUR) : VERTEX_COLOUR
    {
        colour.rgb = float3(0, 0, 1);
        return colour;
    }

END_METAL_VERTEX

Utilities

HvrActorTrigger

This simple component can be attached to a GameObject which has a Collider and assigned a HvrActor as a target. When the Collider is hit by any collidable object, the HvrActor will be triggered to play.

Parameter Type Function
actor HvrActor The HvrActor which will be triggered by this component when something collides with it

HvrActorAnimationSync

This component allows a Animation clip to be synced with a HvrActor’s playback.

The Animation will match the HvrActor’s current time. Syncing the HvrActor to the Animation’s time is not supported.

Parameters
Parameter Type Function
HvrActor HvrActor The HvrActor which will be used to sync the animation
TargetAnimation Animation A Animation component which will be synced to the HvrActor’s Asset

HvrActorAudioSourceSync

This component allows a Audiosource’s AudioClip to be synced to the playback of a HvrActor’s asset.

The HvrActorAudioSourceSync does not take an audio clip itself as a value.

It expects the audiosource to have an audioclip assigned. Once you have assigned a Audiosource and HvrActor, you will not need to manage this component - If the actor is playing, the audio source will be playing.

If the AudioClip assigned to the AudioSource is shorter than the HvrActor’s asset duration it will play as long as it can, if it is longer it will stop once the HvrActor stops playing.

The AudioClip will match the HvrActor’s current time. Syncing the HvrActor to the AudioClips’s time is not supported.

Parameters
Parameter Type Function
Actor HvrActor A HvrActor which will be synced with
Audio Source AudioSource A AudioSource component which has a AudioClip assigned to it
Offset Float A time in seconds which this component will use to offset the sync time. This can be useful for situations where the audio clip start time does not match the Asset.

HvrActorProjectorShadow

This component can be used to align a Projector component based on the bounds of an HvrActor. This is useful for creating blob shadows.

Parameters
Parameter Type Function
Actor HvrActor The HvrActor which has an Asset. The bounds of this Asset will be used to drive the Projector components size
Projector Projector The Projector Component which will cast a blob shadow texture
Size Multiplier Float Used to increase or decrease the scale of the blob shadow texture

HvrDataBuildInclude and HvrDataReference

The HvrDataBuildInclude component and HvrDataReference can be used to include files and folders from your Project which are not assigned to HvrActors.

During the build process the 8i Unity Plugin will scan any scenes that are to be built and check the components to find any HvrActors which are using the ‘Reference’ Data Mode. It will also search for HvrDataBuildInclude components which have a HvrDataReference assigned. If any are found, the data that is referenced will be will be copied to the build folder so the built application can load them.

Using this component is helpful in situations where your project going to be creating HvrActors at runtime and you do not want to manually copy data from your project to your build.

Note

When building for Android the build process described above is not automatic and requires a custom build process. Please see the Android Platform page for more information about this.

HvrDataBuildInclude

The HvrDataBuildInclude is a component which a HvrDataReference can be assigned to.

To use this component, attach it to a GameObject within your scene and assign a HvrDataReference to the ‘Data Reference’ slot.

Parameter Type Function
Data Reference Object A Object slot which a HvrDataReference can be assigned to
HvrDataReference

The HvrDataReference contains an list of references to files or folders from your project.

It can be created by either opening the ‘Assets’ menu at the top of the screen, or in the ‘Create’ context menu when right clicking in your Project window and navigating to ‘Create/8i/HvrDataReference’

To add a reference to the List

  • Click the ‘Plus’ icon in the bottom right.
  • Drag and Drop a file or folder from your project into the ‘Data’ slot OR click the circle icon to open a window to search your entire project.

Because the references are created using Unity’s project GUID system moving any referenced files and folders will not break the reference.

Parameter Type Function
Data List A expandable list of GUID references to files or folders in the project

PAT Components

What is PAT

PAT is the latest codec developed by 8i. PAT is a mesh based codec, in which the final asset output is represented as a sequence of textured meshes

Output from PAT

The PAT codec produces several outputs. It’s important to understand that the outputs are simply a different encapsulation around the same output data. The available outputs are:

  • MPD - This format extends the well known MPEG-DASH and HLS standards. This format should be used for all applications that require streaming and playback that is longer than a few seconds. More info about an MPD structure can be found in the MPD structure document
  • glTF - This output can be used in standard editors/players and social platforms. glTF is widely adopted format, particularly suitable for the web. However, this format performs poorly for longer content where progressive streaming is required.
  • FBX - This output is primarily used for cases where the output needs to be edited using tools like Maya. After editing, this format can be converted to any of the other formats.

Difference from HVR

The main difference between HVR and PAT is how the underlying 3D asset is represented. HVR takes the approach of representing the 3D asset as a point cloud. However, PAT represents the asset as a textured mesh. Effectively, PAT decouples the mesh(geometry) resolution from the texture(color) resolution. Allowing us to scale up the asset perceived resolution, with smaller increase in overall asset size. To illustrate it, think of an actor with a very textured shirt. The underlying geometry of the shirt is fairly simple, allowing us to save a lot of geometry bandwidth. In this case most the complexity is on the texture side, which we can very effectively encode using standard video codecs. The represent the high details with HVR, we will need a very high resolution of points, causing the file size to explode.

PAT usage with Unity

HVR is using a point cloud to represent the data. Point clouds are less standard and hence Unity’s standard rendering will not produce the best output to the user. As a result, to produce the best quality render, the HVR plugin takes over the rendering of the asset. This internal rendering takes away some of feature set that a developer would expect with a 3D asset in Unity. In contrast, PAT simply delivers a mesh and a texture per frame, giving back the control over rendering and 3D asset manipulation and effects to the developer.

The easiest way to instantiate a PAT playback object is via the example PatActor prefab.
There are example PAT scenes in 8i/examples/scenes that provide the best starting point.

Main Script Components

PatActor

This component provides the ability to create and play PAT data within Unity.
It takes in the source asset URL and contains linkage to the PatAsset and the shared IEightiAsset interface.
It is used as the source for PatRender and PatAudioRender scripts.
NOTE: Changing the below parameters must happen before the asset URL is set. Otherwise, the changes are ignored.
  • encryptionKey, encryptionKid: This pair should be set together when necessary. The fields should remain null when unused.
  • enableTextureHwDecode: Controls hardware texture decode enablement (Android only), hardware decode is enabled by default.
  • enableAudio: Controls audio playback enablement, enabled by default.

PatAsset

This is the main API for configuration and control of PAT playback.
It is a superset of the shared IEightiAsset interface, providing access to PAT specific API.
This is not an exhaustive API listing, but the following are of special note:
  • EnableAudio
    Controls audio playback enablement for the asset. This will automatically be called during PatActor.SetAssetDataUrl using the Unity project’s output samplerate.
    If used manually, it needs to be called prior to Create()
  • GetRepresentations

    This allows for enumeration of available PatPlayerRepresentation objects per audio/video/mesh type. Primarily, video representation is used to provide different levels of bandwidth/quality.

  • SelectRepresentation
    • Once enumerated, this API can be used on an initialized (Create() called) asset to switch to a particular representation.
    • If the switched representation is of the same fps, the switch will happen once already downloaded and cached data is used up (a few seconds). Otherwise, in the case of a different framerate, the switch happens immediately.
    • There is a special value of -1 for bitrate that can be used to set automatic adaptive bitrate switching on the specified type. This would currently only need to be set on the video representation.

PatAudioRender

This component is responsible for the audio rendering within Unity. It is assigned a PatActor to use as its render source.
Once assigned to a particular actor, this runs automatically based on audio availability coming from the PatAsset.
If used outside of the PatAsset prefab, the component requires that a Unity AudioSource is attached to the same object.

PatDownloader

This is a utility object used for the background download of Pat assets, signaling via its callback when complete or interrupted.
It should be used once per asset download, as the parameters are specified at construction time.
  • DownloadParams:
    • Uri: The web hosted URL that would otherwise be used in direct streaming
    • LocalDir: Local OS folder path to store the downloaded asset. Where appropriate on mobile platforms, permissions may need to be requested and granted by the user.
    • Quality: Toggle to download the highest, medium or lowest quality content representation. Usually meaning 2048, 1024 and 512 texture resolutions respectively.
    • EncryptionKey: Set to encryption key as needed, otherwise should be null.
    • EncryptionKid: Set to encryption kid as needed, otherwise should be null.
  • StartDownload:
    Begins the download, returns failure if the download is already in progress
  • StopDownload:
    Interrupts an ongoing download. The callback will return failure
  • OnAssetDownloaded callback:
    The callback is executed on a background thread, so much of the Unity API is unavailable. The implementation should simply take the local device path and pass it off to be used for PatAsset creation on the main thread.
    • success: Denotes whether the download completed successfully
    • newUri: On successful download, the returned path to the local manifest file which can be used as the input to PatActor.SetAssetDataUrl()
  • OnDownloadProgress callback:
    The callback is executed on a background thread, so much of the Unity API is unavailable. Returns the latest download progress percentage
    • progressPercentage: progress percentage value, 0-100

PatLiveManager

This component helps to connect a PatActor to Asgard live streams.
NOTE: You must have your own Asgard live stream URL. As testing requires a live stage to operate, there is no sample URL provided.
It takes in the Asgard live URL and monitors whether the stream is active or not.
When the stream becomes active, it will pull the current live MPD URL, assign it to the PatActor, and begin playing.
The manager does not stop playback when the stream becomes inactive. Since there is a delay between playback and the stream, this allows any remaining stream data to finish playing.
The Manager has a callback for when the stream status changes.
Parameters
  • AsgardLiveShowUrl: The URL to monitor for live stream status changes
  • CheckStatePeriod: How often to check the URL when the stream is inactive
  • onStreamActiveChanged: Callback invoked when the stream status changes
  • Active: Enable to allow the Manager to make network requests and begin playback as needed.

PatPlayerRepresentation

These objects are enumerated via the PatAsset interface, for each sub-stream of the given asset (video/mesh/audio)

Properties
  • bitrate: The maximum per-second bitrate of the sub-stream, meant to be used in PatAsset.SelectRepresentation
  • id: string describing a given representation
  • fps: framerate of the representation, when multiple framerates are present

PatRender

This component is responsible for the actual rendering within Unity. It is assigned a PatActor to use as its render source.
Once assigned to a particular actor, this runs automatically based on frame availability coming from the PatAsset.
If used outside of the PatAsset prefab, the component requires that a Unity MeshFilter and MeshRenderer are attached to the same object, along with the provided PatPlayerMaterial and PatPlayerShader. The latter can be modified to provide custom rendering of the underlying texture/mesh data if desired.

Examples

PatStreamPlayer

This is an example script used by our sample scenes to instantiate a PatActor prefab and correctly initialize it to play.
Note that it runs via Unity’s Awake and Start functions, so in the case of download, the engine will be frozen until the download completes.
Parameters
AssetUrl | URL to the cloud hosted manifest
PatActorPrefab | Should be set to the provided PatActor.prefab
DownloadAsset | Controls whether the asset should first be downloaded, or streamed directly from the web
OnTriggerSeek | On touch or space key, trigger seek to half of current playout time
OnTriggerRepSwitch | On touch or space key, trigger cycling to the next video representation (streaming only)

PatLiveAsgardPlayer

This is an example script used by our sample scene to instantiate a PatActor, set up a PatLiveManager and correctly initialize it to play.
This example demonstrates how to achieve continuous playback across multiple live stream sessions.
It uses the StreamActive state and assetInterface properties to stop playback once all the data is played through.
As long as the mLiveManager is kept Active, the example will continuously play across live stream sessions.
Parameters
  • AsgardUrl: The URL to monitor for live stream status changes, which is passed in to PatLiveManager
  • PatActorPrefab: Should be set to the provided PatActor.prefab

PAT with webplayer and 8thwall

PAT MPD assets can be streamed and played on the browser. Additional information about usage of the web player can be found here: https://8i.github.io/embeddable_webplayer/#/

Troubleshooting

Cannot create HVR components

  • Check that the plugin was fully extracted from the ‘8i Unity Plugin’ zip into your Project’s Asset folder
  • Check the Recommended Project Settings window for help ( Found under the ‘8i/Recommended Project Settings’ menu at the top of the Unity Editor )
  • If you recently updated the plugin, make sure that you closed the Unity Editor adding the new plugin folder to your project. The Unity Editor can lock the native binaries included in the plugin and block you from writing over them.
  • Make sure the Unity version is compatible with this version of the plugin.
  • Check the console to see whether there are any errors blocking Unity from compiling the plugin.

Performance is low

Use lower resolution data

It is recommended using this table when choosing which resolution to use for your application.

Resolution PC Mobile About
1,500,000 Y N Not recommended for mobile
800,000 Y Y High end mobile device is required
600,000 Y Y  
400,000 Y Y  

The filename of the hvr file will tell you the resolution. This number represents the density of the data and the overall quality.

The naming convention is [NAME]_[Resolution]_[FrameNumber].[Extension]

Most files follow this naming convention. If you are not sure, please send an email to your contact at 8i.

Ie: BOXER_200_tc_v01_800000_001009.hvr will be:

Name BOXER_200_tc_v01
Resolution 800,000
Frame number 1009

Change the HvrActor render method

It is recommended to use the ‘FastCubes’ render method in order to improve performance.

Change the HvrRender render mode

The ‘Direct’ render mode provides the best performance and memory usage.

There are downsides of this method which are outlined in the HvrRender section of this documentation.

Mobile Performance Tips

  • Point Count

    As mobile platforms perform much slower than desktop systems it is recommended that hvr frames with point counts of 600k or less are used, with the recommended point count being around 300k.

  • Rendering Settings

    It is recommended to use the ‘Direct’ HvrRender render method on Android as it is the best performing renderer.

  • Render Method

    It is recommended to use the ‘Point Sprite’ render method in all cases. It is the best performing render method provided.

    The current alternative, ‘Point Blend’ does not work on some older devices, and is around twice as expensive to render.

HVR Actors are not rendering

Common Problems

  • The Main Camera does not have a HvrRender component attached
    A common mistake is to not attach a HvrRender component to the main rendering camera in the scene
  • My HvrActor Prefabs are not rendering
    Please see the HvrDataBuildInclude component page for more information about this problem.

Android

  • The Graphics API may not be supported

    Make sure that under the PlayerSettings that the targeting GraphicsAPI is GLES3 and that your device supports GLES3.

  • HvrActors using PointBlend are not rendering

    The PointBlend render method does not work on some older mobile devices.

  • HvrRender fails to load Standard shaders

    There is a known issue that on some devices, that when the ‘Split Application Binary’ option is enabled, the HvrRender component may not be able to load the Standard shaders. Go to ‘Edit/Project Settings/Player’ and make sure that the option ‘Split Application Binary’ is not checked.

  • Issues with Google Daydream builds

    The current release of the Unity Editor for DayDream ( 5.4.2f2-GVR13 ), will not use the AndroidManifest.xml file that is provided within the plugin. This means the hvr frames will not be extracted or read from the devices storage upon installing the build.

    It appears as though this build of the Unity Editor has a bug where it will not use and AndroidManifest.xml file that is not located at this specific location project_name/Assets/Plugins/Android/AndroidManifest.xml

    Until this is fixed within Unity, it is recommend to copy the AndroidManifest file from this location 8i/core/platform support/android/plugins/AndroidManifest.xml to project_name/Assets/Plugins/Android/AndroidManifest.xml

  • Black screen when loading the app
    • Project was not built using the Custom Build Menu
      If your project is using the “HvrUnpackingScene”, it is required to create your build using the Custom Build Menu mentioned above.
    • Project using the “HvrUnpackingScene” and was built using the custom build menu
      Check the Editor Build Settings and ensure that there is a second scene directly after the “HvrUnpackingScene”.
  • Android failing to extract data from OBB file

    Some devices do correctly allow the OBB file to be copied to the device when using the “Build and Run” option in Unity, and in some cases will silently fail to update the OBB when the project is built. If this occurs, the OBB file will need to be manually copied to the development device.

    So far only the Samsung Galaxy Note 5 has been observed with this issue.

Tutorials

AR

In this section, we will go through two AR solutions provided by Apple on iOS, and Google on Android, namely ARKit and ARCore.

ARKit Tutorial

Prerequisites
  • iOS devices that support ARKit and Metal and run iOS 11.3 or later, see here,
    • ARKit 2.0 requires a device with iOS 12 installed
  • Mac installed latest macOS
  • Unity 2017.1 or later
  • Xcode 9.3 or later
    • Command line tools installed, see here.
    • ARKit 2.0 requires Xcode 10 or later
  • Get a high level idea of how AR works on iOS from Apple’s documentation.
Getting Unity ARKit plugin

Go to https://bitbucket.org/Unity-Technologies/unity-arkit-plugin and download the official Unity ARKit plugin. You might need to install Mercurial or SourceTree to grab the source.

Switch to 1.5.1 tag if you want it to run on iOS 11 devices. Here’s how you do it in SourceTree but could be different if you use other Mercurial client:

_images/switch-to-1.5.1.png
Importing 8i Unity Plugin

Within the downloaded project, extract 8i Unity Plugin into the Asset folder, as stated in Quick Start section.

You should have the directory structure like this:

_images/unity-arkit-plugin-with-8i.png
Configure the Unity Project

Open the project in Unity

If you are prompted to upgrade the Unity version, click yes.

Select File > Build Settings, a build dialogue should come up. In Platform choose iOS and click Switch Platform button.

Make sure the Platform is switched to iOS, and choose one of the scene as into the build. In this tutorial, we checked the simplest scene UnityARBallz.

_images/switch-to-platform-ios.png

Still in Build Settings dialogue, click Player Settings… button.

A PlayerSettings inspector should appear.

Note

It is requried to disable Metal Editor Support. This is because the 8i Unity Plugin doesn’t not yet support Metal on macOS.

Make sure Metal is listed as the first in Graphics APIs and disable Metal Editor Support

_images/turn-off-metal-editor.png
Your First 8i Hologram

For this tutorial, we will edit the UnityARBallZ scene from Unity ARKit plugin’s example.

We will change the original AR object to 8i’s hologram, so that you can place a human hologram onto the augmented world.

To open the scene, find the scene in project and double click the scene.

_images/open-unityarballz.png

You should be able to see something like this in Scene view:

_images/seeming-empty-scene.png

This scene is a barebone template of an AR app and all it does is to automatically detect the environment and track the movement of the device and tie it to the virtual Camera object.

Select menu GameObject > 8i > HvrActor. This will create an GameObject with a HvrActor component attached to it.

_images/create-8i-hvractor.png

Select the newly created HvrActor object.

There are a few options to note but for now we will just focus on the Asset/Data/Reference field.

This is the data source that 8i’s hologram engine will read from. As you can see, right now it’s empty. To specify a valid file reference, we can go to folder 8i/examples/assets/hvr, and find “president” folder:

_images/inspector-hvractor.png _images/where-is-president.png

Drag this “president” folder to Asset/Data/Reference field in Inspector panel. To make things even simpler, uncheck the Rendering/Lighting/Use Lighting checkbox:

_images/inspector-hvractor-president.png

You should be able to see the hologram has already been shown in the Scene view:

_images/sceneview-president.png
Making A Prefab

Because we want our user be able to drop the hologram whenever he touches the ground, we need to wrap this HvrActor object into a called “prefab” and let our ARKit code know to use it.

Note

Prefabs are an important concept in Unity,

Making sure HvrActor is currently selected, drag the HvrActor object down to a folder in the Project window, Unity will automatically create a prefab for you, and you will see the name of HvrActor turns blue:

_images/drag-to-make-prefab.png

To change the ARKit code to spawn HvrActor*s instead of balls, find *BallMaker object in the scene and select it. Drag the newly created prefab HvrActor to BallMaker’s Inspector panel, replace BallPrefab with HvrActor:

_images/replace-ballmaker-with-hvractor.png

Because we have stored the HvrActor in a prefab it is now safe to delete the HvrActor in the scene. Go to Hierarchy and right click on HvrActor, which should has its name in blue colour, and choose “Delete”.

_images/delete-template-hvractor.png

Save the scene by pressing Cmd+S.

Camera Configuration

Next we need to configure the camera to let it render 8i’s hologram.

Note

This step is required or else you will only be able to view the hologram within the Unity Editor

Find the camera object in Hierarchy > CameraParent > Main Camera and select it.

_images/hierarchy-camera.png

With Main Camera seleced, In menu choose Component > 8i > HvrRender, this should add a HvrRender component to the camera:

_images/main-camera-hvrrender.png

Save the scene by pressing Cmd+S.

Include HVR Data

Before we can build the project, there’s an extra step to do. Because we are using a prefab which means it will be dynamically loaded. We need to explicitly tell Unity to include the data before exporting.

First, right click on the Project window and create an asset of type HvrDataReference. You do it through Create > 8i > HvrDataReference.

_images/create-datareference-asset.png

After creation, select the asset. Drag the president folder to its data field.

_images/drag-hvr-to-datareference.png

Now we have created and configured the asset on disk. Now we need to include this asset in our scene. Right click in Hierarchy window and create an empty GameObject.

_images/create-empty.png

With the empty object selected, attach a component of type HvrDataBuildInclude. You can find it in Component > 8i > HvrDataBuildInclude.

_images/attach-databuildinclude.png

Finally drag the configured HvrDataReference asset to Data Reference field.

_images/assign-data-reference.png

Save the scene.

Export and Build

That’s it! It’s time to export Xcode project and deploy it to the device.

  • Menu File > Build Settings, click Player Settings and make sure Metal is the first listed in the Inspector window.

  • Click Build, select a folder to export the project. If everything went smooth, a Finder window should pop up and shows the exported Xcode project.

  • Double click Unity-iPhone.xcodeproj and this should bring up Xcode.

  • Configure Xcode project as follows. You need to pay attention to code signing if you are new to it.

  • After configuration, hit run:

    _images/xcode-settings.png
  • Once the build is deployed and running, pick up your phone and walk around until a magenta ground is shown, which means you can put your holograms on.

  • Tap the magenta ground to see how hologram works within AR world.

Where to go from now on

ARCore Tutorial

Prerequisites
  • A walkthrough of Google ARCore Quickstart,
  • An ARCore enabled Android device
  • Unity 2017.4.9f1 or later with Android Build Support
    • Note Unity 2018.2 has an known bug with Android render target, so it’s not supported. See bug tracking here.
  • Android SDK 7.0 (API Level 24) or later
  • Accept all the licenses from Android SDK. For example, if you are on macOS:
    • $ cd ~/Library/Android/sdk/tools/bin
    • $ ./sdkmanager –licenses
    • Accept all the licenses.
Getting ARCore SDK for Unity

Download ARCore SDK for Unity 1.4.0 or later

Open Unity and create a new empty 3D project

_images/android-new-project.png

Select Assets > Import Package > Custom Package, use the downloaded arcore-unity-sdk-v1.4.0.unitypackage from your disk.

_images/android-import-package.png

In the import dialogue, make sure everything is selected and click Import.

_images/android-import-package-dialogue.png

Accept any API upgrades if prompted.

Importing 8i Unity Plugin

Within the newly created project, extract 8i Unity Plugin into the Asset folder, as stated in Quick Start section.

You should have the directory structure like this:

_images/android-after-8i-project-structure.png

Fix any warning poped up in 8i Project Tips window, including Android Unpack Scene.

_images/android-project-tips-warning.png
Configure the Unity Project

Open the project in Unity

If you are prompted to upgrade the Unity version, click yes.

Open scene HelloAR by double clicking Assets/GoogleARCore/Examples/HelloAR/Scenes/HelloAR

_images/android-helloar-scene.png

Select File > Build Settings, a build dialogue should come up. click Player Settings… button. A PlayerSettings inspector should appear. In the Inspector window, find Other Settings - Metal Editor Support and unchecked it. This is important for Unity previewing 8i’s hologram content.

_images/android-build-settings-other-settings-metal-editor.png

Still in Build Settings dialogue, in Platform choose Android and click Switch Platform button.

Make sure the Platform is switched to Android, and make sure HelloAR scene is ticked on by using Add Open Scenes.

_images/android-build-settings.png

Still in Build Settings dialogue, click Player Settings… button. A PlayerSettings inspector should appear. In the Inspector window, a few fields need to be configured:

  • Other Settings - Package Name: set to an reversed DNS like name, e.g. com.yourcompany.arsample

  • Other Settings - Uncheck Auto Graphics API and explicitly set OpenGL ES 3 as the graphics API

  • Other Settings - Multithreaded Rendering: uncheck

  • Other Settings - Minimal API Level: set to Android 7.0 or higher. Note you need to have the right version of Android SDK installed and configured in Unity > Preference.

  • Other Settings - Target API Level: set to Android 7.0 or higher. Note you need to have the right version of Android SDK installed and configured in Unity > Preference.

    _images/android-build-settings-other-settings.png
  • XR Settings - ARCore Supported: tick on

    _images/android-build-settings-xr-settings.png
Your First 8i Hologram

For this tutorial, we will edit the HelloAR scene from Google ARCore SDK for Unity’s example.

W will change the original AR object to 8i’s hologram, so that you can place a human hologram onto the augmented world.

To open the scene, find the scene in project and double click the scene.

Select menu GameObject > 8i > HvrActor. This will create an GameObject with a HvrActor component attached to it.

_images/android-creator-hvractor.png

Select the newly created HvrActor object.

There are a few options to note but for now we will just focus on the Asset/Data/Reference field.

This is the data source that 8i’s hologram engine will read from. As you can see, right now it’s empty. To specify a valid file reference, we can go to folder 8i/examples/assets/hvr, and find “president” folder:

_images/inspector-hvractor.png _images/android-where-is-president.png

Drag this “president” folder to Asset/Data/Reference field in Inspector panel. To make things even simpler, uncheck the Rendering/Lighting/Use Lighting checkbox:

_images/inspector-hvractor-president.png

You should be able to see the hologram has already been shown in the Scene view:

_images/android-sceneview-president.png
Making A Prefab

Because we want our user be able to drop the hologram whenever he touches the ground, we need to wrap this HvrActor object into a called “prefab” and let our ARKit code know to use it.

Note

Prefabs are an important concept in Unity,

Making sure HvrActor is currently selected, drag the HvrActor object down to a folder in the Project window, Unity will automatically create a prefab for you, and you will see the name of HvrActor turns blue:

_images/android-make-prefab.png

To change the HelloAR scene to spawn HvrActor instead of Andy Android, find Example Controller object in the scene and select it. Drag the newly created prefab HvrActor to Example Controller’s Inspector panel, replace Andy Plane Prefab and Andy Point Prefab with HvrActor:

_images/android-specify-prefab.png

Because we have stored the HvrActor in a prefab it is now safe to delete the HvrActor in the scene. Go to Hierarchy and right click on HvrActor, which should has its name in blue colour, and choose “Delete”.

_images/android-delete-hvractor.png

Save the scene by pressing Cmd+S.

Camera Configuration

Next we need to configure the camera to let it render 8i’s hologram.

Note

This step is required or else you will only be able to view the hologram within the Unity Editor

Find the camera object in Hierarchy > CameraParent > Main Camera and select it.

_images/android-first-person-camera.png

With First Person Camera seleced, In menu choose Component > 8i > HvrRender, this should add a HvrRender component to the camera:

_images/android-attach-hvrrender.png

Save the scene by pressing Cmd+S.

Include HVR Data

Before we can build the project, there’s an extra step to do. Because we are using a prefab which means it will be dynamically loaded. We need to explicitly tell Unity to include the data before exporting.

First, right click on the Project window and create an asset of type HvrDataReference. You do it through Create > 8i > HvrDataReference.

_images/create-datareference-asset.png

After creation, select the asset. Drag the president folder to its data field.

_images/android-specify-datareference.png

Now we have created and configured the asset on disk. Now we need to include this asset in our scene. Right click in Hierarchy window and create an empty GameObject.

_images/android-create-empty.png

With the empty object selected, attach a component of type HvrDataBuildInclude. You can find it in Component > 8i > HvrDataBuildInclude.

_images/android-attach-hvrdatabuildinclude.png

Drag the configured HvrDataReference asset to Data Reference field.

_images/android-drag-hvrdatareference.png

Finally, choose from menu 8i > Android > Prepare Build and click OK if a dialogue prompts. This will prepare and bake the content ready to be submit to Android device. Note this is an Android specific process whenever you changed the dynamic loaded 8i content. You don’t have to do it if no 8i content changed between builds.

_images/android-prepare-build.png

Save the scene.

Export and Build

That’s it! It’s time to build an APK and deploy it to the device.

  • Connect your Android phone to your development machine

  • Enable developer options and USB debugging on your Android phone. This should be done just once.

  • Menu File > Build Settings, click Player Settings.

  • Click Build And Run, select a folder to export the APK. If everything went smooth, you should be to see the APK get exported and automatically deployed to device.

    _images/android-build-and-run.png
  • Once the build is up and running, pick up your phone and walk around until a magenta ground is shown, which means you can put your holograms on.

  • Tap the white grid ground or blue dots to see how hologram works within AR world.

Where to go from now on

Technical Support

Slack

Please use the dedicated Slack channel given along with your purchase.