Welcome to the 8i Unity Plugin Docs.
The 8i Unity Plugin allows developers to add 8i’s fully volumetric video to any Unity experience. It’s like embedding traditional 2D video content, except you get to walk around ours as it plays!
Download the 8i Unity Plugin. You should be getting a link from 8i after the purchase of stage, or you can ask your 8i contact to get one.
Extract the ‘8i’ folder from the zip file into your project’s Asset directory.
Close the Unity Editor and delete the ‘8i’ folder from your project.
It is required to close the Unity Editor fully as it the Editor will lock the native dlls from being overwritted or deleted.
Supported Unity Versions
Warning
Unity 2018.2 is currently not compatible when building for Android. This is a bug within Unity which will be fixed in a upcoming version. This bug will be seen as the HvrActor rendering over a black background with the Unity scene not being visible.
Note
For more details on each platform see the individual pages in Platform Support
Download the 8i Unity Plugin. You should be getting a link from 8i after the purchase of stage, or you can ask your 8i contact to get one.
Once downloaded extract the ‘8i’ folder from the zip file into your project’s Asset directory.
Before beginning, ensure that the HvrActor, HvrRender or HvrLight components are available through the Unity Component Menus. If they are not, check the installation instructions or the Troubleshooting section of this documentation.
The HvrActor will now be rendering both the scene and game view.
The easiest method to create a PAT playback object is to instantiate the PatActor prefab located under 8i/examples/assets/prefabs. The prefab puts together the needed PatActor, PatRender, and PatAudioRender components, and is properly scaled to match the Unity expected unit measures. The prefab still requires some in-code linking of the components, an example of which is located as a full scene in 8i/examples/scenes/Example-PAT-streaming.unity
It is important to note that the PatStreamPlayer script is a good base example of the usage, but does not utilize every component of the PAT Unity API. See the Components page for additional information on the Pat components and more advanced usage of the PatAsset interface for things like manual and automatic bitrate switching.
Unity will now compile the project and create a build.
If it does not, check the Unity Editor console or the Troubleshooting section of this documentation.
The 8i Unity Plugin fully supports VR rendering.
To enable VR for your Unity Project
If you encounter any issues getting VR working in your Unity Project, please consult the Unity Manual [Link to Unity Docs]
Note: Android-based VR headsets like Google Daydream and Oculus Go are also supported. You may want to consult their own document on how to build an app in Unity.
Architecture |
|
Graphics API |
|
CPU |
|
GPU |
|
RAM |
|
OS |
|
Min Spec |
|
In order to include HVR data with your build custom build step must be used. This step will scan the scenes in your project and copy any required HVR data into the project’s StreamingAssets folder.
This menu can be found under the 8i/Android drop down menu at the top of the Unity Editor window.
In order to load HVR Data on Android, the data must be available on the device’s local storage in a folder with read access.
This section can be ignored if your project does not need to load data off local storage, or you use another way of bundling and extracting HVR data.
When creating a build, Unity will package any files that are placed within your project’s StreamingAssets folder into a compressed file by default. This compressed file is not able to be read by the 8i Unity Plugin. However the plugin provides support to extract HVR and HVRS data from this file onto the device’s local storage.
This feature is available through the “HvrUnpackingScene” scene included with the plugin.
“HvrUnpackingScene” uses the “LoadingSceneManager” component to unpack the data from the built APK/OBB file. When opened this scene will scan the runing build and will extract any files that need to be exported. Once complete it will automatically load the next scene. This component can be easily modified for your project if you want to customize how the data is unpacked.
This scene can be found at: “../8i/core/interface/platforms/android/scenes/”.
Note
It is strongly recommended to enable ‘Split Application Binary’ (Read More) to dodge the size limit placed by Google Play. Packing everything into one APK is allowed but not recommended.
The Google Play Store imposes a size limit of 100mb for APK files uploaded for distribution. To allow larger data to be shipped with an APK, Android supports OBB files, also known as Application Expansion Files (Expansion Files)
Unity also supports these files, via a player setting called “Split Application Binary”. This option will export many of the project resources to an OBB, rather than packaging them with the APK. (Split Application Binary)
Starting August 1, 2019, Google Play only allows apps have support of 64-bit architecture to be able to publish. That means developer can only choose Unity version that supports Android 64bit build. Only Unity 2018.3 and newer versions (as an exception, 2017.4 back-ported the ARM64 support)have good support of ARM64 so you want to stick with those versions.
To build 64-bit app, you also need to change Unity runtime from Mono to compiled IL2cpp. See here for more information.
Architecture |
|
Graphics API |
|
CPU |
|
GPU |
|
RAM |
|
OS |
|
Min Spec |
|
There are no special required steps when creating a build. The project can be built using the default Unity Build menu.
Architecture |
|
Graphics API |
|
CPU |
|
GPU |
|
RAM |
|
OS |
|
Min Spec |
|
There are no special required steps when creating a build. The project can be built using the default Unity Build menu.
Architecture |
|
Graphics API |
|
CPU |
|
GPU |
|
RAM |
|
OS |
|
Min Spec |
|
Architecture |
|
Graphics API |
|
CPU |
|
GPU |
|
RAM |
|
OS |
|
Min Spec |
|
All HVR components included in this plugin can be found within the ‘Component’ menu at the top of the Unity Editor window, or in the ‘Add Component’ menu on any GameObject.
Unity’s manual demonstrates how components are used: https://docs.unity3d.com/Manual/UsingComponents.html
Main Components
This component provides the ability to create and play HVR data within Unity.
It offers control over how the data is played and how it is rendered.
Property | Type | Function |
Data | ||
Mode | Reference | Drag and Drop a file or folder from the Project window onto this slot. Any referenced data is included when a build is created. |
Path | A direct path to a file, folder or network location. Any data specified will not be included in a build. |
|
Play | Bool | Should the Asset this HvrActor creates start playing immediately? |
Loop | Bool | Should the Asset this HvrActor creates loop? |
Seek Time | Float | The time the Asset should seek to once it is created |
Renderer | ||
Material | Object | The material the HvrActor will render with if using the ‘Standard’ RenderMethod. |
Render Method | FastCubes | Renders the actor using cubes |
CorrectCubes | Identical results to FastCubes, but has correct depth and z-sorting for cubes on all platforms | |
PointSprite | Renders the actor with screen aligned squares | |
PointSpriteDepth | Only renders depth, color is ignored | |
PointBlend | Renders the actor with smooth points which soften the look of the actor | |
Use Lighting | Bool | Should this HvrActor be influenced by HvrLights? |
Receive Shadows | Bool | Should this HvrActor receive shadows? |
Cast Shadows | Bool | Should this HvrActor cast shadows? |
Options | ||
Use Screenspace Quad | Bool | Should this HvrActor render using a Screenspace Quad? This option is exposed in order to handle the cases where a ShaderSubroutine has move points outside the bounds of the HvrActor. To reduce the amount of overdraw, it is recommended to leave this option disabled unless needed. |
Use Occlusion Culling | Bool | Should this HvrActor use Unity’s Occlusion Culling system to check whether the object is visible when rendering? This is represented as a sphere around the HvrActor. |
Occlusion Culling Radius Multiplier | Float | Increase the radius of the Occlusion Culling sphere. |
In order to render HvrActors a HvrRender component must be attached to a Camera.
There are two different render modes that the HvrRender component can be set to.
Standard
This mode renders using Unity’s standard rendering loop and allows for custom materials, lighting and effects. This mode attempts to fit invisibly into Unity’s render loop and work as expected and allow HvrActors to be customized like any other normal mesh rendering.
Direct
Renders directly into the Unity framebuffer. This mode is generally faster than Standard, but comes with the trade off that custom materials, lighting and post effects are not supported.
Parameter | Type | Function |
Mode | Standard | Renders HvrActor within Unity’s render loop This allows for custom materials and effects to affect how an HvrActor is rendered |
Direct | Renders directly into Unity’s framebuffer Using this option will ignore all custom materials and effects that are currently set on a HvrActor |
Note
ShaderSubroutines effects are compatible with both Standard and ‘Direct’ mode.
HvrLight enables HvrActors to be lit by and cast shadows from Unity Lights.
Note
HvrLight only works when HvrRender’s rendering mode is set to ‘Standard’. In ‘Direct’ rendering mode, HvrActors will not be affected by this component.
Effects
This components allows you to mask sections of an HvrActor.
It works by using HvrActor3DMaskObject to mark areas which should be affected by the masking.
This component stores an array of HvrActor3DMaskObject. Each HvrActor3DMaskObject is applied in the order within the array. For example, a head of an HvrActor can be isolated by creating a ‘Subtractive’ mask which covers the entire HvrActor, then creating a ‘Additive’ mask which only surrounds the head of the HvrActor. Placing the ‘Subtractive’ mask first, and the ‘Additive’ mask second will cause only the head to be visible.
Note
This effect is done in 2D. This means masking out a hand in front of a face will remove the hand, but the face will not be visible. Rotate the camera within the included example scene to see a demonstration of this.
Parameter | Type | Function |
Objects | Array | An array of HvrActor3DMaskObject components. |
Used by HvrActor3DMask to mark areas to be affected when masking a HvrActor
The mask object can be either a sphere or a box. It can be additive or subtractive.
Property | Type | Function |
Options | Sphere | Mask type will be a sphere |
Box | Mask type will be a cube | |
Additive | Bool | Is this mask additive or subtractive? |
This component allows the color grading to be applied to the rendering of HvrActors.
Adjusting the values and sliders of this component will affect the look of the HvrActor it is attached to.
Note
The ‘Direct’ render mode available on HvrRender is not compatible with this component.
This component allows custom shaders to affect the HvrActor during the native plugin rendering step. This allows for complex effects to be written to affect the color, position and size of each voxel.
A HvrShaderSubroutine component can be though of like Unity’s material system. Where you have a shader with some values that can be changed and this changes how a mesh renders.
In the case of HvrShaderSubroutines, the shader suborutine code is executed as a post process step during the native rendering of the voxels.
Multiple HvrShaderSubroutine components can be added to a HvrActor in order to apply multiple post processing effects. The order which each subroutine executes is based on the order of the component as it is attached to the HvrActor’s GameObject from top to bottom. This is known as a stack.
Note
Not all examples in the below documentation have been written for all of the supported shader languages.
You may be required to modify the shader code for your specific target platform.
Unlike Unity’s ShaderLab shaders, shader subroutines are not automatically converted to work for every BuildTarget. Shader subroutines must be written for each Graphics API that the effect needs to work on.
The current supported shader languages are GLSL, HLSL and Metal.
Graphics API | Shader Language | Language Reference |
---|---|---|
Direct3D11 | HLSL | Windows HLSL Shader Reference |
OpenGLCore | GLSL | OpenGL Reference |
OpenGLES3 | GLSL | OpenGLES Reference |
Metal | Metal Shading Language | Apple Metal Shader Specification |
For convenience, a single shader subroutine file can contain all of the different shader languages. When the file is loaded it will only compile the relevant sections.
BEGIN and END lines are used to specify a block of code for different languages.
In addition, each type of shader subroutine corresponds to one shader stage: either the vertex or fragment (pixel) shader stage.
A code block will have a _VERTEX
or _FRAGMENT
suffix depending on the types of shader subroutines it contains.
The available types of shader subroutines are discussed later in this page.
For example:
BEGIN_GLSL_VERTEX
# Code
END_GLSL_VERTEX
BEGIN_GLSL_FRAGMENT
# Code
END_GLSL_FRAGMENT
BEGIN_HLSL_VERTEX
# Code
END_HLSL_VERTEX
BEGIN_METAL_VERTEX
# Code
END_METAL_VERTEX
Shader subroutines can control several properties of HvrActor rendering.
A code block consists of a collection of subroutines. Each subroutine controls one feature of rendering, such as the object-space position of the vertex, the vertex or fragment colour, or the scale factor applied to the voxel.
A subroutine is implemented as a function in the relevant shader language. The feature (position, colour, etc.) controlled by the subroutine is declared using an output semantic. Each shader subroutine may have several input parameters, which have values provided by the HVR Renderer, based on specified input semantics.
Input and output semantics are declared using an extended shader language syntax, which is similar to the way that HLSL uses semantics. The HVR Renderer parses the extended syntax, processes the semantic declarations and outputs standard shader code.
An example is given below, demonstrating modification of the vertex colour, object-space position and voxel scale. Note that the names of all subroutines need to be declared in a comma-separated list in parentheses after the code block BEGIN tag.
BEGIN_GLSL_VERTEX(CalcVertexColour, CalcVertexPosition, CalcVertexScale)
// Control colour
vec4 CalcVertexColour(vec4 colour : VERTEX_COLOUR, vec4 oPos : OPOS) : VERTEX_COLOUR
{
// Example: This makes the colour darker.
// The 'oPos' parameter is not used in this example,
// but could be included in the calculation, i.e. to modify
// the colour based on the position.
return vec4(0.5 * colour.rgb, colour.a);
}
// Control position
vec4 CalcVertexPosition(vec4 oPos : OPOS) : OPOS
{
// Example: This moves the vertex vertically upwards.
return vec4(oPos.x, oPos.y + 1.0, oPos.z, oPos.w);
}
// Control scale
float CalcVertexScale(float scale : VOXEL_SCALE) : VOXEL_SCALE
{
// Example: This doubles the vertex size.
return 2.0 * scale;
}
END_GLSL_VERTEX
In the example above, the CalcVertexColour subroutine has VERTEX_COLOUR specified as its output semantic, so the HVR Renderer uses its return value for the output vertex colour. Its input semantics are VERTEX_COLOUR and OPOS, so the parameters corresponding to these semantics will be filled in by the HVR Renderer with the original vertex position and colour.
The other two shader subroutines (CalcVertexPosition and CalcVertexScale) work similarly.
A list of semantics and their functionality is given below. Each of these can be used as either input or output semantics. Each subroutine must be declared in the appropriate code block (VERTEX or FRAGMENT) based on the shader stage of its output semantic.
Semantic | Type | Shader Stage | Description |
---|---|---|---|
OPOS | vec4 / float4 | Vertex | Object-space coordinates of the current vertex. |
VERTEX_COLOUR | vec4 / float4 | Vertex | The colour of the current vertex. |
FRAGMENT_COLOUR | vec4 / float4 | Fragment | The colour of the current fragment (pixel). |
VOXEL_SCALE | float | Vertex | A scaling factor used to modify the size of the voxel (1.0 = original scale). |
Important note: Each subroutine must be declared after the BEGIN tag in the code block header.
This takes the form of a comma-separated list of function names in parentheses: for example, BEGIN_GLSL_VERTEX(CalcVertexColor, CalcVertexPosition, CalcVertexScale)
in the example above. If a subroutine is not declared, it will be ignored by the HVR Renderer.
Helper or utility functions without input or output semantics should not be declared in the code block header.
Most shader subroutines are likely to need parameters provided by the application; for example, the current time, in
effects that are dynamic or animated. These correspond to uniform
variables in GLSL and constant buffers in HLSL.
The name of each variable, struct or cbuffer should be prefixed by <ID> (this is discussed in the ‘Shader Subroutine Stacks’ section).
In GLSL, shader parameters can be declared as global uniform variables:
uniform float _<ID>CurrentTime;
Similarly, in HLSL, global cbuffers can be declared:
cbuffer _<ID>ShaderParams
{
float _<ID>CurrentTime;
}
In the Metal shading language, shader inputs cannot be declared as global variables. Instead, a struct
of parameters must
be defined; this can then be declared as a parameter to a shader subroutine using the special semantic SHADER_UNIFORMS. For example:
struct _<ID>ShaderParams
{
float _<ID>CurrentTime;
};
float4 CalcVertexColour(float4 colour : VERTEX_COLOUR, _<ID>ShaderParams uniforms : SHADER_UNIFORMS) : VERTEX_COLOUR
{
return float4(colour.rgb * (sin(uniforms._<ID>CurrentTime) * 0.5 + 0.5), colour.a);
}
For Metal, textures should be declared in a separate struct
and used with the SHADER_TEXTURES semantic. For example:
struct _<ID>ShaderTextures
{
texture2d<float> _<ID>RGLookupTable;
};
float4 CalcVertexColour(float4 colour : VERTEX_COLOUR, _<ID>ShaderTextures textures : SHADER_TEXTURES) : VERTEX_COLOUR
{
// Uses the R and G components of the original colour as texture coordinates.
constexpr sampler textureSampler(mag_filter::linear, min_filter::linear);
return textures._<ID>RGLookupTable.sample(textureSampler, colour.rg);
}
In order to support shader subroutine stacks, it is required to prefix all custom parameters and methods with “<ID>” (without the quote marks).
This is necessary because when a shader subroutine stack is created, all of the shaders in the stack are compiled into one large shader. If more than one of those shaders has a parameter with the same name, the parameter’s value will not be able to be set differently for each shader. For example, if two shaders had the parameter “colour” when you try to set each to a different value it would affect both paramters in the shader.
In order to address this, a unique ID is generated for each file and is used when the shader subroutine stack is created. This ID is used to replace the “<ID>” prefix and ensures that each shader has unique parameter and method names.
This example demonstrates how to write a shader which is compatible with shader subroutine stacks.
BEGIN_GLSL_VERTEX(VertexColor)
uniform float _<ID>Saturation;
float <ID>Luminance(vec3 c)
{
return dot(c, vec3(0.22, 0.707, 0.071));
}
vec4 CalcVertexColour(vec4 colour : VERTEX_COLOUR, vec4 oPos : OPOS) : VERTEX_COLOUR
{
float luminance = <ID>Luminance(colour.rgb);
colour.r = lerp(colour.r, luminance, _<ID>Saturation);
colour.g = lerp(colour.g, luminance, _<ID>Saturation);
colour.b = lerp(colour.b, luminance, _<ID>Saturation);
return colour;
}
END_GLSL_VERTEX
Example 1
Set all voxels to be blue
BEGIN_GLSL_VERTEX(SetVertexColour)
vec4 SetVertexColour(vec4 colour : VERTEX_COLOUR, vec4 oPos : OPOS) : VERTEX_COLOUR
{
colour.rgb = vec3(0, 0, 1);
return colour;
}
END_GLSL_VERTEX
BEGIN_HLSL_VERTEX(SetVertexColour)
float4 SetVertexColour(float4 colour : VERTEX_COLOUR, float4 oPos : OPOS) : VERTEX_COLOUR
{
colour.rgb = float3(0, 0, 1);
return colour;
}
END_HLSL_VERTEX
BEGIN_METAL_VERTEX(SetVertexColour)
float4 SetVertexColour(float4 colour : VERTEX_COLOUR) : VERTEX_COLOUR
{
colour.rgb = float3(0, 0, 1);
return colour;
}
END_METAL_VERTEX
Example 2
Offset the position of all vertices vertically
BEGIN_GLSL_VERTEX(SetVertexPosition)
vec4 SetVertexPosition(vec4 oPos : OPOS) : OPOS
{
if (oPos.y > 100)
oPos.y += 30;
return oPos;
}
END_GLSL_VERTEX
BEGIN_HLSL_VERTEX(SetVertexPosition)
float4 SetVertexPosition(float4 oPos : OPOS) : OPOS
{
if (oPos.y > 100)
oPos.y += 30;
return oPos;
}
END_HLSL_VERTEX
BEGIN_METAL_VERTEX(SetVertexPosition)
float4 SetVertexPosition(float4 oPos : OPOS) : OPOS
{
if (oPos.y > 100)
oPos.y += 30;
return oPos;
}
END_METAL_VERTEX
Example 3
The following sets the color of all voxels to be blue, and sets their scale to 0 if they are below 1m in the data’s object space.
BEGIN_GLSL_VERTEX(SetVertexColour, SetVertexScale)
vec4 SetVertexScale(float scale : VOXEL_SCALE, vec4 oPos : OPOS) : VOXEL_SCALE
{
if (oPos.y < 100)
return 0;
return scale;
}
vec4 SetVertexColour(vec4 colour : VERTEX_COLOUR, vec4 oPos : OPOS) : VERTEX_COLOUR
{
colour.rgb = vec3(0, 0, 1);
return colour;
}
END_GLSL_VERTEX
BEGIN_HLSL_VERTEX(SetVertexColour, SetVertexScale)
float4 SetVertexScale(float scale : VOXEL_SCALE, float4 oPos : OPOS) : VOXEL_SCALE
{
if (oPos.y < 100)
return 0;
return scale;
}
float4 SetVertexColour(float4 colour : VERTEX_COLOUR, float4 oPos : OPOS) : VERTEX_COLOUR
{
colour.rgb = float3(0, 0, 1);
return colour;
}
END_HLSL_VERTEX
BEGIN_METAL_VERTEX(SetVertexColour, SetVertexScale)
float4 SetVertexScale(float scale : VOXEL_SCALE, float4 oPos : OPOS) : VOXEL_SCALE
{
if (oPos.y < 100)
return 0;
return scale;
}
float4 SetVertexColour(float4 colour : VERTEX_COLOUR) : VERTEX_COLOUR
{
colour.rgb = float3(0, 0, 1);
return colour;
}
END_METAL_VERTEX
Utilities
This simple component can be attached to a GameObject which has a Collider and assigned a HvrActor as a target. When the Collider is hit by any collidable object, the HvrActor will be triggered to play.
Parameter | Type | Function |
actor | HvrActor | The HvrActor which will be triggered by this component when something collides with it |
This component allows a Animation clip to be synced with a HvrActor’s playback.
The Animation will match the HvrActor’s current time. Syncing the HvrActor to the Animation’s time is not supported.
Parameter | Type | Function |
HvrActor | HvrActor | The HvrActor which will be used to sync the animation |
TargetAnimation | Animation | A Animation component which will be synced to the HvrActor’s Asset |
This component allows a Audiosource’s AudioClip to be synced to the playback of a HvrActor’s asset.
The HvrActorAudioSourceSync does not take an audio clip itself as a value.
It expects the audiosource to have an audioclip assigned. Once you have assigned a Audiosource and HvrActor, you will not need to manage this component - If the actor is playing, the audio source will be playing.
If the AudioClip assigned to the AudioSource is shorter than the HvrActor’s asset duration it will play as long as it can, if it is longer it will stop once the HvrActor stops playing.
The AudioClip will match the HvrActor’s current time. Syncing the HvrActor to the AudioClips’s time is not supported.
Parameter | Type | Function |
Actor | HvrActor | A HvrActor which will be synced with |
Audio Source | AudioSource | A AudioSource component which has a AudioClip assigned to it |
Offset | Float | A time in seconds which this component will use to offset the sync time. This can be useful for situations where the audio clip start time does not match the Asset. |
This component can be used to align a Projector component based on the bounds of an HvrActor. This is useful for creating blob shadows.
Parameter | Type | Function |
Actor | HvrActor | The HvrActor which has an Asset. The bounds of this Asset will be used to drive the Projector components size |
Projector | Projector | The Projector Component which will cast a blob shadow texture |
Size Multiplier | Float | Used to increase or decrease the scale of the blob shadow texture |
The HvrDataBuildInclude component and HvrDataReference can be used to include files and folders from your Project which are not assigned to HvrActors.
During the build process the 8i Unity Plugin will scan any scenes that are to be built and check the components to find any HvrActors which are using the ‘Reference’ Data Mode. It will also search for HvrDataBuildInclude components which have a HvrDataReference assigned. If any are found, the data that is referenced will be will be copied to the build folder so the built application can load them.
Using this component is helpful in situations where your project going to be creating HvrActors at runtime and you do not want to manually copy data from your project to your build.
Note
When building for Android the build process described above is not automatic and requires a custom build process. Please see the Android Platform page for more information about this.
The HvrDataBuildInclude is a component which a HvrDataReference can be assigned to.
To use this component, attach it to a GameObject within your scene and assign a HvrDataReference to the ‘Data Reference’ slot.
Parameter | Type | Function |
Data Reference | Object | A Object slot which a HvrDataReference can be assigned to |
The HvrDataReference contains an list of references to files or folders from your project.
It can be created by either opening the ‘Assets’ menu at the top of the screen, or in the ‘Create’ context menu when right clicking in your Project window and navigating to ‘Create/8i/HvrDataReference’
To add a reference to the List
Because the references are created using Unity’s project GUID system moving any referenced files and folders will not break the reference.
Parameter | Type | Function |
Data | List | A expandable list of GUID references to files or folders in the project |
What is PAT
PAT is the latest codec developed by 8i. PAT is a mesh based codec, in which the final asset output is represented as a sequence of textured meshes
Output from PAT
The PAT codec produces several outputs. It’s important to understand that the outputs are simply a different encapsulation around the same output data. The available outputs are:
- MPD - This format extends the well known MPEG-DASH and HLS standards. This format should be used for all applications that require streaming and playback that is longer than a few seconds. More info about an MPD structure can be found in the MPD structure document
- glTF - This output can be used in standard editors/players and social platforms. glTF is widely adopted format, particularly suitable for the web. However, this format performs poorly for longer content where progressive streaming is required.
- FBX - This output is primarily used for cases where the output needs to be edited using tools like Maya. After editing, this format can be converted to any of the other formats.
Difference from HVR
The main difference between HVR and PAT is how the underlying 3D asset is represented. HVR takes the approach of representing the 3D asset as a point cloud. However, PAT represents the asset as a textured mesh. Effectively, PAT decouples the mesh(geometry) resolution from the texture(color) resolution. Allowing us to scale up the asset perceived resolution, with smaller increase in overall asset size. To illustrate it, think of an actor with a very textured shirt. The underlying geometry of the shirt is fairly simple, allowing us to save a lot of geometry bandwidth. In this case most the complexity is on the texture side, which we can very effectively encode using standard video codecs. The represent the high details with HVR, we will need a very high resolution of points, causing the file size to explode.
PAT usage with Unity
HVR is using a point cloud to represent the data. Point clouds are less standard and hence Unity’s standard rendering will not produce the best output to the user. As a result, to produce the best quality render, the HVR plugin takes over the rendering of the asset. This internal rendering takes away some of feature set that a developer would expect with a 3D asset in Unity. In contrast, PAT simply delivers a mesh and a texture per frame, giving back the control over rendering and 3D asset manipulation and effects to the developer.
Main Script Components
This allows for enumeration of available PatPlayerRepresentation objects per audio/video/mesh type. Primarily, video representation is used to provide different levels of bandwidth/quality.
These objects are enumerated via the PatAsset interface, for each sub-stream of the given asset (video/mesh/audio)
Examples
AssetUrl | URL to the cloud hosted manifest |
PatActorPrefab | Should be set to the provided PatActor.prefab |
DownloadAsset | Controls whether the asset should first be downloaded, or streamed directly from the web |
PAT with webplayer and 8thwall
PAT MPD assets can be streamed and played on the browser. Additional information about usage of the web player can be found here: https://8i.github.io/embeddable_webplayer/#/
The ‘Recommended Project Settings’ window can found through the ‘8i’ menu at the top of the Unity Editor.
This window will prompt you if it detects an issue with your project settings, it can also fix common problems with a single click.
Use lower resolution data
It is recommended using this table when choosing which resolution to use for your application.
Resolution | PC | Mobile | About |
---|---|---|---|
1,500,000 | Y | N | Not recommended for mobile |
800,000 | Y | Y | High end mobile device is required |
600,000 | Y | Y | |
400,000 | Y | Y |
The filename of the hvr file will tell you the resolution. This number represents the density of the data and the overall quality.
The naming convention is [NAME]_[Resolution]_[FrameNumber].[Extension]
Most files follow this naming convention. If you are not sure, please send an email to your contact at 8i.
Ie: BOXER_200_tc_v01_800000_001009.hvr will be:
Name | BOXER_200_tc_v01 |
Resolution | 800,000 |
Frame number | 1009 |
Change the HvrActor render method
It is recommended to use the ‘FastCubes’ render method in order to improve performance.
Change the HvrRender render mode
The ‘Direct’ render mode provides the best performance and memory usage.
There are downsides of this method which are outlined in the HvrRender section of this documentation.
Mobile Performance Tips
As mobile platforms perform much slower than desktop systems it is recommended that hvr frames with point counts of 600k or less are used, with the recommended point count being around 300k.
It is recommended to use the ‘Direct’ HvrRender render method on Android as it is the best performing renderer.
It is recommended to use the ‘Point Sprite’ render method in all cases. It is the best performing render method provided.
The current alternative, ‘Point Blend’ does not work on some older devices, and is around twice as expensive to render.
Common Problems
Android
Make sure that under the PlayerSettings that the targeting GraphicsAPI is GLES3 and that your device supports GLES3.
The PointBlend render method does not work on some older mobile devices.
There is a known issue that on some devices, that when the ‘Split Application Binary’ option is enabled, the HvrRender component may not be able to load the Standard shaders. Go to ‘Edit/Project Settings/Player’ and make sure that the option ‘Split Application Binary’ is not checked.
The current release of the Unity Editor for DayDream ( 5.4.2f2-GVR13 ), will not use the AndroidManifest.xml file that is provided within the plugin. This means the hvr frames will not be extracted or read from the devices storage upon installing the build.
It appears as though this build of the Unity Editor has a bug where it will not use and AndroidManifest.xml file that is not located at this specific location project_name/Assets/Plugins/Android/AndroidManifest.xml
Until this is fixed within Unity, it is recommend to copy the AndroidManifest file from this location 8i/core/platform support/android/plugins/AndroidManifest.xml to project_name/Assets/Plugins/Android/AndroidManifest.xml
Some devices do correctly allow the OBB file to be copied to the device when using the “Build and Run” option in Unity, and in some cases will silently fail to update the OBB when the project is built. If this occurs, the OBB file will need to be manually copied to the development device.
So far only the Samsung Galaxy Note 5 has been observed with this issue.
In this section, we will go through two AR solutions provided by Apple on iOS, and Google on Android, namely ARKit and ARCore.
Go to https://bitbucket.org/Unity-Technologies/unity-arkit-plugin and download the official Unity ARKit plugin. You might need to install Mercurial or SourceTree to grab the source.
Switch to 1.5.1 tag if you want it to run on iOS 11 devices. Here’s how you do it in SourceTree but could be different if you use other Mercurial client:
Within the downloaded project, extract 8i Unity Plugin into the Asset folder, as stated in Quick Start section.
You should have the directory structure like this:
Open the project in Unity
If you are prompted to upgrade the Unity version, click yes.
Select File > Build Settings, a build dialogue should come up. In Platform choose iOS and click Switch Platform button.
Make sure the Platform is switched to iOS, and choose one of the scene as into the build. In this tutorial, we checked the simplest scene UnityARBallz.
Still in Build Settings dialogue, click Player Settings… button.
A PlayerSettings inspector should appear.
Note
It is requried to disable Metal Editor Support. This is because the 8i Unity Plugin doesn’t not yet support Metal on macOS.
Make sure Metal is listed as the first in Graphics APIs and disable Metal Editor Support
For this tutorial, we will edit the UnityARBallZ scene from Unity ARKit plugin’s example.
We will change the original AR object to 8i’s hologram, so that you can place a human hologram onto the augmented world.
To open the scene, find the scene in project and double click the scene.
You should be able to see something like this in Scene view:
This scene is a barebone template of an AR app and all it does is to automatically detect the environment and track the movement of the device and tie it to the virtual Camera object.
Select menu GameObject > 8i > HvrActor. This will create an GameObject with a HvrActor component attached to it.
Select the newly created HvrActor object.
There are a few options to note but for now we will just focus on the Asset/Data/Reference field.
This is the data source that 8i’s hologram engine will read from. As you can see, right now it’s empty. To specify a valid file reference, we can go to folder 8i/examples/assets/hvr, and find “president” folder:
Drag this “president” folder to Asset/Data/Reference field in Inspector panel. To make things even simpler, uncheck the Rendering/Lighting/Use Lighting checkbox:
You should be able to see the hologram has already been shown in the Scene view:
Because we want our user be able to drop the hologram whenever he touches the ground, we need to wrap this HvrActor object into a called “prefab” and let our ARKit code know to use it.
Note
Prefabs are an important concept in Unity,
Making sure HvrActor is currently selected, drag the HvrActor object down to a folder in the Project window, Unity will automatically create a prefab for you, and you will see the name of HvrActor turns blue:
To change the ARKit code to spawn HvrActor*s instead of balls, find *BallMaker object in the scene and select it. Drag the newly created prefab HvrActor to BallMaker’s Inspector panel, replace BallPrefab with HvrActor:
Because we have stored the HvrActor in a prefab it is now safe to delete the HvrActor in the scene. Go to Hierarchy and right click on HvrActor, which should has its name in blue colour, and choose “Delete”.
Save the scene by pressing Cmd+S.
Next we need to configure the camera to let it render 8i’s hologram.
Note
This step is required or else you will only be able to view the hologram within the Unity Editor
Find the camera object in Hierarchy > CameraParent > Main Camera and select it.
With Main Camera seleced, In menu choose Component > 8i > HvrRender, this should add a HvrRender component to the camera:
Save the scene by pressing Cmd+S.
Before we can build the project, there’s an extra step to do. Because we are using a prefab which means it will be dynamically loaded. We need to explicitly tell Unity to include the data before exporting.
First, right click on the Project window and create an asset of type HvrDataReference. You do it through Create > 8i > HvrDataReference.
After creation, select the asset. Drag the president folder to its data field.
Now we have created and configured the asset on disk. Now we need to include this asset in our scene. Right click in Hierarchy window and create an empty GameObject.
With the empty object selected, attach a component of type HvrDataBuildInclude. You can find it in Component > 8i > HvrDataBuildInclude.
Finally drag the configured HvrDataReference asset to Data Reference field.
Save the scene.
That’s it! It’s time to export Xcode project and deploy it to the device.
Menu File > Build Settings, click Player Settings and make sure Metal is the first listed in the Inspector window.
Click Build, select a folder to export the project. If everything went smooth, a Finder window should pop up and shows the exported Xcode project.
Double click Unity-iPhone.xcodeproj and this should bring up Xcode.
Configure Xcode project as follows. You need to pay attention to code signing if you are new to it.
After configuration, hit run:
Once the build is deployed and running, pick up your phone and walk around until a magenta ground is shown, which means you can put your holograms on.
Tap the magenta ground to see how hologram works within AR world.
Download ARCore SDK for Unity 1.4.0 or later
Open Unity and create a new empty 3D project
Select Assets > Import Package > Custom Package, use the downloaded arcore-unity-sdk-v1.4.0.unitypackage from your disk.
In the import dialogue, make sure everything is selected and click Import.
Accept any API upgrades if prompted.
Within the newly created project, extract 8i Unity Plugin into the Asset folder, as stated in Quick Start section.
You should have the directory structure like this:
Fix any warning poped up in 8i Project Tips window, including Android Unpack Scene.
Open the project in Unity
If you are prompted to upgrade the Unity version, click yes.
Open scene HelloAR by double clicking Assets/GoogleARCore/Examples/HelloAR/Scenes/HelloAR
Select File > Build Settings, a build dialogue should come up. click Player Settings… button. A PlayerSettings inspector should appear. In the Inspector window, find Other Settings - Metal Editor Support and unchecked it. This is important for Unity previewing 8i’s hologram content.
Still in Build Settings dialogue, in Platform choose Android and click Switch Platform button.
Make sure the Platform is switched to Android, and make sure HelloAR scene is ticked on by using Add Open Scenes.
Still in Build Settings dialogue, click Player Settings… button. A PlayerSettings inspector should appear. In the Inspector window, a few fields need to be configured:
Other Settings - Package Name: set to an reversed DNS like name, e.g. com.yourcompany.arsample
Other Settings - Uncheck Auto Graphics API and explicitly set OpenGL ES 3 as the graphics API
Other Settings - Multithreaded Rendering: uncheck
Other Settings - Minimal API Level: set to Android 7.0 or higher. Note you need to have the right version of Android SDK installed and configured in Unity > Preference.
Other Settings - Target API Level: set to Android 7.0 or higher. Note you need to have the right version of Android SDK installed and configured in Unity > Preference.
XR Settings - ARCore Supported: tick on
For this tutorial, we will edit the HelloAR scene from Google ARCore SDK for Unity’s example.
W will change the original AR object to 8i’s hologram, so that you can place a human hologram onto the augmented world.
To open the scene, find the scene in project and double click the scene.
Select menu GameObject > 8i > HvrActor. This will create an GameObject with a HvrActor component attached to it.
Select the newly created HvrActor object.
There are a few options to note but for now we will just focus on the Asset/Data/Reference field.
This is the data source that 8i’s hologram engine will read from. As you can see, right now it’s empty. To specify a valid file reference, we can go to folder 8i/examples/assets/hvr, and find “president” folder:
Drag this “president” folder to Asset/Data/Reference field in Inspector panel. To make things even simpler, uncheck the Rendering/Lighting/Use Lighting checkbox:
You should be able to see the hologram has already been shown in the Scene view:
Because we want our user be able to drop the hologram whenever he touches the ground, we need to wrap this HvrActor object into a called “prefab” and let our ARKit code know to use it.
Note
Prefabs are an important concept in Unity,
Making sure HvrActor is currently selected, drag the HvrActor object down to a folder in the Project window, Unity will automatically create a prefab for you, and you will see the name of HvrActor turns blue:
To change the HelloAR scene to spawn HvrActor instead of Andy Android, find Example Controller object in the scene and select it. Drag the newly created prefab HvrActor to Example Controller’s Inspector panel, replace Andy Plane Prefab and Andy Point Prefab with HvrActor:
Because we have stored the HvrActor in a prefab it is now safe to delete the HvrActor in the scene. Go to Hierarchy and right click on HvrActor, which should has its name in blue colour, and choose “Delete”.
Save the scene by pressing Cmd+S.
Next we need to configure the camera to let it render 8i’s hologram.
Note
This step is required or else you will only be able to view the hologram within the Unity Editor
Find the camera object in Hierarchy > CameraParent > Main Camera and select it.
With First Person Camera seleced, In menu choose Component > 8i > HvrRender, this should add a HvrRender component to the camera:
Save the scene by pressing Cmd+S.
Before we can build the project, there’s an extra step to do. Because we are using a prefab which means it will be dynamically loaded. We need to explicitly tell Unity to include the data before exporting.
First, right click on the Project window and create an asset of type HvrDataReference. You do it through Create > 8i > HvrDataReference.
After creation, select the asset. Drag the president folder to its data field.
Now we have created and configured the asset on disk. Now we need to include this asset in our scene. Right click in Hierarchy window and create an empty GameObject.
With the empty object selected, attach a component of type HvrDataBuildInclude. You can find it in Component > 8i > HvrDataBuildInclude.
Drag the configured HvrDataReference asset to Data Reference field.
Finally, choose from menu 8i > Android > Prepare Build and click OK if a dialogue prompts. This will prepare and bake the content ready to be submit to Android device. Note this is an Android specific process whenever you changed the dynamic loaded 8i content. You don’t have to do it if no 8i content changed between builds.
Save the scene.
That’s it! It’s time to build an APK and deploy it to the device.
Connect your Android phone to your development machine
Enable developer options and USB debugging on your Android phone. This should be done just once.
Menu File > Build Settings, click Player Settings.
Click Build And Run, select a folder to export the APK. If everything went smooth, you should be to see the APK get exported and automatically deployed to device.
Once the build is up and running, pick up your phone and walk around until a magenta ground is shown, which means you can put your holograms on.
Tap the white grid ground or blue dots to see how hologram works within AR world.