OLP Day 2: Chris Pruett Unity Session

The following are my notes from a day at Oculus Launch Pad 2017 with Director of Mobile Engineering @ Oculus, Chris Pruett.

Chris Talking about Unity Workflow and Areas of OLP Interest

Unity Scene Setting: Grungy mid-80s arcade space with a main room and games room. Built for Rift, Vive, etc. details differ at a SDK level. It’s unreleased and he is focused on Mobile VR, and hence it’s designed to be efficient for mobile target devices.

Loading Scene and other tricks and tips for optimizing load-in

The base is that it takes a really long time to load things. The purpose of this is basically to contain the OVR SDK utilities and also has a higher frame rate. If you have been in a heavy load scene and move your head around you get really bad frame drops. One other thing you can do on the topic of scene load (takes a really long time to load a couple hundred megabits of data on a phone), put the assets in an asset bundle. Unity 5 also loves to check the preload audio data check box in “import settings” for any audio file. To take pressure off of the game engine uncheck this “Preload Audio Data” box; it’s possible to shift audio “load type” to “Compressed to Memory”.
Before the level load
  • Put scenes assets in an asset bundle, use the OVROverlay script, synchronously load, when complete turn the cube map off
  • You could decide that a one-time level load is better than a multiple level load. As long as your session time is fairly long you paid all the costs at one time, and now you have a memory buffer for the experience.
————————————————————————————————————–

What’s notable inside OVR Utilities

You have a package called Oculus Utilities for Unity 5 this notably contains:
  • High-level VR Camera Headset (e.g. LeftEyeAnchor, RightEyeAnchor, CenterEyeAnchor)
  • Controllers API for higher-end hand-controllers (Touch) and lower-end hand-controller (gear)
  • User gets to choose left/right setting for the Gear controller
  • OVRInput.cs is abstracted in a way that allows for input from any controller (i.e. LTrackedController or RTrackedController)

OVROverlay

  • Built into the Oculus SDK – it’s a texture that is rendered not by the game engine but by timewarp –– which is something similar to asynchronous reprojection
  • Your engine renders a left and right eye buffer and submits it to the Oculus SDK
  • The basic things that it does are projects images, warping the edge of images in the right way for the specific hardware in use
  • Timewarp – Tries to alleviate judder. It takes a previous frame and reshows this in practice, it only knows about orientation information and it’s not going to help with the camera moving forward. It will render some overlays for you. First of all timewarp has an opportunity to render faster than Unity. Timewarp composites in the layers that you submit which are essentially “Quads”. This is particularly good if you’re rendering video. It was made initially for mobile, but there’s now an additional buffer for Rift that you have to Upshot: You can get a higher fidelity by pushing certain texture through Timewarp.

VR Compositor layer

Does some texture mapping
Screen Shot 2017-06-11 at 2.25.31 PM.png

OVRInput.cs

This section could use some filling out

  • Check the public enum Button for more interesting maps
  • Check out public enum Controller for both Oculus Touch and Gear VR Controller orientation info e.g. LTrackedRemote or RTrackedRemote –– will give you back a quaternion

Potentially To Come: OculusDebugTool, right now it’s only on the Rift


Fill Cost

  • Today eye buffers aren’t rendering the same resolution as the device, but rather 1024 x 1024, which is a 3rd of the resolution of GearVR displays
  • No one comes  fill bound but the buffers are 1400 x 1400
  • The way that Chris thinks about this is the total number of pixels that will get touched for a computation, and the number of times it will compute/touched

Draw Calls

The goal is the get the fewest number of these: check playersettings>options “Static Batching” and “Dynamic Batching” leave them checked. Rendering path is always “forward”
  • Draw calls are organized around a mesh
  • Batches are “when you take like five meshes of the same material and collect them up and issue draw calls in succession (this is because the real-time cost comes with loading in info about the draw)” in Stats this is the total number of draw calls (want to keep under 150), “Saved by Batching” refers to
  • Static Batched objects are objects that you mark as “Static” in the details pane on the right side. Saying that this object isn’t going to move or scale.

Movie Textures

this section needs filling out… not sure what specific advice was doled out

Optimization: Dynamic Versions of Interactive Objects

Colliders get expensive when you start to move them. How do you achieve an optimized version of your app/game with Interactive objects but only pay for them on interact events?
If you want an ‘interactable’ you don’t want the object to be static, but for performance reasons if I know that the object will likely not be moved, for example a pool table (have two pool tables one static and one dynamic) the moment that someone tries to flip this table switch in the dynamic one.
Let’s put this setup on steroids, now we have 2000 objects just like this pool table. Should we still do this swap? You don’t pay for inactive objects (i.e. the dynamic ones that aren’t enabled in the scene) so yes you would be able to use this technique of swapping in dynamic versions/instances of your objects. Let’s pause to consider a slightly different angle on this problem…
Let’s say you just want your 2000 objects to reflect color changes due to environment changes; you can keep your static batching but change the shaders to accommodate this (see lightning example below in “Lightmap and Lightmap Index” section). Another way to accommodate is instance the material, set the material back to the starting material once changes are done.

Frame Debug

Use this to walk through all your draw calls. Can be very helpful to understand how Unity draws your scene. Opaque geometry comes first, followed by more transparent objects.

Expand on how one can open this up in the Unity Editor, please.

Lightmap and Lightmap Index

  • Window>Lighting>Setting you can basically bake your lighting here in “Lightmap Settings”
  • Please fill out this section more if you have other notes
  • If you wanted to have a crack of lightning or something, the way to do that is write your own shader that will light all surfaces for objects by increasing the saturation of every object etc.
  • In the past, Chris has found it edifying to delve into the code for the Unity Shaders such as Mobile Diffuse or Standard, which are all available publicly
Let’s say we want all of the assets in the scene to reflect an ominous mood; you can go into Lighting>Settings:
Unity_5_6_1f1_Personal__64bit__-_StartingScene_unity_-_VikingQuestVR_551_-_Lighting.png

You can barely see but on the far right highlighted in the yellow box, you have a setting for source that is set to “Skybox”

and set “Source” to skybox and apply an ominous skybox there by dragging it from the Project window to the box next to the word “Source”.

 

Oculus SDK for Multiplayer

  • Rooms: once players are in the room they can share info
  • Hard-code a roomID – helps with info transfer across multiple instances of Unity running (i.e.two different gearVRs running with the same app open can share info)
Side Notes
  • Specular – computes the same simple (Lambertian) lighting as Diffuse, plus a viewer dependent specular highlight.
  • Draw calls are organized around a mesh
  • In some cases, Unity will take images that match the same material and batch them (two versions: static mesh batching and dynamic batching) it works based on material pointers.
  • At playtime/buildtime Unity will load a bunch of stuff into the same static combined mesh
  • Colliders get expensive when you start to move them
  • Progressive Lightmapper
  • Combined meshes can be viewed which is cool
  • Unity isn’t going to batch a texture, that’s why he/a very talented artist made the atlas. You can try using Unity’s API for atlas creation or MeshBaker for similar effects.
  • Set Pass Call – is a pass within a shader (some shaders require multiple passes)
  • Unity 5.6 – Single-pass stereo rendering –– halves your draw call # and in practice its about 1/3 of all of this
  • If you want to use Oculus Touch to do pointing or thumbs up (i.e. Facebook Spaces) there is a function found in one of the scripts called GetNearTouch() which allows you to check sensors on Touch Controllers and toggle a hand model point/thumbs up on and off
  • Mipmaps – further reading
  • Occlusion Culling – What you’re able to see or not see at any given second (i.e. Horizon Zero Dawn below) – Window>Occlusion Culling
giphy

click here for the gif source

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s