Reblog: Adventure at the 5th Oculus Connect Conference

oculus_connect_5_dilan.png

The following is a write up from a friend, Kathryn Hicks, on the Danse blog. The link to the original is at the bottom. 

Last week I attended the 5th Oculus Connect Conference held at the San Jose McEnery Convention Center. This two-day conference is held annually during the fall, which showcases the new virtual reality technology from Oculus. It was my second time attending, and it felt even better than the last one.

During the Keynote address, Zuckerberg announced a wireless headset that doesn’t need a cell phone, and an external computer. The Quest, a standalone headset with 6 degrees of freedom, touch controllers and is a potential game-changer for the VR industry. If you are familiar with the Rift and the Oculus Go, the Quest would be a marriage of the two. The Quest is scheduled to come out this spring and will be $399, and a lot of the Rift titles will be available on the Quest. While unfortunately, I was not able to try it, the feedback that I heard from others was positive. The tetherless aspect of the headset creates a more immersive experience and doesn’t feel confined. While the graphics capabilities of the headset are not as high as the Rift, they are good enough and don’t hinder the experience. Plus the optics, as well as the sound, have improved from the Oculus Go. On the downside, the Quest is reportedly top heavy and a denser headset than the Go, which I find the Go to be more substantial than the lightweight Rift. Since the Quest has four inside out cameras on the front of you, if you move the controllers behind you, you could potentially lose tracking. Hopefully, they will make these adjustments before it launches in the spring and add tracking on the strap. I can see much potential with the Quest, such as eSports, education, businesses, medical, engineering, set design; the list goes on. The possibilities are endless, and for the price point, it could substantially increase VR users. Considering that the Quest will be the price of most gaming consoles, without the need of television or home set up.

Walking around the conference was lovely, I felt like a kid in a candy store seeing people putting their full body into the Quest. The well-orchestrated design layouts and theme of the different experiences were terrific. It was a pleasure hearing eSports commentary and cheers as competitors go head to head playing Echo Arena and Onward. Seeing the VR community connect, share laughs, smile, and have a good time, warmed my heart. I enjoyed watching people play the Dead & Buried Quest experience in a large arena and seeing their digital avatars battle each other on screen. I can see more VR arenas being built specifically for the Quest, kind of like skate parks, or soccer parks, but with a sports stadium vibe.

While I was at the conference, I tried a few experiences like The Void – Star Wars Secrets of the Empire, which is a full sensory VR experience. You are an undercover Rebel fighter disguised as a Stormtrooper, as a user you get to interact with your teammates fully, feel, and smell the environment around you. It was a fantastic experience, and I would encourage others to try it at one of the nine locations.

Another experience I tried was the Wolves in the Walls a VR adaptation of Neil Gaiman’s book and created by the company Fable. The audience explores parts of Lucy’s house to try and find hidden wolves in the walls. It was a more intimate experience, and Lucy’s performance felt pretty lifelike. The environments and character designs were beautifully portrayed. Overall it was an enjoyable VR experience.

I also played a multiplayer combat experience called Conjure Strike by The Strike Team. It’s an engaging multiplayer experience, which you can play as a different rock like characters that have different classes like an Elementalist, Mage Hunter, Earth Warden and more. The multiplayer session I had played was similar to capture the flag game. One player has to push a box toward the other side while the opposing player stops the player. It was a fun experience similar to that of Overwatch but in VR. The multiplayer mechanics were excellent, but some of the controls felt foreign to me. Overall it’s an engaging game that seems like it would be popular amongst most VR users.

While I didn’t get to play as many demos as I would have liked, I enjoyed the ones I experienced, especially The Void. It was the most immersive experience I tried, the few things I would change are: update the headset and enhance the outside temperature and wind strength.

I’m looking forward to more development put towards, the Quest and I’m optimistic about the future of VR. As a team member at The Danse, I am excited to work on projects utilizing immersive technology such as virtual & augmented reality. Also, to work in an industry, the is ever changing and improving. It’s nice coming back to the Oculus Connect Conference and see the community excited about the future of VR.

Design Iteration for Oculus Go

 

Design iteration when building for the Oculus Go

With 6 degrees of freedom headsets like the Oculus Rift and HTC Vive, when working in Unreal or Unity3d, it takes only a push of the play button to test your application in the headset.

There are advantages to seeing your scene from within your headset such as how your first-person perspective is developing, checking performance metrics in HUD, checking in on rendering weirdness, or correcting for relative spacing. However, the constraint of having to deploy by building and running to the Oculus Go each time we needed to check something can lessen your appetite for quick checks like this. Besides, sometimes is not even necessary.

That’s why a quick way of iterating on your scene using traditional desktop inputs is nice. Typically duplicating a currently under-construction scene into two versions. One called “site tour” for example and another called “site tour desktop”. The naming convention splits up functionality so that when you need to test something using mouse and keyboard you quickly hop into the “site tour desktop” scene. Some example mappings include UI navigation with a pointer or locomotion. The UI navigation can be done using the left mouse button and cursor instead of shipping to Go and using the hand controller. The locomotion can be done using your keys ‘w’,’a’,’s’, and ‘d’, as is common to most FPS games, to move around the space and the mouse to click and drag to move your head instead of having to teleport.

Diving deeper on the locomotion example

By throwing on headphones and using a Fly script applied to the Main Camera to test quickly using WASD within the Unity editor, you’ll be able to check relevant aspects of your lighting, audio, animations, etc without needing to wear the Go.
sample:

void Update()
{

    if (Input.GetMouseButton(0))
{
yaw += Input.GetAxis(“Mouse X”) * lookSpeed;
pitch += Input.GetAxis(“Mouse Y”) * lookSpeed;
pitch = Mathf.Clamp(pitch, -90.0f, 90.0f);
}

transform.localRotation = Quaternion.AngleAxis(yaw, Vector3.up);
transform.localRotation *= Quaternion.AngleAxis(pitch, Vector3.left);

transform.position += transform.forward * moveSpeed * Input.GetAxis(“Vertical”);
transform.position += transform.right * moveSpeed * Input.GetAxis(“Horizontal”);
transform.position += transform.up * 3 * moveSpeed * Input.GetAxis(“Mouse ScrollWheel”);

}

For the purposes of testing out spatial audio, I’ve noticed it’s great––mimicking head movement by panning using the mouse x.

 

Turning to the Oculus Rift

For what it’s worth in a post that’s supposed to be about the Oculus Go design iteration loop. In progress with an Oculus Go app currently, I and a friend find the utility of swapping a project over to the Oculus Rift to be really helpful.

What this does for you is, allow you to take advantage of the Oculus Rift during Play Mode (in Unity) which gives way to much faster iteration time. Perfect for quick fixes to code and cohesion of various parts (for example, like Teleportation and UI).

Relationships Matter: Maximizing Retention in VR

 

Relationships Matter: Maximizing Retention in VR
Isabel Tewes
isabel@oculus.com

There are many ways to measure success, but coming from the mobile world (push notification
strategy, the habit of retention mini-games, funnel analysis, making a real difference when multi-million userbases exist) Isabel talked about retention today.

Retention defined

When someone loves your app and comes back to it time and time again.

Make a great first impression

  • pinpoint your magic
  • get to that moment quickly
  • guide people through their first experience

Share your personality

  • create a tone and stay consistent
  • rethink your interactions
  • identify the pain points
  • design against them / take advantage of them

Create a lasting connection

  • make the right decisions early

First Contact – Bernie Yee

He focused on how VR can be really overwhelming and having someone acknowledge your actions can be really powerful.

The Significance Robot Waving – the way the robot waves to you at the beginning of the experience draws upon a universal sign. You know you’re supposed to wave back. The personality of your wave then comes out as well.

Wave Finding – Helped guide users through the experience the robot is helping to guide your
attention to where you should be going.

Nudge – Nudge your users patiently and with intent

Rick and Morty – Virtual Rick-ality

Establish a tone and be consistent

Against Gravity – Rec Room

Create a safe environment that people come back to
Minimizing trolling and harassment
“Whatever you are when your [organization] is small remember you’ll only be a larger version of that”

Making friends in Rec Room
Two people making friends in Rec Room is done by shaking hands with someone.

High fiving in Rec Room

Upshot: Create your values early and stick to your values ruthlessly.

 

Reflection: Bigscreen VR

Before I continue I’ll take a step back to define any VR application that brings together people in an environment as “social vr”. What makes Bigscreen interesting is the paradox of choice. In other social applications VR Chat, AltspaceVR, High Fidelity; there are no core activities that you can derive outside of being together, which makes the choices for what you can do very broad. With Bigscreen it’s a display extension or place to watch movies with a cinema experience. Simple.
..
..
Recently I saw that they enticed Paramount (?) studios to do a premiere of Top Gun in VR. I thought that was a nice blend of a social construct we know and love of going to the movies and social VR––though I didn’t attend the screening. I’ve used Bigscreen recently and the environments are nice with physically based shadows and lighting. Bigscreen’s reliance on virtual displays makes it well-positioned to benefit from forthcoming improvements in display clarity.
multiplayer-bigscreen-1400
One thing worth noting is that I haven’t been able to get audio to work for all people in a living room setting. When putting your content on the Bigscreen audio seems to play only for you. This wasn’t the case in the movie theater where a host had no problem playing Rogue One off of Netflix for all to enjoy with sound.
..
..
Finally, when Oculus Home released it’s core 2.0 update everyone on Rift had the ability to see and use their desktop screen in VR. SteamVR also enables desktop viewing. Viveport? Although, today Oculus Home doesn’t offer social as Bigscreen does… this probably effects the uniqueness of their total product and must be considered for continuing fundraising.

OC4 Talks: Designing for Feeling – Robin Hunicke

Notes from OC4 Designing for Feeling – Robin Hunicke

Philosophy of Exploration and Design

Robin opened with her concept of triple E content (a play on AAA, disambiguated below) and extolled the value of figuring out where you want to go first

  • Elegant Expressive and Emotional content (EEE)
  • She presented a 2×2 matrix with high impact, low cost as the quadrant where most content aims… the problem, she expressed, was that the matrix leaves out elegance as a focal point
    Tips
  • Evolve concepts, tools, & solutions, to reduce cost & improve impact
  • Evolve ux
  • Expressive – Players Speak

Process & the Broad Applicability of EEE

Axes in her slide graphic included rational, eee, baroque, and scripted (e.g. Sims, Black ops)

  1. Test your concept like it isn’t your
  2. Throw away ideas
  3. Find the feeling in your idea (lock in on it)
  4. This is your secret sauce
  5. Test the prototype like it isn’t yours
  6. the prototype is different than what is on paper
  7. the process is what helps
  8. Repeat

Luna

Uncertainty is surpassed only by the effort that needs to go into it

For Luna she took inspiration for the design from a paper world feel, influenced by origami, and during the process she packed her mind with fairytales

Not everyone needs to get into hands-on design influences, but she
thought that making origami and the concepts and learning how the
tactile quality turned out were really informative
 I’ve definitely found with Project Futures: The Future of
Farming it’s really key to actually gain some influence from real world knowledge and folks that have built
constructs or structures that are going to lend to the look and feeling of the world space in the app. Namely
Infarm.

One important side note Robin dropped was that none of the characters in Luna have genders.

Other Random Notes

  • Mood boards
  • Luna started out as a PC and VR title from the beginning
  • The demo and vision existed before the actual prototype (i.e. the hands
    controlling the stars)
  • Tested prototype part 2 and threw it away
  • Music is integrated into the testing process with feeling at the center, namely, “what kind of feeling is it communicating?”

Timeline
4 year process for Luna – started out as a drawing in a book

  • They went through a massive phase where no VR was implemented, then in November 2016 it came to life in VR (7 person
    team)
  • By 2017 the pieces are starting to become cohesive and informed by the feeling

Fail Forward was key, it takes a lot of work.

Have to lean into the idea of interesting different challenging titles

UPSHOT = Diverse and inclusive teams, failure is ok, and the belief that you’re
going to get there. Leads to the triple EEEs and successful titles.

Update; Oculus Launch Pad 2017: Future of Farming

Everything is hierarchical in the brain. In VR design for users, my hypothesis is that this can be really helpful for setting the context. For example, at Virtually Live where the VR content is “Formula E Season 2 Highlights”, meaning the one donning the headset is able to watch races. I once proposed that we use the amazing UX of Realities.io to use an interactive model of Earth as the highest level of abstraction from the races (which occur all over the world). The user can spin the globe around and find a location to load in. The hierarchy written abstractly in this example is, Globe is a superset of Countries, Countries that of Cities, and Cities that of Places. I figured that this would be perfect for an electric motorsport championship series that travels to famous cities each month. We went with a carousel design that was more expeditious than the globe in the end.

The Future of Farming
 takes place largely in a metropolitan area, namely San Francisco. So I’ve decided that to begin, I’ll borrow from the hierarchical plan. I want to showcase an orthographic project of San Francisco to the user with maybe a handful of locations highlighted as interactable. To do this I’ve setup WRLD in my project for city landscape.

Upon selection of one of the highlighted locations with the GearVR controller, a scene will load with a focal piece of farming equipment that has made its way into the type of place (e.g. Warehouse, House, or Apartment, etc.).

A quick aside, last week I had a tough travel and work schedule to New York. I came upon a pretty bare blog post upon reading back what I wrote, so I decided, it was better to not share. One of the other hurdles I had, was an unfortunate loss of the teammate I announced two weeks prior, simply due to his prioritizing projects with budgets more appealing to him. I dwelled on this for awhile, as I admired his plant modeling work a lot. With the loss of that collaborator and weighing a few other factors, I’ve decided to pursue an art style much akin to that of Virtual Virtual Reality or that of Superhot. Less geometry all created in VR. Doing most of this via Google Blocks and a workflow involving pushing created environments to Unity which is pretty straight-forward. After you have created your model in Google Blocks, visit on an Internet browser with WebGL-friendly settings and download your model. From there, you can unzip that file and drag it into Unity Assets>Blocks Import which I recommend you create as a way of staying organized. You’ll note that Blocks imports speciate a .mtl, materials folder, and a .obj model usually. In order to have your intended Google Blocks model to show through you need to change one setting called “Material Naming” after you’ve clicked on your .obj. Change it to “By Base Texture Name” and Material Search can be by “Recursive Up”.

Unity_2017_1_1f1_Personal__64bit__-_blocks2unity_unity_-_Blocks_Tutorial_-_Android__Personal___OpenGL_4_1_

 

Here’s a look at the artwork for a studio apartment in SF for the app, as viewed from above. It’s a public bedroom that I’m remixing and you can see I’ve added a triangular floor space for a kitchen and this is likely where the window sill variety of hydroponic crop equipment will go. Modeling one such piece is going to be really fun.

 

 

View from Above

 

room

Angle View

 

 

 

In the past weeks, I’ve dedicated myself to edification on gardening and farming practices via readings, podcasts, and talking to people in business ecosystems involving food product suppliers. I learned about growing shitake mushrooms and broccoli sprouts in the home and got hands on with these. I learned about the technology evolution behind rice cookers and about relevant policy for farmers on the west coast over the last dozen years. In the industry, there are a number of effective farming methods that I’m planning to draw on (indoor hydroponic and aeroponic) that I can see working in some capacity in the home, and milieus I will highlight such as a legitimate vertical indoor farm facility (https://techcrunch.com/2017/07/19/billionaires-make-it-rain-on-plenty-the-indoor-farming-startup/).

I have asked for help from a design consultant standpoint from someone that works at Local Bushel.

To expound on why Local Bushel is perhaps a helpful reference point: Local Bushel is a community of individuals dedicated to increasing our consumption of responsibly raised food. Their values align well with edifying me (the creator) about the marketplace that I want to project into the future about. Those are:

  1. Fostering Community
  2. Being Sustainable and Responsible
  3. Providing High Quality, Fresh Ingredients

——
For interactions, I can start simple and use info-cards/move scenes based on the orientation of the users head using ray casts. Working in Oculus Gear VR Controller eventually.