Bare Hand Input (BHI) with Virtual Reality

User experience (UX) designers commonly make products that rely on common knowledge. When this works out, the result is often an intuitive product. For example, consider a straw for drinking. It relies on the common knowledge of sucking and requires little explanation.¹

On one hand this dependence on knowledge, increases comfort, familiarity, and usability. On another hand, you do it wrong and confusion can ensue.

For example, how would you pick up a pamphlet from a table? You’re probably thinking, uh, I reach out and pick it up. That’s an action that is so automatic to us we don’t think about how complex it is to someone who hasn’t learned that behavior. Now how would you pick up a pamphlet from a table in a virtual environment with a hand controller? Do you try to mimic the way that the hand reaches out and touches the pamphlet? Do you have the user touch a hand controller to the pamphlet and pull a trigger to grab it? Do you point a laser at the pamphlet and have it levitate?

You can see that mimicking hand movements with a controller quickly devolves into something that needs explanation for a first time user to figure out. With multiple VR device designs (e.g. HTC Vive controllers, Oculus Touch controllers, PlayStation Move) on the market there’s also no standard yet to be established for interactions.

Evolutionary Design

In the past, we’ve interacted with virtual contents in unnatural ways. Previous innovations taught us to map our thoughts to typing on a QWERTY keyboard, clicking a mouse, using button mental models with a gamepad, or gesturing on a trackpad with swipes. While innovation has progressed, we still haven’t fully solved for natural human interaction.

Wouldn’t it be amazing to interact with virtual environments with nothing but your hands? Using the pamphlet example, instead of pointing at it with your hand, you could  grab it as you would normally. Or pinching and pulling a virtual window to resize it without a controller. That’s what researchers, designers, and users really want – direct manual input (or the sensation of it), and there’s no shortage of people working to figure it out. Going back to 1985 there was the NASA Ames Research Center’s Datagloves, Mark Bolas’ Fakespace Labs Pinch GlovesManus VRLeap Motion, Gloveone, Samsung’s Rink, and others for this.

Revolutionary Design

This space is beyond the scope of this article, but I believe the most exciting option is something  called bare hand input (BHI).

Project Soli technology is a mini doppler radar with a very sophisticated hand gesture signal recognition. Ivan Poupyrev and his team at the Google Advanced Technology and Projects division worked on Soli.

Google’s Soli technology will be integrated into wearables, phones, computers, cars and other IoT devices. As you think about the common gestures that you use in activities everyday, you will begin to notice patterns. Turning a key. Turning a shower faucet. Turning a door handle. Turning a page of a book.

The Project Soli team is focused on codifying those patterns for meaningful standards for BHI. The most exciting innovation, to me, is that repository of gestures. One example is shown below in which the model uses his thumb and index fingers to create a slide adjuster gesture perhaps for volume on a speaker (I actually considered ‘touch-free’ input when deciding a name for BHI, but since touch among fingers is useful in many applications that seemed inappropriate).

For a product example, take a look at Bixi. Bixi uses 3D, touch-free input for GoPro camera controls in tricky situations. Back to VR, BHI is interesting for Cardboard and mobile VR (which have limited controls to begin with). Generally, those controls include some variation of gaze, swipe, and click/tap. The Soli BHI scheme may become ideal for tethered HMDs as well.

However, there are still things that BHI doesn’t make more intuitive such as locomotion. Specifically, VR products with controller input schemes currently use variations of point and click to teleport. Such products would need a BHI replacement like pointing with the index finger and thumb cocked back in the direction in 3D space that you wish to move.

Though, this technology might fall short for premium VR content that asks for a realistic emulation of the sense of touch. Content examples where you need to feel rain on your hands, or the cold metal of a sword hilt, or the reverberation of a baseball bat after connecting with a ball. Understandably, certain systems aim to achieve this immersion level. Today it is my opinion that we are far away from people buying accessories like haptic gloves. I’d contend that for most VR use cases, through good VR UX design, makers can use the natural feeling of our fingers contacting one another to mitigate that loss of sense of touch.

I’m looking forward to getting hands on (funny right?) with BHI for VR and would like to see what others create as well. Please tweet us your thoughts.

¹ Mike Alger on VR UX

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s