1

Team Occa – Real World Interactions

Systems & Tools 29 January 2015 | 4.45pm

In our previous projects using Oculus Rift, Team Occa have relied a lot on traditional ‘button press’ inputs to interact with our models; movement through the virtual environment has been largely dictated by keyboard or Xbox controller input. When we shared these demos with our colleagues at the Melbourne Arup office, a lot of users said that they’d like to be able to see their own hands or body in the model; it seems very natural to the user to wave their arms and point at things while immersed in the Oculus environment, but unfortunately no one else can see what they’re pointing at. So this week we pursued ways to get our own hands and bodies into the Oculus virtual space.

We’re currently working with two separate devices to achieve this goal, the Microsoft Kinect 2 and the Leap Motion.

The Leap Motion device provides a simple way to get your hands into an Oculus Rift virtual environment. It contains many inbuilt functions that make it easy to track the position of your hands, recognise the gesture you’re making, and even determine the direction that you’re pointing. Andrew has been working to integrate the Leap Motion into the existing ‘Speech Privacy’ project as a way to test the capability of the device. In our previous Speech Privacy demo, if you looked down while in the scene you would see the (somewhat crudely rendered) motionless body of another person. With the addition of Leap Motion to the scene, you can now look down and see a pair of digital hands moving in sync with your own hands, which feels much more natural to the user. After updating to the latest Leap Motion software, we found that the device is quite accurate when detecting changes in finger gestures, but less accurate when detecting a hand swiping in front of it.

Leap Motion hands inside a VR environment
Leap Motion hands inside a VR environment

 

Because of its strengths in detecting finger gestures, we decided to explore pointing as a way to interact with our models and add more value to the experience of the user. The pointing is quite accurate and we’re hoping to use this to select objects.

Using Leap Motion to point at objects
Using Leap Motion to point at objects

 

Using the Leap Motion, we can also interact with the virtual environment directly. For example, a swipe of the hand could knock the computer in the photo above right off its virtual table. This interaction however, requires a little getting used to, as it’s hard to perceive the depth of the scene and there isn’t any haptic feedback. We’re hopeful that by playing with the scale of the models the depth issue can be mitigated in Oculus.

The Kinect 2 works similarly to the Leap Motion, but on a much grander scale so that your entire body can be tracked. However, the Kinect is more difficult than the Leap Motion to integrate with Oculus. Visualising a stick figure model of your body within an Oculus environment takes a moderate amount of wizardry, as there isn’t an existing Oculus and Kinect integration package for Unity (the game engine we’re developing on), and the Kinect into Unity package is severely lacking.

A notable problem with integration of the Kinect 2 and Oculus was that their respective head tracking were conflicting with each other. This gave the user the surreal experience of being able to look at their own stick-body’s head. We also encountered issues because the Kinect’s default output is a mirror image of the real world; this function is useful when watching your avatar moving on a screen in front of you, but when combined with the Oculus it results in your left stick-body hand moving when in reality you’re moving your right hand. With some nifty script work, we were able to fix these issues, so now we can have one person in the Oculus scene, comfortably looking down at his own stick-body.

Ash and Michael testing Kinect
Ash and Michael testing out the Kinect 2 in Oculus

 

Kinect stick bodies
Michael and Ash’s stick bodies (As Michael is wearing the Oculus, the Camera symbol is on his ‘head’)

 

View from Oculus
The view of the stick bodies from inside Oculus, looking down Michael’s arm at Ash

 

Our ultimate goal for the Kinect is to be able to have two different people wearing an Oculus each, in front of a Kinect, and for them to be able to see each other in the same scene. This would enable us to take clients on ‘guided tours’ inside a model – an end result which would be beneficial for both parties. The person ‘leading’ the tour would make an obvious marker for the other person to follow, and the person taking the tour would be able to gesture at objects if they wanted to ask for more information. Our progress so far allows us to transmit the body stance data from one computer to another using Unity’s built in networking. Eventually, this work could allow us to discuss a project within a virtual space, with people on the other side the world.

This week some of the members of Team Occa visited the MVRM (Melbourne Virtual Reality Meetup). The Meetup was presented by Virtual Reality Ventures and centred on the idea of corporate uses of Virtual Reality; they brought attention to other areas where Oculus could be used, as well as the Engineering, Architecture and data visualisation sectors that we’re already familiar with.

A particularly interesting device that was shown was the Auug ‘Motion Synth’. The device itself is actually just a physical attachment for your iPhone (The AUUG Grip) coupled with the AUUG app. It was originally designed as a musical instrument which changes tone/vibrato/key/etc. based on the height, speed and arc that the phone is moved in. However, the presenter Josh Young also demonstrated how it can be used as an interactive tool to create an entire design in Sketchup using the same motions. The interactivity of the device seems quite simple when learned, and it seems like there’s potential to use the device with Oculus. Using the device, it could be possible to move around the scene or interact with your environment with the movement of your hand, and the buttons on the device would provide an easy way to select objects.

Auug Motion Synth
Auug Motion Synth – http://www.auug.com/

 

While integrating these physical interactions with Oculus, we have to consider which movements are most intuitive, yet still remain comfortable. While Leap Motion does offer the ability to see your own hands, it can also be somewhat annoying and uncomfortable if you have to hold your hand up and point to move anywhere. The Auug Motion Synth looks like it has a lot of potential for many different movements with high precision, but it could also take a while to learn how to use – and this isn’t ideal if you’re only showing a project to a client for a 15 minute demo. To comfortably use the Kinect, a large open space is needed so that the Kinect can clearly detect the person it needs to.

These technologies all offer new functionality and benefits, but each with their own drawbacks. In order to get the best performance in any given project, we’ll need to consider which interaction style would be most effective – whether the experience of a fully immersive virtual environment with motion interaction is more beneficial than the simplicity of an x-box controller as the main form of interaction.

Tara Morgan

Melbourne, Australia

“I am a Monash University student with a fascination with the future, and the possibilities for technology. ”

View all posts by: Tara Morgan