leap-motion-ar-overlay-2

Team Occa – Integrating Revit and Max

Systems & Tools 05 January 2015 | 12.30pm

For an introduction to our team, click here.

We’ve been hard at work this week, with the team being divided into two main areas of focus.

  • Working to integrate Revit with Unity
  • Working to integrate Max with Unity

On the Revit to Unity side we have developed various tools that run in the unity editor which can be used with any imported model. Notably useful tools include the ‘Revit Model Tagging Tool’, ‘Revit Delete Tool’, and the ‘Revit Group Tool’. The Tagging Tool enables users to swiftly import tags to objects in a model from the corresponding CSV files. Using these tags, the Delete Tool and Group Tool can respectively delete all objects containing a tag, or group all objects containing a tag to save processing power in runtime.

Revit Tools in Unity
Revit Tools in Unity

In runtime we can reference the objects grouped with the Group Tool, and we are able to make the Visible/Invisible with a simple click of the corresponding GUI.

Another component we have developed for our runtime GUI is a dialog box that displays the Revit data of any object in the scene when it is clicked. In the future we want to add a highlight feature, so that objects with information will be highlighted when the mouse hovers over them.

Runtime GUI in Unity
Runtime GUI in Unity

The team has also been working on Max MSP to Unity integration this week. The highlight of this work is the Speech Privacy demonstration, which now works with the Oculus Rift with audio provided by Arup SoundLab. The project involves passing data over a TCP connection between Unity and Max which provided some headaches due to Max’s message object data types.

Christmas came early for us with the arrival of a brand new Leap Motion and Kinect 2.  We’ve been looking into these different technologies to see how they could be used to interact with the environment in Oculus.

The Leap motion allows users to interact with Oculus using their own hands. Different hand gestured are recognised and perform different actions, for example, pinching will allow you to pick things up in the environment.  We think that this might be a more intuitive way for people to interact with the Oculus environment, and we’re looking into integrating it with our models.

LEAP Motion with Oculus
LEAP Motion with Oculus

After our demonstration of Oculus for the office last Friday, interest has been increasing in Oculus, and people have really been thinking about how they can integrate it with their own work. We’ve been approached to work on a new project to help with visualisation of a client’s site.

Tara Morgan

Melbourne, Australia

“I am a Monash University student with a fascination with the future, and the possibilities for technology. ”

View all posts by: Tara Morgan