Looking back on three years of development projects in Virtual and Augmented Reality at fme
Mar 23, 2018 | by Rolf Krämer | 0 Comments

As a member of the Virtual Reality team here at fme I want to give you a bit of an insight today into the last three years of our project work. One of the most important facts is to let the client not only see but experience the product. Virtual Reality (VR) and Augmented Reality (AR)are two key technologies to achieve this. Here is a short abstract of our project history

March – August 2015: The first Project with Oculus Rift DK2
In March 2015 we started a VR development project for one of our clients. Their wish was to show their products in the VR world. The end customer should be able to move around the product and even switch the product configuration while in Virtual Reality. We were using the DK2 and Unity to realize it. The main challenge was a stable rendering for both eyes. The frame rate (images per seconds) dropped significantly which is defined as lower than 60 frames per second. Thus, the user became VR sick. We solved this by removing unnecessary 3D Model details and integrating a better-balanced lighting system.

Lighting is one of the most expensive calculations in 3D, because the lighting controls the look of the material in 3D. The result was a close to realistic looking product in VR, with an acceptable frame rate.

Challenge accepted and won: Realizing a great frame rate without loss of quality

December 2015 – March 2016: Let’s go to Cardboard
For displaying our abilities and knowledge in VR and 3D technology to our clients, we designed a cardboard app. Inside the app, users can see three motorbikes within a garage. The user can change the color of the bikes and take on three different positions in the scene. The app runs on android devices with the calculation power of a Samsung S6 and above. The main challenge here was accurate head tracking. Android devices have the sensors for a three axis (X,Y,Z) gyroscope, accelerometers and magnetometers as separate sensors. The android API itself does not cumulate these sensors signals to a single result. However, this result is necessary for head tracking. Therefore, we had to cumulate these results with our own algorithm. In the cardboard API version two by Google, an algorithm is already implemented. Unfortunately, this algorithm is not stable enough to avoid drifting. Drifting is the effect, when you move your head around Y Axis with 360-degree movement the start point will differ to the end point.

Challenge accepted and won: Avoid drifting

May 2016: Vive and more
We finally started the project. The user can switch on the car lights and rotate the car. In addition, he has free movability in the scene. The challenge here was to find a way for how to move. The problem of the movement was solved by using a teleporter. In this kind of movement, you can select a point in the scene and teleport to it. This is one of the most common movement concepts in VR, especially on the Vive. The controllers themselves have many possibilities for user interaction. A special one is the touchpad on top of the controller. It is sensitive to the touch point of the thumb and very sensitive to movement. The touchpad is responsible for the car rotation.

Challenge accepted and won: User Acceptable motion concept

May 2016: HoloLens
Another way is Augmented Reality (AR). In AR, the virtual product will blend into the real world. The user captures the room with his smartphone and on the display the camera stream and the virtual product are melted together. The HoloLens combines VR and AR. The user wears it on his head, so he is looking through glasses on which the VR world is projected. Cameras in the HoloLens are measuring the room and create a live 3D simple mesh representation of the room. This helps the software to know, where the user is located and where the virtual products are positioned.
We have created an App for the HoloLens, which allows the user to rotate, scale and place a car in the room. In addition it is possible to turn on / off the lights of the car. One special feature is the speech controlling.
The challenge here was to find a way of accurate rendering. The model was rich in geometry and needed to be reduced for an accurate rendering. The HoloLens itself is a standalone device. Thus, the HoloLens has limited calculation power.

Challenge accepted and won: Presenting complex products with high depth of details in a good performance

August 2016 – January 2017: 360° View
A client had the wish for a 360° view in their app. Therefore, we implemented a 360° Viewer that works on iOS, Android and Win Phone. The challenge here was to find a way to have a simple reusable control, which works on every platform. The solution was a Xamarin control. For Android and iOS, the renderer for the View is written in OpenGL and GLSL. The Win Phone/ Windows solution is written in DirectX and HLSL. For a better understanding, GLSL is the shading language for OpenGL and HLSL the shading language for DirectX. Shading languages are used to calculate on the graphic device (GPU) directly. Another challenge was the tracking of the device’s orientation. Therefore, I wrote a special solution for iOS and Windows. They have special chips with an accurate measurement of the orientation of the device. For Android, I took the solution similar to the Carport app as described earlier.

Challenge accepted and won: Generic interface development while creating individual platform solution for specific chip sets

That is just a little overview of what we can do in VR, AR, Mixed Reality and general in 3D Solutions on every platform. Let the future bring more 🙂

The next topics are already in the pipeline and we will keep you updated on:

  • Technical side of Oculus
  • Technical side of Vive
  • Technical side of Cardboard & Co.
  • Technical side Hololens
  • The next generation VR/AR Glasses
  • Mixed Reality
  • Realtime VR streaming out of the cloud