Chaos
Architecture
Limited-time offer: Get an extra three months of free V-Ray with an annual license. Buy now and save

Industry solutions

Explore our ecosystem

Products

Media & Entertainment

Industry solutions

Explore our ecosystem

Products

Product & E-Commerce

Industry solutions

Explore our ecosystem

Products

Education & CommunityHelp

Chaos Help Center


What are you looking for?

image product-bar

Understanding Live VR rendering


 

Introduction

 

A little over a year ago, we gave our users the ability to output stereo cubemap images to be viewed in VR devices such as the Gear VR. With the release of our Guide to VR, this process became very popular, especially in the Architectural Design and Architectural Visualization community.

One issue that came from this process is that the output was only truly viewable with a VR headset, making it a challenge to see how the changes made to the scene would affect the final output. For this reason, users needed a way to tie the virtual frame buffer to an output that could be seen in VR.

In the upcoming V-Ray 3.5 for 3ds Max, we introduce a way to output the Active Shade VFB to either the Oculus Rift or HTC Vive. In doing so, the user can render and view the changes in VR as they are made.


 

What do I need to make this work

 

Currently, this only works if your VR headset is connected to the computer that is rendering. Therefore we currently support the Oculus Rift, or devices that support OpenVR such as the HTC Vive.

Based on the fact that these renders need to be very large, it is recommended that you use GPU rendering to speed up the rendering as much as possible. Additionally, since you will also need a powerful GPU to view your VR output, it is also recommended that you have at least 2 GPUs in your computer for this to work, where one is reserved for the VR output, and the others can be used for rendering. If you use the same GPU for rendering and VR output, the output will have very poor performance and will create a very negative experience and possibly cyber sickness.

If your computer only has one GPU, but you have access to other computers that have more, Distributed Rendering can be used as long as you exclude your own computer for rendering so that it can be used for VR output.

 

How to set it up
 

Oculus Rift

  1. If you have not already done so, install Oculus Rift PC runtime.
  2. Oculus Rift only: connect headset to the default graphics adapter – it will not work if connected to other graphic adapters.
  3. In the Oculus app, go to Oculus Settings -> General -> Unknown Sources and turn it on.
  4. Check that the headset is working and displays its home scene correctly.

 

HTC Vive

  1. If you have not already done so, install both Steam and Steam VR.

 

You are now ready to start render in 3ds Max

 

  1. In the V-Ray Production Render settings:
  2. Turn off Image Filtering. If you don’t you are likely to see seams in your cubemap.
  3. Under Camera Type select Cube 6×1.
  4. In V-Ray RT Active Shader settings:
  5. For Oculus: In Stereo mode select Oculus Rift (mono) or Oculus Rift (stereo), depending on your needs.
  6. For the HTC Vive: In Stereo mode select OpenVR (mono) or OpenVR (stereo), depending on your needs.
  7. In your image output select a 6×1 output. It is recommended that you choose something large enough for your display. 3000×500 is the recommended minimum.
  8. In your engine type, select CUDA or OpenCL. Again, CPU is not recommended as it will slow your performance.
  9. Under Render Device Select, make sure that the GPU that you are using to output is not selected as it will degrade your VR performance.
  10. If you wish, you can use Distributed Rendering.
  11. If you wish to change the interpupillary distance, select the Advanced settings of V-Ray RT and change the eye distance.
  12. Press the Active Shade Render button:
  13. The first render will start slowly since it takes some time to load the Oculus VR library.
  14. Since it is an Active Shade, you can continue to make changes to your scene such a moving objects, changing materials, and more, and it will update the render in both the VFB as well as the VR output. 

 

Conclusion

 

Working in VR can be a challenge. Few application have an interface that can allow people to build in VR. Using a desktop application can be a bit of a guessing game as to what the VR output will be. Using the Live VR output of V-Ray 3.5 will allow people to remove several of the steps of seeing the content in VR as they make changes. This in turn allows for quicker iterations and a better overall VR experience.

IMG_h96auu.jpg
About the author

Christopher Nichols

Chris is a CG industry veteran and Director of Chaos Labs. He can also be heard regularly as the host of the CG Garage podcast which attracts 20,000 weekly listeners. With a background in both VFX and Design, Chris has worked for Gensler, Digital Domain, Imageworks and Method Studios. His credits include Maleficent, Oblivion and Tron: Legacy.

Originally published: December 13, 2016.
© Škoda Design

Subscribe to our blog.

Get the latest news, artist spotlight stories, tips and tricks delivered to your inbox.

By submitting your information you are agreeing to receive marketing messages from Chaos. You can opt-out at any time. Privacy Policy.

Chaos
© 2024 Chaos Software EOOD. All Rights reserved. Chaos®, V-Ray® and Phoenix FD® are registered trademarks of Chaos Software EOOD in Bulgaria and/or other countries.