Note: you are currently viewing documentation for a beta or an older version of Varjo
This page explains methods for how you can create realistic-looking virtual objects in mixed reality.
On this page you could find following topics:
- Matching VR output with video signal
- Illuminating virtual object with real world lighting
- Rendering shadows from virtual environments to real world
Matching VR output with video signal
After you have seen the environment through the video pass-through, you can begin placing virtual objects in it. There are few things that you can do to modify the virtual object’s colors for it to look realistic within the environment.
Subscribe to DistortedColor stream which provides constantly updating camera parameter data (white balance and brightness).
See the code example in MRExample -> DataStreamer.cpp
Next, multiply those values in the VR postprocess.
See the code example in MRExample -> Scene.cpp
void Scene::update(double frameTime, double deltaTime, int64_t frameCounter, double exposureEV, const std::array<double, 3>& exposureWB)
Illuminating virtual object with real world lighting
You can also make the lighting of the virtual object match the environment. Subscribe to the EnvironmentCubemap stream which provides constantly updating environment data.
EnvironmentCubemap contains the cubemap with following layout:
* | top |
* | left | front | right | back |
* | bottom |
See the code example in MRExample -> DataStream.cpp
Next, use this map as a light source.
Now you have a virtual object with realistic lighting in an MR environment.
Rendering shadows from virtual environments to real world
Since you might not have a virtual environment for your MR application, your virtual objects won’t cast any shadows. In order to avoid this, model a floor for your environment in the position where the real floor is. Typically, it’s enough to render a plane at the floor level. After that, render a black VR color and write the shadow intensity into the alpha channel.