Resource intensive pre-rendered VR scenes could be a thing of the past, if Lytro’s new Volume Tracer software development catches on.
Light-fields are a tricky thing to explain. And matters aren’t helped when companies such as Magic Leap do the news rounds with their own bandying of the term, without ever explaining what they mean. Lytro however, takes a good stab at it by outputting products and imaging solutions that show the tech in good light.
To this, the company has just announced what it believes will lead to “breakthrough experiences” for computer generated VR content. A computer program for lighting 3D scenes known as Volume Tracer.
In a nutshell, Volume Tracer places multiple cameras at strategically placed viewpoints around a computer generated 3D scene. What these virtual cameras then see provides the sample data for virtual light to be easily computed within the scene.
The whole concept is pretty opaque. But what Volume Tracer enables can be easily explained. Think about the distinction between CG imagery in Hollywood blockbusters versus what is rendered inside a VR headset.
It takes immensely powerful and expensive computers a long time to render high quality CG scenes for movies. However the luxury of time and computing horsepower are unfeasible for a CG VR experience that you interact with in the moment.
Content makers for VR are pushed to the limit in creating visually compelling experience that hit a high number of frames-per-second. You need this in VR otherwise the experience can become a jarring, uncomfortable mess. The slicker and smoother the visuals, the better.
And so a compromise must be made. Higher visual quality, which would ordinarily require more time and computational resources to render, is sacrificed for responsiveness.
What Lytro claims Volume Tracer can do is bridge this gulf. By allowing most VR render engines (the frameworks that generate the imagery you see inside the headset) to create light-field samples, the VR scenes gain cinematic lighting qualities that would otherwise be impossible to render live using old methods.
https://vimeo.com/236615698
The video above is Lytro’s own effort explain the process. Beyond this, it has partnered with award-winning Director and former Pixar animator Rodrigo Blaas to create a VR short film, One Morning, to show off the tech.
“(Content creators) could either render the content as a 360 video, and they get beautiful results but limited to three degrees of freedom. Or the trade off on the opposite side is enable six degrees of freedom, but be limited to the graphic capabilities of a real-time engine. In Lytro Volume Tracer you have an uncompromising environment where you have six degrees of freedom and retain the quality of a fully rendered ray-traced environment.” explains Ariel Braunstein, Chief Product Officer at Lytro.
As a company, Lytro is betting big on VR as the future for content creation. But its bread wasn’t always buttered that way. Some 11 years old now, Lytro’s first efforts concentrated on consumer-facing cameras.
These also used light-field tech, employing an array of micro lenses that took in light information from multiple directions. The resulting camera took images that could be refocused after the fact. A cool gimmick, but one that never caught on with Joe Public.
Sensing a lost cause, the company shifted to concentrate on VR. It’s product lineup now includes two huge light-field capturing cameras for cinema and VR. Lytro shot a VR music video featuring singer Bobby Halvorson using the latter.
We’ve embedded a fascinating behind-the-scenes video below. In it, you see how light-field captures can enhance live-action VR in a way not possible with standard cameras.
Source: roadtovr.com
https://vimeo.com/213266879
License: The text of "Lytro Announces Innovative VR Rendering Software" by All3DP is licensed under a Creative Commons Attribution 4.0 International License.