This site may earn affiliate commissions from the links on this folio. Terms of use.

As all of us who are excited nigh the launch of Oculus and Vive are learning, virtual reality is all almost GPUs. While many PCs have enough CPU horsepower and retentivity to handle a VR workload, very few accept GPUs that are upwardly to even the minimum suggested specs for VR playback — let lone development. This nearly insatiable need for GPU horsepower makes VR a natural area of focus for Nvidia, as it showed at this yr's GPU Technology Conference (GTC). With 37 VR-related sessions, dozens of demos, and a good portion of CEO Jen-Hsun Huang's keynote defended to VR, information technology was, along with deep learning and democratic vehicles, one of the iii biggest themes at the briefing.

Look at me: Center-communicable "real world" VR experiences

Solfar's Everest VR experience comes complete with modeled snowMuch of the excitement around VR is built using amazing demos of virtual worlds. Perhaps fittingly, virtual worlds are really easier to portray than real worlds in VR, because it is possible to model exactly what a person would see from any point of view, and with any lighting. Creating stereo views is too relatively simple — basic stereo photographic camera back up can be added in game engines such every bit Unity merely by checking a box. Now, though, nosotros're starting to see full-fidelity experiences based on modeling real-earth locations. Huang showcased ii of the well-nigh elaborate during his keynote: Everest VR and Mars 2030.

Everest VR was put together by Solfar and Rfx using 108 billion pixels of bodily photographs, that were turned into 10 million polygons, which in turn drive a VR experience that is essentially photorealistic. Game-like physics are used to generate drifting snow for additional "reality." What makes immersive experiences like Everest much more powerful than an ordinary 360-degree video is that the user is not limited to a detail location, but can motion through the environment.

Similarly, Mars 2030 is largely based on massive numbers of photographs our spacecraft have sent back from there, assuasive NASA, with aid from Fusion VR, to model viii square kilometers of the planet's surface. The studio then went to work enhancing the model, including creating 1 million hand-sculpted rocks, and 3D versions of mile-long underground lava tube caves. Hoping for some star entreatment, Huang brought Apple co-founder Steve Wozniak up on screen and so we could see him be the start ever to experience Mars 2030. Everything went well for the commencement couple minutes until Woz said he felt dizzy, and needed to stop before he savage out of his chair. That was definitely awkward, and symptomatic of the "queasiness" outcome that continues to intermittently trouble VR rollouts.

Making fantasies seem real: iRay VR and iRay VR Lite

Nvidia'due south iRay is the ray-tracing parcel of choice for many of the top 3D modeling packages already. This yr the company will exist extending it with boosted capabilities to support the unique requirements of VR. Before VR, producers of figurer-generated video content could choose between either time-consuming photorealistic rendering, like that used for characteristic films, or real-time, plausible rendering needed for interactivity in video games. VR experiences are creating a need to attain the best of both — realistic, immersive, experiences that are loftier-quality, 3D, and permit the user to move around. That means they can't be entirely pre-rendered. Unfortunately, moving from a 3D model to a photorealistic experience is too processor-intensive to practice entirely in realtime. So views of the model need to be rendered, typically using ray tracing to mimic lighting and reflections in a physically accurate manner. Traditional applications similar movies or print only require high-resolution 2d images to exist produced, but immersive VR requires the creation of an interactive feel.

Jen-Hsun Huang reveals updated VRWorks at Nvidia GTC 2016That'south where iRay VR comes in. For those willing to purchase, or rent time on, a loftier-functioning calculating cluster (like Nvidia'southward ain DGX-100), iRay VR can generate a navigable model of a scene. The user can move around in the scene, and turn their head, while getting physically-accurate lighting, shadows, and reflections equally they move. Nvidia demoed this with an interactive VR model of its planned headquarters that was quite disarming. Unfortunately, even the viewing computer needs to be pretty massive — requiring around a 24GB frame buffer like the one in the 24GB Quadro M6000 to run.

Even with a supercomputer at hand, rendering for VR requires some compromises. Lucasfilm'due south Lutz Latta explained that, for case, the Millenium Falcon model used for Star Wars is made up of over 4 million polygons. Perfect for the ultimate in cinematic reality, when it can be rendered i frame at a time, but likewise complex for a Star Wars VR experience. In add-on to simplifying it, the studio has worked on a way to have a unified asset specification, so that models can be built once and then are bachelor for use in a diverseness of different media similar picture and VR.

For those of us on slightly more limited budgets, iRay VR Lite will permit you to upload a model to Nvidia's servers, where they volition generate a photorealistic 360-caste stereo view — but one that y'all tin't walk effectually. For a full-fidelity experience, even the Lite version will take advantage of 6GB or more of frame buffer, but the experiences will besides exist viewable on depression-end devices like Google's Paper-thin. Nvidia expects to have iRay VR Lite available by mid-year, with iRay VR to follow.

iRay VR is only 1 part of Nvidia'southward VRWorks set up of tools for VR development and commitment. Other parts of VRWorks too had updates announced at the show. In item VRWorks SLI volition provide OpenGL across multiple GPUs and in that location is now a VR support in GameWorks.

Realities.io: Immersive environments made practical

The cool thing about Realities immersive experiences is that it is inexpensive enough to make it possible to depict large numbers of interesting scenes -- not just a few mega-attractionsEverest and Mars are both very expensive, long-development-cycle, efforts, more or less the equivalent of making a feature film. That limits their creation to large organizations, and their subjects to those that are likely to concenter millions. Startup Realities.io has developed a system that allows it to relatively rapidly, and inexpensively, create photorealistic environments from "everyday" locations. Using around 350 photographs of a scene, it can use a process called photogrammetry to create an interactive model that the user can walk around.

The level of item is pretty astonishing. You can stoop down and run across trash on the floor, or walk over to a wall and see the brush textures in the graffiti. Realities too captures scene lighting, using light probes, so reflections change realistically as you motility. If you're 1 of the lucky few that accept a Vive, you can download Realities for costless via Steam.

Fifty-fifty more exciting, Realities founders David Finsterwalder and Daniel Sproll hope to further democratize the procedure of creating immersive experiences fifty-fifty more than by enabling others to go out and capture the images they tin can and then procedure and produce.

Orah 4i photographic camera: A stitch in real time

The Orah 4i is a plug and play solution for creating stitched 360-degree videoA more than common way of creating VR experiences is using 360-degree camera rigs. In that location are plenty of those on the market, ranging from consumer units with a couple fish-centre lenses to studio-quality rigs similar the ones from Samsung and Jaunt VR. However, all of them require a lot of post-processing to accurately stitch the images together. One company, Videostitch, has made a proficient living providing stitching solutions, simply at GTC they announced they've gone i pace farther. They introduced their own turnkey camera + stitcher, the Orah 4i. Available for pre-society for $ane,800, information technology has four loftier-quality cameras with 90-degree lenses, which feed synchronized information to a modest processor box that stitches them and streams a alive 360-caste epitome. In addition to the obvious use for consequence coverage, the system will as well let first-class existent-fourth dimension previewing for those doing high-cease 360-degree production piece of work.

As an aside, the Orah, similar nigh 360-degree camera rigs, does not produce a stereo image. And because information technology but has one camera facing each direction, you can't really generate one after the fact. Higher stop rigs with more than cameras, like Jaunt'southward, do allow mail service-processing to generate depth maps and stereo views.

However, almost all of these units — whether mono or stereo — are bundled together under the umbrella of "VR." Similarly, many "VR" experiences are static 360-degree photos that don't permit you to move around the scene (but may or may not be stereo). Information technology'd be great to start to get some standardization on terminology here. In my case, I've started to utilise immersive only for experiences that are both stereo and permit movement inside the scene. I call back information technology'd likewise be helpful if we only used VR to refer to experiences with a sense of depth (e.g. stereo), but the term is as well pop as a marketing tool for that to be likely to happen.

VR gets down to business concern

Surgeons are also using more VR technology, in theater, in planning, and even with patients, like in this example from Surgical Theater LLCVR at the show wasn't all about entertainment. Two of the demos I experienced were all nearly commercial applications. ZeroLight was showing off an amazingly realistic Audi in VR — function of a customer-focused sales experience that Audi will start rolling out afterward this yr — complete with Vive headsets in dealer showrooms that allow you to configure and well-nigh experience your car.

WorldViz has extended standard VR functionality to brand it platonic for many industrial and commercial applications. Information technology allows users to see each other and work together on a task in a virtual environment — even if they take dissimilar brand headsets — and it supports much larger "room scale" environments. For example, 1 client created a virtual infirmary in a gym, so doctors could test out the work surroundings before it was built. 1 of the scenarios I ran through involved working on a helicopter rotor. It actually brought home the ability of the Vive's touch controllers. They are a dramatic footstep forrad from trying to utilize a pocket-sized remote or gaming controller to manipulate objects in 3D. I hope Oculus gets their version out soon.

Is VR right for yous?

For lucky Vive owners Steam provides a great store and play experience for supported titlesVR at GTC was deliberately near nearly every application other than games — after all we merely had VR-frenzy at GDC last calendar month — but for most individuals, VR in 2016 will either be near gaming (I'one thousand a racing sim fan, and then my favorites that are available so far are Project Cars and Dirt Rally, but there are tons of others) or involve fiddling around with 360 experiences using a Gear VR or Cardboard. For those who do purchase a high-end headset for gaming, certain the other experiences are cool, but I don't run across anyone upgrading their reckoner and plunking downward some other $800 just to walk around a virtual Mars for a flake. Beyond that, I'k hoping Google announces something amazing under the Android VR banner in May, that can bridge the gap between the current low-end mobile phone offerings and the current ingather of gamers-and-hackers-only PC-driven headset offerings.