Unreal Engine 5 powers virtual reality for a wingsuit flight simulator

Intrepid individuals throughout history have attempted to conquer the skies using everything from pedal planes to rocket-powered jet packs.

Arguably the closest flight a human has ever achieved to non-mechanically assisted flight is wingsuit BASE jumping – also known as wingsuit flight.

This very dangerous and technically difficult sport involves BASE jumping from a high point while wearing a webbing sleeve suit that allows the wearer to slide rather than free fall.

Requiring years of skydiving and BASE jumping experience – and with a fatality rate of 1 in 500 jumps – wingsuit BASE jumping is an activity that, until now, was 99.9% out of reach. Population.

JUMP is the world’s first hyperreal wingsuit simulator, combining a real wingsuit, virtual reality headset and a mix of suspension, wind effects and hyperreal multi-sensory stimulation.

It’s the brainchild of CEO and Founder James Jensen, who was part of the team that created The VOID, one of the first walking virtual reality simulation companies.

Jensen assembled a team, and between 2019 and 2021 they built a prototype simulator. This led to a working facility in Bluffdale, Utah, which has now been operating for over four months and has transported over 5,000 people. “I’ve never skydived or BASE jumped,” says Jensen. “I rely on my professional athletes to tell me it’s real – they said it’s about 85% there. We’re pushing 100 percent.”

JUMP takes the flyer through hyper-detailed 3D landscapes of some of the world’s most breathtaking BASE jumps, including Notch Peak in the United States. To do this, the JUMP team flew a helicopter equipped with high-end cameras and spent two days capturing thousands of ultra-high resolution images of the landscape below.

The images were processed using the latest version of the RealityCapture photogrammetry tool which allows the creation of ultra-realistic 3D models from image sets and/or laser scans.

The reconstruction of the 58,000 captured images required five supercomputers. The team also used precise information from gyroscopes and other sensors to create a personalized, high-accuracy flight log. The result was an incredibly detailed digital model of the environment of over eight billion polygons over 10 square miles.

The next step was to bring the massive dataset into Unreal Engine 5. “It took some support from the RealityCapture team, but in the end we developed some new tools that helped slice those huge datasets and assigning the appropriate textures and materials,” says Jensen.

The team leveraged Nanite, Unreal Engine 5’s virtualized micro-polygon geometry system to handle multi-million polygon mesh import and replication while maintaining a real-time frame rate without any loss. noticeable fidelity.

For lighting and shadows, the team harnessed the power of Lumen, a fully dynamic global illumination system in Unreal Engine 5 that allows indirect lighting to adapt on the fly to direct lighting changes. or geometry.

“Because we strive for total photorealism, we rely heavily on Nanite and Lumen to bring our scenes to life,” says Jensen. “We currently have the largest dataset in Nanite at eight billion polygons – over 700 parts and 16,000 textures per part.”

Jensen explains that features like these are why JUMP used Unreal Engine to create the experience. “Unreal Engine is quite simply the industry leader in real-time, high-resolution simulations,” he says.

“To see things that I used to do in video production that would take days or even weeks and months to render now happen in real time, it’s amazing. Polygon count has always been a bottleneck, and global illumination with Lumen – it’s breathtaking to see in real time.

The JUMP team filled the virtual environment with shrubs, trees, grass, and other objects from Quixel Megascans, a library of scans featuring surfaces, textures, vegetation, and other CG elements high fidelity included with Unreal Engine 5.

They have also developed their own physics engine, FLIGHT, which handles all the configurations and physics of the physical and digital worlds. Blender and Maya were used for 3D art.

The result is an awe-inspiring virtual world realistic enough to make travelers believe they’re standing on the precipice of a 1,200m drop.

But to create the fully immersive feeling of real flight, what’s seen in the VR headset must be combined with a real wingsuit, suspension system, wind effects, and multi-sensory stimulation.

“Physical effects are key to being able to mimic reality,” says Jensen. “When you can sync the physical feel with the visuals and audio, you access a whole new dimension in VR simulations.”

Simulation haptics are triggered by events in the virtual environment. “We wrote custom code in Unreal Engine specifically for moments inside the wingsuit BASE jump experience that trigger cues for smell, wind speed, haptic scene effects, effects sounds and physical objects,” says Jensen.

Details that give a real sense of presence include filling the wingsuit with compressed air. Once a flyer jumps off the cliff, their wingsuit inflates in seconds, while a fan blows the wind at an ever faster rate to add to the realism of the experience.

For now, JUMP is a location-based experience, but Jensen hints at a future in which a version of the system could work in homes around the world.

“The JUMP simulator and technology is the foundation for true full mobility in any metaverse,” he said. “Within a few years of location-based entertainment, we will inevitably get a perfect virtual reality mobility product for home use.”

Comments are closed.