Epic Games showcases the Unreal future of graphics with real-time ray tracing in Star Wars

Epic Games showcases the Unreal future of graphics with real-time ray tracing in Star Wars

Over at the Game Developers Conference in San Francisco this week, Epic Games hosted a State of Unreal presentation to demonstrate future improvements to its graphics engine. The biggest highlight for most people will likely be the company’s real-time ray tracing showcase, featuring a cute elevator scene from the Star Wars universe. But there was also a spookily realistic lizard-like alien called Osiris, voiced and animated by the performance of Gollum actor Andy Serkis, and a photorealistic “digital human” named Siren.

The Star Wars scene was designed to show off the various real-time light reflections and cinematic effects that can be achieved with ray tracing. Nvidia just announced real-time ray tracing, dubbing it Nvidia RTX, as a feature of its next generation of graphics cards, and Epic Games is among the first to offer support for the same. This rendering technique has been “a dream of the graphics and visualization industry for years,” said Nvidia senior vice president Tony Tamasi, but the hold-up until now has been the lack of graphics chips powerful enough for it. With the help of Nvidia’s RTX and Microsoft’s new DirectX Raytracing (DXR) API, Epic Games will be making real-time ray tracing available to Unreal Engine developers later this year.

The Osiris monologue is all about facial animations and how they can be mapped directly from the human actor to the digital character. Working with 3Lateral’s so-called Meta Human Framework technology for capturing actors’ facial performances, the Unreal Engine shows itself capable of producing extraordinarily lifelike (for a fictional alien, anyway) animations. The level of graphical detail and fidelity in this scene is frankly staggering, with all sorts of little twitches and convulsions happening across the alien dude’s face. And it’s all done in real time. This proof of concept, unlike the ray tracing demo, isn’t destined for any sort of immediate deployment and it serves only to show what will one day be possible.

You’ll have to look closely to tell that the Siren performance is digital rather than human. It’s another proof of concept, this time produced in partnership between Epic Games, Tencent, Vicon (who did the finger and body motion capture), Cubic Motion, and 3Lateral. The video was done by mapping the likeness of actor Bingjie Jiang onto the movements performed by actor Alexa Lee. The techniques for achieving the lifelike realism of motion are still quite demanding — including full-body motion capture rigs — but the advancement from the Unreal Engine is in streamlining the rendering process and raising the bar for visual fidelity of the final product.

Next Up In

Gaming

Read Original Article

Otaku,Cats lover and of course the founder of www.pr0t3ch.com “When life gets hard.. just Watercool and OVERCLOCK !”
×
Otaku,Cats lover and of course the founder of www.pr0t3ch.com “When life gets hard.. just Watercool and OVERCLOCK !”
Show More

wajdi1987

Otaku,Cats lover and of course the founder of www.pr0t3ch.com "When life gets hard.. just Watercool and OVERCLOCK !"

Related Articles

Leave a Reply

Close
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker