facebook rss twitter

Crytek reveals Neon Noir real-time raytracing demo

by Mark Tyson on 18 March 2019, 10:11

Tags: Crytek

Quick Link: HEXUS.net/qad5qz

Add to My Vault: x

Just ahead of the weekend Crytek released a tantalising video dubbed 'Neon Noir'. The headline is that this video shows a hardware agnostic real-time raytracing demo running at 4K 30fps on an AMD Radeon Vega 56. Crytek says that Neon Noir runs on "most mainstream, contemporary AMD and Nvidia GPUs".

In a blog post accompanying the video release, Crytek provides some background to the Neon Noir demo. It says that Neon Noir "shows how real-time mesh ray-traced reflections and refractions can deliver highly realistic visuals for games," – you can this judge for yourself by watching the video, above. Specifically, Neon Noir showcases a new advanced version of Cryengine's Total Illumination with real-time ray tracing.

The video that Crytek has shared follows a police drone's journey through a futuristic neon-lit city after a recent downpour. Complex reflections from streaked, steamy windows, broken mirrors, and the wet streets are seen as the drone surveys the city. The drone itself has various lights, such as headlights and it even seems to have indicators.

CryEngine 5.5 will integrate this real-time raytracing technology later in 2019. Crytek says that to benefit most from it games/systems will need to use the latest gen graphics cards plus APIs like Vulkan and DirectX 12. Hopefully it won't be too long before users will be able to download and run the Neon Noir demo to test their PCs, and the greater prize will be, of course, games being enabled with this technology.

HEXUS Forums :: 15 Comments

Login with Forum Account

Don't have an account? Register today!
This is the raytracing people hoped for. Hope it becomes generally available to devs and supplants NVidia-only alternative.

I'm surprised it runs as well as implied on V56.
RTX is the new “G-Sync”…
I'd love to have seen Jensen's reaction when he saw this. Bet that was a picture.
Does anyone know if this actually uses the RTRT cores in an RTX card or will it run using traditional computation on all GPUs?

I'd love to have seen Jensen's reaction when he saw this. Bet that was a picture.

I bet it was! :lol: