For each assets in your scene you need a 2 different UVmap. one for your material and texture layout and a secondmore importante for the Lightmass Light baking Data.I remenber staying on my computer for entier days jsut for this time consuming process.
Then you have the lighting workflow for thoses who want try to match offline renderer ( Even if Top quality Unreal can't match top quality Corona ).
You have to go throught a dozens off tricks and option with skillfully placed light in your scene to fake High quality GI.
I remenber That we also sometime have to override Unreal internal limation at the cost of huge lightmass compiling time.
This is quite an outdated information right there
lightmaps can be auto generated on import
the custom lightmass settings were popular for cpu lightmass which is completely different tool in both quality and speed compared to the new gpu lightmass which is better and much faster and currently in 4.26 beta also has denoiser options.
There are some problems with this solution, and you still need to be careful about some things, I'm not a big fan of lightmass, but it allows some crazy quality lighting super fast if you know what you are doing:
here's an 10 second baking VLM grid example on trees, something incredibly difficult and slow in old cpu lightmass:
https://www.artstation.com/artwork/W2ZkG2
Also I disagree about quality:
https://www.behance.net/gallery/1056932 ... l%20engine
https://www.behance.net/gallery/1016424 ... l%20engine
https://www.behance.net/gallery/1055682 ... l%20engine
https://www.behance.net/gallery/9765458 ... l%20engine
I easily take this quality in realtime over waiting for a single render even minutes. And many clients will too.
Sure, there are still many cases where you would run into issues, and I'm not saying this is best solution right now... However I think in a year that might be a different story, not only for unreal but otoy working on brigade, blender eevee might get raytracing features and some gi solution. Realtime rendering will take over in a year, even if we look at eevee or brigade they are more like semi-realtime.
anything speeding up a realtime renderer could probablly be adapted to offline rendering tech and vis versa.Baked light data are part of offline renderer since a while now.
To me Lumen is just a kind of Voxel cone tracing nothing new or extremelly hard to implement
it is a bit more complex
:
"Lumen uses ray tracing to solve indirect lighting, but not triangle ray tracing," explains Daniel Wright, technical director of graphics at Epic. "Lumen traces rays against a scene representation consisting of signed distance fields, voxels and height fields. As a result, it requires no special ray tracing hardware."
To achieve fully dynamic real-time GI, Lumen has a specific hierarchy. "Lumen uses a combination of different techniques to efficiently trace rays," continues Wright. "Screen-space traces handle tiny details, mesh signed distance field traces handle medium-scale light transfer and voxel traces handle large scale light transfer."
They use voxel tracing in a limited fashion and add other method to fully solve GI. But until they show off some mostly indirectly lit interior with multiple bounces I remain sceptical about this. Their demo shown mostly rocks which could hide imperfections and leaking in lighting...
I have some doubts this can get adapted to pathtracing easily, or if it's even possible, especially if you want it real time...
There is also a more convincing method called Radiance Probes Global Illumination you can see at work here :
This is what rtxgi or ddgi does, and not only it runs on directX and RT cores but it is also highly dependant on probe density and from the results I've seen it falls apart rapidly if smaller objects are indirectly lit in a larger room for example. At some point it might evolve into something more powerful but right now it's a bit complicated and also not sure how pathtracer would use probes and how it would impact quality.
ALL that said the whole industry is leaving rasterisaton for ray tracing and this not ony beneficial to realtime renderer it is also for offline renderer.
So far I'm seeing lot of hybrid solutions rather than leaving rasterization. Raytracing gives fast and high quality shadows, proper reflections and refractions, but for GI it is horribly slow.