PhotonGI caustic cache re-factoring

Discussion related to the LuxCore functionality, implementations and API.
Post Reply
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: PhotonGI caustic cache re-factoring

Post by lacilaci »

Dade wrote: Mon Sep 30, 2019 1:50 pm
lacilaci wrote: Mon Sep 30, 2019 1:42 pm Actually even if you don't use any volume you will get very different lighting with sds only cache
Very different lighting compared to ? If the only difference is useforsdsonly = 0/1 and you are keeping hybrid rendering enabled: you are rendering the normal caustics 2 times in one case.
1. you are looking at a pool through water in those images, you only see sds and pretty much no normal caustics
2. in case where I used useonlyforsds = 0 I disabled lighttracing, when useonlyforsds = 1 I also enabled lighttracing

Just try it on your own pool testscene using only caustic cache vs caustic cache(sds) + lighttracing...
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: PhotonGI caustic cache re-factoring

Post by lacilaci »

here
sds.jpg
on the left set to sds only + lighttracing, on right lighttracing off and sdsonly = 0

same amount of samples
epilectrolytics
Donor
Donor
Posts: 790
Joined: Thu Oct 04, 2018 6:06 am

Re: PhotonGI caustic cache re-factoring

Post by epilectrolytics »

The old caustic cache (before SDS-rework) also produced brighter caustics compared to BiDir, seems that's still the case...
(I think I even tested that with a pool scene somewhere.)
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: PhotonGI caustic cache re-factoring

Post by lacilaci »

epilectrolytics wrote: Mon Sep 30, 2019 2:48 pm The old caustic cache (before SDS-rework) also produced brighter caustics compared to BiDir, seems that's still the case...
(I think I even tested that with a pool scene somewhere.)
I'm pretty sure the one on the right is correct, the sds+lighttracing looks terrible and has some serious performance issues..(and won't show any signs of volume scattering)
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: PhotonGI caustic cache re-factoring

Post by Dade »

lacilaci wrote: Mon Sep 30, 2019 2:37 pm here
sds.jpg

on the left set to sds only + lighttracing, on right lighttracing off and sdsonly = 0
What you are observing is the importance of indirect caustic lighting (SD+S paths). It is something we speculated some post before. It is something only using caustic cache can be rendered.

So I flipped the problem, instead of rendering only SDS paths with the cache, I render only SD paths (aka normal caustics) with hybrid rendering. This has also the additional benefit of not requiring any setting, if Hybrid rendering is enabled, the cache will not be used for SD paths.

So "path.photongi.caustic.useonlyforsds" has been removed and not required anymore, just enable/disable hybrid rendering at your will.

P.S. I have the felling you are using 100x more photons of what you should. Using too many photons, both requires a lot of ram and slows down the rendering.
Support LuxCoreRender project with salts and bounties
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: PhotonGI caustic cache re-factoring

Post by lacilaci »

Dade wrote: Tue Oct 01, 2019 8:26 am
lacilaci wrote: Mon Sep 30, 2019 2:37 pm here
sds.jpg

on the left set to sds only + lighttracing, on right lighttracing off and sdsonly = 0
What you are observing is the importance of indirect caustic lighting (SD+S paths). It is something we speculated some post before. It is something only using caustic cache can be rendered.

So I flipped the problem, instead of rendering only SDS paths with the cache, I render only SD paths (aka normal caustics) with hybrid rendering. This has also the additional benefit of not requiring any setting, if Hybrid rendering is enabled, the cache will not be used for SD paths.

So "path.photongi.caustic.useonlyforsds" has been removed and not required anymore, just enable/disable hybrid rendering at your will.

P.S. I have the felling you are using 100x more photons of what you should. Using too many photons, both requires a lot of ram and slows down the rendering.
I didn't do some scientific comparisons but I did a ton of tests with both low amount of photons and large amount of photons all tests in the production scene with ton of glass and pool (both rendering overview and crazy close ups)
This is important cause I have large areas covered by ~same values/smooth caustics (both sds and normal) and also a lot of sharp caustics.

I think radius reduction is useless, let me explain:

1.You start with big radius you get darkening and big splotches that will:
A. be replaced with noisy smaller patches and you need to wait to "fill" rendering with photons if you have fast radius reduction
B. be refined and render nice caustics if you have good radius reduction but you still have to wait same amount of time as A

So starting with lower radius - or target radius gives much better performance overview cause you can actually see the amount of noise in photons and contribution per each update

2. You have to use low amount of photons per spp update cause large amount of photons and big radius kills performance drastically, but:
A. As radius gets smaller the amount of photons that are produced per update is too low and you render forever
B. If you start with low radius you can use more photons without too much slowdown but you now need to balance photon generating performance and pathtracing performance trade-offs

In conclusion, if I use low amount of photons (no matter the initial radius - this is useless feature it gives nothing to render) I might end up rendering for an hour to have super clean pathtracing result and then another 10 hours until I have enough photons in the rendering.

So I rather go with something like 4 milion photons per 4spp and a fixed radius of 0.0015 that will give me 1/2 of samples/sec performance penalty but progress of photon convergence is +/- same as pathtracing - so visually I can clearly see and somewhat estimate how long until rendering is clean

The only exception is some occasional fireflies, that might be however solved(mostly) with lighttracing.

So, maybe I'm not seeing something. But can you explain what is the benefit of radius reduction function?
From all my testing this was useless as a preview purposes(fixed larger radius would serve well enough)
and for final renderings it just hurts performance and makes any estimates impossible.

for example in my scene I use 4M photons per 4spp at fixed 0.0015 radius and I can guess very well:
A. if target radius is good enough
B. if photon tracing is progressing as good as rendering
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: PhotonGI caustic cache re-factoring

Post by Dade »

lacilaci wrote: Tue Oct 01, 2019 8:59 am I think radius reduction is useless, let me explain:
I have already written this in the PSR thread, the true is it is only a theoretic exercise to claim the "unbiasiness" (angle reduction, radius reduction, etc. all the same stuff). But you can set min. and max. radius to the same value and live happy.
lacilaci wrote: Tue Oct 01, 2019 8:59 am So I rather go with something like 4 milion photons per 4spp and a fixed radius of 0.0015
As I wrote before, try 2M photons and 2spp update. Less photons you store and faster the cache look up is (i.e. more samples/sec) and less GPU memory you need. If you are rendering at high resolution, you can use even 1spp update.
Support LuxCoreRender project with salts and bounties
epilectrolytics
Donor
Donor
Posts: 790
Joined: Thu Oct 04, 2018 6:06 am

Re: PhotonGI caustic cache re-factoring

Post by epilectrolytics »

Dade wrote: Tue Oct 01, 2019 9:11 am try 2M photons and 2spp update.
Is there any reason to possibly have higher sample step value?
In my testing with caustic cache I ended always with lowest step possible and then adjusting photon count.

Maybe spp should be fixed to a low value (one less parameter to expose).
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: PhotonGI caustic cache re-factoring

Post by Dade »

epilectrolytics wrote: Tue Oct 01, 2019 9:23 am
Dade wrote: Tue Oct 01, 2019 9:11 am try 2M photons and 2spp update.
Is there any reason to possibly have higher sample step value?
In my testing with caustic cache I ended always with lowest step possible and then adjusting photon count.

Maybe spp should be fixed to a low value (one less parameter to expose).
There is a fixed overhead associated with an update (stop the GPU, updated the cache, transfer all stuff) so too many updates can have a negative impact on overall GPU performances. One large updated costs less than two smaller. it is everything always a trad-off.
Support LuxCoreRender project with salts and bounties
epilectrolytics
Donor
Donor
Posts: 790
Joined: Thu Oct 04, 2018 6:06 am

Re: PhotonGI caustic cache re-factoring

Post by epilectrolytics »

Dade wrote: Tue Oct 01, 2019 9:30 amit is everything always a trad-off.
Ok I see.
Post Reply