This is the interesting part...
If you have a large pool(cause yes that's my example these days) you will need a LOT of photons... Especially if it's 42m long and you are looking very closely at a very small portion of it...
But you can only trace so much each spp until you hurt performance(same goes for initial radius). So while Dade keeps sugessting tracing lower amount of photons more often...
I might be in a situation where tracing 2M photons per 2spp is too slow (filling pool with photons) but 50M photons per 10spp on cpu is actually fantastic... It is indeed a trade off that you need to test and see how much you can get away with with your scene and your given hw and what are you willing to trade..
However, the problem I was complaining about (radius reduction) is plain stupid and has NO real benefit and only hurts performance and only confuses user cause now you cannot see if you have enough photons per spp... This value cannot be guessed by anyone properly and the "correct" value would also depend on starting radius and target radius so it's just bullshit.
The whole bias part of this solution should be decided by user by defining fixed radius needed for given scene/purpose (+amount of photons, per what spp). Radius reducion is guaranteed to confuse people(general users, archviz or product viz people) and just make this feature very unattractive.
It should be photon radius + X photons per X spp : 3 parameters (photon radius depending on the need and other two depending on performance)
I know I can set both to have the same result... But every parameter exposed that can make people do weird shit is going to be used wrongly and you bet something so useless as radius reduction is just going to do bad stuff..
Anyway, today's hw I can do 4M photons per 4spp with so little tradeoff that in 10min. I get 42M pool filled with 0.0015(scale) photons - fireflies remain a problem