Ok, with filtering things start to look very very good.
I am impressed....
here's my observation.
spikes in cache are no more recognizable as leaks/bleeding/fireflies. And even extremely difficult situation/stresstest (sun+sky+interior+small pool/100K photons, + small cache size/100K photons) won't look like bleeding, just weird lighting overall.
Since preprocess time can be very fast even with moderate numbers, I think that lower limit for cache pool and size should be set (we need to figure out where things start to fall apart and/or take too much time with ~mainstream HW) this will enable users to just not care about cache settings and won't allow to break cache.
Also usage threshold of 4 seems very fair to large spaces and still fast to render even on cpu (lookup from 25-35 cm). without requiring too much photons
@Dade, the spikes are not really gone. You know that, but to make sure they are not really visible, a standard of photon count + cache size has to be on a certain level.
Here is a very fast precalc setting of 1 000 000 photons + 500 000 cache size with 35cm lookup and 4 threshold scale:
at first sight, it is clean and super fast (5min on gtx1060 at 4K) but there is a pretty strong blue tint in the small hallway with plane in center of the image, coming from the sky. Yes, it's a big scene and lot of trees and interior and probably 2-4x more photons are needed but I just wanted to say, that spikes are gone in form of leaks, but might still show up in form of weird color cast now when settings/photon counts are low.
So, maybe we should start thinking about defaults and start hiding and limiting some settings.
If preprocess setup would take anywhere from 1 to 4 seconds on HW like my current outdated (4770K and gtx1060) it should be the low end, no less available (to avoid user errors and funky bogus tutorials about performance improvements with extreme settings)
All in all, I'm very happy with how this turned out. Only thing left now is the support for all materials so that complex scenes can be tested.