Page 1 of 1

denoiser on gpu

Posted: Mon Oct 22, 2018 4:42 pm
by lacilaci
I want to test out BCD more. But even after recent updates I cannot denoise renders above 4K with opencl.
However running cpu only I can denoise render whatever resolution I want.
So my question is, if we can have an option to run denoiser on cpu only so that I can use opencl and denoise High res images...

Re: denoiser on gpu

Posted: Mon Oct 22, 2018 6:41 pm
by Dade
lacilaci wrote: Mon Oct 22, 2018 4:42 pm I want to test out BCD more. But even after recent updates I cannot denoise renders above 4K with opencl.
However running cpu only I can denoise render whatever resolution I want.
So my question is, if we can have an option to run denoiser on cpu only so that I can use opencl and denoise High res images...
I guess the problem you have is not in running the denoiser plugin itself (it already runs only on the CPU) but in the collection of statistics during the rendering (required to run than the denoiser). They require a LOT of GPU ram. The amount of ram used can be reduced tuning some parameter but the best solution, in this case, is usually to just use TILEPATHOCL.
It is pretty much the only good reason to use TILEPATH over normal PATH rendering: TILEPATH only stores few tiles on the GPU ram so the global image resolution doesn't matter and doesn't increase the amount of GPU ram used.

Re: denoiser on gpu

Posted: Mon Oct 22, 2018 8:17 pm
by lacilaci
Dade wrote: Mon Oct 22, 2018 6:41 pm
lacilaci wrote: Mon Oct 22, 2018 4:42 pm I want to test out BCD more. But even after recent updates I cannot denoise renders above 4K with opencl.
However running cpu only I can denoise render whatever resolution I want.
So my question is, if we can have an option to run denoiser on cpu only so that I can use opencl and denoise High res images...
I guess the problem you have is not in running the denoiser plugin itself (it already runs only on the CPU) but in the collection of statistics during the rendering (required to run than the denoiser). They require a LOT of GPU ram. The amount of ram used can be reduced tuning some parameter but the best solution, in this case, is usually to just use TILEPATHOCL.
It is pretty much the only good reason to use TILEPATH over normal PATH rendering: TILEPATH only stores few tiles on the GPU ram so the global image resolution doesn't matter and doesn't increase the amount of GPU ram used.
Denoising a 4K image with tilepathocl resulted in many tiles rendering black (may be related to the variance clamping issue you fixed recently?) Will try again once alphav5 is out

Re: denoiser on gpu

Posted: Mon Oct 22, 2018 8:18 pm
by B.Y.O.B.
Try with AA sample size of 1.

Re: denoiser on gpu

Posted: Tue Oct 23, 2018 5:12 am
by lacilaci
Ok, so it seems denoiser doesn't have enough samples to work with where the black tiles appear... Eventually they seem to go away.
However if I want to use higher AA samples values, I might end up with converged tiles and yet they will be black if denoiser doesn't get enough samples(as in attached image)

Re: denoiser on gpu

Posted: Tue Oct 23, 2018 7:16 am
by lacilaci
Ok, I know it's probably harder than my little brain is capable of understanding. But is it somehow possible to get whatever cycles is using for denoising to be used in luxcore? I know tilepath is probably not the same as how cycles renders and maybe some extra passes are needed to make it work. But maybe it could be doable and worthwile to do that?

Yesterday I tried for few hours really hard to make that denoiser work for me cause I can't rely forever on a standalone build of nvidia denoiser that probably won't be maintained... Turning all the values up and down on an ongoing 4K rendering and it just won't work, it can remove fireflies, that's it.

Re: denoiser on gpu

Posted: Fri Oct 26, 2018 11:03 am
by lacilaci
Another idea...
Nvidia denoiser for example can use shading normal pass and albedo pass to preserve some details(that's how I understand it)
So I know there is already some data gathering going when denoiser is enabled, that eats a lot of memory and the results are not very friendly especially for quick previews.
Could the denoiser be tweaked so that an extra passes(something like albedo and normals) would be used as masks for denoiser so that it won't blur between objects, texture details etc?