denoiser on gpu

Use this forum for general user support and related questions.
Forum rules
Please upload a testscene that allows developers to reproduce the problem, and attach some images.
Post Reply
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

denoiser on gpu

Post by lacilaci »

I want to test out BCD more. But even after recent updates I cannot denoise renders above 4K with opencl.
However running cpu only I can denoise render whatever resolution I want.
So my question is, if we can have an option to run denoiser on cpu only so that I can use opencl and denoise High res images...
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: denoiser on gpu

Post by Dade »

lacilaci wrote: Mon Oct 22, 2018 4:42 pm I want to test out BCD more. But even after recent updates I cannot denoise renders above 4K with opencl.
However running cpu only I can denoise render whatever resolution I want.
So my question is, if we can have an option to run denoiser on cpu only so that I can use opencl and denoise High res images...
I guess the problem you have is not in running the denoiser plugin itself (it already runs only on the CPU) but in the collection of statistics during the rendering (required to run than the denoiser). They require a LOT of GPU ram. The amount of ram used can be reduced tuning some parameter but the best solution, in this case, is usually to just use TILEPATHOCL.
It is pretty much the only good reason to use TILEPATH over normal PATH rendering: TILEPATH only stores few tiles on the GPU ram so the global image resolution doesn't matter and doesn't increase the amount of GPU ram used.
Support LuxCoreRender project with salts and bounties
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: denoiser on gpu

Post by lacilaci »

Dade wrote: Mon Oct 22, 2018 6:41 pm
lacilaci wrote: Mon Oct 22, 2018 4:42 pm I want to test out BCD more. But even after recent updates I cannot denoise renders above 4K with opencl.
However running cpu only I can denoise render whatever resolution I want.
So my question is, if we can have an option to run denoiser on cpu only so that I can use opencl and denoise High res images...
I guess the problem you have is not in running the denoiser plugin itself (it already runs only on the CPU) but in the collection of statistics during the rendering (required to run than the denoiser). They require a LOT of GPU ram. The amount of ram used can be reduced tuning some parameter but the best solution, in this case, is usually to just use TILEPATHOCL.
It is pretty much the only good reason to use TILEPATH over normal PATH rendering: TILEPATH only stores few tiles on the GPU ram so the global image resolution doesn't matter and doesn't increase the amount of GPU ram used.
Denoising a 4K image with tilepathocl resulted in many tiles rendering black (may be related to the variance clamping issue you fixed recently?) Will try again once alphav5 is out
User avatar
B.Y.O.B.
Developer
Developer
Posts: 4146
Joined: Mon Dec 04, 2017 10:08 pm
Location: Germany
Contact:

Re: denoiser on gpu

Post by B.Y.O.B. »

Try with AA sample size of 1.
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: denoiser on gpu

Post by lacilaci »

Ok, so it seems denoiser doesn't have enough samples to work with where the black tiles appear... Eventually they seem to go away.
However if I want to use higher AA samples values, I might end up with converged tiles and yet they will be black if denoiser doesn't get enough samples(as in attached image)
Attachments
denoiserblacktiles.JPG
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: denoiser on gpu

Post by lacilaci »

Ok, I know it's probably harder than my little brain is capable of understanding. But is it somehow possible to get whatever cycles is using for denoising to be used in luxcore? I know tilepath is probably not the same as how cycles renders and maybe some extra passes are needed to make it work. But maybe it could be doable and worthwile to do that?

Yesterday I tried for few hours really hard to make that denoiser work for me cause I can't rely forever on a standalone build of nvidia denoiser that probably won't be maintained... Turning all the values up and down on an ongoing 4K rendering and it just won't work, it can remove fireflies, that's it.
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: denoiser on gpu

Post by lacilaci »

Another idea...
Nvidia denoiser for example can use shading normal pass and albedo pass to preserve some details(that's how I understand it)
So I know there is already some data gathering going when denoiser is enabled, that eats a lot of memory and the results are not very friendly especially for quick previews.
Could the denoiser be tweaked so that an extra passes(something like albedo and normals) would be used as masks for denoiser so that it won't blur between objects, texture details etc?
Post Reply