Page 4 of 25

Re: CGI tech news box

Posted: Wed Apr 25, 2018 1:53 pm
by patrickawalz
out of core geometry and textures in luxcore would be a huge feature.

Re: CGI tech news box

Posted: Wed Apr 25, 2018 3:33 pm
by Dade
patrickawalz wrote: Wed Apr 25, 2018 1:53 pm out of core geometry and textures in luxcore would be a huge feature.
If you have an AMD Vega 56 or 64 GPU it is already available, you have only to enable HBCC on the driver control panel and the GPU will start to be able to use more ram than the physical amount available on the GPU.
Latest NVIDIA GPUs have a even a more advanced feature but I'm not sure if/how it is enabled.

No, the general idea is that GPU ram is just a cache of CPU ram and the hardware swap in/out pages on demand without any need of a specific support from the software.

P.S. yes, there are vendor claiming to have added out of core rendering to their software when they have no changed a single line of code and are just leveraging a feature of latest GPUs :roll:

Re: CGI tech news box

Posted: Fri May 04, 2018 8:57 pm
by kintuX
NVidia AI Denoiser
Over the holidays to keep me busy I have implemented a simple command line implementation of NVidias new AI denoiser. Here are some examples of what it can do,

Original image:
Image

Denoised image:
Image

The code can be found here:

https://github.com/DeclanRussell/NvidiaAIDenoiser

I have created a windows distribution as well for those who wish to try it out here:

Denoiser windows v1.0.0.
source: https://declanrussell.com/portfolio/nvidia-ai-denoiser/

Re: CGI tech news box

Posted: Sat May 05, 2018 2:23 pm
by Dade
kintuX wrote: Fri May 04, 2018 8:57 pm NVidia AI Denoiser
Sincerely, our works a lot better :mrgreen:

cornel.jpg

Re: CGI tech news box

Posted: Sat May 05, 2018 6:48 pm
by kintuX
Dade wrote: Sat May 05, 2018 2:23 pm
kintuX wrote: Fri May 04, 2018 8:57 pm NVidia AI Denoiser
Sincerely, our works a lot better :mrgreen:
Yup, been testing BCD integrated in AppleSeed and can't wait to use it with Lux. Also their Light Tree (Many-Light rendering/if i'm not mistaken similar to LC pt.1) is amazing. Those two additions on GPU/OCL and with BiDir will make for another great revolution in LuxCore... just marvelous :mrgreen:

PS
my intention with posting a link to NVidias AI denoiser is quite the opposite... let users/artists see its dumbness and inefficiency - developers and humans in general are much smarter then 'HeadMasters' want us to believe... not to mention the creation here (LuxCore) is a work of a genius. ;)
Am really thankful to be a part of.

Re: CGI tech news box

Posted: Tue May 08, 2018 7:40 am
by Dade

Re: CGI tech news box

Posted: Tue May 08, 2018 10:06 am
by Sharlybg

Re: CGI tech news box

Posted: Tue May 08, 2018 10:07 am
by Dade
Sharlybg wrote: Tue May 08, 2018 10:06 am seem that the second link is not correct :?:
Fixed.

Re: CGI tech news box

Posted: Tue May 08, 2018 7:47 pm
by patrickawalz
I actually mentioned this in the features wishlist thread along with the improvements to glossy surfaces and the GGX microfacet distribution. the implementation is even included on the project website (for mitsuba)

Re: CGI tech news box

Posted: Fri Jun 15, 2018 10:17 pm
by B.Y.O.B.
https://arnoldsupport.com/2016/05/25/ar ... ure-cache/
In short, if you have an application that will need to read pixels from many large image files, you can rely on ImageCache to manage all the resources for you. It is reasonable to access thousands of image files totalling hundreds of GB of pixels, efficiently and using a memory footprint on the order of 50 MB.