Re: CGI tech news box
Posted: Wed Apr 25, 2018 1:53 pm
out of core geometry and textures in luxcore would be a huge feature.
Show your work, get help, participate in development
If you have an AMD Vega 56 or 64 GPU it is already available, you have only to enable HBCC on the driver control panel and the GPU will start to be able to use more ram than the physical amount available on the GPU.
source: https://declanrussell.com/portfolio/nvidia-ai-denoiser/Over the holidays to keep me busy I have implemented a simple command line implementation of NVidias new AI denoiser. Here are some examples of what it can do,
The code can be found here:
I have created a windows distribution as well for those who wish to try it out here:
Denoiser windows v1.0.0.
Yup, been testing BCD integrated in AppleSeed and can't wait to use it with Lux. Also their Light Tree (Many-Light rendering/if i'm not mistaken similar to LC pt.1) is amazing. Those two additions on GPU/OCL and with BiDir will make for another great revolution in LuxCore... just marvelous
seem that the second link is not correct
I actually mentioned this in the features wishlist thread along with the improvements to glossy surfaces and the GGX microfacet distribution. the implementation is even included on the project website (for mitsuba)Dade wrote: ↑Tue May 08, 2018 7:40 amA Practical Extension to Microfacet Theory for the Modeling of Varying Iridescence
In short, if you have an application that will need to read pixels from many large image files, you can rely on ImageCache to manage all the resources for you. It is reasonable to access thousands of image files totalling hundreds of GB of pixels, efficiently and using a memory footprint on the order of 50 MB.