CGI tech news box

General project and community related discussions and offtopic threads.
patrickawalz
Supporting Users
Posts: 8
Joined: Tue Dec 05, 2017 1:45 pm

Re: CGI tech news box

Post by patrickawalz » Wed Apr 25, 2018 1:53 pm

out of core geometry and textures in luxcore would be a huge feature.

User avatar
Dade
Developer
Posts: 1044
Joined: Mon Dec 04, 2017 8:36 pm

Re: CGI tech news box

Post by Dade » Wed Apr 25, 2018 3:33 pm

patrickawalz wrote:
Wed Apr 25, 2018 1:53 pm
out of core geometry and textures in luxcore would be a huge feature.
If you have an AMD Vega 56 or 64 GPU it is already available, you have only to enable HBCC on the driver control panel and the GPU will start to be able to use more ram than the physical amount available on the GPU.
Latest NVIDIA GPUs have a even a more advanced feature but I'm not sure if/how it is enabled.

No, the general idea is that GPU ram is just a cache of CPU ram and the hardware swap in/out pages on demand without any need of a specific support from the software.

P.S. yes, there are vendor claiming to have added out of core rendering to their software when they have no changed a single line of code and are just leveraging a feature of latest GPUs :roll:
Support LuxCoreRender project with salts and bounties

kintuX
Posts: 160
Joined: Wed Jan 10, 2018 2:37 am

Re: CGI tech news box

Post by kintuX » Fri May 04, 2018 8:57 pm

NVidia AI Denoiser
Over the holidays to keep me busy I have implemented a simple command line implementation of NVidias new AI denoiser. Here are some examples of what it can do,

Original image:
Image

Denoised image:
Image

The code can be found here:

https://github.com/DeclanRussell/NvidiaAIDenoiser

I have created a windows distribution as well for those who wish to try it out here:

Denoiser windows v1.0.0.
source: https://declanrussell.com/portfolio/nvidia-ai-denoiser/

User avatar
Dade
Developer
Posts: 1044
Joined: Mon Dec 04, 2017 8:36 pm

Re: CGI tech news box

Post by Dade » Sat May 05, 2018 2:23 pm

kintuX wrote:
Fri May 04, 2018 8:57 pm
NVidia AI Denoiser
Sincerely, our works a lot better :mrgreen:

cornel.jpg
Support LuxCoreRender project with salts and bounties

kintuX
Posts: 160
Joined: Wed Jan 10, 2018 2:37 am

Re: CGI tech news box

Post by kintuX » Sat May 05, 2018 6:48 pm

Dade wrote:
Sat May 05, 2018 2:23 pm
kintuX wrote:
Fri May 04, 2018 8:57 pm
NVidia AI Denoiser
Sincerely, our works a lot better :mrgreen:
Yup, been testing BCD integrated in AppleSeed and can't wait to use it with Lux. Also their Light Tree (Many-Light rendering/if i'm not mistaken similar to LC pt.1) is amazing. Those two additions on GPU/OCL and with BiDir will make for another great revolution in LuxCore... just marvelous :mrgreen:

PS
my intention with posting a link to NVidias AI denoiser is quite the opposite... let users/artists see its dumbness and inefficiency - developers and humans in general are much smarter then 'HeadMasters' want us to believe... not to mention the creation here (LuxCore) is a work of a genius. ;)
Am really thankful to be a part of.


User avatar
Sharlybg
Supporting Users
Posts: 507
Joined: Mon Dec 04, 2017 10:11 pm
Location: Ivory Coast

Re: CGI tech news box

Post by Sharlybg » Tue May 08, 2018 10:06 am

Support LuxCoreRender project with salts and bounties

User avatar
Dade
Developer
Posts: 1044
Joined: Mon Dec 04, 2017 8:36 pm

Re: CGI tech news box

Post by Dade » Tue May 08, 2018 10:07 am

Sharlybg wrote:
Tue May 08, 2018 10:06 am
seem that the second link is not correct :?:
Fixed.
Support LuxCoreRender project with salts and bounties

patrickawalz
Supporting Users
Posts: 8
Joined: Tue Dec 05, 2017 1:45 pm

Re: CGI tech news box

Post by patrickawalz » Tue May 08, 2018 7:47 pm

I actually mentioned this in the features wishlist thread along with the improvements to glossy surfaces and the GGX microfacet distribution. the implementation is even included on the project website (for mitsuba)

Post Reply