Luxcore and memory

Use this forum for general user support and related questions.
Forum rules
Please upload a testscene that allows developers to reproduce the problem, and attach some images.
User avatar
B.Y.O.B.
Developer
Developer
Posts: 4146
Joined: Mon Dec 04, 2017 10:08 pm
Location: Germany
Contact:

Re: Luxcore and memory

Post by B.Y.O.B. »

I just tested on my Linux machine, this time with "progressive refine" in Cycles (render the whole image at once).
RAM usage of the Blender process:

Cycles with progressive refine
After loading: 138 MiB
During render: 663 MiB
After render: 380 MiB

LuxCore
After loading: 137 MiB
During render: 3.4 GiB
After render: 365 MiB

Note: Using progressive refine absolutely destroys Cycles' performance, my CPU cores are almost idle with only short spikes of usage.
LuxCore uses all cores at 100%. So I guess part of the RAM usage is duplicating the film across threads for better performance?
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: Luxcore and memory

Post by lacilaci »

B.Y.O.B. wrote: Thu Aug 09, 2018 1:09 pm I just tested on my Linux machine, this time with "progressive refine" in Cycles (render the whole image at once).
RAM usage of the Blender process:

Cycles with progressive refine
After loading: 138 MiB
During render: 663 MiB
After render: 380 MiB

LuxCore
After loading: 137 MiB
During render: 3.4 GiB
After render: 365 MiB

Note: Using progressive refine absolutely destroys Cycles' performance, my CPU cores are almost idle with only short spikes of usage.
LuxCore uses all cores at 100%. So I guess part of the RAM usage is duplicating the film across threads for better performance?
Yeah progressive is slow and in high res almost unusable in cycles. But that's still irrelevant.
You switch to tiles and get good performance and low ram usage.
You can't avoid this (in your case) 5x more memory being used in luxcore. That's a relatively small personal project not fitting into 64GB workstation.

Like I said. In general, luxcore performance seems pretty ok-ish and I bet in the future it will get much better even.
The problem is that I'd be worried to start a project relying on luxcore cause before the scene is even complete I might not have enough ram.

Since I was not really using luxcore before this aplha I don't know if this has always been the case or is this something expected due to the alpha state of the software?
Luxart
Posts: 49
Joined: Tue Feb 13, 2018 10:14 am

Re: Luxcore and memory

Post by Luxart »

B.Y.O.B. wrote: Thu Aug 09, 2018 1:09 pm LuxCore uses all cores at 100%. So I guess part of the RAM usage is duplicating the film across threads for better performance?
Yes, Currently for each cpu thread seperate film is allocated.
So as the film size or thread count (current and futrue AMD CPU's have 32 & 64 threads) increases, the memory usage increases :( .

I think seperate film for each CPU thread is to avoid atomic operations, as they are slow.
But GPU is using single film (atleast for each GPU device) and not using atomic operations to store the values in film (Denoiser still uses atomic ops).

Also Dade mentioned in some forum topic that luxrender is designed to avoid two or more threads operate on same pixel simultaneously (avoiding the need of atomic operations).
So why can't we have a single film for CPU as we now have in GPU :?: .

Especially in denoiser the single film greatly reduces the memory usage :) .

I think, Dade can explain if there is any need for each thread films (other performance issues).
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: Luxcore and memory

Post by lacilaci »

Luxart wrote: Fri Aug 10, 2018 6:24 am
B.Y.O.B. wrote: Thu Aug 09, 2018 1:09 pm LuxCore uses all cores at 100%. So I guess part of the RAM usage is duplicating the film across threads for better performance?
Yes, Currently for each cpu thread seperate film is allocated.
So as the film size or thread count (current and futrue AMD CPU's have 32 & 64 threads) increases, the memory usage increases :( .

I think seperate film for each CPU thread is to avoid atomic operations, as they are slow.
But GPU is using single film (atleast for each GPU device) and not using atomic operations to store the values in film (Denoiser still uses atomic ops).

Also Dade mentioned in some forum topic that luxrender is designed to avoid two or more threads operate on same pixel simultaneously (avoiding the need of atomic operations).
So why can't we have a single film for CPU as we now have in GPU :?: .

Especially in denoiser the single film greatly reduces the memory usage :) .

I think, Dade can explain if there is any need for each thread films (other performance issues).
Are you saying that luxcore memory consumption will grow with the number of cores for cpus increasing??? :D

I really hope that this can be resolved.

This issue makes it close to impossible for 80% commercial archviz work and for almost anything that has to be rendered for high quality print.

I wouldn't mind for some performance loss over good memory management and maybe out of core functionality. Especially if performance for dificult scenes can be regained using indirect caching and who knows what other ways there are.
Luxart
Posts: 49
Joined: Tue Feb 13, 2018 10:14 am

Re: Luxcore and memory

Post by Luxart »

lacilaci wrote: Fri Aug 10, 2018 6:37 am
Are you saying that luxcore memory consumption will grow with the number of cores for cpus increasing??? :D
Yes, not with 8 cores. But 32 & 64 cores etc, along with film size and denoiser, it is definitely going to be a memory problem ;)
User avatar
FarbigeWelt
Donor
Donor
Posts: 1046
Joined: Sun Jul 01, 2018 12:07 pm
Location: Switzerland
Contact:

Re: Luxcore and memory

Post by FarbigeWelt »

What is about path tile rendering? Does it also consume lot of memory?
Light and Word designing Creator - www.farbigewelt.ch - aka quantenkristall || #luxcorerender
MacBook Air with M1
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Luxcore and memory

Post by Dade »

FarbigeWelt wrote: Fri Aug 10, 2018 11:45 am What is about path tile rendering? Does it also consume lot of memory?
No, it shouldn't it uses just one single Film.
Support LuxCoreRender project with salts and bounties
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: Luxcore and memory

Post by lacilaci »

Dade wrote: Fri Aug 10, 2018 3:32 pm
FarbigeWelt wrote: Fri Aug 10, 2018 11:45 am What is about path tile rendering? Does it also consume lot of memory?
No, it shouldn't it uses just one single Film.
sadly can't test now. But I'm pretty sure the memory consumption was +/- the same.
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Luxcore and memory

Post by Dade »

lacilaci wrote: Fri Aug 10, 2018 3:37 pm sadly can't test now. But I'm pretty sure the memory consumption was +/- the same.
It can not be or better, if it is, the problem is not in Film (aka frame buffer) memory usage. Try PATH Vs. TILEPATH and 256x256 Vs. 2048x2048 to verify if the problem arise from Film memory usage or not.
Support LuxCoreRender project with salts and bounties
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Luxcore and memory

Post by Dade »

Side note: you can reduce Film memory usage by disabling OpenCL image pipeline (property film.opencl.enable). Indeed, some complex image pipeline will run a lot slower if using the CPU.
Support LuxCoreRender project with salts and bounties
Post Reply