I'm always hitting the GPU card's memory top with my renders (8GB gtx 1070).
I also work with octane, but I cannot switch to Luxrender at work because I always get rid of memory at the middle of the project.
I've made some comparisons (also with CPU).
My first test is with only meshes:
Code: Select all
GPU Idle Render Usage
Cycles 2.79 860 2636 1776
Cycles 2.80 1286 2780 1494
Luxcore 789 2147 1358
Octane 880 1444 564
CPU
Luxcore 479,7 1421,1 941,4
Appleseed 1.01 beta2 579,5 1490,5 911 - Modified because it increases memory until all tiles have been allocated.
Cycles 2.80 523 1078,6 555,6
Cycles 2.79 731,8 1051,9 320,1
As you can see, it uses more than twice than octane. On a big scene is a big problem.
This is the test scene:
Luxcore:
https://www.dropbox.com/s/eiixmn31fti19 ... blend?dl=1
Appleseed:
https://www.dropbox.com/s/3kygy8hcbpthw ... blend?dl=1
Cycles 2.79:
https://www.dropbox.com/s/zsap2a1xq0jnh ... blend?dl=1
Lightwave 2018.0.7 with octane:
https://www.dropbox.com/s/n8bda1xm3338r ... e.zip?dl=1
Model is from Stanford 3D Scanning Repository, of course.
Best regards