Page 1 of 2
How to disable optix rendering for gtx 1070ti?
Posted: Wed Feb 16, 2022 8:59 am
by sarmath
Hi. Is it possible to use only Cuda without optix for Gtx Series? Since Luxcore 2.5 I am experiencing much lower performance from my gtx 1070ti on desktop and gtx 1660ti on my laptop?. Can someone give example how to modify Luxcore config files.? Thanks
Re: How to disable optix rendering for gtx 1070ti?
Posted: Thu Feb 17, 2022 11:08 pm
by Luximage
Can you upload an example scene where you experiencing lower performance?
I have a gtx 1050ti and tested a scene with blendluxcore versions 2.61, 2.5 and 2.4, versions 2.61 and 2.5 had about the same time and version 2.4 was a bit slower.
Re: How to disable optix rendering for gtx 1070ti?
Posted: Fri Feb 18, 2022 7:16 pm
by sarmath
It can be any scene from luxcore demo like Danish Mood. I am getting more more samples per seond from my cpu (ryzen 9 3900x) than from my gtx 1070ti. In Cycles you can use pure Cuda or Optix, when you use optix in GTX series you get slower render times. As Dade said some time ago to me optix should be disabled for GTX series but I think there is no option for it in blendluxcore. So the question: is luxcore by default using optix and oleder cards like GTX series suffer becouse of it ? and what can we do about it?
Re: How to disable optix rendering for gtx 1070ti?
Posted: Sat Feb 19, 2022 9:47 am
by Luximage
Your os is manjaro linux?
In Luxcore 2.1 benchmark scene using luxcore version 2.6 what are your samples/second with only the gpu used?
Re: How to disable optix rendering for gtx 1070ti?
Posted: Sat Feb 19, 2022 2:47 pm
by sarmath
Yes I have moved to Manjaro as I had lots of crashes on windows. Linux is much more stable...no crashes so far.
So in Luxcore 2.1 benchmark using only gpu I get about 2.1M samples/second.
And on cpu only using all 24 threads I get 2.5M samples/second.
Something is seriously wrong here.
Cycles-X renders using Cuda only are much faster than on my cpu as it should be.
Classroom scene gpu only: 1min 32 sec
cpu only: 3min 37 sec
Re: How to disable optix rendering for gtx 1070ti?
Posted: Sun Feb 20, 2022 8:37 pm
by Luximage
Same experience here, in cycles x using only the gpu is faster than using my cpu whereas in blendluxcore 2.61 is the opposite the gpu is slower, it is strange.
I am on windows 10, my cpu is dual xeon x5680 and gpu nvidia 1050ti.
Re: How to disable optix rendering for gtx 1070ti?
Posted: Mon Feb 21, 2022 11:54 am
by sarmath
Would be good if Dade could find some time to investigste why Gtx cards suffer that much...

Re: How to disable optix rendering for gtx 1070ti?
Posted: Mon Feb 21, 2022 7:20 pm
by Luximage
Yes it would be great if Dade had a look at it.
Re: How to disable optix rendering for gtx 1070ti?
Posted: Tue Feb 22, 2022 2:46 pm
by acasta69
From a quick test I just did, it should be possible to disable Optix by adding a "cuda.optix.devices.select" property, but this can only be done in a render config file, not from blendluxcore.
See for example the last line in the following excerpt from "render.cfg":
Code: Select all
opencl.cpu.use = 0
opencl.gpu.use = 1
opencl.devices.select = "001"
cuda.optix.devices.select = "000"
With that last line, Optix is not used. Remove the line and Optix will be used.
It seems this is undocumented, I tried and it worked for me, hopefully Dade will give more info.
Re: How to disable optix rendering for gtx 1070ti?
Posted: Tue Feb 22, 2022 9:17 pm
by sarmath
I Have downloaded Luxcore standalone and with Luxcore Benchmark scene I get 4.6 M samples/second with or without adding the line switching off optix.
Its 2M samples/second more than in blendluxore. Basicly my card is twice as fast with LuxcoreUI alone.
So it must be something wrong with Blendluxcore I think
