How to disable optix rendering for gtx 1070ti?
Forum rules
Please upload a testscene that allows developers to reproduce the problem, and attach some images.
Please upload a testscene that allows developers to reproduce the problem, and attach some images.
How to disable optix rendering for gtx 1070ti?
Hi. Is it possible to use only Cuda without optix for Gtx Series? Since Luxcore 2.5 I am experiencing much lower performance from my gtx 1070ti on desktop and gtx 1660ti on my laptop?. Can someone give example how to modify Luxcore config files.? Thanks
WIndows 11 Pro Ryzen 7 5700x RTX 3090 32 GB RAM
Re: How to disable optix rendering for gtx 1070ti?
Can you upload an example scene where you experiencing lower performance?
I have a gtx 1050ti and tested a scene with blendluxcore versions 2.61, 2.5 and 2.4, versions 2.61 and 2.5 had about the same time and version 2.4 was a bit slower.
I have a gtx 1050ti and tested a scene with blendluxcore versions 2.61, 2.5 and 2.4, versions 2.61 and 2.5 had about the same time and version 2.4 was a bit slower.
Re: How to disable optix rendering for gtx 1070ti?
It can be any scene from luxcore demo like Danish Mood. I am getting more more samples per seond from my cpu (ryzen 9 3900x) than from my gtx 1070ti. In Cycles you can use pure Cuda or Optix, when you use optix in GTX series you get slower render times. As Dade said some time ago to me optix should be disabled for GTX series but I think there is no option for it in blendluxcore. So the question: is luxcore by default using optix and oleder cards like GTX series suffer becouse of it ? and what can we do about it?
WIndows 11 Pro Ryzen 7 5700x RTX 3090 32 GB RAM
Re: How to disable optix rendering for gtx 1070ti?
Your os is manjaro linux?
In Luxcore 2.1 benchmark scene using luxcore version 2.6 what are your samples/second with only the gpu used?
In Luxcore 2.1 benchmark scene using luxcore version 2.6 what are your samples/second with only the gpu used?
Re: How to disable optix rendering for gtx 1070ti?
Yes I have moved to Manjaro as I had lots of crashes on windows. Linux is much more stable...no crashes so far.
So in Luxcore 2.1 benchmark using only gpu I get about 2.1M samples/second.
And on cpu only using all 24 threads I get 2.5M samples/second.
Something is seriously wrong here.
Cycles-X renders using Cuda only are much faster than on my cpu as it should be.
Classroom scene gpu only: 1min 32 sec
cpu only: 3min 37 sec
So in Luxcore 2.1 benchmark using only gpu I get about 2.1M samples/second.
And on cpu only using all 24 threads I get 2.5M samples/second.
Something is seriously wrong here.
Cycles-X renders using Cuda only are much faster than on my cpu as it should be.
Classroom scene gpu only: 1min 32 sec
cpu only: 3min 37 sec
WIndows 11 Pro Ryzen 7 5700x RTX 3090 32 GB RAM
Re: How to disable optix rendering for gtx 1070ti?
Same experience here, in cycles x using only the gpu is faster than using my cpu whereas in blendluxcore 2.61 is the opposite the gpu is slower, it is strange.
I am on windows 10, my cpu is dual xeon x5680 and gpu nvidia 1050ti.
I am on windows 10, my cpu is dual xeon x5680 and gpu nvidia 1050ti.
Re: How to disable optix rendering for gtx 1070ti?
Would be good if Dade could find some time to investigste why Gtx cards suffer that much...
WIndows 11 Pro Ryzen 7 5700x RTX 3090 32 GB RAM
Re: How to disable optix rendering for gtx 1070ti?
Yes it would be great if Dade had a look at it.
Re: How to disable optix rendering for gtx 1070ti?
From a quick test I just did, it should be possible to disable Optix by adding a "cuda.optix.devices.select" property, but this can only be done in a render config file, not from blendluxcore.
See for example the last line in the following excerpt from "render.cfg":
With that last line, Optix is not used. Remove the line and Optix will be used.
It seems this is undocumented, I tried and it worked for me, hopefully Dade will give more info.
See for example the last line in the following excerpt from "render.cfg":
Code: Select all
opencl.cpu.use = 0
opencl.gpu.use = 1
opencl.devices.select = "001"
cuda.optix.devices.select = "000"
It seems this is undocumented, I tried and it worked for me, hopefully Dade will give more info.
Re: How to disable optix rendering for gtx 1070ti?
I Have downloaded Luxcore standalone and with Luxcore Benchmark scene I get 4.6 M samples/second with or without adding the line switching off optix.
Its 2M samples/second more than in blendluxore. Basicly my card is twice as fast with LuxcoreUI alone.
So it must be something wrong with Blendluxcore I think
Its 2M samples/second more than in blendluxore. Basicly my card is twice as fast with LuxcoreUI alone.
So it must be something wrong with Blendluxcore I think
WIndows 11 Pro Ryzen 7 5700x RTX 3090 32 GB RAM