Optix/RTX support

Discussion related to the LuxCore functionality, implementations and API.
acasta69
Developer
Developer
Posts: 472
Joined: Tue Jan 09, 2018 3:45 pm
Location: Italy

Re: Optix/RTX support

Post by acasta69 »

It must be as you said, after deleting / restarting everything Optix is always found.
However I'm left with the first problem I described, i.e. crash before the render starts.
I'll be away from the GTX970 for some time, I'll give more info when I can test again.
Support LuxCoreRender project with salts and bounties

Windows 10 64 bits, i7-4770 3.4 GHz, RAM 16 GB, GTX 970 4GB v445.87
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Optix/RTX support

Post by Dade »

acasta69 wrote: Wed Aug 05, 2020 10:01 am It must be as you said, after deleting / restarting everything Optix is always found.
However I'm left with the first problem I described, i.e. crash before the render starts.
I moved my old GTX 980 to the new PC and the scene is not crashing on Linux but it is rendering nothing at all. I'm checking if it is a problem on our side because Optix is very slow in general on GTX 980 and should be probably not used.
Support LuxCoreRender project with salts and bounties
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Optix/RTX support

Post by Dade »

Dade wrote: Wed Aug 05, 2020 11:17 am
acasta69 wrote: Wed Aug 05, 2020 10:01 am It must be as you said, after deleting / restarting everything Optix is always found.
However I'm left with the first problem I described, i.e. crash before the render starts.
I moved my old GTX 980 to the new PC and the scene is not crashing on Linux but it is rendering nothing at all. I'm checking if it is a problem on our side because Optix is very slow in general on GTX 980 and should be probably not used.
If I disable Optix (context.cuda.optix.enable=0) the scene works fine on the GTX980. This smell like a Optix problem on RTX-less GPUs.
Support LuxCoreRender project with salts and bounties
acasta69
Developer
Developer
Posts: 472
Joined: Tue Jan 09, 2018 3:45 pm
Location: Italy

Re: Optix/RTX support

Post by acasta69 »

Dade wrote: Wed Aug 05, 2020 11:51 am If I disable Optix (context.cuda.optix.enable=0) the scene works fine on the GTX980. This smell like a Optix problem on RTX-less GPUs.
Thanks Dade, I'll do the same test as soon as possible but it will take some days.
Support LuxCoreRender project with salts and bounties

Windows 10 64 bits, i7-4770 3.4 GHz, RAM 16 GB, GTX 970 4GB v445.87
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Optix/RTX support

Post by Dade »

Dade wrote: Wed Aug 05, 2020 11:51 am If I disable Optix (context.cuda.optix.enable=0) the scene works fine on the GTX980. This smell like a Optix problem on RTX-less GPUs.
I reworked the support for enabling/disabling/auto-selecting Optix/RTX.

First, you can check if Optix is available, or not, by checking the "compile.LUXRAYS_ENABLE_OPTIX" property of luxcore::GetPlatformDesc().

Now, Optix/RTX will be used by default only on CUDA GPU with 7.5 compute capability or better (RTX cards). A list of all NVIDIA GPU CUDA compute capabilities can be found here: https://developer.nvidia.com/cuda-gpus

You can check the CUDA device compute capability by checking the properties "opencl.device.x.cuda.compute.major" and "opencl.device.x.cuda.compute.minor" (7 and 5 for RTX GPUs) of luxcore::GetOpenCLDeviceDescs().

You force-enable/force-disable/auto-select Optix/RTX for each CUDA GPU device using the "cuda.optix.devices.select" property. It works like "opencl.devices.select", there is a "0"/"1"/"A" to force-enable/force-disable/auto-select Optix/RTX.

NOTE: "cuda.optix.devices.select" has a char only for each single CUDA device available (nothing for OpenCL devices) so, with 1xOpenCL CPU+2xOpenCL GPUs+2xCUDA GPUs I can use the following settings:

Code: Select all

# 3 OpenCL devices + 2 CUDA devices
opencl.devices.select = 10011
# only 2 CUDA device settings
cuda.optix.devices.select = 10
Support for "context.cuda.optix.enable" has been removed.

I assume now BlendLuxCore, should have an option for each (CUDA) device in the device list to force-enable/ force-disable/auto-select Optix/RTX.

The default should be "auto-select" and I don't see any reason to change that if not for testings.
Support LuxCoreRender project with salts and bounties
User avatar
Odilkhan Yakubov
Posts: 208
Joined: Fri Jan 26, 2018 10:07 pm
Location: Tashkent, Uzbekistan

Re: Optix/RTX support

Post by Odilkhan Yakubov »

Hi. I finally installed Lux v2.5 latest on Blender 2.83 and used also Optix viewport denoiser. Is it possible to rendering using Optix denoiser not just in viewport but also on rendering too like in Cycles?
___________________________________________________________________________
LuxCoreRender Developer for Blender
___________________________________________________________________________
User avatar
B.Y.O.B.
Developer
Developer
Posts: 4146
Joined: Mon Dec 04, 2017 10:08 pm
Location: Germany
Contact:

Re: Optix/RTX support

Post by B.Y.O.B. »

Odilkhan Yakubov wrote: Tue Aug 18, 2020 8:40 pm Is it possible to rendering using Optix denoiser not just in viewport but also on rendering too like in Cycles?
Yes, but it's not yet implemented in the Blender addon.
User avatar
Odilkhan Yakubov
Posts: 208
Joined: Fri Jan 26, 2018 10:07 pm
Location: Tashkent, Uzbekistan

Re: Optix/RTX support

Post by Odilkhan Yakubov »

B.Y.O.B. wrote: Tue Aug 18, 2020 8:41 pm
Odilkhan Yakubov wrote: Tue Aug 18, 2020 8:40 pm Is it possible to rendering using Optix denoiser not just in viewport but also on rendering too like in Cycles?
Yes, but it's not yet implemented in the Blender addon.
so is it in yours plans, am i right? 8-)
___________________________________________________________________________
LuxCoreRender Developer for Blender
___________________________________________________________________________
Post Reply