Roughness value and rendering speed

Use this forum for general user support and related questions.
Forum rules
Please upload a testscene that allows developers to reproduce the problem, and attach some images.
User avatar
B.Y.O.B.
Developer
Developer
Posts: 4146
Joined: Mon Dec 04, 2017 10:08 pm
Location: Germany
Contact:

Re: Roughness value and rendering speed

Post by B.Y.O.B. »

Dade wrote: Wed Aug 12, 2020 12:58 pm It starts to be pretty clear that, while Optix is supposed to work on old GTX GPUs, it is both slower than my BVH code and has problems. We will change the default behavior as soon as B.Y.O.B. is back in town and Optix will be used only on RTX GPUs.
I'm back. Is there a way in the LuxCore API to check for RTX GPUs, or should I look how to detect this from Python?
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Roughness value and rendering speed

Post by Dade »

B.Y.O.B. wrote: Thu Aug 13, 2020 12:14 pm
Dade wrote: Wed Aug 12, 2020 12:58 pm It starts to be pretty clear that, while Optix is supposed to work on old GTX GPUs, it is both slower than my BVH code and has problems. We will change the default behavior as soon as B.Y.O.B. is back in town and Optix will be used only on RTX GPUs.
I'm back. Is there a way in the LuxCore API to check for RTX GPUs, or should I look how to detect this from Python?
My idea is to:

1) separate Optix/RTX from Optix/Denoiser. The second should be always available if Optix is available. The first is different (see below).

2) "context.cuda.optix.enable" should have now renamed in "context.cuda.optix.rtx.enable" and have 3 settings: 1, 0 and AUTO. "AUTO" will enable RTX only on RTX GPUs

So the idea is to change the BlendLuxCore preference settings from "Optix" in "Optix/RTX" and allow 3 settings: "Always enabled", "Always disabled" and "Automatic".

"Automatic" should be the default and pretty much changed only for testing.
Support LuxCoreRender project with salts and bounties
CodeHD
Donor
Donor
Posts: 437
Joined: Tue Dec 11, 2018 12:38 pm
Location: Germany

Re: Roughness value and rendering speed

Post by CodeHD »

Btw. on Optix-Denoiser:

I'm currently on my Laptop with an old GTX 765M that doesn't support any Optix.
With Optix Denoiser enabled (current default settings), it immediately cancels viewport renders because its not available.

Maybe the viewport denoiser selection can be included in the detection. Keeps it more user friendly to people switching/upgrading their BlendLuxCore version,.
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Roughness value and rendering speed

Post by Dade »

CodeHD wrote: Thu Aug 13, 2020 1:01 pm Btw. on Optix-Denoiser:

I'm currently on my Laptop with an old GTX 765M that doesn't support any Optix.
With Optix Denoiser enabled (current default settings), it immediately cancels viewport renders because its not available.
This should be a bug in BlendLuxCore, does the log tell you Optix is available or not ?

Code: Select all

[LuxRays][0.170] OpenCL support: enabled
[LuxRays][0.177] OpenCL Platform 0: Intel(R) CPU Runtime for OpenCL(TM) Applications
[LuxRays][0.177] OpenCL Platform 1: NVIDIA CUDA
[LuxRays][0.177] CUDA support: enabled
[LuxRays][0.177] CUDA support: available
[LuxRays][0.177] CUDA driver version: 11.0
[LuxRays][0.177] CUDA device count: 2
[LuxRays][0.177] Optix support: available
Support LuxCoreRender project with salts and bounties
CodeHD
Donor
Donor
Posts: 437
Joined: Tue Dec 11, 2018 12:38 pm
Location: Germany

Re: Roughness value and rendering speed

Post by CodeHD »

It tells me it is not:

Code: Select all

[LuxRays][8.859] OpenCL support: enabled
[LuxRays][8.859] OpenCL Platform 0: NVIDIA CUDA
[LuxRays][8.859] OpenCL Platform 1: Intel(R) OpenCL
[LuxRays][8.859] CUDA support: enabled
[LuxRays][8.859] CUDA support: available
[LuxRays][8.859] CUDA driver version: 10.10
[LuxRays][8.859] CUDA device count: 1
[LuxRays][8.859] Optix support: not available

[...]

[LuxCore][9.218] [LinearToneMap] Kernels compilation time: 0ms
Traceback (most recent call last):
  File "C:\Users\Johannes\AppData\Roaming\Blender Foundation\Blender\2.83\scripts\addons\BlendLuxCore-daily\engine\base.py", line 124, in view_draw
    viewport.view_draw(self, context, depsgraph)
  File "C:\Users\Johannes\AppData\Roaming\Blender Foundation\Blender\2.83\scripts\addons\BlendLuxCore-daily\engine\viewport.py", line 221, in view_draw
    framebuffer.update(engine.session, scene)
  File "C:\Users\Johannes\AppData\Roaming\Blender Foundation\Blender\2.83\scripts\addons\BlendLuxCore-daily\draw\viewport.py", line 229, in update
    luxcore_session.GetFilm().GetOutputFloat(self._output_type, self.buffer)
RuntimeError: OptixDenoiserPlugin used while Optix is not available
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Roughness value and rendering speed

Post by Dade »

CodeHD wrote: Thu Aug 13, 2020 1:44 pm It tells me it is not:

Code: Select all

[LuxRays][8.859] OpenCL support: enabled
[LuxRays][8.859] OpenCL Platform 0: NVIDIA CUDA
[LuxRays][8.859] OpenCL Platform 1: Intel(R) OpenCL
[LuxRays][8.859] CUDA support: enabled
[LuxRays][8.859] CUDA support: available
[LuxRays][8.859] CUDA driver version: 10.10
[LuxRays][8.859] CUDA device count: 1
[LuxRays][8.859] Optix support: not available

Ok, so it is a little bug in BlendLuxCore offering Optix denoiser even if Optix is not available.
Support LuxCoreRender project with salts and bounties
User avatar
B.Y.O.B.
Developer
Developer
Posts: 4146
Joined: Mon Dec 04, 2017 10:08 pm
Location: Germany
Contact:

Re: Roughness value and rendering speed

Post by B.Y.O.B. »

I forgot to filter out Intel devices in the film device dropdown, this should fix the issue.

Mabe it would be good if GetOpenCLDeviceDescs() would included the information if the device is optix-capable.
Post Reply