Mac OS

Discussion related to the Engine functionality, implementations and API.
robbrown
Developer
Posts: 41
Joined: Mon Sep 03, 2018 1:04 am

Re: Mac OS

Post by robbrown » Wed Oct 03, 2018 5:53 pm

FYI/Updates:

I just uploaded a MacOS release of BlendLuxCore.

Also been doing some various testing, looks like the ATI card I was using works find for LuxCore2.1Benchmark but the Nvidia GTX Card seems to be throwing a segfault so will need to dig into that. The Nvidia Card seems to work on other rendering files just specific ones seems to be throwing errors.
B.Y.O.B. wrote:
Tue Oct 02, 2018 7:45 am
robbrown wrote:
Tue Oct 02, 2018 1:34 am
I did find one oddity while cutting the release, the scenes in the LuxCore repo throw an error if you load them from the luxcoreui menu:
This is a known problem, see https://github.com/LuxCoreRender/LuxCore/issues/87
Thanks for the heads up, good to know it's not a MacOS specific thing.

mick
Posts: 60
Joined: Mon May 21, 2018 7:57 pm

Re: Mac OS

Post by mick » Thu Oct 04, 2018 3:04 pm

Am I right to expect this for the 2.1 release?

What is the status of pyLuxCore?

User avatar
Dade
Developer
Posts: 1465
Joined: Mon Dec 04, 2017 8:36 pm

Re: Mac OS

Post by Dade » Thu Oct 04, 2018 3:36 pm

mick wrote:
Thu Oct 04, 2018 3:04 pm
Am I right to expect this for the 2.1 release?

What is the status of pyLuxCore?
It is already all available on v2.1alpha4: viewtopic.php?f=9&t=630
Support LuxCoreRender project with salts and bounties

mick
Posts: 60
Joined: Mon May 21, 2018 7:57 pm

Re: Mac OS

Post by mick » Thu Oct 04, 2018 7:38 pm

OK, but what about pyLuxCore? Is it also available for macOS?

User avatar
B.Y.O.B.
Developer
Posts: 1738
Joined: Mon Dec 04, 2017 10:08 pm
Location: Germany
Contact:

Re: Mac OS

Post by B.Y.O.B. » Thu Oct 04, 2018 7:46 pm

Yes, it's included in the archive.
Attachments
2018-10-04_21-45-41.png
Support LuxCoreRender project with salts and bounties

robbrown
Developer
Posts: 41
Joined: Mon Sep 03, 2018 1:04 am

Re: Mac OS

Post by robbrown » Thu Oct 11, 2018 6:36 pm

B.Y.O.B. wrote:
Wed Sep 19, 2018 9:19 pm
robbrown wrote:
Wed Sep 19, 2018 8:09 pm
OpenCL mostly works but LuxCore2.1Benchmark.zip does throw an error `PathOCLBase kernel compilation error`
The standard procedure to test if this is an OpenCL compiler bug or a bug in LuxCore code is to run on CPU OpenCL device only.
However, in the Blender addon it is not possible anymore to select the OpenCL CPU device because we now have native C++ plus OpenCL, so it would be confusing for the users to also have the option of OpenCL CPU device.

You could export to text files (select checkbox "Only write LuxCore scene" in config settings).
Then you would have to delete the OpenCL related config lines in the "render.cfg" file: https://wiki.luxcorerender.org/LuxCore_ ... er_Engines
And only write:

Code: Select all

opencl.cpu.use = 1
opencl.gpu.use = 0
This way the OpenCL CPU device is used, these usually have the best compilers, so if the scene works fine there, you can be pretty sure that "PathOCLBase kernel compilation error" is caused by a bug in the GPU OpenCL compiler.
I've been digging into the OpenCL with Nvidia problem I'm having and decided to try the suggestion by B.Y.O.B. from a while ago when I was having a different OpenCL issue. It's weird because the LuxCoreUI output suggests to me it's using the GPU despite having changed the render.cfg file. I'm assuming the results below are not desirable and I should be digging into why the device is still being used in the LuxRays/LuxCore?

Code: Select all

[LuxCore][6.308]   opencl.cpu.use = "1"
[LuxCore][6.308]   opencl.gpu.use = "0"
...
...
...
[LuxRays][7.601] Allocating intersection device 0: GeForce GTX 980 Ti (Type = OPENCL_GPU)
[LuxRays][7.601] Allocating intersection device 1: NativeThread (Type = NATIVE_THREAD)
...
[LuxRays][7.602] Allocating intersection device 16: NativeThread (Type = NATIVE_THREAD)
[LuxCore][7.602] OpenCL devices used:
[LuxCore][7.602] [GeForce GTX 980 Ti Intersect]
[LuxCore][7.602]   Device OpenCL version: OpenCL 1.2
[LuxCore][7.602] Native devices used: 16
[LuxCore][7.602] Configuring 1 OpenCL render threads
[LuxCore][7.602] Configuring 16 native render threads
...
...
...
[LuxRays][19.873] [Device GeForce GTX 980 Ti Intersect] Gamma table buffer size: 16Kbytes
[LuxCore][19.873] [GammaCorrectionPlugin] Defined symbols: -D LUXRAYS_OPENCL_KERNEL -D SLG_OPENCL_KERNEL
[LuxCore][19.873] [GammaCorrectionPlugin] Compiling kernels
[LuxCore][19.873] [GammaCorrectionPlugin] Kernels cached
[LuxCore][19.873] [GammaCorrectionPlugin] Compiling GammaCorrectionPlugin_Apply Kernel
Abort trap: 6
Based on the crash report and internet readings it's looking like some weird combination of Nvidia/GeForce Web drivers and Mac OS system libs, seems like a lot of people have run into, Rust Create, BlackMagic Fusion, Adobe Premiere, etc. I've noticed it doesn't always fail on GammaCorrectionPlugin but seems to regularly so I may try to make a toy OpenCL program that crashes in the same way for isolation of the problem since I haven't found such an example floating around.

User avatar
B.Y.O.B.
Developer
Posts: 1738
Joined: Mon Dec 04, 2017 10:08 pm
Location: Germany
Contact:

Re: Mac OS

Post by B.Y.O.B. » Thu Oct 11, 2018 6:46 pm

Ha, funny, I just implemented a secret debug panel. Currently the only option is to do exactly what I described in the quote, but with the click of one button instead of having to edit the config.
So far it's only available in the "feature/new_math_textures" branch: https://github.com/LuxCoreRender/BlendL ... 6bad6aa97d

About your issue: looks weird indeed, could it be that the driver incorrectly reports the GPU to be a CPU device?
Support LuxCoreRender project with salts and bounties

User avatar
Dade
Developer
Posts: 1465
Joined: Mon Dec 04, 2017 8:36 pm

Re: Mac OS

Post by Dade » Thu Oct 11, 2018 6:48 pm

robbrown wrote:
Thu Oct 11, 2018 6:36 pm
I've been digging into the OpenCL with Nvidia problem I'm having and decided to try the suggestion by B.Y.O.B. from a while ago when I was having a different OpenCL issue. It's weird because the LuxCoreUI output suggests to me it's using the GPU despite having changed the render.cfg file. I'm assuming the results below are not desirable and I should be digging into why the device is still being used in the LuxRays/LuxCore?
OpenCL can be used for 2 tasks in LuxCore:

- rendering;

- film image pipeline;

I assume you are using a not OpenCL rendering engine but the film image pipeline is still running on the GPU. The 2 tasks are decoupled and you can run them with or without using OpenCL: they are 2 different settings.
Support LuxCoreRender project with salts and bounties

User avatar
B.Y.O.B.
Developer
Posts: 1738
Joined: Mon Dec 04, 2017 10:08 pm
Location: Germany
Contact:

Re: Mac OS

Post by B.Y.O.B. » Thu Oct 11, 2018 6:55 pm

You can enable/disable film OpenCL with this property:

Code: Select all

film.opencl.enable = 0
Or in BlendLuxCore in the camera settings, imagepipeline panel.
Support LuxCoreRender project with salts and bounties

robbrown
Developer
Posts: 41
Joined: Mon Sep 03, 2018 1:04 am

Re: Mac OS

Post by robbrown » Thu Oct 11, 2018 9:18 pm

Yeah something isn't working right, I'll keep digging.

The film opencl being disabled seems to have changed the settings a bit and now it's failing on the Denoiser which is also trying to use the GPU instead of CPU. Weirdly if I try to deselect the GPU and leave CPU enabled in the LuxCore Device Settings panel of Blender before doing the write scene to text LuxCoreUI dumps a no device selected error:

Code: Select all

RenderSession starting error:
No OpenCL device selected or available
Done.
I would have thought the CPU would be selected since I changed the render.cfg file. Maybe I need to adjust OpenCL Devices Select?

Code: Select all

opencl.devices.select="01"

Post Reply