Also been doing some various testing, looks like the ATI card I was using works find for LuxCore2.1Benchmark but the Nvidia GTX Card seems to be throwing a segfault so will need to dig into that. The Nvidia Card seems to work on other rendering files just specific ones seems to be throwing errors.
robbrown wrote: ↑Tue Oct 02, 2018 1:34 am
I did find one oddity while cutting the release, the scenes in the LuxCore repo throw an error if you load them from the luxcoreui menu:
robbrown wrote: ↑Wed Sep 19, 2018 8:09 pm
OpenCL mostly works but LuxCore2.1Benchmark.zip does throw an error `PathOCLBase kernel compilation error`
The standard procedure to test if this is an OpenCL compiler bug or a bug in LuxCore code is to run on CPU OpenCL device only.
However, in the Blender addon it is not possible anymore to select the OpenCL CPU device because we now have native C++ plus OpenCL, so it would be confusing for the users to also have the option of OpenCL CPU device.
You could export to text files (select checkbox "Only write LuxCore scene" in config settings).
Then you would have to delete the OpenCL related config lines in the "render.cfg" file: https://wiki.luxcorerender.org/LuxCore_ ... er_Engines
And only write:
This way the OpenCL CPU device is used, these usually have the best compilers, so if the scene works fine there, you can be pretty sure that "PathOCLBase kernel compilation error" is caused by a bug in the GPU OpenCL compiler.
I've been digging into the OpenCL with Nvidia problem I'm having and decided to try the suggestion by B.Y.O.B. from a while ago when I was having a different OpenCL issue. It's weird because the LuxCoreUI output suggests to me it's using the GPU despite having changed the render.cfg file. I'm assuming the results below are not desirable and I should be digging into why the device is still being used in the LuxRays/LuxCore?
Based on the crash report and internet readings it's looking like some weird combination of Nvidia/GeForce Web drivers and Mac OS system libs, seems like a lot of people have run into, Rust Create, BlackMagic Fusion, Adobe Premiere, etc. I've noticed it doesn't always fail on GammaCorrectionPlugin but seems to regularly so I may try to make a toy OpenCL program that crashes in the same way for isolation of the problem since I haven't found such an example floating around.
Ha, funny, I just implemented a secret debug panel. Currently the only option is to do exactly what I described in the quote, but with the click of one button instead of having to edit the config.
So far it's only available in the "feature/new_math_textures" branch: https://github.com/LuxCoreRender/BlendL ... 6bad6aa97d
About your issue: looks weird indeed, could it be that the driver incorrectly reports the GPU to be a CPU device?
robbrown wrote: ↑Thu Oct 11, 2018 6:36 pm
I've been digging into the OpenCL with Nvidia problem I'm having and decided to try the suggestion by B.Y.O.B. from a while ago when I was having a different OpenCL issue. It's weird because the LuxCoreUI output suggests to me it's using the GPU despite having changed the render.cfg file. I'm assuming the results below are not desirable and I should be digging into why the device is still being used in the LuxRays/LuxCore?
OpenCL can be used for 2 tasks in LuxCore:
- rendering;
- film image pipeline;
I assume you are using a not OpenCL rendering engine but the film image pipeline is still running on the GPU. The 2 tasks are decoupled and you can run them with or without using OpenCL: they are 2 different settings.
Support LuxCoreRender project with salts and bounties
Yeah something isn't working right, I'll keep digging.
The film opencl being disabled seems to have changed the settings a bit and now it's failing on the Denoiser which is also trying to use the GPU instead of CPU. Weirdly if I try to deselect the GPU and leave CPU enabled in the LuxCore Device Settings panel of Blender before doing the write scene to text LuxCoreUI dumps a no device selected error: