GPU render experience

Use this forum for general user support and related questions.
Forum rules
Please upload a testscene that allows developers to reproduce the problem, and attach some images.
chafouin
Posts: 120
Joined: Wed Jan 24, 2018 9:35 pm
Location: Los Angeles

GPU render experience

Post by chafouin »

Hi there,

I've always been a LuxCore supporter, but haven't had the chance to use it in a project yet. I'd like to give it another try this time, especially because of some features that I want to use and are missing in Cycles, namely Bokeh texture and proper Caustics. I'd also make use of light groups, but there are some experimental builds of Cycles so it's not a deal breaker.

Howver, LuxCore experience with GPU rendering has always been terrible for me. I know about the trick to lower the viewport resolution, it does help a lot without really fixing the interactivity issues.
Worse, as soon as I start a final render, (1920*1080), Blender UI slows down, locks, freezes, I can't stop the render, I can't do anything on my machine.... CPU has been much better, but it's a shame to not get the GPU rendering boost. Is that everyone's experience or just me? Is there anything you could think of that could be causing this?

The scene for now is only 600k triangles and uses about 3 GB of VRAM, but I expect it to reach 4 millions triangle pretty quickly. I have a Titan X Maxwell with 12GB of VRAM, quite old but still pretty robust, with latest nVidia drivers 456.71. I am using a recent 2.5 build (October 17th) with Blender 2.90.1. CPU is a Ryzen 9 3900X, and OS is Windows 10.

Another little question: my project would be low-key interior (i.e. dark), with a very dim environment HDRI visible through windows, using animated emissive shapes (changing shapes and colors). Does it make sense to use Caches, or will it only add possible flickering?

Thanks!
User avatar
DionXein
Posts: 81
Joined: Mon Jun 01, 2020 10:22 am

Re: GPU render experience

Post by DionXein »

Hello.

First of all, Lux 2.5 still didn't support officially blender 2.90 (I always need to recompile CUDA kernels, when use GPU)

I think using GPU for viewport is still odd way. In previous topics were find best solution - use cpu and higher a pixel block size.
To get Blender UI more responsive - in CPU devices use manual core usage (for example, I have Xeon 2680 with 8 cores, 16 threads, so I set to 15 threads use).

About cashes better answer can give B.Y.O.B or Dade or Sharlybg. (I think there is no need in HDRI, just use flat color and set images where background would be seen through windows to imitate hdri)
chafouin
Posts: 120
Joined: Wed Jan 24, 2018 9:35 pm
Location: Los Angeles

Re: GPU render experience

Post by chafouin »

Thanks for your help!
DionXein wrote: Sat Oct 24, 2020 6:17 am First of all, Lux 2.5 still didn't support officially blender 2.90 (I always need to recompile CUDA kernels, when use GPU)
I don't have that issue personally, but thanks for the reminder!
DionXein wrote: Sat Oct 24, 2020 6:17 am I think using GPU for viewport is still odd way. In previous topics were find best solution - use cpu and higher a pixel block size.
To get Blender UI more responsive - in CPU devices use manual core usage (for example, I have Xeon 2680 with 8 cores, 16 threads, so I set to 15 threads use).
I kinda gave up on the GPU viewport rendering, CPU is good enough for me. It's just that regular renders really prevent me from doing anything, and it's hard to find the right halt condition as well. It's a lot of trial and errors that makes me lose a lot of time.
DionXein wrote: Sat Oct 24, 2020 6:17 am About cashes better answer can give B.Y.O.B or Dade or Sharlybg. (I think there is no need in HDRI, just use flat color and set images where background would be seen through windows to imitate hdri)
I think I know the answer, with animated lights and animated geometry, they won't do any good. But I'll see what they say, maybe they have a magic trick :)
User avatar
DionXein
Posts: 81
Joined: Mon Jun 01, 2020 10:22 am

Re: GPU render experience

Post by DionXein »

Also, I made simple scene. It contains about 15M points, and two variants with and without cashes. I render it with RTX 2080 + CPU
Attachments
Screenshot_5.jpg
Screenshot_6.jpg
johannes.wilde
Posts: 68
Joined: Fri Sep 21, 2018 7:57 am

Re: GPU render experience

Post by johannes.wilde »

chafouin wrote: Sat Oct 24, 2020 4:52 am Howver, LuxCore experience with GPU rendering has always been terrible for me. I know about the trick to lower the viewport resolution, it does help a lot without really fixing the interactivity issues.
Hey,
i was thinking the about the same topic yesterday and LuxCore really got better with this in the last time. But still, if you compare it to (especially) Octane or Cycles, it feels way less smooth. I think the biggest drawback is, when it comes to connecting nodes in the shader editor while running the viewport render (GPU). This is something Octane handles very smooth for some reason, even a lot better than Cycles.
chafouin
Posts: 120
Joined: Wed Jan 24, 2018 9:35 pm
Location: Los Angeles

Re: GPU render experience

Post by chafouin »

DionXein wrote: Sat Oct 24, 2020 7:08 am Also, I made simple scene. It contains about 15M points, and two variants with and without cashes. I render it with RTX 2080 + CPU
Yeah caches definitely speed up the render, but that won't work for my project.

I tried changing the number of threads to improve the interactivity, but it doesn't make any difference, the problem is the GPU and not the CPU

Been testing different settings and it seems that OpenCL is much more reactive than CUDA. It properly refreshes the image and doesn't block Blender as much. Also turns out to be the fastest (1:55, versus 2:02 Optix, 2:21 CUDA)

Is the sample limit not working though? I set the halt condition at 256, but it keeps rendering up to 572 samples. I would understand if it is waiting for a new refresh cycle, but in that case it only stops after 2 or 3 refreshes.
chafouin
Posts: 120
Joined: Wed Jan 24, 2018 9:35 pm
Location: Los Angeles

Re: GPU render experience

Post by chafouin »

johannes.wilde wrote: Sat Oct 24, 2020 7:25 am Hey,
i was thinking the about the same topic yesterday and LuxCore really got better with this in the last time. But still, if you compare it to (especially) Octane or Cycles, it feels way less smooth. I think the biggest drawback is, when it comes to connecting nodes in the shader editor while running the viewport render (GPU). This is something Octane handles very smooth for some reason, even a lot better than Cycles.
I really like Octane's output and performance, but I always found the workflow a bit more tedious than Cycles (especially if you use EEVEE as a shader preview, it's hard to do better). Which is why I always used Cycles when I knew I add to deliver something on time...

It's a shame because I really like LuxCore and the amazing amount of work the devs are doing.
User avatar
DionXein
Posts: 81
Joined: Mon Jun 01, 2020 10:22 am

Re: GPU render experience

Post by DionXein »

chafouin wrote: Sat Oct 24, 2020 7:28 am
DionXein wrote: Sat Oct 24, 2020 7:08 am
Been testing different settings and it seems that OpenCL is much more reactive than CUDA. It properly refreshes the image and doesn't block Blender as much. Also turns out to be the fastest (1:55, versus 2:02 Optix, 2:21 CUDA)
I used Optix in this case, and usually it renders faster then CUDA and OpenGL.
chafouin wrote: Sat Oct 24, 2020 7:28 am
Is the sample limit not working though? I set the halt condition at 256, but it keeps rendering up to 572 samples. I would understand if it is waiting for a new refresh cycle, but in that case it only stops after 2 or 3 refreshes.
I think it's happens only in animation render process with gpu, and obliviously it's kinda bug.
chafouin
Posts: 120
Joined: Wed Jan 24, 2018 9:35 pm
Location: Los Angeles

Re: GPU render experience

Post by chafouin »

johannes.wilde wrote: Sat Oct 24, 2020 7:25 am Hey,
i was thinking the about the same topic yesterday and LuxCore really got better with this in the last time. But still, if you compare it to (especially) Octane or Cycles, it feels way less smooth. I think the biggest drawback is, when it comes to connecting nodes in the shader editor while running the viewport render (GPU). This is something Octane handles very smooth for some reason, even a lot better than Cycles.
I just gave Octane another try, I wouldn't say the viewport rendering is as reactive as Cycles, as least it isn't on my machine. However.... holy moly, it's incredibly fast.
I don't get a perfectly clean image after 2 minutes of rendering with Cycles or LuxCore. It gets easily fixed with the denoiser but I'm afraid of potential flickering because of that. While Octane gives me a clean image without denoising in 15 seconds, and perfectly noiseless in 35 seconds!
User avatar
Sharlybg
Donor
Donor
Posts: 3101
Joined: Mon Dec 04, 2017 10:11 pm
Location: Ivory Coast

Re: GPU render experience

Post by Sharlybg »

For animation with caches my trick is to setup and save cache only with static light/object. And after all the animated stuff get merged with the other one.

For viewport rendering i always work with cpu. But i use to split my workspace so that only 1/3 or 1/4 only of the screen is use for rendering and shading/texturing it also work well with GPU.

Btw 2.5 have nasty bug with blender crash when switching between solid view and rendered view.
Support LuxCoreRender project with salts and bounties

Portfolio : https://www.behance.net/DRAVIA
Post Reply