Page 2 of 3

Re: GPU render experience

Posted: Sat Oct 24, 2020 8:46 am
by Sharlybg
Also i never do assets texturing i side the targeted scene. I have always second collection with a basic lighting where i shade object individually. That way everything get faster even the workflow.
Tricks from the Nvidia Fermi era :mrgreen:

Re: GPU render experience

Posted: Sat Oct 24, 2020 11:02 am
by Dade
DionXein wrote: Sat Oct 24, 2020 7:49 am I used Optix in this case, and usually it renders faster then CUDA and OpenGL.
Only for RTX GPUs (where it can be a lot faster), in my tests, CUDA+Optix is slower on old GPUs than CUDA + my BVH code. It is likely to be "programmed obsolescence" at work (to force people to buy new RTX GPUs).

Re: GPU render experience

Posted: Sat Oct 24, 2020 12:21 pm
by johannes.wilde
DionXein wrote: Sat Oct 24, 2020 7:49 am I just gave Octane another try, I wouldn't say the viewport rendering is as reactive as Cycles, as least it isn't on my machine. However.... holy moly, it's incredibly fast.
That's not what i meant. I am totally happy with BlendLuxCore's viewport Performance on GPU. But i think the UI could stay more responsive and smooth when working with viewport rendering.

Re: GPU render experience

Posted: Sun Oct 25, 2020 12:39 pm
by Theo_Gottwald
I have had several discussions on the responsiveness of the Luxcore UI - especially in higher output resolutions.
I am not sure there is any hope. I guess most devs do never render higher then 1920x1080 :-)

Re: GPU render experience

Posted: Sun Oct 25, 2020 4:08 pm
by DionXein
Theo_Gottwald wrote: Sun Oct 25, 2020 12:39 pm I guess most devs do never render higher then 1920x1080 :-)
I'm sure they did. The main idea is to use most available PC resources on image rendering. And for calibration obliviously u should use lower resolutions, skip frames in animation, etc. For final shot all u have to do is press one button and leave it for a while.
Simple technics like lower pixel block size, set render borders in camera view, set up model on special collections, using cpu for preview render view - should make ur workflow much smoother.

Re: GPU render experience

Posted: Mon Oct 26, 2020 4:28 pm
by chafouin
Again, I'm not complaining about the viewport rendering for which I use CPU anyway, but the final render. There should be a way to leave your PC somewhat reactive, at least to be able to abort a render. I can't even do that, I have to kill Blender from the task manager if I use CUDA or Optix.

Re: GPU render experience

Posted: Mon Oct 26, 2020 4:36 pm
by Theo_Gottwald
Looking at how Responsive AMD Pro Render in any resolution is,
i would just say "it is possible". Even if the rendering would take 5% longer.
Luxrender takes very long, here longer then any other render-engine.
However the results are like that - especially caustics and Interior Renderings with Luxrender are amazing.
So I would give it the necessary time. But Beginners will not use it if its not responsive,
they may thing "Blender has crashed". And it feels like that.

Re: GPU render experience

Posted: Tue Oct 27, 2020 6:44 am
by DionXein
Theo_Gottwald wrote: Mon Oct 26, 2020 4:36 pm Looking at how Responsive AMD Pro Render in any resolution is
ProRender isn't fully unbiased, LuxCore is

Re: GPU render experience

Posted: Tue Oct 27, 2020 8:40 am
by Theo_Gottwald
The responsiveness has to do with the general program structure.
I do not think it has to do with "How an engine internally works".
In windows we would just add some "DoEvents()" after each internal Action.
Of course this may need a bit of time. But I doubt it will be such a big issue.
As the Responsiveness is decreasing with the render-resolution, i assume that these
"DoEvents" Sort of are just called at any complete redraw of the Picture.
Why not call it after a fixed number of pixels?
In such case the responsiveness would always be the same.

Re: GPU render experience

Posted: Tue Oct 27, 2020 1:38 pm
by B.Y.O.B.
I have already written about the causes for hanging on high-resolution final renders here: https://github.com/LuxCoreRender/BlendL ... -667749833

Cycles and Radeon ProRender are usually used with small tiles, and they only update the pixels of one tile on each update. LuxCore on the other hand renders the whole film at once and currently, film updates in Blender always update the whole film at once, too.

Anyway, this is getting off-topic here, as this thread is about problems with GPU viewport rendering, not about final renders with big resolutions.