GPU render experience

Use this forum for general user support and related questions.
Forum rules
Please upload a testscene that allows developers to reproduce the problem, and attach some images.
User avatar
Sharlybg
Donor
Donor
Posts: 3101
Joined: Mon Dec 04, 2017 10:11 pm
Location: Ivory Coast

Re: GPU render experience

Post by Sharlybg »

Also i never do assets texturing i side the targeted scene. I have always second collection with a basic lighting where i shade object individually. That way everything get faster even the workflow.
Tricks from the Nvidia Fermi era :mrgreen:
Support LuxCoreRender project with salts and bounties

Portfolio : https://www.behance.net/DRAVIA
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: GPU render experience

Post by Dade »

DionXein wrote: Sat Oct 24, 2020 7:49 am I used Optix in this case, and usually it renders faster then CUDA and OpenGL.
Only for RTX GPUs (where it can be a lot faster), in my tests, CUDA+Optix is slower on old GPUs than CUDA + my BVH code. It is likely to be "programmed obsolescence" at work (to force people to buy new RTX GPUs).
Support LuxCoreRender project with salts and bounties
johannes.wilde
Posts: 68
Joined: Fri Sep 21, 2018 7:57 am

Re: GPU render experience

Post by johannes.wilde »

DionXein wrote: Sat Oct 24, 2020 7:49 am I just gave Octane another try, I wouldn't say the viewport rendering is as reactive as Cycles, as least it isn't on my machine. However.... holy moly, it's incredibly fast.
That's not what i meant. I am totally happy with BlendLuxCore's viewport Performance on GPU. But i think the UI could stay more responsive and smooth when working with viewport rendering.
User avatar
Theo_Gottwald
Posts: 109
Joined: Fri Apr 24, 2020 12:01 pm

Re: GPU render experience

Post by Theo_Gottwald »

I have had several discussions on the responsiveness of the Luxcore UI - especially in higher output resolutions.
I am not sure there is any hope. I guess most devs do never render higher then 1920x1080 :-)
Get Blender-Quickbuttons and configure over 100 Blender-Buttons individual to your needs.
Visit my YouTube-Channel: Theo's Fun Videos and watch several Blender related Videos.
Join @Dreamstime and sell your Renderings to the world.
User avatar
DionXein
Posts: 81
Joined: Mon Jun 01, 2020 10:22 am

Re: GPU render experience

Post by DionXein »

Theo_Gottwald wrote: Sun Oct 25, 2020 12:39 pm I guess most devs do never render higher then 1920x1080 :-)
I'm sure they did. The main idea is to use most available PC resources on image rendering. And for calibration obliviously u should use lower resolutions, skip frames in animation, etc. For final shot all u have to do is press one button and leave it for a while.
Simple technics like lower pixel block size, set render borders in camera view, set up model on special collections, using cpu for preview render view - should make ur workflow much smoother.
chafouin
Posts: 120
Joined: Wed Jan 24, 2018 9:35 pm
Location: Los Angeles

Re: GPU render experience

Post by chafouin »

Again, I'm not complaining about the viewport rendering for which I use CPU anyway, but the final render. There should be a way to leave your PC somewhat reactive, at least to be able to abort a render. I can't even do that, I have to kill Blender from the task manager if I use CUDA or Optix.
User avatar
Theo_Gottwald
Posts: 109
Joined: Fri Apr 24, 2020 12:01 pm

Re: GPU render experience

Post by Theo_Gottwald »

Looking at how Responsive AMD Pro Render in any resolution is,
i would just say "it is possible". Even if the rendering would take 5% longer.
Luxrender takes very long, here longer then any other render-engine.
However the results are like that - especially caustics and Interior Renderings with Luxrender are amazing.
So I would give it the necessary time. But Beginners will not use it if its not responsive,
they may thing "Blender has crashed". And it feels like that.
Get Blender-Quickbuttons and configure over 100 Blender-Buttons individual to your needs.
Visit my YouTube-Channel: Theo's Fun Videos and watch several Blender related Videos.
Join @Dreamstime and sell your Renderings to the world.
User avatar
DionXein
Posts: 81
Joined: Mon Jun 01, 2020 10:22 am

Re: GPU render experience

Post by DionXein »

Theo_Gottwald wrote: Mon Oct 26, 2020 4:36 pm Looking at how Responsive AMD Pro Render in any resolution is
ProRender isn't fully unbiased, LuxCore is
User avatar
Theo_Gottwald
Posts: 109
Joined: Fri Apr 24, 2020 12:01 pm

Re: GPU render experience

Post by Theo_Gottwald »

The responsiveness has to do with the general program structure.
I do not think it has to do with "How an engine internally works".
In windows we would just add some "DoEvents()" after each internal Action.
Of course this may need a bit of time. But I doubt it will be such a big issue.
As the Responsiveness is decreasing with the render-resolution, i assume that these
"DoEvents" Sort of are just called at any complete redraw of the Picture.
Why not call it after a fixed number of pixels?
In such case the responsiveness would always be the same.
Get Blender-Quickbuttons and configure over 100 Blender-Buttons individual to your needs.
Visit my YouTube-Channel: Theo's Fun Videos and watch several Blender related Videos.
Join @Dreamstime and sell your Renderings to the world.
User avatar
B.Y.O.B.
Developer
Developer
Posts: 4146
Joined: Mon Dec 04, 2017 10:08 pm
Location: Germany
Contact:

Re: GPU render experience

Post by B.Y.O.B. »

I have already written about the causes for hanging on high-resolution final renders here: https://github.com/LuxCoreRender/BlendL ... -667749833

Cycles and Radeon ProRender are usually used with small tiles, and they only update the pixels of one tile on each update. LuxCore on the other hand renders the whole film at once and currently, film updates in Blender always update the whole film at once, too.

Anyway, this is getting off-topic here, as this thread is about problems with GPU viewport rendering, not about final renders with big resolutions.
Post Reply