Film-only out-of-core rendering support (CUDA-only)

Discussion related to the LuxCore functionality, implementations and API.
AndreasResch
Posts: 135
Joined: Fri Jul 06, 2018 9:32 am

Re: Film-only out-of-core rendering support (CUDA-only)

Post by AndreasResch »

I fear, we are talking past each other here. The discussion is about the technical possibility of using OOC on Windows. It should not matter if tiled or progressive rendering is used. The difference would only be the amount.

For what it's worth, here are the stats for the scene with progressive rendering. It uses OOC as well.
Attachments
GPUMem_Cycles_Progr_01.jpg
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Film-only out-of-core rendering support (CUDA-only)

Post by Dade »

AndreasResch wrote: Mon Jun 21, 2021 2:35 pm It should not matter if tiled or progressive rendering is used.
The topic of this thread is about the GPU memory used to store the film and you think it doesn't matter if you are allocating the GPU memory space for a full 7200x4800 pixels film or a tile of 64x64 pixels ?
Support LuxCoreRender project with salts and bounties
AndreasResch
Posts: 135
Joined: Fri Jul 06, 2018 9:32 am

Re: Film-only out-of-core rendering support (CUDA-only)

Post by AndreasResch »

Please don't straw-man me. That leads nowhere. You replied to/quoted my comments, which were an answer to the comment about OOC eventually not working on Windows. Nothing else. I always said that the FOOC works and tiled vs. progressive rendering never entered my argumentation about its validity. That's it. Nothing more - nothing less.

I've said, what I can observe on my PCs. If there's any value to that, fine. Otherwise, there's nothing to add.
Post Reply