CPU vs GPU shading

Use this forum for general user support and related questions.
Post Reply
User avatar
lacilaci
Donor
Donor
Posts: 1115
Joined: Fri May 04, 2018 5:16 am

CPU vs GPU shading

Post by lacilaci » Tue Sep 25, 2018 3:46 pm

why am I seeing shading errors on gpu but not on cpu with the same settings?

Now, this scene is a bit of a stretch case I tried to emulate problem I usually see on some cad models but it is also visible in the luxcore benchmark scene from sharlybg, faintly though.

There are ways to solve this I know but in some cases I'm afraid my only option would be to use cpu rendering.
Attachments
cpu.jpg
gpu.jpg
cpugpu.blend
(719.75 KiB) Downloaded 33 times

kintuX
Posts: 356
Joined: Wed Jan 10, 2018 2:37 am

Re: CPU vs GPU shading

Post by kintuX » Tue Sep 25, 2018 6:47 pm

IDK, might be a bug...
At first, it looks like min. epsilon value behaves differently on GPU vs. CPU. For GPU it must be set 100x higher, then both give same result.
Same behavior is observed in standalone.
Didn't look further.

User avatar
FarbigeWelt
Donor
Donor
Posts: 663
Joined: Sun Jul 01, 2018 12:07 pm
Location: Switzerland
Contact:

Re: CPU vs GPU shading

Post by FarbigeWelt » Wed Sep 26, 2018 7:18 am

I made the observation, but had to increase min. epsilon by factor 500. It depends much on how the edges are connected in the shape. This glitches are often seen with GPU rendering after bolean operation difference.
160.8 | 42.8 (10.7) Gfp / Windows 10 Pro, intel i7 4770K@3.5, 32 GB | AMD R9 290x+R9 390x, 4 GB
17.3 | 19.0 ( 4.7) Gfp / macOS X 13.6, iMac 27'', 2010, intel i7 870@2.93, 24 GB | ATI Radeon HD 5750, 1 GB
#luxcorerender | Gfp = SFFT Gflops

Post Reply