CPU vs GPU shading

Use this forum for general user support and related questions.
Forum rules
Please upload a testscene that allows developers to reproduce the problem, and attach some images.
Post Reply
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

CPU vs GPU shading

Post by lacilaci »

why am I seeing shading errors on gpu but not on cpu with the same settings?

Now, this scene is a bit of a stretch case I tried to emulate problem I usually see on some cad models but it is also visible in the luxcore benchmark scene from sharlybg, faintly though.

There are ways to solve this I know but in some cases I'm afraid my only option would be to use cpu rendering.
Attachments
cpu.jpg
gpu.jpg
cpugpu.blend
(719.75 KiB) Downloaded 125 times
kintuX
Posts: 809
Joined: Wed Jan 10, 2018 2:37 am

Re: CPU vs GPU shading

Post by kintuX »

IDK, might be a bug...
At first, it looks like min. epsilon value behaves differently on GPU vs. CPU. For GPU it must be set 100x higher, then both give same result.
Same behavior is observed in standalone.
Didn't look further.
User avatar
FarbigeWelt
Donor
Donor
Posts: 1046
Joined: Sun Jul 01, 2018 12:07 pm
Location: Switzerland
Contact:

Re: CPU vs GPU shading

Post by FarbigeWelt »

I made the observation, but had to increase min. epsilon by factor 500. It depends much on how the edges are connected in the shape. This glitches are often seen with GPU rendering after bolean operation difference.
Light and Word designing Creator - www.farbigewelt.ch - aka quantenkristall || #luxcorerender
MacBook Air with M1
Post Reply