CPU vs GPU shading

Use this forum for general user support and related questions.
Post Reply
User avatar
Posts: 456
Joined: Fri May 04, 2018 5:16 am

CPU vs GPU shading

Post by lacilaci » Tue Sep 25, 2018 3:46 pm

why am I seeing shading errors on gpu but not on cpu with the same settings?

Now, this scene is a bit of a stretch case I tried to emulate problem I usually see on some cad models but it is also visible in the luxcore benchmark scene from sharlybg, faintly though.

There are ways to solve this I know but in some cases I'm afraid my only option would be to use cpu rendering.
(719.75 KiB) Downloaded 11 times

Posts: 263
Joined: Wed Jan 10, 2018 2:37 am

Re: CPU vs GPU shading

Post by kintuX » Tue Sep 25, 2018 6:47 pm

IDK, might be a bug...
At first, it looks like min. epsilon value behaves differently on GPU vs. CPU. For GPU it must be set 100x higher, then both give same result.
Same behavior is observed in standalone.
Didn't look further.

User avatar
Posts: 355
Joined: Sun Jul 01, 2018 12:07 pm
Location: Switzerland

Re: CPU vs GPU shading

Post by FarbigeWelt » Wed Sep 26, 2018 7:18 am

I made the observation, but had to increase min. epsilon by factor 500. It depends much on how the edges are connected in the shape. This glitches are often seen with GPU rendering after bolean operation difference.
Microsoft Windows 10 Professional, intel i7 4770K, 32 GB, AMD R9 290x 4 GB, AMD R9 390x 8 GB
Instagrammer, please join #luxcorerender for your renderings.

Post Reply