why am I seeing shading errors on gpu but not on cpu with the same settings?
Now, this scene is a bit of a stretch case I tried to emulate problem I usually see on some cad models but it is also visible in the luxcore benchmark scene from sharlybg, faintly though.
There are ways to solve this I know but in some cases I'm afraid my only option would be to use cpu rendering.
CPU vs GPU shading
CPU vs GPU shading
- Attachments
-
- cpugpu.blend
- (719.75 KiB) Downloaded 22 times
Re: CPU vs GPU shading
IDK, might be a bug...
At first, it looks like min. epsilon value behaves differently on GPU vs. CPU. For GPU it must be set 100x higher, then both give same result.
Same behavior is observed in standalone.
Didn't look further.
At first, it looks like min. epsilon value behaves differently on GPU vs. CPU. For GPU it must be set 100x higher, then both give same result.
Same behavior is observed in standalone.
Didn't look further.
- FarbigeWelt
- Posts: 469
- Joined: Sun Jul 01, 2018 12:07 pm
- Location: Switzerland
- Contact:
Re: CPU vs GPU shading
I made the observation, but had to increase min. epsilon by factor 500. It depends much on how the edges are connected in the shape. This glitches are often seen with GPU rendering after bolean operation difference.
160.8 | 42.8 (10.7) Gfp / Windows 10 Pro, intel i7 4770K@3.5, 32 GB | AMD R9 290x+R9 390x, 4 GB
17.3 | 19.0 ( 4.7) Gfp / macOS X 13.6, iMac 27'', 2010, intel i7 870@2.93, 24 GB | ATI Radeon HD 5750, 1 GB
#luxcorerender | Gfp = SFFT Gflops
17.3 | 19.0 ( 4.7) Gfp / macOS X 13.6, iMac 27'', 2010, intel i7 870@2.93, 24 GB | ATI Radeon HD 5750, 1 GB
#luxcorerender | Gfp = SFFT Gflops