Light tracing with cache?

General project and community related discussions and offtopic threads.
Post Reply
User avatar
epilectrolytics
Donor
Donor
Posts: 578
Joined: Thu Oct 04, 2018 6:06 am

Light tracing with cache?

Post by epilectrolytics » Wed Sep 18, 2019 9:31 am

This is probably another stupid idea that won’t work but I’m posting it anyways :mrgreen:

While experimenting with LuxCoreUi I discovered that LightCPU is the most powerful of all render engines. A hint can be taken from the performance of light tracing within the hybrid engine: Caustics appear almost instantly while BiDir, Path+Metro or caustic cache all need their time to resolve them.

When working with LightCPU I always experience that indirect lighting of diffuse surfaces is solved similarly fast. What takes a lot of time in path tracing is done super quickly in light tracing, even on CPU only.

The only problem with LightCPU is it doesn’t do specular surfaces.
Why is that?
Because we assume the camera is an infinitesimally small point that can never be hit when randomly sampling specular surfaces.

On closer inspection this is not really correct, because hidden therein is the assumption the camera would sample the screen space continuously whereas in reality it does it step-wise according to the image resolution.
When moving to the next pixel an angular shift is made for the camera ray cast. That means that the camera de facto has a pixel-wide extension (which is not constant but depends from ray length, surface curvature etc).
This extension can be calculated and make the camera visible to specular bounces from light rays.

Sadly this is no solution because the problem with a small intersectible camera is the same as with small intersectible lights with path tracing in the case of caustics: It is not efficient.
Replacing a probability of zero with a very low probability just means very slow convergence even with metropolis sampling.
I have rendered glass bodies with low roughness in LightCPU which is a similar problem; instead of being black they appear very dark and stay that way for a long time.

So this is a dead end, but when realising the discrete nature of camera scanning the scene another idea popped up.
Basically the image resolution makes the camera project a grid onto all surfaces in the scene where the grid nodes represent pixels.

Could those projected grid nodes be collected in a cache?

Let’s assume there is a scene with a mirror right in front of the camera with the size 100 pixels height and 100 pixels width that would show the stuff in the scene located behind the camera.
We could now trace rays from the camera through “the middle of each pixel“, bouncing off the mirror and register where they meet the surface of objects in the scene, collecting the data in a cache of 100x100=10 000 entries.
An entry needs to contain the shading data of all specular bounces like reflection or refraction colors and of course, the pixel identity.

Now when we do light tracing and hit the wall behind the camera, normally a next event estimation is made with the result that the camera is in line of sight but pointing the wrong way which means continuing the ray or termination in case the bounce limit was reached, but in either case no render result is calculated for this point.

But with a pixel-grid-cache (PGC) available nearby entries could be gathered and if the hit point is found to be in between two neighbour pixel entries we can assume indirect (specular) visibility to the camera and calculate and hand down a shading result to the respective pixel.
If a pixel entry is found nearby but no neighbour pixel entry that would indicate a border case (last pixel to the right) and neglected.
Suddenly in light tracing the wall behind the camera would appear in the mirror and the brick structure would be displayed sharply because the pixel grid nature guarantees sharpness!

Now I wonder if this is feasible or if there are deal-breakers I have overlooked?

If the mirror was not a flat plane but a sphere instead the pixel cache entries would be spread far apart, maybe there is a limit where they cannot get collected and calculated properly when the next entry is on a different object too far away.

A bump map on the mirror would distort the grid including the respective entries, neighbour entries would be spread and maybe distant entries jumbled closely together, probably difficult or impossible to calculate?

A glass body forces a ray-split event, that means double and probably more grid entries per pixel, maybe leading to a very big cache causing memory problems.

And then there is the problem with glossy materials or “specular roughness“.
LightCPU does a superior job with indirect diffuse lighting but once there are glossy materials in the mix it starts to struggle just like path tracing.

How to define pixel-grid- entries when their position becomes an area due to glossy roughness?
In the range where the “pixel-position-blur-areas“ don’t overlap there would be still a defined correlation but when the pixel grid gets too blurred I have no idea how to deal with it.

Probably it won’t work but I felt like I had to post this :D
MBPro 15" 16GB i7-4850HQ GT750M, MacOS 10.13.6 & Win10Pro PC 16GB Ryzen 2700X, 2 x RTX 2070

User avatar
Dade
Developer
Developer
Posts: 3281
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Light tracing with cache?

Post by Dade » Wed Sep 18, 2019 9:48 am

epilectrolytics wrote:
Wed Sep 18, 2019 9:31 am
We could now trace rays from the camera through “the middle of each pixel“, bouncing off the mirror and register where they meet the surface of objects in the scene, collecting the data in a cache of 100x100=10 000 entries.
Each entry is a point while a pixel area can cover thousands of objects. To sample a pixel area, you are going to need many cache entries for each pixel so it is 100x100x<big number> ... with all connected problem of having a finite number for "<big number>".

Looking for the nearest cache entry in "100x100x<big number>" is also quite expansive.

Aside all problems, your (finite) cache can just be replaced by tracing an eye path of random length, trace a light path of random length and connect them (it is BiDir without MIS). It will render all stuff (but SDS paths) with a lot more noise than BiDir+MIS (but faster).

If you add direct light sampling to the eye path to reduce noise ... you hand with Hybrid back/forward rendering.
Support LuxCoreRender project with salts and bounties

User avatar
epilectrolytics
Donor
Donor
Posts: 578
Joined: Thu Oct 04, 2018 6:06 am

Re: Light tracing with cache?

Post by epilectrolytics » Wed Sep 18, 2019 10:32 am

sorry double post
Last edited by epilectrolytics on Wed Sep 18, 2019 2:44 pm, edited 2 times in total.
MBPro 15" 16GB i7-4850HQ GT750M, MacOS 10.13.6 & Win10Pro PC 16GB Ryzen 2700X, 2 x RTX 2070

User avatar
epilectrolytics
Donor
Donor
Posts: 578
Joined: Thu Oct 04, 2018 6:06 am

Re: Light tracing with cache?

Post by epilectrolytics » Wed Sep 18, 2019 12:12 pm

Thanks for responding!
Dade wrote:
Wed Sep 18, 2019 9:48 am
Each entry is a point while a pixel area can cover thousands of objects. To sample a pixel area, you are going to need many cache entries for each pixel so it is 100x100x<big number>
Looking for the nearest cache entry in "100x100x<big number>" is also quite expansive.
The point of the pixel grid cache is that we already know where each pixel entry is.
In the example there would be entries from p(0,0) to p(100,100), each exactly referring to a point in screen space as well as in object space.
So when the nearest entry is p(47,02) we need to look for p(46,02), p(48,02), p(47,01) and p(47,03) only and confirm there is a square of pixel entries around the hit point.
If this happens in the distance with thousands of objects between the entries, no problem, these will be sampled by subsequent light rays, not within the singular hit event we're dealing with now. The nearest entry always exactly defines which pixel is affected, the hitpoint defines which object is affected, so that's that.

The number of pixel entries grows with the number of specular objects and the number of specular bounces only (which may be a problem indeed).
Dade wrote:
Wed Sep 18, 2019 9:48 am
If you add direct light sampling to the eye path to reduce noise ... you hand with Hybrid back/forward rendering.
Yep, but my aim was to get indirect lighting included in light tracing because LightCPU does it faster than PathCPU and even PathOCL in case of diffuse only.

Would a version of the Hybrid engine that does caustics and indirect per light tracing and restricts path tracing to the rest (first bounce specular or glossy) possible?
[/quote]
Last edited by epilectrolytics on Wed Sep 18, 2019 2:41 pm, edited 1 time in total.
MBPro 15" 16GB i7-4850HQ GT750M, MacOS 10.13.6 & Win10Pro PC 16GB Ryzen 2700X, 2 x RTX 2070

User avatar
epilectrolytics
Donor
Donor
Posts: 578
Joined: Thu Oct 04, 2018 6:06 am

Re: Light tracing with cache?

Post by epilectrolytics » Wed Sep 18, 2019 12:14 pm

sorry for triple post (got called to the door while posting and messed everything up in the hurry)
original post above.
Last edited by epilectrolytics on Wed Sep 18, 2019 2:46 pm, edited 2 times in total.
MBPro 15" 16GB i7-4850HQ GT750M, MacOS 10.13.6 & Win10Pro PC 16GB Ryzen 2700X, 2 x RTX 2070

User avatar
lacilaci
Donor
Donor
Posts: 1637
Joined: Fri May 04, 2018 5:16 am

Re: Light tracing with cache?

Post by lacilaci » Wed Sep 18, 2019 1:12 pm

I would be really careful with caches that are directly visible...
The one thing PSR does that is very attractive is that it produces high frequency noise even at early stages(bias) so it can be filtered out with denoiser.

If we will have spots or fireflies in form of several pixels diameter wide spots, OIDN will not filter them out and we might be forced to wait out until bias reaches pixel sized fireflies or no spots at all! This means you cannot do previews and you absolutely have to wait until sds converges or else your renders might be polluted with smudged fireflies.

User avatar
epilectrolytics
Donor
Donor
Posts: 578
Joined: Thu Oct 04, 2018 6:06 am

Re: Light tracing with cache?

Post by epilectrolytics » Wed Sep 18, 2019 2:57 pm

This cache would not be visible directly and it contains no light related information for lookup.
Instead it maps screen space onto object space via a cache that tells in which case a diffuse bounce would be seen elsewhere in a mirror or behind glass by telling directly in which pixel.

BiDir and PSR solve this by connecting eye and light paths which is time consuming.

But Dade knows best and if he says it's bollocks he's probably right though...
MBPro 15" 16GB i7-4850HQ GT750M, MacOS 10.13.6 & Win10Pro PC 16GB Ryzen 2700X, 2 x RTX 2070

provisory
Posts: 224
Joined: Wed Aug 01, 2018 4:26 pm

Re: Light tracing with cache?

Post by provisory » Thu Sep 19, 2019 9:14 am

Dade wrote:
Wed Sep 18, 2019 9:48 am
... tracing an eye path of random length, trace a light path of random length and connect them (it is BiDir without MIS). It will render all stuff (but SDS paths) with a lot more noise than BiDir+MIS (but faster).

If you add direct light sampling to the eye path to reduce noise ... you hand with Hybrid back/forward rendering.
Are eye paths and light paths connected when using Add Light Tracing?
It rather seems to me, that they are separate renderings which then added together.

User avatar
Dade
Developer
Developer
Posts: 3281
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Light tracing with cache?

Post by Dade » Thu Sep 19, 2019 9:40 am

provisory wrote:
Thu Sep 19, 2019 9:14 am
Are eye paths and light paths connected when using Add Light Tracing?
Nope but it would be useless because the same paths are better rendered by normal path tracing.
provisory wrote:
Thu Sep 19, 2019 9:14 am
It rather seems to me, that they are separate renderings which then added together.
Yes, they are just added together.
Support LuxCoreRender project with salts and bounties

Post Reply