Hybrid Back/Forward path tracing (aka BiDir without MIS)

Discussion related to the LuxCore functionality, implementations and API.
provisory
Posts: 235
Joined: Wed Aug 01, 2018 4:26 pm

Re: Hybrid Back/Forward path tracing (aka BiDir without MIS)

Post by provisory »

So for unbiased SDS paths the emulation of camera/image/pixels plane would be necessary?
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Hybrid Back/Forward path tracing (aka BiDir without MIS)

Post by Dade »

provisory wrote: Thu Jul 04, 2019 1:35 pm So for unbiased SDS paths the emulation of camera/image/pixels plane would be necessary?
You are forgetting what can work in theory and what works in practice, especially with 32bit floating point precision: the probability to hit the ultra-tiny lens sensor is going to be so small that the end result will be just few fireflies and nothing more.
Support LuxCoreRender project with salts and bounties
epilectrolytics
Donor
Donor
Posts: 790
Joined: Thu Oct 04, 2018 6:06 am

Re: Hybrid Back/Forward path tracing (aka BiDir without MIS)

Post by epilectrolytics »

provisory wrote: Thu Jul 04, 2019 1:35 pm So for unbiased SDS paths the emulation of camera/image/pixels plane would be necessary?
That would be a poor substitute just like the SDS-paths the path tracer already does.
I don't think unbiased SDS-paths are viable in the context of hybrid rendering.

Dade has already announced that BiDirVM will be fixed later, now that will be an unbiased engine capable of rendering SDS.
Probably slower than hybrid though, more like classic BiDir.

But even hybrid is announced to be complemented with a new SDS-cache, how that is supposed to work I cannot even imagine.
Caustic cache provides SDS-paths only through being read (direct look-up) by path tracing.

Anyways, LuxCoreRender is already in very good shape now with PGI, Hybrid and Disney working well and it will get even more awesome in the coming months!
CodeHD
Donor
Donor
Posts: 437
Joined: Tue Dec 11, 2018 12:38 pm
Location: Germany

Re: Hybrid Back/Forward path tracing (aka BiDir without MIS)

Post by CodeHD »

Ok I think I now understand the point about how the DOF camera is currently implemented. I'm not quite convinced about this yet, though:
Dade wrote: Thu Jul 04, 2019 1:53 pm the probability to hit the ultra-tiny lens sensor is going to be so small that the end result will be just few fireflies and nothing more.
I did some renderings before with a 5-element lens system + a thin matte translucent box behind it as a detector, viewed by an ortho camera from behind (The full ray path in this case was D^3 S^10 D^2) . That gave correct and very good results with BiDir + Metropolis, and as only caustics arrived at the "detector", it should be the same if that was just a "detector" object :?: (Not saying it rendered super fast, and Metropolis had its problems with this specific case still, but it rendered a lot more than just a few fireflies)
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Hybrid Back/Forward path tracing (aka BiDir without MIS)

Post by Dade »

For SDS paths: I'm playing with the original idea of PhotonGI caustic cache used only for SDS paths or light tracing with vertex merging (something similar to BIDIRVM but dedicated only to caustics, including SDS paths).

The second is an unbiased method. It is also the first one with periodic updates and cache radius reduction ... so the true is they are about the same solution, just seen from 2 different starting points.
Support LuxCoreRender project with salts and bounties
User avatar
Sharlybg
Donor
Donor
Posts: 3101
Joined: Mon Dec 04, 2017 10:11 pm
Location: Ivory Coast

Re: Hybrid Back/Forward path tracing (aka BiDir without MIS)

Post by Sharlybg »

The second is an unbiased method. It is also the first one with periodic updates and cache radius reduction ... so the true is they are about the same solution, just seen from 2 different starting points.
the method with less settings will be good if the end result stay the same. anything that keep artist on Art is welcome.
one of the reason why i love the hybrid back... approach.
Support LuxCoreRender project with salts and bounties

Portfolio : https://www.behance.net/DRAVIA
User avatar
FarbigeWelt
Donor
Donor
Posts: 1046
Joined: Sun Jul 01, 2018 12:07 pm
Location: Switzerland
Contact:

Re: Hybrid Back/Forward path tracing (aka BiDir without MIS)

Post by FarbigeWelt »

epilectrolytics wrote: Thu Jul 04, 2019 12:59 pm
FarbigeWelt wrote: Thu Jul 04, 2019 11:25 am I still cannot follow so called non traceable rays.
The renderer does not model a system with lens & sensor.
There is also no image plane.
There's only a point defining the camera position.
If there was the simulation of a lens-sensor system there were effects of aperture and depth of field. If aperture was wide open
depth of field were shallow. To achieve deep depth of field, sharpness no matter how close or far an object in front of the lens is, aperture had to be small, almost closed. In reality the closer the aperture the darker the picture if sensitivity is given. I guess aperture in a math world can be reduced the extent of the diameter of simulated rays, i.e. 0 because a ray could be seen as a line which has a length but no extent in a second dimension. In reality the orifice‘s diameter should at least have approx. the size of max. detectable wavelength. If lens is removed one gets a pin hole camera without a real image plane but projection plane at any distance to orifice. Distance removed to zero means a projection plane area of zero at the location of the orifice also with a radius of zero and finally there is this point called camera in some raytracers. I still don’t see the advantages of this solution.
Light and Word designing Creator - www.farbigewelt.ch - aka quantenkristall || #luxcorerender
MacBook Air with M1
acasta69
Developer
Developer
Posts: 472
Joined: Tue Jan 09, 2018 3:45 pm
Location: Italy

Re: Hybrid Back/Forward path tracing (aka BiDir without MIS)

Post by acasta69 »

FarbigeWelt wrote: Fri Jul 05, 2019 6:22 am If there was the simulation of a lens-sensor system there were effects of aperture and depth of field.
...
... and finally there is this point called camera in some raytracers. I still don’t see the advantages of this solution.
I think I'm a bit losing your point... Are you suggesting that raytracers should move towards the simulation of real camera systems, with real lenses and sensors?
Wouldn't this be computationally a lot more expensive?
The "thin lens" camera model has limitations for sure, but is very effective.
FarbigeWelt wrote: Fri Jul 05, 2019 6:22 am In reality the orifice‘s diameter should at least have approx. the size of max. detectable wavelength.
From a DOF point of view, this would effectively be very similar to having DOF disabled. Or would you like to simulate diffraction?
Support LuxCoreRender project with salts and bounties

Windows 10 64 bits, i7-4770 3.4 GHz, RAM 16 GB, GTX 970 4GB v445.87
User avatar
FarbigeWelt
Donor
Donor
Posts: 1046
Joined: Sun Jul 01, 2018 12:07 pm
Location: Switzerland
Contact:

Re: Hybrid Back/Forward path tracing (aka BiDir without MIS)

Post by FarbigeWelt »

acasta69 wrote: Fri Jul 05, 2019 7:42 am
FarbigeWelt wrote: Fri Jul 05, 2019 6:22 am If there was the simulation of a lens-sensor system there were effects of aperture and depth of field.
...
... and finally there is this point called camera in some raytracers. I still don’t see the advantages of this solution.
I think I'm a bit losing your point... Are you suggesting that raytracers should move towards the simulation of real camera systems, with real lenses and sensors?
Wouldn't this be computationally a lot more expensive?
The "thin lens" camera model has limitations for sure, but is very effective.
FarbigeWelt wrote: Fri Jul 05, 2019 6:22 am In reality the orifice‘s diameter should at least have approx. the size of max. detectable wavelength.
From a DOF point of view, this would effectively be very similar to having DOF disabled. Or would you like to simulate diffraction?
As much I see you got my point. I don’t mean to simulate real optics system. In a math sim this is not required because you do not require to correct chromatic aberrations, because dispersion can be avoided per definition. In math sim there is no limitation but fp precision for the shape of the lens and the sensor, i.e. the benefits exceed in my opinion the calculation costs. And I guess the implementation is just a small extension of the current algorithms, exact the same for all bounces between camera and light. Extra computation should be less than increasing depth by 2, including ‚perfect‘ depth of field.

Diffraction and interferences would be pretty cool but is a completely different story. f(x)=y=sin(a*x+b)*c in plane x,y rotating in z. A geometrical light wave is already hard to imagine, the real one seems impossible due to it’s photons location probability density. Real light creeps around corners and passes through holes two small for its wave length. Before diffraction, let implement rotation of the propagation plane and phase shift first to get nice beat and interference patterns due to promotion and extinction of overlapping rays. (Could be something for Version 3.0 ;-))
Light and Word designing Creator - www.farbigewelt.ch - aka quantenkristall || #luxcorerender
MacBook Air with M1
epilectrolytics
Donor
Donor
Posts: 790
Joined: Thu Oct 04, 2018 6:06 am

Re: Hybrid Back/Forward path tracing (aka BiDir without MIS)

Post by epilectrolytics »

Probably stupid question, regarding hybrid path and env.light vis.cache:
Could not ELVC be replaced with light tracing?

Normally light tracing (as can be seen with LightCPU) does not only caustics but direct light too.
In case of hybrid b/f path tracing this is disabled so light tracing does only caustics.
But if light tracing would be enabled to do direct lighting too and instead the path tracer gets direct light disabled including importance sampling, visibility maps and cache, so that it would do indirect light (optionally with PGI) only, wouldn't that be more simple and effective?

That would mean direct light sampling is CPU only but it is so fast it should not matter.
Post Reply