Using LuxCoreRender for Radio Frequencies

Discussion related to the LuxCore functionality, implementations and API.
mattia
Posts: 5
Joined: Fri Apr 03, 2020 8:41 am

Using LuxCoreRender for Radio Frequencies

Post by mattia »

I'm a researcher interested into radio channel modeling and I'm currently working with a custom-built ray-tracer for such purpose. I would like, though, to have better performance, e.g., using GPU accelerated ray-tracing, which is not trivial to program from scratch.

I was wondering whether I could do this with pre-existing open source (optical) ray-tracing software such as Blender+LuxCoreRender. For this to work, I would need to:
  1. Define transmitters (TXs) and receivers (RXs) nodes. I guess TXs are equivalent to light sources and RXs to cameras, but I would need 360° cameras. Also, TXs and RXs should be points with no size
  2. Obtain output files containing information about every single received ray, such as traveled distance/delay, received power, angles of departure and arrival
  3. Possibly, access to the render engine programmatically based on a pre-built Blender CAD scene
  4. Ideally, create an interface (an add-on?) specifically for RF ray-tracing and make it available to the community which allows the complete workflow to stay on Blender, i.e., (i) scene creation, (ii) material settings, (iii) RX/TX positioning (possibly moving), (iv) ray-tracing computation, (v) ray visualization.
I don't think optical renderers (even physics based) consider diffraction, but I can ignore that for the moment.
I have no idea whether this is even possible, I'm asking the LuxCore community for a feedback on whether these few points could be more or less easily obtained and, if so, give me some references from manuals/APIs which I could further study.

Your opinion will be greatly appreciated.
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Using LuxCoreRender for Radio Frequencies

Post by Dade »

It may be a bit stretched but anyway:
mattia wrote: Fri Apr 03, 2020 8:52 am * Define transmitters (TXs) and receivers (RXs) nodes. I guess TXs are equivalent to light sources and RXs to cameras, but I would need 360° cameras. Also, TXs and RXs should be points with no size
Point light sources and environment camera (a 360 panoramic camera) may work:

Image
mattia wrote: Fri Apr 03, 2020 8:52 am * Obtain output files containing information about every single received ray, such as traveled distance/delay, received power, angles of departure and arrival
Depth AOV and HDR output may work, I assume the angle of arrival can be deduced by the 360 degree image.
mattia wrote: Fri Apr 03, 2020 8:52 am * Possibly, access to the render engine programmatically based on a pre-built Blender CAD scene
All the sources are available but we are talking of hundred of thousands lines of code, it is not easy to start from zero and work on them.
mattia wrote: Fri Apr 03, 2020 8:52 am * Ideally, create an interface (an add-on?) specifically for RF ray-tracing and make it available to the community which allows the complete workflow to stay on Blender, i.e., (i) scene creation, (ii) material settings, (iii) RX/TX positioning (possibly moving), (iv) ray-tracing computation, (v) ray visualization.
Again, with all sources available it is possible but it is not easy and not a small amount of work.
mattia wrote: Fri Apr 03, 2020 8:52 am I have no idea whether this is even possible, I'm asking the LuxCore community for a feedback on whether these few points could be more or less easily obtained and, if so, give me some references from manuals/APIs which I could further study.
I suggest you to do some test just using what is currently available (i.e. Blender+LuxCore) and check if the results are vaguely in the range of you need and you are looking for.
Only after that I would start considering writing some code, customize stuff, etc. because it is not going to be a small endeavor.
Support LuxCoreRender project with salts and bounties
CodeHD
Donor
Donor
Posts: 437
Joined: Tue Dec 11, 2018 12:38 pm
Location: Germany

Re: Using LuxCoreRender for Radio Frequencies

Post by CodeHD »

I just had a brief look at the depth channel, since i find this an interesting topic, too.

I set up a simple test scene (Attached) with a point source, camera and three matte planes. Two are angled up to the light source, i.e. a single bounce to the camera, the other is angled down and receives light via a mirror.

Using hybrid path, this is the depth AOV I get. Saved to exr and displayed with python.
depthoutput.png
The values do not make sense to me. No matter if if is camera to object, light to object or total sum, the value should get bigger towards the bottom right, not smaller. The units also don't seem to refelct the distances.

Dade, can you clarify what depth AOV is supposed to show?
Attachments
depthtest.blend
(625.56 KiB) Downloaded 191 times
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Using LuxCoreRender for Radio Frequencies

Post by Dade »

CodeHD wrote: Fri Apr 03, 2020 1:32 pm The values do not make sense to me. No matter if if is camera to object, light to object or total sum, the value should get bigger towards the bottom right, not smaller. The units also don't seem to refelct the distances.
You are using light tracing only:

Code: Select all

path.hybridbackforward.enable = 1
path.hybridbackforward.partition = 0
It doesn't make any sense (how can LuxCore estimate the distance from the eye ... if it doesn't trace eye paths at all !). It is a trick to to have something similar to a caustic AOV, nothing more. Just disable light tracing or trace some eye path at least.
Support LuxCoreRender project with salts and bounties
CodeHD
Donor
Donor
Posts: 437
Joined: Tue Dec 11, 2018 12:38 pm
Location: Germany

Re: Using LuxCoreRender for Radio Frequencies

Post by CodeHD »

It's the same with 50% ratio and also with pure path tracing. Only that the third plane is not visible at all in the latter case as expected.

Also, I was doing it with OCL, so the light tracing only should have affected only CPU, no?

In any case, you are saying that depth-AOV only gives the distance between camera and first object intersection? In this case, I don't think it would be what mattia needs.
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: Using LuxCoreRender for Radio Frequencies

Post by Dade »

CodeHD wrote: Fri Apr 03, 2020 1:58 pm In any case, you are saying that depth-AOV only gives the distance between camera and first object intersection?
It is the very same definition of Z-buffer, what else could it be ?

P.S. it is the distance between the image plane (not the camera) and the first object intersection.
Support LuxCoreRender project with salts and bounties
mattia
Posts: 5
Joined: Fri Apr 03, 2020 8:41 am

Re: Using LuxCoreRender for Radio Frequencies

Post by mattia »

Thank you very much for you quick responses!
I must say that I'm not very familiar with the graphics rendering lingo and I don't have much experience on Blender.
I only used it to generate some basic custom scenarios (rooms, OpenMap-based, etc), and only used Object and Edit mode for this.

Anyway, if this idea turns out to make sense I can invest significant resources on it.

Dade wrote: Fri Apr 03, 2020 11:41 am It may be a bit stretched but anyway:
mattia wrote: Fri Apr 03, 2020 8:52 am * Obtain output files containing information about every single received ray, such as traveled distance/delay, received power, angles of departure and arrival
Depth AOV and HDR output may work, I assume the angle of arrival can be deduced by the 360 degree image.
I'm not sure, but I think that depth AOV would only work for objects that can be seen directly, right? I'm also interested in rays bouncing off surfaces, but the total travelled distance starting from the light source should be tracked, not just the the portion starting from the last reflection. Could this information be obtained? Total travelled distance would be needed to obtain both total power path loss and wave phase shift.
I guess I could indeed obtain the arrival angle by looking at the image, but I would still need to couple it with a departure angle.
Simply reversing camera and light source could give me both, but then the coupling would not be trivial (if even possible..).
Do you happen to know of a better way to obtain these information?
Dade wrote: Fri Apr 03, 2020 11:41 am
mattia wrote: Fri Apr 03, 2020 8:52 am * Possibly, access to the render engine programmatically based on a pre-built Blender CAD scene
All the sources are available but we are talking of hundred of thousands lines of code, it is not easy to start from zero and work on them.
I was actually asking if there exist a Blender/LuxCore API that I can access programmatically.
I'm thinking of batch of
  1. Rendering scenarios on the department server, of which I have no graphical access and thus couldn't use Blender with ouse and keyboard
  2. Rendering a large number of scenarios with different parameters
  3. It might be needed to swap cameras/light sources in order to also obtain interfering channels among different nodes, and it would be annoying and prone to mistake to do this manually
Dade wrote: Fri Apr 03, 2020 11:41 am
mattia wrote: Fri Apr 03, 2020 8:52 am * Ideally, create an interface (an add-on?) specifically for RF ray-tracing and make it available to the community which allows the complete workflow to stay on Blender, i.e., (i) scene creation, (ii) material settings, (iii) RX/TX positioning (possibly moving), (iv) ray-tracing computation, (v) ray visualization.
Again, with all sources available it is possible but it is not easy and not a small amount of work.
I'm not saying that I would modify the core of the render engine, although I already have experience on large software projects, but rather create a sort of a wrapper around it, like a Blender add-on I guess.


Thank you CodeHD for joining the conversation so eagerly! You are indeed right, I am not interested to simply obtain the distance with the the first object intersection (from the camera POV), but rather following the full path, from light source to destination.
Actually, having the full list of reflection points along its path would be even more useful, is there a way to obtain directly this information?
CodeHD
Donor
Donor
Posts: 437
Joined: Tue Dec 11, 2018 12:38 pm
Location: Germany

Re: Using LuxCoreRender for Radio Frequencies

Post by CodeHD »

Dade wrote: Fri Apr 03, 2020 2:38 pm It is the very same definition of Z-buffer, what else could it be ?

P.S. it is the distance between the image plane (not the camera) and the first object intersection.
The question was if it could be integrated ray distance (as amttia asked for), but I didn't think as far as that it would not make sense, or at least not universal, for infinite light sources (Sun, etc.)

But even with the z-buffer to image plane definition, it does not make sense that there is a left-to-right gradient in the depth.

mattia wrote: Fri Apr 03, 2020 2:47 pm I was actually asking if there exist a Blender/LuxCore API that I can access programmatically.
LuxCore has an API, the Blender addon builds on it as well. So you can definitely access anythng programatically that Blender can. Like Dade said, its more a matter of how much work that will be for you...
mattia wrote: Fri Apr 03, 2020 2:47 pm I'm not sure, but I think that depth AOV would only work for objects that can be seen directly, right? I'm also interested in rays bouncing off surfaces, but the total travelled distance starting from the light source should be tracked, not just the the portion starting from the last reflection. Could this information be obtained? Total travelled distance would be needed to obtain both total power path loss and wave phase shift.
I guess I could indeed obtain the arrival angle by looking at the image, but I would still need to couple it with a departure angle.
Simply reversing camera and light source could give me both, but then the coupling would not be trivial (if even possible..).
Do you happen to know of a better way to obtain these information?
No, as discussed above it would not. How is your demand actually to compute different wavelengths? and different mediums, are you looking at ionospheric refractions or is this more about WiFi/mobile coverage etc.?
Implementing a ray-distance might be one thing, but including refractive indices would need to be considered as well.
mattia wrote: Fri Apr 03, 2020 2:47 pm Actually, having the full list of reflection points along its path would be even more useful, is there a way to obtain directly this information?
Same as above, you would probably need to implement this.
mattia
Posts: 5
Joined: Fri Apr 03, 2020 8:41 am

Re: Using LuxCoreRender for Radio Frequencies

Post by mattia »

CodeHD wrote: Fri Apr 03, 2020 3:38 pm
Dade wrote: Fri Apr 03, 2020 2:38 pm It is the very same definition of Z-buffer, what else could it be ?

P.S. it is the distance between the image plane (not the camera) and the first object intersection.
The question was if it could be integrated ray distance (as amttia asked for), but I didn't think as far as that it would not make sense, or at least not universal, for infinite light sources (Sun, etc.)
I'm not interested into universal nor infinite light sources, just the point sources that I setup.
Sun radiation is negligible at microwave frequencies, and thus never taken into account.
CodeHD wrote: Fri Apr 03, 2020 3:38 pm
mattia wrote: Fri Apr 03, 2020 2:47 pm I was actually asking if there exist a Blender/LuxCore API that I can access programmatically.
LuxCore has an API, the Blender addon builds on it as well. So you can definitely access anythng programatically that Blender can. Like Dade said, its more a matter of how much work that will be for you...
mattia wrote: Fri Apr 03, 2020 2:47 pm I'm not sure, but I think that depth AOV would only work for objects that can be seen directly, right? I'm also interested in rays bouncing off surfaces, but the total travelled distance starting from the light source should be tracked, not just the the portion starting from the last reflection. Could this information be obtained? Total travelled distance would be needed to obtain both total power path loss and wave phase shift.
I guess I could indeed obtain the arrival angle by looking at the image, but I would still need to couple it with a departure angle.
Simply reversing camera and light source could give me both, but then the coupling would not be trivial (if even possible..).
Do you happen to know of a better way to obtain these information?
No, as discussed above it would not. How is your demand actually to compute different wavelengths? and different mediums, are you looking at ionospheric refractions or is this more about WiFi/mobile coverage etc.?
Implementing a ray-distance might be one thing, but including refractive indices would need to be considered as well.
The narrow-band approximation is often done anyway, so just a single wavelength (typically in the order of millimeters up to a few centimeters depending on the frequency band of interest).
We typically assume empty space as the main medium and basically just reflecting surfaces.
Low frequency waves passing through materials such as walls or organic matter could probably be modeled with transparent materials, but I consider that a further complication that I don't want to deal with at least for the moment.
I usually work in indoor up to cellular/mobile coverage, so no more than a kilometer. No varying refractive indexes nor ionosphere are ever considered in these cases.
CodeHD wrote: Fri Apr 03, 2020 3:38 pm
mattia wrote: Fri Apr 03, 2020 2:47 pm Actually, having the full list of reflection points along its path would be even more useful, is there a way to obtain directly this information?
Same as above, you would probably need to implement this.
I guess I'll have to take a look at the APIs then. If you have any reference on manuals/APIs/quick start guides or anything useful when starting from scratch as I am it will be appreciated!
CodeHD
Donor
Donor
Posts: 437
Joined: Tue Dec 11, 2018 12:38 pm
Location: Germany

Re: Using LuxCoreRender for Radio Frequencies

Post by CodeHD »

mattia wrote: Fri Apr 03, 2020 4:17 pm I guess I'll have to take a look at the APIs then. If you have any reference on manuals/APIs/quick start guides or anything useful when starting from scratch as I am it will be appreciated!
You could start with the wiki:

https://wiki.luxcorerender.org/LuxCoreRender_Wiki

and therein the links to the API and SDL reference manuals
Post Reply