Thank you very much for you quick responses!
I must say that I'm not very familiar with the graphics rendering lingo and I don't have much experience on Blender.
I only used it to generate some basic custom scenarios (rooms, OpenMap-based, etc), and only used Object and Edit mode for this.
Anyway, if this idea turns out to make sense I can invest significant resources on it.
Dade wrote: ↑Fri Apr 03, 2020 11:41 am
It may be a bit stretched but anyway:
mattia wrote: ↑Fri Apr 03, 2020 8:52 am
* Obtain output files containing information about every single received ray, such as traveled distance/delay, received power, angles of departure and arrival
Depth AOV and HDR output may work, I assume the angle of arrival can be deduced by the 360 degree image.
I'm not sure, but I think that depth AOV would only work for objects that can be seen directly, right? I'm also interested in rays bouncing off surfaces, but the total travelled distance starting from the light source should be tracked, not just the the portion starting from the last reflection. Could this information be obtained? Total travelled distance would be needed to obtain both total power path loss and wave phase shift.
I guess I could indeed obtain the arrival angle by looking at the image, but I would still need to couple it with a departure angle.
Simply reversing camera and light source could give me both, but then the coupling would not be trivial (if even possible..).
Do you happen to know of a better way to obtain these information?
Dade wrote: ↑Fri Apr 03, 2020 11:41 am
mattia wrote: ↑Fri Apr 03, 2020 8:52 am
* Possibly, access to the render engine programmatically based on a pre-built Blender CAD scene
All the sources are available but we are talking of hundred of thousands lines of code, it is not easy to start from zero and work on them.
I was actually asking if there exist a Blender/LuxCore API that I can access programmatically.
I'm thinking of batch of
- Rendering scenarios on the department server, of which I have no graphical access and thus couldn't use Blender with ouse and keyboard
- Rendering a large number of scenarios with different parameters
- It might be needed to swap cameras/light sources in order to also obtain interfering channels among different nodes, and it would be annoying and prone to mistake to do this manually
Dade wrote: ↑Fri Apr 03, 2020 11:41 am
mattia wrote: ↑Fri Apr 03, 2020 8:52 am
* Ideally, create an interface (an add-on?) specifically for RF ray-tracing and make it available to the community which allows the complete workflow to stay on Blender, i.e., (i) scene creation, (ii) material settings, (iii) RX/TX positioning (possibly moving), (iv) ray-tracing computation, (v) ray visualization.
Again, with all sources available it is possible but it is not easy and not a small amount of work.
I'm not saying that I would modify the core of the render engine, although I already have experience on large software projects, but rather create a sort of a wrapper around it, like a Blender add-on I guess.
Thank you CodeHD for joining the conversation so eagerly! You are indeed right, I am not interested to simply obtain the distance with the the first object intersection (from the camera POV), but rather following the full path, from light source to destination.
Actually, having the full list of reflection points along its path would be even more useful, is there a way to obtain directly this information?