CGI tech news box

General computer graphic news and papers.
User avatar
FarbigeWelt
Donor
Donor
Posts: 1046
Joined: Sun Jul 01, 2018 12:07 pm
Location: Switzerland
Contact:

Re: CGI tech news box

Post by FarbigeWelt »

lacilaci wrote: Thu Dec 05, 2019 9:51 am
1) Also those examples you took from the pdf look terible, but maybe they're underselling this tech?

2) I'd rather see some easy to do full capture system.
Look what xrite does here:
https://www.xrite.com/categories/appear ... -ecosystem
1) Do you think so? Well, they are highly educated IT university employees, they study physical effects and write their own renderer and its enhancements. They are not artist or color perfection workflow engineers.

Also as answer to your latest post, look at this instrument setup
3BFCA246-C363-4E28-8C9D-18126433E8CC.png
This instrument lights up and captures radiance from 360*180 degree^2. They also use a dedicated beam to support interpolation because capturing all directions would take a long time if done with good angle resolution.
The open source idea is great, but a kickstarter project for development in CH or D would need to gather several 100k € I guess.

2) Nice marketing site, as to be expected for a company of Danaher with Pantone and Munsell, world most famous color system for reproducible colors. Not sure, how they capture anisotropy with their flat material samples to meet quality of BSDF. I guess, the x-rite ecosystems for CAD costs several 10k €. Makes sense for high gloss PR for cars or full features CGI movies and maybe for fast prototyping for sport articles like runners‘ shoes.
Light and Word designing Creator - www.farbigewelt.ch - aka quantenkristall || #luxcorerender
MacBook Air with M1
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: CGI tech news box

Post by Dade »

lacilaci wrote: Thu Dec 05, 2019 10:04 am I think that it would be cool if someone would make an open source scanning solution and format
MDL SDK is open source: https://github.com/NVIDIA/MDL-SDK

And it is trivial to integrate in C++ or CUDA (no OpenCL support indeed). It was one of the interesting aspects.
Support LuxCoreRender project with salts and bounties
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: CGI tech news box

Post by lacilaci »

Dade wrote: Thu Dec 05, 2019 10:59 am
lacilaci wrote: Thu Dec 05, 2019 10:04 am I think that it would be cool if someone would make an open source scanning solution and format
MDL SDK is open source: https://github.com/NVIDIA/MDL-SDK

And it is trivial to integrate in C++ or CUDA (no OpenCL support indeed). It was one of the interesting aspects.
Why isn't it widely adopted yet?
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: CGI tech news box

Post by lacilaci »

FarbigeWelt wrote: Thu Dec 05, 2019 10:33 am
1) Do you think so? Well, they are highly educated IT university employees, they study physical effects and write their own renderer and its enhancements. They are not artist or color perfection workflow engineers.
Well that doesn't make those examples more appealing to me...

Anyways.
What I think would be the best and hopefuly we will see in near future. Is something inbetween.
So instead of scientifically perfect and incredibly expensive capturing contraption and instead of complete guesswork and bunch of sliders, we would have few photos and AI deducing the parameters to a point where it is so good "guesswork" that human eye can't tell difference from accurate full on capture.
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: CGI tech news box

Post by lacilaci »

Something like this would be nice
https://luxion.atlassian.net/wiki/space ... 13637/Fuzz

Probably also faster than creating and exporting hairs for some basic "fuzzy" objects (rugs, towels etc..) what do you think?
User avatar
Dade
Developer
Developer
Posts: 5672
Joined: Mon Dec 04, 2017 8:36 pm
Location: Italy

Re: CGI tech news box

Post by Dade »

lacilaci wrote: Sat Jan 11, 2020 2:05 pm Something like this would be nice
https://luxion.atlassian.net/wiki/space ... 13637/Fuzz

Probably also faster than creating and exporting hairs for some basic "fuzzy" objects (rugs, towels etc..) what do you think?
This can fit well in the new geometry pipeline where you throw in a surface and get out a surface+fur/etc.

However, like all geometry related stuff, is it better to use the geometry pipeline or directly Blender ? I mean doing something like "Fuzz" in Blender is slow/hard/complex/etc. ?
Support LuxCoreRender project with salts and bounties
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: CGI tech news box

Post by lacilaci »

Dade wrote: Sat Jan 11, 2020 5:31 pm
lacilaci wrote: Sat Jan 11, 2020 2:05 pm Something like this would be nice
https://luxion.atlassian.net/wiki/space ... 13637/Fuzz

Probably also faster than creating and exporting hairs for some basic "fuzzy" objects (rugs, towels etc..) what do you think?
This can fit well in the new geometry pipeline where you throw in a surface and get out a surface+fur/etc.

However, like all geometry related stuff, is it better to use the geometry pipeline or directly Blender ? I mean doing something like "Fuzz" in Blender is slow/hard/complex/etc. ?
Not sure, while a "long hair" object is fast enough to make and export, when you need "Fuzz" like object - short but very dense hair is super slow and runs out of memory easy. I would not consider this a super important feature but I'd guess it might be faster with luxcore generated at render time than have it generated and exported by blender. Keyshot should have a livestream at the end of the month explaining the advantages of using this thing but not sure if it's applicable for luxcore in any way.
User avatar
B.Y.O.B.
Developer
Developer
Posts: 4146
Joined: Mon Dec 04, 2017 10:08 pm
Location: Germany
Contact:

Re: CGI tech news box

Post by B.Y.O.B. »

Dade wrote: Sat Jan 11, 2020 5:31 pm However, like all geometry related stuff, is it better to use the geometry pipeline or directly Blender ? I mean doing something like "Fuzz" in Blender is slow/hard/complex/etc. ?
Hair export from Blender is partly done in Python, which slows the whole thing down (and this can't be changed). Also there's the conversion from Blender data arrays to LuxCore hair structure + tesselation. The latter is done in C++, but on hundreds of thousands of hair strands, it still takes some time.
A native hair shape like suggested would eleminate both of these conversion steps I think.
User avatar
lacilaci
Donor
Donor
Posts: 1969
Joined: Fri May 04, 2018 5:16 am

Re: CGI tech news box

Post by lacilaci »

So.. How is this blood magic working?
https://www.youtube.com/watch?v=M1nuviu ... mv02U-YR8o
User avatar
Sharlybg
Donor
Donor
Posts: 3101
Joined: Mon Dec 04, 2017 10:11 pm
Location: Ivory Coast

Re: CGI tech news box

Post by Sharlybg »

lacilaci wrote: Wed Jan 22, 2020 12:08 pm So.. How is this blood magic working?
https://www.youtube.com/watch?v=M1nuviu ... mv02U-YR8o
sometime this guy from Fstorm look like Devil.
Support LuxCoreRender project with salts and bounties

Portfolio : https://www.behance.net/DRAVIA
Post Reply