Page 2 of 3

Re: Nvidia Turing

Posted: Wed Aug 22, 2018 9:12 am
by Sharlybg
Just to be clear: this allow the usage of Optix with open source projects but it still require an NVIDIA GPU as Optix only works there, it is mandatory, not an option ... not exactly a small detail.

I'm sure NVIDIA really like this solution :lol:
Theses guys are just Money hunter Nothing More :|

Re: Nvidia Turing

Posted: Wed Aug 22, 2018 12:23 pm
by kintuX
Sharlybg wrote: Wed Aug 22, 2018 9:12 am
Just to be clear: this allow the usage of Optix with open source projects but it still require an NVIDIA GPU as Optix only works there, it is mandatory, not an option ... not exactly a small detail.

I'm sure NVIDIA really like this solution :lol:
Theses guys are just Money hunter Nothing More :|
Sadly, IMHO, they seem much worse. Here we have a saying: Where good people build a church, the devil rises up a chapel.
They have found a way to poison the source.
Also, many seats/job positions will become redundant and blind gamers are happy to support further decline of human ingenuity, artistry, imagination... for the sake of pleasing own weak egos under influence of pretentious tech gods. Displayed is same pattern as is with junkies, addicts - fanaticism. This is what saddens me, could have been done long ago & open sourced, but it is what profit in capitalism stands for - exploitation of the weak. & The gap widens.

On to waiting for AMD 7nm + possibly some extras in raytracing & AI departments... similarly Apple shows quite a rise in Metal2 performance (just in code dev. optimization). Will see after things settle down.

Re: Nvidia Turing

Posted: Thu Aug 23, 2018 8:57 am
by FarbigeWelt
Nice Nvidia demo, but it is only a demo. Some guys tested the so called real time ray tracing. In fact it is not feasible yet. You need at least 4 to 16 samples per pixel and this is not much as we know. They optimized a lot.

Let us wait for cards with more than 100 Tflops for 32 bit floats with tdp less than 250 W.

Re: Nvidia Turing

Posted: Sun Sep 16, 2018 5:40 pm
by kintuX

Re: Nvidia Turing

Posted: Fri Sep 21, 2018 5:32 pm
by lacilaci
I think that soon people will "discover" difference between raytracing and pathtracing and so on. I'm noob, but come on, insane amount of mega rays per second... for what? A shadow or a specific single bounce reflection?
I was relying for some time on renderfarm for like 80-90% renderings(rebus to be precise) but with 2.8 or luxcore alpha that's not much of an option unless official releases. So I'm thinking of an upgrade of my old workstation (4770K+1060) some ryzen or threadripper with 2070 or 2080, but with nvidia I keep thinking that all this raytracing talk might not be too useful for offline rendering.
Can someone well educated tell me if I'm wrong and it's the best time for upgrade?

Re: Nvidia Turing

Posted: Fri Sep 21, 2018 8:14 pm
by Dade
lacilaci wrote: Fri Sep 21, 2018 5:32 pm Can someone well educated tell me if I'm wrong and it's the best time for upgrade?
You could just upgrade everything but the GPU and wait to see what happens. It is probably what I'm going to do.

Vulkan has a NVIDIA extension to expose the new ray tracing "hardware" (if there is any real hardware...). It is the only vaguely viable option for anyone not using CUDA at the moment.

Re: Nvidia Turing

Posted: Sun Sep 23, 2018 5:49 pm
by provisory

Re: Nvidia Turing

Posted: Sun Sep 23, 2018 11:11 pm
by Dade
Strange, doing the math, price/performance ratio of the Vega 64 and RTX 2080Ti is about the same (something like 50% more expansive and 50% faster).

However LuxMark v3.1 uses generic GPU computing code and not the new RTX hardware.

Re: Nvidia Turing

Posted: Mon Sep 24, 2018 9:37 pm
by kintuX
Well, here brand new Vega64 RX is 750€ (used 550€) vs. 1450€ for new RTX 2080Ti (550€ for used GTX1080Ti)

Re: Nvidia Turing

Posted: Tue Sep 25, 2018 8:39 pm
by pixie
I would say Vega competition is 2080 not 2080Ti...