Nvidia's ridiculously expensive RTX graphics have come full of promises but with little to actually show for themselves.
I'm currently browsing the GeForce RTX 2080ti graphics processing unit. I'm not quite ready to present my discovery, because, well, to be honest, I did not really get anything for you.
But before I continue, I really believe that the direction Nvidia will take with its new RTX 20 graphics cards will literally be a game changer. I can not immediately prove it to you.
The GeForce RTX 20 series of GPUs have the same CUDA Cores we had in GTX 10s, but they also contain Nvidia's new Tensor and RT Cores. Unfortunately, there is no way to test, measure or even include Tensor and RT Core. Obviously, he comes because he likes it Shadow of Tomb Raider and Battlefield V, but not yet.
Right now, if you throw the best part of $ 2,500 for the GeForce RTX 2080i, everything you watch is an increase of 30% compared to 1080i. You pay over Grand for that 30% extra.
But if you notice what's coming, that's a different picture.
You see, the RTX 20 series, with its Tensor and RT core, will turn heads development into the head. You may have read about RTX 20's real-time (using RT cores) and AI capabilities that the Tensor core brings to the table and you may have seen the video. It all appears as a marketing flake.
So when I caught Brian Burke, Nvidia's PR tech player at PAX AUS in Melbourne, I did not stay. I asked him why Kiwis should be involved with such a huge amount of money for something that right now does not do much.
I'm a big advocate for Nvidia. Their technology has spurred the progress of 3DFX graphic graphics. There is no respect for AMD / ATi, but they have been playing with Nvidia for over ten years. That's why I felt good to ask Nvidia where are the RTX demos? Why do not new users of RTX do anything, nothing, show the power of their new cards?
Brian's response was damp, but I think Nvidia knew they might have left the launch gun of technology before the developers had something ready. If that were the case, it would be the same problem with the players complaining about why they have games but not the hardware that drives them.
While the average Joe has to wait, Nvidia has shown me Star Wars RTX demo in real time. I followed the HDMI cable from the TV to the PC that runs the demonstration with the GeForce RTX 2080ti installation, just to be sure.
The RTX 2080's airborne monitoring capabilities look impressive. For programmers, the air capture is a holy grail. Most of them have become an airborne lurking expert, using reflections on the screen, but will likely jump with little persuasion.
All the games we play at this time excellently simulate how the beam of light refuses things and enters our eyes. However, there are compromises, and as players we subconsciously bear such compromises. Reflecting surfaces and shadows are limited to what is visible on the screen unless it is specially written.
With tracking trails, virtual protons shoot out of light sources on the scene, bouncing all objects and those who hit the camera, our virtual eye, create a picture. Traditionally, this is a complex task, reserved for top-of-the-line animated films, each box raising hours or super-computer power.
The Nvidia RTX card will have to work at least thirty frames per second, and we hope 60fps in order to get decent work in real time.
The real challenge, however, is that developers have become so good in simulating air shots, may be some time before we really come to notice the effects. Thinking in Star Wars demo, and those that are shown in the upcoming Battlefield V, really show the power of the air capture and realism that can be achieved.
But all the games will not be filled with polished, highly reflective surfaces. And if they are, we're so used to cheating, maybe a while before we actually notice them.
At the Nvidia PAX AUS demo, I played a few 4A Games coming up Metro Exodus, One-click buttons can be switched between RTX and non-RTX visuals. Honestly, and it could disappoint Nvidia, there was not much in it. Yes, images that enabled RTX were better, but not much more than $ 2,500.
Developers can choose how RTX-exclusive elements can be integrated into their games. Maybe he'll have to wait around from the balls to the wall. The shades of Tomb Raider will only use shadow shots in the air.
This is the RTX DLSS (Deep Learning Super-sampling) technology initiated by those AI Tensor Cores that will most likely have the greatest impact in the near future. The DLSS works by Nvidia running the game over its supercomputers. As a game runs the computer, it teaches what an anti-aliasing should look like (it's removing the jagged edges).
After finishing the data, it is packaged as an RTX Tensor core algorithm that will be used to customize the flight deck. The result is, of course, anti-aliasing with little impact on performance. The game that switches to this technology gains momentum of performance.
But right now we have nothing with these RTX features. Mooted ray-tracing in Shadow of Tomb Raider and Battlefield V has not yet arrived. And I fear the worst.
For starters, air capture is a Direct X 12 feature. DICE is a Frostbite Engine, which powers you Battlefield V is a dog when it comes to implementing DX12 with Nvidia cards. I like it Battle 1 and even Madden NFL 19 suffer from stuttering in the game and during the cut scene in the DX12. Switching back to the DX11 eliminates all stuttering.
Nvidia's GPX series RTX 20 will be the game changers, but not now. At this point in time, you pay twice the GTX 1080ti price by about 30% of additional performance.
However, VR players will still enjoy 30% of additional performance, as they give them performance improvements that they will not even get from two 1080ti GPUs running together in SLI. VR games usually do not use two GPUs connected to the computer.
Nvidia's RTX technology is very exciting, but two months after the launch, there is still no justification for the huge expense of these RTX cards.
Take care of my RTX 2080ti / RTX 2080ti x2 SLI review very quickly.