August 20 was shaping up to be a big day in gaming for me. After some advanced notice and lots of leaks and rumors, Nvidia was set to announce their next generation of graphics cards. After years of waiting, I was finally ready to buy a new card. What I saw during the presentation, however, has given me pause.
My previous Nvidia card was 2010’s GTX 470. It was a power hungry, hot, well performing card that lasted me until it finally gave up the ghost two years ago. Repeated “baking” attempts only brought it back for a week or so. Finally, I gave up and pulled it from my PC. Not really ready to jump into the newly announced GTX 1080, I decided to pick up a $70 AMD Radeon RX 460, which is what I’m still running today.
While my gaming is mixed between console and PC, I still enjoy PC games and the graphics and unique gameplay they usually bring. When Nvidia announced their Cologne, Germany, presentation of games and “surprises”, I knew along with everyone else this meant the latest generation cards. I approached Monday thinking that I would buy the best they had to offer in hopes to keep it going for another six to eight years.
The presentation of the new 20xx series cards was impressive. Nvidia chose to not only increase the performance of their normal CUDA cores (the hardware that renders the scene), but they also added new types of cores to the GPU. The first is a ray tracing block. Ray tracing is a computer graphics holy grail of lighting, shadows, and reflections. “Rays” are cast out into the scene like light, absorbing color, material, and angle information as it goes, ultimately creating a photorealistic effect with soft shadows, true reflections, and color influence on lighting. This technique is not used in games because of the heavy hardware requirement. The new Nvidia RTX hardware and associated software speeds up this process so a first generation of ray tracing can be added to games (some 20 near term games were announced, including Shadow of the Tomb Raider and Battlefield V). Finally, the famed Nvidia Tensor cores are added to the GPU. These cores run artificial intelligence applications and are used by the GPU to perform a number of tasks. One such task increasing resolution of the scene by intelligently determining where to fill in pixels (this is more advanced and accurate than the checkerboarding used by the PS4 Pro). Another task is anti-aliasing, which removes jagged edges from straight lines in a scene.
Overall, the presented lineup of the RTX 2070, RTX 2080, and RTX 2080 Ti was an impressive show of hardware that had me ready to preorder. Then, the pricing was announced. The MSRP ranged from $500 to $1000 with the preorder for the 2080 Ti running $1200. Hoping that I would be able to get in at an already high $800 ($100 more than the 1080 Ti) the $1200 price tag was an immediate kick in the stomach. I cannot reasonable pay for a graphics card the same price as three 4K consoles (Jason Evangelho, Forbes contributor, and many other tech writers seem to agree). While the hardware is impressive, the upper end price tags are much too high for tech that does not yet have games to use it. Sure, the 2080 Ti may run regular games at some 20% faster than the previous generation, it’s not worth the premium to me.
So, I’m back to waiting once again. Waiting for the current 1080 Ti to drop in price, waiting for the benchmarks on the more affordable 2070, or waiting for the 2080 Ti market to saturate and the price to come down significantly. Meanwhile, I’ve got a number of two and three year old PC games that my RX 460 runs just fine.
What is your opinion of the Nvidia announcement? Any PC gamers out there ready to pay the entry fee?