We have previously heard whispers about an Nvidia GeForce GTX XX60 graphics card that will lack raytracing features. Today, the weight of rumours tipping the release of such a graphics card became even heavier, as VideoCardz reported that it had received three separate tips about the upcoming release of an Nvidia GeForce GTX 1660 Ti.
The story goes that the new Nvidia GeForce GTX 1660 Ti will become the first graphics card based upon the Turing GPU architecture to be released into the GTX family. Seasoned readers will anticipate the significance of this change – it means that the GTX 1660 Ti, and any other GTX Turing-based efforts that will follow will lack hardware accelerated real-time raytracing features.
According to the VideoCardz piece, the new Nvidia GeForce GTX 1660 Ti isn't just scaled back with a snip to the RT cores, the semiconductor scissors have also been taken out to reduce the number of active CUDA cores to 1536 (compared to 1920 in the RTX 2060). However, other important features in the TU116 GPU are the same as in the TU106 GPU in the RTX 2060, as far as we know.
Memory quantity, type and connection are expected to remain as 6GB of GDDR6 on a 192-bit bus. In this area the VideoCardz sources diverge from some others that suggesg the upcoming Turing-snip-based GTX cards will fall back to GDDR5(X) video memory.
NVIDIA GeForce GTX 1660 Ti |
|||
GeForce GTX 1660 Ti |
GeForce RTX 2060 |
GeForce RTX 2070 |
|
GPU |
12nm FF TU116 |
12nm FF TU106 |
12nm FF TU106 |
CUDA Cores |
1536 |
1920 |
2304 |
Memory |
6GB GDDR6 |
6GB GDDR6 |
8GB GDDR6 |
Memory Bus |
192-bit |
192-bit |
256-bit |
Tabled info via VideoCardz
With the removal of RTX On capability and some trimming of CUDA cores it will be interesting to see how the supposed GTX 1660 Ti will be priced and if anything will be joining it in the gap where the limited current stocks of GTX 1060 cards are currently residing. Remember there is a significant price gap between the RTX 2060 ($349) and the GTX 1050 Ti ($170).