I know it's never going to happen, but Nvidia really should spin off their consumer division - it's such a complete afterthought at this point. They are sitting on all of this amazing technology but just kind of stringing along the market at this point.
Their consumer cards are the entry point for many researchers and students though, so it pays off eventually when they become engineers working with the expensive enterprise cards.
Great point - an analogy comes to mind: MSFT (and Adobe) were totally okay with (and even encouraged) students pirating Windows/Photoshop etc in non-Western countries in the hopes they grow up and carry that knowledge in a future legal venture.
They were right.
This is the hardware equivalent of that.
Only until CUDA is replaceable though: and who is gonna do that? Intel better carry through their promises in this regard.
At this point, I suspect that they mostly want to be in the consumer market to attempt to keep a lid on VRAM density and throughput in that more cost-sensitive space. It's ridiculous that the RTX "4060" has 8 GB on a 128-bit bus vs. 3060's 12 GB on a 192-bit bus (both GDDR6; i.e. they didn't make up the difference with improvements in per-pin transfer rate).
Isn't Nvidia's AI chip is based on their own GPU? Let's say the current AI trend goes away. It's good to have something to fall back on, and why would you cut the consumer side off? It's all based on the same architecture. It's not like it's going to cost or lose them money.
Nah. Besides, I want them to continue to diversify in case there ever is an AI winter. If that happens, at least I'll get an updated Shield TV or something.