People overlooking DLSS is one of the most annoying things when it comes to this generation of GPUs.
DLSS leverages the Tensor Cores, which take up a majority of all the added transistors that make the RTX chips so huge. That's a ton of space that could have been used for CUDA cores, but nVidia chose to make then Tensor cores instead. They did this for specific reasons. They are insanely more efficient for the workflows that they process. Like... insanely more efficient.
The best review I've seen for DLSS so far is here:
https://www.eurogamer.net/articles/digitalfoundry-2018-09-17-nvidia-geforce-rtx-2080-ti-benchmarks-7001
The gains for DLSS over traditional anti-aliasing like TAA are huge (as nvidia originally reported). Obviously these gains are more drastic at higher resolutions, so these cards are really geared towards 4K more than anything else.
Granted, not very many people have 4K setups, so RTX cards don't really make sense for the majority of us. Those Tensor cores are just going to be sitting there unused (unless they're used for RT, but nvidia hasn't been clear about that yet). And yes, the prices are crazy, but people with 4K setups are generally people that don't necessarily care about that so much.
So basicaly, for 4K users, theirs 2080ti. Forevery one else, there's mastercard... I mean cheap 1080tis.