I dunno, on the gaming side, I'm not impressed much. I bet it's all brilliant and great and awesome, IF your are running 4K monitors... And that in itself implies that you already have a 1080ti or something along the very-high-end of last-gen, so suddenly it doesn't seem so compelling. Also NVidia pulling a No Mas Sky with the RT part, as in no game is using ray tracing on launch, and ppl/reviewers (excluding Linus' review) having absolutely no fucking clue what it stands for anyway, doesn't help. On the gaming side, I'm not sold on it like I was with the 10XX seriers.
The problem I see with this whole RT hype is that with no games to show it off, and the tech being exclusive to NVidia, ppl won't buy into it, and game makers won't buy into it, and it will fizzle out.
The "features" of RT are already in most games, using smart workarounds, "cheating" the system. And they look great! It's not like normal-mapping which was new tech that changed the look of games to what they are today. There isn't really a "need" for the good old tech of raytracing a Full Scene in games... But it could be a nice addition.
(BTW, strictly speaking, games already use "raytracing", in as minimalistic a way as possible. Just watch last years CitCon panel on how SC is rendered. Or how Doom 2016 is rendered, that's a good explainer as well)
That said, I'm very interested in the new cards for use in workstations! More CUDA cores is More Better! (mostly...)
Not for the stuff that most "tech-news" outlets are excited for, like editing and "rendering" (basically just encoding) in some 2D cutting software, because that just shows how much they don't undertand the tech they make the news about... (check out how confused Linus is every time he upgrades their "rendering" workstations with hundred thousands of dollars worth of high-end graphics cards, and at best gains only a few percents in render time, and keeps blaming the software for not being updated for his new shiney cards. He has absolutely no clue how that part of the "rendering" process works...)
But for some real GPU rendering of 3D raytraced scenes using production renderers like Redshift and Octane and the like, it's really exciting! Redshift is sooo amazingly fast now (been working with that all year) even with some mid-grade gaming cards like 1060s, it's gonna be real cool to see it run on a 2080ti.
So far I haven't found a report on how the RT cores would be used for these, if at all, but I hope it will get implemented in a way that makes them much faster! Then I'll have to save up and buy some for Mooar Speeeeed! :D
tldr.: it seems a bit pricey for just gaming if you already have a decent enough 10xx and a 1080p screen. It might see more use in 3D work related enviroments, but we shall see if the RT cores get implemented into 3D rendering software at all.