RTX 2080/2080 Ti BENCHMARKS ARE IN!

Montoya

Administrator
Staff member
Oct 31, 2013
9,929
54,350
3,055
RSI Handle
Montoya
When Star Citizen was being kickstarted and Chris was blabbing on about fidelity and how many polygons ships have, we all wondered how in the world this shit was going to run on a normal computer.

The delays in Star Citizen have had an unintended result here, the technology needed to make this game run at an acceptable frame rate have caught up with us!

I told myself that I would only upgrade my 970 when Star Citizen/SQ42 is released, so it looks like that 2070 may be the card I have been waiting for!
 

August

Space Marshal
Officer
Donor
Aug 27, 2018
2,789
10,363
2,250
RSI Handle
August-TEST
Jays 2 Cents made some good points. The 2080 is basically performing at 1080ti levels for $100 more, and the 2080ti is outperforming by 30%, but for 70% more money. Performance per dollar for the new stuff appears to be quite poor.
 

Printimus

Space Marshal
Officer
Donor
Dec 22, 2015
10,674
39,039
3,160
RSI Handle
Printimus
Jays 2 Cents made some good points. The 2080 is basically performing at 1080ti levels for $100 more, and the 2080ti is outperforming by 30%, but for 70% more money. Performance per dollar for the new stuff appears to be quite poor.
Yeah but when you only upgrade your components once ever 2 GPU generations (4-5 years), then $200 more is nothing. I'll take the better performing card, as I am upgrading from x2 980 TIs in SLI.

For me the value is there, I dont care much about fps/$ or whatever everyone else is screaming about.
 

Radegast74

Space Marshal
Oct 8, 2016
3,002
10,660
2,900
RSI Handle
Radegast74
Jays 2 Cents made some good points. The 2080 is basically performing at 1080ti levels for $100 more, and the 2080ti is outperforming by 30%, but for 70% more money. Performance per dollar for the new stuff appears to be quite poor.
That's pretty much the conclusion of this review:
https://arstechnica.com/gadgets/2018/09/nvidia-rtx-2080-and-2080-ti-review-a-tale-of-two-very-expensive-graphics-cards/

My favorite quotes:
"Remember when $700 seemed like a lot of money for a top-of-the-line GPU? Quaint times."

"Right now, the RTX 2080 is a slightly more expensive GTX 1080 Ti, and its extra $100 or so buys you a raffle ticket for Nvidia's possible improved-games future."

"Meanwhile, if you want to stretch the definition of a "consumer-grade" computer part to include something that costs over one thousand dollars, then the RTX 2080 Ti is the best consumer-grade graphics card on the market, bar none."

I think I'm skipping this generation, I'll wait for software to be available that actually takes advantage of the new tech.
 

Xist

Moderator
Staff member
Officer
Donor
Jan 16, 2016
903
2,654
1,650
RSI Handle
Xist
Jays 2 Cents made some good points. The 2080 is basically performing at 1080ti levels for $100 more, and the 2080ti is outperforming by 30%, but for 70% more money. Performance per dollar for the new stuff appears to be quite poor.
At the moment that is a fair assessment.

The question is whether gaming companies will start to use the RTX features that really set the 20xx apart from the 10xx series. If/when we start seeing this featured in games then the RTX cards will be worth upgrading. However I bet it will be at least a year before there is much reason to have any RTX capabilities in your card.

Note that Star Citizen might be one of the first games to implement RTX features. With CR being so into immersion, the shadows produced by RTX are significantly more lifelike than without.

So I agree with @Montoya assuming SC/SQ42 is released in the next 2-3 years, the RTX card is likely to be the one to have. I don't think anybody needs to rush out and buy it now though. Next year it will be cheaper! And there will be a 2180. The year after that, the 2280...

Sometime in there is hopefully when we'll have SQ42. Maybe we'll find out more during CitCon.
 

Lorddarthvik

Space Marshal
Donor
Feb 22, 2016
2,750
9,511
2,860
RSI Handle
Lorddarthvik
I dunno, on the gaming side, I'm not impressed much. I bet it's all brilliant and great and awesome, IF your are running 4K monitors... And that in itself implies that you already have a 1080ti or something along the very-high-end of last-gen, so suddenly it doesn't seem so compelling. Also NVidia pulling a No Mas Sky with the RT part, as in no game is using ray tracing on launch, and ppl/reviewers (excluding Linus' review) having absolutely no fucking clue what it stands for anyway, doesn't help. On the gaming side, I'm not sold on it like I was with the 10XX seriers.

The problem I see with this whole RT hype is that with no games to show it off, and the tech being exclusive to NVidia, ppl won't buy into it, and game makers won't buy into it, and it will fizzle out.
The "features" of RT are already in most games, using smart workarounds, "cheating" the system. And they look great! It's not like normal-mapping which was new tech that changed the look of games to what they are today. There isn't really a "need" for the good old tech of raytracing a Full Scene in games... But it could be a nice addition.
(BTW, strictly speaking, games already use "raytracing", in as minimalistic a way as possible. Just watch last years CitCon panel on how SC is rendered. Or how Doom 2016 is rendered, that's a good explainer as well)

That said, I'm very interested in the new cards for use in workstations! More CUDA cores is More Better! (mostly...)
Not for the stuff that most "tech-news" outlets are excited for, like editing and "rendering" (basically just encoding) in some 2D cutting software, because that just shows how much they don't undertand the tech they make the news about... (check out how confused Linus is every time he upgrades their "rendering" workstations with hundred thousands of dollars worth of high-end graphics cards, and at best gains only a few percents in render time, and keeps blaming the software for not being updated for his new shiney cards. He has absolutely no clue how that part of the "rendering" process works...)

But for some real GPU rendering of 3D raytraced scenes using production renderers like Redshift and Octane and the like, it's really exciting! Redshift is sooo amazingly fast now (been working with that all year) even with some mid-grade gaming cards like 1060s, it's gonna be real cool to see it run on a 2080ti.
So far I haven't found a report on how the RT cores would be used for these, if at all, but I hope it will get implemented in a way that makes them much faster! Then I'll have to save up and buy some for Mooar Speeeeed! :D

tldr.: it seems a bit pricey for just gaming if you already have a decent enough 10xx and a 1080p screen. It might see more use in 3D work related enviroments, but we shall see if the RT cores get implemented into 3D rendering software at all.
 

Radegast74

Space Marshal
Oct 8, 2016
3,002
10,660
2,900
RSI Handle
Radegast74
Note that Star Citizen might be one of the first games to implement RTX features. With CR being so into immersion, the shadows produced by RTX are significantly more lifelike than without.
Interestingly enough, in the latter half of that Are Technica review, they talk about one of Nvidia's demos showing off the new RTX features --> an asteroid field demo!

"For another taste of that enchanted-yet-anxious feeling, we tested a fascinating "Asteroids" demo, developed internally by Nvidia to show off one other compelling RTX trick: mesh shading. Nvidia wants to offer developers a new path to putting more geometrical objects into a given real-time 3D scene. This path "eliminates CPU draw call bottlenecks and uses more efficient algorithms for producing triangles." Nvidia says this ideally works by reducing the number of unique draw calls from the CPU and letting the GPU process a simpler object list."

So yeah, SC could definitely take advantage of this, that is for sure! I'm just not sure where on the roadmap "re-write software to take advantage of RTX" is...so, I can wait.

Picture using the new tech:
Screen Shot 2018-09-20 at 7.29.09 AM.png


Picture using the old tech...less detailed, and much slower fps:
Screen Shot 2018-09-20 at 7.28.39 AM.png
 

Xist

Moderator
Staff member
Officer
Donor
Jan 16, 2016
903
2,654
1,650
RSI Handle
Xist
So yeah, SC could definitely take advantage of this, that is for sure! I'm just not sure where on the roadmap "re-write software to take advantage of RTX" is...so, I can wait.
The risk CIG would be taking is that they'd be adding this in ONLY for people using nVidia cards. They'd also have to have some compatibility for AMD cards, and who knows if RTX works with Vulcan or not.

So it might be one of those things where sure they'd love to have it, but then the cost of maintaining everything else skyrockets, and it's easier just to not have it at all.

Who knows. I would definitely like to know CR's take on it.
 

Radegast74

Space Marshal
Oct 8, 2016
3,002
10,660
2,900
RSI Handle
Radegast74
The risk CIG would be taking is that they'd be adding this in ONLY for people using nVidia cards. They'd also have to have some compatibility for AMD cards, and who knows if RTX works with Vulcan or not.

So it might be one of those things where sure they'd love to have it, but then the cost of maintaining everything else skyrockets, and it's easier just to not have it at all.

Who knows. I would definitely like to know CR's take on it.
Excellent points! If I had my preferences, I would want CR and SC to support open standards. My dream with SC is to get rid of Windows 10 and just run it on Linux...

In completely unrelated news:
https://architosh.com/2018/09/what-matters-vulkan-graphics-api-is-worlds-first-with-formal-memory-model/

So there is definitely progress in other areas, and not locking your software into one company's hardware product is the way to go, imo!
 

Printimus

Space Marshal
Officer
Donor
Dec 22, 2015
10,674
39,039
3,160
RSI Handle
Printimus
I dunno, on the gaming side, I'm not impressed much. I bet it's all brilliant and great and awesome, IF your are running 4K monitors... And that in itself implies that you already have a 1080ti or something along the very-high-end of last-gen, so suddenly it doesn't seem so compelling. Also NVidia pulling a No Mas Sky with the RT part, as in no game is using ray tracing on launch, and ppl/reviewers (excluding Linus' review) having absolutely no fucking clue what it stands for anyway, doesn't help. On the gaming side, I'm not sold on it like I was with the 10XX seriers.

The problem I see with this whole RT hype is that with no games to show it off, and the tech being exclusive to NVidia, ppl won't buy into it, and game makers won't buy into it, and it will fizzle out.
The "features" of RT are already in most games, using smart workarounds, "cheating" the system. And they look great! It's not like normal-mapping which was new tech that changed the look of games to what they are today. There isn't really a "need" for the good old tech of raytracing a Full Scene in games... But it could be a nice addition.
(BTW, strictly speaking, games already use "raytracing", in as minimalistic a way as possible. Just watch last years CitCon panel on how SC is rendered. Or how Doom 2016 is rendered, that's a good explainer as well)

That said, I'm very interested in the new cards for use in workstations! More CUDA cores is More Better! (mostly...)
Not for the stuff that most "tech-news" outlets are excited for, like editing and "rendering" (basically just encoding) in some 2D cutting software, because that just shows how much they don't undertand the tech they make the news about... (check out how confused Linus is every time he upgrades their "rendering" workstations with hundred thousands of dollars worth of high-end graphics cards, and at best gains only a few percents in render time, and keeps blaming the software for not being updated for his new shiney cards. He has absolutely no clue how that part of the "rendering" process works...)

But for some real GPU rendering of 3D raytraced scenes using production renderers like Redshift and Octane and the like, it's really exciting! Redshift is sooo amazingly fast now (been working with that all year) even with some mid-grade gaming cards like 1060s, it's gonna be real cool to see it run on a 2080ti.
So far I haven't found a report on how the RT cores would be used for these, if at all, but I hope it will get implemented in a way that makes them much faster! Then I'll have to save up and buy some for Mooar Speeeeed! :smile:

tldr.: it seems a bit pricey for just gaming if you already have a decent enough 10xx and a 1080p screen. It might see more use in 3D work related enviroments, but we shall see if the RT cores get implemented into 3D rendering software at all.
Look at the technology of 4k screens (tv and monitor) when they first came out. There was hardly any content to run on them, but in time, content followed the tech and now a lot of stuff is available for 4k: games, videos, pictures, etc.
 

Lorddarthvik

Space Marshal
Donor
Feb 22, 2016
2,750
9,511
2,860
RSI Handle
Lorddarthvik
Look at the technology of 4k screens (tv and monitor) when they first came out. There was hardly any content to run on them, but in time, content followed the tech and now a lot of stuff is available for 4k: games, videos, pictures, etc.
True, and I hope it happens with this RT as well! But I fear it will follow the path of the PhysX chip and the NVidia Hair thingy and such. Half of the gamers couldn't use those (AMD had no support), and games featuring them suffered heavily on other hardware because of them. This tech has the same potential to make things better, and the same potential to fail for the same reasons as those did.

Btw, while 4K was not available to the public, most of the films were filmed (or scanned in the earlier days) in 4K + resolutions! The base for the "content" was already there (as early as 1992!), but the consumer level hardware was not. The standards to make the change to 4K were there for ages. Any manufacturer that want's to make a 4K TV can easily make one thanks to these standards. It doesn't matter if you buy a Sony or an LG TV, they will both play the same 4K content, and it will look the same.
With the way NVidia is doing it's RT stuff, we don't know how much effort it takes to implement, and if it can be implemented in a way where the software won't suffer on AMD (or Intel) graphics cards. We don't know how much improvement it brings, or if the buyers will care enough to ask for it even if it costs more.
There are a lot of questions still... Anyways, if I 'd have to guess, I think it will be forgotten in about 2 years time, as something better and more generic pops up in it's stead.

humblebrag: I used to work in digital restoration when I started out, and I worked on 4K digital scans (downsized from 8K) of films from around 1900. 4K is sooo last century... hahahaha
( this was on of them: https://en.wikipedia.org/wiki/The_Weavers_(1905_film) ) That film featured a women who was born 114 years before that. That's freaking 1791!
 

CosmicTrader

Space Marshal
Officer
Donor
Oct 30, 2015
6,152
23,959
2,975
RSI Handle
CosmicTrader
Thanks for posting this info.


When Star Citizen was being kickstarted and Chris was blabbing on about fidelity and how many polygons ships have, we all wondered how in the world this shit was going to run on a normal computer.
As Chris Roberts stated in 2012 ----------- "The PC is dead, Long Live the PC."
CR is always right.......
 
  • Like
Reactions: hardroc77

SpudNyk

Space Marshal
Donor
Jun 19, 2016
886
3,435
2,650
RSI Handle
spudnyk
I’m sticking with my 1080ti and will look at the next gen after 2080ti. Hopefully Intel will have their video cards by then (and be competitive) or AMD will step up their game. Nvidia needs the competition and then prices will come down.
 

Deroth

Space Marshal
Donor
Sep 28, 2017
1,827
6,130
2,850
RSI Handle
Deroth1
I’m sticking with my 1080ti and will look at the next gen after 2080ti. Hopefully Intel will have their video cards by then (and be competitive) or AMD will step up their game. Nvidia needs the competition and then prices will come down.
I have more confidence in Intel bringing their 'A Game' to the market than AMD actually reaching beyond, "buy us because we're not Nvidia, please, please, please, oh please!"
 
Forgot your password?