STAR CITIZEN BENCHMARKS! - R9 Fury X - VS - 980TI

Jul 6, 2014
754
894
550
RSI Handle
Salt_Lake
With that many dead cards possibly you could consider its the rest of your system not the cards lol
Agreed, I dont see how you went through so many cards. I have had a lot of gpu and only 1 issue with 1 dieing thats for both AMD and Nvidia. Its usually people slacking in hetting a fecent psu that causes most of these cpu/gpu dieing, or making their own custom cables with wrong gauge wire.
 

Elsa

Captain
Aug 1, 2015
24
17
175
RSI Handle
Elsa
With that many dead cards possibly you could consider its the rest of your system not the cards lol
During that time i've had 5 diff mobos, e8400, fx 8120, 1090t, and a i5 4690k. 3 Diff Power Supplies. Asus Sabertooth and Maximus mobos.

It isn't uncommon to have nvidia cards die, they dont last. But my 980Ti dying 1 week in was odd.

I've been building PCs since the the i486 SX days, used to connect the bridges on Athlon T-birds using a pencil to unlock multipliers, used to cut up heatsinks back in the day to make ram sinks, custom gpu cooling and volt mods, and even used to do the vapochill stuff during the madonion/glory days of PC. NVidia cards (even back to the first Gforce) just don't last.

Most of them died after a year to year and a half. Only the 980TI died quickly
 
Last edited:
Jul 6, 2014
754
894
550
RSI Handle
Salt_Lake

Star Pilgrim

Rear Admiral
Feb 24, 2015
421
145
310
RSI Handle
Star_Pilgrim
I've been building PCs since the the i486 SX days, used to connect the bridges on Athlon T-birds using a pencil to unlock multipliers, used to cut up heatsinks back in the day to make ram sinks, custom gpu cooling and volt mods, and even used to do the vapochill stuff during the madonion/glory days of PC. NVidia cards (even back to the first Gforce) just don't last.

Most of them died after a year to year and a half. Only the 980TI died quickly
And you still have not learned how to build them properly? :)

Every nitwit can overclock components, only a master can make it work properly for the long haul.
Just because it is stable for the moment it does not mean you have succeeded. Components are made for certain amount of stress and they are set accordingly. Some components can take more than others.
And it takes a master to know which ones might last for prolonged periods and with which settings when OC.

I have been building and OCing comps ever since I got my first Commodore. Ever since IBM PS/2 line was introduced in 1987 (used an 8086 processor and had an ISA bus which was the true beginning of PC as we know it, grandaddy to all later CPUs).
Ive had every PC component you can think of and CPUs ranging from IBM, Cyrix and AMD.

Today I build water cooled PCs (even tried a fully enclosed mineral oil cooled one once) which I never overclock past 20% of their base (history taught me).
Better to have good performing long lasting components than short lived record breaking monsters which cost your wallet.
 

Krystal LeChuck

Meme Meister
Staff member
Officer
Jun 10, 2014
594
888
1,420
RSI Handle
Krystal
Nvidia cards dont last? AMD was the ones that have the higher failure rate go on OC.net or hexus or even LTT. Hell even AMD said a focus of 300 series was a lower failure rate. I have gtx 280s still functioning that work 24/7 for F@H. I also have 6850 that is still working too but isnt in use.

https://www.pugetsystems.com/labs/articles/Video-Card-Failure-Rates-by-Generation-563/

http://forums.anandtech.com/showthread.php?t=2349587
Read the articles you posted. The first one uses ASUS Direct CU cards. Those have the worst coolers in the world for VRMs. Also a comment from the article writer "I'm not sure about exact VRM temperatures since we don't log that currently, but I believe we usually see about 85C being reported ". Is that guy kidding me? A benchmarker that doesn't log VRM temperatures? Also he is lying since the only way you will see 85C on VRMs on a 290 is if it is idling (on air cooling).

The second article contains the following title: "Do AMD graphics cards really suck?" and "Should I be worried?"

Sorry but I don't like reading shitposts.
 

NKato

Grand Admiral
Apr 25, 2014
1,202
1,207
960
RSI Handle
NKato
I had expected AMD to have 12_1 support for their latest.

No matter how much their vp says it doesn't matter, he doesn't realize the reason why a lot want 12_1: product longevity.

Planned obsolescence is bullshit.

With that said, if 12_1 turns out to be nVidia exclusive, there will be such a shitstorm...
 

Krystal LeChuck

Meme Meister
Staff member
Officer
Jun 10, 2014
594
888
1,420
RSI Handle
Krystal
I had expected AMD to have 12_1 support for their latest.

No matter how much their vp says it doesn't matter, he doesn't realize the reason why a lot want 12_1: product longevity.

Planned obsolescence is bullshit.

With that said, if 12_1 turns out to be nVidia exclusive, there will be such a shitstorm...
While it’s true that Maxwell is the only GPU that supports DirectX 12_1, AMD is the only company offering full Tier 3 Resource Binding and asynchronous shaders for simultaneous graphics and compute. That doesn’t mean AMD or Nvidia is better or worse in DX implementation — it means that certain features and capabilities of various cards are imperfectly captured by feature levels and that calling one GPU or another “full” DX12 misses this distinction. Intel, for example, offers ROV at the 11_1 feature level — something neither AMD nor Nvidia can match.



Here you can find an explanation of all those terms and features: https://msdn.microsoft.com/en-us/library/windows/desktop/dn899119(v=vs.85).aspx
 

NKato

Grand Admiral
Apr 25, 2014
1,202
1,207
960
RSI Handle
NKato
While it’s true that Maxwell is the only GPU that supports DirectX 12_1, AMD is the only company offering full Tier 3 Resource Binding and asynchronous shaders for simultaneous graphics and compute. That doesn’t mean AMD or Nvidia is better or worse in DX implementation — it means that certain features and capabilities of various cards are imperfectly captured by feature levels and that calling one GPU or another “full” DX12 misses this distinction. Intel, for example, offers ROV at the 11_1 feature level — something neither AMD nor Nvidia can match.



Here you can find an explanation of all those terms and features: https://msdn.microsoft.com/en-us/library/windows/desktop/dn899119(v=vs.85).aspx
Interesting. Here's hoping the next gen of gpus will properly take advantage of dx12.
 

Yex

Space Marshal
Mar 15, 2015
315
490
2,350
RSI Handle
Yex
Next years GPU's gonna be a giant step up from this years from what im hearing.
Nah
That's just hype from misinformation given by Nvidia. I watched the keynote on pascal it was stock holder bullshit
Look at Skylake it's the usual % increase per year regardless of the node size
Except this year we got a bargain with the cut down Titan X
New HBM2 will be an expensive premium
 
Last edited:

NKato

Grand Admiral
Apr 25, 2014
1,202
1,207
960
RSI Handle
NKato
Nah
That's just hype from misinformation given by Nvidia. I watched the keynote on pascal it was stock holder bullshit
Look at Skylake it's the usual % increase per year regardless of the node size
Except this year we got a bargain with the cut down Titan X
New HBM2 will be an expensive premium
In other words, this Gen for GPUs were mainly experimental.
 

Horizonz

Space Marshal
Jan 14, 2015
113
37
2,350
RSI Handle
Horizonz
HBM is still experimental, it was a test run by AMD and AMD just happened to be the first company with it and Nvidia wanted to wait until HBM2.

But every year we get better and better graphics cards, next years will be even better than what we have now and so on.
 

Yex

Space Marshal
Mar 15, 2015
315
490
2,350
RSI Handle
Yex
HBM is still experimental, it was a test run by AMD and AMD just happened to be the first company with it and Nvidia wanted to wait until HBM2.

But every year we get better and better graphics cards, next years will be even better than what we have now and so on.
HBM was developed heavily by AMD
Nvidia had their own version of HBM that didnt work out, so they are going to use AMD's version instead.
HBM2 with Hynix is supposed to be reserved for AMD as a thank you to them. Except Samsung literally just annouced theyll make HBM2 as well, so nividia might buy it off them instead.

Who knows, as it stands AMD will have HBM2 well before Nvidia. Which could tip the advantage. All companies even intel are banking on HBM, so I would say its well tested and not experimental.
It now depends on Samsung to see how fast we all get it.

http://wccftech.com/amd-working-hynix-development-highbandwidth-3d-stacked-memory/
 
  • Like
Reactions: mromutt

NKato

Grand Admiral
Apr 25, 2014
1,202
1,207
960
RSI Handle
NKato
HBM was developed heavily by AMD
Nvidia had their own version of HBM that didnt work out, so they are going to use AMD's version instead.
HBM2 with Hynix is supposed to be reserved for AMD as a thank you to them. Except Samsung literally just annouced theyll make HBM2 as well, so nividia might buy it off them instead.

Who knows, as it stands AMD will have HBM2 well before Nvidia. Which could tip the advantage. All companies even intel are banking on HBM, so I would say its well tested and not experimental.
It now depends on Samsung to see how fast we all get it.

http://wccftech.com/amd-working-hynix-development-highbandwidth-3d-stacked-memory/
Yet another example of AMD leading the charge, while everybody else just follows and profits off of their own larger market shares.

Now do you see why I am perfectly content with calling nvidia a bunch of anti-competitive hacks? Had they been the only GPU game in town, you can be sure we'd have $1,200 980ti cards launching 5 years from now instead of sooner. And they'd be perfectly content foisting their outdated technologies onto game developers, causing years of stagnation. In fact, if AMD pulls out of the GPU market, I expect PC gaming to die within two decades if a viable competitor doesn't appear.

Monopolies are not beneficial, period.
 
Last edited:

Yex

Space Marshal
Mar 15, 2015
315
490
2,350
RSI Handle
Yex
Yet another example of AMD leading the charge, while everybody else just follows and profits off of their own larger market shares.

Now do you see why I am perfectly content with calling nvidia a bunch of anti-competitive hacks? Had they been the only GPU game in town, you can be sure we'd have $1,200 980ti cards launching 5 years from now instead of sooner. And they'd be perfectly content foisting their outdated technologies onto game developers, causing years of stagnation. In fact, if AMD pulls out of the GPU market, I expect PC gaming to die within two decades if a viable competitor doesn't appear.

Monopolies are not beneficial, period.
True but Nvidia 980 Ti was better so ^_^
Business practices aside, they brought the benchmark scores
And yerp, AMD is going down the tubes
 

Arrangingstars

Space Marshal
Nov 21, 2014
779
1,048
2,500
RSI Handle
Arrangingstars
For whoever said NVidia cards don't last.. I had my NVidia GeForce GTX 250 for 7 years... and it ran fine up until this year and I never really needed to upgrade til now and the card still works my great, my dad bought my old computer from me and it's running like a champ with the same GTX 250 in it. Sure, I couldn't play everything on high settings but I could play everything up until late last year. Now I have a GTX 980.
 

rogesh

Space Marshal
Oct 25, 2014
624
1,178
2,500
RSI Handle
rogesh
Well I had 2 Nvidia cards break and none of my amd's. idk of I went lucky with those amd cards but since the last gforce broke I prefer the amd and they have never let me down
 
Forgot your password?