Some Troubling News from Nvidia

SeungRyul

Spreader of Truth / Master of Hamsters
Staff member
Donor
Oct 30, 2013
2,341
5,156
2,930
RSI Handle
Citizen404

zeddie

Space Marshal
Jan 22, 2014
628
752
2,430
RSI Handle
Test-Dummy
Happily underperforming with my AMD HD7970...

Been with AMD for quite a few generations now, HD3850, HD4850, HD5850, and now HD7970. I'll probbaly buy an RX300series or even RX400 series by the time Star Citizen comes out.

need a new monitor first though.
 

thanatos73

Space Marshal
Nov 21, 2014
1,376
1,412
2,510
RSI Handle
thanatos1973
Happily underperforming with my AMD HD7970...

Been with AMD for quite a few generations now, HD3850, HD4850, HD5850, and now HD7970. I'll probbaly buy an RX300series or even RX400 series by the time Star Citizen comes out.

need a new monitor first though.
Me too, or find a way to commandeer the TV again, cause I'm stuck at 1600x900 on my current monitor. But I am really happy with my R9 270x though.
 

o-BHG-o

Space Marshal
Mar 25, 2014
559
448
2,320
RSI Handle
o-BHG-o
Going home early today to try and get my HD7970's working in crossfire. I am getting black horizontal scan lines. Maybe attributed to power settings or screen refresh rate or the bridge or overscan settings or just s**t syncing and s**t outa luck.


Edit: S'all good man!

Crossfire fixed and general fps increase n single gpu, yay!
 
Last edited:
Jul 6, 2014
754
894
550
RSI Handle
Salt_Lake
This is happening with witcher 3 hairworks and some of nvidia other gameworks titles. Gameworks is starting to use heavy tesselation, which AMD doesnt perform so well. The 700 series performs well, but the 900 series is much better at tesselation I believe the quotes number was 3 times. So the 900 series will perform better with heavy tesselation use, this woukd give an advantage over all other cards until they catch up with tesselation use.

Now there is supposed to be a new driver for the 700 series to help with witcher 3, but that has yet to be seen.

I am also not going to get into the whole gameworks conspiracy that started a shitstorm on the internet. I will say that Nvidia may cripple AMD through legal means, by taking advantage of AMDs weak points. Which AMD could do also.
 

SeungRyul

Spreader of Truth / Master of Hamsters
Staff member
Donor
Oct 30, 2013
2,341
5,156
2,930
RSI Handle
Citizen404
I am also not going to get into the whole gameworks conspiracy that started a shitstorm on the internet. I will say that Nvidia may cripple AMD through legal means, by taking advantage of AMDs weak points. Which AMD could do also.
That would be troubling since its essentially us getting fucked over no matter which company you choose. While Gameworks may not have been a underhanded plot at destroying AMD, it is in the end heavily skewing the playing field and discouraging proper competition.
 
Jul 6, 2014
754
894
550
RSI Handle
Salt_Lake
That would be troubling since its essentially us getting fucked over no matter which company you choose. While Gameworks may not have been a underhanded plot at destroying AMD, it is in the end heavily skewing the playing field and discouraging proper competition.
This is very true. This is where the devs needs to allow us to control the amount of tesselation or reduce it to not impact as much as it did. There is a point of dimishing returns which I think witcher 3 over stepped. Someone looked the old EvE video which shows that tesselation can be controlled, and it also show one more thing. That is EvE despite having the technologies to do heavy tesellation, decided not to because they knew it would be too much to handle. So again the dev needs to be reasonable with its technologies and tools. You could make a game look extremly realistic, but if only 4x titan x can run it at 30fps at 1080p is it worth making?

Edit: if a dev does want to make a high end game like SC does, it needs to voice it out that it will be intense and your 780 wont handle it. If you expect to run it on a 6970, well SC is going to ramsay snow/bolton your card until its new name is sansa stark.
 
Last edited:

AntiSqueaker

Space Marshal
Apr 23, 2014
2,157
5,559
2,920
RSI Handle
Anti-Squeaker
Nvidia has a long history of being anti-competitive. Like making Nvidia hairworks closed so AMD can't release drivers for it to work right on their cards. (Whereas TressFX is open).

Knowing about the 3.5GB 970 issue and just trying to ignore it until people called them out on it. (and then a bunch of people "upgrading" to a 980, fanboyism at it's finest)

And a ton of Nvidia games (including the Witcher 3) are behind closed doors to AMD dev's- so no wonder they run "better" out the gate on Nvidia hardware, the AMD dev's have to start drivers from scratch, whereas Nvidia has a HUGE headstart.

Nvidia has a 70% market share, and they want to keep it that way.


With that being said, I do have a 660. But please be educated about companies and their business practices before you decide to spend 300+ dollars on a GPU.

edit: not saying that AMD is the perfect poster child here, they do some crappy stuff to. But I see a lot of pro-Nvidia sentiment here, and tons elsewhere, and want to inform people about some of the scummy business practices they do. Nvidia makes good cards and bad decisions.
 
Last edited:

Krystal LeChuck

Meme Meister
Staff member
Officer
Jun 10, 2014
594
888
1,420
RSI Handle
Krystal
I'm just waiting for DX12 release. There will be lots of LOLZ going around with the insane performance leap of both AMD and Nvidia hardware, but I have a very strong feeling that AMD is going to pull some crazy results seeing how close DX12 architecture is to Mantle.
 

EpilepticCricket

Space Marshal
Oct 20, 2014
1,403
4,905
2,160
RSI Handle
EpilepticCricket
Yeah, DX12 is a big part of why I'm holding off on rebuilding my (now dead) gaming rig. Preliminary numbers show monstrous performance jumps on existing hardware even when the cards don't fully support DX12. This write up from pcworld.com shows just how big of a jump we're making with it. Hopefully AMD will have a strong contender that really blows current nvidea offerings away, but I'm not up to date enough on next-gen GPU's to speculate on that yet.

Edit: I am aware that DX12 focuses more on improving CPU performance rather than GPU
 
Last edited:
Jul 6, 2014
754
894
550
RSI Handle
Salt_Lake
I'm just waiting for DX12 release. There will be lots of LOLZ going around with the insane performance leap of both AMD and Nvidia hardware, but I have a very strong feeling that AMD is going to pull some crazy results seeing how close DX12 architecture is to Mantle.
This is a little wrong. DX12 was developed before Mantle according to Microsofg, which Tamasi (?) stated it would have taken longer to develop DX12 than Mantle. DX12 is also CPU focused and how to use the GPU better rather than GPU focused. I dont see it favoring AMD or Nvidia unless one is stronger than the other. Though HBM that will probably help with talking to the CPU and give it all the draw it asks for faster and no bottleneck, will probably be a huge advantage.
 

NKato

Grand Admiral
Apr 25, 2014
1,202
1,207
960
RSI Handle
NKato
http://www.reddit.com/r/pcmasterrace/comments/376jqk/kepler_performances_crippling_controversy_nvidia/

Essentially it looks like Nvidia is crippling the performance of the Kepler cards to make the current 900 Maxwell series look much more attractive. Much more testing needs to be done at this point but the public consensus is that the Kepler cards are underperforming significantly.

Hopefully this is just a fluke as the issue is rectified soon :)
Oh myyyyyyyy. </George Takei>

nVidia has been known to do these kinds of practices. I mean, ever since they doctored benchmark results back in 2002, I've been skeptical of the company's honesty and integrity.
 

NKato

Grand Admiral
Apr 25, 2014
1,202
1,207
960
RSI Handle
NKato

Star Pilgrim

Rear Admiral
Feb 24, 2015
421
145
310
RSI Handle
Star_Pilgrim
http://www.reddit.com/r/pcmasterrace/comments/376jqk/kepler_performances_crippling_controversy_nvidia/

Essentially it looks like Nvidia is crippling the performance of the Kepler cards to make the current 900 Maxwell series look much more attractive. Much more testing needs to be done at this point but the public consensus is that the Kepler cards are underperforming significantly.

Hopefully this is just a fluke as the issue is rectified soon :)
Same old practices they have used always throughout history.

So nothing new then?
 

Star Pilgrim

Rear Admiral
Feb 24, 2015
421
145
310
RSI Handle
Star_Pilgrim
This is a little wrong. DX12 was developed before Mantle according to Microsofg, which Tamasi (?) stated it would have taken longer to develop DX12 than Mantle. DX12 is also CPU focused and how to use the GPU better rather than GPU focused. I dont see it favoring AMD or Nvidia unless one is stronger than the other. Though HBM that will probably help with talking to the CPU and give it all the draw it asks for faster and no bottleneck, will probably be a huge advantage.
Naturally.
MS was continuously developing DX12.
But WITHOUT the low level GPU improvements AMD came out with, with Mantle.

DX 12 was suppose to bring some improvements and come out with Windows 8.
After AMD released Mantle to the tech circles. AMD nudged MS in this direction because their Xbox console deal. If MS was ever to beat playstation it needed an edge. And you know that AMDs gpu runs on both Xbox and Playstation right.
AMD wanted to get this out to the masses, and MS bite.
They have done the same with FreeSysnc becoming a part of the standard, since Ncrapia went proprietary (like they always do) with their Gsync .

Nvidia lives for royalties, secret deals and controlling the market any way they can.

Look back and put it in perspective.
AMD has consistently came out with new and new tech that others eventually adapted and made a twist on.
And they have consistently been offering BEST cards for the price/performance ratio, since forever.
 
Jul 6, 2014
754
894
550
RSI Handle
Salt_Lake
Naturally.
MS was continuously developing DX12.
But WITHOUT the low level GPU improvements AMD came out with, with Mantle.

DX 12 was suppose to bring some improvements and come out with Windows 8.
After AMD released Mantle to the tech circles. AMD nudged MS in this direction because their Xbox console deal. If MS was ever to beat playstation it needed an edge. And you know that AMDs gpu runs on both Xbox and Playstation right.
AMD wanted to get this out to the masses, and MS bite.
They have done the same with FreeSysnc becoming a part of the standard, since Ncrapia went proprietary (like they always do) with their Gsync .

Nvidia lives for royalties, secret deals and controlling the market any way they can.

Look back and put it in perspective.
AMD has consistently came out with new and new tech that others eventually adapted and made a twist on.
And they have consistently been offering BEST cards for the price/performance ratio, since forever.

Nvidia gsync and the VESA a-sync work in different ways. Gsync is more flexible on its framerates, while a-sync is more flexible. There is not enough margine for 99% of people to notice though. It was nvidia that pushed amd to get freesync opup and running. HBM was also a known for pascal cards since 2014, which is still planed to use a newer version of HBM that is faster. Both AMD and Nvidia push techmtechnology forward. Do you honestly think if nvidia quit the business right now AMD would still be innovative as they are right now? You shoukl be saying no, because they would focus their research into cpu or enterprise technology. This is exactly what nvidia did, they thought AMD was down for the count and focused on enterprise areas such as automobile and spacex.

As for performance per price is debatable, for me where I spend 60 cents usd for 1kwh, I rather run a lower watt gpu than a higher one. If fiji is really 200w then I am going to be leaning that way, if its 300+ I will lean toward nvidia. Especially when my next build will have 4 gpu. I am actually rooting for AMD zen cpu and hope they have a enthusiast category that can contenccontend with 5690x, ddr, l3 cache, and hopefully give us 64 pcie lanes so we can run 16x/16x/16x/16x crossfire. Would be my next build easilly. If only they would let evga build gpu for them, evga has been such a great company for me. I will never go back to asus after my 290x came doa, and I just dont trust many other companies for various reasons.
 

Star Pilgrim

Rear Admiral
Feb 24, 2015
421
145
310
RSI Handle
Star_Pilgrim
As for your answer to a "debacle".
It's not a debacle, it's a fact.

Wattage usage?
Who was the wattage hog all the time before that? Nvidia ofc.

Why do you think bitcoin miners use AMD primarily over Nvidia?
Better wattage per GPU power.

AMD slipped once in that regard, and every nvidia fanboy marched to the forums and made it a fact that AMD if and always has been a power hog, while the facts show the opposite. They pinpointed one GPU. While the full nvidia line has been the hog every release previous.

AMD has improved over the nvidia again.
So the battle is starting again.
It is like a bi-yearly cycle these two go through. And each time the bashing of fanboy camps starts.
Nvidia ones are especially toxic, that is why I hate them with a passion.

Each time new DirectX comes out, ATI is king (at least one year).
Then Nvidia comes out with a card that beats it (but it is at least 30% pricier).

Future trend shows that direct compute will be used more (AMD is king here).
And the tessellation (nvidia is king here currently).

Both competitors know DX12 will be a gamechanger and they will optimize their GPUs to take advantage of any advantage tech that can bring in FPS scores higher.
 
Last edited:

WarrenPeace

Space Marshal
Jul 17, 2014
4,209
8,451
2,920
RSI Handle
Shortspark
It is like a bi-yearly cycle these two go through. And each time the bashing of fanboy camps starts.
Nvidia ones are especially toxic, that is why I hate them with a passion.
Those fanboys may be saying the same about AMD enthusiasts. Returning that hate does nothing to reduce the toxicity of the debate. Better that they are ignored, if they are so irritating.
 
Forgot your password?