Started By
Message

re: PC Discussion - Gaming, Performance and Enthusiasts

Posted on 3/13/15 at 1:13 pm to
Posted by Devious
Elitist
Member since Dec 2010
29422 posts
Posted on 3/13/15 at 1:13 pm to
Just get a gtx 970 and the 630w but you could go lower on psu
Posted by taylork37
Member since Mar 2010
15803 posts
Posted on 3/13/15 at 1:23 pm to
I already have the GPU ordered, but I am trying to size the PS. I basically don't want to pay for a 750W PS if I don't need to.
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 3/13/15 at 2:28 pm to
Recommendations are based on the fact that the card can draw up to 300W by itself depending on voltage, overclocks, and total load. 600W would be pushing it under full CPU/GPU load but could still handle it. 650W+ is safer. This EVGA 750W is $50 after rebate: LINK

Posted by taylork37
Member since Mar 2010
15803 posts
Posted on 3/13/15 at 2:32 pm to
Awesome. Thanks.
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 3/14/15 at 7:44 pm to
OK, long geeky post to spark some discussion.

If these latest benchmarks have any truth to them, I may sell my 980s. But what I'll buy instead, I'm not sure.

AMD R9 390X, Nvidia GTX 980 Ti and Titan X Benchmarks Leaked


For reference, the "GM200 cut" on the chart is probably a 980 Ti, and the Fiji XT is obviously the R9-390X.



More performance charts in the link, but this one is a good enough summary for 4K. What we see here is a 390X raping a 980, as expected. It also edges out the Titan X by a couple percentage points. Not super surprising since the Titan has been kind of an oddball since its first release, as a novelty-priced gaming/workstation hybrid that isn't necessarily competing with anything else.

The problem here, assuming these benchmarks are real, is that the performance factor is more than likely measured by average frame rate. But what I want to know are frame times. Any one of those top 3 GPUs are going to be flawless 1440P renderers, but moving beyond that, memory bandwidth and/or capacity become much more important. In another chart, they show performance comparisons at 2560x1600, and the disparity between the 4GB 980 and the 4GB 390X is smaller, indicating that memory bandwidth is playing a role at 4K. Likewise, the 980 and 980 Ti have a smaller performance difference at the lower resolution, indicating something involving memory (980 Ti has higher memory bandwidth and more memory). 390X and Titan X stay around the same, marginal difference, possibly indicating that there does not appear, on the surface, to be a vram bottleneck at 4K for either card.

So, say the Fiji can process frames faster than any other GPU. The memory bandwidth is obviously more than enough to handle the cookie-cutter scenes that were likely used to get consistent measurements. But the capacity is still in question. If I'm speeding down a dirt road in Far Cry 4 rendered at 4K, GPU usage in the 80%+ range, and I make a sudden sharp turn that reveals a wide view of previously unrendered landscape/foliage, how quickly can Fiji ramp up and make room in vram to buffer these frames? Will I see hitching, or ugly frame drops for a second or two? These are real issues that aren't captured in average frame-rate comparisons. Likewise, with thrice as much vram, is the full GM200 going to handle that scenario more consistently? Or is the memory bandwidth (roughly half of the 390X's) going to be the bottleneck? (Though that last question is probably more appropriate for the 6GB 980 Ti).

The question is even more important in crossfire/SLI setups, at least until/unless we really do get stacked vram support in games that need it when we get DX12. A single 980 is a great 1440P card. SLI 980s add the processing power for that extra eye candy, but what they don't do well is handle 4K textures. Average frame rate is fine, but in games like Shadow of Mordor at 4K, there's an obvious vram bottleneck. I might see great scaling at 90+% usage on each GPU, 3600MB allocated vram, 60-ish fps, then I turn around quickly, and vram allocation goes to 4000-ish MB, GPU usage drops to about 80%, with a brief frame rate drop. It was even more pronounced when I was running crossfire R9-290s. But without being able to do my own tests with similar GPUs, it's unclear whether the capacity or bandwidth is more important here. Review sites do a poor job with these types of tests ("derp, we put 8xMSAA on 8K renders and measured frame rates! Look, this one did 4fps and that one only did 2fps! 100% performance increase!").

So anyway, Coyote when you get your Titans, I expect full frame latency and stutter reports
Posted by Devious
Elitist
Member since Dec 2010
29422 posts
Posted on 3/14/15 at 7:46 pm to
If you want to sell a 980, let's talk.

Eta: that's all I got out of that long arse post. I'll go read it again.
This post was edited on 3/14/15 at 7:47 pm
Posted by UltimateHog
Thailand
Member since Dec 2011
69468 posts
Posted on 3/14/15 at 7:49 pm to
I just want my FreeSync monitor already
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 3/14/15 at 7:49 pm to
wouldn't be until June at the earliest, and as always, I try to sell them with the blocks first for convenience.

It would be nice if the 980 Ti PCB would be compatible with the 980 water block so they'd be easier to sell separately, but nope. More vram, different chip.
Posted by Devious
Elitist
Member since Dec 2010
29422 posts
Posted on 3/14/15 at 7:52 pm to
So I just read the link. If that pricing speculation is close to accurate, I'm on board with a 390x.
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 3/14/15 at 8:25 pm to
Yep, most likely. I have to consider the cost of SLI/crossfire, because I don't want to have a lonely GPU in this gigantic case (OCD). Two titans is out of the question, and I'm guessing the 980 Ti will be $700, so unless the 2GB extra vram makes them perfect SLI cards for 4K, I probably won't spring for those either. My decision also depends on Freesync's success, so I may just wait until the end of the year.
Posted by DoUrden
UnderDark
Member since Oct 2011
26158 posts
Posted on 3/14/15 at 8:30 pm to
I'm gonna live with my faulty 970's and ignore all this talk.
Posted by rompus
Kentucky
Member since Jan 2010
608 posts
Posted on 3/14/15 at 9:24 pm to
Here's what it boils down to for me and why I bought my 980 a couple of months ago to replace an aging 560.
Every time I have strayed away from nVidia I was not happy. At times ATI/AMD has been the performance to price leader, but you have to put up with their shitty drivers and the fact that a lot of games seem to favor/optimize themselves for nVidia hardware.

When I bought the 560, it was to replace a 5770, which is a lateral move at best performance wise, but I was tired of all the damn crashes from the AMD card. Never mind it never really performed as well as it should have.

As far as Freesync, I hope it pans out and I hope at some point nVidia supports it too.
I have the Acer XB270H monitor and GSync is fricking awesome.
My plan for now is to stick with my current setup until 4K settles in.

As far as SLI/Crossfire, maybe it's me, but it seems like most SLI users are always complaining that this game or that game doesn't support SLI, performance isn't up to par, causes crashes or whatnot. It just seems like it is never really worth it other than to show people, "hey, I can score ******* on XXXXXX benchmark".

Posted by DoUrden
UnderDark
Member since Oct 2011
26158 posts
Posted on 3/14/15 at 9:27 pm to
I had a rough time with xfire on my 7950's, I had to stay at least two updates behind or the cards wouldn't work. With my SLI 970's I haven't had that problem, but I think one game (mabye DA:I) would disable the SLI though.
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 3/14/15 at 10:23 pm to
quote:

I was tired of all the damn crashes from the AMD card


Never understood the stability complaints about AMD cards. They definitely have driver quirks, but the worst part was always the fact that they took forever to release any optimized drivers for new games. Other than being quick to the punch, I do not find nvidia drivers to be any better (read: more stable) than AMD's WHQL releases. However, SLI, on the whole, is better than Crossfire.

quote:

a lot of games seem to favor/optimize themselves for nVidia hardware.

It's sad and irritating. NVidia's success stems from its relentless efforts to force fragmentation in the gaming development world.

quote:

As far as Freesync, I hope it pans out and I hope at some point nVidia supports it too.

To clarify, since it's not exactly spelled out by all the various review sites, Nvidia will never support freesync, since that's AMD's specific implementation of adaptive sync. The question is, if adaptive sync is just as good as gsync, will Nvidia drop the sham and stop charging vendors $200+ for their mysterious proprietary gsync chip.

quote:

As far as SLI/Crossfire, maybe it's me, but it seems like most SLI users are always complaining that this game or that game doesn't support SLI, performance isn't up to par, causes crashes or whatnot. It just seems like it is never really worth it other than to show people, "hey, I can score ******* on XXXXXX benchmark".


Yeah, a lot of it is e-peen, but it is also the most cost effective way of upgrading or just surpassing the highest-end single-GPU solutions. It's supported in more games than not these days, and can be forced in control panel and third-party programs in games that don't support it (with varying results). Usually, if you're SLI/Crossfiring high-end GPUs, games that don't support it don't even need it. It is not necessarily a noob-friendly setup in terms of set-it-and-forget-it, but it has plenty of perks and can scale quite well to the point of nearly double performance. It's only recently, as 4K becomes the new holy grail of gaming, that SLI/Crossfire becomes problematic because of the memory bandwidth/capacity bottleneck. Two years ago, 2GB was just fine, even for a lot of 1440P games. It's a 77%-ish increase in pixels from 1080P to 1440P. But then the jump from 1440P to 4K is another 128%-ish increase, so 4GB of vram (at least at the bandwidth in existing GPUs) is just not sufficient no matter how many GPUs you throw at it. If stacked vram support becomes a reality in AAA titles w/ DX12, that will be a game changer.
This post was edited on 3/14/15 at 10:25 pm
Posted by Mr Gardoki
AL
Member since Apr 2010
27652 posts
Posted on 3/14/15 at 10:45 pm to
Having gone from a 7870 to a 770 I didn't find either card more stable. Only thing I really like more about nvidia is the experience panel they have does a good job of notifying you about driver updates. Unless it has changed I don't think amd had anything like that.
Posted by DoUrden
UnderDark
Member since Oct 2011
26158 posts
Posted on 3/14/15 at 10:48 pm to
I liked AMD's control panel more, it gave your more options when OC'ing since I am not as knowledgeable as some on here.
Posted by UltimateHog
Thailand
Member since Dec 2011
69468 posts
Posted on 3/14/15 at 10:51 pm to
quote:

Only thing I really like more about nvidia is the experience panel they have does a good job of notifying you about driver updates. Unless it has changed I don't think amd had anything like that.


They definitely do.

quote:

I liked AMD's control panel more, it gave your more options when OC'ing since I am not as knowledgeable as some on here.



Same, much more appealing too.
Posted by rompus
Kentucky
Member since Jan 2010
608 posts
Posted on 3/14/15 at 11:17 pm to
quote:

Never understood the stability complaints about AMD cards


True. It is not the cards, it is the crap drivers. Nothing more frustrating than being excited about a new AAA game on release day and not being able to play it because AMD can't get their shite together. Hell, lately, nVidia updates with "game ready" drivers a day or two before release.


quote:

$200+ for their mysterious proprietary gsync chip


So, I understand it's BS that nVidia is doing this. I understand that we should be able to buy these monitors cheaper, but that being said... it's worth every damn cent to me. Gsync is the single best upgrade I have ever done in over 20+ years of PC gaming. So, so smooth. I would do it again in a heartbeat. Yeah, I feel like nVidia's little bitch... dammit.


quote:

If stacked vram support becomes a reality in AAA titles w/ DX12, that will be a game changer.


We all hope this is the case. Maybe for once Microsoft will come through with their promises to PC gamers.
Here's to hoping DX12 is all it is stacked up to be.

Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 3/15/15 at 12:10 am to
quote:

not being able to play it because AMD can't get their shite together.


Again, this has literally never happened to me. Nvidia is faster with drivers to increase performance in specific games, but save for crossfire support, I've never had a game that was unplayable/nonfunctional or otherwise significantly crippled due to lack of driver support.

quote:

Gsync is the single best upgrade I have ever done in over 20+ years of PC gaming.


I agree. I've seen it in person, and it's amazing, and without any context of the technology's real cost of implementation, I would've gladly forked over $800 for the ROG Swift. But when you step back and realize that NVIDIA knew about the new VESA standard (1.2a) and instead opted to develop its own proprietary chip for the sole purposed of: a) being first, and b) Exclusivity to the point of preventing its own customers from benefitting from an open standard that's more financially accessible, it's just NVIDIA being NVIDIA. It wasn't about innovation. They priced it out of mainstream to make sure one of the greatest advancements in PC gaming was only available on half a dozen monitors, all priced well over $400. A whole year of being the only game in town, and they all but sat on the technology. Variable refresh panels, in theory, could be controlled with driver implementation, and it's possible (rumored, but not proven either way) that the $200 gsync chip is little more than a hardware identifier (essentially, DRM to ensure you're using an nvidia card). It was uncovered that nvidia drivers do check for a gsync chip before that feature is activated (though that doesn't necessarily mean anything underhanded).

I thought it was a dumb move not to make the cost of entry more more reasonable, but the more I think about it, it's a brilliant business move. Hardware enthusiasts don't upgrade monitors as often as GPUs. As much as I wanted that ROG Swift and would've been willing to buy it for $800, it's still a big purchase for me, and would've dictated my GPU purchases for quite a while. Which is the whole point of that $200 g-sync chip, physx, the shield tablet, etc. Most of the extra money you pay for NVIDIA cards doesn't go to R&D. It goes to marketing, all of the time, effort, and dollars spent to ensure fragmentation and customer ignorance. Sure, it's just business, and they're successful, and admittedly I do enjoy some of the features and the way they are implemented, like ShadowPlay and DSR, and extra eyecandy that's deliberately designed to make AMD cards shite the bed. But it does more harm than good to the gaming community. Just like Apple, a successful company that has totally fricked up the tech community's definition of the word innovation.
This post was edited on 3/15/15 at 12:14 am
Posted by UltimateHog
Thailand
Member since Dec 2011
69468 posts
Posted on 3/15/15 at 3:16 am to
Have they confirmed that the 390X is going to be factory liquid cooled? I remember that was a constant in the all rumors, but just how good will it be? Better than air I'm sure.
first pageprev pagePage 1033 of 1912Next pagelast page

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on X, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookXInstagram