Started By
Message

re: UPDATE: New build: Monitor choice drive GPU choice or the other way around?

Posted on 8/10/15 at 2:56 pm to
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 8/10/15 at 2:56 pm to
quote:

This is my first new monitor purchase in 10 years - so - question is - should I decide on 144mhz, Ultrawide, 4k, etc., first, then get a card that drives that (as monitors tend to survive through 4 or 5 GPUs and up to 3 PCs at my place) - OR pick the best card for a budget, THEN pick the best monitor for that card?


I generally advise anyone building a gaming PC to choose their monitor first, and let that guide their video card choice. Although, some might decide they want 1440P and then realize an entry-level 1440P build is going to be $900+, so they back it down to 1080P. Either way you go about it (GPU first or monitor first), you know well enough that one should never be decided independent of the other.

quote:

And I'm not just thinking in terms of GSync/Freesync - but that is an example of how marrying to one brand or the other makes a big difference at this time.

As for other details - I wanted to do GPU and Monitor for about $1000. The dizzying array of choices compared to last time has me chasing my tail.


As you know, Gsync and Freesync do add cost. While it's a shame to do without it, you may consider forgoing adaptive sync technology if gaming isn't first priority. I say that because a $1000 budget for monitor/GPU basically means you have to go AMD/Freesync if you want adaptive sync. Resolution should be higher priority than refresh rates and adaptive sync anyway. At $1000, a decent 1440P 60hz IPS monitor and a GTX 980 Ti would be my choice. Or, an R9 Fury and a 1440P 144Hz freesync monitor. Either card would perform several times better than your 6850, but the 980 Ti is a bit faster than both the Fury and Fury X. But just because it's faster doesn't mean it's worth it. While it beats the Fury X for the same price, adaptive sync is a few hundred extra on the Nvidia side, if that's a factor.

4K is great, too, but I wouldn't recommend it for gaming unless it has adaptive sync. There aren't any single-GPU solutions on the market yet that can reliably handle 4K and maintain 60fps with high texture settings. You'd be in the same boat as you are now with your 6850. Playing the next elder scrolls at 4K on any of today's cards would mean sub-60fps with tearing and stuttering. The adaptive sync eliminates the tearing and stuttering. The great thing about 1440P is that it already looks really good at native resolution, and also AMD and NVIDIA both have a feature that allows you to force the GPU to render every frame at a higher resolution (like 4K) and downsample it to 1440p. This creates an effect that's nearly identical to 4K because it practically eliminates aliasing and renders both distant and nearby in-game textures at a higher resolution. You get the same framerate hit as 4K, so it's useful in older games.

quote:

So - Option "B" is to just go integrated now and wait on GPU and monitor upgrade for 2016.


As long as you don't plan on gaming (or can just game on the 6850 for now, that's not a bad option. I don't know what AMD and NVidia's timelines are for 2016, but they've just released their respective flagships, and it may be a year or more before we see another. Undoubtedly there will be more freesync/gsyncs on the market, and maybe 4K prices will have come down a little (although they're not terrible right now if you want a TN panel).

quote:

How much of an upgrade will Intel 530 be over a 6850?


Do you mean Intel HD graphics? Intel 530 is an SSD. Anyways, Intel's integrated graphics would be a fairly significant downgrade from the 6850, not an upgrade. While iGPUs have come a long way, they still can't compete with even lower end dedicated GPUs. Memory bandwidth has a lot to do with it.

quote:

Will that impact how much system ram I will have to start with?

No, you shouldn't consider Intel HD Graphics as a viable gaming alternative, and even so, system memory capacity wouldn't have much to do with it. Stick with 2 x 4gb or 2 x 8gb regardless.

quote:

As to the main topic - any strong recommendations for or against 144mhz, ultrawide OR 4k for general purpose computing in Q3 of 2015? I view the monitor as a 7 to 10 year investment and the card a 4 to 5, if that impacts your advice.


Here's the rundown:
144Hz: Looks terrific and smooth, most advantageous for competitive first-person shooter players. Right now your choices for 120+Hz are 1080P without adaptive sync, or 1440P with adaptive sync. As I said, I consider resolution (or pixel density, really) a higher priority than refresh rates to an extent, so I'd advise 1440P with adaptive sync if you want 144Hz.

Ultrawide: Great for productivity if you don't have the desk space for two 16:9 monitors. They'll give you a wider field of view in games, but older games (4-5 years ago and beyond) generally don't support the 21:9 aspect ratio. I'd recommend sticking with the same vertical resolution of 1440P so you aren't sacrificing pixel density. And those are NOT cheap.

4K: Again, glorious but difficult to attain 60fps without lowering settings to a point where 1440P would likely look better anyway.


quote:

How are the last gen/second tier with 4k? Like a plain 980 or 390?



I have two 980s and I still don't do 4K in modern games. And the games will only get more demanding as time goes on.
This post was edited on 8/10/15 at 3:00 pm
Posted by Ace Midnight
Between sanity and madness
Member since Dec 2006
89742 posts
Posted on 8/10/15 at 4:18 pm to
quote:

Do you mean Intel HD graphics? Intel 530 is an SSD.


I get to teach ILike something! :wow:

They're dropping the 4 digit reference numbers and - best I can tell, the Skylake Desktop Intel HD Graphics numbers are going to be 530.

Confused? You won't be after this episode of, "WTF is going on with all the graphics' numbers?"



Seriously, though - your post is a great starting point for anyone. I've looked at most of this over the past month/6 weeks or so (you were in my "New build Haswell or wait for Skylake thread). You hit on some points I've gone over (chasing my tail, as it were), so let's narrow down:

quote:

Ultrawide: Great for productivity if you don't have the desk space for two 16:9 monitors. They'll give you a wider field of view in games, but older games (4-5 years ago and beyond) generally don't support the 21:9 aspect ratio.


Let's assume I have room for 2 24s - maybe even 2 28s and I've even considered a 32" HDTV in the past for this space, albeit wall mounted.

So looking at 2 Freesync choices:

And I could fit that LG 34" Ultrawide that has Freesync (34UM67) - do you see me using that monitor in 7 to 10 years?

Compare that to the Samsung (U28E590D) that you turned me onto - it's sitting at $499 right now on Newegg - 4k - but, what if I decide to drive that at lower resolutions with a $350 card (Say a 390) for a couple of years and see what happens with Fiji?

Now both of those displays have very narrow Freesync rates - 48 to 75 on the LG and 40 to 60 on the Samsung (albeit at 4k). What's the consequence of that (and I know you said that Freesync perhaps shouldn't be a priority based on the "general purpose" nature of the build, but I do play a lot of games, albeit few of them are graphics demanding - Total War series and Elder Scrolls being top of the mark)?

Of course, I can't go back to Nvidia, because I went with Freesync. See - it's maddening other than a "pick 1 and live with it scenario".

Gaming isn't priority #1, but I don't want to feel like I compromised in only a year or 2. I don't mind a computer feeling old at 4 or 5 - because it is at that point.

Which is what upped me from my original $600 budget for GPU and monitor in the first place, because I felt far too confined with that budget - after taking a "no unreasonable compromise" approach, I drifted into the $700 card, $800 display territory and had to tap the brakes - you know? (I know you're a bleeding edge guy - if money were no object, I'm sure that we'd both be.)

See what I mean? Maddening.
This post was edited on 8/10/15 at 4:21 pm
first pageprev pagePage 1 of 1Next pagelast page
refresh

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram