Page 1
Page 1
Started By
Message

UPDATE: New build: Monitor choice drive GPU choice or the other way around?

Posted on 8/10/15 at 1:39 pm
Posted by Ace Midnight
Between sanity and madness
Member since Dec 2006
89493 posts
Posted on 8/10/15 at 1:39 pm
Sorry if this rambles:

I'm doing a new Skylake Z170 build - I've been all over the place (ILike clued me in on G-Sync/Freesync) - obviously, the smart budget choice is to get a last generation or second tier card (970/980 or 390) and let the 4k market develop.

The build is general purpose, but I will play whatever the next generation of Elder Scrolls is when it comes out.

This is my first new monitor purchase in 10 years - so - question is - should I decide on 144mhz, Ultrawide, 4k, etc., first, then get a card that drives that (as monitors tend to survive through 4 or 5 GPUs and up to 3 PCs at my place) - OR pick the best card for a budget, THEN pick the best monitor for that card?

And I'm not just thinking in terms of GSync/Freesync - but that is an example of how marrying to one brand or the other makes a big difference at this time.

As for other details - I wanted to do GPU and Monitor for about $1000. The dizzying array of choices compared to last time has me chasing my tail.

I'm not a competitive gamer and do not demand high refresh rates - I play Skyrim, for example, on a 6850 - I get tearing, stuttering, but the game is - generally playable between scene changes at moderate settings. I'm sure my FPS run in the 40 to 60 range. I'm using a Sceptre and I can't tell you specs - but 1080p, 60hz.

I'm intrigued by Ultrawide, but I do believe that 4k is the future. However, buying 4k now is probably neither cost-effective, nor optimal technology-wise - just as with televisions, I believe 2016 will be the year that 4k monitors and GPUs come into their own.

So - Option "B" is to just go integrated now and wait on GPU and monitor upgrade for 2016. How much of an upgrade will Intel 530 be over a 6850? Will that impact how much system ram I will have to start with?

As to the main topic - any strong recommendations for or against 144mhz, ultrawide OR 4k for general purpose computing in Q3 of 2015? I view the monitor as a 7 to 10 year investment and the card a 4 to 5, if that impacts your advice.
This post was edited on 8/12/15 at 12:26 pm
Posted by SG_Geaux
Beautiful St George
Member since Aug 2004
77946 posts
Posted on 8/10/15 at 2:33 pm to
Decide what resolution you want to play at and what your budget is and then work from there.
This post was edited on 8/10/15 at 2:34 pm
Posted by Ace Midnight
Between sanity and madness
Member since Dec 2006
89493 posts
Posted on 8/10/15 at 2:47 pm to
quote:

Decide what resolution you want to play at and what your budget is and then work from there.


Fair enough - and part of the reason I started this thread was to get some analysis.

Staying with AMD - with VSR, can I stay with a 1080 resolution, or should I go QHD/1440 and try to hit the sweet spot of value to performance?

Is Ultrawide an option - driving 40% more pixels rather than 3x or 4x with 4k?

OR, since I believe 4k is the future - just gut it up, get the best 4k within budget today and upgrade the card as necessary?

How are the last gen/second tier with 4k? Like a plain 980 or 390?

I mean, one can always wait forever for the next thing. And I know this thread is asking a lot, but $1000 - $1200 (roughly for GPU and monitor) is a lot of money to stake on tech that seems in transition.

I know I want to move up from 24" class, but should I be looking at 34" ultrawides, 30" IPS 144mhz or 27/28" 4k at 60hz?

Which is, perhaps another choice that will narrow things down, significantly.
This post was edited on 8/10/15 at 2:48 pm
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 8/10/15 at 2:56 pm to
quote:

This is my first new monitor purchase in 10 years - so - question is - should I decide on 144mhz, Ultrawide, 4k, etc., first, then get a card that drives that (as monitors tend to survive through 4 or 5 GPUs and up to 3 PCs at my place) - OR pick the best card for a budget, THEN pick the best monitor for that card?


I generally advise anyone building a gaming PC to choose their monitor first, and let that guide their video card choice. Although, some might decide they want 1440P and then realize an entry-level 1440P build is going to be $900+, so they back it down to 1080P. Either way you go about it (GPU first or monitor first), you know well enough that one should never be decided independent of the other.

quote:

And I'm not just thinking in terms of GSync/Freesync - but that is an example of how marrying to one brand or the other makes a big difference at this time.

As for other details - I wanted to do GPU and Monitor for about $1000. The dizzying array of choices compared to last time has me chasing my tail.


As you know, Gsync and Freesync do add cost. While it's a shame to do without it, you may consider forgoing adaptive sync technology if gaming isn't first priority. I say that because a $1000 budget for monitor/GPU basically means you have to go AMD/Freesync if you want adaptive sync. Resolution should be higher priority than refresh rates and adaptive sync anyway. At $1000, a decent 1440P 60hz IPS monitor and a GTX 980 Ti would be my choice. Or, an R9 Fury and a 1440P 144Hz freesync monitor. Either card would perform several times better than your 6850, but the 980 Ti is a bit faster than both the Fury and Fury X. But just because it's faster doesn't mean it's worth it. While it beats the Fury X for the same price, adaptive sync is a few hundred extra on the Nvidia side, if that's a factor.

4K is great, too, but I wouldn't recommend it for gaming unless it has adaptive sync. There aren't any single-GPU solutions on the market yet that can reliably handle 4K and maintain 60fps with high texture settings. You'd be in the same boat as you are now with your 6850. Playing the next elder scrolls at 4K on any of today's cards would mean sub-60fps with tearing and stuttering. The adaptive sync eliminates the tearing and stuttering. The great thing about 1440P is that it already looks really good at native resolution, and also AMD and NVIDIA both have a feature that allows you to force the GPU to render every frame at a higher resolution (like 4K) and downsample it to 1440p. This creates an effect that's nearly identical to 4K because it practically eliminates aliasing and renders both distant and nearby in-game textures at a higher resolution. You get the same framerate hit as 4K, so it's useful in older games.

quote:

So - Option "B" is to just go integrated now and wait on GPU and monitor upgrade for 2016.


As long as you don't plan on gaming (or can just game on the 6850 for now, that's not a bad option. I don't know what AMD and NVidia's timelines are for 2016, but they've just released their respective flagships, and it may be a year or more before we see another. Undoubtedly there will be more freesync/gsyncs on the market, and maybe 4K prices will have come down a little (although they're not terrible right now if you want a TN panel).

quote:

How much of an upgrade will Intel 530 be over a 6850?


Do you mean Intel HD graphics? Intel 530 is an SSD. Anyways, Intel's integrated graphics would be a fairly significant downgrade from the 6850, not an upgrade. While iGPUs have come a long way, they still can't compete with even lower end dedicated GPUs. Memory bandwidth has a lot to do with it.

quote:

Will that impact how much system ram I will have to start with?

No, you shouldn't consider Intel HD Graphics as a viable gaming alternative, and even so, system memory capacity wouldn't have much to do with it. Stick with 2 x 4gb or 2 x 8gb regardless.

quote:

As to the main topic - any strong recommendations for or against 144mhz, ultrawide OR 4k for general purpose computing in Q3 of 2015? I view the monitor as a 7 to 10 year investment and the card a 4 to 5, if that impacts your advice.


Here's the rundown:
144Hz: Looks terrific and smooth, most advantageous for competitive first-person shooter players. Right now your choices for 120+Hz are 1080P without adaptive sync, or 1440P with adaptive sync. As I said, I consider resolution (or pixel density, really) a higher priority than refresh rates to an extent, so I'd advise 1440P with adaptive sync if you want 144Hz.

Ultrawide: Great for productivity if you don't have the desk space for two 16:9 monitors. They'll give you a wider field of view in games, but older games (4-5 years ago and beyond) generally don't support the 21:9 aspect ratio. I'd recommend sticking with the same vertical resolution of 1440P so you aren't sacrificing pixel density. And those are NOT cheap.

4K: Again, glorious but difficult to attain 60fps without lowering settings to a point where 1440P would likely look better anyway.


quote:

How are the last gen/second tier with 4k? Like a plain 980 or 390?



I have two 980s and I still don't do 4K in modern games. And the games will only get more demanding as time goes on.
This post was edited on 8/10/15 at 3:00 pm
Posted by Ace Midnight
Between sanity and madness
Member since Dec 2006
89493 posts
Posted on 8/10/15 at 4:18 pm to
quote:

Do you mean Intel HD graphics? Intel 530 is an SSD.


I get to teach ILike something! :wow:

They're dropping the 4 digit reference numbers and - best I can tell, the Skylake Desktop Intel HD Graphics numbers are going to be 530.

Confused? You won't be after this episode of, "WTF is going on with all the graphics' numbers?"



Seriously, though - your post is a great starting point for anyone. I've looked at most of this over the past month/6 weeks or so (you were in my "New build Haswell or wait for Skylake thread). You hit on some points I've gone over (chasing my tail, as it were), so let's narrow down:

quote:

Ultrawide: Great for productivity if you don't have the desk space for two 16:9 monitors. They'll give you a wider field of view in games, but older games (4-5 years ago and beyond) generally don't support the 21:9 aspect ratio.


Let's assume I have room for 2 24s - maybe even 2 28s and I've even considered a 32" HDTV in the past for this space, albeit wall mounted.

So looking at 2 Freesync choices:

And I could fit that LG 34" Ultrawide that has Freesync (34UM67) - do you see me using that monitor in 7 to 10 years?

Compare that to the Samsung (U28E590D) that you turned me onto - it's sitting at $499 right now on Newegg - 4k - but, what if I decide to drive that at lower resolutions with a $350 card (Say a 390) for a couple of years and see what happens with Fiji?

Now both of those displays have very narrow Freesync rates - 48 to 75 on the LG and 40 to 60 on the Samsung (albeit at 4k). What's the consequence of that (and I know you said that Freesync perhaps shouldn't be a priority based on the "general purpose" nature of the build, but I do play a lot of games, albeit few of them are graphics demanding - Total War series and Elder Scrolls being top of the mark)?

Of course, I can't go back to Nvidia, because I went with Freesync. See - it's maddening other than a "pick 1 and live with it scenario".

Gaming isn't priority #1, but I don't want to feel like I compromised in only a year or 2. I don't mind a computer feeling old at 4 or 5 - because it is at that point.

Which is what upped me from my original $600 budget for GPU and monitor in the first place, because I felt far too confined with that budget - after taking a "no unreasonable compromise" approach, I drifted into the $700 card, $800 display territory and had to tap the brakes - you know? (I know you're a bleeding edge guy - if money were no object, I'm sure that we'd both be.)

See what I mean? Maddening.
This post was edited on 8/10/15 at 4:21 pm
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 8/10/15 at 5:39 pm to
quote:

I get to teach ILike something! :wow:

They're dropping the 4 digit reference numbers and - best I can tell, the Skylake Desktop Intel HD Graphics numbers are going to be 530.



Yep, that's new to me. I rarely pay attention to that stuff, but I am certainly used to the general nonsense with model numbers.

quote:

And I could fit that LG 34" Ultrawide that has Freesync (34UM67) - do you see me using that monitor in 7 to 10 years?



I think the 1080-pixel vertical resolution is going to be dated in less time than that.

quote:

Compare that to the Samsung (U28E590D) that you turned me onto - it's sitting at $499 right now on Newegg - 4k - but, what if I decide to drive that at lower resolutions with a $350 card (Say a 390) for a couple of years and see what happens with Fiji?


That's certainly an attractively priced 4K freesync monitor, and worth considering. It's tough finding data online about how well a monitor's scaler works. It's an additional reason I've not made the switch to 4K yet. Most of the games I play would need to be run at a lower resolution, and I don't want to sacrifice image clarity. I'm guessing it will do 1080P more convincingly than 1440P.

quote:

Now both of those displays have very narrow Freesync rates - 48 to 75 on the LG and 40 to 60 on the Samsung (albeit at 4k). What's the consequence of that (and I know you said that Freesync perhaps shouldn't be a priority based on the "general purpose" nature of the build, but I do play a lot of games, albeit few of them are graphics demanding - Total War series and Elder Scrolls being top of the mark)?


The consequences of the narrow adaptive range is that in any instances you experience heavy frame drops, freesync may stop working (best case, you get tearing and stuttering) or freesync may be less effective (worst case, you get flickering). One advantage to g-sync is it has a wider effective range, basically 30-60/144 on every monitor. Funnily enough, the "Freesync" standard is spec'd at a wider range than g-sync, but the effective ranges in implementation have been the other way around.

Staying within that 20Hz range is not difficult in most games. Find the right settings to give you somewhere close to 60fps, and you should sail right through framedrops.

quote:

Of course, I can't go back to Nvidia, because I went with Freesync. See - it's maddening other than a "pick 1 and live with it scenario".


Yep, there's no way around it, and AMD and Nvidia like it that way.
Posted by Ace Midnight
Between sanity and madness
Member since Dec 2006
89493 posts
Posted on 8/10/15 at 6:05 pm to
quote:

That's certainly an attractively priced 4K freesync monitor, and worth considering. It's tough finding data online about how well a monitor's scaler works. It's an additional reason I've not made the switch to 4K yet. Most of the games I play would need to be run at a lower resolution, and I don't want to sacrifice image clarity. I'm guessing it will do 1080P more convincingly than 1440P.



I've seen subjective reports that both last year's U28D and the current U28E in the 590 display 1080p, 1440p and (of course) native 4k - pretty well for general use.

I'm probably leaning that way as it appears to be a solid (albeit, entry) 4k display that should serve more competently through the expected lifecycle than a 1080p - ultrawide or not. Acer has a hot 34 ultrawide freesync coming out that will probably crack $1300 or $1400 and I'm not waiting around for that kind of sticker shock.

28" looks to be the sweet spot for 4k - not crazy about TN, but at this price point, combined with Samsung doing fairly good things with TN lately - I can live with a little off-angle issues for the intended purpose of the monitor.

quote:

The consequences of the narrow adaptive range is that in any instances you experience heavy frame drops, freesync may stop working (best case, you get tearing and stuttering) or freesync may be less effective (worst case, you get flickering).


One of the things I'm seeing is that - with Freesync you're golden in the range - works as designed and greatly improves output - above the range, it's like it isn't there (no worse off), but below the range - a game that might still be playable - barely in the 36, 38 FPS range, but isn't because of the disconnect between Freesync and the panel.

Now that I'm settling in on this display - if I want 4k and Freesync, I'm somewhere in the Fijis, right? With HBM (1), 4gb cap, probably the R9 Fury X - another maddening choice because, as it sits at ~$650, depending on flavor (and I lean Sapphire or MSI) - it's water cooled (should be no problem in a new case purchased in 2015, right?) and quite expensive - the consensus so far says - "Works adequately, even with 4gb - barely - who knows if the next generation of games will stress the card beyond competent 4k functioning?"

Maddening to drop $650 on a card that very well may be deemed obsolete in 2016.

(Or I can slog along with a 390 for 2 or 3 years, then jump when the HBM2 cards are here with a refinement of the Fiji architecture, right?)

Maddening.
This post was edited on 8/10/15 at 6:07 pm
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 8/10/15 at 6:42 pm to
No matter what you do, you'll regret your purchase in 6 months to a year. I fricking hate my 980s sometimes because the 980 Ti ended up being amazing. I can't sell and swap video cards as easily as others because of the proprietary water blocks.

The 4GB HBM won't be 4K friendly in a year or two. Neither will anything else on the market. You could always SLI/Crossfire in a couple of years, though it wouldn't be as easy with the Fury X.

quote:

(and I lean Sapphire or MSI)


Sapphire makes some of the better cards and has probably the worst reputation in the industry regarding warranty service.
This post was edited on 8/10/15 at 6:43 pm
Posted by UltimateHog
Oregon
Member since Dec 2011
65778 posts
Posted on 8/10/15 at 7:21 pm to
I have the Sapphire Fury X, and it's fantastic. Consistently stays at or under 55C, never have to worry about heat at all which is great peace of find, just make sure you mount the radiator above the card.
Posted by jdd48
Baton Rouge
Member since Jan 2012
22071 posts
Posted on 8/10/15 at 7:26 pm to
quote:

I'm doing a new Skylake Z170 build


I'm putting together a build soon too. I want a damn Intel 750 so bad.
Posted by Ace Midnight
Between sanity and madness
Member since Dec 2006
89493 posts
Posted on 8/10/15 at 7:51 pm to
quote:

just make sure you mount the radiator above the card.



Thanks for the tip.

I'm leaning towards a Fractal Define R5 as the case - shouldn't be a problem. I was a little worried about cooling options for both a flagship GPU and watercooling (maybe) the CPU. I guess I should see what folks in the wild are doing with Skylakes first. I'm not going to push the edge of overclocking, but just a modest bump over stock/turbo. The Noctua (Air) may be good enough for what I intend.

Just don't know yet. Waiting for an i7-6700k in stock notice from Newegg.
This post was edited on 8/10/15 at 7:52 pm
Posted by UltimateHog
Oregon
Member since Dec 2011
65778 posts
Posted on 8/10/15 at 8:02 pm to
I already have my MSI Gaming M7 delivered, have the 6600K preordered from Amazon but still no release date or delivery estimate, so this waiting game is difficult. (and I just noticed the 6600K is for sale in Stock on NewEgg)

I'm keeping my Corsair H80i for cooling my Skylake, doesn't need anything extra to fit, and it's served we well since I got it. I'll do modest over clocking as well, probably around 4.5GHz and I have 3000MHz RAM ready to go as well.

The Fury X cooler is phenomenal, very high quality and the fan it comes with is a Scythe Gentle Typhoon. I added one of my extra fans to the front as well and it stays extremely cool averaging around 52C.
This post was edited on 8/10/15 at 9:10 pm
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 8/10/15 at 10:38 pm to
quote:

Waiting for an i7-6700k in stock notice from Newegg.


You know, if it's helpful for your true monitor/GPU aspirations, you'll get the same life and performance (gaming-wise) from the $100 cheaper 6600K. I have an i7 myself, and it's maybe once in a blue moon where hyperthreading actually does anything worthwhile, and by worthwhile, I mean shave a few seconds off encoding time.
Posted by Ace Midnight
Between sanity and madness
Member since Dec 2006
89493 posts
Posted on 8/10/15 at 10:42 pm to
quote:

You know, if it's helpful for your true monitor/GPU aspirations, you'll get the same life and performance (gaming-wise) from the $100 cheaper 6600K.


I haven't categorically ruled out getting a 6600k, but I'm probably spending the $100.
Posted by Ace Midnight
Between sanity and madness
Member since Dec 2006
89493 posts
Posted on 8/12/15 at 12:36 pm to
quote:

Samsung (U28E590D)


I went with this - and thanks to ILike for turning me on to it.

$499 with free shipping from Newegg - almost impossible to say no.

Still waiting on Skylake i7... Availability of R9 Fury X is also spotty.
first pageprev pagePage 1 of 1Next pagelast page
refresh

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram