Started By
Message

re: PC Discussion - Gaming, Performance and Enthusiasts

Posted on 1/10/15 at 10:59 am to
Posted by Alandial
Baton Rouge
Member since Dec 2004
2558 posts
Posted on 1/10/15 at 10:59 am to
quote:

built-in hardware ID check


This is what worries me. I am only missing .5g of the dedicated VRAM. The i3 to i5 upgrade is more important imo. I will try and get 1 or the other before then, but in a "worst case scenario" (1st world problems) I still want to play.
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 1/10/15 at 12:07 pm to
quote:

The i3 to i5 upgrade is more important imo


No, GPU upgrade is more important by a wide margin. You're well behind the curve there. A CPU upgrade wouldn't hurt, but if anything is going to ruin your gaming experience here, it's that 6700 series. Lucky for you, a GPU upgrade wouldn't even cost you as much. I don't know which 6700 series you have (Guessing a 6750), but even a $150 R9-270X would more than triple your performance. Fairly cheap upgrade assuming you have the PSU to handle it (270x draws close to 200w). Any worthwhile CPU upgrade would require a new motherboard and a $200+ CPU. You could probably get a 280x used on eBay for under $175, which I'd recommend if it's in your budget.
Posted by Alandial
Baton Rouge
Member since Dec 2004
2558 posts
Posted on 1/10/15 at 12:30 pm to
Oh wow, that is actually great news. I was looking at the 270s. I have until May for the release of the game so it is probably likely that a sale will come with the new cards fixing it to come out. Might be able to sneak that 280 in. I cannot remember my exact PS #s but I know it is over 200. So . Thanks for the info.
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 1/10/15 at 12:51 pm to
quote:

I cannot remember my exact PS #s but I know it is over 200.


Needs to be around 450-500 for a card like that, and around 30amps on the 12v rail. Can get away with 400w if it's a high-quality PSU. Would recommend checking on that.
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 1/10/15 at 1:10 pm to
quote:

AMD Freesync Hands-on with BenQ, Samsung & LG Monitors - CES 2015

Reddit thread on video



top comment:
quote:

The important takeaway with FreeSync is that it's actually going to be available and not necessarily unique to the most high-end gaming monitors. The problem with G-SYNC is that you only find it on prohibitively expensive displays, and the idea of retrofitting is not feasible.

I think Nvidia's insistence on only supporting their proprietary technology is going to bite them in the arse here.


Truth to that, except that I would expect freesync monitors to still cost $300+ because panel quality is important for dynamic refresh. NVIDIA knew this was coming, and they were ahead of it by a year. If they'd flooded the market with affordable gsync monitors, the proprietary nature of gsync would've been an advantage to them instead of a hindrance.

That same poster said:

quote:

When you strive for the best graphics experience, it's no longer a single purchase that defines your experience. Now, you're tied to a monitor -- a considerable expense. If you have to choose between the $400 390 (or $500 390X?) and the $550 980, but have some bias toward Nvidia you might still go with the 980. But if you now have to choose between an $800 ROG Swift and a comparable Samsung (or any other mfg) FreeSync for $400, you might be easily persuaded.


Imagine you were an AMD user looking for an upgrade around the time the 970s/980s hit the market, and you could actually get gsync monitors in variety of flavors for a reasonable price, more people would have 970s and gsync monitors and would be far less inclined to switch brands for their next GPU upgrade. If the ROG swift were $500, even 600, I would've purchased one the minute I could find one in stock (but I know the gsync chip isn't only thing driving up costs on this panel). I'd say they should start giving away gsync chips to any brand that will take them, but NVIDIA is too arrogant to do that.


Another quote from that guy:

quote:

I sincerely hope they do support it. The divide that Nvidia creates with gobbling up open technologies and making them proprietary (PhysX! It was better as an add-in card) is toxic to the PC gaming industry.
If their technology is better, it should stand on its own merit. The fact that they don't support open technologies makes it look like they don't even trust that their technology is better.


His thoughts = mine
This post was edited on 1/10/15 at 1:30 pm
Posted by Mr Gardoki
AL
Member since Apr 2010
27652 posts
Posted on 1/10/15 at 1:24 pm to
Nvidia has a pretty loyal fanboy team so it will be interesting. Right now it is all out of my range so it doesn't matter to me.
Posted by Mr Gardoki
AL
Member since Apr 2010
27652 posts
Posted on 1/10/15 at 1:25 pm to
Does anyone use phsx anymore?
This post was edited on 1/10/15 at 1:25 pm
Posted by Alandial
Baton Rouge
Member since Dec 2004
2558 posts
Posted on 1/10/15 at 1:29 pm to



This should be ok right?
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 1/10/15 at 1:31 pm to
Yep, that should be fine for any GPU.
Posted by Alandial
Baton Rouge
Member since Dec 2004
2558 posts
Posted on 1/10/15 at 1:33 pm to
Thanks so much for the back and forth :) Very happy about the responses. Usually when I have to upgrade something I have to upgrade 2 other things first to upgrade the original thing lol. I think I am going to go with the 280 but will see what sales come out before A. The game releases or B. My patience hits 0.

Thank you again.
Posted by UltimateHog
Oregon
Member since Dec 2011
65825 posts
Posted on 1/10/15 at 2:08 pm to
That guys comments were almost exactly why I linked the reddit discussion on it.
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 1/10/15 at 2:11 pm to
quote:

Nvidia has a pretty loyal fanboy team so it will be interesting. Right now it is all out of my range so it doesn't matter to me.



Yeah, anyone who's devoted to Nvidia will be in full denial about Freesync even if it offers 100% identical experience to gsync.

It's possible that because of the fact that none of NVIDIA's cards support DP 1.2a (and therefore could not use any form of adaptive sync), NVIDIA may be counting on the fact that flagship chasers (i.e., people who just bought the 970 or 980) are likely to buy a gsync monitor instead of all new cards AND a new monitor. And once they have the gsync monitor, it becomes even less financially logical to buy AMD for their next flagship.

Speaking of "identical experience," one thing in Linus's video stood out to me. He pointed out that in the particular demo panel in front of him, he sees tearing (adaptive sync disabling itself) when you get to lower framerates, and it was entirely due to the panel technology (worth noting that low-hz flickering probably looks worse, though). But it reminds me of another key difference between AMD and NVIDIA. Particularly with SLI vs Crossfire, you'll notice that you can run an AMD crossfire setup on even the crappiest of motherboards, as long as you have two slots with the same x16 form factor, and each are connected to at least 4 PCIe lanes. That means, given enough lanes, you can run crossfire in 16x/8x, 16/4, 8/4, 8/4/4, or 4/4 (and so on) configuration if that's all the board supports. A lot of Z-series intel chipset boards have both PCIe 3.0 and PCIe 2.0 slots (the chipset itself provides the 2.0 lanes), so you can in fact run one AMD card in your top x16 3.0 slot, and the second card in your bottom x4 PCIe 2.0 slot. Depending on the setup, Crossfire can yield very inconsistent results, especially in any form of alternate frame rendering.

On the other hand, Nvidia is x8 minimum, and allocates an equal number of lanes to each card (so if you have enough lanes for x16/x8, Nvidia will still run SLI in x8/x8). Being in control of the hardware and software environment ensures a more consistent experience across all users. Akin to Apple, which limits hardware configurations across the board, so the experience is consistent and idiot-proof.. vs. Windows or Android, which can be run on the shittiest or most powerful of devices, and the hardware can have a lot of control over your experience out of the box.

If monitor vendors start pushing out adaptive sync on cheap shitty panels, you're going to see a lot of inconsistent opinions on Freesync, which could damage its mass appeal (even if you and I know better). Plus, we still don't know how much individual games will depend on the driver implementation of Freesync, which is another potentially massive variable (that I hope turns out to be a non-issue).

quote:

Does anyone use phsx anymore?


I haven't played much of anything that uses it since getting the 980s. Prior to that, I used at least some level of PhysX in those games already. The work was just offloaded to my CPU, which does it inefficiently but was powerful enough to handle it nonetheless. PhysX is probably NVIDIA's biggest gimmick. I've seen very little innovation with it since its unveiling. Fog, paper, cloth, and particles.
Posted by BoogerNuts
Lake Charles
Member since Nov 2013
856 posts
Posted on 1/10/15 at 2:35 pm to
Gonna start assembly tomorrow, just to give everything another day to set, but I'm happy with how everything came out.



Posted by bluebarracuda
Member since Oct 2011
18242 posts
Posted on 1/10/15 at 2:47 pm to
That looks great
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 1/10/15 at 3:30 pm to
Superb. Can't wait to see the finished build.

Although, you have too much radiator for single CPU cooling. Recommend omitting one until you add GPUs to the loop. All it's going to do now is add more noise and restriction.
Posted by BoogerNuts
Lake Charles
Member since Nov 2013
856 posts
Posted on 1/10/15 at 4:35 pm to
Thanks for the compliments guys. I am aware it's a bit overkill for a CPU loop, but honestly it's mostly for aesthetics at this point. I do plan to do a GPU upgrade this year though, this time I'll make sure there's a block for it. EK makes a block for the Vapor-X 290X, but not the 290, thought that was odd. As for the noise it's not that much of a concern, those JetFlos aren't silent by any means whether they are on a radiator or not. Just waiting on AMDs new line to drop to see what direction I want to go with GPUs.

Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 1/10/15 at 5:06 pm to
That sucks. You'd think they'd use the same PCB. I had reference 290's which were identical to reference 290x's and one of sapphire's 290 Tri-X models. Sapphire is super annoying with their ridiculous amount of variations of the same damn card.
Posted by BoogerNuts
Lake Charles
Member since Nov 2013
856 posts
Posted on 1/10/15 at 5:28 pm to
Yea I thought that was just silly. I have no idea why the PCB would be different on those two cards, especially considering that some of the 290 owners have reported that their cards actually unlocked to a 290x. If they are being honest that just makes no sense at all to me. Maybe they are just that proud of their cooling system that they purposefully make it a PITA to develop a block for. But I will say the Vapor-X cooler works like a charm, and the cards OC like a beast.
Posted by UltimateHog
Oregon
Member since Dec 2011
65825 posts
Posted on 1/10/15 at 5:43 pm to
I love my Tri-X 290. OC's like a beast too. It's been a dream for 1440p, perfect.

Got it brand new for $219.99 on black Friday from an eBay "store". Closest I've seen since is $270.
This post was edited on 1/10/15 at 8:25 pm
Posted by VABuckeye
Naples, FL
Member since Dec 2007
35565 posts
Posted on 1/11/15 at 11:20 am to
There's a lot of sexiness in those photos. Shame the paint and cords are Scarlet and Gray.
first pageprev pagePage 1005 of 1887Next pagelast page

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram