Started By
Message

re: PC Discussion - Gaming, Performance and Enthusiasts

Posted on 1/10/15 at 2:11 pm to
Posted by ILikeLSUToo
Central, LA
Member since Jan 2008
18018 posts
Posted on 1/10/15 at 2:11 pm to
quote:

Nvidia has a pretty loyal fanboy team so it will be interesting. Right now it is all out of my range so it doesn't matter to me.



Yeah, anyone who's devoted to Nvidia will be in full denial about Freesync even if it offers 100% identical experience to gsync.

It's possible that because of the fact that none of NVIDIA's cards support DP 1.2a (and therefore could not use any form of adaptive sync), NVIDIA may be counting on the fact that flagship chasers (i.e., people who just bought the 970 or 980) are likely to buy a gsync monitor instead of all new cards AND a new monitor. And once they have the gsync monitor, it becomes even less financially logical to buy AMD for their next flagship.

Speaking of "identical experience," one thing in Linus's video stood out to me. He pointed out that in the particular demo panel in front of him, he sees tearing (adaptive sync disabling itself) when you get to lower framerates, and it was entirely due to the panel technology (worth noting that low-hz flickering probably looks worse, though). But it reminds me of another key difference between AMD and NVIDIA. Particularly with SLI vs Crossfire, you'll notice that you can run an AMD crossfire setup on even the crappiest of motherboards, as long as you have two slots with the same x16 form factor, and each are connected to at least 4 PCIe lanes. That means, given enough lanes, you can run crossfire in 16x/8x, 16/4, 8/4, 8/4/4, or 4/4 (and so on) configuration if that's all the board supports. A lot of Z-series intel chipset boards have both PCIe 3.0 and PCIe 2.0 slots (the chipset itself provides the 2.0 lanes), so you can in fact run one AMD card in your top x16 3.0 slot, and the second card in your bottom x4 PCIe 2.0 slot. Depending on the setup, Crossfire can yield very inconsistent results, especially in any form of alternate frame rendering.

On the other hand, Nvidia is x8 minimum, and allocates an equal number of lanes to each card (so if you have enough lanes for x16/x8, Nvidia will still run SLI in x8/x8). Being in control of the hardware and software environment ensures a more consistent experience across all users. Akin to Apple, which limits hardware configurations across the board, so the experience is consistent and idiot-proof.. vs. Windows or Android, which can be run on the shittiest or most powerful of devices, and the hardware can have a lot of control over your experience out of the box.

If monitor vendors start pushing out adaptive sync on cheap shitty panels, you're going to see a lot of inconsistent opinions on Freesync, which could damage its mass appeal (even if you and I know better). Plus, we still don't know how much individual games will depend on the driver implementation of Freesync, which is another potentially massive variable (that I hope turns out to be a non-issue).

quote:

Does anyone use phsx anymore?


I haven't played much of anything that uses it since getting the 980s. Prior to that, I used at least some level of PhysX in those games already. The work was just offloaded to my CPU, which does it inefficiently but was powerful enough to handle it nonetheless. PhysX is probably NVIDIA's biggest gimmick. I've seen very little innovation with it since its unveiling. Fog, paper, cloth, and particles.
Posted by BoogerNuts
Lake Charles
Member since Nov 2013
856 posts
Posted on 1/10/15 at 2:35 pm to
Gonna start assembly tomorrow, just to give everything another day to set, but I'm happy with how everything came out.



first pageprev pagePage 1 of 1Next pagelast page
refresh

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram