Started By
Message

re: PC Discussion - Gaming, Performance and Enthusiasts

Posted on 7/7/24 at 1:50 am to
Posted by LSUGent
Member since Jun 2011
2728 posts
Posted on 7/7/24 at 1:50 am to
quote:

That's great and all, but who is using a 4090 to game at 1080p? Any and all performance ratings using a 4090 should be at 1440 and 4K. Sorry, no 4090 owner is running 1080. I don't know if the results change a bunch, but it matters.


The chart shows a 4090 at 1080p because that is the best way to show and compare CPU performance.

At higher resolutions (especially at 4k), the GPU becomes the bottleneck of performance. At 1080p, a 4090’s performance becomes strictly limited to how powerful the CPU is…. This allows benchmarkers to demonstrate relative CPU performance when controlling all other variables.
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
29907 posts
Posted on 7/7/24 at 8:35 am to
quote:

Not sure, just know its gaming performance is about as good as the others and well ahead of the non-X3D chips.

We’ve already established that. My question was in response to your “future proofing” statement. Because the 7900x3D doesn’t use both CCDs simultaneously for games, if future games do require more than six (or eight) cores, the 7900x3D is actually less “future proofed” than the 7800x3D with eight available cores for gaming.

Basically the X900x3D chip only makes sense for someone who cares about productivity far more than gaming, but cares about gaming a little bit, but can’t afford the X950x3D, and the gaming they do is CPU bound, but not CPU bound in a way that only having six cores is a problem.

I’m not saying those people don’t exist; I’m just saying that they make up a vanishingly small percentage of the gaming population.
Posted by UltimateHog
Oregon
Member since Dec 2011
67618 posts
Posted on 7/7/24 at 2:19 pm to
I think it makes more sense than a pure 6 core offering like a 7600X3D because at least then you have those extra cores for better productivity and the cost would be similar.
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
29907 posts
Posted on 7/7/24 at 2:46 pm to
quote:

I think it makes more sense than a pure 6 core offering like a 7600X3D because at least then you have those extra cores for better productivity and the cost would be similar.

I 100% agree with you that from a consumer perspective, price being equal, it (probably) makes more sense to buy a 7900x3D than a 7600x3D (the probably being that the 900x3D chips sometimes don't choose the right CCD to use, so it can cause some aggravation).

But from AMD's perspective, you're comparatively losing money if you can't sell the 7900x3D for more enough over the 7600x3D to cover the additional BOM and manufacturing costs of the second CCD.

And hey, maybe they can. I don't have the numbers. But based on the utterly abysmal sales of the 7900x3D at anything near retail, my assumption is that the juice isn't worth the squeeze.
Posted by boXerrumble
Member since Sep 2011
53964 posts
Posted on 7/7/24 at 4:09 pm to
Well let’s see what AMD does this time with the x3ds. They did talk about “tweaking” and “next steps” with the x3ds so the same amount of cache doesn’t sound “exciting”, but if there’s other benefits, then hey that might be interesting.

On the 7900x3d/9900x3d thing, I suspect it’s just purely money related and product stack related more so than “technically makes sense” related.
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
29907 posts
Posted on 7/7/24 at 4:36 pm to
quote:

Well let’s see what AMD does this time with the x3ds. They did talk about “tweaking” and “next steps” with the x3ds so the same amount of cache doesn’t sound “exciting”, but if there’s other benefits, then hey that might be interesting.


And like I said, genuinely just a mild disappointment. If I do a full rebuild at the end of this year, it will almost certainly be with a 9800x3D.

I just need to decide whether I want to do that full rebuild



ETA: And amusingly, my decision likely won't be predicated on the sexy hardware. My brain has decided that my next build needs to have a rear IO motherboard, so now I have to wait and see whether there are cases and motherboards I like enough to justify tossing my otherwise still gorgeous O11XL Silver
This post was edited on 7/7/24 at 4:53 pm
Posted by cwil1
Member since Oct 2023
907 posts
Posted on 7/12/24 at 10:37 am to
Future proofing doesn't exist.
Posted by Carson123987
Middle Court at the Rec
Member since Jul 2011
67324 posts
Posted on 7/12/24 at 11:02 am to
Ended up grabbing the Alienware AW3225QF 4K/240. It's normally $1199, but thanks to the autistic Slickdeals commenters and their deal magic, I was able to combine a Dell sale, Alienware promo, credit card offer, and Rakuten cashback to get the monitor for a net price of $793.52. Out for delivery now
Posted by LSUGent
Member since Jun 2011
2728 posts
Posted on 7/12/24 at 11:43 am to
quote:

Ended up grabbing the Alienware AW3225QF 4K/240. It's normally $1199, but thanks to the autistic Slickdeals commenters and their deal magic, I was able to combine a Dell sale, Alienware promo, credit card offer, and Rakuten cashback to get the monitor for a net price of $793.52. Out for delivery now


Nice

I am holding out for a new 45" Ultragear ultrawide in 3840x1600 OLED to replace my current Alienware ultrawide.
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
29907 posts
Posted on 7/12/24 at 12:16 pm to
Have you looked at any reviews regarding pixel density? I would potentially have concerns if I were going to be sitting a desk distance.
Posted by LSUGent
Member since Jun 2011
2728 posts
Posted on 7/12/24 at 12:57 pm to
quote:

Have you looked at any reviews regarding pixel density? I would potentially have concerns if I were going to be sitting a desk distance


The reason why the pixel density sucks on the current models is because they’re 1440p.

If they go to 3840x1600 on the panels the pixel density should easily be around 110ppi+.
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
29907 posts
Posted on 7/12/24 at 1:08 pm to
quote:

The reason why the pixel density sucks on the current models is because they’re 1440p. If they go to 3840x1600 on the panels the pixel density should easily be around 110ppi+.

92 and change, actually. My 34” 3440x1440 is just under 110ppi, and I’m not sure I would want to go much below that at desk distance. That’s why I wanted to flag it for you, just in case.
Posted by UltimateHog
Oregon
Member since Dec 2011
67618 posts
Posted on 7/12/24 at 2:35 pm to
Of course it does, now more than ever.
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
29907 posts
Posted on 7/12/24 at 4:16 pm to
quote:

Of course it does, now more than ever.

Expound?
Posted by cwil1
Member since Oct 2023
907 posts
Posted on 7/13/24 at 6:38 am to
No, false.
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
29907 posts
Posted on 7/13/24 at 9:34 am to
quote:

Future proofing doesn't exist.

I think I agree with you more than I disagree with you, but with a caveat. While I don't think "future proofing" exists in the way people generally use it, I do think there's being penny wise and pound foolish in such a way that you're nearly immediately handicapped. There are also a very small subset of parts that can be future proofed, to an extent.

As an example, people who spent a ton of money making sure they had motherboards with PCIE Gen 4 half a decade ago probably replaced/are replacing that motherboard before they actually had a need, or even a use, for Gen 4. But on the slip side, when I was building my current PC three or four years ago, there was a lot of chatter about people "over buying" their PSUs, that unless you were doing some crazy XOC, you could get away with 650w and certainly didn't need more than 750w. Considering most people buy a PSU with an eye towards getting 7-10 years out of it, anyone who purchased a 650w PSU a few years ago is likely regretting the $20 or so it saved them compared to getting an 850w+ unit.

In short, if you *know* that you're keeping a part for a long time, paying attention to building in a bit of future proofing is probably a good idea. If you don't know that you'll be keeping a part for a long time, attempting to build in future proofing is generally going to be a fool's errand. But with *that* said, there's immediate idiocy. If you're building now, don't do something stupid like get 12GB of RAM. Or more relevant, if pricing rumors are true, I don't think I would spend $300 on a six core Ryzen 9600x. Now to be clear, I'm not necessarily opposed to someone building a budget PC with a six core chip, but at which point you're spending $300, that's not really a budget build, and I think there are smarter buys that will give you a bit more wiggle room for the future.

But to bring this full circle, the 9900x3D just doesn't really fit any reasonable future proofing criteria, and considering it's a six core chip for gaming, it's more future vulnerable than other chips in its class.
Posted by cwil1
Member since Oct 2023
907 posts
Posted on 7/13/24 at 12:19 pm to
I was just saying, you can't predict hardware or gaming's future. No one knows what games will be like in just 3 years. I recall in 2020 ''8gb of VRAM will be fine for years''. Didn't happen. CPU's could stil only need 6 cores in 5 years, or they could need 12, we just don't know. The best approach is to build the best PC you can. For your budget. And enjoy it. Trying to buy parts out of ''future-proofing'' is a fools games. The game industry is just so fluid. Did anyone think we'd have frame generation, or FSR just 5-6 years ago? Or DLSS and other upscalers? People with older cards are now hitting crazy FPS. With FSR. Something unimagineable just a few years ago.
This post was edited on 7/13/24 at 12:21 pm
Posted by LSUGent
Member since Jun 2011
2728 posts
Posted on 7/13/24 at 1:08 pm to
quote:

No one knows what games will be like in just 3 years.


This is sort of not true. And what I mean by this is that a very large part of the games industry is tied to the hip with console makers and their tech. The next gen consoles dictate heavily on how hard game devs push graphical fidelity.

The Ps5 has 16gb of unified memory for the whole system. The console can’t use all 16GB for the games because it needs to use some for the OS and other processes, but because it has 16, this is why older 8gb GPUs are not sufficient enough now since game devs are targeting more like 10-12gb of vram usage for textures.

We will know what the baseline PC requirements will need to be once we see the specs for next gen PlayStations and Xbox.
Posted by cwil1
Member since Oct 2023
907 posts
Posted on 7/13/24 at 1:13 pm to
If you play 4k sure. But 10-12GB is still just fine for 1080p and 1440p. And it doesn't hurt to drop from ultra to high. Which most of the time is the same setting, but margnially better.
This post was edited on 7/13/24 at 1:14 pm
Posted by UltimateHog
Oregon
Member since Dec 2011
67618 posts
Posted on 7/13/24 at 3:11 pm to
Yes, true.
first pageprev pagePage 1890 of 1907Next pagelast page

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on X, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookXInstagram