Started By
Message

re: PC Discussion - Gaming, Performance and Enthusiasts

Posted on 9/20/22 at 1:41 pm to
Posted by bamabenny
Member since Nov 2009
15741 posts
Posted on 9/20/22 at 1:41 pm to
Big yikes. I’ll stick with my 3080FE for a while I guess
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
32747 posts
Posted on 9/20/22 at 1:55 pm to
Only explanation is that Nvidia is intentionally positioning these cards to clear out 3000 series inventory. I thought I was going to struggle to hold off past release date to wait for a 4080 hybrid/aio card. Turns out, I'm not going to struggle, at all. Staggeringly disappointing day for PC gamers.

Here's hoping that AMD picks up the slack on November 3rd.
Posted by UltimateHog
Thailand
Member since Dec 2011
69399 posts
Posted on 9/20/22 at 2:15 pm to
The 7900XT has been rumored at ~$2K so I doubt it. These prices are here to stay I'm afraid but we'll see.
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
32747 posts
Posted on 9/20/22 at 2:26 pm to
Halo cards are always going to be crazy. People whine about it, but it just is what it is.

Nvidia went several steps beyond that here, in my opinion.

Historically, the xx90 and xx80 use the same die, with the xx80 using a cut down version, and this is generally the 102 die. The gaming performance is usually close, with the big difference being VRAM for production. Then, the xx70 is generally one step down, using the 103 die.

Here, the 4090 is using 102. The 4080 16gb is using 103. And the 4080 12gb is using 104. So Nvidia basically created a massive gaming performance separation between the 4090 and 4080 16gb by using different dies and in spite of that, nearly doubled the MSRP of the xx80 generation over generation. And then in what I consider the most insulting move, shifted down-die again with the 4080 12gb. So instead of the xx70 being one full die removed from the xx90, the xx80 12gb is two full dies removed from the xx90. That's insanity.

With my prattling aside, ignoring the halo 7900xt, I believe AMD can make significant headway in market share if it puts out a 7800xt that competes with the 4080 16gb but significantly undercuts it in price. In a vacuum, I'm not sure I would think that likely, but I think Nvidia kinda just blew the doors off again regarding features, particularly with 3rd gen RTX cores and DLSS 3.0, so unless AMD has a massive surprise hiding up its sleeve, it needs to compete on price this gen more than ever.
Posted by UltimateHog
Thailand
Member since Dec 2011
69399 posts
Posted on 9/20/22 at 3:21 pm to
AMD won't lower on price when they're bringing such a new cutting edge massive tech a year ahead of Nvidia.

First to market for new bleeding edge tech is never cheap. Not to mention the cost for 2 dies stacked versus the conventional single die GPUs.

It's just wayyyy to big of a jump in technology and power. I expect the price to reflect this.
This post was edited on 9/20/22 at 3:24 pm
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
32747 posts
Posted on 9/20/22 at 4:21 pm to
I always take performance rumors with a heaping of salt. But if the outer edge of the rumors regarding performance are true, then maybe AMD won’t have to compete on price. We’ll see in a couple of months.
Posted by UltimateHog
Thailand
Member since Dec 2011
69399 posts
Posted on 9/20/22 at 5:38 pm to
Yeah I mean MCM is a big big deal for the future and Nvidia and Intels next consumer GPUs are both MCM as well. Nvidia already has data center MCM cards.

There's a great article from Nvidia several years back about scaling into the future with MCM. Good read let me find it.

MCM-GPU: Multi-Chip-Module GPUs for Continued Performance Scalability

I'm excited to see how it all performs. On paper the MCM design with infinity cache should be a match made in heaven. Throw in ~4GHz OC and now we're really talking.

But as always on paper doesn't always deliver.
This post was edited on 9/20/22 at 5:46 pm
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
32747 posts
Posted on 9/20/22 at 5:44 pm to
I completely agree. Hell, AMD moving into chiplets basically revolutionized the CPU landscape. It's simply a question of whether this generation bears that fruit, or whether it needs a few generations to "age like fine wine" like Zen needed.

I'm pulling for them. Something needs to give, because the Wattage Wars can't keep going like this.
Posted by UltimateHog
Thailand
Member since Dec 2011
69399 posts
Posted on 9/20/22 at 11:44 pm to
AMD RDNA 3 touts 50 per cent better perf per watt as rumours hint at 4GHz GPU


Good read on AMD maintaining commitment to power efficiency. Over 50% perf per watt gain with 5nm RDNA3.

quote:

In a blog post titled ‘Advancing Performance-Pet-Watt to Benefit Gamers,’ senior vice president and product technology architect Sam Naffziger takes us on a trip down memory lane, pointing to significant efficiency gains across the last three generations of AMD Radeon graphics cards.

Most recently, Radeon RX 6000 Series GPUs are cited as having delivered a 65 per cent increase in performance-per-watt over RX 5000 Series cards built on the same 7nm process. Firing a shot across the bow of upcoming competition, Naffziger points out “graphics card power has quickly pushed up to and beyond 400 watts.” RTX 4090, as you’ve no doubt heard, is rumoured to ship with a lofty 450W TDP.

What’s of interest to the enthusiast awaiting next-gen hardware is that Naffziger reckons AMD is on track to deliver on its promise of a greater than 50 per cent increase in performance per watt with 5nm RDNA 3.

Promising “top-of-the-line gaming performance” in “cool, quiet, and energy-conscious designs,” AMD recalls building its CPU and GPU architectures from the ground-up, and a lot of those early bets have begun to pay off. RDNA 3’s efficiency is made possible, says AMD, through refinement of the adaptive power management featured in RDNA 2, as well as a new generation of AMD Infinity Cache.

Efficiency gains typically lend themselves to higher frequencies, so just how quick might a next-gen AMD GPU operate? According to the latest Twitter leaks from @9550pro, we could be looking at GPUs hitting almost 4GHz.
This post was edited on 9/20/22 at 11:50 pm
Posted by UltimateHog
Thailand
Member since Dec 2011
69399 posts
Posted on 9/21/22 at 12:17 am to
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
32747 posts
Posted on 9/21/22 at 8:11 am to
quote:

Anotha one.


About god damned time...
Posted by MetroAtlantaGatorFan
Member since Jun 2017
15598 posts
Posted on 9/21/22 at 8:36 am to
quote:

RTX 4090 is $1600
RTX 4080 16GB $1200
RTX 4080 12GB $900

Wow, talk abut continuing to move the market in a hilarious direction, especially that 4080 now being in 2 tiers and a $300 difference in price (why not call the 16GB version the 4080Ti)? RTX 3080 was $700 at launch.

Memory isn't the only difference between the 2 4080s. The 12gb one is basically a 4070. Yet they're selling it for $400 more than the 3070 launched at. And there's no crypto boom or pandemic going on. WTF is Nvidia doing?!

Posted by Joshjrn
Baton Rouge
Member since Dec 2008
32747 posts
Posted on 9/21/22 at 11:45 am to
Yep, I went on an in depth rant a page or two ago:

quote:

Historically, the xx90 and xx80 use the same die, with the xx80 using a cut down version, and this is generally the 102 die. The gaming performance is usually close, with the big difference being VRAM for production. Then, the xx70 is generally one step down, using the 103 die.

Here, the 4090 is using 102. The 4080 16gb is using 103. And the 4080 12gb is using 104. So Nvidia basically created a massive gaming performance separation between the 4090 and 4080 16gb by using different dies and in spite of that, nearly doubled the MSRP of the xx80 generation over generation. And then in what I consider the most insulting move, shifted down-die again with the 4080 12gb. So instead of the xx70 being one full die removed from the xx90, the xx80 12gb is two full dies removed from the xx90. That's insanity.


Posted by MetroAtlantaGatorFan
Member since Jun 2017
15598 posts
Posted on 9/21/22 at 2:02 pm to
Yeah I initially thought the only difference was 4gb since they've done that previously with the 1060 and 2060.
Posted by bamabenny
Member since Nov 2009
15741 posts
Posted on 9/21/22 at 10:24 pm to
What’s the 3080 die?
Posted by Joshjrn
Baton Rouge
Member since Dec 2008
32747 posts
Posted on 9/21/22 at 10:29 pm to
quote:

What’s the 3080 die?


Both the 3090 and 3080 used a GA102 die. Basically, if the silicon quality was good enough, it became a 3090. But if small areas of the die had defects, those areas would be disabled and it would be used in a 3080. That’s why the gaming performance between a 3090 and 3080 wasn’t all that significant. 10% or so performance boost at over double the price.

Now we are talking about a 4090 potentially being 75%+ more powerful at less than 50% price increase compared to a 4080 16gb. It’s fricked.
This post was edited on 9/21/22 at 10:30 pm
Posted by thunderbird1100
GSU Eagles fan
Member since Oct 2007
72203 posts
Posted on 9/23/22 at 8:12 am to
Something tells me AMD might be willing to deal a big blow to Nvidia here. I think Nvidia is a bit tone deaf to the current state of people's pockets over the last year+. Essentially rebadging a 70-series $500-ish card as a 12 GB 4080 for $900 is a gigantic slap in the face to people who buy mid-range GPUs.

AMD can also just join the profit party or take a bit of an opportunity to take over a huge part of market share.

If AMD can put something out that beats a 4080 12 GB for $150-$200 less they will wi na lot of people over.

Or they can also make a 700 series class card $800-$900 similar to NVidia, have their 800 series class card $1100-$1200 and a 900 series class card $1500-$1600 and just make us all if the performance is similar (or better) to Nvidia's cards at those prices.

Question is do they want the profits or the market share. The market share is there for the taking in a big way as Nvidia clears out 30 series stock over the next year overpricing their 40 series cards for now, well specifically the 2 different "4080s"
This post was edited on 9/23/22 at 8:23 am
Posted by finchmeister08
Member since Mar 2011
40082 posts
Posted on 9/23/22 at 8:22 am to
AMD dropped the prices of their 6000 series cards




Posted by boXerrumble
Member since Sep 2011
54363 posts
Posted on 9/23/22 at 8:28 am to
Frankly if I was building a new PC right now, I'd probably go after a 6750XT or a 6800 for my gaming needs (or wait for AMD RX 7000 series).

I think my 2070S might be the last Nvidia card I buy for a while.
This post was edited on 9/23/22 at 8:31 am
Posted by thunderbird1100
GSU Eagles fan
Member since Oct 2007
72203 posts
Posted on 9/23/22 at 8:32 am to
I've seen a few 6900 XTs over the past month or two for $700 new. That $949 price for the 6950 XT just makes no sense still considering the very small bump (5-10%) in performance from a 6900 XT. Should be an $800 card at most.

$700 or less is kind of a bargain for a 6900 XT for now until the new cards hit. $600 and under for a 6800 XT is also very appealing.

I like to see the 6700 XT at $379 now too. I remember getting my 5700 XT for $350 brand new when they came out with a $50 off thing at microcenter, still have that 5700 XT. I would consider a 6700 XT for like $350 too but now I can only get like $200-$225 for my 5700 XT looks like now. My next card will be a 4k card though since i will be getting a 4k projector as next big purchase. 5700 XT is not enough in that regard, and the 6700 XT is closer but still not quite there either to game in 4k in some cases. I would imagine something like a 7700 XT should not have much a sweat.
first pageprev pagePage 1814 of 1912Next pagelast page

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on X, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookXInstagram