Started By
Message

re: Xbox One memory performance improved for production console

Posted on 7/1/13 at 2:47 pm to
Posted by stout
Porte du Lafitte
Member since Sep 2006
181413 posts
Posted on 7/1/13 at 2:47 pm to
This is also pretty cool and allows higher resolutions on what you are actually looking at so processing resources can be freed up vs running everything at higher resolutions and draining your computational power.


quote:

The key idea is to display lots of detail without drowning a graphics card’s memory, by only loading in the high detail textures when the camera is pointing at them

It doesn’t sound too dissimilar to tech already in use, but Microsoft seems pretty proud of its efforts, which will only be available for Windows 8 and next-gen consoles.

“The motivation for doing something like this is to enable you to make games with unprecedented amounts of detail,” Microsoft’s Antoine Leblond said.

Demos shown to attendees included a map of the surface of Mars and the detail on a glider.

No images were shared so please enjoy this header showing off a Tuscan mosaic.


LINK
This post was edited on 7/1/13 at 2:49 pm
Posted by Cs
Member since Aug 2008
10681 posts
Posted on 7/1/13 at 2:49 pm to
quote:

So how could Microsoft's own internal tech teams have underestimated the capabilities of its own hardware by such a wide margin? Well, according to sources who have been briefed by Microsoft, the original bandwidth claim derives from a pretty basic calculation - 128 bytes per block multiplied by the GPU speed of 800MHz offers up the previous max throughput of 102.4GB/s. It's believed that this calculation remains true for separate read/write operations from and to the ESRAM. However, with near-final production silicon, Microsoft techs have found that the hardware is capable of reading and writing simultaneously. Apparently, there are spare processing cycle "holes" that can be utilised for additional operations. Theoretical peak performance is one thing, but in real-life scenarios it's believed that 133GB/s throughput has been achieved with alpha transparency blending operations (FP16 x4).


If this is true, it looks like Microsoft actually lowered the clock on the GPU itself by about ~50 Mhz. Instead of expressing this in terms of the unidirectional bandwidth, they instead chose to express the bandwidth bidirectionally.

As the article mentions, 128 bytes multiplied by a GPU clock of 800 mhz yields a (unidirectional) bandwidth of 102 GB/s. However, if this is expressed in terms of the bidirectional bandwidth, that value would be 204 GB/s, not 196 GB/s.

Thus, a GPU downcloked by 50 Mhz would yield a unidirectional bandwidth of 96 GB/s, not 102 GB/s, thereby resulting in a bidirectional bandwidth of 196 GB/s, not 204 GB/s.

Additionally, it's important to keep in mind that this is only a 32 MB cache. As the article mentions -

quote:

But in a world where Killzone: Shadow Fall is utilising 800MB for render targets alone, how difficult will it be for developers to work with just 32MB of fast memory for similar functions?


Posted by jeff5891
Member since Aug 2011
15955 posts
Posted on 7/1/13 at 2:51 pm to
quote:

telling developers that 192GB/s is now theoretically possible


BOOM
Posted by stout
Porte du Lafitte
Member since Sep 2006
181413 posts
Posted on 7/1/13 at 2:53 pm to
quote:

Additionally, it's important to keep in mind that this is only a 32 MB cache. As the article mentions -



I knew you would say this but what about the reverse side of this. How much is the higher latency of DDR5 hurting the linear calculations on the CPU on the PS4?

Seems like you trade one or the other only using one type of memory vs trying to get the best of both worlds whether it's just 32 MB or not.
This post was edited on 7/1/13 at 2:54 pm
Posted by taylork37
Member since Mar 2010
15776 posts
Posted on 7/1/13 at 2:59 pm to
quote:

MS sure rustled your jimmies hard huh?


Oh yes..my jimmies are rustled.
Posted by Cs
Member since Aug 2008
10681 posts
Posted on 7/1/13 at 3:18 pm to
quote:


I knew you would say this but what about the reverse side of this. How much is the higher latency of DDR5 hurting the linear calculations on the CPU on the PS4?

Seems like you trade one or the other only using one type of memory vs trying to get the best of both worlds whether it's just 32 MB or not.


I've actually looked into this a bit, and it seems that the absolute latency of GDDR5 isn't higher than DDR3. It's higher in clock ticks, however, the actual ticks are actually shorter due to higher overall clock speeds. The data rates increase, so the latency in clock cycles has increased as well, although the absolute latency has remained the same.

With that said, DDR3 is still a bit more ideal for CPU related tasks. GDDR5 is really just DDR3 with a pre-fetch buffer, which is where GDDR5 acquires its speed advantage. However, when dealing with random access, having to persistently clean and refill the buffer makes it slightly slower in that respect.

The additional SRAM cache is indubitably fast. However, the fact that it's only 32 MB really does make a difference. It's akin to building a skyscraper when your materials are on the other side of the country, yet you have an empty cereal box that you can use to transfer materials within seconds from one location to the other. The cereal box is fast, but in the context of building a multi story concrete building, there are limitations in not only what specifically it can transfer, but how practical it would be for the project as a whole.

Mark Cerny and the design team actually considered employing a small cache, in a design paradigm not entirely unlike what Microsoft has done with the Xbox One. Below is an image from the presentation.




By halving the memory bus and implementing a small cache of eDRAM, Cerny says that would have been able to achieve a collective bandwidth of over a terabyte per second. Yet, they decided against that option for two reasons - one, the additional cache increased the overall complexity of the system, meaning that developers would need to spend additional time and money on adjusting their code to take advantage of the cache. Secondly, despite the incredibly high bandwidth of the cache, its small size would seem impractical given the sheer amounts of data that is involved when programming games in 720p/1080p.
This post was edited on 7/1/13 at 3:23 pm
Posted by taylork37
Member since Mar 2010
15776 posts
Posted on 7/1/13 at 3:22 pm to
Ill just leave this one up to the adults...my head hurts.
This post was edited on 7/1/13 at 3:23 pm
Posted by bluebarracuda
Member since Oct 2011
19317 posts
Posted on 7/1/13 at 3:27 pm to
Posted by oauron
Birmingham, AL
Member since Sep 2011
14597 posts
Posted on 7/1/13 at 3:31 pm to
Hardware and clock speed battles were never all that interesting to me. The two machines are similar enough and one platform won't be holding back the other to cause any detriment.

Posted by stout
Porte du Lafitte
Member since Sep 2006
181413 posts
Posted on 7/1/13 at 3:32 pm to
quote:

It's akin to building a skyscraper when your materials are on the other side of the country, yet you have an empty cereal box that you can use to transfer materials within seconds from one location to the other. The cereal box is fast, but in the context of building a multi story concrete building, there are limitations in not only what specifically it can transfer, but how practical it would be for the project as a whole.



I agree which makes me wonder how devs will choose to use it. Maybe just for the really heavy lifting.

I also wonder if coupling the ESRAM with the processing they will do in the cloud is one reason they went with DDR3 instead of DDR5. Maybe they feel confident enough in the ESRAM and the cloud or maybe they are just cheap. Who knows? 32MB of ESRAM on a custom APU has to be costing them though.
Posted by bluebarracuda
Member since Oct 2011
19317 posts
Posted on 7/1/13 at 3:35 pm to
quote:

Hardware and clock speed battles were never all that interesting to me. The two machines are similar enough and one platform won't be holding back the other to cause any detriment.


This is what I've been saying. The difference in graphics will probably be the exact same as it is in this gen.
Posted by Klark Kent
Houston via BR
Member since Jan 2008
74481 posts
Posted on 7/1/13 at 3:36 pm to
quote:

Ill just leave this one up to the adults...my head hurts.


this. i tried reading the first page, but it's way over my head.

where's that tom guy who is an expert at pretty much everything technology related but won't ever give us a source or his background/resume.

i'm sure he'll be able to tell us what to believe or what is pure bullshite.

his opinion is strongly regarded around here.
Posted by taylork37
Member since Mar 2010
15776 posts
Posted on 7/1/13 at 3:38 pm to
quote:

where's that tom guy who is an expert at pretty much everything technology related but won't ever give us a source or his background/resume.


Dude, he knows quite a lot....haven't you seen his website?

LINK
Posted by oauron
Birmingham, AL
Member since Sep 2011
14597 posts
Posted on 7/1/13 at 3:39 pm to
quote:

This is what I've been saying. The difference in graphics will probably be the exact same as it is in this gen.



Which is largely not there. There are few exceptions that benefited from being on one platform versus the other, but the systems are too alike for a lot of that to happen this time (most notably, they use the same media for game and no longer have to do extra compression for one platform).
Posted by stout
Porte du Lafitte
Member since Sep 2006
181413 posts
Posted on 7/1/13 at 3:40 pm to
quote:

where's that tom guy who is an expert at pretty much everything technology related but won't ever give us a source or his background/resume.




I found Tom's true nemesis

quote:

What are the reasons why you are buying an XboxOne this gen?

I'm certain I'm mostly alone in this one but the major selling point to me is:

Deep Azure integration and subsidized cloud resources for devs

I am a white-paper junkie and read acm like it's my job. Designing and architecting "clouds" is kind of my thing. All this dismissal of Azure is balls. The crap we've pulled off in industry and academia with AWS, Azure and various non-public clouds would blow most people's minds. By the end of 2014 it will be a defining feature.

Increasing hardware specs is not "next gen" to me. If the consoles were simply upgrades of their predecessors I wouldn't buy either. I mean really, is it a "next gen" system every time I upgrade the graphics in my PC? My PC has come a long way since Voodoo cards. More cores, higher clockspeed, energy consumption, etc don't match my view of a noteworthy "next generation" experience. Wow, I can crank out some more poly's? Shadows are going to be better? "Cool..." not "OMGWTF!" Shifting the paradigm is though, IMO. Cloud, convergence, bio-feedback, hypervisor -- that's what I consider "next gen." Tell me the hardware has enough power to where the games still look good and don't hold back the true next-gen features, otherwise I don't care.


The guy gives some background info on his expertise and why he feels this way unlike Tom.
This post was edited on 7/1/13 at 3:41 pm
Posted by SG_Geaux
Beautiful St George, LA
Member since Aug 2004
80635 posts
Posted on 7/1/13 at 3:40 pm to
quote:

Which is largely not there. There are few exceptions that benefited from being on one platform versus the other, but the systems are too alike for a lot of that to happen this time (most notably, they use the same media for game and no longer have to do extra compression for one platform).


Mostly people will only be able to tell the difference in side by side screenshots.
This post was edited on 7/1/13 at 3:41 pm
Posted by SG_Geaux
Beautiful St George, LA
Member since Aug 2004
80635 posts
Posted on 7/1/13 at 3:41 pm to
quote:

Dude, he knows quite a lot....haven't you seen his website?



Posted by taylork37
Member since Mar 2010
15776 posts
Posted on 7/1/13 at 3:42 pm to
quote:

Mostly people will only be able to tell the difference in side by side screenshots.


Which is kinda sad that there are websites devoted to comparisons like this.

Lens of Truth
Posted by stout
Porte du Lafitte
Member since Sep 2006
181413 posts
Posted on 7/1/13 at 3:46 pm to
I bet $1000 I could find several posts from the people now touting the difference in hardware as NBD saying the opposite when the initial announcements were made. At that time it was perceived PS4 was light years ahead of Xbone and it absolutely mattered to a few boasting about it at the time. Funny how people's stances change as info comes out.
This post was edited on 7/1/13 at 3:47 pm
Posted by taylork37
Member since Mar 2010
15776 posts
Posted on 7/1/13 at 3:52 pm to
quote:

Funny how people's stances change as info comes out.


Who knows...they were probably just trolling.
first pageprev pagePage 2 of 3Next pagelast page

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on X, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookXInstagram