Xbox On eSRAM “You Can Do Amazing Things With It”

Follow Meh

Dylan Zellmer

Dylan splits time between games journalism, designing video games, and playing them. Outside of his deep involvement in the games industry, he enjoys It’s Always Sunny in Philadelphia, Shameless, A Song of Ice and Fire, fitness, and family.
Follow Meh

  • Dirkster_Dude

    This sounds like people who actually understand the technology and might have a point about it. It is tiresome listening to people who don’t even know how to code spout of nonsense about what IS and ISN’T a good thing in terms of developing games.

    • Kevin Malone

      come on, now. I’m pretty sure everyone on the internet is a qualified computing hardware developer. I mean, if you can copy/paste blocks of text, you’re pretty much a genius.

      • The Wolf 47

        The Xbox One was not even supposed to have eSRAM to begin with.
        They changed it at the last minute in order to compete with the PS4’s GDDR5 which caught them off guard.
        The PS4 CPU was optimized to use the GPU as it’s main core with GDDR5 RAM supporting it.
        The Xbox One design just got eSRAM thrown in at the last minute, hardly optimized at all. Had they have made test for years, they would have known about it’s weakness and opted for a different built and configuration.
        I highly doubt they were able to optimize the console for the use of eSRAM within a few months… (when we consider that those consoles were in development for years)

        • Joe

          And your evidence for this is?…..yeah just as I thought jack sh**t. The ESRAM is a natural evolution of the EDRAM solution from the Xbox 360, only that its more flexible.

          • Michael Norris

            Esram is garbage,no matter how you slice it.MS missed the boat on GDDR5 which would had unified the whole console and make things much easier for developers.Now developers are playing catch up all of this and the fact that the Gpu/Cpu are lacking as well.

      • Xtreme Derp

        Being harder to code for, more expensive, and weaker is not “a good thing”. Hope that helps.,

    • Wadda Umean

      But surely it would be better to have “extremely high bandwidth”, as he puts it, everywhere (PS4, 176GB/s) rather than just 32MB (XB1, 109-204GB/s depending on how well you can mix reads and writes), right?

      • Vdek

        The PS4 actually has much weaker bandwidth when both the GPU and CPU try to access memory according to Sony. Only ~120GB/S total when the CPU really needs the bandwidth. The ESRAM bandwidth is optimized for the 16 ROPs on the X1 leaving the DDR3 bandwidth intact.

        • Xtreme Derp

          WRONG

          What is it with dumb as bricks microsoft fanboys being constantly WRONG over and over again?

          • Vdek

            Wow you totally disproved what I said, good going idiot.

          • Xtreme Derp

            Wow you’re totally stupid and WRONG.

    • Xtreme Derp

      Being harder to code for, more expensive, and weaker is not “a good thing”. Hope that helps.

      • Dirkster_Dude

        The fact that I don’t agree with you doesn’t make you wrong, but I will use my 20 years programming experience to make my own judgments in terms “harder to code”. Now being “more expensive” I can give you that. As for “weaker” or I’m assuming you mean less powerful. So far I haven’t seen any games where it matters yet except on paper. Time will tell I suppose.

  • Microsoft sux

    It just does not help get you 1080p consistently like fast GDDR5

    • Gabrielsp85

      eSRAM is faster than GDDR5 even Lord Cenry said it, it’s only matter of time Dev’s learn to use its properly and better tools to make it easier

      • Arnold Stallone

        There was a big article with devas, about x1 esram.
        – 109 is peak value obtained with a few lines of code to measure it, in a lab .
        In the real world, 65-70 is what devs and games can get.

        – you can multiply it by 2, if you manage, in a perfect world, to do at THE SAME TIME 50% of reading and 50% of writing.
        But that won’t likely never happen. Those 32mb are so tiny, devs will often be writing data, Or reading data. For the perfect world to happen, it means the x1 would need to constantly write 16mb while reading 16 mob.
        Therefore, they need to choose faster speed, and write 32mb of data, and read 32mb of data, or optimize bandwidth, with only 16 mb to play with. Choices…

        If there were 64mb, or maybe 128mb, maybe for 20-30 extra $, the system would be fantastically fast, with that huge buffer to do plenty of things. But 32mb, you can fill them with a single high def texture.

        I imagine devs have to invent new tricks to play with those 32mb, every single day.

        But new drivers or directx, etc, won’t make that bandwidth faster, by miracle.

      • Xtreme Derp

        WRONG

        What is it with dumb as bricks microsoft fanboys being constantly WRONG over and over again?,

  • angh

    They had the ESRAM for last few years, on last generation. So developers exactly know how to use it. The problem is, as fast as it can be, it is just a bit to small.

    • incendy

      Actually that is incorrect, X360 had 10mb of eDram which refreshes at much lower rates than eSram. X1 uses 32mb of eSram.
      As for developers being used to it, these systems are very different. X360’s CPU was PowerPC based and not X86 so coding techniques for performance will not carry over.

  • Pingback: Dirext X12 - Vorstellung am 20.03. - Xbox One - Seite 33()

  • Gamez Rule

    Can EsRAM make the graphics more powerful to match PS4? Nope☺

  • Fango

    Ill believe it when i see it?

  • Xtreme Derp

    Xbox’s DDR3 and ESRAM is inferior to GDDR5 in both size and sustained bandwidth. It’s a poor design decision, period.

    • James Deng

      suze, sustained bandwidth and general locality – the penalty for somewhat more eccentric memory patterns is much much much less on the PS4 than XB1 due simply to the fact 8GB >> 32 MB

      • Xtreme Derp

        Yes, it’s a detriment to gaming graphics performance, there’s little or no upside to it. All the claims about “3 billion dollars” are nonsense conspiracy theories, it was simply a poor design decision favoring media and multitasking over gaming performance.

    • joe

      ESRAM is superior in sustained bandwidth but “poor design” considering current game engines. Ofcourse, it was designed with the future in mind and will come in to its own with tiled resources and future gaming engines.

      • Xtreme Derp

        Lol no. “Future” gaming engines are deferred rendering which have even larger framebuffers, it’s even worse for the tiny ESRAM. PS4 can also do PRT which is the opengl API equivalent of tiled resources, and it’s not going to solve Xbox’s memory size/width problems.

        When your framebuffer exceeds the size of the on die RAM you just don’t get the bandwidth advantage any more so your fill rate will suffer. In layman’ s terms you need to cut down on things like MSAA, alpha effects, 60FPS or even resolution. Framerate is the first thing that gets cut, because devs prioritize graphics over 60FPS most of the time but when they don’t, they are left out of options and have to cut back on the rest.

    • ^ How does it feel to suck so much?

      • Xtreme Derp

        Mentally ill dumb as rocks cretin,

  • ImonadrugcalledCharlieSheen

    So let me get this right a Microsoft employee is telling us their ESRAM is good, let me act surprised, what they say is irrelevant, the overwhelming majority of developers have said its trash. But I’m sure one day it might be utilized to make things marginally better on xbone, of course by that time PS5 will be out.

  • azz156

    I remember the arguments the Sony fanbois had with the ps2 saying “the gameplay is what matters” and completely ignoring the fact the original Xbox was 3x faster in every way. I just finished playing titanfall for 3 hours and I can say it looks perfectly fine and didn’t get any lag when playing it but if I wanted better gfx I would have bought it for my new gaming pc.