The Xbox One might have a weaker hardware compared to the PlayStation 4, but according to the Xbox’s director of development, Boyd Multerer, the design of Xbox One is effectively a “Super-Computer Design,” and it will allow the developers to squeeze more performance out of the Xbox One with time.

In an interview with Total Xbox, Multerer commented on the resolution controversy that is usually associated with the Xbox One.

“Part of it is the obvious one where everyone’s still getting to know this hardware and they’ll learn to optimise it. Part of it is less obvious, in that we focused a lot of our energy on framerate. And I think we have a consistently better framerate story that we can tell.

Boyd then talks about how they prioritized the CPU performance on the Xbox One, in order to get the best performance out of the CPU. While it may have not resulted in better resolution in games, he placed an emphasis on the frame rate calling it “smooth” on the Xbox One.

The design of the Xbox One is held back by some bottlenecks like the ESRAM but according to Boyd, this is not the main reason. He calls the GPU as “complicated” compared to the – relatively easier to program to – Xbox 360 GPU, asserting that once developers are able to utilize the hardware pipeline without any potential bottleneck, they can expect improved performance on the Xbox One.

“The GPUs are really complicated beasts this time around. In the Xbox 360 era, getting the most performance out of the GPU was all about ordering the instructions coming into your shader. It was all about hand-tweaking the order to get the maximum performance. In this era, that’s important – but it’s not nearly as important as getting all your data structures right so that you’re getting maximum bandwidth usage across all the different buffers. So it’s relatively easy to get portions of the GPU to stall. You have to have it constantly being fed.”

He compares this hardware design to a “Super-Computer Design” and ensures that we should expect to see “fairly large improvements in GPU output” in the near future.

What do you think about the statements made by Boyd Multerer. Let us know in the comments below.