The Ninth Console Generation has been a weird one. First we had the Switch releasing in the middle of the Eighth Generation as a Wii U replacement, and then years later the PlayStation 5 and Xbox Series both released during a chip shortage. While the Switch had been out for quite some time and gained quite the fanbase, the Xbox Series and PS5 struggled to win fans over.
Those who wanted the next gen consoles found themselves hunting for them for years, while others simply had no reason to ditch their preexisting PS4 and Xbox One (and One X) consoles. Not to mention the fact that all Xbox games were now releasing on PC as well, with Sony also releasing a select few PlayStation exclusives here and there. So even if the consoles were easy to come by... Why buy them?
Games. That's the reason we buy any console that comes out. New consoles means new games, and these new games are usually made taking advantage of what the new hardware offers. Sure, games are made with lower requirements as well because not every single game needs to be "cutting edge." Why would a 2D platforming game need to use the full advantage of the PS5 or Xbox? The answer? It doesn't!
A game using the full capability of hardware does not make a game good, and a game isn't bad for not using said hardware to it's fullest. Instead what stronger hardware does allow for, is for these same developers to not be as limited with what they create.
The Consoles of the Past
If you look at something like the PS1, you can see that it can handle 3D graphics, voice acting, higher quality music, etc. The CDs themselves can only hold so much data so multiple discs are required if you want to take full advantage of that, but you're also limited by what the PS1 itself can process at any given time. Too many objects in an area? Game is going to stutter and not be able to handle it. Too high quality of textures? Again, it wont be able to handle it. Jumping to the PS2 takes away these constraints, opening games up to higher quality models, textures, more objects within the areas, more NPCs, etc, but again there's always a limit.
The original Xbox was pretty even with the PS2 spec wise, and the PS3 and Xbox 360 once again lifted the constraints to allow for HD graphics, and games that could push past what the PS2 and Xbox allowed. Games like Metal Gear Solid 2 were groundbreaking and allowed you to shoot glass bottles and destroy them, or knock over books on a book shelf. These were little touches that the PS2 could handle, but they had to be limited to avoid taking resources away from the core game. For the 360 and PS3 though? We could have hundreds of bottles just laying on the floor to destroy, and it really wouldn't matter. Those limitations were now gone. We could now build a game with hundreds of high quality enemies on screen, large detailed levels, and include these little touches without tanking the game's performance.
And this is how the pattern continues.
Xbox One and PlayStation 4 removed the limits yet again, allowing for everything the previous generation could do, plus allowed for higher quality textures, more objects on screen at any given time, more demanding particle effects, lighting, shadows, etc, and the games could (mostly) run in 1080p. The Xbox One X and PlayStation 4 Pro stepped us up a bit more allowing for 4K and 60 fps, but being a part of the same generation meant that games couldn't fully make use of their additional power. And that's the issue that is plaguing us now.
As developer's dreams continue to grow, games are becoming bigger and bigger with each passing year. Massive open worlds, more enemies, smarter AI, features like ray tracing, 4K video, attempting to keep 60 fps while also making the world more detailed, etc. Developers want to do more and more with their games, and they also want to avoid impacting the game's performance. We can make these nice looking maps with a lot going on, but what's the point if the game doesn't want to run? You have to cut back until you find a middle ground for what you want vs what you need to keep things functioning. Again, this is where a new console generation shines as it sets that cut back bar higher. It's what the PlayStation 5 and Xbox Series X should've been, but sadly it's not... And a part of that problem is the Xbox.
What is the Series S
The Xbox Series S is actually a good idea. It is a "budget" Xbox Series that released at $300 instead of $500, and can even be bought on a payment plan directly from Xbox.com. The idea here is to easily replace your Xbox One with the new console, without having to pay top dollar. If you buy the device, or the Series X, you get to simply plug it in, connect your account, and watch as your Xbox One transfers over to the new hardware. It uses the same UI, has access to the same features (plus more), and really feels like you've just upgraded your Xbox One's hardware. A nice pretty seamless transition. It's a great console for the consumer, and more compact in size compared to the Series X. The problem though? It's not a Series X.
Putting aside the fact that the S doesn't have a disc reader, the main issue with it is it's lower specs.
Taken directly from Xbox, the Series S's Specs are as follows:
CPU. 8X Cores @ 3.6 GHz (3.4 GHz w/SMT) Custom Zen 2 CPUMeanwhile the Xbox Series X:
GPU. 4 TFLOPS, 20 CUs @1.565 GHz Custom RDNA 2 GPU
Memory. 10GB GDDR6 128 bit-wide bus
Memory Bandwidth. 8GB @ 224 GB/s, 2GB @ 56 GB/s.
Gaming Resolution. 1440p
CPU. 8X Cores @ 3.8 GHz (3.66 GHz w/SMT) Custom Zen 2 CPUThe CPU isn't too different. 3.6 GHz vs 3.8 GHz; however, the GPU goes from 4 TFLOPS to 12. That's a significant upgrade. (The 10GB vs 16GB of RAM is quite the difference also.)
GPU. 12 TFLOPS, 52 CUs @1.825 GHz Custom RDNA 2 GPU
Memory. 16GB GDDR6 w/320 bit-wide bus
Memory Bandwidth. 10GB @ 560 GB/s, 6GB @ 336 GB/s.
Gaming Resolution. True 4K
High Dynamic Range. Up to 8K HDR
Of course there's more to both of these consoles than just the numbers, but the bottom line is the fact that the Series S IS the bottom line for this generation.
The PlayStation 5 has similar specs to that of the Series X, and most modern PCs (and older high end PCs) can meet these same requirements. This is actually quite the upgrade if you compare the X to what the original Xbox One was, or to what the original PS4 offered. The One's CPU was actually 1.75 GHz, while it's GPU was 1.31 TFLOPS, so again, both the Series S and X are an upgrade, but the X itself is a pretty big upgrade from the S.
Microsoft's Policy
Honestly, there's nothing fundamentally wrong with the Series S itself. It's a nice budget console that can play the same games as the Xbox Series X, and can use all the main features like Xbox Game Pass's cloud service. It is an Xbox Series console, and that's perfectly fine. The actual problem is with Microsoft's policy.
The thing is, if a game is to release on Xbox Series X, it HAS to work on Xbox Series S as well. Games cannot be exclusive to the X. What this means is, instead of developers having the X's specs to work with as their cut off, they actually have to look at the Series S. At a minimum games can not exceed the requirements the S has set when it comes to the core game. Yes, the X version can run at a higher resolution and have graphic settings turned up a notch, but the core game cannot require anything more than what the S can handle. If anything in a game pushes it past the S's specs, then it has to be reworked or removed. Simple as that.
This ultimately holds back this generation greatly. We have the Series X and PS5 that are capable of doing so much more, but they really can't... Unless you skip the Xbox completely.
The PS5 Issue
One way developers have gotten around Microsoft's Policy is to simply not release their games on Xbox. Sticking to PlayStation 5 and PC allows them to use the higher end specs as their minimum, while also pushing stronger PCs past that point. PS5 exclusives can release basically at the Series X's specs, and take full advantage of the hardware. It's what you expect out of a new console generation, and what developers want as well. However, that's also not the best move.
As mentioned previously, the PS5 was also in very short supply, and not as many people were willing to upgrade from PS4. Releasing games on PS5 exclusively means releasing to a much smaller user base, and in return, less profits from said games. It's happened time and time again since the launch of this generation, and it's something that wont change unless more people buy a PS5. But why should they? And why should those who own a Series X switch? They shouldn't.
Releasing more PS4 Games
Since the Xbox One and PS4 have millions of users, it makes more sense to keep releasing games for those consoles. While Microsoft put an end to Xbox One support already, Sony still embraces the PlayStation 4. Cross generation games still need to meet the S's requirements, so many developers settle on meeting the PS4's requirements instead. They can still make great games doing this, and release to MANY more users than they could releasing on PS5 and Xbox Series, or by limiting their games to PlayStation and PC only.
And that's the situation we are currently in.
Most games that come out are actually still PS4/XBO games at their core. The Xbox Series and PlayStation 5 versions simply take advantage of new graphical effects and features, while the Series X and PS5 take the same effects a step further than the Series S. These games do not make full use of what the current hardware can handle, nor do they really need to. They aren't games like Ratchet & Clank or Starfield that require a SSD to run, but the consoles that have SSDs do take advantage of the faster load times. They are still fun without the faster loading though, and there's nothing in them that needs the stronger hardware to function. So in reality, they aren't truly "next gen" (current gen) games, and that's what is holding back this generation as a whole. A generation that hardly even exists.
So how do we fix it?
That's a good question! Maybe we can't? It's a really sticky situation overall. We are four years into this console generation, and many people own the Xbox Series S. Simply cutting it off or making it so games don't have to run on it is not a good solution. You'll have Xbox fans who bought the S angry over the fact that they are no longer receiving games (or the fact specific games wont work on it), and that's pretty much a lawsuit waiting to happen.
Cutting off PS4 releases also doesn't make sense, as not every game actually needs the Xbox Series or PS5's specs to run. If it can work on last gen hardware and hundreds of millions of people are playing on said hardware, why wouldn't you release a game on it?
There's really no good way to fix this, and it honestly might be too late.
The Ninth generation is a mess overall. It had the capability to do so much more, but a lot of factors were stacked up against it to bring it down. Of course this wont stop people from enjoying their Xbox Series or PS5 consoles, but it doesn't change the reality that this generation isn't all it could've been.
Post a Comment