With the exception of the ARM-based Nintendo Switch, the console market is entirely based on AMD CPU and GPU technology. Consoles represent a huge chip market, so isn't it strange that a major semiconductor company like Intel is a no-show?
Why Are There No Intel-Based Consoles?
While there might be no current-generation Intel-based gaming console, it's worth pointing out that the first Xbox was, in fact, an Intel-based console. There is a long and interesting story behind how the first Xbox was a slightly modified PC shoved into an X-shaped box. In fact, "Xbox" is a shortening of "DirectX Box", which reflects the fact that this was Microsoft's PC API squished into a roughly console-shaped mold. But since then, consoles have not featured Intel silicon.
Explaining why isn't a simple matter, but there are some key factors that likely play a role here. Following the first Xbox, the Microsoft Xbox 360 ditched the PC x86 architecture for an IBM PowerPC main processor and a GPU designed by ATI (which is now part of AMD). The PlayStation 3 featured the "Cell" processor, which also uses PowerPC cores but adds in special co-processors that resulted from a collaboration between Sony, Toshiba, and IBM. The GPU in the PlayStation 3 was designed by NVIDIA, but rumor has it that the original plan was for the cell processors to handle graphics on the PS3, but of course, that obviously didn't work out.
With the generation following these two consoles, Sony took on x86 architecture for the first time, and Microsoft's Xbox returned to it, with both using an AMD "APU" (Accelerated Processing Unit) which is its proprietary name for a combination of CPU and GPU into a single package. This is a much more cost-efficient solution for a console than having a separate CPU and GPU, each with its own memory.
With the latest PS5 and Xbox Series consoles, things have stayed the same, so it looks like APUs are here to stay. Of course, Intel has been making CPUs with integrated graphics for decades, but unlike AMD, which benefited from ATI's GPU technology, Intel's integrated GPUs simply did not have the performance or features to make them competitive in the console space. To me, this is the main reason that an Intel chip doesn't sit at the heart of any modern console. It could easily compete on the CPU side of the equation, but not on the GPU side. Until now, that is.
Intel's Graphics Tech Is Finally Ready
While Intel has tried to enter the discrete graphics market before (notably with the interesting yet ill-fated Larrabee project) it's only now that it has a viable discrete GPU solution on the market. Intel's Arc A750 and A770 both perform in line with mainstream GPUs such as the RTX 3060, and are about as powerful as the GPUs in current-generation consoles. This is a major milestone for Intel and something that fans of GPU technology should be excited about since it means there's finally a genuine third competitor in the market again. Because of its position in the market, Intel has had to be aggressive on pricing, and so you can pick up an A750 for a genuinely good price for its performance.
So one thing we do know is that Intel's GPUs on the shelf are powerful enough to at least match current-generation consoles, and they even go beyond that a little thanks to better ray-tracing and AI-upscaling technology. A console with an Intel Arc GPU in it would stand shoulder-to-shoulder with a PS5 or Xbox Series X, at least on paper.
Sparkle Intel Arc A750 8GB
Intel's Arc A750 is a fantastic way to access modern games for a rock-bottom price, but if your motherboard and CPU don't support Resizable BAR or you like to play older DX 9 games, it's better to look elsewhere for now.
Intel's GPU Driver Issues Don't Matter for Consoles
While it's true that Intel's current mainstream GPUs are competitive in mainstream graphics performance, there's more to a GPU than its actual silicon hardware. Drivers are essential to getting usable performance and stability from your hardware, and in this area Intel has been less successful with the launch of its first-generation desktop GPUs.
Unlike NVIDIA and AMD, Intel doesn't have decades of driver development experience for PC games running on high-performance GPUs. This has led to a situation where new games running on DirectX 11 and 12 work just fine on Arc cards, but if you want to play games from the massive PC gaming backlog you might have compatibility or performance issues. Not all new games fare as well either.
For example, Intel dropped the ball a little with the release of Starfield, while NVIDIA and AMD had driver updates ready well in advance of this major game's release date. None of this matters at all when we're talking about a fixed hardware platform like a console. Since every game on the console is written and tuned specifically for the console it releases on, there are, for all intents and purposes, no driver issues to speak of. While the Intel driver team has years of work ahead of it, a console with an Intel GPU should not take anywhere near as long to get into an acceptable state.
The SoC We Want to See
As of this writing, there is no Intel equivalent of an AMD APU, but we can't imagine it will be long before Intel has something packing both its performance CPU cores and Intel Arc GPU technology. While it may be quite a few years before Intel even has the opportunity to bid on being the supplier for a console system, the recent rise in popularity of handheld gaming PCs such as the Steam Deck and ROG Ally is another likely motivator to produce something similar.
AMD is also dominating in this new handheld gaming PC space, so don't be surprised if Intel come out with a competing product to the AMD Z1 Extreme or its successor. Either way, there's no downside for consumers to having strong competition in every market, so bring on the Intel console hardware!