When it comes to comparing graphics between the same title on consoles and PC, there is no contest. PCs look better. Period. Even with console emulators on PC, some console titles will have better graphics quality than on the console itself. Whereas Sony and Microsoft’s latest flagships, the PS4 Pro and Xbox One X respectively, claim to be able to handle 4K and HDR gaming, the truth is that all that talk is marketing magic.
Instead, resolutions closely approaching 2K like 1440p and 2160p are common, depending on how graphically intense the game gets. The reason for this is that consoles need to maintain a certain number of FPS (60) for consistent smooth gameplay, so manufacturers will scale down the resolution, textures and other GPU hungry aspects.
Now then, it seems that you don’t really need to do anything else, if by nature PC games already graphically outperform consoles. But it’s not that simple, especially if you have an aging PC with an older graphics card. So, first, let’s explore this from a hardware aspect. When it comes to graphics horsepower your GPU determines how fast a game will run and how good it will look. Small Business Chron explains that you’ll need a dedicated GPU if you’re going to do any serious gaming.
Integrated graphics are very limited and their lack of dedicated memory means that visually your gameplay will suffer. With this in mind, swapping it out for a newer (even a generation older) graphics card with at least 4GB to 6GB GDDR allows the GPU to store enough hi-res textures and will automatically provide an improved visual experience. I
f you have the money to splurge, however, go for an Nvidia RTX GPU. Once you’ve seen RTX powered games like Call of Duty: Modern Warfare that make use of ray tracing effects you’ll never want to go back to a console ever again.
Now let’s talk about optimizing your game’s software settings, and it’s not enough to just have everything set on “ultra”. You’ll need to consider if your PC’s hardware can handle all the intense graphics from high-resolution textures to physics-based effects like the dynamic motion of hair and fur featured in The Witcher III.
This is why you should check each game’s individual graphics augmentation settings and see which aspects such as resolution, anti-aliasing, texture quality, number of rendered objects on screen, and many others, affect your game’s performance. In fact, tech giant HP suggests tailoring resolution settings and anti-aliasing configurations as the first manual steps you can take to optimize the way your computer handles graphics.
Anti-aliasing settings are extremely GPU intensive and it’s not always necessary to set them to highest, especially if you’re running at resolutions of 1920×1080 and higher where jagged edges are smoothed out anyway. Crank it up to 2048×1080 (2K) or even 3840×2160 (4K) and notice the amazing details produced which will be light years beyond what the PS4 and Xbox One can offer.
In fact, going back to The Witcher III, and despite it being over 4 years old, it’s still one of the best games of the decade. Forbes notes how the game is considered one of the most graphically impressive and most important games of our generation (play it in 4K and see what we mean). On PC that is. Consoles actually held The Witcher III back from achieving its full potential.
So, why would anyone bother with a console you ask? Consoles offer a convenient alternative to both newbies and hardcore gamers, with graphics capabilities that are generally above par.
However, for those looking for the latest and greatest graphics horsepower and are spoiled by high resolutions and incredible graphical effects, nothing beats a PC.