Consoles may have their technically impressive titles, but on PC, games aren’t held back by ageing hardware, and while that means we’re treated to some truly incredible looking experiences, it also leads to developers pushing the very best hardware to its absolute limits. We’re talking GPU starting to smoke inside your tower kind of limits.
The thing is, these unhindered efforts have become cornerstones in PC gaming, creating some of the most technically impressive graphics in the history of the industry. They became benchmarks (with many still holding that reverential place) not just for players, but developers the world over.
How could we put together a feature on games as graphical benchmarks and not mention the elder king of GPU destruction? When Crytek dropped Crysis upon the PC gaming community in 2007, it drew a line in the virtual sand between those that could run it, and those that wept in lamentation over their impotent hardware setup.
Today, we’re 11 years on from its initial release, and the original Crysis is still an absolute beast, even if you’re not running it on ultra.
It’s because the game was a trailblazer, running at a seemingly-impossible 60fps thanks to its use of the then relatively new DirectX 10 and a series of rendering techniques that worked some incredible graphical magic.
The jaw-dropping spectrum of graphics available today has made us almost immune to true virtual fidelity, but back in the ’90s our minds were still intact and in dire need of blowing.
Enter Myst, a 3D world that combined a sedate journey through a world filled with unusual and obtuse puzzles. Looking back on it today, it looks so basic it’s almost laughable, but in 1993 it was a realistic world of wonder.
The game was so big it had to be packaged in a CD-ROM rather than a floppy disk (a decision that helped welcome in the popularity of the format in PC gaming), and was one of the first games to use QuickTime (but don’t hold that against it).
It was built using high-end Macintosh Quandras, with its relatively small team using a number of texture and rendering techniques to overcome the technical limits of the age.
Grand Theft Auto V (2013)
Few games have left quite the mark of the industry – and wider popular culture – as Grand Theft Auto V. Even if you don’t take into account the fact it’s made $6 billion in revenue since its launch in 2013, Rockstar’s open-world extravaganza has become an absolute leviathan running on PC.
Sure, it took 18 months for it to hit PC, but when it did, it was clear the Scottish developer had used that extra time to crush your puny specs.
Despite being a series that wasn’t known for its GPU demands, GTA V can run at 200fps. It requires a reasonable 4GB VRAM if you want it running at top settings, but you’ll need a beefy GPU if you want to utilise all the graphical shenanigans this version supports as standard.
The Elder Scrolls III: Morrowind (2002)
A feature on graphical achievement that eschews The Elder Scrolls V: Skyrim in favour of its long-in-the-tooth forbear, Elder Scrolls III: Morrowind? Have we lost the plot? That might seem like the case, but for its advancements in 2011, the fifth Elder Scrolls paled in comparison to leaps and bounds that were made by Bethesda with the third.
While its predecessor, Daggerfall, was technically the first to use a form of 3D polygons via the XnGine engine, it was the use of a Direct3D-powered setup that supported the use of skeletal animations (for more lifelike character movement) and 32-bit textures. The project took six years (with a break in the middle) to finish, but the end result was a technical turning point for the series and action-RPGs as a whole.
It may not look like much now, compared to the dynamic lighting and photorealistic textures of modern shooters, but over 20 years ago Quake was a technical revelation that redefined the use of 3D space and polygonal models.
Director John Romero wanted to create something that was technically and mechanically superior to 1993’s Doom, and that’s just what he got with the power of the in-house Quake engine.
The new engine dropped the use of 2.5D maps and flat textures in favour of a far more immersive 3D level design, while prerendered sprites were put to the sword as polygonal models brought Quake’s enemy and weapon design kicking and screaming into the world of true 3D. Multiplayer shooters haven’t been the same since, and with good reason.
The Witcher 3: Wild Hunt (2015)
Where The Witcher 2 was once used as a PC gaming benchmark, it seems fitting its grand and engrossing sequel take on that same mantle in the modern era.
Even now, a full three years on from its release, The Witcher 3: Wild Hunt is the game you load up when you want to test your GPU, CPU and new curved, 4K-ready monitor.
Even on the lowest settings, guiding Geralt of Rivia and his trusty horse Roach as he attempts to find his adopted daughter, is a gorgeous experience.
StarCraft II: Wings of Liberty (2010)
StarCraft II: Wings of Liberty may have become a staple in the competitive strategy RPG scene in esports, but once upon a time it was a new title that was quite literally melting GPUs in towers and gaming laptops the world over.
The issue? The game shipped without a frame-rate limiter, which caused some processors and GPUs to overheat. A workaround was quickly found, but it’s remained a beefy means of pushing your machine to the absolute limit – especially if you love your real-time strategy with a heavy dose of science fiction. Just make sure you keep those frames under control.
Arma 3 (2013)
You might know Arma 3 as the game that provided the basis for the DayZ mode, the fan-made zombie survival experience that served as the genesis for the now super-popular battle royale genre, but once upon a time, it served as the hot new thing in the world of military simulation.
Back in 2013, it proved to be one of the most critically-acclaimed titles on PC, thanks mainly to its realistic physics and gorgeous recreation of Greece and the South Pacific. It was a real test for hardware at the time, and can still be a beast to run even now, five years on from its release.
Hitman, the 2016 episodic renaissance of Agent 47 wasn’t just a brilliant return to form for Danish developer Io Interactive, but a beautiful way to push your PC gaming machine to its limits.
Actually, if you run it at normal graphical settings, it won’t crush your GPU to dust, but crank up those settings to ultra turns a gorgeous-looking game into something truly breathtaking.
With a decent processor (preferably with myriad cores) and DX12, you can really see every bullet casing hit the floor and the light dance of 47’s bald head as your surreptitiously bump off your target at a silky-smooth 60fps.
Back in 2005, Monolith – the studio that would go on to re-imagine the Middle-earth with Shadow of Mordor and Shadow or War – was making waves with a new game that combined John Woo-esque first-person shooter action with Ring-style horror madness.
Over a decade ago, F.E.A.R. looked stunning for its time, and came with some seriously high spec requirements as standard.
From its complex lighting model (which used per pixel lighting, volumetrics and lightmapping) to its intricate textures and environmental builds, it looked incredible. It may not have that same sense of wonder now, but it’s still an impressive feat 13 years on.