About Us | Game Reviews | Feature Articles | Podcast | Best Work | Forums | Shop | Review Game

Why isn't PC gaming pushing technological boundaries?

Mike Doolittle's picture

Call of Juarez: Bound in Blood Screenshot

As a dyed-in-the-wool PC gamer and hardware enthusiast, I regularly visit [H]ardOCP, a website dedicated to all things PC gaming hardware. They recently did a performance evaluation of Call of Juarez: Bound in Blood using a number of modern graphics cards across various price ranges, and found, as they have with many games in the last year or two, that it performed very well even on lower-end graphics cards. They conclude:

As the prevalence of cross-platform game development increases, it seems that the PC's role as a gaming technology showcase is diminishing. Gone are the days when games were developed for the PC first, the consoles later. It is simply more profitable to focus on consoles, and then throw the PC gamers a bone (or not). There are various and numerous reasons for this, but the reason most cited by game developers and publishers is the persistence and relative ease of game piracy on the PC. It just makes sense that these businesses shift their focus in the face of unswerving opposition.

But whatever the cause, there is no doubt that PC game are getting lighter, and their hardware requirements are becoming less and less stringent. Once upon a time, PC gaming was about tweaking and modifying games to run and look better on the staggering variety of hardware in the wild. But now, almost every game we have seen in the past year has run beautifully out of the box on even the least expensive video cards.

Call of Juarez: Bound in Blood Screenshot

This is by no stretch of the imagination the first time I've heard such things. When I was active on the now-defunct Tweakguides.com forums, I debated the apparent decline in cutting-edge PC technology with PC gamers many times over. It is inarguable that in many respects, it has never been easier on the wallet to be a PC gamer. Many games do indeed perform exceptionally well across a large variety of cards; the high-end configurations seem more suited to those who want to run very high levels of anti-aliasing and/or ultra-high resolutions. My own video card configuration, a pair of nVidia GTX 260s—a reasonably high-end setup—allows me to run even the most demanding games with extremely high image quality on my 22" monitor. While ATI and nVidia are preparing to release their next-generation DirectX 11 cards this fall, I truly see no need for an upgrade, particularly since it will likely be at least a couple of years before DirectX 11 is widely used. 

But I think the reasons for this lessened pressure to buy expensive upgrades are more complex than the proliferation of multiplatform development. And I think that, despite the historical performance-per-dollar ratio we see in the video card market, games are continuing to push technological boundaries. Let's consider some of the factors.

Crysis Warhead Screenshot

1. A competitive GPU market

There is no denying that the video card market has matured greatly since its dawn in the mid-90s; it's hard to believe that ten years ago, the Voodoo3 was considered a high-performance card. nVidia and ATI hadn't even entered the picture at that point, much less become a mature, competitive market.

Over the last eight years or so, nVidia and ATI have continued to attempt to out-do one another, and each generation brings more performance across a greater variety of price ranges.  We've seen monster dual-GPU cards that cost over $600, and lean machines that offer surprising performance for a low price. The last year has been particularly good for ATI who, despite still retaining a relatively small share of the GPU marketplace compared to nVidia, has leveraged their efficient, powerful GPUs to force nVidia to drop their prices and offer more cards targeted at more price points.

If Intel's forthcoming Larrabee makes an impact in the GPU marketplace, we may see even more performance-per-dollar.

Crysis Warhead Screenshot

2. Reusable, highly optimized game engines

Every so often, a game comes along that really pushes graphical boundaries. Doom 3, Far Cry and Half-Life 2 did it in 2004; The Elder Scrolls IV: Oblivion did it in 2006, and Crysis did it in 2007. But these types of games are few and far between. It makes more sense for developers to reuse existing engines with minor tweaks and optimizations rather than attempt to build a cutting-edge engine from the ground up for every game. Over time, these engines can offer very impressive performance as they become increasingly optimized. Valve's Source engine, which was introduced in 2004 with Half-Life 2, is still widely used nearly five years later. The famous Unreal Engine 3 has likewise been in use for a few years now, and provides very impressive visuals while keeping hardware demands remarkably reasonable. Currently, the most advanced engine is Crytek's CryEngine2, which was introduced in 2007's Crysis. At the time, it was so advanced that even the highest-end dual-GPU configurations could not run it at maximum settings. A year later, the expansion Crysis Warhead introduced numerous performance optimizations which, along with the increasingly competitive GPU market, allowed it to be played at high or even maximum settings without an ultra-expensive PC. And while we will undoubtedly see advanced engines trickle in occasionally, given the immense risk in game development the use of a well-established high-performance engine will remain preferable to ground-up engine development for many developers.

Alan Wake Screenshot

3. Multiplatform development, piracy or not

There is little doubt that more developers are focusing on multiplatform development. Genres that were once solely the domain of hardcore PC gamers have branched into consoles as well—first-person shooters were the first to go, and even MMORPGs and real-time strategy games, while still well-established on the PC, have begun to trickle into the console space as well.

The exact impact of piracy on sales is unknown, nor is its real extent. But piracy aside, the ease of development across multiple platforms (the Xbox and its successor are DirectX-based platforms—hence the name, which was derived from "DirectX Box" during the original's development) and the wide audience granted by multiple platforms make a singular focus on any platform seem somewhat short-sighted. And while Microsoft and Sony are free to throw incentives at developers to keep games platform-exclusive, the PC remains an open platform for developers.

Call of Duty: World at War Screenshot

We should also keep in mind that multiplatform development is by no means some sort of new trend. It took hold with the first Xbox, and nothing has been the same since. As far back as The Elder Scrolls 3: Morrowind, big-name PC developers have been developing PC and console versions of their games simultaneously, and this has not stopped the PC versions of such games from being cutting-edge. Flexible game engines allow for scaling across a wide variety of hardware, and as we saw with Oblivion, engines can be optimized to take advantage of both consoles and a wide array of PC configurations. 

I can't help but think that there really isn't much to complain about. I'm occasionally frustrated when a developer delays or nixes a PC version of a game (as was recently the case with Alan Wake), but in most cases the wait doesn't faze me. There are still many excellent PC exclusives and multiplatform games that take fine advantage of modern PC technology. The competitive GPU marketplace, along with the rise of digital distribution platforms like Steam and the gaming-friendly features of Windows 7, are tearing down some of the entry barriers to PC gaming, which may be vital for the long-term viability of the platform. And ultimately, whether a game was developed for this platform or that often has very little bearing on its quality—and for those of us who are willing to shell out for a more customizable, higher-fidelity experience, the performance is still there. A game like Call of Duty: World at War may not be as demanding as Crysis, but when you see it in native high-resolution with anti-aliasing and every whiz-bang visual effect cranked up and play it with a 1600dpi laser mouse and customized key maps, the console versions just seem flaccid by comparison.  The PC gaming landscape is changing, but it's still going strong and, with inexpensive hardware, digital distribution and a vast catalog of games and mods, there has perhaps never been a better time to jump on board.

Category Tags
Platform(s): PC  
Articles: Editorials  

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Thanks

Thank you for taking the time to write this informative article.

More cons than pros

There are definitely some pros to the current situation, with the focus on multiplatform development. It's forced the hardware manufacturers to lower their prices in order to compete more effectively with consoles. Both ATI and nVidia have dramatically dropped the prices of their cards. You can now actually buy a really well performing video card for under $200. This was unthinkable a few years ago.

But when developers don't develop at least the occasional PC-exclusive title that pushes the boundaries, then obviously there's little incentive to upgrade. Why the hardware manufacturers don't give the game developers a kick in the pants (i.e. bribe) more often, to make an exclusive PC title (that isn't another bloody MMO, RTS, or casual title), I don't understand. It's what Microsoft and Sony do. They bribe the developers to make games 360 or PS3 exclusive. We need this on the PC side, as well.

Microsoft used to do this for Windows on the PC, but now they don't seem to give one hoot about PC gaming any more. Shutting down their PC game dev divisions (Flight Sim and Ensemble studios), and the joke that is GFWL, is ample evidence of that. Microsoft is PC gaming's worst enemy, without a doubt.

Frankly, I have no cause to complain about multiplatform development, except when it exacts a price on the PC version of a title. Poorly coded ports that are full of bugs, console interface leftovers, and mediocre performance. Ports that are missing features or content. Ports that don't receive the same DLC as their console bretheren. Delayed PC versions, or no PC version at all. These issues are unfortunately commonplace these days, and they annoy PC gamers like me to no end.

Alan Wake is a perfect example of this. It started off as a title, that was supposed to push the boundaries of PC performance. I remember when Remedy demoed it at an Intel conference, bragging about how all the physics effects in the game were only made possible thanks to the quad core processor supplied by Intel. It was meant as a showcase title for the PC. And now sudddenly it's become a 360 exclusive? For the time being, at least? WTF?

That's Microsoft deliberately withholding the PC version, in order to pimp the 360 version at the PC's expense. They've already admitted as much in the past. The manager for the european branch of Microsoft's entertainment division stated that they intentionally withhold PC versions, because they know they'd sell better in Europe for example. They'd be slitting their own throats in terms of sales for the 360 version over there. You'd think they would be happy as long as the game is played on a Microsoft platform, regardless of which one it is. But no. And it's these kinds of stunts that understandably make PC gamers angry.

The answer is rather simple PC is a smaller niche than console.

When PCs lead the industry in power and software development things were catered to the PC gamers but over the last 5 if not 10 years with since the days of the Xbox MS and the game industry itself focused more and more on consoles and when they matched the broad spectrum of mid rang rigs there was just no turning back since consoles just sale more titles.

I picked the PC for 2 reasons control over the control mice and keyboard variety and control mapping variety(unless its some console hand me down that dose not have the IQ to use any keyboard or 5+ dual wheel mouse button or function) something consoles don't quite get as they assume any console noob grabs 1 kind of controller and gnaws on it fractiously drooling all the while...

If the consoles allowed you to re-map the control outside of a game in the consoles dashboard/system menu that would turn me into a console gamer again hell if they let you use most or even specific branded mice and keyboards I would be an avid console gamer....but.... they don't....

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Code of Conduct

Comments are subject to approval/deletion based on the following criteria:
1) Treat all users with respect.
2) Post with an open-mind.
3) Do not insult and/or harass users.
4) Do not incite flame wars.
5) Do not troll and/or feed the trolls.
6) No excessive whining and/or complaining.

Please report any offensive posts here.

For more video game discussion with the our online community, become a member of our forum.

Our Game Review Philosophy and Ratings Explanations.

About Us | Privacy Policy | Review Game | Contact Us | Twitter | Facebook |  RSS
Copyright 1999–2010 GameCritics.com. All rights reserved.