dogs1Just when we all thought the wacky Watch Dogs saga was finally winding down, it received another second wind last week when Guru3D user TheWorse discovered hidden in the code for Ubisoft’s divisive open-world game a series of graphical effects that had been used in the game’s infamous demos from E3 2012. The same effects, in fact, that dropped jaws at the time and whose absence in the finished product (Which suffered several delays before its eventual release last month) has been a bone of contention for gamers as the graphical powerhouse many felt was promised was increasingly revealed to be far more ordinary-looking than those early reveals suggested. TheWorse knocked together a patch that re-enabled many of these effects which has taken the internet by storm over the last few days, bringing the game a great deal closer to the visual splendour seen in those initial demos.

But even that wasn’t enough to plug the crack in the Drama Dam, with Ubisoft releasing a patch this week that while performing some much-needed fixes, such as dealing with a particularly annoying save-corrupting bug, also happens to share the same format as TheWorse’s mod – meaning that it will overwrite it on installation. Stunningly, this has not been received warmly by gamers, prompting the following response from Ubisoft:

“The dev team is completely dedicated to getting the most out of each platform, so the notion that we would actively downgrade quality is contrary to everything we’ve set out to achieve. We test and optimize our games for each platform on which they’re released, striving for the best possible quality. The PC version does indeed contain some old, unused render settings that were deactivated for a variety of reasons, including possible impacts on visual fidelity, stability, performance and overall gameplay quality. Modders are usually creative and passionate players, and while we appreciate their enthusiasm, the mod in question (which uses these old settings) subjectively enhances the game’s visual fidelity in certain situations but can also have various negative impacts. Those could range from performance issues, to difficulty in reading the environment in order to appreciate the gameplay, to potentially making the game less enjoyable or even unstable.

Thanks for playing Watch Dogs and stay safe on the mean streets of Chicago.

-The Watch Dogs Team”

Let’s unpack this statement, shall we?

dogs3the notion that we would actively downgrade quality is contrary to everything we’ve set out to achieve – If we’re taking overall quality, i.e. gameplay quality as well as graphical quality, then that’s fair enough in principle and something we’ll look at more fully in a minute. However, in terms of graphical quality there is simply no metric that can state that the Watch Dogs that was shown at E3 2012 is not vastly different, and looks significantly advanced, from the game that was released a few weeks ago. The game simply doesn’t look as good or run as smoothly: just ask PC players, who have been putting up with incessant frame rate stuttering while in vehicles. Ubisoft made a big deal of positioning this game as a genuinely ‘next-gen’ experience, showing off the power of the new consoles and providing graphics that were dramatically superior to what we’re used to, so wouldn’t it stand to reason that if Ubisoft were so committed to providing a consistent level of quality throughout the game’s public lifespan, it would look no different now to what we saw on its announcement, or at least at a comparable visual level. If something is clearly less advanced in a final product than what it was in its promotion, that is a downgrade because it’s simply not as good. Again, this is just as far as graphics are concerned (Though plenty of people would contest that the gameplay didn’t match the hype either, though that’s arguably a more subjective thing). The Ubi doth protest too much, methinks.

dogs2The PC version does indeed contain some old, unused render settings that were deactivated for a variety of reasons, including possible impacts on visual fidelity, stability, performance and overall gameplay quality”This is where things get more interesting, and in where a grain of truth may well lie. Many gamers did report that the TheWorse mod didn’t effect the game’s performance (I actually found that it improved the stuttering slightly, even if the problem was still there). However, many users have also reported glitches with things such as car headlights conflicting with other lights and lighting not adjusting to when the player enters buildings, leading to incorrectly dark interiors.  Some have also claimed that some areas run much worse with the mod than others, presumably depending on how much is going on in those environments that may require more graphical horsepower. So there may be anecdotal evidence that backs up the team’s claim that these settings could not be optimized at a level that ensured that the game ran smoothly, and to be fair the mod is only a few days old at this point: it’s very possible that more serious effects exist that simply haven’t been encountered yet, so while I’m not saying we give Ubisoft a pass on this one their reasons should not be dismissed out of hand either. Take the E3 depth of field settings, which were so hilariously extreme that it actually did hinder visibility.

But then again, TheWorse did quickly tweak it to a better-looking and more practical level, so even if we take Ubisoft’s word for it that these settings were left unoptimized that doesn’t take away the fact that that isn’t really the point of all this. The point is, why were these settings simply disabled instead of worked on until they were optimized? There have been a million conspiracy theories floating around on this one: they wanted parity with the console versions, Sony/Microsoft/Beelzebub Himself paid Ubi off to gimp all other versions except the one on their own platform, they knew they couldn’t deliver graphics on that level on the consoles so gimped the PC to save themselves embarrassment… And as is the case with conspiracy theories, they’re mostly complete bollocks. Ubisoft’s statement, however, does reflect an omnipresent reality of game development: that the technical realities of programming and the pressures of release deadlines often mean that many of the shiny features you want to include in your game, you simply don’t have the time and/or money to include. That’s why you generally keep mum about your fancy features in the early stages of development, unless your name’s Peter Molyneux.

dogs4Watch Dogs had already been delayed eight months, and it wouldn’t be surprising if the Xbox One/PS4 versions were the lead platforms. You have one of Ubisoft’s biggest releases of the year in the hectic final stages of production, with the devs crunching to get the thing ready for release – it doesn’t seem all that outlandish that much time would be put aside for optimizing some exclusive code for a non-lead platform. Cuts like this are made all the time and we never hear about them. The difference here is that much of what was excised was exactly was the game was sold to us on – and Ubisoft did sell Watch Dogs from the start as the big, innovative step forward that would herald the new generation, even going so far as to deny any graphical downgrade when footage of the actual game hit the net in the weeks leading up to release. Watch Dogs‘s graphical splendour was made a pillar of the game’s promotion from the first announcement to the day of release, almost to the point where it was the central one.

Ubisoft’s response has a bedrock of reason to it, but treating people like they’re irrational to expect the experience they were sold on, then interfere when fans do try and restore those features, is just a bit gauche. Where was this patch the first three weeks, when many of us couldn’t play the game because of cloud save corruptions and Uplay server crashes, and were suffering a jittery mess even when we could? Those were performance issues that had nothing to do with E3 bells and whistles, and apparently they weren’t stamped out in time either. It’s clear that the E3 settings were not worked on because Ubisoft simply didn’t see the PC market as lucrative enough to spend the resources to do so, and that goes for other issues that were plainly obvious. What Ubisoft is giving us is an explanation that’s rooted in truth, but still yet obscured by spin.

The thing is, however, we should also consider that this isn’t just an Ubisoft problem but an industry problem, and one which has been around almost as long as gaming itself. It started with putting arcade shots on the back of 8-bit game boxes, then became bullshots, and then the bullshots became CG movies that presented an unrealistically polished portrayal of games. Now we have publishers passing off highly-advanced PC builds of games, even console exclusives, that have no chance in hell of looking that good on release. It hasn’t just been Ubisoft that’s been doing this: remember last year, when many publishers were called out for running PC dev builds of console games at trade shows? Does anyone really think that, if LucasArts had survived, Star Wars: 1313 would’ve ended up looking remotely as good as that gameplay movie that had everyone foaming at the mouth?

As the new generation continues and the new generation moves out of that awkward transitional phase every new generation goes through we’ll see devs learn to coax more out of the new hardware, and games will start to look more like their promotional buzz reels again. But that doesn’t change the fact that the disparity between the marketing of games and the realities of how they are accomplished still remains. At the root of the Watch Dogs saga is chicanery that’s always been there, and it can only be hoped that if this will ever change cases like this will encourage it to happen.

dogs600