Gaming's greatest u-turns

In case you missed it, EA has pulled microtransactions from Star Wars Battlefront 2 on the eve of its release, reportedly following an intervention from Disney CEO Bob Iger after a week of mounting pressure over the game's controversial loot box setup.

We'd be the first to admit we're guilty of hyperbole now and then, but this feels like one of the biggest and most humiliating gaming u-turns in living memory. It certainly puts all that Forza 7 VIP stuff into perspective. But it did get us thinking: is this gaming's greatest climbdown? What else would make the list?

The original vision for Xbox One

How could we start anywhere else? We were there at Microsoft's Redmond campus when the Xbox One was originally unveiled to the world by perma-grinning corporate robot Don Mattrick in a marquee on the lawn outside the Xbox building. Right from the start, it was clear things were going wrong. The presentation focused way too much on Xbox One's ability to interact with TV programming - part of Microsoft's thinly veiled attempt to seize control of everyone's living room - and evaded some of the issues gamers had been discussing in the months beforehand. Did Xbox One really require an internet connection to function? Were all the games built on troublesome DRM, making it impossible to trade in or lend games? Did we really have to have Kinect 2.0 to use this thing?

Unfortunately, it quickly turned out that the answer to all these questions was "yes". People did not like this answer. Microsoft kept on a brave face about the backlash until E3, no doubt hoping people would start to see the long-term benefits of this new vision, and then Sony utterly humiliated them with their famous video guide to sharing games on PS4. It's worth watching this video again, because this 21-second video is one of the most devastating takedowns in the history of video games marketing.

Within weeks, Microsoft completely reversed its position. Xbox One would not require an internet connection at all times, and sharing games would work exactly as it always had. It wasn't long after this that smiling Don Mattrick left the company to become CEO of Zynga. His successor, Phil Spencer, instigated radical changes that put the Xbox division's focus back on gaming, quickly removing Kinect from Xbox One bundles to cut the price and eventually killing it off completely. The damage to Microsoft's reputation was so severe, though, that it currently languishes in a distant second place behind PlayStation 4, and may even end up third at the end of this generation if Nintendo Switch maintains its current momentum.

It's a shame, because the OG Xbox and Xbox 360 were fantastic consoles with rich game libraries that brought real innovation to console gaming. Xbox One is steadily improving now, and we hope it can give Sony a better run for its money in the years ahead.

The Mass Effect 3 ending debacle

When Mass Effect 3 launched in 2012, it was meant to be the crown jewel of a series that had broken new ground in role-playing game design. Over the course of Commander Shepard's space trilogy, BioWare had given you unprecedented control over what happened to you, your crew and the species you met along the way. Players were giddy with excitement at the prospect of seeing their ending, after investing so much of themselves into the series.

Initially it looked good. Critics seemed very impressed, noting that the whole game felt like an ending, taking you on a tour of critical locations throughout the galaxy and delivering closure around key characters and plot points. The genophage, the quarians, the Reapers - you would settle all the family's business. If anything, the game was so obsessed dealing with business from its predecessors that it didn't quite stand out as strongly on its own as either of them, but that felt like an inevitable consequence of BioWare's design. Ultimately it was probably worth it. Right?

But when gamers got to the very end of the game, the final final final showdown, and witnessed Shepard's outcome, the mood shifted. Sure, we'd got to do a bunch of narrative housekeeping along the way, but where was the huge range of outcomes that should have been possible based on the decisions we made throughout the trilogy? Where was the sense of closure? After the unbelievably epic conclusion of Mass Effect 2, where was the final boss the series deserved? And don't get us started on the plot holes. Instead, we got a couple of possible conclusions, notoriously distinguished from one another largely by a change of color scheme. Oof.

What should have been a moment of triumph quickly evolved into a PR disaster. Gamers would not let this one go - they had been led down the garden path by Mass Effect's bluster about a personalized RPG story for literally years, buying into it at every step, and now it turned out it was all hype. The thing that really frustrated people seemed to be that BioWare was so much better than this. The studio of Knights of the Old Republic, Jade Empire and Baldur's frickin' Gate, these guys should have been able to pull this off.

In the end, BioWare agreed. While preserving the original Mass Effect 3 ending, the studio broke off from other activities and went into production on a free piece of downloadable content that installed a cinematic epilogue. The Extended Cut arrived in late June, three months after the launch of the core game, and was generally well received.

Paid mods on Steam

Given the Battlefront 2 topic at hand, the brief history of Valve's paid mod workshop feels like a cautionary tale that EA and DICE might have benefited from learning about a while ago. Essentially, back in 2015 Valve announced that it was adding a payment feature to the Skyrim section of Steam Workshop, allowing mod creators to monetize their content.

On the surface of it, that sounds reasonable enough, but on closer scrutiny there were a litany of problems. The first was revenue split, which would see modders receive 25% of the money their mods made, while the original game's publisher and Valve would split the other 75%. Even in the world of digital downloads, where the host platform often takes a sizable cut, that felt like a pretty aggressive split.

There were other issues that were just as significant. Mods had always been on a spectrum from scrappy, half-working stuff to more polished efforts, and Skyrim - a game with literally thousands of mods available, many of which could be applied simultaneously to transform the game in entertaining ways - had a lot of everything.

Unfortunately, this meant that there were lots of problematic edge cases. Some mods used copyrighted material - easy to turn a blind eye to when it's free and buried in a mountain of add-ons, but more exciting for IP lawyers when it's got a price tag attached to it. Other mods would break when patches were released, meaning you could be left with something you bought that no longer worked. And with the addition of a payment system, unscrupulous sorts could rip off other mod creators, passing their lesser-known work off as their own to make a quick buck. Valve famously doesn't employ legions of staff to monitor stuff, preferring to build systems where the community self-polices nefarious activity, which made this situation even worse.

Valve tried to figure out a way forward with the feature, and Gabe Newell even did an AMA on Reddit about the whole issue, but in the end Valve removed the feature. Its statement included a sensible conclusion that will probably feel sobering to EA and DICE this morning:

"We underestimated the differences between our previously successful revenue sharing models, and the addition of paid mods to Skyrim's workshop. We understand our own game's communities pretty well, but stepping into an established, years old modding community in Skyrim was probably not the right place to start iterating. We think this made us miss the mark pretty badly, even though we believe there's a useful feature somewhere here."

In other words, you can't always take something from one context and expect it to work well in another.

Boycott Modern Warfare 2

This one may feel like a stretch at first, but it also feels very relevant today. Bear with us.

Back in 2009, Infinity Ward did something very unpopular. On the eve of the release of Call of Duty: Modern Warfare 2, one of the most eagerly anticipated first-person shooters in recent history, it revealed that the PC version would no longer use dedicated servers. Instead, the game would use a peer-to-peer technology similar to the console versions. This was a huge departure for the PC gaming community, which had always used dedicated servers - in essence, a PC in a server farm somewhere running an instance of the game that 'client' gamers connected to, ensuring consistency and stability of experience. This was considered the optimal way to facilitate multiplayer in PC games, not least because it gave the community some semblance of ownership over the experience.

We recall this episode very well for a couple of reasons. The first is that, if we remember correctly, community manager Robert Bowling announced the change live on a community podcast, leading to stunned silence from his hosts at what should have been a moment of great excitement for them. We seem to remember voices cracking and audible disbelief hanging over the rest of the conversation, although we may be hamming it up.

The second reason is that Infinity Ward stuck to its guns. The PC game shipped this way. And in response, a large number of PC gamers decided to show their distaste for the decision by organizing a boycott. They could do so visibly, too, by setting up a Steam Group called "Boycott Modern Warfare 2 (WE WANT DEDICATED SERVERS)", which attracted nearly a thousand members.

There was only one small problem: Steam's publicly visible groups allow you to see what their members are playing. And when the launch of the game rolled around, guess what?

It turns out it's not just game companies that end up making embarrassing u-turns.