Microtransactions are still plaguing games after being around for 18 years.

A retrospective on microtransactions in console games, absolving Bethesda as the pioneer.

April 7th 2024.

Microtransactions are still plaguing games after being around for 18 years.
When looking back on the history of microtransactions in console games, it's hard not to think about Bethesda. They were the trailblazers, the first to release a proper microtransaction for a console game on April 3, 2006. Sure, there had been things like dashboard and avatar pictures before, but this was the first time an actual in-game item was offered for sale. And it caused quite a stir.

At the time, I remember there was a lot of outrage over the concept of microtransactions. The idea of paying real money for virtual items seemed ridiculous and many mocked both the game and anyone who bought it. The item in question was horse armour for the single-player game Elder Scrolls 4: Oblivion, and it cost £1.50. But despite the backlash, it was surprisingly successful. People bought it in droves and soon enough, Bethesda and other game developers realized they could make easy money by selling cosmetic items for real money.

Around the same time, microtransactions were also becoming popular in mobile games. But this was a different situation. Most mobile games were free to play, and if you did pay for something, it usually cost less than the horse armour. Nonetheless, it was still concerning to see microtransactions becoming more prevalent in the gaming world.

One of the first consequences of the rise of microtransactions was the disappearance of unlockables in games. The days of unlocking costumes and extras by playing the game were quickly fading away. Why offer these things for free when you could sell them for real money instead?

This wasn't necessarily a huge problem, as we understand that game development costs have risen, but game prices have remained the same. However, things took a turn when games started to sell in-game currency or items that made the game easier or quicker to play through.

As microtransactions became more profitable for publishers, they began to take things out of games to sell them separately later. This led to a growing concern that game designs were being altered to entice players to buy these microtransactions. Suddenly, we couldn't trust that the game we were playing was meant to be experienced in a certain way.

From horse armour to pay-to-win gaming in just a few short years, it felt like microtransactions had become a core part of the gaming industry. It was as if it was some cool new gameplay idea rather than just a way to make more money.

Big publishers like Ubisoft, Activision, and EA also started to add microtransactions after a game's release, so they wouldn't be mentioned in reviews. This was met with constant criticism, but they didn't stop. Why would they? Only a small minority of people were complaining, while the majority of players were paying without thinking about the consequences.

Despite the backlash, I don't blame Bethesda for being the first to introduce microtransactions. They were just one of many companies to come up with this idea. It's the gamers who make it a success. If no one had bought that horse armour 18 years ago, microtransactions might have been written off as something that would never work on consoles.

Unfortunately, the opposite happened, and now it seems like microtransactions are designed first, and then a game is created to sell them. The game you pay for is essentially just a storefront for the real profit. So, thank you to those Oblivion players who thought £1.50 for a piece of horse armour was a good idea. You've had a major influence on the history of video games.

[This article has been trending online recently and has been generated with AI. Your feed is customized.]

 0
 0