Graphic by Cameron Yong/HIGHLANDER
Graphic by Cameron Yong/HIGHLANDER

Video games are an expensive medium no matter how you look at it. Most gamers can afford to buy a few games a year, as the margin for entry requires the ownership of a console, PC or compatible device that will enable the operation of that game, as well as the cost of the actual game itself and any peripheral hardware or software that may be necessary. If one was to do the math, it is possible to see that the cost of getting the bare minimum products needed to play games in the current generation can easily add up.

As it is, this same pricey nature of playing video games leads to the desire on the part of the consumer to get the most out of the games they play. The development side of the industry has likewise attempted to make the most out of their games by padding out the multibillion dollar business with extraneous transactions and paywalls that occur parallel to — or sometimes within — the games that have been purchased. Recent events like the release of “Assassin’s Creed: Unity” by Ubisoft in a mostly nonfunctioning state have raised questions about the existence of microtransactions in a game that had people paying full-price.

Ubisoft released “Unity” to cries that it was a game marred by the presence of excessive bugs, which caused everything from the player character to fall through the geometry of the world to disappearing facial animations that left players talking to floating eyes and teeth. What put many others on edge, however, was the presence of in-game microtransactions, something regularly reserved for games with no initial purchase price that instead make up the entirety of their profits through in-app purchases and ad placement. While these microtransactions do not directly affect the playstyle or completability of the game, they enable players to get a boost in various aspects of the gameplay, such as enhancing melee strength during combat — essentially giving a cheat code for game owners willing to pay for it. This issue was exacerbated by Ubisoft’s choice to patch problems with monetary transactions before fixing the game-breaking bugs.

Another game hampered by either the developer or publisher’s choice to emphasize monetary gain over consumer satisfaction was “Batman: Arkham Origins” earlier this year. Upon the game’s release, the numerous bugs left players to deal with glitches in the game engine that brought about audio drops and even hampered game progression. Though it was apparent that these issues existed, Warner Brothers Montreal released the statement that, “The team is currently working hard on the upcoming story DLC and there currently are no plans for releasing another patch to address the issues that have been reported on the forums.” The development and production sides of “Origins” chose to milk their consumers for more money rather than fixing the issues they experienced with products they had already purchased.

This trend points to the fact that in recent years, priorities in the game industry have shifted to emphasize a continuous release of content for a more steady cash flow, rather than depending on pleasing consumers with finished products. Beta testing — the level of development in which companies are tasked with playing the game systematically so that faults in the game code and design can be found and fixed — has been made secondary in an effort to make money quicker. While the development end of gaming (as with any portion of the entertainment industry) has extremely high overhead costs — sometimes running above the $100 million mark — that cost should not be deferred to the consumer, and should certainly not come at the cost of a finished and working product.

Price points for games are firmly set for now at $60, and attempts to increase that price by too much are likely to go over about as well as a concrete airplane. It is for this reason that game companies attempt to make up the increasingly costly creation of games with DLC and microtransactions. It is here that the error in AAA-game development is really made apparent, and while devs often place focus on constant money-fueled spectacle, the actual gameplay suffers at the hands of deadlines and draining budgets.

Coming from the angle of a college student, it is necessary that people vote with their dollars to show that these sorts of actions are not acceptable on the part of developers. If the trends of purchases are shown to reflect games that have little to no bugs instead of following a day-one fervor tied to well-known franchises, then publishers will respond by following the money. With such little money to freely spend after we pay tuition, rent and bills it should not also fall to us to pay for the development of AAA-game titles that are shipped in an unfinished condition. Independent games like “Super Meatboy” and “The Stanley Parable” are perfect examples of enjoyment at a price point that doesn’t exploit the consumer, and should be encouraged, even among companies as large as Ubisoft and Warner Bros.