TLDR: The human tendency to want more for less exists both in game developers and gamers alike and eventually causes games to become beautiful yet empty husks that leave some gamers unsatisfied no matter how realistic the graphics are or how social the game experience is. Interested? Read on then…
I am no spring-chicken and age does give you perspective. And so, I recently tried to take a look back at gaming along the years to try and figure out how we got things like: MMO’s, F2P, Multiplayer and Social gaming. It all lead me to one aspect of Human Nature; wanting More for Less.
One of the first computer games I ever owned was SYZYGY for the Dragon 32K home computer:
As you can see, the game had a minimalistic approach to graphics because of hardware restrictions. Much later in my gaming “career” I got an ATARI 520ST and a Commodore AMIGA and enjoyed plenty of awesome games. It seems to me that most games of that era had entertaining gameplay and good graphics, for the era they released in, that is.
What was common however to all the hardware I owned before I got my first PC was that all those machines were closed in their architecture and did not support replacing their graphics card with something better. My Dragon, ATARI ST and AMIGA would all be set aside and replaced by a PC because the PC enjoyed an open architecture and the older machines did not. This however, I determined, indirectly lead to the decline of entertainment value I derive these days from most of the games both Consoles and PC’s have to offer!
You will undoubtedly ask how I can make such a seemingly ludicrous claim, that constantly-improving graphics may lead to less enjoyable game-experience. The answer is simple: People want more for less, always. Much like the manufacturers of consoles now try to land an exclusive deal with game developers to develop titles for their hardware, graphic-card developers tried to get game-developers to hop on-board and join their brand of video hardware. PC gamers still debate which is better ATI or nVidia and there used to be other graphics-card makers out there.
Graphics-card makers sent prototype models of their hardware to game-developers who developed games supporting the exclusive features the video hardware offered. Games therefore had to include a lot of code that examined the graphics hardware in the PC and disabled or enabled certain features based on what video hardware was installed. Developers wanted to spend their time more efficiently and develop more titles for less time and money. Gamers wanted to enjoy more games with less frequent changes to their hardware and both Hardware and Software industries came to the rescue.
Enter a new generation of Consoles and graphic standardizations such as OpenGL and DirectX but more importantly Game Engines and their toolkits. To make a game you need a team of writers, artists, game designers and last but not least software developers. If you want to make money developing games (or any other product really), the best way would be to recycle old materials (e.g. art assets, program code, gameplay concepts) into new products which cost increasingly less as you continue to develop “new” games. This of course allows you to develop more for less. This means that a franchise could be worth its weight in gold if you managed to produce a new title every 2 years, or even less!
EA Sports franchises have proven this concept as well as countless fans of the HALO franchise, and other franchise based shooters, who are willing to buy new downloadable maps and items for their multiplayer matches not to mention entirely new installments in an existing franchise. Once a franchise gains enough followers, it is more profitable to try and focus development and production efforts on making more “content” in that franchise than trying to breakthrough once more with new ideas.
And then… there are MMO’s, games that would have us either pay $180 a year to enjoy the right to log-in and play, or pay even more for a stream of content-items that is being put into an item-store rather than be added as regular content. Some MMO’s have both a subscription fee and an item-store selling vanity items thus increasing profits to obscene levels.
Add to this the tendency of humans being able to tolerate everything and anything while they’re in a group, and social gaming rises to offer developers profitable avenues. Sandboxes where gamers can engage themselves in building stuff together e.g. Minecraft; or battlefields where gamers fight each-other for… well, nothing really other than brag-rights or territory control which usually means you guessed it… brag-rights.
Do you see a pattern here dear reader? Developers are trying to cut their costs by recycling content and game-systems as much as possible and gamers are loving it because they can grind the content with their friends. All the while there isn’t really a need for new stories, new art assets or well anything new really but a stream of l00t drops and bragging-right plaques (a.k.a Achievements) that gamers contentedly tweet and Facebook about.
You could of course dismiss everything I’m saying, and say I’m just an old fart who’s too self-centered to get down with the masses and grind some Raid-Bosses to get that purple gear piece that would increase my stats by 0.5% so I could grind the next Boss in the next Raid. You’d probably be right too, but I leave you with this: for every awesome single-player game like BioShock Infinite, Dishonored or Ni No Kuni there are multiple F2P MMO’s and Facebook games out there that offer not rich and emotionally engaging content but rather a bland been-there-grinded-that experience that promises nothing but more bragging-rights.
Admittedly, it’s not all bad because some developers are actually integrating in-game systems that allow gamers to create content for themselves, but more on that in my next post!
As always, thanks for reading and even more so for commenting! 🙂