Introduction

The gaming industry has come a long way since its humble beginnings more than thirty years ago. From a time when people were thrilled to see a square white block and two rectangular paddles on the screen to today, where gamers explore realistic three-dimensional worlds in high resolution with surround sound, the experience of being a gamer has changed radically.
The experience of being a game developer has changed even more. In the early 1980s, it was typical for a single programmer to work on a title for a few months, doing all the coding, drawing all the graphics, creating all the music and sound effects, and even doing the majority of the testing. Contrast this to today, where game development teams can have over a hundred full-time people, including not only dozens of programmers, artists and level designers, but equally large teams for quality assurance, support, and marketing.
The next generation of consoles will only increase this trend. Game companies will have to hire more artists to generate more detailed content, add more programmers to optimize for more complex hardware, and even require larger budgets for promotion. What is this likely to mean for the industry?
This article makes the following predictions:
- The growing cost of development for games on next-gen platforms will increase demand from publishers to require new games to be deployed on many platforms.
- Increased cross-platform development will mean less money for optimizing a new game for any particular platform.
- As a result, with the exception of in-house titles developed by the console manufacturers themselves, none of the three major platforms (Xbox 360, PS3 and Nintendo Revolution) will end up with games that look significantly different from each other, nor will any platform show any real "edge" over the others. Many games will be written to a "lowest common denominator" platform, which would be two threads running on a single CPU core and utilizing only the GPU.