Game Developers are Unicorns

It might be my generation, it might be that standards have got lower over the years but the fact remains that when it came to moving to the next step of my professional life (secondary school to college to university to work) I always heard the same complain: the class before was better, they knew more, we got further with the program, they were more committed, etc.

Now, if someone hears that every time, they start to believe it or, has it has been shown in many studies, the performance drops due to being “manipulated” in believing you don’t account for much. But there is also the other side: that the upper “levels” expects the lower “levels” to do all the work so that they are there just to enjoy the ride. And this is especially true when moving from school to the industry, where you have to be productive from yesterday and everybody is busy shipping the next product and no one can spare the time to mentor a newcomer. For this reason, newcomers must already have 5+ years experience (wasn’t there junior in the job title?), have worked with 15 different frameworks and languages and be familiar with the custom tools that are used in the studio (so palantirs do exist!).

This is a bit dramatic, but not too far from the truth. And while the industry complains that there is no talent to be found, I rarely hear of initiatives to fill the gap. A possible solution is for universities and the industry to start really collaborating in cultivating talent. While there exists programs that try to do exactly that (for instance, the one I am currently enrolled in at the CDE), there aren’t enough of them and even when there are, I still feel a certain resistance from the industry in getting on board.

Here are some general thoughts on why companies might note be so willing to take on more junior developers and also other steps of the pipeline:

– time: again, getting someone on the team that might still be in school is a serious commitment, they need to learn many different aspects of the craft while there are products that need shipping and having someone mentor them even for a few months means reducing the output of a precious resource

– long term view: you are not guaranteed that the student will remain at the studio, so that they might have invested a lot of resources to train a person that will later move to another company. While this might be a concern, consider that the same will happen at other companies and you will “swap” talent. This is a great way to share knowledge and in general improve the overall quality of new entries

– open source: you can mock web developers all you want, but they understood a while ago that the best way to find new talent is to share their tools so that people that are interested in the industry can have a go at the code and when they interview for a similar position, they already have hands on experience with it. Also, web development conferences usually share their recordings for free, so that people who couldn’t attend (hey, conferences are expensive) are still able to catch up to what’s happening in the industry. CppCon does that, but it’s not strictly a game developers conference; GDC has got better, as all talks are now free from 2013 backwards, but two years is still a significant gap in the industry. And don’t get me started with academic conferences and papers…

– which brings me to standards: as game developers, we feel that every game we develop is unique and needs new technology to come to life. And while this might be true to some extent and there has been some progress with the Unreal and Crytech engine making their source code available to everyone, they are still heavily customised for each game. Also, the engine is only part of the pipeline, with a lot of other tools that are developed and maintained in house. Now, I might be naive here, but if a tool for editing animations, packing textures or compiling shaders is what gives companies an edge, there are some serious issues with the way they do business

– education: this relates to the previous two points; diplomas and degrees programs can be updated and reviewed only so often (I think the minimum is around 5 years) during which a lot things will evolve industry wise. Also, colleges and universities should provide general knowledge to the students and unless you are doing a very specific course in game development, a BSc in computer science has to provide knowledge for a wide range of disciplines. Disciplines in game development are so specific (i.e. animation, AI, rendering, audio, networking, gameplay, etc.) that they wouldn’t fit in a regular curriculum. Expert in those areas have dedicate their professional life to it, so it would be a bit unfair to expect the same amount of knowledge from a junior

– entry level: I might sound like a broken record by now, but most companies usually want at least two years of experience even for a junior position. Now, where would someone get that experience if nobody is willing to let them gain it? True, you can build side projects to prove you understand the basics, but as we all know coding takes years to master so you would be delegating to someone else the burden to train that person until they are expert

– turnover and overtime: this might apply only for a minority of candidates, but working in the game industry is not the safest of options. With the news of studios closing down making the headlines quite frequently, talented people might prefer other industries that have proven to be safer. Game developers also either boast or denounce the high amount of overtime in the industry, which again can be off-putting for individuals that would like a life outside of work.

Unfortunately I don’t have a solution to the problem. Definitely a more open collaboration between the industry and academia would be a good step, with companies regarding training and mentoring more as an investment rather than pure cost.

Thoughts?

Thoughts, suggestions, criticism? Let me know!