Last week I derided permacrunch – the policy of working an entire creative team 70 hours a week all the time – as “the policy of simpletons and sociopaths”. This led to some people asking, “Why is crunch even a thing?” Can’t management just plan the schedule so that the project is done on time using only 40 hour work weeks?
Sadly, I don’t think that’s a fair expectation at all. And it’s not because management is a bunch of soulless meanies who want to work our poor developers to deathI mean, management might still be soulless meanies, but not for this reason.. The problem is twofold:
- Scheduling is hard. To accurately predict how long it will take you to create software, you’ll need to know all the problems you’ll encounter ahead of time, and how long it will take to solve them. By the time you know that, the game has probably shipped.
- There’s no upper limit on how much time you can spend making a game, and no matter how much time you give the team, some developers will always push for “just one more feature”. Not because they’re stupid or irresponsible, but because they really love games and want to make this one really good. I say this from experience. Good Robot shipped about four months later than we planned, because we had more features we wanted to add. And none of us were getting paid until the game was done. If we were willing to delay our own payday to make the game we wanted, how much easier do you think it is to push for more features when you’re not the one who will have to bear the direct financial consequences?
Scheduling isn’t just a problem in videogames. It happens in all kinds of software development. When asked “How long will it take to accomplish X?” the most common answer will be given under the assumption that when you work on X you’ll never encounter hard-to-identify bugs, that the requirements of X won’t change, and that some obscure hardware problem won’t eat up a bunch of your time. It assumes you won’t have staff turnover that requires integrating someone new to the project. It assumes that no programmers will be pulled away for “just a few days” to deal with some horrible crash or exploit in the game you just shipped, which might actually stall the whole team because everyone’s work is interconnected. It assumes that the design itself is perfect, that all systems will work as imagined on the dry-erase board, and that everyone’s artistic and technical ideas will fit neatly together in the final product.
I hear you say, “So just multiply your estimate by some number!”
Okay. What number? Two? Ten? Should I just make up some random bullshit multiplier?
Keep in mind that when you’re giving a quote like this, you are basically making a promise to get the job done for a certain price. Management is trying to figure out how much videogame they can get for their money, and how risky that endeavor will be. Massively padding out your proposed schedule doesn’t help them make informed decisions.
If you make your quote too high, the project might get canceled, or they’ll start cutting features before you even start. The problem is that management has no ability to know if your estimates are accurate or not. They can’t know if you’re being conservative, or optimistic, or if you’re just padding your schedule to make yourself look good. They have other projects they could be risking the company’s money on, and if yours sounds too expensive then maybe they will cancel your project and do something that seems less risky. Maybe they will assume you’re a bad project lead and give the job to someone else. Or maybe – and this is the really terrifying one – they will pull one of your colleagues into the meeting and ask them for a quote. Better hope they come up with the same time estimate and the same bullshit multiplier!
But let’s talk about the biggest scheduling hazard of them all: You want to make the best game possible.
You’re about halfway done with Punch Guy 2: Punch Harder when one of the gameplay coders shows you something he’s stumbled on. It’s an emergent behavior we hadn’t planned on during the design phase. Our new “destructible environments” feature combined with our “punch guys over railings” feature, and created this situation where you could punch someone through a window. Once people found out about this, they formed a little group around one of the testing computers. All afternoon there’s been a rotating group of five or six people passing the controller around, blasting dudes through windows with uppercuts and cheering. It’s the most enthusiasm the team has shown in weeks.
Everyone who tries it says, “We should add this to the game.” Moments like this are why you went into making games in the first place. It’s an obviously fun feature, its something your rivals making Fist Man don’t have, and as a bonus it will look great in trailers. Obviously you’re a fool if you don’t put this in. And the feature is basically done, right? So this accidental thing is given the blessing to become a real feature. You don’t need to put it on anyone’s to-do list, since people were basically already working on it in their “spare time”.
But as time goes on, it becomes clear that the feature is not actually as close to done as it seemed. Over the following weeks, a gradual trickle of new issues get added to the global to-do list:
1) There are only a couple of windows in the game where this feature can be used. We should have the level designers go in and add a few more, where possible.
2) Windows were originally supposed to be un-breakable, so right now there’s nothing but a black void outside. Level designers will need to go back to completed levels and add some sort of facade for the world outside the window. (Which artists will need to design.)
3) The AI doesn’t recognize it’s been punched out of a window, which means the victim just stands outside of the level, shouting combat taunts. This might lead to players getting stuck in areas where their goal is to defeat everyone.
4) The lead artist is driven crazy by the fact that you shatter these dirty external windows and the lighting doesn’t change in the room, even though it should let more sunlight in. She insists that we add some dynamic lights to deal with this. In the end, she’s right. It really does give the window-breaking an extra visceral kick, because it gives the player an additional way to change the environment. But it also means a little more fussing around and generally adds a bit more overhead to building levels.
5) There are a bunch of strange edge-cases where you can punch a guy out of a window, but other dudes are between the victim and the window. The victim sort of floats through them, which looks horrible. We need changes to the AI or fight logic to handle this.
6) A tester punched the final boss out of a window and ended the fight in 5 seconds. Remember that being punched out of a window is actually just being punched over a railing, which was supposed to be an insta-kill as far as the game is concerned. But we don’t want players to be able to insta-kill the final boss.
We originally dealt with this by not having any railings in the final room. But we can’t remove the windows from the final room. The cutscene animator is adamant that this window is integral to the pre-fight cutscene. It’s major component of the lighting in the room, and one of the characters even looks out of the window and comments on something outside. Removing this window would mean re-writing this scene, re-scripting the action, re-recording the dialog, and re-designing the room layout. The artists say you should make the window unbreakable, which the gameplay designer hates because it goes against the established rules for no good reasonAs far as the player is concerned.. He thinks the window should be removed, which the art team hates because they don’t want to throw away all this work and make the scene less visually interesting just to solve an “exploit””You want us to trash the room we’ve spent the most time on, because you can’t balance the fight mechanics?”.
This debate goes in circles for weeks. Eventually some bright young kid comes up with the idea of making the window “strong”. It cracks every time you slam someone into it, and on the third impact it shatters. Maneuvering the final boss to be thrown into the window three times is actually more difficult than just doing the fight normally, although it is faster. This seems like a good compromise, and even a nice side-goal for skilled players.
This solves the problem, but requires even more work from everyone.
7) The art team thinks the glass particles need to be re-worked. They were fine if this was just a random bit of scenery shattering, but now that this is a major feature and it’s going to be in cutscenes and trailers, we really need the bits of glass to glitter in the sunlight. There should be dust. And there’s a three-day debate in email where the team argues if it makes sense for there to be flying blood particles or if lacerations from flying glass wouldn’t start visibly bleeding until after the body hit the ground. Suddenly everyone is citing Wikpedia, or Myth Busters, or their cousin who’s an EMT, all trying to give a definitive answer to justify their vision.
8) The really big foes – “bruisers” to the dev team – don’t look right when going through a window. They’re so tall that their head passes through the frame at the top and their arms often clip through on either side. We need to either make them immune to throws in front of a window (the gameplay guy throws a fit at this suggestion) or we need to record a couple of new animations where they… I dunno… maybe tuck in their arms and head so they fit through the window? Can the animators make that look good? (The lead animator looks like she’s going going to bite someone when you mention this to her.)
9) The flying glass particles we added? Now the sound designer says his effects no longer match. His sound was originally sort of a generic “something broke” sound, but it’s totally inappropriate for this cloud of flying glass shards we have now. He wants to add a few new sounds for this, and he needs the scripts for the windows to be changed to use these new sounds.
10) In the warehouse level, guys are scripted to jump IN through the windows for the set-piece brawl. This is confusing as hell for playtesters, who all wanted to stand in front of the window and punch guys back out the moment they jumped in. This doesn’t work the way they expect, and it would be lame if it did. We need to change something so that players no longer want to do this. Eventually – after weeks of back-and-forth – we have the guys enter through some sort of chute.
Suddenly this “tiny” feature has gobbled up a couple of weeks worth of development time. It’s not management’s fault. Nobody here is a bad guy. Nobody is incompetent. The only way to avoid stuff like this it is make a firm rule of NEVER EXPLORE AWESOME FEATURES THAT AREN’T IN THE SPEC, and I don’t think anyone – developers, players, managers – wants that.
Games are Never Finished, Only Abandoned
And all of that was just one issue, out of the dozens that arise during development. There’s another issue – just as big and complicated – where we change it so the player can choose to push a button to kiss their girlfriend, rather than having the hero kiss her in a cutscene. That results in days of debates and arguments over player agency and which button to useYou can’t use X, we use that for punching! But Y is for jumping, and that doesn’t seem right. Grapple? Choke? A philosophical debate ensues: Which of our violent gameplay verbs is closest to “kiss”?.
There’s another issue where – after weeks of struggling – we conclude the fistfight inside the car during a high-speed chase just isn’t working. The sequence is scrapped, but then someone tries to salvage it by having the fight happen in the back of a box truck, but it’s too late in development for all the required animations, cutscenes, and dialog, and the feature is finally cut for good.
The whole project is filled with these sorts of decisions. Every week is a new challenge of “Something we did last week created unforeseen consequences this week, which can only be solved by cutting features or doing more work.”
Actually, Scheduling is Impossible
Even if the project manager is a genius that correctly predicted how long it would take to make Punch Guy 2: Punch Harder according to the original spec, the game is still behind schedule because of all of those things that were added or changed during development. And note that if the project manager had doubled their estimate so there was lots of room in the schedule, people would simply have devoured that time by adding even more stuff. (For example: Maybe we would have stuck with that fight in the moving vehicle until we got it working.)
Even in a company where all of the executives are kindly saintsPlease tell me where to send my resumé. and the project manager is a scheduling masterAh, so this company is apparently based in “Opposite World”., the game is still running out of time. Not because anyone was “evil”, but because the team itself wants to make the best game they can. This team enthusiastically added the window feature on their own, knowing full well that the ship date wasn’t going to move. They are stretching themselves to solve difficult problems to make the most entertaining game possible. That’s literally the reason they entered this field in the first place. That’s the part of the job they love. It’s a challenge to overcome, and they imposed it on themselves.
Crunch is simply the moment where everyone’s soaring ambitions come face-to-face with the limits of a cold, uncaring universe. Limitless ambition will always be at odds with finite resources, and the end of the project is where those two forces collide and are reconciled.
So then somebody says, “Then companies should adopt policy of releasing a game when it’s done, like Valve, Nintendo, and Blizzard!”
That’s nice if you’re sitting atop vast cash reserves, you’ve got the best talent in the industry, and your games are guaranteed to sell. Not all companies have the luxury of open-ended development. THQ folded in 2013, and took a lot of jobs with them. Closing a company is brutally painful for hundreds of people, and you can’t really blame the managers who avoid this by restraining games to a finite budget.
And just to play devil’s advocate: Imagine you’re a corporate manager. You allocated $X million to Punch Guy for a Christmas release. And now the project lead comes to you in late summer and says he needs more time (money) to finish the game, and when you ask why he tells you about all this stuff – like the window feature – that you never agreed to and wasn’t in the design document. Pushing Punch Guy back would make the game cost more AND it will force you to pass up the massively profitable Christmas sales AND it will put Punch Guy 2 on the shelves in February, where it will compete with Shoot Guy, one of the other titles that you manage. It’s your entire job to say no to stuff like this. Besides, you’ve only got so many dollars to go around. The Punch Guy team went off-mission and now they expect to be rewarded with more money? Why should it go to the Punch Guy team and not the Shoot Guy team?
Sure, you can probably schedule a project down to the day, as long as you’re doing something very familiar. If this game is just last year’s game with new maps, then I guess we can just turn development into a conveyor belt. If we don’t want to invent new gameplay then we don’t need to allow for the R&D that new gameplay systems require.
Crunch is most likely to happen to teams that are ambitious and try to do a little more than their budget allowsUnless we’re talking about crunch imposed as a “money saving” effort, which we discussed last week.. It happens when people try to innovate. When they polish a game through iteration. Which, incidentally, are the very things needed to become like Valve, Nintendo, and Blizzard.
So yes, I think late-stage development crunch can be part of a healthy studio. Which is why I’m adamantly against mandatory perma-crunch.
Like a distance runner saving some energy for the last sprint, you want the team to have fuel left in the tank when it’s time for the big push. You want them to have some enthusiasm left when it’s time to start making the tough choices about which features need to be cut and which ones we can finish in time. You want them to surge forward when they see the finish line, and that can’t happen if they’ve been working 6 days a week for the last year. If the team is wearing out, then their output will slow down just when you need it most. Sane work hours and good morale give your team the ability to sprint when it looks like the ship date is at risk.
Perma-crunch is stupid and destructive, but voluntary crunch by people who are trying to push the medium is a good thing. Hard work is not bad or evil. Great things can happen when you’re willing to push. But people are not machines, and they need to save their energy for those moments when it will count the most.
 I mean, management might still be soulless meanies, but not for this reason.
 As far as the player is concerned.
 ”You want us to trash the room we’ve spent the most time on, because you can’t balance the fight mechanics?”
 You can’t use X, we use that for punching! But Y is for jumping, and that doesn’t seem right. Grapple? Choke? A philosophical debate ensues: Which of our violent gameplay verbs is closest to “kiss”?
 Please tell me where to send my resumé.
 Ah, so this company is apparently based in “Opposite World”.
 Unless we’re talking about crunch imposed as a “money saving” effort, which we discussed last week.
Please Help I Can’t Stop Playing Cities: Skylines
What makes this borderline indie title so much better than the AAA juggernauts that came before?
Shamus Plays LOTRO
As someone who loves Tolkein lore and despises silly MMO quests, this game left me deeply conflicted.
id Software Coding Style
When the source code for Doom 3 was released, we got a look at some of the style conventions used by the developers. Here I analyze this style and explain what it all means.
Gamers Aren’t Toxic
This is a horrible narrative that undermines the hobby through crass stereotypes. The hobby is vast, gamers come from all walks of life, and you shouldn't judge ANY group by its worst members.
Games and the Fear of Death
Why killing you might be the least scary thing a game can do.
137 thoughts on “This Dumb Industry: In Defense of Crunch”
Also to bear in mind: is punching someone through a window going to generate, on its own, enough extra sales to pay for the probably several person-months of work it has generated for the team? Almost certainly not. If you make a habit of doing that sort of thing you become one of those companies fondly remembered as “I used to love their games. Shame they went bust. Modern devs are all too lazy and stupid to make games like that any more.”
Clearly, the solution is to build enough of the basic building blocks and then leave it out of the finished game (but still in the code), knowing that sometime down the line, a player digging through the game code will discover it, create a mod for it, which you can then praise the mod community for picking up something that you thought would be awesome, but had to cut due to time restraints. ;)
Incidentally, I’d love to see this done by Bethesda – Skyrim was “Go anywhere do anything” with all the depth sucked out to make it accessible, but imagine someone digging through the code and discovering that every dungeon has an unused level modifier that you could turn on to make it into a more Morrowind-esque experience where the world really doesn’t care what level you are, the foes are as hard as they happen to be.
look up the skyrim Scaling Stopper mods. Its not really built into the game files, because Bethesda thinks more people enjoy the game more with difficulty scaling with level, but I prefer using a scaling stopper.
Something that was built into the game but was scraped due to time constraints was a much more in depth and reactive Civil War quest line. Apollo Down on Nexus found these defunct buggy bones and resurrected the civil war the devs wanted in the game. The mod is called Civil War Overhaul if you want to check it out.
it’s a good idea in theory, what would happen, though is #playersportal would start lambasting the studio for keeping core gameplay features from them and start harrassing the female employees (especially those in accounting and HR), because clearly they are censoring everybodys right to gameplay that wasn’t completely developed in the first place.
Well, that was certainly from left-field.
I don’t know that “enough extra sales on its own” is a useful metric. There are very few games where there is one specific feature that makes people buy the game, especially for games that aren’t sequels.
Imagine they took one weapon out of Doom. Suddenly there’s no more shotgun. All the other weapons are there, but there’s no shotgun. Does that make you not buy the game if you otherwise would have? What if they took out just the pistol? What if Starcraft was exactly the same except they just removed the Roach? How many fewer people purchase Mass Effect 2 if you remove any single party member?
When you buy a game, you buy it because it’s that game, a complete package with lots of different things all working together. For pretty much any well constructed game, there’s no single thing you can point to and say “If this specific thing wasn’t in the game, I wouldn’t have bought the game.”
Game development would be a lot easier if you could do it that way though. “Yeah, I checked the chart. Adding that Sniper Rifle will generate about 10,000 more sales, but the Grenade Launcher adds 20,000, so let’s add the Grenade Launcher instead.”
I’d go further, and say that if you decide only based on “how many copies will this sell” which features go into a game, then you should not be making a game, you should be selling gambling machines. In the end, games are an art-form, and not just a commodity like cars or shoes.
Not every feature needs to be added either. Adding a “square blocks only” option to Tetris does not improve the game at all, even if it is just an option.
The question should be: Will this feature make the game better?
Nope, games are a business. People who treat it like an art form without the benefit of a patron willing to throw money at them in a loss-making venture go out of business. I’m not personally prepared to insist that all games devs need to be prepared to work for (in effect) pathetic hourly wages in the name of producing art for people who aren’t that happy about having to pay for it.
There is a steady stream of devs who do treat their work as art and put in long hours for little reward. And then they discover the benefits of being able to afford somewhere to live and not starve to death, and stop making games, or work for decidedly business-oriented games studios. Except for the one person in 10,000 who makes enough to live off at a reasonable level.
It would be nice if people had the luxury of doing what was best for the game, but that bears no relation to the financial realities of game development at the moment.
What games are cultural products, like movies or books. And what this means is that their success as a business venture is a lot more unpredictable than typical products. The cultural product must engage its audience successfully. So it’s may not be Art, but game devs must share some of the artist’s concerns…
“I'd go further, and say that if you decide only based on “how many copies will this sell” which features go into a game, then you should not be making a game, you should be selling gambling machines. In the end, games are an art-form, and not just a commodity like cars or shoes.”
Cars and shoes are not commodities.
I’d reply that if you think this way then why did you demand a salary to work here? People go into the games industry to *make money* making games. The people who fund games fund them to make money making games – everyone else working on the project is making money, why should the investors get left out? If people didn’t make money making games there would be a heck of a lot fewer games and of lesser quality.
‘Making art’ is a luxury the programmers, animators, level designers can afford. Project leads *must* be able to justify every budget expenditure. Saying ‘its art’ isn’t good enough – at a minimum ‘its art and that will generate buzz to help market’.
I fail to understand why Phill and Incunabulum replied to this with statements regarding art such as :
– “People who treat it like an art form without the benefit of a patron willing to throw money at them in a loss-making venture go out of business.”
– “If people didn't make money making games there would be a heck of a lot fewer games and of lesser quality.”
– “Saying “˜its art' isn't good enough ““ at a minimum “˜its art and that will generate buzz to help market'.”
Art was not part of the discussion but the way they introduced it in such negative terms reminds me of some economy teachers i’ve met back in the day when i got into a “Game school” here in France. There was a huge difference in how most of them viewed the gaming world and the business around it. In short, most were more sensible and open minded than others.
It’s funny they post that here when Shamus, Chris and MrBtongue already talked about corporate culture and how it slowly kills games as art.
Even if you don’t take them seriously, you can easily imagine the AAA games industry nowadays as sticking a project manager to a painter who asks that a naked woman be included in the painting because it sells more and putting tape on the painter’s mouth before he gets to say anything about how “it wasn’t HIS vision” or “it’s not what he likes or wants to do”.
– “Shh shhh, you do that or I cut the budget for the green paint and your bonus”.
– “but i’m not doing it for the money !”
– “Shhh shh”.
This is also the problem with Day One DLC. Is Javik from Mass Effect 3 worth $10 of added “fun” to the game? If not, what is a fair estimate? $5? $1? There are very few things in a game you can point to and say “this is 50% of what I like in the game right here.”
Yeah, there’s always the slightly different but equally hazy questions of “What exactly is content worth?” and its cousin “How much game is there in a Complete Game?”
DLC arguments tend to largely end up happening because people don’t agree on what features are obligated to be included in the main game and how much money additional thing should actually cost. If you played Mass Effect 3 without Javik, and you didn’t know Javik existed, would you have gotten $60 worth of value?
(Compounded, of course, by the economics of games being weird. If I buy a burger and my friend buys a burger, that costs the company about twice as much as just making me a burger. With games, the cost is split across everybody who buys it, so $5 should possibly in theory get you more from a popular game than from an unpopular one. But then you’re getting into arguments over moral imperatives vs what the economy will allow and we’re in an even deeper mess than just the game value one)
If jumping wasn’t in Super Mario Bros., I wouldn’t have bought it.
I have a few issues with your article:
First, I feel like fold #1 of the “twofold problem” really needed some more attention than it got. I work in a field where the algorithms are specced, the hardware target are specced, and dev responsibility is specced (and everyone–end-user, contractor, and developer alike–is contractually bound to stick to the spec or pay more time/money), but #1 still gets me on many, many projects.
Second, I don’t know how much of the dynamic you describe in detail when it comes to enthusiasm-driven feature-creep scales up from a smaller dev team like the Good Robot team to the infamous perma-crunch AAA big boys. Who really has power to suggest new features in a larger development house? Aren’t the managers (the ones with scheduling power) the only ones who would be allowed to approve new features to begin with? The cynic in me believes feature creep in a AAA studio comes less from enthusiastic developers, and more from enthusiastic marketers after focus-group testing…
Finally, coming off your last article it kind of comes off like you’re blaming devs for perma-crunch, but the way you talk at the end suggests there is a distinction between “developer-volunteered feature-creep crunch” and “management mandated crunch.” It would be helpful if you described where you’re seeing the line between these two, especially since (as I noted above) I suspect “management” is where feature creep would actually stem from.
Who has the power to suggest new features in a larger development house? Practically anyone. Getting the design team to approve it, the coders to admit that it can be done, and so on, is a lot more difficult.
I’m not sure I’ve worked in a real AAA studio, but I’ve been one step down from there.
As a programmer implementing game code, you can often implement small new features and then show them to people as a fait accompli. These will often actually make it into a game, if they don’t appear too complicated.
Yes, all new features need to be approved, usually by the lead designer working in concert with the project manager. If the project has a separate “vision carrier”, then they’ll be involved, too. But in the case where a feature has already been implemented, or just falls out seemingly “for free” as an unexpected consequence of other already-approved features (as in the “punching a guy through a window” example), then it’s very likely that it’ll be approved, since these people will usually view it as low-cost, and as a team morale booster. (And in fact, in the case of “unexpected emergent behaviours”, it often seems like it would be more work to prevent those unexpected emergent behaviours, than it would be to simply integrate them into the game!)
I also work on projects that are big (though most of the people working on them aren’t programmers), and I will also admit that , quite often, I will go the extra mile to add a little usability feature here and there. However, something I accomplish by myself in a long afternoon is a far cry different than what Shamus is describing.
I feel like there is a distinction when it comes to feature creep. There’s “internal creep,” where an individual developer sees something that can be made awesome for relatively little work, so you drop a few hours and maybe enlist a handful of other enthusiasts make it happen. Then there’s “external creep,” where someone who has relatively little experience in development sees something that can be made awesome for little work, because they don’t understand the inner workings of what they’re messing with.
The thing is, I experience both forms of creep all the time, and at no point ever, not even once in six years, has “internal creep” ever forced me to miss a deadline. More importantly, it has never forced one of my coworkers to miss a deadline, because I’m not an idiot. Even though I’m not an expert in all fields surrounding my own, I can tell when I’m making more work for other people. If I’m dancing close to overshooting a deadline, I sure as hell try to avoid that, even if it means nixing a feature.
On the other hand, if I am ever behind, you can bet your ass it’s because a customer or a contractor changed the scope of my work after I began, and now I’m neck-deep in trying to make a square peg fit a round hole. That’s the equivalent to game development crap like “social capabilities for Mirror’s Edge”–people outside the development process inflicting creep on those lower on the corporate hierarchy.
For a small team, I can totally see how “external” and “internal” creep become very blurred–after all, Shamus is his own boss. If Good Robot’s lighting model needs modified, the asset code needs changed, the AI needs fixed, etc. etc., Shamus is the only one doing all that work. Therefore Shamus, dedicated developer he is, works on throwing robots through windows and deadlines get fudged because Shamus is the one who created the deadlines in the first place.
In a larger organization where those twenty hats are parsed twenty different people, it’s different. If I am a part of a large team, I have to look my teammates in the eye every day–I don’t want to inflict crunch on them unless it’s for a damn good feature. Peer pressure does wonders to reduce “internal” feature creep, more than people give credit for. At the same time, there are constantly features in AAA games that are clearly the brainchild of someone in marketing or misguided leadership–e.g. Social Play for Mirror’s Edge–which make it pretty obvious to me where the crunch is actually coming from.
To hear it described in the article, however, crunch is inevitable because developers have such unbridled enthusiasm for their work they can’t see the bigger picture of what the consequences of feature creep will actually be. I’m very dubious that an overly large amount of blame for crunch–especially “bad” crunch, though I’m still somewhat unclear on where the line between “okay” and “bad” crunch lies–can be laid at enthusiastic developers’ feet, but that’s the impression I’m getting from the article.
In games, that sort of thing does happen pretty often to various extents. For instance, in Assassin’s Creed the Apple’s effect actually originated as a bug and was turned into a final bossfight.
I’m not questioning whether developer-driven creep happens or not, I’m questioning how much schedule overrun it causes.
I have much more faith in developers–especially over-worked developers–to look at an emergent feature and think critically about whether the extra feature is worth the extra work. I do not trust executives or marketers to make the same assessment.
In Shamus’s example, unless the window-throwing feature was absolutely outstanding, someone on the team would kill it before it got big enough to become a ballooning cost. If instead the feature came from marketing after “extensive” focus-group testing, the feature would chug ahead regardless of sensibility. This is overwhelmingly supported in my (admittedly anecdotal) experience.
Therefore, I think it’s wrong to blame enthusiastic developers for crunch, especially on very large projects. Unbridled enthusiasm self-corrects as teams grow larger, while top-down scope creep becomes worse under the same conditions.
From what I gather, often they realize it’s going to balloon their schedule too much after it has already ballooned their schedule. Then if they are a normal company they cut it and if they are not a normal company they invoke Valve Time.
“From what you gather,” from what source?
I’m sorry for belaboring the point, but it’s really frustrating to me that everyone apparently regards developers as undisciplined children who will happily volunteer for any task–no matter how unprofitable or unproductive–because their developer brains are too simple to understand the consequences of feature creep. Like, if the smart money-people at the top weren’t reigning them in, the simpletons would be kidnapped by a stranger in a white van that offered them free pizza and a chance to re-write Pac-Man from scratch for a moderate stipend.
Bunk to that, I say. If more than 10% of last-minute schedule push-backs or long-hour crunch-fests on huge, AAA projects originate from features the development team volunteered for on their own initiative, I will eat my hat.
Assorted post-mortems talking about features they pursued and eventually cut (Rise Of Legends had a faction with nomadic cities that ultimately got cut) and the Valve dev commentaries showing what happens when they don’t have to cut features to meet a schedule on account of not having committed to a release date (The gunship mine-storm and shooting down rockets were both originally bugs). Crunch itself generally happens after they officially stop adding new features in preparation for going gold, but designers spend time and effort on features that don’t make it in. Of course, in big studios big features require management approval, but the designers can be quite enthusiastic about them too and underestimate the time commitment and difficulty.
Wouldn’t the defenestration example create a Kobayashi Maru, though? Defenestration is there, as a janky emergent property, so you have three options:
1: Burn money implementing and testing the feature.
2: Burn money implementing and testing air-tight blocks on it, which is technically a feature, too, but not one for the box art or promo.
3: Remove the breaking windows and/or throwing over railings features, essentially burning the money you spent implementing them and potentially contradicting existing promotional materials.
So, yeah, burning the cash to get it the rest of the way there may seem like the easiest option, and you can try to pass it off as plusing an existing feature in the spec.
I would generally think making windows unbreakable is the cheapest option. Except for the price of not being able to punch people through windows. Which is always a crowd-pleaser.
Exactly, but then you’re writing off the cash you spent making them breakable, and spending time and money flipping those bits, and testing to make sure it actually works. What if the window being unbreakable but still a distinct object from the wall means mooks phase through it? That’s why it might be tempting to polish a janky but mostly known emergent feature. You can always tag large mooks and bosses as unthrowable.
In the given example, breakable windows was an emergent “feature” that came from a combination of “breakable objects” and “throwing mooks over railings.”
Just because you flip the bit to make windows unbreakable, doesn’t mean all that code is wasted, because presumably there are already other breakable objects that use the same code. And even if the work already finished goes to waste, writing off the resources already spent on breakable windows is absolutely what you should do if the effort required of them isn’t justified by the improvement in the quality or marketability of the game–otherwise, you are falling for a quintessential sunk-cost fallacy
Either way, implementation is definitely not the cheapest option.
It causes it fairly often, or if it doesn’t it causes crunch instead which is Shamus’ point: self-imposed occasional crunch to add something cool happens and is okay.
I work in business software and we add features like this all the time as we work on something else. Most of the time it’s something extra related to what was originally planned that we realize really helps the user or solves a corner case no one thought of earlier. It helps the immediate user now and makes something we can point at when we’re selling the product to another customer. Skipping it to keep the schedule would be borderline unethical or shooting ourselves in the foot. If a few weeks now saves us a month or two of support calls or complaints it’s totally worth it. But it’s hard to tell what single thing is going to be the big deal all the time.
You have to draw a line somewhere, but there are no hard and fast rules on it. That’s one of the classic mistakes of Software development: trying to apply manufacturing rules to something that is not the least bit like manufacturing.
One of the things I’ve found along my career trajectory as a Software Engineer is that the more experience I get, the more accurate I am at accurately estimating project time. Likewise, the more time I spend with a junior teammate, the better I get at assessing how much time their specific contributions to a project will take (as well as getting better at identifying the tasks to give them that are just challenging enough to help them grow but not so much that they hit a wall).
But, importantly, while this tends to work well on projects conceptually related to your area of expertise, it fails utterly the moment you step more than a hair’s width outside that scope. I do C# enterprise-level software and SQL, and I’m great at it, but I wouldn’t even know where to begin on something like a C++ based graphics engine. The relation, then, is that I could schedule the development of a DB schema down to the level of a day in many cases, but couldn’t schedule the development of a graphics engine down to anything more accurate than perhaps a quarter.
And even more importantly, there seems to be a tipping point in realization of this weakness. What I’ve seen time and again is that the difference between a Senior Dev and a PM/Lead Dev is that the latter is typically one of the former who’s developed a sense for where the knowledge boundary lies, so that in addition to assessing their own (and their team’s) programming abilities (and the scheduling of project tasks that goes with it) they are also better able to grasp how far afield the task at hand is and be able to buffer the scheduling accordingly.
And, ultimately, that would be the basis for my answer to the issue of padding the schedule; I start with a base padding (for me it’s 2.5, which is based off of personal experience — original estimate plus 1.0x for bug hunt and 0.5x for polishing) and adjust that based on my familiarity with the project scope. Then, to account for long-term uncertainty, I add 1 week per month per month to that number (that is, the first month adds 1 more week, the second month adds another 2, and so-on, though I usually cap it at around 4-6 weeks of added time for any given month of the prior step’s estimate). So far, this non-precise equation has given me very good project estimates, though again, a lot of it is based around how good you are at the initial estimation of your own skills.
You can draw a lot of comparisons between game development and regular(?) software development but you wind up with a pretty big X factor that separates them. Games are entertainment. In a previous life one of the things I wrote was an offsite tape storage management program (yes, it’s exactly as exciting as it sounds) and at no point did I ever have to worry about whether or not it was fun.
I think the real difference is in how many different external dependencies you’re building you product on top of. How many driver features, APIs, frameworks, and development tools do you need to make it all happen?
That’s important, because every single one of those pieces of software has its idiosyncrasies. They all have intermittent bugs, documentation mistakes, constant updates, and bizarre race conditions that all have to be ironed out at the same time.
Modern games have so many interconnected systems–most of which devs didn’t actually write themselves or have direct control over, they’re just importing them and making them work–that it’s a miracle they ever work at all.
If you’re a distance runner, having energy for the last sprint means you paced yourself poorly. Ideally, you want to be barely able to keep going for the last 1/4 of the race.
Distance runs usually have a pre-defined length and course.
Depends on the distance, doesn’t it? I thought in the shorter distance (i.e. non-sprint) runs like the mile, you do want to be able to surge at the end.
Nah, you want your pacing as even as possible. You might be able wring a little extra out at the end because it’s easier to throw everything in when you’ve only got to maintain it for a few seconds, but you want to be pushing yourself the entire time. You never want to intentionally save energy for a sprint at the end.
Ten seconds per mile is a fairly minor difference. Running twice as fast for the last ten seconds of the race is impossible unless you’ve been taking the rest of the run way, way too easy. For people who are actually decently fast in the first place, a mile run is nearly flat-out all the way. A “sprint” at the end would barely be faster than their normal pace, and would come nowhere near making up for the time lost by trying to save energy.
That’s only if you’re trying to psyche out other runners. If you can reserve some energy for the late run, maybe the other runners try to compensate by running faster, and maybe they burn themselves out and you get ahead of them.
…Except that for a long-distance race, you know the distance and the conditions, and in game development (or any sufficiently complicated project, really), the remaining distance is not really well known, and you do need to keep some reserves in order to be able to react to unforeseen obstacles (whose existence is entirely foreseeable but whose nature and severity is not).
So maybe it should rather be compared to a football* tournament? You should never slack, even against opponents you consider weak, but if you play on 110% all of the time, your team will be spent before the end. You need to keep some reserves to deal with unexpected problems. If the opponent scores a lead 5 minutes before the end of the final match, your team absolutely must still have the ability to get up and mount an attack.
The thing that Alex St. John doesn’t understand (or chooses to ignore) is that there is a long-term sustainable level of output. You can go above that and increase output a lot in the short term but if you spend too much time in that state, it’ll come down to below 100%. Way below 100% if you ignore the warning signs. Think about this: One can ride a horse to death. Humans don’t die as easily from over-exertion but they can still take away long-term or permanent damage, and a marathon runner won’t be able to run another marathon on the next day because all the reserves went into the last quarter of the last race.
* as in “foot” and “ball” :) American football may be similar but I don’t know enough about that
It’s actually more true in American Football because of the clock rules. Tackles don’t freeze the clock but incomplete passes and the ball carrier going out of bounds do and when people call a timeout it freezes the clock until the start of the next play, so in “two-minute drill” teams make a lot of long passes and burn through their timeout stock and have a very realistic shot at getting in a touchdown in the final two minutes.
I just want to jump in to cautiously agree with “voluntary crunch by people who are trying to push the medium is a good thing,” with one big caveat: voluntary short-duration crunch is a good thing. Long-duration crunch is still bad for the company, whether or not it’s voluntary.
Workers who work extra hours of their own volition because they’re passionate about the project are fantastic. They’re worth their weight in gold. Workers who work extra hours all the time are likely to tire themselves out and become less valuable over time, even if nobody asked them to work those extra hours; I’ve seen this happen to people several times (and done it myself, once).
As a manager in a game development job, one of your most important functions (after hiring, results, and delivering timely feedback) is to make sure that your directs are working in a sustainable way. If you’ve got someone who’s doing fantastic work and is really excited about the project and so is always working 12 hours per day of their own volition, you need to encourage them to leave earlier, get some rest, and come back the next day feeling refreshed. Working late once in a while, or in the days leading up to a big milestone is fantastic and helpful and always appreciated. But doing it every day for weeks or months on end is just begging for burnout, probably at the exact moment when you actually need them most to be on top of their game.
As a manager, you’re not doing your job if you allow your workers to voluntarily crunch for any non-trivial length of time. Those people are long-term assets who are building value for you; it’s in your interests to keep them happy and healthy, even if they want to overwork themselves! (Also, they’re probably really nice people, and you should want to keep them happy and healthy for that reason alone! But I mean, the argument against unsustainable crunch ““ even when voluntary ““ works from a purely financial standpoint as well.)
Absolutely agree with this. If you have people who haven’t learned to pace themselves to avoid burnout, then you need to do it for them. You don’t want to be the learning experience for them where they tanked a project by burning out at the critical moment.
Voluntary or not, long crunches are deadly for productivity in the long run.
The benefit of voluntary crunches, though, is that most people can sustain them for longer because if work does not feel like work, it doesn’t exhaust the worker as much. And of course, if it’s voluntary and people realize they’re over-exerting themselves, they can cut it back, too.
The next big topic, then, is of course what “voluntary” means … there are lots of subtle ways for employers to “encourage” employees to do things they’d never be allowed to directly ask for, without ever mentioning them; simply by creating an atmosphere where certain voluntary things seem to be expected …
Like … if “everybody” (actually less then half the people but you don’t know better) is doing “voluntary” crunches, and the boss mentions this in passing, noting that you aren’t. But don’t mind, it’s voluntary! We’re picking the dev team for next year’s game, and of course the people with the highest output will be first to get places. Unfortunately, we may not be able to offer everyone a place on that team…
The other side is that different people have different levels of endurance. For example, a good runner might be able to run a quick half marathon a day, but I’d be hard pressed to run a slow one once a week. This can be fixed with endurance training, weight loss, special diets and so on. The equivalent mental endurance to do high level, focused work for 60, 80, 120 hours a week? The only advice I see given is to just do it until you can do it, which doesn’t have great results and is basically a filter for people with natural talent, or who are medicated for existing conditions in ways that increase mental focus. So either the management who want these hours need to create training regimes, and ideally share them with universities and schools so new hires don’t start from zero, or have a filtering apprenticeship scheme so they aren’t doing severe damage to people who can’t hack it.
That’s only if we take the argument that the hours are ever really feasible seriously, though. But taking bad ideas seriously is often an efficient way to debunk them.
A big factor in this is that many people will say they weren’t under significant stress even when they are, because a) they don’t want to look weak or complaining, and b) they don’t realize how they’re getting incrementally more nervous, short-tempered, less focussed. How various small health annoyances are getting worse … until it’s pretty far advanced.
That last bit happened to me, and I’m happy to still be mostly intact.
As I see it, there is however a thing that management can do to make things easier when crunch gets on: The less something feels like work, the more people can tolerate. The really really good people often don’t perceive what they’re doing as work which they have to do but as the thing they’re fascinated with and allowed to do, and even get money for. This may in fact be what St. John relates to in his article. However, that attitude is very very hard to maintain if the stuff you’re told to do doesn’t rhyme with your idea of what you should be doing, if your manager keeps threatening people with layoffs to push them to perform etc… in short, if management pushes the employees rather than pulling, or motivating them. In a friendly, encouraging environment where input is welcome, people can “crunch” for much longer than under threat.
The point where I disagree with St. John is that it was somehow the employee’s obligation to get into that state of mind and stay there. Nope. It’s management’s task to find motivated employees, and then keep them motivated. St. John demands high motivation and high output from his employees, does not want to pay much and then blames them if their motivation is spent after some time of getting nothing back from the person who profits most from their work … I’d call that an abusive relationship.
If you’ve ever seen the BBC show ‘Grand Designs’ you can really see this in effect in an industry outside of IT. People trying to make the best house they can, and even with the best builders and architects they still go overtime because the owner decided to change the front door during the project or something.
I think given another 4 decades or so, the IT industry might finally reach a point where this is under control. For example, you don’t pick a builder on who is can build it cheapest and fastest, and I think people in general will take the same approach to IT development eventually. You can only here so many horror stories about bad projects before you realise that.
And that that point it’ll be the FutureTech people in their new industry where people are being put through hell because the management don’t understand how projects work yet.
Not likely. At some point, the world has built enough houses that it figures out houses are pretty similar and it will take exactly X days to build a house with Y features, because it has been done one hundred times before. I work in software, and I actually had this experience: I spent eight months working on a series of all-but-identical data upload tools and by the end of it I could actually give an accurate estimate of how long it would take to build another one.
But in general, software projects are a lot more varied than houses. You’re almost never going to get a situation where this exact thing has been done one hundred times before, because if it had, then no one would you need you to build the 101st program for it, they’ve already got one hundred to choose from. So every software project is going to be newish territory and we’re back to the problem that the only way to estimate completion time is to find all the new problems and figure out how long it takes to solve them, at which point you’re already done.
Actually, regular small houses may work like that but larger projects? Representative buildings? A friend of mine is a civil engineer (Bridges, tunnels, that sort of thing), and most of the things run between 50% and 100% over budget and time. Berlin has been trying to get a new airport up and running for a decade now, and it was “almost done” five years ago. Current state: probably not done by 2017…
I’d say that AAA games are much more like large building projects than one-family ready-made homes. And even for those, there’s still a huge danger of overrunning costs because most families who build a home still have a problem deciding whether whether the more expensive option is also more reliable or just more expensive…
I agree with most of this. I observe, it’s not JUST software. Large building projects are usually late. Large reorganisations are usually late (and often failures).
Are you familiar with https://en.wikipedia.org/wiki/The_Mythical_Man-Month? It says that crunch time is often not really more productive. I think that’s only partially true — if you’ve primarily done the conceptually difficult bits, and need to handle a lot of things that can only be handled right then (eg. fixing bugs after launch needs to be AS QUICK AS POSSIBLE), then you may be able to work longer.
But I’m even more sceptical than you that it’s ever a good idea. OTOH, we mostly agree there should be LESS crunch, even if we disagree where it should stop.
But I also think, if everyone sort of plans for an N year project with N months of crunch, and even if they don’t say so, a firm release date when you’re in that crunch time, if you come up with new bits of work, you CAN’T do them, unless you’re willing to release late, you HAVE to cut corners or cut features or skimp on reliability and testing. Because you usually physically can’t work more than 80 hours a week without screwing it up worse than you’re helping. And if you’re on the ball, you realise that, and have to make do with what you’ve got time for. But if you planned a longer schedule to begin with, with generous slack time for “future good ideas”, then you’d be in a similar position, assuming you had the discipline to accept SOME good ideas you could fit into slack, and reject ones that required overtime.
There’s two possible reasons that doesn’t work: (1) time at the end of the project is more valuable and you CAN’T do that work earlier, or (2) people are willing to do the crunch, so they end up doing it. I expect the truth is somewhere between, but I’m not sure.
The Mythical Man-Month is a fantastic bit of reading for anybody involved with software planning and implementation, but Brooks has good things to say, actually, about ‘hustle’ (the ability to catch up when the schedule slips), and nothing that I can recall or find about crunch time, specifically. Brooks’ book is mainly concerned with the difficulties of communications overhead within project teams, IIRC.
(Possibly, you have TMMM mixed up a bit with Peopleware, which does seem to get into questions about the value of overtime. I can’t discuss that one with any confidence: I’ve owned a copy for ages, but, alas!, never quite gotten around to reading it.)
Doh! Well spotted, thank you for pointing it out. After some thought, I found what I’d thought of, from Joel on Software: http://www.joelonsoftware.com/articles/fog0000000245.html
I read that ages ago, and vividly remembered it, and then later found about half of it much more concretely explained in MMM, but muddled it all together in my mind. The extract I remembered is:
“When you think of writing code without thinking about all the steps you have to take, it always seems like it will take n time, when in reality it will probably take more like 3n time. When you do a real schedule, you add up all the tasks and realize that the project is going to take much longer than originally thought. Reality sinks in.
Inept managers try to address this by figuring out how to get people to work faster. This is not very realistic. You might be able to hire more people, but they need to get up to speed and will probably be working at 50% efficiency for several months (and dragging down the efficiency of the people who have to mentor them). Anyway, in this market, adding good programmers is going to take 6 months.
You might be able to get 10% more raw code out of people temporarily at the cost of having them burn out 100% in a year. Not a big gain, and it’s a bit like eating your seed corn.
You might be able to get 20% more raw code out of people by begging everybody to work super hard, no matter how tired they get. Boom, debugging time doubles. An idiotic move that backfires in a splendidly karmic way.
But you can never get 3n from n, ever, and if you think you can, please email me the stock ticker of your company so I can short it.”
I’m an EMT and I’d just like to say that the broken glass cuts will begin bleeding immediately.
Will the impact of the throw splatter some of the blood inside, or will it pretty much all be outside though?
Cuts from a (non-safety) glass window tend to go straight to the bone, because broken glass is incredibly sharp and usually breaking a window takes a fair amount of force. There’s a certain type of young man who likes to express themselves by hitting things, and when they get tired of brick walls (and faces) and go for a window instead, they really do make a mess; muscles, tendons, the lot.
So, to answer your question – there would be blood everywhere.
It’s discussions like this that make me glad I’m not an EMT. On many, many levels.
Crunch bars are delicious and the slander in this post makes me vivid with unspeakable rage.
It’s the candles which should be suing for defamation if you ask me! Mmmm, comparatively tasty chocolate-flavoured candles…
Preach it Shamus!
I work for a software company. Nothing as flashy as games, but we have some SaaS and non-SaaS products. We used to frequently deal with scope creep, shifting release dates, questionable feature market justification, and buggy code.
About a year ago we took on a new Product and Engineering manager who said, “Let’s stop writing code and start shipping software.” He instilled a discipline around the team where nearly every time an unexpected issue came up, the answer was to adjust scope. From his point of view, the dates can’t move easily, it’s hard to spin up engineers so resources can’t move easily, therefore you move the scope to something you can do. As a result, we release on time, have less crunch, and are more predictable for the business.
That said, as a mostly SaaS software company we have the ability to easily push something into the next release, which is not so much the case for something like a video game, especially consoles.
Lots of developers don’t like this guy. From the business standpoint, he’s exactly what we need to succeed. However, some of our developers feel like the predictability tempers their creativity.
This is a tough balance.
I almost understand this — but can you define “adjust scope” for me please. Thanks!
Adjusting scope means to change what you think the project should accomplish. The scope of a project is every feature the project includes. Adjusting scope means to change what exactly you’re expecting the project to accomplish. So adding/removing which features you want to include basically.
I guess what both Shamus and you are saying: There are always unpredictable elements, so something must be adjusted. And that happens by either moving release date, adjusting the scope or crunching. Since each of these carries costs with it, the solution will have to be some compromise between them. For management it’s easiest to have people crunch (i.e. carry the risk of delays), for programmers it’s easiest to delay release.
I’ve felt the demotivating effect of adjusting scope downwards, but when it comes during crunch time (which is almost always for me…), it’s actually mostly welcome. Since I work in well-defined projects with fixed time windows, those are the only options, which is sad, because it makes me a lot less proud of my work. Fortunately, there’s always the next project.
Software as a Service. It’s basically when people run software on their servers and rent it out to companies.
SaaS is Software as a Service, the subscription model. There is one version of your software, often cloud-hosted and managed by the company itself so customers don’t need to configure a thing. In games, we call them MMOs. Much easier to offer customer service when you don’t need internal machines running every separate version of your product under service in order to replicate the scenarios given by customers with three-versions-outdated builds of your product. For business software it’s a clear win, for consumer software (e.g. Office 365 vs Microsoft Office) I’m a little iffy as buying individual versions is a one-and-done purchase which will provide a serviceable feature set for as long as you can get it to run.
Having frequent, free updates creates the same development effect (in SaaS the updates are just included in the subscription fee) – stick to the release schedule because who cares, your new feature will come out in the next build two weeks from now. You only really need to worry about pushing a feature into the product when missing the target means your work won’t see the light of day for years (or never, if the game flops and takes the studio with it).
Software as a Service. I obviously don’t know what this person works on, but all kinds of web-based forms to fill in, webmail solution things, cloud-based stuff, dropbox for example.
What the others said about Software as a Service is correct. MMOs are a great parallel.
We have a client application and a server application. Server is on our side. We push updates to our server and their client on our schedule.
The reason it’s different than games as boxed product is because if you pull or modify a feature in SaaS, you can just wait a few weeks or months, then put it in when time allows.
I work as an internal web dev for a company, building applications that are only used within our company. Our policy is that we can make scope changes while in the middle of the project, but it’s going to push back the release date. Not going to work for you? Tough bananas.
Of course, if you’re building something for commercial release, you don’t always get the luxury of telling your users that–you need them more than they need you.
(This has also taught me an incredibly useful trick–if you’re not sure how important a change is, make the user do something about it. (e.g. collecting some data you need) If it’s a serious issue, they’ll put in the effort without complaint–but somehow magically 99% of the time it’s no longer a critical issue, once you ask them to spend five minutes of their own time on it.)
I’m not completely against “crunch”, there are obviously times when working overtime is the best alternative. The problem in the games industry is that developers don’t always get paid compensation for their overtime (this seems illegal to me but my understanding of american labor laws are not exactly perfect) and because of that companies are not financially punished for overworking its staff.
American labor law has “exempt” and “non-exempt” categories. (Exempt from overtime.) The exempt group includes managers, and high-paid people like programmers. I think there was just recently a move by the Obama administration to lower the ceiling for exempt employment, forcing companies to pay more people overtime.
But it won’t matter much. Look at large construction projects, which are late all over the world. They face financial penalties, and do pay overtime. And you would think the world had learned everything there is to know about construction some time ago. But they still can’t schedule worth a damn, and customers still want it done something like on time. And so you have a crunch period.
Virtually all salaried workers fall into the exempt category (although the new rule, if it goes through, would put some salaried workers into the non-exempt group). And most game devs are probably salaried. There are very few legal limits on the treatment of exempt employees, unfortunately, and it is 100% legal in the US to require exempt employees to work as many hours as you like. (although, if they work less than 40 hours, you can’t cut their pay)
And FWIW, these designations aren’t determined by companies–it’s from the federal government. So a company can’t arbitrarily decide a particular group is exempt, we don’t have to pay them overtime anymore. (and, interestingly enough, a non-exempt employee can’t choose to forgo overtime either–no matter what, they MUST be paid it)
To further complicate matters, in many countries employees typically have a contract that lays out things like numbers of hours worked, etc., but in the US, most employees don’t have contracts either.
Eh, the federal government regulates the categories, but the regulations are somewhat vague so the companies have some leeway to be creative in exempting. Some of the managers at my first employer were absurd about that.
As I understand it, a company can categorize an employee non-exempt if they want, they just do exempt whenever they can since it’s cheaper.
As an aside, I completely forgot about the income limit changing for exempt status. IIRC, it was a pretty significant boost to something like $52,000/yr or less is non exempt (I remember this because I made $52,500 when Obama proposed it)
The current exempt ceiling amounts to about $23k! (Technically, $455/wk)
EDIT: The employer can’t choose whether an employee is exempt or non-exempt (except insofar as they can count on employees not suing them if they get it wrong). But they can choose to pay non-exempt employees on a salary basis, which might be what you’re thinking of. In that case they still have to track hours to make sure the non-exempt employee gets overtime pay when they go over 40 hours and never falls below the minimum hourly wage (because the non-exempt employee is literally not exempt from overtime and minimum wage requirements).
The new proposed ceiling is $50k-ish, which sounds good to me because the current ceiling is absurd. I thought some compromise on that amount actually got through, but I guess it didn’t…
I was actually thinking a company could pay an exempt employee overtime, if they felt like that was needed to stay competitive in benefits or because the employee is part of a union or something.
That’s actually something that happens at my company–the other two programmers who have been doing this for thirty years started off as unionized technicians and were promoted to programmers from there. Thus, even though they would qualify as exempt professionals, their continued union status gets them overtime benefits (in addition to other union benefits) and there’s nothing illegal about it, but there’s nothing in the law that says you can’t pay exempt employees as if they were non-exempt.
I think the biggest move to salaried employees lately has come from company classifying employees as “independent contractors” rather than “employees.” No matter what position they are hiring for, that gets employers out of paying overtime. Also, it is ridiculously easy to be classified as a “professional”–a classification which should be reserved for the likes of engineers and programmers–such that I think $10/hour play-testers get classified as “exempt” for being “professionals”
Personally I think “exempt” status makes sense for employees whose jobs aren’t meaningfully measurable in terms of “hours worked” regardless of how much or little they’re paid. Obviously it takes time to write code, but a programmer’s job is to write a program rather than spend X hours a week writing programs. In other jobs, particularly service and retail, the time commitment is directly valuable. So it makes conceptual sense to pay programmers a fixed salary for producing results in however much time it takes them personally rather than financially incentivizing them to be less productive. It’s open to abuse by setting unreasonable timeframes, of course.
Oh true, it’s not illegal to track the time of your exempt employees and give them overtime. It’s disadvantageous to the employer of course–that’s probably why the example that came to mind for you involves a union. An employer who just wanted to stay competitive in benefits would probably offer bonuses during crunch time. Truly paying overtime would require tracking hours, which is a pain. And of course a lot of exempt employees see not having their hours tracked as a benefit, since that tends to entail very little internet surfing, strict start/stop times for breaks and lunch, etc.
Are you familiar with the “EA Spouse” blog post incident? I believe part of the fallout from that was EA having to reclassify hundreds of employees as non-exempt.
That whole Feature Creep section reads just like something that actually happened, with like, a couple of names changed around to protect anonymity. Quality stuff.
Particularly from what is clearly a bloody Punch Guy fanboi. FIST4EVA!!
I wouldn’t have thought there would be much cross-over in the demographics for Punch Guy and Fist Man. The latter sounds like a very different sort of game….
It has been proven to be.Same arguments were used back in the industrial revolution to protect greedy factory owners from “those entitled unions”.But once the laws protecting the workers finally came to be,productivity still continued to increase.
And before you go on to say “but creative industry like software development cant be compared to repetitive factory assembly”,I have two words for you:Movie industry.The same thing happened to it when it was young and the studios were greedy.And did that industry crash and burn because they couldnt exploit their actors,coat them in led paint,and injure them in various other ways?
Yes,scheduling and management is hard.Which is why every big business pays managers to do it well for them.But if you are going to promote your lead programmer into a position they know nothing about,of course theyll suck at it.Similarly,you cant just put a random manager on top just because they have done so well in auto assembly.You wouldnt ask them to manage your skyscraper build site,so why would you have them manage your software company?But if you put right people in the position suited for them,they will schedule things well and you will rarely go over budget or over time,you will not require permacrunch,your employees wont be forced to leave in a year,and you will still make profit.Not as much profit as you can make now,but if that bothers you then you are a greedy asshole.
Productivity continued to increase because that’s what happens when you industrialize. More mechanization and automation allows a single human worker to produce a lot more. We didn’t go from needing 9 out of 10 people working on the farms to 1 out of 40 because of unions.
Meanwhile, union jobs in the private sector have been killing themselves off. Manufacturing, in particular, has taken it on the chin.
Again veering dangerously close to political discussion, but still. I want to point out that, while in some cases unions have shot themselves massively in the foot by demanding unrealistic things, in many cases – manufacturing for example – the “massive job loss” isn’t so much because of unions, as because of the *lack* of unions elsewhere, and the lack of trading restrictions. A shoe made in a Bangladesh sweat shop is cheaper than one made in the USA, even considering transportation and import fees, so companies make them over there. Does that mean American workers should accept 19th-century-style labor conditions, we (I’m actually European but anyway) should allow the industry to pollute, and work people to death, or does it mean we should probably try and make the Bangladeshi earn a living wage and get to take their kids to school, like we can?
There’s a balance to be found between “abuse your workers ’till they drop dead” and “work 2 hour weeks for full pay and 100 paid vacation days a year”. In a completely free market with MegaCorp vs Individual, the individual will get trampled and used and thrown away; some form of assembly/gathering of workers to stand stronger together is necessary to find a balance. Where and how that balance is to be found is a whole different matter (I’m not actually a fan of unions in their current form, myself).
Problem is, without cheaper of sources of labor elsewhere, we wouldn’t *have* cheap shoes anymore. Whatever wage increases we would get from trying to tariff away imports and unionize everyone in the US we’d more than offset with price increases.
There’s too much magical thinking wrapped up in the idea that every job can have a “living wage” and that we can arbitrarily raise our standard of living without finding ways to increase production and reduce waste.
Meanwhile, places like Bangladesh are benefiting by having external money come in and help them industrialize. The reason people work in those sweat shops is because they’re the best deal available to them, and it’s getting them through their industrial revolution faster than everyone else did when they were the first to go through it.
Also, the idea that a collective of workers is necessary in order to get good working conditions is provably false. A very small part of the private sector in the US is unionized, and working conditions are great in areas where labor is in demand. The best thing for workers is a strong labor market- companies will compete over skilled employees. Even basic, unskilled labor can demand better wages and conditions when there isn’t a massive surplus of it.
You’re absolutely correct as long as it refers to factory workers and the like. Stuff that is reasonably foreseeable can be planned pretty well. Not working more than about 40 hours per week (for some jobs, 35!) is actually maximizing output per worker, and fixed working times make it much easier to plan output, so everybody wins.
Delivering large complicated (creative!) projects, though, on time, on budget and without overtime is still not something I’ve ever seen. So between schedule, scope and working time, something always has to give. That’s what I understand Shamus’ statement to mean. You might choose (or be forced by law) not to allow overtime but then the other two items will have to carry the risk.
Hofstadter’s Law is pretty relevant with any sort of software development: “It always takes longer than you expect, even if you account for Hofstadter’s Law”
The reference to Hofstadter led me straight to the etymology of your username! :D
Wait, is his name not just “Master” spelled backwards?
Yes indeed! (There was no great mystery, to be sure – I’d never noticed before, and just found it interesting how my brain went something like Hofstadter-recursion-strangeloop-backwards-Retsam=master! in some vanishingly small percentage of the amount of time it takes to type that.)
It’s actually “jetsam” spelled incredibly weirdly.
actually, being able to instantly kill the final boss sounds like a great feature to me. but then i hate boss fights.
I love games that let you skip boss fights, as long as you’ve set it up ahead of time/figured out the clever strategy to do it.
Reminds me of some puzzles in Portal where the devs realized there were ways to reach the exit while skipping most of the puzzle. In some cases, where it was too lame, they reworked the puzzle to avoid the problem, but in other cases, they figured it took more skill to avoid the puzzle than just solving it the normal way, so they left it in. I like that attitude.
The solution is to not change your ongoing focus to be more accommodating to this glass punching,but rather keep it as is and then make a sequel punch guy 3:pirate defenestrator.
We need a Punch Guy – Marlow Briggs crossover: indefenestratible II defenestration
I sure hope this becomes Pyrodactyl’s next project…
I’m going to punch you in the mouth!
With my mouth.
Because I like you.
Kiss of the North Star.
Obviously, you would map”kiss” to the same button as “choke.” Seriously, is this even up for debate?
By the way, have you guys heard the rumors about a bunch of people quitting at the Punch Guy studio?! Appearantly, they are forming a new studio and Kickstarting a new game called “Defenestration Guy!”
Not going with “Defenestration Dude” was a missed opportunity for those guys, really.
I feel like Bioshock Infinite is a good example of why even if crunch can be bad you still want deadlines on games. They started work on it in 2007 and released it in 2013. and it was kind of a mess both gameplay wise and story wise. You could tell things kept being pulled or added or changed. It did have good moments but also had some really WTF ones. But it probably would have been helped by more coherent deadlines from the start so that there wasn’t as much throwing things out out the window for the next cool thing.
Just make Punch Guy vs. Shoot Guy already so it can sell like hotcakes.
As a developer, one of the most infuriating things you can come across is the armchair expert who thinks their “why don’t you just X?” is some sort of infallible, amazing idea that everyone else was just too dumb to think up.
The problem is never the ideas. It’s the implementation and the fallout of all the problems it causes.
Thanks for detailing exactly why nothing in game development is ever easy or simple.
As a retired software developer/manager I find these discussions quite entertaining. The majority of comments are aimed at the problem, while some have actual experience either in software fields or in gaming. So all in all, Shamus attracts a good crowd to his discussions. Go team.
The basic failure mode of the AAA video gaming industry is the extended crunch time. It is not a sign of a healthy business/industry that this is an ongoing, decades old topic. They violate the most basic rules of software management. Of course IT does it every single day in the vast majority of companies. So the gaming industry is in with company. Not necessarily good company but at least they are not alone. The IT problem is somewhat different as IT management normally doesn’t have any experience with actual SW. There are trying to check off company buzzword bingo cards, while enhancing their future hiring potential and saving money by cutting staff.
Do you want to fix the problem? If yes, then hire smaller teams. Keep them. Do not fire them. Assign them to the essential software tasks common to your games. Hire at least one software manager, not some beautician that got a table top game almost published (think Jon Peters in the movie industry – do not hire that person). A real software manager with real SW industry experience.
Let out your company specific taunts/rants here.
The problem as described in the post doesn’t actually address the problem. The issue is there needs to be a design loop. Adding fun things is how games work. That is not the normal method for SW. However, in SW it is very common to have new ideas come up that need some development time, to see if they will work in the very special snowflake environment that the team is in. I have read several accounts of this happening in many game teams. So it is not an unknown solution.
Also, the SW industry multiplier is not magic nor is it bullshit. For someone outside of the industry it may seem like it. However, the number is ~2.5 for most teams. The longer the team is together the less the multiplier and the more accurate the estimate. One of the fun parts for me reading about game development is in SW this information was established in the 60’s and 70’s. And still, the game companies normally flush the team out the door – to save money. So the next project requires the non-determent time of building a team added up front which costs money, possibly more than they saved. Absolute genius.
The bad news is this always takes an effort to learn. It is not just flip a switch and the light will glow. It takes a company with management that understands the situation and knows that it is solvable. Thus it is not the problem the game industry makes it out to be. However, it is not a problem they actually want to solve. At least I don’t see that happening in the gaming business as a whole. Bethesda has some teams that have very long histories, I am sure you can think of or know of others.
Remember this, games generate an amazing, lust inspiring amount of cash. It draws people that have money but no skills, or their only skill is getting other people’s money. Either way it does not bode well for controlling crunch time. The people driving this behavior are after the money, and not caring about their people/teams is evidence of that.
Side note: As a young man I did some construction work. We added on to existing building and when I was first tasked with estimating a job it was very rewarding. However, the job ran long and I thought I was in trouble. Nope, my boss multiplied my estimate by 3. This works in many businesses, we are bad at estimating. The more experience the less is unknown, the more accurate the estimate. That is the loop the gaming industry is not trying to learn.
As a program manager, I cannot support this comment enough. In my area of expertise people often make decisions on a vacuum and do not consider past projects as relevant to current projects. On a technical level that may be correct but when attempting to judge how a given team will perform those historical projects are incredibly important.
The only time that crunch time should be your only option is if you do not know the team that is working for you. The more historical context that you have for a given team the more accurately you can project how that team will perform.
This is possibly going to be a really dull post. But point number one is both right and wrong.
Scheduling is hard. Lots of people get it wrong. But the imposition of crunch is a deliberate decision by managers to prioritise the ‘Time’ success factor. Any project – and the creation of a game is undoubtably a project – is defined by its success in meeting set Time, Cost and Quality criteria, which can be prioritised by management to allow trade-offs. Crunch costs money and affects quality, so these are obviously being traded by management to hit a time requirement.
What Shamus sets out (you don’t know what is going to go wrong) are risks – the potential for an occurrence to negatively affect the project (in time, cost or quality). In any long running company, many of the risks for a project will be known – you know there is a risk of a major bug occurring towards the end of development. You also know that some bugs will be found in software. Therefore, in scheduling terms you define timescales as an uncertainty bound (minimum/most likely/maximum) and take an Uncertainty Orientated Mean ((Min + (4xML) + Max)/6). You then carry out a Monte Carlo analysis of the risks and can identify a point at which, if you ran the same project 1000 times, X% would succeed (know as P%). The organisation/managers can choose what level of risk they are willing to tolerate – which P level they would choose to set their project success criteria at. Things that are heavily time bound (like the infrastructure for an Olympic Games) are set by the P90 – with a cost increase to achieve this.
So, while scheduling is hard, you can manage it to avoid crunch – most of the time. That crunch has been a fixture of the games industry for so long suggests to me that this process is not being followed, or that companies are not willing to invest money to avoid it (likely the last) – it is therefore the organisations willingly hurting their staff to achieve project timescales.
My guess is that people in the games industry would argue that their projects are by nature too unique to lend themselves to this approach, and thus the method was never attempted. I certainly don’t know enough about the industry to know if it’s a valid point or not.
That’s a false argument, especially with sequels. A game is no more unique than an Olympic park, certainly not with now decades of experience as to what the risks are in game development. What we’re talking about is making estimates better and so being able to limit or remove crunch – in effect crunch being a fall back should a major risk occur. You could take the risk register from the (successful to time, cost, quality, no crunch used) software project I ran three years ago and it would be applicable to games with just a few words changed.
What I would say is that some games are cutting edge. That means they are higher risk, and so costlier. If the organisation won’t find that then it will lead to the project going over time cost or under quality – with the result of crunch being needed to ship on time at acceptable quality. This is no different to a new aircraft (for example) and is often why projects go late or over budget (A380 etc)
The other main reason projects fail is requirement (for games read feature) creep. Wanting new stuff not agreed at the project outset costs time and money and if you don’t give that to the project along with the new requirements the project will fail.
“What I would say is that some games are cutting edge. ”
I think this is a big part of why there are so many disagreements in this thread. We all have different games in mind when we talk about how development “should” work. Building GTA V – an open world game with massive levels of hand-crafted content on a scale never attempted before at this level of fidelity – is going to be crazy difficult to schedule accurately, since you’re literally doing stuff that hasn’t been done before. (I’d love to know how they MAKE content for that game.) On the other hand, a standard corridor shooter should be pretty reliable. Some titles are high risk, some are extreme risk, and some aren’t any worse than (say) application software.
Apologies for briefly going off-topic, but congratulations on the Hugo Award nomination, Shamus!
Unfortunately, it was probably due to meddling from the Rabid Puppies. This year the Rabid Puppy nomination slate included a bunch of people from the Escapist for unknown reasons (Grey Carter of Critical Miss was talking about it on Twitter), and Shamus Young was one of the people the puppies had on their nomination list.
So although I think he’s wholly worthy of the nomination, I do not believe most of his support was in good faith.
(If you have no clue what I’m talking about with the puppies thing, there’s a Guardian article about it)
I was aware of the problems regarding the Hugo’s, but I absolutely wasn’t aware Shamus was on the list. I can’t really see why or how the Rabid Puppies would favor him, there’s really nothing right wing about the writing here. That said, I do hope he wins :-)
I’m pretty sure they’re trying to get a clean sweep of “No Award” by nominating people who’d otherwise be supported by their opponents.
Shamus is in good company with Neal Stephenson, Neil Gaiman, Louis McMaster Bujold, Brandon Sanderson, Andy Weir, Stephen King, Alister Reynolds, and Jim Butcher on their slate this year.
I assume they just grabbed a bunch of recognizable names from the Escapist’s contributor list, for reasons that are extremely political and so I won’t go into but should be easy to google. The lack of any particular introspection into their choices is entirely par for the course.
It’s almost like the rabid puppies just nominate stuff they think is good – which is what they said they’d do…
They’ve never just nominated “right wing” stuff.
You’re going to claim to my face that they think Space Raptor Butt Invasion is one of the five best short stories of the year?
Sad Puppies or Rabid Puppies?
Sad Puppies might be voting mostly for things they like.
Rabid Puppies are Trolling for the Troll Gods
Well, I mean, Vox Day said he was going to “burn down the Hugo awards in their entirety,” after the 2015 awards. That statement of intent… fills me with unease.
That said – setting politics aside, congratulations to Shamus! It might have come from an unexpected place, but frankly you really do deserve a nod like this after all your insightful writing.
I’m no puppies fan, but even a stopped clock can be right twice a day. I think Shamus is definitely worthy of a Hugo, so I’m not worrying about the process that got him on the ballot.
I just hope that the puppy involvement won’t keep people from giving Shamus a chance.
Huh – I thought it was going to be Best Novel (for the Mass Effect rant). And then I thought it was going to be Best Dramatic Presentation – Long Form. For the Mass Effect rant. It has been getting pretty dramatic at times!
You are as off base on Nestle Crunch as you are on point with Capt Crunch.
To answer the very first question about time scheduling: the Planning Fallacy and how to avoid it is a solved problem in domains where there are a large body of prior comparable projects.
Look at how many of those projects had delays that were not anticipated when they were planned, and extrapolate from the trend to estimate how many unexpected delays the next project wil have.
If you consistently have more unexpected delays than you planned for, plan for more of them. That includes planning for emergent features that you didn’t know you wanted.
“a large body of prior comparable projects”
I think this would exclude videogames. Studios don’t share their budget and scheduling figures, which is the numerical depth you’d need to make this work. So you’d have to use the figures from just your own company. And given the ever-evolving nature of the industry, the numbers wouldn’t be relevant for long.
New console generation? New graphics engine? Breaking into a new gameplay genre? That old data isn’t going to help you, because you’re going to be facing a lot more problems than usual.
Although personally, I think new engines and gameplay are projects for small R&D teams, and it’s foolish to send an entire AAA staff into the unknown like that.
But there are publishers which have plenty of developers under their umbrella,lik ubisoft and ea.At least those should be able to have comprehensive data and planning abilities.
Even then, that’s not much data. How many games so far this console generation? And that data is spread all over many genres. I don’t expect the FIFA / Madden schedules to be useful for the people trying to make Mirrors Edge.
Sure, it’s better than nothing, but probably not enough to solve this problem.
But then, EA is still reportedly doing the perma-crunch thing with high staff turnover, so I imagine they have NO idea how long anything actually takes or how a project should work when people know their jobs. In this thread we’re trying to solve the scheduling problem so we can fix the end-of-project crunch problem, and EA supports ALWAYS crunch. The first step for them isn’t gathering scheduling data, it’s figuring out how to run a development house in the first place.
Supposedly EA knows exactly what it’s doing and it works for them because they can keep hiring new people when the old ones burn out. Remember that old EA Spouse story?
“The producers even set a deadline; they gave a specific date for the end of the crunch, which was still months away from the title’s shipping date, so it seemed safe. That date came and went. And went, and went. When the next news came it was not about a reprieve; it was another acceleration: twelve hours six days a week, 9am to 10pm…
This crunch also differs from crunch time in a smaller studio in that it was not an emergency effort to save a project from failure. Every step of the way, the project remained on schedule. Crunching neither accelerated this nor slowed it down; its effect on the actual product was not measurable. The extended hours were deliberate and planned; the management knew what they were doing as they did it.”
Precisely this.Its not that its impossible to plan in software industries,its that they are malicious.And yes,we shouldnt assume malice when incompetence is sufficient,but stories like that one show that malice is far more likely.
They’re incompetent in their malice. They’ve been doing poorly financially for quite some time and had a CEO resign in disgrace over SimCity, which had the morass of bugs I’d expect from lengthy forced crunch.
I don’t think it’s an organisational problem there really.
It’s an accountability one.
They aren’t financially responsible for their employees time. Crunch is such a large problem because the compensation is inadequate. Give them a decent rate, and they’ll see the amount of crunch drop dramatically.
They’ll tie bonuses to performance indicators. If they have that money, then that can go towards an overtime rate. You get paid for working, you don’t get paid for a review score. That’s insane.
Game developers need collective bargaining and union organisations. They need legislative intervention into the practices of companies like EA. Until they have that, you’re not going to see change, because working people like slaves is a damn fine practice if you’ve always got more coming in, and it costs them nothing.
The reference class isn’t “video games”. The reference classes are “game design” “art design” “programming”, and every department that actually has a schedule and staff that is non-fungible between departments.
No, your new console generation doesn’t invalidate your knowledge of “how long does it take to rig a model for this many animations”, any more than it invalidates your knowledge of how long it takes to record voice acting.
It’s possible to be wrong about what the trend is, but it’s also possible to predict when someone will be done with each step except bug hunting and fixing.
Crunch is, for the most part, not being used because a larger fraction of the current project’s time needed to be fixing bugs than the average project.
They just really aren’t all that comparable most of the time. Maybe if your making something that boils down to a copy or minor iteration of something you’ve already done (same engine, same game code, same toolset, etc…) is that even potentially useful and probably only to a point.
Every time you work on a new project even if your doing something a bit similar you are usually combining something new into it and while you may be able to gut check an estimate that turns out fairly accurate, it really has to do with your expertise and ability to estimate than any previous data you can point at. There are too many moving parts that are changing every time you start the next project that relying on past data isn’t even a safe call.
An experienced developer giving an estimate is usually including some padding in areas they can guess might be potential pitfalls based on their past experience, but it’s not foolproof. Very rarely can you to point at some feature and say we did X with Y there that took A, now that we want X with Z it will also take A because X with Y did. Very often it won’t, then again sometimes it might, but experience and iteration are often the only ways to discover that.
(Also hi Decius!)
It’s really interesting (and frustrating) to read about how you (I presume) have experienced development teams. I know there is a creative aspect to this whole endeavour* but the whole point of a project lead is to do this cost-benefit analysis, not shunt it up to the big guy giving the money and then asking them for more time… That’s bad management.
The project lead should be managing the project better. It doesn’t matter if people want to put in a feature – you don’t start implementing things before you’ve tried to fully assess the ramifications. I mean, I know it’s an easily pickable analogy with the window (so I respect you’d be frustrated with people saying: “but if it was like this…”) but the project lead should have put the whole thing into a project team of one – three people whose job it is over, say, three days to write a report detailing how many windows are in the game, what interactions those windows currently have with story and gameplay and what consequential art/sound and design aspects might need to be implemented to get this new feature up to spec.
Maybe this happens and then we get the sequence of events anyway – maybe someone in the industry can enlighten my point.
You can’t get away with haphazardly approaching goals. It doesn’t work.
*Seriously, I get really offended when people bring up this argument – as if other industries that are not considered “art” do not have this same excuse! I’m a scientist and there are a lot of creative aspects to ‘science’ whether you are in a business or academia. I’ve also worked in inward and outward facing call centres (non-science related jobs!), print media and counter sales.
My experience tells me that most jobs can be creative and have artistic license to get things done and changed for the better. This is highly dependent on the people involved though. An unquestioning, work to the minute person will not think beyond their job and just go home at the end of the shift (which is fine). However, even people who do not love their job can and will think of improvements. The reason those jobs end up not being creative is because management stops everything in their tracks.
Now, many times, this feels like a bad thing on the part of the employee. I understand why it happens but I feel that every suggestion should be considered as a real exercise – even if you put the employee themself onto it in their spare time. However, it is important that project leads stop this sort of behaviour in its tracks precisely because it does waste resources, time and causes more crunch.
When we’re talking about an industry with huge staff turnover, bleeding talent, mass layoffs and studio closures that lack of control obviously has a huge cost and it’s very irresponsible on the part of those leads and management to continue to act in that way.
With regard to the window example, there’s also the question of, if this is the part of the game everyone enjoys the most, does that mean we should have more window parts, or does it actually mean the core game needs serious work to make it more fun.
Wait,what?Not for free radical,but for mass effect stuff?And people say how bitching and moaning will get you nowhere.
The cure for most of those issues is experience, just like for any other industry.
Experienced developers are better at identifying hazards earlier on, better at estimating costs, better at avoiding to overextend themselves. They make fewer bugs, and troubleshoot and fix those that they do make faster.
Same goes for designers, producers, and artists. More experienced people work faster, and better.
Now, the issue which is specific to the video game industry is that as a whole, it lacks experience. It has a tendency to drive experienced people away. The reason is simple: experience = time. Experienced people = older people. Older people = people who have a perspective on life where other things matter than just their job, such as starting a family.
In an industry that continually screw ups project management and force people to crunch, older, more experienced people have a tendency to reconsider why they are bothering working in that industry at all, when they can get a developer job with an equivalent or often better salary, outside of the video game industry, that leaves them time to have a life outside of their work.
So it’s a vicious circle. Lack of experienced people -> bad project management -> crunch time -> people are driven away as they begin getting experienced -> lack of experience.
I have seen people wondering about working at AAA studios, and I don’t think it’s entirely an AAA versus non-AAA thing. I currently work in an AAA studio and they have made a point of trying to hire mostly experienced people as much as possible, and it’s good. We are not perfect, we do have crunch periods from time to time, but mostly, we don’t. People are very friendly and professional and out of the half dozen or so video game studios of various sizes I worked at, this is the best professional experience I have had so far (also including non video game companies I worked for, actually)
That said, it is quite possible that studios that manage to become AAA studios (in other words, to earn the trust of a big publisher to produce a big budget game) are doing so in great part because they employ experienced people.
Now, the issue which is specific to the video game industry is that as a whole, it lacks experience.
Actually, the problem with the game industry is not a lack of experience. The problem is a lack of information sharing. Somehow, somewhere, each and every development house got it into their heads that their “information” was proprietary – despite virtually every other industry out there sharing very vital statistics.
This has lead to a dearth of information on what it takes to make a game, what changes to an engine can have an effect on the bottom line and what segments of the market are healthy or not.
This is the single most biggest problem in the whole industry and there is a very small segment of the industry fighting voraciously for more information sharing but the majority of the industry is still operating under the mindset that “this information is my information, it’s probably useful to you so I won’t share it”. Despite the overwhelming evidence from other successful industries that sharing information lifts all boats like a rising tide.
The gaming industry actually has a lot of experience. What it doesn’t do is share that experience.
I’m still disappointed about Stab Guy, but they should have known it would fail in direct competition with Stabbity Death and Live by the Stab.
I didn’t comment on the last post mostly because it came down on the “crunch is a bad thing caused by other bad things” side of the argument, but I really want to come in and say that I hate the way this topic is discussed. No matter how obviously dire the conditions at an employer are, no matter how blatantly exploitative management is, there’s always someone ready to say “well project management is hard and expecting an accurate schedule is unrealistic”. Even the most sympathetic articles throw in a “As an artistic medium, it’s unreasonable to expect there to be no unexpected extra work”. And sure, maybe that’s true. But I don’t care. Unless you’re willing to pay developers overtime wages, any company that expects its employees to compensate for their crap project management by sacrificing personal time is relying on coercion to cover management failures. Any company that plans and schedules employees to sacrifice their person time for the sake of the project has a business model based on theft.
If management wants extra hours at the end of a project, they are asking you to make sacrifices for the sake of the company. Ask yourself, would the company make sacrifices for me? Appeals to artistic quality and the difficulty of proper management are excuses for exploitation.
No, the reason that the gaming industry has such a persistent crunch problem is the same reason that game developers get paid less than other programmers and the same reason that they have less job security: Studio management sees developers as expendable. They don’t care about bad publicity, they don’t care that this is technically an inefficient use of resources, and they clearly don’t care if employees burn out or quit. The average consumer doesn't know or care about any of this, the way it is works fine for the studio, and there’s a steady stream of idealistic kids replace those who leave. This isn’t a problem that can be fixed by the occasional appeal to decency or public shaming.
But commentary to the tune of “yes but it's reasonable in moderation” doesn't help.
You’ll notice I left out all mention of pay, simply because that branches out into another topic altogether. I never claimed they shouldn’t be paid for overtime.
“But commentary to the tune of “yes but it's reasonable in moderation” doesn't help.”
Understanding the parameters of a problem are key to solving it. If you’re looking to condemn unhelpfulness, starry-eyed silliness like “There should never be crunch, ever!” is worse than useless. It tells managers they should simply solve the problem by doing something blatantly impossible. (Large software projects that hit a predetermined date with zero crunch may exist, but they are vanishingly rare.) It sets the bar so high that (from their perspective) there’s no point in making an attempt. That sort of demand is easily dismissed.
I’m going to follow your lead and hold off on “crunch is a labor standards issue and inherently tied to issues of pay and compensation”, but while I see the argument that expecting to actually meet “starry eyed ideals” won’t ever lead to happiness, I fail to see why we can’t expect an effort. If there isn’t an expectation that zero crunch is a goal, then zero crunch will never happen.
Even beyond that, I dislike these arguments because you’re not saying “we’ll never reach perfection, so let’s strive for being good”, you’re arguing for “since we’ll never reach perfection, let’s just all agree to never speak of it and just try for tolerable”. What you described in the post wasn’t a sympathetic example of devs getting in over their heads, it’s a fundamental failure on the part of project management to do their jobs properly followed by them opportunistically taking advantage of the fact that they can force everyone else to work extra hard to compensate. “Crunch on management failure” isn’t any more reasonable than “crunch on every deadline” and only better than “always crunch, crunch forever” because that’s the policy of a blatant sociopath.
If there’s crunch, it should be a failure, and someone should be responsible for it.
Crunch time should either be scheduled at the beginning of the project, or accountable to someone who misjudged something.
It’s not always bad to misjudge how much time a project will take, but it is bad to consistently misjudge the amount of time all of your projects will take.
Discuss scheduled crunch time along with salary negotiations.
I now want a game where you go around kissing people.
Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*
You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>
You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?
You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.
You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!
You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>