The concepts surrounding batch size have been floating around my head again in recent days. My current assignment is a rather large batch of work with Silverlight 2.0, but it’s one that I’ve been able to subdivide into a number of small batches that rapidly build on each other. Thus, each one of my check ins to my private branch are able to show continual improvement in what I’ve made available. Unfortunately, from a throughput perspective, the value of this is exactly _zero_. Why? Because until Silverlight 2.0 Beta 2 is released, we don’t have a GoLive license from Microsoft. Thus, this environmental constraint has forced me into a situation where I have a very large batch that can’t be released until a specific point.
What does this mean? For starters, it means that our board has a VERY large balloon of work in the middle, and that overall we have near-zero flow. Our work in progress is artificially limited to three requirements, but the actual size of one of those is 10x the other two.
What are the implications of a large, all or nothing batch of work? Probably 90% of the features within the requirement would apply as MMF’s, so they can’t really be dropped out. I can’t do partial releases due to policy constraints (although, we’ll be bending that a bit since it’s not RTW). I’m going to think about this, and I’ll try to remember to post what comes out of this. For now, I’ve got more code to write. Mixing methodologies a bit, I should be able to trickle out a couple more inch-pebbles tonight