First, if you want to use an Agile framework try to understand what that implies.
Scrum can be difficult to master, even though it is a relatively simple framework. Recognizing
this reality, we’ve made Scrumban our chosen framework for introducing Scrum to teams and organizations for the first time.
Because Scrumban seeks to minimize the disruptions associated with imposing new definitions and responsibilities upon employees, rolling out Scrum under this framework is substantially different from traditional approaches.
Rather than starting out with Scrum-specific orientation and training, we emphasize is covery of existing systems and processes, then use the framework to gradually introduce elements
of Scrum as warranted by the context.
This gradual introduction can be both role based and process based. For instance, the “daily scrum” is a process-based change the team can begin to employ within the context of a Scrumban framework, just as new Scrum masters can be eased into their responsibilities one element at a time.
The Scrumban framework obviates the need for debate on these issues. The tasks associated with ramping up a new project can be easily accommodated and managed within the Scrumban framework as a special work type. If it’s important for your environment to ensure Scrum ceremonies hold true to their Agile objectives, then Scrumban provides a ready solution.
It also makes sense to assess existing Scrum practices with an eye toward understanding their impact on performance. Larry Maccherone and his former colleagues at Rally Software have analyzed data from thousands of Scrum teams in an effort to assess how differences in the way Scrum is practiced affect various elements of performance.
This data mining exposed some interesting realities about how our choices involving Scrum practices can influence various facets of performance. Incidentally, Maccherone’s analysis is consistent with data mined from hundreds of teams using my own Scrum project management platform.
Maccherone elected to adopt a “Software Development Performance Index” as a mode of measuring the impact of specific practices. The index is composed of measurements for the following aspects:
Productivity: Average throughput—the number of stories completed in a given period of time
Predictability: The stability of throughput over time—how much throughput values varied from the average over time for a given team
Responsiveness: The average amount of time work on each user story is “in process”
Quality: Measured as defect density