CASE STUDY – PART 5 – Proven, Practical Tactics For Agile IT Release Management (LESSONS LEARNED)

OVERVIEW:

This article is the last in a series of five which explain how an IT organization delivered a release management process that exceeded its management’s expectations and provided a foundation for continued success. The series includes:

  1. How did we get here – THE CONTEXT
  2. First solution steps – DEFINITIONS AND TRIAGE
  3. Intake and Release Planning – THE CORE SOLUTION
  4. Production Change Control – FINAL QUALITY CONTROL
  5. Metrics and Insights – LESSONS LEARNED

SUMMARY:

Many Information Technology organizations flounder when they are tasked to understand, organize and implement change to the system and application software serving their clients and end customers over a period of several years. This fifth article focuses on the key results of the solutions developed during the Release Management consulting engagement.

Please refer to the first Article – THE CONTEXT for a full discussion of the problem domain and organization, to the second article – DEFINITIONS AND TRIAGE for a discussion of the get-ready steps, to the third, THE CORE SOLUTION for details on planning releases, and to the fourth FINAL QUALITY CONTROL to learn how implementation quality was improved.   These articles all were entitled Proven, Practical Tactics for Agile IT Release Management. Now is the time to assess how “Agile” were we? This Release Management process was implemented in 1999, without benefit of access to the thoughts and ideas published following the Agile Manifesto. We also have some basic metrics to consider and explain, and thoughts on the lessons along the road.

HOW AGILE WAS THIS WORK?

I willingly concede that there are experts in the Agile community who are far better qualified to render an opinion on how closely this work conforms to the principles of Agile Software development and the complementary Scrum approaches to Product and Enterprise Requirements management. On the one hand this was not a discussion of software development. The Agile Manifesto states (Author’s Note: the specific reference for this quote is given at the end of this article):   “We are uncovering ways of developing software by doing it and helping others do it. Through this work we have come to value:

  1. Individuals and interactions over processes and tools
  2. Working software over comprehensive documentation
  3. Customer collaboration over contract negotiation
  4. Responding to change over following a plan”

Our process work was very steadfast, disciplined and critical to success. Our interactions were frequent and very focused. We used plain cheap tools, but exceedingly well. Individuals – we used everyone’s strengths to succeed. I’d give us a grade of a B on item 1.   The Release Management process took no notice of the interim steps of software development. In fact we stripped out tracking of interim dates, then put back in the importance of starting QA. The only thing we worried about was production-ready software. I’d give us a grade of A.   On customer collaboration, we certainly improved communications about what was being worked on (and what wasn’t also was obvious). We definitely showed the VPs that we were trying to slide their Top 5 requests in at the earliest juncture in the overall plan. The Release Management process did not operate at the level of the software’s requirements, design and functionality. In essence we just did a great job of clearly starting and stopping work. I’d give us a B on item 3.   This process excelled at responding to change over following a plan. Every week we would build a firm Release Schedule for 6 Releases, and the very next week we would re-work the whole thing due to circumstances and reality. We did that with clarity, collaboration, understanding and high levels of communication. I’d give us an A+ here.

METRICS:

I will restate my opinion that 99% of the published material regarding IT processes lack meaningful statistical indicators. There is a lot of “crowing” about methods and tools, but not a lot of believable concrete information. Also, keep in mind that IT headcounts were  held constant for the periods in question. Here is a sample of the data and metrics we collected:

  • Release Management Selected Metrics

………………………………Year End 1998………Year End 1999 ……Improvement

Customer Satisfaction                   2.5                                 4.0                    60.0%

……………………………..6/1/98 – 5/31/99…….6/1/99 – 5/31/2000…….Improvement

CRs Completed or Cancelled      97                            218

Including Project Releases          118                           218                                84.7%

Major Projects Completed            10                            15                                   50.0%

…………………………….As of 5/31/1999….. As of 5/31/2000

Avg Age of CR Backlog         197 days               187 days           -10 days

Size of CR Backlog                   307                    297                   -10 CRs

  • Customer Satisfaction – IT

The CIO conducted a very simple poll of the senior managers in the organization each year, asking for an overall degree of satisfaction with the IT performance for the prior year. On a scale of 1 to 5, the 10 managers selected from Very Unsatisfactory (1) to Outstanding (5).   This simple scoring did not differentiate between performance in Operations, or on Projects or on implementing Change Requests. It was the simple view of their Overall Satisfaction. We believe that the efforts on release management were a major factor in raising the score from the prior year. Another major win was that the IT organization turned the corner on the Year 2000 without mishaps.

  • Change Requests Completed or Cancelled

The consulting engagement began in early May of 1999. At that point in time the definition of Change Requests did not include production software changes caused by major projects. Using the new definitions of what Release Management considered an in-scope Change Request, the base of Change Requests Completed was expanded for the earlier time period so that a fair comparison can be drawn. The IT organization, using Release Management, dispatched about 85% more Change Requests over 12 months.    As a parallel metrics observation, the IT group set annual targets for completing change requests. Their goal for the year 1999 was 140 (this was considered an aggressive target at the time). On a calendar year basis, IT completed 172 Change Requests in 1999.

  • Major Projects Completed

In case people wonder if IT just re-directed effort to do more change requests, thus short-changing the efforts on projects, the numbers for major projects are shown. We do not know what % of total resources were used year over year, as project-hour accounting was weak. Given that the Year 2000 Project was a major endeavor, I offer that it is safe to assume that there was no disproportionate shift of resources that favored better Change Request results.

  • Average Age of Change Request Backlog

We decided to consider how well we were doing in terms of reducing the amount of time the clients were waiting to get their Change Requests taken care of. For the period in question, no significance can be observed. At least it didn’t trend up! Our first-hand experience was that we were getting to the high priority requests quicker, but we didn’t collect metrics in Excel for this.

  • Size of Change Request Backlog

In a similar vein, we kept an eye on the total change requests. We saw minor fluctuations, but in general, the client community was always submitting more improvements. There was no budgetary chargeback mechanism from the IT department to the VPs, so asking for more IT work had no direct consequences for them.

LESSONS LEARNED

  1. First and foremost, the direct investment made in Release Management implementation brought better than expected results for the stakeholders.
  2. I was amazed and delighted to see the Wall of Index Cards morph over time to be a more elaborate information radiator for the organization. One example was the addition of colored dots to the cards for Top 5 and also QC status. We also got tricky with positioning cards above and below certain horizontal lines to convey new information. We also started to display the thermometer of completed change requests versus the annual target (it was uplifting). There is a lot of truth to the adage that you learned everything you need to know in kindergarten…..
  3. The solutions we applied were just about perfect for a collocated organization with the configuration management and QC processes in place and an organizational commitment to release management.
  4. The CIO, seeing that the process was successfully embedded, at the end of 1999 asked the consultant to do a fresh study of the commercial software market for supporting tools in the Release Management arena – none were found that could match the team effectiveness we achieved with cards on a wall.
  5. The role and actions of the Release Manager were very well defined.  Hence, I prepared a transition plan to bring in an internal manager for the ongoing position of Release Manager. It took over 4 months to locate and train a replacement candidate for the permanent position. The first chosen candidate just couldn’t keep the pace of detailed item management that was required.
  6. If this problem had involved 600 Change Requests, it might not have worked at all. As long as we had fewer than 350, we could handle it on one wall and you could read every card from about 15 feet away.  There are limits to this media/storyboard approach.

CONCLUSION / TRANSITION

There is a huge amount of value to creating a Visual Decision board that covers the whole problem for an organization. This general finding can be applied in a low-cost manner to many problems. In my research to find an adequate software package solution for Release Management, nearly all stumble on the problem of scale on a 21″ computer monitor. To this day, the tenth anniversary of this endeavor, only very sophisticated hardware systems and conference room environments begin to match THE WALL and the practices we used. I smile each time I see a modern spy movie, or “24” or CSI Miami dazzle the audience with technology for the virtual space and index card/puzzle manipulation approach in a sophisticated manner.

(c) By David W. Larsen

If you would like to contact David to discuss process improvement in your organization, email dwlarsen1946@gmail.com

David W. Larsen is a consultant in the Information Technology practices of process improvement, ecommerce, portfolio management, and program and project management. He earned his BA from Valparaiso University, his MBA at St. Mary’s College of California, and a degree in Telecommunications Management from the University of California, Berkeley. Dave has served global businesses in leading projects for application development and integration and has worked hand-in-hand with firms in the US, Canada, the Czech Republic, England, Sweden, Germany, Italy, Japan, Australia, and India.

Information on the Agile Manifesto and the Agile Alliance may be found at: Manifesto for Agile Software Development © 2001 at http://www.agilemanifesto.org

Article Source

Share

.......................