Driving Organizational Excellence: A Comprehensive Book Review and Personal Insights on 'Accelerate - Building and Scaling High-Performing Technology Organizations'
The content here is under the Attribution 4.0 International (CC BY 4.0) license
Delivering software fast and keeping the quality is challenging (Dave Farley challenges the potential tread-off between speed and quality, elaborating on the argument that the one depends on the other, instead of the one blocks the other). Accelerate brings the light through data on the devops culture and practices. Devops (even though the term wasn’t there?) has been adopted by the industry for a while and Accelerate dives into the practices on what makes successful teams to delivery software fast and also what hold them back to improve the delivery process. Besides that the book is accessible for different profiles within the technology world (software engineer, cyber security, quality engineer, etc) .
This post follow the same structure as the book, but instead of three (as the third one is related to a study case and challenges faced) it will be split into two parts. The first, will go through the results found and the second, will dive into the research process.
The companion for this post is the mind map that was built after the reading process and along the reading process, the sections that follow have the mind map bit, which corresponds to the subject being discussed.
The first part of the book focus on the insights that the data showed to researches, pointing to which capabilities leads to improvement on software delivery. The premise used is that business as usual is not enough to innovate and strive to success on business, according to the authors, a Gartner survey showed that 47% of the CEO’s face pressure to delivery digital transformation. On the other hand, if the CEO’s face that pressure, part of it might impact how companies delivery software.
The following list depicts the four metrics used to build a guide line of what to measure. Early in the process the authors described that measuring velocity has several flaws, as it focus on how fast something was delivered, it is context dependent and teams usually game their velocity.
In other words, teams once perceived that they are being measured for how fast they delivery they will start to overcome pre defined rules to improve that, which in this case, for the results presented it makes not much sense.
The four metrics as described by the authors focus on the global outcome, rather than the velocity itself. Global outcome is an approach that is harder to game in terms of velocity. It even makes it easier to see the effect on the deliverables.
- Lead time (LT): Measures how long it takes from a request until it is available for the customer to use that (often related to deployed to production).
- Deployment frequency (DF): This metric is associated with the pain that engineers have to deploy changes for customers.
- Mean time to restore (MTTR): Restore is measured on how long it takes for something to get fixed is it is broken, it is related to next metric.
- Change fail percentage (CFP): How often you fail.
Personally I liked the image that IT Revolution. made with the metrics, it makes it easier to remember and the adaptation that was made in the post is a welcome for people that want to have a taste of what the book looks like.
The metrics are connected to the capabilities found by the authors, in a sense that, each capability listed, may impact the metric score, positively or negatively.  Published a prototype in order to automatically measure those metrics.
The key for metrics are connected to 24 capabilities that impacts on the global outcome, depending on which capability is being inspected, it might interfere in one or more metrics used.
In total, 24 capabilities were found , they were classified into five categories:
On 2, I can reefer to  that talks about using microservices as a way to enable the four key metrics measurement and improvement.
Accelerate introduces the results that came from the research in part 1, and in the part 2, it goes deeper in the science behind that. The interesting part is how it was decided to follow up with survey instead of any other method.
The authors argue that survey usually is not a trusted source. Survey takers can lie leading to “invalid” data, which it would be “easier” to avoid this using logs for example. If that is the case, the authors also ague that trust on the system is needed as well.
In the end, there is no 100% guarantee that all the collected data, be it from survey or logs will be correct.
Therefore, there are ways for mitigate this issue, and for that the authors used a statistical analysis. Another way to frame it was the size of the survey. In total there were around 23.000 answers, and to impact that, a lot of people would have had to lie in an orchestrated way. (Which is not impossible but very unlikely that happened).
Accelerate for me gives me a perspective on how to approach software delivery on both ways: in theory and practice. The collected data points to how effectively delivery software in a digital era, which each day developers are on the front line, trying to delivery as much value as possible.
The metrics, used to measure global outcome (delivered value) rather than individual contributions is connected on how it is important to work as a group, the team interests have priority on individual goals. This is the present, and the future. Even though, I would bet that most of big organizations that are struggling to innovate are penalized for not having this mindset in place.
All in all, for me, the argument used to depict the context why to use a survey and how to approach the analysis statically gives the perspective on how the work was conducted, focused on the data, instead of biased opinions or “feelings”. Some might argue that still there will be biased, anyways, it is a way to minimize it.
Edit Oct 20, 2021
Edit Jul 08, 2022
Last month, @keunwoo published a critical review of Accelerate. @nicolefv and I thought it was a good opportunity to address questions about our work and methods. Here's our response. https://t.co/fLPln1h5w5— Jez Humble (@jezhumble) July 8, 2022
- A review of Accelerate: The Science of Lean Software and DevOps
- Response to Keunwoo Lee’s review of Accelerate
- Software Development with Feature Toggles: Practices used by Practitioners
- The Links Between Agile Practices, Interpersonal Conflict, and Perceived Productivity
- Measuring productivity in agile software development process: a scoping study
- What Makes Effective Leadership in Agile Software Development Teams?
- It’s Like Coding in the Dark
- Keynote - Chris Richardson - Microservices Patterns
- Metrik - An easy-to-use, cross-platform measurement tool that pulls data out of CD pipelines and analysis the four key metrics for you
- N. Radziwill, “Accelerate: Building and Scaling High Performance Technology Organizations.(Book Review) 2018 Forsgren, N., J. Humble and G. Kim. Portland OR: IT Revolution Publishing. 257 pages.” Taylor & Francis, 2020.
- M. Sallin, M. Kropp, C. Anslow, J. W. Quilty, and A. Meier, “Measuring Software Delivery Performance Using the Four Key Metrics of DevOps,” in Agile Processes in Software Engineering and Extreme Programming, Cham, 2021, pp. 103–119.
- G. K. Nicole Forsgren Jez Humble, “Accelerate: Building and Scaling High-Performing Technology Organizations,” 2018 [Online]. Available at: https://www.goodreads.com/en/book/show/35747076-accelerate. [Accessed: 16-Jul-2021]
- H. Suryawirawan and C. Richardson, “#53 - Principles for Adopting Microservices Successfully - Chris Richardson,” 2021 [Online]. Available at: https://techleadjournal.dev/episodes/53. [Accessed: 30-Aug-2021]
Table of contents
- Accelerate notes and insights
- Part 1
- Part 2
- Final considerations
- Edit Oct 20, 2021
- Related subjects