Accelerate notes and insights
Deliverying software fast and keeping the quality is challenging. Accelerate brings the light through data on the devops culture and practices. Devops (even though the term wasn’t there?) has been adopted by the industry for a while and Accelerate dives into the practices on what makes successfull teams to delivery software fast and also what hold them back to improve the delivery process. Besides that the book is accessible for different profiles within the technology world (software engineer, cyber security, quality engineer, etc) .
This post follow the same structure as the book, but instead of three (as the third one is related to a study case and challegens faced) it will be split into two parts. The first, will go through the results found and the second, will dive into the research process.
The companion for this post is the mind map that was built after the readig process and during the reading process, the sections that follow have the mind map bit, which corresponds to the subject being discussed.
The first part of the book focus on the insights that the data showed to researches, pointing to which capabilities leads to improvement on software delivery. The premise used is that business as usual is not enough to innovate and strive to success on business, according to the authors, a Gartner survey showed that 47% of the CEO’s face pressure to delivery digital transformation. On the other hand, if the CEO’s face that pressure, part of it might impact how companies delivery software.
The following list depicts the four metrics used to build a guide line of what to measure. Early in the process the authors described that measuring velocity has several flaws, as it focus on how fast something was delivered, it is context dependent and teams usually game their velocity.
In other words, teams once perceived that they are being measured for how fast they delivery they will start to overcome pre defined rules to improve that, which in this case, for the results presented it makes not much sense.
The four metrics as described by the authors focus on the global outcome, rather than the velocity itself. Global outcome is an approach that is harder to game in terms of velocity. It even makes it easier to see the effect on the deliverables.
- Lead time: Measures how long it takes from a request until it is available for the customer to use that (often related to deployed to production).
- Deployment frequency: This metric is associated with the pain that engineers have to deploy changes for customers.
- Mean time to restore (MTTR): Restore is measured on how long it takes for something to get fixed is it is broken, it is related to next metric.
- Change fail percentage: How often you fail.
The metrics are connected to the capabilities found by the authors, in a sense that, each capability listed, it may impact the metric score, positively or negatively.
The key for metrics are connected to 24 capabilities that impacts on the global outcome, depending on which capability is being inspected, it might interfer in one or more metrics used.
In total, 24 capabilities were found , they were classified into five categories:
- Continuous delivery
- Product and process
- Lean management and monitoring
On 2, I can reefer to  that talks about using microservices as a way to enable the four key metrics measurement and improvement.
Accelerate introduces the results that came from the research in part 1, and in the part 2, it goes deeper in the science behind that. The interesting part is how it was decided to follow up with survey instead of any other method.
The authors argue that survey usually is not a trusted source. Survey takers can lie leading to “invalid” data, which it would be “easier” to avoid this using logs for example. If that is the case, the authors also ague that trust on the system is needed as well.
In the end, there is no 100% guarantee that all the collected data, be it from survey or logs will be correct.
Therefore, there are ways for mitigate this issue, and for that the authors used a statistical analysis. Another way to frame it was the size of the survey. In total there were around 23.000 answers, and to impact that, a lot of people would have had to lie in an orchestred way. (Which is not impossible but very unlikely that happened).
Accelerate for me gives me a perspective on how to approach software delivery on both ways: in theory and practice. The collected data points to how effectively delivery software in a digital era, which each day developers are on the front line, trying to delivery as much value as possible.
The metrics, used to measure global outcome (delivered value) rather than individual contributions is connected on how it is important to work as a group, the team interests have priority on individual goals. This is the present, and the future. Eventhough, I would bet that most of big organizations that are struggling to innovate are penalized for not having this mindset in place.
All in all, for me, the arguments used to depicts the context why to use a survey and how to approach the analsys statically gives the perspective on how serious the work was conducted, focused on the data, instead of biased opinions or “feelings”.
- N. Radziwill, “Accelerate: Building and Scaling High Performance Technology Organizations.(Book Review) 2018 Forsgren, N., J. Humble and G. Kim. Portland OR: IT Revolution Publishing. 257 pages.” Taylor & Francis, 2020.
- G. K. Nicole Forsgren Jez Humble, “Accelerate: Building and Scaling High-Performing Technology Organizations,” 2018 [Online]. Available at: https://www.goodreads.com/en/book/show/35747076-accelerate. [Accessed: 16-Jul-2021]
- H. Suryawirawan and C. Richardson, “#53 - Principles for Adopting Microservices Successfully - Chris Richardson,” 2021 [Online]. Available at: https://techleadjournal.dev/episodes/53. [Accessed: 30-Aug-2021]