SDLC Insights

Improve observability, predictability and efficiency

Book a call

The buyer of a company would never complete a purchase with only a point in time look at the financials. The buyer rightly understands that most of the value comes not in evaluating the current state, but in understanding how that state is changing over time. Knowing a company will do $10 million in revenue this year feels very different depending on whether they did $5 million or $20 million last year. Change over time is absolutely critical to having context to understand present day performance.

In a previous post, we talked about the importance of time in understanding a complex business process like software development. You can’t understand how a process works without understanding how work flows over time. To have confidence that you truly understand a process, you need to observe it longitudinally rather than simply at a particular moment.

Unfortunately, when we think about the events that cause us to evaluate the alignment, predictability and efficiency of a software development process, we rarely have the luxury of time for that effort. “Due Diligence” processes are an excellent example of this dynamic, where one organization is trying to understand the capabilities and limitations of another before taking an ownership stake or effecting a partnership. In essence, the buyer is trying to understand the technical debt of the seller before it becomes the buyer’s problem. During due diligence, there is an evaluation period prior to the signing of a definitive deal and once component of the overall evaluation is a deep dive into the technical components, or “technical due diligence.”

And yet, technical diligence almost never includes a robust understanding of change over time. The elapsed period of diligence may be measured in days, weeks, or even a small number of months, but it’s certainly not enough to observe the process of software development over time with substantial rigor.

Typical due diligence for technology is centered on a few activities:

  • Interviews with key contributors (executives, technical architects, team leaders) to understand their knowledge as well as to size up the people
  • Review of key artifacts to understand the technical decisions that have been made and the architecture of the solution with a particular focus on where those choices might limit future freedom of action
  • Code reviews of key libraries to try to understand the quality of the code being written
  • Measures of historical quality like SLA compliance to try to understand whether the solution can meet requirements in its current state

Much like the seller of a house who slaps a quick coat of paint on walls to cover up the cracks coming from the foundation, the organization undergoing diligence has every incentive to paper over the deficiencies and put a brave face on everything. The misalignment of incentives creates a motive for less than complete disclosure, and the difference in time spent observing the process creates a substantial asymmetry of information where the seller has the benefit of tremendous knowledge over time while the buyer is relying on a point-in-time analysis to try to extrapolate future value.

These dynamics make it difficult for an organization performing diligence to perform an effective and thorough analysis with very limited time and resources.

Caveat emptor, latin for “buyer beware” or “no take backs” from my playground days, governs most of these transactions. There might be some representations and warranties for financial penalties in the case of extreme fraud or misappropriated intellectual property, but the reality is that poor technical diligence usually results in poor financial returns for the buyer, and an awful lot of headaches. (The importance of robust diligence and the risk of asymmetric information is even more important when taken in conjunction with the tendency of a buyer to over-pay in a competitive auction, known as the “winner’s curse”).

Process mining can level the playing field during this process. Because the mining platform will typically ingest months or years of historical process data, it allows an organization to quickly understand:

  • What organizational priorities are garnering the most resource allocation
  • How often have different teams met their commitments to the business in the past
  • How efficiently is the SDLC operating and what bottlenecks exist

Since this analysis is done algorithmically, the output is more comprehensive and less subject to bias than traditional due diligence techniques.

It’s tempting to say that process mining is almost like performing an x-ray to the house walls with the new paint to see the structural deficiencies, but the concept is even more powerful. Rather than a point-in-time x-ray, process mining is like having the ability to go back in time and observe the changes to the house over the past months or years. Whoever is performing diligence can immediately understand where the process is running efficiently and meeting the expectations of the business versus where the process might need additional attention or resources to improve.

Moreover, process mining allows the party performing diligence to compare teams within or across organizations with advanced metrics like the Sprint Performance Score or Portfolio Performance Score. When viewed in conjunction with other data points an evaluator can get a more comprehensive and objective view of the technical debt that has accumulated over time. By assessing where investment is going (across Capex or Opex, Build / Run / Maintain) you can understand the capacity of the team to ship new capabilities that may be needed to develop the business.

This can help the buyer better understand what they are buying, and what investments might be needed to realize the expected returns. The most important thing that process mining can highlight is the trickiest to understand and solve - how are the teams of people performing? Do I have the right people? Do I need to invest in more? Do they need more or less process or tooling to be effective?

All of this creates a paradigm shift for the technical evaluation of a company. In a matter of hours, a buyer can have a detailed and rich understanding of the asset they are evaluating for purchase and how it has evolved over time. They can understand how effectively the teams are working, where bottlenecks in the process exist, and the severity of those issues compared to other organizations. This leads to a much more informed perspective on the likelihood of an asset performing well, and what sorts of investments might be necessary for improvement. If the buyer must beware, at least they can also be informed.

Erik Severinghaus

Co-Founder | Co-CEO

Erik is an entrepreneur, innovator, author, and adventurer who’s been featured in Fortune, Forbes, and The Wall St. Journal. His track record includes profitable exits from iContact (Vocus), SimpleRelevance (Rise Interactive), and SpringCM (DocuSign). Erik released his first book in 2021, Scale Your Everest, a guidebook for mental health, resilience and entrepreneurship. As an endurance athlete, Erik has conquered some of the world's tallest peaks, including Mt. Everest in 2018.