A question we received recently, “With automated methods of gathering data from software companies popping up, is there really a need for traditional technical due diligence as part of an acquisition involving interviews and analysis or will the data tell me enough?”
The data referred to here is gathered by running software tools on certain development-related systems within a target company to generate some insights into how the development team works, the overall quality of the code, the level of technical debt, and other valuable data. If this data is gathered and interpreted correctly, does that mean that standard technical due diligence no longer has a place in the investment process?
The answer is a resounding “NO”. Technical due diligence is still required, but can be nicely augmented through tools-based data gathering and analysis.
Using data (e.g. from source control systems or the code itself) to provide insights into the target company’s R&D team is recommended, assuming suitable access to the data is granted or the target can run the analysis tools on their own. However, this does not eliminate the need for in-person conversations but instead provides context and information to direct a deeper dive into key areas targeting specific risks and potentially problematic areas with the technology.
As an example, areas of the code with significant predicted technical debt or maintainability issues are ripe for deeper investigation, particularly if the identified components are critical to the application being evaluated. Probing questions such as the effect on scale, historical issues with the components, the frequency of execution of those code paths for the core use cases, overall impact on the user, and the level of effort/cost to address the issues may expose risks to the investment.
While the data is certainly useful, some of the activities in technical due diligence that the data does not replace are:
- Talking through the architecture, its history, and its future to develop an understanding of more significant technology risks
- Learning about how the code was written (e.g. practices, processes), which can give some additional insights on its quality and team-based scale, among other things
- Having someone qualified to interpret the data and do deeper dives on riskier areas that the data informs
- Identifying improvements and value creation opportunities for the target to help inform the post-close plan
- Interviewing the team to discover that a key third-party component, while useful for many companies, is problematic for their use cases and will need replacement
- Learning more about the people and practices that built the software, with an experienced/wise set of eyes/ears to identify potential issues, particularly as the business scales
- Human judgement against business objectives as to whether some of the identified technical debt really needs to be addressed, and how urgent it is.
In summary, is data gathered from the target a reasonable data point in a technical due diligence effort? Absolutely. Is it wise to solely rely on these metrics to make reasonable investment decisions? Absolutely not. The real value comes from combining both approaches to uncover investment risks and opportunities for improvement as quickly and thoroughly as possible.