Data science and private equity due diligence may not seem like a good fit: successful data science projects usually require lots of time to study the data before finding an answer. On the other hand, private equity commercial due diligence timeframes are short: funds may only have two weeks or less between the time they and their partners have access to the data room and the time they need to make an investment decision. With such a timing mismatch, many funds feel they have no choice but to forego analyzing the target’s data before making their decision.
At Stax, we’ve found the best 3 methods to incorporate advanced statistical modeling even into this condensed timeframe. By anticipating the questions that will be asked, building familiarity with the most important data sources, and partnering with an overseas team for round-the-clock project execution—we have been able to compress the timeframe for rigorous analytical decision making.
Data science projects usually require time upfront to translate the business problem into an analytical question. It could take weeks to correctly frame the problem, determine the data needed, and consider how to structure a statistical model that answers the problem.
Fortunately, most questions which are answered in a diligence timeframe are similar to other questions that have been asked during diligence assessments. Some frequent, common questions Stax has seen in diligence include:
As Stax has such familiarity around with these types of questions, we have developed a framework for these specific queries and expectations to answering these kinds of questions during a diligence timeframe.
Since some of the questions are similar, some of the data sources are also similar. This is more commonly the case when we need to gather external data. For example, for questions about how the business cycle is tied to company performance, there are standard data sources to gather national and local economic metrics. Other common data sources include local demographics, foot traffic, lists and locations of competitors, etc. Familiarity and quick access to these sources saves time in data preparation.
As more companies use outsourced vendors for data collection, Stax has been able to even conduct internal data analysis during a diligence timeframe. For example, e-commerce businesses often use vendors like Shopify, which store their clients’ data in a format that is familiar and easy to obtain.
Even if the internal data is not standardized, Stax has found some data analysis is possible for focused, critical questions. To answer these questions, it is critical to have experience with data requests and common issues to reduce the number of back-and-forth exchanges about data. We have found that our private equity clients appreciate our knowledge of how databases are likely structured, and which filters they are likely to utilize when analyzing data. It’s important to build this kind of trust quickly.
A global team is critical. For years, Stax’s round-the-clock teams have worked on predictive models in short timeframes. Through thousands of engagements, we have been able to develop best practices around communication and collaboration with our overseas teams. Specifically, we’ve found there are parts of the process that two teams can work on sequentially, such as descriptive analysis, feature engineering, or testing different model structures. We budget our time so that the only overlap in these areas is in double-checking the other team’s errors.
For other parts of the process, collaboration is key. When something in the predictive model requires a new, creative approach, it helps to have Team A explore, share their results, and wake up with fresh eyes to see how Team B has reacted to their results and explored the problem themselves. This type of collaboration builds a level of robustness into our predictive models that we wouldn’t get if either team worked independently.
Data science during the private equity diligence process won’t remove all the uncertainty, but it can eliminate a significant portion. At Stax, our data science teams frequently work on longer-term projects, and we are aware of the additional value of spending months on analysis before committing to, for example, selecting fifty new locations for your next retail expansion. During a diligence, we can build initial estimates for those kinds of models.
But data science in diligence is really about narrowing the range of a critical question. For example, if we’ve only got our intuition about whether a specific business will suffer from a downturn in the business cycle, data can confirm or push back against that intuition. We’ll be able to tell what the numbers say, what we were able to measure and were NOT able to measure in the given timeframe, and provide a level of confidence in our efforts. There is a lot more confidence in valuation between someone who says, “we don’t think this business has much risk from the business cycle” to “this business’s revenue is about 30% tied to the business cycle, and that’s based on X, Y, and Z.” Even a skeptic, who only puts half as much confidence in our models as we do, would appreciate we’ve put in the effort.
Lastly, analytics during a diligence process can help plan what to do with the company after the deal closes. Sometimes, it’s as simple as knowing we’ll need to invest more in our data architecture. Other times, it’s knowing that there’s an opportunity to run one of the business’s processes a better way, even if we haven’t built out the full model. And finally, in the best cases, a predictive model built during diligence can develop into a tactical plan of how to invest in the business and create value.
All Rights Reserved | Stax LLC | Powered by Flypaper | Privacy Policy