Quantcast
Viewing all articles
Browse latest Browse all 18

Another analytic practitioner speaks – an interview with Tracy Altman

Last year I interested Andrea Scarso, CEO of MoneyFarm, about analytics. This was a hugely popular post so I thought I would continue the series this year by interviewing some other analytic practitioners. The first in this continuing series is an interview with Tracy Allison Altman, co-founder of Ugly Research. Ugly Research are developing PepperSlice, an analytics application for explaining recommendations to decision makers.  She’s on Twitter @EvidenceSoup, blogs at EvidenceSoup.com and you can read about her work at Ugly Research.

What’s your background, how did you come to be working in analytics?

I started out as an engineer, evaluating oil and gas investment decisions. This sparked my interest in data and analytical methods, so I got a Master’s (Computer Science concentration) and then a PhD in Public Policy Analysis. But after years in management roles, I was dissatisfied with the available methods for explaining outcomes and making recommendations to decision makers. I created something that worked better for me, and now I’ve founded a startup to make this method available at PepperSlice.

What are the primary kinds of analytics you build at the moment?

My work at PepperSlice is about delivering analytics to human decision makers: To support strategic or one-off tactical decisions, rather than automated ones. We emphasize the communication of analytical findings, aiming to include enough specifics for rigorous evaluation, but without smothering people in detail.
To structure decisions, I’ve developed a straightforward way to model ‘action-outcome’ pairs, which involves aggregating the analytics that establish these linkages. In the PepperSlice web application, people use analytics to explain particular outcomes – whether predicted or actual.
I’m also investigating the performance of individual analytical methods. I’ve developed a way to capture information about the tools people are using, and evaluate their value in making various types of decisions.

In your experience what are some of the top challenges for analytic professionals in terms of maximizing the business impact of what they do?

To improve business impact, analytics professionals need to establish their legacy. They need to better communicate their contributions to results. And they need to assume stronger leadership roles, by modeling analytical behavior and guiding decision analysis within their organizations.
A key challenge is the lack of an effective user interface/presentation layer between the analytics and the decision maker. It’s difficult for analytics professionals to tell a powerful value story, and equally difficult for executives to get a concise, effective explanation of important outcomes. Conventional dashboards, and other analytical interfaces, don’t look like decisions; they don’t identify actions that were (or could be) taken, or connect actions with specific outcomes. Analytics professionals could have much more impact by more clearly demonstrating their contributions to important decisions.
An associated problem for analytics professionals is figuring out how to simplify information without making it simplistic. Understandably, there’s tension between the executive asking “Can’t you just give me a number?” and the analyst who has lots to say about the data. Oftentimes the explanation is too disconnected from the data, which can impose a substantial burden on an organization: Closing that gap is my primary objective.
Decision makers need to experience and believe the data and analytics they’re given – without being data scientists or analytics experts themselves. The way I see it, there are two ways to present a decision problem: You either highlight the key points, or you show all the associated complexity. Both approaches have weaknesses, but I choose to begin with a simple, executive-level representation and let people drill down into more detail.

What have you found that helps meet these challenges? How have you evolved your approach to analytics to maximize the business impact of what you do?

I’ve developed a streamlined, structured format for presenting analytical findings to decision makers; it explicitly links actions with outcomes, and shows what analytics were applied to establish that linkage. Essentially, it’s a vehicle for analytics professionals to demonstrate value. My goal is to connect the explanation with the data, so people can see what is happening, and why – all in one user interface.
Here’s how my approach has evolved to better address business needs:

  1. I’ve learned to focus on informing the immediate decisions, the ones being made today. Those are the decisions that matter.
  2. I’ve discovered the importance of evaluating the different analytical methods. Decision makers need to know which methods are most useful for making which decisions.

How, specifically, do you develop requirements for analytic projects?

I help people evaluate the actions available to them as they seek to improve outcomes. Typically, I’ll walk them through my version of decision analysis. Requirements begin with desired outcomes: What are the decision maker’s fundamental objectives? What intermediate objectives will help them get there, and what metrics or KPIs can we use to monitor progress?
The next step is identifying actions that can influence important outcomes. How do we recognize these, and how can they be analyzed? Collective intelligence supports this discovery: Querying decision makers, interviewing subject matter experts, and performing data analytics.

There’s a growing interest in rigorously modeling decisions as part of specifying the requirements for an analytics project. How do you see this approach adding value for analytics professionals?

This is a must. If analytics projects aren’t also decision modeling projects, then what are they?
Analytics professionals can add substantial value by taking the lead on decision analysis, guiding people to address this fundamental question: What’s a ‘good’ decision? What forms of knowledge are required to move forward with a decision? What’s nice-to-have, and what’s must-have?

Anything else you would like to share?

We need to remember that our purpose is to figure out how to improve outcomes, and that requires much more than hard data. It’s tempting to treat the fuzzier stuff – and the people who apply it – as second-class citizens in a data-driven world. But we do so at our peril.

Last question – what advice would you give analytic professionals to help them maximize the value they create for their organization?

  1. Establish your legacy. Model analytical behavior and decision analysis for your organization.Assume a leadership role to foster a sophisticated, decision-aware culture: One where people truly understand the difference between ‘good’ and ‘bad’ decisions. Too many professionals still don’t recognize that an unwanted outcome doesn’t necessarily signal a bad decision process, or an incompetent decision maker.
  2. Demonstrate the value of your analytics. Decision makers should know what analytics were performed, why they were done, and how they contributed.
  3. Avoid the temptation of “Ooh, shiny!” and focus on improving outcomes. First, identify what needs to be known, what variables need to be understood. Then select the analytics tools that will help you get there.

You can follow Tracy on Twitter @EvidenceSoup and read her blogs at EvidenceSoup.com  For more on decision modeling for analytic projects, check out our white paper.


Viewing all articles
Browse latest Browse all 18

Trending Articles