Landing talent. Building momentum. Scaling. If you are building an analytical marketing capability, these three challenges can keep you up at night.

Today I introduce you to someone who has written the book on these topics: Cesar Brea. I’ve interviewed Cesar about his recently published book, Marketing And Sales Analytics: Proven Techniques And Powerful Applications From Industry Leaders.

I first met Cesar through the serendipitous chain of networking that I delight in. If there’s anyone who could get quant-fearing people to embrace analytics, and have fun and feel smart getting there, this is the guy.

Cesar has a warm and self-effacing quality that belies his ferocious smarts and impressive professional pedigree. He’s also the father of triplets!

We discuss topics that many marketing analytics leaders wrestle with:

  • The right and wrong way to establish a marketing analytics capability
  • Building momentum for marketing analytics in 90-day chunks
  • Landing analytics talent with the right aptitude, attitude, and altitude
  • The dangers of “analytic Ferraris burning out their clutches”

Your book crystallizes best practices for guiding analytics efforts, and showcases fifteen executives leading analytics organizations — including several readers of this newsletter! What’s the main point of the book?

 Success with analytics is more about getting the conditions right than about the brilliance or sophistication of any single insight, analyst, or tool. This book takes a look at all the conditions necessary for analytics to be valuable, from the perspectives of senior folks living in the real world where establishing and sustaining these conditions is a daily grind.

 

  • You are actually not a career analyst. Rather, you held senior roles at Bain, Razorfish, Monitor, and tech startups. Now you work with major brands on marketing analytics. How does your career path inform your perspectives?

 

My ‘angle’ on the analytics wave is the observation that there’s too much focus on the means themselves – the ideas, the tools, the advanced degrees – and not enough on the end results. I’m much more interested in the end results, and in how those possibilities should frame the approach and related investments.

 

  • One of your conditions for success is “operational flexibility.” Let’s talk about building that capability for an organization. You say in your book that it is risky to “start with an under-performing operation built on spaghetti and plan for a smooth multi-year transition to a fully integrated on-premise option. That just puts too many moving parts into play, with too high an up-front, bet-on-the-come investment.” What works better?

 

It works better to tune the investments in operational flexibility iteratively to a steady, rising stream of results.

 

Focus on delivering something of business value — a tangible, quantifiable result expressed in terms of an operating or financial metric — every 90 days.This pace forces you to pursue simple infrastructure, often SaaS-based, at first. Build a string of these wins into a portfolio.  The portfolio will include initiatives at different stages of maturity – some are long-term and some are short-term. As the portfolio gets bigger, you can explore more sophisticated infrastructure.  The key is to remember that infrastructure is a means, not an end.

 

  • Why 90 days?

 

It’s all about high return on insight from the time invested. By bounding things time-wise, you’ll work with what you have, and you can get value more quickly.

 

Plus, we’ve all experienced diminishing returns to analytic complexity. Organizing work in 90-day cycles (or even shorter, as I describe in the book) forces you to phase your work so you’re conscious of these diminishing returns.

 

In the world of analytics I see a lot of data or infrastructure projects that go on way longer than 90 days – sometimes two years, or longer. But if the CEO needs to go to the board and give an update every 90 days, then others in the organization should think about the same cadence, for some meaningful portion of their work.

 

After 90 days, if you don’t have something to show, your credibility goes down.

 

Marketing analytics leaders should constantly be thinking of momentum and paying the freight, just like business leaders they serve need to. What is your quarterly contribution to the cause?

 

 

  • How else can analytics leaders get results faster?

 

The web analytics guru Avinash Kaushik has described three pillars of your analytics stool: research, data analysis, and testing. The most effective thing I have seen is being flexible enough to move from one mode to another readily, as the questions demand. But most organizations have these pillars in organizational siloes, so one of them can pound their hammer for longer than necessary before progressing to the next pillar. When recruiting analytics people, screen for their experience partnering across these pillars.

 

  • Once the analytics underpinnings are there, how should the analytics organization scale?

 

A smart analytic strategy starts simple and doesn’t expand any faster than a firm’s ability to absorb insights it generates. Accordingly, your staffing efforts should recognize and follow this pattern. You don’t want analytic Ferraris running in first and second gear all the time, burning out their clutches in the heavy traffic of your operating realities.

 

From a recruiting standpoint, it’s a matter of knowing what gear you’re in. Don’t hire super-specialized analytic folks until you’ve got an engine that’s realizing value from simpler opportunities.  You won’t be able to take advantage of them, and they’ll get frustrated they aren’t able to fully leverage their talents and skills. You’re better off with a proven “analytic marketer” first, someone who has experience with analysis but is first and foremost a good, practical executive, able to span the think-act divide, and communicate really well.

 

At the scaling stage, you have to challenge your team to do two things: embrace simplification, and automate what they are doing.

 

  • Won’t people be unmotivated to automate themselves out of a job? Or simply bored?

 

I encourage people to embrace automation since it broadens their relevance. This is about enhancing their job and increasing their altitude. You don’t want folks who are happy to spend 80% of their time performing the same manual process over and over, in perpetuity.

 

  • How can you tell when interviewing someone if they have the practicality to get their analysis acted on?

 

Their stories tend to start with: “Well, we weren’t growing as fast as we need to, so we…”  Or, “We were launching a new product, and we needed to reach…”  What they don’t do is start by telling me about the analysis per se: “Well, we started by building a model…”

 

  • You write that “Any good analyst can extract value from dirty, incomplete data, at least enough to get a sense for whether there’s value worth pursuing further.”  Discuss.

 

Good analytic marketers aren’t afraid of formulating hypotheses about potential actions based on the data they have. Then they do some math: “What could this opportunity be worth? What would the minimally useful next step to validate it be?  How much would this step cost me?”  They’re thinking backwards from goals, putting their analysis in the context of value, costs, and risks, rather than just thinking of it as a project with an arbitrary threshold for confidence in the findings that requires a certain data standard before the analysis can begin.

 

  • How should hiring managers respond to the shortage of analytic talent?

 

Given that good analytics folks are scarce, the worst thing you can do is look for them if you haven’t addressed the other conditions I describe in the book.

 

Once you have addressed them, realize that a more narrowly skilled person may be more effective than a “rock star.” At one of my clients, access to data is a challenge, and so we’ve traded off statistical modeling experience for good data-wrangling skills.  You can’t fly the jet without fuel, and I would settle for a fueled propeller plane over an unfueled jet liner any day. We figure we can always get good coaching on model-building, but if we can’t navigate the organization and scrounge the data we need, we’re nowhere.

 

Also, think in terms of small teams. You likely need three things: data access; modelling and analysis; and communications/organizational savvy. Having teamlets that can together fill those needs works well. But if you do go the team path, keep an eye on interpersonal dynamics. They need to embrace their complementarity, rather than resent it. The first path leads to synergies; the latter produces antagonism and destroys productivity.

 

When talent is scarce, look for people who are analytically curious but come from outside marketing. Finance people who are really curious can make great transfers into marketing analytics.

 

  • How about hiring a stats Ph.D. onto an analytics team?

 

One person I interviewed for the book discusses how PhD statisticians are coming out of academic programs and they seem well-suited to analytics roles, but their paradigms rarely hold true in the real world. You’re better off finding people for whom getting data and doing statistical analysis on that data has been the means to an end, for instance, a sociologist who has learned stats to apply it to their field. They will be more proactive and cause-driven, and won’t bring “classic” assumptions, like bell-curve distributions, to the party.

 

With more experienced sophisticated Ph.D. statisticians, you could be tempted to push things out to that level of sophistication. Instead, look for people who embrace the 90-day orientation. It’ll prevent you from over-scoping.

 

  • Who was the best analytical marketer you’ve ever worked with?

 

The best one was surprisingly not an analytics person! He was a line of business leader at FAO Schwarz, the toy company. He taught himself to write SQL because he was intensely curious about the business that he had direct responsibility for driving. He showed that no one was above the grungy work of querying databases. His actions built the credibility of the analytics function.

 

On the more junior end, recently I worked with an analyst in financial services who wasn’t afraid to ask why we were doing a project, and was quick to absorb the explanation once I answered it. Once he got the goal, he started proposing other questions to ask and other ways of looking at the data. It made for a very productive partnership.

 

  • How about the worst?

 

The least successful analysts I’ve known were all about skills. They knew how to build models in SAS, or knew some other tool really well, but they waited to be told what to do. Skill is necessary but insufficient for success in analytics.

 

  • So, what should we look for when hiring marketing analytics folks?

 

Being self-service with their learning is key. Look for a demonstrated inclination to Read The Manual and keep learning. At the technical level, it’s all changing fast, so the ability to learn fast is important.

 

Ultimately you’re looking for people who understand that they have to improve the performance of this business by X and won’t let any obstacle stand in their way. This is really about attitude: someone who wants to see stuff happen and drive things ahead.

 

My view is that the market is too skills-focused today, and insufficiently focused on past experiences that demonstrate these attitudes.

 

 

  • It sounds like the holy trinity here is aptitude, attitude, and altitude. How do you assess altitude?

 

You can be open-ended when evaluating someone more senior. With younger folks, be more structured. I’ll say, “Tell me about the business that you worked with. What customers did you serve? What products did you offer them? Which performed better? Where was the uncertainty in your planning process? How did you model things?

 

Guide the interview more. Don’t ask at the outset whether they have built a linear regression model. In the book I suggest some questions I use when I interview a senior team at the beginning of an engagement, like, “So, how’s business?” In an interview with a senior candidate, I might ask a variant of this, like, “So, tell me about the business you supported.” In the book I suggest different answers I listen for that tell me a lot about where someone’s coming from, and what that could mean in terms of their analytic orientation.

 

 

  • You write about reactive versus proactive analytics organizations. How should a job candidate differentiate a reactive from a proactive analytics organization?

 

A reactive organization will talk about the analytic function, the kinds of projects they do, and the process they use. These are important, of course.  But a proactive one will talk first about the business, about its needs and challenges, about *both* the analysis *and* the actions being taken on that analysis, and the results. Then, a good analytics manager will talk about how they’ve organized and run their function to support those business realities. Good examples could be how they organize to support highly distributed businesses with different characteristics, or how they manage process for decision-support needs on different time scales.

 

  • What one piece of advice do you have for someone staffing up an analytics effort?

 

In the book I say, “Momentum is strategic.”  Make sure you’ve got something to show for your efforts every 90 days.  As you do so, the nature of the opportunities and the results you are achieving — and are challenged to achieve — will tell you lots about the kind of people you need.

 

  • Thank you! How can we get the book?

Go to: www.marketingandsalesanalytics.com

You’ll make my mother very happy! Not to mention my publisher.