How to Optimize Your Marketing Spend

There are formal methods and tools you can use to optimize marketing spend, including software from SAS, IBM and HP (among others).  The usefulness of these methods, however, depends on basic disciplines that are missing from many Marketing organizations.

In this post I’d like to propose some informal rules for marketing optimization.  These do not exclude using formal methods as well — think of them as organizing principles to put in place before you go shopping for optimization software.

(1) Ignore your agency’s “metrics”.

You use agencies to implement your Marketing campaigns, and they will be more than happy to provide you with analysis that shows you how much value you’re getting from the agency.  Ignore these.  Asking your agency to measure results of the campaigns they implement is like asking Bernie Madoff to serve as the custodian for your investments.

Every agency analyst understands that the role of analytics is to make the account team look good.   This influences the analytic work product in a number of ways, from use of bogus and irrelevant metrics to cherry-picking the numbers.

Digital media agencies are very good at execution, and they should play a role in developing strategy.  But if you are serious about getting the most from your Marketing effort, you should have your own people measure campaign results, or engage an independent analytics firm to perform this task for you.

(2) Use market testing to measure every campaign.

Market testing isn’t the gold standard for campaign measurement; it’s the only standard.  The principle is straightforward: you assign marketing treatments to prospects at random, including a control group who receive no treatment.  You then measure subsequent buying behavior among members of the treatment and control groups; the campaign impact is the difference between the two.

The beauty of test marketing is that you do not need a hard link between impressions and revenue at the point of sale, nor do you need to control for other impressions or market noise.  If treatments and controls are assigned at random, any differences in buying behavior are attributable to effects of the campaign.

Testing takes more effort to design and implement, which is one reason your agency will object to it.  The other reason is that rigorous testing often shows that brilliant creative concepts have no impact on sales.  Agency strategists tend to see themselves as advocates for creative “branding”; they oppose metrics that expose them as gasbags.  That is precisely why you should insist on it.

(3) Kill campaigns that do not cover media costs.

Duh, you think.  Should be obvious, right?  Think again.

A couple of years ago, I reviewed the digital media campaigns for a big retailer we shall call Big Brand Stores.  Big Brand ran forty-two digital campaigns per fiscal year; stunningly, exactly one campaign — a remarketing campaign — showed incremental revenue sufficient to cover media costs.  (This analysis made no attempt to consider other costs, including creative development, site-side development, program management or, for that matter, cost of goods sold.)

There is a technical term for campaigns that do not cover media costs.  They’re called “losers”.

The client’s creative and media strategists had a number of excuses for running these campaigns, such as:

  • “We’re investing in building the brand.”
  • “We’re driving traffic into the stores.”
  • “Our revenue attribution is faulty.”

Building a brand is a worthy project; you do it by delivering great products and services over time, not by spamming everyone you can find.

It’s possible that some of the shoppers rummaging through the marked-down sweaters in your bargain basement saw your banner ad this morning.  Possible, but not likely; it’s more likely they’re there because they know exactly when you mark down sweaters every season.

Complaints about revenue attribution usually center on the “last click” versus “full-funnel” debate, a tiresome argument you can avoid by insisting on measurement through market testing.

If you can’t measure the gain, don’t do the campaign.

(4) Stop doing one-off campaigns.

Out of Big Brand’s forty-two campaigns, thirty-nine were one-offs: campaigns run once and never again.  A few of these made strategic sense: store openings, special competitive situations and so forth.  The majority were simply “concepts”, based on the premise that the client needed to “do something” in April.

The problem with one-off campaigns is that you learn little or nothing from them.  The insight you get from market testing enables you to tune and improve one campaign with a well-defined value proposition targeted to a particular audience.  You get the most value from that insight when you repeat the campaign.  Marketing organizations stuck in the one-off trap never build the knowledge and insight needed to compete effectively.  They spend much, but learn nothing.

Allocate no more than ten percent of your Marketing spend to one-off campaigns.  Hold this as a reserve for special situations — an unexpected competitive threat, product recall or natural disaster.  Direct the rest of your budget toward ongoing programs defined by strategy.  For more on that, read the next section.

(5) Drive campaign concepts from strategy.

Instead of spending your time working with the agency to decide which celebrity endorsement to tout in April, develop ongoing programs that address key strategic business problems.  For example, among a certain segment of consumers, awareness and trial of your product may be low; for a credit card portfolio, share of revolving balances may be lagging competing cards among certain key segments.

The exact challenge depends on your business situation; what matters is that you choose initiatives that (a) can be influenced through a sustained program of marketing communications and (b) will make a material impact to your business.

Note that “getting lots of clicks in April” satisfies the former but not the latter.

This principle assumes that you have a strategic segmentation in place, because segmentation is to Marketing what maneuver is to warfare.  You simply cannot expect to succeed by attempting to appeal to all consumers in the same way.  Your choice of initiatives should also demonstrate some awareness of the customer lifecycle; for example, you don’t address current customers in the same way that you address prospective customers or former customers.

When doing this, keep the second and third principles in mind: a campaign concept is only a concept until it is tested.  A particular execution may fail market testing, but if you have chosen your initiatives well you will try again using a different approach.  Keep in mind that you learn as much from failed market tests as from successful market tests.

Dell Buys Statsoft

Dell announced this morning that it has acquired Statsoft, a privately held company that distributes Statistica, a suite of software for statistics and data mining.   Terms of sale were not announced.

Founded by academics in 1984, Statsoft has developed a loyal following at the low end of the analytics market, where it offers a reasonably priced alternative to SAS and SPSS.  The Statistica software suite includes a number of modules that support statistics, multivariate analysis, data mining, ETL, real-time scoring, quality control, process control and vertical solutions.  Relative to other statistical software packages on the market, Statistica’s support for analytic features is comprehensive.

Statistica 12.0: Plot Window

Statistica appeals to a core group of loyal and satisfied users.  In the most recent Rexer data mining survey, Statistica ranked eleventh overall in reported use, but ranked second in reported primary use; the product scored at the top of the list in user satisfaction.  According to Rexer’s segmentation, Statistica has the highest penetration among users who are new to data mining, rarely work with Big Data, place a high value on ease of use, and do not want to write their own code.

StatSoft supports desktop and server editions of Statistica on Windows only; that should fit well with Dell’s hardware business.  What does not make sense is Dell’s claim that this acquisition “bolsters its portfolio of Big Data Solutions”; Statistica lacks support for distributed computing, and does not run in databases or Hadoop.

2014 Predictions: Advanced Analytics

A few predictions for the coming year.

(1) Apache Spark matures as the preferred platform for advanced analytics in Hadoop.

Spark will achieve top-level project status in Apache by July; that milestone, together with inclusion in Cloudera CDH5, will validate the project’s rapid maturation.  Organizations will increasingly question the value of “point solutions” for Hadoop analytics versus Spark’s integrated platform for machine learning, streaming, graph engines and fast queries.

At least one commercial software vendor will release software using Spark as a foundation.

Apache Mahout is so done that speakers at the recent Spark Summit didn’t feel the need to stick a fork in it.

(2) “Co-location” will be the latest buzzword.

Most analytic tools can connect with Hadoop, extract data and drag it across the corporate network to a server for processing; that capability is table stakes.  Few, however, can integrate directly with MapReduce for advanced analytics with little or no data movement.

YARN changes the picture, however, as it enables integration of MapReduce and non-MapReduce applications.  In practice, that means it will be possible to stand up co-located server-based analytics (e.g. SAS) on a few nodes with expanded memory inside Hadoop.  This asymmetric architecture adds some latency (since data moves from the HDFS data nodes to the analytic nodes), but not as much as when data moves outside of Hadoop entirely.  For most analytic use cases, the cost of data movement will be more than offset by the improved performance of in-memory iterative processing.

It’s no coincidence that Hortonworks’ partnership with SAS is timed to coincide with the release of HDP 2.0 and production YARN support.

SAS and HDP

(3) Graph engines will be hot.

Not that long ago, graph engines were exotic.  No longer: a wide range of maturing applications, from fraud detection and social media analytics to national security rely on graph engines for graph-parallel analytics.

GraphLab leads in the space, with Giraph and Tez well behind; Spark’s GraphX is still in beta.  GraphX has already achieved performance parity with Giraph and it has the advantage of integration with the other pieces of Spark.  As the category matures, analysts will increasingly see graph analysis as one more arrow in the quiver.

(4) R approaches parity with SAS in the commercial job market.

R already dominates SAS in broad-based analyst surveys, but SAS still beats R in commercial job postings.  But job postings for R programmers are rapidly growing, while SAS postings are declining.  New graduates decisively prefer R over SAS, and organizations increasingly recognize the value of R for “hard money” analytics.

(5) SAP emerges as the company most likely to buy SAS.

“Most likely” as in “only logical” suitor.  IBM no longer needs SAS, Oracle doesn’t think it needs SAS, and HP has too many other issues to address before taking on another acquisition.   A weak dollar favors foreign buyers, and SAS does substantial business outside the US.  SAP lacks street cred in analytics (and knows it), and is more likely to agree to Jim Goodnight’s inflated price and terms.

Will a transaction take place this year?   Hard to say; valuations are peaking, but there are obstacles to sale, as I’ve noted previously.

(6) Competition heats up for “easy to use” predictive analytics.

For hard money analytics, programming tools such as SAS and R continue to dominate.  But organizations increasingly seek alternatives to SAS and SPSS for advanced analytic tools that are (a) easy to use, and (b) relatively inexpensive to deploy on a broad scale.  SAS’ JMP and Statistica are existing players, with Alteryx, Alpine and RapidMiner entering the fray.  Expect more entrants as BI vendors expand offerings to support more predictive analytics.

Vertical and horizontal solutions will be key to success in this category.  It’s not enough to have a visual interface; “ease of use” means “ease of use in context”.   It is easier to develop a killer app for one use case than for many.  Competitive forces require smaller vendors to target use cases they can dominate and pursue a niche strategy.

Comments on “SAS’ Nimble Dance”

There are many subjects in analytics more interesting than the SAS PR operation, but my Google Tracker pinged a few times this week after SAS successfully planted this article in the New York Times.  This morning I feel like shooting fish in a barrel, so here are four brief comments.

Steve Lohr writes:

In 2009, I wrote a long piece that looked at SAS and the challenges it faced. The headline read, “At a Software Powerhouse, the Good Life Is Under Siege.”

The piece in question — which looked like an IBM plant at the time — drew great mirth at SAS, especially the part about how hard-working SAS managers check email while driving home.   That explains why some say it’s dangerous to cross Campus Drive at 5:05.

A new version, coming in June, will be able to run entirely in remote “cloud” data centers. “It’s a complete cloud distribution, totally cloud-ready,” James Goodnight, co-founder and chief executive of SAS, said in an interview…Those clouds can be private ones operated by companies or government agencies. But SAS has its own hosted data centers, and its software now also runs on Amazon’s Web Services cloud.

Someone should explain to Mr. Lohr that the point of “Infrastructure as a service” is that you do not need special software,  unless of course your software license agreement is unduly restrictive, or your software vendor has a cumbersome license key.   “Now” appears to be 2011, according to this thread from the SAS support site; running SAS on AWS is not exactly a new thing, although doing so seems to be a science project according to the thread.  SAS crowing that its software is “now ready for Cloud” reminds me of folks at Electronic Arts crowing that you can now run SimCity because they bought more servers.

On Wednesday, SAS executives came to New York for an event at the Pierre Hotel to show off its retooled technology to customers. The code has been rewritten to run on modern hardware — so-called massively parallel computers….SAS has developed new visual tools — so users can do data analysis with a point-and-click on a laptop, or swipe-and-tap on an iPad tablet, as SAS demonstrated this week. The goal is to broaden the base of SAS users well beyond its traditional core of SAS-trained data experts. “Democratizing data is exactly what this is about,” said James Davis, an SAS senior vice president and chief marketing officer.

“Democratizing data” may or may not be a smart strategy for SAS; time will tell.   But what about “SAS-trained data experts”, what does this announcement mean for them?  SAS seems to be telling “SAS-trained data experts”  that they are working in software not designed to run on modern hardware, a point that many loyal SAS customers will be surprised to learn.

As a private company, SAS does not report its financial results. But Mr. Goodnight said its revenue grew 5.5 percent last year, held down by weakness in Europe and a strong dollar against the euro, which reduced reported sales. Europe is about the size of the United States as a market for SAS.

Nice try, Dr. Goodnight.  In 2012, the dollar declined against the Euro, by about three percent, which increased the dollar-denominated value of European sales.  In any event, all software companies operate in the same currency environment and, as noted here, IBM and SAP reported double-digit growth in 2012.

Book Review: Big Data Big Analytics

Big Data Big Analytics: Emerging Business Intelligence and Analytic Trends for Today’s Businesses, by Michael Minelli, Michele Chambers and Ambiga Dhiraj.

Books on Big Data tend to fall into two categories: they are either “strategic” and written at a very high level, or they are cookbooks that tell you how to set up a Hadoop cluster.  Moreover, many of these books focus narrowly on data management — an interesting subject in its own right for those who specialize in the discipline, but yawn-inducing for managers in Sales, Marketing, Risk Management, Merchandising or Operations who have businesses to run.

Hey, we can manage petabytes of data.  Thank you very much.  Now go away.

Big Data Big Analytics appeals to business-oriented readers who want a deeper understanding of Big Data, but aren’t necessarily interested in writing MapReduce code.   Moreover, this is a book about analytics — not just how we manage data, but what we do with it and how we make it drive value.

The authors of this book — Michael Minelli, Michele Chambers and Ambiga Dhiraj — combine in-depth experience in enterprise software and data warehousing with real-world experience delivering analytics for clients.  Building on interviews with a cross-section of subject matter experts — there are 58 people listed in the acknowledgements — they map out the long-cycle trends behind the explosion of data in our economy, and the expanding tools to manage and learn from that data.  They also point to some of the key bottlenecks and constraints enterprises face as they attempt to deal with the tsunami of data, and provide sensible thinking about how to address these constraints.

Big Data Big Analytics includes rich and detailed examples of working applications.  This is refreshing; books in this category tend to push case studies to the back of the book, or focus on one or two niche applications.  This book documents the disruptive nature of Big Data analytics across numerous vertical and horizontal applications, including Consumer Products, Digital Media, Marketing, Advertising, Fraud and Risk Management, Financial Markets and Health Care.

The book includes excellent chapters that describes the technology of Big Data, chapters on Information Management, Business Analytics, Human Factors — people, process, organization and culture.   The final chapter is a good summary of Privacy and Ethics.

The Conclusion aptly summarizes this book: it’s not how much data you have, it’s what you do with it that matters.  Big Data Big Analytics will help you get started.

Analytic Applications (Part Two): Managerial Analytics

This is the second in a four-part taxonomy of analytics based on how the analytic work product is used.  In the first post of this series, I covered Strategic Analytics, or analytics that support the C-suite.  In this post, I will cover Managerial Analytics: analytics that support middle management, including functional and regional line managers.

At this level, questions and issues are functionally focused:

  • What is the best way to manage our cash?
  • Is product XYZ performing according to expectations?
  • How effective are our marketing programs?
  • Where can we find the best opportunities for new retail outlets?

There are differences in nomenclature across functions, as well as distinct opportunities for specialized analytics (retail store location analysis, marketing mix analysis, new product forecasting), but managerial questions and issues tend to fall into three categories:

  • Measuring the results of existing entities (products, programs, stores, factories)
  • Optimizing the performance of existing entities
  • Planning and developing new entities

Measuring existing entities with reports, dashboards, drill-everywhere (etc.) is the sweet spot for enterprise business intelligence systems.  Such systems are highly effective when the data is timely and credible, reports are easy to use and the system reflects a meaningful assessment framework.  This means that metrics (activity, revenue, costs, profits) reflect the goals of the business function and are standardized to enable comparison across entities.

Given the state of BI technology, analysis teams within functions (Marketing, Underwriting, Store Operations etc.) spend a surprisingly large amount of time preparing routine reports for managers.  (For example, an insurance client asked my firm to perform an assessment of actual work performed by a group of more than one hundred SAS users.  The client was astonished to learn that 80% of the SAS usage could be done in Cognos, which the client also owned).

In some cases, this is simply due to a lack of investment by the organization in the necessary tools and enablers, a problem that is easily fixed.  More often than not, though, the root cause is the absence of consensus within the function of what is to be measured and how performance should be compared across entities.   In organizations that lack measurement discipline, assessment is a free-for-all where individual program and product managers seek out customized reports that show their program or product to the best advantage; in this environment, every program or product is a winner and analytics lose credibility with management.  There is no technical “fix” for this problem; it takes leadership for management to set out clear goals for the organization and build consensus for an assessment framework.

Functional analysts often complain that they spend so much time preparing routine reports that they have little or no time to perform analytics that optimize the performance of existing entities.  Optimization technology is not new, but tends to be used more pervasively in Operational Analytics (which I will discuss in the next post in this series).   Functionally focused optimization tools for management decisions have been available for well over a decade, but adoption is limited for several reasons:

  • First, an organization stuck in the “ad hoc” trap described in the previous paragraph will never build the kind of history needed to optimize anything.
  • Second, managers at this level tend to be overly optimistic about the value of their own judgment in business decisions, and resist efforts to replace intuitive judgment with systematic and metrics-based optimization.
  • Finally, in areas such as Marketing Mix decisions, constrained optimization necessarily means choosing one entity over another for resources; this is inherently a leadership decision, so unless functional leadership understands and buys into the optimization approach it will not be used.

Analytics for planning and developing new entities (such as programs, products or stores) usually require information from outside of the organization, and may also require skills not present in existing staff.  For both reasons, analytics for this purpose are often outsourced to providers with access to pertinent skills and data.  For analysts inside the organization, technical requirements look a lot like those for Strategic Analytics: the ability to rapidly ingest data from any source combined with a flexible and agile programming environment and functional support for a wide range of generic analytic problems.

In the next post in this series, I’ll cover Operational Analytics, defined as analytics whose purpose is to improve the efficiency or effectiveness of a business process.

Analytic Applications (Part One)

Conversations about analytics tend to get muddled because the word describes everything from a simple SQL query to climate forecasting.  There are several different ways to classify analytic methods, but in this post I propose a taxonomy of analytics based on how the results are used.

Before we can define enterprise best practices for analytics, we need to understand how they add value to the organization.  One should not lump all analytics together because, as I will show, the generic analytic applications have fundamentally different requirements for people, processes and tooling.

There are four generic analytic applications:

  • Strategic Analytics
  • Managerial Analytics
  • Operational Analytics
  • Customer-Enabling Analytics

In today’s post, I’ll address Strategic Analytics; the rest I’ll cover in subsequent posts.

Strategic Analytics directly address the needs of the C-suite.  This includes answering non-repeatable questions, performing root-cause analysis and supporting make-or-break decisions (among other things).   Some examples:

  • “How will Hurricane Sandy impact our branch banks?”
  • “Why does our top-selling SUV turn over so often?”
  • “How will a merger with XYZ Co. impact our business?”

Strategic issues are inherently not repeatable and fall outside of existing policy; otherwise the issue would be delegated.   Issues are often tinged with a sense of urgency, and a need for maximum credibility; when a strategic decision must be taken, time is of the essence, and the numbers must add up.   Answers to strategic questions frequently require data that is not readily accessible and may be outside of the organization.

Conventional business intelligence systems do not address the needs of Strategic Analytics, due to the ad hoc and sui generis nature of the questions and supporting data requirements.   This does not mean that such systems add no value to the organization; in practice, the enterprise BI system may be the first place an analyst will go to seek an answer.  But no matter how good the enterprise BI system is, it will never be sufficiently complete to provide all of the answers needed by the C-suite.

The analyst is key to the success of Strategic Analytics.  This type of work tends to attract the best and most capable analysts, who are able to work rapidly and accurately under pressure.  Backgrounds tend to be eclectic: an insurance company I’ve worked with, for example, has a strategic analysis team that includes an anthropologist, an economist, an epidemiologist and graduate of the local community college who worked her way up in the Claims Department.

Successful strategic analysts develop domain, business and organizational expertise that lends credibility to their work.  Above all, the strategic analyst takes a skeptical approach to the data, and demonstrates the necessary drive and initiative to get answers.  This often means doing hard stuff, such as working with programming tools and granular data to get to the bottom of a problem.

More often than not, the most important contribution of the IT organization to Strategic Analytics is to stay out of the way.  Conventional IT production standards are a bug, not a feature, in this kind of work, where the sandbox environment is the production environment.  Smart IT organizations recognize this, and allow the strategic analysts some latitude in how they organize and manage data.   Dumb IT organizations try to force the strategic analysis team into a “Production” framework.  This simply inhibits agility, and encourages top executives to outsource strategic issues to outside consultants.

Analytic tooling tends to reflect the diverse backgrounds of the analytics, and can be all over the map.  Strategic analysts use SAS, R, Stata, Statsoft, or whatever to do the work, and drop the results into Powerpoint.  One of the best strategy analysts I’ve ever worked with used nothing other than SQL and Excel.  Since strategic analysis teams tend to be small, there is little value in demanding use of a single tool set; moreover, most strategic analysts want to use the best tool for the job, and prefer to use niche tools that are optimized for a single problem.

The most important common requirement is the capability to rapidly ingest and organize data from any source and in any format.  For many organizations, this has historically meant using SAS.  (A surprisingly large number of analytic teams use SAS to ingest and organize the data, but perform the actual analysis using other tools).    Growing data volumes, however, pose a performance challenge for the conventional SAS architecture, so analytic teams increasingly look to data warehouse appliances like IBM Netezza, to Hadoop, or a combination of the two.

In the next post, I’ll cover Managerial Analytics, which includes analytics designed to monitor and optimize the performance of programs and products.

Recent Books on Analytics

For your Christmas gift list,  here is a brief roundup of four recently published books on analytics.

Business Intelligence in Plain Language by Jeremy Kolb (Kindle Edition only) is a straightforward and readable summary of conventional wisdom about Business Intelligence.  Unlike many guides to BI, this book devotes some time and attention to data mining.  As an overview, however, Mr. Kolb devotes too little attention to the most commonly used techniques in predictive analytics, and too much attention to more exotic methods.  There is nothing wrong with this per se, but given the author’s conventional approach to implementation it seems eccentric.  At $6.99, though, even an imperfect book is a pretty good value.

Tom Davenport’s original Harvard Business Review article Competing on Analytics is one of the ten most-read articles in HBR’s history; Google Trends shows a spike in search activity for the term “analytics” concurrent with its publication, and steady growth in interest since them.  Mr. Davenport’s latest book  Enterprise Analytics: Optimize Performance, Process, and Decisions Through Big Data is a collection of essays by Mr. Davenport and members of the International Institute of Analytics, a commercial research organization funded in part by SAS.   (Not coincidentally, SAS is the most frequently mentioned analytics vendor in the book).  Mr. Davenport defines enterprise analytics in the negative, e.g. not “sequestered into several small pockets of an organization — market research, or actuarial or quality management”.    Ironically, though, the best essays in this book are about narrowly focused applications, while the worst essay, The Return on Investments in Analytics, is little more than a capital budgeting primer for first-year MBA students, with the word “analytics” inserted.  This book would benefit from a better definition of enterprise analytics, the value of “unsequestering” analytics from departmental silos, and more guidance on exactly how to make that happen.

Jean-Paul Isson and Jesse Harriott have hit a home run with Win with Advanced Business Analytics: Creating Business Value from Your Data, an excellent survey of the world of Business Analytics.   This book combines an overview of traditional topics in business analytics (with a practical “what works/what does not work” perspective) with timely chapters on emerging areas such as social media analytics, mobile analytics and the analysis of unstructured data.  A valuable contribution to the business library.

The “analytical leaders” featured in Wayne Eckerson’s  Secrets of Analytical Leaders: Insights from Information Insiders — Eric Colson, Dan Ingle, Tim Leonard, Amy O’Connor, Ken Rudin, Darren Taylor and Kurt Thearling — are executives who have actually done this stuff, which distinguishes them from many of those who write and speak about analytics.  The practical focus of this book is apparent from its organization — departing from the conventional wisdom of how to talk about analytics, Eckerson focuses on how to get an analytics initiative rolling, and keep it rolling.  Thus, we read about how to get executive support for an analytics program, how to gain momentum, how to hire, train and develop analysts, and so forth.  Instead of writing about “enterprise analytics” from a top-down perspective, Eckerson writes about how to deploy analytics in an enterprise — which is the real problem that executives need to solve.