jueves, 16 de mayo de 2013

Six ways you can fail at data driven marketing

Data driven marketing is not as much easy as we think.

You've heard the buzzword, and you probably think you're doing data driven marketing as we speak. But the reality of the situation is that most marketers aren't actually data driven.

Taking Peter Drucker's "What gets measured, gets managed" at face value is dangerous. Mere measurement isn't actually enough. Data driven marketers measure for the right reasons.

And so, with that in mind, here are six reasons you (might) fail at data driven marketing.

1. You confuse metrics with goals

There's no doubt that goals are an important part of incentivizing your workforce and defining strategy. And there's no doubt that you need metrics in order to determine whether or not a goal was actually met. But metrics aren't goals, nor should they ever be thought of as such.

Metrics are more than a set of numbers that you want to maximize or minimize in order to achieve your business goals. They are the key to understanding how your business works.

They are the key to understanding your customers, your employees, and your products. In marketing, metrics should be used to define and discover strategy just as much as to measure whether or not you're executing on strategy.

The secret to success with big data doesn't lie in your KPIs. It lies in hidden correlations and careful testing. It's about understanding exactly what influences your KPIs and why. You have to know what strings to pull before you can become the puppet master.

2. You don't measure during project management

Every marketing project is an opportunity to learn something new about how to optimize future projects. This goes beyond analyzing the finished product itself. The more you understand about the process of project management, the more you understand about how to tweak it for optimal results.

It should go without saying that this is nearly impossible without project management software. We personally find MS Project too taxing and Basecamp too minimalist. Our choice is Workzone, a Basecamp alternative with a wider range of features.

No piece of software is going to give you all the data you need, in fact most isn't really designed with measurement in mind. Time tracking goes a long way. Measuring the difference between how projects were initially set up and how they evolved over time is another important task that's much easier with the right software.

While you can't exactly split test project management techniques, you can compare results over time. It's important to define what you think works best, and put it to the test. Sometimes a project management technique hardly matters at all.

Sometimes it can make a dramatic difference. You won't know unless you're always testing.

3. You don't care about statistical significance

This is a very common mistake, and an easy one to make. Just because a single split test favored one tactic over another, or one blog post has a lower bounce rate than another, this doesn't necessarily mean one was better than the other. You need to reach statistical significance before you know for sure.

Statistical significance is a tough subject to grasp without some training in statistics. Suffice it to say, if you don't have a margin of error, you can't make meaningful comparisons.

It's not always possible to reach statistical significance. Sometimes the data is too skewed or the sample is too small. I'm not saying there's anything wrong with using intuition when it's all you have.

All I'm saying is that if you don't care about statistical significance, you're going to make a lot of unnecessary mistakes, invest in a lot of failing strategies, and make many choices that you thought were data driven, when they really weren't.

We've mentioned the importance of having analysts before. If it's within your budget, it's a good idea to have a statistician, or somebody who speaks the language, on board. If not, I highly recommend learning at least some of the basics.

4. You confuse science with rigidity

Data driven marketing is a science, and science is a creative endeavor. The naïve marketer hears that marketing is a science, and concludes that it obeys a rigid set of rules, that you should start with the facts, and that there's no room for interpretation.

The informed marketer hears that marketing is a science, and concludes that it is an evolving process, that you should start with your hypothesis, and that what you choose to measure is going to have tremendous influence on your interpretation.

Science is, at its heart, about discovery. This is the only reasonable way to think about data-driven marketing. Discoveries start with a hunch or a happy accident. They almost never start with the facts.

The data driven marketer brainstorms and harnesses their creative abilities to make educated guesses about ways to improve business. They then put these educated guesses to the test. Then they iterate to refine their hypothesis and make incremental improvements.

Data driven marketing should never be a rigid process.

5. You don't measure and test relevancy

If there's only one thing big data has to offer (and that's not the case), it's relevancy. As we move away from push marketing, we enter a world where it's possible to market only to those who are actually likely to care about our products.

But it goes deeper than that. Even those who are likely to care about your product will care about it for different reasons. You need to segment your audiences and target them separately. And this goes beyond traditional demographics like age and gender. Psychographics become increasingly important. The more you know about your customers, the fewer excuses you have to market to them the same way.

Ninety-one percent of consumers have abandoned at least one brand on Facebook, and about half of them did so because they saw too many messages, code for saying the messages didn't matter to them.

Irrelevant messages are unheard, ignored, or blocked. It's not enough to make assumptions about what you think is most relevant to your target audience. You need to measure and test those assumptions at every juncture.

The typical email has a 5% open rate, but you can get that as high as 95% if the message is relevant enough.

 

6. You don't design testable models

We mentioned before that data driven marketing is a science, and if you pay attention, scientists spend a lot of time talking about "theories" and "models." They design these models because it is their goal as scientists to make predictions about measurable phenomena.

It is your job as a data driven marketer to make predictions about KPIs and how your actions can improve them. You can't do this without a model. Models don't always need to be advanced or even very good at making predictions. They just need to be good enough to guide you in the right direction.

What is a model? It depends. A model could be a mathematical formula that predicts how many sales you'll make based on email open rates or survey results. It could be a pet theory about how consumers react to positive or negative emotions.

The most important thing a data driven model needs is the ability to be tested. An untestable model is useless. Data driven models make predictions, and those predictions can be proven false if it turns out they are wrong.

The data driven marketer recognizes that their model is simplistic and will probably fail under some circumstances. They accept this, and they continue to refine the model as new data comes along. This is how they build an increasingly effective model and continue to improve results.

If you don't have a testable model, you're missing the point of data driven marketing.

Conclusion

I don't believe I've broken any big "secrets" to data driven marketing with this blog post, but there are so few introductory posts on the subject that I hope I've reached some marketers who didn't realize some of the opportunities they've been missing out on. Data driven marketing is a creative discipline with a focus on designing predictive models that you can use to optimize results.

It's about improving relevancy and even improving project management. It's not about strict adherence to a set of rules or just a way to set goals. It's a continuously evolving process that's about discovery more than anything else.

Any other places where marketers fail at the data driven approach? I'd love to hear your thoughts on this subject.

No hay comentarios:

Publicar un comentario