Experiences- the 7th Era of Marketing
Page 20
The three layers of the experience technology stack can help align marketing and technology strategies, providing high-level goals for each kind of technology to use to build experiences.
Here’s an example of this experience technology model:
• A large consumer hotel company has restructured the management of its websites for web experience management.
With more than 600 websites to manage across multiple devices, the company has deployed a solid backbone for web content management and measurement.
The company uses an enterprise-wide web content management solution with integrated analytics to deliver everything in a standardized format to hotel staff—from web visitor information to brand marketers to per-hotel revenue information.
This standardized content store gives them the flexibility to easily create and deliver content across web, mobile, and social channels—and remain more nimble across numerous social media.
By deploying channel-specific strategies, they can react quickly and conversationally to consumers that may have issues while on the property, and store that information globally.
Recent “wins” include having the ability to immediately provide a guest with a free breakfast after the guest Tweets that there was a problem with his or her room.
CCM MEETS TECHNOLOGY ADOPTION: BUILT TO ADAPT
One of the biggest dangers for enterprise marketing groups today is the temptation to repeat the mistakes made during “Web 1.0.” Just as the “ebusiness team” was often separated from the core business in the late 1990s, so, too, are social and web experience teams often isolated from both corporate marketing and IT today.
CCM and the function of delivering rich digital experiences across any channel require that these teams collaborate on one cohesive strategy. As a result, the experience management technologies facilitating those processes can converge effectively to integrate and manage all three layers: core data management, engagement management, and content channel and experience management.
This new adaptive system will be staffed by flexible teams who also will be challenged to change and rearrange as new strategies are deployed by the enterprise. These de-siloed teams should ultimately work together in the marketing department to generate a phenomenon that scientists call “emergence,” where relatively simple and separate interactions develop into productive patterns. Then, the whole ultimately becomes much greater than the sum of its parts.
Tomorrow’s marketing and communications teams will succeed by learning to adapt—and by deploying systems of engagement that facilitate adaption. By building to change, the marketing department builds to succeed.
WHAT’S NEXT: THE 90-DAY VISION
The First Month—Alignment with technology
• Look for ways to get insight from the technology leaders into their roadmap and build collaborative strategies. Seek to develop a strategy that enhances the technologist’s ability to determine “how” a particular technology should function, and the marketer’s ability to determine “what” process it should facilitate. A common ground might be found in viewing a simple agenda to agree on categories of:
Engagement and relationship—work to create fast and agile ways to facilitate new platforms and quickly iterate ideas with the technology and marketing team. Basically, the CMO needs to understand this process intimately and be able to quickly communicate the fast-changing iterations that may be required.
Intelligence and insight—the CMO and CIO need to collaborate on a methodology that is focused on learning—drawing value out of what actually happened vs. siloed push-and-pull of data. The CMO and CIO need to work hand-in-hand here.
Shared values and exceeded expectations—the CMO and CIO must go beyond merely understanding one another’s agenda; they need to collaborate on a joined strategy where a partnership is critical to the business.
• Create that joined strategy by viewing the technologists as core team members who help to manage the portfolio of experiences—or at least as a support layer that will be informed of all the story maps being created.
The Second Month—The technology stack and new technology acquisition
• Getting alignment on a technology stack and purchase strategy for customer-driven experiences can be difficult. One way to focus your efforts is to look at audience development for content-driven experiences as the goal. With that in mind, and in conjunction with your technology team, look to an approach across three layers:
Core data management—driven by the technologists and acknowledge that, once set, it will not change often.
Engagement management—as a middle layer of technology that interfaces with data, changes here are considered and infrequent. Business rules will feed the data collection.
Content and experience management—a top layer that should be as flexible and/or disposable as any media strategy. Here the changes will come fast and furious, and experimentation will be high. Once set into a core strategy—or once value has been proven by a particular experience—then perhaps it becomes more of an engagement management approach.
The Third Month and Onward—Technology enablement as a core piece of CCM
• Technology will certainly affect the CCM portfolio management strategy. Delivering rich digital experiences across many channels will require ongoing collaboration on one cohesive strategy. Experience management technologies can converge effectively to integrate and manage all three technology layers. Conduct technology reviews as part of the ongoing CCM agenda.
ENDNOTES
84 “Google brings agile to agencies with ‘Agile Creativity’ vision.” Chief Marketing Technologist, August 9, 2012. Available at: http://www.chiefmartec.com/
85 “Systems of Engagement and the Future of Enterprise IT: A Sea Change in Enterprise IT,” by Geoffrey Moore. Available at: http://www.aiim.org/futurehistory/
“Tell me how you measure me and I will tell you how I will behave. If you measure me in an illogical way…do not complain about illogical behavior.” | Eliyahu Goldratt
Before you get too excited, you need to know that this is not a book about marketing measurement. There, whew, we said it.
There are innumerable experts in the field of data measurement who derive actionable insight out of the ways that marketing and advertising are measured. As we have indicated, we are focused on putting function before form, story before expression, and meaning before action.
That said, we can safely say that measurement—especially as it pertains to content and experience-driven marketing—is in a world of hurt. The stark perception that content-driven marketing cannot be measured (as opposed to the actual results) is the most often cited reason that content marketing initiatives are cancelled.
Therefore, as we move into the new era of Experiences—as we “lean in” with our capital “M”—we must CHANGE how we approach the very idea of what measurement means.
This chapter is about changing our approach to measurement.
MARKETERS ARE WIRED TO GET MEASUREMENT WRONG
In 2008, science historian Michael Shermer coined the word “patternicity.” In his book The Believing Brain, he defines it as “the tendency to find meaningful patterns in both meaningful and meaningless noise.”86 Shermer says that humans have the tendency to “infuse the patterns with meaning, intention, and agency.” He calls this latter tendency “agenticity.”
So, as humans, we’re wired to make two types of errors that have relevance here:
• Type 1 Errors – the false positive, or the pattern that doesn’t really exist
• Type 2 Errors – the false negative, where we fail to see the real pattern that actually does exist.
As marketers, we are even MORE hardwired for not only making Type 1 errors, but for the “agenticity” that goes along with doing so.
One of the disadvantages that analytics technology has brought is in its promise to reduce all marketing down to an algorithm. The ability to be “data-driven” has pressured marketers into anal
ytics as “proof of life” for every creative strategy the team puts forward. Thus, the team is trained to see successful patterns in ANYTHING that even resembles success.
Web analytics dashboards—churning through “small data”—are billboards for Type 1 Errors. More traffic? The light is green. Never mind that the reason we have more traffic is because something we’ve just published went viral in a bad way. Fast food restaurant Chick-fil-A’s web traffic quadrupled over the span of one month in 2012 during the controversy over their CEO’s comments on the LGBT community. Is that a good thing?
How about more “time on site”? As marketers, we interpret this as visitors are more engaged with our content, not that they may be having difficulty finding what they are looking for.
We see more “likes” as an indicator of success on Facebook, without even acknowledging that people actually have to “like” your page before they can comment on how much they hate you.
A “data-driven marketing” mindset has pushed many marketers into scrambling to find patterns of success that may or may not be there. This tunnel vision has marketers using data in narrow ways. This is addressed in an IBM paper titled, “From Stretched to Strengthened,” which reports on study findings that most CMOs use data to optimize transactions, as opposed to using it to glean insights to deepen their relationships with customers.87
In an interview with the authors, Gordon Evans, senior director of product marketing for SalesForce.com’s Marketing Cloud, framed this disconnect by saying:
“There’s this whole notion of being able to identify and convert people in a different kind of funnel—where you take them from strangers, to friends, to fans to advocates. The way that’s done is through engagement-focused experiences.”
Put simply, marketers have to stop looking at analytics as only a means to improve converting calls-to-action and as “proof-of-life” that a campaign was worth paying for. Instead, they must also start to look for meaning, using content and data to deepen the engagement and to improve the process of creating engagement-focused experiences.
WHAT DO ANALYTICS MEAN TO MARKETING?
If you look up the definition of “analytics” in the dictionary, you’ll find:
ANALYTICS
/Anl’itiks/
noun
plural noun: analytics
1. The systematic computational analysis of data or statistics.
2. Information resulting from the systematic analysis of data or statistics.
Two things stand out to us in that definition. Nowhere do we see the words insight, improvement, or actionable. In other words, it’s the systematic reporting of the patterns of data that has been captured. In fact, that’s exactly what the IBM study found.
But let’s be honest—this is really the definition of what analytics has become for marketers:
ANALYTICS (The marketers’ version)
/Anl’itiks/
noun
plural noun: analytics
1. The systematic computational method of making data say anything that justifies our actions.
2. Information manipulated to ensure that marketing or other departments illustrate “proof of life” to justify their existence.
3. Weapon of mass delusion.
That’s right. Delusional. This is where we are with measurement. We are right where physicist and business management thought leader Eli Goldratt said we would be. Our companies use increasing amounts of technology to define how marketing should be measured. And we, as marketers, respond accordingly with our behavior. Because we are measured in increasingly illogical ways, we start behaving just as illogically.
As an extreme example of this, we worked with a large technical infrastructure company. Their customer email database had 175,000 addresses. They invested hundreds of thousands of dollars in an email technology solution that would actually send that many emails on a weekly basis. Each time they would send an email, half of them (85,000) would bounce back as undeliverable. This was because the database was never maintained or pruned.
When we asked the marketing team why they didn’t just prune the database, and delete the 85,000 that were bouncing each time, they told us “because if management sees that our email subscription rate dropped by 50% they would cut our budget for email.”
Yes…that’s delusional.
THE MEASUREMENT PYRAMID
In Managing Content Marketing, Robert Rose and Joe Pulizzi presented the measurement pyramid, a new approach for the process of measuring content marketing. The pyramid is comprised of three key levels of analytics that inform the process of creating content for business purposes:
1. Primary indicators: Goals—which indicate progress toward a desired achievement. These are the numbers we actually report on.
2. Secondary indicators: Key performance indicators (KPIs)—secondary goals and indicators toward the progress of the goals. These are integrated measurements of content that help us improve the process toward meeting our primary goals.
3. User indicators: Data points—data we collect that helps us on a day-to-day basis inform the process to alter course toward our goal.
The measurement pyramid is designed to help segment the KPIs that help improve process vs. those that we should be reporting as part of our goals.
As a CCM governing process begins to take root, the measurement pyramid can be used to elevate and integrate measurement beyond a myopic focus on ensuring that every graph goes up and to the right. It can be a framework that lets us start to measure the efficacy of content across a multi-channel strategy and provide an incentive for cross-functional teams to align their goals. This approach can help marketers report the MEANING of numbers; it offers the insight necessary to develop more delightful experiences, as opposed to being a scorecard that helps record marketing team performance.
The key is in the reporting.
As a strategic function, marketing should report only the primary indicators. The secondary indicators exist to help improve the marketing team and content processes. And the user indicators exist only to help individuals make course corrections that fuel the achievements within their areas of responsibility.
Let’s look at an example of this.
GOALS AND INDICATORS
As we remember from Chapter 8 and the story mapping process, we may have a number of goals associated with a particular initiative. One way to think about goals is to state them in the following framework:
OBJECTIVE + TIME FRAME + CONSTRAINT
At first, establishing a constraint may seem like a bit of a ball and chain; however, these three parameters are important because they allow us to be more creative while striving to reach our goals. In short, we have three levers to pull and push. Thus, we can be more precise than just saying the goal for a content marketing initiative is to “increase qualified leads by 10%.”
Instead, we can be quite specific. For example, we can set a goal for a content-driven experience that says:
Increase qualified leads by 10% in six months with only a 5% budget increase.
Using the measurement pyramid, we might come up with the following tiers of measurement:
GOAL Increase qualified leads by 10% in six months with a 5% budget increase
PRIMARY INDICATORS • # of qualified leads
• Cost
SECONDARY INDICATORS (examples) • Unqualified leads
• Cost per lead (CPL)
• Blog subscribers
• Email subscribers
• Downloads
• Webinar attendees
• Website visitors
USER INDICATORS (examples) • Likes/followers
• A/B test results
• Cost per visitor (CPV)
• Video views
• Webinar registrants
The example above is simply meant to be illustrative. As you might expect, those lists on the secondary indicators and user indicators could get quite long depending on the number of goals, as well as the amou
nt of data that you’re tracking. And, yes, you probably have more than one goal that you’re trying to reach.
But you can now see how this pyramid might tier within different functional groups. For example, the sales enablement team will have goals that drive qualified leads, whereas the field marketing team may have goals that drive awareness and initial downloads of content. The product marketing team may have sales revenue goals for new products. These individual goals can share integrated secondary indicators—and then the individual managers can share user indicators. The shape may resemble this:
The measurement pyramid can scale to different groups with cross-functioning goals. User indicators become a pool for the team to draw from to understand how to improve toward the goals.