What makes an organization’s content great?
We may think we know instinctively how to tell if a piece of content is good or bad. We know that it was created for a reason—very few organizations invest in paying writers to write for their own amusement—and we assume it was created according to our editorial and brand standards. And someone within the organization thought it was good enough to put out there on the web site, so it must be okay. And if it was good when it was written, it’s probably still fine, right?
Maybe. But given the role content plays in our customer experience, perhaps we need to embrace a more formalized approach to testing and improving our content to make sure that it really is great. Great content fulfills our business goals, supports our brand, and meets our customers’ needs now, and over time, as needs change and user engagement with our brand evolves. What we need is content that performs—and we need to know what that means and how to create and sustain performance-driven, effective content.
What is performance driven content?
In his recent post, Kevin Nichols defined it thus:
Performance-driven content is a data-driven approach to content that can ensure your content ecosystem evolves the way it should, meeting the needs of your content consumers… this method positions an organization to continuously assess how its content is performing, and from that perspective, make decisions on what to do with the content in the future.
Kevin goes on to lay out a framework and process for creating and governing content to ensure its ongoing performance. A key caveat that Kevin makes is that data is not just metrics or analytics but a whole host of inputs, including user feedback, content analysis, competitive research, and industry trends in addition to metrics and analytics.
What I’d like to dive a little deeper into the “Assess” step in that framework. In the assess phase, the first step is “look in depth at current experience and/or competitive models, figure out areas for improvement and differentiators, refine project scope based on learnings.”
Tools for measuring performance
What does it mean to look in depth at current experience and/or competitive models? It means performing an audit—an assessment of the current state of your content and that of your competitors.
The content audit is a powerful tool for assessing the breadth, depth, quality, and performance of your content against your own goals and against your competitive landscape and ensuring the content meets the needs of your customers. To be comprehensive, a content audit not only addresses the standard measures of editorial quality, but looks at every source of information you can gather about your content. In my book Content Audits and Inventories: A Handbook, I wrote about using analytics and other measures of performance, such as customer feedback to assess what content is performing well. Incorporating those metrics into your content audit will give you quantitative data to include with your qualitative assessment of the content.
The audit is not a one-time approach to content improvement. Ideally, regular audits are built into your content ecosystem, as a critical part of ongoing analysis and optimization. Depending on the type of content you publish and the frequency of publication, audit frequency can vary, but should be done at minimum annually.
The content audit
Before you begin your audit, you should conduct the exercise known as a content inventory. A content inventory is a quantitative assessment of what content lives within your experience. Create a detailed inventory for your website, but remember, too, all the content that exists in other channels, including print assets.
Given that many organizations produce vast amounts of content, a tool to automate the process can save time and ensure that comprehensive data is collected. I recommend my own Content Analysis Tool as an excellent resource to accomplish the task of gathering detailed data about your web content.
The CAT dashboard, with standard data and custom columns added.
After you have captured the necessary information for your inventory, prepare your audit by building out on the data in the dashboard or exporting it to spreadsheet format. In addition to the data that CAT provides (including integrated Google Analytics data), you’ll want to add in columns for your own annotation and analysis.
For the purposes of assessing content performance, you will want to record audience-specific information in your audit. For example, for each content type or content focus area, who are the customer segments or personas the content is targeted to, the user task it’s related to, and the action you wish the user to take based on the content. Your inventory can also capture information about who created the content, when it was created, how often it is published, and more.
Example inventory and audit spreadsheet template from Content Strategy Alliance Best Practices
Focus the audit
Your inventory will provide you with a lot of data about a lot of content. This can mean thousands of pages. So for the purposes of the audit, which takes the inventory and evaluates the effectiveness of that content, you will probably need to limit your scope and target your efforts. If you are auditing for content performance, you will want to start first by focusing on your highest-value content. For time-and resource-strapped organizations, the most important content may be the initial or only focus of the audit, but if you have the time and resources, come up with a plan that addresses all content.
How do you determine which is most important? Not necessarily by checking your analytics and seeing what is getting the highest traffic. Checking those scores comes later. Your first task is identifying what matters most to your business and which content supports it. This is where a customer journey map can be a very useful tool. If you have mapped out the key tasks for your customers and what content is required to support their progress through the journey, you can focus in on that content for your performance audit.
To focus your audit:
1. Review your business goals and objectives as they relate to the experience you are evaluating (e.g., your website)
2. Review the most important products or key services for your business
3. Include content that is important for your overall brand, such as thought leadership, social channels, and corporate content (information for investors, job seekers, media)
4. Look at the needs and tasks of your users, as determined in your user research, user feedback, and personas
5. Review the user journeys
6. Analyze the metrics to see which content your users engage with the most (what do they view, download, share, etc.)
7. Synthesize from all these sources to come up with a list of your most important content. This should be your highest priority.
8. Perform this exercise to prioritize where other areas of content fall (least to most important)
Standard metrics to use to assess content performance include analytics data such as page views and time on page. If your inventory sheet includes this data, you can start to filter and sort to find what’s working well and what isn’t. For example, if a piece of content has been identified as high priority based on its relevance and importance to a key user task but it is receiving low page views and a high bounce rate, flag it for review and possible revision.
As many have said before me, while analytics data tell you the what, they don’t tell you the why. Use analytics not as the be-all, end-all indicator of content quality, but as one piece of the larger picture. Look at the data to identify the highs and lows of content engagement (page views, time on page, bounces, and exits) then use that as an initial way to scope the audit. Look at the low-performing pages—are they performing badly because they contain bad (outdated, irrelevant, poorly written) content? Or because they are misplaced in the user journey or the content is presented in a less-than-optimal format? Rather than focusing just on identifying content to kill, think about whether there is valuable, salvageable content that needs some love.
Similarly, the vast set of content that’s neither at the high or low end of the scale may be some of your most valuable content, but not performing well. This is where the additional data you’ve added to your audit is critical—if you can sort your data by persona, user task, or other key performance indicator and then see that the content mapping to that scenario is not performing well, you know where to focus your improvement efforts.
User data and feedback
In addition to the hard data you get from analytics programs, be sure you’re including everything you can gather that comes from your customers, this is the soft metric data. Ideally, before you even began your audit, you immersed yourself in all the data your organization has about its customers—user research, personas, customer journey maps. If your company hasn’t invested in doing this kind of user research (or even if it has), you can do supplemental research by reviewing industry research relevant to your business and your target customer base. For example, if your business is a cruise line, you can research industry trends and demographic information in sources like Forrester Research or industry-specific publications.
Your direct customer interaction and feedback is a rich source of information as well. Use social monitoring tools to see whether your customers are engaging with your social channels by commenting, following, and sharing. If you host user-generated content such as comments on your own site, you should be closely monitoring that as well. And work with your customer service department to track the requests they are receiving for information—this can lead you to the gaps in your content and give you a clear metric for measuring success, if you are able to lessen the frequency of those calls or emails for support.
If you have the opportunity to do direct user testing, take it. This can be as formal or informal as your resources allow. But even bringing in a few family and friends, sitting them down to your web site with a list of tasks to complete, and watching (even better, recording so you can share with your team) their progress and reactions can be a valuable (and sometimes painful!) lesson. For ideas about how to do low-cost user testing, check out the article Guerilla Research Tactics and Tools.
Search logs can also be a source of information about what people are looking for on your site and whether they’re finding it or not. Not only can you use that data to see whether you are actually serving up appropriate results for what people are looking for, but you can see the words they’re using when they search and feed that information back to your information architect to refine labeling and tagging.
In addition to your analytics data and your customer research, don’t forget to keep an eye on what your competitors are doing. Doing a competitive content audit can help you set benchmarks against which your content can be measured over time. Just as you identified which content is key to your own organization’s business and user goals, you can identify the equivalent content on your competitors’ channels. Set up a framework for measuring your performance side-by-side with your competitors, using a scorecard to rank relative quality. You won’t have access to their analytics data, but you can evaluate content by qualitative aspects such as the tone and voice, the quality of the writing, and ease of use (set up some sample tasks, such as walking through an evaluation-to-purchase cycle for a product—how easy or hard is it compared to your own experience?). Look at their social channels to see how successful they are compared to yours and see if you can identify the strategy and tactics that might be applicable to your own business. You don’t want to simply copy what others are doing, but it’s fair game to learn from their successes.
For an excellent example of a tool for capturing and scoring a competitive analysis, take a look at the free template provided by the Content Strategy Alliance.
Creating a cycle of continuous improvement
When you’ve set up your performance driven content framework (using Kevin Nichols’ model), identified the tools you have at your disposal to test your content, and completed your first audit, how do you use those insights to sustain performance?
The first and most important thing to remember is that this is not a one-time effort. As Kevin wrote, you need to create a closed-loop process of continuous evaluation, review, and improvement.
Your audit framework:
1. Know your business goals, including the metrics and KPIs (key performance indicators) that the business uses to measure success
2. Know your users and what they are currently doing as well as what you want them to be experiencing
3. Define the scope of your audit to make sure you’re focused on the right content
4. Evaluate the “hard” data such as analytics as well as the “soft” data such as customer feedback
5. Share your actionable insights back to stakeholders and content creators
In order to sustain the improvements you achieved by instituting a set of processes for creating and reviewing content, you need to close the loop with a governance model.
A good definition of content governance appeared right here on the GatherContent blog recently, in a post authored by Edward Baldwin: “Content Governance is the system, a set of guidelines, that determines how an organization’s content gets created and published.” Edward laid out a great set of guidelines for building governance into your content creation process. I’d like to dive a little deeper into how to formalize and operationalize a governance model.
Why govern content?
The goal of governance is to create a repeatable, accountable, visible, and predictable process for managing content. Performance-driven content has the same goal, using performance as the key factor in the process.
Content governance addresses a number of potential problems, above and beyond ensuring content performance. Lack of defined standards and holistic oversight of all channels creates risk. Ungoverned or poorly-governed content can result in everything from loss of credibility or brand loyalty to loss of sales to actual legal liability.
The reigning queen of the topic of digital governance is Lisa Welchman, whose book Managing Chaos: Digital Governance by Design should be on every content strategist’s bookshelf. Lisa recommends building a digital governance model on a foundation of strategy (the objectives, policies, and standards) and a framework for who makes decisions.
Following is my own take on building that decision-making framework and the strategic and operational aspects of content governance, as they relate to managing high-performing content.
Building a governance strategy and team
In an ideal world, every content team would be staffed with enough people to easily manage a multi-step content lifecycle, customized for each publication channel, and operating on a predictable schedule. The reality is, though, that most of our organizations are stretched thin and taking the time to participate in a content audit takes time away from the also-pressing tasks of creating new content and developing new channels.
This is why it’s all the more important to formalize processes and identify roles, so that resource planning can take into account the time and tasks needed.
Even if your company can’t create a dedicated content governance team, you should be able to identify a set of roles and responsibilities that need to be supported. A successful governance model begins with identifying ownership of content and responsibility for decision making. A common model for documenting roles is a RACI grid—who is responsible, accountable, consulted, or informed about content decisions?
The content governance team needs to address both strategic and operational aspects of governance.
Strategic aspects of governance
Establishing your governance strategy should be similar to the process of establishing your performance-driven content strategy. Base your strategy on:
- An assessment of the current state of your content via your content audit and including all the metrics and user data described above
- An understanding of your desired future state and specific goals for achieving it
- Definition of the metrics by which you’ll assess your success at reaching that state—define your quality bar by content type and channel. Make sure your metrics are quantifiable and directly tied to quality and performance.
Communicate your strategy to all content stakeholders, including cross-organizational stakeholders such as your technical team who, for example, may be asked to build governance and workflow into your content management system or implement analytics reporting. In addition to a communication plan to do so, create a dashboard that effectively rolls up all the findings and recommendations.
Remember that, just as your content and business model evolve, so too must your governance strategy. Just as you regularly review your content against goals and metrics, review your governance strategy and implementation to be sure it’s achieving its objectives and meriting the investment in resources.
Operational aspects of governance
The operational, or tactical aspects of governance, address how the strategy will be actually carried out.
- Assessing organizational readiness
- Implementing a communication plan—the form it will take and the cadence by which it will be distributed
- Training team members on the processes (as Kathy Wagner says, “There’s no sense designing content systems without the people to make it work.”)
- Creating a schedule of content review based on the quantity and frequency with which you publish content (note: be realistic about the time and resources required)
- Establishing workflows for day-to-day governance
Some considerations in setting up a governance organization:
- Participating in governance activities may add additional responsibilities to current staff
- Additional staff may be needed, determined after roles and responsibilities are defined
Lather, rinse, repeat
If you take nothing else away from this post and its companion, let it be this: measuring the performance of your content and investing in improving it can’t be a one-time effort. It may seem like a daunting task to adopt and enforce content lifecycle management, including implementation of processes for regular auditing, establishing content standards and policies, and defining roles and responsibilities for governing content over the long term. But the return on investment is content that meets your customers’ needs, meets or exceeds your business goals, and is created and managed in a predictable, consistent way.