GatherContent is becoming Content Workflow by Bynder. Read More

5 considerations to help you make reliable content decisions

5 considerations to help you make reliable content decisions

7 minute read

5 considerations to help you make reliable content decisions

7 minute read

5 considerations to help you make reliable content decisions

Marie Girard

Digital customer experience, IBM Hybrid Cloud

Table of contents

1.
2.
3.
4.
5.
6.
7.
8.

Have you ever been in a meeting where the group decided to publish something then left thinking “that shouldn’t have been written, let alone published!” only to find out later that everyone in the meeting thought the same thing?

Have you ever seen key information fall through the cracks of corporate silos?

Have you ever attended debrief after debrief where you dissect the same recurring problems?

If so, your organisation might be unaware of the human factors at play in collective decisions.

For a long time I thought this was just an irritating idiosyncrasy of all big organisations—until I discovered Christian Morel’s books on absurd decisions.

Morel, a former head of human resources at Renault, turned to sociology where he studied “absurd decisions”—radical errors and how they’re made by groups of individuals.A radical error is the result of a decision that goes against everybody’s objectives, for example:

  • Engineers who persist in using faulty joints for the Challenger shuttle
  • Pilots who collectively decide to turn off the one reactor that is still working
  • Executives who persistently share illegible slides

The first book covers how radical errors happen. The second is about how organisations have improved their decision-making in high-risk environments.

Morel’s insights and experience outline some principles for what individuals and organisations should do to make the right decisions. I will describe how you can apply these principles to your content projects, but first let’s take a closer look at what happens when groups make radical errors.

Why do groups persist in making bad decisions?

The world we live in is complex and unpredictable. Complexity doesn’t fit well with linear, cause-and-effect modes of thinking that we usually associate with rational behaviour.And our individual minds are not well designed to manage this complexity. The cognitive tools we use to cope with complexity sometimes get in the way, and constrain us through a number of biases:

  • Representation errors—when there’s an elephant in the room. It’s that feeling that everything is normal when in fact it’s all going wrong. For example, when a plane crashes because the pilot relied on a faulty altimeter rather than what he could see through the windshield. Or when we decide to launch a campaign that is preposterous or even insulting in the current political context (here's a recent example).
  • Target fixation—when we take more risks as we get closer to our goal.For example, when we’re one day from the delivery of a website redesign and we decide to add that extra, unusable feature at the last minute.
  • Loss of meaning – when action becomes an objective of its own.For example, when a team decides to remove links from their content because of false positives in the link-checker reports.
  • Attention deficit—when we’re stressed or tired. For example, when an editor spends hours making corrections to a document and misses a glaring typo in the title.

The list goes on. Check out this infographic for a full picture of our cognitive biases.Morel observes that the biases we have at the individual level are amplified when we work in groups. We often make decisions based on what we think others think. And sometimes, based on what we think others think we think. This leads to a phenomenon called “groupthink” where everybody disagrees but keeps quiet.

Part of the reason that led to the Challenger disaster was that experts knew the joints were faulty, yet none of them questioned the decision to launch Challenger regardless of this knowledge.To add to that, there are power struggles at play between managers, experts, and novices. Morel states that it always takes three people to make a decision, and when the weight of each role becomes unbalanced, there is a higher risk for the group to make a bad decision.

The manager has the power to act on the decision. The expert has the power to validate the decision. The novice has the power to understand and apply the decision. All of them can either take the decision, support it, oppose it, or not be part of it.

For example:

  • A plane crashes because the copilot followed the commander’s orders. A communications department publishes leaflets with incorrect information. (The manager takes the decision, experts and novices are not part of it)
  • Marketing validates technical content that people don’t like. (The expert takes the decision, the manager follows, and novices are not part of it)
  • A nuclear plant explodes because people maintaining the plant were not experienced and had no supervision. (Novices take the decision, managers support it, experts are not part of it)

We all know or have heard of a manager who makes decisions without consulting anyone, but just having a technical expert in the process is not a guarantee for a reliable decision. It’s the imbalance in decision-making power between the three roles that can lead to disaster.

We live in a complex world, our perception is flawed, group dynamics are riddled with traps, and decision-making power structures are often unbalanced. Everything seems to be set up for failure when people work together on a content project.

What can we do?

Thankfully, in his second book Morel presents solutions to these thorny decision-making problems. Here are 5 things you can do to ensure you make the right decisions on your content projects.

1. Start with yourself

We’re all human, and nobody’s perfect. But it can be difficult to admit that you’re fallible. All the same it’s worth taking the time to acknowledge your flaws.

For example, 80% of pilots compared to 30% of surgeons admitted to making mistakes when tired. Now guess which is more likely: dying in a plane crash, or dying from a post-surgery issue. So if you find yourself unable to admit that you make mistakes—beware! Good group decisions start with admitting that everyone makes mistakes. Including you. Yes, you the content expert. Take some time to reflect on your own mistakes, and the possible cognitive psychology factors that caused you to make the mistake. Was it target fixation, attention deficit, a representation error? Or maybe a mix of those?

With better awareness of these factors, you reduce risks. And if you can train your teams to become more aware, you reduce risks for the whole organisation.

2. Set up imperfect but simple rules to manage uncertainty

In our complex world, we have to deal with uncertainty. There are many things we cannot predict, and when the unexpected happens, we must decide and react quickly.

There are various ways you can deal with uncertainty. For example, mountaineers have to deal with the possibility of being caught in an avalanche:

  • Solution 1: Avoid risk – but zero risk is impossible – and could you really ban all mountaineering in winter?
  • Solution 2: Manage the risk through scientific analysis and deduction – this has many limitations. If you take a sample of snow in one spot, a sample of snow 5 meters further might lead you to conclude the opposite. And by the time you go back to the main desk with your samples, there might be some wind or additional snow that completely changes your conclusions.
  • Solution 3: Accept uncertainty and the risk that goes with it, and deal with it with simple rules. The rules are simple, therefore imperfect, they may be based on impressions and not scientific evidence. But they have to be simple so that they can (and must) be used by everybody.

Switzerland went for solution three and put in place a short checklist that mountaineers complete as a group before hiking in the snow:

  • Have there been avalanches in the past 48 hours?
  • Is there accumulated snow?
  • Is there a corridor that can be identified by a novice?
  • Has there been a recent thaw?
  • Are there obstacles (trees, cliffs) that would prevent an escape?
  • Was an avalanche warning issued by the main office?
  • Is the snow unstable?

The number of deaths in avalanches decreased radically with the use of that checklist. In France, the checklist hasn’t been put in place, and the numbers remain the same.For content projects, basic templates and checklists are easy to share and help contributors make better choices even if they can’t be backed with data.For example, before you publish a piece of content, you could check these simple things:

  • Have we run a spellcheck?
  • Do images have alternative text?
  • Do we have the right SEO keywords in our titles?
  • Does the piece include the 3 sections that we have defined in the template? Does the content match the purpose of these sections?
  • Has an event happened recently that would make this content irrelevant or inappropriate?

3. Deliberately cross-control communications and interactions

The way we communicate naturally, and group dynamics, can lead to misunderstandings such as implied consensus.To counter that, some companies use standard ways of communicating and interacting that make group communication more effective.

For a one-to-one conversation, a question would be followed by a confirmation of the question, then an answer, which would be followed by a confirmation of the answer.

JOHN: Jane, what’s your take on the document’s readiness for publication?

JANE: Well, my take on the document’s readiness for publication is that we’re pretty much ready to go, but we still need to go through the readiness checklist.

JOHN: Ok, so you’re saying we can publish as soon as we’ve gone through the readiness checklist.

JANE: Yes.

A caricature of this way of communicating are the exchanges between pilots and ground control – but if they didn’t communicate like that there would be many more plane crashes! In some Japanese companies, they even introduced instructions for body language to be used for raising warnings.

For a group conversation, a proposal to the group followed by “any objections?” would be a sure way to get an implied consensus. To make sure everybody understood the proposal and voiced their concerns, you would get each participant to confirm the proposal.Beyond conversation, cross-control can be done at the interaction level, where people do their tasks together, and with a learning dimension. For content design, you can put that in place through pair-writing activities and design workshops.

Rules of effective communication and peer cross-control are not natural, so they have to be standardised and put in place deliberately if you want them to happen.

4. Bring diverse groups together around integrated processes

Our perception is flawed, and we cannot keep everything in mind. You multiply all the things you as an individual need to remember by the number of divisions in an organisation, and you get a big headache.In a hospital, surgery involves various teams that don’t necessarily talk to each other: the surgeon, the anaesthetist, the nurses, the administrators, and so on; the number of patients that get an operation for the wrong side of their body—or that leave the hospital with tools in their body is much larger than we care to imagine.

To address this issue, some hospitals use checklists at various checkpoints. All participants are required to fill in the checklist together.The pre-surgery checklist includes questions such as:

  • Have we checked the patient’s identity?
  • Which side of the body are we operating?

The post-surgery checklist includes questions such as:

  • Do we have the right count of tools?

This might sound over the top, but hospitals that use these systematic checkpoints have reduced post-surgery deaths by half.

Content projects usually don’t involve inserting tools into bodies, but they often require collaboration between disciplines. Customer experience and journey maps can act as checklists that bring together various stakeholders around what is most important.An end-to-end map of the as-is customer journey will help kick-start a project. Multidisciplinary groups can identify gaps and inconsistencies along the journey and devise ways to address them.

In the middle of a content project, the group can review a plan or editorial calendar, to make sure everything is aligned with what was decided at the start.

At the end of the project, another review of the customer journey, supported by the right KPIs, will help the group assess the finalisation of the project.

5. Turn your organisation into a mission-driven, learning organisation

Managers, experts and novices all play a role in the decision-making process. The weight that the organisation puts on management power influences the decision-making process.Let’s take a very risky environment as an example: a nuclear submarine, where the tiniest mistake can mean disaster for the whole crew. Even if the regular military organisation is very hierarchical, in a nuclear submarine, there is no or a very limited hierarchy, with relative autonomy for each of the crew members. When they enter the submarine, commanders and officers switch to “high-risk” mode, and remove their insignias. On board there is a strong notion of mission-command; everybody in the crew has a very good understanding of their mission, so that decisions are taken collectively, and with a shared understanding of the objectives. You would think working on a nuclear submarine is the riskiest experience you could have, yet the number of accidents is very, very small.

The other problem with hierarchy is that it prevents the organisation from learning from its mistakes.

Investigating a series of plane crashes, an airline found the problem was related to general hierarchical pressure, and had to do with name and blame. When a mistake was made, the culprit was identified and punished as an example. So people avoided drawing attention to their mistakes and looked for ways to hide issues.

The reality is things are much more complicated than that. It’s rare that a single person is responsible for an error – remember it takes 3 people to make a decision. And making an example of someone doesn’t prevent the error from repeating.

The airline put in place a mandatory, anonymous, debrief platform where people could share their issues. One of the objectives was to spot near-miss issues to address weak signals as soon as possible. The debrief reports were meant to be shared and have a pedagogical objective. The airline soon became more reliable and improved its safety record.Content projects often reveal weaknesses in organisations -- the cracks in information flows that lead to bad decisions. You can rarely drive organisation change from highlighting problems found in content projects. But you can at least define the mission with message architecture. Set up an overall mission for your content, its emotional undertones, the tone and voice that should be used throughout.A well-designed message architecture, paired with shared responsibility for content results, can help all contributors play their part in the creation and review of content. Without crashing the plane.

Sympathy for the devil’s advocate

You will find many of these five principles at work on agile projects. Agile is much more of a counter-culture than we might think, and adopting it goes beyond daily stand-up meetings and iterations.

So now might be the time to start cherishing devil’s advocates, and adopt a form of counter culture:

  • Acknowledging our own cognitive limitations
  • Accepting uncertainty and managing it with basic content delivery checklists
  • Pair-writing, and designing content through workshops
  • Taking the time to discuss content choices with diverse groups, and listening to those who challenge decisions
  • Taking errors as opportunities to learn and adjust course of action, with no shame or blame

Decision making is not something that just happens. By learning about cognitive bias and group dynamics, you can become more mindful about how decisions get made in your content projects. And dodge those content catastrophes!

Have you ever been in a meeting where the group decided to publish something then left thinking “that shouldn’t have been written, let alone published!” only to find out later that everyone in the meeting thought the same thing?

Have you ever seen key information fall through the cracks of corporate silos?

Have you ever attended debrief after debrief where you dissect the same recurring problems?

If so, your organisation might be unaware of the human factors at play in collective decisions.

For a long time I thought this was just an irritating idiosyncrasy of all big organisations—until I discovered Christian Morel’s books on absurd decisions.

Morel, a former head of human resources at Renault, turned to sociology where he studied “absurd decisions”—radical errors and how they’re made by groups of individuals.A radical error is the result of a decision that goes against everybody’s objectives, for example:

  • Engineers who persist in using faulty joints for the Challenger shuttle
  • Pilots who collectively decide to turn off the one reactor that is still working
  • Executives who persistently share illegible slides

The first book covers how radical errors happen. The second is about how organisations have improved their decision-making in high-risk environments.

Morel’s insights and experience outline some principles for what individuals and organisations should do to make the right decisions. I will describe how you can apply these principles to your content projects, but first let’s take a closer look at what happens when groups make radical errors.

Why do groups persist in making bad decisions?

The world we live in is complex and unpredictable. Complexity doesn’t fit well with linear, cause-and-effect modes of thinking that we usually associate with rational behaviour.And our individual minds are not well designed to manage this complexity. The cognitive tools we use to cope with complexity sometimes get in the way, and constrain us through a number of biases:

  • Representation errors—when there’s an elephant in the room. It’s that feeling that everything is normal when in fact it’s all going wrong. For example, when a plane crashes because the pilot relied on a faulty altimeter rather than what he could see through the windshield. Or when we decide to launch a campaign that is preposterous or even insulting in the current political context (here's a recent example).
  • Target fixation—when we take more risks as we get closer to our goal.For example, when we’re one day from the delivery of a website redesign and we decide to add that extra, unusable feature at the last minute.
  • Loss of meaning – when action becomes an objective of its own.For example, when a team decides to remove links from their content because of false positives in the link-checker reports.
  • Attention deficit—when we’re stressed or tired. For example, when an editor spends hours making corrections to a document and misses a glaring typo in the title.

The list goes on. Check out this infographic for a full picture of our cognitive biases.Morel observes that the biases we have at the individual level are amplified when we work in groups. We often make decisions based on what we think others think. And sometimes, based on what we think others think we think. This leads to a phenomenon called “groupthink” where everybody disagrees but keeps quiet.

Part of the reason that led to the Challenger disaster was that experts knew the joints were faulty, yet none of them questioned the decision to launch Challenger regardless of this knowledge.To add to that, there are power struggles at play between managers, experts, and novices. Morel states that it always takes three people to make a decision, and when the weight of each role becomes unbalanced, there is a higher risk for the group to make a bad decision.

The manager has the power to act on the decision. The expert has the power to validate the decision. The novice has the power to understand and apply the decision. All of them can either take the decision, support it, oppose it, or not be part of it.

For example:

  • A plane crashes because the copilot followed the commander’s orders. A communications department publishes leaflets with incorrect information. (The manager takes the decision, experts and novices are not part of it)
  • Marketing validates technical content that people don’t like. (The expert takes the decision, the manager follows, and novices are not part of it)
  • A nuclear plant explodes because people maintaining the plant were not experienced and had no supervision. (Novices take the decision, managers support it, experts are not part of it)

We all know or have heard of a manager who makes decisions without consulting anyone, but just having a technical expert in the process is not a guarantee for a reliable decision. It’s the imbalance in decision-making power between the three roles that can lead to disaster.

We live in a complex world, our perception is flawed, group dynamics are riddled with traps, and decision-making power structures are often unbalanced. Everything seems to be set up for failure when people work together on a content project.

What can we do?

Thankfully, in his second book Morel presents solutions to these thorny decision-making problems. Here are 5 things you can do to ensure you make the right decisions on your content projects.

1. Start with yourself

We’re all human, and nobody’s perfect. But it can be difficult to admit that you’re fallible. All the same it’s worth taking the time to acknowledge your flaws.

For example, 80% of pilots compared to 30% of surgeons admitted to making mistakes when tired. Now guess which is more likely: dying in a plane crash, or dying from a post-surgery issue. So if you find yourself unable to admit that you make mistakes—beware! Good group decisions start with admitting that everyone makes mistakes. Including you. Yes, you the content expert. Take some time to reflect on your own mistakes, and the possible cognitive psychology factors that caused you to make the mistake. Was it target fixation, attention deficit, a representation error? Or maybe a mix of those?

With better awareness of these factors, you reduce risks. And if you can train your teams to become more aware, you reduce risks for the whole organisation.

2. Set up imperfect but simple rules to manage uncertainty

In our complex world, we have to deal with uncertainty. There are many things we cannot predict, and when the unexpected happens, we must decide and react quickly.

There are various ways you can deal with uncertainty. For example, mountaineers have to deal with the possibility of being caught in an avalanche:

  • Solution 1: Avoid risk – but zero risk is impossible – and could you really ban all mountaineering in winter?
  • Solution 2: Manage the risk through scientific analysis and deduction – this has many limitations. If you take a sample of snow in one spot, a sample of snow 5 meters further might lead you to conclude the opposite. And by the time you go back to the main desk with your samples, there might be some wind or additional snow that completely changes your conclusions.
  • Solution 3: Accept uncertainty and the risk that goes with it, and deal with it with simple rules. The rules are simple, therefore imperfect, they may be based on impressions and not scientific evidence. But they have to be simple so that they can (and must) be used by everybody.

Switzerland went for solution three and put in place a short checklist that mountaineers complete as a group before hiking in the snow:

  • Have there been avalanches in the past 48 hours?
  • Is there accumulated snow?
  • Is there a corridor that can be identified by a novice?
  • Has there been a recent thaw?
  • Are there obstacles (trees, cliffs) that would prevent an escape?
  • Was an avalanche warning issued by the main office?
  • Is the snow unstable?

The number of deaths in avalanches decreased radically with the use of that checklist. In France, the checklist hasn’t been put in place, and the numbers remain the same.For content projects, basic templates and checklists are easy to share and help contributors make better choices even if they can’t be backed with data.For example, before you publish a piece of content, you could check these simple things:

  • Have we run a spellcheck?
  • Do images have alternative text?
  • Do we have the right SEO keywords in our titles?
  • Does the piece include the 3 sections that we have defined in the template? Does the content match the purpose of these sections?
  • Has an event happened recently that would make this content irrelevant or inappropriate?

3. Deliberately cross-control communications and interactions

The way we communicate naturally, and group dynamics, can lead to misunderstandings such as implied consensus.To counter that, some companies use standard ways of communicating and interacting that make group communication more effective.

For a one-to-one conversation, a question would be followed by a confirmation of the question, then an answer, which would be followed by a confirmation of the answer.

JOHN: Jane, what’s your take on the document’s readiness for publication?

JANE: Well, my take on the document’s readiness for publication is that we’re pretty much ready to go, but we still need to go through the readiness checklist.

JOHN: Ok, so you’re saying we can publish as soon as we’ve gone through the readiness checklist.

JANE: Yes.

A caricature of this way of communicating are the exchanges between pilots and ground control – but if they didn’t communicate like that there would be many more plane crashes! In some Japanese companies, they even introduced instructions for body language to be used for raising warnings.

For a group conversation, a proposal to the group followed by “any objections?” would be a sure way to get an implied consensus. To make sure everybody understood the proposal and voiced their concerns, you would get each participant to confirm the proposal.Beyond conversation, cross-control can be done at the interaction level, where people do their tasks together, and with a learning dimension. For content design, you can put that in place through pair-writing activities and design workshops.

Rules of effective communication and peer cross-control are not natural, so they have to be standardised and put in place deliberately if you want them to happen.

4. Bring diverse groups together around integrated processes

Our perception is flawed, and we cannot keep everything in mind. You multiply all the things you as an individual need to remember by the number of divisions in an organisation, and you get a big headache.In a hospital, surgery involves various teams that don’t necessarily talk to each other: the surgeon, the anaesthetist, the nurses, the administrators, and so on; the number of patients that get an operation for the wrong side of their body—or that leave the hospital with tools in their body is much larger than we care to imagine.

To address this issue, some hospitals use checklists at various checkpoints. All participants are required to fill in the checklist together.The pre-surgery checklist includes questions such as:

  • Have we checked the patient’s identity?
  • Which side of the body are we operating?

The post-surgery checklist includes questions such as:

  • Do we have the right count of tools?

This might sound over the top, but hospitals that use these systematic checkpoints have reduced post-surgery deaths by half.

Content projects usually don’t involve inserting tools into bodies, but they often require collaboration between disciplines. Customer experience and journey maps can act as checklists that bring together various stakeholders around what is most important.An end-to-end map of the as-is customer journey will help kick-start a project. Multidisciplinary groups can identify gaps and inconsistencies along the journey and devise ways to address them.

In the middle of a content project, the group can review a plan or editorial calendar, to make sure everything is aligned with what was decided at the start.

At the end of the project, another review of the customer journey, supported by the right KPIs, will help the group assess the finalisation of the project.

5. Turn your organisation into a mission-driven, learning organisation

Managers, experts and novices all play a role in the decision-making process. The weight that the organisation puts on management power influences the decision-making process.Let’s take a very risky environment as an example: a nuclear submarine, where the tiniest mistake can mean disaster for the whole crew. Even if the regular military organisation is very hierarchical, in a nuclear submarine, there is no or a very limited hierarchy, with relative autonomy for each of the crew members. When they enter the submarine, commanders and officers switch to “high-risk” mode, and remove their insignias. On board there is a strong notion of mission-command; everybody in the crew has a very good understanding of their mission, so that decisions are taken collectively, and with a shared understanding of the objectives. You would think working on a nuclear submarine is the riskiest experience you could have, yet the number of accidents is very, very small.

The other problem with hierarchy is that it prevents the organisation from learning from its mistakes.

Investigating a series of plane crashes, an airline found the problem was related to general hierarchical pressure, and had to do with name and blame. When a mistake was made, the culprit was identified and punished as an example. So people avoided drawing attention to their mistakes and looked for ways to hide issues.

The reality is things are much more complicated than that. It’s rare that a single person is responsible for an error – remember it takes 3 people to make a decision. And making an example of someone doesn’t prevent the error from repeating.

The airline put in place a mandatory, anonymous, debrief platform where people could share their issues. One of the objectives was to spot near-miss issues to address weak signals as soon as possible. The debrief reports were meant to be shared and have a pedagogical objective. The airline soon became more reliable and improved its safety record.Content projects often reveal weaknesses in organisations -- the cracks in information flows that lead to bad decisions. You can rarely drive organisation change from highlighting problems found in content projects. But you can at least define the mission with message architecture. Set up an overall mission for your content, its emotional undertones, the tone and voice that should be used throughout.A well-designed message architecture, paired with shared responsibility for content results, can help all contributors play their part in the creation and review of content. Without crashing the plane.

Sympathy for the devil’s advocate

You will find many of these five principles at work on agile projects. Agile is much more of a counter-culture than we might think, and adopting it goes beyond daily stand-up meetings and iterations.

So now might be the time to start cherishing devil’s advocates, and adopt a form of counter culture:

  • Acknowledging our own cognitive limitations
  • Accepting uncertainty and managing it with basic content delivery checklists
  • Pair-writing, and designing content through workshops
  • Taking the time to discuss content choices with diverse groups, and listening to those who challenge decisions
  • Taking errors as opportunities to learn and adjust course of action, with no shame or blame

Decision making is not something that just happens. By learning about cognitive bias and group dynamics, you can become more mindful about how decisions get made in your content projects. And dodge those content catastrophes!


Ready to get started?
Start your free trial now
Start free trialBook a demo
No items found.
mariegirardchoppinet

About the author

Marie Girard

Marie manages digital customer experience projects for IBM Cloud business rules software. With a background in technical communication and information architecture, she develops scalable content strategies and organisational frameworks to deliver the right content at each stage of the customer experience.

She considers content and organisations as living systems. She leverages collective intelligence and design thinking methodologies to drive meaningful innovation.

Outside of IBM, she continues to investigate holistic practices through yoga, and teaches content strategy and design at Paris Diderot University.

Related posts you might like