LOADING

Type to search

Behavioural insights Completed work Insights

Behavioural Insight Brief: Applying Behavioural Insights to Government Organizations

Share this

PDF: Behavioural Insight Brief: Applying Behavioural Insights to Government Organizations

On this page

Context
Evolution of Behavioural Insights: Influencers and Influences
Cognitive Biases
Behaviourally informed Organizational Solutions
5-Step Method to Improve Decisions
Be Bias Aware and Prepared!
Sources

Context

Policy Horizons Canada is interested in drawing on behavioural sciences to identify the often hidden individual and group behaviours that limit the effectiveness of organizations. Until recently, much of the focus has been on better understanding and “nudging” citizens’ behaviours toward improved public outcomes. Far less work has been done by government organizations to understand and apply behavioural insights introspectively. The study of organizational behaviours can help management identify decisions that can suffer from personal or organizational biases. Looking first at individual behaviours, and then applying them to a broader organizational perspective, is particularly relevant in a government context, given that decision making and discussion in the public service often occur in groups.

Evolution of Behavioural Insights

Behavioural insights are gleaned from the broad behavioural sciences field, and more narrowly from the newer field of behavioural economics, which is continually evolving and has been subject to numerous influences, from economists, psychologists and philosophers alike. Some of the more significant influencers and influences are listed below.

The Behavioural Paradox of Government Policy

“Government agencies increasingly use behavioural irrationalities as a justification for government intervention, the paradox is that these same government policies are also subject to similar behavioural inadequacies…”
– W. Kip Viscusi and Ted Gayer

Influencers:

Adam Smith

Economist and philosopher Adam Smith is often referred to as the founding father of economics. His paper The Theory of Moral Sentiments, which was published in 1759, explored many concepts relevant to the modern field of behavioural economics, including the psychology behind decision making, motivation and interaction. He argued that behaviour is determined by a struggle between passions (or emotions) and the impartial spectator (which allows one to see themselves from another’s perspective). Smith’s work is particularly relevant to modern-day studies related to rewards and punishment, and loss aversion.

In the early 1900s, economics slowly moved away from links to psychology. For example, economist Vilfredo Pareto (1848-1923) wrote, “…pure political economy has therefore a great interest in relying as little as possible on the domain of psychology.” Despite this, his writings included speculations on how people think and feel about economic choices and in turn influenced modern-day approaches on how to nudge people out of inertia.

Sigmund Freud

The psychoanalyst Sigmund Freud popularized the idea of the conscious versus unconscious mind in the early 1900s, describing it as an iceberg with three levels:

  1. the conscious – what we’re fully aware of at any one time
  2. the pre-conscious – what we could become aware of quite easily if we switched our attention to it (e.g. using our memory to recall a common route instead of walking or driving on “auto pilot”)
  3. the unconscious – beliefs, patterns or subjective maps of reality we’ve pushed out of our unconscious mind through repression that drive our behaviours and continuously influence our judgements, feelings and behaviour.

Initially, the field of psychology was skeptical regarding the idea of mental processes operating at an unconscious level, but it later theorized that most “information processing resides outside of consciousness for reasons of efficiency.”

Burrhus Frederic (B.F.) Skinner

Behaviourist B.F. Skinner believed that human behaviour is best understood as a function of incentives and rewards. His focus was on the influence of one’s environment as a condition of behaviour. Skinner is best known for “the operant conditioning chamber” box he developed in 1930 to study animal behaviour (wherein a rat quickly learned to operate a lever to dispense rewards). In the late 1950s, scholars redirected the field of psychology back toward internal mental processes, like memory and emotion.

Herbert Simon

Herbert Simon’s research into the decision-making process in economic organizations dates back to the 1940s. His critique of the classic (utility) concept of how rational people are expected to behave in decision-making environments led to the recommendation to look to psychology for better insights into how people process information. He introduced the concept of “bounded rationality” in the 1950s, suggesting that instead of considering all possible options to arrive at a decision, the number of options that a person actually considers is much smaller. In introducing the concept of bounded rationality to organizational communications, Herbert Simon proposed that people who make decisions in organizations never have “complete” information, and that even if they did, they would likely pick the first acceptable option that comes to their attention.

Daniel Kahneman and Amos Tversky

Two other key influencers were Daniel Kahneman and Amos Tversky. In addition to exploring a number of biases influencing judgement and decision making, they introduced the “prospect theory” in 1979 to explain some systematic choices most people make—choices that again contradict the strictly rational model. Prospect theory states that people make decisions based on the potential value of losses and gains rather than the final known outcome, using certain heuristics (mental shortcuts). Heuristics usually involve focusing on one aspect of a complex problem and ignoring others, which can lead to systematic deviations from logic, probability or rational choices.

Kahneman went on to note that people’s behaviour is dictated by two systems: 1) automatic, cue-driven, habit thinking) and 2) logical, motivated, deliberate behaviour. Decisions made from system 1 thinking result in quick snap judgments being formed through intuition, while system 2 decisions are derived from careful and thoughtful consideration. Each type of thinking is also associated with cognitive biases.

Cass Sunstein and Richard Thaler

In 2008, Professors Richard Thaler and Cass Sunstein published the influential book “Nudge.” The book provides insights into how people reach decisions and outlines strategies to influence people’s decision making without significantly changing economic incentives or limiting options. It coined the term “nudges” as an application of behavioural insights to help individuals make more optimal decisions. The authors noted that “to count as a mere nudge, the intervention must be easy and cheap to avoid. Nudges are not mandates. Putting fruit at eye level counts as a nudge. Banning junk food does not.”

Cass Sunstein and Reid Hastie

In 2015, Professors Cass Sunstein and Reid Hastie published the book “Wiser: Getting Beyond Groupthink to make Groups Smarter.” In their book they outline numerous biases and behavioural problems that can occur in group decision making and offer tactics and lessons to achieve better outcomes. Their findings can apply to groups in companies, school boards and government alike.

The authors note that problems occur when individual biases are not corrected at the group level, resulting in cascade effects and errors in judgement that are often amplified. Moreover, groups can become polarized—members end up adopting a more extreme version of their original position after deliberating together. The problem is especially common for groups of like-minded people.

Influences:

Noise

Daniel Kahneman, Andrew Rosenfield, Linnea Gandhi and Tom Blaser’s recent research on “Noise” points to the problems of inconsistent decision making in organizations. They note that the chance variability of noise can be extremely costly to companies and that the problem is often invisible as people do not go through life imagining plausible alternatives to every judgement they make.

In a 2016 talk at a conference, Kahneman noted that “When people think about error, we tend to think about biases … but in fact, a lot of the errors that people make are simply noise, in the sense that it’s random, unpredictable, it cannot be explained.”

Noise can be experienced by the same person or by two or more people doing similar jobs. Typically, when decisions are required and the same information is provided to several people, resulting decisions are arbitrary and cannot be explained statistically. This can be seen, for example, where people have to make hundreds or thousands of judgements or decisions, such as the ones related to parole or tribunal rulings regarding government benefits. Professor Kahneman notes that “long experience on a job always increases people’s confidence in their judgements, but in the absence of rapid feedback, confidence is no guarantee of either accuracy or consensus.”

Cognitive Biases

Harvard Professors John Beshears and Francesca Gino note that there are two main causes of poor decision making: insufficient motivation and cognitive biases. To determine which is causing the problematic behaviour, they recommend asking two questions:

  • Is the problem caused by people’s failure to take any action at all? If so, the cause is a lack of motivation.
  • Are people taking action, but in a way that introduces systematic errors into the decision-making process? If so, the problem is rooted in cognitive biases.

Biases often play an important role in behavioural insights. The more we understand them, the more prepared we will be to make better decisions, which is essential for high-performing organizations.

Biases influence how inferences, judgements and predictions are drawn. Subtle, ingrained biases are deeply rooted in our evolution and cultural past. Even people who actively try to avoid bias will unknowingly act on subtle yet damaging stereotypes.

Cognitive biases (i.e. biases related to the mental processes involved in gaining knowledge and comprehension) are closely related to implicit/social biases (relatively automatic features of prejudiced judgement and social behaviour). Implicit bias occurs, for example, when someone consciously rejects stereotypes and supports anti-discrimination efforts, but also holds negative associations in his/her mind unconsciously.

Some 200 types of biases(link is external) have been documented; although many are similar or simply have a different name. Common biases include:

Blind-spot Bias

People tend to overlook how biases play a role in their own decisions. Despite their best efforts to be objective, many people fail to see their own actual degree of bias. A multi-university study, published in Management Science in 2015, showed that in a sample of more than 600 residents of the United States, more than 85 percent believed they were less biased than the average American. Only one adult out of the 661 considered themselves more biased than the average person. People’s tendency to believe they are less biased than their peers has detrimental consequences at the organizational level.

Planning Fallacy/Optimism Bias

Underestimating the amount of time needed to complete a future task is common in workplaces and is often a direct result of the optimism bias which causes us to over-estimate favourable and positive outcomes. In a group setting, overly optimistic or overly risk-averse tendencies may be exacerbated, leading to deadline or budget projections that are way off.

Affinity (like me) bias is often referred to in the context of hiring, in cases where interviewers and managers tend to hire (and assign newer responsibilities and promotional opportunities to) those who are similar to themselves.

Bandwagon Effect

The uptake of beliefs, ideas and trends increases the more they have already been adopted by others. Individuals in meetings, for example, may be more likely to adopt views that are widely supported and echoed by others. In a group decision-making context this can lead to groupthink when the desire for harmony, and fear of standing out, overrides a realistic appraisal of alternatives. Subconscious group pressures to conform can lead to a deterioration of mental efficiency, reality testing and moral judgement.

Professors Sunstein and Hastie underline in their book “Wiser” that information signals and social pressures can cause groups to follow what others say and do and amplify, rather than correct, individual errors in judgement. An information cascade develops when people abandon their own privately held knowledge and rely instead on the publicly stated judgements or actions of their predecessors. They may fail to disclose what they know out of respect for their colleagues and frequently convince themselves their own views must be wrong.

In a deliberating group, people who are in a minority position often silence themselves, even if they have important information. Studies have also shown less educated people, visible minorities and sometimes women have disproportionately less weight within deliberating groups.

Confirmation Bias

People have a natural tendency to search for, focus on and evaluate information that fits in with their own thinking and preconceptions. Ironically, people who have a facility for deeper deliberation often use their cognitive powers to justify what they already believe and find reasons to dismiss contrary evidence.

Research shows that people with the most education, the highest mathematical abilities and the strongest tendencies to be reflective about their beliefs are the most likely to resist information which contradicts their prejudices.

Psychology studies have also shown that education and intelligence do not stop people’s politics from shaping their broader worldview, even if those beliefs do not match the hard evidence. Instead, the ability to weigh the facts may depend more on curiousity. This leads to odd situations where people who are most extreme in their views are more scientifically informed than those who hold more balanced views.

Conservatism/Status Quo Bias

New evidence is more likely to be scrutinized and treated with disbelief than established information. This is closely related to status quo bias, which results in a tendency to focus on just one possible future, one objective or one option in isolation.

Future/Present Bias

Most people naturally have a tendency to either focus on long-term goals or to want immediate gratification. As entrepreneur Derek Sivers explains, always sticking to one pattern of thought is detrimental. The greatest success comes from overriding one’s natural tendency to find a balance.

Anchoring/Saliency Bias

People often judge things relative to some arbitrary reference point. For example, individuals tend to be over-reliant on the first piece of information they hear and often come back to that initial assumption even after further reflection. Anchoring is frequently used to establish a reference point (e.g. an opening bid) in negotiations. The anchoring bias is also somewhat similar to the availability heuristic and saliency bias, which leads people to make decisions on the basis of just a few facts or to overestimate the probability of events which have greater “availability” in their memory—those memories which are recent, salient and vivid. This is why it is critical to be cautious when responding to fast-breaking issues. Thoughtful deliberation and clear communication, in and between groups, is all the more important when decisions are being made in an emotionally charged climate, such as those related to public health and safety.

Behaviourally informed Organizational Solutions

While it is impossible to be free of biases, simply being more aware of their existence is a good starting point. There are numerous ways to harness biases and avoid decision-making pitfalls. The following are some examples of behaviourally informed organizational solutions that can be used individually, or in combination:

Blinding

Blinding keeps select information hidden to reduce bias and improve decision making. It can be particularly effective for implicit/social biases. The psychologist Robert Rosenthal, a leading methodologist, concluded that the best way to reduce the chances of bias unconsciously affecting decision processes is to mask bias-sensitive information for as long as possible (e.g. gender, race, age, university, etc.).

The blind method has been used to counter gender bias. When musicians auditioned behind a screen it increased the acceptance rate of women into symphony orchestras. This method is also used to support more rigorous research. In medical science, both subjects and researchers are often unaware of who is in the treatment or control groups of clinical trials. Similarly, scholarly journals routinely remove authors’ names and institutions so submissions can be assessed on their scientific merit alone. This practice is also being extended to hiring in organizations like the Treasury Board of Canada Secretariat and the Behavioural Insights unit in the UK, where identifiable information is removed early in the process (e.g. from C.V.s or qualifying exams).

Arouse Emotion

Professors Beshears and Gino recognize that emotions and biases that accompany subconscious thinking often wreak havoc, but they can be tapped for productive purposes. Their recommended approach to tackling this type of thinking is to arouse emotions. For example, giving new employees the opportunity for self-reflection about their strengths and how they can apply them to their organization increases an employee’s emotional bond with the organization and improves their performance.

In the UK, tweaks to an email that appealed to the emotions of aspiring police officers prior to writing an online entrance exam increased the probability by 50 percent that a visible minority applicant passed the recruitment stage. The tone of the reminder email that applicants saw before taking a test of situational judgement was friendlier in tone and prompted them to consider what becoming a police officer would mean to them and their community.

Debiasing

Training individuals to learn about and detect situations in which biases can occur in the workplace can de-bias decision making in the short- and long-term. In research conducted for the U.S. air force(link is external), a combination of games and videos were noted to have significant de-biasing effects, with some trials demonstrating an elimination or muting of instances of targeted biases.

Find hidden information/dissenting points of view

Techniques should be applied to seek out the unique information that is often held by just one or a few group members. Leaders should make it clear they welcome new information and/or dissenting points of view. Providing a safe space for challenging discussions may include communicating and ensuring rules of engagement are respected.

People often keep silent because they receive very few benefits from disclosure. If rewarded when their group succeeds, they are far more likely to reveal what they know, reducing hidden profiles, cascades and group polarization. Incentives can be restructured to reward group success and encourage the disclosure of information. Identification with the group’s success is more likely to ensure that people will say what they know, regardless of whether it fits “the party line.”

Self-silencing

When group leaders express their views early, this can discourage disagreement and promote self-censorship. Leaders and other high-status members (influencers) can do groups a favour by indicating a willingness to hear uniquely held information. By refusing to take a firm position at the outset, they can make way for more information to emerge. A strong facilitator can help to encourage more equitable participation in group discussions.

Leaders can also force new perspectives by using “pre-mortems” to imagine a future failure, setting up role-playing initiatives, or designating devil’s advocates or red teams to challenge assumptions and decisions. Group members should be told before deliberations begin that each has a different and relevant role—or at least distinctive information to contribute.

Planning Prompts

Despite people’s best intentions, they often forget or fail to follow through. Beshears and Gino note that simple planning prompts can help employees achieve their goals. In a study where letters were mailed to employees that described the benefits of flu shots as well as the times and clinic locations, the addition of a feature to write down planned times they would go to a clinic caused the target group to briefly engage in logical and deliberate thinking. Prompting them to form plans by jotting down a time, even though they were not actually scheduling an appointment, increased the number of employees who got the shots by 13 percent. Similarly, a leader can help teams follow through on resolutions by having members create clear maps that detail when and how they plan to reach their goals.

Algorithms and Systematic Analysis

Algorithms (a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer) can be used to take human fallibility out of the planning equation. For example, project management software can help predict workflow and task duration, benchmark milestones and give planners an early warning when things start to go wrong. The growth of artificial intelligence will likely see computers used more often for that purpose.

Simple statistical algorithms often produce more accurate predictions and decisions than those generated by experts, even when the experts have access to more information. Superior consistency allows even simple and imperfect algorithms to achieve greater accuracy than human professionals. Despite this, algorithm aversion is common. One interesting study suggested that the best way to get people to use imperfect algorithms(link is external) (which are still better than human judgement) is to allow people the option of adjusting them slightly.

Well-established processes, methods and data sources based on sound evidence and economic principles can be invaluable in helping to counter personal and organizational biases. Getting extremely analytical and systematic about the facts (e.g. through quantifiable metrics) not only helps to maintain objectivity, it also helps ensure consistency, which is important for combatting organizational noise.

Helpful tools in this regard include proper procedure (following the letter of the law, checklists, etc.), consistent control of (admissible) information and the systematic collection of feedback (without which you may never realize that you are making systematically biased decisions).

Noise Audit

Companies can also consider conducting a noise audit. This involves several professionals independently evaluating a few relevant cases. Kahneman stresses that the point of a noise audit is not to produce a report and notes that buy-in is easier to achieve if executives view the study as their own creation. He recommends that cases be compiled by respected team members and cover the range of problems typically encountered. To make the results relevant to everyone, all unit members should participate in the audit. A social scientist with experience in conducting rigorous behavioural experiments should offer technical supervision of the audit, but the professional unit must own the process.

Cost-benefit analysis

Sunstein and Hastie explain in their book “Wiser” that cost benefit analysis acts as an indispensable safeguard against individual biases, group errors and noise. If group polarization leads people to extreme action, or inertia, cost-benefit analysis can impose a reality-check. The net-benefit test, for example, has become a commonly used standard, imposing a form of systematic analysis discipline across government departments. Rigorous cost benefit analysis typically considers how an initiative may affect Canadians, organizes and systematically assesses consequences, and describes the implications of policy alternatives.

Broaden Thinking & Consult Outsiders and Outside Information

Natural cognitive biases in strategic decision making, such as groupthink, self-preservation, confirmation bias and overly optimistic forecasting, can be countered by bringing together diverse points of view. For government organizations, deliberating with stakeholders outside the public service will also help provide fresh, practical and candid insights.

  1. Prediction markets relate to the “wisdom of crowds.” Although they have been used for political forecasting and commercially for more than two decades, they also show great potential for drawing out diverse opinions and ensuring people’s views are not influenced by those around them. Prediction markets use a combination of gamification and futures markets to allow people to trade on the likely success of policies, ad campaign strategies, candidate viability, events and more.Prediction markets have the potential to avoid the bias that conventional research surveys are prone to (sampling bias, respondent fatigue, dubious responses to overly long or irrelevant questionnaires, etc.). In prediction markets the sample is self-selecting as people only answer the questions that interest them, and rather than indicating what they would do, they indicate what they think the outcome will be. They decide if and how much they want to bet. The extent to which they wish to bet can be very indicative of the outcome.Internal prediction can also provide insight into how organizations process information. Prediction markets provide employees with incentives for truthful revelation and can capture changes in opinion as dynamic new information is introduced.
  2. The Delphi approach (a series of anonymous votes interspersed with deliberation) combines the benefits of individual decision making with social learning. Anonymity insulates group members from reputational pressures and reduces self-silencing of some group members.
  3. Crowdsourcing is also gaining popularity and is helpful for keeping policy up to date. Technology has made it much easier to crowdsource on a larger scale, broadening access to users and experts with diverse skills and experiences. The UK government’s open policy making toolkit noted a number of examples where it has helped policy makers and the public work together to come up with several ideas for further testing. Crowdsourcing can also be in the form of large scale online games, tournaments or nudgeathons.
  4. Consulting a number of experts, such as in the form of challenge panels, can also help ensure a well-informed decision.

Evaluations

A Mowat/KPMG study(link is external) recommended that evaluation criteria be used to help public servants consider a variety of different viewpoints when drafting policy proposals (such as business cases, strategic plans or cabinet submissions). These proposals would be evaluated by the extent to which they included contributions from relevant stakeholders.

Similarly, joint evaluations or multi-evaluations can help reduce bias. For example, a manager conducting a performance review of his or her employee may wish to also consult with multiple evaluators (e.g. other managers and clients) who have had an opportunity to assess the employee’s work directly and in different circumstances.

Reference class forecasting

Kahneman and Tversky’s behavioural insights studies found that people tend to be overly optimistic due to overconfidence and insufficient consideration of distributional information (i.e. risk) about outcomes. This is especially evident in project management where people frequently underestimate the costs, duration and risks of planned actions. They noted that reference class forecasting (or comparison class forecasting) can help counter planning fallacy.

The Standish Group, a U.S.-based consulting firm, has been tracking the performance of IT projects since the mid-1990s and has found very little variation in results. In 2015, 50,000 projects worldwide were examined, including in government and the private sector. About 30 percent of these efforts succeeded—that is, they were on time, on budget and produced a payoff. Roughly half the projects ran into difficulty and nearly 20 percent failed outright. It was found that the larger and more complex the project, the higher the rate of failure.

This finding is supported by research(link is external) conducted in 2012 by McKinsey & Company and the University of Oxford, which showed that half of all large ($15 million+) information technology projects massively exceeded their budgets. They studied more than 5,400 projects and found that, on average, large IT projects ran 45 percent over budget. Software projects ran the highest risk of cost and schedule overruns. The findings—consistent across industries—amounted to a total cost overrun of $66 billion. One of the remedies suggested by McKinsey & Company, as well as the American Planning Association, is the use of reference class forecasting.

Kahneman and Tversky concluded that disregard of risk is perhaps the major source of error in forecasting. They recommended forecasters be less inwardly focused when planning their projects and use reference class forecasting (i.e. consider comprehensive external data and look at distributional [risk-related] information from similar completed ventures).

Foresight Methods

Foresight methods do not predict or forecast but instead consider alternative plausible futures (e.g. 10 years out or further), considering emerging challenges and opportunities which could arise from disruptive changes. Foresight tools help people share, explore and test their mental models about how the world is changing and what it could mean for their organization. This approach can help challenge current assumptions (e.g. related to government policy planning).

Choice Architecture

Public-policy makers are increasingly using choice architecture tools to nudge people toward better decisions on issues such as tax payments, medical treatments, consumer health and wellness, and climate-change mitigation.

Choice architecture can be used to improve people’s decisions by modifying the environment in which people make their decisions. For example, designing well-marked staircases, with motivational health-related messages on the stairway walls, can encourage more people to take the stairs.

Choice architecture tools can also strategically design how information and options are presented to nudge people toward positive outcomes. Creating defaults related to automatic enrollment in long-term savings payment plans can be used to boost such things as post-secondary education enrollment or retirement savings.

The Last Mile(link is external) and Randomized Controlled Trials

Combining big data analysis with behavioural insights is a best of both worlds approach that has been called “the last mile.” Professor Dilip Soman notes that many organizations invest a lot of time analyzing data and then creating a strategy and developing new policies or programs based on the data. They pay a lot less attention to the end; the crucial “last mile” which takes into considerations the less obvious behaviours of their target market.

Once a plan of action has been refined to take into account human behaviour such as biases, the next step, ideally, is to test the proposed approach through a systematic collection of feedback, statistical evaluation and scientific methodologies, such as randomized controlled trials.

These experimental studies assess the effectiveness of policy interventions by randomly allocating and comparing a control group and a group which receives an intervention.

Randomized controlled trials are often considered the gold standard for clinical trials and provide solid evidence that helps cut through the behavioural biases.

Very little attention has been paid to date to the ethical implications of applying behavioural sciences to policy interventions (to be explored in a related BI in Brief paper).

5-Step Method to Improve Decisions

Professor Gino and Beshears have developed a handy tool outlining how executives can mitigate the effects of bias on decision making and motivate employees and customers to make choices that are in both the organization’s and their own best interest.

1) Understand How Decisions Are Made
Human beings have two modes of processing information and making decisions:

  • System 1 is automatic, instinctive and emotional.
  • System 2 is slow, logical and deliberate.

2) Define The Problem
Behavioural economics tools are most effective when:

  • Human behaviour is at the core of the problem.
  • People are not acting in their own best interests.
  • The problem can be narrowly defined.

3) Diagnose The Problem
To determine whether poor decision making is a result of insufficient motivation or of cognitive biases, ask two questions:

  • Is the problem caused by people’s failure to take any action at all?
  • Do people take action, but in a way that introduces systematic errors into the decision-making process?

4) Design The Solution
Use one of three levers:

  • Trigger system 1 thinking by introducing changes that arouse emotions, harness bias or simplify process.
  • Engage system 2 thinking by using joint evaluations, creating opportunities for reflection, increasing accountability and introducing reminders and planning prompts.
  • Bypass both systems by setting defaults and building in automatic adjustments.

5) Test The Solution
Rigorously test the proposed solution to avoid costly mistakes:

  • Identify a target outcome that is specific and measurable.
  • Identify a range of possible solutions and then focus on one.
  • Introduce the change in some areas of the organization (the “treatment group”) and not others (the “control group”).

Source: John Beshears and Francesca Gino, @Hbr.org,  from “Leaders As Decision Architects,” May 2015

Be Bias Aware and Prepared!

While it is near impossible to be bias free, with a collective effort, biases fostered in our organizational culture can be addressed, and bias mitigation efforts can be directed toward particularly important decisions, or decisions that may be especially prone to bias.

High-performing organizations strive to bolster behavioural insight capacity, with a view to ensuring their decisions will increasingly be made on the basis of reliable evidence and better models of human behaviour.

Not only do public servants need to be “bias aware,” they need to be prepared by enshrining behavioural science tactics into their organizational processes and embedding them into everyday practice so they become an integral aspect of policy analysis and decisions.

Sources

Behavioural Insight Brief is a series of summaries of behavioural insights topics to expand knowledge and stimulate discussion regarding the rapidly evolving field of behavioural insights. For more information, please send an email to: info@horizons.gc.ca

Algate, F., Gallagher, R., Hallsworth, M., Halpern, D., Nguyen, S., Ruda, S., Sanders, M., Service, O., with Gyani, A., Harper, H., Kirkman, E., Pelenur, M., and Reinhard, J. 2014. EAST: Four simple ways to apply behavioural insights. U.K. Cabinet Office and Nesta.

Almeida, S., Ciriolo,E , Lourenço, J., and Troussard, X. 2016. Behavioural insights applied to policy: European Report 2016. JRC Science Hub, European Union.

Beshears, J. and Gino, F. 2015. “Leaders As Decision Architects”, Harvard Business Review, May.

Blaser, T., Gandhi, L.; Kahenman, D. and Rosenfield, A.; 2016. “Noise: How to Overcome the High, Hidden Cost of Inconsistent Decision Making,” Harvard Business Review, October.

Dietvorst, B., Simmons, J., Massey, C. 2015. “Overcoming Algorithm Aversion: People Will Use Algorithms If They Can (Even Slightly) Modify Them”, Journal of Experimental Psychology: General, Vol 144(1), February. Pages 114-126.

Galley, A., Gold, J., and Johal, S.. 2013. Public Service transformed, Harnessing the Power of Behavioural Insights, Mowat Centre, University of Toronto.

Gayer, G, and Viscusi, W.K., 2015. “Behavioral Public Choice: The Behavioral Paradox of Government Policy.” Harvard Journal of Law and Public Policy, March. Volume 38.

Goldacre, B., Haynes, L., Service, O., and Torgerson, D. 2013. Test, Learn, Adapt: Developing Public Policy with Randomised Control Trials. U.K. Cabinet Office.

Hastie, R. and Sunstein, C., 2015. Wiser: Getting Beyond Groupthink to Make Groups Smarter, Harvard Business Review Press.

Sunstein, C. and Thaler, R. 2008. Nudge. Yale University Press.

Tags:
Avatar photo
Policy Horizons | Horizons de politiques

Policy Horizons Canada, also referred to as Policy Horizons, is an organization within the federal public service that conducts strategic foresight on cross-cutting issues that informs public servants today about the possible public policy implications over the next 10-15 years.

  • 1

You might also like