A simple yes or no: challenging forced binary choice in community consultation
Forced binary choices or dichotomous questions must be used with care, if at all, when seeking community input.
Yes or no questions can make fun icebreakers or good campfire games. In a community engagement context, however, they must be approached with caution. An alternative name for this kind of question – a forced binary choice – gives us one clue why caution is recommended.
No-one likes being forced
Sometimes you may just feel the need to know how many in a community agree with a proposal or not, like an idea or not, or prefer option A to option B. However, it’s rarely a good idea to ask a simple yes/no question because it’s rarely that simple.
Without room for nuanced response, or even a “yes, but”, forcing respondents to choose between one of two options can result in:
- The respondent thinks both options sound good but because they can only choose one, it can be construed that they’ve rejected the other.
- The respondent thinks both options sound bad but must choose one, so the results will look like they endorse that option.
- The respondent doesn’t understand or care about the options but will choose one in order to move on to the next question, thus skewing your data.
Speaking of data, what are you planning to do with the results? If a decision is already made internally regardless of consultation outcomes, don’t ask the question. And however confident you may be that the numbers will support your intended action, just don’t ask if you’re not prepared to abide by the outcome. It’s disingenuous and undermines trust in your engagement, your organisation and engagement in general.
You’ll be familiar with the concept of “closed questions”. The “closed” in this context is that you have shut down the opportunity for sharing or discovering insights. There’s no way to understand why the respondent has chosen the option they have.
Acquiescence bias, dissent bias and the limitations of binary choice
Specifically seeking a Yes/No or Agree/Disagree response (as opposed to an A/B or other set of two options) can also introduce acquiescence bias. Depending on the context, respondents will opt for Yes or Agree because of a subconscious tendency to be positive or agreeable. While acquiescence bias is a common issue in the design of market research, dissent bias may be more of a concern in a consultation context. In this case, the answer is likely to be no to everything when the respondent feels very negatively about a project or proponent. But whichever way the bias goes, it doesn’t serve project or engagement objectives if permitted to creep in.
Creating a forced binary choice can also incite or escalate division within community. In land use planning, for example, considerable effort may go into drafting two options for community consultation. If forced to choose between these two, some impacted community members can start to campaign for their preferred option: we have seen this go as far as door-knocking in their neighbourhood to seek support, to persuade other community members to their perspective and even to threaten or intimidate supporters of the “other side”.
Limiting choice to one of two options also excludes creative responses to the proposed options, meaning respondents can’t identify which bits of a proposal they really like, which they can live with if they have to and which they would forcefully reject. Or to come up with ideas you may have missed entirely.
Try these alternatives
Rather than two options in this context, consider providing one proposal – clearly marked “draft for consultation” – that highlights all of the aspects that are open to negotiation, and the constraints that apply.
You can also design a survey to:
- Seek the same information through an open question: Instead of, for example, “Do you like this? Yes/No”, ask “What do you like about this?” and “What don’t you like about this?”
- Use a radio button question to create a rating scale for the example above: What do you think of …? with answer options from 1 to 10. (Note: do not ask “How much do you like …?” as it introduces a further opportunity for bias, planting an assumption that the option is liked to some degree.).
- Use an emoji Likert scale for a quick, balanced, check of sentiment.
But remember, absorbing information, then weighing up options – via questions that go beyond a simple yes/no – can generate a cognitive load for respondents. Keeping surveys as short as possible is one key way to reduce survey fatigue.
So, what if you genuinely want and will listen to a simple Yes/No response?
- Provide sufficient information in the context of the question so you can garner an informed response.
- Allow respondents to opt out of a question; if they do, consider including a conditional question asking why they opted out.
- Give respondents a “don’t know/not applicable” option – sometimes it’s okay to sit on the fence, and it reminds decision-makers that community opinion isn’t all black or white.
Sometimes it is that simple
Dichotomous questions are great when you want unambiguous data – because they aren’t open to interpretation (or the analyst’s own biases) the way that text responses can be. This kind of question is a good option when it comes to simple facts; they are quick and easy to answer, usually conferring a low cognitive load. An exception may be if respondents really have to think about their answer while completing the survey (for example, it may be easy for them to know whether or not they went to an open day, not so easy for them to know if they thought it was worth their time attending).
Binary questions are also useful for filtering or segmenting respondents. For example, responses to “Have you viewed the display?” can determine, via skip logic in a survey, whether the respondent is then asked about the display or not. They can also be used in cross-tabulation analysis to help you identify any correlation between, in this example, viewing the display and responses to other questions. This could provide insights into the effectiveness of the display in informing responses or affecting sentiment.
You can dilute the impacts of acquiescence/dissent bias in this context by using more direct, construct-specific wording. Using the same example as above, instead of “Yes/No” options, offer “I have viewed it/I haven’t viewed it”. This structure puts the question further into context so that subconscious bias is less influential. Using the randomise function to change the order in which options are presented will also help reduce the influence of primacy and recency biases.
To elicit further insights, again using skip logic, you might consider including a “why” or “why not” question following your binary choice. Use this option only if the insights will actually inform decision making or project management choices as, again, keeping the survey as short as possible may be more important than any extra information you gather.
Reporting your data
Finally, keep in mind that even with an ideally designed question, the results show only the ideas, preferences and concerns of those who have responded. You can only say that x% of respondents agree or disagree, for example, not x% of the community.