Using experts 'inexpertly' can lead to policy failure
- Date:
- October 15, 2015
- Source:
- University of Melbourne
- Summary:
- Expert advice can often be compromised by human frailties -- like their current mood or what their values are -- and should be treated accordingly, experts say.
- Share:
In an article published in the journal Nature, Professor Burgman and Professor William Sutherland from the University of Cambridge argue expert opinions are often considered infallible.
But expert advice or estimates are often compromised by "cognitive frailties," which include the expert's mood, values, whether they stand to gain or lose from a decision and the context in which their opinions are sought.
"Experts are typically unaware of these subjective influences," the article says.
"They are often highly credible, yet they vastly overestimate their own objectivity and the reliability of their peers."
It was vital that the conventional approach of informing policy through expert advice -- either individuals or panels -- be balanced with methods that alleviate any psychological and motivational bias.
Professor Burgman, the director of the Centre of Excellent for Biosecurity Risk Analysis (CEBRA) in the School of BioSciences at the University of Melbourne, says history shows experts often get it wrong.
Australians were once told that cane toads were not a threat to the local environment, while much of the world came the conclusion in 2003 that Iraq had weapons of mass destruction.
"Experts must be tested, their biases minimised, their accuracy improved and their estimates validated with independent evidence," the authors write.
"Experts should be held accountable for their opinions."
Professors Burgman and Sutherland have created a framework of eight key ways to improve the advice of experts. These include using groups -- not individuals -- with diverse, carefully selected members well within their expertise areas.
They also caution against being bullied or starstruck by the over-assertive or heavyweight.
"Some experts are much better than others at estimation and prediction.
"However, the only way to tell a good expert from a poor one is to test them.
"Qualifications and experience don't help to tell them apart."
The researchers suggest experts should not advise decision makers directly about matters that involve values or preferences, because experts are not impartial.
To get better answers from experts, decision makers should ensure experts use structured questions and carefully designed and managed group interactions.
"The cost of ignoring these techniques -- of using experts inexpertly -- is less accurate information and so more frequent, and more serious, policy failures," write the researchers.
Eight ways to improve expert advice
- Use groups Their estimates consistently outperform those of individuals'
- Choose members carefully Expertise declines dramatically outside an individual's specialisation
- Don't be starstruck A person's age, number of publications or reputation is not a measure of an expert's ability to estimate or predict events
- Avoid homogeneity Diverse groups tend to generate more accurate judgements
- Don't be bullied Less-assured and assertive people tend to make better judgements
- Weight opinions Calibrate an expert's performance with test questions
- Train experts Training can improve an expert's ability
- Give feedback Rapid feedback tends to improve expert judgements
Story Source:
Materials provided by University of Melbourne. Note: Content may be edited for style and length.
Journal Reference:
- William J. Sutherland, Mark Burgman. Policy advice: Use experts wisely. Nature, 2015; 526 (7573): 317 DOI: 10.1038/526317a
Cite This Page: