Ever seen a boss, a colleague or (heaven forbid) yourself behave in a way that, however seemingly rational, is incredibly obstructive? Before they’ll approve or support a project, there is always one more fact they need.
Chances are that they are in the grip of a cognitive bias called the “information bias”. This is what drives us to look for information about a situation long past the point that this information has any bearing on the decision being made. Worse still, if you play along hoping that you’ll eventually get what you need, you could be stuck there until you get to a point where the question can’t be answered and they’ll never approve it.
This was illustrated best by Baron, Beattie and Hershey (organizational behaviour and human decision processes 42,88-110 (1988)) in their fictitious diseases diagnostic problem. In essence, a patient presents with symptoms suggestive of one of three conditions, one of which, globoma, at a probability of 80%. There is a test, the ET scan, which would certainly rule in or out the other two diagnoses but would give a 50/50 for globoma.
In spite of the fact that the probability of the patient having globoma (80%) was unchanged before and after the ET scan, and the patient should be treated for globoma irrespective of the result, a small but significant number of subjects insisted on the test before treatment.
For these subjects it took a significant amount of questioning about the usefulness of the test before they realised their mistake.
When faced with this situation, it’s probably best to approach the person after the meeting (it’s always in a meeting, right?) and ask specific questions about what and how the extra information will change the decision. Otherwise, when you spend precious time and resources getting the answer to this question, the chances are there will be another.