09 October 2015

Trust in Forecasts

Welcome back. Play along with me. Pick a sport. Tomorrow is opening day. Joe and Frank, two sports gurus are on TV, forecasting the winners. You’re not up on the teams or on Joe and Frank.

Joe says there’s a 70% chance that Team A will beat Team B. Frank says there’s a 30% chance that Team A will win.

Questions: Which forecast is more accurate? Which guru would you trust more? Hint: All else being equal, based on the higher and lower predictions, more people would lean toward Team A winning. At least that’s what a recent study by researchers from Virginia Tech and the University of South Carolina demonstrated.


You can always trust Magic 8 Ball
forecasts. (multiple websites)
Prediction Levels Affect Inferences

The researchers conducted eight experiments, each testing the reactions of large groups of participants to forecasted occurrences of 70% versus 30%.

The experiments included a basketball team winning or losing; the success of stock market investments (an initial public offering, a stock’s price); predictions of a book being published, when the predictions are presented numerically, pictorially (pie chart) and verbally (“very” or “less likely”/”unlikely” to be published); other formats (frequencies and point spreads); and a case when benchmarks exist.

In nearly every case, the researchers found that, when forecasts are higher (70% rather than 30%), test participants inferred that the forecaster had conducted more in-depth analyses, was more confident about the prediction and was more trustworthy. They also judged the prediction to be more accurate.

In essence, participants evaluated the forecasts based only on the event occurring, e.g., team A winning, not its complement, team A losing. A lower forecast was thus interpreted erroneously as the 30% forecaster being less confident in the prediction of team A winning, instead of being more confident in team A losing.

For the experiment that tested reactions when benchmarks exist, the benchmark used was a high expectation that the likelihood of an event occurring should be 25%. Participants judged a slightly higher prediction (30%) to be more accurate than a slightly lower prediction (20%); however, a much higher prediction (70%) reduced their evaluation of accuracy.  


Well, you can usually trust Magic 8
Ball forecasts. (multiple websites)
 Wrap Up

The experiments ruled out certain alternative explanations and demonstrated both the independence of prediction format and the robustness of the effects. Nevertheless, I would be curious to learn more about the test participants than their age and gender.

Perhaps I’m just reflecting my own reaction to the test forecasts, but I would expect anyone who has taken courses in, or dealt with, statistics or probability to treat such forecasts more objectively. I believe this supports areas the researchers suggested for further study.

So. Have you changed your thoughts about Joe and Frank and whether team A or B will win tomorrow? It should be a good game. Thanks for stopping by today.

P.S.

Paper on forecasting study in Journal of Marketing Research and article on Science Daily website:
journals.ama.org/doi/10.1509/jmr.12.0526
www.sciencedaily.com/releases/2015/08/150812131916.htm

No comments:

Post a Comment