286
Views
7
CrossRef citations to date
0
Altmetric
INFORMATION THEORY

Uncertainty, information, and disagreement of economic forecasters

&
Pages 796-817 | Published online: 16 May 2017
 

ABSTRACT

An information framework is proposed for studying uncertainty and disagreement of economic forecasters. This framework builds upon the mixture model of combining density forecasts through a systematic application of the information theory. The framework encompasses the measures used in the literature and leads to their generalizations. The focal measure is the Jensen–Shannon divergence of the mixture which admits Kullback–Leibler and mutual information representations. Illustrations include exploring the dynamics of the individual and aggregate uncertainty about the US inflation rate using the survey of professional forecasters (SPF). We show that the normalized entropy index corrects some of the distortions caused by changes of the design of the SPF over time. Bayesian hierarchical models are used to examine the association of the inflation uncertainty with the anticipated inflation and the dispersion of point forecasts. Implementation of the information framework based on the variance and Dirichlet model for capturing uncertainty about the probability distribution of the economic variable are briefly discussed.

2010 MATHEMATICS SUBJECT CLASSIFICATION:

Appendix

Disagreement based on Rényi information divergence.

The expression for Rényi information divergence between two normal distributions fi=N(μi,σi2),i=1,2 can be written in the following form:

Kr(f1:f2)=r(μ1μ2)22σr2+12(r1)logσ22σr212logσ12σ22,
which is defined when σr2=(1r)σ12+rσ22>0. In Example 2, f1=fy|x and f2=fy. Noting that σy|x2=(1ρ2)σy2, we have σr2=[1+(r1)ρ2]σy2>0 which holds for all r>0. Then
Kr(fy|X:fy)=r(α+βXμy)22[1+(r1)ρ2]σy2+12(r1)log1[1+(r1)ρ212log(1ρ2).

The last term is the mutual information (17). Letting α=μyβμx,β=ρσyσx, and taking the expectation E[Kr(fy|X:fy)], we obtain (18).

Proof of Proposition 1. The ME model subject to the moment constraints (27) is given by:

(31) fc(y)=Cexp(j=1Jλc,jyj),(31)
where λc,1,,λc,J are the Lagrange multipliers for constraints (26) and C=C(λc1,,λc,J) is the normalizing factor. (The ME model can be a probability vector or a continuous density). Breaking the log-ratio in K(fy|x:fa), we have:
Eg[K(fy|X:fc)]=Eg[H(fy|X)Ey|x[logfc(Y)]]=Eg[H(fy|X)]logC+Eg[j=1Jλc,jEy|x(Yj)]=Eg[H(fy|X)]logC+j=1rλc,jEg[Ec(Yj)]=Eg[H(fy|X)]logC+j=1rλc,jμc,j.

The second equality is obtained using (31) and the last equality is from (27). The last result is found by noting that

H(fc)=logC+j=1Jλc,jμc,j.

Acknowledgments

We would like to thank, but not implicate, two anonymous reviewers, Kajal Lahiri, Paul Nystrom, Jeffrey Racine, Minchul Shin, and Kenneth Wallis for their valuable feedbacks and comments that led to improving the exposition of this paper. Shoja’s research was partially supported by an Info-Metrics Institute’s Summer Fellowship. Soofi’s research was supported by a Roger L. Fitzsimonds Distinguished Scholar Award.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 578.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.