Abstract
In the use of information theory for the development of forecasting models, two alternative approaches can be used, based either on Shannon entropy or on Kullback information gain. In this paper, a new approach is presented, which combines the usually superior statistical inference powers of the Kullback procedure with the advantages of the availability of calibrated ‘elasticity’ parameters in the Shannon approach. Situations are discussed where the combined approach is preferable to either of the two existing procedures, and the principles are illustrated with the help of a small numerical example.
Get full access to this article
View all access options for this article.
