OJS  Vol.3 No.6 A , December 2013
Subjectivity in Application of the Principle of Maximum Entropy
ABSTRACT

Complete prior statistical information is currently required in the majority of statistical evaluations of complex models. The principle of maximum entropy is often utilized in this context to fill in the missing pieces of available information and is normally claimed to be fair and objective. A rarely discussed aspect is that it relies upon testable information, which is never known but estimated, i.e. results from processing of raw data. The subjective choice of this processing strongly affects the result. Less conventional posterior completion of information is equally accurate but is computationally superior to prior, as much less information enters the analysis. Our recently proposed methods of lean deterministic sampling are examples of very few approaches that actively promote the use of minimal incomplete prior information. The inherited subjective character of maximum entropy distributions and the often critical implications of prior and posterior completion of information are here discussed and illustrated, from a novel perspective of consistency, rationality, computational efficiency and realism.


Cite this paper
J. Hessling, "Subjectivity in Application of the Principle of Maximum Entropy," Open Journal of Statistics, Vol. 3 No. 6, 2013, pp. 1-8. doi: 10.4236/ojs.2013.36A001.
References
[1]   D. S. Sivia and J. Skilling, “Data Analysis—A Bayesian Tutorial,” Oxford University Press, Oxford, 2006.

[2]   S. M. Kay, “Fundamentals of Statistical Signal Processing, Estimation Theory,” Volume 1, Prentice Hall, Upper Saddle River, 1993.

[3]   E. T. Jaynes, “Information Theory and Statistical Mechanics,” The Physical Review, Vol. 106, No. 4, 1957, pp. 620-630. http://dx.doi.org/10.1103/PhysRev.106.620

[4]   C. E. Shannon, “A Mathematical Theory of Communication,” The Bell System Technical Journal, Vol. 27, No. 3, 1948, pp. 379-423.

[5]   J. P. Hessling and T. Svensson, “Propagation of Uncertainty by Sampling on Confidence Boundaries,” International Journal for Uncertainty Quantification, Vol. 3, No. 5, 2013, pp. 421-444.

[6]   J. P. Hessling, “Deterministic Sampling for Propagating Model Covariance,” SIAM/ASA Journal on Uncertainty Quantification, Vol. 1, No. 1, 2013, pp. 297-318.

[7]   G. M. Ewing, “Calculus of Variations with Applications,” Dover, New York, 1985.

[8]   J. P. Hessling, “Identification of Complex Models,” in Review, 2013.

[9]   L. Rade and B. Westergren, “Mathematics Handbook,” 2 Edition, Studentlitteratur, Lund, 1990.

[10]   ISO GUM, “Guide to the Expression of Uncertainty in Measurement,” Technical Report, International Organisation for Standardisation, Geneva, 1995.

[11]   S. Julier and J. Uhlmann, “Unscented Filtering and Nonlinear Estimation,” Proceedings IEEE, Vol. 92, No. 3, 2004, pp. 401-422.

[12]   S. Julier, J. Uhlmann and H. Durrant-Whyte, “A New Approach for Filtering Non-Linear Systems,” American Control Conference, Seattle, 21-23 June 1995, pp. 16281632.

[13]   R. Y. Rubenstein and D. P. Kroese, “Simulation and the Monte Carlo Method,” 2 Edition, John Wiley & Sons Inc., New York, 2007.

[14]   J. C. Helton and F. J. Davis, “Latin Hypercube Sampling and the Propagation of Uncertainty in Analyses of Complex Systems,” Reliability Engineering and System Safety, Vol. 81, No. 1, 2003, pp. 23-69.
http://dx.doi.org/10.1016/S0951-8320(03)00058-9

[15]   T. Lovett, “Polynomial Chaos Simulation of Analog and Mixed-Signal Systems: Theory, Modeling Method, Application,” Lambert Academic Publishing, Saarbrücken, 2006.

[16]   J. P. Hessling, “Stratified Deterministic Sampling of Multivariate Statistics,” in Preparation, 2013.

[17]   J. P. Hessling, “Deterministic Sampling for Quantification of Modeling Uncertainty of Signals,” In: F. P. G. Marquez and N. Zaman, Eds., Digital Filters and Signal Processing, INTECH, Rijeka, 2012.

[18]   N. Metropolis, “The Beginning of the Monte Carlo Method,” Los Alamos Science Special Issue, Vol. 15, 1987, pp. 125-130.

 
 
Top