Entropy—A Universal Concept in Sciences

Show more

References

[1] Feller, W. (1968) An Introduction to Probability Theory and Its Applications. Volume I., John Wiley and Sons, New York

[2] Boltzmann, L. (1896) Vorlesungen über Gastheorie. J. A. Barth, Leipzig.

[3] Shannon, C.E. (1948) A Mathematical Theory of Communication. The Bell System Technical Journal, 27, 53-75.

http://dx.doi.org/10.1002/j.1538-7305.1948.tb00917.x

[4] Nyquist, H. (1924) Certain Factors Affecting Telegraph Speed. Bell System Technical Journal, 3, 324-346.

http://dx.doi.org/10.1002/j.1538-7305.1924.tb01361.x

[5] Khinchin, A.I. (1957) Mathematical Foundation of Information Theory. Dover Publications, New York.

[6] Aczcél, J. and Daróczy, Z. (1975) On Measures of Information and Their Characterization. Academic Press, New York.

[7] Merzbacher, E. (1967) Quantum Physics. 7th Edition, John Wiley and Sons, New York.

[8] Faddejew, D.K. (1957) Der Begriff der Entropie in der Wahrscheinlichkeitstheorie. In: Arbeiten zur Informationstheorie I. DVdW, Berlin.

[9] Watanabe, S. (1969) Knowing and Guessing. John Wiley and Sons, New York.

[10] Majerník, V. (2001) Elementary Theory of Organization. Palacky University Press, Olomouc.

[11] Haken, H. (1983) Advanced Synergetics. Springer-Verlag, Berlin.

[12] Ke-Hsuch, L. (2000) Physics of Open Systems. Physics Reports, 165, 1-101.

[13] Jaynes, E.T. (1957) Information Theory and Statistical Mechanics. Physical Review, 106, 620-630.

http://dx.doi.org/10.1103/PhysRev.106.620

[14] Jaynes, E.T. (1967) Foundations of Probability Theory and Statistical Mechanics. In: Bunge, M., Ed., Delavare Seminar in the Foundation of Physics, Springer, New York.

http://dx.doi.org/10.1007/978-3-642-86102-4_6

[15] Ang, A.H. and Tang, W.H. (2004) Probability Concepts in Engineering. Planning, 1, 3-5.

[16] Vajda, I. (1995) Theory of Information and Statistical Decisions. Kluver, Academic Publisher, Dortrecht.

[17] Rényi, A. (1961) On the Measures of Entropy and Information. 4th Berkeley Symposium on Mathematical Statistics, 547-555.

[18] Havrda, J. and Charvát, F. (1967) Quantification Method of Classification Processes. Concept of Structural -Entropy. Kybernetika, 3, 30-35.

[19] Majerník, V., Majerníkova, E. and Shpyrko, S. (2003) Uncertainty Relations Expressed by Shannon-Like Entropies. Central European Journal of Physics, 6, 363-371.

http://dx.doi.org/10.2478/s11534-008-0057-6

[20] Tsallis, C. (1988). Possible Generalization of Boltzmann-Gibbs Statistics. Journal of Statistical Physics, 52, 479-487.

http://dx.doi.org/10.1007/BF01016429

[21] Majerník, V. and Richterek, L. (1997) Entropic Uncertainty Relations. European Journal of Physics, 18, 73-81.

http://dx.doi.org/10.1088/0143-0807/18/2/005

[22] Brillouin, L. (1965) Science and Information Theory. Academic Press, New York.

[23] Flower, T.B. (1983) The Notion of Entropy. International Journal of General Systems, 9, 143-152.

[24] Guiasu, S. and Shenitzer, A. (1985) The Principle of Maximum Entropy. The Mathematical Intelligencer, 7, 42-48.

http://dx.doi.org/10.1007/BF03023004

[25] Khinchin, A.I. (1957) Mathematical Foundation of Information Theory. Dover Publications, New York.

[26] Chaitin, G.J. (1982) Algorithmic Information Theory. John Wiley and Sons, New York.

[27] Kolmogorov, A.N. (1965) Three Approaches to the Quantittative Definition of Information. Problems of Information Transmission, 1, 1-7.