Information theory

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was established and put on a firm footing by Claude Shannon in the 1940s,[1] though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.[2][3]

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (which has two equally likely outcomes) provides less information (lower entropy, less uncertainty) than identifying the outcome from a roll of a die (which has six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.

Applications of fundamental topics of information theory include source coding/data compression (e.g. for ZIP files), and channel coding/error detection and correction (e.g. for DSL). Its impact has been crucial to the success of the Voyager missions to deep space,[4] the invention of the compact disc, the feasibility of mobile phones and the development of the Internet and artificial intelligence.[5][6][3] The theory has also found applications in other areas, including statistical inference,[7] cryptography, neurobiology,[8] perception,[9] signal processing,[2] linguistics, the evolution[10] and function[11] of molecular codes (bioinformatics), thermal physics,[12] molecular dynamics,[13] black holes, quantum computing, information retrieval, intelligence gathering, plagiarism detection,[14] pattern recognition, anomaly detection,[15] the analysis of music,[16][17] art creation,[18] imaging system design,[19] study of outer space,[20] the dimensionality of space,[21] and epistemology.[22]

  1. ^ Schneider, Thomas D. (2006). "Claude Shannon: Biologist". IEEE Engineering in Medicine and Biology Magazine: The Quarterly Magazine of the Engineering in Medicine & Biology Society. 25 (1): 30–33. doi:10.1109/memb.2006.1578661. ISSN 0739-5175. PMC 1538977. PMID 16485389.
  2. ^ a b Cruces, Sergio; Martín-Clemente, Rubén; Samek, Wojciech (2019-07-03). "Information Theory Applications in Signal Processing". Entropy. 21 (7): 653. Bibcode:2019Entrp..21..653C. doi:10.3390/e21070653. ISSN 1099-4300. PMC 7515149. PMID 33267367.
  3. ^ a b Baleanu, D.; Balas, Valentina Emilia; Agarwal, Praveen, eds. (2023). Fractional Order Systems and Applications in Engineering. Advanced Studies in Complex Systems. London, United Kingdom: Academic Press. p. 23. ISBN 978-0-323-90953-2. OCLC 1314337815.
  4. ^ Horgan, John (2016-04-27). "Claude Shannon: Tinkerer, Prankster, and Father of Information Theory". IEEE. Retrieved 2024-11-08.
  5. ^ Shi, Zhongzhi (2011). Advanced Artificial Intelligence. World Scientific Publishing. p. 2. doi:10.1142/7547. ISBN 978-981-4291-34-7.
  6. ^ Sinha, Sudhi; Al Huraimel, Khaled (2020-10-20). Reimagining Businesses with AI (1 ed.). Wiley. p. 4. doi:10.1002/9781119709183. ISBN 978-1-119-70915-2.
  7. ^ Burnham, K. P.; Anderson, D. R. (2002). Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach (Second ed.). New York: Springer Science. ISBN 978-0-387-95364-9.
  8. ^ F. Rieke; D. Warland; R Ruyter van Steveninck; W Bialek (1997). Spikes: Exploring the Neural Code. The MIT press. ISBN 978-0262681087.
  9. ^ Delgado-Bonal, Alfonso; Martín-Torres, Javier (2016-11-03). "Human vision is determined based on information theory". Scientific Reports. 6 (1): 36038. Bibcode:2016NatSR...636038D. doi:10.1038/srep36038. ISSN 2045-2322. PMC 5093619. PMID 27808236.
  10. ^ cf; Huelsenbeck, J. P.; Ronquist, F.; Nielsen, R.; Bollback, J. P. (2001). "Bayesian inference of phylogeny and its impact on evolutionary biology". Science. 294 (5550): 2310–2314. Bibcode:2001Sci...294.2310H. doi:10.1126/science.1065889. PMID 11743192. S2CID 2138288.
  11. ^ Allikmets, Rando; Wasserman, Wyeth W.; Hutchinson, Amy; Smallwood, Philip; Nathans, Jeremy; Rogan, Peter K. (1998). "Thomas D. Schneider], Michael Dean (1998) Organization of the ABCR gene: analysis of promoter and splice junction sequences". Gene. 215 (1): 111–122. doi:10.1016/s0378-1119(98)00269-8. PMID 9666097.
  12. ^ Jaynes, E. T. (1957). "Information Theory and Statistical Mechanics". Phys. Rev. 106 (4): 620. Bibcode:1957PhRv..106..620J. doi:10.1103/physrev.106.620. S2CID 17870175.
  13. ^ Talaat, Khaled; Cowen, Benjamin; Anderoglu, Osman (2020-10-05). "Method of information entropy for convergence assessment of molecular dynamics simulations". Journal of Applied Physics. 128 (13): 135102. Bibcode:2020JAP...128m5102T. doi:10.1063/5.0019078. OSTI 1691442. S2CID 225010720.
  14. ^ Bennett, Charles H.; Li, Ming; Ma, Bin (2003). "Chain Letters and Evolutionary Histories". Scientific American. 288 (6): 76–81. Bibcode:2003SciAm.288f..76B. doi:10.1038/scientificamerican0603-76. PMID 12764940. Archived from the original on 2007-10-07. Retrieved 2008-03-11.
  15. ^ David R. Anderson (November 1, 2003). "Some background on why people in the empirical sciences may want to better understand the information-theoretic methods" (PDF). Archived from the original (PDF) on July 23, 2011. Retrieved 2010-06-23.
  16. ^ Loy, D. Gareth (2017), Pareyon, Gabriel; Pina-Romero, Silvia; Agustín-Aquino, Octavio A.; Lluis-Puebla, Emilio (eds.), "Music, Expectation, and Information Theory", The Musical-Mathematical Mind: Patterns and Transformations, Computational Music Science, Cham: Springer International Publishing, pp. 161–169, doi:10.1007/978-3-319-47337-6_17, ISBN 978-3-319-47337-6, retrieved 2024-09-19
  17. ^ Rocamora, Martín; Cancela, Pablo; Biscainho, Luiz (2019-04-05). "Information Theory Concepts Applied to the Analysis of Rhythm in Recorded Music with Recurrent Rhythmic Patterns". Journal of the Audio Engineering Society. 67 (4): 160–173. doi:10.17743/jaes.2019.0003.
  18. ^ Marsden, Alan (2020). "New Prospects for Information Theory in Arts Research". Leonardo. 53 (3): 274–280. doi:10.1162/leon_a_01860. ISSN 0024-094X.
  19. ^ Pinkard, Henry; Kabuli, Leyla; Markley, Eric; Chien, Tiffany; Jiao, Jiantao; Waller, Laura (2024). "Universal evaluation and design of imaging systems using information estimation". arXiv:2405.20559 [physics.optics].
  20. ^ Wing, Simon; Johnson, Jay R. (2019-02-01). "Applications of Information Theory in Solar and Space Physics". Entropy. 21 (2): 140. Bibcode:2019Entrp..21..140W. doi:10.3390/e21020140. ISSN 1099-4300. PMC 7514618. PMID 33266856.
  21. ^ Kak, Subhash (2020-11-26). "Information theory and dimensionality of space". Scientific Reports. 10 (1): 20733. doi:10.1038/s41598-020-77855-9. ISSN 2045-2322. PMC 7693271. PMID 33244156.
  22. ^ Harms, William F. (1998). "The Use of Information Theory in Epistemology". Philosophy of Science. 65 (3): 472–501. doi:10.1086/392657. ISSN 0031-8248. JSTOR 188281.

From Wikipedia, the free encyclopedia · View on Wikipedia

Developed by razib.in