infomax

Infomax, or the principle of maximum information preservation, is an optimization principle for artificial neural networks and other information processing systems. It prescribes that a function that maps a set of input values x to a set of output values z(x) should be chosen or learned so as to maximize the average Shannon mutual information between x and z(x), subject to a set of specified constraints and/or noise processes. Infomax algorithms are learning algorithms that perform this optimization process. The principle was described by Linsker in 1988.{{cite journal |doi=10.1109/2.36 |author=Linsker R |title=Self-organization in a perceptual network |journal=IEEE Computer |volume=21 |issue=3 |pages=105–17 |year=1988 |s2cid=1527671 }} The objective function is called the InfoMax objective.

As the InfoMax objective is difficult to compute exactly, a related notion uses two models giving two outputs z_1(x), z_2(x), and maximizes the mutual information between these. This contrastive InfoMax objective is a lower bound to the InfoMax objective.{{Cite journal |last1=Becker |first1=Suzanna |last2=Hinton |first2=Geoffrey E. |date=January 1992 |title=Self-organizing neural network that discovers surfaces in random-dot stereograms |url=https://www.nature.com/articles/355161a0 |journal=Nature |language=en |volume=355 |issue=6356 |pages=161–163 |doi=10.1038/355161a0 |pmid=1729650 |bibcode=1992Natur.355..161B |issn=1476-4687|url-access=subscription }}

Infomax, in its zero-noise limit, is related to the principle of redundancy reduction proposed for biological sensory processing by Horace Barlow in 1961,{{cite book |author=Barlow, H. |chapter=Possible principles underlying the transformations of sensory messages |editor=Rosenblith, W. |title=Sensory Communication |publisher=MIT Press |location=Cambridge MA |year=1961 |pages=217–234 }} and applied quantitatively to retinal processing by Atick and Redlich.{{cite journal |doi=10.1162/neco.1992.4.2.196 |author=Atick JJ, Redlich AN |title=What does the retina know about natural scenes? |journal=Neural Computation |volume=4 |pages=196–210 |year=1992 |issue=2 |s2cid=17515861 }}

Applications

(Becker and Hinton, 1992) showed that the contrastive InfoMax objective allows a neural network to learn to identify surfaces in random dot stereograms (in one dimension).

One of the applications of infomax has been to an independent component analysis algorithm that finds independent signals by maximizing entropy. Infomax-based ICA was described by (Bell and Sejnowski, 1995),{{cite journal |author=Bell AJ, Sejnowski TJ |date=November 1995 |title=An information-maximization approach to blind separation and blind deconvolution |journal=Neural Comput |volume=7 |issue=6 |pages=1129–59 |citeseerx=10.1.1.36.6605 |doi=10.1162/neco.1995.7.6.1129 |pmid=7584893 |s2cid=1701422}} and (Nadal and Parga, 1995).{{cite book |author=Nadal J.P., Parga N. |chapter= Sensory coding: information maximization and redundancy reduction |title=Neural Information Processing |editor1-first=G.|editor1-last=Burdet|editor2-first=P.|editor2-last=Combe|editor3-first=O.|editor3-last=Parodi |series=World Scientific Series in Mathematical Biology and Medicine |location=Singapore |publisher=World Scientific |volume=7 |pages=164–171 |date= 1999 |url=http://www.lps.ens.fr/~nadal/documents/proceedings/carg97/carg97.html }}

See also

References

{{reflist}}

  • {{cite journal |doi=10.1016/S0042-6989(97)00121-1 |author=Bell AJ, Sejnowski TJ |title=The "Independent Components" of Natural Scenes are Edge Filters |journal=Vision Res. |volume=37 |issue=23 |pages=3327–38 |date=December 1997 |pmid=9425547 |pmc=2882863 }}
  • {{cite journal |author=Linsker R |title=A local learning rule that enables information maximization for arbitrary input distributions |journal=Neural Computation |volume=9 |issue= 8|pages=1661–65 |year=1997 |doi=10.1162/neco.1997.9.8.1661|s2cid=42857188 }}
  • {{cite book |author=Stone, J. V. |title=Independent Component Analysis: A tutorial introduction |publisher=MIT Press |location=Cambridge MA |year=2004 |isbn=978-0-262-69315-8 }}

Category:Artificial neural networks

Category:Computational neuroscience

{{mathapplied-stub}}