Specific-information
{{Short description|State-dependent measures that converge to the mutual information}}
{{no footnotes|date=May 2011}}
In information theory, specific-information is the generic name given to the family of state-dependent measures that in expectation converge to the mutual information. There are currently three known varieties of specific information usually denoted , , and .
The specific-information between a random variable and a state is written as :.
References
- {{cite journal |pages=325–40 |doi=10.1088/0954-898X/10/4/303 |title=How to measure the information gained from one symbol |year=1999 |last1=Deweese |first1=Michael |last2=Meister |first2=Markus |journal=Network: Computation in Neural Systems |volume=10 |issue=4 |pmid=10695762|citeseerx=10.1.1.553.8013 }}
- {{cite journal |pages=177–87 |doi=10.1088/0954-898X/14/2/301 |title=How much information is associated with a particular stimulus? |year=2003 |last1=Butts |first1=Daniel |journal=Network: Computation in Neural Systems |volume=14 |issue=2 |pmid=12790180}}