Conditional dependence

{{see also|Conditional independence}}

File:Conditional Dependence.jpg illustrating conditional dependence]]

In probability theory, conditional dependence is a relationship between two or more events that are dependent when a third event occurs.Introduction to Artificial Intelligence by Sebastian Thrun and Peter Norvig, 2011 [https://www.ai-class.com/course/video/videolecture/33 "Unit 3: Conditional Dependence"]{{Dead link|date=July 2020|bot=InternetArchiveBot|fix-attempted=yes}}Introduction to learning Bayesian Networks from Data by Dirk Husmeier [http://www.bioss.sari.ac.uk/staff/dirk/papers/sbb_bnets.pdf]{{Dead link|date=December 2023 |bot=InternetArchiveBot |fix-attempted=yes }} "Introduction to Learning Bayesian Networks from Data -Dirk Husmeier" For example, if A and B are two events that individually increase the probability of a third event C, and do not directly affect each other, then initially (when it has not been observed whether or not the event C occurs)Conditional Independence in Statistical theory [http://edlab-www.cs.umass.edu/cs589/2010-lectures/conditional%20independence%20in%20statistical%20theory.pdf "Conditional Independence in Statistical Theory", A. P. Dawid"] {{webarchive|url=https://web.archive.org/web/20131227164541/http://edlab-www.cs.umass.edu/cs589/2010-lectures/conditional%20independence%20in%20statistical%20theory.pdf|date=2013-12-27}}Probabilistic independence on Britannica [http://www.britannica.com/EBchecked/topic/477530/probability-theory/32768/Applications-of-conditional-probability#toc32769 "Probability->Applications of conditional probability->independence (equation 7) "]

\operatorname{P}(A \mid B) = \operatorname{P}(A) \quad \text{ and } \quad \operatorname{P}(B \mid A) = \operatorname{P}(B) (A \text{ and } B are independent).

But suppose that now C is observed to occur. If event B occurs then the probability of occurrence of the event A will decrease because its positive relation to C is less necessary as an explanation for the occurrence of C (similarly, event A occurring will decrease the probability of occurrence of B). Hence, now the two events A and B are conditionally negatively dependent on each other because the probability of occurrence of each is negatively dependent on whether the other occurs. We haveIntroduction to Artificial Intelligence by Sebastian Thrun and Peter Norvig, 2011 [https://www.ai-class.com/course/video/quizquestion/60 "Unit 3: Explaining Away"]{{Dead link|date=July 2020 |bot=InternetArchiveBot |fix-attempted=yes }}

\operatorname{P}(A \mid C \text{ and } B) < \operatorname{P}(A \mid C).

Conditional dependence of A and B given C is the logical negation of conditional independence ((A \perp\!\!\!\perp B) \mid C).{{Cite book |last=Bouckaert |first=Remco R. |title=Selecting Models from Data, Artificial Intelligence and Statistics IV |publisher=Springer-Verlag |year=1994 |isbn=978-0-387-94281-0 |editor-last=Cheeseman |editor-first=P. |series=Lecture Notes in Statistics |volume=89 |pages=101-111, especially 104 |language=EN |chapter=11. Conditional dependence in probabilistic networks |editor-last2=Oldford |editor-first2=R. W.}} In conditional independence two events (which may be dependent or not) become independent given the occurrence of a third event.Conditional Independence in Statistical theory [http://edlab-www.cs.umass.edu/cs589/2010-lectures/conditional%20independence%20in%20statistical%20theory.pdf "Conditional Independence in Statistical Theory", A. P. Dawid] {{webarchive|url=https://web.archive.org/web/20131227164541/http://edlab-www.cs.umass.edu/cs589/2010-lectures/conditional%20independence%20in%20statistical%20theory.pdf |date=2013-12-27 }}

Example

In essence probability is influenced by a person's information about the possible occurrence of an event. For example, let the event A be 'I have a new phone'; event B be 'I have a new watch'; and event C be 'I am happy'; and suppose that having either a new phone or a new watch increases the probability of my being happy. Let us assume that the event C has occurred – meaning 'I am happy'. Now if another person sees my new watch, he/she will reason that my likelihood of being happy was increased by my new watch, so there is less need to attribute my happiness to a new phone.

To make the example more numerically specific, suppose that there are four possible states \Omega = \left\{ s_1, s_2, s_3, s_4 \right\}, given in the middle four columns of the following table, in which the occurrence of event A is signified by a 1 in row A and its non-occurrence is signified by a 0, and likewise for B and C. That is, A = \left\{ s_2, s_4 \right\}, B = \left\{ s_3, s_4 \right\}, and C = \left\{ s_2, s_3, s_4 \right\}. The probability of s_i is 1/4 for every i.

class="wikitable"
Event\operatorname{P}(s_1)=1/4\operatorname{P}(s_2)=1/4\operatorname{P}(s_3)=1/4\operatorname{P}(s_4)=1/4Probability of event
A0101

! \tfrac{1}{2}

B0011

! \tfrac{1}{2}

C0111

! \tfrac{3}{4}

and so

class="wikitable"
Events_1s_2s_3s_4Probability of event
A \cap B0001

! \tfrac{1}{4}

A \cap C0101

! \tfrac{1}{2}

B \cap C0011

! \tfrac{1}{2}

A \cap B \cap C0001

! \tfrac{1}{4}

In this example, C occurs if and only if at least one of A, B occurs. Unconditionally (that is, without reference to C), A and B are independent of each other because \operatorname{P}(A)—the sum of the probabilities associated with a 1 in row A—is \tfrac{1}{2}, while

\operatorname{P}(A\mid B) = \operatorname{P}(A \text{ and } B) / \operatorname{P}(B) = \tfrac{1/4}{1/2} = \tfrac{1}{2} = \operatorname{P}(A).

But conditional on C having occurred (the last three columns in the table), we have

\operatorname{P}(A \mid C) = \operatorname{P}(A \text{ and } C) / \operatorname{P}(C) = \tfrac{1/2}{3/4} = \tfrac{2}{3}

while

\operatorname{P}(A \mid C \text{ and } B) = \operatorname{P}(A \text{ and } C \text{ and } B) / \operatorname{P}(C \text{ and } B) = \tfrac{1/4}{1/2} = \tfrac{1}{2} < \operatorname{P}(A \mid C).

Since in the presence of C the probability of A is affected by the presence or absence of B, A and B are mutually dependent conditional on C.

See also

  • {{annotated link|Conditional independence}}
  • {{annotated link|de Finetti's theorem}}
  • {{annotated link|Conditional expectation}}

References

{{reflist|group=note}}

{{reflist}}

Category:Independence (probability theory)