Independence vs Marginal independence












2












$begingroup$


I known that for give $x,y$, if we have
$$p(x,y)=p(x)p(y)$$

Then we call $x,y$ are independent.

For marginal independence, I found the definition here.




Random variable $x$ is marginal independent of random variable $y$ if:
$$p(x|y)=p(x)$$




However, I cannot see the difference between them.



The question come to my mind when I want to figure out a question about Bayesian network.
In 3-way Bayesian network, there are three nodes A,B,C in a common parent struct.



   A  
| |
B C


A is the parent of B and C. It says B and C are conditional independent given A, and I can see it because:
$$P(B|A,C)=frac{P(A,B,C)}{P(A,C)}\=frac{P(A)P(B|A)P(C|A)}{P(A,C)}\=P(B|A)$$
But, when the value of A is unknown, why $B,C$ are not independent?










share|cite|improve this question











$endgroup$












  • $begingroup$
    Marginal independent is the same as independent. Conditionally independent is the same but every works after you condition on some certain event (here A).
    $endgroup$
    – Jimmy R.
    Oct 20 '17 at 7:08












  • $begingroup$
    One can only wonder why the author sees fit to rename "marginal independence" the independence property. This can only confuse things, especially in a teaching context. Really, I find this pedagogical choice rather detrimental.
    $endgroup$
    – Did
    Dec 19 '18 at 19:37


















2












$begingroup$


I known that for give $x,y$, if we have
$$p(x,y)=p(x)p(y)$$

Then we call $x,y$ are independent.

For marginal independence, I found the definition here.




Random variable $x$ is marginal independent of random variable $y$ if:
$$p(x|y)=p(x)$$




However, I cannot see the difference between them.



The question come to my mind when I want to figure out a question about Bayesian network.
In 3-way Bayesian network, there are three nodes A,B,C in a common parent struct.



   A  
| |
B C


A is the parent of B and C. It says B and C are conditional independent given A, and I can see it because:
$$P(B|A,C)=frac{P(A,B,C)}{P(A,C)}\=frac{P(A)P(B|A)P(C|A)}{P(A,C)}\=P(B|A)$$
But, when the value of A is unknown, why $B,C$ are not independent?










share|cite|improve this question











$endgroup$












  • $begingroup$
    Marginal independent is the same as independent. Conditionally independent is the same but every works after you condition on some certain event (here A).
    $endgroup$
    – Jimmy R.
    Oct 20 '17 at 7:08












  • $begingroup$
    One can only wonder why the author sees fit to rename "marginal independence" the independence property. This can only confuse things, especially in a teaching context. Really, I find this pedagogical choice rather detrimental.
    $endgroup$
    – Did
    Dec 19 '18 at 19:37
















2












2








2


2



$begingroup$


I known that for give $x,y$, if we have
$$p(x,y)=p(x)p(y)$$

Then we call $x,y$ are independent.

For marginal independence, I found the definition here.




Random variable $x$ is marginal independent of random variable $y$ if:
$$p(x|y)=p(x)$$




However, I cannot see the difference between them.



The question come to my mind when I want to figure out a question about Bayesian network.
In 3-way Bayesian network, there are three nodes A,B,C in a common parent struct.



   A  
| |
B C


A is the parent of B and C. It says B and C are conditional independent given A, and I can see it because:
$$P(B|A,C)=frac{P(A,B,C)}{P(A,C)}\=frac{P(A)P(B|A)P(C|A)}{P(A,C)}\=P(B|A)$$
But, when the value of A is unknown, why $B,C$ are not independent?










share|cite|improve this question











$endgroup$




I known that for give $x,y$, if we have
$$p(x,y)=p(x)p(y)$$

Then we call $x,y$ are independent.

For marginal independence, I found the definition here.




Random variable $x$ is marginal independent of random variable $y$ if:
$$p(x|y)=p(x)$$




However, I cannot see the difference between them.



The question come to my mind when I want to figure out a question about Bayesian network.
In 3-way Bayesian network, there are three nodes A,B,C in a common parent struct.



   A  
| |
B C


A is the parent of B and C. It says B and C are conditional independent given A, and I can see it because:
$$P(B|A,C)=frac{P(A,B,C)}{P(A,C)}\=frac{P(A)P(B|A)P(C|A)}{P(A,C)}\=P(B|A)$$
But, when the value of A is unknown, why $B,C$ are not independent?







probability independence bayesian-network






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 26 at 0:02









Lerner Zhang

314217




314217










asked Oct 20 '17 at 7:01









Lunar_oneLunar_one

213




213












  • $begingroup$
    Marginal independent is the same as independent. Conditionally independent is the same but every works after you condition on some certain event (here A).
    $endgroup$
    – Jimmy R.
    Oct 20 '17 at 7:08












  • $begingroup$
    One can only wonder why the author sees fit to rename "marginal independence" the independence property. This can only confuse things, especially in a teaching context. Really, I find this pedagogical choice rather detrimental.
    $endgroup$
    – Did
    Dec 19 '18 at 19:37




















  • $begingroup$
    Marginal independent is the same as independent. Conditionally independent is the same but every works after you condition on some certain event (here A).
    $endgroup$
    – Jimmy R.
    Oct 20 '17 at 7:08












  • $begingroup$
    One can only wonder why the author sees fit to rename "marginal independence" the independence property. This can only confuse things, especially in a teaching context. Really, I find this pedagogical choice rather detrimental.
    $endgroup$
    – Did
    Dec 19 '18 at 19:37


















$begingroup$
Marginal independent is the same as independent. Conditionally independent is the same but every works after you condition on some certain event (here A).
$endgroup$
– Jimmy R.
Oct 20 '17 at 7:08






$begingroup$
Marginal independent is the same as independent. Conditionally independent is the same but every works after you condition on some certain event (here A).
$endgroup$
– Jimmy R.
Oct 20 '17 at 7:08














$begingroup$
One can only wonder why the author sees fit to rename "marginal independence" the independence property. This can only confuse things, especially in a teaching context. Really, I find this pedagogical choice rather detrimental.
$endgroup$
– Did
Dec 19 '18 at 19:37






$begingroup$
One can only wonder why the author sees fit to rename "marginal independence" the independence property. This can only confuse things, especially in a teaching context. Really, I find this pedagogical choice rather detrimental.
$endgroup$
– Did
Dec 19 '18 at 19:37












2 Answers
2






active

oldest

votes


















2












$begingroup$

As Jimmy R points out, marginal independence is the same as traditional independence, and conditional independence is just the independence of two random variables given the value of another random variable.



To give one example for why $B$ and $C$ are not independent when $A$ is unknown, suppose that $A$ is a random variable detailing whether a coin is biased. If $A=Y$, the coin is biased and will always give heads (magically), and if $A=N$ then the coin is fair.



Now suppose $B$ and $C$ are random variables on the outcome of flipping the (possibly-biased) coin. If we know whether the coin is biased, then $A$ and $B$ are independent (as they're different coin flips) and we know the probability distributions for both (which are the same).



However, suppose we don't know the outcome of $A$, the coin may or may not be biased. Now suppose that the outcome of $B$ comes up tails. Then we know that the outcome of $C$ must be fair, as the coin must be fair, i.e. $A=F$ (else $B$ couldn't give us tails). Hence, without knowing $A$, we have that $B$ and $C$ are dependent on eachother.



We therefore say that $B$ and $C$ are conditionally independent on/given $A$ but are (marginally) dependent.






share|cite|improve this answer









$endgroup$





















    0












    $begingroup$


    But, when the value of A is unknown, why B,C are not independent?




    The formulation of your problem points me toward a solution. Let's say they are independent(proof by contradiction).



    If B and C are independent then $P(B, C) = P(B) P(C)$, and $P(A, B, C) = P(A|B, C) P(B,C)=P(A|B, C)P(B)P(C)=P(A)P(B)P(C)$. And the graph(two children share a commen parent) in your question doesn't work anymore.



    I'd like to plug an intuitive instance.
    Let's consider that A is the age of a certain person and B is his/her reading ability and C stands for his/her hight. Knowing A decouples B and C, as obviously as anyone can see. But if A is not given, we can infer B by C or C according to B. If he/she is tall we can guess he/she is educated enough to read fluently and if someone's reading ability is high we can also conjecture that he/she most probably would not be as short as a child in kindergarten.






    share|cite|improve this answer











    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2481087%2findependence-vs-marginal-independence%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      2












      $begingroup$

      As Jimmy R points out, marginal independence is the same as traditional independence, and conditional independence is just the independence of two random variables given the value of another random variable.



      To give one example for why $B$ and $C$ are not independent when $A$ is unknown, suppose that $A$ is a random variable detailing whether a coin is biased. If $A=Y$, the coin is biased and will always give heads (magically), and if $A=N$ then the coin is fair.



      Now suppose $B$ and $C$ are random variables on the outcome of flipping the (possibly-biased) coin. If we know whether the coin is biased, then $A$ and $B$ are independent (as they're different coin flips) and we know the probability distributions for both (which are the same).



      However, suppose we don't know the outcome of $A$, the coin may or may not be biased. Now suppose that the outcome of $B$ comes up tails. Then we know that the outcome of $C$ must be fair, as the coin must be fair, i.e. $A=F$ (else $B$ couldn't give us tails). Hence, without knowing $A$, we have that $B$ and $C$ are dependent on eachother.



      We therefore say that $B$ and $C$ are conditionally independent on/given $A$ but are (marginally) dependent.






      share|cite|improve this answer









      $endgroup$


















        2












        $begingroup$

        As Jimmy R points out, marginal independence is the same as traditional independence, and conditional independence is just the independence of two random variables given the value of another random variable.



        To give one example for why $B$ and $C$ are not independent when $A$ is unknown, suppose that $A$ is a random variable detailing whether a coin is biased. If $A=Y$, the coin is biased and will always give heads (magically), and if $A=N$ then the coin is fair.



        Now suppose $B$ and $C$ are random variables on the outcome of flipping the (possibly-biased) coin. If we know whether the coin is biased, then $A$ and $B$ are independent (as they're different coin flips) and we know the probability distributions for both (which are the same).



        However, suppose we don't know the outcome of $A$, the coin may or may not be biased. Now suppose that the outcome of $B$ comes up tails. Then we know that the outcome of $C$ must be fair, as the coin must be fair, i.e. $A=F$ (else $B$ couldn't give us tails). Hence, without knowing $A$, we have that $B$ and $C$ are dependent on eachother.



        We therefore say that $B$ and $C$ are conditionally independent on/given $A$ but are (marginally) dependent.






        share|cite|improve this answer









        $endgroup$
















          2












          2








          2





          $begingroup$

          As Jimmy R points out, marginal independence is the same as traditional independence, and conditional independence is just the independence of two random variables given the value of another random variable.



          To give one example for why $B$ and $C$ are not independent when $A$ is unknown, suppose that $A$ is a random variable detailing whether a coin is biased. If $A=Y$, the coin is biased and will always give heads (magically), and if $A=N$ then the coin is fair.



          Now suppose $B$ and $C$ are random variables on the outcome of flipping the (possibly-biased) coin. If we know whether the coin is biased, then $A$ and $B$ are independent (as they're different coin flips) and we know the probability distributions for both (which are the same).



          However, suppose we don't know the outcome of $A$, the coin may or may not be biased. Now suppose that the outcome of $B$ comes up tails. Then we know that the outcome of $C$ must be fair, as the coin must be fair, i.e. $A=F$ (else $B$ couldn't give us tails). Hence, without knowing $A$, we have that $B$ and $C$ are dependent on eachother.



          We therefore say that $B$ and $C$ are conditionally independent on/given $A$ but are (marginally) dependent.






          share|cite|improve this answer









          $endgroup$



          As Jimmy R points out, marginal independence is the same as traditional independence, and conditional independence is just the independence of two random variables given the value of another random variable.



          To give one example for why $B$ and $C$ are not independent when $A$ is unknown, suppose that $A$ is a random variable detailing whether a coin is biased. If $A=Y$, the coin is biased and will always give heads (magically), and if $A=N$ then the coin is fair.



          Now suppose $B$ and $C$ are random variables on the outcome of flipping the (possibly-biased) coin. If we know whether the coin is biased, then $A$ and $B$ are independent (as they're different coin flips) and we know the probability distributions for both (which are the same).



          However, suppose we don't know the outcome of $A$, the coin may or may not be biased. Now suppose that the outcome of $B$ comes up tails. Then we know that the outcome of $C$ must be fair, as the coin must be fair, i.e. $A=F$ (else $B$ couldn't give us tails). Hence, without knowing $A$, we have that $B$ and $C$ are dependent on eachother.



          We therefore say that $B$ and $C$ are conditionally independent on/given $A$ but are (marginally) dependent.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jul 14 '18 at 18:19









          Alex WAlex W

          758




          758























              0












              $begingroup$


              But, when the value of A is unknown, why B,C are not independent?




              The formulation of your problem points me toward a solution. Let's say they are independent(proof by contradiction).



              If B and C are independent then $P(B, C) = P(B) P(C)$, and $P(A, B, C) = P(A|B, C) P(B,C)=P(A|B, C)P(B)P(C)=P(A)P(B)P(C)$. And the graph(two children share a commen parent) in your question doesn't work anymore.



              I'd like to plug an intuitive instance.
              Let's consider that A is the age of a certain person and B is his/her reading ability and C stands for his/her hight. Knowing A decouples B and C, as obviously as anyone can see. But if A is not given, we can infer B by C or C according to B. If he/she is tall we can guess he/she is educated enough to read fluently and if someone's reading ability is high we can also conjecture that he/she most probably would not be as short as a child in kindergarten.






              share|cite|improve this answer











              $endgroup$


















                0












                $begingroup$


                But, when the value of A is unknown, why B,C are not independent?




                The formulation of your problem points me toward a solution. Let's say they are independent(proof by contradiction).



                If B and C are independent then $P(B, C) = P(B) P(C)$, and $P(A, B, C) = P(A|B, C) P(B,C)=P(A|B, C)P(B)P(C)=P(A)P(B)P(C)$. And the graph(two children share a commen parent) in your question doesn't work anymore.



                I'd like to plug an intuitive instance.
                Let's consider that A is the age of a certain person and B is his/her reading ability and C stands for his/her hight. Knowing A decouples B and C, as obviously as anyone can see. But if A is not given, we can infer B by C or C according to B. If he/she is tall we can guess he/she is educated enough to read fluently and if someone's reading ability is high we can also conjecture that he/she most probably would not be as short as a child in kindergarten.






                share|cite|improve this answer











                $endgroup$
















                  0












                  0








                  0





                  $begingroup$


                  But, when the value of A is unknown, why B,C are not independent?




                  The formulation of your problem points me toward a solution. Let's say they are independent(proof by contradiction).



                  If B and C are independent then $P(B, C) = P(B) P(C)$, and $P(A, B, C) = P(A|B, C) P(B,C)=P(A|B, C)P(B)P(C)=P(A)P(B)P(C)$. And the graph(two children share a commen parent) in your question doesn't work anymore.



                  I'd like to plug an intuitive instance.
                  Let's consider that A is the age of a certain person and B is his/her reading ability and C stands for his/her hight. Knowing A decouples B and C, as obviously as anyone can see. But if A is not given, we can infer B by C or C according to B. If he/she is tall we can guess he/she is educated enough to read fluently and if someone's reading ability is high we can also conjecture that he/she most probably would not be as short as a child in kindergarten.






                  share|cite|improve this answer











                  $endgroup$




                  But, when the value of A is unknown, why B,C are not independent?




                  The formulation of your problem points me toward a solution. Let's say they are independent(proof by contradiction).



                  If B and C are independent then $P(B, C) = P(B) P(C)$, and $P(A, B, C) = P(A|B, C) P(B,C)=P(A|B, C)P(B)P(C)=P(A)P(B)P(C)$. And the graph(two children share a commen parent) in your question doesn't work anymore.



                  I'd like to plug an intuitive instance.
                  Let's consider that A is the age of a certain person and B is his/her reading ability and C stands for his/her hight. Knowing A decouples B and C, as obviously as anyone can see. But if A is not given, we can infer B by C or C according to B. If he/she is tall we can guess he/she is educated enough to read fluently and if someone's reading ability is high we can also conjecture that he/she most probably would not be as short as a child in kindergarten.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Jan 27 at 8:51

























                  answered Jan 26 at 1:23









                  Lerner ZhangLerner Zhang

                  314217




                  314217






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2481087%2findependence-vs-marginal-independence%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Mario Kart Wii

                      The Binding of Isaac: Rebirth/Afterbirth

                      What does “Dominus providebit” mean?