
Advanced Statistics Advanced Probability and Statistics Math Forum 
 LinkBack  Thread Tools  Display Modes 
February 4th, 2019, 03:53 PM  #1 
Senior Member Joined: Jan 2017 From: Toronto Posts: 209 Thanks: 3  Question about information entropy
Let X be a random variable with values a, b, c, d and let Y be a random variable with values p, q, r with the joint distribution shown in the following table Code: a b c d p  0.2  0.05  0.02  0.03 q  0.01  0.04  0.08  0.07 r  0.1  0.3  0.04  0.06 My Answer: $\displaystyle Ent(X, Y) =  \sum\limits^{b,c,d}_{n=a} ~\sum\limits^{q,r}_{m=p} ~ ( P(X=n, Y=m) \cdot \log(P(X=n, Y=m) ) $ Is my answer correct??? Last edited by skipjack; February 4th, 2019 at 04:24 PM. 
February 4th, 2019, 03:59 PM  #2 
Senior Member Joined: Sep 2015 From: USA Posts: 2,299 Thanks: 1222 
yes, though they probably want you to plug and chug the numbers. 
February 4th, 2019, 03:59 PM  #3 
Senior Member Joined: Jan 2017 From: Toronto Posts: 209 Thanks: 3 
Thanks. Master romsek.


Tags 
entropu, entropy, information, question 
Thread Tools  
Display Modes  

Similar Threads  
Thread  Thread Starter  Forum  Replies  Last Post 
Probability Question With Different People Knowing Different Information  EvanJ  Probability and Statistics  1  April 6th, 2017 08:49 PM 
I don't think there is enough information here...  Indigo28  Algebra  6  March 12th, 2017 11:01 AM 
Collatz Information Question  johnr  Number Theory  4  November 13th, 2013 09:29 AM 