My Math Forum

My Math Forum (http://mymathforum.com/math-forums.php)
-   Advanced Statistics (http://mymathforum.com/advanced-statistics/)
-   -   Question about information entropy (http://mymathforum.com/advanced-statistics/345717-question-about-information-entropy.html)

zollen February 4th, 2019 02:53 PM

Question about information entropy
 
Let X be a random variable with values a, b, c, d and let Y be a random variable with values p, q, r with the joint distribution shown in the following table

Code:

      a      b      c      d
p | 0.2  | 0.05 | 0.02 | 0.03
q | 0.01 | 0.04 | 0.08 | 0.07
r  | 0.1  | 0.3  | 0.04 | 0.06

What is Ent(X, Y)?


My Answer:

$\displaystyle
Ent(X, Y) = - \sum\limits^{b,c,d}_{n=a} ~\sum\limits^{q,r}_{m=p} ~ ( P(X=n, Y=m) \cdot \log(P(X=n, Y=m) )
$

Is my answer correct???

romsek February 4th, 2019 02:59 PM

yes,

though they probably want you to plug and chug the numbers.

zollen February 4th, 2019 02:59 PM

Thanks. Master romsek.


All times are GMT -8. The time now is 06:47 AM.

Copyright © 2019 My Math Forum. All rights reserved.