- **Advanced Statistics**
(*http://mymathforum.com/advanced-statistics/*)

- - **Question about information entropy**
(*http://mymathforum.com/advanced-statistics/345717-question-about-information-entropy.html*)

Question about information entropyLet X be a random variable with values a, b, c, d and let Y be a random variable with values p, q, r with the joint distribution shown in the following table Code: ` a b c d` My Answer: $\displaystyle Ent(X, Y) = - \sum\limits^{b,c,d}_{n=a} ~\sum\limits^{q,r}_{m=p} ~ ( P(X=n, Y=m) \cdot \log(P(X=n, Y=m) ) $ Is my answer correct??? |

yes, though they probably want you to plug and chug the numbers. |

Thanks. Master romsek. |

All times are GMT -8. The time now is 07:53 PM. |

Copyright © 2019 My Math Forum. All rights reserved.