My Math Forum Question about information entropy
 User Name Remember Me? Password

 Advanced Statistics Advanced Probability and Statistics Math Forum

 February 4th, 2019, 02:53 PM #1 Senior Member   Joined: Jan 2017 From: Toronto Posts: 209 Thanks: 3 Question about information entropy Let X be a random variable with values a, b, c, d and let Y be a random variable with values p, q, r with the joint distribution shown in the following table Code:  a b c d p | 0.2 | 0.05 | 0.02 | 0.03 q | 0.01 | 0.04 | 0.08 | 0.07 r | 0.1 | 0.3 | 0.04 | 0.06 What is Ent(X, Y)? My Answer: $\displaystyle Ent(X, Y) = - \sum\limits^{b,c,d}_{n=a} ~\sum\limits^{q,r}_{m=p} ~ ( P(X=n, Y=m) \cdot \log(P(X=n, Y=m) )$ Is my answer correct??? Last edited by skipjack; February 4th, 2019 at 03:24 PM.
 February 4th, 2019, 02:59 PM #2 Senior Member     Joined: Sep 2015 From: USA Posts: 2,553 Thanks: 1403 yes, though they probably want you to plug and chug the numbers. Thanks from zollen
 February 4th, 2019, 02:59 PM #3 Senior Member   Joined: Jan 2017 From: Toronto Posts: 209 Thanks: 3 Thanks. Master romsek.

 Tags entropu, entropy, information, question

 Thread Tools Display Modes Linear Mode

 Similar Threads Thread Thread Starter Forum Replies Last Post EvanJ Probability and Statistics 1 April 6th, 2017 07:49 PM Indigo28 Algebra 6 March 12th, 2017 10:01 AM johnr Number Theory 4 November 13th, 2013 08:29 AM

 Contact - Home - Forums - Cryptocurrency Forum - Top