My Math Forum  

Go Back   My Math Forum > College Math Forum > Advanced Statistics

Advanced Statistics Advanced Probability and Statistics Math Forum


Thanks Tree1Thanks
  • 1 Post By romsek
Reply
 
LinkBack Thread Tools Display Modes
February 4th, 2019, 03:53 PM   #1
Senior Member
 
Joined: Jan 2017
From: Toronto

Posts: 209
Thanks: 3

Question about information entropy

Let X be a random variable with values a, b, c, d and let Y be a random variable with values p, q, r with the joint distribution shown in the following table

Code:
      a       b      c      d
p | 0.2  | 0.05 | 0.02 | 0.03
q | 0.01 | 0.04 | 0.08 | 0.07
r  | 0.1  | 0.3  | 0.04 | 0.06
What is Ent(X, Y)?


My Answer:

$\displaystyle
Ent(X, Y) = - \sum\limits^{b,c,d}_{n=a} ~\sum\limits^{q,r}_{m=p} ~ ( P(X=n, Y=m) \cdot \log(P(X=n, Y=m) )
$

Is my answer correct???

Last edited by skipjack; February 4th, 2019 at 04:24 PM.
zollen is offline  
 
February 4th, 2019, 03:59 PM   #2
Senior Member
 
romsek's Avatar
 
Joined: Sep 2015
From: USA

Posts: 2,299
Thanks: 1222

yes,

though they probably want you to plug and chug the numbers.
Thanks from zollen
romsek is online now  
February 4th, 2019, 03:59 PM   #3
Senior Member
 
Joined: Jan 2017
From: Toronto

Posts: 209
Thanks: 3

Thanks. Master romsek.
zollen is offline  
Reply

  My Math Forum > College Math Forum > Advanced Statistics

Tags
entropu, entropy, information, question



Thread Tools
Display Modes


Similar Threads
Thread Thread Starter Forum Replies Last Post
Probability Question With Different People Knowing Different Information EvanJ Probability and Statistics 1 April 6th, 2017 08:49 PM
I don't think there is enough information here... Indigo28 Algebra 6 March 12th, 2017 11:01 AM
Collatz Information Question johnr Number Theory 4 November 13th, 2013 09:29 AM





Copyright © 2019 My Math Forum. All rights reserved.