My Math Forum Entropy H is continuous in p(i)

 Computer Science Computer Science Forum

March 26th, 2013, 11:04 PM   #1
Newbie

Joined: Jan 2012

Posts: 12
Thanks: 0

Entropy H is continuous in p(i)

Here's something that I came across in "A Mathematical theory of communication" By Shannon that I couldn't get-

Quote:
 " Suppose we have a set of possible events whose probabilities of occurrence are p1, p2, ... pn. These probabilities are known but that is all we know concerning which event will occur. Can we find a measure of how much “choice” is involved in the selection of the event or of how uncertain we are of the outcome? If there is such a measure, say H(p1, p2,...pn), it is reasonable to require of it the following properties: 1. H should be continuous in the p(i). "
H is the entropy.

What does he mean by H should be continuous in p(i) [where p(i) indicates probability of 'i' th event]. I couldn't get it.

 March 27th, 2013, 05:16 AM #2 Global Moderator     Joined: Nov 2006 From: UTC -5 Posts: 16,046 Thanks: 937 Math Focus: Number theory, computational mathematics, combinatorics, FOM, symbolic logic, TCS, algorithms Re: Entropy H is continuous in p(i) If p(i) changes by a very small amount, H will change by a small amount. You can use the normal epsilon-delta definition here if you prefer.

 Tags continuous, entropy

 Thread Tools Display Modes Linear Mode

 Similar Threads Thread Thread Starter Forum Replies Last Post acepsut Economics 0 June 28th, 2013 04:28 AM ZardoZ Computer Science 0 April 6th, 2013 05:48 AM iVenky Computer Science 5 March 29th, 2013 09:35 AM safyras Algebra 0 November 6th, 2011 01:50 AM trid2 Advanced Statistics 1 March 1st, 2009 06:24 PM

 Contact - Home - Forums - Cryptocurrency Forum - Top