My Math Forum  

Go Back   My Math Forum > Science Forums > Computer Science

Computer Science Computer Science Forum

LinkBack Thread Tools Display Modes
March 26th, 2013, 11:04 PM   #1
Joined: Jan 2012

Posts: 12
Thanks: 0

Entropy H is continuous in p(i)

Here's something that I came across in "A Mathematical theory of communication" By Shannon that I couldn't get-

" Suppose we have a set of possible events whose probabilities of occurrence are p1, p2, ... pn. These probabilities are known but that is all we know concerning which event will occur. Can we find a measure of how much “choice” is involved in the selection of the event or of how uncertain we are of the outcome? If there is such a measure, say H(p1, p2,, it is reasonable to require of it the following properties:

1. H should be continuous in the p(i). "
H is the entropy.

What does he mean by H should be continuous in p(i) [where p(i) indicates probability of 'i' th event]. I couldn't get it.
iVenky is offline  
March 27th, 2013, 05:16 AM   #2
Global Moderator
CRGreathouse's Avatar
Joined: Nov 2006
From: UTC -5

Posts: 16,046
Thanks: 931

Math Focus: Number theory, computational mathematics, combinatorics, FOM, symbolic logic, TCS, algorithms
Re: Entropy H is continuous in p(i)

If p(i) changes by a very small amount, H will change by a small amount. You can use the normal epsilon-delta definition here if you prefer.
CRGreathouse is offline  

  My Math Forum > Science Forums > Computer Science

continuous, entropy

Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
How to compute entropy? acepsut Economics 0 June 28th, 2013 04:28 AM
Internet Browser's Entropy. ZardoZ Computer Science 0 April 6th, 2013 05:48 AM
The Entropy of Information Source iVenky Computer Science 5 March 29th, 2013 09:35 AM
calculating the entropy safyras Algebra 0 November 6th, 2011 01:50 AM
Entropy (information theory) trid2 Advanced Statistics 1 March 1st, 2009 07:24 PM

Copyright © 2017 My Math Forum. All rights reserved.