Taking the binary logarithm  error propagation? 1 Attachment(s) Hi, How do I calculate the error (in my case represented by the standard deviation) of a set of data which are converted to their binary logarithm? I have for example 10 numerical values for which I can calculate the standard deviation. After converting these 10 values to their binary logarithm, I can either calculate the new standard deviation based on these new 10 values as I did before or I can use the rules of the Gaussian error propagation for calculating the new error (standard deviation). The formula for the latter is shown in the attachment. I get different results using one or the other method. Which way of calculating the standard deviation is correct (and why)? 
to make sure I understand this before looking in detail at it. You have 10 samples from an underlying Gaussian distribution $X_n \sim N(\mu_X, \sigma_X)$ You can estimate $\sigma$ the usual way to come up with the sample variance $\hat{\sigma}_X$ Now let a new random variable $Y \sim \log_2(X)$ You can process these samples to obtain $Y_n = \log_2(X_n)$ and you want to determine how to calculate $\hat{\sigma}_Y$, presumably from the $Y_n$ Is this correct? 
Yes, this is correct. This is what I want to do. Actually, I can think of two ways how to do it but I don't know which is the right one. 
since the underlying distribution is Gaussian, what do you plan to do for negative values who's binary logarithm is undefined? 
1 Attachment(s) I found this portion of a pdf file that explains what you need to do pretty well. You still have the truncation problem. 
Quote:

Quote:

I found this page here where it is recommended not to use error propagation rules at all if the individual values can be calculated, i.e. if the replicated data be transformed directly. https://books.google.de/books?id=aIq...arithm&f=false So, it seems like I don't have to bother with the Gaussian error propagation here. 
All times are GMT 8. The time now is 12:56 PM. 
Copyright © 2019 My Math Forum. All rights reserved.