My Math Forum Why use variance instead of standard deviation?

 Probability and Statistics Basic Probability and Statistics Math Forum

 January 11th, 2018, 10:08 AM #1 Newbie   Joined: Jan 2018 From: Sweden Posts: 1 Thanks: 0 Why use variance instead of standard deviation? This drives me crazy, The question is simple: A manufacturing process has the expected value of 200minutes and standard deviation of 10minutes The reset process has the expected value of 30 minutes with the standard deviation 3 minutes Compute the probability that the sum of the manufacturing and reset time is no more than 260 minutes. Using the central limit theorem I thought it would be straight forward like usually: 260-210/13 However the right answer is 260-210/sqrt(100+9) To me this is not logic, can someone explain why I would want to take the square root of the combined variances when I already have the individual square roots given? Best regards, John
 January 11th, 2018, 02:45 PM #2 Global Moderator   Joined: May 2007 Posts: 6,643 Thanks: 628 It comes from a theorem of elementary probability theory. The variance of the sum of independent random variables is the sum of the variances. Adding standard deviations gives you a wrong answer. Thanks from studiot

 Thread Tools Display Modes Linear Mode

 Similar Threads Thread Thread Starter Forum Replies Last Post Kinroh Advanced Statistics 1 September 5th, 2013 01:31 PM HenryMolaison Algebra 2 February 20th, 2013 01:33 PM margybear Algebra 3 February 22nd, 2012 10:10 PM tsl182forever8 Algebra 2 February 10th, 2012 06:37 AM 212111 Advanced Statistics 0 August 27th, 2009 02:10 AM

 Contact - Home - Forums - Cryptocurrency Forum - Top