My Math Forum  

Go Back   My Math Forum > College Math Forum > Advanced Statistics

Advanced Statistics Advanced Probability and Statistics Math Forum


Thanks Tree5Thanks
  • 1 Post By romsek
  • 1 Post By SDK
  • 2 Post By romsek
  • 1 Post By SDK
Reply
 
LinkBack Thread Tools Display Modes
March 6th, 2019, 12:28 PM   #1
Member
 
Joined: Feb 2018
From: Canada

Posts: 46
Thanks: 2

Quadratic forms

Suppose that $X \sim N(\mu, \Lambda)$ with det$(\Lambda) > 0$. Show that $Q=(X-\mu)'\Lambda^{-1}(X-\mu)$ has a $\chi^2(n)$ distribution where $n$ is the dimension of $X$.
Here is my proof:
Since det$(\Lambda) > 0$ then $\Lambda$ is positive definite, $\Lambda=T\Sigma T'$ where $T$ is a real orthogonal matrix and
\[ \Sigma =
\begin{bmatrix}
\lambda_{1} &0 &\cdots &0 \\
0 &\lambda_{2} &\cdots &0 \\
\vdots &\vdots &\ddots &\vdots \\
0 &0 &\cdots &\lambda_{n}
\end{bmatrix}
\quad {.Then }\]
$\Lambda^{-1} = T\Sigma^{-1} T'$. Then we have:
\[
\begin{aligned}
Q &= (X-\mu)'\Lambda^{-1}(X-\mu) \\
&= (X-\mu)'T\Sigma^{-1} T'(X-\mu) \\
&= Y'\Sigma Y
\end{aligned} \]
where $Y=T'(X-\mu)$. Further,
\[
\begin{aligned}
Q &= Y'\Sigma Y \\
&=
\begin{bmatrix}
Y_1 & Y_2 &\cdots &Y_n
\end{bmatrix}
\begin{bmatrix}
\dfrac{1}{\lambda_{1}} &0 &\cdots &0 \\
0 &\dfrac{1}{\lambda_{2}} &\cdots &0 \\
\vdots &\vdots &\ddots &\vdots \\
0 &0 &\cdots &\dfrac{1}{\lambda_{n}}
\end{bmatrix}
\begin{bmatrix}
Y_1 \\
Y_2 \\
\vdots \\
Y_n
\end{bmatrix} \\
&= \sum^{n}_{i=1} \dfrac{Y_i^2}{\lambda_i} = \sum^{n}_{i=1}\left(\dfrac{Y_i}{\sqrt{\lambda_{i}} }\right)^2
\end{aligned} \]
Since $Y_i \sim N(0, \lambda_{i})$ and $X_i$ is independent then
\[ Q= \sum^{n}_{i=1}\left(\dfrac{Y_i^2}{\sqrt{\lambda_{i }}}\right)^2 \sim \chi^2(n). \]
There are two other problems that I do not know where to start, can someone give me a hint.
Problem 1) Let $X_1, X_2, \cdots, X_n$ be i.i.d. random variables with corresponding order statistics $X_{(1)}, \cdots, X_{(n)}.$
a) Show that $P(X_1=X_{(k)}) = 1/n$ for all $k=1,\cdots,n$.
b) Compute $E(X_1|X_{(1)},X_{(2)}, \cdots, X_{(n)})$.
Problem 2) Suppose that $X_1, X_2, \cdots, X_n$ are i.i.d. Bernoulli(p) random variables. What is the generating moment function of $X_1+X_2^2+X_3^3+\cdots+X_n^{n}$. Then identify the distribution.
Thank you.

Last edited by Shanonhaliwell; March 6th, 2019 at 01:03 PM.
Shanonhaliwell is offline  
 
March 6th, 2019, 03:27 PM   #2
Senior Member
 
romsek's Avatar
 
Joined: Sep 2015
From: USA

Posts: 2,533
Thanks: 1390

1a) I believe the idea is that $P[X_1 = X_{(k)}] = \dfrac{(n-1)!}{n!} = \dfrac 1 n$

Think about the ways that the $X_k$ might be sorted

b) you're given all the order statistics. (a) showed they form a discrete uniform distribution. So what's the expectation?
Thanks from Shanonhaliwell
romsek is offline  
March 6th, 2019, 06:41 PM   #3
Member
 
Joined: Feb 2018
From: Canada

Posts: 46
Thanks: 2

Thank you. I think the expectation is:
\[ E(X_1|X_{(1)},X_{(2)},\cdots,X_{(n)})=\dfrac{1}{n} \sum_{k=1}^{n}X_k \]
How about problem 2, any idea.
Shanonhaliwell is offline  
March 6th, 2019, 07:55 PM   #4
SDK
Senior Member
 
Joined: Sep 2016
From: USA

Posts: 635
Thanks: 401

Math Focus: Dynamical systems, analytic function theory, numerics
Quote:
Originally Posted by Shanonhaliwell View Post
Since det$(\Lambda) > 0$ then $\Lambda$ is positive definite
The rest of your proof seems fine assuming this is true. However, I don't see why this follows. Is there some additional structure on $\Lambda$ which you haven't mentioned? In general, if a matrix has positive determinant it doesn't mean all of its eigenvalues are positive. It just means it must have an even number of negative eigenvalues.

If this is indeed a mistake, I think this approach can be fixed by using the singular value decomposition for $\Lambda$ in place of its eigenvalue decomposition in your current proof.
Thanks from Shanonhaliwell
SDK is offline  
March 6th, 2019, 09:41 PM   #5
Senior Member
 
romsek's Avatar
 
Joined: Sep 2015
From: USA

Posts: 2,533
Thanks: 1390

Quote:
Originally Posted by SDK View Post
The rest of your proof seems fine assuming this is true. However, I don't see why this follows. Is there some additional structure on $\Lambda$ which you haven't mentioned? In general, if a matrix has positive determinant it doesn't mean all of its eigenvalues are positive. It just means it must have an even number of negative eigenvalues.

If this is indeed a mistake, I think this approach can be fixed by using the singular value decomposition for $\Lambda$ in place of its eigenvalue decomposition in your current proof.
$\Lambda$ is a covariance matrix and thus positive semi-definite.

Since $|\Lambda| > 0$ it's positive definite.
Thanks from SDK and Shanonhaliwell
romsek is offline  
March 7th, 2019, 03:00 AM   #6
SDK
Senior Member
 
Joined: Sep 2016
From: USA

Posts: 635
Thanks: 401

Math Focus: Dynamical systems, analytic function theory, numerics
Quote:
Originally Posted by romsek View Post
$\Lambda$ is a covariance matrix and thus positive semi-definite.

Since $|\Lambda| > 0$ it's positive definite.
This clears it up thanks.
Thanks from Shanonhaliwell
SDK is offline  
March 7th, 2019, 08:52 AM   #7
Member
 
Joined: Feb 2018
From: Canada

Posts: 46
Thanks: 2

There is a mistake in my proof so I hope someone can fix for it me or let me have a chance to edit my post.
It should be:
\[ \begin{aligned}
Q &= (X-\mu)'\Lambda^{-1}(X-\mu) \\
&= (X-\mu)'T\Sigma^{-1} T'(X-\mu) \\
&= Y'\Sigma^{-1} Y
\end{aligned} \]
so
\[ \begin{aligned}
Q &= Y'\Sigma^{-1} Y \\
&=
\begin{bmatrix}
Y_1 & Y_2 &\cdots &Y_n
\end{bmatrix}
\begin{bmatrix}
\dfrac{1}{\lambda_{1}} &0 &\cdots &0 \\
0 &\dfrac{1}{\lambda_{2}} &\cdots &0 \\
\vdots &\vdots &\ddots &\vdots \\
0 &0 &\cdots &\dfrac{1}{\lambda_{n}}
\end{bmatrix}
\begin{bmatrix}
Y_1 \\
Y_2 \\
\vdots \\
Y_n
\end{bmatrix} \\
&= \sum^{n}_{i=1} \dfrac{Y_i^2}{\lambda_i} = \sum^{n}_{i=1}\left(\dfrac{Y_i}{\sqrt{\lambda_{i}} }\right)^2
\end{aligned} \]
For the second problem, since $X_1, X_2, \cdots, X_n$ are i.i.d. Bernoulli(p) random variables so $X_2^2=X_2,X_3^3=X_3,\cdots,X_n^{n}=X_n$ then $\sum^{n}_{i=1}X_{i}^{i}=\sum_{i=1}^{n}X_i \sim \text{Binomial distribution} (n,p)$.
Hence the generating moment function of $X_1+X_2^2+X_3^3+\cdots+X_n^{n}$ is the generating moment function of Binomial distribution which is $(1-p+pe^t)^n$.
Thank you @romsek and @SDK for your helps.

Last edited by Shanonhaliwell; March 7th, 2019 at 09:24 AM.
Shanonhaliwell is offline  
Reply

  My Math Forum > College Math Forum > Advanced Statistics

Tags
forms, quadratic



Thread Tools
Display Modes


Similar Threads
Thread Thread Starter Forum Replies Last Post
Linear Algebra - Quadratic Forms Luiz Linear Algebra 4 November 3rd, 2015 07:17 AM
Definiteness of quadratic forms Akcope Calculus 0 March 20th, 2014 08:43 AM
Exact names of the 3 quadratic function forms !_UK@$ Elementary Math 2 August 13th, 2010 12:04 AM
Quadratic forms with p-adic coefficient antonio85 Number Theory 0 July 15th, 2010 07:28 AM
differential forms depressedmaths Differential Equations 0 June 8th, 2009 12:30 AM





Copyright © 2019 My Math Forum. All rights reserved.