[1] Convergence in probability

Definition:  A sequence of random variables $X_{1}, X_{2},...$ converges in probability to random variable X if for every e > 0, $\lim_{n\rightarrow \infty} P(|X_{n}-X| > e )=0$ $\Leftrightarrow \lim_{n\rightarrow \infty} P(|X_{n}-X| \leq e )=1$

Think about a sequence of random variables. This sequence asymptotically reaches a certain random variable X (or a constant). How do we know it? We can define a boundary of the certain random variable. The sequence will reach this boundary, however we never know it actually reach the certain random variable X

 


[2] Almost Sure Convergence

 

Definition: A sequence of random variables $X_{1}, X_{2},...$ almost surely to a random variable if for every e>0, $P(\lim_{n\rightarrow \infty}|X_{n}-X| \leq e )=1$  

A sequence of random variables definitely reach a certain random variable X. But we don’t know when it actually reach it. Therefore, after a certain n, then the sequence of random variables are equal to the variable X.

 

 

[3] Convergence in distribution.

Definition: A sequence of random variables $X_{1}, X_{2},...$ converge in distribution to a random variable X if $\lim_{n\rightarrow \infty}P(|X_{n} \leq x)= P(X\leq x)=F_{X}x$ at all points x where F(x) is continuous.

 

 

* Central Limit Theorem 

Definition: Suppose that $X_{1}, X_{2},..., X_{n}$ are i.i.d. random variable with mean $\mu$ and variance $\sigma^2$. Then, $Z_{n}=\frac{\sqrt{n}(\bar{X}-\mu)}{\sigma}\rightarrow Z\sim N(0,1)$ (in distribution)  

Remark) The condition of the CLT is different from the others. There should be i.i.d, number of n randm varaibles.  

Varance stabilizing transformations

When g is differentiable, we have $\sqrt{n}(g(\bar{X_{n}}-g(\mu)))\rightarrow g'(\mu)N(0, \sigma^2)$ in distribution.  

 

 

 수학기호가 안보인다면 새로고침(F5)을 해보세요. 

저도 아직 배우는 학생이다보니 수정해야할 부분이 있다면 댓글 부탁드릴게요 :D 

 

반응형

+ Recent posts