Function of a random variable
-
Consider operations on a random variable is X
- this shifts/scales X
- Assuming that X has a mean μ and varince σ
- Let c be a constant in this operation
-
Addition and subtraction
- we define this new random varible by adding c: Y=X+c
- mean and varince of Y becomes
- mean shifts by c: E(Y)=E(x)+c=μ+c
- the variance does not change V(Y)=V(x)+o=σ2
-
Multiplication and division
- we define the random varible as Y=cX
- mean and variance of Y becomes
- mean is shifted by c E(Y)=E(cX)=cμ
- variance is scaled by c aswell V(Y)=V(cX)=c2σ2
- variance scales with X by c
Linear Functions of random varibles
-
Different cases when the random variables are not independent
- consider X1 and X2 being not independent
- Y=X1+X2
- since mean is additive we know that
- E(Y)=E(X1+X2)=E(X1)+E(X2)=μ1+μ2
- However since its variance and the formula is the square along the square product
- V(Y)=E(Y2)−(E(Y))2
- and since we know that E(X1+X2)=μ1+μ2
General Linear function for Dependent random variables
-
Generally we have n random variables Xn
- X is actually vectors or column matrix with x values
- this gives us data for all the data points
- each X has its own mean μ and variance σ2
-
Consider we have n+1 constants c
- Y=c0+c1X1+c2X2…
- The Mean is additive so its just the same as before where its scaled by c
- However we must consider covariance (Cov(XiXj))
- where i < j
- general variance becomes
- V(Y)=σy2=∑i=1n(ciσi)2+2∑i=1n∑j>i[cicj]Cov(XiXj)