Probability density function (PDF) for continuous random variable x is function:
fX(x) funtion maps values x from sample space to real numbers.
For continuous random variable:
Expectation of continuous random variable is:
Properties:
Variance of continuous random variable is:
Properties:
Cumulative distribution functions (CDF) of random variable X is:
So:
and FX(a) ≤ FX(b) for a ≤ b.
Relation between CDF and PDF:
Conditional probability of continuous random variable is:
Conditional expectation of continuous random variable is:
Properties:
Continuous uniform random variable is fX(x) that is non-zero only on [a, b] with fX(x) = `1 ⁄ (b − a).
Proofs:
Note
In maxima:
(%i4) factor((b^2+b*a+a^2)/3 - (a+b)^2/4); 2 (b - a) -------- 12
Exponential random variables with parameter λ is:
for x ≥ 0, and zero otherwise.
Properties:
Proof:
Note
From maxima:
(%i15) assume(lambda>0); (%o15) [lambda > 0] (%i16) integrate(lambda*%e^(-lambda*x),x,0,inf); (%o16) 1 (%i17) integrate(x*lambda*%e^(-lambda*x),x,0,inf); 1 (%o17) ------ lambda (%i18) integrate(x^2*lambda*%e^(-lambda*x),x,0,inf); 2 (%o18) ------- 2 lambda
Normal random variables with parameters μ, σ and σ > 0 defined by PDF:
Properties:
If Z = X + Y and X and Y is independent normal r.v. then:
Proof:
If Y = a·X + b then fY(y) = 1 ⁄ |a|·fX((y − b) ⁄ a).
Proof, for y > 0:
so:
For y < 0:
Combining expression for a ≠ 0 gives us result.
If X is uniform distribution with parameters c, d then a·Y + b also is uniform distribution with parameters a·c + b, a·d + b.
If X is exponential distribution with parameters λ then a·Y also is exponential distribution with parameters λ ⁄ a for a > 0.
If X is normal distribution with parameters μ, σ² then a·Y + b also is normal distribution with parameters a·μ + b, (a·σ)².
Proofs.
When Χ exp(λ) and Y = a·X then:
When Χ norm(μ, σ²) and Y = a·X + b then:
Let's Y = g(X) and g is monotonic function on range [a, b]. So there is inverse function h(Y) = X on range [g(a), g(b)] (if g is increasing values) or on range [g(b), g(a)] (if g is decreasing values). In that case:
Proof. Let g is monotonically increasing function. Thus:
and so:
If Z = X + Y and X and Y is independent r.v. then:
Proof:
Consider Z at conditional event X = x:
Becasue of independence of X and Y:
Joint PDF of X and Z is:
By integrating by x we get:
Covariance of two r.v. is:
Properties:
Covariance of two independent r.v. is zero.
Proofs:
For independent r.v. X and Y:
Dimensionless version of covariance:
It is defined only for cases when σX ≠ 0 and σY ≠ 0.
Obviously − 1 ≤ ρ(Χ, Υ) ≤ + 1 and ρ(Χ, X) = 0.
For independent r.v. ρ(Χ, Y) = 0.
If |ρ(X, Y)| = 1 then X and Y is have linear dependencies X = Y or X = − Y.
Properties:
Proof:
.. math::
E[E[X|Y]] = ∫_Yf_Y(y)·∫_Xx·f_{X|Y}(x|y)dx·dy
= ∫_Y∫_Xx·f_Y(y)·f_{X|Y}(x|y)dx·dy = ∫_Y∫_Xx·f_{X,Y}(x,y)dx·dy
= ∫_Xx·∫_Yf_{X,Y}(x,y)dy·dx = ∫_Xx·f_X(x)dx = E[X]
Proof:
For X norm(μX, σX²) and Y norm(μY, σY²) random variable X + Y is also has normal distribution with parameters:
https://en.wikipedia.org/wiki/Sum_of_normally_distributed_random_variables
Let Y = X1 + ... + XN is a sum of r.v. N and all Xi are i.i.r.v. Thus:
Proofs: