Probabilistic models that do not interact with each other and have no common sources of uncertainty.
|
P(A∩B)=P(A)⋅P(B)iff A and B are independentpX|A(x)=pX(x)for all x iff X and A are independentpX,Y(x,y)=pX(x)⋅pY(y)for all x,y iff X and Y are independentpX,Y,Z(x,y,z)=pX(x)⋅pY(y)⋅pZ(z)for all x,y,z iff X,Y and Z are independent |
Note it's always true that
|
fX,Y(x,y)=fX|Y(x|y)⋅fY(y)by conditional proability |
But
|
fX|Y(x|y)⋅fY(y)=fX(x)⋅fY(y)iff X,Y are independent for all x,y |
Expectation
In general,
|
E[g(x,y)]≠g(E[x],E[y])eg E[XY]≠E[X]E[Y] |
It's however always true that
|
E[aX+b]=aE[X]+bLinearity of Expectation |
But if
X and
Y are
independent, then
|
E[XY]=E[X]E[Y] and E[g(X)h(Y)]=E[g(X)]E[h(Y)] |
Variance
In general,
It's however always true that
|
var(aX)=a2var(X)andvar(X+a)=var(X) |
But if
X and
Y are
independent, then
Source: MITx 6.041x, Lecture 7.
# posted by rot13(Unafba Pune) @ 8:37 AM
