1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
|
A [[Random variable]] with 2 "parts".
Works the same for variables with even higher dimension.
# [[Discrete]] 2D random variable
## Joint probability mass function (pmf)
$$p(x,y) = P(X=x \cap Y=y) \quad \left( = P(X=x, Y=y) \right)$$
## Joint table
Much like the normal joint table but an actual table instead of a single line.
We can get the respective 1D tables by adding rows or columns (depending on
which variable) together.
$$p_X(x) \ \mathrm{of} \ X := p_X(x) = p(x,y_1) + ... + p(x, y_n)$$
## Marginal pdf
# [[Continuous]] 2D random variable
## Joint probability distribution function (pdf)
1D is integrated over the number line, so 2D is integrated over $D$.
$$? = \iint_D f(x,y) dxdy = P((X, Y) \in D)$$
where $D$ is any [[Borel set]] on $\mathbb{R}^2$.
We don't usually draw this graph since it is in 3D (unpleasant).
Instead, we draw the non-trivial domain (all non-zero values).
$$p(x,y) \ge 0, \ \int_{-\infty}^\infty \left( \int_{-\infty}^\infty f(x,y) dx \right) dy = 1$$
## Marginal pdf
$$f_X(x) \ \mathrm{of} \ X := \int_{-\infty}^\infty f(x,y)dy$$
Similar for $f_Y(y)$.
Bounds might be weird.
# Independance
$X$ and $Y$ are independent if
$$p(x,y) = p_X(x) \cdot p_Y(y) \qquad \mathrm{(discrete)}$$
$$f(x,y) = f_X(x) \cdot f_Y(y) \qquad \mathrm{(continuous)}$$
Much the same as independence for [[Event]]s.
Check by multiplying marginal pmf/pdf or if something "looks" dependant. Check
the intuition.
|