diff options
Diffstat (limited to 'sannolikhet')
| -rw-r--r-- | sannolikhet/2D random variable.md | 56 | ||||
| -rw-r--r-- | sannolikhet/Bayes' theorem.md | 3 | ||||
| -rw-r--r-- | sannolikhet/Essential formula of probability.md | 9 | ||||
| -rw-r--r-- | sannolikhet/Event.md | 54 | ||||
| -rw-r--r-- | sannolikhet/Percentile.md | 5 | ||||
| -rw-r--r-- | sannolikhet/Probability calculation techniques.md | 16 | ||||
| -rw-r--r-- | sannolikhet/Probability.md | 3 | ||||
| -rw-r--r-- | sannolikhet/Random variable.md | 97 | ||||
| -rw-r--r-- | sannolikhet/Sample space.md | 10 |
9 files changed, 253 insertions, 0 deletions
diff --git a/sannolikhet/2D random variable.md b/sannolikhet/2D random variable.md new file mode 100644 index 0000000..10c2a39 --- /dev/null +++ b/sannolikhet/2D random variable.md @@ -0,0 +1,56 @@ +A [[Random variable]] with 2 "parts". + +Works the same for variables with even higher dimension. + +# [[Discrete]] 2D random variable + +## Joint probability mass function (pmf) + +$$p(x,y) = P(X=x \cap Y=y) \quad \left( = P(X=x, Y=y) \right)$$ + +## Joint table + +Much like the normal joint table but an actual table instead of a single line. + +We can get the respective 1D tables by adding rows or columns (depending on +which variable) together. + +$$p_X(x) \ \mathrm{of} \ X := p_X(x) = p(x,y_1) + ... + p(x, y_n)$$ + +## Marginal pdf + +# [[Continuous]] 2D random variable + +## Joint probability distribution function (pdf) + +1D is integrated over the number line, so 2D is integrated over $D$. + +$$? = \iint_D f(x,y) dxdy = P((X, Y) \in D)$$ + +where $D$ is any [[Borel set]] on $\mathbb{R}^2$. + +We don't usually draw this graph since it is in 3D (unpleasant). + +Instead, we draw the non-trivial domain (all non-zero values). + +$$p(x,y) \ge 0, \ \int_{-\infty}^\infty \left( \int_{-\infty}^\infty f(x,y) dx \right) dy = 1$$ + +## Marginal pdf + +$$f_X(x) \ \mathrm{of} \ X := \int_{-\infty}^\infty f(x,y)dy$$ + +Similar for $f_Y(y)$. + +Bounds might be weird. + +# Independance + +$X$ and $Y$ are independent if + +$$p(x,y) = p_X(x) \cdot p_Y(y) \qquad \mathrm{(discrete)}$$ +$$f(x,y) = f_X(x) \cdot f_Y(y) \qquad \mathrm{(continuous)}$$ + +Much the same as independence for [[Event]]s. + +Check by multiplying marginal pmf/pdf or if something "looks" dependant. Check +the intuition. diff --git a/sannolikhet/Bayes' theorem.md b/sannolikhet/Bayes' theorem.md new file mode 100644 index 0000000..633c26f --- /dev/null +++ b/sannolikhet/Bayes' theorem.md @@ -0,0 +1,3 @@ +$$P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}$$ + +Proof: expand the [[Conditional probability]]. diff --git a/sannolikhet/Essential formula of probability.md b/sannolikhet/Essential formula of probability.md new file mode 100644 index 0000000..25414fc --- /dev/null +++ b/sannolikhet/Essential formula of probability.md @@ -0,0 +1,9 @@ +Also known as the "happy face formula", since we are happy whenever we get to +use it. + +If all outcomes are equally likely, then + +$$P(A) = \frac{\mathrm{N}(A)}{\mathrm{N}(S)}$$ + +We can sometimes count using this directly, but usually we need counting +techniques. diff --git a/sannolikhet/Event.md b/sannolikhet/Event.md new file mode 100644 index 0000000..1d630f0 --- /dev/null +++ b/sannolikhet/Event.md @@ -0,0 +1,54 @@ +An event is any subset of the [[Sample space]]. + +Example: Given +$$S = \{1,2,3,4,5,6\}$$ + +then +$$A = \{2,4\}$$ +is an event. + +Can be visualized with [[Venn diagram]]s. + +Events happen with a [[Probability]]. + +## Special events + +Some special events are the empty event ($\phi$) and the [[Sample space]] itself ($S$). + +# Operations + +Just like there are normal operations (+, \*, ..) on numbers there exist +operations on events. They work much like events on normal [[Set]]s. + +## Intersection + +$$A \cap B$$ + +## Union + +$$A \cup B$$ + +## Complement + +$$A'$$ + +# Disjoint + +Two events are disjoint if they don't "overlap" in any way. + +# Independent + +Two events are independent if $P(A \cap B) = P(A) \cdot P(B)$. + +Intuition: $A$ does not affect $B$ and vice versa. + +$(A, B)$ independent $\Leftrightarrow$ $(A', B)$ independent. + +For example, given a fair dice roll and + +$$A = \{2\}, \ B = \{2, 3\}, \ C = S$$ + +then $A$ and $B$ are dependent but $A$ and $C$ are independent (since the +outcome of $A$ doesn't affect the outcome of $C$ since $P(C) = 1$. + +In general, $A_1, ..., A_n$ are independent if... diff --git a/sannolikhet/Percentile.md b/sannolikhet/Percentile.md new file mode 100644 index 0000000..a1879e9 --- /dev/null +++ b/sannolikhet/Percentile.md @@ -0,0 +1,5 @@ +$c$ is called the $b$-th percentile if + +$$P(X \le c) = b \%$$ + +where $P(X \le x)$ denotes a [[Random variable#Continuous random variable#Probability density function]] diff --git a/sannolikhet/Probability calculation techniques.md b/sannolikhet/Probability calculation techniques.md new file mode 100644 index 0000000..0b0efdf --- /dev/null +++ b/sannolikhet/Probability calculation techniques.md @@ -0,0 +1,16 @@ +A proper definition would probably (hah) use the [[Kolmogorov axioms]]. + +Intuition: probability is "chance". However: something like 50% is not exact. If +something has the probability 50% we don't expect it to happen exactly 50 times +out of 100 tries. Rather, we expect + +$$\lim_{n \rightarrow \infty} \mathrm{N}(\mathrm{heads}) = 50\% \cdot \mathrm{N}(n \ \mathrm{throws})$$ + +# Some rules + +$$P(S) = 1$$ +$$P(\phi) = 0$$ +$$P(A') = 1 - P(A)$$ +$$P(A_1 \cup A_2 \cup ...) = P(A_1) + P(A_2) + ...$$ +$$P(A \cup B) = P(A) + P(B) - P(A \cap B)$$ +$$P(A \cup B \cup C) = P(A) + P(B) + P(C) - P(A \cap B) - P(B \cap C) - P(A \cap C) + P(A \cap B \cap C)$$ diff --git a/sannolikhet/Probability.md b/sannolikhet/Probability.md new file mode 100644 index 0000000..90969b0 --- /dev/null +++ b/sannolikhet/Probability.md @@ -0,0 +1,3 @@ +In a nutshell: "What is the *probability*" that x happens?" + +See [[TAMS42#Probability]].
\ No newline at end of file diff --git a/sannolikhet/Random variable.md b/sannolikhet/Random variable.md new file mode 100644 index 0000000..ba7e125 --- /dev/null +++ b/sannolikhet/Random variable.md @@ -0,0 +1,97 @@ +Definition: A random variable (or a distribution) is a numerical value +associated with an [[Experiment]] whose value can change from one replicate of +the experiment to another. + +A proper definition would need [[Probability space]] and [[Measurable function]]s. + +For example, given a fair dice roll, + +$$X = \{1,2,3,4,5,6\}$$ + +is a random variable. + +$$Y = [30, 260]$$ + +is another. + +Two types: Discrete random variables and continuous random variables. + +# [[Discrete]] random variable + +If the outcomes are either bounded in size or countably infinite. + +## Probability mass function + +Also known as the pmf. Every discrete random variable has a corresponding pmf. +Denoted + +$$p(x) = P(X = x)$$ + +## Table + +Every discrete random variable also has a corresponding table. + +|$X$|$x_1$|$x_2$|$...$|$x_n$| +|--|--|--|--|--| +|$p(x)$|$p(x_1)$|$p(x_2)$|$...$|$p(x_n)$| + +where + +$$p(x_i) \ge 0 \quad \forall i \in \{1,2,..,n\}$$ +$$\sum_{i=1}^n p(x_i) = 1$$ + +# [[Continuous]] random variable + +The rest. E.g. some interval on the number line. + +## Probability density function + +Also knows as the pdf. Every continuous random variable has a corresponding pdf. +Denoted $f(x)$ where + +$$\int_a^b f(x) dx = P(a \le X \le b)$$ + +and + +$$f(x) \le 0 \quad \forall x$$ +$$\int_{-\infty}^\infty f(x) dx = 1$$ + +# Cumulative distribution function + +Also knows as the cdf. + +$$F(x) = P(X \le x)$$ + +For discrete random variables: + +$$F(y) = \sum_{i=1}^y p(x_i)$$ + +And for continuous random variables: + +$$F(x) = \int_{-\infty}^x f(y) dy$$ + +Here we see that + +$$F'(x) = f(x)$$ + +for continuous random variables. Compare with [[Algebrans fundamentalsats]]? + +# Examples + +## Waiting time (useful model) + +Let $X$ be the waiting time between calls in a phone center. Assume $X$ is a +continuous random variable with pdf + +$$f(x) = 2e^{-2x} \quad x \gt 0$$ + +What is $P(X \gt 3)$? + +$$P(X \gt 3) = \int_3^\infty f(x) dx = \int_3^\infty 2e^{-2x}dx = e^{-6}$$ + +In actuality, + +$$f(x) = 2e^{-2x} \quad x>0$$ +$$0 \ \mathrm{otherwise}$$ + +but the 0-case is assumed. diff --git a/sannolikhet/Sample space.md b/sannolikhet/Sample space.md new file mode 100644 index 0000000..6e4130c --- /dev/null +++ b/sannolikhet/Sample space.md @@ -0,0 +1,10 @@ +The sample space is the [[Set]] of all possible outcomes of an [[Experiment]] or +a [[Trial]]. + +Example: Throw a dice and observe the upper side. +$$S = \{1,2,3,4,5,6\}$$ + +Example: Throw two fair dice and observe their upper sides. +$$S = \{(1,1),(1,2), ..., (1,6),(2,1), ...,(6,5),(6,6)\}$$ + +The sample space consists of [[Event]]s. |
