Skip to main content
[ "article:topic", "showtoc:no", "license:ccbync", "authorname:mrosulek" ]
Engineering LibreTexts

0.3: Probability

  • Page ID
    7448
  • A (discreteprobability distribution ? over a set \(X\) of outcomes is a function ? : \(X\rightarrow [0,1]\) that satisfies the condition:

    \[\sum D(x)=1\nonumber\]

    We say that ? assigns probability \(?(x)\) to outcome \(x\). The set \(X\) is referred to as the support of ?.

    A special distribution is the uniform distribution over a finite set X, which assigns probability\(1/|X|\) to every element of \(X\).

    Let ? be a probability distribution over X. We write \(Pr_?[A]\) to denote the probability of an event \(A\), where probabilities are according to distribution ?. Typically the distribution ? is understood from context, and we omit it from the notation. Formally, an event is a subset of the support X, but it is typical to write \(Pr[cond]\) where “cond” is the condition that defines an event \(A = \{x\in X | x\) satisfies condition cond}. Interpreting A strictly as a set, we have \(Pr_?[A]\stackrel{\text{def}}{=}\sum_{x\in a}?(x)\).

    The conditional probability of A given B is defined as \(Pr[A|B] \stackrel{\text{def}}{=} Pr[A \cap B]/Pr[B]\). When \(Pr[B] = 0\), we let \(Pr[A | B] = 0\) by convention, to avoid dividing by zero.

    Below are some convenient facts about probabilities:

    \[Pr[A]=Pr[A|B]Pr[B]+Pr[A|\neg B]Pr[\neg B];\nonumber\]

    \[Pr[A\cup B\le Pr[A]+Pr[B].\tag{union bound}\]

    Precise Terminology

    It is common and tempting to use the word “random” when one really means “uniformly at random.” We’ll try to develop the habit of being more precise about this distinction.

    It is also tempting to describe an outcome as either random or uniform. For example, one might want to say that “the string \(x\) is random.” But an outcome is not random; the process that generated the outcome is random. After all, there are many ways to come up with the same string \(x\), and not all of them are random. So randomness is a property of the process and not an inherent property of the result of the process.

    It’s more precise and a better mental habit to say that an outcome is “sampled or chosenrandomly,” and it’s even better to be precise about what the random process was. For example, “the string \(x\) is chosen uniformly.”

    Notation in Pseudocode

    When \(?\) is a probability distribution, we write “\(x\leftarrow ?\)” to mean that the value of x is sampled according to the distribution \(?\). We overload the “\(\leftarrow\)” notation slightly, writing “\(x\leftarrow X\)” when \(X\) is a finite set to mean that \(x\) is sampled from the uniform distribution over \(X\).

    We will often discuss algorithms that make some random choices. When describing such algorithms, we will use statements like “\(x\leftarrow ?\)” in the pseudocode. If \(? \) is an algorithm that takes input and also makes some internal random choices, then we can think of the output of \(?(x)\) as a distribution — possibly a different distribution for each input \(x\). Then we write “\(y \leftarrow ?(x)\)” to mean the natural thing: run \(?\) on input \(x\) and assign the output to \(y\). The use of the arrow “\(\leftarrow\)” rather than an assignment operator “\(:=\)” is meant to emphasize that, even when x is fixed, the output \(?(x)\) is a random variable depending on internal random choices made by \(?\).

    Asymptotics (Big-O)

    Let \(?: \mathbb{N}\leftarrow \mathbb{N}\) be a function. We characterize the asymptotic growth of \(?\) in the following ways:

    \[f(n)\space \text{is}\space O(g(n)) \stackrel{\text{def}}{\Leftrightarrow} \lim_{n \rightarrow \infty} \frac{f(n)}{g(n)} < \infty \\ \Leftrightarrow \exists c>0: \text{for all but finitely many }n:\space f(n)<c\cdot g(n)\nonumber\]

    \[f(n)\space \text{is}\space \Omega(g(n)) \stackrel{\text{def}}{\Leftrightarrow} \lim_{n \rightarrow \infty} \frac{f(n)}{g(n)} >0 \\ \Leftrightarrow \exists c>0: \text{for all but finitely many }n:\space f(n)>c\cdot g(n)\nonumber\]

    \[f(n)\space \text{is}\space \Theta(g(n)) \stackrel{\text{def}}{\Leftrightarrow} f(n) \space\text{is}\space O(g(n)) \text{ and } f(n)\space \text{is } \Omega(g(n)) \\ \Leftrightarrow 0<\lim_{n \rightarrow \infty} \frac{f(n)}{g(n)} <\infty \\ \Leftrightarrow\exists c_1,c_2>0: \text{for all but finitely many }n:\space c_2\cdot g(n)<f(n)<c_2\cdot g(n)\nonumber\]