Skip to main content
Engineering LibreTexts

7.2: Permutations and Combinations

Learning Objectives

  • Discusses the basics of combinations and permutations, and how to calculate the probability of certain events, such as n-bit errors in a codeword.

The lottery "game" consists of picking kk numbers from a pool of n. For example, you select 6 numbers out of 60. To win, the order in which you pick the numbers doesn't matter; you only have to choose the right set of 6 numbers. The chances of winning equal the number of different length- k sequences that can be chosen. A related, but different, problem is selecting the batting lineup for a baseball team. Now the order matters, and many more choices are possible than when order does not matter.  

Answering such questions occurs in many applications beyond games. In digital communications, for example, you might ask how many possible double-bit errors can occur in a codeword. Numbering the bit positions from 1 to N, the answer is the same as the lottery problem with k=6. Solving these kind of problems amounts to understanding permutations - the number of ways of choosing things when order matters as in baseball lineups - and combinations - the number of ways of choosing things when order does not matter as in lotteries and bit errors.

Calculating permutations is the easiest. If we are to pick k numbers from a pool of n, we have n choices for the first one. For the second choice, we have n-1. The number of length-two ordered sequences is therefore be n(n-1). Continuing to choose until we make k choices means the number of permutations is

\[n(n-1)(n-2)...(n-k+1)\]

This result can be written in terms of factorials as

\[\frac{n!}{(n-k)!}\]

with

\[n!=n(n-1)(n-2)...1\]

For mathematical convenience, we define

When order does not matter, the number of combinations equals the number of permutations divided by the number of orderings. The number of ways a pool of k things can be ordered equals k!. Thus, once we choose the nine starters for our baseball game, we have

\[9!=362,880\]

different lineups! The symbol for the combination of k things drawn from a pool of n is

\[\binom{n}{k}\]

and equals

\[\frac{n!}{(n-k)!k!}\]

Exercise \(\PageIndex{1}\)

What are the chances of winning the lottery? Assume you pick 6 numbers from the numbers 1-60.

Solution

\[\binom{60}{6}=\frac{60!}{54!6!}=50,063,860\]

Combinatorials occur in interesting places. For example, Newton derived that the n-th power of a sum obeyed the formula 

\[(x+y)^{n}=\binom{n}{0}x^{n}+\binom{n}{1}x^{n-1}y+\binom{n}{2}x^{n-2}y^{2}+...+\binom{n}{n}y^{n}\]

Exercise \(\PageIndex{1}\)

What does the sum of binomial coefficients equal? In other words, what is

\[\sum_{k=0}^{n}\binom{n}{k}\]

Solution

Because of Newton's binomial theorem, the sum equals

\[(1+1)^{n}=2^{n}\]

A related problem is calculating the probability that any two bits are in error in a length-nn codeword when pp is the probability of any bit being in error. The probability of any particular two-bit error sequence is

\[p^{2}(1-p)^{n-2}\]

The probability of a two-bit error occurring anywhere equals this probability times the number of combinations:

\[\binom{n}{2}p^{2}(1-p)^{n-2}\]

Note that the probability that zero or one or two, etc. errors occurring must be one; in other words, something must happen to the codeword! That means that we must have

\[\binom{n}{0}(1-p)^{n}+\binom{n}{1}(1-p)^{n-1}+\binom{n}{2}p^{2}(1-p)^{n-2}+...+\binom{n}{n}p^{n}=1\]

Can you prove this?

Contributor

  •  ContribEEOpenStax