Skip to main content
Engineering LibreTexts

13.4: Chinese Remainder Theorem

  • Page ID
    7400
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    When doing arithmetic mod \(N\), we can sometimes use knowledge of the factors \(N=p q\) to speed things up. This section discusses the math behind these speedups.

    fig-ch01_patchfile_01.jpg
    Figure \(\PageIndex{1}\): Copy and Paste Caption here. (Copyright; author via source)

    History. In the Sunzi Suanjing, written some time around the 4th century ce, Chinese mathematician Sunzi posed an interesting puzzle involving remainders:

    "We have a number of things, but we do not know exactly how many. If we count them by threes we have two left over. If we count them by fives we have three left over. If we count them by sevens we have two left over. How many things are there?\({ }^{6}\)

    Sunzi’s puzzle is the first known instance of a system of simultaneous equations involving modular arithmetic: In our notation, he is asking us to solve for \(x\) in the following system of congruences:

    \[\begin{aligned} &x \equiv_{3} 2 \\ &x \equiv_{5} 3 \\ &x \equiv_{7} 2 \end{aligned}\]

    We can solve such systems of equations using what is called (in the West) the Chinese Remainder Theorem (CRT). Below is one of the simpler formations of the Chinese Remainder Theorem, involving only two equations/moduli (unlike the example above, which has three moduli 3,5 , and 7 )

    Theorem \(13.9\) (CRT)

    Suppose \(\operatorname{gcd}(r, s)=1\). Then for all integers \(u, v\), there is a solution for \(x\) in the following system of equations:

    \[\begin{aligned} &x \equiv{ }_{r} u \\ &x \equiv{ }_{s} v \end{aligned}\]

    Furthermore, this solution is unique modulo \(r s\).

    Proof

    Since gcd \((r, s)=1\), we have by Bezout’s theorem that \(1=a r+b s\) for some integers \(a\) and \(b\). Furthermore, \(b\) and \(s\) are multiplicative inverses modulo \(r\). Now choose \(x=v a r+u b s\). Then,

    \[x=v a r+u b s \equiv_{r}(v a) 0+u\left(s^{-1} s\right)=u\]

    So \(x \equiv_{r} u\), as desired. Using similar reasoning \(\bmod s\), we can see that \(x \equiv_{s} v\), so \(x\) is a solution to both equations.

    Now we argue that this solution is unique modulo \(r\) s. Suppose \(x\) and \(x^{\prime}\) are two solutions to the system of equations, so we have:

    \[x \equiv_{r} x^{\prime} \equiv_{r} u\]

    \[x \equiv_{s} x^{\prime} \equiv_{s} v\]

    Since \(x \equiv_{r} x^{\prime}\) and \(x \equiv_{s} x^{\prime}\), it must be that \(x-x^{\prime}\) is a multiple of \(r\) and a multiple of \(s\). Since \(r\) and \(s\) are relatively prime, their least common multiple is \(r s\), so \(x-x^{\prime}\) must be a multiple of \(r s\). Hence, \(x \equiv_{r s} x^{\prime}\). So any two solutions to this system of equations are congruent \(\bmod r s\)

    Example

    Sage implements the crt function to solve for \(x\) in these kinds of systems of equations. Suppose we want to solve for \(x\) :

    \[\begin{aligned} &x \equiv_{427} 42 \\ &x \equiv_{529} 123 \end{aligned}\]

    In Sage, the solution can be found as follows:

     

    We can check the solution:

     

    CRT Encodings Preserve Structure

    Let’s call \((u, v) \in \mathbb{Z}_{r} \times \mathbb{Z}_{s}\) the CRT encoding of \(x \in \mathbb{Z}_{r s}\) if they satisfy the usual relationship: \[\begin{aligned} &x \equiv{ }_{r} u \\ &x \equiv{ }_{s} v \end{aligned}\] We can convert any \(x \in \mathbb{Z}_{r s}\) into its CRT encoding quite easily, via \(x \mapsto(x \% r, x \% s)\). The Chinese Remainder Theorem says that any \((u, v) \in \mathbb{Z}_{r} \times \mathbb{Z}_{s}\) is a valid CRT encoding of a unique \(x \in \mathbb{Z}_{r s}\); and the proof of the theorem shows how to convert from the CRT encoding into the "usual \(\mathbb{Z}_{r s}\) encoding."

    The amazing thing about these CRT encodings is that they preserve all sorts of arithmetic structure.

    Claim \(13.10\)

    If \((u, v)\) is the CRTencoding of \(x\), and \(\left(u^{\prime}, v^{\prime}\right)\) is the CRT encoding of \(x^{\prime}\), then \(\left(u+u^{\prime} \% r, v+v^{\prime} \% s\right)\) is the CRT encoding of \(x+x^{\prime} \% r s\).

    Example

    Taking \(r=3\) and \(s=5\), let’s write down the CRT encodings of every element in \(\mathbb{Z}_{15}\). In this table, every column contains \(x\) and its \(C R T\) encoding \((u, v)\) :

    fig-ch01_patchfile_01.jpg
    Figure \(\PageIndex{1}\): Copy and Paste Caption here. (Copyright; author via source)

    Highlight the columns for \(x=3\) and \(x^{\prime}=7\) and their sum \(x+x^{\prime}=10\).

    fig-ch01_patchfile_01.jpg
    Figure \(\PageIndex{1}\): Copy and Paste Caption here. (Copyright; author via source)

    Focusing on only the highlighted cells, the top row shows a true addition expression \(3+7 \equiv_{15}\) \(10 ;\) the second row shows a true addition expression \(0+1 \equiv_{3} 1\); the third row shows a true addition expression \(3+2 \equiv_{5} 0\).

    This pattern holds for any \(x\) and \(x^{\prime}\), and I encourage you to try it!

    As if that weren’t amazing enough, the same thing holds for multiplication:

    Claim \(13.11\)

    If \((u, v)\) is the CRTencoding of \(x\), and \(\left(u^{\prime}, v^{\prime}\right)\) is the CRT encoding of \(x^{\prime}\), then \(\left(u \cdot u^{\prime} \% r, v \cdot v^{\prime} \% s\right)\) is the CRT encoding of \(x \cdot x^{\prime} \% r s\).

    Example

    Let’s return to the \(r=3, s=5\) setting for \(C R T\) and highlight \(x=6, x^{\prime}=7\), and their product \(x \cdot x^{\prime} \equiv_{15} 12 .\)

    fig-ch01_patchfile_01.jpg
    Figure \(\PageIndex{1}\): Copy and Paste Caption here. (Copyright; author via source)

    The top row shows a true multiplication expression \(6 \cdot 7 \equiv_{15} 12\); the second row shows a true multiplication expression \(0 \cdot 1 \equiv_{3} 0\); the third row shows a true multiplication expression \(1 \cdot 2 \equiv 52\)

    This pattern holds for any \(x\) and \(x^{\prime}\), and I encourage you to try it!

    The CRT suggests a different, perhaps more indirect, way to do things mod \(r s\). Suppose \(x\) has CRT encoding \((u, v)\) and \(x^{\prime}\) has CRT encoding \(\left(u^{\prime}, v^{\prime}\right)\), and we want to compute \(x+y\) mod \(r s\). One wild idea is to first directly compute the CRT encoding of this answer, and then convert that encoding to the normal integer representation in \(\mathbb{Z}_{r s}\).

    In this case, we know that the answer \(x+x^{\prime}\) has the CRT encoding \(\left(u+u^{\prime} \% r, v+v^{\prime} \% s\right)\). But this is the same as \(\left(x+x^{\prime} \% r, x+x^{\prime} \% s\right)-\) do you see why? So, to add \(x+x^{\prime} \bmod r s\), we just need to add \(x+x^{\prime} \bmod r\), and then add \(x+x^{\prime} \bmod s\). This gives us the CRT encoding of the answer we want, and we can convert that CRT encoding back into a normal \(\mathbb{Z}_{r s^{-}}\) integer.

    The same idea works for multiplication as well, giving us the following:

    CRT method for doing some operation[s] \(\bmod r s\)

    1. Do the operation [s] you want, but modr instead of mod rs.
    2. Do the operation [s] you want, but mods instead of mod rs.
    3. Those two results are the CRT encoding of the final answer, so convert them back to the normal representation. 
    Example

    Let’s take the exampler \(=3359\) and \(s=2953\), which are relatively prime (so the CRT applies). Suppose we want to compute \(3141592+6535897 \%\) rs. Doing it the usual way in Sage looks like this:

     

    Doing it in the CRT way looks like this.

     

    Both methods give the same answer!

    Application to RSA

    You might be wondering what the point of all of this is. \({ }^{7}\) The CRT method seems like a very indirect and wasteful way to compute anything. This impression might be true for simple operations like addition and single multiplications. However, the CRT method is faster for exponentiation \(\bmod N\), which is the main operation in RSA!

    Example

    In Sage, we can do basic exponentiation mod \(n\) as follows:

     

    If we are working over an RSA modulus and know its factorization \(p \times q\), then we use the \(C R T\) method for exponentiation mod pq as follows. We simply do the exponentiation mod p and (separately) mod \(q\), then use the crt function to convert back to \(\mathbb{Z}_{p q}\).

     

    We need to use \(u.lift()\) and \(v.lift()\) to convert \(u\) and \(v\) from Mod-objects into integers, because that is what crt expects.

    We can use both methods to perform an exponentiation, and measure how long it takes with the timeit function. In this example, \(N\) is about 2000 bits long, and the difference in speed is noticeable:

     

    And just for good measure, we can check that both approaches give the same answer:

     

    To understand why the CRT method is faster, it’s important to know that the cost of standard modular exponentiation over a \(k\)-bit modulus is \(O\left(k^{3}\right)\). For simplicity, let’s pretend that exponentiation takes exactly \(k^{3}\) steps. Suppose \(p\) and \(q\) are each \(k\) bits long, so that the RSA modulus \(N\) is \(2 k\) bits long. Hence, a standard exponentiation mod \(N\) takes \((2 k)^{3}=8 k^{3}\) steps.

    With the CRT method, we do an exponentiation mod \(p\) and an exponentiation mod \(q\). Each of these exponentiations takes \(k^{3}\) steps, since \(p\) and \(q\) are only \(k\) bits long. Overall, we are only doing \(2 k^{3}\) steps in this approach, which is \(4 \times\) faster than the standard exponentiation \(\bmod N\). In this simple analysis, we are not counting the cost of converting the CRT encoding back to the typical mod- \(N\) representation. But this cost is much smaller than the cost of an exponentiation (both in practice and asymptotically).

    It’s worth pointing out that this speedup can only be done for RSA signing, and not verification. In order to take advantage of the CRT method to speed up exponentiation \(\bmod N\), it’s necessary to know the prime factors \(p\) and \(q\). Only the person who knows the signing key knows these factors.


    \({ }^{6}\) Chinese text is from an old manuscript of Sunzi Suanjing, but my inability to speak the language prevents me from identifying the manuscript more precisely. English translation is from Joseph Needham, Science and Civilisation in China, vol. 3: Mathematics and Sciences of the Heavens and Earth, \(1959 .\)

    \({ }^{7}\) I’m talking about the CRT method for arithmetic mod \(r s\), not life in general.


    This page titled 13.4: Chinese Remainder Theorem is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Mike Rosulek (Open Oregon State) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.

    • Was this article helpful?