Skip to main content
Engineering LibreTexts

16.1: Curve Fitting

  • Page ID
    15013
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Data sets are by definition a discrete set of points. We can plot them, but we often want to view trends, interpolate new points and (carefully!) extrapolate for prediction purposes. Linear regression is a powerful tool, although sometimes the data would be better fit by another curve. Here we show how to do linear and quadratic regression.

    Example 16.1.1

    Consider the following data set from example 15.1.1 in the previous chapter.

    \[F:=\begin{pmatrix}
    2\\
    4\\
    6\\
    8\\
    10\\
    12\\
    14\\
    16\\
    18\\
    20
    \end{pmatrix}.kN\nonumber\]

    \[δ:=\begin{pmatrix}
    0.82\\
    1.47\\
    2.05\\
    3.37\\
    3.75\\
    4.17\\
    5.25\\
    5.44\\
    6.62\\
    7.97
    \end{pmatrix}.mm\nonumber\]

    clipboard_e4dc04b345de3fdf420135193cace9cb0.png

    Let’s find the least squares line. We consider two methods to do this.

    Method 1

    clipboard_e8d2d3b2fd3545fb1c460b0ad07922a7b.png

    clipboard_ed3c84716dab47f4cf835e205d53f7b10.png

    Method 2

    This method uses the command linfit. It is a bit awkward here, but will be useful when we do higher order regression. We do have to first remove the units from our variables (a limitation of linfit)

    clipboard_eef8e602f4af2ce9030385a3a6104d8cf.png

    Now put them together on the same plot:

    clipboard_e00f340f7a59d5a8e0374552e1a3aa71c.png

    Example 16.1.2

    Consider the following data set:

    \[Time:=\begin{pmatrix}
    0\\
    1\\
    2\\
    3\\
    4\\
    5\\
    6
    \end{pmatrix}\nonumber\]

    \[Distance2:=\begin{pmatrix}
    5\\
    19.8\\
    76.8\\
    153.3\\
    256.2\\
    394.5\\
    559.2
    \end{pmatrix}\nonumber\]

    Plotted:

    clipboard_e4b989f602a8f5fd84b67ee87f4ba9920.png

    The plot of the data suggests a quadratic fit, i.e. a curve y = a + bx + cx2.

    How do we find this quadratic curve?
    Here’s how we implement this in Mathcad:

    clipboard_ed3426f3b93553cbfa616e182a9a87728.png

    Now plot them together:

    clipboard_e1d1d8d9f98b97d637bbe6f06f5aef052.png

    We can compute the R-squared value to see the correlation between Distance and Qpredicted.

    clipboard_e66ca29015bc26075540932fd6b3a8ba4.png

    'RR=1' means a great fit.


    This page titled 16.1: Curve Fitting is shared under a CC BY-NC 3.0 license and was authored, remixed, and/or curated by Troy Siemers (APEX Calculus) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.