Skip to main content
Engineering LibreTexts

11.5: Scikit-learn (sklearn) - Supervised and Unsupervised Learning

  • Page ID
    122954
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\dsum}{\displaystyle\sum\limits} \)

    \( \newcommand{\dint}{\displaystyle\int\limits} \)

    \( \newcommand{\dlim}{\displaystyle\lim\limits} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    Introduction to Scikit-learn

    Scikit-learn (often abbreviated as sklearn) is a free software machine learning library for the Python programming language. It's built on NumPy, SciPy, and Matplotlib, and provides a wide range of supervised and unsupervised learning algorithms. Scikit-learn is known for its simplicity, efficiency, and ease of use, making it a popular choice for both beginners and experienced machine learning practitioners. The scikit-learn documentation provides a comprehensive description of this library.

    Jupyter Notebook with sklearn Examples:

    A dark gray and blue split background; on the left is a colorful infinity loop icon with the word Logic below it, and on the right is the word Magic in white text.

    Key Features

    Scikit-learn offers a comprehensive set of tools for various machine learning tasks, including:

    • Classification: Identifying which category an item belongs to (e.g., spam detection, image recognition).
    • Regression: Predicting a continuous value (e.g., predicting house prices, stock trends).
    • Clustering: Grouping similar items together (e.g., customer segmentation, document analysis).
    • Dimensionality Reduction: Reducing the number of features in a dataset while preserving important information (e.g., principal component analysis).
    • Model Selection: Comparing, validating, and choosing parameters and models (e.g., cross-validation, grid search).
    • Preprocessing: Preparing data for machine learning algorithms (e.g., scaling, imputation).
    • Scikit-learn offers a comprehensive set of tools for various machine learning tasks, including:

    Example

    Scikit-learn Engineering Analysis Examples

    Scikit-learn (sklearn) is a powerful Python library for machine learning, widely used in various fields, including engineering analysis. Here is an example of how to perform a basic engineering analysis task, such as predicting material properties, using sklearn.

    Example: Predicting Material Hardness

    This example demonstrates predicting a material's hardness based on its composition using a linear regression model.

    Explanation:

    • Data Preparation: A sample dataset representing material composition and hardness is created. In a real-world scenario, this would be loaded from a file (e.g., CSV, Excel).
    • Feature and Target Definition: The independent variables (material composition) are defined as X, and the dependent variable (hardness) as y.
    • Data Splitting: The data is split into training and testing sets to evaluate the model's generalization performance on unseen data.
    • Model Training: A LinearRegression model from sklearn.linear_model is initialized and trained using the fit() method on the training data.
    • Prediction: The trained model predicts hardness values for the test set using the predict() method.
    • Model Evaluation: The mean_squared_error and root_mean_squared_error metrics are used to quantify the difference between predicted and actual hardness values.
    • New Predictions: The trained model can then be used to predict the hardness of new, unobserved material compositions.

    This example demonstrates a fundamental application of sklearn in engineering analysis. More complex scenarios might involve feature engineering, different machine learning algorithms (e.g., Support Vector Machines, Random Forests), and more elaborate evaluation metrics.

    %pip install scikit-learn
    import pandas as pd
    from sklearn.model_selection import train_test_split
    from sklearn.linear_model import LinearRegression
    from sklearn.metrics import mean_squared_error
    import numpy as np
    
    # 1. Create a sample dataset (replace with your actual engineering data)
    data = {
        'Carbon_Content': [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0],
        'Manganese_Content': [0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 1.1, 1.2, 1.3, 1.4],
        'Hardness_HV': [150, 180, 210, 240, 270, 300, 330, 360, 390, 420]
    }
    df = pd.DataFrame(data)
    
    # 2. Define features (X) and target (y)
    X = df[['Carbon_Content', 'Manganese_Content']]
    y = df['Hardness_HV']
    
    # 3. Split data into training and testing sets
    X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
    
    # 4. Initialize and train the Linear Regression model
    model = LinearRegression()
    model.fit(X_train, y_train)
    
    # 5. Make predictions on the test set
    y_pred = model.predict(X_test)
    
    # 6. Evaluate the model's performance
    mse = mean_squared_error(y_test, y_pred)
    rmse = np.sqrt(mse)
    
    print("-" * 30)
    print(f"Mean Squared Error: {mse:0.4e}")
    print(f"Root Mean Squared Error: {rmse:0.4e}")
    
    # 7. Use the trained model for new predictions
    new_material_composition = pd.DataFrame([[0.55, 0.95]], columns=['Carbon_Content', 'Manganese_Content'])
    predicted_hardness = model.predict(new_material_composition)
    print(f"Predicted Hardness for new material: {predicted_hardness[0]:.2f} HV")
    Requirement already satisfied: scikit-learn in /srv/conda/envs/notebook/lib/python3.8/site-packages (1.3.2)
    Requirement already satisfied: numpy<2.0,>=1.17.3 in /srv/conda/envs/notebook/lib/python3.8/site-packages (from scikit-learn) (1.23.5)
    Requirement already satisfied: threadpoolctl>=2.0.0 in /srv/conda/envs/notebook/lib/python3.8/site-packages (from scikit-learn) (3.5.0)
    Requirement already satisfied: joblib>=1.1.1 in /srv/conda/envs/notebook/lib/python3.8/site-packages (from scikit-learn) (1.4.2)
    Requirement already satisfied: scipy>=1.5.0 in /srv/conda/envs/notebook/lib/python3.8/site-packages (from scikit-learn) (1.9.3)
    Note: you may need to restart the kernel to use updated packages.
    ------------------------------
    Mean Squared Error: 1.6156e-27
    Root Mean Squared Error: 4.0194e-14
    Predicted Hardness for new material: 285.00 HV
    

    11.5: Scikit-learn (sklearn) - Supervised and Unsupervised Learning is shared under a CC BY-SA license and was authored, remixed, and/or curated by LibreTexts.