Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Tuning hyperparameters is a key concept in machine learning. You can perform hyperparameter tuning automatically using techniques such as grid search, random search, and gradient based optimization etc. This guide demonstrates how to tune hyperparameters manually by performing a few tests using WSO2 Machine Learner. 

Table of Contents
maxLevel3

Goals of this guide

This guide uses the well-known Pima Indians Diabetes dataset and the Logistic Regression with mini batch gradient descent algorithm to perform the analysis. The hyperparameters of this algorithm are as follows.

HyperparameterDescription
Iterations

Number of times optimizer run before completing the optimization process.

Learning RateStep size of the optimization algorithm.
Regularization TypeType of the regularization. WSO2 Machine Learner supports L2 and L1 regularizations. 
Regularization ParameterRegularization parameter controls the model complexity. Hence, it helps to control model overfitting.
SGD Data FractionFraction of the training dataset used in a single iteration of the optimization algorithm.

This guide demonstrates the following goals on finding the optimal Learning Rate and the Number of Iterations, while keeping the other hyperparameters in the above list at a constant value.

  • Finding the optimal Learning Rate and the Number of Iterations which improves Area Under Curve (AUC). For more information on Area Under Curve of ROC Curve, see What you wanted to know about AUC.

  • Finding the relationship between Learning Rate and AUC.

  • Finding the relationship between number of iterations and AUC.