SlideShare a Scribd company logo
Training Optimal
Models
Eng Teong Cheah
Microsoft MVP
Hyperparameter Tuning
Hyperparameter tuning is accomplished by training the multiple models, using the same
algorithm and training data but different hyperparameter values. The resulting model
from each training run is then evaluated to determine the performance metric for which
you want to optimize (for example, accuracy), and the best-performing model is selected.
In Azure Machine Learning, you achieve this through an experiment that consists of
a hyperdrive run, which initiates a child run for each hyperparameter combination to be
tested. Each child run uses a training script with parameterized hyperparameter values to
train a model, and logs the target performance metric achieved by the trained model.
Defining a search space
The set of hyperparameter values tried during hyperparameter tuning is known as
the search space. The definition of the range of possible values that can be chosen
depends on the type of hyperparameter.
Discrete hyperparameters
Some hyperparameters require discrete values - in other words, you must select the value
from a particular set of possibilities. You can define a search space for a discrete
parameter using a choice from a list of explicit values, which you can define as a
Python list (choice([10,20,30])), a range (choice(range(1,10))), or an arbitrary set of comma-
separated values (choice(30,50,100))
Defining a search space
Continuous hyperparameters
Some hyperparameters are continuous - in other words you can use any value along a
scale. To define a search space for these kinds of value, you can use any of the following
distribution types:
•normal
•uniform
•lognormal
•loguniform
Configuring sampling
Configuring early termination
With a sufficiently large hyperparameter search space, it could take many iterations (child
runs) to try every possible combination. Typically, you set a maximum number of
iterations, but this could still result in a large number of runs that don't result in a better
model than a combination that has already been tried.
To help prevent wasting time, you can set an early termination policy that abandons runs
that are unlikely to produce a better result than previously completed runs. The policy is
evaluated at an evaluation_interval you specify, based on each time the target
performance metric is logged. You can also set a delay_evaluation parameter to avoid
evaluating the policy until a minimum number of iterations have been completed.
Tune Hyperparameters
https://ceteongvanness.wordpress.com/?p=441
2
References
Microsoft Docs

More Related Content

PDF
Initializing & Optimizing Machine Learning Models
PPTX
Automated Machine Learning (Auto ML)
PPTX
Grid search.pptx
PPTX
hyperparameter tuning grid search, random search, Bayesian .pptx
PPTX
Building largescalepredictionsystemv1
PPT
5_Model for Predictions_Machine_Learning.ppt
PDF
Understanding Mahout classification documentation
PPS
JUnit Goodness
Initializing & Optimizing Machine Learning Models
Automated Machine Learning (Auto ML)
Grid search.pptx
hyperparameter tuning grid search, random search, Bayesian .pptx
Building largescalepredictionsystemv1
5_Model for Predictions_Machine_Learning.ppt
Understanding Mahout classification documentation
JUnit Goodness

Similar to Training Optimal Models (20)

PDF
Adaptive Bayesian contextual hyperband: A novel hyperparameter optimization a...
PDF
ICPSR - Complex Systems Models in the Social Sciences - Lab Session 7, 8 - Pr...
PPT
Logistic Regression using Mahout
PDF
PPTX
How to Win Machine Learning Competitions ?
PPTX
deep learning Lstm Stock model predictor for Google csv
PPTX
Oyebade's Lstm Stock Market Predictor Ai
PPTX
Processes in Query Optimization in (ABMS) Advanced Database Management Systems
PPTX
Comparative Study of Machine Learning Algorithms for Sentiment Analysis with ...
PPTX
the application of machine lerning algorithm for SEE
PPTX
Hyperparameter Tuning
PDF
Cutting edge hyperparameter tuning made simple with ray tune
PDF
MLPerf an industry standard benchmark suite for machine learning performance
PPTX
RapidMiner: Data Mining And Rapid Miner
PPTX
RapidMiner: Data Mining And Rapid Miner
PDF
An introduction to variable and feature selection
PPTX
Lecture 6b Hyperparameter_Tuningin .pptx
PPT
mel705-15.ppt
PPT
mel705-15.ppt
PDF
Customer Churn Analytics using Microsoft R Open
Adaptive Bayesian contextual hyperband: A novel hyperparameter optimization a...
ICPSR - Complex Systems Models in the Social Sciences - Lab Session 7, 8 - Pr...
Logistic Regression using Mahout
How to Win Machine Learning Competitions ?
deep learning Lstm Stock model predictor for Google csv
Oyebade's Lstm Stock Market Predictor Ai
Processes in Query Optimization in (ABMS) Advanced Database Management Systems
Comparative Study of Machine Learning Algorithms for Sentiment Analysis with ...
the application of machine lerning algorithm for SEE
Hyperparameter Tuning
Cutting edge hyperparameter tuning made simple with ray tune
MLPerf an industry standard benchmark suite for machine learning performance
RapidMiner: Data Mining And Rapid Miner
RapidMiner: Data Mining And Rapid Miner
An introduction to variable and feature selection
Lecture 6b Hyperparameter_Tuningin .pptx
mel705-15.ppt
mel705-15.ppt
Customer Churn Analytics using Microsoft R Open
Ad

More from Eng Teong Cheah (20)

PDF
Modern Cross-Platform Apps with .NET MAUI
PDF
Efficiently Removing Duplicates from a Sorted Array
PDF
Monitoring Models
PDF
Responsible Machine Learning
PDF
Deploying Models
PDF
Machine Learning Workflows
PDF
Working with Compute
PDF
Working with Data
PDF
Experiments & TrainingModels
PDF
Automated Machine Learning
PDF
Getting Started with Azure Machine Learning
PDF
Hacking Containers - Container Storage
PDF
Hacking Containers - Looking at Cgroups
PDF
Hacking Containers - Linux Containers
PDF
Data Security - Storage Security
PDF
Application Security- App security
PDF
Application Security - Key Vault
PDF
Compute Security - Container Security
PDF
Compute Security - Host Security
PDF
Virtual Networking Security - Network Security
Modern Cross-Platform Apps with .NET MAUI
Efficiently Removing Duplicates from a Sorted Array
Monitoring Models
Responsible Machine Learning
Deploying Models
Machine Learning Workflows
Working with Compute
Working with Data
Experiments & TrainingModels
Automated Machine Learning
Getting Started with Azure Machine Learning
Hacking Containers - Container Storage
Hacking Containers - Looking at Cgroups
Hacking Containers - Linux Containers
Data Security - Storage Security
Application Security- App security
Application Security - Key Vault
Compute Security - Container Security
Compute Security - Host Security
Virtual Networking Security - Network Security
Ad

Recently uploaded (20)

PDF
KodekX | Application Modernization Development
PDF
[발표본] 너의 과제는 클라우드에 있어_KTDS_김동현_20250524.pdf
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PDF
CIFDAQ's Market Insight: SEC Turns Pro Crypto
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
GDG Cloud Iasi [PUBLIC] Florian Blaga - Unveiling the Evolution of Cybersecur...
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
Spectral efficient network and resource selection model in 5G networks
PDF
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
PDF
Advanced IT Governance
PDF
NewMind AI Weekly Chronicles - August'25 Week I
PDF
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PPTX
PA Analog/Digital System: The Backbone of Modern Surveillance and Communication
PPTX
MYSQL Presentation for SQL database connectivity
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
Review of recent advances in non-invasive hemoglobin estimation
PPTX
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
KodekX | Application Modernization Development
[발표본] 너의 과제는 클라우드에 있어_KTDS_김동현_20250524.pdf
Mobile App Security Testing_ A Comprehensive Guide.pdf
CIFDAQ's Market Insight: SEC Turns Pro Crypto
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
GDG Cloud Iasi [PUBLIC] Florian Blaga - Unveiling the Evolution of Cybersecur...
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
Spectral efficient network and resource selection model in 5G networks
Bridging biosciences and deep learning for revolutionary discoveries: a compr...
Advanced IT Governance
NewMind AI Weekly Chronicles - August'25 Week I
Build a system with the filesystem maintained by OSTree @ COSCUP 2025
PA Analog/Digital System: The Backbone of Modern Surveillance and Communication
MYSQL Presentation for SQL database connectivity
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
Review of recent advances in non-invasive hemoglobin estimation
Detection-First SIEM: Rule Types, Dashboards, and Threat-Informed Strategy
Per capita expenditure prediction using model stacking based on satellite ima...
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx

Training Optimal Models

  • 3. Hyperparameter Tuning Hyperparameter tuning is accomplished by training the multiple models, using the same algorithm and training data but different hyperparameter values. The resulting model from each training run is then evaluated to determine the performance metric for which you want to optimize (for example, accuracy), and the best-performing model is selected. In Azure Machine Learning, you achieve this through an experiment that consists of a hyperdrive run, which initiates a child run for each hyperparameter combination to be tested. Each child run uses a training script with parameterized hyperparameter values to train a model, and logs the target performance metric achieved by the trained model.
  • 4. Defining a search space The set of hyperparameter values tried during hyperparameter tuning is known as the search space. The definition of the range of possible values that can be chosen depends on the type of hyperparameter. Discrete hyperparameters Some hyperparameters require discrete values - in other words, you must select the value from a particular set of possibilities. You can define a search space for a discrete parameter using a choice from a list of explicit values, which you can define as a Python list (choice([10,20,30])), a range (choice(range(1,10))), or an arbitrary set of comma- separated values (choice(30,50,100))
  • 5. Defining a search space Continuous hyperparameters Some hyperparameters are continuous - in other words you can use any value along a scale. To define a search space for these kinds of value, you can use any of the following distribution types: •normal •uniform •lognormal •loguniform
  • 7. Configuring early termination With a sufficiently large hyperparameter search space, it could take many iterations (child runs) to try every possible combination. Typically, you set a maximum number of iterations, but this could still result in a large number of runs that don't result in a better model than a combination that has already been tried. To help prevent wasting time, you can set an early termination policy that abandons runs that are unlikely to produce a better result than previously completed runs. The policy is evaluated at an evaluation_interval you specify, based on each time the target performance metric is logged. You can also set a delay_evaluation parameter to avoid evaluating the policy until a minimum number of iterations have been completed.