import numpy as np
from math import inf
from spotpython.fun.objectivefunctions import Analytical
from spotpython.spot import spot
10 Using sklearn
Surrogates in spotpython
Besides the internal kriging surrogate, which is used as a default by spotpython
, any surrogate model from scikit-learn
can be used as a surrogate in spotpython
. This chapter explains how to use scikit-learn
surrogates in spotpython
.
10.1 Example: Branin Function with spotpython
’s Internal Kriging Surrogate
10.1.1 The Objective Function Branin
The
spotpython
package provides several classes of objective functions.We will use an analytical objective function, i.e., a function that can be described by a (closed) formula.
Here we will use the Branin function:
y = a * (x2 - b * x1**2 + c * x1 - r) ** 2 + s * (1 - t) * np.cos(x1) + s, where values of a, b, c, r, s and t are: a = 1, b = 5.1 / (4*pi**2), c = 5 / pi, r = 6, s = 10 and t = 1 / (8*pi).
It has three global minima:
f(x) = 0.397887 at (-pi, 12.275), (pi, 2.275), and (9.42478, 2.475).
from spotpython.fun.objectivefunctions import Analytical
= Analytical().fun_branin fun
Similar to the one-dimensional case, which was introduced in Section Section 7.5, we can use TensorBoard to monitor the progress of the optimization. We will use the same code, only the prefix is different:
from spotpython.utils.init import fun_control_init, design_control_init
= "04"
PREFIX = fun_control_init(
fun_control =PREFIX,
PREFIX= np.array([-5,-0]),
lower = np.array([10,15]),
upper =20,
fun_evals=inf)
max_time
= design_control_init(
design_control =10) init_size
10.1.2 Running the surrogate model based optimizer Spot
:
= spot.Spot(fun=fun,
spot_2 =fun_control,
fun_control=design_control) design_control
spot_2.run()
spotpython tuning: 3.8004662117718677 [######----] 55.00%
spotpython tuning: 3.8004662117718677 [######----] 60.00%
spotpython tuning: 3.159024883515257 [######----] 65.00%
spotpython tuning: 3.133916697143885 [#######---] 70.00%
spotpython tuning: 2.8926749183116236 [########--] 75.00%
spotpython tuning: 0.4190219407803557 [########--] 80.00%
spotpython tuning: 0.401871440801683 [########--] 85.00%
spotpython tuning: 0.39926034519166187 [#########-] 90.00%
spotpython tuning: 0.39926034519166187 [##########] 95.00%
spotpython tuning: 0.39926034519166187 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x156a2e7e0>
10.1.3 TensorBoard
Now we can start TensorBoard in the background with the following command:
tensorboard --logdir="./runs"
We can access the TensorBoard web server with the following URL:
http://localhost:6006/
The TensorBoard plot illustrates how spotpython
can be used as a microscope for the internal mechanisms of the surrogate-based optimization process. Here, one important parameter, the learning rate \(\theta\) of the Kriging surrogate is plotted against the number of optimization steps.
10.1.4 Print the Results
spot_2.print_results()
min y: 0.39926034519166187
x0: 3.1509546500431656
x1: 2.298567899278217
[['x0', np.float64(3.1509546500431656)], ['x1', np.float64(2.298567899278217)]]
10.1.5 Show the Progress and the Surrogate
=True) spot_2.plot_progress(log_y
spot_2.surrogate.plot()
10.2 Example: Using Surrogates From scikit-learn
- Default is the
spotpython
(i.e., the internal)kriging
surrogate. - It can be called explicitely and passed to
Spot
.
from spotpython.build.kriging import Kriging
= Kriging(name='kriging', seed=123) S_0
- Alternatively, models from
scikit-learn
can be selected, e.g., Gaussian Process, RBFs, Regression Trees, etc.
# Needed for the sklearn surrogates:
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import RBF
from sklearn.tree import DecisionTreeRegressor
from sklearn.ensemble import RandomForestRegressor
from sklearn import linear_model
from sklearn import tree
import pandas as pd
- Here are some additional models that might be useful later:
= DecisionTreeRegressor(random_state=0)
S_Tree = linear_model.LinearRegression()
S_LM = linear_model.Ridge()
S_Ridge = RandomForestRegressor(max_depth=2, random_state=0) S_RF
10.2.1 GaussianProcessRegressor as a Surrogate
- To use a Gaussian Process model from
sklearn
, that is similar tospotpython
’sKriging
, we can proceed as follows:
= 1 * RBF(length_scale=1.0, length_scale_bounds=(1e-2, 1e2))
kernel = GaussianProcessRegressor(kernel=kernel, n_restarts_optimizer=9) S_GP
The scikit-learn GP model
S_GP
is selected forSpot
as follows:surrogate = S_GP
We can check the kind of surogate model with the command
isinstance
:
isinstance(S_GP, GaussianProcessRegressor)
True
isinstance(S_0, Kriging)
True
- Similar to the
Spot
run with the internalKriging
model, we can call the run with thescikit-learn
surrogate:
= Analytical(seed=123).fun_branin
fun = spot.Spot(fun=fun,
spot_2_GP =fun_control,
fun_control=design_control,
design_control= S_GP)
surrogate spot_2_GP.run()
spotpython tuning: 18.865129821249617 [######----] 55.00%
spotpython tuning: 4.066961682805861 [######----] 60.00%
spotpython tuning: 3.4619112320780285 [######----] 65.00%
spotpython tuning: 3.4619112320780285 [#######---] 70.00%
spotpython tuning: 1.3283123221495199 [########--] 75.00%
spotpython tuning: 0.9548698218896146 [########--] 80.00%
spotpython tuning: 0.9356616728510581 [########--] 85.00%
spotpython tuning: 0.39968125707661706 [#########-] 90.00%
spotpython tuning: 0.3983050744842078 [##########] 95.00%
spotpython tuning: 0.39821610604643354 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x15909a4b0>
spot_2_GP.plot_progress()
spot_2_GP.print_results()
min y: 0.39821610604643354
x0: 3.1496411777654334
x1: 2.272943969041002
[['x0', np.float64(3.1496411777654334)], ['x1', np.float64(2.272943969041002)]]
10.3 Example: One-dimensional Sphere Function With spotpython
’s Kriging
- In this example, we will use an one-dimensional function, which allows us to visualize the optimization process.
show_models= True
is added to the argument list.
from spotpython.fun.objectivefunctions import Analytical
= fun_control_init(
fun_control = np.array([-1]),
lower = np.array([1]),
upper =10,
fun_evals=inf,
max_time= True,
show_models= np.sqrt(np.spacing(1)))
tolerance_x = Analytical(seed=123).fun_sphere
fun = design_control_init(
design_control =3) init_size
= spot.Spot(fun=fun,
spot_1 =fun_control,
fun_control=design_control)
design_control spot_1.run()
spotpython tuning: 0.03475493366922229 [####------] 40.00%
spotpython tuning: 0.03475493366922229 [#####-----] 50.00%
spotpython tuning: 0.014602288560505551 [######----] 60.00%
spotpython tuning: 0.00020552455663660785 [#######---] 70.00%
spotpython tuning: 5.673799497313666e-08 [########--] 80.00%
spotpython tuning: 5.673799497313666e-08 [#########-] 90.00%
spotpython tuning: 5.673799497313666e-08 [##########] 100.00% Done...
10.3.1 Results
spot_1.print_results()
min y: 5.673799497313666e-08
x0: -0.00023819738657914922
[['x0', np.float64(-0.00023819738657914922)]]
=True) spot_1.plot_progress(log_y
- The method
plot_model
plots the final surrogate:
spot_1.plot_model()
10.4 Example: Sklearn
Model GaussianProcess
- This example visualizes the search process on the
GaussianProcessRegression
surrogate fromsklearn
. - Therefore
surrogate = S_GP
is added to the argument list.
= Analytical(seed=123).fun_sphere
fun = spot.Spot(fun=fun,
spot_1_GP =fun_control,
fun_control=design_control,
design_control= S_GP)
surrogate spot_1_GP.run()
spotpython tuning: 0.004925671418704527 [####------] 40.00%
spotpython tuning: 0.002612062398164981 [#####-----] 50.00%
spotpython tuning: 5.609944300870913e-07 [######----] 60.00%
spotpython tuning: 3.399776625316493e-08 [#######---] 70.00%
spotpython tuning: 2.8303204876737398e-08 [########--] 80.00%
spotpython tuning: 2.8303204876737398e-08 [#########-] 90.00%
spotpython tuning: 2.2894458385368016e-08 [##########] 100.00% Done...
spot_1_GP.print_results()
min y: 2.2894458385368016e-08
x0: 0.0001513091483862361
[['x0', np.float64(0.0001513091483862361)]]
=True) spot_1_GP.plot_progress(log_y
spot_1_GP.plot_model()
10.5 Exercises
10.5.1 1. A decision tree regressor: DecisionTreeRegressor
- Describe the surrogate model. Use the information from the scikit-learn documentation.
- Use the surrogate as the model for optimization.
10.5.2 2. A random forest regressor: RandomForestRegressor
- Describe the surrogate model. Use the information from the scikit-learn documentation.
- Use the surrogate as the model for optimization.
10.5.3 3. Ordinary least squares Linear Regression: LinearRegression
- Describe the surrogate model. Use the information from the scikit-learn documentation.
- Use the surrogate as the model for optimization.
10.5.4 4. Linear least squares with l2 regularization: Ridge
- Describe the surrogate model. Use the information from the scikit-learn documentation.
- Use the surrogate as the model for optimization.
10.5.5 5. Gradient Boosting: HistGradientBoostingRegressor
- Describe the surrogate model. Use the information from the scikit-learn documentation.
- Use the surrogate as the model for optimization.
10.5.6 6. Comparison of Surrogates
Use the following two objective functions
- the 1-dim sphere function
fun_sphere
and - the two-dim Branin function
fun_branin
:
for a comparison of the performance of the five different surrogates:
spotpython
’s internal KrigingDecisionTreeRegressor
RandomForestRegressor
linear_model.LinearRegression
linear_model.Ridge
.
- the 1-dim sphere function
Generate a table with the results (number of function evaluations, best function value, and best parameter vector) for each surrogate and each function as shown in Table 10.1.
surrogate |
fun |
fun_evals |
max_time |
x_0 |
min_y |
Comments |
---|---|---|---|---|---|---|
Kriging |
fun_sphere |
10 | inf |
|||
Kriging |
fun_branin |
10 | inf |
|||
DecisionTreeRegressor |
fun_sphere |
10 | inf |
|||
… | … | … | … | |||
Ridge |
fun_branin |
10 | inf |
- Discuss the results. Which surrogate is the best for which function? Why?
10.6 Selected Solutions
10.6.1 Solution to Exercise Section 10.5.5: Gradient Boosting
10.6.1.1 Branin: Using SPOT
import numpy as np
from math import inf
from spotpython.fun.objectivefunctions import Analytical
from spotpython.utils.init import fun_control_init, design_control_init
from spotpython.spot import spot
- The Objective Function Branin
= Analytical().fun_branin
fun = "BRANIN"
PREFIX = fun_control_init(
fun_control =PREFIX,
PREFIX= np.array([-5,-0]),
lower = np.array([10,15]),
upper =20,
fun_evals=inf)
max_time
= design_control_init(
design_control =10) init_size
- Running the surrogate model based optimizer
Spot
:
= spot.Spot(fun=fun,
spot_2 =fun_control,
fun_control=design_control)
design_control spot_2.run()
spotpython tuning: 3.1468376213815015 [######----] 55.00%
spotpython tuning: 3.1468376213815015 [######----] 60.00%
spotpython tuning: 3.1468376213815015 [######----] 65.00%
spotpython tuning: 3.1468376213815015 [#######---] 70.00%
spotpython tuning: 1.1460879999819689 [########--] 75.00%
spotpython tuning: 1.0254127943018325 [########--] 80.00%
spotpython tuning: 0.42994831006071443 [########--] 85.00%
spotpython tuning: 0.4020917650024458 [#########-] 90.00%
spotpython tuning: 0.3992153710593467 [##########] 95.00%
spotpython tuning: 0.3992153710593467 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x15c49dca0>
- Print the results
spot_2.print_results()
min y: 0.3992153710593467
x0: 3.1555383337491234
x1: 2.2840066834425232
[['x0', np.float64(3.1555383337491234)],
['x1', np.float64(2.2840066834425232)]]
- Show the optimization progress:
=True) spot_2.plot_progress(log_y
- Generate a surrogate model plot:
spot_2.surrogate.plot()
10.6.1.2 Branin: Using Surrogates From scikit-learn
- The
HistGradientBoostingRegressor
model fromscikit-learn
is selected:
# Needed for the sklearn surrogates:
from sklearn.ensemble import HistGradientBoostingRegressor
import pandas as pd
= HistGradientBoostingRegressor() S_XGB
- The scikit-learn XGB model
S_XGB
is selected forSpot
as follows:surrogate = S_XGB
. - Similar to the
Spot
run with the internalKriging
model, we can call the run with thescikit-learn
surrogate:
= Analytical(seed=123).fun_branin
fun = spot.Spot(fun=fun,
spot_2_XGB =fun_control,
fun_control=design_control,
design_control= S_XGB)
surrogate spot_2_XGB.run()
spotpython tuning: 30.69410528614059 [######----] 55.00%
spotpython tuning: 30.69410528614059 [######----] 60.00%
spotpython tuning: 30.69410528614059 [######----] 65.00%
spotpython tuning: 30.69410528614059 [#######---] 70.00%
spotpython tuning: 1.3263745845108854 [########--] 75.00%
spotpython tuning: 1.3263745845108854 [########--] 80.00%
spotpython tuning: 1.3263745845108854 [########--] 85.00%
spotpython tuning: 1.3263745845108854 [#########-] 90.00%
spotpython tuning: 1.3263745845108854 [##########] 95.00%
spotpython tuning: 1.3263745845108854 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x15c72dc10>
- Print the Results
spot_2_XGB.print_results()
min y: 1.3263745845108854
x0: -2.872730773493426
x1: 10.874313833535739
[['x0', np.float64(-2.872730773493426)],
['x1', np.float64(10.874313833535739)]]
- Show the Progress
=True) spot_2_XGB.plot_progress(log_y
- Since the
sklearn
model does not provide aplot
method, we cannot generate a surrogate model plot.
10.6.1.3 One-dimensional Sphere Function With spotpython
’s Kriging
- In this example, we will use an one-dimensional function, which allows us to visualize the optimization process.
show_models= True
is added to the argument list.
from spotpython.fun.objectivefunctions import Analytical
= fun_control_init(
fun_control = np.array([-1]),
lower = np.array([1]),
upper =10,
fun_evals=inf,
max_time= True,
show_models= np.sqrt(np.spacing(1)))
tolerance_x = Analytical(seed=123).fun_sphere
fun = design_control_init(
design_control =3) init_size
= spot.Spot(fun=fun,
spot_1 =fun_control,
fun_control=design_control)
design_control spot_1.run()
spotpython tuning: 0.03475493366922229 [####------] 40.00%
spotpython tuning: 0.03475493366922229 [#####-----] 50.00%
spotpython tuning: 0.014602288560505551 [######----] 60.00%
spotpython tuning: 0.00020552455663660785 [#######---] 70.00%
spotpython tuning: 5.673799497313666e-08 [########--] 80.00%
spotpython tuning: 5.673799497313666e-08 [#########-] 90.00%
spotpython tuning: 5.673799497313666e-08 [##########] 100.00% Done...
- Print the Results
spot_1.print_results()
min y: 5.673799497313666e-08
x0: -0.00023819738657914922
[['x0', np.float64(-0.00023819738657914922)]]
- Show the Progress
=True) spot_1.plot_progress(log_y
- The method
plot_model
plots the final surrogate:
spot_1.plot_model()
10.6.1.4 One-dimensional Sphere Function With Sklearn
Model HistGradientBoostingRegressor
- This example visualizes the search process on the
HistGradientBoostingRegressor
surrogate fromsklearn
. - Therefore
surrogate = S_XGB
is added to the argument list.
= fun_control_init(
fun_control = np.array([-1]),
lower = np.array([1]),
upper =10,
fun_evals=inf,
max_time= True,
show_models= np.sqrt(np.spacing(1)))
tolerance_x = Analytical(seed=123).fun_sphere
fun = design_control_init(
design_control =3)
init_size= spot.Spot(fun=fun,
spot_1_XGB =fun_control,
fun_control=design_control,
design_control= S_XGB)
surrogate spot_1_XGB.run()
spotpython tuning: 0.03475493366922229 [####------] 40.00%
spotpython tuning: 0.03475493366922229 [#####-----] 50.00%
spotpython tuning: 0.03475493366922229 [######----] 60.00%
spotpython tuning: 0.03475493366922229 [#######---] 70.00%
spotpython tuning: 0.008730885505764131 [########--] 80.00%
spotpython tuning: 0.008730885505764131 [#########-] 90.00%
spotpython tuning: 0.008730885505764131 [##########] 100.00% Done...
spot_1_XGB.print_results()
min y: 0.008730885505764131
x0: 0.09343920754032609
[['x0', np.float64(0.09343920754032609)]]
=True) spot_1_XGB.plot_progress(log_y
spot_1_XGB.plot_model()
10.7 Jupyter Notebook
- The Jupyter-Notebook of this lecture is available on GitHub in the Hyperparameter-Tuning-Cookbook Repository