10  Using sklearn Surrogates in spotpython

Besides the internal kriging surrogate, which is used as a default by spotpython, any surrogate model from scikit-learn can be used as a surrogate in spotpython. This chapter explains how to use scikit-learn surrogates in spotpython.

import numpy as np
from math import inf
from spotpython.fun.objectivefunctions import Analytical
from spotpython.spot import spot

10.1 Example: Branin Function with spotpython’s Internal Kriging Surrogate

10.1.1 The Objective Function Branin

  • The spotpython package provides several classes of objective functions.

  • We will use an analytical objective function, i.e., a function that can be described by a (closed) formula.

  • Here we will use the Branin function:

      y = a * (x2 - b * x1**2 + c * x1 - r) ** 2 + s * (1 - t) * np.cos(x1) + s,
      where values of a, b, c, r, s and t are: a = 1, b = 5.1 / (4*pi**2),
      c = 5 / pi, r = 6, s = 10 and t = 1 / (8*pi).
  • It has three global minima:

      f(x) = 0.397887 at (-pi, 12.275), (pi, 2.275), and (9.42478, 2.475).
from spotpython.fun.objectivefunctions import Analytical
fun = Analytical().fun_branin
TensorBoard

Similar to the one-dimensional case, which was introduced in Section Section 7.5, we can use TensorBoard to monitor the progress of the optimization. We will use the same code, only the prefix is different:

from spotpython.utils.init import fun_control_init, design_control_init
PREFIX = "04"
fun_control = fun_control_init(
    PREFIX=PREFIX,
    lower = np.array([-5,-0]),
    upper = np.array([10,15]),
    fun_evals=20,
    max_time=inf)

design_control = design_control_init(
    init_size=10)

10.1.2 Running the surrogate model based optimizer Spot:

spot_2 = spot.Spot(fun=fun,
                   fun_control=fun_control,
                   design_control=design_control)
spot_2.run()
spotpython tuning: 3.8004662117718677 [######----] 55.00% 
spotpython tuning: 3.8004662117718677 [######----] 60.00% 
spotpython tuning: 3.159024883515257 [######----] 65.00% 
spotpython tuning: 3.133916697143885 [#######---] 70.00% 
spotpython tuning: 2.8926749183116236 [########--] 75.00% 
spotpython tuning: 0.4190219407803557 [########--] 80.00% 
spotpython tuning: 0.401871440801683 [########--] 85.00% 
spotpython tuning: 0.39926034519166187 [#########-] 90.00% 
spotpython tuning: 0.39926034519166187 [##########] 95.00% 
spotpython tuning: 0.39926034519166187 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x156a2e7e0>

10.1.3 TensorBoard

Now we can start TensorBoard in the background with the following command:

tensorboard --logdir="./runs"

We can access the TensorBoard web server with the following URL:

http://localhost:6006/

The TensorBoard plot illustrates how spotpython can be used as a microscope for the internal mechanisms of the surrogate-based optimization process. Here, one important parameter, the learning rate \(\theta\) of the Kriging surrogate is plotted against the number of optimization steps.

TensorBoard visualization of the spotpython optimization process and the surrogate model.

10.1.5 Show the Progress and the Surrogate

spot_2.plot_progress(log_y=True)

spot_2.surrogate.plot()

10.2 Example: Using Surrogates From scikit-learn

  • Default is the spotpython (i.e., the internal) kriging surrogate.
  • It can be called explicitely and passed to Spot.
from spotpython.build.kriging import Kriging
S_0 = Kriging(name='kriging', seed=123)
  • Alternatively, models from scikit-learn can be selected, e.g., Gaussian Process, RBFs, Regression Trees, etc.
# Needed for the sklearn surrogates:
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import RBF
from sklearn.tree import DecisionTreeRegressor
from sklearn.ensemble import RandomForestRegressor
from sklearn import linear_model
from sklearn import tree
import pandas as pd
  • Here are some additional models that might be useful later:
S_Tree = DecisionTreeRegressor(random_state=0)
S_LM = linear_model.LinearRegression()
S_Ridge = linear_model.Ridge()
S_RF = RandomForestRegressor(max_depth=2, random_state=0)

10.2.1 GaussianProcessRegressor as a Surrogate

  • To use a Gaussian Process model from sklearn, that is similar to spotpython’s Kriging, we can proceed as follows:
kernel = 1 * RBF(length_scale=1.0, length_scale_bounds=(1e-2, 1e2))
S_GP = GaussianProcessRegressor(kernel=kernel, n_restarts_optimizer=9)
  • The scikit-learn GP model S_GP is selected for Spot as follows:

    surrogate = S_GP

  • We can check the kind of surogate model with the command isinstance:

isinstance(S_GP, GaussianProcessRegressor) 
True
isinstance(S_0, Kriging)
True
  • Similar to the Spot run with the internal Kriging model, we can call the run with the scikit-learn surrogate:
fun = Analytical(seed=123).fun_branin
spot_2_GP = spot.Spot(fun=fun,
                     fun_control=fun_control,
                     design_control=design_control,
                     surrogate = S_GP)
spot_2_GP.run()
spotpython tuning: 18.865129821249617 [######----] 55.00% 
spotpython tuning: 4.066961682805861 [######----] 60.00% 
spotpython tuning: 3.4619112320780285 [######----] 65.00% 
spotpython tuning: 3.4619112320780285 [#######---] 70.00% 
spotpython tuning: 1.3283123221495199 [########--] 75.00% 
spotpython tuning: 0.9548698218896146 [########--] 80.00% 
spotpython tuning: 0.9356616728510581 [########--] 85.00% 
spotpython tuning: 0.39968125707661706 [#########-] 90.00% 
spotpython tuning: 0.3983050744842078 [##########] 95.00% 
spotpython tuning: 0.39821610604643354 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x15909a4b0>
spot_2_GP.plot_progress()

spot_2_GP.print_results()
min y: 0.39821610604643354
x0: 3.1496411777654334
x1: 2.272943969041002
[['x0', np.float64(3.1496411777654334)], ['x1', np.float64(2.272943969041002)]]

10.3 Example: One-dimensional Sphere Function With spotpython’s Kriging

  • In this example, we will use an one-dimensional function, which allows us to visualize the optimization process.
    • show_models= True is added to the argument list.
from spotpython.fun.objectivefunctions import Analytical
fun_control = fun_control_init(
    lower = np.array([-1]),
    upper = np.array([1]),
    fun_evals=10,
    max_time=inf,
    show_models= True,
    tolerance_x = np.sqrt(np.spacing(1)))
fun = Analytical(seed=123).fun_sphere
design_control = design_control_init(
    init_size=3)
spot_1 = spot.Spot(fun=fun,
                    fun_control=fun_control,
                    design_control=design_control)
spot_1.run()

spotpython tuning: 0.03475493366922229 [####------] 40.00% 

spotpython tuning: 0.03475493366922229 [#####-----] 50.00% 

spotpython tuning: 0.014602288560505551 [######----] 60.00% 

spotpython tuning: 0.00020552455663660785 [#######---] 70.00% 

spotpython tuning: 5.673799497313666e-08 [########--] 80.00% 

spotpython tuning: 5.673799497313666e-08 [#########-] 90.00% 

spotpython tuning: 5.673799497313666e-08 [##########] 100.00% Done...

10.3.1 Results

spot_1.print_results()
min y: 5.673799497313666e-08
x0: -0.00023819738657914922
[['x0', np.float64(-0.00023819738657914922)]]
spot_1.plot_progress(log_y=True)

  • The method plot_model plots the final surrogate:
spot_1.plot_model()

10.4 Example: Sklearn Model GaussianProcess

  • This example visualizes the search process on the GaussianProcessRegression surrogate from sklearn.
  • Therefore surrogate = S_GP is added to the argument list.
fun = Analytical(seed=123).fun_sphere
spot_1_GP = spot.Spot(fun=fun,
                      fun_control=fun_control,
                      design_control=design_control,
                      surrogate = S_GP)
spot_1_GP.run()

spotpython tuning: 0.004925671418704527 [####------] 40.00% 

spotpython tuning: 0.002612062398164981 [#####-----] 50.00% 

spotpython tuning: 5.609944300870913e-07 [######----] 60.00% 

spotpython tuning: 3.399776625316493e-08 [#######---] 70.00% 

spotpython tuning: 2.8303204876737398e-08 [########--] 80.00% 

spotpython tuning: 2.8303204876737398e-08 [#########-] 90.00% 

spotpython tuning: 2.2894458385368016e-08 [##########] 100.00% Done...
spot_1_GP.print_results()
min y: 2.2894458385368016e-08
x0: 0.0001513091483862361
[['x0', np.float64(0.0001513091483862361)]]
spot_1_GP.plot_progress(log_y=True)

spot_1_GP.plot_model()

10.5 Exercises

10.5.1 1. A decision tree regressor: DecisionTreeRegressor

  • Describe the surrogate model. Use the information from the scikit-learn documentation.
  • Use the surrogate as the model for optimization.

10.5.2 2. A random forest regressor: RandomForestRegressor

  • Describe the surrogate model. Use the information from the scikit-learn documentation.
  • Use the surrogate as the model for optimization.

10.5.3 3. Ordinary least squares Linear Regression: LinearRegression

  • Describe the surrogate model. Use the information from the scikit-learn documentation.
  • Use the surrogate as the model for optimization.

10.5.4 4. Linear least squares with l2 regularization: Ridge

  • Describe the surrogate model. Use the information from the scikit-learn documentation.
  • Use the surrogate as the model for optimization.

10.5.5 5. Gradient Boosting: HistGradientBoostingRegressor

  • Describe the surrogate model. Use the information from the scikit-learn documentation.
  • Use the surrogate as the model for optimization.

10.5.6 6. Comparison of Surrogates

  • Use the following two objective functions

    1. the 1-dim sphere function fun_sphere and
    2. the two-dim Branin function fun_branin:

    for a comparison of the performance of the five different surrogates:

    • spotpython’s internal Kriging
    • DecisionTreeRegressor
    • RandomForestRegressor
    • linear_model.LinearRegression
    • linear_model.Ridge.
  • Generate a table with the results (number of function evaluations, best function value, and best parameter vector) for each surrogate and each function as shown in Table 10.1.

Table 10.1: Result table
surrogate fun fun_evals max_time x_0 min_y Comments
Kriging fun_sphere 10 inf
Kriging fun_branin 10 inf
DecisionTreeRegressor fun_sphere 10 inf
Ridge fun_branin 10 inf
  • Discuss the results. Which surrogate is the best for which function? Why?

10.6 Selected Solutions

10.6.1 Solution to Exercise Section 10.5.5: Gradient Boosting

10.6.1.1 Branin: Using SPOT

import numpy as np
from math import inf
from spotpython.fun.objectivefunctions import Analytical
from spotpython.utils.init import fun_control_init, design_control_init
from spotpython.spot import spot
  • The Objective Function Branin
fun = Analytical().fun_branin
PREFIX = "BRANIN"
fun_control = fun_control_init(
    PREFIX=PREFIX,
    lower = np.array([-5,-0]),
    upper = np.array([10,15]),
    fun_evals=20,
    max_time=inf)

design_control = design_control_init(
    init_size=10)
  • Running the surrogate model based optimizer Spot:
spot_2 = spot.Spot(fun=fun,
                   fun_control=fun_control,
                   design_control=design_control)
spot_2.run()
spotpython tuning: 3.1468376213815015 [######----] 55.00% 
spotpython tuning: 3.1468376213815015 [######----] 60.00% 
spotpython tuning: 3.1468376213815015 [######----] 65.00% 
spotpython tuning: 3.1468376213815015 [#######---] 70.00% 
spotpython tuning: 1.1460879999819689 [########--] 75.00% 
spotpython tuning: 1.0254127943018325 [########--] 80.00% 
spotpython tuning: 0.42994831006071443 [########--] 85.00% 
spotpython tuning: 0.4020917650024458 [#########-] 90.00% 
spotpython tuning: 0.3992153710593467 [##########] 95.00% 
spotpython tuning: 0.3992153710593467 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x15c49dca0>
  • Print the results
spot_2.print_results()
min y: 0.3992153710593467
x0: 3.1555383337491234
x1: 2.2840066834425232
[['x0', np.float64(3.1555383337491234)],
 ['x1', np.float64(2.2840066834425232)]]
  • Show the optimization progress:
spot_2.plot_progress(log_y=True)

  • Generate a surrogate model plot:
spot_2.surrogate.plot()

10.6.1.2 Branin: Using Surrogates From scikit-learn

  • The HistGradientBoostingRegressor model from scikit-learn is selected:
# Needed for the sklearn surrogates:
from sklearn.ensemble import HistGradientBoostingRegressor
import pandas as pd
S_XGB = HistGradientBoostingRegressor()
  • The scikit-learn XGB model S_XGB is selected for Spot as follows: surrogate = S_XGB.
  • Similar to the Spot run with the internal Kriging model, we can call the run with the scikit-learn surrogate:
fun = Analytical(seed=123).fun_branin
spot_2_XGB = spot.Spot(fun=fun,
                     fun_control=fun_control,
                     design_control=design_control,
                     surrogate = S_XGB)
spot_2_XGB.run()
spotpython tuning: 30.69410528614059 [######----] 55.00% 
spotpython tuning: 30.69410528614059 [######----] 60.00% 
spotpython tuning: 30.69410528614059 [######----] 65.00% 
spotpython tuning: 30.69410528614059 [#######---] 70.00% 
spotpython tuning: 1.3263745845108854 [########--] 75.00% 
spotpython tuning: 1.3263745845108854 [########--] 80.00% 
spotpython tuning: 1.3263745845108854 [########--] 85.00% 
spotpython tuning: 1.3263745845108854 [#########-] 90.00% 
spotpython tuning: 1.3263745845108854 [##########] 95.00% 
spotpython tuning: 1.3263745845108854 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x15c72dc10>
  • Print the Results
spot_2_XGB.print_results()
min y: 1.3263745845108854
x0: -2.872730773493426
x1: 10.874313833535739
[['x0', np.float64(-2.872730773493426)],
 ['x1', np.float64(10.874313833535739)]]
  • Show the Progress
spot_2_XGB.plot_progress(log_y=True)

  • Since the sklearn model does not provide a plot method, we cannot generate a surrogate model plot.

10.6.1.3 One-dimensional Sphere Function With spotpython’s Kriging

  • In this example, we will use an one-dimensional function, which allows us to visualize the optimization process.
    • show_models= True is added to the argument list.
from spotpython.fun.objectivefunctions import Analytical
fun_control = fun_control_init(
    lower = np.array([-1]),
    upper = np.array([1]),
    fun_evals=10,
    max_time=inf,
    show_models= True,
    tolerance_x = np.sqrt(np.spacing(1)))
fun = Analytical(seed=123).fun_sphere
design_control = design_control_init(
    init_size=3)
spot_1 = spot.Spot(fun=fun,
                    fun_control=fun_control,
                    design_control=design_control)
spot_1.run()

spotpython tuning: 0.03475493366922229 [####------] 40.00% 

spotpython tuning: 0.03475493366922229 [#####-----] 50.00% 

spotpython tuning: 0.014602288560505551 [######----] 60.00% 

spotpython tuning: 0.00020552455663660785 [#######---] 70.00% 

spotpython tuning: 5.673799497313666e-08 [########--] 80.00% 

spotpython tuning: 5.673799497313666e-08 [#########-] 90.00% 

spotpython tuning: 5.673799497313666e-08 [##########] 100.00% Done...
  • Print the Results
spot_1.print_results()
min y: 5.673799497313666e-08
x0: -0.00023819738657914922
[['x0', np.float64(-0.00023819738657914922)]]
  • Show the Progress
spot_1.plot_progress(log_y=True)

  • The method plot_model plots the final surrogate:
spot_1.plot_model()

10.6.1.4 One-dimensional Sphere Function With Sklearn Model HistGradientBoostingRegressor

  • This example visualizes the search process on the HistGradientBoostingRegressor surrogate from sklearn.
  • Therefore surrogate = S_XGB is added to the argument list.
fun_control = fun_control_init(
    lower = np.array([-1]),
    upper = np.array([1]),
    fun_evals=10,
    max_time=inf,
    show_models= True,
    tolerance_x = np.sqrt(np.spacing(1)))
fun = Analytical(seed=123).fun_sphere
design_control = design_control_init(
    init_size=3)
spot_1_XGB = spot.Spot(fun=fun,
                      fun_control=fun_control,
                      design_control=design_control,
                      surrogate = S_XGB)
spot_1_XGB.run()

spotpython tuning: 0.03475493366922229 [####------] 40.00% 

spotpython tuning: 0.03475493366922229 [#####-----] 50.00% 

spotpython tuning: 0.03475493366922229 [######----] 60.00% 

spotpython tuning: 0.03475493366922229 [#######---] 70.00% 

spotpython tuning: 0.008730885505764131 [########--] 80.00% 

spotpython tuning: 0.008730885505764131 [#########-] 90.00% 

spotpython tuning: 0.008730885505764131 [##########] 100.00% Done...
spot_1_XGB.print_results()
min y: 0.008730885505764131
x0: 0.09343920754032609
[['x0', np.float64(0.09343920754032609)]]
spot_1_XGB.plot_progress(log_y=True)

spot_1_XGB.plot_model()

10.7 Jupyter Notebook

Note