10  Using sklearn Surrogates in spotpython

Besides the internal kriging surrogate, which is used as a default by spotpython, any surrogate model from scikit-learn can be used as a surrogate in spotpython. This chapter explains how to use scikit-learn surrogates in spotpython.

import numpy as np
from math import inf
from spotpython.fun.objectivefunctions import analytical
from spotpython.spot import spot

10.1 Example: Branin Function with spotpython’s Internal Kriging Surrogate

10.1.1 The Objective Function Branin

  • The spotpython package provides several classes of objective functions.

  • We will use an analytical objective function, i.e., a function that can be described by a (closed) formula.

  • Here we will use the Branin function:

      y = a * (x2 - b * x1**2 + c * x1 - r) ** 2 + s * (1 - t) * np.cos(x1) + s,
      where values of a, b, c, r, s and t are: a = 1, b = 5.1 / (4*pi**2),
      c = 5 / pi, r = 6, s = 10 and t = 1 / (8*pi).
  • It has three global minima:

      f(x) = 0.397887 at (-pi, 12.275), (pi, 2.275), and (9.42478, 2.475).
from spotpython.fun.objectivefunctions import analytical
fun = analytical().fun_branin
TensorBoard

Similar to the one-dimensional case, which was introduced in Section Section 7.5, we can use TensorBoard to monitor the progress of the optimization. We will use the same code, only the prefix is different:

from spotpython.utils.init import fun_control_init, design_control_init
PREFIX = "04"
fun_control = fun_control_init(
    PREFIX=PREFIX,
    lower = np.array([-5,-0]),
    upper = np.array([10,15]),
    fun_evals=20,
    max_time=inf)

design_control = design_control_init(
    init_size=10)

10.1.2 Running the surrogate model based optimizer Spot:

spot_2 = spot.Spot(fun=fun,
                   fun_control=fun_control,
                   design_control=design_control)
spot_2.run()
spotpython tuning: 3.8004550038787155 [######----] 55.00% 
spotpython tuning: 3.8004550038787155 [######----] 60.00% 
spotpython tuning: 3.1588579885698627 [######----] 65.00% 
spotpython tuning: 3.1342382932317037 [#######---] 70.00% 
spotpython tuning: 2.8956615907630585 [########--] 75.00% 
spotpython tuning: 0.42052429574482275 [########--] 80.00% 
spotpython tuning: 0.4013351867835322 [########--] 85.00% 
spotpython tuning: 0.399265616254338 [#########-] 90.00% 
spotpython tuning: 0.399265616254338 [##########] 95.00% 
spotpython tuning: 0.399265616254338 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x13fcef560>

10.1.3 TensorBoard

Now we can start TensorBoard in the background with the following command:

tensorboard --logdir="./runs"

We can access the TensorBoard web server with the following URL:

http://localhost:6006/

The TensorBoard plot illustrates how spotpython can be used as a microscope for the internal mechanisms of the surrogate-based optimization process. Here, one important parameter, the learning rate \(\theta\) of the Kriging surrogate is plotted against the number of optimization steps.

TensorBoard visualization of the spotpython optimization process and the surrogate model.

10.1.5 Show the Progress and the Surrogate

spot_2.plot_progress(log_y=True)

spot_2.surrogate.plot()

10.2 Example: Using Surrogates From scikit-learn

  • Default is the spotpython (i.e., the internal) kriging surrogate.
  • It can be called explicitely and passed to Spot.
from spotpython.build.kriging import Kriging
S_0 = Kriging(name='kriging', seed=123)
  • Alternatively, models from scikit-learn can be selected, e.g., Gaussian Process, RBFs, Regression Trees, etc.
# Needed for the sklearn surrogates:
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import RBF
from sklearn.tree import DecisionTreeRegressor
from sklearn.ensemble import RandomForestRegressor
from sklearn import linear_model
from sklearn import tree
import pandas as pd
  • Here are some additional models that might be useful later:
S_Tree = DecisionTreeRegressor(random_state=0)
S_LM = linear_model.LinearRegression()
S_Ridge = linear_model.Ridge()
S_RF = RandomForestRegressor(max_depth=2, random_state=0)

10.2.1 GaussianProcessRegressor as a Surrogate

  • To use a Gaussian Process model from sklearn, that is similar to spotpython’s Kriging, we can proceed as follows:
kernel = 1 * RBF(length_scale=1.0, length_scale_bounds=(1e-2, 1e2))
S_GP = GaussianProcessRegressor(kernel=kernel, n_restarts_optimizer=9)
  • The scikit-learn GP model S_GP is selected for Spot as follows:

    surrogate = S_GP

  • We can check the kind of surogate model with the command isinstance:

isinstance(S_GP, GaussianProcessRegressor) 
True
isinstance(S_0, Kriging)
True
  • Similar to the Spot run with the internal Kriging model, we can call the run with the scikit-learn surrogate:
fun = analytical(seed=123).fun_branin
spot_2_GP = spot.Spot(fun=fun,
                     fun_control=fun_control,
                     design_control=design_control,
                     surrogate = S_GP)
spot_2_GP.run()
spotpython tuning: 18.865120626024897 [######----] 55.00% 
spotpython tuning: 4.067035571881624 [######----] 60.00% 
spotpython tuning: 3.4619135134152215 [######----] 65.00% 
spotpython tuning: 3.4619135134152215 [#######---] 70.00% 
spotpython tuning: 1.328248854169006 [########--] 75.00% 
spotpython tuning: 0.9548473238361144 [########--] 80.00% 
spotpython tuning: 0.9362788065438252 [########--] 85.00% 
spotpython tuning: 0.40009463636571496 [#########-] 90.00% 
spotpython tuning: 0.39824274152672423 [##########] 95.00% 
spotpython tuning: 0.39824274152672423 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x368afde50>
spot_2_GP.plot_progress()

spot_2_GP.print_results()
min y: 0.39824274152672423
x0: 3.1501084171824973
x1: 2.2710556115184426
[['x0', 3.1501084171824973], ['x1', 2.2710556115184426]]

10.3 Example: One-dimensional Sphere Function With spotpython’s Kriging

  • In this example, we will use an one-dimensional function, which allows us to visualize the optimization process.
    • show_models= True is added to the argument list.
from spotpython.fun.objectivefunctions import analytical
fun_control = fun_control_init(
    lower = np.array([-1]),
    upper = np.array([1]),
    fun_evals=10,
    max_time=inf,
    show_models= True,
    tolerance_x = np.sqrt(np.spacing(1)))
fun = analytical(seed=123).fun_sphere
design_control = design_control_init(
    init_size=3)
spot_1 = spot.Spot(fun=fun,
                    fun_control=fun_control,
                    design_control=design_control)
spot_1.run()

spotpython tuning: 0.03475493366922229 [####------] 40.00% 

spotpython tuning: 0.03475493366922229 [#####-----] 50.00% 

spotpython tuning: 0.014772665252290174 [######----] 60.00% 

spotpython tuning: 0.00020571435112063172 [#######---] 70.00% 

spotpython tuning: 5.943319903709223e-08 [########--] 80.00% 

spotpython tuning: 5.943319903709223e-08 [#########-] 90.00% 

spotpython tuning: 5.943319903709223e-08 [##########] 100.00% Done...

10.3.1 Results

spot_1.print_results()
min y: 5.943319903709223e-08
x0: -0.0002437892512747275
[['x0', -0.0002437892512747275]]
spot_1.plot_progress(log_y=True)

  • The method plot_model plots the final surrogate:
spot_1.plot_model()

10.4 Example: Sklearn Model GaussianProcess

  • This example visualizes the search process on the GaussianProcessRegression surrogate from sklearn.
  • Therefore surrogate = S_GP is added to the argument list.
fun = analytical(seed=123).fun_sphere
spot_1_GP = spot.Spot(fun=fun,
                      fun_control=fun_control,
                      design_control=design_control,
                      surrogate = S_GP)
spot_1_GP.run()

spotpython tuning: 0.00492567138682192 [####------] 40.00% 

spotpython tuning: 0.0026120626229886095 [#####-----] 50.00% 

spotpython tuning: 3.1732324790463464e-07 [######----] 60.00% 

spotpython tuning: 3.291512456526943e-08 [#######---] 70.00% 

spotpython tuning: 1.8087882373554217e-08 [########--] 80.00% 

spotpython tuning: 2.4792327258527864e-09 [#########-] 90.00% 

spotpython tuning: 2.4792327258527864e-09 [##########] 100.00% Done...
spot_1_GP.print_results()
min y: 2.4792327258527864e-09
x0: 4.979189417819718e-05
[['x0', 4.979189417819718e-05]]
spot_1_GP.plot_progress(log_y=True)

spot_1_GP.plot_model()

10.5 Exercises

10.5.1 1. A decision tree regressor: DecisionTreeRegressor

  • Describe the surrogate model. Use the information from the scikit-learn documentation.
  • Use the surrogate as the model for optimization.

10.5.2 2. A random forest regressor: RandomForestRegressor

  • Describe the surrogate model. Use the information from the scikit-learn documentation.
  • Use the surrogate as the model for optimization.

10.5.3 3. Ordinary least squares Linear Regression: LinearRegression

  • Describe the surrogate model. Use the information from the scikit-learn documentation.
  • Use the surrogate as the model for optimization.

10.5.4 4. Linear least squares with l2 regularization: Ridge

  • Describe the surrogate model. Use the information from the scikit-learn documentation.
  • Use the surrogate as the model for optimization.

10.5.5 5. Gradient Boosting: HistGradientBoostingRegressor

  • Describe the surrogate model. Use the information from the scikit-learn documentation.
  • Use the surrogate as the model for optimization.

10.5.6 6. Comparison of Surrogates

  • Use the following two objective functions

    1. the 1-dim sphere function fun_sphere and
    2. the two-dim Branin function fun_branin:

    for a comparison of the performance of the five different surrogates:

    • spotpython’s internal Kriging
    • DecisionTreeRegressor
    • RandomForestRegressor
    • linear_model.LinearRegression
    • linear_model.Ridge.
  • Generate a table with the results (number of function evaluations, best function value, and best parameter vector) for each surrogate and each function as shown in Table 10.1.

Table 10.1: Result table
surrogate fun fun_evals max_time x_0 min_y Comments
Kriging fun_sphere 10 inf
Kriging fun_branin 10 inf
DecisionTreeRegressor fun_sphere 10 inf
Ridge fun_branin 10 inf
  • Discuss the results. Which surrogate is the best for which function? Why?

10.6 Selected Solutions

10.6.1 Solution to Exercise Section 10.5.5: Gradient Boosting

10.6.1.1 Branin: Using SPOT

import numpy as np
from math import inf
from spotpython.fun.objectivefunctions import analytical
from spotpython.utils.init import fun_control_init, design_control_init
from spotpython.spot import spot
  • The Objective Function Branin
fun = analytical().fun_branin
PREFIX = "BRANIN"
fun_control = fun_control_init(
    PREFIX=PREFIX,
    lower = np.array([-5,-0]),
    upper = np.array([10,15]),
    fun_evals=20,
    max_time=inf)

design_control = design_control_init(
    init_size=10)
  • Running the surrogate model based optimizer Spot:
spot_2 = spot.Spot(fun=fun,
                   fun_control=fun_control,
                   design_control=design_control)
spot_2.run()
spotpython tuning: 3.1468336273020228 [######----] 55.00% 
spotpython tuning: 3.1468336273020228 [######----] 60.00% 
spotpython tuning: 3.1468336273020228 [######----] 65.00% 
spotpython tuning: 3.1468336273020228 [#######---] 70.00% 
spotpython tuning: 1.1486878851267175 [########--] 75.00% 
spotpython tuning: 1.0238265839035492 [########--] 80.00% 
spotpython tuning: 0.42056072017865986 [########--] 85.00% 
spotpython tuning: 0.4019421180007683 [#########-] 90.00% 
spotpython tuning: 0.39920553705190365 [##########] 95.00% 
spotpython tuning: 0.39920553705190365 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x368a49730>
  • Print the results
spot_2.print_results()
min y: 0.39920553705190365
x0: 3.154894449957061
x1: 2.286298891852929
[['x0', 3.154894449957061], ['x1', 2.286298891852929]]
  • Show the optimization progress:
spot_2.plot_progress(log_y=True)

  • Generate a surrogate model plot:
spot_2.surrogate.plot()

10.6.1.2 Branin: Using Surrogates From scikit-learn

  • The HistGradientBoostingRegressor model from scikit-learn is selected:
# Needed for the sklearn surrogates:
from sklearn.ensemble import HistGradientBoostingRegressor
import pandas as pd
S_XGB = HistGradientBoostingRegressor()
  • The scikit-learn XGB model S_XGB is selected for Spot as follows: surrogate = S_XGB.
  • Similar to the Spot run with the internal Kriging model, we can call the run with the scikit-learn surrogate:
fun = analytical(seed=123).fun_branin
spot_2_XGB = spot.Spot(fun=fun,
                     fun_control=fun_control,
                     design_control=design_control,
                     surrogate = S_XGB)
spot_2_XGB.run()
spotpython tuning: 30.69410528614059 [######----] 55.00% 
spotpython tuning: 30.69410528614059 [######----] 60.00% 
spotpython tuning: 30.69410528614059 [######----] 65.00% 
spotpython tuning: 30.69410528614059 [#######---] 70.00% 
spotpython tuning: 1.3263745845108854 [########--] 75.00% 
spotpython tuning: 1.3263745845108854 [########--] 80.00% 
spotpython tuning: 1.3263745845108854 [########--] 85.00% 
spotpython tuning: 1.3263745845108854 [#########-] 90.00% 
spotpython tuning: 1.3263745845108854 [##########] 95.00% 
spotpython tuning: 1.3263745845108854 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x3686befc0>
  • Print the Results
spot_2_XGB.print_results()
min y: 1.3263745845108854
x0: -2.872730773493426
x1: 10.874313833535739
[['x0', -2.872730773493426], ['x1', 10.874313833535739]]
  • Show the Progress
spot_2_XGB.plot_progress(log_y=True)

  • Since the sklearn model does not provide a plot method, we cannot generate a surrogate model plot.

10.6.1.3 One-dimensional Sphere Function With spotpython’s Kriging

  • In this example, we will use an one-dimensional function, which allows us to visualize the optimization process.
    • show_models= True is added to the argument list.
from spotpython.fun.objectivefunctions import analytical
fun_control = fun_control_init(
    lower = np.array([-1]),
    upper = np.array([1]),
    fun_evals=10,
    max_time=inf,
    show_models= True,
    tolerance_x = np.sqrt(np.spacing(1)))
fun = analytical(seed=123).fun_sphere
design_control = design_control_init(
    init_size=3)
spot_1 = spot.Spot(fun=fun,
                    fun_control=fun_control,
                    design_control=design_control)
spot_1.run()

spotpython tuning: 0.03475493366922229 [####------] 40.00% 

spotpython tuning: 0.03475493366922229 [#####-----] 50.00% 

spotpython tuning: 0.014772665252290174 [######----] 60.00% 

spotpython tuning: 0.00020571435112063172 [#######---] 70.00% 

spotpython tuning: 5.943319903709223e-08 [########--] 80.00% 

spotpython tuning: 5.943319903709223e-08 [#########-] 90.00% 

spotpython tuning: 5.943319903709223e-08 [##########] 100.00% Done...
  • Print the Results
spot_1.print_results()
min y: 5.943319903709223e-08
x0: -0.0002437892512747275
[['x0', -0.0002437892512747275]]
  • Show the Progress
spot_1.plot_progress(log_y=True)

  • The method plot_model plots the final surrogate:
spot_1.plot_model()

10.6.1.4 One-dimensional Sphere Function With Sklearn Model HistGradientBoostingRegressor

  • This example visualizes the search process on the HistGradientBoostingRegressor surrogate from sklearn.
  • Therefore surrogate = S_XGB is added to the argument list.
fun_control = fun_control_init(
    lower = np.array([-1]),
    upper = np.array([1]),
    fun_evals=10,
    max_time=inf,
    show_models= True,
    tolerance_x = np.sqrt(np.spacing(1)))
fun = analytical(seed=123).fun_sphere
design_control = design_control_init(
    init_size=3)
spot_1_XGB = spot.Spot(fun=fun,
                      fun_control=fun_control,
                      design_control=design_control,
                      surrogate = S_XGB)
spot_1_XGB.run()

spotpython tuning: 0.03475493366922229 [####------] 40.00% 

spotpython tuning: 0.03475493366922229 [#####-----] 50.00% 

spotpython tuning: 0.03475493366922229 [######----] 60.00% 

spotpython tuning: 0.03475493366922229 [#######---] 70.00% 

spotpython tuning: 0.008730885505764131 [########--] 80.00% 

spotpython tuning: 0.008730885505764131 [#########-] 90.00% 

spotpython tuning: 0.008730885505764131 [##########] 100.00% Done...
spot_1_XGB.print_results()
min y: 0.008730885505764131
x0: 0.09343920754032609
[['x0', 0.09343920754032609]]
spot_1_XGB.plot_progress(log_y=True)

spot_1_XGB.plot_model()

10.7 Jupyter Notebook

Note