12  Expected Improvement

This chapter describes, analyzes, and compares different infill criterion. An infill criterion defines how the next point \(x_{n+1}\) is selected from the surrogate model \(S\). Expected improvement is a popular infill criterion in Bayesian optimization.

12.1 Example: Spot and the 1-dim Sphere Function

import numpy as np
from math import inf
from spotpython.fun.objectivefunctions import Analytical
from spotpython.spot import spot
from spotpython.utils.init import fun_control_init, surrogate_control_init, design_control_init
import matplotlib.pyplot as plt

12.1.1 The Objective Function: 1-dim Sphere

  • The spotpython package provides several classes of objective functions.
  • We will use an analytical objective function, i.e., a function that can be described by a (closed) formula: \[f(x) = x^2 \]
fun = Analytical().fun_sphere
  • The size of the lower bound vector determines the problem dimension.
  • Here we will use np.array([-1]), i.e., a one-dim function.
TensorBoard

Similar to the one-dimensional case, which was introduced in Section Section 7.5, we can use TensorBoard to monitor the progress of the optimization. We will use the same code, only the prefix is different:

from spotpython.utils.init import fun_control_init
PREFIX = "07_Y"
fun_control = fun_control_init(
    PREFIX=PREFIX,
    fun_evals = 25,
    lower = np.array([-1]),
    upper = np.array([1]),
    tolerance_x = np.sqrt(np.spacing(1)),)
design_control = design_control_init(init_size=10)
spot_1 = spot.Spot(
            fun=fun,
            fun_control=fun_control,
            design_control=design_control)
spot_1.run()
spotpython tuning: 4.960293502265715e-09 [####------] 44.00% 
spotpython tuning: 4.960293502265715e-09 [#####-----] 48.00% 
spotpython tuning: 3.91132389612039e-09 [#####-----] 52.00% 
spotpython tuning: 3.91132389612039e-09 [######----] 56.00% 
spotpython tuning: 2.8792582932753584e-09 [######----] 60.00% 
spotpython tuning: 2.8792582932753584e-09 [######----] 64.00% 
spotpython tuning: 2.8792582932753584e-09 [#######---] 68.00% 
spotpython tuning: 2.8792582932753584e-09 [#######---] 72.00% 
spotpython tuning: 2.8792582932753584e-09 [########--] 76.00% 
spotpython tuning: 2.8792582932753584e-09 [########--] 80.00% 
spotpython tuning: 5.121302337041985e-10 [########--] 84.00% 
spotpython tuning: 5.121302337041985e-10 [#########-] 88.00% 
spotpython tuning: 9.915881842777748e-12 [#########-] 92.00% 
spotpython tuning: 9.915881842777748e-12 [##########] 96.00% 
spotpython tuning: 9.915881842777748e-12 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x14a2b2f30>

12.1.2 Results

spot_1.print_results()
min y: 9.915881842777748e-12
x0: -3.1489493236280808e-06
[['x0', np.float64(-3.1489493236280808e-06)]]
spot_1.plot_progress(log_y=True)

TensorBoard visualization of the spotpython optimization process and the surrogate model.

12.2 Same, but with EI as infill_criterion

PREFIX = "07_EI_ISO"
fun_control = fun_control_init(
    PREFIX=PREFIX,
    lower = np.array([-1]),
    upper = np.array([1]),
    fun_evals = 25,
    tolerance_x = np.sqrt(np.spacing(1)),
    infill_criterion = "ei")
spot_1_ei = spot.Spot(fun=fun,
                     fun_control=fun_control)
spot_1_ei.run()
spotpython tuning: 1.0205727057090308e-08 [####------] 44.00% 
spotpython tuning: 1.0205727057090308e-08 [#####-----] 48.00% 
spotpython tuning: 1.0205727057090308e-08 [#####-----] 52.00% 
spotpython tuning: 1.0205727057090308e-08 [######----] 56.00% 
spotpython tuning: 2.2781224716456335e-14 [######----] 60.00% 
spotpython tuning: 2.2781224716456335e-14 [######----] 64.00% 
spotpython tuning: 2.2781224716456335e-14 [#######---] 68.00% 
spotpython tuning: 2.2781224716456335e-14 [#######---] 72.00% 
spotpython tuning: 2.2781224716456335e-14 [########--] 76.00% 
spotpython tuning: 2.2781224716456335e-14 [########--] 80.00% 
spotpython tuning: 2.2781224716456335e-14 [########--] 84.00% 
spotpython tuning: 2.2781224716456335e-14 [#########-] 88.00% 
spotpython tuning: 2.2781224716456335e-14 [#########-] 92.00% 
spotpython tuning: 2.2781224716456335e-14 [##########] 96.00% 
spotpython tuning: 2.2781224716456335e-14 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x14a5870e0>
spot_1_ei.plot_progress(log_y=True)

spot_1_ei.print_results()
min y: 2.2781224716456335e-14
x0: 1.5093450472458687e-07
[['x0', np.float64(1.5093450472458687e-07)]]

TensorBoard visualization of the spotpython optimization process and the surrogate model. Expected improvement, isotropic Kriging.

12.3 Non-isotropic Kriging

PREFIX = "07_EI_NONISO"
fun_control = fun_control_init(
    PREFIX=PREFIX,
    lower = np.array([-1, -1]),
    upper = np.array([1, 1]),
    fun_evals = 25,
    tolerance_x = np.sqrt(np.spacing(1)),
    infill_criterion = "ei")
surrogate_control = surrogate_control_init(
    n_theta=2,
    noise=False,
    )
spot_2_ei_noniso = spot.Spot(fun=fun,
                   fun_control=fun_control,
                   surrogate_control=surrogate_control)
spot_2_ei_noniso.run()
spotpython tuning: 1.885355036033269e-05 [####------] 44.00% 
spotpython tuning: 1.885355036033269e-05 [#####-----] 48.00% 
spotpython tuning: 1.885355036033269e-05 [#####-----] 52.00% 
spotpython tuning: 1.885355036033269e-05 [######----] 56.00% 
spotpython tuning: 1.885355036033269e-05 [######----] 60.00% 
spotpython tuning: 1.885355036033269e-05 [######----] 64.00% 
spotpython tuning: 1.885355036033269e-05 [#######---] 68.00% 
spotpython tuning: 1.885355036033269e-05 [#######---] 72.00% 
spotpython tuning: 9.720933590577543e-06 [########--] 76.00% 
spotpython tuning: 8.679050871673044e-06 [########--] 80.00% 
spotpython tuning: 8.679050871673044e-06 [########--] 84.00% 
spotpython tuning: 8.679050871673044e-06 [#########-] 88.00% 
spotpython tuning: 8.679050871673044e-06 [#########-] 92.00% 
spotpython tuning: 8.679050871673044e-06 [##########] 96.00% 
spotpython tuning: 8.679050871673044e-06 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x14a6974a0>
spot_2_ei_noniso.plot_progress(log_y=True)

spot_2_ei_noniso.print_results()
min y: 8.679050871673044e-06
x0: -0.0029115430445807067
x1: -0.000449408468129777
[['x0', np.float64(-0.0029115430445807067)],
 ['x1', np.float64(-0.000449408468129777)]]
spot_2_ei_noniso.surrogate.plot()

TensorBoard visualization of the spotpython optimization process and the surrogate model. Expected improvement, isotropic Kriging.

12.4 Using sklearn Surrogates

12.4.1 The spot Loop

The spot loop consists of the following steps:

  1. Init: Build initial design \(X\)
  2. Evaluate initial design on real objective \(f\): \(y = f(X)\)
  3. Build surrogate: \(S = S(X,y)\)
  4. Optimize on surrogate: \(X_0 = \text{optimize}(S)\)
  5. Evaluate on real objective: \(y_0 = f(X_0)\)
  6. Impute (Infill) new points: \(X = X \cup X_0\), \(y = y \cup y_0\).
  7. Got 3.

The spot loop is implemented in R as follows:

Visual representation of the model based search with SPOT. Taken from: Bartz-Beielstein, T., and Zaefferer, M. Hyperparameter tuning approaches. In Hyperparameter Tuning for Machine and Deep Learning with R - A Practical Guide, E. Bartz, T. Bartz-Beielstein, M. Zaefferer, and O. Mersmann, Eds. Springer, 2022, ch. 4, pp. 67–114.

12.4.2 spot: The Initial Model

12.4.2.1 Example: Modifying the initial design size

This is the “Example: Modifying the initial design size” from Chapter 4.5.1 in [bart21i].

spot_ei = spot.Spot(fun=fun,
                fun_control=fun_control_init(
                lower = np.array([-1,-1]),
                upper= np.array([1,1])), 
                design_control = design_control_init(init_size=5))
spot_ei.run()
spotpython tuning: 0.13771720107579405 [####------] 40.00% 
spotpython tuning: 0.008747581912500914 [#####-----] 46.67% 
spotpython tuning: 0.002833855020194859 [#####-----] 53.33% 
spotpython tuning: 0.0008113730206162874 [######----] 60.00% 
spotpython tuning: 0.0003658334488102912 [#######---] 66.67% 
spotpython tuning: 0.000357376362957228 [#######---] 73.33% 
spotpython tuning: 0.000357376362957228 [########--] 80.00% 
spotpython tuning: 0.00032569308461158667 [#########-] 86.67% 
spotpython tuning: 0.000272252838237925 [#########-] 93.33% 
spotpython tuning: 0.0001495137205828153 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x14a60e6f0>
spot_ei.plot_progress()

np.min(spot_1.y), np.min(spot_ei.y)
(np.float64(9.915881842777748e-12), np.float64(0.0001495137205828153))

12.4.3 Init: Build Initial Design

from spotpython.design.spacefilling import SpaceFilling
from spotpython.build.kriging import Kriging
from spotpython.fun.objectivefunctions import Analytical
gen = SpaceFilling(2)
rng = np.random.RandomState(1)
lower = np.array([-5,-0])
upper = np.array([10,15])
fun = Analytical().fun_branin

X = gen.scipy_lhd(10, lower=lower, upper = upper)
print(X)
y = fun(X, fun_control=fun_control)
print(y)
[[ 8.97647221 13.41926847]
 [ 0.66946019  1.22344228]
 [ 5.23614115 13.78185824]
 [ 5.6149825  11.5851384 ]
 [-1.72963184  1.66516096]
 [-4.26945568  7.1325531 ]
 [ 1.26363761 10.17935555]
 [ 2.88779942  8.05508969]
 [-3.39111089  4.15213772]
 [ 7.30131231  5.22275244]]
[128.95676449  31.73474356 172.89678121 126.71295908  64.34349975
  70.16178611  48.71407916  31.77322887  76.91788181  30.69410529]
S = Kriging(name='kriging',  seed=123)
S.fit(X, y)
S.plot()

gen = SpaceFilling(2, seed=123)
X0 = gen.scipy_lhd(3)
gen = SpaceFilling(2, seed=345)
X1 = gen.scipy_lhd(3)
X2 = gen.scipy_lhd(3)
gen = SpaceFilling(2, seed=123)
X3 = gen.scipy_lhd(3)
X0, X1, X2, X3
(array([[0.77254938, 0.31539299],
        [0.59321338, 0.93854273],
        [0.27469803, 0.3959685 ]]),
 array([[0.78373509, 0.86811887],
        [0.06692621, 0.6058029 ],
        [0.41374778, 0.00525456]]),
 array([[0.121357  , 0.69043832],
        [0.41906219, 0.32838498],
        [0.86742658, 0.52910374]]),
 array([[0.77254938, 0.31539299],
        [0.59321338, 0.93854273],
        [0.27469803, 0.3959685 ]]))

12.4.4 Evaluate

12.4.5 Build Surrogate

12.4.6 A Simple Predictor

The code below shows how to use a simple model for prediction.

  • Assume that only two (very costly) measurements are available:

    1. f(0) = 0.5
    2. f(2) = 2.5
  • We are interested in the value at \(x_0 = 1\), i.e., \(f(x_0 = 1)\), but cannot run an additional, third experiment.

from sklearn import linear_model
X = np.array([[0], [2]])
y = np.array([0.5, 2.5])
S_lm = linear_model.LinearRegression()
S_lm = S_lm.fit(X, y)
X0 = np.array([[1]])
y0 = S_lm.predict(X0)
print(y0)
[1.5]
  • Central Idea:
    • Evaluation of the surrogate model S_lm is much cheaper (or / and much faster) than running the real-world experiment \(f\).

12.5 Gaussian Processes regression: basic introductory example

This example was taken from scikit-learn. After fitting our model, we see that the hyperparameters of the kernel have been optimized. Now, we will use our kernel to compute the mean prediction of the full dataset and plot the 95% confidence interval.

import numpy as np
import matplotlib.pyplot as plt
import math as m
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import RBF

X = np.linspace(start=0, stop=10, num=1_000).reshape(-1, 1)
y = np.squeeze(X * np.sin(X))
rng = np.random.RandomState(1)
training_indices = rng.choice(np.arange(y.size), size=6, replace=False)
X_train, y_train = X[training_indices], y[training_indices]

kernel = 1 * RBF(length_scale=1.0, length_scale_bounds=(1e-2, 1e2))
gaussian_process = GaussianProcessRegressor(kernel=kernel, n_restarts_optimizer=9)
gaussian_process.fit(X_train, y_train)
gaussian_process.kernel_

mean_prediction, std_prediction = gaussian_process.predict(X, return_std=True)

plt.plot(X, y, label=r"$f(x) = x \sin(x)$", linestyle="dotted")
plt.scatter(X_train, y_train, label="Observations")
plt.plot(X, mean_prediction, label="Mean prediction")
plt.fill_between(
    X.ravel(),
    mean_prediction - 1.96 * std_prediction,
    mean_prediction + 1.96 * std_prediction,
    alpha=0.5,
    label=r"95% confidence interval",
)
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("sk-learn Version: Gaussian process regression on noise-free dataset")

from spotpython.build.kriging import Kriging
import numpy as np
import matplotlib.pyplot as plt
rng = np.random.RandomState(1)
X = np.linspace(start=0, stop=10, num=1_000).reshape(-1, 1)
y = np.squeeze(X * np.sin(X))
training_indices = rng.choice(np.arange(y.size), size=6, replace=False)
X_train, y_train = X[training_indices], y[training_indices]


S = Kriging(name='kriging',  seed=123, log_level=50, cod_type="norm")
S.fit(X_train, y_train)

mean_prediction, std_prediction, ei = S.predict(X, return_val="all")

std_prediction

plt.plot(X, y, label=r"$f(x) = x \sin(x)$", linestyle="dotted")
plt.scatter(X_train, y_train, label="Observations")
plt.plot(X, mean_prediction, label="Mean prediction")
plt.fill_between(
    X.ravel(),
    mean_prediction - 1.96 * std_prediction,
    mean_prediction + 1.96 * std_prediction,
    alpha=0.5,
    label=r"95% confidence interval",
)
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("spotpython Version: Gaussian process regression on noise-free dataset")

12.6 The Surrogate: Using scikit-learn models

Default is the internal kriging surrogate.

S_0 = Kriging(name='kriging', seed=123)

Models from scikit-learn can be selected, e.g., Gaussian Process:

# Needed for the sklearn surrogates:
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import RBF
from sklearn.tree import DecisionTreeRegressor
from sklearn.ensemble import RandomForestRegressor
from sklearn import linear_model
from sklearn import tree
import pandas as pd
kernel = 1 * RBF(length_scale=1.0, length_scale_bounds=(1e-2, 1e2))
S_GP = GaussianProcessRegressor(kernel=kernel, n_restarts_optimizer=9)
  • and many more:
S_Tree = DecisionTreeRegressor(random_state=0)
S_LM = linear_model.LinearRegression()
S_Ridge = linear_model.Ridge()
S_RF = RandomForestRegressor(max_depth=2, random_state=0) 
  • The scikit-learn GP model S_GP is selected.
S = S_GP
isinstance(S, GaussianProcessRegressor)
True
from spotpython.fun.objectivefunctions import Analytical
fun = Analytical().fun_branin
fun_control = fun_control_init(
    lower = np.array([-5,-0]),
    upper = np.array([10,15]),
    fun_evals = 15)    
design_control = design_control_init(init_size=5)
spot_GP = spot.Spot(fun=fun, 
                    fun_control=fun_control,
                    surrogate=S, 
                    design_control=design_control)
spot_GP.run()
spotpython tuning: 24.51465459019188 [####------] 40.00% 
spotpython tuning: 11.003092545432404 [#####-----] 46.67% 
spotpython tuning: 11.003092545432404 [#####-----] 53.33% 
spotpython tuning: 7.281405479109784 [######----] 60.00% 
spotpython tuning: 7.281405479109784 [#######---] 66.67% 
spotpython tuning: 7.281405479109784 [#######---] 73.33% 
spotpython tuning: 2.9520033012954237 [########--] 80.00% 
spotpython tuning: 2.9520033012954237 [#########-] 86.67% 
spotpython tuning: 2.1049818033904044 [#########-] 93.33% 
spotpython tuning: 1.9431597967021723 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x14a7b33e0>
spot_GP.y
array([ 69.32459936, 152.38491454, 107.92560483,  24.51465459,
        76.73500031,  86.30426863,  11.00309255,  16.11758333,
         7.28140548,  21.82343562,  10.96088904,   2.9520033 ,
         3.02912616,   2.1049818 ,   1.9431598 ])
spot_GP.plot_progress()

spot_GP.print_results()
min y: 1.9431597967021723
x0: 10.0
x1: 2.99858238342458
[['x0', np.float64(10.0)], ['x1', np.float64(2.99858238342458)]]

12.7 Additional Examples

# Needed for the sklearn surrogates:
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import RBF
from sklearn.tree import DecisionTreeRegressor
from sklearn.ensemble import RandomForestRegressor
from sklearn import linear_model
from sklearn import tree
import pandas as pd
kernel = 1 * RBF(length_scale=1.0, length_scale_bounds=(1e-2, 1e2))
S_GP = GaussianProcessRegressor(kernel=kernel, n_restarts_optimizer=9)
from spotpython.build.kriging import Kriging
import numpy as np
import spotpython
from spotpython.fun.objectivefunctions import Analytical
from spotpython.spot import spot

S_K = Kriging(name='kriging',
              seed=123,
              log_level=50,
              infill_criterion = "y",
              n_theta=1,
              noise=False,
              cod_type="norm")
fun = Analytical().fun_sphere

fun_control = fun_control_init(
    lower = np.array([-1,-1]),
    upper = np.array([1,1]),
    fun_evals = 25)

spot_S_K = spot.Spot(fun=fun,
                     fun_control=fun_control,
                     surrogate=S_K,
                     design_control=design_control,
                     surrogate_control=surrogate_control)
spot_S_K.run()
spotpython tuning: 0.13771716894083716 [##--------] 24.00% 
spotpython tuning: 0.008764900158613986 [###-------] 28.00% 
spotpython tuning: 0.002831737294424899 [###-------] 32.00% 
spotpython tuning: 0.0008144336474759649 [####------] 36.00% 
spotpython tuning: 0.000363982666025624 [####------] 40.00% 
spotpython tuning: 0.0003615841465166041 [####------] 44.00% 
spotpython tuning: 0.0003590011672749327 [#####-----] 48.00% 
spotpython tuning: 0.00032913641643097483 [#####-----] 52.00% 
spotpython tuning: 0.0002791331313588125 [######----] 56.00% 
spotpython tuning: 0.00016536102611694684 [######----] 60.00% 
spotpython tuning: 1.979364042364845e-05 [######----] 64.00% 
spotpython tuning: 2.328711577671373e-06 [#######---] 68.00% 
spotpython tuning: 5.408003176528451e-07 [#######---] 72.00% 
spotpython tuning: 4.501997119079259e-07 [########--] 76.00% 
spotpython tuning: 3.902062597093855e-07 [########--] 80.00% 
spotpython tuning: 1.9521044693395355e-07 [########--] 84.00% 
spotpython tuning: 1.684568701593145e-07 [#########-] 88.00% 
spotpython tuning: 1.684568701593145e-07 [#########-] 92.00% 
spotpython tuning: 1.684568701593145e-07 [##########] 96.00% 
spotpython tuning: 1.684568701593145e-07 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x158c630b0>
spot_S_K.plot_progress(log_y=True)

spot_S_K.surrogate.plot()

spot_S_K.print_results()
min y: 1.684568701593145e-07
x0: 0.0003249898265724749
x1: 0.0002506760514762174
[['x0', np.float64(0.0003249898265724749)],
 ['x1', np.float64(0.0002506760514762174)]]

12.7.1 Optimize on Surrogate

12.7.2 Evaluate on Real Objective

12.7.3 Impute / Infill new Points

12.8 Tests

import numpy as np
from spotpython.spot import spot
from spotpython.fun.objectivefunctions import Analytical

fun_sphere = Analytical().fun_sphere

fun_control = fun_control_init(
                    lower=np.array([-1, -1]),
                    upper=np.array([1, 1]),
                    n_points = 2)
spot_1 = spot.Spot(
    fun=fun_sphere,
    fun_control=fun_control,
)

# (S-2) Initial Design:
spot_1.X = spot_1.design.scipy_lhd(
    spot_1.design_control["init_size"], lower=spot_1.lower, upper=spot_1.upper
)
print(spot_1.X)

# (S-3): Eval initial design:
spot_1.y = spot_1.fun(spot_1.X)
print(spot_1.y)

spot_1.fit_surrogate()
X0 = spot_1.suggest_new_X()
print(X0)
assert X0.size == spot_1.n_points * spot_1.k
[[ 0.86352963  0.7892358 ]
 [-0.24407197 -0.83687436]
 [ 0.36481882  0.8375811 ]
 [ 0.415331    0.54468512]
 [-0.56395091 -0.77797854]
 [-0.90259409 -0.04899292]
 [-0.16484832  0.35724741]
 [ 0.05170659  0.07401196]
 [-0.78548145 -0.44638164]
 [ 0.64017497 -0.30363301]]
[1.36857656 0.75992983 0.83463487 0.46918172 0.92329124 0.8170764
 0.15480068 0.00815134 0.81623768 0.502017  ]
[[0.00160545 0.00421075]
 [0.00171842 0.00407727]]

12.9 EI: The Famous Schonlau Example

X_train0 = np.array([1, 2, 3, 4, 12]).reshape(-1,1)
X_train = np.linspace(start=0, stop=10, num=5).reshape(-1, 1)
from spotpython.build.kriging import Kriging
import numpy as np
import matplotlib.pyplot as plt

X_train = np.array([1., 2., 3., 4., 12.]).reshape(-1,1)
y_train = np.array([0., -1.75, -2, -0.5, 5.])

S = Kriging(name='kriging',  seed=123, log_level=50, n_theta=1, noise=False, cod_type="norm")
S.fit(X_train, y_train)

X = np.linspace(start=0, stop=13, num=1000).reshape(-1, 1)
mean_prediction, std_prediction, ei = S.predict(X, return_val="all")

plt.scatter(X_train, y_train, label="Observations")
plt.plot(X, mean_prediction, label="Mean prediction")
if True:
    plt.fill_between(
        X.ravel(),
        mean_prediction - 2 * std_prediction,
        mean_prediction + 2 * std_prediction,
        alpha=0.5,
        label=r"95% confidence interval",
    )
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Gaussian process regression on noise-free dataset")

#plt.plot(X, y, label=r"$f(x) = x \sin(x)$", linestyle="dotted")
# plt.scatter(X_train, y_train, label="Observations")
plt.plot(X, -ei, label="Expected Improvement")
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Gaussian process regression on noise-free dataset")

S.log
{'negLnLike': array([1.20788205]),
 'theta': array([-0.99002508]),
 'p': [],
 'Lambda': []}

12.10 EI: The Forrester Example

from spotpython.build.kriging import Kriging
import numpy as np
import matplotlib.pyplot as plt
import spotpython
from spotpython.fun.objectivefunctions import Analytical
from spotpython.spot import spot

# exact x locations are unknown:
X_train = np.array([0.0, 0.175, 0.225, 0.3, 0.35, 0.375, 0.5,1]).reshape(-1,1)

fun = Analytical().fun_forrester
fun_control = fun_control_init(
    PREFIX="07_EI_FORRESTER",
    sigma=1.0,
    seed=123,)
y_train = fun(X_train, fun_control=fun_control)

S = Kriging(name='kriging',  seed=123, log_level=50, n_theta=1, noise=False, cod_type="norm")
S.fit(X_train, y_train)

X = np.linspace(start=0, stop=1, num=1000).reshape(-1, 1)
mean_prediction, std_prediction, ei = S.predict(X, return_val="all")

plt.scatter(X_train, y_train, label="Observations")
plt.plot(X, mean_prediction, label="Mean prediction")
if True:
    plt.fill_between(
        X.ravel(),
        mean_prediction - 2 * std_prediction,
        mean_prediction + 2 * std_prediction,
        alpha=0.5,
        label=r"95% confidence interval",
    )
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Gaussian process regression on noise-free dataset")

#plt.plot(X, y, label=r"$f(x) = x \sin(x)$", linestyle="dotted")
# plt.scatter(X_train, y_train, label="Observations")
plt.plot(X, -ei, label="Expected Improvement")
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Gaussian process regression on noise-free dataset")

12.11 Noise

import numpy as np
import spotpython
from spotpython.fun.objectivefunctions import Analytical
from spotpython.spot import spot
from spotpython.design.spacefilling import SpaceFilling
from spotpython.build.kriging import Kriging
import matplotlib.pyplot as plt

gen = SpaceFilling(1)
rng = np.random.RandomState(1)
lower = np.array([-10])
upper = np.array([10])
fun = Analytical().fun_sphere
fun_control = fun_control_init(
    PREFIX="07_Y",
    sigma=2.0,
    seed=123,)
X = gen.scipy_lhd(10, lower=lower, upper = upper)
print(X)
y = fun(X, fun_control=fun_control)
print(y)
y.shape
X_train = X.reshape(-1,1)
y_train = y

S = Kriging(name='kriging',
            seed=123,
            log_level=50,
            n_theta=1,
            noise=False)
S.fit(X_train, y_train)

X_axis = np.linspace(start=-13, stop=13, num=1000).reshape(-1, 1)
mean_prediction, std_prediction, ei = S.predict(X_axis, return_val="all")

#plt.plot(X, y, label=r"$f(x) = x \sin(x)$", linestyle="dotted")
plt.scatter(X_train, y_train, label="Observations")
#plt.plot(X, ei, label="Expected Improvement")
plt.plot(X_axis, mean_prediction, label="mue")
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Sphere: Gaussian process regression on noisy dataset")
[[ 0.63529627]
 [-4.10764204]
 [-0.44071975]
 [ 9.63125638]
 [-8.3518118 ]
 [-3.62418901]
 [ 4.15331   ]
 [ 3.4468512 ]
 [ 6.36049088]
 [-7.77978539]]
[-1.57464135 16.13714981  2.77008442 93.14904827 71.59322218 14.28895359
 15.9770567  12.96468767 39.82265329 59.88028242]

S.log
{'negLnLike': array([26.18505386]),
 'theta': array([-1.10547472]),
 'p': [],
 'Lambda': []}
S = Kriging(name='kriging',
            seed=123,
            log_level=50,
            n_theta=1,
            noise=True)
S.fit(X_train, y_train)

X_axis = np.linspace(start=-13, stop=13, num=1000).reshape(-1, 1)
mean_prediction, std_prediction, ei = S.predict(X_axis, return_val="all")

#plt.plot(X, y, label=r"$f(x) = x \sin(x)$", linestyle="dotted")
plt.scatter(X_train, y_train, label="Observations")
#plt.plot(X, ei, label="Expected Improvement")
plt.plot(X_axis, mean_prediction, label="mue")
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Sphere: Gaussian process regression with nugget on noisy dataset")

S.log
{'negLnLike': array([21.82276721]),
 'theta': array([-2.94197609]),
 'p': [],
 'Lambda': array([4.89634062e-05])}

12.12 Cubic Function

import numpy as np
import spotpython
from spotpython.fun.objectivefunctions import Analytical
from spotpython.spot import spot
from spotpython.design.spacefilling import SpaceFilling
from spotpython.build.kriging import Kriging
import matplotlib.pyplot as plt

gen = SpaceFilling(1)
rng = np.random.RandomState(1)
lower = np.array([-10])
upper = np.array([10])
fun = Analytical().fun_cubed
fun_control = fun_control_init(
    PREFIX="07_Y",
    sigma=10.0,
    seed=123,)

X = gen.scipy_lhd(10, lower=lower, upper = upper)
print(X)
y = fun(X, fun_control=fun_control)
print(y)
y.shape
X_train = X.reshape(-1,1)
y_train = y

S = Kriging(name='kriging',  seed=123, log_level=50, n_theta=1, noise=False)
S.fit(X_train, y_train)

X_axis = np.linspace(start=-13, stop=13, num=1000).reshape(-1, 1)
mean_prediction, std_prediction, ei = S.predict(X_axis, return_val="all")

plt.scatter(X_train, y_train, label="Observations")
#plt.plot(X, ei, label="Expected Improvement")
plt.plot(X_axis, mean_prediction, label="mue")
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Cubed: Gaussian process regression on noisy dataset")
[[ 0.63529627]
 [-4.10764204]
 [-0.44071975]
 [ 9.63125638]
 [-8.3518118 ]
 [-3.62418901]
 [ 4.15331   ]
 [ 3.4468512 ]
 [ 6.36049088]
 [-7.77978539]]
[  -9.63480707  -72.98497325   12.7936499   895.34567477 -573.35961837
  -41.83176425   65.27989461   46.37081417  254.1530734  -474.09587355]

S = Kriging(name='kriging',  seed=123, log_level=0, n_theta=1, noise=True)
S.fit(X_train, y_train)

X_axis = np.linspace(start=-13, stop=13, num=1000).reshape(-1, 1)
mean_prediction, std_prediction, ei = S.predict(X_axis, return_val="all")

plt.scatter(X_train, y_train, label="Observations")
#plt.plot(X, ei, label="Expected Improvement")
plt.plot(X_axis, mean_prediction, label="mue")
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Cubed: Gaussian process with nugget regression on noisy dataset")

import numpy as np
import spotpython
from spotpython.fun.objectivefunctions import Analytical
from spotpython.spot import spot
from spotpython.design.spacefilling import SpaceFilling
from spotpython.build.kriging import Kriging
import matplotlib.pyplot as plt

gen = SpaceFilling(1)
rng = np.random.RandomState(1)
lower = np.array([-10])
upper = np.array([10])
fun = Analytical().fun_runge
fun_control = fun_control_init(
    PREFIX="07_Y",
    sigma=0.25,
    seed=123,)

X = gen.scipy_lhd(10, lower=lower, upper = upper)
print(X)
y = fun(X, fun_control=fun_control)
print(y)
y.shape
X_train = X.reshape(-1,1)
y_train = y

S = Kriging(name='kriging',  seed=123, log_level=50, n_theta=1, noise=False)
S.fit(X_train, y_train)

X_axis = np.linspace(start=-13, stop=13, num=1000).reshape(-1, 1)
mean_prediction, std_prediction, ei = S.predict(X_axis, return_val="all")

plt.scatter(X_train, y_train, label="Observations")
#plt.plot(X, ei, label="Expected Improvement")
plt.plot(X_axis, mean_prediction, label="mue")
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Gaussian process regression on noisy dataset")
[[ 0.63529627]
 [-4.10764204]
 [-0.44071975]
 [ 9.63125638]
 [-8.3518118 ]
 [-3.62418901]
 [ 4.15331   ]
 [ 3.4468512 ]
 [ 6.36049088]
 [-7.77978539]]
[ 0.46517267 -0.03599548  1.15933822  0.05915901  0.24419145  0.21502359
 -0.10432134  0.21312309 -0.05502681 -0.06434374]

S = Kriging(name='kriging',
            seed=123,
            log_level=50,
            n_theta=1,
            noise=True)
S.fit(X_train, y_train)

X_axis = np.linspace(start=-13, stop=13, num=1000).reshape(-1, 1)
mean_prediction, std_prediction, ei = S.predict(X_axis, return_val="all")

plt.scatter(X_train, y_train, label="Observations")
#plt.plot(X, ei, label="Expected Improvement")
plt.plot(X_axis, mean_prediction, label="mue")
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Gaussian process regression with nugget on noisy dataset")

12.13 Modifying Lambda Search Space

S = Kriging(name='kriging',
            seed=123,
            log_level=50,
            n_theta=1,
            noise=True,
            min_Lambda=0.1,
            max_Lambda=10)
S.fit(X_train, y_train)

print(f"Lambda: {S.Lambda}")
Lambda: 0.8567558302695025
X_axis = np.linspace(start=-13, stop=13, num=1000).reshape(-1, 1)
mean_prediction, std_prediction, ei = S.predict(X_axis, return_val="all")

plt.scatter(X_train, y_train, label="Observations")
#plt.plot(X, ei, label="Expected Improvement")
plt.plot(X_axis, mean_prediction, label="mue")
plt.legend()
plt.xlabel("$x$")
plt.ylabel("$f(x)$")
_ = plt.title("Gaussian process regression with nugget on noisy dataset. Modified Lambda search space.")