import numpy as np
from spotpython.fun.objectivefunctions import Analytical
from spotpython.utils.init import fun_control_init, surrogate_control_init, design_control_init
from spotpython.spot import spot
8 Multi-dimensional Functions
This chapter illustrates how high-dimensional functions can be optimized and analyzed. For reasons of illustration, we will use the three-dimensional Sphere function, which is a simple and well-known function. The problem dimension is \(k=3\), but can be easily adapted to other, higher dimensions.
8.1 The Objective Function: 3-dim Sphere
The spotpython
package provides several classes of objective functions. We will use an analytical objective function, i.e., a function that can be described by a (closed) formula: \[
f(x) = \sum_i^k x_i^2.
\]
The Sphere function is continuous, convex and unimodal. The plot shows its two-dimensional form. The global minimum is \[ f(x) = 0, \text{at } x = (0,0, \ldots, 0). \]
It is available as fun_sphere
in the Analytical
class [SOURCE].
= Analytical().fun_sphere fun
Here we will use problem dimension \(k=3\), which can be specified by the lower
bound arrays. The size of the lower
bound array determines the problem dimension. If we select -1.0 * np.ones(3)
, a three-dimensional function is created.
In contrast to the one-dimensional case (Section 7.5), where only one theta
value was used, we will use three different theta
values (one for each dimension), i.e., we set n_theta=3
in the surrogate_control
. As default, spotpython
sets the n_theta
to the problem dimension. Therefore, the n_theta
parameter can be omitted in this case. More specifically, if n_theta
is larger than 1 or set to the string “anisotropic”, then the \(k\) theta values are used, where \(k\) is the problem dimension. The meaning of “anisotropic” is explained in @#sec-iso-aniso-kriging.
The prefix is set to "03"
to distinguish the results from the one-dimensional case. Again, TensorBoard can be used to monitor the progress of the optimization.
We can also add interpretable labels to the dimensions, which will be used in the plots. Therefore, we set var_name=["Pressure", "Temp", "Lambda"]
instead of the default var_name=None
, which would result in the labels x_0
, x_1
, and x_2
.
= fun_control_init(
fun_control ="03",
PREFIX= -1.0*np.ones(3),
lower = np.ones(3),
upper =["Pressure", "Temp", "Lambda"],
var_name=True,
TENSORBOARD_CLEAN=True)
tensorboard_log= surrogate_control_init(n_theta=3)
surrogate_control = spot.Spot(fun=fun,
spot_3 =fun_control,
fun_control=surrogate_control)
surrogate_control spot_3.run()
Moving TENSORBOARD_PATH: runs/ to TENSORBOARD_PATH_OLD: runs_OLD/runs_2024_12_14_20_30_13
Created spot_tensorboard_path: runs/spot_logs/03_maans08_2024-12-14_20-30-13 for SummaryWriter()
spotpython tuning: 0.03443518849425384 [#######---] 73.33%
spotpython tuning: 0.031343410270600766 [########--] 80.00%
spotpython tuning: 0.0009628776719739201 [#########-] 86.67%
spotpython tuning: 8.551395067781694e-05 [#########-] 93.33%
spotpython tuning: 6.646694576732236e-05 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x15421dc70>
Now we can start TensorBoard in the background with the following command:
tensorboard --logdir="./runs"
and can access the TensorBoard web server with the following URL:
http://localhost:6006/
8.1.1 Results
8.1.1.1 Best Objective Function Values
The best objective function value and its corresponding input values are printed as follows:
= spot_3.print_results() _
min y: 6.646694576732236e-05
Pressure: 0.005351119956860987
Temp: 0.001959694434893034
Lambda: 0.005830270893916998
The method plot_progress()
plots current and best found solutions versus the number of iterations as shown in Figure 8.1.
spot_3.plot_progress()
8.1.1.2 A Contour Plot
We can select two dimensions, say \(i=0\) and \(j=1\), and generate a contour plot as follows. Note, we have specified identical min_z
and max_z
values to generate comparable plots.
=0, j=1, min_z=0, max_z=2.25) spot_3.plot_contour(i
- In a similar manner, we can plot dimension \(i=0\) and \(j=2\):
=0, j=2, min_z=0, max_z=2.25) spot_3.plot_contour(i
- The final combination is \(i=1\) and \(j=2\):
=1, j=2, min_z=0, max_z=2.25) spot_3.plot_contour(i
- The three plots look very similar, because the
fun_sphere
is symmetric. - This can also be seen from the variable importance:
= spot_3.print_importance() _
Pressure: 95.21437451356887
Temp: 100.0
Lambda: 87.10302600165961
spot_3.plot_importance()
8.1.2 TensorBoard
The second TensorBoard visualization shows the input values, i.e., \(x_0, \ldots, x_2\), plotted against the wall time.
The third TensorBoard plot illustrates how spotpython
can be used as a microscope for the internal mechanisms of the surrogate-based optimization process. Here, one important parameter, the learning rate \(\theta\) of the Kriging surrogate is plotted against the number of optimization steps.
8.1.3 Conclusion
Based on this quick analysis, we can conclude that all three dimensions are equally important (as expected, because the Analytical function is known).
8.2 Exercises
Exercise 8.1 (The Three Dimensional fun_cubed
) The spotpython
package provides several classes of objective functions.
We will use the fun_cubed
in the Analytical
class [SOURCE]. The input dimension is 3
. The search range is \(-1 \leq x \leq 1\) for all dimensions.
Tasks: * Generate contour plots * Calculate the variable importance. * Discuss the variable importance: * Are all variables equally important? * If not: * Which is the most important variable? * Which is the least important variable?
Exercise 8.2 (The Ten Dimensional fun_wing_wt
)
- The input dimension is
10
. The search range is \(0 \leq x \leq 1\) for all dimensions. - Calculate the variable importance.
- Discuss the variable importance:
- Are all variables equally important?
- If not:
- Which is the most important variable?
- Which is the least important variable?
- Generate contour plots for the three most important variables. Do they confirm your selection?
Exercise 8.3 (The Three Dimensional fun_runge
)
- The input dimension is
3
. The search range is \(-5 \leq x \leq 5\) for all dimensions. - Generate contour plots
- Calculate the variable importance.
- Discuss the variable importance:
- Are all variables equally important?
- If not:
- Which is the most important variable?
- Which is the least important variable?
Exercise 8.4 (The Three Dimensional fun_linear
)
- The input dimension is
3
. The search range is \(-5 \leq x \leq 5\) for all dimensions. - Generate contour plots
- Calculate the variable importance.
- Discuss the variable importance:
- Are all variables equally important?
- If not:
- Which is the most important variable?
- Which is the least important variable?
Exercise 8.5 (The Two Dimensional Rosenbrock Function fun_rosen
)
- The input dimension is
2
. The search range is \(-5 \leq x \leq 10\) for all dimensions. - See Rosenbrock function and Rosenbrock Function for details.
- Generate contour plots
- Calculate the variable importance.
- Discuss the variable importance:
- Are all variables equally important?
- If not:
- Which is the most important variable?
- Which is the least important variable?
8.3 Selected Solutions
Solution 8.1 (Solution to Exercise 8.1: The Three-dimensional Cubed Function fun_cubed
). We instanciate the fun_cubed
function from the Analytical
class.
from spotpython.fun.objectivefunctions import Analytical
= Analytical().fun_cubed fun_cubed
- Here we will use problem dimension \(k=3\), which can be specified by the
lower
bound arrays. The size of thelower
bound array determines the problem dimension. If we select-1.0 * np.ones(3)
, a three-dimensional function is created. - In contrast to the one-dimensional case, where only one
theta
value was used, we will use three differenttheta
values (one for each dimension), i.e., we can setn_theta=3
in thesurrogate_control
. However, this is not necessary, because by default,n_theta
is set to the number of dimensions. - The prefix is set to
"03"
to distinguish the results from the one-dimensional case. - We will set the
fun_evals=20
to limit the number of function evaluations to 20 for this example. - The size of the initial design is set to
10
by default. It can be changed by settinginit_size=10
viadesign_control_init
in thedesign_control
dictionary. - Again, TensorBoard can be used to monitor the progress of the optimization.
- We can also add interpretable labels to the dimensions, which will be used in the plots. Therefore, we set
var_name=["Pressure", "Temp", "Lambda"]
instead of the defaultvar_name=None
, which would result in the labelsx_0
,x_1
, andx_2
.
Here is the link to the documentation of the fun_control_init function: [DOC]. The documentation of the design_control_init
function can be found here: [DOC].
The setup can be done as follows:
= fun_control_init(
fun_control ="cubed",
PREFIX=20,
fun_evals= -1.0*np.ones(3),
lower = np.ones(3),
upper =["Pressure", "Temp", "Lambda"],
var_name=True,
TENSORBOARD_CLEAN=True
tensorboard_log
)
= surrogate_control_init(n_theta=3)
surrogate_control = design_control_init(init_size=10) design_control
Moving TENSORBOARD_PATH: runs/ to TENSORBOARD_PATH_OLD: runs_OLD/runs_2024_12_14_20_30_15
Created spot_tensorboard_path: runs/spot_logs/cubed_maans08_2024-12-14_20-30-15 for SummaryWriter()
- After the setup, we can pass the dictionaries to the
Spot
class and run the optimization process.
= spot.Spot(fun=fun_cubed,
spot_cubed =fun_control,
fun_control=surrogate_control)
surrogate_control spot_cubed.run()
spotpython tuning: -1.4616833740603914 [######----] 55.00%
spotpython tuning: -1.4616833740603914 [######----] 60.00%
spotpython tuning: -2.0535965272858117 [######----] 65.00%
spotpython tuning: -2.0535965272858117 [#######---] 70.00%
spotpython tuning: -2.0535965272858117 [########--] 75.00%
spotpython tuning: -2.0535965272858117 [########--] 80.00%
spotpython tuning: -2.0535965272858117 [########--] 85.00%
spotpython tuning: -2.0906487259161515 [#########-] 90.00%
spotpython tuning: -2.0906487259161515 [##########] 95.00%
spotpython tuning: -3.0 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x156e90c20>
- Results
= spot_cubed.print_results() _
min y: -3.0
Pressure: -1.0
Temp: -1.0
Lambda: -1.0
spot_cubed.plot_progress()
- Contour Plots
We can select two dimensions, say \(i=0\) and \(j=1\), and generate a contour plot as follows.
We can specify identical min_z
and max_z
values to generate comparable plots. The default values are min_z=None
and max_z=None
, which will be replaced by the minimum and maximum values of the objective function.
= -3
min_z = 1
max_z =0, j=1, min_z=min_z, max_z=max_z) spot_cubed.plot_contour(i
- In a similar manner, we can plot dimension \(i=0\) and \(j=2\):
=0, j=2, min_z=min_z, max_z=max_z) spot_cubed.plot_contour(i
- The final combination is \(i=1\) and \(j=2\):
=1, j=2, min_z=min_z, max_z=max_z) spot_cubed.plot_contour(i
- The variable importance can be printed and visualized as follows:
= spot_cubed.print_importance() _
Pressure: 100.0
Temp: 83.62271900424608
Lambda: 97.98572230390421
spot_cubed.plot_importance()
Solution 8.2 (Solution to Exercise 8.5: The Two-dimensional Rosenbrock Function fun_rosen
).
import numpy as np
from spotpython.fun.objectivefunctions import Analytical
from spotpython.utils.init import fun_control_init, surrogate_control_init
from spotpython.spot import spot
- The Objective Function: 2-dim
fun_rosen
The spotpython
package provides several classes of objective functions. We will use the fun_rosen
in the Analytical
class [SOURCE].
= Analytical().fun_rosen fun_rosen
- Here we will use problem dimension \(k=2\), which can be specified by the
lower
bound arrays. - The size of the
lower
bound array determines the problem dimension. If we select-5.0 * np.ones(2)
, a two-dimensional function is created. - In contrast to the one-dimensional case, where only one
theta
value is used, we will use \(k\) differenttheta
values (one for each dimension), i.e., we setn_theta=3
in thesurrogate_control
. - The prefix is set to
"ROSEN"
. - Again, TensorBoard can be used to monitor the progress of the optimization.
= fun_control_init(
fun_control ="ROSEN",
PREFIX= -5.0*np.ones(2),
lower = 10*np.ones(2),
upper =25)
fun_evals= surrogate_control_init(n_theta=2)
surrogate_control = spot.Spot(fun=fun_rosen,
spot_rosen =fun_control,
fun_control=surrogate_control)
surrogate_control spot_rosen.run()
spotpython tuning: 90.78737194937058 [####------] 44.00%
spotpython tuning: 1.0172240416576994 [#####-----] 48.00%
spotpython tuning: 1.0172240416576994 [#####-----] 52.00%
spotpython tuning: 1.0172240416576994 [######----] 56.00%
spotpython tuning: 1.0172240416576994 [######----] 60.00%
spotpython tuning: 1.0172240416576994 [######----] 64.00%
spotpython tuning: 1.0172240416576994 [#######---] 68.00%
spotpython tuning: 1.0172240416576994 [#######---] 72.00%
spotpython tuning: 1.0172240416576994 [########--] 76.00%
spotpython tuning: 1.0172240416576994 [########--] 80.00%
spotpython tuning: 0.9204178748278081 [########--] 84.00%
spotpython tuning: 0.9204178748278081 [#########-] 88.00%
spotpython tuning: 0.9204178748278081 [#########-] 92.00%
spotpython tuning: 0.9204178748278081 [##########] 96.00%
spotpython tuning: 0.7259403018002056 [##########] 100.00% Done...
<spotpython.spot.spot.Spot at 0x156d4af00>
Now we can start TensorBoard in the background with the following command:
tensorboard --logdir="./runs"
and can access the TensorBoard web server with the following URL:
http://localhost:6006/
- Results
= spot_rosen.print_results() _
min y: 0.7259403018002056
x0: 0.16580025802317414
x1: 0.08230860213240618
=True) spot_rosen.plot_progress(log_y
- A Contour Plot: We can select two dimensions, say \(i=0\) and \(j=1\), and generate a contour plot as follows.
- Note: For higher dimensions, it might be useful to have identical
min_z
andmax_z
values to generate comparable plots. The default values aremin_z=None
andmax_z=None
, which will be replaced by the minimum and maximum values of the objective function.
= None
min_z = None
max_z =0, j=1, min_z=min_z, max_z=max_z) spot_rosen.plot_contour(i
- The variable importance can be calculated as follows:
= spot_rosen.print_importance() _
x0: 99.99999999999999
x1: 1.2430550048669098
spot_rosen.plot_importance()
8.4 Jupyter Notebook
- The Jupyter-Notebook of this lecture is available on GitHub in the Hyperparameter-Tuning-Cookbook Repository