Central orchestrator for surrogate-model-based optimization: constructor, optimize(), and result inspection.
SpotOptim is the main entry point for all optimization in spotoptim. You create an instance with your objective function and bounds, call optimize(), and get back a scipy-compatible OptimizeResult.
Best x : [-0.00016718 0.00071419]
Best f(x) : 0.000001
Evaluations: 20
Three ingredients: a callable fun, a list of bounds, and a budget via max_iter (total evaluations including the initial design). The optimizer builds a Kriging surrogate by default and uses predicted-value acquisition (acquisition="y").
Key Constructor Parameters
The constructor accepts many parameters. The most commonly used are:
Best x : [-9.98879117e-04 3.00000000e+00 0.00000000e+00]
Best f(x) : 0.000001
When var_type is not provided, all variables default to "float".
Variable Transformations
Use var_trans to apply log-transformations to parameters that span several orders of magnitude (e.g., learning rates):
import numpy as npfrom spotoptim import SpotOptimdef obj_with_lr(X): X = np.atleast_2d(X) lr = X[:, 0] weight = X[:, 1]return (lr -0.001)**2+ (weight -0.5)**2opt = SpotOptim( fun=obj_with_lr, bounds=[(1e-5, 1e-1), (0.0, 1.0)], var_type=["float", "float"], var_name=["lr", "weight"], var_trans=["log10", None], max_iter=20, n_initial=10, seed=0,)result = opt.optimize()print(f"Best lr : {result.x[0]:.6f}")print(f"Best weight : {result.x[1]:.4f}")print(f"Best f(x) : {result.fun:.8f}")
Best lr : 0.000010
Best weight : 0.4981
Best f(x) : 0.00000477
With var_trans=["log10", None], the first variable is optimized in log10 space internally. The bounds are specified in natural (original) scale — here 1e-5 to 1e-1 for the learning rate.
Choosing an Acquisition Function
The acquisition parameter controls how the optimizer selects the next evaluation point:
"y" (default) — minimize the surrogate’s predicted value (pure exploitation)
"ei" — Expected Improvement: balances exploitation and exploration
"pi" — Probability of Improvement: probability of beating the current best
See Surrogate Models for Kriging options, MLPSurrogate, and how to plug in sklearn estimators.
Noisy Optimization
For noisy objective functions, use repeats_initial and repeats_surrogate to re-evaluate each design point multiple times. Add ocba_delta to enable Optimal Computing Budget Allocation, which intelligently allocates extra evaluations to the most promising points.
The optimizer stops when either max_iter or max_time is reached, whichever comes first.
Restarts
When the optimizer gets stuck in a local minimum, automatic restarts can help. Set restart_after_n to trigger a restart after that many consecutive iterations without improvement: