mo.pareto

mo.pareto

Functions

Name Description
is_pareto_efficient Find the Pareto-efficient points from a set of points.
mo_pareto_optx_plot Visualizes the Pareto-optimal points in the input space for each pair of inputs
mo_xy_contour Generates contour plots of every combination of two input variables x_i and x_j
mo_xy_surface Generates surface plots of every combination of two input variables x_i and x_j

is_pareto_efficient

mo.pareto.is_pareto_efficient(costs, minimize=True)

Find the Pareto-efficient points from a set of points.

A point is Pareto-efficient if no other point exists that is better in all objectives. This function assumes that lower values are preferred for each objective when minimize=True, and higher values are preferred when minimize=False.

Parameters

Name Type Description Default
costs np.ndarray An (N,M) array-like object of points, where N is the number of points and M is the number of objectives. required
minimize bool If True, the function finds Pareto-efficient points assuming lower values are better. If False, it assumes higher values are better. Defaults to True. True

Returns

Name Type Description
np.ndarray np.ndarray: A boolean mask of length N, where True indicates that the corresponding point is Pareto-efficient.

Examples:

>>> import numpy as np
>>> from spotoptim.mo.pareto import is_pareto_efficient
>>> points = np.array([[1, 2], [2, 1], [1.5, 1.5], [3, 3]])
>>> pareto_mask = is_pareto_efficient(points, minimize=True)
>>> print(pareto_mask)
[ True  True  True False]

mo_pareto_optx_plot

mo.pareto.mo_pareto_optx_plot(
    X,
    Y,
    minimize=True,
    feature_names=None,
    target_names=None,
    **kwargs,
)

Visualizes the Pareto-optimal points in the input space for each pair of inputs x_i and x_j (with i < j) and each objective f_k.

Plots are placed on a grid where rows correspond to input pairs and columns correspond to objectives.

Parameters

Name Type Description Default
X np.ndarray An (N,D) array of input points, where N is the number of points and D is the number of variables (dimensions). required
Y np.ndarray An (N,M) array of objective values, where N is the number of points and M is the number of objectives. required
minimize bool If True, assumes minimization of objectives. Defaults to True. True
feature_names list List of names for the input variables. Defaults to None. None
target_names list List of names for the objectives. Defaults to None. None
**kwargs Any Additional arguments passed to plt.subplots (e.g., figsize). {}

Returns

Name Type Description
None None

Examples

>>> from spotoptim.mo.pareto import mo_pareto_optx_plot
>>> X = np.array([[1, 2], [3, 4], [5, 6]])
>>> Y = np.array([[1, 2], [3, 4], [5, 6]])
>>> mo_pareto_optx_plot(X, Y)

mo_xy_contour

mo.pareto.mo_xy_contour(
    models,
    bounds,
    target_names=None,
    feature_names=None,
    resolution=50,
    feature_pairs=None,
    **kwargs,
)

Generates contour plots of every combination of two input variables x_i and x_j (where i < j) and for each of the multiple objectives f_k.

Parameters

Name Type Description Default
models list List of trained models (one per objective). required
bounds list List of tuples (min, max) for each input variable. required
target_names list List of names for the objectives. Defaults to None. None
feature_names list List of names for the input variables. Defaults to None. None
resolution int Grid resolution for the contour plot. Defaults to 50. 50
feature_pairs list List of tuples (i, j) specifying which feature pairs to plot. If None, all combinations are plotted. Defaults to None. None
**kwargs Any Additional keyword arguments passed to plt.subplots (e.g., figsize). {}

Returns

Name Type Description
None None

Examples

>>> from sklearn.ensemble import RandomForestRegressor
>>> from spotoptim.mo.pareto import mo_xy_contour
>>> import numpy as np
>>> # Train dummy models
>>> X = np.random.rand(10, 2)
>>> y1 = X[:, 0] + X[:, 1]
>>> y2 = X[:, 0] * X[:, 1]
>>> m1 = RandomForestRegressor().fit(X, y1)
>>> m2 = RandomForestRegressor().fit(X, y2)
>>> # Plot
>>> mo_xy_contour([m1, m2], bounds=[(0, 1), (0, 1)], target_names=["Sum", "Prod"])

mo_xy_surface

mo.pareto.mo_xy_surface(
    models,
    bounds,
    target_names=None,
    feature_names=None,
    resolution=50,
    feature_pairs=None,
    **kwargs,
)

Generates surface plots of every combination of two input variables x_i and x_j (where i < j) and for each of the multiple objectives f_k.

Parameters

Name Type Description Default
models list List of trained models (one per objective). required
bounds list List of tuples (min, max) for each input variable. required
target_names list List of names for the objectives. Defaults to None. None
feature_names list List of names for the input variables. Defaults to None. None
resolution int Grid resolution for the surface plot. Defaults to 50. 50
feature_pairs list List of tuples (i, j) specifying which feature pairs to plot. If None, all combinations are plotted. Defaults to None. None
**kwargs Any Additional keyword arguments passed to plt.subplots (e.g., figsize). {}

Returns

Name Type Description
None None

Examples

>>> from sklearn.ensemble import RandomForestRegressor
>>> from spotoptim.mo.pareto import mo_xy_surface
>>> import numpy as np
>>> # Train dummy models
>>> X = np.random.rand(10, 2)
>>> y1 = X[:, 0] + X[:, 1]
>>> y2 = X[:, 0] * X[:, 1]
>>> m1 = RandomForestRegressor().fit(X, y1)
>>> m2 = RandomForestRegressor().fit(X, y2)
>>> # Plot
>>> mo_xy_surface([m1, m2], bounds=[(0, 1), (0, 1)], target_names=["Sum", "Prod"])