Single Objective Optimization#
SO Solver#
- class udao.optimization.soo.so_solver.SOSolver#
Bases:
ABC
- abstract solve(problem: SOProblem, seed: int | None = None) Tuple[float, Dict[str, float]] #
Solve a single-objective optimization problem
- Parameters:
problem (SOProblem) – Single-objective optimization problem to solve
seed (Optional[int], optional) – Random seed, by default None
- Returns:
A tuple of the objective value and the variables that optimize the objective
- Return type:
Tuple[float, Dict[str, float]]
Sampler Solver#
- class udao.optimization.soo.sampler_solver.SamplerSolver(device: device | None = None)#
Bases:
SOSolver
,ABC
- filter_on_constraints(input_vars: Dict[str, ndarray], problem: SOProblem) Dict[str, ndarray] #
Keep only input variables that don’t violate constraints
- Parameters:
wl_id (str | None) – workload id
input_vars (np.ndarray) – input variables
constraints (List[Constraint]) – constraint functions
- solve(problem: SOProblem, seed: int | None = None) Tuple[float, Dict[str, float]] #
Solve a single-objective optimization problem
- Parameters:
objective (Objective) – Objective to be optimized
variables (Dict[str, Variable]) – Variables to be optimized
constraints (Optional[List[Constraint]], optional) – List of constraints to comply with, by default None
input_parameters (Optional[Dict[str, Any]], optional) – Fixed parameters input to objectives and/or constraints, by default None
seed (int, by default None) – random seed
- Returns:
A point that satisfies the constraints and optimizes the objective
- Return type:
Point
- Raises:
NoSolutionError – If no feasible solution is found
Grid Search Solver#
- class udao.optimization.soo.grid_search_solver.GridSearchSolver(params: Params)#
Bases:
SamplerSolver
Solving a SOO problem by grid search over variables
Random Sampler Solver#
- class udao.optimization.soo.random_sampler_solver.RandomSamplerSolver(params: Params)#
Bases:
SamplerSolver
Solving a SOO problem by random sampling over variables
MOGD#
- class udao.optimization.soo.mogd.MOGD(params: Params)#
Bases:
SOSolver
MOGD solver for single-objective optimization.
Performs gradient descent on input variables by minimizing an objective loss and a constraint loss.
- class Params(learning_rate: float, max_iters: int, patience: int, multistart: int, objective_stress: float = 10.0, constraint_stress: float = 100000.0, strict_rounding: bool = False, batch_size: int = 1, device: Optional[torch.device] = <factory>, dtype: torch.dtype = torch.float32)#
Bases:
object
- batch_size: int = 1#
batch size for gradient descent
- constraint_stress: float = 100000.0#
stress term for constraint functions
- device: device | None#
device on which to perform torch operations, by default available device.
- dtype: dtype = torch.float32#
type of the tensors
- learning_rate: float#
learning rate of Adam optimizer applied to input variables
- max_iters: int#
maximum number of iterations for a single local search
- multistart: int#
number of random starts for gradient descent
- objective_stress: float = 10.0#
stress term for objective functions
- patience: int#
maximum number of iterations without improvement
- strict_rounding: bool = False#
whether strictly rounding integer variables at each iteration.
- constraints_loss(constraint_values: List[Tensor], constraints: Sequence[Constraint]) Tensor #
compute loss of the values of each constraint function fixme: double-check
- Parameters:
constraint_values (List[th.Tensor]) – values of each constraint function
constraints (Sequence[co.Constraint]) – constraint functions
- Returns:
loss of the values of each constraint function
- Return type:
th.Tensor
- get_meshed_categorical_vars(variables: Dict[str, Variable]) ndarray | None #
Get combinations of all categorical (binary, enum) variables
- Parameters:
variables (Dict[str, co.Variable]) – Variables to be optimized
- Returns:
Combinations of all categorical variables of shape (n_samples, n_vars)
- Return type:
Optional[np.ndarray]
- objective_loss(objective_value: Tensor, objective: Objective) Tensor #
Compute the objective loss for a given objective value: - if no bounds are specified, use the squared objective value - if both bounds are specified, use the squared normalized objective value if it is within the bounds, otherwise add a stress term to a squared distance to middle of the bounds
- Parameters:
objective_value (th.Tensor) – Tensor of objective values
objective (co.Objective) – Objective function
- Returns:
Tensor of objective losses
- Return type:
th.Tensor
- Raises:
NotImplementedError – If only one bound is specified for the objective
- solve(problem: SOProblem, seed: int | None = None) Tuple[float, Dict[str, float]] #
Solve a single-objective optimization problem
- Parameters:
problem (SOProblem) – Single-objective optimization problem to solve
seed (Optional[int], optional) – Random seed, by default None
- Returns:
A tuple of the objective value and the variables that optimize the objective
- Return type:
Tuple[float, Dict[str, float]]
- static within_objective_bounds(obj_value: float, objective: Objective) bool #
check whether violating the objective value var_ranges :param pred_dict: dict, keys are objective names, values are objective values :param obj_bounds: dict, keys are objective names, values are lower and upper var_ranges of each objective value :return: True or False