OmniOpt2-Logo ScaDS.ai-Logo
CI Badge PyPI Version

Constraints

Why Constraints Matter in Hyperparameter Optimization and Simulations

When performing hyperparameter optimization or running large-scale simulations, constraints allow you to embed domain knowledge directly into the search space. Instead of blindly exploring all possible combinations of parameters, constraints restrict the optimization process to feasible or meaningful regions — which improves efficiency, avoids invalid configurations, and reflects real-world limitations.

What Are Constraints?

In the context of hyperparameter optimization, constraints are mathematical conditions that must be satisfied by the parameter values during the search. A common form of constraint is a linear inequality such as \( a \cdot x + b \cdot y \leq c \), where \(x\) and \(y\) are tunable parameters (e.g., learning rate, number of layers), and \(a\), \(b\), and \(c\) are fixed constants. These expressions define a subspace in which the optimizer is allowed to operate.

Why Use Constraints?

Constraints are useful for several reasons:

Machine Learning Examples

Simulation Examples

Mathematical Form

In general, constraints can be written as \(a_1 \cdot x_1 + a_2 \cdot x_2 + \dots + a_n \cdot x_n \leq c \), or \( x_1 \leq x_2 \), where \(x_1, x_2, \dots, x_n\) are parameters and \(a_i, c\) are constants, ie. int or float .

Using Constraints in OmniOpt2

OmniOpt2 allows you to specify constraints in two ways: through the graphical user interface (GUI) or via the command-line interface (CLI).

1. Using the GUI

In the GUI, you can enter a list of constraints directly. This is done through an input field where you can specify each constraint in the following forms (given, again that, \( x \) and \( y \) are parameters, and \( a, b, c \) are int or float ):
$$ a \cdot x + b \cdot y + \dots \leq c, $$
or:
$$ x \leq y. $$
For example, you might specify a constraint like:
$$ 3 \cdot \text{learning_rate} + 2 \cdot \text{batch_size} \leq 100 $$
This constraint would limit the combination of the learning rate and batch size to ensure that they don't exceed a total of 100. To enter constraints in the GUI, see the screenshot below for guidance:
Constraints GUI

2. Using the CLI

In the CLI, constraints can be added using the --experiment_constraints argument. You need to encode each constraint in base64 format. Here’s an example:
--experiment_constraints $(echo "50 * learning_rate + 20 * batch_size >= 1000" | base64 -w0) $(echo "100 * learning_rate + 200 * num_layers >= 500" | base64 -w0)

Possible comparison operators

There are two possible comparison operators: <= and >= . No other ones are possible in Ax.

Constraints in Continued Jobs

Given the option --disable_previous_job_constraint is not set, the constraints specified will be taken over to continued jobs as well. They can be overridden, though, by adding another --experiment_constraints . This will delete all old constraints and only work on the new ones.