Skip to content

comet_ml.Optimizer

The Optimizer class. Used to perform a search for minimum or maximum loss for a given set of parameters. Also used for a grid or random sweep of parameter space.

Note that any keyword argument not in the following will be passed onto the Experiment constructor. For example, you can pass project_name and logging arguments by listing them here.

Parameters:

  • config (dict, default: None ) –

    If COMET_OPTIMIZER_ID is configured, otherwise is either a config dictionary, optimizer id, or a config filename.

  • trials (int, default: None ) –

    Number of trials per parameter set to test.

  • verbose (bool, default: 1 ) –

    Verbosity level where 0 means no output, and 1 (or greater) means to show more detail.

  • experiment_class (str | callable, default: 'Experiment' ) –

    Class to use (for example, OfflineExperiment).

Example
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
import comet_ml

comet_ml.login()

# Assume COMET_OPTIMIZER_ID is configured:
opt = comet_ml.Optimizer()

# An optimizer config dictionary:
opt = comet_ml.Optimizer({"algorithm": "bayes", ...})

# An optimizer id:
opt = comet_ml.Optimizer("73745463463")

# A filename to a optimizer config file:
opt = comet_ml.Optimizer("/tmp/mlhacker/optimizer.config")

To pass arguments to the Experiment constructor, pass them into the opt.get_experiments() call, like so:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
import comet_ml

comet_ml.login()
opt = comet_ml.Optimizer("/tmp/mlhacker/optimizer.config")

for experiment in opt.get_experiments(
    project_name="my-project",
    auto_metric_logging=False,
):
    loss = fit(model)
    experiment.log_metric("loss", loss")
    experiment.end()

Functions

__init__

__init__(
    config=None,
    trials=None,
    verbose=1,
    experiment_class="Experiment",
    api_key=None,
    **kwargs
)

The Optimizer constructor.

Parameters:

  • config (str | dict, default: None ) –

    Can an optimizer config id, an optimizer config dictionary, or an optimizer config filename.

  • trials (int, default: None ) –

    Number of trials per parameter value set

  • verbose (int, default: 1 ) –

    Level of details to show; 0 means quiet

  • experiment_class (str | callable, default: 'Experiment' ) –

    Supported values are "Experiment" (the default) to use online Experiments or "OfflineExperiment" to use offline Experiments. It can also be a callable (a function or a method) that returns an instance of Experiment, OfflineExperiment, ExistingExperiment or ExistingOfflineExperiment.

See above for examples.

end

end(experiment)

Optimizer.end() is called at end of experiment. Usually, you would not call this manually, as it is called directly when the experiment ends.

get_experiments

get_experiments(**kwargs)

Optimizer.get_experiments() will iterate over all possible experiments for this sweep or search, n at a time. All experiments will have a unique set of parameter values (unless performing multiple trials per parameter set).

Example
1
2
3
4
5
6
7
8
import comet_ml

comet_ml.login()
opt = comet_ml.Optimizer({"algorithm": "bayes", ...})

for experiment in opt.get_experiments():
    loss = fit(x, y)
    experiment.log_metric("loss", loss)

get_id

get_id()

Get the id of this optimizer.

Example
1
2
3
4
5
6
import comet_ml

comet_ml.login()
opt = comet_ml.Optimizer({"algorithm": "bayes", ...})

opt.get_id()

get_parameters

get_parameters()

Optimizer.get_parameters() will iterate over all possible parameters for this sweep or search. All parameters combinations will be emitted once (unless performing multiple trials per parameter set).

Example
1
2
3
4
5
6
7
8
9
import comet_ml

comet_ml.login()
opt = comet_ml.Optimizer({"algorithm": "bayes", ...})

for parameters in optimizer.get_parameters():
    experiment = comet_ml.Experiment()
    loss = fit(x, y)
    experiment.log_metric("loss", loss)

next

next(**kwargs: Dict[str, str]) -> Optional[CometExperiment]

Optimizer.next() will return the next experiment for this sweep or search. All experiments will have a unique set of parameter values (unless performing multiple trials per parameter set).

Normally, you would not call this directly, but use the generator Optimizer.get_experiments()

Parameters:

  • kwargs (Any, default: {} ) –

    Any keyword argument will be passed to the Experiment class for creation. The API key is passed directly.

Example
1
2
3
4
5
6
7
8
9
import comet_ml

comet_ml.login()
opt = comet_ml.Optimizer({"algorithm": "bayes", ...})

experiment = optimizer.next()
if experiment is not None:
    loss = fit(x, y)
    experiment.log_metric("loss", loss)

next_data

next_data()

Optimizer.next_data() will return the next parameters in the Optimizer sweep.

Normally, you would not call this directly, but use the generator Optimizer.get_parameters()

Example
1
2
3
4
5
import comet_ml

comet_ml.login()
opt = comet_ml.Optimizer({"algorithm": "bayes", ...})
experiment = optimizer.next_data()

status

status()

Get the status from the optimizer server for this optimizer.

Example

Running the code sample:

1
2
3
4
5
6
import comet_ml

comet_ml.login()
opt = comet_ml.Optimizer({"algorithm": "bayes", ...})

opt.status()

Will return the dict:

{
    'algorithm': 'grid',
    'comboCount': 32,
    'configSpaceSize': 10,
    'endTime': None,
    'id': 'c20b90ecad694e71bdb5702778eb8ac7',
    'lastUpdateTime': None,
    'maxCombo': 0,
    'name': 'c20b90ecad694e71bdb5702778eb8ac7',
    'parameters': {
        'x': {
            'max': 20,
            'min': 0,
            'scalingType': 'uniform',
            'type': 'integer'
        }
    },
    'retryCount': 0,
    'spec': {
        'gridSize': 10,
        'maxCombo': 0,
        'metric': 'loss',
        'minSampleSize': 100,
        'randomize': False,
        'retryLimit': 20,
        'seed': 2997936454
    },
    'startTime': 1558793292216,
    'state': {
        'sequence_i': 0,
        'sequence_retry': 11,
        'sequence_trial': 0,
        'total': 32,
    },
    'status': 'running',
    'trials': 1,
    'version': '1.0.0'
}
Dec. 17, 2024