Skip to content

Recommended workflow for tracking a tuning job on a sklearn model #2295

@krishnashanker-amt

Description

@krishnashanker-amt

Hello!

I'm tuning a sklearn model (by providing a training script to sagemaker.sklearn.estimator.SKLearn) with sagemaker.tuner.HyperparameterTuner (mostly similar to the logic followed here ) .

I'm not sure how to track a tuning job setup by this process using Sagemaker Experiments (I know we can track a job in the console). I'm able to link all trials created by this tuning job to an experiment AFTER the tuning job completes using the logic followed here. The fit() of HyperparameterTuner does not support experiment_config . I've also tried setting up a tracker in the training script (using my_tracker = Tracker.load()) . This method creates a trial for each training job, but my_tracker.log_metric(metric_name='test_metric', value=0.4452) does not work (no metric is logged in the trial component from tracker) .

What am I missing? How can I track a tuning job with custom metrics using sagemaker-python-sdk and Sagemaker Experiments?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions