The hyperparameters used for the fine-tuning job.

interface Hyperparameters {
    batch_size?: number | "auto";
    learning_rate_multiplier?: number | "auto";
    n_epochs?: number | "auto";
}

Properties

batch_size?: number | "auto"

Number of examples in each batch. A larger batch size means that model parameters are updated less frequently, but with lower variance.

learning_rate_multiplier?: number | "auto"

Scaling factor for the learning rate. A smaller learning rate may be useful to avoid overfitting.

n_epochs?: number | "auto"

The number of epochs to train the model for. An epoch refers to one full cycle through the training dataset.