Releases: mlr-org/mlr3tuning
Releases · mlr-org/mlr3tuning
mlr3tuning 1.4.0
- feat: Resample stages from
CallbackResampleare now available inCallbackBatchTuningandCallbackAsyncTuning. - fix: The
$predict_typewas written to the model even when theAutoTunerwas not trained. - feat: Internal tuned values are now visible in logs.
- BREAKING CHANGE: Remove internal search space argument.
- BREAKING CHANGE: The mlr3 ecosystem has a base logger now which is named
mlr3.
Themlr3/bbotklogger is a child of themlr3logger and is used for logging messages from thebbotkandmlr3tuningpackage. - feat: Classes are now printed with the
clipackage.
mlr3tuning 1.3.0
- feat: Save
ArchiveAsyncTuningto adata.tablewithArchiveAsyncTuningFrozen. - perf: Save models on worker only when requested in
ObjectiveTuningAsync.
mlr3tuning 1.2.1
- refactor: Only pass
extrato$assign_result().
mlr3tuning 1.2.0
- feat: Add new callback
clbk("mlr3tuning.one_se_rule")that selects the the hyperparameter configuration with the smallest feature set within one standard error of the best. - feat: Add new stages
on_tuning_result_beginandon_result_begintoCallbackAsyncTuningandCallbackBatchTuning. - refactor: Rename stage
on_resulttoon_result_endinCallbackAsyncTuningandCallbackBatchTuning. - docs: Extend the
CallbackAsyncTuningandCallbackBatchTuningdocumentation. - compatibility: mlr3 0.22.0
mlr3tuning 1.1.0
- fix: The
as_data_table()functions do not unnest thex_domaincolum anymore by default. - fix:
to_tune(internal = TRUE)now also works if non-internal tuning parameters require have an.extra_trafo. - feat: It is now possible to pass an
internal_search_spacemanually.
This allows to use parameter transformations on the primary search space in combination with internal hyperparameter tuning. - refactor: The
Tunerpass extra information of the result in theextraparameter now.
mlr3tuning 1.0.2
- refactor: Extract internal tuned values in instance.
mlr3tuning 1.0.1
- refactor: Replace internal tuning callback.
- perf: Delete intermediate
BenchmarkResultinObjectiveTuningBatchafter optimization.
mlr3tuning 1.0.0
- feat: Introduce asynchronous optimization with the
TunerAsyncandTuningInstanceAsync*classes. - BREAKING CHANGE: The
Tunerclass isTunerBatchnow. - BREAKING CHANGE: THe
TuningInstanceSingleCritandTuningInstanceMultiCritclasses areTuningInstanceBatchSingleCritandTuningInstanceBatchMultiCritnow. - BREAKING CHANGE: The
CallbackTuningclass isCallbackBatchTuningnow. - BREAKING CHANGE: The
ContextEvalclass isContextBatchTuningnow. - refactor: Remove hotstarting from batch optimization due to low performance.
- refactor: The option
evaluate_defaultis a callback now.
mlr3tuning 0.20.0
- compatibility: Work with new paradox version 1.0.0
- fix:
TunerIracefailed with logical parameters and dependencies.
mlr3tuning 0.19.2
- refactor: Change thread limits.