Skip to content

Conversation

@pat-s
Copy link
Member

@pat-s pat-s commented Aug 14, 2018

superseeds #2267

To do

  • Write tutorial section and explain the difference to "blocking"

Idea

Use fully predefined indices via the blocking argument in the task for resampling.

Single resampling

4 classes -> 4 folds.

Nested Resampling

Outer: 4 classes -> 4 folds
Inner: 3 classes -> 3 folds

Only compatible with 1 repetition, hence only with "CV".

To use predefined indices in "RepCV" one should use the already existing "Blocking" implementation. This diiffers in the way what the class number does not also define the number of folds and hence more combinations than just the number of folds can be generated.

Implementation

The user needs to initiliaze this special resampling by setting fixed = TRUE in makeResampleDesc(). Otherwise, the "blocking" implementation will be used.

As said, the approach uses the factor variable supplied in the task via the blocking argument. I know this can cause confusion between "blocking" and "grouping" but we need to differentiate both approaches somehow.
On the other hand we do not need a new argument taking a factor vector in the task.

In a nested setting, a possible workflow would look as follows:

inner = makeResampleDesc("CV", iters = 4, fixed = TRUE)
outer = makeResampleDesc("CV", iters = 5, fixed = TRUE)
tune_wrapper = makeTuneWrapper(lrn, resampling = inner, par.set = ps,
 control = ctrl, show.info = FALSE)

p = resample(tune_wrapper, ct, outer, show.info = FALSE,
 extract = getTuneResult)

So rather than doing a random sampling, we use the predefined indices specified in "blocking".

The function is smart enough to also deal with a little mispecification by issueing a warning:

inner = makeResampleDesc("CV", iters = 5, fixed = TRUE)
outer = makeResampleDesc("CV", iters = 5, fixed = TRUE)

"iters (5) is not equal to length of blocking levels (4)!

If inner > outer, an error will be thrown.

By logic, the inner fold count needs always to be one less the outer count.

Users can also combine using fixed indices in the outer and random sampling in the inner:

inner = makeResampleDesc("CV", iters = 5)
outer = makeResampleDesc("CV", iters = 5, fixed = TRUE)
tune_wrapper = makeTuneWrapper(lrn, resampling = inner, par.set = ps,
  control = ctrl, show.info = FALSE)
expect_success(resample(tune_wrapper, ct, outer, show.info = FALSE,
  extract = getTuneResult))

To explicitly avoid clashes between "fixed" and "blocking" when a "blocking" factor was given in the task, I had to add a little helper arg.
To use "blocking" in single "CV", the user now needs to explicitly enable it by using makeResampleDesc("CV", iters = 5, blocking.cv = TRUE).
But I think people would always use "blocking" in "repCV" anyways I guess?

Just for clarification: This PR changes nothing on the existing "blocking" implemen tation besides the need to explicitly trigger it when using "CV".

@larskotthoff
Copy link
Member

Thanks.

What exactly does fixed = TRUE do here? I didn't understand the difference between your first and second example.

Since the number of iterations is fixed by the number of levels, why not make this the automatic choice instead of asking the user to specify it again?

@pat-s
Copy link
Member Author

pat-s commented Aug 14, 2018

What exactly does fixed = TRUE do here? I didn't understand the difference between your first and second example.

fixed = TRUE triggers the usage of the indices specified in "blocking". Otherwise, a "normal" random sampling CV is applied.

Since the number of iterations is fixed by the number of levels, why not make this the automatic choice instead of asking the user to specify it again?

I didn't do this first because I had problems distinguishing between inner and outer and it was easier to get help by the user here (inner should always be one less than outer).
However, because the function can adapt correctly if both levels are set to the maximum number of levels now, I could just hardcode this. Thanks!

(Another reason was that for a long time I did not use "fixed" and so I had no flag telling me whether I am in a "blocking", "fixed" or "normal" setting.)

@pat-s
Copy link
Member Author

pat-s commented Aug 16, 2018

Vignette update added. Please review using the netlify preview: https://deploy-preview-2412--nervous-hopper-4136be.netlify.com/articles/resample.html

R/ResampleDesc.R Outdated
#' else it will be a fraction of the total training indices. IE for 100 training sets and a value of .2, the increment
#' of the resampling indices will be 20. Default is \dQuote{horizon} which gives mutually exclusive chunks
#' of test indices.}
#' \item{fixed (`logical(1)`)}{Whether indices supplied via argument 'blocking' in the task should be used in resampling. Default is `FALSE`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This wording suggests that blocking is ignored in resampling, and the documentation for blocking says the opposite.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Made it more clear now. Please take another look :)

R/ResampleDesc.R Outdated
#' of the resampling indices will be 20. Default is \dQuote{horizon} which gives mutually exclusive chunks
#' of test indices.}
#' \item{fixed (`logical(1)`)}{Whether indices supplied via argument 'blocking' in the task should be used in resampling. Default is `FALSE`.
#' 'grouping' only works with 'CV' and the supplied indices must match the number of observations.}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where does the grouping come from?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Leftover, should be fixed

R/ResampleDesc.R Outdated
#' of test indices.}
#' \item{fixed (`logical(1)`)}{Whether indices supplied via argument 'blocking' in the task should be used in resampling. Default is `FALSE`.
#' 'grouping' only works with 'CV' and the supplied indices must match the number of observations.}
#' \item{blocking.cv (`logical(1)`)}{Should 'blocking' be used in 'CV'? Default to `FALSE`}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It sounds like this does the same thing as fixed.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added more detail to the doc and a link to the (not yet existing) tutorial page.

if (length(blocking)) {
# 'fixed' only exists by default for 'CV' -> is.null(desc$grouping)
# only use this way of blocking of 'fixed = FALSE' -> is.null(desc$grouping)
if(is.null(desc$fixed)) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Space after if (isn't this supposed to be checked automatically?).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also I would give fixed a default value in makeResampleDesc so that this can never be NULL to remove the additional check.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Set a default value.

} else {
fixed = TRUE
}
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With default values for fixed and blocking.cv this could be simply fixed = desc$fixed; blocking.cv = desc$blocking.cv.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes this simplifies things. I've set defaults in makeResampleDesc().

}

if (desc$iters != length(levels(task$blocking))) {
desc$iters = length(levels(task$blocking))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is the number of iterations being adjusted here? There should be a warning.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two reasons:

  1. If the user inserts a different number than levels used for fixed in makeResampleDesc(), the function will error.
  2. In the inner call, the function is able to adapt by automatically reducing one level. So having always length(iters) = length(levels(task$blocking) is the most safe environment for the function to work.

I added the explanation as a comment.

desc$iters = length(levels(task$blocking))
}
levs = levels(task$blocking)
n_levels = length(levels(task$blocking))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or just length(levs).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, thanks.

test.inds = lapply(inst$test.inds, function(i) which(task$blocking %in% levs[i]))

# Nested resampling: We need to create a list with length(levels) first.
# Then one fold will be length(0) because we are missing one factor level because we are in the inner level
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happens if the number of outer folds is less than the number of levels (or a simple train/test split) and more than one factor level is missing?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In

if (desc$iters != length(levels(task$blocking))) {
desc$iters = length(levels(task$blocking))
we check that the number of folds is always = number of levels.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah ok. Could you add information on what the number of levels was and what it was set to in the warning please?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem we have here is the following:

  • The warning is triggered for both inner and outer level
  • Determining the level (inner/outer) is tricky
  • The inner level resets to the factor levels of the outer level first and is then further adjusted. So we actually get a false positive for the inner level even if its set correctly (e.g. outer = 5, inner = 4).

Even if this doesn't sound logical, I would even vote for the complete removal of the warning. Users who set fixed = T usually know what they want and what they need to set.
Its contra-productive if the warning is raised even if the specification is correct (e.g. outer = 5, inner = 4).

Not sure if this thinking if easy to follow here. Let me know if I should explain it again in more detail.

I would propose to mention the adjustment in the tutorial (and in the help page?).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But at some point you can figure out whether there are enough levels for the folds, right? So there shouldn't need to be any false positives?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Further down in the function I have the final number of levels. But then I have no information if the level count was adjusted or not. And even if I would add a flag, I am missing the original level count since desc$iters is reassigned.

I think the effort of implementing a robust warning for both inner and outer is not worth the effort.
I would prefer to note it in the help page (in details) and in the tutorial.

Something like "Setting iters with fixed = T has no effect. iters will be set to length(blocking.levels) in the outer and length(blocking.levels) - 1 in the inner level".

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for discussing this Lars! It's also not a solution I am completely happy with and a problem I thought a lot about.

I'll do the required changes soon, including the tutorial page. Finally getting this done.

p = resample(lrn, ct, rdesc)$pred

# check if all test.inds are unique
expect_length(unique(unlist(p$instance$test.inds, use.names = FALSE)), 150)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should also check whether the right observations are together.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

@pat-s
Copy link
Member Author

pat-s commented Sep 17, 2018

@larskotthoff
Sorry for the delay, I was on vacation and busy with some other stuff.

Hope you still know whats going on here :)

Once the technical part is approved, I'll update the tutorial section.

```{r}
str(getResamplingIndices(p, inner = TRUE))
```

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please mention here that the number of inner folds is automatically adjusted based on the available levels?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, as said, I will write a new section comparing blocking and grouping and explain what happens.

Merge branch 'factor-cv' of github.com:mlr-org/mlr into factor-cv

# Conflicts:
#	docs/articles/tutorial/advanced_tune.html
#	docs/articles/tutorial/bagging.html
#	docs/articles/tutorial/bagging_files/figure-html/makeBaggingWrapper_regressionPlot-1.png
#	docs/articles/tutorial/benchmark_experiments.html
#	docs/articles/tutorial/benchmark_experiments_files/figure-html/unnamed-chunk-29-1.png
#	docs/articles/tutorial/classifier_calibration_files/figure-html/unnamed-chunk-4-1.png
#	docs/articles/tutorial/classifier_calibration_files/figure-html/unnamed-chunk-5-1.png
#	docs/articles/tutorial/classifier_calibration_files/figure-html/unnamed-chunk-6-1.png
#	docs/articles/tutorial/configureMlr.html
#	docs/articles/tutorial/cost_sensitive_classif.html
#	docs/articles/tutorial/cost_sensitive_classif_files/figure-html/unnamed-chunk-10-1.png
#	docs/articles/tutorial/create_filter.html
#	docs/articles/tutorial/create_measure.html
#	docs/articles/tutorial/create_measure_files/figure-html/unnamed-chunk-8-1.png
#	docs/articles/tutorial/feature_selection.html
#	docs/articles/tutorial/feature_selection_files/figure-html/unnamed-chunk-14-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects.html
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/sa_single-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/single_crash-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/single_nested-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/two-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/two_crash-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/two_nested-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/two_optima-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/two_showargs-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/unnamed-chunk-3-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/unnamed-chunk-4-1.png
#	docs/articles/tutorial/impute.html
#	docs/articles/tutorial/learning_curve_files/figure-html/LearningCurveACCx-1.png
#	docs/articles/tutorial/learning_curve_files/figure-html/LearningCurveTPFP-1.png
#	docs/articles/tutorial/learning_curve_files/figure-html/unnamed-chunk-2-1.png
#	docs/articles/tutorial/multilabel.html
#	docs/articles/tutorial/nested_resampling.html
#	docs/articles/tutorial/nested_resampling_files/figure-html/unnamed-chunk-8-1.png
#	docs/articles/tutorial/out_of_bag_predictions.html
#	docs/articles/tutorial/over_and_undersampling.html
#	docs/articles/tutorial/partial_dependence.html
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-15-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-16-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-17-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-18-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-19-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-20-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-22-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-23-1.png
#	docs/articles/tutorial/performance.html
#	docs/articles/tutorial/predict.html
#	docs/articles/tutorial/predict_files/figure-html/unnamed-chunk-20-1.png
#	docs/articles/tutorial/predict_files/figure-html/unnamed-chunk-22-1.png
#	docs/articles/tutorial/predict_files/figure-html/unnamed-chunk-23-1.png
#	docs/articles/tutorial/preproc.html
#	docs/articles/tutorial/resample.html
#	docs/articles/tutorial/roc_analysis.html
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-10-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-11-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-11-2.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-14-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-15-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-16-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-17-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-19-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-20-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-22-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-23-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-24-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-4-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-6-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-8-1.png
#	docs/articles/tutorial/task.html
#	docs/articles/tutorial/train.html
#	docs/articles/tutorial/tune.html
#	docs/articles/tutorial/tune_files/figure-html/unnamed-chunk-21-1.png
#	docs/articles/tutorial/wrapper.html
#	docs/favicon.ico
#	docs/reference/makeModelMultiplexer.html
#	docs/reference/makeWeightedClassesWrapper.html
#	docs/reference/tuneParams.html
Merge branch 'master' into factor-cv

# Conflicts:
#	docs/articles/tutorial/advanced_tune.html
#	docs/articles/tutorial/bagging.html
#	docs/articles/tutorial/bagging_files/figure-html/makeBaggingWrapper_regressionPlot-1.png
#	docs/articles/tutorial/benchmark_experiments.html
#	docs/articles/tutorial/benchmark_experiments_files/figure-html/unnamed-chunk-29-1.png
#	docs/articles/tutorial/classifier_calibration_files/figure-html/unnamed-chunk-4-1.png
#	docs/articles/tutorial/classifier_calibration_files/figure-html/unnamed-chunk-5-1.png
#	docs/articles/tutorial/classifier_calibration_files/figure-html/unnamed-chunk-6-1.png
#	docs/articles/tutorial/configureMlr.html
#	docs/articles/tutorial/cost_sensitive_classif.html
#	docs/articles/tutorial/cost_sensitive_classif_files/figure-html/unnamed-chunk-10-1.png
#	docs/articles/tutorial/create_filter.html
#	docs/articles/tutorial/create_measure.html
#	docs/articles/tutorial/create_measure_files/figure-html/unnamed-chunk-8-1.png
#	docs/articles/tutorial/feature_selection.html
#	docs/articles/tutorial/feature_selection_files/figure-html/unnamed-chunk-14-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects.html
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/sa_single-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/single_crash-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/single_nested-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/two-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/two_crash-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/two_nested-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/two_optima-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/two_showargs-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/unnamed-chunk-3-1.png
#	docs/articles/tutorial/hyperpar_tuning_effects_files/figure-html/unnamed-chunk-4-1.png
#	docs/articles/tutorial/impute.html
#	docs/articles/tutorial/learning_curve_files/figure-html/LearningCurveACCx-1.png
#	docs/articles/tutorial/learning_curve_files/figure-html/LearningCurveTPFP-1.png
#	docs/articles/tutorial/learning_curve_files/figure-html/unnamed-chunk-2-1.png
#	docs/articles/tutorial/multilabel.html
#	docs/articles/tutorial/nested_resampling.html
#	docs/articles/tutorial/nested_resampling_files/figure-html/unnamed-chunk-8-1.png
#	docs/articles/tutorial/out_of_bag_predictions.html
#	docs/articles/tutorial/over_and_undersampling.html
#	docs/articles/tutorial/parallelization.html
#	docs/articles/tutorial/partial_dependence.html
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-15-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-16-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-17-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-18-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-19-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-20-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-22-1.png
#	docs/articles/tutorial/partial_dependence_files/figure-html/unnamed-chunk-23-1.png
#	docs/articles/tutorial/performance.html
#	docs/articles/tutorial/predict.html
#	docs/articles/tutorial/predict_files/figure-html/unnamed-chunk-20-1.png
#	docs/articles/tutorial/predict_files/figure-html/unnamed-chunk-22-1.png
#	docs/articles/tutorial/predict_files/figure-html/unnamed-chunk-23-1.png
#	docs/articles/tutorial/preproc.html
#	docs/articles/tutorial/resample.html
#	docs/articles/tutorial/roc_analysis.html
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-10-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-11-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-11-2.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-14-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-15-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-16-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-17-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-19-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-20-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-22-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-23-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-24-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-4-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-6-1.png
#	docs/articles/tutorial/roc_analysis_files/figure-html/unnamed-chunk-8-1.png
#	docs/articles/tutorial/task.html
#	docs/articles/tutorial/train.html
#	docs/articles/tutorial/tune.html
#	docs/articles/tutorial/tune_files/figure-html/unnamed-chunk-21-1.png
#	docs/articles/tutorial/wrapper.html
#	docs/favicon.ico
#	docs/reference/makeModelMultiplexer.html
#	docs/reference/makeWeightedClassesWrapper.html
#	docs/reference/tuneParams.html
@pat-s
Copy link
Member Author

pat-s commented Sep 27, 2018

@larskotthoff updated help page and tutorial - please take a look.

Remember that you can use the netlify preview of the docs (https://deploy-preview-2412--nervous-hopper-4136be.netlify.com/articles/tutorial/resample.html) once the pkgdown files have been deployed by Travis.

@larskotthoff
Copy link
Member

Thanks, merging.

@larskotthoff larskotthoff merged commit 3958ef1 into master Sep 27, 2018
@larskotthoff larskotthoff deleted the factor-cv branch September 27, 2018 17:34
vrodriguezf pushed a commit to vrodriguezf/mlr that referenced this pull request Jan 16, 2021
* add 'grouping' option

* account for grouping and blocking

* add tests

* use fixed instead of grouping

* hardcode iters when fixed = TRUE

* update tests and rename

* fix doc of makeResampleDesc

* also apply args blocking.cv and fixed to RepCV to fix blocking tests

* update tests

* update resample::blocking

* update documentation of arg 'fixed'

* grouping -> fixed

* set defaults for fixed and blocking.cv in makeResampleDesc

* fixed -> grouping

* explain why are doing a hard levels reset

* simplify code

* hand over new default args of makeResampleDesc

* update tests

* fix test expectations (handled by seed?)

* fixed and blocking.cv are official params and not just items

* fix doc

* Deploy from Travis build 12597 [ci skip]

Build URL: https://travis-ci.org/mlr-org/mlr/builds/432449836
Commit: af8a40d

* update docs from master

* update docs

* Deploy from Travis build 12603 [ci skip]

Build URL: https://travis-ci.org/mlr-org/mlr/builds/432635288
Commit: 312763e

* Deploy from Travis build 12602 [ci skip]

Build URL: https://travis-ci.org/mlr-org/mlr/builds/432635052
Commit: 7dc7606

* update vignette

* update help page

* update NEWS

* Deploy from Travis build 12624 [ci skip]

Build URL: https://travis-ci.org/mlr-org/mlr/builds/434150919
Commit: 87e1195

* Deploy from Travis build 12622 [ci skip]

Build URL: https://travis-ci.org/mlr-org/mlr/builds/434149614
Commit: 9f6ab3e

* Deploy from Travis build 12621 [ci skip]

Build URL: https://travis-ci.org/mlr-org/mlr/builds/434148851
Commit: 50f51a4
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants