figshare
Browse
pone.0245157.s005.docx (12.53 kB)

Hyperparameters of XGBoost models.

Download (12.53 kB)
journal contribution
posted on 2021-01-19, 18:27 authored by William P. T. M. van Doorn, Patricia M. Stassen, Hella F. Borggreve, Maaike J. Schalkwijk, Judith Stoffers, Otto Bekers, Steven J. R. Meex

Hyperparameters were based on theoretical reasoning rather than hyperparameter tuning. This was done to prevent overfitting on hyperparameters due to small sample size. “Base_score”, “Missing”, “Reg_alpha”, “Reg_lambda” and “Subsample” parameters were standard values provided by the XGBoost interface. “Max_depth”, “max_delta_step” and “estimators” were values we internally use for these kind of machine learning models. During the study, hyperparameters were never adjusted to gain performance in our validation dataset.

(DOCX)

History