AutoML-GP5-LightAutoML-OOFs-and-Test-preds
@kaggle.alexryzhkov_automl_gp5_lightautoml_oofs_and_test_preds
Loading...
Loading...
Loading...
Loading...
@kaggle.alexryzhkov_automl_gp5_lightautoml_oofs_and_test_preds
TLDR: 24 base regression and classification models, GBDT + NN, with their blend.
We trained all models (CatBoost and LGBM for regression, DenseLight and FT-Transformer for both regression and classification) with original and clipped target (clip all price values higher than 500k for training fold) using original and augmented with kagglex (added only in train fold) datasets (thanks to @lashfire).
Our final ensemble with 10-Fold CV scores is:
@esifunds
@owid
Share link
Anyone who has the link will be able to view this.