Replies: 1 comment 7 replies
-
Currently FLAML does several things for imbalanced data.
I'm supportive of making the warning about imbalance. |
Beta Was this translation helpful? Give feedback.
7 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
How does FLAML handle imbalanced data (unequal distribution of target classes in classification task)? RandomForest has
class_weight
argument, xgboost hassample_weight
and LGBM hasclass_weight
as well? Should we scan for class imbalance first and if it's detected, set those arguments to 'balanced'. It would be better to throw a warning and let the user know about class imbalance before training. It would be nice if imbalanced-learn is used automatically in preprocessing the dataset if imblance is detectedThoughts?
Beta Was this translation helpful? Give feedback.
All reactions