Skip to content

Classification on Unbalanced Datasets using Boost Techniques (AdaBoost M2, SMOTE Boost, RusBoost,..)

License

Notifications You must be signed in to change notification settings

celestialtaha/Unbalanced-dataset-Classification

Repository files navigation

Unbalanced-dataset-Classification

Classification on Unbalanced Datasets using Boost Techniques (AdaBoost M2, SMOTE Boost, RusBoost,..)

Below is the detailed results:

image

Average Classifier Precision for AdaBoost : 0.77

Average Classifier Precision for RUSBoost : 0.82

Average Classifier Precision for SMOTEBoost : 0.66

Average Classifier Precision for RandomBalanceBoost :0.6

Average Classifier Precision for RandomForest : 0.95

Average Classifier Precision for SVM : 1.0


  • Best performing method based on Average Precision of classifiers: "SVM"

  • Best Performing Ensemble Classifier is "Random Forset" Runner up (second best) is RUSBOOST


Taha Samavati - Analysis of final results

About

Classification on Unbalanced Datasets using Boost Techniques (AdaBoost M2, SMOTE Boost, RusBoost,..)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages