Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pre-trained weights #66

Closed
LiGuo12 opened this issue Sep 23, 2024 · 3 comments
Closed

Pre-trained weights #66

LiGuo12 opened this issue Sep 23, 2024 · 3 comments

Comments

@LiGuo12
Copy link

LiGuo12 commented Sep 23, 2024

Hi all,

Is it possible to release your best weights? Our team will use them for research rather than commercial purposes, and will appropriately cite your work in the paper.

@optmai
Copy link
Collaborator

optmai commented Sep 24, 2024

What task are you referring to?

@LiGuo12
Copy link
Author

LiGuo12 commented Sep 24, 2024

What task are you referring to?

Hi, we are interested in the Deep AUC Maximization (DAM) model from the paper "Large-scale Robust Deep AUC Maximization: A New Surrogate Loss and Empirical Studies on Medical Image Classification". The model achieves 0.93 AUC on CheXpert.

@optmai
Copy link
Collaborator

optmai commented Sep 25, 2024

Thank you for your interest! The result is actually an ensemble of several models trained using deep AUC maximization. You can follow the tutorial here https://github.com/Optimization-AI/LibAUC/blob/main/examples/05_Optimizing_AUROC_Loss_with_DenseNet121_on_CheXpert.ipynb to train a model. You can use different losses/optimizers available in our library to train different models and get an ensemble. The original models for achieving 0.93 AUC on CheXpert were not available anymore.

@optmai optmai closed this as completed Sep 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants