You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Recently I came across some papers on molecular contrastive learning, and it is my great pleasure to find a paper written by your team, named Molecular Contrastive Learning of Representations via Graph Neural Networks. This paper has benefited me a lot. But when I use the pre-trained model you provided for downstream tasks with the default configuration file config_finetune.yaml, the performance of the model can never reach the one shown in the paper. So I would like to ask if you can provide the hyperparameter configuration files required for downstream tasks on each data set.
The text was updated successfully, but these errors were encountered:
hi, @Shimmer8001 I finetuned and found the result similar to the paper, with just minor decrease. I think there are some random seed need to set to replicate exactly the same results.
Besides, did you try to pre-train model on larger of other different dataset to improve the finetune?
hi, @danielkaifeng I can't reproduce the results shown in the paper, too. I guess maybe the hyperparameters and random seed I set are not suitable. I wonder if you can share the hyperparameters and random seed. Thank you very much.
Recently I came across some papers on molecular contrastive learning, and it is my great pleasure to find a paper written by your team, named Molecular Contrastive Learning of Representations via Graph Neural Networks. This paper has benefited me a lot. But when I use the pre-trained model you provided for downstream tasks with the default configuration file config_finetune.yaml, the performance of the model can never reach the one shown in the paper. So I would like to ask if you can provide the hyperparameter configuration files required for downstream tasks on each data set.
The text was updated successfully, but these errors were encountered: