-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting-started with deel-lip #78
base: master
Are you sure you want to change the base?
Conversation
Thanks @Sharing-Sam-Work for your contribution! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very good introductory notebooks! I believe it makes things clearer and smoother for inexperienced users. These tutorials enhance the main tools of deel-lip: the custom layers and the specific losses.
I suggested some improvements. Feel free to take them into account or not. We can discuss about them later on.
"We show two cases. In the first case, we use `deel-lip`'s `TauCategoricalCrossentropy` from the `losses` submodule. In the second case, we use another loss function from `deel-lip`: `MulticlassHKR`.\n", | ||
"\n", | ||
"In particular, we will show how these functions can be parametrized to increase the robustness of our predictive models. We will also see that generally, there is a compromise between the robustness and the accuracy of our models (i.e. better robustness generally comes at the price of a decrease in performance).\n", | ||
"\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Before talking about the two cases, I think we can emphasize the importance of the loss to train 1-Lipschitz networks, especially the fact that there is a trade-off between accuracy and robustness and that all our deel-lip losses provides hyper-parameters to tweak this trade-off. The user should understand here that training Lipschitz constrained networks requires to use our losses for a better control of the trade-off.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
Changed the content:
🎮 Control over the accuracy-robustness trade-off with deel-lip
's loss functions.
When training training 1-Lipschitz networks, one will see that there is a compromise between the robustness and the accuracy of the models. In simple terms, achieving stronger robustness often involves sacrificing some performance.
In this section, we will show the pivotal role of deel-lip
's loss functions in training 1-Lipschitz networks. Each of these functions comes with its own set of hyperparameters, enabling you to precisely navigate and adjust the balance between accuracy and robustness.
We show two cases. In the first case, we use deel-lip
's TauCategoricalCrossentropy
from the losses
submodule. In the second case, we use another loss function from deel-lip
: MulticlassHKR
.
Thank you @Sharing-Sam-Work for this great work 👍
My suggestion is to merge this PR in a temporary branch here in this repo so we can take care of this without requiring @Sharing-Sam-Work to do it. |
…rofessionals in industrial sectors who want to swiftly grasp the practical applications of the package. These tutorials offer a concise introduction to utilizing the package for producing and training robust 1-Lipschitz deep learning models. The focus here is on practical implementation rather than delving deeply into theoretical aspects. We provide practical suggestions to enhance user-friendliness and usability, making these tutorials ideal for those aiming for a practical working knowledge of the library and its functionalities, rather than an exhaustive theoretical exploration.
…class' k parameter in Getting-Started 1 and changed the hyper-parameter of the HKR loss function for Getting-Started 2)
71e714b
to
8aaafbd
Compare
Getting started
I have taken into account the comments. |
….md and index.mdof the google collab link
….md and index.mdof the google collab link Added title to the getting started notebooks in the README.md file
I have created two new tutorial notebooks specifically tailored for professionals in industrial sectors who want to swiftly grasp the practical applications of the package. These tutorials offer a concise introduction to utilizing the package for producing and training robust 1-Lipschitz deep learning models.
The focus here is on practical implementation rather than delving deeply into theoretical aspects. We provide practical suggestions to enhance user-friendliness and usability, making these tutorials ideal for those aiming for a practical working knowledge of the library and its functionalities, rather than an exhaustive theoretical exploration.
I have tested the changes with: tox -e py310-lint
I have also visualized the changes with: mkdocs serve
If the changes are validated, the google collab link will need to be changed in docs/index.md, so that they come from the deel-lip repository as opposed to mine (Sharing-Sam-Work), as in the below:
| Getting started 1 - Creating a 1-Lipschitz neural network | |