Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removing equinox experimental #71

Merged
merged 7 commits into from
Oct 6, 2023
Merged

Removing equinox experimental #71

merged 7 commits into from
Oct 6, 2023

Conversation

stepp1
Copy link

@stepp1 stepp1 commented May 25, 2023

This PR updates Equinox BatchNorm as it's no longer an experimental feat.

Fixes #70

@paganpasta paganpasta changed the base branch from main to dev May 25, 2023 22:39
@paganpasta
Copy link
Owner

Thanks for the PR!
It seems that few pre-commit checks are failing. Let me know if you will be able to fix and push them.
Else, I'll try to pick this up as soon as I can.

@stepp1
Copy link
Author

stepp1 commented May 25, 2023

Hi there!
There were other ways of calling BatchNorm that I didn't catch at first, so I updated the PR. Now, when running the precommit hook, everything seems ok.

However, I'm getting a bunch of errors (100%!) when running pytest.

@paganpasta
Copy link
Owner

Ok, I'll take a look at as to why they might be failing.

@stepp1
Copy link
Author

stepp1 commented May 29, 2023

Hi there! I was looking through the errors, and the culprit might be the changes in how the stateful ops are handled now.
I'm not very familiar w/ equinox, but I'll keep looking.

Edit: I think I narrowed it down to the fact that the new BatchNorm call breaks inside a ConvNormActivation. This occurs because the method now requires both x and state when called. Maybe @patrick-kidger can help here

@patrick-kidger
Copy link
Contributor

The new BatchNorm has a different API to the previous experimental version. (Incidentally the other normalisation layers -- LayerNorm, GroupNorm -- also support this new API if needed, so they are interchangeable.)

It should be called as output, state = norm_layer(input, state), see the stateful tutorial.

I think this should be straightforward enough -- simply thread the additional state through.

@hlzl hlzl mentioned this pull request Sep 20, 2023
@paganpasta paganpasta merged commit 94b091b into paganpasta:dev Oct 6, 2023
0 of 2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

BatchNorm is no longer experimental
3 participants