Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix] TF Backend Regularization fix. #1781

Merged
merged 7 commits into from
Jun 20, 2024
Merged

Conversation

agniv-the-marker
Copy link
Contributor

@agniv-the-marker agniv-the-marker commented Jun 20, 2024

TF Regularization

Check the docs and the argument for tf.keras.regualizers.l1/l2 is of the form l1/l2 instead of just l in the current version. Without it this breaks:

net = dde.maps.PFNN(
    network,
    "swish",
    "Glorot normal",
    regularization=["l2", 1e-8]
)

Specifically it gives you the error:

TypeError: L1.__init__() got an unexpected keyword argument 'l'

As of right now, the following works instead:

net = dde.maps.PFNN(
    network,
    "swish",
    "Glorot normal",
    regularization=["l1+l2", 1e-8, 1e-8]
)

But merging would allow for each one to be used on their own.

@agniv-the-marker agniv-the-marker changed the title Update regularizers.py Small updates for TF backend. Jun 20, 2024
@lululxvi
Copy link
Owner

Split the regularization and model save into two PRs.

Made capital and added support for passing in either lower/uppercase for the regularizer name.
using .lower() instead of adding multiple checks for readability.
Will split into a new commit
Remove comments
@agniv-the-marker agniv-the-marker changed the title Small updates for TF backend. [Fix] TF Backend Regularization fix. Jun 20, 2024
Linter said so
@lululxvi lululxvi merged commit 737c2f8 into lululxvi:master Jun 20, 2024
11 checks passed
g-w1 pushed a commit to g-w1/deepxde that referenced this pull request Jun 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants