Skip to content

Commit

Permalink
Update KLD docs
Browse files Browse the repository at this point in the history
  • Loading branch information
fchollet committed Apr 15, 2024
1 parent 61bbff5 commit 6503b6d
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 0 deletions.
8 changes: 8 additions & 0 deletions keras/losses/losses.py
Original file line number Diff line number Diff line change
Expand Up @@ -342,6 +342,10 @@ class KLDivergence(LossFunctionWrapper):
loss = y_true * log(y_true / y_pred)
```
`y_true` and `y_pred` are expected to be probability
distributions, with values between 0 and 1. They will get
clipped to the `[0, 1]` range.
Args:
reduction: Type of reduction to apply to the loss. In almost all cases
this should be `"sum_over_batch_size"`.
Expand Down Expand Up @@ -1443,6 +1447,10 @@ def kl_divergence(y_true, y_pred):
loss = y_true * log(y_true / y_pred)
```
`y_true` and `y_pred` are expected to be probability
distributions, with values between 0 and 1. They will get
clipped to the `[0, 1]` range.
Args:
y_true: Tensor of true targets.
y_pred: Tensor of predicted targets.
Expand Down
4 changes: 4 additions & 0 deletions keras/metrics/probabilistic_metrics.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,10 @@ class KLDivergence(reduction_metrics.MeanMetricWrapper):
metric = y_true * log(y_true / y_pred)
```
`y_true` and `y_pred` are expected to be probability
distributions, with values between 0 and 1. They will get
clipped to the `[0, 1]` range.
Args:
name: (Optional) string name of the metric instance.
dtype: (Optional) data type of the metric result.
Expand Down

0 comments on commit 6503b6d

Please sign in to comment.