Skip to content

Commit

Permalink
Skip softmax BF16 test for ROCm (#21162)
Browse files Browse the repository at this point in the history
### Description

Skip softmax BF16 test for ROCm, because BFloat16 is unsupported by
MIOpen, and `torch.cuda.is_available()` also returns `True` for ROCm.
  • Loading branch information
mindest committed Jun 26, 2024
1 parent 41ad83f commit e2abba1
Showing 1 changed file with 3 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -148,8 +148,8 @@ def test_onnx_ops(self):

@unittest.skipIf(not torch.cuda.is_bf16_supported(), "Test requires CUDA and BF16 support")
def test_softmax_bf16_large(self):
if not torch.cuda.is_available():
# only test bf16 on cuda
if torch.version.cuda is None:
# Only run this test when CUDA is available, as on ROCm BF16 is not supported by MIOpen.
return

class Model(torch.nn.Module):
Expand All @@ -175,7 +175,7 @@ def forward(self, input):
data_ort.requires_grad = True
ort_res = ort_model(input=data_ort)
ort_res.backward(gradient=init_grad)
# compara result
# compare result
torch.testing.assert_close(data_torch.grad, data_ort.grad, rtol=1e-5, atol=1e-4)


Expand Down

0 comments on commit e2abba1

Please sign in to comment.