Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

div_ only supports scalar multiplication #5

Open
rawatenator opened this issue Apr 25, 2018 · 1 comment
Open

div_ only supports scalar multiplication #5

rawatenator opened this issue Apr 25, 2018 · 1 comment

Comments

@rawatenator
Copy link

Going to look into this in detail now, but got the following error while running the imagenet ipynb


RuntimeErrorTraceback (most recent call last)
<ipython-input-17-c4f537643c1f> in <module>()
      8 prob_outputs_dog = Variable(torch.zeros(1,1000)) ; prob_outputs_dog.data[:,dog_id] += 1
      9 
---> 10 prob_inputs_cat = eb.excitation_backprop(model, inputs, prob_outputs_cat, contrastive=False)
     11 prob_inputs_dog = eb.excitation_backprop(model, inputs, prob_outputs_dog, contrastive=False)

/home/rrawat/coderepo/excitationbp/excitationbp/utils.py in excitation_backprop(model, inputs, prob_outputs, contrastive, target_layer)
     45     if not contrastive:
     46         outputs = model(inputs)
---> 47         return torch.autograd.grad(top_h_, target_h_, grad_outputs=prob_outputs)[0]
     48 
     49     pos_evidence = torch.autograd.grad(top_h_, contr_h_, grad_outputs=prob_outputs.clone())[0]

/home/rrawat/anaconda2/lib/python2.7/site-packages/torch/autograd/__init__.pyc in grad(outputs, inputs, grad_outputs, retain_graph, create_graph, only_inputs)
    151     return Variable._execution_engine.run_backward(
    152         outputs, grad_outputs, retain_graph,
--> 153         inputs, only_inputs)
    154 
    155 if not torch._C._autograd_init():

/home/rrawat/anaconda2/lib/python2.7/site-packages/torch/autograd/function.pyc in apply(self, *args)
     89 
     90     def apply(self, *args):
---> 91         return self._forward_cls.backward(self, *args)
     92 
     93 

/home/rrawat/coderepo/excitationbp/excitationbp/functions/eb_linear.pyc in backward(ctx, grad_output)
     28 
     29         input.data = input.data - input.data.min() if input.data.min() < 0 else input.data
---> 30         grad_output /= input.mm(weight.t()).abs() + 1e-10 # normalize
     31         ### stop EB-SPECIFIC CODE  ###
     32 

/home/rrawat/anaconda2/lib/python2.7/site-packages/torch/autograd/variable.pyc in __idiv__(self, other)
    847 
    848     def __idiv__(self, other):
--> 849         return self.div_(other)
    850 
    851     def __pow__(self, other):

/home/rrawat/anaconda2/lib/python2.7/site-packages/torch/autograd/variable.pyc in div_(self, other)
    357         if not isinstance(other, Variable) and not torch.is_tensor(other):
    358             return DivConstant.apply(self, other, True)
--> 359         raise RuntimeError("div_ only supports scalar multiplication")
    360 
    361     def pow(self, other):

RuntimeError: div_ only supports scalar multiplication

@greydanus
Copy link
Owner

Interesting. I just tried to reproduce your error with PyTorch 0.3 and the latest commit of excitationbp. I wasn't able to reproduce your error.

I did push a small two-line update to this repo about an hour ago, but I'm not sure how that could have caused the error you are getting.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants