Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Level 6 features #21

Open
efyphil opened this issue Jun 24, 2019 · 3 comments
Open

Level 6 features #21

efyphil opened this issue Jun 24, 2019 · 3 comments

Comments

@efyphil
Copy link

efyphil commented Jun 24, 2019

I tried to extract features and got very interesting results.
Screenshot from 2019-06-24 15-03-25
As you can see level 6 features was constant. I used different pictures and checked all of 196 feature maps and always got the same result. I also checked different weights for different realization of pwc-net. Can you please describe this result.

@lemon-PiC
Copy link

Did you find the answer? I have the same question in the 6th feature pyramid.

@efyphil
Copy link
Author

efyphil commented Dec 18, 2019

Did you find the answer? I have the same question in the 6th feature pyramid.

No, maybe you can try to learn network by yourself without last layer in encoder.

@Animadversio
Copy link

Doing some gradient analysis I found gradient vanishing phenomena to the last correlation layer or encoding layer is quite strong....
My intuition is the correlation output usually has a much smaller scale than the feature tensor, thus the activation could diminish, so do the gradient to the last decoder and encoder layer.
(note that there is no direct connection from flow map to 6th layer feature tensor, all their gradient comes through the correlation layer, which seems like a correlation bottleneck )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants