Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

测试精度问题 #6

Open
wangyuxin87 opened this issue Mar 19, 2019 · 9 comments
Open

测试精度问题 #6

wangyuxin87 opened this issue Mar 19, 2019 · 9 comments

Comments

@wangyuxin87
Copy link

作者你好 我想问下这个tensorflow版本能达到原文精度吗? 略次还是略高?

@wangyuxin87
Copy link
Author

期待你的回复!

@basaltzhang
Copy link

basaltzhang commented Mar 27, 2019

I use the released ckpt in this git, test on the ICDAR Incidental 2015 test set. The result seems not good enough.
Recall/Precision/Hmean: 0.666/0.035/0.067
Using prob score, under the best hmean, p/r/f1-score: 0.453/0.406/0.428
I assume the author didn't train on this training set, so next i'll try and show the results.

@Shun14
Copy link
Owner

Shun14 commented Mar 27, 2019

I use the released ckpt in this git, test on the ICDAR Incidental 2015 test set. The result seems not good enough.
Recall/Precision/Hmean: 0.666/0.035/0.067
Using prob score, under the best hmean, p/r/f1-score: 0.453/0.406/0.428
I assume the author didn't train on this training set, so next i'll try and show the results.

The released model was trained on SynthText Dataset.You can train your own model base on it.

@Shun14
Copy link
Owner

Shun14 commented Mar 27, 2019

期待你的回复!

大概低一点,因为有些论文里提到的trick我没加,比如数据增强的时候,对小物体的crop的时候的规则

@wangyuxin87
Copy link
Author

好的 谢谢!

@basaltzhang
Copy link

The author says N in loss function is the number of default boxes that match groundtruth boxes, but in
this code is the batch size instead.

@basaltzhang
Copy link

Follow the author's training parameters in Table, I have the following result:
p/r/f1-score: 0.773/0.697/0.733

@qingfengyy
Copy link

qingfengyy commented Jun 11, 2019

Follow the author's training parameters in Table, I have the following result:
p/r/f1-score: 0.773/0.697/0.733

Could you tell me How you get this result, have you modified the loss function or other ways.The loss value does not decreas when training on icdar2015.

@jercas
Copy link

jercas commented Jul 15, 2019

Follow the author's training parameters in Table, I have the following result:
p/r/f1-score: 0.773/0.697/0.733

Could you tell me How you get this result, have you modified the loss function or other ways.The loss value does not decreas when training on icdar2015.

I have the some problem when i use the released model which pre-trained on Synthtext, the loss seem ridiculous and wave between 70 to 200 (sometime even higher). Have you guys solved this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants