Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The eval code for the calculation of ap is wrong #10

Open
happyConan opened this issue Nov 12, 2020 · 3 comments
Open

The eval code for the calculation of ap is wrong #10

happyConan opened this issue Nov 12, 2020 · 3 comments

Comments

@happyConan
Copy link

In the eval.py file, pred_boxes,pred_classes,pred_scores are sorted before calculating ap. In eval_ap_2d function, zip(gt_single_cls, bbox_single_cls, scores_single_cls) was used. but the sizes of gt_single_cls, bbox_single_cls, scores_single_cls are different. This will make low-score results ignored. is it reasonable? eg: gt_single_cls=[[],[3],[],[3]], bbox_single_cls=[[],[3],[3],[3],[3]], scores_single_cls=[[],[0.9],[0.8],[0.7],[0.6]], the 0.6 box will be ignored. but maybe 0.9,0.8,0.7 has the same gt. Consequently, it will cause lower ap.

@VectXmy
Copy link
Owner

VectXmy commented Nov 12, 2020

The eval code is from network. There seems to be something wrong with this. Thanks!

@happyConan
Copy link
Author

The eval code is from network. There seems to be something wrong with this. Thanks!

thanks for your reply

@howellma35
Copy link

iou is 3d?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants