You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I set hyper-parameters 0.1 for size_penalty ,0.05 for merge_percent and 16 for batchSize in your code. But on DukeMTMC-VideoReID, the highest rank-1 is 59.3% and the heighest mAP is 47.9%. I can't reproduce the performance of your papers. Is my hyper-parameters set wrong?
The text was updated successfully, but these errors were encountered:
Hello, I set hyper-parameters 0.1 for size_penalty ,0.05 for merge_percent and 16 for batchSize in your code. But on DukeMTMC-VideoReID, the highest rank-1 is 59.3% and the heighest mAP is 47.9%. I can't reproduce the performance of your papers. Is my hyper-parameters set wrong?
Hi~ I met the same problem. However, I can get about 63% mAP on DukeMTMC-VideoReID which is higher than yours while it is still lower than the result of the paper. Besides, the mAP on MARS is 34.7% in my experiment which is so bad(43.5% in the paper). Do you try the code on MARS? How about your performance? By the way, my setting of hyper-parameters is the same as yours.
@ChildQuqu hello
I didn't understand this function:
def linkage_calculation(self, dist, labels, penalty):
......
linkages = linkages.T + linkages - linkages * np.eye(cluster_num)
intra = linkages.diagonal()
penalized_linkages = linkages + penalty * ((intra * np.ones_like(linkages)).T + intra).T
I don't know the details of such code.
I am a graduate student in China.Is it convenient for you to leave your mailbox or wechat?I'd like to ask you about the code details.
It doesn't matter if it's inconvenient.
Best wishes!
Hello, I set hyper-parameters 0.1 for size_penalty ,0.05 for merge_percent and 16 for batchSize in your code. But on DukeMTMC-VideoReID, the highest rank-1 is 59.3% and the heighest mAP is 47.9%. I can't reproduce the performance of your papers. Is my hyper-parameters set wrong?
The text was updated successfully, but these errors were encountered: