Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reproduce the results in the paper for mTSP #2

Open
hnnam0906 opened this issue Nov 7, 2024 · 3 comments
Open

Reproduce the results in the paper for mTSP #2

hnnam0906 opened this issue Nov 7, 2024 · 3 comments

Comments

@hnnam0906
Copy link

hnnam0906 commented Nov 7, 2024

Hi @hyeonahkimm, @Leaveson,

I would like to reproduce the results for mTSP mentioned in the Table 1 and Part B.2 Equity-Transformer of the paper.

As far as i understand, to achieve the results, you will setup in 3 steps:

  • Step 1: Train the model with the hyperparameters described in the table 5.
  • Step 2: Finetune the trained model with finetuning parameters shown in the table 5
  • Step 3: Run the inference step with Augmentation of 8

Are the pretrained models in the git (in pretrained/mstp folder) the outputs from step 1 and i need to run next steps (finetune and inference) to achieve the expected results?

How many epochs do I need to train for the models? Are they 100, 4, 0 for mtsp 50, 200, 500 respectively as in the pretrained folder or other numbers, please advise.

Please kindly check if my understanding is correct or not? Could you instruct more detail for steps and show the corresponding parts in the codes so that i can understand clearly and get the results as in the paper?

Thanks

@hnnam0906
Copy link
Author

@hyeonahkimm, @Leaveson,

Can you kindly help to review the above info so that i can achieve the results in the paper for mTSP?
Btw, how many epochs should i need to train and finetune for the 500 cities in mTSP problem?

Thanks

@Leaveson
Copy link
Contributor

Hello @hnnam0906

Thank you for your question.

Yes, the pretrained models in the Git repository (in the pretrained/mstp folder) are already the results after completing steps 1, 2, and 3, so you won’t need to run finetuning or inference separately to achieve the expected outcomes.

Regarding your overall understanding, you are correct about steps 1, 2, and 3. For the finetuning step, please note that finetuning on mtsp50 is not required. Instead, you can use the pretrained model obtained after running mtsp50 for 100 epochs. From this point, finetuning can be applied to mtsp200 and mtsp500 with 4 epochs and 0 epochs, respectively.

Thank you, and please feel free to reach out with any further questions!

Best,

@hnnam0906
Copy link
Author

hnnam0906 commented Nov 14, 2024

Hi @Leaveson, @hyeonahkimm,

I hope you have a nice day and thanks for your clarification.

If i would like to use the geographical distance (not the Euclidean one), and the input is not in range [0,1] since the latitude is from -90 to 90 degree and the longitude is from -180 to 180 degree, which parts will i need to modify (maybe the input for training, evaluation, augmentation..., is there anything in algorithm needed to be changed)?

I realize that i cannot normalize the latitude and longitude to the range of [0,1] since the distance is not linear with the latitude and longitude.

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants