Skip to content

TranSPormer: a transformer for the Travelling Salesman Problem

License

Notifications You must be signed in to change notification settings

dcaffo98/transpormer

Repository files navigation

TranSPormer: a transformer for the Travelling Salesman Problem

This repository presents a proof-of-concept of a transformer neural network to address the Travelling Salesman Problem avoiding any auto-regressive component. Architecture overview


We rely on a cross-attention layer where the positional encodings represent the queries, while keys and values comes from the graph node embeddings.

Attention matrix proposed


An instance of an attention matrix from our model before and after training.

Attention matrix before training Attention matrix after training


Some examples of tours generated by our model that are shorter than those produced by the Christofides algorithm from NetworkX. qualitative results qualitative results qualitative results qualitative results qualitative results qualitative results qualitative results qualitative results qualitative results qualitative results


Create a dataset of 10k random graphs of 50 nodes each

> python create_dataset.py --path ./my_dataset --n 10000 --n_nodes 50

About

TranSPormer: a transformer for the Travelling Salesman Problem

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published