Skip to content

Successfully developed a dialogue summarization model using a Seq2Seq architecture with Attention on the DialogSum dataset to generate concise and coherent summaries of multi-turn conversations.

Notifications You must be signed in to change notification settings

SayamAlt/DialogSum-Summarization-using-Seq2Seq-Attention

About

Successfully developed a dialogue summarization model using a Seq2Seq architecture with Attention on the DialogSum dataset to generate concise and coherent summaries of multi-turn conversations.

Topics

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published