Skip to content

sapthrishi/kaggle-contradictory-my-dear-watson

Repository files navigation

kaggle-contradictory-my-dear-watson

kaggle competition : https://www.kaggle.com/c/contradictory-my-dear-watson

These are the notebooks that:

  1. fine tune a Bert model,
  2. fine tune a RoBERTa model,
  3. add a Global Average Pooling to an xlm-roberta-large model output hidden layer and fine tune.

Datasets used for fine tuning are glue/mnli, and xnli.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published