- 
                  Tokyo Institute of Technology
- Japan
- 
        
  21:13
  (UTC +09:00) 
Popular repositories Loading
- 
      Megatron-DeepSpeedMegatron-DeepSpeed PublicOngoing research training transformer language models at scale, including: BERT & GPT-2 Python 1 
- 
      
- 
      
- 
      
- 
      CutoutCutout PublicForked from uoguelph-mlrg/Cutout 2.56%, 15.20%, 1.30% on CIFAR10, CIFAR100, and SVHN https://arxiv.org/abs/1708.04552 Python 
- 
      RedPajama-DataRedPajama-Data PublicForked from togethercomputer/RedPajama-Data The RedPajama-Data repository contains code for preparing large datasets for training large language models. Python 
          Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
  If the problem persists, check the GitHub status page or contact support.

