Highlights
- Pro
Stars
Code to reproduce "Transformers Can Do Arithmetic with the Right Embeddings", McLeish et al (NeurIPS 2024)
Cramming the training of a (BERT-type) language model into limited compute.
Code for Transformers Solve Limited Receptive Field for Monocular Depth Prediction
Monocular Depth Estimation Toolbox based on MMSegmentation.
[ECCV 2020] Self-Supervised Monocular Depth Estimation: Solving the Dynamic Object Problem by Semantic Guidance
Reproduction of the CVPR 2020 paper - Self-supervised monocular trained depth estimation using self-attention and discrete disparity volume
A curated list of papers, code and resources pertaining to image composition/compositing or object insertion/addition/compositing, which aims to generate realistic composite image.
A curated list of papers, code and resources pertaining to image harmonization.
The official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Control WS2812B and many more types of digital RGB LEDs with an ESP32 over WiFi!
A PyTorch implementation of the Transformer model in "Attention is All You Need".
A PyTorch implementation of the Transformer model from "Attention Is All You Need".
PyTorch original implementation of Cross-lingual Language Model Pretraining.
A PyTorch Implementation of Single Shot MultiBox Detector
Kaggle | 9th place single model solution for TGS Salt Identification Challenge
Pytorch implementation of Adversarially Robust Distillation (ARD)
Code for the paper "Understanding Generalization through Visualizations"
PyTorch implementation of binary neural networks