Deep learning skills from scratch 딥러닝 기초 지식들을 위해 공부한 레포지토리입니다.
| idx | Task | Keyword | Title | Author | Link | Review Link | Code Review |
|---|---|---|---|---|---|---|---|
| 1 | NLP | Attention, Transformer | Attention is all you need | Paper | Blog | Code | |
| 2 | NLP | BERT | Pre-training of Deep Bidirectional Transformers for Language Understatnding | Google AI Language | Paper | Blog | |
| 3 | LLM | RAG | Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks | Facebook AI Research | Paper | Blog | |
| 4 | LLM | Agent, LLM | Augmented Language Models | Facebook AI Research | Paper | Blog | |
| 5 | LLM | MoE, SMoE | Mixtral of Experts | Mistral AI | Paper | Blog | |
| 6 | LLM | LoRA(PEFT) | Low-Rank Adaptation of LLM | Paper | Blog | ||
| 7 | LLM | HyDE | Precise Zero-shot Dense Retriever without Relevance labels | Paper | Blog | ||
| 8 | ASR | Contextual ASR | CTC-Assisted LLM-Based Contextual ASR | Paper | Blog | ||
| 9 | LLM | Lost in the Middle | How Language Models Use Long Contexts | Stanford Univ | Paper | Blog |