Skip to content

Latest commit

 

History

History
37 lines (31 loc) · 1.19 KB

ner_concept.md

File metadata and controls

37 lines (31 loc) · 1.19 KB

Potential Research Concepts for the NER Task

Architecture

  • arch-rnn: Recurrent Neural Networks (LSTMs, GRU)
  • arch-cnn: Convolutional Neural Networks
  • arch-transformer: Transformer
  • arch-gnn: Graph Neural Networks (GNN)
  • arch-att: Attention Mechasnim
  • arch-crf: Conditional Random Field
  • arch-semicrf: Semi-Conditional Random Field

Training

  • train-multitask: Multi-task learning
  • train-multimodal: Multi-modal learning
  • train-auxiliary: Joint training
  • train-transfer: Cross-domain learning, Transfer or domain adaptation
  • train-multiling: Bi-lingual, Multi-lingual learning
  • train-active: Active learning, Bootstrapping
  • train-adver: Adversarial learning

Pre-trained Models

  • pre-train: Pre-trained, Contextualized (BERT)

Task Setting

  • task-nested: Nested or Multi-grained NER
  • task-hetero: Heterogeneous categories
  • task-lowres: Low-resource or Zero-resource learning
  • task-weaksup: Weakly Supervised NER
  • task-disc: Discontinuous NER

Dataset

  • dataset: Constructing a new dataset

Interpretable Analysis

  • interpret-genz: Generalization
  • interpret-model: System-wise Analysis
  • interpret-error: Annotation Errors Analysis