Skip to content
forked from vboyce/Maze

Code for auto-generating maze distractors and running maze in ibex. Forked to use huggingface models and support easier use of other languages.

Notifications You must be signed in to change notification settings

MrLogarithm/maze-bert

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Maze-BERT

This directory adapts the original Maze code to more easily support additional languages and modern neural net architectures. Concretely, we replace the original RNN language model with a Transformer from the huggingface repository; this makes it easy to load in a model from another language (or a multilingual model) by changing the model name in the params file.

Installation

python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
cd maze_automate
./set_up.py --gulordava

For additional details, you can also consult the original installation and usage instructions at vboyce.github.io/Maze/install.html.

Usage

source venv/bin/activate
cd maze_automate
# English
./distract.py test_input.txt output_file.txt -p params_en_bert.txt
# Korean
./distract.py test_input.txt output_file.txt -p params_ko_bert.txt

In params_ko_bert.txt, the model_path parameter specifies which huggingface model will be used to sample word probabilities. dictionary_class specifies a class in wordfreq_distractor.py which will be used to sample word frequencies.

Note that you will probably need to adjust the min_delta and min_abs surprisal thresholds, as the average suprisal may differ across languages and models. In particular, BERT models tend to have lower surprisal than the RNNs used by the original code, so the thresholds used by this fork may be much smaller than those in the original code.

Adapting to New Languages

To adapt this code to a new language:

  1. Edit the params file to change model_path to a model which supports the desired language;
  2. Add a class wordfreq_<language_name>_dict to wordfreq_distractor.py. This should be as simple as copying one of the existing classes, changing the two-character language tag (e.g. 'ko', 'en', 'fr'), and possibly changing the regex used to filter out-of-vocabulary items; and
  3. Change dictionary_class in the params file to point to the class you just added.

About

Code for auto-generating maze distractors and running maze in ibex. Forked to use huggingface models and support easier use of other languages.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 70.2%
  • R 27.7%
  • JavaScript 2.1%