This repo explores the relationship between attention in transformer models and the syntactic structure of human language.
- Clone this repo
- Install the requirements:
pip install -r requirements.txt(you may want to do this in a virtual environment) - Install the required SpaCy model:
python -m spacy download en_core_web_sm - Run the tests:
pytest - Run the experiments:
python -m attention