Skip to content

ACMCMC/attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Attention

This repo explores the relationship between attention in transformer models and the syntactic structure of human language.

Setup

  1. Clone this repo
  2. Install the requirements: pip install -r requirements.txt (you may want to do this in a virtual environment)
  3. Install the required SpaCy model: python -m spacy download en_core_web_sm
  4. Run the tests: pytest
  5. Run the experiments: python -m attention

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

No packages published