Skip to content

XXX-33/A2QTrans

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fine-Grained Visual Classification via Adaptive Attention Quantization Transformer

Framework

p2_v4_00

Prerequisites

The following packages are required to run the scripts:

  • Python >= 3.6
  • PyTorch = 1.8.1
  • Torchvision = 0.9.1
  • Apex

Download Google pre-trained ViT models

wget https://storage.googleapis.com/vit_models/imagenet21k/{MODEL_NAME}.npz

Dataset

You can download the datasets from the links below:

Install required packages

Install Prerequisites with the following command:

pip3 install -r requirements.txt

Train

CUB-200-2011

CUDA_VISIBLE_DEVICES=0,1,2,3 python3 -m torch.distributed.launch --nproc_per_node=4 --master_port 12345 train.py --dataset CUB --img_size 400 --train_batch_size 4 --eval_batch_size 4 --learning_rate 0.02 --num_steps 40000 --fp16 --low_memory --eval_every 200 --name sample_run --aplly_BE

Stanford dogs

CUDA_VISIBLE_DEVICES=0,1,2,3 python3 -m torch.distributed.launch --nproc_per_node=4 --master_port 12345 train.py --dataset dogs --img_size 400 --train_batch_size 4 --eval_batch_size 4 --learning_rate 0.003 --num_steps 10000 --fp16 --low_memory --eval_every 200 --name sample_run --aplly_BE

NAbirds

CUDA_VISIBLE_DEVICES=0,1,2,3 python3 -m torch.distributed.launch --nproc_per_node=4 --master_port 12345 train.py --dataset nabirds --img_size 400 --train_batch_size 4 --eval_batch_size 4 --learning_rate 0.02 --num_steps 60000 --fp16 --low_memory --eval_every 200 --name sample_run --aplly_BE

INat2017

CUDA_VISIBLE_DEVICES=0,1,2,3 python3 -m torch.distributed.launch --nproc_per_node=4 --master_port 12345 train.py --dataset INat2017 --img_size 304 --train_batch_size 16 --eval_batch_size 16 --learning_rate 0.01 --num_steps 271500 --fp16 --low_memory --eval_every 9050 --name sample_run 

Acknowledge

We would like to express our sincere gratitude to the authors of the following repositories for their hard work and generosity in sharing their code, which has been instrumental in the development of our project.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages