Skip to content

Ukuer/Diff-Palm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Official implementation of "Diff-Palm: Realistic Palmprint Generation with Polynomial Creases and Intra-Class Variation Controllable Diffusion Models" [CVPR2025]

PyTorch GitHub Repo stars

This repository is the official PyTorch implementation of the CVPR paper "Diff-Palm: Realistic Palmprint Generation with Polynomial Creases and Intra-Class Variation Controllable Diffusion Models".

Abstract

Palmprint recognition is significantly limited by the lack of large-scale publicly available datasets. Previous methods have adopted B'ezier curves to simulate the palm creases, which then serve as input for conditional GANs to generate realistic palmprints. However, without employing real data fine-tuning, the performance of the recognition model trained on these synthetic datasets would drastically decline, indicating a large gap between generated and real palmprints. This is primarily due to the utilization of an inaccurate palm crease representation and challenges in balancing intra-class variation with identity consistency. To address this, we introduce a polynomial-based palm crease representation that provides a new palm crease generation mechanism more closely aligned with the real distribution. We also propose the palm creases conditioned diffusion model with a novel intra-class variation control method. By applying our proposed $K$-step noise-sharing sampling, we are able to synthesize palmprint datasets with large intra-class variation and high identity consistency. Experimental results show that, for the first time, recognition models trained solely on our synthetic datasets, without any fine-tuning, outperform those trained on real datasets. Furthermore, our approach achieves superior recognition performance as the number of generated identities increases.

Getting Started

Clone this repository

Clone our repo to your local machine using the following command:

git clone https://github.com/Ukuer/Diff-Palm.git
cd Diff-Palm

Prerequisites

Training

The palm creases conditioned diffusion model are placed in DiffModels. To train this model, please refer to the following steps:

  • prepare the palmprint ROI images.
  • extract the palm crease images using PCEM.
  • place paired palmprint and palm crease directory in palm and label sub-directory respectively, as follows :
    DATASETS/
    ├── palm/
    │   ├── 1.png
    │   ├── 2.png
    │   └── ...
    └── label/
        ├── 1.png
        ├── 2.png
        └── ...
    
  • modify the run.sh to meet your requirements.
  • bash run.sh to start training. You need to kill the process manually to finish the training.

  • Additionally, we have found that training with plain palmprint images would easily result in synthesizd images with color shift. A small discussion about this problem is in issues. To avoid this, we opt to apply scale.py to scale each palmprint images.

Inference

The infernece is two-stage. First synthsize polynomial creases, then generate palmprints with these creases.

Polynomial Creases

This code is in PolyCreases.

  • run syn_polypalm_mp.py to synthesize polynomial crease images.
  • Note that each synthesized image is regarded as an identity.

Diffusion Models

This code is in DiffModels.

  • download pretrained weights in Google drive or Baidu drive.
  • place this pretrained file in ./checkpoint/diffusion-netpalm-scale-128
  • modify the sample.sh to meet your requirements.
  • run bash sample.sh to synthsize realistic palmprint datasets.

Datasets

TODO

Evaluation

TODO

Acknowledgement

Our implementation is based on the following works: PCE-Palm, guided-diffusion, ElasticFace.

Citation

@misc{jin2025diffpalm,
      title={Diff-Palm: Realistic Palmprint Generation with Polynomial Creases and Intra-Class Variation Controllable Diffusion Models}, 
      author={Jianlong Jin and Chenglong Zhao and Ruixin Zhang and Sheng Shang and Jianqing Xu and Jingyun Zhang and ShaoMing Wang and Yang Zhao and Shouhong Ding and Wei Jia and Yunsheng Wu},
      year={2025},
      eprint={2503.18312},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2503.18312}, 
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published