1.PAST: Pairwise attention swin transformer for offline signature verification

Published in International Journal on Document Analysis and Recognition, 2025

Signature verification has shown tremendous potential as a reliable biometric in both academic research and industrial applications. With the advent of deep learning, signature verification has made remarkable progress in the past decade. However, despite significant progress, detecting subtle differences between genuine and forged signatures remains a challenge, raising concerns over privacy protection and data security in signature verification systems. Recently, the tremendous success of transformers in Natural Language Processing has led to their extension to computer vision, resulting in significant advancements. The multi-head self-attention mechanism is considered crucial for the success of Transformer. As the name implies, its query, key, and value all originate from the same sequence, rendering it suitable for single input tasks. However, pairwise signature verification treats reference and query signature images equally as two independent inputs. Regarding this matter, the mere amalgamation of the two independent inputs in the form of a single sequence inevitably leads to potential inherent issues. To tackle this problem, we present a Pairwise Attention (PA) mechanism that keeps the symmetry of inputs. Unlike the original attention mechanism, pairwise attention facilitates bidirectional information exchange between reference and query signatures without introducing any additional assumptive temporal information. Subsequently, combining with the architecture of Swin Transformer, we propose Pairwise Attention Swin Transformer(PAST). Our method fundamentally solves the problem of introducing false assumptive temporal information during the process of input fusion, and also performs impressively on several public datasets. Experimental results show that PAST outperforms most existing methods. In addition, we also investigated the impact of background information from the CEDAR database on the results. It revealed that background information in the training data has a significant impact on the verification performance.


Recommended citation:

PAST: Pairwise attention swin transformer for offline signature verification, Y.-J. Xiong*, J.-X. Ren, D.-H. Zhu, X.-J. Xie, X.-H. Qiu, International Journal on Document Analysis and Recognition (IJDAR), 2025:1-13

Download Paper