Document liveness detection for identity document verification is an important step in many document recognition pipelines. It is imperative to have an objective evaluation methodology along with a benchmarking dataset to capture the efficiency of current document liveness detection methods.
In the Document Liveness Challenge 2021 (DLC 2021) the general objective is to establish an evaluation methodology and set up baselines for document image recapture detection, document photocopy detection and document lamination detection methods. The benchmarking dataset that will be used in the contest will be the first dataset containing document images that are representative of the potential problems which are challenging in the document liveness detection process.
The DLC 2021 competition is based on the MIDV-2020 dataset (arxiv.org/abs/2107.00396) and its specifically prepared extensions. The images for DLC 2021 are prepared by capturing video clips using a smartphone and splitting them into frames with subsequent annotation of identity document quadrangles. The ground truth for each video clip contains 50+ frame annotations. The directory structure and annotation format corresponds to the one used in MIDV-2020.
In the scope of DLC 2021 we consider “genuine” the original physical documents samples which were used to capture the original MIDV-2020 dataset. Each sample was prepared by printing the template document image in color, laminating and cropping with rounded corners.
The video clips will be split into frames with different frame rates, each containing from 1 to 30 frames in total, to represent different time acquisition and frame processing time. The participants’ solutions will be evaluated as binary classifiers with rejection for each video frames sequence. The ground truth is defined by the way the video sequence was constructed.
Examples of video frames for original, recaptured, photocopied and color copied document are shown below
At the start of the competition a demo dataset will be published — a small set of video sequences and annotations in the same format as the full evaluation dataset. The evaluation dataset will not be available for participants before the submissions are closed. Each submission should be prepared as a dockerfile which could process video sequences in a batch mode and stores results in a canonical form. The dockerfile example will be provided at the start of the competition.
The description of the methods and the evaluation scores will be presented at ICMV 2021 at the special session dedicated to the challenge. A report on the competition will be published in the ICMV 2021 conference proceedings. At the end of the competition, the testing dataset along with the required program to run the evaluation measures will become publicly available.
19.07.2021 (Monday) — DLC 2021 start, demo dataset publication;
16-22.08.2021 (the 5-th week) — validation checkpoint with demo dataset leaderboard;
29.08.2021 (Sunday) — DLC 2021 submission deadline;
01.09.2021 (Friday) — leaderboard publication;
04.09.2021 (Saturday) — challengers papers submission deadline;
08.09.2021 (Friday) — DLC 2021 official finish.
If you still have any questions, please send an email: firstname.lastname@example.org
Dmitry V. Polevoy (Ph.D), research scientist
responsible person (email@example.com)
Vladimir V. Arlazarov (Ph.D), Head of Department
Dmitri G. Slugin, research scientist
Dmitri P. Nikolaev (Ph.D), Head of Department
Jean-Christophe Burie, Professor
Luqman Muhammad Muzzamil (Ph.D.), research scientist
Zuheng Ming (Ph.D), postdoctoral researcher
Registration for ICMV 2021 Document Liveness Challenge
If you have any questions, please contact us at firstname.lastname@example.org
Please fill out the form to get more information about the products,
pricing and trial SDK for Android, iOS, Linux, Windows.