SlideShare a Scribd company logo
World Academy of Science, Engineering and Technology
International Journal of Computer, Information Science and Engineering Vol:2 No:7, 2008

Automatic Authentication of Handwritten
Documents via Low Density Pixel Measurements
Abhijit Mitra, Pranab Kumar Banerjee and Cemal Ardil
injuries, illness, temperature, age and emotional state as well
as external factors such as alcohols and drugs [3] – an original
piece of writing has many natural personal characteristics like
cursiveness, ballistic rhythm etc. But a forge sample, however
skillful it may be, always lacks this rhythm and has poor line
quality. Hence the features to provide efficient basis for verification for the case of offline authentication are the non-natural
characteristics like hesitation, patching, retouching etc., which
are generally concerned with the gray levels of the signature
(and lost if we deal with the signature as a binary image). In
the 1980s, Ammar et. al. [6], aiming at skilled forgery detection, used such feature sets. They considered the geometric
information about the letters far less informative and the main
emphasis was given on high-pressure regions which were actually the dark pixels in the signature image corresponding to
high-pressure points on the writing trace. There, the focus was
on the ratio of the number of dark pixels to the total number
of pixels in the signature image, and it was one of the first
attempts to consider dynamic information in the static image
for verification. Subsequently, many other ideas were developed for skilled forgery verification like the orientations of the
gradient vectors at each pixel of the signature [9], a fuzzy technique to add some pseudo-dynamic information such as pen-up
and pen-down events [10], curve comparison algorithms [11],
topological features like branch point, crossing point [12] etc.
However, to the best of our knowledge, none of these works
have given the emphasis on low pen-tip pressure points, or, in
other words, low density pixel measurement, which can well
be another basis for distinction between genuine and skillful
forgery samples.

Abstract— We introduce an effective approach for automatic offline authentication of handwritten samples where the forgeries are skillfully done,
i.e., the true and forgery sample appearances are almost alike. Subtle details
of temporal information used in online verification are not available offline
and are also hard to recover robustly. Thus the spatial dynamic information
like the pen-tip pressure characteristics are considered, emphasizing on the
extraction of low density pixels. The points result from the ballistic rhythm
of a genuine signature which a forgery, however skillful that may be, always
lacks. Ten effective features, including these low density points and density ratio, are proposed to make the distinction between a true and a forgery
sample. An adaptive decision criteria is also derived for better verification
judgements.

International Science Index 19, 2008 waset.org/publications/6001

Keywords— Handwritten document verification, Skilled forgeries, Low
density pixels, Adaptive decision boundary.
I. INTRODUCTION
ANDWRITTEN document authentication, in general, attempts to confirm that a given written sample of a person
is genuine, or equivalently, to identify a questioned sample as
a forgery. Offline handwriting authentication [1]- [7] is considered more difficult than online verification [8] due to the
lack of dynamic information like the order of strokes, speed
of signing, acceleration etc. This makes it difficult to define
effective global and local features for verification purposes
and thus the field is not as developed as the online detection.
Nevertheless, the available common factors in the offline authentication such as slant writing, relative height of small and
tall letters etc., have been used successfully in most of the
reported works in free-hand forgery detection [7]. However,
the skilled forgeries, often subclassified into traced and simulated forgeries, involve attempting to mimic the style of the
writer and thus can be difficult to detect with only these static
features. The main problem comes in designing a feature extraction method which gives stable features for the genuine
written samples despite their inevitable variations, and salient
features for the forgeries even if the imitations are skillfully
done. Even though a genuine writer can never produce exactly the same handwriting twice, e.g., signatures, and many
factors can affect signatures and other handwriting including

H

In this paper, we propose an effective approach of automatic
handwriting authentication via low density pixel measurements
with an adaptive threshold as decision boundary. For simplicity, we deal only with handwritten signatures in the proposed
scheme, which can well be extended to any piece of handwritten samples without the loss of generality. Towards this, the
low and high density pixel percentages are computed and ten
effective features are proposed along with the ratio of above
mentioned density percentages. An adaptive decision criteria
is introduced then, leading towards betterment of system reliability. The simulation studies also exhibit better results than
most of the existing schemes.

Manuscript received July 12, 2005.
A. Mitra is with the Department of Electronics and Communication Engineering, Indian Institute of Technology (IIT) Guwahati, North Guwahati 781039, India. E-mail: a.mitra@iitg.ernet.in.
P. K. Banerjee is with the Department of Electronics and TeleCommunication Engineering, Jadavpur University, Kolkata - 700032, India.
E-mail: pkbanj@rediffmail.com.
C. Ardil is with the Azerbaijan National Academy of Aviation, Baku, Azerbaijan. E-mail: cemalardil@gmail.com.

The paper is organized as follows. Section II and III briefly
discuss about the types of forgeries and signature bank respectively. The measurement of pixel densities are dealt with in
details in Section IV. The authentication process is given in

43
World Academy of Science, Engineering and Technology
International Journal of Computer, Information Science and Engineering Vol:2 No:7, 2008

Section V, where, in particular, ten different features are given
with the deviation measurement from these, followed by the
proposed adaptive decision boundary. Section VI discusses
about the experimental results and the paper is concluded in
Section VII.

International Science Index 19, 2008 waset.org/publications/6001

II. CLASSIFICATION OF FORGE DOCUMENTS
Forge documents are generally classified into two types,
namely, free-hand and skilled. These are discussed below.
Free-hand forgeries [7] are again subclassified into random
and simple forgeries. When a forger simply uses an invalid
signature without any prior knowledge of the authenticated
person’s name or style of writing, it is classified as random
forgery. These are the simplest to detect because their characteristics differ globally from the genuine signature. Simple
forgeries involve using the writer’s name without any a-priori
information of the genuine signature style. A majority of forgeries are free-hand forgeries, but are often overlooked when
there are massive numbers of documents to be processed.
Skilled forgeries [5] on the other hand, with the further subclassification as traced and simulated forgeries, are almost
alike the true samples. Simulated forgeries are those in which
the forger imitates the original signature style from his/her
memory, while a traced forgery is a fraudulent signature which
has been executed by actually following the outline of a genuine signature with a writing instrument. Our work is aimed
towards detecting these traced and simulated signatures.

Fig. 1 An example of selecting TL and TU .

A. Threshold Selection
From the original gray level density information of the signature (i.e., the normalized histogram), we select a lower
threshold point (TL ) and an upper threshold point (TU ) adaptively at the gray levels which correspond to the lower and
1
upper p2 points respectively of the peak frequency of the
same normalized histogram of the signature in question. An
example of this selection procedure is given in Fig. 1, where
M ax represents the maximum number of pixels at a particular
gray level in any given signature.

III. SIGNATURE DATA BANK
The signature data bank consists of 200 genuine and 200
forgery samples. True samples were written by 20 different
persons as their own signatures with 10 samples each. Forgeries were written by 4 different forgers simulating and tracing
original samples of those 20 persons. All samples were written using the same ball pen in a horizontally oriented limited
space in order to elude the disparity of gray levels among
several genuine signatures of the same person. The signature
image is dealt with as a noise-free gray image where a dark
pixel means high-pressure pixel. All the samples are scanned
within a limited space of 256 £ 512 pixels and are digitized to
matrix A = [a]ij , with each aij having one of the values from
28 gray levels. In the sequel, we would denote the maximum
and minimum density pixels of any signature by amax and
amin respectively.

B. Low Density Pixel Percentage (LDPP)
Low density pixels (LDP) are those signature pixels which
have gray level values lower than TL . LDPs can be extracted
by the relation
½
1 8 aij · TL
aLDP =
(1)
ij
0 otherwise
where aLDP represent the LD pixels. We can now define a
ij
low density pixel percentage (LDPP) in a signature as the ratio
of LDPs to the binarized signature area as
PM PN
i=1

j=1

LDP P = PM PN
i=1

IV. MEASUREMENT OF PIXEL DENSITY FEATURES

aLDP
ij

j=1

bin
aij

£ 100

(2)

with abin being the binarized image [13] pixels.
ij

In this section, we develop the extraction method of low
and high density pen-tip points and respective density ratios.
In order to distinguish among different density regions, two
suitable threshold points are defined, one for high density pixels and the other for low density points, which are discussed
below.

C. High Density Pixel Percentage (HDPP)
High density pixels (HDP) are signature points which have
gray level values higher than TU . HDPs can be extracted [6]
by the following relation:
½
1 8 aij ¸ TU
aHDP =
(3)
ij
0 otherwise

44
World Academy of Science, Engineering and Technology
International Journal of Computer, Information Science and Engineering Vol:2 No:7, 2008

Fig. 3 Comparison of low density pixels of the same true and forgery samples.

where aHDP represent the HD pixels. Here we define a high
ij
density pixel percentage (HDPP) as the ratio of HDPs to the
binarized signature area and express it as
International Science Index 19, 2008 waset.org/publications/6001

Fig. 2 Comparison of binarized images of a true and a forgery signature.

Á1 ; Á2 ; Á3 : Vertical position of the peak frequency in the vertical projection of the binarized image, HDP and LDP, respectively.
Á4 ; Á5 ; Á6 : Horizontal position of the peak frequency in the
horizontal projection of the binarized image, HDP and LDP,
respectively.
Á7 : High to low density ratio, i.e., HDPP/LDPP.
Á8 : Lower threshold point computed from the pressure histogram (TL ).
Á9 : Dynamic range of the siganture pixel values, i.e., amax ¡
amin .
Á10 : Aspect ratio of the signature, i.e., length/width ratio of
just the signature area.

PM PN
i=1

j=1

HDP P = PM PN
i=1

HDP
aij

j=1

bin
aij

£ 100

(4)

where, in both the cases, M £ N denotes the total signature
area. Applying the aforesaid adaptive threshold selection procedure to the density ratios, we obtained 22 · HDP P · 41
but in the case of LDPP, the range was specified by 10 ·
LDP P · 42, where, most of the forgery samples have shown
very poor values of LDPP. It was also observed in several signatures that almost same values of HDPPs were obtained from
both the cases although the positions of HDPs for an original
and a forgery sample have differed significantly. However, a
forger could never match the value of LDPP obtained from a
true sample. The observation is made clear in Fig. 2 and Fig.
3, where, Fig. 2 shows the almost similar binarized images of
a true and a forgery sample, while Fig. 3 exhibits a significant
mismatch between the low density points of the same samples.

B. Deviation Measurement
The total deviation of an unknown sample is measured by
v
u
N
u 1 X (Ái ¡ ¹i )2
DRMS = t
(5)
N i=1 ¹N Fi
where N is the number of used features (e.g., here N = 10),
¹NFi is a normalization factor of any ith feature, computed
by
v
u
T
u1 X
t
¹N Fi =
(Ái (j) ¡ ¹i )2
(6)
T j=1

V. VERIFICATION PROCEDURE
Finding out a suitable feature set to establish a reasonable
distinction basis between original and forgery samples is a
difficult task. We, however, attempt the same with the following set, chosen by several observations with different kind of
signatures.

A. Feature Set

with T being the total number of known original samples for
a particular person and ¹i is the mean of Ái for known true
samples, i.e.,
T
1X
¹i =
Ái (j):
(7)
T j=1

Ten features (Á1 to Á10 ) have been used for verification
purpose. The selection of these features is based on the experimental observation and are given below.

The deviation parameter DRMS is the key factor behind the
verification criteria. If we now define DTmax as the maximum
deviation in the true samples of a given signature, DFmin as

45
World Academy of Science, Engineering and Technology
International Journal of Computer, Information Science and Engineering Vol:2 No:7, 2008
TABLE I
OPERATIONAL FLOW OF PROPOSED ADAPTIVE DECISION CRITERIA.

————————————————————————
1. Initialization:
p(0) = 0, ¸ = 0:01 (in this case), c = 0.
Compute SR(0).
————————————————————————
2. Loop operation:
For n = 1 to ¾ 2 =¸, do:
(a)c à c + ¸.
(b)Compute Vth .
(c)Compute SR(n).
(d)p(n + 1) = p(n) + ¸sgnfSR(n) ¡ SR(n ¡ 1)g.
(e)Store p(n).
Loop complete.
————————————————————————
3. Postprocessing:
copt = maxfp(n) j 8ng.
c à copt .
Find Vth and SR with this value.

Fig. 4 Relation among PCA, PCR and SR with different values of c for a
particular set of genuine and forgery signatures.
sep
extra safety factor by setting Vthresh = DTmax + p2 . This
model, however, couldn’t ensure a reasonable system reliability as Dsep doesn’t come out to be positive all the time. We
thus use a flexible threshold model by employing an observational formula to calculate Vth , and is given as

D

International Science Index 19, 2008 waset.org/publications/6001

————————————————————————
the minimum deviation in the forgery samples of that person
and Dsep as the minimum distance [6] separating original and
forgery samples of the same person, then we can write
Dsep = DFmin ¡ DTmax :

Vth = DTmax (1 +

(8)

c
) ¡ ¹D
¾2

(9)

where ¹D is the mean of the deviation DRMS computed on
the set of known genuine signatures of the person in question,
¾2 is the variance of the same set and the variable coefficient
c, which can vary within (0; ¾ 2 ], is set on a particular value
(copt ) that corresponds to the best system reliability (SR). The
optimum value copt is found out with the following adaptive
formaula.
A parameter p, updated with every nth iteration, is initially
set to p(0) = 0 and then adapted as

In this case, the efficiency of the chosen feature set stems from
the fact that it has ensured DFmin > DTmax , i.e., Dsep > 0,
for most of the cases.
C. Authentication Criteria
Having all the above parameters computed, we introduce a
threshold value (Vth ) closely related with the aforesaid deviations for the practical purpose of verification and the decision
is taken as follows.
² If DRMS < Vth , the input sample is considered to be a
true sample.
² If DRMS ¸ Vth , the input sample is judged to be a
forgery sample.
The threshold parameter Vth , therefore, must be a preestablished one. After investigating a few aspects, an adaptive
decision criteria has been followed to select Vth .

p(n + 1) = p(n) + ¸sgnfSR(n) ¡ SR(n ¡ 1)g

(10)

where SR(n) and SR(n ¡ 1) denote the system reliability
value at any nth and (n ¡ 1)th iteration respectively, sgn(:)
indicates the conventional sign function and ¸ is the step size
according to which p(n) is either incremented or decremented.
This ¸ can be set with respect to the rate of change of c. With
this equation, copt is assigned the following value
copt = maxfp(n) j 8ng

(11)

and is passed on to eq. (9) to set c for having the final Vth
value. The entire decision criteria is summarized in Table I.

C.1 Adaptive decision criteria for selecting Vth :
In [6], a simple threshold has been proposed by assigning
Vth to the value DTmax , i.e., Vthresh = DTmax . This couldn’t
converge to good results for such a simplified model. Another
modified threshold has also been taken into account to give an

VI. EXPERIMENTAL RESULTS
The verification results on the real-life set are collectively
judged by percentage of correct acceptance (PCA), percentage

46
World Academy of Science, Engineering and Technology
International Journal of Computer, Information Science and Engineering Vol:2 No:7, 2008

of correct rejection (PCR) and system reliability (SR). PCA
is defined as the ratio of the number of the accepted original
samples and the number of known original samples. PCR is
the ratio of number of rejected forgery samples and number
of known forgery samples, while SR is defined as the average
of these two (all of these three ratios should be multiplied by
100 to get percentage values). The proposed flexible threshold along with simple and modified ones have been applied
to the entire signature data bank and the results have been
calculated on an average basis. For the simple threshold, we
obtained PCA=92, PCR=95, SR=93.5. For modified threshold, the values were PCA=94.5, PCR=95, SR=94.75, and for
the proposed flexible threshold case with adaptive decision,
PCA=97.5, PCR=96, SR=96.75. Fig. 4 shows that selecting
Vth in the case of flexible threshold is not very critical as
SR > 95 can be easily obtained for a considerable range of c.

[10] C. Simon, E. Levrat, R. Sabourin and J. Bremont, “A Fuzzy Perceptron for Offline Handwritten Signature Verification,” in Proc. Brazilian
Symp. Document Image Analysis, 1997, pp. 261-272.
[11] F. Nouboud and R. Plamondon, “Global parameters and curves for offline signature verification,” in Proc. Int. Workshop on Frontiers in
Handwriting Recognition, 1994, pp. 145-155.
[12] K. Han and K. Sethi, “Signature Identification via Local Association
of Features,” in Proc. Int. Conf. Document Analysis and Recognition,
1995, pp. 187-190.
[13] J. R. Ullman, Pattern Recognition Techniques. New York: CraneRussak, 1973.

International Science Index 19, 2008 waset.org/publications/6001

VII. CONCLUSIONS
In this paper, a method for offline handwritten document
authentication has been proposed, where the true and forgery
samples were almost alike. Along with the high density pentip points, we have introduced the notion of low density points
which have shown effectivity for such a problem by indicating
the most important distinction with respect to the simulated or
traced signatures, i.e., for non-natural signature characteristics. The features have been chosen very carefully so that the
local characteristics of a forgery sample, like lack of ballistic
rhythm and poor line quality, can easily be detected in our
scheme. Experimental results have also supported the effectiveness of pressure regions, specially for low density pixels.
The LDPs are expected to find their adequate utility in forensic
applications in the future study.
REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]

[7]
[8]
[9]

B. Fang et al., “Offline Signature Verifiaction by the Analysis of Cursive
Strokes,” Int. J. Pattern Recognition, Artificial Intelligence, vol. 15, no.
4, pp. 659-673, 2001.
A. Mitra, “An Offline Verifiaction Scheme of Skilled Handwritten
Forgery Documents using Pressure Characteristics,” IETE Journal of
Research, vol. 50, no. 2, pp. 141-145, April 2004.
J. K. Guo, D. Doermann and A. Rosenfeld, “Local Correspondence
for Detecting Random Forgeries,” in Proc. 4th IAPR Conf. Document
Analysis, Recognition, Ulm, Germany, 1997, pp. 319-323.
F. Leclerc and R. Plamondon, “Automatic Signature Verification: the
state of the art – 1989-1993,” Int. J. Pattern Recognition, Artificial
Intelligence, vol. 8, pp. 3-20, 1994.
M. Ammar, “Progress in Verification of Skillfully Simulated Handwritten Signatures,” Int. J. Pattern Recognition, Artificial Intelligence, vol.
5, pp.337-351, 1991.
M. Ammar, Y. Yoshida and T. Fukumura, “A New Effective Approach
for Automatic Off-line Verification of Signatures by using Pressure Features,” in Proc. 8th Int. Conf. Pattern Recognition, Paris, 1986, pp.
566-569.
R. N. Nagel and A. Rosenfeld, “Computer detection of freehand forgeries,” IEEE Trans. Computers, vol. 26, pp. 895-905, 1977.
R. Baron and R. Plamondon, “Acceleration Measurement with an Instrumented Pen for Signature Verification and Handwriting Analysis,”
IEEE Trans. Instrument., Measurement, vol. 38, pp. 1132-1138, 1989.
R. Sabourin and R. Plamondon, “Preprocessing of handwritten signatures from image gradient analysis,” in Proc. 8th Int. Conf. Pattern
Recognition, Paris, 1986, pp. 576-579.

47

More Related Content

PDF
Offline Handwritten Signature Identification and Verification using Multi-Res...
PPTX
The state of the art in handwriting synthesis
PDF
Script identification using dct coefficients 2
PPTX
OCR for Urdu translation
PDF
Prises de tête et hésitation
PDF
SPICE MODEL of C4D15120A LTspice Model (Professional Model) in SPICE PARK
PDF
Free SPICE Model of 1SS272 in SPICE PARK
PPT
Dochelp-An artificially intelligent medical diagnosis system
Offline Handwritten Signature Identification and Verification using Multi-Res...
The state of the art in handwriting synthesis
Script identification using dct coefficients 2
OCR for Urdu translation
Prises de tête et hésitation
SPICE MODEL of C4D15120A LTspice Model (Professional Model) in SPICE PARK
Free SPICE Model of 1SS272 in SPICE PARK
Dochelp-An artificially intelligent medical diagnosis system

Similar to Automatic authentication-of-handwritten-documents-via-low-density-pixel-measurements (20)

PDF
Ijetcas14 337
PDF
Offline signature identification using high intensity variations and cross ov...
PDF
Effective Implementation techniques in Offline Signature Verification
PDF
Handwritten Signature Verification using Artificial Neural Network
PDF
14 offline signature verification based on euclidean distance using support v...
PDF
vargas2011.pdf
PDF
Proposed Method for Off-line Signature Recognition and Verification using Neu...
PDF
International Journal of Computational Engineering Research(IJCER)
PDF
A Review on Robust identity verification using signature of a person
PDF
ENHANCED SIGNATURE VERIFICATION AND RECOGNITION USING MATLAB
PDF
An appraisal of offline signature verification techniques
PDF
Freeman Chain Code (FCC) Representation in Signature Fraud Detection Based On...
PDF
RELATIVE STUDY ON SIGNATURE VERIFICATION AND RECOGNITION SYSTEM
PDF
Artificial Intelligence Based Bank Cheque Signature Verification System
PDF
An offline signature verification using pixels intensity levels
PDF
B017150823
PDF
OSPCV: Off-line Signature Verification using Principal Component Variances
PDF
Computational Method for Forensic Verification of offline Signatures
PDF
Offline handwritten signature identification using adaptive window positionin...
PDF
ylmaz2016.pdf
Ijetcas14 337
Offline signature identification using high intensity variations and cross ov...
Effective Implementation techniques in Offline Signature Verification
Handwritten Signature Verification using Artificial Neural Network
14 offline signature verification based on euclidean distance using support v...
vargas2011.pdf
Proposed Method for Off-line Signature Recognition and Verification using Neu...
International Journal of Computational Engineering Research(IJCER)
A Review on Robust identity verification using signature of a person
ENHANCED SIGNATURE VERIFICATION AND RECOGNITION USING MATLAB
An appraisal of offline signature verification techniques
Freeman Chain Code (FCC) Representation in Signature Fraud Detection Based On...
RELATIVE STUDY ON SIGNATURE VERIFICATION AND RECOGNITION SYSTEM
Artificial Intelligence Based Bank Cheque Signature Verification System
An offline signature verification using pixels intensity levels
B017150823
OSPCV: Off-line Signature Verification using Principal Component Variances
Computational Method for Forensic Verification of offline Signatures
Offline handwritten signature identification using adaptive window positionin...
ylmaz2016.pdf
Ad

More from Cemal Ardil (20)

PDF
Upfc supplementary-controller-design-using-real-coded-genetic-algorithm-for-d...
PDF
The main-principles-of-text-to-speech-synthesis-system
PDF
The feedback-control-for-distributed-systems
PDF
System overflow blocking-transients-for-queues-with-batch-arrivals-using-a-fa...
PDF
Sonic localization-cues-for-classrooms-a-structural-model-proposal
PDF
Robust fuzzy-observer-design-for-nonlinear-systems
PDF
Response quality-evaluation-in-heterogeneous-question-answering-system-a-blac...
PDF
Reduction of-linear-time-invariant-systems-using-routh-approximation-and-pso
PDF
Real coded-genetic-algorithm-for-robust-power-system-stabilizer-design
PDF
Performance of-block-codes-using-the-eigenstructure-of-the-code-correlation-m...
PDF
Optimal supplementary-damping-controller-design-for-tcsc-employing-rcga
PDF
Optimal straight-line-trajectory-generation-in-3 d-space-using-deviation-algo...
PDF
On the-optimal-number-of-smart-dust-particles
PDF
On the-joint-optimization-of-performance-and-power-consumption-in-data-centers
PDF
On the-approximate-solution-of-a-nonlinear-singular-integral-equation
PDF
On problem-of-parameters-identification-of-dynamic-object
PDF
Numerical modeling-of-gas-turbine-engines
PDF
New technologies-for-modeling-of-gas-turbine-cooled-blades
PDF
Neuro -fuzzy-networks-for-identification-of-mathematical-model-parameters-of-...
PDF
Multivariate high-order-fuzzy-time-series-forecasting-for-car-road-accidents
Upfc supplementary-controller-design-using-real-coded-genetic-algorithm-for-d...
The main-principles-of-text-to-speech-synthesis-system
The feedback-control-for-distributed-systems
System overflow blocking-transients-for-queues-with-batch-arrivals-using-a-fa...
Sonic localization-cues-for-classrooms-a-structural-model-proposal
Robust fuzzy-observer-design-for-nonlinear-systems
Response quality-evaluation-in-heterogeneous-question-answering-system-a-blac...
Reduction of-linear-time-invariant-systems-using-routh-approximation-and-pso
Real coded-genetic-algorithm-for-robust-power-system-stabilizer-design
Performance of-block-codes-using-the-eigenstructure-of-the-code-correlation-m...
Optimal supplementary-damping-controller-design-for-tcsc-employing-rcga
Optimal straight-line-trajectory-generation-in-3 d-space-using-deviation-algo...
On the-optimal-number-of-smart-dust-particles
On the-joint-optimization-of-performance-and-power-consumption-in-data-centers
On the-approximate-solution-of-a-nonlinear-singular-integral-equation
On problem-of-parameters-identification-of-dynamic-object
Numerical modeling-of-gas-turbine-engines
New technologies-for-modeling-of-gas-turbine-cooled-blades
Neuro -fuzzy-networks-for-identification-of-mathematical-model-parameters-of-...
Multivariate high-order-fuzzy-time-series-forecasting-for-car-road-accidents
Ad

Recently uploaded (20)

PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PPTX
A Presentation on Artificial Intelligence
PDF
Review of recent advances in non-invasive hemoglobin estimation
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PDF
Network Security Unit 5.pdf for BCA BBA.
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
PDF
gpt5_lecture_notes_comprehensive_20250812015547.pdf
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
PPTX
Programs and apps: productivity, graphics, security and other tools
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PDF
Approach and Philosophy of On baking technology
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PDF
Dropbox Q2 2025 Financial Results & Investor Presentation
PDF
Per capita expenditure prediction using model stacking based on satellite ima...
PDF
Encapsulation theory and applications.pdf
PPTX
Spectroscopy.pptx food analysis technology
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
The Rise and Fall of 3GPP – Time for a Sabbatical?
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
Reach Out and Touch Someone: Haptics and Empathic Computing
A Presentation on Artificial Intelligence
Review of recent advances in non-invasive hemoglobin estimation
Building Integrated photovoltaic BIPV_UPV.pdf
Network Security Unit 5.pdf for BCA BBA.
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
gpt5_lecture_notes_comprehensive_20250812015547.pdf
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Programs and apps: productivity, graphics, security and other tools
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Approach and Philosophy of On baking technology
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
Dropbox Q2 2025 Financial Results & Investor Presentation
Per capita expenditure prediction using model stacking based on satellite ima...
Encapsulation theory and applications.pdf
Spectroscopy.pptx food analysis technology

Automatic authentication-of-handwritten-documents-via-low-density-pixel-measurements

  • 1. World Academy of Science, Engineering and Technology International Journal of Computer, Information Science and Engineering Vol:2 No:7, 2008 Automatic Authentication of Handwritten Documents via Low Density Pixel Measurements Abhijit Mitra, Pranab Kumar Banerjee and Cemal Ardil injuries, illness, temperature, age and emotional state as well as external factors such as alcohols and drugs [3] – an original piece of writing has many natural personal characteristics like cursiveness, ballistic rhythm etc. But a forge sample, however skillful it may be, always lacks this rhythm and has poor line quality. Hence the features to provide efficient basis for verification for the case of offline authentication are the non-natural characteristics like hesitation, patching, retouching etc., which are generally concerned with the gray levels of the signature (and lost if we deal with the signature as a binary image). In the 1980s, Ammar et. al. [6], aiming at skilled forgery detection, used such feature sets. They considered the geometric information about the letters far less informative and the main emphasis was given on high-pressure regions which were actually the dark pixels in the signature image corresponding to high-pressure points on the writing trace. There, the focus was on the ratio of the number of dark pixels to the total number of pixels in the signature image, and it was one of the first attempts to consider dynamic information in the static image for verification. Subsequently, many other ideas were developed for skilled forgery verification like the orientations of the gradient vectors at each pixel of the signature [9], a fuzzy technique to add some pseudo-dynamic information such as pen-up and pen-down events [10], curve comparison algorithms [11], topological features like branch point, crossing point [12] etc. However, to the best of our knowledge, none of these works have given the emphasis on low pen-tip pressure points, or, in other words, low density pixel measurement, which can well be another basis for distinction between genuine and skillful forgery samples. Abstract— We introduce an effective approach for automatic offline authentication of handwritten samples where the forgeries are skillfully done, i.e., the true and forgery sample appearances are almost alike. Subtle details of temporal information used in online verification are not available offline and are also hard to recover robustly. Thus the spatial dynamic information like the pen-tip pressure characteristics are considered, emphasizing on the extraction of low density pixels. The points result from the ballistic rhythm of a genuine signature which a forgery, however skillful that may be, always lacks. Ten effective features, including these low density points and density ratio, are proposed to make the distinction between a true and a forgery sample. An adaptive decision criteria is also derived for better verification judgements. International Science Index 19, 2008 waset.org/publications/6001 Keywords— Handwritten document verification, Skilled forgeries, Low density pixels, Adaptive decision boundary. I. INTRODUCTION ANDWRITTEN document authentication, in general, attempts to confirm that a given written sample of a person is genuine, or equivalently, to identify a questioned sample as a forgery. Offline handwriting authentication [1]- [7] is considered more difficult than online verification [8] due to the lack of dynamic information like the order of strokes, speed of signing, acceleration etc. This makes it difficult to define effective global and local features for verification purposes and thus the field is not as developed as the online detection. Nevertheless, the available common factors in the offline authentication such as slant writing, relative height of small and tall letters etc., have been used successfully in most of the reported works in free-hand forgery detection [7]. However, the skilled forgeries, often subclassified into traced and simulated forgeries, involve attempting to mimic the style of the writer and thus can be difficult to detect with only these static features. The main problem comes in designing a feature extraction method which gives stable features for the genuine written samples despite their inevitable variations, and salient features for the forgeries even if the imitations are skillfully done. Even though a genuine writer can never produce exactly the same handwriting twice, e.g., signatures, and many factors can affect signatures and other handwriting including H In this paper, we propose an effective approach of automatic handwriting authentication via low density pixel measurements with an adaptive threshold as decision boundary. For simplicity, we deal only with handwritten signatures in the proposed scheme, which can well be extended to any piece of handwritten samples without the loss of generality. Towards this, the low and high density pixel percentages are computed and ten effective features are proposed along with the ratio of above mentioned density percentages. An adaptive decision criteria is introduced then, leading towards betterment of system reliability. The simulation studies also exhibit better results than most of the existing schemes. Manuscript received July 12, 2005. A. Mitra is with the Department of Electronics and Communication Engineering, Indian Institute of Technology (IIT) Guwahati, North Guwahati 781039, India. E-mail: a.mitra@iitg.ernet.in. P. K. Banerjee is with the Department of Electronics and TeleCommunication Engineering, Jadavpur University, Kolkata - 700032, India. E-mail: pkbanj@rediffmail.com. C. Ardil is with the Azerbaijan National Academy of Aviation, Baku, Azerbaijan. E-mail: cemalardil@gmail.com. The paper is organized as follows. Section II and III briefly discuss about the types of forgeries and signature bank respectively. The measurement of pixel densities are dealt with in details in Section IV. The authentication process is given in 43
  • 2. World Academy of Science, Engineering and Technology International Journal of Computer, Information Science and Engineering Vol:2 No:7, 2008 Section V, where, in particular, ten different features are given with the deviation measurement from these, followed by the proposed adaptive decision boundary. Section VI discusses about the experimental results and the paper is concluded in Section VII. International Science Index 19, 2008 waset.org/publications/6001 II. CLASSIFICATION OF FORGE DOCUMENTS Forge documents are generally classified into two types, namely, free-hand and skilled. These are discussed below. Free-hand forgeries [7] are again subclassified into random and simple forgeries. When a forger simply uses an invalid signature without any prior knowledge of the authenticated person’s name or style of writing, it is classified as random forgery. These are the simplest to detect because their characteristics differ globally from the genuine signature. Simple forgeries involve using the writer’s name without any a-priori information of the genuine signature style. A majority of forgeries are free-hand forgeries, but are often overlooked when there are massive numbers of documents to be processed. Skilled forgeries [5] on the other hand, with the further subclassification as traced and simulated forgeries, are almost alike the true samples. Simulated forgeries are those in which the forger imitates the original signature style from his/her memory, while a traced forgery is a fraudulent signature which has been executed by actually following the outline of a genuine signature with a writing instrument. Our work is aimed towards detecting these traced and simulated signatures. Fig. 1 An example of selecting TL and TU . A. Threshold Selection From the original gray level density information of the signature (i.e., the normalized histogram), we select a lower threshold point (TL ) and an upper threshold point (TU ) adaptively at the gray levels which correspond to the lower and 1 upper p2 points respectively of the peak frequency of the same normalized histogram of the signature in question. An example of this selection procedure is given in Fig. 1, where M ax represents the maximum number of pixels at a particular gray level in any given signature. III. SIGNATURE DATA BANK The signature data bank consists of 200 genuine and 200 forgery samples. True samples were written by 20 different persons as their own signatures with 10 samples each. Forgeries were written by 4 different forgers simulating and tracing original samples of those 20 persons. All samples were written using the same ball pen in a horizontally oriented limited space in order to elude the disparity of gray levels among several genuine signatures of the same person. The signature image is dealt with as a noise-free gray image where a dark pixel means high-pressure pixel. All the samples are scanned within a limited space of 256 £ 512 pixels and are digitized to matrix A = [a]ij , with each aij having one of the values from 28 gray levels. In the sequel, we would denote the maximum and minimum density pixels of any signature by amax and amin respectively. B. Low Density Pixel Percentage (LDPP) Low density pixels (LDP) are those signature pixels which have gray level values lower than TL . LDPs can be extracted by the relation ½ 1 8 aij · TL aLDP = (1) ij 0 otherwise where aLDP represent the LD pixels. We can now define a ij low density pixel percentage (LDPP) in a signature as the ratio of LDPs to the binarized signature area as PM PN i=1 j=1 LDP P = PM PN i=1 IV. MEASUREMENT OF PIXEL DENSITY FEATURES aLDP ij j=1 bin aij £ 100 (2) with abin being the binarized image [13] pixels. ij In this section, we develop the extraction method of low and high density pen-tip points and respective density ratios. In order to distinguish among different density regions, two suitable threshold points are defined, one for high density pixels and the other for low density points, which are discussed below. C. High Density Pixel Percentage (HDPP) High density pixels (HDP) are signature points which have gray level values higher than TU . HDPs can be extracted [6] by the following relation: ½ 1 8 aij ¸ TU aHDP = (3) ij 0 otherwise 44
  • 3. World Academy of Science, Engineering and Technology International Journal of Computer, Information Science and Engineering Vol:2 No:7, 2008 Fig. 3 Comparison of low density pixels of the same true and forgery samples. where aHDP represent the HD pixels. Here we define a high ij density pixel percentage (HDPP) as the ratio of HDPs to the binarized signature area and express it as International Science Index 19, 2008 waset.org/publications/6001 Fig. 2 Comparison of binarized images of a true and a forgery signature. Á1 ; Á2 ; Á3 : Vertical position of the peak frequency in the vertical projection of the binarized image, HDP and LDP, respectively. Á4 ; Á5 ; Á6 : Horizontal position of the peak frequency in the horizontal projection of the binarized image, HDP and LDP, respectively. Á7 : High to low density ratio, i.e., HDPP/LDPP. Á8 : Lower threshold point computed from the pressure histogram (TL ). Á9 : Dynamic range of the siganture pixel values, i.e., amax ¡ amin . Á10 : Aspect ratio of the signature, i.e., length/width ratio of just the signature area. PM PN i=1 j=1 HDP P = PM PN i=1 HDP aij j=1 bin aij £ 100 (4) where, in both the cases, M £ N denotes the total signature area. Applying the aforesaid adaptive threshold selection procedure to the density ratios, we obtained 22 · HDP P · 41 but in the case of LDPP, the range was specified by 10 · LDP P · 42, where, most of the forgery samples have shown very poor values of LDPP. It was also observed in several signatures that almost same values of HDPPs were obtained from both the cases although the positions of HDPs for an original and a forgery sample have differed significantly. However, a forger could never match the value of LDPP obtained from a true sample. The observation is made clear in Fig. 2 and Fig. 3, where, Fig. 2 shows the almost similar binarized images of a true and a forgery sample, while Fig. 3 exhibits a significant mismatch between the low density points of the same samples. B. Deviation Measurement The total deviation of an unknown sample is measured by v u N u 1 X (Ái ¡ ¹i )2 DRMS = t (5) N i=1 ¹N Fi where N is the number of used features (e.g., here N = 10), ¹NFi is a normalization factor of any ith feature, computed by v u T u1 X t ¹N Fi = (Ái (j) ¡ ¹i )2 (6) T j=1 V. VERIFICATION PROCEDURE Finding out a suitable feature set to establish a reasonable distinction basis between original and forgery samples is a difficult task. We, however, attempt the same with the following set, chosen by several observations with different kind of signatures. A. Feature Set with T being the total number of known original samples for a particular person and ¹i is the mean of Ái for known true samples, i.e., T 1X ¹i = Ái (j): (7) T j=1 Ten features (Á1 to Á10 ) have been used for verification purpose. The selection of these features is based on the experimental observation and are given below. The deviation parameter DRMS is the key factor behind the verification criteria. If we now define DTmax as the maximum deviation in the true samples of a given signature, DFmin as 45
  • 4. World Academy of Science, Engineering and Technology International Journal of Computer, Information Science and Engineering Vol:2 No:7, 2008 TABLE I OPERATIONAL FLOW OF PROPOSED ADAPTIVE DECISION CRITERIA. ———————————————————————— 1. Initialization: p(0) = 0, ¸ = 0:01 (in this case), c = 0. Compute SR(0). ———————————————————————— 2. Loop operation: For n = 1 to ¾ 2 =¸, do: (a)c à c + ¸. (b)Compute Vth . (c)Compute SR(n). (d)p(n + 1) = p(n) + ¸sgnfSR(n) ¡ SR(n ¡ 1)g. (e)Store p(n). Loop complete. ———————————————————————— 3. Postprocessing: copt = maxfp(n) j 8ng. c à copt . Find Vth and SR with this value. Fig. 4 Relation among PCA, PCR and SR with different values of c for a particular set of genuine and forgery signatures. sep extra safety factor by setting Vthresh = DTmax + p2 . This model, however, couldn’t ensure a reasonable system reliability as Dsep doesn’t come out to be positive all the time. We thus use a flexible threshold model by employing an observational formula to calculate Vth , and is given as D International Science Index 19, 2008 waset.org/publications/6001 ———————————————————————— the minimum deviation in the forgery samples of that person and Dsep as the minimum distance [6] separating original and forgery samples of the same person, then we can write Dsep = DFmin ¡ DTmax : Vth = DTmax (1 + (8) c ) ¡ ¹D ¾2 (9) where ¹D is the mean of the deviation DRMS computed on the set of known genuine signatures of the person in question, ¾2 is the variance of the same set and the variable coefficient c, which can vary within (0; ¾ 2 ], is set on a particular value (copt ) that corresponds to the best system reliability (SR). The optimum value copt is found out with the following adaptive formaula. A parameter p, updated with every nth iteration, is initially set to p(0) = 0 and then adapted as In this case, the efficiency of the chosen feature set stems from the fact that it has ensured DFmin > DTmax , i.e., Dsep > 0, for most of the cases. C. Authentication Criteria Having all the above parameters computed, we introduce a threshold value (Vth ) closely related with the aforesaid deviations for the practical purpose of verification and the decision is taken as follows. ² If DRMS < Vth , the input sample is considered to be a true sample. ² If DRMS ¸ Vth , the input sample is judged to be a forgery sample. The threshold parameter Vth , therefore, must be a preestablished one. After investigating a few aspects, an adaptive decision criteria has been followed to select Vth . p(n + 1) = p(n) + ¸sgnfSR(n) ¡ SR(n ¡ 1)g (10) where SR(n) and SR(n ¡ 1) denote the system reliability value at any nth and (n ¡ 1)th iteration respectively, sgn(:) indicates the conventional sign function and ¸ is the step size according to which p(n) is either incremented or decremented. This ¸ can be set with respect to the rate of change of c. With this equation, copt is assigned the following value copt = maxfp(n) j 8ng (11) and is passed on to eq. (9) to set c for having the final Vth value. The entire decision criteria is summarized in Table I. C.1 Adaptive decision criteria for selecting Vth : In [6], a simple threshold has been proposed by assigning Vth to the value DTmax , i.e., Vthresh = DTmax . This couldn’t converge to good results for such a simplified model. Another modified threshold has also been taken into account to give an VI. EXPERIMENTAL RESULTS The verification results on the real-life set are collectively judged by percentage of correct acceptance (PCA), percentage 46
  • 5. World Academy of Science, Engineering and Technology International Journal of Computer, Information Science and Engineering Vol:2 No:7, 2008 of correct rejection (PCR) and system reliability (SR). PCA is defined as the ratio of the number of the accepted original samples and the number of known original samples. PCR is the ratio of number of rejected forgery samples and number of known forgery samples, while SR is defined as the average of these two (all of these three ratios should be multiplied by 100 to get percentage values). The proposed flexible threshold along with simple and modified ones have been applied to the entire signature data bank and the results have been calculated on an average basis. For the simple threshold, we obtained PCA=92, PCR=95, SR=93.5. For modified threshold, the values were PCA=94.5, PCR=95, SR=94.75, and for the proposed flexible threshold case with adaptive decision, PCA=97.5, PCR=96, SR=96.75. Fig. 4 shows that selecting Vth in the case of flexible threshold is not very critical as SR > 95 can be easily obtained for a considerable range of c. [10] C. Simon, E. Levrat, R. Sabourin and J. Bremont, “A Fuzzy Perceptron for Offline Handwritten Signature Verification,” in Proc. Brazilian Symp. Document Image Analysis, 1997, pp. 261-272. [11] F. Nouboud and R. Plamondon, “Global parameters and curves for offline signature verification,” in Proc. Int. Workshop on Frontiers in Handwriting Recognition, 1994, pp. 145-155. [12] K. Han and K. Sethi, “Signature Identification via Local Association of Features,” in Proc. Int. Conf. Document Analysis and Recognition, 1995, pp. 187-190. [13] J. R. Ullman, Pattern Recognition Techniques. New York: CraneRussak, 1973. International Science Index 19, 2008 waset.org/publications/6001 VII. CONCLUSIONS In this paper, a method for offline handwritten document authentication has been proposed, where the true and forgery samples were almost alike. Along with the high density pentip points, we have introduced the notion of low density points which have shown effectivity for such a problem by indicating the most important distinction with respect to the simulated or traced signatures, i.e., for non-natural signature characteristics. The features have been chosen very carefully so that the local characteristics of a forgery sample, like lack of ballistic rhythm and poor line quality, can easily be detected in our scheme. Experimental results have also supported the effectiveness of pressure regions, specially for low density pixels. The LDPs are expected to find their adequate utility in forensic applications in the future study. REFERENCES [1] [2] [3] [4] [5] [6] [7] [8] [9] B. Fang et al., “Offline Signature Verifiaction by the Analysis of Cursive Strokes,” Int. J. Pattern Recognition, Artificial Intelligence, vol. 15, no. 4, pp. 659-673, 2001. A. Mitra, “An Offline Verifiaction Scheme of Skilled Handwritten Forgery Documents using Pressure Characteristics,” IETE Journal of Research, vol. 50, no. 2, pp. 141-145, April 2004. J. K. Guo, D. Doermann and A. Rosenfeld, “Local Correspondence for Detecting Random Forgeries,” in Proc. 4th IAPR Conf. Document Analysis, Recognition, Ulm, Germany, 1997, pp. 319-323. F. Leclerc and R. Plamondon, “Automatic Signature Verification: the state of the art – 1989-1993,” Int. J. Pattern Recognition, Artificial Intelligence, vol. 8, pp. 3-20, 1994. M. Ammar, “Progress in Verification of Skillfully Simulated Handwritten Signatures,” Int. J. Pattern Recognition, Artificial Intelligence, vol. 5, pp.337-351, 1991. M. Ammar, Y. Yoshida and T. Fukumura, “A New Effective Approach for Automatic Off-line Verification of Signatures by using Pressure Features,” in Proc. 8th Int. Conf. Pattern Recognition, Paris, 1986, pp. 566-569. R. N. Nagel and A. Rosenfeld, “Computer detection of freehand forgeries,” IEEE Trans. Computers, vol. 26, pp. 895-905, 1977. R. Baron and R. Plamondon, “Acceleration Measurement with an Instrumented Pen for Signature Verification and Handwriting Analysis,” IEEE Trans. Instrument., Measurement, vol. 38, pp. 1132-1138, 1989. R. Sabourin and R. Plamondon, “Preprocessing of handwritten signatures from image gradient analysis,” in Proc. 8th Int. Conf. Pattern Recognition, Paris, 1986, pp. 576-579. 47