Fingerprints are an early example of biometrics[?], the science of identifying individuals by their physical characterstics. There is no clear date at which fingerprinting was first used, some examples being from prehistory. However, some significant modern dates are as follows.
Sometimes the prints are invisible, in which case they are called latent fingerprints, but there are chemical techniques such as cyanoacrylate[?] fuming that can make them visible.
Recently the American Federal Bureau of Investigation adopted a wavelet-based system for efficient storage of fingerprint data, developed by Ingrid Daubechies[?].
In the 2000s, electronic fingerprint readers have been introduced for security applications such as identification of computer users (log-in authentication). However, early devices have been discovered to be vulnerable to quite simple methods of deception, such as fake fingerprints cast in gels.
There is some controversy over the uniqueness of fingerprints. Even those who accept their uniqueness sometimes argue that the techniques used to compare fingerprints are fallible.
Fingerprint analysis (or Dactylography, a term mainly used in the US) is the science of using fingerprints to uniquely identify someone. Humans leave behind prints of the ridges of the skin on their fingertips when handling certain materials. The pattern of ridges is thought to be unique for each person and in practice has proved unique enough to identify the person who left the fingerprint.
Fingerprint analysis emerged in the early 20th century, when it was the first method in forensic science for unique identification. As a result of its early success, it acquired a mystique of infallibility. It has only recently been subjected to systematic analysis by investigators from outside the field.
Fingerprint examiners have certainly disagreed with one another: the case of Shirley McKie[?] was a notable case involving fingerprints.
Search Encyclopedia
|
Featured Article
|