Biometric Technology: A Brief History

Ever wonder what are biometrics?

Biometrics are a collection of physical characteristics of people, such as fingerprints, voiceprints, and body size. Biometrics can be comprised of combinations of these characteristics, or any number of characteristics can be used. When biometrics were first developed, they represented the physical characteristics of humans; however, it was later discovered that other animals had the same characteristics.

With the passage of time, biometric authentication has grown into one of the most important concepts in modern times. Biometrics can be used for access control, access identification, and fingerprint scanning. In computer science, biometric authentication is used as a form of access and identity control. In other areas, biometric authentication is used for things such as access to secured areas of computer systems.

Traditional forms of password authentication include storing very long, highly personalized passwords which are then used by many different users. This means that if one of these users forgets their password, it is considerably more difficult for other members of the network to access the information on their personal fingerprint scanners. With modern biometric scanners, it is entirely possible for a single person to authenticate themselves on a site, even though there may be hundreds or thousands of other users with their fingerprints at hand. No more having to remember a master password that changes every time you go away or change your social security number (one of the easiest ways in which to identify yourself). Now, with the use of iris recognition technology, it is entirely possible for an individual to use one of the most common forms of personal identification – your fingerprints – to log onto a website and securely access information they need on any computer.

In a nutshell, biometric technology involves the collection of measurements on an individual – facial measurements, iris measurements, hand measurements, and so forth. These measurements are then fed into a computer system that can then create a digital fingerprint. If your finger is measured and it matches a certain number of previously recorded fingerprints, the resulting digital fingerprint is then considered a match. This is why multi-factor authentication is important when using biometric security technology – you will need to have more than one factor that you can verify against in order to gain access to the information that you need.

Modern Biometric Technology

The history of modern biometric technology saw its beginnings in the 1960s. It has evolved into high-tech scanners that can now read bio-makers with 100% accuracy. It can be traced back to when the Federal Bureau of Investigation (FBI) initiated the use of fingerprint identification in 1969, which allowed for the analysis and mapping of unique fingerprint patterns.

In 1975, the FBI also funded prototypes of scanners used to extract fingerprints, wherein digital storage costs were prohibitive at the time. The National Institute of Science and Technology (NIST) worked on algorithms and compression, leading to the FBI’s first operational algorithm. The M40 algorithm produces fewer images that trained, and skilled human technicians can assess to reduce the human search. These developments have helped to improve fingerprint technology.

Biometric science boomed in the 1990s when the National Security Agency (NSA) established the Biometric Consortium. The Department of Defense (DoD), in partnership with Defense Advanced Research Products Agency (DARPA), funded commercial face recognition algorithms. Additionally, Lockheed Martin bought an automated fingerprint identification device for the FBI.

The history of biometric technology has seen further developments, such as  West Virginia University establishing a Bachelor’s program in Biometrics Systems Engineering and Computer Engineering around the 2000s. The International Organization for Standardization (ISO), an international non-profit organization that encouraged international collaboration in biometrics research, also helped standardize generic biometric technologies.

The United States Immigration Department also used biometrics to process visa applications from legitimate travelers for enhanced security. Biometric data such as voice samples, fingerprints, DNA swabs, and iris images were used to identify national security threats.

The emerging popularity of smartphones was also crucial to the development of biometric technology as we know it today. In 2013, Apple introduced Touch ID in their release of the iPhone 5S. Touch ID is a key feature on iOS phones and other devices that enables users to unlock their devices and make purchases. Apple clarified that fingerprints are stored directly on the Apple chip and not on iCloud or Apple servers when they introduced Touch ID to the public.

Biometric authentication was finally commercialized after 60 years, as it is now easier to use every day. It gained momentum as smartphones have integrated scanning sensors. Millions of Samsung and Apple customers were open to biometric fingerprint scanners added to their phones. Apple then transitioned to face recognition as they released the iPhone X after this widespread acceptance of fingerprint scanners. One thing to expect about the future of biometric technology is to help ensure data security and safety. 5G will allow big data and the Internet of Things to be more easily accessible than ever. Standard bodies such as the W3C and FIDO2 can recognize biometrics and help regulate them in a time when access barriers are removed.

Get to know more about biometric technology as you check out on digital signature authentication and delegated authentication with the infographic we have from LoginID.

Biometric Technology

Leave a Reply

Your email address will not be published. Required fields are marked *

five − one =