Difference Between Authentication And Authorization

Print   

02 Nov 2017

Disclaimer:
This essay has been written and submitted by students and is not an example of our work. Please click this link to view samples of our professional work witten by our professional essay writers. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of EssayCompany.

Authentication is the act of establishing or confirming something (or someone) as authentic, that is that claims made by or about the thing are true. Authenticating an object may mean confirming its provenance, whereas authenticating a person often consists of verifying their identity. Authentication depends upon one or more authentication factors.

In computer security, authentication is the process of attempting to verify the digital identity of the sender of a communication such as a request to log in. The sender being authenticated may be a person using a computer, a computer itself or a computer program. A blind credential, in contrast, does not establish identity at all, but only a narrow right or status of the user or program.

In a web of trust, authentication is a way to ensure users are who they say they are that the user who attempts to perform functions in a system is in fact the user who is authorized to do so.

1.2 Difference between Authentication and Authorization:

Authorization is often thought to be identical to that of authentication, many widely adopted standard of protocols, obligatory regulations, and even statutes are based on this assumption.

However, more precise usage describes authentication as the process of verifying a person's identity, while authorization is the process of verifying that a known person has the authority to perform a certain operation. Authentication, therefore, must precede authorization.

For example, when you show proper identification to a bank teller, you could be authenticated by the teller, and you would be authorized to access information about your bank accounts. You would not be authorized to access accounts that are not your own.

1.3 Authentication Factors:

The authentication factors humans are generally classified into four cases:

Something the user is (e.g., fingerprint or retinal pattern, DNA sequence (there are assorted definitions of what is sufficient), voice pattern (again several definitions), signature recognition, unique bio-electric signals produced by the living body, or other biometric identifier)

Something the user has (e.g., ID card, security token, software token or cell phone)

Something the user knows (e.g., a password, a pass phrase or a personal identification number (PIN)).

Something the user does (e.g., voice recognition, signature, or gait).

1.4 e-Authentications:

It is defined as the Web Based service that provides authentication to end users accessing (logging into) an Internet service.

The e-Authentication is similar to Credit Card verification for eCommerce web sites. The verification is done by a dedicated service that receives the input and returns success or fails indication.

For example, an end user wishes to enter his e-Buy or e-Trade web site. He gets the Login web page and is required to enter his user ID and a Password or in the more secured sites – his One Time Password. The information is transmitted to the e-Authentication service as a query. If the service returns success–the end user is permitted into the e-Trade service with his privileges as a user.

1.5 Purpose of Authentication:

On August 8, 2001, the FFIEC agencies1 (agencies) issued guidance entitled Authentication in an Electronic Banking Environment (2001 Guidance).The 2001 Guidance focused on risk management controls necessary to authenticate the identity of retail and commercial customers accessing Internet-based financial services.

Since 2001, there have been significant legal and technological changes with respect to the protection of customer information, to increase incidents of fraud, including identity theft and the introduction of improved authentication technologies.

This updated guidance replaces the 2001 Guidance and specifically addresses why financial institutions regulated by the agencies should conduct risk-based assessments, evaluate customer awareness programs, and develop security measures to reliably authenticate customers remotely accessing their Internet-based financial services.

Financial institutions should use this guidance when evaluating and implementing authentication systems and practices whether they are provided internally or by a service provider. Although this guidance is focused on the risks and risk management techniques associated with the Internet delivery channel, the principles are applicable to all forms of electronic banking activities.

The agencies consider single-factor authentication, as the only control mechanism, to be inadequate for high-risk transactions involving access to customer information or the movement of funds to other parties. Financial institutions offering Internet-based products and services to their customers should use effective methods to authenticate the identity of customers using those products and services.

The authentication techniques employed by the financial institution should be appropriate to the risks associated with those products and services. Account fraud and identity theft are frequently the result of single-factor (e.g., ID/password) authentication exploitation.

Where risk assessments indicate that the use of single-factor authentication is inadequate, financial institutions should implement multifactor authentication, layered security, or other controls reasonably calculated to mitigate those risks.

Consistent with the FFIEC Information Technology, December 2002, financial institutions should periodically:

• Ensure that their information security program:

Identifies and assesses the risks associated with Internet-based products and services.

Identifies risk mitigation actions, including appropriate authentication

Strength.

Measures and evaluates customer awareness efforts.

• Adjust, as appropriate, their information security program in light of any

relevant changes in technology, the sensitivity of its customer information,

and internal or external threats to information.

• Implement appropriate risk mitigation strategies.

1.6 Need for Strong Authentication:

Single-factor authentication usually consists of "something you know". However, generally, these could be susceptible to attacks that could compromise the security of the application. Some of the more common attacks can occur at little or no cost to the perpetrator and without detection.

Programs are readily available over the internet. If undetected, the perpetrator could access the information without alerting the legitimate user. This is the reason of using a strong user authentication process to protect the data and systems. The need for strong user authentication has many benefits.

First, effective authentication provides the basis for validation of parties to the transaction and their agreement to its terms.

Second, it is a necessary element to establish authenticity of the records evidencing the electronic transaction should there ever be a dispute.

Third, it is a necessary element to establish the integrity of the records evidencing the electronic transaction. All of these elements promote the enforceability of electronic agreements.

Financial institutions should assess the adequacy of existing authentication techniques in the light of changing or new perceived risks. According to the ICSA (International Computer Security Association), 80 per cent of system undermining occurs from within the organization. The Basle Committee on Banking Supervision advises financial institutions to consider the apparent risks of offering internet banking services based on PIN alone. Single factor authentication alone may not be commercially reasonable or adequate for high-risk applications and transactions.

Systems linked to open and entrusted networks like the internet are subject to a greater number of individuals who may attempt to compromise the system. Attackers may use automated programs to systematically generate millions of numerical combinations, in the case of systems relying on PIN alone, to learn a customer's access code (brute force attack).

1.7 Background of Authentication:

Financial institutions engaging in any form of Internet banking should have effective and reliable methods to authenticate customers. An effective authentication system is necessary for compliance with requirements to safeguard customer information, to prevent money laundering and terrorist financing, to reduce fraud, to inhibit identity theft, and to promote the legal enforceability of their electronic agreements and transactions. The risks of doing business with unauthorized or incorrectly identified persons in an Internet banking environment can result in financial loss and reputation damage through fraud, disclosure of customer information, corruption of data, or unenforceable agreements.

There are a variety of technologies and methodologies financial institutions can use to authenticate customers. These methods include the use of customer passwords, personal identification numbers (PINs), digital certificates using a public key infrastructure (PKI), physical devices such as smart cards, one-time passwords (OTPs), USB plug-ins or other types of "tokens", transaction profile scripts, biometric identification, and others. The level of risk protection afforded by each of these techniques varies.

The selection and use of authentication technologies and methods should depend upon the results of the financial institution’s risk assessment process. Authentication methods that depend on more than one factor are more difficult to compromise than single-factor methods. Accordingly, properly designed and implemented multifactor authentication methods are more reliable and stronger fraud deterrents.

For example, the use of a logon ID/password is single-factor authentication (i.e., something the user knows); whereas, an ATM transaction requires multifactor authentication: something the user possesses (i.e., the card) combined with something the user knows (i.e., PIN). A multifactor authentication methodology may also include "out–of–band"5 controls for risk mitigation.

The success of a particular authentication method depends on more than the technology. It also depends on appropriate policies, procedures, and controls. An effective authentication method should have customer acceptance, reliable performance, scalability to accommodate growth, and interoperability with existing systems and future plans.

1.8 Risk Assessment:

The implementation of appropriate authentication methodologies should start with an assessment of the risk posed by the institution’s Internet banking systems.

The risk should be evaluated in light of the type of customer (e.g., retail or commercial), the customer transactional capabilities (e.g., bill payment, wire transfer, loan origination), the sensitivity of customer information being communicated to both the institution and the customer, the ease of using the communication method; and the volume of transactions. Prior agency guidance has elaborated on this risk-based and "layered" approach to information security.

An effective authentication program should be implemented to ensure that controls and authentication tools are appropriate for all of the financial institution’s Internet-based products and services.

Authentication processes should be designed to maximize interoperability and should be consistent with the financial institution’s overall strategy for Internet banking and electronic commerce customer services.

The level of authentication used by a financial institution in a particular application should be appropriate to the level of risk in that application.

The method of authentication used in a specific Internet application should be appropriate and reasonable, from a business perspective, in light of the reasonably forcible risks in that application. Because the standards for implementing a commercially reasonable system may change over time as technology and other procedures develop, financial institutions and technology service providers should develop an on going process to review authentication technology and ensure appropriate changes are implemented.

The agencies consider single-factor authentication, as the only control mechanism, to be inadequate for high-risk transactions involving access to customer information or the movement of funds to other parties. Single-factor authentication tools, including passwords and PINs, have been widely used for a variety of Internet banking and electronic commerce activities, including account inquiry, bill payment, and account aggregation.

However, financial institutions should assess the adequacy of such authentication techniques in light of new or changing risks such as phishing, pharming, malware, and the evolving sophistication of compromise techniques. Where risk assessments indicate that the use of single-factor authentication is inadequate, financial institutions should implement multifactor authentication, layered security, or other controls reasonably calculated to mitigate those risks.

The risk assessment process should:

• Identify all transactions and levels of access associated with

Internet-based customer products and services.

• Identify and assess the risk mitigation techniques, including

Authentication methodologies, employed for each transaction type

and level of access.

• Include the ability to gauge the effectiveness of risk mitigation

Techniques for current and changing risk factors for each transaction type and level of access.

1.9 Customer Verification:

With the growth in electronic banking and commerce, financial institutions should use reliable methods of originating new customer accounts online. Moreover, customer identity verification during account origination is required and is important in reducing the risk of identity theft, fraudulent account applications, and unenforceable account agreements or transactions.

Potentially significant risks arise when a financial institution accepts new customers through the Internet or other electronic channels because of the absence of the physical cues that financial institutions traditionally use to identify persons.

One method to verify a customer’s identity is a physical presentation of a proof of identity credential such as a driver's license. Similarly, to establish the validity of a business and the authority of persons to perform transactions on its behalf, financial institutions typically review articles of incorporation, business credit reports, and board resolutions identifying officers and authorized signers, and other business credentials.

However, in an Internet banking environment, reliance on these traditional forms of paper-based verification decreases substantially. Accordingly, financial institutions need to use reliable alternative methods or authentication, as the only control mechanism, to be inadequate in the case.

2 Literature study of related work

2.1 Existing System:

Authentication methodologies are numerous and range from simple to complex. The level of security provided varies based upon both the technique used and the manner in which it is deployed. Single-factor authentication involves the use of one factor to verify customer identity.

The most common single-factor method is the use of a password. Two-factor authentication is most widely used with ATMs. To withdraw money from an ATM, the customer must present both an ATM card and a password or PIN. Multifactor authentication utilizes two or more factors to verify customer identity.

Authentication methodologies based upon multiple factors can be more difficult to compromise and should be considered for high-risk situations. The effectiveness of a particular authentication technique is dependent upon the integrity of the selected product or process and the manner in which it is implemented and managed.

Which ever authentication tool is chosen heavily depends on the type of service and across which channel together with a risk assessment that the financial institution must carry out in order to ensure that the perceived risks are adequately mitigated. An effective authentication program should be implemented on an enterprise-wide basis and across all services channels.

For example internet, telephone and call-centre services, to ensure that controls and authentication tools are adequate. Authentication processes should be designed to maximize interoperability and should be consistent with the financial institution's overall strategy for electronic banking and e-commerce customer services.

2.2 Tokens:

Tokens are physical devices (something the person has) and may be part of a multifactor authentication scheme. Three types of tokens are discussed here: the USB token device, the smart card, and the password-generating token.

a) USB Token Device:

The USB token device is typically the size of a house key. It plugs directly into a computer’s USB port and therefore does not require the installation of any special hardware on the user’s computer. Once the USB token is recognized, the customer is prompted to enter his or her password (the second authenticating factor) in order to gain access to the computer system.

USB tokens are one-piece, injection-molded devices. USB tokens are hard to duplicate and are tamper resistant; thus, they are a relatively secure vehicle for storing sensitive data and credentials. The device has the ability to store digital certificates that can be used in a public key infrastructure (PKI) environment.

The USB token is generally considered to be user-friendly. Its small size makes it easy for the user to carry and, as noted above, it plugs into an existing USB port; thus the need for additional hardware is eliminated.

By requiring two independent elements for user authentication, this approach significantly decreases the chances of unauthorized information access and fraud.

USB Tokens are designed to securely store an individual’s digital identity (digital ID), specifically their Entrust digital certificates and keys.

Fig 2.1: USB Port

These portable tokens plug into a computer’s USB port either directly or using a USB extension cable. When users attempt to login to applications via the desktop, VPN/WLAN or Web portal, they will be prompted to enter their unique PIN number. If the entered PIN number matches the PIN within the Entrust USB Token, the appropriate digital credentials are passed to the network and access is granted. PIN numbers stored on the token are encrypted for added security.

Pros of UBS Tokens:

Strong security :

Removing the USB Token will prevent other users from accessing the current secure session.

The token cannot be duplicated.

If stolen, security is not completely compromised because the associated PIN number is also required to gain access to the desktop, VPN/WLAN or Web portal

PINs are encrypted by the token itself to increase the level of security

Storage of a user’s digital certificate enables security capabilities beyond just authentication, to include digital signatures and encryption.

Reliable :

USB Tokens are durable against normal wear and tear

the typical lifespan of a token is approximately ten years

batteries are not required

USB Tokens can help organizations abide by new security standards and global legislations that mandate the privacy of client/patient/consumer records.

lower total cost :

plug and play capabilities with USB ports minimizes helpdesk and training costs

USB extension cables protect against USB failures due to wear and tear

Cons of USB Tokens:

Requires client side software which may not be consistently available across platform.

End user education (to understand the concepts of PKI) may be a degree more difficult when compared to other two factor authentication mechanisms.

Tokens may not be compatible across platforms (if a user needs to authenticate from multiple platforms, may need multiple tokens)

Having a "new store" for identities might not be as convenient (from an administrative standpoint) as simply providing an extra factor to an existing system.

b) Smart Card:

A smart card is a small, tamperproof computer. The smart card itself contains a CPU and some non-volatile storage. In most cards, some of the storage is tamperproof while the rest is accessible to any application that can talk to the card. This capability makes it possible for the card to keep some secrets, such as the private keys associated with any certificates it holds. The card itself actually performs its own cryptographic operations.

Although smart cards are often compared to hard drives, they are "secured drives with a brain"—they store and process information.

Smart cards are storage devices with the core mechanics to facilitate communication with a reader or coupler. They have file-system configurations and the ability to be partitioned into public and private spaces that can be made available or locked. They also have segregated areas for protected information, such as certificates, e-purses, and entire operating systems. In addition to traditional data storage states, such as read-only and read/write, some vendors are working with sub states best described as "add only" and "update only."

Smart cards are a key component of the public key infrastructure (PKI) because smart cards enhance software-only solutions, such as client authentication, logon, and secure email. Smart cards are a point of convergence for public key certificates and associated keys because:

1) Provide tamper-resistant storage for protecting private keys and other forms of personal information.

2) Isolate security-critical computations, involving authentication, digital signatures, and key exchange from other parts of the system that don’t have a need to know.

3) Enable portability of credentials and other private information between computers at work, at home, or on the road.

Fig 2.2: Smart Cards

Fig2.3: Smart Cards

The primary disadvantage as a consumer authentication device is that they require the installation of a hardware reader and associated software drivers on the consumer’s home computer.

c) Password-Generating Token:

A password-generating token produces a unique pass-code, also known as a one-time password each time it is used. The token ensures that the same OTP is not used consecutively. The OTP is displayed on a small screen on the token. The customer first enters his or her user name and regular password (first factor), followed by the OTP generated by the token (second factor).

The customer is authenticated if the regular password matches and the OTP generated by the token matches the password on the authentication server. A new OTP is typically generated every 60 seconds—in some systems, every 30 seconds. This very brief period is the life span of that password. OTP tokens generally last 4 to 5 years before they need to be replaced.

Password-generating tokens are secure because of the time-sensitive, synchronized nature of the authentication. The randomness, unpredictability, and uniqueness of the OTPs substantially increase the difficulty of a cyber thief capturing and using OTPs gained from keyboard logging.

2.3 Biometrics:

Biometrics are automated methods of identifying a person or verifying the identity of a person based on a physiological or behavioral characteristic. Examples of physiological characteristics include hand or finger images, facial characteristics, and iris recognition. Behavioral characteristics are traits that are learned or acquired. Dynamic signature verification, speaker verification, and keystroke dynamics are examples of behavioral characteristics.

Biometric authentication requires comparing a registered or enrolled biometric sample (biometric template or identifier) against a newly captured biometric sample (for example, a fingerprint captured during a login). During Enrollment, as shown in the picture below, a sample of the biometric trait is captured, processed by a computer, and stored for later comparison.

Biometric recognition can be used in Identification mode, where the biometric system identifies a person from the entire enrolled population by searching a database for a match based solely on the biometric.

For example, an entire database can be searched to verify a person has not applied for entitlement benefits under two different names. This is sometimes called "one-to-many" matching. A system can also be used in Verification mode, where the biometric system authenticates a person’s claimed identity from their previously enrolled pattern. This is also called "one-to-one" matching.

In most computer access or network access environments, verification mode would be used. A user enters an account, user name, or inserts a token such as a smart card, but instead of entering a password, a simple touch with a finger or a glance at a camera is enough to authenticate the user

2.3.1 Use of Biometrics:

Using biometrics for identifying human beings offers some unique advantages. Biometrics can be used to identify you as you. Tokens, such as smart cards, magnetic stripe cards, photo ID cards, physical keys and so forth, can be lost, stolen, duplicated, or left at home. Passwords can be forgotten, shared, or observed.

Moreover, today's fast-paced electronic world means people are asked to remember a multitude of passwords and personal identification numbers (PINs) for computer accounts, bank ATMs, e-mail accounts, wireless phones, web sites and so forth. Biometrics holds the promise of fast, easy-to-use, accurate, reliable, and less expensive authentication for a variety of applications.

There is no one "perfect" biometric that fits all needs. All biometric systems have their own advantages and disadvantages. There are, however, some common characteristics needed to make a biometric system usable. First, the biometric must be based upon a distinguishable trait. For example, for nearly a century, law enforcement has used fingerprints to identify people.

There is a great deal of scientific data supporting the idea that "no two fingerprints are alike." Technologies such as hand geometry have been used for many years and technologies such as face or iris recognition have come into widespread use. Some newer biometric methods may be just as accurate, but may require more research to establish their uniqueness.

Another key aspect is how "user-friendly" a system is. The process should be quick and easy, such as having a picture taken by a video camera, speaking into a microphone, or touching a fingerprint scanner. Low cost is important, but most implementers understand that it is not only the initial cost of the sensor or the matching software that is involved. Often, the life-cycle support cost of providing system administration and an enrollment operator can overtake the initial cost of the biometric hardware.

The advantage biometric authentication provides is the ability to require more instances of authentication in such a quick and easy manner that users are not bothered by the additional requirements. As biometric technologies mature and come into wide-scale commercial use, dealing with multiple levels of authentication or multiple instances of authentication will become less of a burden for users. An indication of the biometric activities.

2.3.2 Biometric techniques:

Various biometric techniques and identifiers are being developed and tested, these include:

Fingerprint recognition

Face recognition

Voice recognition

Keystroke recognition

Handwriting recognition

Finger and hand geometry

Retinal scan

Iris scan

Fingerprint Recognition:

Fingerprint recognition technologies analyze global pattern schemata on the fingerprint, along with small unique marks known as minutiae, which are the ridge endings and bifurcations or branches in the fingerprint ridges. The data extracted from fingerprints are extremely dense and the density explains why fingerprints are a very reliable means of identification.

Fingerprint recognition systems store only data describing the exact fingerprint minutiae; images of actual fingerprints are not retained. Fingerprint scanners may be built into computer keyboards or pointing devices (mice), or may be stand-alone scanning devices attached to a computer.

Fingerprints are unique and complex enough to provide a robust template for authentication. Using multiple fingerprints from the same individual affords a greater degree of accuracy. Fingerprint identification technologies are among the most mature and accurate of the various biometric methods of identification.

Although end users should have little trouble using a fingerprint-scanning device, special hardware and software must be installed on the user’s computer. Fingerprint recognition implementation will vary according to the vendor and the degree of sophistication required. This technology is not portable since a scanning device needs to be installed on each participating user’s computer. However, fingerprint biometrics is generally considered easier to install and use than other, more complex technologies, such as iris scanning.

Enrollment can be performed either at the financial institution’s customer service center or remotely by the customer after he or she has received setup instructions and passwords. According to fingerprint technology vendors, there are several scenarios for remote enrollment that provide adequate security, but for large-dollar transaction accounts, the institution should consider requiring that customers appear in person

Fig 2.4: Fingerprint recognition

Fig 2.5: Fingerprint recognition

Fig 2.6: Fingerprint recognition.

Face Recognition:

The identification of a person by their facial image can be done in a number of different ways such as by capturing an image of the face in the visible spectrum using an inexpensive camera or by using the infrared patterns of facial heat emission. Facial recognition in visible light typically model key features from the central portion of a facial image.

Using a wide assortment of cameras, the visible light systems extract features from the captured image(s) that do not change over time while avoiding superficial features such as facial expressions or hair. Several approaches to modeling facial images in the visible spectrum are Principal Component Analysis, Local Feature Analysis, neural networks, elastic graph theory, and multi-resolution analysis.

Facial scans are only as good as the environment in which they are collected. The so-called "mug shot" environment is ideal. The best scans are produced under controlled conditions with proper lighting and proper placement of the video device. As part of a highly sensitive security environment, there may be several cameras collecting image data from different angles, producing a more exact scan.

Certain facial scanning applications also include tests for livens, such as blinking eyes. Testing for livens reduces the chance that the person requesting access is using a photograph of an authorized individual.

Some facial recognition systems may require a stationary or posed user in order to capture the image, though many systems use a real-time process to detect a person's head and locate the face automatically. Major benefits of facial recognition are that it is non-intrusive, hands-free, and continuous and accepted by most users.

Fig 2.7: Face recognition

Fig 2.8: Face recognition

Voice Recognition:

Speaker recognition has a history dating back some four decades, where the outputs of several analog filters were averaged over time for matching. Speaker recognition uses the acoustic features of speech that have been found to differ between individuals. These acoustic patterns reflect both anatomy (e.g., size and shape of the throat and mouth) and learned behavioral patterns (e.g., voice pitch, speaking style).

This incorporation of learned patterns into the voice templates (the latter called "voiceprints") has earned speaker recognition its classification as a "behavioral biometric." Speaker recognition systems employ three styles of spoken input: text-dependent, text-prompted and text independent.

Most speaker verification applications use text-dependent input, which involves selection and enrollment of one or more voice passwords. Text-prompted input is used whenever there is concern of imposters. The various technologies used to process and store voiceprints include hidden Markov models, pattern matching algorithms, neural networks, matrix representation and decision trees.

Some systems also use "anti-speaker" techniques, such as cohort models, and world models. Ambient noise levels can impede both collections of the initial and subsequent voice samples. Performance degradation can result from changes in behavioral attributes of the voice and from enrollment using one telephone and verification on another telephone.

Voice changes due to aging also need to be addressed by recognition systems. Many companies market speaker recognition engines, often as part of large voice processing, control and switching systems. Capture of the biometric is seen as non-invasive. The technology needs little additional hardware by using existing microphones and voice-transmission technology allowing recognition over long distances via ordinary telephones (wire line or wireless).

Keystroke recognition:

Keystroke dynamics is the process of analyzing the way a user types at a terminal by monitoring the keyboard inputs thousands of times per second in an attempt to identify users based on habitual typing rhythm patterns.

Keystroke rhythm is a good sign of identity. Moreover, unlike other biometric systems which may be expensive to implement, keystroke dynamics is almost free the only hardware required is the keyboard.

Keystroke verification techniques can be classified as either static or continuous. Static verification approaches analyze keystroke verification characteristics only at specific times,

For example, during the login sequence. Static approaches provide more robust user verification than simple passwords, but do not provide continuous security, they can not detect a substitution of the user after the initial verification. Continuous verification, on the contrary, monitors the user's typing behavior throughout the course of the interaction.

Hand Writing Recognition:

Handwriting recognition is the ability of a computer to receive intelligible handwritten input. The image of the written text may be sensed "off line" from a piece of paper by optical scanning (optical character recognition). Alternatively, the movements of the pen tip may be sensed "on line", for example by a pen-based computer screen surface.

Handwriting recognition principally entails optical character recognition. However, a complete handwriting recognition system also handles formatting, performs correct segmentation into characters and finds the most plausible words.

Fig 2.9: Hand Writing Recognition

Basically Hand Writing recognition is of two types. They are:

Online Recognition.

Offline Recognition.

Online Recognition:

`On-line handwriting recognition involves the automatic conversion of text as it is written on a special digitizer or PDA, where a sensor picks up the pen-tip movements X(t),Y(t) as well as pen-up/pen-down switching. That kind of data is known as digital ink and can be regarded as a dynamic representation of handwriting. The obtained signal is converted into letter codes which are usable within computer and text-processing applications.

The elements of an on-line handwriting recognition interface typically include:

A pen or stylus for the user to write with.

A touch sensitive surface, which may be integrated with, or adjacent to, an output display.

A software application which interprets the movements of the stylus across the writing surface, translating the resulting curves into digital text.

The information on strokes and trajectories is mathematically represented in an ink signal composed of a sequence of 2D points ordered by time. No matter what the handwriting surface may be, the digital ink is always plotted according to a matrix with x and y axes and a point of origin.

Fig 2.10: Online Recognition

Online data acquisition captures just the information needed, which is trajectory and strokes, to obtain a clear signal. This effective information makes the data easier to process and is suited for more possibilities than offline acquired data.

Offline recognition:

Off-line handwriting recognition involves the automatic conversion of text in an image I(x,y) into letter codes which are usable within computer and text-processing applications. The data obtained by this form is regarded as a static representation of handwriting.

The technology is successfully used by businesses which process lots of handwritten documents, like insurance companies. The quality of recognition can be substantially increased by structuring the document (by using forms).

The off-line handwriting recognition is comparatively difficult. As different people have different handwriting styles, so it becomes difficult to recognize the handwriting by computer. The PIN CODE digits are generally read by computer to sort the incoming mail.

Fig 2.11: Offline recognition

Finger & Hand Geometry recognition:

The geometric features of the hand such as the lengths of fingers and the width of the hand are measured to identify an individual.

The hand geometry scanner looks for unique features in the structure of the hand. These unique features include the finger thickness, length, and width, the distances between finger joints, the hand’s overall bone structure, etc. It should be noted here that with iris and fingerprint recognition, the primary goal is to look for extremely distinctive features

However, this is not the case with hand geometry recognition, as it is looking for moderately unique features. Thus, hand geometry recognition would not be the biometric tool of choice for high security applications or identification purposes where iris recognition or fingerprint recognition would be, respectively.

The user first places his or her hand onto a platen. This platen consists of 5 pegs which help the user position their fingers properly in order to insure quality enrollment and verification templates. The hand geometry scanner consists of a charged couple device camera (CCD), as well as various reflectors and mirrors in order to capture various black and white pictures of the hand. Two basic types of pictures of the hand are captured:

(1) An image of the top of the hand.

(2) An image of the side of the hand.

In the enrollment phase, the user is prompted by the hand geometry scanner to place their hand on the platen three different times, so that three images can be captured and then averaged. The resulting image forms the basis for the enrollment template, which is then stored in the database of the hand geometry scanner. The enrollment phase can be accomplished in just five seconds.

In the verification phase, the user is prompted to place their hand only once on the platen. An image is captured, and forms the basis for the verification template. The verification template is compared against the enrollment template, in the exact same fashion as fingerprint recognition. The verification phase can be accomplished in just under one second.

In the enrollment and verification phases, the hand geometry scanner takes 96 measurements of the hand. The enrollment and verification templates are only 9 bytes.

Applications:

The most recognized use for this technology is in physical access entry

applications, because the system is user-friendly to configure.

Another application gaining popularity for hand geometry recognition is time and attendance.

The advantage is that the use of timecards, identification badges, and social security numbers is eliminated. Also, the costly problem of "buddy punching" (associated with time clocks) is non-existent. Hand geometry recognition is also used for point of sale applications.

Hand geometry recognition is also utilized in the Immigration and Naturalization Service Passenger Accelerated Service System (INSPASS). With this system, frequent International business travelers can simply use their hand geometry to enter the United States, rather than waiting in long immigration lines at the airport.

First, hand geometry recognition, along with fingerprint recognition, has been around the longest, and as a result, it has certainly proved itself to be a viable technology.

Second, it is deemed to be one of the easiest to use and administer of all of the biometric technologies that are available today.

Third, hand geometry recognition can work in the harshest of environments, both internal and external.

In terms of an internal environment, the technology can tolerate a fair amount of rough usage from end users, especially in large factory, warehouse, and retail settings.

With respect to an external environment, the technology has proven to work in the most extreme of atmospheric climates, ranging from very hot to very cold.

Fourth, hand geometry recognition is the least susceptible to privacy rights issues-primarily because of its simple enrollment and verification procedures.

Fifth, the hand is a stable biometric whose physical characteristics are not susceptible to major biological changes (except for conditions of arthritis, swelling, or deep cuts), thus making hand geometry recognition a very reliable technology.

Retinal Scan:

A retinal scan is a biometric technique that uses the unique patterns on a person's retina to identify them.

The human retina is stable from birth to death, making it the most accurate biometric to measure. It has been possible to take a retina scan since the 1930s, when research suggested that each individual had unique retina patterns. The research was validated and we know that the blood vessels at the back of the eye have a unique pattern, from eye to eye and person to person.

A retinal scan involves the use of a low-intensity light source and coupler that are used to read the blood vessel patterns, producing very accurate biometric data. It has the highest crossover accuracy of any of the biometric collectors, estimated to be in the order of 1:10,000,000.

Some biometric identifiers, like fingerprints, can be fooled. This is not the case with a retina scan. The retina of a deceased person quickly decays and cannot be used to deceive a retinal scan. It is for this reason that retina scan technology is used for high end access control security applications.

Fig 2.12: Retinal Scan

Iris Scan:

Iris recognition is a method of biometric authentication that uses pattern recognition techniques based on high-resolution images of the irides of an individual's eyes. Iris recognition uses camera technology, and subtle IR illumination to reduce specular reflection from the convex cornea to create images of the detail-rich, intricate structures of the iris. These unique structures converted into digital templates, provide mathematical representations of the iris that yield unambiguous positive identification of an individual.

Iris recognition efficacy is rarely impeded by glasses or contact lenses. Iris technology has the smallest outlier (those who cannot use/enroll) group of all biometric technologies. The only biometric authentication technology designed for use in a one-to many search environment, a key advantage of iris recognition is its stability, or template longevity as, barring trauma, a single enrollment can last a lifetime.

Fig 2.13: Iris Scan

Fig 2.14: Iris Scan

The iris of the eye has been described as the ideal part of the human body for biometric identification for several reasons:

It is an internal organ that is well protected against damage and wear by a highly transparent and sensitive membrane (the cornea). This distinguishes it from fingerprints, which can be difficult to recognize after years of certain types of manual labor.

The iris is mostly flat and its geometric configuration is only controlled by two complementary muscles (the sphincter pupillae and dilator pupillae), which control the diameter of the pupil. This makes the iris shape far more predictable than, for instance, that of the face.

The iris has a fine texture that – like fingerprints – is determined randomly during embryonic gestation. Even genetically identical individuals have completely independent iris textures, whereas DNA (genetic "fingerprinting") is not unique for the about 1.5% of the human population who have a genetically identical monozygotic twin.

Cons are:

Iris scanning is a relatively new technology and is incompatible with the very substantial investment that the law enforcement and immigration authorities of some countries have already made into finger-print recognition.

Iris recognition is very difficult to perform at a distance larger than a few meters and if the person to be identified is not cooperating by holding the head still and looking into the camera.

As with other photographic biometric technologies, iris recognition is susceptible to poor image quality, with associated failure to enroll rates.

As with other identification infrastructure (national residents databases, ID cards, etc.), civil rights activists have voiced concerns that iris-recognition technology might help governments to track individuals beyond their will.

PROPOSED SYSTEM:

Two-factor authentication method uses mobile devices as security tokens to receive a single-use password to strengthen the existing ID/password authentication and authorization process.

In the first phase, the authenticator receives application generated requests for authentication of a specified user. When the request is received, a single-use password is generated and sent using GSM Short Messaging Service to a GSM cell phone registered on the specified user. The single-use password has a configurable timeout (default 5 minutes).

In the second phase of the authentication process, a request is made specifying the user id and a hash of the single-use password. If both the single-use and user specified password is valid, the user will be authenticated.

Even though the application is using client’s device to send the second factor, it is possible to completely avoid the client’s involvement in generating the password. There are few limitations in the paper. This proposal is completely overcoming the problem of hacking but not from phishing.

The above proposal can be enhanced to secure the application by preventing from phishing problem. The proposed system is responsible to generate the second factor and also to generate a dynamic GUI to access the second factor.

Benefits of Two-factor authentication:

Greatly enhances security by requiring two independent pieces of information for authentication.

Reduces the risk posed by weak user passwords that are cracked easily.

Minimizes the time administrators spend training and supporting users by providing a strong authentication process that is simple, intuitive, and automated.



rev

Our Service Portfolio

jb

Want To Place An Order Quickly?

Then shoot us a message on Whatsapp, WeChat or Gmail. We are available 24/7 to assist you.

whatsapp

Do not panic, you are at the right place

jb

Visit Our essay writting help page to get all the details and guidence on availing our assiatance service.

Get 20% Discount, Now
£19 £14/ Per Page
14 days delivery time

Our writting assistance service is undoubtedly one of the most affordable writting assistance services and we have highly qualified professionls to help you with your work. So what are you waiting for, click below to order now.

Get An Instant Quote

ORDER TODAY!

Our experts are ready to assist you, call us to get a free quote or order now to get succeed in your academics writing.

Get a Free Quote Order Now