ASSGNMT2 - Computer Science
Part 1: Quoting
Required source: A professional journal article from the list presented in the Library section of the classroom as explained above. Do not look for quotes already presented in the article; your mission is to find direct statements in the article and quote them yourself.
Quotation 1 – parenthetical citation
· Choose a meaningful statement of 25–39 words from the article and quote it without introduction, using in-text citation after the end-quotation mark and before the final sentence punctuation.
Quotation 2 – narrative citation
· Choose a different meaningful statement of 25–39 words from the same article and quote it properly, starting your sentence with According to or a similar introduction, and inserting proper citation as explained in the Reading.
Required adjustment:
· Edit just one of your two quotes by correctly using brackets, an ellipsis,
or
[sic]. These techniques are explained in the Reading.
· Caution: If the original does not have an error, you cannot use [sic] and must instead employ either brackets for a clarification or an ellipsis to delete words. Note that British English spelling errors are not considered errors.
Reference entry:
· Provide a full 7th edition APA-standard reference entry for this journal article.
Part 2: Paraphrasing from two other articles
Choose two (2) other journal articles from the same Library list. It is recommended that you pick articles that are relatively easy for you to understand, especially if you are rather new to the technology field. Find a section of each article that interests you and write paraphrases.
For each of your two paraphrases, separately:
· Compose a descriptive title (a phrase) in your own words.
· Write a paraphrase of 170–220 words. If it is difficult to meet the minimum length or to avoid writing more than the maximum, then a more suitable section (or section size) from the original article must be chosen.
· Do not include any quotes.
· Write the paraphrases in paragraph form (no lists).
· Include proper citation as explained in the reading.
· Provide a full 7th edition APA-standard reference entry.
International Journal of Advanced Computer Research, Vol 10(47)
ISSN (Print): 2249-7277 ISSN (Online): 2277-7970
http://dx.doi.org/10.19101/IJACR.2019.940152
96
A survey of biometric approaches of authentication
Nuhu Yusuf
*
, Kamalu Abdullahi Marafa, Kamila Ladan Shehu, Hussaini Mamman and Mustapha
Maidawa
Lecturer, Department of Management and Information Technology, Abubakar Tafawa Balewa University (ATBU)
Bauchi, Nigeria
Received: 20-December-2019; Revised: 19-March-2020; Accepted: 22-March-2020
©2020 Nuhu Yusuf et al. This is an open access article distributed under the Creative Commons Attribution (CC BY) License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1.Introduction
Information security refers to a means of preventing
unauthorized users from access to information. Risk
management usually adopts in information security to
provide solutions to security challenges by
minimizing risks. Risk can be minimized by
administrative control and defense mechanisms.
Access control can also enforce the user right access
and thereby minimizing risks. Authentication is one
of the techniques used in access control systems to
protect unauthorized access. Many approaches for
authentication have been proposed to address security
challenges. These approaches have a conflict with the
system usability and therefore required the
modification of tradition password techniques for
better solutions. Information Authentication is the
common method used by security experts to verify
the users’ identities before getting access right into
the system. Access controls are enforced for all users,
irrespective of categories they belong. Traditional
authentication methods [1] were enough to protect
the unauthorized access right as many security
breaches were reported. Therefore, advanced security
methods that are based on human features required.
*Author for correspondence
Biometrics are a strong authentication method based
on certain human characteristics. These human
characteristics are distinct to each individual and the
selection of each requires careful assessment of its
benefits and shortcomings. Different biometric
methods exist, ranging from simple passwords,
fingerprint and palm print and to more complex ones
such as DNA. The fingerprint is one of the biometrics
methods that are impossible for unauthorized users to
alter because it utilizes friction ridges of the finger.
Palm prints required an image of the hand, palm
region to compare palms for giving access right. As
the most common method, people prepared using a
password to secure their system rather than using
complex algorithms.
Biometric methods prove their capability of
preventing unauthorized user access. However, large-
scale review required on some of the methods to be
able to understand recently added contributions. The
previous contribution on this is given by Padma and
Srinivasan [2] which focused on the biometric
authentication review in cloud computing. Prasad et
al. [3] present fingerprint biometric authentication
methods where they review various fingerprint
recognition system and their application. However,
Review Article
Abstract
The increasing need for better authentication methods against hackers has called for the use of the biometric
authentication method to guard against unauthorized access into the systems. The used of human characteristics for
biometrics provides authentication for different kind of systems. However, poor quality of authentication still allows
hackers gaining access to these systems. Many biometrics authentication approaches have been proposed to improve the
authentication accuracy and other related quality measures. This survey aims to provide a state-of-the-art fingerprint and
password biometric authentication approaches. Their challenges have been presented and discussed in terms of biometric
authentication. Furthermore, the strengths and weaknesses of each of the fingerprint and password biometric
authentication are discussed and compared. The findings show that fingerprint image quality and password
authentication is still an active research area where performance requires improvement. Also, the graphical password
indicates a promising future direction for enhancing password methods.
Keywords
Biometrics, Authentication, Fingerprint, Password, Information security.
International Journal of Advanced Computer Research, Vol 10(47)
97
there is needed to look into other biometric methods
as the paper only considers the fingerprint method.
This paper presents a survey of biometric
authentication methods, specifically compared
fingerprint and password methods as the most
commonly used by people.
2.Literature review
Biometric authentication attracts the attention of both
researchers and practitioners and it’s now replacing
other authentication methods such as passwords. This
is because user behaviour patterns can be easily used
for identification. These human characteristics cannot
be easily stolen or forgot and can useful for
authentications. For instance, face, fingerprint, iris
and voice could easily identify users during
authentication and unauthorized users would not get
access. Tekade and Shende [4] believe that biometric
technology is capable of solving personal identity
security issues for many critical application areas.
Parkavi et al. [5] present the importance of biometrics
in using multiple personal identification techniques
for authenticating users. Kakkad et al. [6] present the
importance of user authentication techniques to
authenticate images on clouds.
The biometric authentication was introduced to
identified and control access to a system [4].
Biometrics can be human characteristics, for
instance, fingerprint, face recognition, iris
recognition, retina and palm print [5]. Through
biometric recognition, users’ identities are verified
based on some certain measurements. To provide
proper authentication, the biometric authentication
usually utilizes fingerprint, eye scanners, facial
recognition, hand geometry and passwords
authentication approaches. The fingerprint approach
operates based on fingerprint scanners such as
optical, capacitive and ultrasound [7]. The optical
takes the finger photo, identify patterns and compile
into codes for proper security identification. Eye
scanner approach provides authentication based on
retina and iris scanner. The retina and iris remain
with a person throughout their life and as such can be
easily accessible. The retina scan uses light to
illuminate eye blood vessels. The idea for this is that
people have different retina tissues in blood vessels.
Iris scanner uses a photo of individuals and uses for
authentication. Facial recognition approach can be
either extracting person face image or using skin
texture analysis for authentication. Hand geometry
approach uses palm thickness for biometric
authentication. Though, the low accuracy [6] serves
as the drawback to this approach. Biometric
authentication provides authentication security
process to verify user identity [4]. The biometric
authentication has been characterized as ease of use
method. This is because users can use it at any time
they required to use. The biometric authentication
also makes difficult for hackers to discover any
weakness and have access to the system [8].
However, a certain limitation exists for biometric
authentication such used by proxy and remote
recovery. To use biometric authentication, the actual
person involve d must be physically present and other
people cannot authenticate on behalf of others.
Additionally, some of the authentication methods do
have recovery methods but there is absent of such
recovery for biometric authentication. Figure 1
presents some of the common biometric
authentication methods.
Figure 1 Biometric authentication methods
Biometric
Authentication
Fingerprint
Optical
Scanner
Capacitive
Scanner
Ultrasound
Scanner
Eye Scan
Retina Scan
Iris Scan
Facial
Recognition
Face
Recognition
Skin Texture
Analysis
Hand
Geometry
Palm
Thickness
Finger Length
Password
One Time
Password
Graphical
Password
Nuhu Yusuf et al.
98
3.Methods
The most often used biometric authentication
approaches in either simple or complex systems are
fingerprint and passwords methods.
3.1Fingerprint biometric authentication
The fingerprint is an important mechanism for
detecting crime and prevents unauthorized access to
the system. Erika Rahmawati et al. [9] believe
fingerprint technology can be used with a digital
signature to improve the security of mobile
applications, specifically when sending and receiving
documents. Furthermore, Kamelia et al. [10] examine
the significant of fingerprint method in taking online
attendance using mobile phones. They provide the
possibility of integrating fingerprint with GPS via
Arduino and achieved 1.39 seconds average response
time. Goicoechea-Telleria et al. [11] investigate how
fingerprint adoption in smartphones becomes a worry
some due to sensor issues. Hwang et al. [12] provide
a template for achieving higher accuracy in
fingerprint recognition for mobile devices. You and
Wang [13] proposed a fingerprint method that is
based on a fuzzy vault scheme. Wireless devices
require fingerprint for data security as such, Lin et al.
[14] suggest dimensional reduction that utilizes
machine learning algorithms as an authentication
solution. Dimensional reductions provide effective
decisions on data reduction. In addition to that, Ma et
al.[15] presents a multi-dimension algorithm to
provide cellular network security. However,
Sadhukhan et al. [16] analyses the performance of
clustering based fingerprint for smartphones devices.
Engelsma et al. [17] suggested how fingerprint can
be enhancing in future to avoid image variation
results from fingerprint captures. They presented a
universal 3d fingerprint target as an alternative to
improve images variations. Similarly, fingerprint
higher resolution in terms of 3d can also be achieved
using sweat gland extraction [18] which utilizes cells
positions. However, Valdes-Ramirez et al. [19]
reviewed fingerprint features for identifying latent
fingerprint based on minutiae. Makhija et al. [20]
analysed the performance of various latent fingerprint
techniques which required further improvements.
3.2Password biometric authentication
Password authentication is the process of verifying
the access right of the user through the use of a
password. User may be allowed to set up a simple
password using text. But these simple texts are
subjects to attacks. Maqbali and Mitchell [21]
suggested the generating of password automatically
without users involvements. This will be in line with
international standard practise for password
requirements authentication.
The purpose of password authentication is to make
authorize users to kept secret access right so that
unauthorized would not get access to. The passwords
should not be easy for password attacks to guess.
Password attackers can easily gain access to weak
passwords. Rahiemy et al. [22] present that the lack
of password complexity serves as the source for
attackers. In addition to that, Tabrez and Sai [23] also
believe that weak passwords always motivate
attackers. Zhang et al. [24] argued that a technique
can design in such a way that user may constantly
change the password before attackers have access.
The technique only takes into consideration the
dictionary attacks while forgetting that other attacks
may provide serious damages than dictionary attacks.
Password attacks are various techniques used to gain
access to the password by either guessing or stealing.
It could be dictionary attacks where people’s names,
date of births, or lower/uppercase letters would be
trying and retry till getting the actual password.
Figure 1 presents how dictionary attacks work.
Erdem and Sandıkkaya [25] support the use of the
one-time password and they proposed a technique
based on OTP where cloud provider would be located
as the cloud as service and then analyze the user
before given access. Default password may be
discovered by either Trojan horse or backdoors via
network trafficking. Intruders also used social
engineering to have access to the passwords via
emails or any other alternative methods. Bruteforce
attacks are other attacks based on trial and error to
get access to the password.
Mohamedali and Fadlalla [26] present different
categories of password attacks and stated the benefits
and shortcomings of each attack. They suggest more
friendly methods to address these attacks without
complicating with usability. These attacks include
among others the Phishing, Man-in-the-Middle, etc.
Zheng and Jia [27] suggest the use of separators
between keystrokes to address the leaked password
issues. This means that the blank space is inserted
within the password for better authentications. If the
passport with spaces corresponds with the users’
inputs, then access right will be granted. However,
Hwang et al. [28] proposed the use of Smart Card as
an authentication method instead of a general
password. They try to address password guessing
attacks using complex smart card implementations.
International Journal of Advanced Computer Research, Vol 10(47)
99
4.Results
This section presents the results of the two major
biometric authentications taking into consideration
their strengths and limitations.
4.1Results of fingerprint biometric authentication
approach
Table 1 present the comparison of various fingerprint
techniques. Wu and Chiu [29] present solutions to
poor fingerprint quality to ensure better fingerprint
recognition for authentication. Their work used ridge
features techniques which different individuals and
achieved almost 99\% accuracy. In addition to that,
Tang et al. [30] examine how Hessian matrix and
short-time Fourier transform (STFT) would improve
fingerprint images quality utilizes fingerprint
textures. The result indicates 0.799 second processing
time has been reduced. Furthermore, Liban and Hilles
[31] suggest enhancing latent fingerprint to improve
fingerprint quality so that reasonable processing time
would be achieved. However, Koptyra and Ogiela
[32] argued that higher fingerprint processing time
will be achieved if enhancing Histograms of Oriented
Gradients (HOG) technique.
Patel et al. [33] enhanced O’ Gorman filter to address
minutiae points’ extraction problem. The result
achieved mean square error (MSE) and peak signal to
noise ratio (PSNR) of 6\% and 39\% respectively.
Similarly, Sudiro et al. [34] used Artificial Neural
Network to address Fingerprint extraction issues
while achieving 41\% False Acceptance Rate. Kim et
al. [35] also used a deep neural network to address
issues arising from fingerprint collections. Cao and
Jain [36] present fingerprint synthesis technique to
reduce processing time error of fetching fingerprint
images from the database. Nuraisha and Shidik [37]
stated that fake fingerprints cause longer processing
time and as such normalization is required to get
higher accuracy results. Han et al. [38] improve
fingerprint image impulse noise using Adaptive
Median Filter.
4.2Results of password biometric authentication
approach
With all the security challenges of traditional
password, Taufiq and Ogi [39] suggest improvement
of existing passwords techniques to strengthen
security rather than adopting other complex methods.
They present a method that utilizes one-time
password known as Raspberry Pi at the access
control level. Though it is difficult for attackers to
repay attacks on password because the new password
will be assigned, the network response time will force
another challenge. Furthermore, Zaki et al. [40]
believe that text passwords can be enhanced using
different pattern keys ranging from simple to
complex ones. However, Lekshmi et al. [41]
suggested the neural network approach as an
alternative password method, especially if integrated
with fuzzy rules. Bhola et al [42] examine how
android device will be used to improve on password
methods. Scaria and Megalingam [43] present a
complex method that incorporates OTP, biometrics
and noisy passwords.
Graphical passwords have been successfully
implemented to overcome text-based passwords
challenges but still required more improvements.
Bilgi and Tugrul [44] integrated images in a
password method to provide access right. Their
approach provides more benefits compared to
ordinary text-based passwords but does not clearly
state how shoulder surfing attacks would be
minimized. Moreover, Fayyadh et al. [45] present a
graphical method that allows the user to create shapes
during their registration and thereby required to draw
such shapes when accessing the system. Their
approach is quite an improvement compared to Bilgi
and Tugrul [44]. Zhang et al [46] approach is
difficult to implement and can be conflicting with
usability. Table 2 shows the evaluation of password
biometric authentication approaches.
Table 1 Evaluation of Fingerprint Biometric Authentication Approaches
Authors Problem Techniques Metrics/results Benefits Limitations
Wu and
Chiu (2017)
[29]
poor
fingerprint
quality
Ridge Features Accuracy of
99.00\%, and
99.09\%
Successfully classified
ridges features
Not suitable for large
datasets
Patel et al.
(2017) [33]
minutiae
points
extraction
Enhanced
O’Gorman Filter
MSE = 6.698
PSNR = 39.871
Better results on
O’Gorman compare to
Gabor
Doesn’t show overall
fingerprint
performance
Cao and Jain
(2018) [36]
fingerprint
images
database
Fingerprint
Synthesis
Time: 512 × 512
in 12 muinute
Provide a better quality
image
Doesn’t incorporating
diversity criteria in
the training process
Nuhu Yusuf et al.
100
Authors Problem Techniques Metrics/results Benefits Limitations
Abdilahi
Liban and
Hilles
(2018) [47]
Fingerprint
images
quality
Enhanced Latent
fingerprint
RMSE =
0.023199
PSNR = 81.07826
improved matching
accuracy
latent fingerprint
images still
overlapped
Safira
Nuraisha et
al (2018)
[37]
fake
fingerprints
Normalization Accuracy of 24\% Increased accuracy of
detecting fake
fingerprint images
Inefficient features
extraction
Szymkowski
and Saeed
(2018) [48]
Fingerprint
recognition
Sectorization Accuracy of
100\%
Provide a new way to
reach a satisfactory
level of identification
accuracy
Changes in changes in
their fingerprint
patterns may still
present
Han et al.
(2018) [38]
filtering
window
size noise
Adaptive Median
Filter
PSNR = 44 Present feasible for
fingerprint image
enhancement
impulse noise still
present
Kim et al.
(2019) [35]
Collecting
fingerprints
deep neural
networks
average detection
error rate=1.57\%
Generate real
fingerprint with certain
characteristics
Time-consuming
Sudiro et al.
(2017) [34]
Fingerprint
extraction
simple minutiae
point extraction
FAR= 41.57\%
FRR= 41.13\%,
EER= 41.35\%
minutiae extraction
improvement
still lack accuracy due
to the high value of
FAR
Tang et al.
(2017) [30]
Fingerprint
Image
quality
Hessian matrix and
short-time Fourier
transform (STFT)
Processing time =
0.799 s
increased the contrast
greatly according to the
structural characteristics
low contrast between
ridge still need
improvement
Table 2 Evaluation of password biometric authentication approaches
Authors Problem Techniques Benefits Limitations
Taufiq and Ogi
(2018) [39]
password
leakage
attacks
Raspberry Pi Improve one-time password
mutual
authentication
Run with RSA
Zaki et al.
(2018) [40]
password
authentication
combination of pattern,
key, and dummy digits
minimizes different password
attacks and usability issues
Expensive to implement
Lekshmi et al.
(2018) [41]
password
authentication
Hopfield Neural Network
with fuzzy logic
provides better accuracy and
response time
Not easier compare to
graphical passwords
Bhola et al.
(2017) [42]
Cybercrimes Android Device and One-
Way Function
Improved dynamic password
Authentication
Cannot handle multiple
websites authentication
Bilgi and Tugrul
(2018) [44]
password
authentication
Shoulder-Surfing
Resistant Graphical
passwords
faster and easier
authentication processes
shoulder surfing problem
Mehrube and
Nguyen (2018)
[49]
password
authentication
Real-time Eye Tracking The smart camera can capture
and store PIN
incorporating the PIN
identification algorithm into
the-real-time
Othman et al.
(2018) [50]
password
authentication
Graphical Authentication
with Shoulder Surfing
Resistant
demonstrate the robustness,
security strength and the
functionality
A higher number of
direction authentication
exposure
Fayyadh et al
(2018) [45]
password
authentication
graphical password (2D
Shapes)
effective against the brute
force attacks, the dictionary
attacks, and the keylogger
attacks
Difficult to remember the
number of used shapes
when larger
Sudramurthy et
al (2017) [51]
password
authentication
Honey Password Pointed out the strength of the
honey word system depends
on the AES Algorithm
Limited to online purchase
5.Discussion
In section 3, the biometric authentication methods are
presented to provide proper authentication. The
fingerprint authentication accuracy and PSNR have
been observed with different levels of performance.
Reasonable results have been obtained for accuracy
which indicates how accurate some of these
techniques have in addressing fingerprint challenges.
The PSNR results indicate additional improvement is
required. Moreover, the techniques used are mostly
International Journal of Advanced Computer Research, Vol 10(47)
101
enhancement of the existing techniques for
fingerprint and their limitations was highlighted.
Additionally, the techniques appear to solve certain
problems, especially for poor quality recognition. For
instance, ridges feature technique successfully
classified and improved poor quality of the
fingerprint. Despite the quality of this technique in
recognizing fingerprint, the ridges feature technique
not suitable for large datasets. This is because of
difficulties in counting the number of fingerprint
ridges. O’Gorman filter technique has also limitation
in showing fingerprint performance when compared
to other methods such as Gabor in extracting
minutiae points. The fingerprint synthesis doesn’t
take diversity into account when addressing the
fingerprint image database. The latent fingerprint and
sectorization techniques improved accuracy, but the
image still overlapped while minimization contains
low accuracy results. This is because fake fingerprint
may be difficult to detect if complex mechanisms
have not put in place for detection. A deep neural
network is time-consuming in collecting fingerprints.
The STFT technique also provides efficient, timely
results due to contrast increased. Moreover, impulse
noise is still present in the adaptive media filter
technique. In password, biometric authentication
methods, graphical password techniques have some
issues regarding the remembering of a various
number of shapes which may lead to poor
authentication. Though, the graphical passwords
techniques provide an effective measure against
hackers. Using Hopfield neural network with fuzzy
logic can possibly eliminate this problem. The
Hopfield neural network with fuzzy logic can provide
better accuracy for authentication compared with
some of the graphical password’s techniques. Pattern
key and dummy digits are expensive to implement
compared with Raspberry Pi based on the one-time
password. Real-time eye-tracking techniques can be a
good technique for authentication compared with
smart camera capture with store PIN which is easily
altered.
6.Conclusion and future work
Biometric authentication is identification and
verification, which consider human characteristics to
improve system security. The aim is to identify and
authenticate access to any component of the system.
There are many biometric authentication methods
currently available. This work only considers the two
most widely used methods which are the fingerprint
and passwords methods. Various proposed
fingerprint techniques show much improvement in
achieving high image quality. However, the
fingerprint image quality still required improvement
to recognize fingerprint. Moreover, the password
method comprises text and graphical passwords.
Graphical password authenticates users based on the
grid selection algorithm. The algorithm can prevent
not only shoulder surfing attacks, but also other
related password attacks. Besides that, we
highlighted some features of various biometric
authentication techniques. Additionally, we discussed
some of the strengths and challenges of biometric
authentication. In general, both fingerprint and
password methods have proved effective for
biometric authentication. However, regarding future
work, all simple and complex biometric
authentication methods should be considered for a
better understanding.
Acknowledgment
We wish to thank the Department of Management &
Information Technology ATBU Bauchi, Faculty of
Management Science ATBU Bauchi as well as the
Management of Abubakar Tafawa Balewa University
Bauchi for their support and encouragement.
Conflicts of interest
The authors have no conflicts of interest to declare.
References
[1] Bharathi S, Sudhakar R. Biometric recognition using
finger and palm vein images. Soft Computing. 2019;
23(6):1843-55.
[2] Padma P, Srinivasan S. A survey on biometric based
authentication in cloud computing. In international
conference on inventive computation technologies
2016 (pp. 1-5). IEEE.
[3] Prasad PS, Devi BS, Reddy MJ, Gunjan VK. A survey
of fingerprint recognition systems and their
applications. In international conference on
communications and cyber physical engineering 2018
(pp. 513-20). Springer, Singapore.
[4] Tekade P, Shende P. Enhancement of security through
fused multimodal biometric system. In international
conference on computing, communication, control and
automation 2017 (pp. 1-5). IEEE.
[5] Parkavi R, Babu KC, Kumar JA. Multimodal
biometrics for user authentication. In 11th
international conference on intelligent systems and
control 2017 (pp. 501-5). IEEE.
[6] Kakkad V, Patel M, Shah M. Biometric authentication
and image encryption for image security in cloud
framework. Multiscale and Multidisciplinary
Modeling, Experiments and Design. 2019; 2(4):233-
48.
[7] Nakanishi I, Maruoka T. Biometric authentication
using evoked potentials stimulated by personal
ultrasound. In international conference on
telecommunications and signal processing (TSP) 2019
(pp. 365-8). IEEE.
Nuhu Yusuf et al.
102
[8] Vittori P. Ultimate password: is voice the best
biometric to beat hackers? Biometric Technology
Today. 2019; 2019(9):8-10.
[9] Rahmawati E, Listyasari M, Aziz AS, Sukaridhoto S,
Damastuti FA, Bachtiar MM, et al. Digital signature
on file using biometric fingerprint with fingerprint
sensor on smartphone. In international electronics
symposium on engineering technology and
applications (IES-ETA) 2017 (pp. 234-8). IEEE.
[10] Kamelia L, Hamidi EA, Darmalaksana W, Nugraha A.
Real-time online attendance system based on
fingerprint and GPS in the smartphone. In
international conference on wireless and telematics
2018 (pp. 1-4). IEEE.
[11] Goicoechea-Telleria I, Garcia-Peral A, Husseis A,
Sanchez-Reillo R. Presentation attack detection
evaluation on mobile devices: simplest approach for
capturing and lifting a latent fingerprint. In
international carnahan conference on security
technology 2018 (pp. 1-5). IEEE.
[12] Hwang D, Lee H, Bae G, Son S, Kim J. Fingerprint
template management for higher accuracy in user
authentication. In international conference on
electronics, information, and communication 2018
(pp. 1-4). IEEE.
[13] You L, Wang T. A novel fuzzy vault scheme based on
fingerprint and finger vein feature fusion. Soft
Computing. 2019; 23(11):3843-51.
[14] Lin Y, Zhu X, Zheng Z, Dou Z, Zhou R. The
individual identification method of wireless device
based on dimensionality reduction and machine
learning. The Journal of Supercomputing. 2019;
75(6):3010-27.
[15] Ma L, Jin N, Zhang Y, Xu Y. RSRP difference
elimination and motion state classification for
fingerprint-based cellular network positioning system.
…
International Journal of Advanced Computer Research, Vol 10(47)
ISSN (Print): 2249-7277 ISSN (Online): 2277-7970
http://dx.doi.org/10.19101/IJACR.2019.940152
96
A survey of biometric approaches of authentication
Nuhu Yusuf
*
, Kamalu Abdullahi Marafa, Kamila Ladan Shehu, Hussaini Mamman and Mustapha
Maidawa
Lecturer, Department of Management and Information Technology, Abubakar Tafawa Balewa University (ATBU)
Bauchi, Nigeria
Received: 20-December-2019; Revised: 19-March-2020; Accepted: 22-March-2020
©2020 Nuhu Yusuf et al. This is an open access article distributed under the Creative Commons Attribution (CC BY) License,
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1.Introduction
Information security refers to a means of preventing
unauthorized users from access to information. Risk
management usually adopts in information security to
provide solutions to security challenges by
minimizing risks. Risk can be minimized by
administrative control and defense mechanisms.
Access control can also enforce the user right access
and thereby minimizing risks. Authentication is one
of the techniques used in access control systems to
protect unauthorized access. Many approaches for
authentication have been proposed to address security
challenges. These approaches have a conflict with the
system usability and therefore required the
modification of tradition password techniques for
better solutions. Information Authentication is the
common method used by security experts to verify
the users’ identities before getting access right into
the system. Access controls are enforced for all users,
irrespective of categories they belong. Traditional
authentication methods [1] were enough to protect
the unauthorized access right as many security
breaches were reported. Therefore, advanced security
methods that are based on human features required.
*Author for correspondence
Biometrics are a strong authentication method based
on certain human characteristics. These human
characteristics are distinct to each individual and the
selection of each requires careful assessment of its
benefits and shortcomings. Different biometric
methods exist, ranging from simple passwords,
fingerprint and palm print and to more complex ones
such as DNA. The fingerprint is one of the biometrics
methods that are impossible for unauthorized users to
alter because it utilizes friction ridges of the finger.
Palm prints required an image of the hand, palm
region to compare palms for giving access right. As
the most common method, people prepared using a
password to secure their system rather than using
complex algorithms.
Biometric methods prove their capability of
preventing unauthorized user access. However, large-
scale review required on some of the methods to be
able to understand recently added contributions. The
previous contribution on this is given by Padma and
Srinivasan [2] which focused on the biometric
authentication review in cloud computing. Prasad et
al. [3] present fingerprint biometric authentication
methods where they review various fingerprint
recognition system and their application. However,
Review Article
Abstract
The increasing need for better authentication methods against hackers has called for the use of the biometric
authentication method to guard against unauthorized access into the systems. The used of human characteristics for
biometrics provides authentication for different kind of systems. However, poor quality of authentication still allows
hackers gaining access to these systems. Many biometrics authentication approaches have been proposed to improve the
authentication accuracy and other related quality measures. This survey aims to provide a state-of-the-art fingerprint and
password biometric authentication approaches. Their challenges have been presented and discussed in terms of biometric
authentication. Furthermore, the strengths and weaknesses of each of the fingerprint and password biometric
authentication are discussed and compared. The findings show that fingerprint image quality and password
authentication is still an active research area where performance requires improvement. Also, the graphical password
indicates a promising future direction for enhancing password methods.
Keywords
Biometrics, Authentication, Fingerprint, Password, Information security.
International Journal of Advanced Computer Research, Vol 10(47)
97
there is needed to look into other biometric methods
as the paper only considers the fingerprint method.
This paper presents a survey of biometric
authentication methods, specifically compared
fingerprint and password methods as the most
commonly used by people.
2.Literature review
Biometric authentication attracts the attention of both
researchers and practitioners and it’s now replacing
other authentication methods such as passwords. This
is because user behaviour patterns can be easily used
for identification. These human characteristics cannot
be easily stolen or forgot and can useful for
authentications. For instance, face, fingerprint, iris
and voice could easily identify users during
authentication and unauthorized users would not get
access. Tekade and Shende [4] believe that biometric
technology is capable of solving personal identity
security issues for many critical application areas.
Parkavi et al. [5] present the importance of biometrics
in using multiple personal identification techniques
for authenticating users. Kakkad et al. [6] present the
importance of user authentication techniques to
authenticate images on clouds.
The biometric authentication was introduced to
identified and control access to a system [4].
Biometrics can be human characteristics, for
instance, fingerprint, face recognition, iris
recognition, retina and palm print [5]. Through
biometric recognition, users’ identities are verified
based on some certain measurements. To provide
proper authentication, the biometric authentication
usually utilizes fingerprint, eye scanners, facial
recognition, hand geometry and passwords
authentication approaches. The fingerprint approach
operates based on fingerprint scanners such as
optical, capacitive and ultrasound [7]. The optical
takes the finger photo, identify patterns and compile
into codes for proper security identification. Eye
scanner approach provides authentication based on
retina and iris scanner. The retina and iris remain
with a person throughout their life and as such can be
easily accessible. The retina scan uses light to
illuminate eye blood vessels. The idea for this is that
people have different retina tissues in blood vessels.
Iris scanner uses a photo of individuals and uses for
authentication. Facial recognition approach can be
either extracting person face image or using skin
texture analysis for authentication. Hand geometry
approach uses palm thickness for biometric
authentication. Though, the low accuracy [6] serves
as the drawback to this approach. Biometric
authentication provides authentication security
process to verify user identity [4]. The biometric
authentication has been characterized as ease of use
method. This is because users can use it at any time
they required to use. The biometric authentication
also makes difficult for hackers to discover any
weakness and have access to the system [8].
However, a certain limitation exists for biometric
authentication such used by proxy and remote
recovery. To use biometric authentication, the actual
person involve d must be physically present and other
people cannot authenticate on behalf of others.
Additionally, some of the authentication methods do
have recovery methods but there is absent of such
recovery for biometric authentication. Figure 1
presents some of the common biometric
authentication methods.
Figure 1 Biometric authentication methods
Biometric
Authentication
Fingerprint
Optical
Scanner
Capacitive
Scanner
Ultrasound
Scanner
Eye Scan
Retina Scan
Iris Scan
Facial
Recognition
Face
Recognition
Skin Texture
Analysis
Hand
Geometry
Palm
Thickness
Finger Length
Password
One Time
Password
Graphical
Password
Nuhu Yusuf et al.
98
3.Methods
The most often used biometric authentication
approaches in either simple or complex systems are
fingerprint and passwords methods.
3.1Fingerprint biometric authentication
The fingerprint is an important mechanism for
detecting crime and prevents unauthorized access to
the system. Erika Rahmawati et al. [9] believe
fingerprint technology can be used with a digital
signature to improve the security of mobile
applications, specifically when sending and receiving
documents. Furthermore, Kamelia et al. [10] examine
the significant of fingerprint method in taking online
attendance using mobile phones. They provide the
possibility of integrating fingerprint with GPS via
Arduino and achieved 1.39 seconds average response
time. Goicoechea-Telleria et al. [11] investigate how
fingerprint adoption in smartphones becomes a worry
some due to sensor issues. Hwang et al. [12] provide
a template for achieving higher accuracy in
fingerprint recognition for mobile devices. You and
Wang [13] proposed a fingerprint method that is
based on a fuzzy vault scheme. Wireless devices
require fingerprint for data security as such, Lin et al.
[14] suggest dimensional reduction that utilizes
machine learning algorithms as an authentication
solution. Dimensional reductions provide effective
decisions on data reduction. In addition to that, Ma et
al.[15] presents a multi-dimension algorithm to
provide cellular network security. However,
Sadhukhan et al. [16] analyses the performance of
clustering based fingerprint for smartphones devices.
Engelsma et al. [17] suggested how fingerprint can
be enhancing in future to avoid image variation
results from fingerprint captures. They presented a
universal 3d fingerprint target as an alternative to
improve images variations. Similarly, fingerprint
higher resolution in terms of 3d can also be achieved
using sweat gland extraction [18] which utilizes cells
positions. However, Valdes-Ramirez et al. [19]
reviewed fingerprint features for identifying latent
fingerprint based on minutiae. Makhija et al. [20]
analysed the performance of various latent fingerprint
techniques which required further improvements.
3.2Password biometric authentication
Password authentication is the process of verifying
the access right of the user through the use of a
password. User may be allowed to set up a simple
password using text. But these simple texts are
subjects to attacks. Maqbali and Mitchell [21]
suggested the generating of password automatically
without users involvements. This will be in line with
international standard practise for password
requirements authentication.
The purpose of password authentication is to make
authorize users to kept secret access right so that
unauthorized would not get access to. The passwords
should not be easy for password attacks to guess.
Password attackers can easily gain access to weak
passwords. Rahiemy et al. [22] present that the lack
of password complexity serves as the source for
attackers. In addition to that, Tabrez and Sai [23] also
believe that weak passwords always motivate
attackers. Zhang et al. [24] argued that a technique
can design in such a way that user may constantly
change the password before attackers have access.
The technique only takes into consideration the
dictionary attacks while forgetting that other attacks
may provide serious damages than dictionary attacks.
Password attacks are various techniques used to gain
access to the password by either guessing or stealing.
It could be dictionary attacks where people’s names,
date of births, or lower/uppercase letters would be
trying and retry till getting the actual password.
Figure 1 presents how dictionary attacks work.
Erdem and Sandıkkaya [25] support the use of the
one-time password and they proposed a technique
based on OTP where cloud provider would be located
as the cloud as service and then analyze the user
before given access. Default password may be
discovered by either Trojan horse or backdoors via
network trafficking. Intruders also used social
engineering to have access to the passwords via
emails or any other alternative methods. Bruteforce
attacks are other attacks based on trial and error to
get access to the password.
Mohamedali and Fadlalla [26] present different
categories of password attacks and stated the benefits
and shortcomings of each attack. They suggest more
friendly methods to address these attacks without
complicating with usability. These attacks include
among others the Phishing, Man-in-the-Middle, etc.
Zheng and Jia [27] suggest the use of separators
between keystrokes to address the leaked password
issues. This means that the blank space is inserted
within the password for better authentications. If the
passport with spaces corresponds with the users’
inputs, then access right will be granted. However,
Hwang et al. [28] proposed the use of Smart Card as
an authentication method instead of a general
password. They try to address password guessing
attacks using complex smart card implementations.
International Journal of Advanced Computer Research, Vol 10(47)
99
4.Results
This section presents the results of the two major
biometric authentications taking into consideration
their strengths and limitations.
4.1Results of fingerprint biometric authentication
approach
Table 1 present the comparison of various fingerprint
techniques. Wu and Chiu [29] present solutions to
poor fingerprint quality to ensure better fingerprint
recognition for authentication. Their work used ridge
features techniques which different individuals and
achieved almost 99\% accuracy. In addition to that,
Tang et al. [30] examine how Hessian matrix and
short-time Fourier transform (STFT) would improve
fingerprint images quality utilizes fingerprint
textures. The result indicates 0.799 second processing
time has been reduced. Furthermore, Liban and Hilles
[31] suggest enhancing latent fingerprint to improve
fingerprint quality so that reasonable processing time
would be achieved. However, Koptyra and Ogiela
[32] argued that higher fingerprint processing time
will be achieved if enhancing Histograms of Oriented
Gradients (HOG) technique.
Patel et al. [33] enhanced O’ Gorman filter to address
minutiae points’ extraction problem. The result
achieved mean square error (MSE) and peak signal to
noise ratio (PSNR) of 6\% and 39\% respectively.
Similarly, Sudiro et al. [34] used Artificial Neural
Network to address Fingerprint extraction issues
while achieving 41\% False Acceptance Rate. Kim et
al. [35] also used a deep neural network to address
issues arising from fingerprint collections. Cao and
Jain [36] present fingerprint synthesis technique to
reduce processing time error of fetching fingerprint
images from the database. Nuraisha and Shidik [37]
stated that fake fingerprints cause longer processing
time and as such normalization is required to get
higher accuracy results. Han et al. [38] improve
fingerprint image impulse noise using Adaptive
Median Filter.
4.2Results of password biometric authentication
approach
With all the security challenges of traditional
password, Taufiq and Ogi [39] suggest improvement
of existing passwords techniques to strengthen
security rather than adopting other complex methods.
They present a method that utilizes one-time
password known as Raspberry Pi at the access
control level. Though it is difficult for attackers to
repay attacks on password because the new password
will be assigned, the network response time will force
another challenge. Furthermore, Zaki et al. [40]
believe that text passwords can be enhanced using
different pattern keys ranging from simple to
complex ones. However, Lekshmi et al. [41]
suggested the neural network approach as an
alternative password method, especially if integrated
with fuzzy rules. Bhola et al [42] examine how
android device will be used to improve on password
methods. Scaria and Megalingam [43] present a
complex method that incorporates OTP, biometrics
and noisy passwords.
Graphical passwords have been successfully
implemented to overcome text-based passwords
challenges but still required more improvements.
Bilgi and Tugrul [44] integrated images in a
password method to provide access right. Their
approach provides more benefits compared to
ordinary text-based passwords but does not clearly
state how shoulder surfing attacks would be
minimized. Moreover, Fayyadh et al. [45] present a
graphical method that allows the user to create shapes
during their registration and thereby required to draw
such shapes when accessing the system. Their
approach is quite an improvement compared to Bilgi
and Tugrul [44]. Zhang et al [46] approach is
difficult to implement and can be conflicting with
usability. Table 2 shows the evaluation of password
biometric authentication approaches.
Table 1 Evaluation of Fingerprint Biometric Authentication Approaches
Authors Problem Techniques Metrics/results Benefits Limitations
Wu and
Chiu (2017)
[29]
poor
fingerprint
quality
Ridge Features Accuracy of
99.00\%, and
99.09\%
Successfully classified
ridges features
Not suitable for large
datasets
Patel et al.
(2017) [33]
minutiae
points
extraction
Enhanced
O’Gorman Filter
MSE = 6.698
PSNR = 39.871
Better results on
O’Gorman compare to
Gabor
Doesn’t show overall
fingerprint
performance
Cao and Jain
(2018) [36]
fingerprint
images
database
Fingerprint
Synthesis
Time: 512 × 512
in 12 muinute
Provide a better quality
image
Doesn’t incorporating
diversity criteria in
the training process
Nuhu Yusuf et al.
100
Authors Problem Techniques Metrics/results Benefits Limitations
Abdilahi
Liban and
Hilles
(2018) [47]
Fingerprint
images
quality
Enhanced Latent
fingerprint
RMSE =
0.023199
PSNR = 81.07826
improved matching
accuracy
latent fingerprint
images still
overlapped
Safira
Nuraisha et
al (2018)
[37]
fake
fingerprints
Normalization Accuracy of 24\% Increased accuracy of
detecting fake
fingerprint images
Inefficient features
extraction
Szymkowski
and Saeed
(2018) [48]
Fingerprint
recognition
Sectorization Accuracy of
100\%
Provide a new way to
reach a satisfactory
level of identification
accuracy
Changes in changes in
their fingerprint
patterns may still
present
Han et al.
(2018) [38]
filtering
window
size noise
Adaptive Median
Filter
PSNR = 44 Present feasible for
fingerprint image
enhancement
impulse noise still
present
Kim et al.
(2019) [35]
Collecting
fingerprints
deep neural
networks
average detection
error rate=1.57\%
Generate real
fingerprint with certain
characteristics
Time-consuming
Sudiro et al.
(2017) [34]
Fingerprint
extraction
simple minutiae
point extraction
FAR= 41.57\%
FRR= 41.13\%,
EER= 41.35\%
minutiae extraction
improvement
still lack accuracy due
to the high value of
FAR
Tang et al.
(2017) [30]
Fingerprint
Image
quality
Hessian matrix and
short-time Fourier
transform (STFT)
Processing time =
0.799 s
increased the contrast
greatly according to the
structural characteristics
low contrast between
ridge still need
improvement
Table 2 Evaluation of password biometric authentication approaches
Authors Problem Techniques Benefits Limitations
Taufiq and Ogi
(2018) [39]
password
leakage
attacks
Raspberry Pi Improve one-time password
mutual
authentication
Run with RSA
Zaki et al.
(2018) [40]
password
authentication
combination of pattern,
key, and dummy digits
minimizes different password
attacks and usability issues
Expensive to implement
Lekshmi et al.
(2018) [41]
password
authentication
Hopfield Neural Network
with fuzzy logic
provides better accuracy and
response time
Not easier compare to
graphical passwords
Bhola et al.
(2017) [42]
Cybercrimes Android Device and One-
Way Function
Improved dynamic password
Authentication
Cannot handle multiple
websites authentication
Bilgi and Tugrul
(2018) [44]
password
authentication
Shoulder-Surfing
Resistant Graphical
passwords
faster and easier
authentication processes
shoulder surfing problem
Mehrube and
Nguyen (2018)
[49]
password
authentication
Real-time Eye Tracking The smart camera can capture
and store PIN
incorporating the PIN
identification algorithm into
the-real-time
Othman et al.
(2018) [50]
password
authentication
Graphical Authentication
with Shoulder Surfing
Resistant
demonstrate the robustness,
security strength and the
functionality
A higher number of
direction authentication
exposure
Fayyadh et al
(2018) [45]
password
authentication
graphical password (2D
Shapes)
effective against the brute
force attacks, the dictionary
attacks, and the keylogger
attacks
Difficult to remember the
number of used shapes
when larger
Sudramurthy et
al (2017) [51]
password
authentication
Honey Password Pointed out the strength of the
honey word system depends
on the AES Algorithm
Limited to online purchase
5.Discussion
In section 3, the biometric authentication methods are
presented to provide proper authentication. The
fingerprint authentication accuracy and PSNR have
been observed with different levels of performance.
Reasonable results have been obtained for accuracy
which indicates how accurate some of these
techniques have in addressing fingerprint challenges.
The PSNR results indicate additional improvement is
required. Moreover, the techniques used are mostly
International Journal of Advanced Computer Research, Vol 10(47)
101
enhancement of the existing techniques for
fingerprint and their limitations was highlighted.
Additionally, the techniques appear to solve certain
problems, especially for poor quality recognition. For
instance, ridges feature technique successfully
classified and improved poor quality of the
fingerprint. Despite the quality of this technique in
recognizing fingerprint, the ridges feature technique
not suitable for large datasets. This is because of
difficulties in counting the number of fingerprint
ridges. O’Gorman filter technique has also limitation
in showing fingerprint performance when compared
to other methods such as Gabor in extracting
minutiae points. The fingerprint synthesis doesn’t
take diversity into account when addressing the
fingerprint image database. The latent fingerprint and
sectorization techniques improved accuracy, but the
image still overlapped while minimization contains
low accuracy results. This is because fake fingerprint
may be difficult to detect if complex mechanisms
have not put in place for detection. A deep neural
network is time-consuming in collecting fingerprints.
The STFT technique also provides efficient, timely
results due to contrast increased. Moreover, impulse
noise is still present in the adaptive media filter
technique. In password, biometric authentication
methods, graphical password techniques have some
issues regarding the remembering of a various
number of shapes which may lead to poor
authentication. Though, the graphical passwords
techniques provide an effective measure against
hackers. Using Hopfield neural network with fuzzy
logic can possibly eliminate this problem. The
Hopfield neural network with fuzzy logic can provide
better accuracy for authentication compared with
some of the graphical password’s techniques. Pattern
key and dummy digits are expensive to implement
compared with Raspberry Pi based on the one-time
password. Real-time eye-tracking techniques can be a
good technique for authentication compared with
smart camera capture with store PIN which is easily
altered.
6.Conclusion and future work
Biometric authentication is identification and
verification, which consider human characteristics to
improve system security. The aim is to identify and
authenticate access to any component of the system.
There are many biometric authentication methods
currently available. This work only considers the two
most widely used methods which are the fingerprint
and passwords methods. Various proposed
fingerprint techniques show much improvement in
achieving high image quality. However, the
fingerprint image quality still required improvement
to recognize fingerprint. Moreover, the password
method comprises text and graphical passwords.
Graphical password authenticates users based on the
grid selection algorithm. The algorithm can prevent
not only shoulder surfing attacks, but also other
related password attacks. Besides that, we
highlighted some features of various biometric
authentication techniques. Additionally, we discussed
some of the strengths and challenges of biometric
authentication. In general, both fingerprint and
password methods have proved effective for
biometric authentication. However, regarding future
work, all simple and complex biometric
authentication methods should be considered for a
better understanding.
Acknowledgment
We wish to thank the Department of Management &
Information Technology ATBU Bauchi, Faculty of
Management Science ATBU Bauchi as well as the
Management of Abubakar Tafawa Balewa University
Bauchi for their support and encouragement.
Conflicts of interest
The authors have no conflicts of interest to declare.
References
[1] Bharathi S, Sudhakar R. Biometric recognition using
finger and palm vein images. Soft Computing. 2019;
23(6):1843-55.
[2] Padma P, Srinivasan S. A survey on biometric based
authentication in cloud computing. In international
conference on inventive computation technologies
2016 (pp. 1-5). IEEE.
[3] Prasad PS, Devi BS, Reddy MJ, Gunjan VK. A survey
of fingerprint recognition systems and their
applications. In international conference on
communications and cyber physical engineering 2018
(pp. 513-20). Springer, Singapore.
[4] Tekade P, Shende P. Enhancement of security through
fused multimodal biometric system. In international
conference on computing, communication, control and
automation 2017 (pp. 1-5). IEEE.
[5] Parkavi R, Babu KC, Kumar JA. Multimodal
biometrics for user authentication. In 11th
international conference on intelligent systems and
control 2017 (pp. 501-5). IEEE.
[6] Kakkad V, Patel M, Shah M. Biometric authentication
and image encryption for image security in cloud
framework. Multiscale and Multidisciplinary
Modeling, Experiments and Design. 2019; 2(4):233-
48.
[7] Nakanishi I, Maruoka T. Biometric authentication
using evoked potentials stimulated by personal
ultrasound. In international conference on
telecommunications and signal processing (TSP) 2019
(pp. 365-8). IEEE.
Nuhu Yusuf et al.
102
[8] Vittori P. Ultimate password: is voice the best
biometric to beat hackers? Biometric Technology
Today. 2019; 2019(9):8-10.
[9] Rahmawati E, Listyasari M, Aziz AS, Sukaridhoto S,
Damastuti FA, Bachtiar MM, et al. Digital signature
on file using biometric fingerprint with fingerprint
sensor on smartphone. In international electronics
symposium on engineering technology and
applications (IES-ETA) 2017 (pp. 234-8). IEEE.
[10] Kamelia L, Hamidi EA, Darmalaksana W, Nugraha A.
Real-time online attendance system based on
fingerprint and GPS in the smartphone. In
international conference on wireless and telematics
2018 (pp. 1-4). IEEE.
[11] Goicoechea-Telleria I, Garcia-Peral A, Husseis A,
Sanchez-Reillo R. Presentation attack detection
evaluation on mobile devices: simplest approach for
capturing and lifting a latent fingerprint. In
international carnahan conference on security
technology 2018 (pp. 1-5). IEEE.
[12] Hwang D, Lee H, Bae G, Son S, Kim J. Fingerprint
template management for higher accuracy in user
authentication. In international conference on
electronics, information, and communication 2018
(pp. 1-4). IEEE.
[13] You L, Wang T. A novel fuzzy vault scheme based on
fingerprint and finger vein feature fusion. Soft
Computing. 2019; 23(11):3843-51.
[14] Lin Y, Zhu X, Zheng Z, Dou Z, Zhou R. The
individual identification method of wireless device
based on dimensionality reduction and machine
learning. The Journal of Supercomputing. 2019;
75(6):3010-27.
[15] Ma L, Jin N, Zhang Y, Xu Y. RSRP difference
elimination and motion state classification for
fingerprint-based cellular network positioning system.
…
https://doi.org/10.1177/0008125619864925https://doi.org/10.1177/0008125619864925
California Management Review
2019, Vol. 61(4) 5 –14
© The Regents of the
University of California 2019
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/0008125619864925
journals.sagepub.com/home/cmr
5
Special Issue on AI
A Brief History of
Artificial Intelligence:
On the Past, Present,
and Future of Artificial
Intelligence
Michael Haenlein1 and Andreas Kaplan2
SUMMARY
This introduction to this special issue discusses artificial intelligence (AI), commonly
defined as “a system’s ability to interpret external data correctly, to learn from such
data, and to use those learnings to achieve specific goals and tasks through flexible
adaptation.” It summarizes seven articles published in this special issue that present a
wide variety of perspectives on AI, authored by several of the world’s leading experts
and specialists in AI. It concludes by offering a comprehensive outlook on the future
of AI, drawing on micro-, meso-, and macro-perspectives.
KeYwoRdS: artificial intelligence, big data, regulation, strategy, machine-based
learning
T
he world we are living in today feels, in many ways, like a
Wonderland similar to the one that the British mathematician
Charles Lutwidge Dodgson, better known under the name Lewis
Carroll, described in his famous novels. Image recognition, smart
speakers, and self-driving cars—all of this is possible due to advances in artificial
intelligence (AI), defined as “a system’s ability to interpret external data correctly,
to learn from such data, and to use those learnings to achieve specific goals and
tasks through flexible adaptation.”1 Established as an academic discipline in the
1950s, AI remained an area of relative scientific obscurity and limited practical
interest for over half a century. Today, due to the rise of Big Data and improve-
ments in computing power, it has entered the business environment and public
conversation.
1ESCP Europe Business School, Paris, France
2ESCP Europe Business School, Berlin, Germany
864925CMRXXX10.1177/0008125619864925California Management ReviewA Brief History of Artificial Intelligence: On the Past, Present, and Future of Artificial Intelligence
research-article2019
https://us.sagepub.com/en-us/journals-permissions
https://journals.sagepub.com/home/cmr
CALIFORNIA MANAGEMENT REVIEW 61(4) 6
AI can be classified into analytical, human-inspired, and humanized AI
depending on the types of intelligence it exhibits (cognitive, emotional, and social
intelligence) or into Artificial Narrow, General, and Super Intelligence by its evo-
lutionary stage.2 What all of these types have in common, however, is that when
AI reaches mainstream usage it is frequently no longer considered as such. This
phenomenon is described as the AI effect, which occurs when onlookers discount
the behavior of an AI program by arguing that it is not real intelligence. As the
British science fiction writer Arthur Clarke once said, “Any sufficiently advanced
technology is indistinguishable from magic.” Yet when one understands the tech-
nology, the magic disappears.
In regular intervals since the 1950s, experts predicted that it will only take
a few years until we reach Artificial General Intelligence—systems that show
behavior indistinguishable from humans in all aspects and that have cognitive,
emotional, and social intelligence. Only time will tell whether this will indeed be
the case. But to get a better grasp of what is feasible, one can look at AI from two
angles—the road already traveled and what still lies ahead of us. In this editorial,
we aim to do just that. We start by looking into the past of AI to see how far this
area has evolved using the analogy of the four seasons (spring, summer, fall, and
winter), then into the present to understand which challenges firms face today,
and finally into the future to help everyone prepare for the challenges ahead of us.
The Past: Four Seasons of AI
AI Spring: The Birth of AI
Although it is difficult to pinpoint, the roots of AI can probably be traced
back to the 1940s, specifically 1942, when the American Science Fiction writer
Isaac Asimov published his short story Runaround. The plot of Runaround—a
story about a robot developed by the engineers Gregory Powell and Mike
Donavan—evolves around the Three Laws of Robotics: (1) a robot may not injure
a human being or, through inaction, allow a human being to come to harm; (2)
a robot must obey the orders given to it by human beings except where such
orders would conflict with the First Law; and (3) a robot must protect its own
existence as long as such protection does not conflict with the First or Second
Laws. Asimov’s work inspired generations of scientists in the field of robotics, AI,
and computer science—among others the American cognitive scientist Marvin
Minsky (who later co-founded the MIT AI laboratory).
At roughly the same time, but over 3,000 miles away, the English math-
ematician Alan Turing worked on much less fictional issues and developed a code
breaking machine called The Bombe for the British government, with the purpose
of deciphering the Enigma code used by the German army in the Second World
War. The Bombe, which was about 7 by 6 by 2 feet large and had a weight of about
a ton, is generally considered the first working electro-mechanical computer. The
powerful way in which The Bombe was able to break the Enigma code, a task pre-
viously impossible to even the best human mathematicians, made Turing wonder
A Brief History of Artificial Intelligence: On the Past, Present, and Future of Artificial 7
about the intelligence of such machines. In 1950, he published his seminal article
“Computing Machinery and Intelligence”3 where he described how to create
intelligent machines and in particular how to test their intelligence. This Turing
Test is still considered today as a benchmark to identify intelligence of an artificial
system: if a human is interacting with another human and a machine and unable
to distinguish the machine from the human, then the machine is said to be
intelligent.
The word Artificial Intelligence was then officially coined about six years
later, when in 1956 Marvin Minsky and John McCarthy (a computer scientist at
Stanford) hosted the approximately eight-week-long Dartmouth Summer Research
Project on Artificial Intelligence (DSRPAI) at Dartmouth College in New Hampshire.
This workshop—which marks the beginning of the AI Spring and was funded by
the Rockefeller Foundation—reunited those who would later be considered as the
founding fathers of AI. Participants included the computer scientist Nathaniel
Rochester, who later designed the IBM 701, the first commercial scientific com-
puter, and mathematician Claude Shannon, who founded information theory.
The objective of DSRPAI was to reunite researchers from various fields in order to
create a new research area aimed at building machines able to simulate human
intelligence.
AI Summer and Winter: The Ups and Downs of AI
The Dartmouth Conference was followed by a period of nearly two
decades that saw significant success in the field of AI. An early example is the
famous ELIZA computer program, created between 1964 and 1966 by Joseph
Weizenbaum at MIT. ELIZA was a natural language processing tool able to sim-
ulate a conversation with a human and one of the first programs capable of
attempting to pass the aforementioned Turing Test.4 Another success story of the
early days of AI was the General Problem Solver program—developed by Nobel
Prize winner Herbert Simon and RAND Corporation scientists Cliff Shaw and
Allen Newell—that was able to automatically solve certain kind of simple prob-
lems, such as the Towers of Hanoi.5 As a result of these inspiring success stories,
substantial funding was given to AI research, leading to more and more projects.
In 1970, Marvin Minsky gave an interview to Life Magazine in which he stated
that a machine with the general intelligence of an average human being could be
developed within three to eight years.
Yet, unfortunately, this was not the case. Only three years later, in 1973,
the U.S. Congress started to strongly criticize the high spending on AI research. In
the same year, the British mathematician James Lighthill published a report com-
missioned by the British Science Research Council in which he questioned the
optimistic outlook given by AI researchers. Lighthill stated that machines would
only ever reach the level of an “experienced amateur” in games such as chess and
that common-sense reasoning would always be beyond their abilities. In response,
the British government ended support for AI research in all except three universi-
ties (Edinburgh, Sussex, and Essex) and the U.S. government soon followed the
CALIFORNIA MANAGEMENT REVIEW 61(4) 8
British example. This period started the AI Winter. And although the Japanese
government began to heavily fund AI research in the 1980s, to which the U.S.
DARPA responded by a funding increase as well, no further advances were made
in the following years.
AI Fall: The Harvest
One reason for the initial lack of progress in the field of AI and the fact
that reality fell back sharply relative to expectations lies in the specific way in
which early systems such as ELIZA and the General Problem Solver tried to rep-
licate human intelligence. Specifically, they were all Expert Systems, that is, col-
lections of rules which assume that human intelligence can be formalized and
reconstructed in a top-down approach as a series of “if-then” statements.6 Expert
Systems can perform impressively well in areas that lend themselves to such for-
malization. For example, IBM’s Deep Blue chess playing program, which in 1997
was able to beat the world champion Gary Kasparov—and in the process proved
one of the statements made by James Lighthill nearly 25 earlier wrong—is such
an Expert System. Deep Blue was reportedly able to process 200 million possible
moves per second and to determine the optimal next move looking 20 moves
ahead through the use of a method called tree search.7
However, Expert Systems perform poorly in areas that do not lend them-
selves to such formalization. For example, an Expert System cannot be easily
trained to recognize faces or even to distinguish between a picture showing a muf-
fin and one showing a Chihuahua.8 For such tasks it is necessary that a system is
able to interpret external data correctly, to learn from such data, and to use those
learnings to achieve specific goals and tasks through flexible adaptation—charac-
teristics that define AI.9 Since Expert Systems do not possess these characteristics,
they are technically speaking not true AI. Statistical methods for achieving true AI
have been discussed as early as the 1940s when the Canadian psychologist Donald
Hebb developed a theory of learning known as Hebbian Learning that replicates
the process of neurons in the human brain.10 This led to the creation of research on
Artificial Neural Networks. Yet, this work stagnated in 1969 when Marvin Minsky
and Seymour Papert showed that computers did not have sufficient processing
power to handle the work required by such artificial neural networks.11
Artificial neural networks made a comeback in the form of Deep Learning
when in 2015 AlphaGo, a program developed by Google, was able to beat the
world champion in the board game Go. Go is substantially more complex than
chess (e.g., at opening there are 20 possible moves in chess but 361 in Go) and it
was long believed that computers would never be able to beat humans in this
game. AlphaGo achieved its high performance by using a specific type of artificial
neural network called Deep Learning.12 Today artificial neural networks and Deep
Learning form the basis of most applications we know under the label of AI. They
are the basis of image recognition algorithms used by Facebook, speech recognition
algorithms that fuel smart speakers and self-driving cars. This harvest of the fruits
of past statistical advances is the period of AI Fall, which we find ourselves in today.
A Brief History of Artificial Intelligence: On the Past, Present, and Future of Artificial 9
The Present: California Management Review Special Issue on AI
The discussion above makes it clear that AI will become as much part of
everyday life as the Internet or social media did in the past. In doing so, AI will
not only impact our personal lives but also fundamentally transform how firms
take decisions and interact with their external stakeholders (e.g., employees, cus-
tomers). The question is less whether AI will play a role in these elements but
more which role it will play and more importantly how AI systems and humans
can (peacefully) coexist next to each other. Which decisions should rather be
taken by AI, which ones by humans, and which ones in collaboration will be
an issue all companies need to deal with in today’s world and our articles in this
special issue provide insights into this from three different angles.
First, these articles look into the relationship between firms and employees
or generally the impact of AI on the job market. In their article “Artificial
Intelligence in Human Resources Management: Challenges and a Path Forward”
Tambe, Cappelli, and Yakubovich analyze how AI changes the HR function in
firms. Human resource management is characterized by a high level of complexity
(e.g., measurement of employee performance) and relatively rare events (e.g.,
occurrence of recruiting and dismissals), which have serious consequences for
both employees and the firm. These characteristics create challenges in the data-
generation stage, the machine-learning stage, and the decision-making stage of AI
solutions. The authors analyze those challenges, provide recommendations on
when AI or humans should take the lead, and discuss how employees can be
expected to react to different strategies.
Another article that addresses this issue is “The Feeling Economy: Managing
in the Next Generation of AI” by Huang, Rust, and Maksimovic. This article takes
a broader view and analyzes the relative importance of mechanical tasks (e.g.,
repairing and maintaining equipment), thinking tasks (e.g., processing, analyzing,
and interpreting information), and feeling tasks (e.g., communicating with peo-
ple) for different job categories. Through empirical analysis, these authors show
that in the future, human employees will be increasingly occupied with feeling
tasks since thinking tasks will be taken over by AI systems in a manner similar to
how mechanical tasks have been taken over my machines and robots.
Second, the articles in this special issue analyze how AI changes the inter-
nal functioning of firms, specifically group dynamics and organizational decision
making. In “Organizational Decision-Making Structures in the Age of AI,”
Shrestha, Ben-Menahem, and von Krogh develop a framework to explain under
which conditions organizational decision making should be fully delegated to AI,
hybrid (either AI as an input to human decision making or human decisions as an
input to AI systems) or aggregated (in the sense that humans and AI take deci-
sions in parallel with the optimal decision being determined by some form of vot-
ing). The question of which option should be preferred depends on the specificity
of the decision-making space, the size of the alternative set, and decision-making
speed as well as the need for interpretability and replicability.
CALIFORNIA MANAGEMENT REVIEW 61(4) 10
In a similar spirit, Metcalf, Askay, and Rosenberg present artificial swarm
intelligence as a tool to allow humans to make better decisions in “Keeping
Humans in the Loop: Pooling Knowledge through Artificial Swarm Intelligence to
Improve Business Decision Making.” By taking inspiration from decision making
in the animal world (e.g., among flocks of birds or ant colonies), these authors
propose a framework to combine explicit and tactic knowledge that suffers less
from biases such as herding behavior or the limitations of alternative techniques
such as surveys, crowdsourcing, or prediction markets. They show the applicabil-
ity of their method to sales forecasting and the definition of strategic priorities.
In their article “Demystifying AI: What Digital Transformation Leaders Can
Teach You,” Brock and Wangenheim take a broader perspective and investigate to
what extent firms are already using AI in their business and how leaders in AI are
different from companies lagging behind. Based on a large-scale survey, they
identify guidelines of successful AI applications that include a need for data, the
requirement to have skilled staff and in-house knowledge, the focus on improv-
ing existing business offerings using AI, the importance of having AI embedded in
the organization (while, at the same time, engaging with technology partners),
and the importance of being agile and having top-management commitment.
Finally, the articles in this special issue look into the interaction between a
firm and its customers and specifically the role of AI in marketing. In “Understanding
the Role of Artificial Intelligence in Personalized Engagement Marketing,” Kumar,
Rajan, Venkatesan, and Lecinski propose how AI can help in the automatic
machine-driven selection of products, prices, website content, and advertising
messages that fit with an individual customer’s preferences. They discuss in detail
how the associated curation of information through personalization changes
branding and customer relationship management strategies for firms in both
developed and developing economies.
In a similar spirit, Overgoor, Chica, Rand, and Weishampel provide a six-
step framework on how AI can support marketing decision making in “Letting the
Computers Take Over: Using AI to Solve Marketing Problems.” This framework—
which is based on obtaining business and data understanding, data preparation
and modeling, as well as evaluation and deployment of solutions—is applied in
three case studies to problems many firms face in today’s world: how to design
influencer strategies in the context of word-of-mouth programs,13 how to select
images for digital marketing, and how to prioritize customer service in social media.
The Future: Need for Regulation
Micro-Perspective: Regulation with Respect to Algorithms and
Organizations
The fact that in the near future AI systems will increasingly be part of our
day-to-day lives raises the question of whether regulation is needed and, if so,
in which form. Although AI is in its essence objective and without prejudice, it
does not mean that systems based on AI cannot be biased. In fact, due to its very
A Brief History of Artificial Intelligence: On the Past, Present, and Future of Artificial 11
nature, any bias present in the input data used to train an AI system persists and
may even be amplified. Research has, for example, shown that the sensors used
in self-driving cars are better in detecting lighter skin tones than darker ones14
(due to the type of pictures used to train such algorithms) or that decision-sup-
port systems used by judges may be racially biased15 (since they are based on the
analysis of past rulings).
Instead of trying to regulate AI itself, the best way to avoid such errors is
probably to develop commonly accepted requirements regarding the training and
testing of AI algorithms, possibly in combination with some form of warranty,
similar to consumer and safety testing protocols used for physical products. This
would allow for stable regulation even if the technical aspects of AI systems evolve
over time. A related issue is the one of accountability of firms for mistakes of their
algorithms or even the need for a moral codex of AI engineers, similar to the one
lawyers or doctors are swearing to. What such rules can, however, not avoid is the
deliberate hacking of AI systems, the unwanted use of such systems for micro-
targeting based on personality traits,16 or the generation of fake news.17
What makes matters even more complicated is that Deep Learning, a key
technique used by most AI systems, is inherently a black box. While it is straight-
forward to assess the quality of the output generated by such systems (e.g., the
share of correctly classified pictures), the process used for doing so remains largely
opaque. Such opacity can be intentional (e.g., if a corporation wants to keep an
algorithm secret), due to technical illiteracy or related to the scale of application
(e.g., in cases where a multitude of programmers and methods are involved).18
While this may be acceptable in some cases, it may be less so in others. For exam-
ple, few people may care how Facebook identifies who to tag in a given picture.
But when AI systems are used to make diagnostic suggestions for skin cancer
based on automatic picture analysis,19 understanding how such recommendations
have been derived becomes critical.
Meso-Perspective: Regulation with Respect to Employment
In a similar manner as the automation of manufacturing processes has
resulted in the loss of blue-collar jobs, the rising use of AI will result in less need
for white-collar employees and even high-qualified professional jobs. As men-
tioned previously, image recognition tools are already outperforming physicians
in the detection of skin cancer and in the legal profession e-discovery technolo-
gies have reduced the need for large teams of lawyers and paralegals to exam-
ine millions of documents.20 Granted, significant shifts in job markets have been
observed in the past (e.g., in the context of the Industrial Revolution from 1820-
1840), but it is not obvious whether new jobs will necessarily be created in other
areas in order to accommodate those employees. This is related to both the num-
ber of possible new jobs (which may be much less than the number of jobs lost)
and the skill level required.
Interestingly, in a similar way as fiction can be seen as the starting point of AI
(remember the Runaround short story by Isaac Asimov), it can also be used to get a
CALIFORNIA MANAGEMENT REVIEW 61(4) 12
glimpse into how a world with more unemployment could look like. The fiction
novel Snow Crash published by the American Writer Neal Stephenson describes a
world in which people spend their physical life in storage units, surrounded by
technical equipment, while their actual life takes place in a three-dimensional world
called the Metaverse where they appear in the form of three-dimensional avatars.
As imaginary as this scenario sounds, recent advancements in virtual reality image
processing, combined with the past success of virtual worlds21 (and the fact that
higher unemployment leads to less disposable income), make alternative forms of
entertainment less accessible, and make this scenario far from utopian.
Regulation might again be a way to avoid such an evolution. For example,
firms could be required to spend a certain percentage of the money saved through
automation into training employees for new jobs that cannot be automated. States
may also decide to limit the use of automation. In France, self-service systems
used by public administration bodies can only be accessed during regular working
hours. Or firms might restrict the number of hours worked per day to distribute
the remaining work more evenly across the workforce. All of these may be easier
to implement, at least in the short term, than the idea of a Universal Basic Income
that is usually proposed as a solution in this case.
Macro-Perspective: Regulation with Respect to Democracy and Peace
All this need for regulation necessarily leads to the question “Quis custodiet
ipsos custodes?” or “Who will guard the guards themselves?” AI can be used not
only by firms or private individuals but also by states themselves. China is cur-
rently working on a social credit system that combines surveillance, Big Data,
and AI to “allow the trustworthy to roam everywhere under heaven while mak-
ing it hard for the discredited to take a single step.”22 In an opposite move, San
Francisco recently decided to ban facial recognition technology23 and researchers
are working on solutions that act like a virtual invisibility cloak and make people
undetectable to automatic surveillance cameras.24
While China and, to a certain extent, the United States try to limit the bar-
riers for firms to use and explore AI, the European Union has taken the opposite
direction with the introduction of the General Data Protection Regulation (GDPR)
that significantly limits the way in which personal information can be stored and
processed. This will by all likelihood result in the fact that the development of AI
will be slowed down in the EU compared with other regions, which in turn raises
the question how to balance economic growth and personal privacy concerns. In
the end, international coordination in regulation will be needed, similar to what
has been done regarding issues such as money laundering or weapons trade. The
nature of AI makes it unlikely that a localized solution that only affects some
countries but not others will be effective in the long run.
Through the Looking Glass
Nobody knows whether AI will allow us to enhance our own intelligence,
as Raymond Kurzweil from Google thinks, or whether it will eventually lead us
A Brief History of Artificial Intelligence: On the Past, Present, and Future of Artificial 13
into World War III, a concern raised by Elon Musk. However, everyone agrees
that it will result in unique ethical, legal, and philosophical challenges that will
need to be addressed.25 For decades, ethics has dealt with the Trolley Problem,
a thought experiment in which an imaginary person needs to choose between
inactivity which leads to the death of many and activity which leads to the death
of few.26 In a world of self-driving cars, these issues will become actual choices
that machines and, by extension, their human programmers will need to make.27
In response, calls for regulation have been numerous, including by major actors
such as Mark Zuckerberg.28
But how do we regulate a technology that is constantly evolving by itself—
and one that few experts, let alone politicians, fully understand? How do we
overcome the challenge of being sufficiently broad to allow for future evolutions
in this fast-moving world and sufficiently precise to avoid everything being con-
sidered as AI? One solution can be to follow the approach of U.S. Supreme Court
Justice Potter Stewart who in 1964 defined obscenity by saying: “I know it when
I see it.” This brings us back to the AI effect mentioned earlier, that we now
quickly tend to accept as normal was used to be seen as extraordinary. There are
today dozens of different apps that allow a user to play chess against her phone.
Playing chess against a machine—and losing with near certainty—has become a
thing not even worth mentioning. Presumably, Garry Kasparov had an entirely
different view on this matter in 1997, just a bit over 20 years ago.
Author Biographies
Michael Haenlein is the Big Data Research Center Chaired Professor and
Associate Dean of the Executive PhD Program at the ESCP Europe Business
School (email: [email protected]).
Andreas Kaplan, Professor and Dean at ESCP Europe Business School Berlin,
counts among the Top 50 Business and Management authors worldwide (email:
[email protected]).
Notes
1. Andreas M. Kaplan and Michael Haenlein, “Siri, Siri, in My Hand: Who’s the Fairest in
the Land? On the Interpretations, Illustrations, and Implications of Artificial Intelligence,”
Business Horizons, 62/1 (January/February 2019): 15-25.
2. Ibid.
3. Alan Turing, “Computing Machinery and Intelligence,” Mind, LIX/236 (1950): 433-460.
4. For those eager to try ELIZA, see: https://www.masswerk.at/elizabot/.
5. The Towers of Hanoi is a mathematical game that consists of three rods and a number of
disks of different sizes. The game starts with the disks in one stack in ascending order and
consists of moving the entire stack from one rod to another, so that at the end the ascending
order is kept intact.
6. For more details, see Kaplan and Haenlein, op. cit.
7. Murray Campbell, A. Joseph Hoane Jr., and Feng-Hsiung Hsu, “Deep Blue,” Artificial
Intelligence, 134/1-2 (January 2002): 57-83.
8. Matthew Hutson, “How Researchers Are Teaching AI to Learn Like a Child,” Science, May 24,
2018, https://www.sciencemag.org/news/2018/05/how-researchers-are-teaching-ai-learn-child.
9. Kaplan and Haenlein, op. cit.
mailto:[email protected]
mailto:[email protected]
https://www.masswerk.at/elizabot/
https://www.sciencemag.org/news/2018/05/how-researchers-are-teaching-ai-learn-child
CALIFORNIA MANAGEMENT REVIEW 61(4) 14
10. Donald Olding Hebb, The Organization of Behavior: A Neuropsychological Theory (New York, NY:
John Wiley, 1949).
11. Marvin Minsky and Seymour A. Papert, Perceptrons: An Introduction to Computational Geometry
(Cambridge, MA: MIT Press, 1969).
12. David Silver, Aja Huang, Chris J. Maddison, Arthur Guez, Laurent Sifre, …
IT S ervice P roviders and C yb e rse cu rity Risk
There is growing evidence that information technology outsourcing
(ITO) is a major contributor to cybersecurity risk exposure. Reports of cybersecurity
incidents linked to IT providers arrive regularly. Most often ITO clients are the ones
suffering the major consequences.
xamples from government and corporate
sectors abound. In 2013, QinetiQ, a defense
contractor of software used by US Special
Forces, was subject to an ongoing cybersecurity
breach that compromised much classified
research. In 2011, RSA, a cybersecurity
subcontractor of Lockheed Martin and the
Department of Defense, was breached and subsequently
contributed to a cyberattack on Lockheed Martin. Even
worse is the incident with Edward Snowden, an employee
of Booz Allen Hamilton, a US National Security Agency
contractor, who has been charged with deliberately leaking
massive amounts of classified information. More recently, in
July 2019 there was CapitalOne’s data breach allegedly due
to a former Amazon Cloud Services employee who hacked
over 100 million customers’ data hosted on Amazon’s cloud,
and in May 2019 Salesforce had a multi-hour cloud meltdown
due to a database blunder that granted users access to all
data. Similar examples involving government contractors
abound. There are also broader studies suggesting that
50 S Armed Forces Comptroller | Fall 2019
IT S ervice P roviders and C y b e rs e c u rity Risk
almost one third of cyber incidents in financial services and
healthcare originate with ITO and other third-party service
providers.1
Cybersecurity risk considerations remain paramount in all
forms of ITO. ITO clients may implicitly or explicitly expect
ITO and managed security service providers to assume
some responsibility for cyber risk. In reality, organizations
cannot outsource their cybersecurity liability. Reputation-wise,
they are liable for the security of their data and systems, no
matter what. More importantly, laws simply do not allow
firms to outsource regulatory responsibility. This means that
ITO clients must still actively monitor, document, and manage
their cybersecurity risk exposure. Indeed, corporations
often disclose in annual statements to shareholders
cybersecurity concerns about ITO providers, sometimes
conceding that these concerns compromise their financial
reports’ reliability and lead to re-insourcing of IT services.
What options do ITO client organizations have? This
question requires understanding key cybersecurity challenges
in ITO. These challenges point to trust in ITO providers as a
key success ingredient, however, multiple ways have been
suggested for establishing such trust. Our review of different
perspectives on trust shows that trust anchored in
independent cybersecurity certification and market-based
reputation mechanisms is emerging as a dominant model.
Cybersecurity Challenges in ITO
Although IT insourcing is subject to cybersecurity risks,
many of the risks are exacerbated in the ITO context because
of the following challenges.
• Quantifying Cyber Risk Exposure. ITO clients lack data
on ITO providers’ vulnerability to cyber incidents, as well
as on the frequency and damage magnitude of each type
of incidents. Because cyber risk also stems from ITO
providers’ partners along the supply chain, the nature of
risk is more diverse and
evolves at a rapid pace.
These factors limit any attempt
to reliably quantify ITO clients’
cybersecurity risk exposure.
• Liability Asymmetry. ITO
providers seek to disclaim
their liability to avoid paying
damages that are
disproportionate to the revenue
received, and customers are
concerned that ITO providers
may not have the same
incentives to protect client
data and systems.
• Opaque Supply Chains. ITO involves increasingly complex,
dynamic, and non-transparent supply chains. Cloud
computing, for example, is an ecosystem with many more
points of access and higher potential for cybersecurity
failure. A transparency assessment of 25 top cloud
computing providers, based on their published information,
concludes that most offer very limited visibility into their
operations and supply chains. Lack of visibility among IT
service providers and supply chain partners constrains
ITO customers’ ability to control cybersecurity risk.
• Growing Regulatory Demands. Cybersecurity regulations
imposing disclosure and compliance requirements on
firms have been growing at a rapid pace in the United
States, United Kingdom (UK), European Union, and
elsewhere.2 Ensuring regulatory compliance becomes
daunting for ITO providers. Data and services may be
moving across supply chain partners operating in different
regulatory environments. In particular, rapid evolution of
the regulatory environment adds to the frustrations and
near impossibility of ITO and cloud computing providers
to satisfy all laws applicable to global customers in
different jurisdictions.
• Strategic Imperative. Many enterprises no longer view
cybersecurity as an operational concern but rather as a
While ITO continues to be popular because it improves
enterprise agility and cost effectiveness, associated
cybersecurity risks have been growing and taking on an
urgent priority. Contributing to the growing concern are two
trends. One is the rising reliance on cloud-computing service
providers (CSP). In cloud computing, clients’ risk exposure
grows as they move sensitive data to federated cloud
environments that may be hosted
with multiple providers and
sub-providers belonging to different
legal entities in various jurisdictions.
Such a layered structure will
invariably pose greater risk. Another
trend is reliance on managed
security service providers, also
called cybersecurity as a service.
According to a 2019 Ernst & Young
study, more companies are
outsourcing than insourcing their
cybersecurity needs. This is
common among corporations as
well as US government agencies seeking cost savings and
access to staff with highly specialized skills. For example, in
2015, US Cyber Command outsourced $475 million worth
of work covering nearly 20 cyber task areas.
“...trust anchored in
independent cybersecurity
certification and market-based
reputation mechanisms is emerging
as a dominant model.”
1 Benaroch M. and Chernobai A., “ Linking Operational IT Failures to IT Control Weaknesses,” Proceedings ofAMCIS’2015, Puerto Rico, 2015. Vasishta N.V., Gupta M., Misra S.K.,
Mulgund P., and Sharman R., “Optimizing Cybersecurity Program - Evidence from Data Breaches in Healthcare”, 13th Annual Symposium on Information Assurance (ASIA’18),
June, 2018, Albany, NY.
2 Gozman, D., Willcocks, L. 2019. “The emerging Cloud Dilemma: Balancing innovation with cross-border privacy and outsourcing regulations,” Journal o f Business Research,
forthcoming 2019.
The Journal o f the Am erican S ociety o f M ilitary C om ptrollers ■ 51
IT Service Providers and Cybersecurity Risk
strategic imperative. This holds equally for government
bodies. They store far more data than private sector
organizations, and they are major cybercrime targets that
could imperil national security and citizens’ trust. This
reality makes cybersecurity and data privacy among the
most challenging issues in ITO contract negotiations.
Client Provider Trust in Managing Cybersecurity Risk
Because most cybersecurity risks inherent in ITO are not
likely to be mitigated contractually, many ITO clients are
knowingly or unknowingly accepting cybersecurity risk. A
frequently made argument is that managing cybersecurity
risks in ITO involves client-provider trust. However, there
are multiple perspectives on how to achieve such trust.
Figure 1 labels these as the decision-theoretic, transpar
ency-based, and market-based perspectives.
decision-theoretic strategies, including the strategy of risk
transfer using cyber liability insurance. Pricing such insurance
policies is challenging even for insurers. Information on risk
is incomplete and the risk is fast-changing.3 Moreover, cyber
insurance policies typically impose restrictive liability
exclusions and conditions, leaving clients with coverage
limits and considerable risk exposure. Even for the largest
financial institutions coverage limits are usually under $300
million. These challenges are exacerbated in the ITO context,
in part, due to blurry delineation of where ITO providers’
cybersecurity responsibility starts and ends.
2. Transparency-Based Perspective
If we cannot reliably calculate cybersecurity risk exposure,
one alternative is to develop visibility into ITO providers’
operations as a basis for trust. Transparency of the supply
chain should allow ITO clients to verify that
Figure 1: Three Perspectives on Client-Provider Trust in ITO
1. Decision-Theoretic Perspective
This view is about ITO clients developing trust in their
own decision to outsource, including what to outsource
and to whom. This trust is anchored in a decision-theoretic
calculation of risk exposure based on data about (1) the
firm ’s and ITO provider’s cybersecurity vulnerabilities,
sources of threats, and assets subject to those threats,
(2) the distributions of frequency and damage-magnitude
of cybersecurity events, and (3) contract terms and their
pricing in the case of purchasing cybersecurity liability
insurance.
However, again, limited availability of these data restricts
the ability to quantify risk and manage it using
their trust in ITO providers is not misplaced. Some hold
that supply chains involving multinational companies
need to be inspected down to the second, third, and
fourth tiers.
However, visibility into ITO providers’ operations and
supply chain partners remains a challenge.4 IT executives
continue to cite supply chain visibility as a very high
priority. Transparency-based trust could work only if
every player in the supply chain has visibility into the IT
security controls of their directly connected parties and
is willing to audit those parties to validate the reliability
of those controls. In reality, ITO providers can share little
with clients because most are not fully aware of their
3 Kopp E., Kaffenberger L., and Wilson C., “ Cyber Risk, Market Failures, and Financial Stability,” IMF Working Paper (WP/17/185), International Monetary Fund, 2017.
4 Akinrolabu O. and New S., “Can Improved Transparency Reduce Supply Chain Risks in Cloud Computing?” Operations and Supply Chain Management, Vol. 10, No. 3, 2017,
pp. 130-140
52 ■ Armed Forces Comptroller Fall 2019
IT Service Providers and Cybersecurity Risk
supply chains beyond the first tier. Moreover, many ITO
clients are simply not capable of executing security audits
of their IT providers. Even if these requirements are met,
there is the added cost to doing business with ITO
providers. More importantly, ITO clients’ liability for
cybersecurity risk may grow as they know more about
their ITO providers’ operations and supply chains.
3. Market-Based Perspective
This view of trust requires market mechanisms for
establishing the reputation of ITO providers. Reputation,
or the fear of its loss, constrains opportunistic behavior
and exemplifies how markets self-regulate. Sometimes
service providers hire a trusted third-party to evaluate
and certify their quality. Examples
are Dun & Bradstreet, which
provides dependable credit
information on businesses of
all sizes, and Underwriters
Laboratories, which provides
a seal of approval on products.
Evaluation standards are often
established by regulators,
especially when market-based
reputation mechanisms and
evaluation standards are slow
to develop.
For market-based trust to work,
the key is balanced and well-designed regulations.
Whereas lack of transparency increases demand for
regulations, serious information asymmetries between
regulator and firms render regulations ineffective. For
example, over 60 percent of public firms do not disclose
their cyber incidents despite a mandate from the Securities
and Exchange Commission to disclose incidents when
they materially damage the business. Another factor that
renders ex-ante regulation ineffective is failure to design
effective evaluation standards.
Of the three perspectives on trust, the market-based
perspective is emerging as the dominant alternative in
ITO service delivery.
Financial Reporting Regulations and Certification
The accounting field has extensively studied regulatory
evaluation standards and market-based reputational
mechanisms. A prime example is the 2002 Sarbanes-Oxley
(SOX) Act, which was enacted to boost investor trust in
public firms’ financial reporting after several high-profile
corporate scandals (e.g., Enron). SOX mandates firms to
audit and disclose deficiencies in internal controls over
financial reporting, where audits are certified by trusted
public accounting firms (e.g., Deloitte, KPMG, Ernst & Young).
Given SOX regulatory requirements, sponsoring
organizations, such as the American Institute of Certified
Public Accountants (AICPA), developed evaluation standards
comprising lists of controls to audit for SOX compliance.
Secondary market data observed after revelations of
(reported) information about internal controls reflect how
shareholders and security analysts react to the new
information. It provides insights into a host of issues,
including: penalties shareholders inflict to hold firms
accountable for internal control deficiencies (e.g., drop in
equity prices, rise in cost of capital, and higher audit fees),
what types of internal controls matter most, and what role
corporate board governance plays regarding internal
control effectiveness.
Similar insights can, and are
starting to, emerge in the
cybersecurity context,
particularly regarding ITO
providers’ security controls.
Two recent studies examine IT
security control deficiencies
associated with data breaches
in healthcare and cyber
incidents in financial services
firms. Two other studies
document a favorable stock
market reaction to ITO providers
announcing investments in certification of their IT security
controls.5
There is enough evidence that market-based trust works.
Generally, it holds firms accountable to their shareholders -
shareholders trust regulatory certifications, and firms work
hard to avoid problems with their certified internal controls
that would result in punitive market reactions. This should
work for cybersecurity risk in ITO. Most ITO client firms are
not capable or willing to evaluate the IT security controls of
their ITO providers and supply chain partners, and no ITO
provider wants to be audited repeatedly and by every client
separately. What could fill the gap is market-based trust and
independent certifications of ITO providers’ IT security
controls.
Cybersecurity Regulations and Standards
As we implied earlier, regulations are operationalized and
expanded into evaluation standards by various sponsoring
entities. Sample standards for cybersecurity include SOC1/2,
IS027001, NIST800-53, and country-specific standards such
as UK’s G-Cloud and Singapore’s MTCS. All such standards
seek visibility into service providers’ IT security and data
“Reputation, or the fear
of its loss, constrains opportunistic
behavior and exemplifies
how markets self-regulate.”
5 Benaroch M. and Chernobai A., “ Linking Operational IT Failures to IT Control Weaknesses,” Proceedings ofAM CIS2015, Puerto Rico, 2015. Vasishta N.V., Gupta M„ Misra S.K.,
Mulgund P., and Sharman R., “ Optimizing Cybersecurity Program - Evidence from Data Breaches in Healthcare”, 13th Annual Symposium on Information Assurance (ASIA’18),
June, 2018, Albany, NY.
The Journal o f th e Am erican S ociety o f M ilitary Com ptrollers ■ 53
IT Service Providers and Cybersecurity Risk
privacy controls for ensuring the confidentiality, integrity,
and availability of those providers’ systems and services.
ITO providers seeking participation in specific industry
environments, such as cloud computing, are increasingly
expected to adhere to specific cybersecurity standards. If
their IT platforms achieve certifications, the platforms are
judged to have capabilities to meet specific security
requirements. Certification attests to a commitment to robust
cybersecurity management.
SOC1/2, Service Organization Control, appears to be more
widely adopted by ITO service providers, primarily because
it extends SOX.6 SOC is geared toward certifying financial
reporting controls of publicly traded service providers.
SOC1/2 is sponsored by the AICPA with public accounting
firms acting as the audit certifying bodies. SOC1/2
certification yields two common reports. The SOC1 report
informs auditors and shareholders about controls over
financial reporting. The SOC2 report informs knowledgeable
users (e.g., clients, partners, regulators) about controls for
meeting information handling objectives (security, availability,
processing integrity, confidentiality, and privacy).
The Way Ahead
The promise of market-based trust has increasing empirical
support. One study demonstrates that firms announcing
completion of IS027001 certification witness an appreciation
of their stock price.7 The same way such certifications lead
to positive market reactions that create firm value, so do
cybersecurity incidents indicating failures of certified controls
lead to punitive market reactions that destroy firm value. It
is this dual market-based mechanism that should hold ITO
providers accountable for cybersecurity risk. Similar results
are reported for Cyber Essentials Plus, a certification program
the UK government and National Cyber Security Centre
mandate of firms bidding for government contracts involving
the processing of sensitive and personal information.8
This indicates the power of large players to impose standards
that markets recognize and respect. A third study documents
a stronger negative market reaction when cyber incidents
are linked to pervasive and difficult to remediate IT security
control deficiencies.9 Apparently, markets are also sensitive
to how critical are different IT security controls.
There are also obstacles to market-based trust. Chief among
them is the optionality of cybersecurity certification. Another
is the myriad of cybersecurity regulations and standards
around the world that runs the whole gamut from very strong
to non-existent. SOC1/2 is a popular standard but it is overly
focused on financial reporting. More comprehensive
standards exist but one is yet to achieve dominance and
broad market acceptance. Perhaps over time large govern
ment and industry bodies may use their economic power to
impose standards that markets will adopt.
In conclusion, market-based trust can ensure ITO service
provider accountability for cybersecurity risk if clients
demand ITO providers to obtain suitable cybersecurity
certification, and if surfaced deficiencies in certified IT
security controls have punitive market implications for
ITO providers.
Mr. Michel Benaroch
Michel Benaroch is Professor o f
Information Systems at the Lubin
School o f Accountancy in the
Whitman School o f Management,
Syracuse University. He is also
Associate Dean for Research and
Ph.D. Programs at the Whitman
School. Professor Benaroch has
published extensively on a range
o f topics, including economics o f I T,
management o f IT investment risk,
cybersecurity impacts on
organizations, and artificial
intelligence applications in finance.
He teaches courses on business
analytics and managerial
decision-making. He earned a
Ph.D. in business administration
from New York University, and an
MBA and B.Sc. in mathematics and
computer science from the Hebrew
University in Jerusalem.
6 Weiss M. and Solomon M.G., Auditing IT Infrastructures for Compliance, Jones & Bartlett Learning, LLC, an Ascend Learning Company, 2016.
7 See note 6.
8 Malliouris D.D. and Simpson A.C., “The stock market impact of information security investments: The case of security standards,” Workshop on Economics o f Information Security,
June 2019, Boston, MA.
9 Benaroch, M. “Properties o f IT Control Deficiencies at the Root of Cyber Incidents: Theoretical and Empirical Examination, Proceedings o f the 12th ILAIS Conference, June, 2018.
54 ■ Armed Forces Comptroller | Fall 2019
Copyright of Armed Forces Comptroller is the property of American Society of Military
Comptrollers and its content may not be copied or emailed to multiple sites or posted to a
listserv without the copyright holders express written permission. However, users may print,
download, or email articles for individual use.
CATEGORIES
Economics
Nursing
Applied Sciences
Psychology
Science
Management
Computer Science
Human Resource Management
Accounting
Information Systems
English
Anatomy
Operations Management
Sociology
Literature
Education
Business & Finance
Marketing
Engineering
Statistics
Biology
Political Science
Reading
History
Financial markets
Philosophy
Mathematics
Law
Criminal
Architecture and Design
Government
Social Science
World history
Chemistry
Humanities
Business Finance
Writing
Programming
Telecommunications Engineering
Geography
Physics
Spanish
ach
e. Embedded Entrepreneurship
f. Three Social Entrepreneurship Models
g. Social-Founder Identity
h. Micros-enterprise Development
Outcomes
Subset 2. Indigenous Entrepreneurship Approaches (Outside of Canada)
a. Indigenous Australian Entrepreneurs Exami
Calculus
(people influence of
others) processes that you perceived occurs in this specific Institution Select one of the forms of stratification highlighted (focus on inter the intersectionalities
of these three) to reflect and analyze the potential ways these (
American history
Pharmacology
Ancient history
. Also
Numerical analysis
Environmental science
Electrical Engineering
Precalculus
Physiology
Civil Engineering
Electronic Engineering
ness Horizons
Algebra
Geology
Physical chemistry
nt
When considering both O
lassrooms
Civil
Probability
ions
Identify a specific consumer product that you or your family have used for quite some time. This might be a branded smartphone (if you have used several versions over the years)
or the court to consider in its deliberations. Locard’s exchange principle argues that during the commission of a crime
Chemical Engineering
Ecology
aragraphs (meaning 25 sentences or more). Your assignment may be more than 5 paragraphs but not less.
INSTRUCTIONS:
To access the FNU Online Library for journals and articles you can go the FNU library link here:
https://www.fnu.edu/library/
In order to
n that draws upon the theoretical reading to explain and contextualize the design choices. Be sure to directly quote or paraphrase the reading
ce to the vaccine. Your campaign must educate and inform the audience on the benefits but also create for safe and open dialogue. A key metric of your campaign will be the direct increase in numbers.
Key outcomes: The approach that you take must be clear
Mechanical Engineering
Organic chemistry
Geometry
nment
Topic
You will need to pick one topic for your project (5 pts)
Literature search
You will need to perform a literature search for your topic
Geophysics
you been involved with a company doing a redesign of business processes
Communication on Customer Relations. Discuss how two-way communication on social media channels impacts businesses both positively and negatively. Provide any personal examples from your experience
od pressure and hypertension via a community-wide intervention that targets the problem across the lifespan (i.e. includes all ages).
Develop a community-wide intervention to reduce elevated blood pressure and hypertension in the State of Alabama that in
in body of the report
Conclusions
References (8 References Minimum)
*** Words count = 2000 words.
*** In-Text Citations and References using Harvard style.
*** In Task section I’ve chose (Economic issues in overseas contracting)"
Electromagnetism
w or quality improvement; it was just all part of good nursing care. The goal for quality improvement is to monitor patient outcomes using statistics for comparison to standards of care for different diseases
e a 1 to 2 slide Microsoft PowerPoint presentation on the different models of case management. Include speaker notes... .....Describe three different models of case management.
visual representations of information. They can include numbers
SSAY
ame workbook for all 3 milestones. You do not need to download a new copy for Milestones 2 or 3. When you submit Milestone 3
pages):
Provide a description of an existing intervention in Canada
making the appropriate buying decisions in an ethical and professional manner.
Topic: Purchasing and Technology
You read about blockchain ledger technology. Now do some additional research out on the Internet and share your URL with the rest of the class
be aware of which features their competitors are opting to include so the product development teams can design similar or enhanced features to attract more of the market. The more unique
low (The Top Health Industry Trends to Watch in 2015) to assist you with this discussion.
https://youtu.be/fRym_jyuBc0
Next year the $2.8 trillion U.S. healthcare industry will finally begin to look and feel more like the rest of the business wo
evidence-based primary care curriculum. Throughout your nurse practitioner program
Vignette
Understanding Gender Fluidity
Providing Inclusive Quality Care
Affirming Clinical Encounters
Conclusion
References
Nurse Practitioner Knowledge
Mechanics
and word limit is unit as a guide only.
The assessment may be re-attempted on two further occasions (maximum three attempts in total). All assessments must be resubmitted 3 days within receiving your unsatisfactory grade. You must clearly indicate “Re-su
Trigonometry
Article writing
Other
5. June 29
After the components sending to the manufacturing house
1. In 1972 the Furman v. Georgia case resulted in a decision that would put action into motion. Furman was originally sentenced to death because of a murder he committed in Georgia but the court debated whether or not this was a violation of his 8th amend
One of the first conflicts that would need to be investigated would be whether the human service professional followed the responsibility to client ethical standard. While developing a relationship with client it is important to clarify that if danger or
Ethical behavior is a critical topic in the workplace because the impact of it can make or break a business
No matter which type of health care organization
With a direct sale
During the pandemic
Computers are being used to monitor the spread of outbreaks in different areas of the world and with this record
3. Furman v. Georgia is a U.S Supreme Court case that resolves around the Eighth Amendments ban on cruel and unsual punishment in death penalty cases. The Furman v. Georgia case was based on Furman being convicted of murder in Georgia. Furman was caught i
One major ethical conflict that may arise in my investigation is the Responsibility to Client in both Standard 3 and Standard 4 of the Ethical Standards for Human Service Professionals (2015). Making sure we do not disclose information without consent ev
4. Identify two examples of real world problems that you have observed in your personal
Summary & Evaluation: Reference & 188. Academic Search Ultimate
Ethics
We can mention at least one example of how the violation of ethical standards can be prevented. Many organizations promote ethical self-regulation by creating moral codes to help direct their business activities
*DDB is used for the first three years
For example
The inbound logistics for William Instrument refer to purchase components from various electronic firms. During the purchase process William need to consider the quality and price of the components. In this case
4. A U.S. Supreme Court case known as Furman v. Georgia (1972) is a landmark case that involved Eighth Amendment’s ban of unusual and cruel punishment in death penalty cases (Furman v. Georgia (1972)
With covid coming into place
In my opinion
with
Not necessarily all home buyers are the same! When you choose to work with we buy ugly houses Baltimore & nationwide USA
The ability to view ourselves from an unbiased perspective allows us to critically assess our personal strengths and weaknesses. This is an important step in the process of finding the right resources for our personal learning style. Ego and pride can be
· By Day 1 of this week
While you must form your answers to the questions below from our assigned reading material
CliftonLarsonAllen LLP (2013)
5 The family dynamic is awkward at first since the most outgoing and straight forward person in the family in Linda
Urien
The most important benefit of my statistical analysis would be the accuracy with which I interpret the data. The greatest obstacle
From a similar but larger point of view
4 In order to get the entire family to come back for another session I would suggest coming in on a day the restaurant is not open
When seeking to identify a patient’s health condition
After viewing the you tube videos on prayer
Your paper must be at least two pages in length (not counting the title and reference pages)
The word assimilate is negative to me. I believe everyone should learn about a country that they are going to live in. It doesnt mean that they have to believe that everything in America is better than where they came from. It means that they care enough
Data collection
Single Subject Chris is a social worker in a geriatric case management program located in a midsize Northeastern town. She has an MSW and is part of a team of case managers that likes to continuously improve on its practice. The team is currently using an
I would start off with Linda on repeating her options for the child and going over what she is feeling with each option. I would want to find out what she is afraid of. I would avoid asking her any “why” questions because I want her to be in the here an
Summarize the advantages and disadvantages of using an Internet site as means of collecting data for psychological research (Comp 2.1) 25.0\% Summarization of the advantages and disadvantages of using an Internet site as means of collecting data for psych
Identify the type of research used in a chosen study
Compose a 1
Optics
effect relationship becomes more difficult—as the researcher cannot enact total control of another person even in an experimental environment. Social workers serve clients in highly complex real-world environments. Clients often implement recommended inte
I think knowing more about you will allow you to be able to choose the right resources
Be 4 pages in length
soft MB-920 dumps review and documentation and high-quality listing pdf MB-920 braindumps also recommended and approved by Microsoft experts. The practical test
g
One thing you will need to do in college is learn how to find and use references. References support your ideas. College-level work must be supported by research. You are expected to do that for this paper. You will research
Elaborate on any potential confounds or ethical concerns while participating in the psychological study 20.0\% Elaboration on any potential confounds or ethical concerns while participating in the psychological study is missing. Elaboration on any potenti
3 The first thing I would do in the family’s first session is develop a genogram of the family to get an idea of all the individuals who play a major role in Linda’s life. After establishing where each member is in relation to the family
A Health in All Policies approach
Note: The requirements outlined below correspond to the grading criteria in the scoring guide. At a minimum
Chen
Read Connecting Communities and Complexity: A Case Study in Creating the Conditions for Transformational Change
Read Reflections on Cultural Humility
Read A Basic Guide to ABCD Community Organizing
Use the bolded black section and sub-section titles below to organize your paper. For each section
Losinski forwarded the article on a priority basis to Mary Scott
Losinksi wanted details on use of the ED at CGH. He asked the administrative resident