TouchID:User Authentication on Mobile Devices via Inertial-Touch Gesture Analysis 162 XINCHEN ZHANG,State Key Laboratory for Novel Software Technology,Nanjing University,China YAFENG YIN',State Key Laboratory for Novel Software Technology,Nanjing University,China LEI XIE,State Key Laboratory for Novel Software Technology,Nanjing University,China HAO ZHANG,State Key Laboratory for Novel Software Technology,Nanjing University,China ZEFAN GE,State Key Laboratory for Novel Software Technology,Nanjing University,China SANGLU LU,State Key Laboratory for Novel Software Technology,Nanjing University,China Due to the widespread use of mobile devices,it is essential to authenticate users on mobile devices to prevent sensitive information leakage.In this paper,we propose TouchID,which combinedly uses the touch sensor and the inertial sensor for gesture analysis,to provide a touch gesture based user authentication scheme.Specifically,TouchID utilizes the touch sensor to analyze the on-screen gesture while using the inertial sensor to analyze the device's motion caused by the touch gesture, and then combines the unique features between the on-screen gesture and the device's motion for user authentication.To mitigate the intra-class difference and reduce the inter-class similarity,we propose a spatial alignment method for sensor data and segment the touch gesture into multiple sub-gestures in space domain,to keep the stability of the same user and enhance the discriminability of different users.To provide a uniform representation of touch gestures with different topological structures,we present a four-part based feature selection method,which classifies a touch gesture into a start node,an end node,the turning node(s),and the smooth paths,and then select effective features from these parts based on the Fisher Score. In addition,considering the uncertainty of user's postures,which may change the sensor data of the same touch gesture,we propose a multi-threshold kNN based model to adaptively tolerate the posture difference for user authentication.Finally,we implement TouchID on commercial smartphones and conduct extensive experiments to evaluate TouchID.The experiment results show that TouchID can achieve a good performance for user authentication,ie.,having a low equal error rate of 4.90%. CCS Concepts:Human-centered computingUbiquitous and mobile computing. Additional Key Words and Phrases:User authentication.Touch gesture,Mobile devices,Inertial-touch gesture analysis ACM Reference Format: Xinchen Zhang,Yafeng Yin,Lei Xie,Hao Zhang,Zefan Ge,and Sanglu Lu.2020.TouchID:User Authentication on Mo- bile Devices via Inertial-Touch Gesture Analysis.Proc.ACM Interact.Mob.Wearable Ubiguitous Technol.37,4,Article 162 (December 2020),29 pages.https://doi.org/10.1145/1122445.1122456 "Corresponding Author Authors'addresses:Xinchen Zhang.State Key Laboratory for Novel Software Technology,Nanjing University,China,xczhang@smail.nju edu.cn;Yafeng Yin,State Key Laboratory for Novel Software Technology,Nanjing University,China,yafeng@nju.edu.cn;Lei Xie,State Key Laboratory for Novel Software Technology,Nanjing University,China,Ixie@nju.edu.cn;Hao Zhang.State Key Laboratory for Novel Software Technology,Nanjing University,China,h.zhang@smailnju.edu.cn;Zefan Ge,State Key Laboratory for Novel Software Technology,Nanjing University,China,zefan@smail.nju.edu.cn;Sanglu Lu,State Key Laboratory for Novel Software Technology,Nanjing University,China sanglu@nju.edu.cn Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page.Copyrights for components of this work owned by others than ACM must be honored.Abstracting with credit is permitted.To copy otherwise,or republish,to post on servers or to redistribute to lists,requires prior specific permission and/or a fee.Request permissions from permissions@acm.org. 2020 Association for Computing Machinery. 2474-9567/2020/12-ART162$15.00 https:/doi.org/10.1145/1122445.1122456 Proc.ACM Interact.Mob.Wearable Ubiquitous Technol.,Vol.37,No.4,Article 162.Publication date:December 2020
162 TouchID: User Authentication on Mobile Devices via Inertial-Touch Gesture Analysis XINCHEN ZHANG, State Key Laboratory for Novel Software Technology, Nanjing University, China YAFENG YIN∗ , State Key Laboratory for Novel Software Technology, Nanjing University, China LEI XIE, State Key Laboratory for Novel Software Technology, Nanjing University, China HAO ZHANG, State Key Laboratory for Novel Software Technology, Nanjing University, China ZEFAN GE, State Key Laboratory for Novel Software Technology, Nanjing University, China SANGLU LU, State Key Laboratory for Novel Software Technology, Nanjing University, China Due to the widespread use of mobile devices, it is essential to authenticate users on mobile devices to prevent sensitive information leakage. In this paper, we propose TouchID, which combinedly uses the touch sensor and the inertial sensor for gesture analysis, to provide a touch gesture based user authentication scheme. Specifically, TouchID utilizes the touch sensor to analyze the on-screen gesture while using the inertial sensor to analyze the device’s motion caused by the touch gesture, and then combines the unique features between the on-screen gesture and the device’s motion for user authentication. To mitigate the intra-class difference and reduce the inter-class similarity, we propose a spatial alignment method for sensor data and segment the touch gesture into multiple sub-gestures in space domain, to keep the stability of the same user and enhance the discriminability of different users. To provide a uniform representation of touch gestures with different topological structures, we present a four-part based feature selection method, which classifies a touch gesture into a start node, an end node, the turning node(s), and the smooth paths, and then select effective features from these parts based on the Fisher Score. In addition, considering the uncertainty of user’s postures, which may change the sensor data of the same touch gesture, we propose a multi-threshold kNN based model to adaptively tolerate the posture difference for user authentication. Finally, we implement TouchID on commercial smartphones and conduct extensive experiments to evaluate TouchID. The experiment results show that TouchID can achieve a good performance for user authentication, i.e., having a low equal error rate of 4.90%. CCS Concepts: • Human-centered computing → Ubiquitous and mobile computing. Additional Key Words and Phrases: User authentication, Touch gesture, Mobile devices, Inertial-touch gesture analysis ACM Reference Format: Xinchen Zhang, Yafeng Yin, Lei Xie, Hao Zhang, Zefan Ge, and Sanglu Lu. 2020. TouchID: User Authentication on Mobile Devices via Inertial-Touch Gesture Analysis. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 37, 4, Article 162 (December 2020), 29 pages. https://doi.org/10.1145/1122445.1122456 ∗Corresponding Author Authors’ addresses: Xinchen Zhang, State Key Laboratory for Novel Software Technology, Nanjing University, China, xczhang@smail.nju. edu.cn; Yafeng Yin, State Key Laboratory for Novel Software Technology, Nanjing University, China, yafeng@nju.edu.cn; Lei Xie, State Key Laboratory for Novel Software Technology, Nanjing University, China, lxie@nju.edu.cn; Hao Zhang, State Key Laboratory for Novel Software Technology, Nanjing University, China, h.zhang@smail.nju.edu.cn; Zefan Ge, State Key Laboratory for Novel Software Technology, Nanjing University, China, zefan@smail.nju.edu.cn; Sanglu Lu, State Key Laboratory for Novel Software Technology, Nanjing University, China, sanglu@nju.edu.cn. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org. © 2020 Association for Computing Machinery. 2474-9567/2020/12-ART162 $15.00 https://doi.org/10.1145/1122445.1122456 Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 37, No. 4, Article 162. Publication date: December 2020
162:2 1 INTRODUCTION With the widespread use of mobile devices (e.g.,smartphones,smartwatches,and tablets)in daily life,more and more sensitive information such as photos,emails,chat messages and bank accounts is stored on mobile devices,thus it is essential to authenticate users to prevent sensitive information leakage [22].Traditionally,the knowledge based authentication schemes were used,which mainly required the user to input the predefined PIN codes or patterns for authentication.Unfortunately,these schemes are vulnerable to various attacks such as smudge attack [2],shoulder surfing attack [49,54],and password inference attack [56,57,62].To overcome these issues,the physiological feature based authentication schemes were proposed,which mainly utilized the unique fingerprints [40,42 and face features [14,21]for authentication.It is convenient to stretch out the finger or show the face for authentication,thus these schemes are adopted in many devices.However,capturing fingerprints or face features often require dedicated or expensive hardwares,e.g.,fingerprint sensor or high-resolution camera. Besides,the physiological feature based authentication schemes can be vulnerable to replaying and spoofing attacks [13,26,38,41].For example,the fake fingerprint generated by 3D printer can achieve an average attack success rate of 80%[26],while the images,videos or 3D face masks can be used to spoof the face authentication systems [13]. In fact,whatever the knowledge based or the physiological feature based authentication schemes,they utilize "what”the user knows or“what”"the user has for user authentication,while ignoring“how”the user performs during the authentication process,which is the focus of this paper.To solve this problem,the behavioral biometrics based authentication schemes were proposed,which focused on the difference of user behaviors in performing gestures for authentication.For example,using the unique vibration characteristics of finger knuckles when being tapped for authentication [8],performing a sequence of rhythmic taps/slides on a device screen [9]for authentication,using the hand's geometry in taps generated by different fingers for authentication [51],using the inter-stroke relationship between multiple fingers in touch gestures for authentication [43].These methods often work with common sensors embedded in many off-the-shelf devices instead of the dedicated sensors used in physiological feature based authentication schemes.However,the existing work often designed customized gestures for user authentication,while the gestures were rarely adopted in commercial mobile devices. Different from the existing work,this paper aims to provide an online user authentication scheme TouchID which is expected to resist the common attacks like smudge attack [2],shoulder surfing attack [49,54],password inference attack [56,57,62],and spoofing attacks [26,38,41].In TouchID,the user performs a widely adopted graphic pattern based touch gesture with one finger on the commercial mobile device for user authentication, as shown in Fig.1.The unlock operations are the same with the existing graphic pattern based unlocking method and easy to use.In regard to the graphic pattern,it is common and has been integrated into many COTS devices(e.g.,Android-powered smartphones which have a 87%global market share [28])and applications (e.g.,payment software Alipay [4],image management software Safe Vault [58]).When compared with the existing work [43,47,51]designing customized gestures,TouchID does not require to use the specific features like the displacements between different fingers and the relationship between a series of customized gestures. thus user authentication in TouchID can be more challenging.As shown in Fig.1,when performing a touch gesture,TouchID leverages the touch sensor and the inertial sensor to capture the on-screen gesture and the device's motion,respectively.Then,TouchID utilizes the unique biometric feature of the user's finger and the touch behavior,i.e.,the geometry of on-screen gesture and the micro movement of the device,to perform user authentication in an online manner.However,to achieve the above goal,it is necessary to mitigate the intra-class difference and reduce the inter-class similarity of gestures,i.e.,enhancing the stability of gestures from the same user while enhancing the discriminability of gestures from the different users.Specifically,we need to solve the following three challenges,to provide the user authentication scheme TouchID. Proc.ACM Interact.Mob.Wearable Ubiquitous Technol..Vol 37.No.4.Article 162.Publication date:December 2020
162:2 • 1 INTRODUCTION With the widespread use of mobile devices (e.g., smartphones, smartwatches, and tablets) in daily life, more and more sensitive information such as photos, emails, chat messages and bank accounts is stored on mobile devices, thus it is essential to authenticate users to prevent sensitive information leakage [22]. Traditionally, the knowledge based authentication schemes were used, which mainly required the user to input the predefined PIN codes or patterns for authentication. Unfortunately, these schemes are vulnerable to various attacks such as smudge attack [2], shoulder surfing attack [49, 54], and password inference attack [56, 57, 62]. To overcome these issues, the physiological feature based authentication schemes were proposed, which mainly utilized the unique fingerprints [40, 42] and face features [14, 21] for authentication. It is convenient to stretch out the finger or show the face for authentication, thus these schemes are adopted in many devices. However, capturing fingerprints or face features often require dedicated or expensive hardwares, e.g., fingerprint sensor or high-resolution camera. Besides, the physiological feature based authentication schemes can be vulnerable to replaying and spoofing attacks [13, 26, 38, 41]. For example, the fake fingerprint generated by 3D printer can achieve an average attack success rate of 80% [26], while the images, videos or 3D face masks can be used to spoof the face authentication systems [13]. In fact, whatever the knowledge based or the physiological feature based authentication schemes, they utilize “what” the user knows or “what” the user has for user authentication, while ignoring “how” the user performs during the authentication process, which is the focus of this paper. To solve this problem, the behavioral biometrics based authentication schemes were proposed, which focused on the difference of user behaviors in performing gestures for authentication. For example, using the unique vibration characteristics of finger knuckles when being tapped for authentication [8], performing a sequence of rhythmic taps/slides on a device screen [9] for authentication, using the hand’s geometry in taps generated by different fingers for authentication [51], using the inter-stroke relationship between multiple fingers in touch gestures for authentication [43]. These methods often work with common sensors embedded in many off-the-shelf devices instead of the dedicated sensors used in physiological feature based authentication schemes. However, the existing work often designed customized gestures for user authentication, while the gestures were rarely adopted in commercial mobile devices. Different from the existing work, this paper aims to provide an online user authentication scheme TouchID, which is expected to resist the common attacks like smudge attack [2], shoulder surfing attack [49, 54], password inference attack [56, 57, 62], and spoofing attacks [26, 38, 41]. In TouchID, the user performs a widely adopted graphic pattern based touch gesture with one finger on the commercial mobile device for user authentication, as shown in Fig. 1. The unlock operations are the same with the existing graphic pattern based unlocking method and easy to use. In regard to the graphic pattern, it is common and has been integrated into many COTS devices (e.g., Android-powered smartphones which have a 87% global market share [28]) and applications (e.g., payment software Alipay [4], image management software Safe Vault [58]). When compared with the existing work [43, 47, 51] designing customized gestures, TouchID does not require to use the specific features like the displacements between different fingers and the relationship between a series of customized gestures, thus user authentication in TouchID can be more challenging. As shown in Fig. 1, when performing a touch gesture, TouchID leverages the touch sensor and the inertial sensor to capture the on-screen gesture and the device’s motion, respectively. Then, TouchID utilizes the unique biometric feature of the user’s finger and the touch behavior, i.e., the geometry of on-screen gesture and the micro movement of the device, to perform user authentication in an online manner. However, to achieve the above goal, it is necessary to mitigate the intra-class difference and reduce the inter-class similarity of gestures, i.e., enhancing the stability of gestures from the same user while enhancing the discriminability of gestures from the different users. Specifically, we need to solve the following three challenges, to provide the user authentication scheme TouchID. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 37, No. 4, Article 162. Publication date: December 2020
TouchID:User Authentication on Mobile Devices via Inertial-Touch Gesture Analysis.162:3 二4 Touch sensor Inertial sensor Successtul TTING GESTURE STUR authentication failed CLASSIFIER TRANIN TESTING TESTING Fig.1.A typical application scenario of TouchID The first challenge is how to mitigate the intra-class difference and reduce the inter-class similarity among gestures? To address this challenge,we first conduct extensive experimental studies to observe how the finger moves in a touch gesture and how the device moves caused by the gesture.We find that time differences among touch gestures can affect intra-class difference and inter-class similarity.Therefore,we propose a spatial alignment method to align the sensor data in space domain,and then segment the touch gesture into multiple sub-gestures to highlight the sub-gesture which contributes more for enhancing the stability of the same user and the discriminability of different users. The second challenge is how to represent the touch gesture with different topological structures in a uniform way? The touch gestures corresponding to different graphic patterns have a different number of sub-gestures and the sub-gestures can also be different,which may lead to the different representation of a touch gesture.To address this challenge,we propose a four-part feature selection method,which classifies a touch gesture into four parts, i.e.,a start node,an end node,turning node(s),and smooth paths,whatever the topological structure of the touch gesture is.Then,we select effective features for each part based on Fisher Score.Finally,for each gesture,we can represent it with a feature vector consisted of the uniform feature set. The third challenge is how to tolerate the uncertainty caused by different body postures and hand postures?When performing a touch gesture,the user can sit,lay or stand,and she/he can interact with the device with one hand or two hands,the different postures will lead to the inconsistency of the sensor data for the same touch gesture. To address this challenge,we design a multi-threshold KNN based model to separate the touch gestures under different postures into different clusters adaptively,and then perform user authentication in each cluster.In addition,to reduce the computation overhead of multi-threshold kNN,we only use a small number of samples for training. We make three main contributions in this paper.1)We conduct an extensive experimental study to observe the finger's movement and the device's motion when performing a touch gesture,and then propose a spatial alignment method to align the touch gesture in space domain and segment the gesture into sub-gestures,to enhance the stability of the same user and the discriminability of different users.2)Based on a comprehensive analysis of touch sensor data and inertial sensor data,we propose a four-part feature selection method to represent the touch gestures with different topological structures in a uniform way,and select effective features based on the Fisher Score by considering both the intra-class stability and the inter-class discriminability.In addition,we also propose a multi-threshold KNN based model to mitigate the effect of different postures.3)We implement TouchID on an Android-powered smartphone and conduct a lot of experiments to evaluate the efficiency of TouchID Proc.ACM Interact.Mob.Wearable Ubiquitous Technol.,Vol.37,No.4,Article 162.Publication date:December 2020
TouchID: User Authentication on Mobile Devices via Inertial-Touch Gesture Analysis • 162:3 Touch sensor Successful authentication SETTING GESTURE CLASSIFIER TRANING TESTING Authentication failed SETTING GESTURE CLASSIFIER TRANING TESTING Inertial sensor Fig. 1. A typical application scenario of TouchID The first challenge is how to mitigate the intra-class difference and reduce the inter-class similarity among gestures? To address this challenge, we first conduct extensive experimental studies to observe how the finger moves in a touch gesture and how the device moves caused by the gesture. We find that time differences among touch gestures can affect intra-class difference and inter-class similarity. Therefore, we propose a spatial alignment method to align the sensor data in space domain, and then segment the touch gesture into multiple sub-gestures to highlight the sub-gesture which contributes more for enhancing the stability of the same user and the discriminability of different users. The second challenge is how to represent the touch gesture with different topological structures in a uniform way? The touch gestures corresponding to different graphic patterns have a different number of sub-gestures and the sub-gestures can also be different, which may lead to the different representation of a touch gesture. To address this challenge, we propose a four-part feature selection method, which classifies a touch gesture into four parts, i.e., a start node, an end node, turning node(s), and smooth paths, whatever the topological structure of the touch gesture is. Then, we select effective features for each part based on Fisher Score. Finally, for each gesture, we can represent it with a feature vector consisted of the uniform feature set. The third challenge is how to tolerate the uncertainty caused by different body postures and hand postures? When performing a touch gesture, the user can sit, lay or stand, and she/he can interact with the device with one hand or two hands, the different postures will lead to the inconsistency of the sensor data for the same touch gesture. To address this challenge, we design a multi-threshold KNN based model to separate the touch gestures under different postures into different clusters adaptively, and then perform user authentication in each cluster. In addition, to reduce the computation overhead of multi-threshold kNN, we only use a small number of samples for training. We make three main contributions in this paper. 1) We conduct an extensive experimental study to observe the finger’s movement and the device’s motion when performing a touch gesture, and then propose a spatial alignment method to align the touch gesture in space domain and segment the gesture into sub-gestures, to enhance the stability of the same user and the discriminability of different users. 2) Based on a comprehensive analysis of touch sensor data and inertial sensor data, we propose a four-part feature selection method to represent the touch gestures with different topological structures in a uniform way, and select effective features based on the Fisher Score by considering both the intra-class stability and the inter-class discriminability. In addition, we also propose a multi-threshold KNN based model to mitigate the effect of different postures. 3) We implement TouchID on an Android-powered smartphone and conduct a lot of experiments to evaluate the efficiency of TouchID. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 37, No. 4, Article 162. Publication date: December 2020
162:4· The experiment results show that TouchlD can achieve a very low equal error rate for user authentication and outperforms the existing solutions 2 RELATED WORK When considering the unique features in user behaviors,a variety of gesture based user authentication schemes have been proposed.The gestures include hand gestures [8,20],eye movements [11,17],lip motions [37,48], heartbeats [25,36,52],touch gestures [1,7,9,12,16,31,34,43,46,47,51,55,59-61],and so on.Among the gestures,touch gestures are often used to authenticate owners of mobile devices,since users often interact with mobile devices with touch gestures.In this paper,we will summarize the related work using touch gestures for user authentication on mobile devices.In addition,from the perspective of sensors used for monitoring the touch gestures,we mainly classify the related work into three categories,i.e.,touch sensor based,inertial sensor based, and inertial-touch based user authentication. Touch sensor based user authentication:When performing an on-screen gesture,the touch sensor can provide the coordinates of fingertips and the sizes of touch areas,which can be used for inferring the moving directions,moving speeds,moving trajectories of fingertips,relative distances among multiple fingers,etc.The unique features in touch gestures can be used to differentiate users.Until now,many touch-related gestures,e.g., swiping [1,9,12,16,55,59,60],tapping [9,34,61],zooming in/out [60]and some user-defined gestures [47], have been proposed for user authentication.When performing a single-touch swiping gesture on the screen, Frank et al.[16]investigated whether a classifier can continuously authenticate users.Chen et al.[9]required users to perform a sequence of rhythmic tapping or swiping gestures on a multi-touch mobile device,then it extracted features from rhythmic gestures to authenticate users.Moreover,Song et al.[47]studied multi-touch swiping gestures and showed that both the hand geometry and the behavioral biometric can be recorded in these gestures and used for user authentication.Besides,some methods also proposed the usage of image-based features for modeling touchscreen data.Zhao et al.[60]proposed a novel Graphic Touch Gesture Feature(GTGF) to extract the identity traits from swiping and zooming in/out gestures,where the intensity values and shapes of the GTGF represent the moving trace and pressure dynamically.The existing work indicates the touch sensor can be used for on-screen gesture authentication.However,only with the touch sensor,these methods mainly focus on the geometry features on the screen.To get more distinguishable features,they may require the user to perform gestures in a rhythm or adopt the user-defined gestures. Inertial sensor based user authentication:The inertial sensor means the accelerometer,gyroscope,magne- tometer,or any combination of the three sensors.It can measure the device's motion related to the touch gesture for user authentication.For example,when tapping on the smartphone,Sitova et al.[46]used the accelerom- eter,gyroscope,and magnetometer to capture the micro hand movements and dynamic orientation changes during several tapping gestures for user authentication.Chen et al.[8]used the accelerometer in smartwatchs to capture the vibration characteristics when a user taps her/his finger knuckles,then they extracted features from vibration signals to authenticate users.Shen et al.[7]used the accelerometer,gyroscope,magnetometer to capture behavioral traits during swiping gestures and built a one-class Markov-based decision procedure to authentication users.Guerra-Casanova et al.[20]used the accelerometer in a mobile device to depict the behavioral characteristics when a user performs hand gestures while holding the mobile device.The existing work indicates that the inertial sensor can be used for touch gesture based user authentication.However,different from the touch sensor,the inertial sensor mainly captures the indirect sensor data of the touch gesture,ie.,using the sensor data corresponding to device motions to infer the characteristics in touch gestures,and it is often used to authenticate simple or short gestures like tapping and swiping. Inertial-touch based user authentication:When combining the touch sensor and the inertial sensor,it is possible to capture the on-screen gesture and the device's motion at the same time,thus enriching the sensor data Proc.ACM Interact.Mob.Wearable Ubiquitous Technol..Vol 37.No.4.Article 162.Publication date:December 2020
162:4 • The experiment results show that TouchID can achieve a very low equal error rate for user authentication and outperforms the existing solutions. 2 RELATED WORK When considering the unique features in user behaviors, a variety of gesture based user authentication schemes have been proposed. The gestures include hand gestures [8, 20], eye movements [11, 17], lip motions [37, 48], heartbeats [25, 36, 52], touch gestures [1, 7, 9, 12, 16, 31, 34, 43, 46, 47, 51, 55, 59–61], and so on. Among the gestures, touch gestures are often used to authenticate owners of mobile devices, since users often interact with mobile devices with touch gestures. In this paper, we will summarize the related work using touch gestures for user authentication on mobile devices. In addition, from the perspective of sensors used for monitoring the touch gestures, we mainly classify the related work into three categories, i.e., touch sensor based, inertial sensor based, and inertial-touch based user authentication. Touch sensor based user authentication: When performing an on-screen gesture, the touch sensor can provide the coordinates of fingertips and the sizes of touch areas, which can be used for inferring the moving directions, moving speeds, moving trajectories of fingertips, relative distances among multiple fingers, etc. The unique features in touch gestures can be used to differentiate users. Until now, many touch-related gestures, e.g., swiping [1, 9, 12, 16, 55, 59, 60], tapping [9, 34, 61], zooming in/out [60] and some user-defined gestures [47], have been proposed for user authentication. When performing a single-touch swiping gesture on the screen, Frank et al. [16] investigated whether a classifier can continuously authenticate users. Chen et al. [9] required users to perform a sequence of rhythmic tapping or swiping gestures on a multi-touch mobile device, then it extracted features from rhythmic gestures to authenticate users. Moreover, Song et al. [47] studied multi-touch swiping gestures and showed that both the hand geometry and the behavioral biometric can be recorded in these gestures and used for user authentication. Besides, some methods also proposed the usage of image-based features for modeling touchscreen data. Zhao et al. [60] proposed a novel Graphic Touch Gesture Feature (GTGF) to extract the identity traits from swiping and zooming in/out gestures, where the intensity values and shapes of the GTGF represent the moving trace and pressure dynamically. The existing work indicates the touch sensor can be used for on-screen gesture authentication. However, only with the touch sensor, these methods mainly focus on the geometry features on the screen. To get more distinguishable features, they may require the user to perform gestures in a rhythm or adopt the user-defined gestures. Inertial sensor based user authentication: The inertial sensor means the accelerometer, gyroscope, magnetometer, or any combination of the three sensors. It can measure the device’s motion related to the touch gesture for user authentication. For example, when tapping on the smartphone, Sitová et al. [46] used the accelerometer, gyroscope, and magnetometer to capture the micro hand movements and dynamic orientation changes during several tapping gestures for user authentication. Chen et al. [8] used the accelerometer in smartwatchs to capture the vibration characteristics when a user taps her/his finger knuckles, then they extracted features from vibration signals to authenticate users. Shen et al. [7] used the accelerometer, gyroscope, magnetometer to capture behavioral traits during swiping gestures and built a one-class Markov-based decision procedure to authentication users. Guerra-Casanova et al. [20] used the accelerometer in a mobile device to depict the behavioral characteristics when a user performs hand gestures while holding the mobile device. The existing work indicates that the inertial sensor can be used for touch gesture based user authentication. However, different from the touch sensor, the inertial sensor mainly captures the indirect sensor data of the touch gesture, i.e., using the sensor data corresponding to device motions to infer the characteristics in touch gestures, and it is often used to authenticate simple or short gestures like tapping and swiping. Inertial-touch based user authentication: When combining the touch sensor and the inertial sensor, it is possible to capture the on-screen gesture and the device’s motion at the same time, thus enriching the sensor data Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 37, No. 4, Article 162. Publication date: December 2020
TouchID:User Authentication on Mobile Devices via Inertial-Touch Gesture Analysis.162:5 Linear-accleration(as,as,as) Gyroscope(gu,gn,ga) Touch(xi,y) F Touch (s:) Touch(via) Fig.2.Performing a touch gesture on a smartphone for user authentication.When using the customized touch gestures on the smartphone for user authentication, Shahzad et al.[43]designed and chose 10 effective gestures that including 3 single-touch gestures and 7 multi- touch gestures,and then proposed GEAT,which extracted behavioral features from the touch sensor and the accelerometer for user authentication.Jain et al.[31]utilized the accelerometer,orientation and the touch sensor to capture the touch characteristics during swiping gestures,and used the modified Hausdorff distance as the classifier to authenticate users.Wang et al.[51]also utilized the accelerometer,gyroscope,and the touch sensor to capture the geometry of the user's hand when performing the user-defined gestures that require the user to tap four times with one finger or tap once with four fingers.The existing work indicates that using the touch sensor as well as the inertial sensor can capture both on-screen features and the motion features of the device.However, the existing work tends to introduce multi-touch gestures and user-defined gestures,to capture more specific features corresponding to a user(e.g.,the relations between fingers in a gesture)for better authentication. To capture both the on-screen gesture and the device motions,we utilize the touch sensor,accelerometer,and gyroscope for user authentication.In regard to the touch gesture,the existing work mainly used the customized gestures for user authentication,while the gestures are rarely used in commercial mobile devices.In this paper, the user only performs a widely adopted unlock gesture,i.e.,the graphic pattern based touch gestures,with one finger for user authentication.Due to the lack of specific features like displacements between different fingertips in multi-touch gestures and the relationship between a series of customized gestures,the user authentication in this paper which only uses a widely adopted single touch gesture can be more challenging. 3 OBSERVATIONS AND MODELING In this section,we will conduct extensive experiments to observe how the user performs a touch gesture and how the gesture affects the sensor data.Unless otherwise specified,the user performs a common unlock gesture on a 3 x 3 grid on the Samsung Galaxy S9 smartphone with a 5.9-inch screen,as shown in Fig.2.Then,we use the touch sensor and the inertial sensor(i.e.,accelerometer and gyroscope)to collect the sensor data for analysis.The sampling rates of the touch sensor,accelerometer,gyroscope are set to 60 Hz,100 Hz,and 100 Hz,respectively. 3.1 Finger Movements and Device Motions during a Touch Gesture A touch gesture can be measured with the fingertip's coordinates on the screen,the fingertip's touch sizes along the trajectory,and the device's motions along the time.When the fingertip touches the screen,it will generate the following data,i.e.,the coordinate,the touch size,the pressure of the fingertip.However,due to the limitation of Proc.ACM Interact.Mob.Wearable Ubiquitous Technol.,Vol.37,No.4,Article 162.Publication date:December 2020
TouchID: User Authentication on Mobile Devices via Inertial-Touch Gesture Analysis • 162:5 Fh Fp Fg 。 ( , , ) ( , , ) i i i i i i x y z x y z Gyroscope g g g Linear accleration a a a Touch(xi, yi) Touch (si) Touch(vi,i) Fig. 2. Performing a touch gesture on a smartphone for user authentication. When using the customized touch gestures on the smartphone for user authentication, Shahzad et al. [43] designed and chose 10 effective gestures that including 3 single-touch gestures and 7 multitouch gestures, and then proposed GEAT, which extracted behavioral features from the touch sensor and the accelerometer for user authentication. Jain et al. [31] utilized the accelerometer, orientation and the touch sensor to capture the touch characteristics during swiping gestures, and used the modified Hausdorff distance as the classifier to authenticate users. Wang et al. [51] also utilized the accelerometer, gyroscope, and the touch sensor to capture the geometry of the user’s hand when performing the user-defined gestures that require the user to tap four times with one finger or tap once with four fingers. The existing work indicates that using the touch sensor as well as the inertial sensor can capture both on-screen features and the motion features of the device. However, the existing work tends to introduce multi-touch gestures and user-defined gestures, to capture more specific features corresponding to a user (e.g., the relations between fingers in a gesture) for better authentication. To capture both the on-screen gesture and the device motions, we utilize the touch sensor, accelerometer, and gyroscope for user authentication. In regard to the touch gesture, the existing work mainly used the customized gestures for user authentication, while the gestures are rarely used in commercial mobile devices. In this paper, the user only performs a widely adopted unlock gesture, i.e., the graphic pattern based touch gestures, with one finger for user authentication. Due to the lack of specific features like displacements between different fingertips in multi-touch gestures and the relationship between a series of customized gestures, the user authentication in this paper which only uses a widely adopted single touch gesture can be more challenging. 3 OBSERVATIONS AND MODELING In this section, we will conduct extensive experiments to observe how the user performs a touch gesture and how the gesture affects the sensor data. Unless otherwise specified, the user performs a common unlock gesture on a 3 × 3 grid on the Samsung Galaxy S9 smartphone with a 5.9-inch screen, as shown in Fig. 2. Then, we use the touch sensor and the inertial sensor (i.e., accelerometer and gyroscope) to collect the sensor data for analysis. The sampling rates of the touch sensor, accelerometer, gyroscope are set to 60 Hz, 100 Hz, and 100 Hz, respectively. 3.1 Finger Movements and Device Motions during a Touch Gesture A touch gesture can be measured with the fingertip’s coordinates on the screen, the fingertip’s touch sizes along the trajectory, and the device’s motions along the time. When the fingertip touches the screen, it will generate the following data, i.e., the coordinate, the touch size, the pressure of the fingertip. However, due to the limitation of Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Vol. 37, No. 4, Article 162. Publication date: December 2020