Track Your Foot Step:Anchor-free Indoor Localization based on Sensing Users'Foot Steps Chang Liu',Lei Xief,Chuyu Wangt,Jie Wu,Sanglu Lut iState Key Laboratory for Novel Software Technology,Nanjing University,P.R.China Department of Computer Information and Sciences,Temple University,USA Email:Tliuchang@dislab.nju.edu.cn,Ixie@nju.edu.cn,Twangcyu217@126.com,Tjiewu@temple.edu,Tsanglu@nju.edu.cn Abstract-Currently,conventional indoor localization schemes mainly leverage WiFi-based or Bluetooth-based schemes to locate the users in the indoor environment.These schemes require to deploy the infrastructures such as the WiFi APs and Bluetooth beacons in advance to assist indoor localization.This property hinders the indoor localization schemes in that they are not scalable to any other situations without these infrastructures.In this paper,we propose FootStep-Tracker,an anchor-free indoor localization scheme purely based on sensing the user's footsteps. By embedding the tiny Sensor Tag into the user's shoes,FootStep- Tracker is able to accurately perceive the user's moving trace, including the moving direction and distance,by leveraging the accelerometers and gyroscopes.Furthermore,by detecting the Fig.1.The SensorTag used in FootStep-Tracker.We embed two tags into user's activities such as ascending/descending the stairs and the insoles and use an Android phone to collect and process the sensor data. taking an elevator,FootStep-Tracker can effectively correlate due to the inappropriate placement of sensors.Second,they with the specified positions such as stairs and elevators,and conventionally estimate the moving distance by counting the further determine the exacted moving traces in the indoor map by leveraging the space constraints in the map.Realistic experiment foot steps,while assuming the user's step length remains to be results show that,FootStep-Tracker is able to achieve an average a constant value.This approach is not adaptive to the variation localization accuracy of Im for indoor localization,without any of user's moving activities,since the user may sometimes walk infrastructures having been deployed in advance. with small steps,and sometimes jog with large steps.Third, they still need to leverage the anchor nodes like the WiFi APs I.INTRODUCTION to help determine the exact position in the map.This increases Recently,indoor localization schemes have been widely their dependence on the surrounding infrastructure. used to support various applications such as context-aware In this paper,we propose FootStep-Tracker,an anchor-free or location-based services.Conventional localization schemes indoor localization scheme purely based on sensing the user's mainly leverage WiFi-based or Bluetooth-based schemes to footsteps.Our novel solution is based on the observation that locate the users in the indoor environment.These schemes the user's moving activities can be effectively inferred from primarily require the deployment of the infrastructures such his/her footsteps by leveraging the tiny sensors embedded in as WiFi APs and Bluetooth beacons in advance to assist shoes,such as accelerometers and gyroscopes.As is shown in indoor localization.However.for a number of indoor environ- Fig.1 (a),by embedding the tiny sensor like the SensorTag ments,it is impossible (or rather expensive)to deploy such [4]into the user's shoes,FootStep-Tracker is able to accu- a large number of devices as the localization infrastructures. rately perceive the user's moving traces,including the moving This property hinders the indoor localization schemes in that direction and distance,by leveraging the accelerometers and there are not scalable to any other situations without these gyroscopes.Fig.1(b)shows the FootStep-Tracker Android infrastructures.Therefore,it is essential to design a brand new app.Furthermore,by detecting the user's activities such asas- approach for indoor localization without any requirement for cending/descending the stairs and taking an elevator,FootStep- the infrastructure. Tracker can effectively correlate with the specified positions Recently,a few researchers have sought to leverage the such as the stairs and elevators,and further determine the devices with embedded sensors,such as smart phones [1- exact moving traces in the indoor map,by leveraging the space 3]and wearable bracelets,to position and track the indoor constraints in the map. environment users.However,the previous work in positioning There are several challenges building the indoor localization and tracking the users have had the following common limita-scheme purely based on sensing the user's footsteps.First. tions:First,they usually put devices like smart phones into the it is difficult to accurately estimate the user's horizontal step user's pant pocket and perceive the user's movements via the movements.Since the sensors are embedded in the shoes,they embedded sensors.They cannot accurately capture the user's actually capture the feet's movement in the air while the user movements,including the moving directions and distances, is moving,and thus the user's horizontal movement cannot
Track Your Foot Step: Anchor-free Indoor Localization based on Sensing Users’ Foot Steps Chang Liu† , Lei Xie† , Chuyu Wang† , Jie Wu‡ , Sanglu Lu† †State Key Laboratory for Novel Software Technology, Nanjing University, P.R. China ‡Department of Computer Information and Sciences, Temple University, USA Email: † liuchang@dislab.nju.edu.cn, † lxie@nju.edu.cn, †wangcyu217@126.com, ‡ jiewu@temple.edu, † sanglu@nju.edu.cn Abstract—Currently, conventional indoor localization schemes mainly leverage WiFi-based or Bluetooth-based schemes to locate the users in the indoor environment. These schemes require to deploy the infrastructures such as the WiFi APs and Bluetooth beacons in advance to assist indoor localization. This property hinders the indoor localization schemes in that they are not scalable to any other situations without these infrastructures. In this paper, we propose FootStep-Tracker, an anchor-free indoor localization scheme purely based on sensing the user’s footsteps. By embedding the tiny SensorTag into the user’s shoes, FootStepTracker is able to accurately perceive the user’s moving trace, including the moving direction and distance, by leveraging the accelerometers and gyroscopes. Furthermore, by detecting the user’s activities such as ascending/descending the stairs and taking an elevator, FootStep-Tracker can effectively correlate with the specified positions such as stairs and elevators, and further determine the exacted moving traces in the indoor map by leveraging the space constraints in the map. Realistic experiment results show that, FootStep-Tracker is able to achieve an average localization accuracy of 1m for indoor localization, without any infrastructures having been deployed in advance. I. INTRODUCTION Recently, indoor localization schemes have been widely used to support various applications such as context-aware or location-based services. Conventional localization schemes mainly leverage WiFi-based or Bluetooth-based schemes to locate the users in the indoor environment. These schemes primarily require the deployment of the infrastructures such as WiFi APs and Bluetooth beacons in advance to assist indoor localization. However, for a number of indoor environments, it is impossible (or rather expensive) to deploy such a large number of devices as the localization infrastructures. This property hinders the indoor localization schemes in that there are not scalable to any other situations without these infrastructures. Therefore, it is essential to design a brand new approach for indoor localization without any requirement for the infrastructure. Recently, a few researchers have sought to leverage the devices with embedded sensors, such as smart phones [1– 3] and wearable bracelets, to position and track the indoor environment users. However, the previous work in positioning and tracking the users have had the following common limitations: First, they usually put devices like smart phones into the user’s pant pocket and perceive the user’s movements via the embedded sensors. They cannot accurately capture the user’s movements, including the moving directions and distances, Fig. 1. The SensorTag used in FootStep-Tracker. We embed two tags into the insoles and use an Android phone to collect and process the sensor data. due to the inappropriate placement of sensors. Second, they conventionally estimate the moving distance by counting the foot steps, while assuming the user’s step length remains to be a constant value. This approach is not adaptive to the variation of user’s moving activities, since the user may sometimes walk with small steps, and sometimes jog with large steps. Third, they still need to leverage the anchor nodes like the WiFi APs to help determine the exact position in the map. This increases their dependence on the surrounding infrastructure. In this paper, we propose FootStep-Tracker, an anchor-free indoor localization scheme purely based on sensing the user’s footsteps. Our novel solution is based on the observation that the user’s moving activities can be effectively inferred from his/her footsteps by leveraging the tiny sensors embedded in shoes, such as accelerometers and gyroscopes. As is shown in Fig. 1 (a), by embedding the tiny sensor like the SensorTag [4] into the user’s shoes, FootStep-Tracker is able to accurately perceive the user’s moving traces, including the moving direction and distance, by leveraging the accelerometers and gyroscopes. Fig. 1 (b) shows the FootStep-Tracker Android app. Furthermore, by detecting the user’s activities such as ascending/descending the stairs and taking an elevator, FootStepTracker can effectively correlate with the specified positions such as the stairs and elevators, and further determine the exact moving traces in the indoor map, by leveraging the space constraints in the map. There are several challenges building the indoor localization scheme purely based on sensing the user’s footsteps. First, it is difficult to accurately estimate the user’s horizontal step movements. Since the sensors are embedded in the shoes, they actually capture the feet’s movement in the air while the user is moving, and thus the user’s horizontal movement cannot
be directly derived from the collected sensor data.To address position.[3,11-17].Leppakoski et al.[11]proposed an IMU this challenge,we leverage the gyroscope to measure the angle sensors,WLAN signals and indoor map combined localization between the foot's direction of movement and the ground,and system.By using extended Kalman filter to combine the sensor leverage the accelerometer to measure the actual movement with WLAN signal and particle filter to combine the inertial of the foot.We then build a geometric model to estimate data with map information,the diverse data are fused well the horizontal movement.Second,it is difficult to accurately to improve the pedestrian dead reckoning.Vidal et al.[12] estimate the user's moving direction during the movement. present an indoor pedestrian tracking system with the sensor While tracking the user's foot steps,the angle variation of the on the smart phone.Combined with the dead-reckoning and foot steps cannot be directly correlated to the user's moving the gait detecting approach,and aided by the indoor signatures direction.To address this challenge,we build a geometric such as corners,the system have an acceptable location accu- model to depict the relationship between the angle variation racy.Wang et al.[13]present UnLoc,which leverage the iden- of the foot steps and the moving direction,and further derive tifiable signal signatures of indoor environment which can be the user's moving direction from the measurements from the captured by the sensor or WiFi to improve the dead-reckoning embedded sensors.Third,to realize the indoor localization, method.With UnLoc,the localization system converge speed it is essential to determine the exact moving traces in the can be effectively improved.Fourati et al.[15]proposed an indoor map.To address this challenge,we use activity sensing Complementary Filter algorithm to process the sensor data, to effectively figure out the reference positions,such as the and combined with Zero Velocity Update (ZVU).the system elevators and stairs,and further leverage the space constraints can locate the user with high accuracy.Rai el al.developed in the indoor map to filter out those infeasible candidate traces, ZEE [3],which leverages the smart phone built-in sensors, so as to fix the moving traces in the indoor map. tracking the user when he travels in an indoor environment, We advance the state of the art on positioning and tracking and scanning with WiFi signal simultaneously.By combining the users from three perspectives.First,we propose an anchor-the sensors and WiFi,ZEE uses crowdsourcing to locate the free indoor localization purely based on sensing the user's user,achieving a meter-level location accuracy. footsteps,without the support of any infrastructure.Second, Different from the previous work,in this paper,we pro- we propose efficient solutions to accurately estimate the mov- pose an anchor-free indoor localization system.By sensing ing direction and distance,by only leveraging the low-cost the user's foot step and utilizing the reference position and inertial sensors like accelerometer and gyroscope.Third,we constraint of the indoor map,FootStep-Tracker track the user's leverage activity sensing to effectively figure out the reference location without any deployment of anchor nodes. positions during the process of tracking the user,so as to further determine the exact moving traces in the indoor map. III.SYSTEM OVERVIEW II.RELATED WORK In our system,called FootStep-Tracker,we focus on how to A.Infrastructure based Indoor Localization track the user's position based on the low-cost inertial sensors Infrastructure based indoor localization schemes primarily embedded inside the shoes,according to a given indoor map. Fig.2 shows the framework of FootStep-Tracker.First,the use wireless signal,such as RF signal and WiFi signal,to lo- cate the users or objects in the indoor environment.Several lo- Activity Classifier is designed to classify the user's activities cation algorithms such as Fingerprint[6]and LANDMARC[7] into two activity groups,i.e.,walking and reference activities such as ascending/descending the stairs,and the elevator have been proposed and widely accepted in the academic area.Yang et al.[9]proposed Tagoram,an object localization ascending/descending,according to the raw sensor data of system based on COTS RFID reader and tags.By proposed gyroscope and accelerometer.In regard to the walking activity, Differential Augmented Hologram (DAH),Tagoram can re- we measure the moving distance based on the Step Segmen- tation and Step Length Estimator,and measure the moving cover the tag's moving trajectories and achieves a milimeter location accuracy in tracking mobile RFID tags.Xiao et al. direction based on Moving Direction Estimator.According to [10]proposed Nomloc which dynamically adjusts the WLAN the moving distance and moving direction,we reconstruct the network topology by nomadic WiFi AP to address the per- user's moving trace relative to the starting point.Meanwhile, formance variance problem.By the proposed space partition it is possible to derive the reference positions according based algorithm and fine-grained channel state information, to the activity sensing results from the Activity Classifier. Nomloc can effectively mitigate the multipath and NLOS For example,the reference positions can be the elevators effects. if the activity of elevator ascending/descending is detected. Furthermore,by leveraging the space constraints in the indoor B.Infrastructure-free based Indoor Localization map to filter out those infeasible candidate traces,our solution State-of-the-art infrastructure-free based indoor localization could finally determine the user's trace in the indoor map schemes,especially for pedestrian navigation work track the The components of FootStep-Tracker are as follows: user by detecting the user's movement with the IMU sensors, 1)Activity Classifier.It extracts corresponding features and dead-reckoning is the most popular scheme which esti- from the inertial sensor data of human movement,then mate the object's current position by it's previous determined it estimates the user's current activities via the classifica-
be directly derived from the collected sensor data. To address this challenge, we leverage the gyroscope to measure the angle between the foot’s direction of movement and the ground, and leverage the accelerometer to measure the actual movement of the foot. We then build a geometric model to estimate the horizontal movement. Second, it is difficult to accurately estimate the user’s moving direction during the movement. While tracking the user’s foot steps, the angle variation of the foot steps cannot be directly correlated to the user’s moving direction. To address this challenge, we build a geometric model to depict the relationship between the angle variation of the foot steps and the moving direction, and further derive the user’s moving direction from the measurements from the embedded sensors. Third, to realize the indoor localization, it is essential to determine the exact moving traces in the indoor map. To address this challenge, we use activity sensing to effectively figure out the reference positions, such as the elevators and stairs, and further leverage the space constraints in the indoor map to filter out those infeasible candidate traces, so as to fix the moving traces in the indoor map. We advance the state of the art on positioning and tracking the users from three perspectives. First, we propose an anchorfree indoor localization purely based on sensing the user’s footsteps, without the support of any infrastructure. Second, we propose efficient solutions to accurately estimate the moving direction and distance, by only leveraging the low-cost inertial sensors like accelerometer and gyroscope. Third, we leverage activity sensing to effectively figure out the reference positions during the process of tracking the user, so as to further determine the exact moving traces in the indoor map. II. RELATED WORK A. Infrastructure based Indoor Localization Infrastructure based indoor localization schemes primarily use wireless signal, such as RF signal and WiFi signal, to locate the users or objects in the indoor environment. Several location algorithms such as Fingerprint[6] and LANDMARC[7] have been proposed and widely accepted in the academic area. Yang et al. [9] proposed Tagoram, an object localization system based on COTS RFID reader and tags. By proposed Differential Augmented Hologram (DAH), Tagoram can recover the tag’s moving trajectories and achieves a milimeter location accuracy in tracking mobile RFID tags. Xiao et al. [10] proposed Nomloc which dynamically adjusts the WLAN network topology by nomadic WiFi AP to address the performance variance problem. By the proposed space partition based algorithm and fine-grained channel state information, Nomloc can effectively mitigate the multipath and NLOS effects. B. Infrastructure-free based Indoor Localization State-of-the-art infrastructure-free based indoor localization schemes, especially for pedestrian navigation work track the user by detecting the user’s movement with the IMU sensors, and dead-reckoning is the most popular scheme which estimate the object’s current position by it’s previous determined position.[3, 11–17]. Leppakoski et al. [11] proposed an IMU ¨ sensors, WLAN signals and indoor map combined localization system. By using extended Kalman filter to combine the sensor with WLAN signal and particle filter to combine the inertial data with map information, the diverse data are fused well to improve the pedestrian dead reckoning. Vidal et al. [12] present an indoor pedestrian tracking system with the sensor on the smart phone. Combined with the dead-reckoning and the gait detecting approach, and aided by the indoor signatures such as corners, the system have an acceptable location accuracy. Wang et al. [13] present UnLoc, which leverage the identifiable signal signatures of indoor environment which can be captured by the sensor or WiFi to improve the dead-reckoning method. With UnLoc, the localization system converge speed can be effectively improved. Fourati et al. [15] proposed an Complementary Filter algorithm to process the sensor data, and combined with Zero Velocity Update (ZVU), the system can locate the user with high accuracy. Rai el al. developed ZEE [3], which leverages the smart phone built-in sensors, tracking the user when he travels in an indoor environment, and scanning with WiFi signal simultaneously. By combining the sensors and WiFi, ZEE uses crowdsourcing to locate the user, achieving a meter-level location accuracy. Different from the previous work, in this paper, we propose an anchor-free indoor localization system. By sensing the user’s foot step and utilizing the reference position and constraint of the indoor map, FootStep-Tracker track the user’s location without any deployment of anchor nodes. III. SYSTEM OVERVIEW In our system, called FootStep-Tracker, we focus on how to track the user’s position based on the low-cost inertial sensors embedded inside the shoes, according to a given indoor map. Fig.2 shows the framework of FootStep-Tracker. First, the Activity Classifier is designed to classify the user’s activities into two activity groups, i.e., walking and reference activities such as ascending/descending the stairs, and the elevator ascending/descending, according to the raw sensor data of gyroscope and accelerometer. In regard to the walking activity, we measure the moving distance based on the Step Segmentation and Step Length Estimator, and measure the moving direction based on Moving Direction Estimator. According to the moving distance and moving direction, we reconstruct the user’s moving trace relative to the starting point. Meanwhile, it is possible to derive the reference positions according to the activity sensing results from the Activity Classifier. For example, the reference positions can be the elevators if the activity of elevator ascending/descending is detected. Furthermore, by leveraging the space constraints in the indoor map to filter out those infeasible candidate traces, our solution could finally determine the user’s trace in the indoor map. The components of FootStep-Tracker are as follows: 1) Activity Classifier. It extracts corresponding features from the inertial sensor data of human movement, then it estimates the user’s current activities via the classifica-
Step Step Length Gyroscope Estimator Activity .Walkin Moving Direction Estimator Ascend/Descend the Elevator/Stairs Reference Moving Indoor Map Position Trace 5tat (a)Axes of accelerometer. (b)Axes of gyroscope. Location Fig.3.Axes on SensorTag. Fig.2.Framework of FootStep-Tracker.By input the sensors'data and the to estimate the user's moving trace.If the user is doing indoor map,FootStep-Tracker outputs the user's location in time. reference activities,including ascending/descending the stairs and ascending/descending the elevator,we use it to find the tion techniques such as decision tree and hidden Markov reference positions in the map.Besides,if the user is detected model. as standing still,we keep sensing the sensors. 2)Step Segmentation.In regard to the activity of walking, Observation and Intuition.The acceleration a:is strongly it splits the sequential inertial sensor data into segments, relative with the six activities.That is because when the each segment represents a complete process of footstep user is standing still,the direction of z-axis is along the during walking. vertical direction which is the opposite direction of the gravity. 3)Step Length Estimator.It estimates the distance of each Besides,the acceleration is constant,which differs from the step in the horizontal line.We use a geometric model periodicity fluctuant acceleration of walking and climbing to depict the footstep movement and rotation of the foot stairs.And when the user is moving up or down,such as during one step,and then project the step length in the ascend/descend the stairs,the foot's movement is along the air to the horizontal line,by leveraging the accelerometer to estimate the step length in the air and the gyroscope vertical direction which can be sensed well by a=. We first collect a:for each activity.Fig.4 shows the to estimate the projection angle. acceleration of different activities.Fig.4(a)shows that when 4)Moving Direction Estimator.It estimates the turning the user is standing still,a almost stays constant,and the angle during the process of walking.We use a geometric amplitude equals to the gravity.Fig.4 (b-d)show that when model to depict the relationship between the angle the user is walking or ascending/descending the stairs,a. variation of the foot steps and the moving direction, changes periodically.Fig.4 (e)shows the process of a user and further derive the user's moving direction from the ascending the elevator.The red box in the figure shows that measurements from the embedded sensors. the acceleration first gets smaller than the gravity,then gets 5)Reference Position Estimator.It estimates the reference larger.While the elevator accelerates to have an upward speed, positions in the given indoor map,such as elevators and the user is under the hypergravity condition and the a is stairs,according to the results in activity sensing.In this smaller than the gravity.Then the elevator rises in a constant way,the moving trace can be fixed in the indoor map. speed,with the user's speed relative to the rest of the elevator IV.SYSTEM DESIGN Meanwhile,the a.is equal to the gravity.Finally,the elevator slows down,the user is under the weightlessness condition, System Deployment.FootStep-Tracker processes the data and a:has a negative,but bigger reading than the gravity.Fig.4 captured by the sensors which is embedded in the user's shoes. (f)shows the the process of an elevator descending which is Without loss of generality,we use CC2541 SensorTag[4] a opposite the ascending process. which is produced by TEXAS INSTRUMENTS.We sample the Solution.To classify the user's activities.we first segment the accelerometer and gyroscope with 20Hz,analysing data and sequential data into windows,then classify the window by a presenting the result of localization by an android smart phone hybrid method.Generally,the human step frequency is 1Hz carried by the user.For the convenience of further discussion,to 3Hz.That is to say,the period of a step will last from we present the axes on the SensorTag coordinate system in 0.3s to Is.The weightlessness and hypergravity process in the Fig.3.We denote the three-axis acceleration as a,ay a elevator will commonly last for about 2 seconds.We use a and the three-axis angular velocity as g,gyg slide window with size 40,which equals to 2 seconds in time, to ensure that the window contains an entire step period during A.Activity Classifier walking or a process of hypergravity and weightlessness. Motivation.For the purpose of estimating the moving trace We classify the window into eight classes,which are the and reference position,we first need to know what the user description shown in the Fig.5.Table I shows the description of is currently doing.In our scene,we need to classify the each abbreviation.Firstly,we note that UST,DST and WALK user's activity into two main classes:walking and reference obviously have a higher variance than EHG,EOW and SS.To activities.If the user is walking.we use the sensor data classify this two activities groups,we use a decision tree with
Gyroscope Accelerometer Activity Classifier Step Segmentation Moving Direction Estimator Step Length Estimator Reference Position Estimator Walking tairs Indoor Map Location Fig. 2. Framework of FootStep-Tracker. By input the sensors’ data and the indoor map, FootStep-Tracker outputs the user’s location in time. tion techniques such as decision tree and hidden Markov model. 2) Step Segmentation. In regard to the activity of walking, it splits the sequential inertial sensor data into segments, each segment represents a complete process of footstep during walking. 3) Step Length Estimator. It estimates the distance of each step in the horizontal line. We use a geometric model to depict the footstep movement and rotation of the foot during one step, and then project the step length in the air to the horizontal line, by leveraging the accelerometer to estimate the step length in the air and the gyroscope to estimate the projection angle. 4) Moving Direction Estimator. It estimates the turning angle during the process of walking. We use a geometric model to depict the relationship between the angle variation of the foot steps and the moving direction, and further derive the user’s moving direction from the measurements from the embedded sensors. 5) Reference Position Estimator. It estimates the reference positions in the given indoor map, such as elevators and stairs, according to the results in activity sensing. In this way, the moving trace can be fixed in the indoor map. IV. SYSTEM DESIGN System Deployment. FootStep-Tracker processes the data captured by the sensors which is embedded in the user’s shoes. Without loss of generality, we use CC2541 SensorTag[4] which is produced by TEXAS INSTRUMENTS. We sample the accelerometer and gyroscope with 20Hz, analysing data and presenting the result of localization by an android smart phone carried by the user. For the convenience of further discussion, we present the axes on the SensorTag coordinate system in Fig. 3. We denote the three-axis acceleration as ax, ay, az, and the three-axis angular velocity as gx, gy, gz. A. Activity Classifier Motivation. For the purpose of estimating the moving trace and reference position, we first need to know what the user is currently doing. In our scene, we need to classify the user’s activity into two main classes: walking and reference activities. If the user is walking, we use the sensor data ay az ax (a) Axes of accelerometer. gy gz gx (b) Axes of gyroscope. Fig. 3. Axes on SensorTag. to estimate the user’s moving trace. If the user is doing reference activities, including ascending/descending the stairs and ascending/descending the elevator, we use it to find the reference positions in the map. Besides, if the user is detected as standing still, we keep sensing the sensors. Observation and Intuition. The acceleration az is strongly relative with the six activities. That is because when the user is standing still, the direction of z-axis is along the vertical direction which is the opposite direction of the gravity. Besides, the acceleration is constant, which differs from the periodicity fluctuant acceleration of walking and climbing stairs. And when the user is moving up or down, such as ascend/descend the stairs, the foot’s movement is along the vertical direction which can be sensed well by az. We first collect az for each activity. Fig.4 shows the acceleration of different activities. Fig.4 (a) shows that when the user is standing still, az almost stays constant, and the amplitude equals to the gravity. Fig.4 (b-d) show that when the user is walking or ascending/descending the stairs, az changes periodically. Fig.4 (e) shows the process of a user ascending the elevator. The red box in the figure shows that the acceleration first gets smaller than the gravity, then gets larger. While the elevator accelerates to have an upward speed, the user is under the hypergravity condition and the az is smaller than the gravity. Then the elevator rises in a constant speed, with the user’s speed relative to the rest of the elevator. Meanwhile, the az is equal to the gravity. Finally, the elevator slows down, the user is under the weightlessness condition, and az has a negative, but bigger reading than the gravity. Fig.4 (f) shows the the process of an elevator descending which is a opposite the ascending process. Solution. To classify the user’s activities, we first segment the sequential data into windows, then classify the window by a hybrid method. Generally, the human step frequency is 1Hz to 3Hz. That is to say, the period of a step will last from 0.3s to 1s. The weightlessness and hypergravity process in the elevator will commonly last for about 2 seconds. We use a slide window with size 40, which equals to 2 seconds in time, to ensure that the window contains an entire step period during walking or a process of hypergravity and weightlessness. We classify the window into eight classes, which are the description shown in the Fig.5. Table I shows the description of each abbreviation. Firstly, we note that UST, DST and WALK obviously have a higher variance than EHG, EOW and SS. To classify this two activities groups, we use a decision tree with
20 wwi SS--EHG-EWL -20 -20 9.064 -40 0.5 60% UST.DST.WALK -0 200 400 50 100 Number of Samples Number of Samples EHG.EWL.SS -10 (a)Standing Still. (b)Walking. Vanmnce 400 Mean 20 20 (a)Variance (b)Mean W Fig.6.CDF of variance and means for different activity -20 mentioned before.We use Hidden Markov Model(HMM)for 4 the classification.To classify EHG,EOW and SS,we also notice that the mean value of the window is different,caused 100 6 by the hypergravity and weightlessness.Fig.6(b)shows the CDF plot of the three activities'mean values of a:window, (c)Ascend the stairs. (d)Descend the stairs which contains about 200 windows collected among three five different users.Further more,if we estimate the user's activity as EHG,then we wait for an EOW,and we say the user is -10 under EA.If the user is under EOW,we wait for an EHG, and we say the user is under ED. B.Step Segmentation Number of Sample00 400 Number of Samples To estimate the length of each step,we first need to split (e)Elevator Ascending. (f)Elevator Descending. the raw sequential data into each step.Human walking is a Fig.4.Accelerometer data of vertical direction(z-axis,contains the gravity periodical movement along the moving direction,which has about -9.8)when the user is standing still,walking.ascending/descending a specific pattern in sensors'reading.The direction of y- stairs and taking an elevator. axis is almost the same as the moving direction,we do step a threshold on the variance of window. segmentation on ay and assisted by ar and a=.Fig.7 shows the acceleration of three-axis while the user is walking.Note UST that,after the foot touches the floor,and before it lifts up,it ●ase HMM DST is relative static to the ground and the accelerometer have a constant reading,which we called "static zone". WALK The red boxes in Fig.7 shows the "static zone"of accelerom- Treel EHG EA eter.To avoid the mistake segmentation caused by the activities Tree2 EWL ED which is similar to walking,such as swing the leg,we also SS detect "static zone"on az and a=.If the current activity is walking,Step Segmentation takes the raw data as input, Fig.5.Activity Classifier segmenting az,ay,a:by "static zones"which contains six consecutive samples which range from 0+0.5 on a,ay and TABLE I LABEL DESCRIPTION -9.8+0.5 on a.We extract the window between the "static zone"window for each axis.And get intersection elements of Abbrev Description Abbrev Description UST EWL the three as our segmented data for the current step. ascend the stairs weightlessness in elevator DST descend the stairs EA ascend the elevator WALK walking ED descend the elevator EHG hypergravity in elevator SS stand still Fig.6(a)shows the CDF(Cumulative Distribution Function) 40 60 100 120 Number of samn plot of the two groups'window variances of a=,which contains about 700 windows collected among three different users. The a:almost stays constant when a user is standing still or 80 100 120 is taking an elevator.Meanwhile,it has a larger variance while Number ot sam the user is walking or ascending/descending stairs.Moreover, there is an obvious bound between the two groups,which can be selected as the threshold.There is no such an obvious bound to classify the UST,DST and WALK.However note that,the 40 60 80 100 120 Number of Sample fluctuation of UST.DST and WALK is different as we have Fig.7.the data of accelerometer of walking
0 200 400 −60 −40 −20 0 20 Number of Samples a z(m/s 2 ) (a) Standing Still. 0 50 100 −60 −40 −20 0 20 Number of Samples a z(m/s 2 ) (b) Walking. 0 50 100 −60 −40 −20 0 20 Number of Samples a z(m/s 2 ) (c) Ascend the stairs. 0 20 40 60 80 −60 −40 −20 0 20 Number of Samples a z(m/s 2 ) (d) Descend the stairs. 0 200 400 −15 −10 −5 Number of Samples a z(m/s 2 ) (e) Elevator Ascending. 0 200 400 −15 −10 −5 Number of Samples a z(m/s 2 ) (f) Elevator Descending. Fig. 4. Accelerometer data of vertical direction(z-axis, contains the gravity about -9.8) when the user is standing still, walking, ascending/descending stairs and taking an elevator. a threshold on the variance of window. window Decision Tree1 HMM Decision Tree2 UST DST WALK SS EHG EWL EA ED Activity Classifier Fig. 5. Activity Classifier TABLE I LABEL DESCRIPTION Abbrev Description Abbrev Description UST ascend the stairs EWL weightlessness in elevator DST descend the stairs EA ascend the elevator WALK walking ED descend the elevator EHG hypergravity in elevator SS stand still Fig.6 (a) shows the CDF (Cumulative Distribution Function) plot of the two groups’ window variances of az, which contains about 700 windows collected among three different users. The az almost stays constant when a user is standing still or is taking an elevator. Meanwhile, it has a larger variance while the user is walking or ascending/descending stairs. Moreover, there is an obvious bound between the two groups, which can be selected as the threshold. There is no such an obvious bound to classify the UST, DST and WALK. However note that, the fluctuation of UST, DST and WALK is different as we have 0 200 400 0 0.5 1 Variance CDF UST,DST,WALK EHG,EWL,SS 9.064 (a) Variance −11 −10 −9 −8 0 0.5 1 Mean CDF SS EHG EWL (b) Mean Fig. 6. CDF of variance and means for different activity. mentioned before. We use Hidden Markov Model (HMM) for the classification. To classify EHG, EOW and SS, we also notice that the mean value of the window is different, caused by the hypergravity and weightlessness. Fig. 6 (b) shows the CDF plot of the three activities’ mean values of az window, which contains about 200 windows collected among three five different users. Further more, if we estimate the user’s activity as EHG, then we wait for an EOW, and we say the user is under EA. If the user is under EOW, we wait for an EHG, and we say the user is under ED. B. Step Segmentation To estimate the length of each step, we first need to split the raw sequential data into each step. Human walking is a periodical movement along the moving direction, which has a specific pattern in sensors’ reading. The direction of yaxis is almost the same as the moving direction, we do step segmentation on ay and assisted by ax and az. Fig.7 shows the acceleration of three-axis while the user is walking. Note that, after the foot touches the floor, and before it lifts up, it is relative static to the ground and the accelerometer have a constant reading, which we called “static zone”. The red boxes in Fig.7 shows the “static zone” of accelerometer. To avoid the mistake segmentation caused by the activities which is similar to walking, such as swing the leg, we also detect “static zone” on ax and az. If the current activity is walking, Step Segmentation takes the raw data as input, segmenting ax, ay, az by “static zones” which contains six consecutive samples which range from 0 ± 0.5 on ax, ay and −9.8 ± 0.5 on az. We extract the window between the “static zone” window for each axis. And get intersection elements of the three as our segmented data for the current step. 0 20 40 60 80 100 120 −20 0 20 Number of Samples a x(m/s 2 ) 0 20 40 60 80 100 120 −50 0 50 Number of Samples a y(m/s 2 ) 0 20 40 60 80 100 120 −50 0 50 Number of Samples a z(m/s 2 ) Fig. 7. the data of accelerometer of walking
foot C.Step Length Estimator direction Motivation.For the purpose of depicting the user's moving trace,we need the user's moving distance.Different users have different step length according to their figure.For a specific user,lot of existing step length estimation schemes are based on the assumption that the step length is invariable during a period of time.While we believe that the user's step length (a) (b) may change frequently in some cases,such as walking with Fig.9.The toe-in and toe-out situation small steps and jogging with large steps.Step Length Estimator estimates the length step by step,which can sensing the change Critical Time Extraction.As we mentioned above,only the foot's movement in phase (3)leads to the displacement which of the user's stride in time. Challenge.The step length is not exactly the length of the happens between the uplift time and liftoff time.Besides,the foot's moving trace in the air.Instead,as depicted by the red angle 0 changes from uplift time to landing time.So we extract dotted line in Fig.8,it is the moving trace's projection on the the critical time,which is uplift time,liftoff time,landing time ground.Therefore,we cannot directly derive the step length and rest time.Given the segmented data by Step Segmentation, by the double integral on ay. which only contains the phase (2)-(4)data,FootStep-Tracker extracts critical times in the data sequences.At the uplift time. Observation and Intuition.Fig.8 (a)depicts the moving process of the feet.As shown in the figure,the y-axis is not the heel uplifts,the g starts to be negative,but ay is slightly always horizontal,we project it on the horizontal plane and less than zero.We extract backward from the segment,taking denote it as foot direction,and the angle between the y-axis the time when g starts to be negative as uplifi time.At the direction and foot direction as 6. liftoff time,the foot just starts to move forward.We extract Fig.8 (b)shows the sensor's data corresponding to (a).As among the segment,taking the time when ay(t)<0,and shown in Fig.3,in the sensor's coordinate system,the forward ay(t+1)is positive as liftoff time.At the landing time,the direction is positive of ay and the anticlockwise direction is heel touch the ground,ay declines to negative.At the rest positive of x-axis of gr.At phase (1),the foot is relative static time,gr and ay start to be zero again.We extract the first to the ground,corresponding to a few zero values on ay and time when ay,gr become zero. g.At phase (2),the foot actually does not have a forward Algorithm 1:Critical Time Extraction. acceleration.But the heel uplifts,leading to a negative reading Input:Sequential data ay,g,Segmented data for current in gr.As the y-axis is no longer horizontal,ay is slightly less step D. than zero caused by the gravity.We denoted the time at begin Output:Uplift time T,liftoff time T,landing time Ta. of phase(2)as uplifi time,i.e.,Tu.At phase (3),the foot starts rest time T to move forward.Instep moving upwarp,leading to a positive 1 Find the T backward from the beginning of D.until the reading in g.The entire foot accelerates forward and causes a data at time t satisfies that g(t-1)=0,g(t)<0; positive reading in ay We denoted the time at begin of phase 2 Find the Ti backward from the beginning of Ds until the (3)as liftoff time,i.e.,Ti.At phase (4),the foot decelerates to data at time t that satisfies that au(t)<0,au(t+1)>0; static,touch the land and the instep downwarps.We denoted 3 Find the Ta forward from the end of Ds until thethe the time at begin of phase (4)as landing time,i.e..Ta.At data at time t satisfies that a(t-1)>0,a(t)<0; phase(5).the heel touches down the land and rests again.We 4 Find the T forward from the end of D.until the the denote the time at begin of phase (5)as rest time,i.e..T. data at time t satisfies that g(t),a(t)is equal to zero; Besides.due to the toe in and toe out.the forward horizontal 5 return Tu,Ti,Td,Tr: acceleration along the foot direction can not represent that along the moving direction.Fig.9 depicts the situation.a Step Length Estimation.The red dotted line in Fig.8 shows denotes the x-axis acceleration,am denotes the acceleration that the step length is not the foot's moving tracing in the air, along the moving direction,and af is the acceleration along but it's projection on the ground.Eq.(2)shows that the forward the horizontal foot direction.There is an angle between the acceleration along the foot direction af can be calculated by moving direction and foot direction.The relationship of the ay,a=,and the angle at each time.We project ay,a on the three acceleration am,af,ar can be represent as Eq.(1). horizontal plane,and compound them as af. am =afcos(p)+arsin(p) (1) af(t)=ay(t)cos(0(t))+az(t)sin(0(t)),tE[Ti,Tal (2) Given the segmented sensor data by Step Segmentation,we Eq.(3)calculates the angle between y-axis direction and foot first extract the critical time,including uplift time,liftoff time, direction for each time.As the instep starts to roll at uplift landing time and rest time.Then we estimate the step length by time,we do the integral on x-axis gyroscope from uplift time, integral on the am from liftoff time to landing time.Lastly, getting the angle 0 at each time t. as we embedded sensor in both shoes,we use double feet calibration to reduce the error further,getting the calibrated (t)= gz(t)dt,t∈[Tu,T] (3) step length
C. Step Length Estimator Motivation. For the purpose of depicting the user’s moving trace, we need the user’s moving distance. Different users have different step length according to their figure. For a specific user, lot of existing step length estimation schemes are based on the assumption that the step length is invariable during a period of time. While we believe that the user’s step length may change frequently in some cases, such as walking with small steps and jogging with large steps. Step Length Estimator estimates the length step by step, which can sensing the change of the user’s stride in time. Challenge. The step length is not exactly the length of the foot’s moving trace in the air. Instead, as depicted by the red dotted line in Fig.8, it is the moving trace’s projection on the ground. Therefore, we cannot directly derive the step length by the double integral on ay. Observation and Intuition. Fig.8 (a) depicts the moving process of the feet. As shown in the figure, the y-axis is not always horizontal, we project it on the horizontal plane and denote it as foot direction, and the angle between the y-axis direction and foot direction as θ. Fig.8 (b) shows the sensor’s data corresponding to (a). As shown in Fig.3, in the sensor’s coordinate system, the forward direction is positive of ay and the anticlockwise direction is positive of x-axis of gx. At phase (1), the foot is relative static to the ground, corresponding to a few zero values on ay and gx. At phase (2), the foot actually does not have a forward acceleration. But the heel uplifts, leading to a negative reading in gx. As the y-axis is no longer horizontal, ay is slightly less than zero caused by the gravity. We denoted the time at begin of phase (2) as uplift time, i.e., Tu. At phase (3), the foot starts to move forward. Instep moving upwarp, leading to a positive reading in gx. The entire foot accelerates forward and causes a positive reading in ay. We denoted the time at begin of phase (3) as liftoff time, i.e., Tl . At phase (4), the foot decelerates to static, touch the land and the instep downwarps. We denoted the time at begin of phase (4) as landing time, i.e., Td. At phase (5), the heel touches down the land and rests again. We denote the time at begin of phase (5) as rest time, i.e., Tr. Besides, due to the toe in and toe out, the forward horizontal acceleration along the foot direction can not represent that along the moving direction. Fig.9 depicts the situation. ax denotes the x-axis acceleration, am denotes the acceleration along the moving direction, and af is the acceleration along the horizontal foot direction. There is an angle ϕ between the moving direction and foot direction. The relationship of the three acceleration am, af , ax can be represent as Eq.(1). am = af cos(ϕ) + axsin(ϕ) (1) Given the segmented sensor data by Step Segmentation, we first extract the critical time, including uplift time, liftoff time, landing time and rest time. Then we estimate the step length by integral on the am from liftoff time to landing time. Lastly, as we embedded sensor in both shoes, we use double feet calibration to reduce the error further, getting the calibrated step length. moving direction accelerometer x-axis direction foot direction φ (a) φ ax am af φ (b) Fig. 9. The toe-in and toe-out situation. Critical Time Extraction. As we mentioned above, only the foot’s movement in phase (3) leads to the displacement which happens between the uplift time and liftoff time. Besides, the angle θ changes from uplift time to landing time. So we extract the critical time, which is uplift time, liftoff time, landing time and rest time. Given the segmented data by Step Segmentation, which only contains the phase (2)-(4) data, FootStep-Tracker extracts critical times in the data sequences. At the uplift time, the heel uplifts, the gx starts to be negative, but ay is slightly less than zero. We extract backward from the segment, taking the time when gx starts to be negative as uplift time. At the liftoff time, the foot just starts to move forward. We extract among the segment, taking the time when ay(t) < 0, and ay(t + 1) is positive as liftoff time. At the landing time, the heel touch the ground, ay declines to negative. At the rest time, gx and ay start to be zero again. We extract the first time when ay, gx become zero. Algorithm 1: Critical Time Extraction. Input: Sequential data ay, gx, Segmented data for current step Ds Output: Uplift time Tu, liftoff time Tl , landing time Td, rest time Tr 1 Find the Tu backward from the beginning of Ds until the data at time t satisfies that gx(t − 1) = 0, gx(t) < 0; 2 Find the Tl backward from the beginning of Ds until the data at time t that satisfies that ay(t) < 0, ay(t + 1) > 0; 3 Find the Td forward from the end of Ds until the the data at time t satisfies that ay(t − 1) > 0, ay(t) < 0; 4 Find the Tr forward from the end of Ds until the the data at time t satisfies that gx(t), ay(t) is equal to zero; 5 return Tu, Tl , Td, Tr; Step Length Estimation. The red dotted line in Fig.8 shows that the step length is not the foot’s moving tracing in the air, but it’s projection on the ground. Eq.(2) shows that the forward acceleration along the foot direction af can be calculated by ay ,az, and the angle θ at each time. We project ay, az on the horizontal plane, and compound them as af . af (t) = ay(t)cos(θ(t)) + az(t)sin(θ(t)), t ∈ [Tl , Td ] (2) Eq.(3) calculates the angle between y-axis direction and foot direction for each time. As the instep starts to roll at uplift time, we do the integral on x-axis gyroscope from uplift time, getting the angle θ at each time t. θ(t) = Z t Tu gx(t) dt, t ∈ [Tu , Tr ] (3)