Additional Training Data
The original PerfLoc training data was acquired by a person who simply walked in four buildings in which data was collected. On the other hand, the test data included T&E scenarios for various modes of mobility. By design, NIST made these scenarios challenging in the sense that the PerfLoc participants were expected to deal with occurrence of several modes of mobility within a single scenario. They were expected to detect the time instances at which the mobility mode changed and to correctly identify the new mobility mode. The idea was that correct detection of the mobility mode and development of models for various mobility modes would result in better localization and tracking performance.
To achieve this lofty goal, some PerfLoc participants suggested that it would be helpful if NIST made training data available for various mobility modes encountered in the test data. This would allow the participants to develop models for various mobility modes, which they could use for automatic detection of mobility mode and adaptation of the tracking algorithm to the mobility mode in effect at a given time. NIST also realized that it would be useful to provide training data for smartphone attitude estimation, given how tightly localization and tracking is coupled with attitude estimation.
In response to these requests, NIST collected two types of training data that we are making available to the PerfLoc community. We collected training data with four smartphones (OnePlus 2, LG G4, Motorola Nexus 6, Samsung Galaxy S6) attached to the arms of the person collecting the data in the same way that the original PerfLoc data was collected (see PerfLoc User Guide). The person collecting the data was a young male, other than another young male who collected the original PerfLoc data, who was using his left leg as the leading leg when he was sidestepping. We also collected training data with a Google Pixel XL smartphone, which will be used at the PerfLoc Finalist App Demo Days at NIST in February 2018. The person collecting this data was a young female who was using her right leg as the leading leg when she was sidestepping. She held the Google Pixel XL smartphone in her left hand while moving around in the building. This is also how the tests at the PerfLoc Finalist App Demo Days will be conducted at NIST.
The training scenarios for three mobility modes (walking backwards, sidestepping, and crawling on the floor) starts with the person collecting data walking normally for a minute or two before switching to the mobility mode other than walking intended for that scenario. For example, the person may crawl on the floor for a few minutes. Then the person resumes walking normally for a minute or two to complete the scenario. The training data for three other mobility modes (going up and down a stairwell, going up and down in an elevator, and transporting smartphone(s) on a push cart) was collected while the person was moving around using that type of motion only, as opposed to walking before or afterwards. Of course, the person had to take a few steps to get into or out of the elevator or to get on or off the staircase. For attitude estimation, we collected data with the Google Pixel XL smartphone only, because it was not possible to measure the attitude of the four phones strapped to the arms of the person who collected data with any degree of accuracy. More will be said about this shortly. The table below shows the seven types of scenarios we used for data collection and which scenario was used with which phone(s).
|Scenario ID||Mobility Mode||Simultaneous Data Collection with Four Phones (OnePlus 2, LG G4, Motorola Nexus 6, and Samsung Galaxy S6)||Data Collection with Google Pixel XL Only|
|AS4||Going Up and Down a Stairwell||✔||✔|
|AS5||Going Up and Down an Elevator||✔||✔|
|AS6||Transporting Smartphone(s) on a Push Cart||✔||✔|
|AS7||Phone Attitude Estimation||--||✔|
The additional training data comprises five types of data (see PerfLoc User Guide): (i) built-in environmental, position and motion sensors, (ii) timestamps at certain test points (dots) in the buildings with known x,y,z coordinates and other ground truth information, (iii) metadata involving the features of the device that was used to collect a data set corresponding to a particular scenario and the air pressure value at the beginning of the scenario, whenever applicable, and (iv) a file describing the scenario including the coordinates of each training dot in the building used in the scenario. The protocol buffer definitions are the same as with the original PerfLoc data. We are making available only the types of data that are potentially useful in developing models for various mobility modes. The data is well annotated in the sense that ground truth location is provided frequently, the time at which the person switches from walking to the another mobility mode and the time at which the person switches back to walking in the first three scenarios are provided, and every turn the person makes, whether in the direction of motion or just by turning the body without any change in the direction of motion (for example at the time the person switches from walking to sidestepping) are documented. The file that describes each of the first six scenario types (AS1 - AS6 in the above table) is a table with the following column headings:
|Index||Change in Direction of Motion||Turn of the Body||Notes||Easting (m)||Northing (m)||Elevation (m)|
The file that describes the last scenario type (AS7 in the above table) has one more column with the heading "Attitude" before (to the left of) of the "Notes" column.
(Easting, Northing, Elevation) are the coordinates of the points at which ground truth information about any change in the direction of motion, turn of the body of the person collecting the data, or any change in mobility mode is documented. We used the identifying codes shown in the following table for any "Change in Direction of Motion":
|Various Choices for "Change in Direction of Motion"||Identifying Codes|
|No change in the direction of motion||1|
|180-degree change in the direction of motion without the person who collected the data turning at all, e.g. when the person switches from walking normally to walking backwards||2|
|90-degree turn to the right with respect to the direction of motion||3|
|90-degree turn to the left with respect to the direction of motion||4|
|Making a u-turn by turning to the right||5|
|Making a u-turn by turning to the left||6|
Note that at some locations we pressed on the push button switch that creates a timestamp twice without moving. There is no identifying code in the above table for that scenario, but one can detect its occurrence if there is no change in the coordinates (Easting, Northing, and Elevation) from one row of the table to the next. In these scenarios, it is still possible for the person collecting data to turn in place, e.g. when the person switches from walking normally to sidestepping.
Also note that the initial direction of motion (North, West, South, or East) is specified in the "Notes" field in the first row of the table.
We used the identifying codes shown in the following table for "Turn of the Body":
|Various Choices for "Turn of the Body"||Identifying Codes|
|No change in the direction of motion||1|
|90-degree turn to the right (clockwise when viewed from above)||2|
|90-degree turn to the left (counterclockwise when viewed from above)||3|
|180-degree turn to the right (clockwise when viewed from above)||4|
|180-degree turn to the left (counterclockwise when viewed from above)||5|
To collect attitude estimation data, we laid the Google Pixel XL phone on the floor in one of six possible ways at many locations. The ground truth attitude of the phone is provided at those instances. We placed the phone face up on the floor with the y-axis of the phone pointing to the direction of motion, to the right (by 90 degrees) of the direction of motion, in the opposite to the direction of motion, to the left (by 90 degrees) of the direction of motion. We also placed the phone on the floor on its side with the y-axis pointing in the direction of motion with the screen facing to the right or to the left. The pictures below show these six phone attitudes with the respective identifying code used in the file describing the attitude estimation scenario at the bottom of each image. The phone was freely moving in the left hand of the person collecting data in between times at which the person placed the phone on the floor. Therefore, the challenge is to estimate the phone attitude on a continuous basis and compare the estimate with the provided ground truth attitude at those instances the phone was placed on the floor.
The additional training data described in this page may be downloaded here.
Any question? Please email LocSprt@nist.gov.