.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/sensor_position_comparison_2019.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_sensor_position_comparison_2019.py: SensorPositionComparison2019 - Full mocap reference data set with 6 sensors per foot ==================================================================================== We provide 2 versions of the dataset: SensorPositionComparison2019Segmentation: In this dataset no Mocap ground truth is provided and the IMU data is not cut to the individual gait test, but just a single recording for all participants exists with all tests (including failed ones) and movement between the tests. This can be used for stride segmentation tasks, as we hand-labeled all stride-start-end events in these recordings SensorPositionComparison2019Mocap: In this dataset the data is cut into the individual tests. This means 7 data segments exist per participants. For each of these segments full synchronised motion capture reference is provided. For more information about the dataset, see the dataset [documentation](https://zenodo.org/record/5747173) General information ------------------- The dataset was recorded with Nilspod sensors by Portabiles. Multiple sensors were attached to the feet of the participants. For most tasks you will only be interested in the data from one sensor position. The data from all foot-mounted IMUs are transformed into the gaitmap coordinate system on loading. If you want to use the data from the ankle or hip sensor, they will remain in their original coordinate system as defined by the sensor node. For attachment images see the dataset [documentation](https://zenodo.org/record/5747173). Below we show the performed dataset transformation for the instep sensor as an example. All other sensor transformations are shown at the end of this document. .. figure:: /images/coordinate_systems/coordinate_transform_nilspodV1_instep_fraunhofer_qualisis.svg :alt: coordinate system definition instep :figclass: align-center .. GENERATED FROM PYTHON SOURCE LINES 39-51 .. warning:: For this example to work, you need to have a global config set containing the path to the dataset. Check the `README.md` for more information. SensorPositionComparison2019Segmentation ======================================== This version of the dataset contains one recording per participant with all tests and movement between the tests. No Mocap reference is provided, but just the IMU data and the stride borders based on the IMU data. By default, the data of all sensors is provided and the data for each sensor is aligned based on the roughly known orientation of the sensor, so that the coordinate system of the insole sensor (see dataset documentation) can be used for all sensors. .. GENERATED FROM PYTHON SOURCE LINES 51-60 .. code-block:: default from joblib import Memory from gaitmap_datasets.sensor_position_comparison_2019 import SensorPositionComparison2019Segmentation dataset = SensorPositionComparison2019Segmentation( memory=Memory("../.cache"), ) dataset .. raw:: html

SensorPositionComparison2019Segmentation [14 groups/rows]

participant
0 4d91
1 5047
2 5237
3 54a9
4 6dbe_2
5 6e2e
6 80b8
7 8873
8 8d60
9 9b4b
10 c9bb
11 cb3d
12 cdfc
13 e54d


.. GENERATED FROM PYTHON SOURCE LINES 61-63 We can see that we have 14 participants. Using the dataset class, we can select any subset of participants. .. GENERATED FROM PYTHON SOURCE LINES 63-66 .. code-block:: default subset = dataset.get_subset(participant=["4d91", "5047"]) subset .. raw:: html

SensorPositionComparison2019Segmentation [2 groups/rows]

participant
0 4d91
1 5047


.. GENERATED FROM PYTHON SOURCE LINES 67-69 Once we have the selection of data we want to work with, we can iterate the dataset object to access the data of individual datapoints or just index it as below. .. GENERATED FROM PYTHON SOURCE LINES 69-72 .. code-block:: default datapoint = subset[0] datapoint .. raw:: html

SensorPositionComparison2019Segmentation [1 groups/rows]

participant
0 4d91


.. GENERATED FROM PYTHON SOURCE LINES 73-75 On this datapoint, we can now access the data. We will start with the metadata. .. GENERATED FROM PYTHON SOURCE LINES 75-77 .. code-block:: default datapoint.metadata .. rst-class:: sphx-glr-script-out .. code-block:: none {'age': 28.0, 'bmi': 24.69, 'height': 180.0, 'imu_tests': {'fast_10': {'start': '2019-05-29T11:56:00.131836000', 'start_idx': 37914, 'stop': '2019-05-29T11:56:29.545898000', 'stop_idx': 43938}, 'fast_20': {'start': '2019-05-29T11:59:27.661132000', 'start_idx': 80416, 'stop': '2019-05-29T11:59:55.737304000', 'stop_idx': 86166}, 'long': {'start': '2019-05-29T12:00:19.760742000', 'start_idx': 91086, 'stop': '2019-05-29T12:05:26.264648000', 'stop_idx': 153858}, 'normal_10': {'start': '2019-05-29T11:53:28.408203000', 'start_idx': 6841, 'stop': '2019-05-29T11:54:12.934570000', 'stop_idx': 15960}, 'normal_20': {'start': '2019-05-29T11:57:03.544922000', 'start_idx': 50901, 'stop': '2019-05-29T11:57:38.671875000', 'stop_idx': 58095}, 'slow_10': {'start': '2019-05-29T11:54:38.159179000', 'start_idx': 21126, 'stop': '2019-05-29T11:55:34.155273000', 'stop_idx': 32594}, 'slow_20': {'start': '2019-05-29T11:58:09.355468000', 'start_idx': 64379, 'stop': '2019-05-29T11:59:03.520507000', 'stop_idx': 75472}}, 'mocap_test_start': {'fast_10': '2019-05-29, 13:55:57', 'fast_20': '2019-05-29, 13:59:24', 'long': '2019-05-29, 14:00:16', 'normal_10': '2019-05-29, 13:53:25', 'normal_20': '2019-05-29, 13:57:00', 'slow_10': '2019-05-29, 13:54:35', 'slow_20': '2019-05-29, 13:58:06'}, 'sensors': {'back': 'A515', 'l_ankle': 'E901', 'l_cavity': '9CA5', 'l_heel': '598E', 'l_insole': 'EFCC', 'l_instep': 'D60A', 'l_lateral': 'EB9E', 'l_medial': '5D89', 'r_ankle': 'C9FB', 'r_cavity': '9710', 'r_heel': '36BD', 'r_insole': '6807', 'r_instep': '922A', 'r_lateral': '0D14', 'r_medial': '220F', 'sync': '323C'}, 'sex': 'male', 'shoe_size': 42, 'weight': 80.0} .. GENERATED FROM PYTHON SOURCE LINES 78-80 Next we can access the synchronised data of the individual sensors. The data is stored as a multi-column pandas DataFrame with the time as index. .. GENERATED FROM PYTHON SOURCE LINES 80-83 .. code-block:: default imu_data = datapoint.data imu_data.head() .. rst-class:: sphx-glr-script-out .. code-block:: none ________________________________________________________________________________ [Memory] Calling gaitmap_datasets.sensor_position_comparison_2019._dataset._get_session_and_align... _get_session_and_align('4d91', data_folder=PosixPath('/home/arne/Documents/repos/work/projects/sensor_position_comparison/sensor_position_main_analysis/data/raw')) ____________________________________________get_session_and_align - 5.3s, 0.1min .. raw:: html
back l_ankle l_cavity l_heel l_insole l_instep l_lateral l_medial r_ankle r_cavity r_heel r_insole r_instep r_lateral r_medial
acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z acc_x acc_y acc_z gyr_x gyr_y gyr_z
0.000000 9.639655 1.003389 0.792358 0.964769 -4.226855 1.145244 9.616028 1.097543 0.580359 8.893348 1.039582 -1.149213 -0.478280 0.495052 9.670171 0.486887 -2.350048 9.478837 -2.798397 1.174054 9.245719 -2.238035 -1.056516 8.550122 -1.012983 0.107313 9.697630 -0.489173 -3.509086 9.284385 -5.138431 -0.283704 8.277701 -3.924461 -3.198059 8.274648 -1.584041 2.358008 9.093908 -1.885105 -0.843765 9.354360 -1.313708 0.462278 9.647076 1.090911 -2.628315 9.487558 9.655968 -0.533029 -1.641668 0.064293 -0.995854 -0.119135 -0.195298 -0.296407 9.780394 0.342168 -0.107218 -0.237374 -3.329367 -0.335009 9.173181 -0.171504 0.083816 -0.619364 -1.680654 0.531686 9.628885 -0.029296 0.536366 -0.144435 -5.199548 0.209451 8.330069 0.010661 -0.056726 -0.213001 -0.485401 -1.077195 9.734632 0.700920 0.381665 0.026233 -2.454875 0.861965 9.510098 -0.609000 0.054679 -0.377781
0.004883 9.658913 0.979582 0.797465 1.263272 -3.808572 0.522217 9.592170 1.025593 0.604017 9.011317 1.163295 -0.720290 -0.525866 0.472277 9.593322 -0.229196 -2.149277 9.353990 -2.893315 1.141313 9.227531 -2.227447 -1.059012 9.090357 -1.131958 0.015675 9.578977 -1.346035 -3.336291 9.280680 -5.047247 -0.242298 8.443387 -4.221019 -3.006924 8.032200 -1.466738 2.326298 9.324606 -2.683556 -0.657678 8.923062 -1.320010 0.542008 9.675786 0.353176 -2.458574 9.013681 9.636703 -0.470721 -1.679168 0.122383 -0.996680 -0.242096 -0.194860 -0.267221 9.836840 0.098813 0.074041 -0.237625 -3.361520 -0.478899 9.229526 0.133922 0.145733 -0.682751 -1.629357 0.496934 9.557363 0.025199 0.051991 -0.020901 -5.204463 0.185407 8.287773 0.245606 -0.123868 0.092788 -0.523744 -1.062473 9.772863 0.142299 -0.232900 -0.442393 -2.435560 0.809754 9.552935 -0.239553 0.174794 -0.256035
0.009766 9.644721 0.941690 0.817528 1.448687 -4.308459 -0.258831 9.710810 1.059832 0.632045 8.412018 1.347793 -0.906480 -0.349521 0.493113 9.724612 0.246885 -2.344160 9.354793 -2.847745 1.106559 9.346008 -2.667794 -1.175937 8.375414 -1.099034 0.097475 9.659951 -0.982458 -3.333560 8.977136 -5.047621 -0.184963 8.406099 -4.044298 -2.882318 7.787892 -1.578492 2.385069 9.448548 -2.378515 -0.843877 9.225552 -1.352227 0.484959 9.661543 0.227079 -2.277953 9.132425 9.603264 -0.585479 -1.644957 0.055563 -1.179022 -0.423820 -0.218909 -0.257937 9.832040 0.166820 -0.292649 0.012160 -3.347421 -0.340068 9.211550 0.195143 0.391584 -0.740335 -1.601073 0.486773 9.533437 0.266061 -0.254717 -0.143613 -5.199455 0.214274 8.400605 0.308364 0.126203 -0.030605 -0.480560 -1.067600 9.749025 0.578217 -0.231504 0.027214 -2.382656 0.847593 9.533862 -0.424711 0.174321 -0.136325
0.014648 9.644446 0.970887 0.855183 1.448176 -4.482166 0.417053 9.634935 1.121579 0.589781 9.071769 0.670671 -1.025639 -0.454468 0.466338 9.637213 0.553151 -2.225334 8.928221 -3.013602 1.232312 9.292753 -1.704833 -1.421502 7.939957 -1.103957 0.126482 9.721759 -1.231116 -2.913909 8.974175 -4.904418 -0.373580 8.216127 -3.870531 -3.443963 7.843597 -1.518045 2.347116 9.195006 -2.432015 -0.654928 8.748984 -1.361561 0.475349 9.661571 0.532298 -2.148109 8.712761 9.613208 -0.609454 -1.607167 -0.131440 -1.727433 -0.237600 -0.228618 -0.248485 9.827244 -0.081777 0.071465 -0.236178 -3.316683 -0.482910 9.123911 0.197406 0.205761 -0.563970 -1.657266 0.497896 9.638724 0.457502 0.227155 -0.086887 -5.223509 0.257579 8.301641 0.369600 0.250148 0.091517 -0.475689 -1.057406 9.797073 0.455013 0.139391 0.265228 -2.353629 0.771404 9.614875 -0.363227 0.050310 -0.016760
0.019531 9.759649 0.913256 0.836195 1.265814 -4.173974 0.593066 9.616217 1.126282 0.552007 8.112835 0.793633 -1.335435 -0.469232 0.442467 9.665183 0.125151 -2.403567 9.476217 -2.761227 1.129461 9.402465 -2.487435 -1.298131 8.311991 -1.089331 0.058657 9.545938 -0.860304 -3.332403 8.977536 -5.037178 -0.323512 8.494659 -4.407556 -3.497591 7.354362 -1.454504 2.318448 9.516196 -2.678787 -0.655022 8.684410 -1.427903 0.459928 9.661788 0.477007 -2.454248 8.416942 9.666087 -0.523495 -1.603864 -0.622239 -2.147860 -0.228380 -0.286172 -0.254087 9.808381 -0.259506 0.070568 0.011377 -3.298001 -0.517798 9.285508 -0.804771 0.182991 1.712054 -1.619186 0.530274 9.556883 0.208960 -0.009893 0.221172 -5.228498 0.209488 8.245288 0.130432 0.002707 0.215289 -0.509365 -1.080655 9.801870 0.516753 -0.107439 0.146249 -2.445217 0.842934 9.529151 0.007138 0.112506 -0.255844


.. GENERATED FROM PYTHON SOURCE LINES 84-89 Finally, we provide hand-labeled stride borders for the IMU data. All strides are labeled based on the minima in the gyr_ml signal (see dataset documentation for more details). Note that we use a trailing `_` to indicate that this is data calculated based on the ground truth/manual labels and not just the IMU data. .. GENERATED FROM PYTHON SOURCE LINES 89-92 .. code-block:: default segmented_stride_labels = datapoint.segmented_stride_list_["left"] segmented_stride_labels.head() .. raw:: html
start end
s_id
426 1907 2145
427 2145 2373
428 2373 2598
429 2598 2824
430 2824 3053


.. GENERATED FROM PYTHON SOURCE LINES 93-96 Alternatively to the `segmented_stride_list_` we also provide the `segmented_stride_list_per_sensor_` makes it easier to directly access the stride borders for a specific sensor (note they are still the same, as before, as only one stridelist per foot exists). .. GENERATED FROM PYTHON SOURCE LINES 96-100 .. code-block:: default segmented_stride_labels = datapoint.segmented_stride_list_per_sensor_["l_insole"] segmented_stride_labels.head() .. raw:: html
start end
s_id
426 1907 2145
427 2145 2373
428 2373 2598
429 2598 2824
430 2824 3053


.. GENERATED FROM PYTHON SOURCE LINES 101-102 Below we plot the IMU data (acc on top, gyro on bottom) and the stride borders for a small part of the data. .. GENERATED FROM PYTHON SOURCE LINES 102-117 .. code-block:: default import matplotlib.pyplot as plt sensor = "l_insole" fig, axes = plt.subplots(2, 1, sharex=True, figsize=(10, 5)) imu_data[sensor].filter(like="gyr").plot(ax=axes[0]) imu_data[sensor].filter(like="acc").plot(ax=axes[1]) for i, s in datapoint.segmented_stride_list_["left"].iterrows(): s /= datapoint.sampling_rate_hz axes[0].axvspan(s["start"], s["end"], alpha=0.2, color="C1") axes[1].axvspan(s["start"], s["end"], alpha=0.2, color="C1") axes[0].set_xlim(300, 350) fig.tight_layout() fig.show() .. image-sg:: /auto_examples/images/sphx_glr_sensor_position_comparison_2019_001.png :alt: sensor position comparison 2019 :srcset: /auto_examples/images/sphx_glr_sensor_position_comparison_2019_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 118-127 SensorPositionComparison2019Mocap ================================= For this version of the dataset, the data is split into the individual tests. This means 7 data segments exist per participants. For details about the respective tests, see the dataset documentation. For each of these segments full synchronised motion capture trajectory of all markers is provided. Further, we provide labels for IC and TC derived from the motion capture data for each of the hand labeled strides within the segments. .. GENERATED FROM PYTHON SOURCE LINES 127-134 .. code-block:: default from gaitmap_datasets.sensor_position_comparison_2019 import SensorPositionComparison2019Mocap dataset = SensorPositionComparison2019Mocap( memory=Memory("../.cache"), ) dataset .. raw:: html

SensorPositionComparison2019Mocap [98 groups/rows]

participant test
0 4d91 fast_10
1 4d91 fast_20
2 4d91 long
3 4d91 normal_10
4 4d91 normal_20
... ... ...
93 e54d long
94 e54d normal_10
95 e54d normal_20
96 e54d slow_10
97 e54d slow_20

98 rows × 2 columns



.. GENERATED FROM PYTHON SOURCE LINES 135-136 We can see that one individual data point for this dataset is only one of the gaittests. .. GENERATED FROM PYTHON SOURCE LINES 136-139 .. code-block:: default datapoint = dataset[0] datapoint .. raw:: html

SensorPositionComparison2019Mocap [1 groups/rows]

participant test
0 4d91 fast_10


.. GENERATED FROM PYTHON SOURCE LINES 140-143 We can access the entire trajectory of the motion capture markers for this segment. Note, that we don't provide any mocap-derived ground truth for any spatial parameters, but assume that they will be calculated from the trajectory depending on the task. .. GENERATED FROM PYTHON SOURCE LINES 143-147 .. code-block:: default imu_data = datapoint.data mocap_traj = datapoint.marker_position_ mocap_traj.head() .. rst-class:: sphx-glr-script-out .. code-block:: none ________________________________________________________________________________ [Memory] Calling gaitmap_datasets.sensor_position_comparison_2019.helper.get_mocap_test... get_mocap_test('4d91', 'fast_10', data_folder=PosixPath('/home/arne/Documents/repos/work/projects/sensor_position_comparison/sensor_position_main_analysis/data/raw')) /home/arne/Documents/repos/private/gaitmap-datasets/.venv/lib/python3.8/site-packages/c3d/c3d.py:1219: UserWarning: No analog data found in file. warnings.warn('No analog data found in file.') ___________________________________________________get_mocap_test - 0.1s, 0.0min .. raw:: html
l_fcc r_fcc l_toe r_toe l_fm5 l_fm1 r_fm1 r_fm5 l5 l_ias r_ias
x y z x y z x y z x y z x y z x y z x y z x y z x y z x y z x y z
time after start [s]
0.00 33.188286 11.408942 0.037959 33.201908 11.528133 0.038243 32.917381 11.357433 0.054776 32.933617 11.550323 0.056874 32.974258 11.329062 0.048930 32.957886 11.409232 0.051247 32.970825 11.519496 0.054074 32.994427 11.596015 0.048777 33.159939 11.451368 1.046660 32.972012 11.333252 1.027096 32.978550 11.586587 1.021398
0.01 33.188255 11.408989 0.037928 33.201885 11.528160 0.038260 32.917343 11.358250 0.055674 32.933586 11.550290 0.056909 32.974262 11.329080 0.048941 32.957943 11.409199 0.051245 32.970863 11.519358 0.054200 32.994404 11.596023 0.048770 33.159855 11.451330 1.046640 32.971905 11.333132 1.027082 32.978527 11.586611 1.021472
0.02 33.188271 11.408986 0.037974 33.201900 11.528152 0.038310 32.917412 11.357396 0.054874 32.933598 11.550281 0.056861 32.974266 11.329084 0.048959 32.957935 11.409245 0.051261 32.970863 11.519274 0.054225 32.994404 11.596003 0.048777 33.159752 11.451270 1.046694 32.971832 11.333097 1.027076 32.978493 11.586695 1.021515
0.03 33.188251 11.409006 0.038017 33.201874 11.528157 0.038336 32.917309 11.358207 0.055748 32.932625 11.549203 0.054874 32.974174 11.329082 0.049010 32.957901 11.409237 0.051280 32.970787 11.519297 0.054214 32.994392 11.596045 0.048861 33.159664 11.451239 1.046714 32.971771 11.333051 1.027099 32.978401 11.586411 1.021418
0.04 33.188221 11.408930 0.038047 33.201878 11.528141 0.038334 32.917362 11.358143 0.055821 32.932652 11.549143 0.054953 32.974186 11.329026 0.049040 32.957932 11.409176 0.051335 32.970802 11.519186 0.054318 32.994438 11.596021 0.048868 33.159637 11.451154 1.046796 32.971657 11.333029 1.027109 32.978237 11.586364 1.021336


.. GENERATED FROM PYTHON SOURCE LINES 148-156 We plot the data of the heel marker (fcc) and the imu data of the insole sensor together to show that they are synchronised. Both data streams have the correct time axis, even though they are sampled at different rates (mocap is sampled with 100 hz, and IMU with 204.8). Keep that in mind, when working with the data without an index (e.g. after converting to numpy arrays). To better visualize the data we "normalize" the mocap data by subtracting the first position. This way we can clearly see the individual strides. .. GENERATED FROM PYTHON SOURCE LINES 156-165 .. code-block:: default fig, axes = plt.subplots(2, 1, sharex=True, figsize=(10, 5)) imu_data["l_insole"].filter(like="gyr").plot(ax=axes[0]) mocap_traj["l_fcc"].sub(mocap_traj["l_fcc"].iloc[0]).plot(ax=axes[1]) axes[0].set_xlim(0, 7.5) axes[0].set_ylabel("IMU gyr [rad/s]") axes[1].set_ylabel("Marker Trajectory [m]") fig.tight_layout() fig.show() .. image-sg:: /auto_examples/images/sphx_glr_sensor_position_comparison_2019_002.png :alt: sensor position comparison 2019 :srcset: /auto_examples/images/sphx_glr_sensor_position_comparison_2019_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 166-167 Like before we have access to the segmented strides (however, only cut for the respective region). .. GENERATED FROM PYTHON SOURCE LINES 167-170 .. code-block:: default segmented_stride_labels = datapoint.segmented_stride_list_["left"] segmented_stride_labels.head() .. raw:: html
start end
s_id
490 347 547
491 547 739
492 739 934
493 934 1133
494 1133 1363


.. GENERATED FROM PYTHON SOURCE LINES 171-176 We can also access the labels for IC and TC. Even though we used the hand labeled strides as regions of interest for segmentation, we can see that the start and end labels of the mocap event strides and the hand labeled strides are not identical. This is because the event list is provided in the samples of the motion capture data, while the hand labeled strides are provided in the samples of the IMU data. .. GENERATED FROM PYTHON SOURCE LINES 176-179 .. code-block:: default event_labels = datapoint.mocap_events_["left"] event_labels.head() .. raw:: html
start end ic tc min_vel
s_id
490 169 267 209 174 236
491 267 360 306 272 329
492 360 456 399 366 425
493 456 553 496 461 522
494 553 665 595 558 633


.. GENERATED FROM PYTHON SOURCE LINES 180-183 To avoid errors in potential conversions between the two domains (mocap/IMU), we provide the `convert_with_padding` methods to convert the event list. (To understand why the method is called `..._with_padding`, see the section below). .. GENERATED FROM PYTHON SOURCE LINES 183-186 .. code-block:: default event_labels_in_imu = datapoint.convert_events_with_padding(event_labels, from_time_axis="mocap", to_time_axis="imu") event_labels_in_imu.head() .. raw:: html
start end ic tc min_vel
s_id
490 346 547 428 356 483
491 547 737 627 557 674
492 737 934 817 750 870
493 934 1133 1016 944 1069
494 1133 1362 1219 1143 1296


.. GENERATED FROM PYTHON SOURCE LINES 187-192 Now you can see, that the start and end labels are (almost) identical. Remaining differences are due to rounding errors. This is not ideal, but should not affect typical analysis. Below we plotted the segmented strides and the IC and TC labels onto the gyr-y axis and the mocap z-axis (foot lift). .. GENERATED FROM PYTHON SOURCE LINES 192-243 .. code-block:: default fig, axes = plt.subplots(2, 1, sharex=True, figsize=(10, 5)) gyr_y = imu_data["l_insole"]["gyr_y"] norm_mocap_z = mocap_traj["l_fcc"].sub(mocap_traj["l_fcc"].iloc[0])["z"] gyr_y.plot(ax=axes[0]) norm_mocap_z.plot(ax=axes[1]) event_labels_in_mocap = event_labels event_labels_times = datapoint.convert_events_with_padding(event_labels, from_time_axis="mocap", to_time_axis="time") event_labels_in_imu = datapoint.convert_events_with_padding(event_labels, from_time_axis="mocap", to_time_axis="imu") for i, s in event_labels_times.iterrows(): axes[0].axvspan(s["start"], s["end"], alpha=0.2, color="C1") axes[1].axvspan(s["start"], s["end"], alpha=0.2, color="C1") axes[0].scatter( event_labels_times["ic"], gyr_y.iloc[event_labels_in_imu["ic"]], marker="s", color="k", zorder=10, label="IC", ) axes[0].scatter( event_labels_times["tc"], gyr_y.iloc[event_labels_in_imu["tc"]], marker="o", color="C3", zorder=10, label="TC", ) axes[1].scatter( event_labels_times["ic"], norm_mocap_z.iloc[event_labels_in_mocap["ic"]], marker="s", color="k", zorder=10, label="IC", ) axes[1].scatter( event_labels_times["tc"], norm_mocap_z.iloc[event_labels_in_mocap["tc"]], marker="o", color="C3", zorder=10, label="TC", ) axes[0].legend() axes[0].set_xlim(0, 7.5) axes[0].set_ylabel("IMU gyr [rad/s]") axes[1].set_ylabel("Marker Trajectory [m]") fig.tight_layout() fig.show() .. image-sg:: /auto_examples/images/sphx_glr_sensor_position_comparison_2019_003.png :alt: sensor position comparison 2019 :srcset: /auto_examples/images/sphx_glr_sensor_position_comparison_2019_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 244-255 Data padding ------------ One issue that you might run into when working with the mocap version of the dataset is that the start of the test (which is used to cut the signal) is right at the beginning of the movement. This means for algorithms that require a certain resting period (e.g. to do a gravity alignment) might not work well. Therefore, we provide a `data_padding_s` parameter that will load that amount of seconds before and after the actual test. On the time axis, we assign negative time stamps to all the padded values that are before the actual test start. This ensures that the time axis of the IMU data and the mocap data are still aligned, even tough no mocap data exists in the padded region. .. GENERATED FROM PYTHON SOURCE LINES 255-263 .. code-block:: default dataset = SensorPositionComparison2019Mocap( memory=Memory("../.cache"), data_padding_s=3, ) datapoint = dataset[0] imu_data = datapoint.data mocap_traj = datapoint.marker_position_ .. GENERATED FROM PYTHON SOURCE LINES 264-266 We can see that the data is now padded with 3 seconds before and after the test, however no mocap samples exist in these regions. .. GENERATED FROM PYTHON SOURCE LINES 266-274 .. code-block:: default fig, axes = plt.subplots(2, 1, sharex=True, figsize=(10, 5)) imu_data["l_insole"].filter(like="gyr").plot(ax=axes[0]) mocap_traj["l_fcc"].sub(mocap_traj["l_fcc"].iloc[0]).plot(ax=axes[1]) axes[0].set_ylabel("IMU gyr [rad/s]") axes[1].set_ylabel("Marker Trajectory [mm]") fig.tight_layout() fig.show() .. image-sg:: /auto_examples/images/sphx_glr_sensor_position_comparison_2019_004.png :alt: sensor position comparison 2019 :srcset: /auto_examples/images/sphx_glr_sensor_position_comparison_2019_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 275-283 While the time axis of the IMU data is still aligned with the mocap data, care needs to be taken when it comes to the event data. Only events/labels provided with IMU samples (or with a time axis) respect the padding correctly. For example, the `segmented_stride_list` is provided with in IMU samples, so it is padded correctly. Note: The strides that are included in the segmented stride list will not change, if you increase the padding! This means if you increase the padding so that strides outside the selected gait tests are part of the signal, they will not be included in the segmented stride list. .. GENERATED FROM PYTHON SOURCE LINES 283-286 .. code-block:: default segmented_stride_labels = datapoint.segmented_stride_list_["left"] segmented_stride_labels.head() .. raw:: html
start end
s_id
490 961 1161
491 1161 1353
492 1353 1548
493 1548 1747
494 1747 1977


.. GENERATED FROM PYTHON SOURCE LINES 287-289 However, to correctly transform it to the time domain, you need to manually add the padding time. To avoid erros, we provide the `convert_with_padding` method that does this for you. .. GENERATED FROM PYTHON SOURCE LINES 289-294 .. code-block:: default segmented_stride_labels_time = datapoint.convert_events_with_padding( segmented_stride_labels, from_time_axis="imu", to_time_axis="time" ) segmented_stride_labels_time.head() .. raw:: html
start end
s_id
490 1.694336 2.670898
491 2.670898 3.608398
492 3.608398 4.560547
493 4.560547 5.532227
494 5.532227 6.655273


.. GENERATED FROM PYTHON SOURCE LINES 295-300 Values provided in mocap samples, don't have any padding applied. However, for the like with the segmented_stride_list, you can use `convert_with_padding` to transform them to IMU samples with correct padding. First no padding, mocap samples .. GENERATED FROM PYTHON SOURCE LINES 300-303 .. code-block:: default event_labels_in_mocap = datapoint.mocap_events_["left"] event_labels_in_mocap.head() .. raw:: html
start end ic tc min_vel
s_id
490 169 267 209 174 236
491 267 360 306 272 329
492 360 456 399 366 425
493 456 553 496 461 522
494 553 665 595 558 633


.. GENERATED FROM PYTHON SOURCE LINES 304-305 In IMU samples with padding: .. GENERATED FROM PYTHON SOURCE LINES 305-310 .. code-block:: default event_labels_in_imu = datapoint.convert_events_with_padding( event_labels_in_mocap, from_time_axis="mocap", to_time_axis="imu" ) event_labels_in_imu.head() .. raw:: html
start end ic tc min_vel
s_id
490 960 1161 1042 970 1097
491 1161 1351 1241 1171 1288
492 1351 1548 1431 1364 1484
493 1548 1747 1630 1558 1683
494 1747 1976 1833 1757 1910


.. GENERATED FROM PYTHON SOURCE LINES 311-313 And in time (seconds) with padding: Below you can see that the first event is now after 4 seconds, indicating that the signal is correctly padded. .. GENERATED FROM PYTHON SOURCE LINES 313-318 .. code-block:: default event_labels_times = datapoint.convert_events_with_padding( event_labels_in_mocap, from_time_axis="mocap", to_time_axis="time" ) event_labels_times.head() .. raw:: html
start end ic tc min_vel
s_id
490 1.69 2.67 2.09 1.74 2.36
491 2.67 3.60 3.06 2.72 3.29
492 3.60 4.56 3.99 3.66 4.25
493 4.56 5.53 4.96 4.61 5.22
494 5.53 6.65 5.95 5.58 6.33


.. GENERATED FROM PYTHON SOURCE LINES 319-341 Other coordinate transformations ================================ For reference, here are visual representations for the coordinate systems transforms used for all the foot sensors. .. figure:: /images/coordinate_systems/coordinate_transform_nilspodV1_instep_fraunhofer_qualisis.svg :alt: coordinate system definition instep :figclass: align-center .. figure:: /images/coordinate_systems/coordinate_transform_nilspodV1_heel_fraunhofer_qualisis.svg :alt: coordinate system definition heel :figclass: align-center .. figure:: /images/coordinate_systems/coordinate_transform_nilspodV1_lateral_fraunhofer_qualisis.svg :alt: coordinate system definition lateral :figclass: align-center .. figure:: /images/coordinate_systems/coordinate_transform_nilspodV1_medial_fraunhofer_qualisis.svg :alt: coordinate system definition medial :figclass: align-center .. figure:: /images/coordinate_systems/coordinate_transform_nilspodV1_cavity_fraunhofer_qualisis.svg :alt: coordinate system definition cavity :figclass: align-center .. figure:: /images/coordinate_systems/coordinate_transform_nilspodV1_insoles_fraunhofer_qualisis.svg :alt: coordinate system definition insoles :figclass: align-center .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 16.035 seconds) **Estimated memory usage:** 690 MB .. _sphx_glr_download_auto_examples_sensor_position_comparison_2019.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: sensor_position_comparison_2019.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: sensor_position_comparison_2019.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_