MHEALTH dataset

The MHEALTH (Mobile HEALTH) dataset comprises body motion and vital signs recordings for ten volunteers of diverse profile while performing several physical activities. Sensors placed on the subject's chest, right wrist and left ankle are used to measure the motion experienced by diverse body parts, namely, the acceleration, the rate of turn and the magnetic field orientation. The sensor positioned on the chest also provides 2-lead ECG measurements, which can be potentially used for basic heart monitoring, checking for various arrhythmias or looking at the effects of exercise on the ECG.

The dataset is available here and at the UCI Machine Learning Repository.

Note: this video aims at illustrating the type of activities performed by the participants. Only the motion sensors are shown.


Use of this dataset* in publications must be acknowledged by referencing the following publications:

Banos, O., Villalonga, C., Garcia, R., Saez, A., Damas, M., Holgado, J. A., Lee, S., Pomares, H., Rojas, I. Design, implementation and validation of a novel open framework for agile development of mobile health applications. BioMedical Engineering OnLine, vol. 14, no. S2:S6, pp. 1-20 (2015). [BiBTeX LINK]

Banos, O., Garcia, R., Holgado, J. A., Damas, M., Pomares, H., Rojas, I., Saez, A., Villalonga, C. mHealthDroid: a novel framework for agile development of mobile health applications. Proceedings of the 6th International Work-conference on Ambient Assisted Living an Active Ageing (IWAAL 2014), Belfast, Northern Ireland, December 2-5, (2014).

REALDISP dataset

The REALDISP (REAListic sensor DISPlacement) dataset has been originally collected to investigate the effects of sensor displacement in the activity recognition process in real-world settings. It builds on the concept of ideal-placement, self-placement and induced-displacement. The ideal and mutual-displacement conditions represent extreme displacement variants and thus could represent boundary conditions for recognition algorithms. In contrast, self-placement reflects a users perception of how sensors could be attached, e.g., in a sports or lifestyle application. The dataset includes a wide range of physical activities (warm up, cool down and fitness exercises), sensor modalities (acceleration, rate of turn, magnetic field and quaternions) and participants (17 subjects). Apart from investigating sensor displacement, the dataset lend itself for benchmarking activity recognition techniques in ideal conditions.

The dataset is available here and at the UCI Machine Learning Repository.


Use of this dataset* in publications must be acknowledged by referencing the following publications:

Banos, O., Toth M. A., Damas, M., Pomares, H., Rojas, I. Dealing with the effects of sensor displacement in wearable activity recognition. Sensors, vol. 14, no. 6, pp. 9995-10023 (2014). [BiBTeX LINK]

Banos, O., Toth, M. A., Damas, M., Pomares, H., Rojas, I., Amft, O. A benchmark dataset to evaluate sensor displacement in activity recognition. Proceedings of the 14th International Conference on Ubiquitous Computing (Ubicomp 2012), Pittsburgh, USA, September 5-8, (2012). [BiBTeX LINK]

Multimodal Kinect-IMU dataset

This dataset has been originally collected to investigate transfer learning (see reference below) among ambient sensing and wearable sensing systems. Nevertheless, the dataset may be also used for gesture spotting and continuous activity recognition. It includes data for three activity recognition scenarios, namely HCI (gesture recognition), fitness (continuous recognition) and background (unrelated events). The dataset comprises synchronized 3D coordinates of 15 body joints, measured by a vision-based skeleton tracking system (Microsoft Kinect) and the readings of 10 body-worn inertial measurement units (IMUs): acceleration, rate of turn, magnetic field and orientation (quaternions).

The dataset is available here.


Use of this dataset* in publications must be acknowledged by referencing the following publication:

Banos, O., Calatroni, A., Damas, M., Pomares, H., Rojas, I., Troester, G., Sagha, H., Millan, J. del R., Chavarriaga, R., Roggen, D. Kinect=IMU? Learning MIMO Signal Mappings to Automatically Translate Activity Recognition Systems Across Sensor Modalities. Proceedings of the 16th annual International Symposium on Wearable Computers (ISWC 2012), Newcastle, United Kingdom, June 18-22 (2012). [BiBTeX LINK]


* I would appreciate if you send me an email (oresti.bl@gmail.com) to report on any publication using this dataset.