COVID-BEHAVE dataset

The COVID-BEHAVE dataset (measuring human behaviour during the COVID-19 pandemic) measures human behaviour holistically and longitudinally to illustrate the effects of enforced confinements. The study was conducted during the first lockdown of the COVID-19, where 21 healthy subjects from the Netherlands and Greece collected multimodal raw and processed data for two months from smartphone sensors, activity trackers, and users' responses to digital questionnaires. The data can be used to model human behaviour in a wide sense as the dataset explores physical, social, emotional, and cognitive domains. The dataset is not deemed to make general claims about the effects of the pandemic at a population level but rather offer an exemplary perspective on a given group of people. Importantly, to our knowledge this is the first dataset combining passive sensing, experience sampling, and virtual assistants to study human behavior dynamics in a prolonged lockdown situation.

The dataset is available at the OSF Repository.


Use of this dataset in publications must be acknowledged by referencing the following publications:

Konsolakis, K., Banos, O., Cabrita, M., Hermens, H. COVID-BEHAVE dataset: measuring human behaviour during the COVID-19 pandemic. Scientific Data, pp. 1-20 (2022). [PDF BiBTeX LINK]

ATOPE+Breast dataset

The ATOPE+Breast dataset (ATOPE+ for patients with breast cancer) describes the daily status of 23 patients with breast cancer during therapeutic exercise intervention with daily measures of HRV, self-reported wellness, physical activity, and sleep. Besides, the ATOPE+Breast dataset contains information about training sessions, such as intensity recorded, demographic data, treatment details, initial evaluations of quality of life, physical activity levels, previous medical conditions, and risk factors.

The dataset is available at the Zenodo Repository.


Use of this dataset in publications must be acknowledged by referencing the following publications:

Moreno-Gutierrez, S., Postigo-Martin, P., Damas, M., Pomares, H., Banos, O., Arroyo-Morales, M., Cantarero-Villanueva, I. ATOPE+: an mHealth system to support personalized therapeutic exercise interventions in patients with cancer. IEEE Access, vol. 9, pp. 16878-16898 (2021). [PDF BiBTeX LINK]

CoVidAffect dataset

The CoVidAffect dataset comprises a longitudinal series of individual changes in subjective feeling (valence) and physical activation (arousal) collected during the COVID-19 lockdown in Spain. Participants countrywide regularly reported these two fundamental dimensions of emotion via the project’s website or through a smartphone app, developed specifically for this purpose. As the lockdown was bound to have a different impact on each participant, depending on their particular context, the dataset registers contextual information, such as socioeconomic status, living space, employment changes and physical activity levels of each participant.

The dataset is available at the Zenodo Repository.


Use of this dataset in publications must be acknowledged by referencing the following publication(s):

Bailon, C., Goicoechea, C., Banos, O., Damas, M., Pomares, H., Correa, A., Sanabria, D., Perakakis, P. CoVidAffect, real-time monitoring of mood variations following the COVID-19 outbreak in Spain. Scientific Data, vol. 7, no. 365, pp. 1-10 (2020). [PDF BiBTeX LINK]

MHEALTH dataset

The MHEALTH (Mobile HEALTH) dataset comprises body motion and vital signs recordings for ten volunteers of diverse profile while performing several physical activities. Sensors placed on the subject's chest, right wrist and left ankle are used to measure the motion experienced by diverse body parts, namely, the acceleration, the rate of turn and the magnetic field orientation. The sensor positioned on the chest also provides 2-lead ECG measurements, which can be potentially used for basic heart monitoring, checking for various arrhythmias or looking at the effects of exercise on the ECG.

The dataset is available here and at the UCI Machine Learning Repository.

Note: this video aims at illustrating the type of activities performed by the participants. Only the motion sensors are shown.


Use of this dataset in publications must be acknowledged by referencing the following publications:

Banos, O., Villalonga, C., Garcia, R., Saez, A., Damas, M., Holgado, J. A., Lee, S., Pomares, H., Rojas, I. Design, implementation and validation of a novel open framework for agile development of mobile health applications. BioMedical Engineering OnLine, vol. 14, no. S2:S6, pp. 1-20 (2015). [PDF BiBTeX LINK]

Banos, O., Garcia, R., Holgado, J. A., Damas, M., Pomares, H., Rojas, I., Saez, A., Villalonga, C. mHealthDroid: a novel framework for agile development of mobile health applications. 6th International Work-conference on Ambient Assisted Living an Active Ageing (IWAAL 2014), Belfast, Northern Ireland, December 2-5, (2014). [PDF BiBTeX LINK SLIDES]

REALDISP dataset

The REALDISP (REAListic sensor DISPlacement) dataset has been originally collected to investigate the effects of sensor displacement in the activity recognition process in real-world settings. It builds on the concept of ideal-placement, self-placement and induced-displacement. The ideal and mutual-displacement conditions represent extreme displacement variants and thus could represent boundary conditions for recognition algorithms. In contrast, self-placement reflects a users perception of how sensors could be attached, e.g., in a sports or lifestyle application. The dataset includes a wide range of physical activities (warm up, cool down and fitness exercises), sensor modalities (acceleration, rate of turn, magnetic field and quaternions) and participants (17 subjects). Apart from investigating sensor displacement, the dataset lend itself for benchmarking activity recognition techniques in ideal conditions.

The dataset is available here and at the UCI Machine Learning Repository.



Use of this dataset in publications must be acknowledged by referencing the following publications:

Banos, O., Toth M. A., Damas, M., Pomares, H., Rojas, I. Dealing with the effects of sensor displacement in wearable activity recognition. Sensors, vol. 14, no. 6, pp. 9995-10023 (2014). [PDF BiBTeX LINK]

Banos, O., Toth, M. A., Damas, M., Pomares, H., Rojas, I., Amft, O. A benchmark dataset to evaluate sensor displacement in activity recognition. 14th ACM International Conference on Ubiquitous Computing (Ubicomp 2012), Pittsburgh, USA, September 5-8, (2012). [PDF BiBTeX LINK SLIDES]

Multimodal Kinect-IMU dataset

This dataset has been originally collected to investigate transfer learning (see reference below) among ambient sensing and wearable sensing systems. Nevertheless, the dataset may be also used for gesture spotting and continuous activity recognition. It includes data for three activity recognition scenarios, namely HCI (gesture recognition), fitness (continuous recognition) and background (unrelated events). The dataset comprises synchronized 3D coordinates of 15 body joints, measured by a vision-based skeleton tracking system (Microsoft Kinect) and the readings of 10 body-worn inertial measurement units (IMUs): acceleration, rate of turn, magnetic field and orientation (quaternions).

The dataset is available here.



Use of this dataset in publications must be acknowledged by referencing the following publication:

Banos, O., Calatroni, A., Damas, M., Pomares, H., Rojas, I., Troester, G., Sagha, H., Millan, J. del R., Chavarriaga, R., Roggen, D. Kinect=IMU? Learning MIMO Signal Mappings to Automatically Translate Activity Recognition Systems Across Sensor Modalities. 16th Annual International Symposium on Wearable Computers (ISWC 2012), Newcastle, United Kingdom, June 18-22 (2012). [PDF BiBTeX LINK SLIDES]