Haptic Human-Human Interaction – Behavior Recognition Dataset

Koc University Human-Human Interaction Behaviour Pattern Recognition Dataset

This repository contains labelled haptic interaction data collected from Human-Human dyads in a joint object manipulation scenario. We conducted an experimental study to generate data that can be used to identify human-human haptic interaction patterns and learn models for capturing salient characteristics of dyadic interactions.

In order to identify human interaction patterns, we have developed an application where two human subjects interact in a virtual environment through the haptic channel using SensAble Phantom Premium haptic devices. The application requires the subjects to coordinate their actions in order to move the rectangular object together in a 2D maze-like environment:

Human-Human Interaction Scenario

Figure 1. Experimental Setup

Based on our interpretations of user interactions, we identified a set of interaction patterns that were observed frequently in our dyadic object manipulation task. Regarding these patterns, we proposed a human interaction behavior taxonomy. In the proposed taxonomy, the first layer presents a very general categorization of any physical interaction involving multiple agents. In this layer, an interaction-based perspective is adopted to classify the task as being either harmonious, conflicting, or neutral. The second layer is concerned with the “intentions” of the agents. In this sense, it is not related to the resulting motion
of the object itself, but is rather responsible for defining whether the agents’ motion plans agree or not. Finally, the last layer describes interaction patterns that are commonly encountered in our task. The proposed technique allows the recognition of interaction behaviours, which constitute the leaves of the following taxonomy tree:

Taxonomy of interaction patterns in dyadic object manipulation

Figure 2. Taxonomy of interaction patterns in dyadic object manipulation

During the experiment, the subjects are presented with two different scenes to observe interaction patterns in both translational and rotational motion. The first scene, which will be called the straight scene from now on, depicts a horizontal path, whereas the second scene, called the bifurcated scene, presents a fork-shaped path for the users to follow. In order to elicit different interaction patterns, we presented the subjects with different manipulation scenarios, in which conflicts between partners are artificially invoked by providing each agent with different visual information about the location of the target configuration.

Please click on the following visuals to get more information on the experimental scenarios in each scene:


Straight Scene Scenarios


Video 1. Experimental Scenarios in the Straight Scene


Bifurcated Scene Scenarios


Video 2. Experimental Scenarios in the Bifurcated Scene

Downloads


40 subjects (6 female and 34 male), aged between 21 and 29, participated in our study. The subjects were randomly divided into two groups to form dyads that should work as partners during the experiment. Hence, this dataset consists of data collected from 20 human-human dyads. For each dyad, the data is recorded at 1 kHz and stored as a Matlab struct.

+ Raw Interaction Data
– Readme.txt (2.33 KB) [ download ]
– Koc_haptic_HHI_data.zip (2.44 GB) [ download ]
+ Videos
– Due to large data size, the videos are not available online. However, you can reconstruct the simulated trials by running this Matlab code.
+ Labelled Feature Sets
The feature sets are stored as Matlab .mat files, each of which is a 6×1 cell array. Cells 1 to 6 respectively contain instances from classes C1 to C6 as depicted in Fig. 2 above.
– Combined and Individual Feature Sets (16.7 MB) [ download ]

Please note that this dataset is available for research purposes only. If you are interested in using the dataset, please do so by citing the following paper:

Cigil E. Madan, Ayse Kucukyilmaz, T. Metin Sezgin, and Cagatay Basdogan. Recognition of Haptic Interaction Patterns in Dyadic Joint Object Manipulation. IEEE Transactions on Haptics, 2015. vol.8, no.1, pp.54,66, Jan.-March 2015. [ Bibtex ] doi: 10.1109/TOH.2014.2384049

The experimental procedure, and the details of how data is collected can be found in the aforementioned paper. May you have any queries, please direct them to Ayse Kucukyilmaz via e-mail:a.kucukyilmaz [at] imperial.ac.uk

The dataset is collected at Koc University (Robotics and Mechatronics Laboratory and Intelligent User Interfaces Laboratory). The copyright of the data remains with this institution.


Copyright © 2014 Koc University. All rights reserved.


Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Comments are closed