A Novel Haptic Feature Set for the Classification of Interactive Motor Behaviors in Collaborative Object Transfer

Koc University Human-Human Interaction Behaviour Pattern Recognition Dataset

This repository contains raw and labelled haptic interaction data collected from human-human dyads in a joint object manipulation scenario.

In our study*, we report evidence that harmonious and conflicting behavior patterns in physical human-human interaction (pHHI) can be recognized using haptic information alone without the need to track the manipulated object. In doing so, we design an experimental study where two human partners physically interact with each other to manipulate an object (see Fig. 1).

*Al-saadi, Z., Sirintuna, D., Kucukyilmaz, A., Basdogan, C., 2021, “A Novel Haptic Feature Set for the Classification of Interactive Motor Behaviors in Collaborative Object Transfer”, IEEE Transactions on HapticsDOI: 10.1109/TOH.2020.3034244

Fig. 1. (a) Two humans cooperate to manipulate an object from one location to another. (b) and (c) are the top and the front views of the manipulated object, respectively. The object coordinate frame is shown in red. The numbers 1, 2, and 3 in (b) highlight the handles, the F/T sensors, and the ArUco marker, respectively.
During the experiment, the dyad was instructed to move the object collaboratively in mid-air to visit a set of target configurations, reachable through primitive translational and rotational movements. We artificially created harmonious and conflicting behaviors by providing one of the partners with different target configurations. The role of this partner was always played by the experimenter. In return, the subjects’ task sequence was always constant and they were asked to: (1) lift the object up (U), (2) rotate it counterclockwise about x-axis ( , (3) translate along the negative z-axis into the red target configuration marked on the table ( , (4) translate along the y-axis to reach the black target configuration ( , (5) rotate the object clockwise about x-axis ( , and (6) put it down (D).

The experimenter, on the other hand, was given different target configurations, defining six different scenarios (Figure 2 shows the six scenarios (Sc1-Sc6) followed by the experimenter and a representative experimenter-subject scenario (Sc3) with the expected behaviors). Each of these six scenarios aims at isolating more than one of the following four specific interaction patterns:

  1. Harmonious Translation (HT)
  2. Conflicting Translation (CT)
  3. Harmonious Rotation (HR)
  4. Conflicting Rotation (CT)

Figure 2. (a.I-a.VI) Experimenter’s (E: Exp) scenarios Sc1-Sc6, (b.I) Subject’s (S: Sub) scenario, consisting of the task sequence {U, R_x^+, T_z^-, T_y^+, R_x^-, D}, where U: lift the object Up, D: put the object Down, R: Rotation, T: Translation. Superscripts and subscripts for R and T refer to the direction and axis of the transformation, respectively. For simplicity U and D are omitted from the figure, (b.II) Individual steps of actions in Experimenter’s scenario Sc3: {U, R_x^+, T_z^-, T_y^+, D}, (b.III) Expected interaction patterns generated due to the interplay between the Subject’s and the Experimenter’s actions. The red and black dots represent the intermediate and final object configurations as shown to the subject. Dashed arrows represent the partners’ intentions, whereas solid arrows represent their actions.
Utilizing the recorded haptic data (i.e., forces and torques applied by the partners on the object during the experiment), we define haptic-based features that charactrize each interaction pattern. Using a random forest classifier, we verified that haptic information was sufficient to differentiate between the interactions patterns.

Downloads

We conducted a study with 12 subjects (8 males and 4 females with an average age of 28.4 ± 7.0 SD). Each dyad (i.e., experimenter-subject) performed 12 trials, where scenarios Sc1-Sc6 were executed twice. The data was recorded at a frequency of 1 kHz.

Raw Interaction Data

Labelled Feature Sets

The feature sets are stored as Matlab *.mat files; each contains 30 columns storing the haptic-based features.

Animation

Based on the collected data, you can animate the experiment using the following dataset and animation code: