Online Identification of Interaction Behaviors from Haptic Data during Collaborative Object Transfer

Kucukyilmaz, Ayse and Issak, Illimar (2019) Online Identification of Interaction Behaviors from Haptic Data during Collaborative Object Transfer. IEEE Robotics and Automation Letters . p. 1. ISSN 2377-3774

Full content URL: http://doi.org/10.1109/LRA.2019.2945261

Documents
Online Identification of Interaction Behaviors from Haptic Data during Collaborative Object Transfer
Accepted Manuscript
[img]
[Download]
[img] PDF
Kucukyilmaz-Humanoids19-OnlineClassification.pdf - Whole Document

4MB
Item Type:Article
Item Status:Live Archive

Abstract

Joint object transfer is a complex task, which is less structured and less specific than what is existing in several industrial settings. When two humans are involved in such a task, they cooperate through different modalities to understand the interaction states during operation and mutually adapt to one another’s actions. Mutual adaptation implies that both partners can identify how well they collaborate (i.e. infer about the interaction state) and act accordingly. These interaction states can define whether the partners work in harmony, face conflicts, or remain passive during interaction. Understanding how two humans work together during physical interactions is important when exploring the ways a robotic assistant should operate under similar settings. This study acts as a first step to implement an automatic classification mechanism during ongoing collaboration to identify the interaction state during object co-manipulation.
The classification is done on a dataset consisting of data from 40 subjects, who are partnered to form 20 dyads. The dyads experiment in a physical human-human interaction (pHHI) scenario to move an object in an haptics-enabled virtual environment to reach predefined goal configurations. In this study, we propose a sliding-window approach for feature extraction and demonstrate the online classification methodology to identify interaction patterns. We evaluate our approach using 1) a support vector machine classifier (SVMc) and 2) a Gaussian Process classifier (GPc) for multi-class classification, and achieve over 80% accuracy with both classifiers when identifying general interaction types.

Keywords:Classification, Feature Extraction, Force and Tactile Sensing, Haptics and Haptic Interfaces, Human Factors and Human-in-the-Loop, Learning and Adaptive Systems, Physical Human-Human Interaction, Physical Human-Robot Interaction, Pattern recognition
Subjects:H Engineering > H670 Robotics and Cybernetics
G Mathematical and Computer Sciences > G760 Machine Learning
G Mathematical and Computer Sciences > G700 Artificial Intelligence
G Mathematical and Computer Sciences > G440 Human-computer Interaction
H Engineering > H671 Robotics
Divisions:College of Science > School of Computer Science
Related URLs:
ID Code:37631
Deposited On:07 Oct 2019 08:35

Repository Staff Only: item control page