Robotic Intention Visualisation in Human-Robot Collaboration
Complex working environments characterized by high-stake tasks and by the occurrence of unforeseen events pose challenges to the introduction of robotic systems for the automation of complex procedures. The next frontier of robotics is Human-Robot Collaboration (HRC), rather than full automation (Johansen et al., 2022). A tighter integration of robots in the workplace innovative ideas to effectively reconfigure practices without compromising the correct execution of work routines and thus disrupt productivity. To sustain the practice of human teams, the design of robotic systems should follow a user-centered approach that takes in account contextual factors and the characteristics of its end users.
This calls for deeper investigation of how the features of robotic platforms are determined, looking at the display of robotic output at the level of human-machine interface. The signaling of robotic intention should reflect the internal state of the technology in a transparent and unequivocal way and accommodate for contingent needs of the receiver. Conveying a correct mental model of how the technology functions, would improve the sense of control of users and encourage them to adopt robots as reliable co-workers.
The specific focus of Project 2.1 is on the representation of robotic intentions in the design of custom Human-Machine Interfaces to enable the collaboration between human and robots to jointly complete high-stake tasks in complex environments. The ultimate scientific purpose is the augmentation of the human workforce by adopting a human-centered approach to automation, thus involving end-users all along the design process.
Research activities
This PhD addresses the lack of studies exploring robot communication in-the-wild to test the feasibility of signaling modalities in relation to situational constraints (Walker et al., 2018). Inspired by the StEER method (Pelikan, Porfirio, & Winkle, 2023) a Research-Through-(Participatory) Design approach will be adopted. The final goal is to develop potential configurations of a user-centered and context-sensitive Human-Machine Interface and compare their usability by conducting mixed-methods user testing.
Principal Supervisor: Prof Markus Rittenbruch
Associated Researchers