Member Login

Publication

Journal

Exploring AR-enabled human–robot collaboration (HRC) system for exploratory collaborative assembly tasks

PUBLICATION DATE: 8 August, 2025
PUBLICATION AUTHOR/S: Wei Win Loy, Jared Donovan, Markus Rittenbruch & Müge Belek Fialho Teixeira

Augmented reality (AR)-enabled human–robot collaboration (HRC) is emerging as a critical paradigm in architectural design and fabrication, particularly for supporting real-time interaction, creative agency, and situated decision-making. As collaborative robots (cobots) become more integrated into exploratory design workflows, AR offers a means to bridge the gap between robotic precision and human intuition. This paper investigates how AR interfaces can facilitate adaptive, embodied collaboration between designers and cobots in spatially unconstrained, exploratory assembly tasks. We developed and evaluated an AR-enabled HRC system across two user studies involving architectural designers. The system allows users to preview, modify, and execute cobotic actions within a shared workspace, incorporating dynamic visual feedback and real-time spatial tracking. Drawing on principles of situated cognition and interactive fabrication, we analyse how AR supports spatial awareness, enhances user agency, and enables intuitive, adaptive interactions. The findings reveal that AR interfaces contribute to HRC through three interconnected themes: (1) improving predictive coordination by externalising cobot intentions and constraints, (2) reinforcing user agency via real-time decision-making tools, and (3) scaffolding situated learning through adaptive visual feedback. We conclude by outlining three key future directions: expanding the spatial and structural complexity of AR-HRC systems, developing more nuanced models of user-cobot interaction in design contexts, and integrating real-time structural feedback to inform user decision-making.

Publication link

View all publications