Member Login

ARTICLE: The Art of Mechamimicry: Designing Prototyping Tools for Human-Robot Interaction

When we think about robotics, especially in high, stakes contexts like surgery, we often imagine advanced machines, complex algorithms, and high, tech labs. But sometimes the best way to design the future is with cardboard, PVC pipes, and a bit of puppetry.

In our recent research, presented at DIS 2025, we explored how embodied, low, fidelity prototyping can help bridge the gap between technical development and the lived realities of people who work with robots.

The Challenge

Robotic, Assisted Surgery (RAS) is a complex and dynamic human–robot interaction (HRI) settings. Surgeons rely on tacit, embodied knowledge built up over years of practice. Engineers bring deep technical expertise. Human factors specialists understand the cognitive and ergonomic limits of people in high, stress environments.

But bringing these perspectives together can be challenging, especially early in the design process. Too often, technical development runs ahead of human needs, leaving systems misaligned with real, world practice.

Our Approach: The Kinematic Puppet

To address this, we developed the kinematic puppet: a modular, tangible prototyping tool that allows users to physically “puppeteer” a robot arm without needing code or expensive equipment.

  • Built with 3D, printed joints, rotary encoders, and PVC linkages, it’s reliable, reusable, and reconfigurable.
  • It integrates with virtual simulation, so physical movements can be recorded and replayed as digital twins.
  • It lowers the barrier for surgeons, engineers, and designers to experiment together, making abstract ideas concrete.

Through physically roleplaying it allows participants to test scenarios and explore ideas before committing to costly development.

Putting It to the Test

We trialled the kinematic puppet in a co-design workshop focused on revision hip surgery. Surgeons, engineers, and designers gathered around a low, fidelity anatomical model, simple props, and the kinematic puppet.

Through roleplay and bodystorming, participants experimented with:

  • Ergonomic tool grips (pen, style vs. drill, style).
  • Spatial layouts of surgical environments.
  • Cooperative control methods (e.g. axis locking, haptic boundaries).

Crucially, the kinematic puppet and surgical props helped surface tacit knowledge, things surgeons know through embodied practice but may struggle to articulate. Combined with simulation, we could capture, replay, and analyse these scenarios for further design development.

Key Takeaways

  1. Embodiment matters: In robotics design, haptic feedback, posture, and movement constraints cannot be understood through software alone. Tangible tools make this knowledge accessible.
  2. Hybrid methods are powerful: Combining physical roleplay with digital capture bridges creative ideation and technical precision.
  3. Collaboration is essential: Designers play a key role in facilitating conversations between different stakeholders helping translate knowledge and experience between disciplines.
  4. Low, fidelity ≠ low, value. Sometimes the simplest prototypes spark the richest insights.

Why This Matters Beyond Surgery

Although our case study focused on surgery, the approach has wider relevance. Any domain where humans and robots work together under constraints, manufacturing, logistics, aerospace, can benefit from accessible, embodied prototyping.

By lowering the technical threshold for participation, we can bring more voices into the design of future robotic systems. That means better alignment with real workflows, safer systems, and technology that truly supports human expertise.

A Call to Action

If you’re working in robotics, design, or any field where humans and machines collaborate:

  • Prototype early, prototype tangibly. Don’t wait for polished tech, start with what people can touch, move, and play with.
  • Value tacit knowledge. Invite practitioners to show you, not just tell you, how they work.
  • Think hybrid. Use both physical artefacts and digital tools to capture richer insights.

The future of robotics won’t just be written in code, it will be shaped through the creative, embodied practices that help us design with people, not just for them.

We would love to hear from others: how have you used tangible or embodied methods to explore technology design in your field?

PhD Research Spotlight: Yuan Liu Enhancing Human-Robot Collaboration Through Augmented and Virtual Reality

Integrating collaborative robots (cobots) into human workspaces demands more than just technical precision, it requires human-centered design. PhD researcher Yuan Liu, based at Queensland University of Technology (QUT), is tackling this challenge through her project Enhancing Human Decision Making in Human-Robot Collaboration: The Role of Augmented Reality, part of the Designing Socio-Technical Robotic Systems program within the Australian Cobotics Centre. 

Yuan’s research investigates the co-design and development of immersive visualisation approaches, including Augmented Reality (AR) and Virtual Reality (VR), to simulate, prototype, and evaluate human-robot collaboration (HRC) within real-world manufacturing environments. Her goal is to empower workers and decision-makers to better understand how cobots affect workflows, spatial layouts, and safety, ultimately improving acceptance and performance. 

By leveraging Extended Reality (XR) technologies, Yuan is working to enhance human decision-making before, during, and after cobot integration. Her aim is to create intuitive, interactive systems that help workers anticipate robot actions and develop AR-based design frameworks that optimise collaboration and safety. 

Her project aligns closely with the program’s mission to embed holistic design as a critical factor in the seamless integration of humans and machines. The broader goal is to improve working conditions, increase production efficiency, and foster workforce acceptance of robotic technologies. Yuan’s work contributes directly to these aims by developing a Human-Centred Design Process, AR-driven frameworks and design guidelines that place human experience at the centre of robotic system development. 

A key component of Yuan’s research is her industry placement with B&R Enclosures, where she is conducting fieldwork in their gasket room. Over the course of her PhD, she will spend 12 months on placement, collecting observational data, conducting interviews, and validating her design outcomes. This engagement ensures that her findings are relevant and transferrable to industry.  

The project is structured across several phases. It began with capturing 360-degree video footage of workers performing tasks in the gasket room, followed by detailed analysis of decision-making during these interactions. Yuan then conducted interviews with employees to gather self-reported insights into their decision-making processes. These findings are informing the development of an AR-based design prototype, tailored to enhance human understanding and collaboration with robots. The final phase focuses on knowledge transfer, ensuring that outcomes are shared with industry partners and the broader research community. 

Yuan’s academic journey reflects her interdisciplinary strengths. Before joining the Australian Cobotics Centre, she earned an MSc in Interactive Media from University College Cork (Ireland) in 2022, where she gained expertise in Multimedia Technologies and Human-Computer Interaction (HCI). Her academic path began with a Bachelor’s degree in Urban Planning from Southwest University (China), followed by several years of professional experience in landscape architecture and urban planning. This diverse background informs her approach to research, blending design thinking with technical innovation. 

Her work is supported by a multidisciplinary supervisory team, including Professor Glenda Caldwell, Professor Markus Rittenbruch, Associate Professor Müge Fialho Leandro Alves Teixeira, Dr Alan Burden, and Dr Matthias Guertler from UTS. She also collaborates closely with staff at B&R Enclosures, including Eric Stocker and Josiah Brooks, who facilitate access to the workplace and support her data collection efforts. 

By the end of her project in November 2026, Yuan aims to deliver a framework for understanding human behaviour and decision-making in manufacturing, a human-centred AR design approach for collaborative robotics, and guidelines for designing AR interfaces that optimise human-robot interaction. 

🔗 Read more about Yuan’s project on our website: https://www.australiancobotics.org/project/project-3-3-augmented-reality-in-collaborative-robotics/