Member Login

ARTICLE: The Art of Mechamimicry: Designing Prototyping Tools for Human-Robot Interaction

When we think about robotics, especially in high, stakes contexts like surgery, we often imagine advanced machines, complex algorithms, and high, tech labs. But sometimes the best way to design the future is with cardboard, PVC pipes, and a bit of puppetry.

In our recent research, presented at DIS 2025, we explored how embodied, low, fidelity prototyping can help bridge the gap between technical development and the lived realities of people who work with robots.

The Challenge

Robotic, Assisted Surgery (RAS) is a complex and dynamic human–robot interaction (HRI) settings. Surgeons rely on tacit, embodied knowledge built up over years of practice. Engineers bring deep technical expertise. Human factors specialists understand the cognitive and ergonomic limits of people in high, stress environments.

But bringing these perspectives together can be challenging, especially early in the design process. Too often, technical development runs ahead of human needs, leaving systems misaligned with real, world practice.

Our Approach: The Kinematic Puppet

To address this, we developed the kinematic puppet: a modular, tangible prototyping tool that allows users to physically “puppeteer” a robot arm without needing code or expensive equipment.

  • Built with 3D, printed joints, rotary encoders, and PVC linkages, it’s reliable, reusable, and reconfigurable.
  • It integrates with virtual simulation, so physical movements can be recorded and replayed as digital twins.
  • It lowers the barrier for surgeons, engineers, and designers to experiment together, making abstract ideas concrete.

Through physically roleplaying it allows participants to test scenarios and explore ideas before committing to costly development.

Putting It to the Test

We trialled the kinematic puppet in a co-design workshop focused on revision hip surgery. Surgeons, engineers, and designers gathered around a low, fidelity anatomical model, simple props, and the kinematic puppet.

Through roleplay and bodystorming, participants experimented with:

  • Ergonomic tool grips (pen, style vs. drill, style).
  • Spatial layouts of surgical environments.
  • Cooperative control methods (e.g. axis locking, haptic boundaries).

Crucially, the kinematic puppet and surgical props helped surface tacit knowledge, things surgeons know through embodied practice but may struggle to articulate. Combined with simulation, we could capture, replay, and analyse these scenarios for further design development.

Key Takeaways

  1. Embodiment matters: In robotics design, haptic feedback, posture, and movement constraints cannot be understood through software alone. Tangible tools make this knowledge accessible.
  2. Hybrid methods are powerful: Combining physical roleplay with digital capture bridges creative ideation and technical precision.
  3. Collaboration is essential: Designers play a key role in facilitating conversations between different stakeholders helping translate knowledge and experience between disciplines.
  4. Low, fidelity ≠ low, value. Sometimes the simplest prototypes spark the richest insights.

Why This Matters Beyond Surgery

Although our case study focused on surgery, the approach has wider relevance. Any domain where humans and robots work together under constraints, manufacturing, logistics, aerospace, can benefit from accessible, embodied prototyping.

By lowering the technical threshold for participation, we can bring more voices into the design of future robotic systems. That means better alignment with real workflows, safer systems, and technology that truly supports human expertise.

A Call to Action

If you’re working in robotics, design, or any field where humans and machines collaborate:

  • Prototype early, prototype tangibly. Don’t wait for polished tech, start with what people can touch, move, and play with.
  • Value tacit knowledge. Invite practitioners to show you, not just tell you, how they work.
  • Think hybrid. Use both physical artefacts and digital tools to capture richer insights.

The future of robotics won’t just be written in code, it will be shaped through the creative, embodied practices that help us design with people, not just for them.

We would love to hear from others: how have you used tangible or embodied methods to explore technology design in your field?

PhD Research Spotlight: Yuan Liu Enhancing Human-Robot Collaboration Through Augmented and Virtual Reality

Integrating collaborative robots (cobots) into human workspaces demands more than just technical precision, it requires human-centered design. PhD researcher Yuan Liu, based at Queensland University of Technology (QUT), is tackling this challenge through her project Enhancing Human Decision Making in Human-Robot Collaboration: The Role of Augmented Reality, part of the Designing Socio-Technical Robotic Systems program within the Australian Cobotics Centre. 

Yuan’s research investigates the co-design and development of immersive visualisation approaches, including Augmented Reality (AR) and Virtual Reality (VR), to simulate, prototype, and evaluate human-robot collaboration (HRC) within real-world manufacturing environments. Her goal is to empower workers and decision-makers to better understand how cobots affect workflows, spatial layouts, and safety, ultimately improving acceptance and performance. 

By leveraging Extended Reality (XR) technologies, Yuan is working to enhance human decision-making before, during, and after cobot integration. Her aim is to create intuitive, interactive systems that help workers anticipate robot actions and develop AR-based design frameworks that optimise collaboration and safety. 

Her project aligns closely with the program’s mission to embed holistic design as a critical factor in the seamless integration of humans and machines. The broader goal is to improve working conditions, increase production efficiency, and foster workforce acceptance of robotic technologies. Yuan’s work contributes directly to these aims by developing a Human-Centred Design Process, AR-driven frameworks and design guidelines that place human experience at the centre of robotic system development. 

A key component of Yuan’s research is her industry placement with B&R Enclosures, where she is conducting fieldwork in their gasket room. Over the course of her PhD, she will spend 12 months on placement, collecting observational data, conducting interviews, and validating her design outcomes. This engagement ensures that her findings are relevant and transferrable to industry.  

The project is structured across several phases. It began with capturing 360-degree video footage of workers performing tasks in the gasket room, followed by detailed analysis of decision-making during these interactions. Yuan then conducted interviews with employees to gather self-reported insights into their decision-making processes. These findings are informing the development of an AR-based design prototype, tailored to enhance human understanding and collaboration with robots. The final phase focuses on knowledge transfer, ensuring that outcomes are shared with industry partners and the broader research community. 

Yuan’s academic journey reflects her interdisciplinary strengths. Before joining the Australian Cobotics Centre, she earned an MSc in Interactive Media from University College Cork (Ireland) in 2022, where she gained expertise in Multimedia Technologies and Human-Computer Interaction (HCI). Her academic path began with a Bachelor’s degree in Urban Planning from Southwest University (China), followed by several years of professional experience in landscape architecture and urban planning. This diverse background informs her approach to research, blending design thinking with technical innovation. 

Her work is supported by a multidisciplinary supervisory team, including Professor Glenda Caldwell, Professor Markus Rittenbruch, Associate Professor Müge Fialho Leandro Alves Teixeira, Dr Alan Burden, and Dr Matthias Guertler from UTS. She also collaborates closely with staff at B&R Enclosures, including Eric Stocker and Josiah Brooks, who facilitate access to the workplace and support her data collection efforts. 

By the end of her project in November 2026, Yuan aims to deliver a framework for understanding human behaviour and decision-making in manufacturing, a human-centred AR design approach for collaborative robotics, and guidelines for designing AR interfaces that optimise human-robot interaction. 

🔗 Read more about Yuan’s project on our website: https://www.australiancobotics.org/project/project-3-3-augmented-reality-in-collaborative-robotics/  

Meet our E.P.I.C. Researcher, Michelle Dunn

Michelle Dunn is a Research Program Co-lead in the Quality Assurance and Compliance Program where her research explores practical applications of robotics to improve everyday life. Her work spans manufacturing robotics and automation that make work easier, collaborative robotics that enable humans and robots to work safely side by side, and assistive technologies designed to support people in their daily activities.

We interviewed Michelle recently to find out more about why she does what she does.

Tell us a little about your role at the Australian Cobotics Centre.
What is your research focus and how does it contribute to the Centre’s mission?

I am the co-lead of Program 4 which looks at Quality Assurance and Compliance in Collaborative Robot scenarios. We focus on the outcomes of industrial and cobotic automation, ensuring that the solutions are working as intended.

What has been a highlight of your time with the Centre so far?
This could be a moment, a project, a collaboration, anything that stands out.

I particularly enjoyed the ACC 2024 Symposium in Brisbane. I think at the 3 year mark everyone in the centre was settled in, we knew each other, and we were showcasing some great solutions, so there was plenty of great conversations and ideas. I’m looking forward to the 2025 instalment.

What would you like your impact to be within the Centre and in the broader field of collaborative robotics?

Traditionally robots have focused on the three Ds – dirty, dangerous and dull jobs. Collaborative robots are designed to work with people, so we aren’t just looking at the 3-D jobs. Instead, we are looking at working with people in a broader variety of occupations. I’d like to see more cobots introduced into SMEs to assist with daily tasks and support small businesses – this is where we can make real impact.

What project or achievement are you most proud of in your career to date, and why?

What motivates me to do my job is having a direct impact on people’s lives – this is my goal in every project I’m part of. Some examples include: working as an automotive software engineer as my first job out of uni (there are cars in China that change gears based on the code that I wrote); my post-doc in vehicle crashworthiness (the value of which I unfortunately experienced firsthand); and educating literally thousands of students in Australia on how to design and build robots and become better engineers, which has flow-on effects to the whole community.

What do you find most rewarding about being part of the Australian Cobotics Centre?

The ACC is full of some of the best roboticists, engineers and researchers in Australia. It has been very rewarding to work with all these people, seeing how they think and fostering the development of the next batch of robotics thinkers. Even though we are separated across different parts of Australia, we find a way to make it work. ACC researchers are EPIC (Excellent. People-Centric. Innovative. Collaborative.)

If you could give an impromptu 1-hour talk on any topic outside your research, what would it be?

I am an avid maker and creator. I focus on textiles and, being an engineer, my creations tend to be quite mathematical and “constructed” (you should see my kids’ costumes for Book Week!). I love pushing the boundaries of existing techniques, old and new. I could definitely talk about the technology and mathematics of textile creation for hours!

ARTICLE: Project Introduction: What Is Trust, Really? Rethinking Human-Robot Interaction Through Design

While engineers often focus on functionality and reliability, real-world deployment reveals a deeper challenge: if people feel uneasy, confused, or disconnected when working with robots, adoption falters. This is especially true in manufacturing, where seamless human-robot collaboration is essential for productivity and safety. 

A new strategic project led by Dr Valeria Macalupu, postdoctoral researcher in QUT’s Human-Robot Interaction (HRI) program, aims to address this challenge by exploring how design affordances, such as physical form, materials, and expressive behaviours, can foster trust between humans and robots. Co-funded by the QUT Design Lab, the project is titled “What is trust really? Exploring Rich Interactions and Design Affordances in Human-Robot Interaction through Co-design.” 

Rather than treating trust as a by-product of performance, the project investigates how visual, tactile, and behavioural cues can signal intent, competence, and emotional safety. These cues are often overlooked in traditional engineering approaches but are critical to how people interpret and respond to robotic systems. 

The research will be developed and disseminated through a series of co-design workshops, iterative prototyping, and a public exhibition. The goal is to generate insights into how robots can be designed to feel more intuitive, approachable, and safe. 

For industry, this project offers practical design guidelines to help engineers and developers create robots that are not only functional but also intuitively acceptable to human users. In manufacturing contexts, this could mean smoother integration of collaborative robots, reduced training time, and improved worker satisfaction. 

More broadly, the project contributes to a growing body of research that sees trust not as a by-product of performance, but as a designable quality. By understanding how people interpret and respond to robots through their physical presence, this work supports the development of safer, more empathetic, and more effective robotic systems. 

Across the Centre, our postdoctoral research fellows lead a diverse range of projects, from long-term initiatives to shorter, more focused pieces of work. This latest project from Dr Macalupu exemplifies the kind of strategic, interdisciplinary work that drives innovation and impact across our programs. 

This project is jointly funded by the Australian Cobotics Centre and the QUT Design Lab.  

Read more about the project: Project 2.9: What is trust really? Exploring Rich Interactions and Design Affordances in Human-Robot Interaction through Co-design » Australian Cobotics Centre | ARC funded ITTC for Collaborative Robotics in Advanced Manufacturing

Get in touch with Valeria: v.chira@qut.edu.au   

“Exploring Human-Robot Interaction: James Dwyer’s Seminar on Mechamimicry and Prototyping Tools”

PhD researcher, James Dwyer presented a seminar titled ‘The Art of Mechamimicry: Designing Prototyping Tools for Human-Robot Interaction’ for the QUT Centre for Robotics today.

Supervised by Jared Donovan, Markus Rittenbruch, Dr Valeria Macalupú and Rafael Gomez FDIA, James’s work explores how embodied, low-fidelity prototyping can make abstract HRI concepts tangible and accessible. Through a case study in Robotic Assisted Surgery, he demonstrates how combining physical role-play with virtual simulation helps stakeholders externalise tacit knowledge and co-design better robotic systems.

With a background in Industrial Design and Psychology, James brings a unique perspective to collaborative robotics in both surgical and manufacturing contexts.

Read more about his project, based at QUT (Queensland University of Technology), here: https://lnkd.in/gZXCsyb9

Advancing Humanoid Robotics: Beyond Impressive Demonstrations

Humanoid robots have captured the public’s imagination with their awe-inspiring demonstrations, from backflips and dances to folding laundry. These performances often leave us in awe of their capabilities, showcasing highly sophisticated technology. But at the heart of these demos lies a crucial distinction that is often overlooked: what these robots can do in controlled environments is vastly different from what they can achieve in the real world.

At Swinburne University of Technology (SUT) and Queensland University of Technology (QUT) researchers are shifting the focus from simply impressing audiences to solving the real challenges that come with integrating humanoid robots into everyday environments. Their work aims to bridge the gap between performance-based robotics and true autonomy.

Performance vs. Perception in Humanoid Robots

Most advanced humanoid robot demonstrations rely heavily on predefined motion sequences, scripted environments, and external assistance. The robots perform complex tasks by following a series of carefully orchestrated routines, often created through motion capture, reinforcement learning, or imitation learning in simulated settings. While visually stunning, these robots lack the semantic understanding and real-time reasoning necessary for true independence.

In the real world—whether in homes, hospitals, or factories—robots need much more than impressive pre-programmed actions. They need to:

  • Understand the context of objects and their surroundings

  • Sense the environment in real-time

  • Adapt their control and actions to handle the unpredictable nature of the world

This ability to adapt and truly comprehend is one of the greatest challenges facing robotics today. It’s not enough for robots to simply perform well in rehearsed routines. They must also be able to perceive and respond to dynamic environments.

Collaboration for Real-World Applications

At SUT, the research is aimed at developing humanoid robots that can interact with the world as we do: through contextual awareness and adaptive control. The team is working to address the gap between what robots can perform and how they understand the world around them, an essential step toward making robots more practical for real-world applications.

In this exciting area of research, SUT is eager to collaborate with industry partners to explore collaborative opportunities for research, development, and innovation. The goal is not just to impress but to create robots that can interpret and respond meaningfully to their environments, paving the way for their deployment in real-world scenarios.

As the Australian Cobotics Centre continues to explore the diverse capabilities of collaborative robots (cobots), we look forward to seeing how these advancements in humanoid robotics will contribute to the broader field. Whether in healthcare, manufacturing, or beyond, the future of humanoid robots is about intelligence, not just impressive demonstrations.

High Achiever HDR Student Award Recognition

In June 2025, Yuan Liu was honoured to receive the High Achiever HDR Student Award from the QUT Faculty of Engineering. This recognition is a testament to the dedication and hard work put into her research journey. She is deeply grateful to her principal supervisor, Professor Glenda Caldwell, for the nomination, and to Professor Markus Rittenbruch, Associate Professor Müge Belek Fialho Teixeira, Dr. Alan Burden, and Dr. Matthias Guertler for their ongoing support and guidance. This prestigious award serves as motivation for Yuan to continue advancing her research and pushing the boundaries of her work.

ARTICLE: Automating Asset Management Tasks in Factory Settings

As the Asset Management Council of Australia emphasizes, effective asset management provides organizations with insights across four critical domains: physical, information, financial, and intellectual assets. When implemented successfully, it enhances decision-making, strengthens operational resilience, and delivers long-term value. This capability serves as a key differentiator between agile, modern factories and slower, fragmented legacy operations.

The next frontier of asset management explores the shift from human-dependent monitoring to autonomous systems—where assets effectively manage themselves.

Under the Australian Cobotics Centre’s (ACC) Quality Assurance and Compliance program, researchers at the UTS:Robotics Institute are advancing this concept through the deployment of Boston Dynamics’ robotic platform, Spot. In this implementation, Spot autonomously navigates industrial environments, constructs a digital map, captures detailed imagery of equipment such as pipes, coils, and panels, and performs inspection tasks without human supervision.

Collaborative robots, or cobots, play a critical role in this evolving landscape. Designed to complement rather than replace human workers, cobots enhance manufacturing resilience by integrating human judgment with machine-level consistency and precision. Studies by [1] and [2] highlight the synergy of cobots and human teams in improving factory adaptability.

In this approach, robots such as Spot undertake detailed inspection routines, while humans execute follow-up tasks—tightening components, initiating repairs, or notifying human operators through shared digital interfaces. A supporting fleet management system, as outlined by [3], enables real-time tracking of cobot performance, usage, and maintenance status, effectively treating each cobot as a digital asset within the broader ecosystem.

In field trials, Spot demonstrated 90 minutes of continuous operation without a battery swap, covering 7.5 kilometers of factory floor—equivalent to 75,000 m². This capacity enables a single robot to replace multiple manual inspection rounds per shift in a typical Australian SME manufacturing facility.

A key innovation in the system is an integrated asset-management dashboard that connects directly to the SPOT’s API. By aggregating live telemetry, annotated inspection imagery, and equipment metadata, the dashboard eliminates the need for manual data entry. During initial deployments, Spot completed multi-kilometre inspection loops and uploaded over 6,000 annotated images per shift, delivering a comprehensive, timestamped view of equipment health. The dashboard’s real-time capabilities position it as a dynamic decision-support tool, advancing the transition from reactive maintenance to proactive, autonomous operations.

This trial represents a foundational step toward scaling autonomous survey robotics across industry. The integration of AI-driven perception, robotic mobility, and collaborative tasking is redefining the asset management paradigm.

The convergence of autonomous robotics, AI-powered vision systems, and collaborative machines signals a fundamental transformation in industrial asset management. Tasks once characterized by manual oversight are becoming intelligent, continuous processes. Initiatives such as those under the Australian Cobotics Centre offer a forward-looking model where factory systems are capable of sensing, interpreting, and responding with minimal human input—enabling safer, smarter, and more resilient operations across the manufacturing sector.

Citation: https://www.nbnco.com.au/blog/the-nbn-project/the-power-of-robotics-to-lift-digital-capability

Text – Asset Dashboard

[1]           J. Pizoń, M. Cioch, Ł. Kański, and E. Sánchez García, “Cobots Implementation in the Era of Industry 5.0 Using Modern Business and Management Solutions,” Adv. Sci. Technol. Res. J., vol. 16, no. 6, pp. 166–178, Dec. 2022, doi: 10.12913/22998624/156222.

[2]           A. R. Sadik and B. Urban, “An Ontology-Based Approach to Enable Knowledge Representation and Reasoning in Worker–Cobot Agile Manufacturing,” Future Internet, vol. 9, no. 4, Art. no. 4, Dec. 2017, doi: 10.3390/fi9040090.

[3]           B. I. Ismail, M. F. Khalid, R. Kandan, H. Ahmad, M. N. Mohd Mydin, and O. Hong Hoe, “Cobot Fleet Management System Using Cloud and Edge Computing,” in 2020 IEEE 7th International Conference on Engineering Technologies and Applied Sciences (ICETAS), Dec. 2020, pp. 1–5. doi: 10.1109/ICETAS51660.2020.9484266.

Celebrating Excellence in 3MT: Our Centre’s Outstanding Participants

We are thrilled to share the incredible achievements of our PhD researchers in the recent Three Minute Thesis (3MT®) competition! The 3MT is a prestigious event where researchers present their work in just three minutes, capturing the essence of their thesis in an engaging and accessible way.

Akash Hettiarachchi’s Win at the QUT Faculty of Business and Law

A massive congratulations to our PhD researcher, Akash Hettiarachchi, for winning the QUT (Queensland University of Technology) Faculty of Business and Law’s 3MT® Final! Akash’s presentation on the impact of collaborative robots (cobots) in enhancing diversity—particularly gender and generational diversity—in the manufacturing workforce was truly inspiring. This victory paves the way for Akash to advance to the Grand Final at the Graduate Research Student Showcase on 11th September at QUT. The winner of this final will represent Australia at the Asia-Pacific competition, where over 900 universities from 85 countries compete.

Akash is part of our Centre’s Human-Robot Workforce team, led by A/Prof Penny Williams, Prof Greg Hearn, and Postdoctoral Research Fellow Dr. Melinda Laundon. You can read more about Akash’s project here: Read More. We are incredibly proud of Akash and excited to support him in the next stage of his journey!

Recognising Other Remarkable Participants

We would also like to acknowledge the outstanding achievements of two of our researchers who participated in the 3MT competition:

  • Danial Rizvi, a PhD researcher from UTS, represented our Centre in the UTS competition. His insightful presentation was a reflection of his dedication to advancing knowledge in his field.

  • Jacob Dawson, an Associate PhD, won the QUT Faculty of Engineering (FoE) 3MT® round and will join Akash in the QUT Grand Final on 11th September.

2025 HDR & Postdoc Winter Retreat: Career Pathways and Success Strategies

The 2025 HDR & Postdoc Winter Retreat brought together HDR students and early career researchers from across various disciplines, providing them with invaluable insights into shaping their future careers.

Day 1 – Defining Professional Identity and Exploring Careers Beyond Academia

Day 1 focused on preparing researchers for career pathways beyond their PhDs. Led by Associate Director for Research Training, Professor Glenda Caldwell, the day’s activities helped participants reconnect with the broader skills developed during their research journey. Workshops included:

  • Defining Professional Identity: Led by Karen Cavu (FHEA) and Glen Murphy (QUT Entrepreneurship), researchers learned how to articulate their professional identities, empowering them to recognize the value they can bring to various industries.

  • Informational Interviews & Networking: Dr Abigail Winter (SFHEA) and Karen Cavu emphasized the power of informational interviews, guiding researchers on how to leverage these conversations for career growth.

  • Industry Careers Panel: Featuring Dr Tom Williamson (Stryker), Dr Anjali Jaiprakash (Gelomics), and Dr Maria Hameed Khan (QUT Centre for Decent Work & Industry), the panel shared inspiring stories about transitioning from academia to industry, showcasing the diverse career paths available for researchers.

The day concluded with a relaxed networking session with drinks and pizza, providing an opportunity to connect and reflect on the day’s insights.

Day 2 – Research Career Success on Your Own Terms

The second day featured a full-day workshop led by Prof Inger Mewburn, also known as the Thesiswhisperer, titled “Research Career Success (on your own terms).” This workshop brought together researchers from various QUT research centres, focusing on:

  • The real career landscape for researchers today.

  • Exploring diverse career opportunities for PhD graduates.

  • Practical strategies for career development, including traditional networking and the use of cutting-edge AI tools.

Throughout the workshop, Inger gathered live feedback, which reflected the growing confidence and evolving career perspectives of the participants. The session provided both a reality check and an empowering look at the many career opportunities available to researchers.

A heartfelt thank you to everyone involved, especially Prof Inger Mewburn for leading such a transformative session. We look forward to putting these insights into action as we continue to support our researchers in their career development.