Member Login

ARTICLE: Making Cobots Ready-to-Hand: A Compliance Perspective 

Written by Katia Bourahmoune, UTS & Acting Co-Lead Quality Assurance and Compliance program

Heidegger describes an equipment as ready-to-hand when it disappears into practice, when its use is so seamlessly integrated that it ceases to be an object of thought and becomes instead a transparent extension of action. A hammer is not noticed as a hammer when it drives a nail effectively; it is only when it splinters or slips that it becomes it becomes an object of scrutiny, unready-to-hand, with its use questioned. In modern manufacturing, collaborative robots (cobots) occupy an uneasy position between these two states. They promise repeatability, precision, and tireless monitoring, yet they are undeniably still machines to be supervised, audited, and monitored. In compliance and quality assurance, this human oversight of machines is necessary. Afterall, compliance remains the most human part of the hyper-mechanised modern manufacturing process. This is particularly evident in heavily regulated industries like medical device manufacturing, aviation and defence, where errors are measured not only in costs but in lives and national security.

For cobots to become ready-to-hand, they must be genuinely collaborative: partners in the task rather than peripheral machinery. While collaboration in the context of human-robot interaction is hard to define and evolves as the field advances, it is useful to frame it within the level on interaction between a human and a robot. These levels range from co-existence (shared space, individual actions) to co-operation (shared space, human-guided actions), to collaboration (shared space, joint bi-directional actions). Collaboration through this lens implies shared situational awareness, legible intent, and adaptive action: the robot exposes what it “perceives” (vision, force,…), why it is acting (constraints, goals,…), and how humans can adapt, override, or teach. Such interfaces must preserve human agency and skilled technique while reducing ergonomic and cognitive load. In practice, this means adaptive assistance that yields to expert touch, explanations of proposed actions, and workflows that keep responsibility distributed rather than displaced. When collaboration works this way, it does more than improve throughput; it establishes the preconditions for assurance to be intrinsic rather than supervisory. On this foundation, compliance becomes by design: assurance embedded in action, rather than appended after it. Cobots can inspect as they assemble, verify as they position, and generate audit-grade evidence as a by-product of normal operation. Cobots can extend human judgment through continuous monitoring, allowing human inspectors to concentrate on exceptions, interpretation, and continuous improvement.

This human-robot collaboration fundamentally hinges on trust. In production, workers must believe that a cobot will act predictably and safely; in quality assurance, they must also believe that the cobot’s monitoring and record-keeping are accurate and transparent. Research on automation psychology shows the dangers of both extremes: over-trust leads to blind reliance, while under-trust leads to redundancy and disuse. The literature points to several ways for calibrated trust including reliable and predictable performance, timely feedback, options for human override, transparent explanations of decisions, and auditable records tied to actions, and here we emphasise the compliance-critical elements of legibility, traceability, and contestability. Trust, then, is not an abstract sentiment but a design commitment: when cobots make their intentions legible and their decisions contestable, human operators retain meaningful agency in the loop. This keeps human judgment engaged precisely where it adds the most value. In regulated settings, this turns assurance into a shared practice rather than a supervisory afterthought, and it reorients collaboration toward preserving and amplifying human skill rather than displacing it.

Concerns are often raised that automation “deskills” human labour, relegating workers to passive supervision. Cobots designed for compliance offer the opposite prospect. By taking on repetitive inspection tasks, cobots free human expertise for higher-order judgment: interpreting anomalies, adapting processes, and innovating in response to unforeseen conditions. The skill does not vanish; it is re-centred where it matters most. In this way, cobots not only maintain but actively sustain skill, ensuring that human judgment remains the decisive element in compliance.

The Compliance and Quality Assurance program at the Australian Cobotics Centre aims to develop practical tools that specify, monitor and evaluate human–robot collaboration using multi-modality sensing and AI for assessing compliance.

When cobots are truly ready-to-hand, i.e. useful, trustworthy, and engineered for compliance-by-design, they cease to be mere machines and become true collaborators that elevate human skill while making quality an intrinsic property of every human–robot action.

 

Further reading:  

Heidegger, M. (1962). Being and time. In J. Macquarrie, & E. Robinson, (Trans.), New York, NY: Harper & Row. 

Guertler, M., Tomidei, L., Sick, N., Carmichael, M., Paul, G., Wambsganss, A., … & Hussain, S. (2023). When is a robot a cobot? Moving beyond manufacturing and arm-based cobot manipulators. Proceedings of the Design Society, 3, 3889-3898. https://doi.org/10.1017/pds.2023.390  

Hancock, P. A., Billings, D. R., & Schaefer, K. E. (2011). A meta-analysis of factors affecting trust in human-robot interaction. Human Factors, 53(5), 517–527.  https://doi.org/10.1177/0018720811417254  

Carmichael, M. (2023). Can we Unlock the Potential of Collaborative Robots?. Australian Cobotics Centre. https://www.australiancobotics.org/articles/can-we-unlock-the-potential-of-collaborative-robots/  

 

 

ARTICLE: The Art of Mechamimicry: Designing Prototyping Tools for Human-Robot Interaction

When we think about robotics, especially in high, stakes contexts like surgery, we often imagine advanced machines, complex algorithms, and high, tech labs. But sometimes the best way to design the future is with cardboard, PVC pipes, and a bit of puppetry.

In our recent research, presented at DIS 2025, we explored how embodied, low, fidelity prototyping can help bridge the gap between technical development and the lived realities of people who work with robots.

The Challenge

Robotic, Assisted Surgery (RAS) is a complex and dynamic human–robot interaction (HRI) settings. Surgeons rely on tacit, embodied knowledge built up over years of practice. Engineers bring deep technical expertise. Human factors specialists understand the cognitive and ergonomic limits of people in high, stress environments.

But bringing these perspectives together can be challenging, especially early in the design process. Too often, technical development runs ahead of human needs, leaving systems misaligned with real, world practice.

Our Approach: The Kinematic Puppet

To address this, we developed the kinematic puppet: a modular, tangible prototyping tool that allows users to physically “puppeteer” a robot arm without needing code or expensive equipment.

  • Built with 3D, printed joints, rotary encoders, and PVC linkages, it’s reliable, reusable, and reconfigurable.
  • It integrates with virtual simulation, so physical movements can be recorded and replayed as digital twins.
  • It lowers the barrier for surgeons, engineers, and designers to experiment together, making abstract ideas concrete.

Through physically roleplaying it allows participants to test scenarios and explore ideas before committing to costly development.

Putting It to the Test

We trialled the kinematic puppet in a co-design workshop focused on revision hip surgery. Surgeons, engineers, and designers gathered around a low, fidelity anatomical model, simple props, and the kinematic puppet.

Through roleplay and bodystorming, participants experimented with:

  • Ergonomic tool grips (pen, style vs. drill, style).
  • Spatial layouts of surgical environments.
  • Cooperative control methods (e.g. axis locking, haptic boundaries).

Crucially, the kinematic puppet and surgical props helped surface tacit knowledge, things surgeons know through embodied practice but may struggle to articulate. Combined with simulation, we could capture, replay, and analyse these scenarios for further design development.

Key Takeaways

  1. Embodiment matters: In robotics design, haptic feedback, posture, and movement constraints cannot be understood through software alone. Tangible tools make this knowledge accessible.
  2. Hybrid methods are powerful: Combining physical roleplay with digital capture bridges creative ideation and technical precision.
  3. Collaboration is essential: Designers play a key role in facilitating conversations between different stakeholders helping translate knowledge and experience between disciplines.
  4. Low, fidelity ≠ low, value. Sometimes the simplest prototypes spark the richest insights.

Why This Matters Beyond Surgery

Although our case study focused on surgery, the approach has wider relevance. Any domain where humans and robots work together under constraints, manufacturing, logistics, aerospace, can benefit from accessible, embodied prototyping.

By lowering the technical threshold for participation, we can bring more voices into the design of future robotic systems. That means better alignment with real workflows, safer systems, and technology that truly supports human expertise.

A Call to Action

If you’re working in robotics, design, or any field where humans and machines collaborate:

  • Prototype early, prototype tangibly. Don’t wait for polished tech, start with what people can touch, move, and play with.
  • Value tacit knowledge. Invite practitioners to show you, not just tell you, how they work.
  • Think hybrid. Use both physical artefacts and digital tools to capture richer insights.

The future of robotics won’t just be written in code, it will be shaped through the creative, embodied practices that help us design with people, not just for them.

We would love to hear from others: how have you used tangible or embodied methods to explore technology design in your field?

PhD Research Spotlight: Yuan Liu Enhancing Human-Robot Collaboration Through Augmented and Virtual Reality

Integrating collaborative robots (cobots) into human workspaces demands more than just technical precision, it requires human-centered design. PhD researcher Yuan Liu, based at Queensland University of Technology (QUT), is tackling this challenge through her project Enhancing Human Decision Making in Human-Robot Collaboration: The Role of Augmented Reality, part of the Designing Socio-Technical Robotic Systems program within the Australian Cobotics Centre. 

Yuan’s research investigates the co-design and development of immersive visualisation approaches, including Augmented Reality (AR) and Virtual Reality (VR), to simulate, prototype, and evaluate human-robot collaboration (HRC) within real-world manufacturing environments. Her goal is to empower workers and decision-makers to better understand how cobots affect workflows, spatial layouts, and safety, ultimately improving acceptance and performance. 

By leveraging Extended Reality (XR) technologies, Yuan is working to enhance human decision-making before, during, and after cobot integration. Her aim is to create intuitive, interactive systems that help workers anticipate robot actions and develop AR-based design frameworks that optimise collaboration and safety. 

Her project aligns closely with the program’s mission to embed holistic design as a critical factor in the seamless integration of humans and machines. The broader goal is to improve working conditions, increase production efficiency, and foster workforce acceptance of robotic technologies. Yuan’s work contributes directly to these aims by developing a Human-Centred Design Process, AR-driven frameworks and design guidelines that place human experience at the centre of robotic system development. 

A key component of Yuan’s research is her industry placement with B&R Enclosures, where she is conducting fieldwork in their gasket room. Over the course of her PhD, she will spend 12 months on placement, collecting observational data, conducting interviews, and validating her design outcomes. This engagement ensures that her findings are relevant and transferrable to industry.  

The project is structured across several phases. It began with capturing 360-degree video footage of workers performing tasks in the gasket room, followed by detailed analysis of decision-making during these interactions. Yuan then conducted interviews with employees to gather self-reported insights into their decision-making processes. These findings are informing the development of an AR-based design prototype, tailored to enhance human understanding and collaboration with robots. The final phase focuses on knowledge transfer, ensuring that outcomes are shared with industry partners and the broader research community. 

Yuan’s academic journey reflects her interdisciplinary strengths. Before joining the Australian Cobotics Centre, she earned an MSc in Interactive Media from University College Cork (Ireland) in 2022, where she gained expertise in Multimedia Technologies and Human-Computer Interaction (HCI). Her academic path began with a Bachelor’s degree in Urban Planning from Southwest University (China), followed by several years of professional experience in landscape architecture and urban planning. This diverse background informs her approach to research, blending design thinking with technical innovation. 

Her work is supported by a multidisciplinary supervisory team, including Professor Glenda Caldwell, Professor Markus Rittenbruch, Associate Professor Müge Fialho Leandro Alves Teixeira, Dr Alan Burden, and Dr Matthias Guertler from UTS. She also collaborates closely with staff at B&R Enclosures, including Eric Stocker and Josiah Brooks, who facilitate access to the workplace and support her data collection efforts. 

By the end of her project in November 2026, Yuan aims to deliver a framework for understanding human behaviour and decision-making in manufacturing, a human-centred AR design approach for collaborative robotics, and guidelines for designing AR interfaces that optimise human-robot interaction. 

🔗 Read more about Yuan’s project on our website: https://www.australiancobotics.org/project/project-3-3-augmented-reality-in-collaborative-robotics/  

PhD Research Spotlight: Zongyuan Zhang Tackles Contact Tasks with Mobile Robots

PhD Research Spotlight: Zongyuan Zhang Tackles Contact Tasks with Mobile Robots

As part of the Biomimic Cobots program within the Australian Cobotics Centre, PhD researcher Zongyuan Zhang is leading a project that addresses a key challenge in manufacturing: enabling mobile robots to perform high-precision contact tasks, such as grinding, polishing, and welding, on large, arbitrarily placed workpieces in factory environments.

Zongyuan brings a diverse background in robotics to this work. He holds an M.Sc. in Robotics from the University of Birmingham, UK, where he focused on applying deep learning to manipulator force control. His experience spans control system design, mechanical structure design, and participation in a range of innovative robotics projects—including underwater photography robots, driverless racing cars, exoskeleton mechanical arms, dual-rotor aircraft, and remote-control robotic arms—some of which are now undergoing commercialisation.

His PhD project, Contact Task Execution by Robot with Non-Rigid Fixation, investigates how robots with non-rigidly fixed chassis can maintain the accuracy, stability, and adaptability required for industrial contact tasks. These tasks typically demand hybrid force/position control and high contact forces, which are complicated by the mobility and flexibility of the robot’s base.

This research contributes to the Biomimic Cobots program’s goal of developing collaborative robots that mimic human sensing, learning, and manipulation skills. It explores:

  • How a robot mimics human control to execute contact tasks like sanding and grinding.
  • How augmented mobility enables task execution in large, unconstrained spaces.
  • How minimal task-specific programming can be used to adapt to new workpieces and environments.

Zongyuan, based at QUT, is supervised by Professor Jonathan Roberts, Professor Will Browne, and Dr Chris Lehnert, and is working onsite at ARM Hub alongside industry partner, Vaulta. The project with industrial partners concerns the efficient and accurate removal of surface oxides from metallic materials, thereby enabling tighter bonding between metal components. This embedded collaboration ensures his research is conducted in real production environments and remains grounded in the practical needs of Australian manufacturers.

Recent milestones include the:

  • Design and deployment a framework for performing industrial sanding tasks using collaborative robots.
  • Utilisation of sound as a multimodal input to improve the robustness of the sanding process and enhance the cost-efficiency of the robotic system.
  • Exploration of how humanoid robots can achieve high-precision performance in contact tasks.

Check out our website for the latest on his project: Project 1.1 – Contact task execution by robot with non-rigid fixation » Australian Cobotics Centre | ARC funded ITTC for Collaborative Robotics in Advanced Manufacturing

Zongyuan pictured (centre) with ARM Hub’s Technology Lead, Dr Troy Cordie (top picture, L) and Queensland’s Deputy Premier, Minister for State Development, Infrastructure and Planning and Minister for Industrial Relations, Jarrod Bleijie MP. (R)

ARTICLE: Project Introduction: What Is Trust, Really? Rethinking Human-Robot Interaction Through Design

While engineers often focus on functionality and reliability, real-world deployment reveals a deeper challenge: if people feel uneasy, confused, or disconnected when working with robots, adoption falters. This is especially true in manufacturing, where seamless human-robot collaboration is essential for productivity and safety. 

A new strategic project led by Dr Valeria Macalupu, postdoctoral researcher in QUT’s Human-Robot Interaction (HRI) program, aims to address this challenge by exploring how design affordances, such as physical form, materials, and expressive behaviours, can foster trust between humans and robots. Co-funded by the QUT Design Lab, the project is titled “What is trust really? Exploring Rich Interactions and Design Affordances in Human-Robot Interaction through Co-design.” 

Rather than treating trust as a by-product of performance, the project investigates how visual, tactile, and behavioural cues can signal intent, competence, and emotional safety. These cues are often overlooked in traditional engineering approaches but are critical to how people interpret and respond to robotic systems. 

The research will be developed and disseminated through a series of co-design workshops, iterative prototyping, and a public exhibition. The goal is to generate insights into how robots can be designed to feel more intuitive, approachable, and safe. 

For industry, this project offers practical design guidelines to help engineers and developers create robots that are not only functional but also intuitively acceptable to human users. In manufacturing contexts, this could mean smoother integration of collaborative robots, reduced training time, and improved worker satisfaction. 

More broadly, the project contributes to a growing body of research that sees trust not as a by-product of performance, but as a designable quality. By understanding how people interpret and respond to robots through their physical presence, this work supports the development of safer, more empathetic, and more effective robotic systems. 

Across the Centre, our postdoctoral research fellows lead a diverse range of projects, from long-term initiatives to shorter, more focused pieces of work. This latest project from Dr Macalupu exemplifies the kind of strategic, interdisciplinary work that drives innovation and impact across our programs. 

This project is jointly funded by the Australian Cobotics Centre and the QUT Design Lab.  

Read more about the project: Project 2.9: What is trust really? Exploring Rich Interactions and Design Affordances in Human-Robot Interaction through Co-design » Australian Cobotics Centre | ARC funded ITTC for Collaborative Robotics in Advanced Manufacturing

Get in touch with Valeria: v.chira@qut.edu.au   

ARTICLE: Automating Asset Management Tasks in Factory Settings

As the Asset Management Council of Australia emphasizes, effective asset management provides organizations with insights across four critical domains: physical, information, financial, and intellectual assets. When implemented successfully, it enhances decision-making, strengthens operational resilience, and delivers long-term value. This capability serves as a key differentiator between agile, modern factories and slower, fragmented legacy operations.

The next frontier of asset management explores the shift from human-dependent monitoring to autonomous systems—where assets effectively manage themselves.

Under the Australian Cobotics Centre’s (ACC) Quality Assurance and Compliance program, researchers at the UTS:Robotics Institute are advancing this concept through the deployment of Boston Dynamics’ robotic platform, Spot. In this implementation, Spot autonomously navigates industrial environments, constructs a digital map, captures detailed imagery of equipment such as pipes, coils, and panels, and performs inspection tasks without human supervision.

Collaborative robots, or cobots, play a critical role in this evolving landscape. Designed to complement rather than replace human workers, cobots enhance manufacturing resilience by integrating human judgment with machine-level consistency and precision. Studies by [1] and [2] highlight the synergy of cobots and human teams in improving factory adaptability.

In this approach, robots such as Spot undertake detailed inspection routines, while humans execute follow-up tasks—tightening components, initiating repairs, or notifying human operators through shared digital interfaces. A supporting fleet management system, as outlined by [3], enables real-time tracking of cobot performance, usage, and maintenance status, effectively treating each cobot as a digital asset within the broader ecosystem.

In field trials, Spot demonstrated 90 minutes of continuous operation without a battery swap, covering 7.5 kilometers of factory floor—equivalent to 75,000 m². This capacity enables a single robot to replace multiple manual inspection rounds per shift in a typical Australian SME manufacturing facility.

A key innovation in the system is an integrated asset-management dashboard that connects directly to the SPOT’s API. By aggregating live telemetry, annotated inspection imagery, and equipment metadata, the dashboard eliminates the need for manual data entry. During initial deployments, Spot completed multi-kilometre inspection loops and uploaded over 6,000 annotated images per shift, delivering a comprehensive, timestamped view of equipment health. The dashboard’s real-time capabilities position it as a dynamic decision-support tool, advancing the transition from reactive maintenance to proactive, autonomous operations.

This trial represents a foundational step toward scaling autonomous survey robotics across industry. The integration of AI-driven perception, robotic mobility, and collaborative tasking is redefining the asset management paradigm.

The convergence of autonomous robotics, AI-powered vision systems, and collaborative machines signals a fundamental transformation in industrial asset management. Tasks once characterized by manual oversight are becoming intelligent, continuous processes. Initiatives such as those under the Australian Cobotics Centre offer a forward-looking model where factory systems are capable of sensing, interpreting, and responding with minimal human input—enabling safer, smarter, and more resilient operations across the manufacturing sector.

Citation: https://www.nbnco.com.au/blog/the-nbn-project/the-power-of-robotics-to-lift-digital-capability

Text – Asset Dashboard

[1]           J. Pizoń, M. Cioch, Ł. Kański, and E. Sánchez García, “Cobots Implementation in the Era of Industry 5.0 Using Modern Business and Management Solutions,” Adv. Sci. Technol. Res. J., vol. 16, no. 6, pp. 166–178, Dec. 2022, doi: 10.12913/22998624/156222.

[2]           A. R. Sadik and B. Urban, “An Ontology-Based Approach to Enable Knowledge Representation and Reasoning in Worker–Cobot Agile Manufacturing,” Future Internet, vol. 9, no. 4, Art. no. 4, Dec. 2017, doi: 10.3390/fi9040090.

[3]           B. I. Ismail, M. F. Khalid, R. Kandan, H. Ahmad, M. N. Mohd Mydin, and O. Hong Hoe, “Cobot Fleet Management System Using Cloud and Edge Computing,” in 2020 IEEE 7th International Conference on Engineering Technologies and Applied Sciences (ICETAS), Dec. 2020, pp. 1–5. doi: 10.1109/ICETAS51660.2020.9484266.

ARTICLE: Solving manufacturing’s labour crunch: Why job quality and collaborative robots must go hand-in-hand

The Albanese Government’s Future Made in Australia agenda has committed $22.7 billion over the next decade to rebuild sovereign manufacturing capability, capture low-carbon supply-chain opportunities and lift advanced industry productivity. Yet money alone will not solve the worker shortages that have persisted since COVID-19. Even after a 4.5 per cent fall in the February quarter, there were still more than 15,000 unfilled manufacturing positions and vacancy rates remained 44 per cent higher than before the pandemic. 

Although the proportion of vacant jobs in Australia decreased to 2 per cent in March, that headline masks deep, persistent shortages in key trades and technician roles. Unless industry and government tackle the root causes, the Future Made in Australia investments risk running into a human-capital wall. 

Qualitative research conducted by the Australian Cobotics Centre’s (ACC’s) Human-Robot Workforce Research Program and presented at the 2025 AIRAANZ conference by postdoctoral researcher Dr Melinda Laundon was based on interviews with 23 senior stakeholders across government, industry bodies, unions, and education providers. The research highlights three intertwined problems contributing to why enough workers aren’t joining or staying in the manufacturing sector.  

  • Earnings quality. At the sector level, manufacturing pay has struggled to keep pace with construction and transport, and is eclipsed by mining.  
  • Job security perceptions. Although views are changing to recognise that automation can make jobs safer and more interesting and increase production capacity, some workers still worry that automation may remove jobs. 
  • Working environment. Rigid shift patterns sit awkwardly beside the flexibility many Australians tasted during the pandemic. Under-investment in training may also leave employees uncertain about career progression. 

The research suggests some policy and organisational actions that may reduce labour shortages by improving job quality, attraction and retention. Stakeholders argued that raising hourly rates is necessary but not sufficient; manufacturers also need to: 

  • Provide up-skilling pathways. Investing in training for robotics, programming and digital twins both raises earnings potential and signals that workers have a future in the firm.  
  • Design human-centric technology deployments. Cobots can augment dirty, dangerous and highly repetitive tasks, reducing physical strain and freeing people for higher-value problem-solving.  
  • Embed employee voice. Involving operators in the redesign of workflows and role changes builds trust and ensures that cobots and other advanced manufacturing technologies can be implemented in a way that enhances job quality. 

These suggestions align with the government’s Industry 4.0 ambitions yet remain challenging for the small and medium enterprises (SMEs) that make up 98 per cent of Australian manufacturers. Fast turnaround and lower-cost microcredentials can be more accessible for SME owners and workers. Government and industry associations also have a role to play in promoting a manufacturing career narrative, highlighting success stories and the capacity for workers to move to tech-enabled roles with higher pay and autonomy.  

The ACC partners with manufacturers and technology providers to pilot human-robot solutions in real manufacturing contexts, drawing on expertise from design, engineering, quality assurance, and people management researchers. For governments rolling out Future Made in Australia programs and organisations considering cobot adoptions, it shows how technology adoption can lift productivity and job quality, not trade one off against the other. 

More Than Machines: Why Do We Want to Build Robots That Look Like Us? 

Written by Dr Alan Burden , QUT Postdoctoral Research Fellow, Designing Socio-Technical Robotics System program.

A colleague recently questioned why we are building robots that look human. If other machines already perform tasks reliably, robots in human shapes reveal more about our expectations rather than about technical necessity. Apart from striving to fulfil sci-fi fantasies, there seems to be little logical reason for many industries to develop humanoids. 

Humanoid robots are machines designed to resemble and move like humans, typically featuring an identifiable head, torso, arms, legs, and enabling interaction with people, objects, and environments in human-centred ways.  

In 2025, manufacturers are projected to ship approximately 18,000 humanoid robots globally [1], marking a significant step toward broader adoption. Looking ahead, Goldman Sachs forecasts that by 2035, the humanoid robot market could reach USD $38 billion (approximately AUD $57 billion), with annual shipments increasing to 1.4 million units [2]. Further into the future, Bank of America projects that by 2060, up to 3 billion humanoid robots could be in use worldwide, primarily in homes and service industries [1]. 

From Tesla’s Optimus Gen 2 [3] to Figure AI’s Figure 02 [4], the humanoid robot is no longer a figment of science fiction. These robots will walk, lift, talk, and perform factory tasks. Yet beneath the surface of innovation lies a deeper question: Will we build humanoid robots because the human form is genuinely useful, or because it reflects our own image back at us? 

In an age where industrial arm robots, wheeled and tracked platforms, and flying drones already perform industrial tasks with precision, the humanoid form can seem like an odd choice. These robots will be complex, expensive to develop, and often over-engineered for the roles they are expected to perform. 

So what explains the current fascination with building robots in our own image? 

Form vs Function: The Practical Debate 

Our world is designed around the human body. Door handles, tools, staircases, and car pedals all presume a body with arms, legs, and binocular vision. Humanoid robots will therefore adapt more easily to our environments. 

Still, there is a contradiction worth unpacking. We already have machines that operate far more efficiently without the constraints of two legs and a torso. Amazon’s warehouse bots glide on wheels, carrying shelving units with speed and precision [5]. Boston Dynamics’ Spot, a quadruped, excels at inspections and terrain navigation [6]. Agility Robotics’ Digit uses bipedal bird-like legs to move efficiently through human-centric spaces [7]. 

Humanoid robots won’t necessarily be more capable but may be more compatible with existing environments, especially where infrastructure redesign would be costly or disruptive. This compatibility advantage will be what Stanford’s Human-Centred AI Institute describes as the affordance of embodied compatibility rather than pure efficiency [8]. 

The Psychological Shortcut 

People respond to humanoid forms with startling immediacy. A robot with a face, a voice, and gestures doesn’t just operate in our space – it socially occupies it. 

That connection brings both benefits and barriers. Humanoid robots will be easier to instruct, cooperate with, or trust, especially in care or customer service roles. This intuitive rapport, however, will come at a cost. We’ll also project emotions, intentions, and even moral status onto these mechanical beings. The IEEE’s Global Initiative on Ethics of Autonomous Systems [9] has warned that anthropomorphic design risks confusion over autonomy, trust, and accountability. 

A robotic arm making a mistake will seem tolerable. A robot with eyes and facial features doing the same will feel uncanny. The “uncanny valley”, a term coined by roboticist Masahiro Mori in 1970 to describe the discomfort people feel when a robot or virtual character looks almost human, but not quite [10] – will blur the line between tool and companion, worker and being. 

Redefining Labour and Power 

Humanoid robots will often be pitched as general-purpose labourers: tireless, adaptable, and compliant. In some ways, they’ll echo the 19th-century industrial ideal of the perfect worker. 

But this vision raises complex questions. If these machines replace humans in repetitive or hazardous roles, how will we protect the dignity and security of displaced workers? If a robot becomes a “colleague,” what responsibilities will come with that illusion? 

The Future of Humanity Institute at Oxford [11] noted that humanoids could contribute to a shift in how we view authority and social dynamics. If robots are always obedient, will we begin to expect the same from people? Automation will soon shape not just job loss, but workplace culture and human behaviour. This connects with human-robot interaction research on anthropomorphic framing and robot deception, which cautions against uncritically assigning social roles to machines [11]. 

Who Are We Really Building? 

At its core, the humanoid robot reflects our self-image. When Boston Dynamics’ Atlas robot performed parkour in a now-iconic demonstration video [12], public fascination was less about mechanics and more about the eeriness of watching something mechanical move with such human-like agility. The video, titled Atlas – Partners in Parkour, showcased robots jumping, flipping, and vaulting through a gymnastics course which triggered both admiration, unease and a wave of social media memes drawing comparisions with Terminator films. 

This is not new. From clockwork automatons in royal courts to androids in science fiction, each era’s robots mirror its anxieties and desires. For instance, Hanson Robotics’ Sophia [13] was designed with expressive facial features to promote naturalistic interaction, yet remains polarising and dismissed as novelty. Is it an advancement in social robotics or a symbol of anthropomorphic overreach? 

The goal of today’s humanoids reveals our priorities. Tesla’s Optimus will be built to handle repetitive factory work. Figure AI’s humanoids will aim to integrate into warehouse workflows. These designs won’t just be technical – they will symbolise which human qualities we value and which jobs we are ready to relinquish. 

The Real Question 

As mechanical humans enter our homes and workplaces, we must ask what they will symbolise beyond their specs. Humanoid robots will reflect assumptions about work, social interaction, and human worth. When we automate tasks in human form, we choose which parts of ourselves we replicate and which we outsource. 

The most pressing questions won’t be about joint torque or facial recognition, but about how these machines reshape our relationships with technology, labour, and each other. Robots, like all tools, embody human intention. The challenge isn’t building minds like ours, but questioning why we keep giving them our face. 

References 

[1] Koetsier, J. (2025, April 30). Humanoid robot mass adoption will start in 2028, says Bank of America. Forbes. https://www.forbes.com/sites/johnkoetsier/2025/04/30/humanoid-robot-mass-adoption-will-start-in-2028-says-bank-of-america/ 

[2] Goldman Sachs. (2024, January 8). The global market for humanoid robots could reach $38 billion by 2035. https://www.goldmansachs.com/insights/articles/the-global-market-for-robots-could-reach-38-billion-by-2035 

[3] Tesla. (2023). Tesla Optimus: Our Humanoid Robot. https://www.tesla.com/AI 

[4] Figure AI. (2024). Figure 02 Robot Overview. https://www.figure.ai/ 

[5] Amazon. (2025). Facts & figures: Amazon fulfillment centers and robotics. https://www.aboutamazon.co.uk/news/innovation/bots-by-the-numbers-facts-and-figures-about-robotics-at-amazon 

[6] Boston Dynamics. (2025). Spot | Boston Dynamics. https://bostondynamics.com/products/spot/ 

[7] Agility Robotics. (2025). Digit – ROBOTS: Your Guide to the World of Robotics. https://www.agilityrobotics.com/ 

[8] Srivastava, S., Li, C., Lingelbach, M., Martín-Martín, R., Xia, F., Vainio, K., Lian, Z., Gokmen, C., Buch, S., Liu, K., Savarese, S., Gweon, H., Wu, J., & Fei-Fei, L. (2021). BEHAVIOR: Benchmark for Everyday Household Activities in Virtual, Interactive, and Ecological Environments. arXiv preprint arXiv:2108.03332. https://arxiv.org/abs/2108.03332 

[9] IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. (2020). Ethically Aligned Design, 1st ed. https://ethicsinaction.ieee.org/ 

[10] Mori, M. (1970). The uncanny valley. Energy, 7(4), 33–35. (English translation by MacDorman & Kageki, 2012, IEEE Robotics & Automation Magazine). https://doi.org/10.1109/MRA.2012.2192811 

[11] Brundage, M., Avin, S., Clark, J., Toner, H., Eckersley, P., Garfinkel, B., … & Amodei, D. (2018). The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation. Future of Humanity Institute, University of Oxford. Retrieved from https://arxiv.org/abs/1802.07228 

[12] Boston Dynamics. (2021, August 17). Atlas | Partners in Parkour [Video]. YouTube. https://www.youtube.com/watch?v=tF4DML7FIWk 

[13] Hanson Robotics. (2025). Sophia the Robot. https://www.hansonrobotics.com/sophia/ 

 

ARTICLE: Beyond Efficiency: Ethical Considerations of Adopting Cobots

Collaborative robots, commonly referred to as “Cobots,” are among the most groundbreaking technological advancements of our time. Academics and industry experts firmly believe that Cobots have the potential to revolutionise global manufacturing. A Cobot is a context-aware robot equipped with artificial intelligence and vision capabilities, enabling it to safely coexist with both human operators and machines in the same workspace.

The adoption of Cobots in manufacturing is one of the key enablers of Industry 5.0. The concept of Industry 5.0 was first proposed by Michael Rada[i] in 2015, after it was felt that Industry 4.0, the predecessor of Industry 5.0 was unable to meet the increasing demands of personalization and customization of goods. Through incorporation of highly advanced systems such as artificial intelligence, automated systems, internet of things, and cloud computing, Industry 4.0 was aimed at enhancing operational efficiency and productivity by connecting physical and virtual worlds. However, the rapidly evolving global business dynamics shifted the industry paradigm from not just efficient production but also high-value mass customization and personalization of goods. It was widely believed that Industry 4.0 was unable to address these changes. Therefore, Industry 5.0 was coined to address changing industrial dynamics focusing on collaboration between advanced production systems, machines and humans.

To reap the enormous benefits associated with this technology, its adoption necessitates careful consideration of the risks that could potentially affect the well-being of human operators who work collaboratively with Cobots.

Ethical Considerations of Adopting Cobots

Ethical considerations when adopting Cobots encompass a wide range of social factors[ii]. As defined by the British Standards Institution[iii], ethical hazards are any potential source of harm that compromises psychological, societal, and environmental well-being. While collaborative settings involving Cobots offer benefits like reducing physically demanding tasks for humans, they have also brought forth new risks and ethical considerations that demand attention during their planning and use. In following sections, I will discuss some of the ethical considerations of adopting Cobots:

Emotional Stress

Understanding potential worker emotional stress may result in designing better human-Cobot interaction systems that minimise stress and enhance the overall user experience. Cobots may cause emotional stress among users for several reasons. For instance, users might feel they have less control over their work environment when Cobots are involved, especially if the Cobots operate autonomously. This can lead to feelings of anxiety and stress. Moreover, Cobots are often used for tasks that require high precision and concentration, thus pressure to perform these tasks accurately can be mentally exhausting and stressful. The constant need to monitor and interact with Cobots can trigger physiological stress responses, such as increased heart rate and tension. Organisations can consider these factors when designing and implementing cobots.

Social Environment

Understanding potential social environment related disruptions, manufacturers can develop strategies to mitigate workers’ concerns and create a harmonious work environment. Unless workers are involved in the design and planning of Cobot implementations, they may disrupt the social harmony of the workplace in several ways, for example by raising concerns about job security among workers, or causing anxiety and tension due to the fear of being replaced by robots. This can lead to confusion and ambiguity about job roles, causing stress and disrupting team cohesion. Furthermore, the presence of Cobots can alter social interactions in the workplace, with some workers viewing them as teammates while others see them as intruders, potentially leading to conflicts. Additionally, the increasing autonomy of Cobots raises ethical questions about decision-making and accountability.

Social Acceptance

By comprehending social acceptance related community factors, strategies can be developed to enhance the acceptance of Cobots. Communities play a crucial role in determining the acceptance of new technologies. Several key factors influence the acceptance of Cobots. Different cultures exhibit varying levels of comfort and acceptance towards technology. Some cultures place a higher level of trust and enthusiasm for technological advancements, which can lead to greater acceptance of Cobots. The opinions and behaviours of peers, family, and colleagues can significantly impact an individual’s acceptance of Cobots. Communities with higher levels of education and awareness about the benefits and functionalities of Cobots tend to accept them more readily. Government policies and incentives that promote the use of Cobots can positively influence community acceptance. Supportive regulations and funding for Cobot integration can encourage businesses and individuals to adopt this technology.

Data Collection

Firms adopting Cobots need to devise their data management policies and ensure workers that collected data will not be used by any other third party. Considering that Cobots collect a variety of data from their safety systems, there’s a risk that operators and user data could be collected, used, and sold without consent. Research indicates that many industry organisations were already interested in the potential value of this data in developing future products and services.

The addressal of these ethical considerations can ensure that the adoption of Cobots contributes positively to society and aligns with our social values. Thus, by prioritizing ethics, we can foster trust and acceptance of Cobots in manufacturing.

[i] https://www.linkedin.com/pulse/industry-50-from-virtual-physical-michael-rada/

[ii] https://www.centreforwhs.nsw.gov.au/__data/assets/pdf_file/0019/1128133/Work-health-and-safety-risks-and-harms-of-cobots.pdf

[iii] https://knowledge.bsigroup.com/products/robots-and-robotic-devices-guide-to-the-ethical-design-and-application-of-robots-and-robotic-systems

What Would Jim Henson Do? Roleplaying Human-Robot Collaborations Through Puppeteering

By James Dwyer and Dr Valeria Macalupu (both QUT)

 

 A tangible, adaptable and modular interface for embodied explorations of human-robot interaction concepts.

As robots become increasingly integrated into various industries, from healthcare to manufacturing, the need for intuitive and adaptable tools to design and test robotic movements has never been greater. Traditional approaches often rely on expensive simulations or complex hardware setups, which can restrict early-stage experimentation and limit participation from non-expert stakeholders. The kinematic puppet offers a refreshing alternative by combining hands-on prototyping with virtual simulation, making it easier for anyone to explore and refine robot motion. This work is particularly critical for exploring intuitive ways surgeons can collaborate with robots in the operating room, improving Robot-Assisted Surgery (RAS).

 What is the kinematic puppet?

The kinematic puppet is an innovative tool that combines physical prototyping and virtual simulation to simplify the design and testing of robot movements and human-robot interactions. The physical component is a modular puppet constructed from 3D-printed joints equipped with rotary encoders and connected by PVC linkages. This flexible and cost-effective setup allows users to customise a robot arm to suit a variety of needs by adjusting linkage lengths and joint attachments.

On the digital side, a virtual simulation environment (developed in Unreal Engine) creates a real-time digital twin of the physical puppet. This integration via Wi-Fi/UDP enables immediate visualisation and testing of HRI concepts. By bridging the gap between physical manipulation and digital analysis, the kinematic puppet makes it easier for anyone to experiment with and refine robot motion in an interactive and accessible way.

Figure 1. The physical and virtual components of the kinematic puppet.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

How does the user interact with the puppet?

In the demonstration, users engage with the system by physically manipulating the kinematic puppet to control a digital twin of the robot arm, guiding it through a virtual cutting task. As they direct the arm’s movements, a virtual cutting tool simulates material removal in real time.

The system provides continuous feedback through both visual displays and haptic responses, creating an immersive and intuitive experience. This interactive environment challenges participants to balance precision and speed, highlighting the importance of both accuracy and efficiency in robotic tasks.

By making the abstract process of programming robotic movements tangible, the kinematic puppet empowers users to experiment and learn in a dynamic environment.

Figure 2. James showing how the kinematic puppet works.

Demonstration at HRI 2025 – An experience for HDR students.

Presenting the Kinematic Puppet at the Human-Robot Interaction Conference 2025 provided valuable insights into how our research resonates with the broader robotics community. Attendees were particularly drawn to the system’s modularity and reconfigurability and appreciated the puppetry-based approach as an intuitive method for exploring human-robot interaction concepts.

The demonstration wasn’t without challenges. Technical issues before the demo required some mildly frantic rebuilding of the code solution the morning of, highlighting a common research reality: experimental prototypes often accumulate small bugs through iterative development that compound unexpectedly. An all-too-common challenge that reflects the messy nature of research and something that isn’t always visible in polished publications.

Reviewer feedback highlighted potential applications we hadn’t considered, particularly around improving accessibility of research technologies. While most attendees engaged enthusiastically with the concept, some appeared to struggle to connect it to their work. It took time for me to find effective ways to explain the purpose and value of the approach—a good reminder that not every method resonates equally in a diverse field and how important it is to tailor explanations to your audience, even within a given research community.

For an HDR student, this experience underscores the importance of exposing your work to the research community early. The value lies not in positive reception, but in the process of presenting the work itself. Getting to explain my work to others forced me to articulate and refine my thinking, an opportunity that is missed when work is conducted in isolation. These interactions helped me understand how my work fits within the broader landscape and sparked new reflections on its purpose and potential applications that I might have missed otherwise.

You can read more about this demo here: https://dl.acm.org/doi/10.5555/3721488.3721764

Dwyer, J. L., Johansen, S. S., Donovan, J., Rittenbruch, M., & Gomez, R. (2025). What Would Jim Henson Do? Roleplaying Human-Robot Collaborations Through Puppeteering Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction, Melbourne, Australia.

TL; DR.

  1. Accessible Design: The kinematic puppet combines physical prototyping with virtual simulation for intuitive human-robot interaction design.
  2. Intuitive Feedback for Seamless Experience: Users control a customisable robot arm through hands-on manipulation while receiving real-time visual and haptic feedback. This novel approach supports Robot-Assisted Surgery design processes by enabling the intuitive exploration of human-robot interactions.
  3. Creative Inspiration: Inspired by film animation techniques and puppeteering, this low-cost, adaptable tool enables rapid prototyping and innovative experimentation in human-robot interaction research more broadly.
  4. Communicating Complex Research Concepts: Often requires tailoring explanations to diverse audiences. Even within a specialised community like HRI, individuals connect with ideas differently, and finding effective ways to articulate the purpose and value of novel methodological approaches is an ongoing challenge that improves with practice.
  5. Early Exposure of Research Work: Presenting research work to the community provides invaluable benefits beyond simply positive reception. The process of presenting forces the articulation and refinement of ideas, reveals how your work fits within the broader research landscape, and often uncovers applications and connections you might otherwise miss when working in isolation.