Member Login

ARTICLE: From Gut Feel to Evidence: Making the Case for Technology Adoption Through Quality Economics

By Munia Ahamed, UTS PhD Researcher, Australian Cobotics Centre

Technology adoption decisions in manufacturing are often characterised by a tension between perceived opportunity and perceived risk. This is particularly true for small and medium enterprises considering investments in collaborative robotics and automated quality systems, where the upfront costs are concrete, but the returns can feel uncertain.

Research consistently identifies this uncertainty as a key barrier. A 2024 UK government review of advanced technology adoption found that financial barriers—particularly difficulties justifying investment decisions due to uncertain returns—ranked among the most frequently cited obstacles for manufacturers [1], [2], [3]. Similar patterns emerge in studies of Australian SMEs, where decision-makers report hesitancy in embracing new technologies despite recognising their potential benefits.
This article argues that one way to address this challenge is through a more rigorous application of quality cost economics—a well-established body of theory that provides frameworks for quantifying the costs of defects, rework, and quality failures. By grounding technology adoption decisions in these frameworks, manufacturers can move from intuition-based decision-making toward evidence-based investment analysis.

The Economics of Quality: Theoretical Foundations
The concept of quality costing has a long history in operations management. Joseph Juran introduced the notion of the “cost of poor quality” in his 1951 Quality Control Handbook, arguing that organisations inevitably pay for quality—either through prevention and detection, or through the consequences of failure. His work established that appraisal and failure costs are typically much higher than prevention costs, suggesting that investment in getting things right the first time yields significant returns.

Philip Crosby extended this thinking in the 1970s and 1980s with his influential argument that “quality is free”. Crosby’s position was not that quality improvement carries no cost, but rather that the price of nonconformance—scrap, rework, warranty claims, lost customers—far exceeds the price of conformance. His research suggested that well-run quality programs could yield gains of 20 to 25 percent of revenues, with the cost of nonconformance reducible by half within 12 to 18 months of systematic effort.

Armand Feigenbaum’s Prevention-Appraisal-Failure (PAF) model provides a useful taxonomy for categorising quality costs. Prevention costs include activities designed to avoid defects occurring in the first place—process design, training, quality planning. Appraisal costs cover inspection, testing, and measurement activities. Failure costs, both internal (scrap, rework) and external (warranty, returns, reputation damage), represent the consequences of quality problems that were not prevented or detected.

In practical terms, this means that spending more on preventing defects upfront usually reduces the overall cost of quality problems later. Failure costs tend to decrease faster than prevention costs increase, so the total quality cost goes down. This insight remains as relevant today as when it was first articulated, and it provides a theoretical basis for evaluating investments in quality-enhancing technologies.

The Economics of Detection Timing
A related concept concerns the timing of defect detection. The 1:10:100 rule, attributed to George Labovitz and Yu Sang Chang (1992), captures the exponential escalation of costs as defects progress through the value chain. In its simplest form, the rule suggests that addressing a problem at its source costs one unit; finding and correcting it later in the process costs ten units; and dealing with it after it reaches the customer costs one hundred units.

While the specific ratios vary by context, the underlying principle is well-supported: defects that escape early detection accumulate additional processing costs, and defects that reach customers incur costs that extend beyond direct remediation to include relationship damage, complaint handling, and potential regulatory consequences.

This principle has direct relevance to technology adoption decisions. Automated inspection systems and vision-guided collaborative robots do not merely accelerate quality checking—they fundamentally alter when inspection occurs. Real-time, in-process detection catches problems before they accumulate downstream costs, shifting the organisation’s quality cost profile in favourable directions.

Barriers to Evidence-Based Decision Making
If the theoretical case for quality investment is strong, why do manufacturers—particularly SMEs—struggle to act on it? The literature identifies several contributing factors.
First, quality costs are often poorly measured. While direct costs like scrap and rework may be tracked, hidden costs—inspection time, schedule disruption, expedited shipping to replace defective goods—frequently go unrecorded. Without accurate baseline data, it becomes difficult to project returns on quality-enhancing investments.

Second, uncertainty about technology performance creates decision paralysis. Studies of SME technology adoption consistently find that decision-makers hesitate when they cannot point to demonstrated results in comparable contexts. This creates a circular problem: evidence is needed to justify investment, but evidence comes from having invested.

Third, competing priorities and resource constraints mean that quality investments must compete with other demands on limited capital. In this environment, investments with uncertain or difficult-to-quantify returns tend to be deferred in favour of more immediately tangible needs.

ROI Calculators as Analytical Tools

One response to these challenges is the development of structured ROI calculators tailored to specific technology investments. When well-designed, such tools serve several functions beyond simply generating a payback estimate.

First, they impose discipline on baseline measurement. To complete the calculator, users must quantify current defect rates, rework costs, and inspection time—data that many organisations have not systematically collected. The process of gathering this information often yields insights independent of any technology decision.

Second, they make assumptions explicit. A good ROI model does not obscure uncertainty; it surfaces it. Users can see what improvement rates are assumed, what cost factors are included, and how sensitive the conclusions are to different inputs. This transparency supports more informed discussion among stakeholders.

Third, they provide a framework for comparing alternatives. By standardising how costs and benefits are categorised, calculators enable like-for-like comparison of different technology options or implementation approaches.

The value of such tools lies not in their precision—all projections involve uncertainty—but in their capacity to structure thinking and ground decisions in operational data rather than vendor claims or general optimism.

Practical Recommendations

For manufacturers seeking to apply quality cost economics and the 1:10:100 principle to their technology decisions, several practical steps can strengthen the quality of investment analysis.
Establish quality cost baselines. Before evaluating any technology investment, spend time measuring what is currently unmeasured: rework hours, scrap rates, inspection time, defect escape rates. Even approximate figures provide a foundation for analysis that intuition cannot.

Map defect origins and detection points. Understanding where in the process problems arise—and where they are currently caught—identifies the opportunities for earlier detection. The gap between origin and detection represents accumulated cost that prevention or earlier inspection could avoid.
Use sensitivity analysis. Rather than seeking a single ROI figure, explore how conclusions change under different assumptions. What defect reduction would be needed for the investment to break even? How does the payback period shift if improvement is 20% less than projected? This approach acknowledges uncertainty while still supporting decision-making.

Consider pilot implementations. Where full-scale investment feels premature, smaller-scale trials with defined metrics can generate context-specific evidence. This reduces risk while building organisational capability and confidence.

The Path Forward
The theoretical foundations for quality cost analysis are well-established, with decades of research supporting the economic logic of prevention over detection and early detection over late. What is often lacking is the practical application of these frameworks to specific technology adoption decisions.
ROI calculators, when grounded in quality economics and used as analytical tools rather than sales devices, can help bridge this gap. They provide a structured means of translating established theory into operational decision-making, replacing intuition with evidence and making the case for investment in terms that resonate with resource-constrained decision-makers.

For Australian manufacturing to remain globally competitive, we need to accelerate thoughtful adoption of collaborative robotics and quality automation. Fact-based decision tools are one contribution toward that goal.

We welcome discussion on this topic. How has your organisation approached the challenge of justifying technology investments? What frameworks or tools have proven useful?

References
[1] Make UK and RSM UK, Investment Monitor 2024: Using Data to Drive Manufacturing Productivity. London, UK: Make UK, 2024.
[2] Make UK and BDO LLP, Manufacturing Outlook: 2024 Quarter 4. London, UK: Make UK, 2024.
[3] UK Government, Invest 2035: The UK’s Modern Industrial Strategy — Green Paper. London, UK: HM Government, Oct. 2024.

 

ARTICLE: Making Cobots Ready-to-Hand: A Compliance Perspective 

Written by Katia Bourahmoune, UTS & Acting Co-Lead Quality Assurance and Compliance program

Heidegger describes an equipment as ready-to-hand when it disappears into practice, when its use is so seamlessly integrated that it ceases to be an object of thought and becomes instead a transparent extension of action. A hammer is not noticed as a hammer when it drives a nail effectively; it is only when it splinters or slips that it becomes it becomes an object of scrutiny, unready-to-hand, with its use questioned. In modern manufacturing, collaborative robots (cobots) occupy an uneasy position between these two states. They promise repeatability, precision, and tireless monitoring, yet they are undeniably still machines to be supervised, audited, and monitored. In compliance and quality assurance, this human oversight of machines is necessary. Afterall, compliance remains the most human part of the hyper-mechanised modern manufacturing process. This is particularly evident in heavily regulated industries like medical device manufacturing, aviation and defence, where errors are measured not only in costs but in lives and national security.

For cobots to become ready-to-hand, they must be genuinely collaborative: partners in the task rather than peripheral machinery. While collaboration in the context of human-robot interaction is hard to define and evolves as the field advances, it is useful to frame it within the level on interaction between a human and a robot. These levels range from co-existence (shared space, individual actions) to co-operation (shared space, human-guided actions), to collaboration (shared space, joint bi-directional actions). Collaboration through this lens implies shared situational awareness, legible intent, and adaptive action: the robot exposes what it “perceives” (vision, force,…), why it is acting (constraints, goals,…), and how humans can adapt, override, or teach. Such interfaces must preserve human agency and skilled technique while reducing ergonomic and cognitive load. In practice, this means adaptive assistance that yields to expert touch, explanations of proposed actions, and workflows that keep responsibility distributed rather than displaced. When collaboration works this way, it does more than improve throughput; it establishes the preconditions for assurance to be intrinsic rather than supervisory. On this foundation, compliance becomes by design: assurance embedded in action, rather than appended after it. Cobots can inspect as they assemble, verify as they position, and generate audit-grade evidence as a by-product of normal operation. Cobots can extend human judgment through continuous monitoring, allowing human inspectors to concentrate on exceptions, interpretation, and continuous improvement.

This human-robot collaboration fundamentally hinges on trust. In production, workers must believe that a cobot will act predictably and safely; in quality assurance, they must also believe that the cobot’s monitoring and record-keeping are accurate and transparent. Research on automation psychology shows the dangers of both extremes: over-trust leads to blind reliance, while under-trust leads to redundancy and disuse. The literature points to several ways for calibrated trust including reliable and predictable performance, timely feedback, options for human override, transparent explanations of decisions, and auditable records tied to actions, and here we emphasise the compliance-critical elements of legibility, traceability, and contestability. Trust, then, is not an abstract sentiment but a design commitment: when cobots make their intentions legible and their decisions contestable, human operators retain meaningful agency in the loop. This keeps human judgment engaged precisely where it adds the most value. In regulated settings, this turns assurance into a shared practice rather than a supervisory afterthought, and it reorients collaboration toward preserving and amplifying human skill rather than displacing it.

Concerns are often raised that automation “deskills” human labour, relegating workers to passive supervision. Cobots designed for compliance offer the opposite prospect. By taking on repetitive inspection tasks, cobots free human expertise for higher-order judgment: interpreting anomalies, adapting processes, and innovating in response to unforeseen conditions. The skill does not vanish; it is re-centred where it matters most. In this way, cobots not only maintain but actively sustain skill, ensuring that human judgment remains the decisive element in compliance.

The Compliance and Quality Assurance program at the Australian Cobotics Centre aims to develop practical tools that specify, monitor and evaluate human–robot collaboration using multi-modality sensing and AI for assessing compliance.

When cobots are truly ready-to-hand, i.e. useful, trustworthy, and engineered for compliance-by-design, they cease to be mere machines and become true collaborators that elevate human skill while making quality an intrinsic property of every human–robot action.

 

Further reading:  

Heidegger, M. (1962). Being and time. In J. Macquarrie, & E. Robinson, (Trans.), New York, NY: Harper & Row. 

Guertler, M., Tomidei, L., Sick, N., Carmichael, M., Paul, G., Wambsganss, A., … & Hussain, S. (2023). When is a robot a cobot? Moving beyond manufacturing and arm-based cobot manipulators. Proceedings of the Design Society, 3, 3889-3898. https://doi.org/10.1017/pds.2023.390  

Hancock, P. A., Billings, D. R., & Schaefer, K. E. (2011). A meta-analysis of factors affecting trust in human-robot interaction. Human Factors, 53(5), 517–527.  https://doi.org/10.1177/0018720811417254  

Carmichael, M. (2023). Can we Unlock the Potential of Collaborative Robots?. Australian Cobotics Centre. https://www.australiancobotics.org/articles/can-we-unlock-the-potential-of-collaborative-robots/  

 

 

PhD Research Spotlight: Zongyuan Zhang Tackles Contact Tasks with Mobile Robots

PhD Research Spotlight: Zongyuan Zhang Tackles Contact Tasks with Mobile Robots

As part of the Biomimic Cobots program within the Australian Cobotics Centre, PhD researcher Zongyuan Zhang is leading a project that addresses a key challenge in manufacturing: enabling mobile robots to perform high-precision contact tasks, such as grinding, polishing, and welding, on large, arbitrarily placed workpieces in factory environments.

Zongyuan brings a diverse background in robotics to this work. He holds an M.Sc. in Robotics from the University of Birmingham, UK, where he focused on applying deep learning to manipulator force control. His experience spans control system design, mechanical structure design, and participation in a range of innovative robotics projects—including underwater photography robots, driverless racing cars, exoskeleton mechanical arms, dual-rotor aircraft, and remote-control robotic arms—some of which are now undergoing commercialisation.

His PhD project, Contact Task Execution by Robot with Non-Rigid Fixation, investigates how robots with non-rigidly fixed chassis can maintain the accuracy, stability, and adaptability required for industrial contact tasks. These tasks typically demand hybrid force/position control and high contact forces, which are complicated by the mobility and flexibility of the robot’s base.

This research contributes to the Biomimic Cobots program’s goal of developing collaborative robots that mimic human sensing, learning, and manipulation skills. It explores:

  • How a robot mimics human control to execute contact tasks like sanding and grinding.
  • How augmented mobility enables task execution in large, unconstrained spaces.
  • How minimal task-specific programming can be used to adapt to new workpieces and environments.

Zongyuan, based at QUT, is supervised by Professor Jonathan Roberts, Professor Will Browne, and Dr Chris Lehnert, and is working onsite at ARM Hub alongside industry partner, Vaulta. The project with industrial partners concerns the efficient and accurate removal of surface oxides from metallic materials, thereby enabling tighter bonding between metal components. This embedded collaboration ensures his research is conducted in real production environments and remains grounded in the practical needs of Australian manufacturers.

Recent milestones include the:

  • Design and deployment a framework for performing industrial sanding tasks using collaborative robots.
  • Utilisation of sound as a multimodal input to improve the robustness of the sanding process and enhance the cost-efficiency of the robotic system.
  • Exploration of how humanoid robots can achieve high-precision performance in contact tasks.

Check out our website for the latest on his project: Project 1.1 – Contact task execution by robot with non-rigid fixation » Australian Cobotics Centre | ARC funded ITTC for Collaborative Robotics in Advanced Manufacturing

Zongyuan pictured (centre) with ARM Hub’s Technology Lead, Dr Troy Cordie (top picture, L) and Queensland’s Deputy Premier, Minister for State Development, Infrastructure and Planning and Minister for Industrial Relations, Jarrod Bleijie MP. (R)

Celebrating the Robotics & Advanced Manufacturing Centre at TAFE Queensland!

On 17th June 2025, our Centre was part of the official opening of the Robotics and Advanced Manufacturing Centre (RAMC) at TAFE Queensland Eagle Farm campus. This 5-Star Green Star-rated facility was officially launched by The Honourable Ros Bates MP, Minister for Finance, Trade, Employment and Training and included Minister Tim Nicholls MP and a talk from ARM Hub‘s Technical Lead, Dr Troy Cordie

As part of the opening, TAFE opened their doors for an Industry Open Day. Centre Director Prof Jonathan Roberts, Centre Manager Merryn Ballantyne, and PhD researcher Jacqueline Greentree spoke with over 150 attendees about our exciting initiatives and upcoming events.

The Open Day also included stands and demonstrations from Mynt Energy Tech, ARM Hub, Aptella, Infinispark, and HFS Design (formerly Micromelon Engineering) providing a showcase of Queensland’s collaborative innovation ecosystem.

Thank you to TAFE Queensland for inviting us to be part of the event and to Benaiah Fenby and team for organising such a wonderful day. The new TAFE Centre is a fantastic step forward in preparing Queensland’s workforce for the future of work in emerging and sustainable industries. And, as one of our Centre’s industry partners, we look forward to continuing to work together to ensure manufacturers are equipped for the next generation of advanced manufacturing.

📸 TAFE Queensland, Merryn Ballantyne, Jonathan RobertsNo alternative text description for this image

More Than Machines: Why Do We Want to Build Robots That Look Like Us? 

Written by Dr Alan Burden , QUT Postdoctoral Research Fellow, Designing Socio-Technical Robotics System program.

A colleague recently questioned why we are building robots that look human. If other machines already perform tasks reliably, robots in human shapes reveal more about our expectations rather than about technical necessity. Apart from striving to fulfil sci-fi fantasies, there seems to be little logical reason for many industries to develop humanoids. 

Humanoid robots are machines designed to resemble and move like humans, typically featuring an identifiable head, torso, arms, legs, and enabling interaction with people, objects, and environments in human-centred ways.  

In 2025, manufacturers are projected to ship approximately 18,000 humanoid robots globally [1], marking a significant step toward broader adoption. Looking ahead, Goldman Sachs forecasts that by 2035, the humanoid robot market could reach USD $38 billion (approximately AUD $57 billion), with annual shipments increasing to 1.4 million units [2]. Further into the future, Bank of America projects that by 2060, up to 3 billion humanoid robots could be in use worldwide, primarily in homes and service industries [1]. 

From Tesla’s Optimus Gen 2 [3] to Figure AI’s Figure 02 [4], the humanoid robot is no longer a figment of science fiction. These robots will walk, lift, talk, and perform factory tasks. Yet beneath the surface of innovation lies a deeper question: Will we build humanoid robots because the human form is genuinely useful, or because it reflects our own image back at us? 

In an age where industrial arm robots, wheeled and tracked platforms, and flying drones already perform industrial tasks with precision, the humanoid form can seem like an odd choice. These robots will be complex, expensive to develop, and often over-engineered for the roles they are expected to perform. 

So what explains the current fascination with building robots in our own image? 

Form vs Function: The Practical Debate 

Our world is designed around the human body. Door handles, tools, staircases, and car pedals all presume a body with arms, legs, and binocular vision. Humanoid robots will therefore adapt more easily to our environments. 

Still, there is a contradiction worth unpacking. We already have machines that operate far more efficiently without the constraints of two legs and a torso. Amazon’s warehouse bots glide on wheels, carrying shelving units with speed and precision [5]. Boston Dynamics’ Spot, a quadruped, excels at inspections and terrain navigation [6]. Agility Robotics’ Digit uses bipedal bird-like legs to move efficiently through human-centric spaces [7]. 

Humanoid robots won’t necessarily be more capable but may be more compatible with existing environments, especially where infrastructure redesign would be costly or disruptive. This compatibility advantage will be what Stanford’s Human-Centred AI Institute describes as the affordance of embodied compatibility rather than pure efficiency [8]. 

The Psychological Shortcut 

People respond to humanoid forms with startling immediacy. A robot with a face, a voice, and gestures doesn’t just operate in our space – it socially occupies it. 

That connection brings both benefits and barriers. Humanoid robots will be easier to instruct, cooperate with, or trust, especially in care or customer service roles. This intuitive rapport, however, will come at a cost. We’ll also project emotions, intentions, and even moral status onto these mechanical beings. The IEEE’s Global Initiative on Ethics of Autonomous Systems [9] has warned that anthropomorphic design risks confusion over autonomy, trust, and accountability. 

A robotic arm making a mistake will seem tolerable. A robot with eyes and facial features doing the same will feel uncanny. The “uncanny valley”, a term coined by roboticist Masahiro Mori in 1970 to describe the discomfort people feel when a robot or virtual character looks almost human, but not quite [10] – will blur the line between tool and companion, worker and being. 

Redefining Labour and Power 

Humanoid robots will often be pitched as general-purpose labourers: tireless, adaptable, and compliant. In some ways, they’ll echo the 19th-century industrial ideal of the perfect worker. 

But this vision raises complex questions. If these machines replace humans in repetitive or hazardous roles, how will we protect the dignity and security of displaced workers? If a robot becomes a “colleague,” what responsibilities will come with that illusion? 

The Future of Humanity Institute at Oxford [11] noted that humanoids could contribute to a shift in how we view authority and social dynamics. If robots are always obedient, will we begin to expect the same from people? Automation will soon shape not just job loss, but workplace culture and human behaviour. This connects with human-robot interaction research on anthropomorphic framing and robot deception, which cautions against uncritically assigning social roles to machines [11]. 

Who Are We Really Building? 

At its core, the humanoid robot reflects our self-image. When Boston Dynamics’ Atlas robot performed parkour in a now-iconic demonstration video [12], public fascination was less about mechanics and more about the eeriness of watching something mechanical move with such human-like agility. The video, titled Atlas – Partners in Parkour, showcased robots jumping, flipping, and vaulting through a gymnastics course which triggered both admiration, unease and a wave of social media memes drawing comparisions with Terminator films. 

This is not new. From clockwork automatons in royal courts to androids in science fiction, each era’s robots mirror its anxieties and desires. For instance, Hanson Robotics’ Sophia [13] was designed with expressive facial features to promote naturalistic interaction, yet remains polarising and dismissed as novelty. Is it an advancement in social robotics or a symbol of anthropomorphic overreach? 

The goal of today’s humanoids reveals our priorities. Tesla’s Optimus will be built to handle repetitive factory work. Figure AI’s humanoids will aim to integrate into warehouse workflows. These designs won’t just be technical – they will symbolise which human qualities we value and which jobs we are ready to relinquish. 

The Real Question 

As mechanical humans enter our homes and workplaces, we must ask what they will symbolise beyond their specs. Humanoid robots will reflect assumptions about work, social interaction, and human worth. When we automate tasks in human form, we choose which parts of ourselves we replicate and which we outsource. 

The most pressing questions won’t be about joint torque or facial recognition, but about how these machines reshape our relationships with technology, labour, and each other. Robots, like all tools, embody human intention. The challenge isn’t building minds like ours, but questioning why we keep giving them our face. 

References 

[1] Koetsier, J. (2025, April 30). Humanoid robot mass adoption will start in 2028, says Bank of America. Forbes. https://www.forbes.com/sites/johnkoetsier/2025/04/30/humanoid-robot-mass-adoption-will-start-in-2028-says-bank-of-america/ 

[2] Goldman Sachs. (2024, January 8). The global market for humanoid robots could reach $38 billion by 2035. https://www.goldmansachs.com/insights/articles/the-global-market-for-robots-could-reach-38-billion-by-2035 

[3] Tesla. (2023). Tesla Optimus: Our Humanoid Robot. https://www.tesla.com/AI 

[4] Figure AI. (2024). Figure 02 Robot Overview. https://www.figure.ai/ 

[5] Amazon. (2025). Facts & figures: Amazon fulfillment centers and robotics. https://www.aboutamazon.co.uk/news/innovation/bots-by-the-numbers-facts-and-figures-about-robotics-at-amazon 

[6] Boston Dynamics. (2025). Spot | Boston Dynamics. https://bostondynamics.com/products/spot/ 

[7] Agility Robotics. (2025). Digit – ROBOTS: Your Guide to the World of Robotics. https://www.agilityrobotics.com/ 

[8] Srivastava, S., Li, C., Lingelbach, M., Martín-Martín, R., Xia, F., Vainio, K., Lian, Z., Gokmen, C., Buch, S., Liu, K., Savarese, S., Gweon, H., Wu, J., & Fei-Fei, L. (2021). BEHAVIOR: Benchmark for Everyday Household Activities in Virtual, Interactive, and Ecological Environments. arXiv preprint arXiv:2108.03332. https://arxiv.org/abs/2108.03332 

[9] IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. (2020). Ethically Aligned Design, 1st ed. https://ethicsinaction.ieee.org/ 

[10] Mori, M. (1970). The uncanny valley. Energy, 7(4), 33–35. (English translation by MacDorman & Kageki, 2012, IEEE Robotics & Automation Magazine). https://doi.org/10.1109/MRA.2012.2192811 

[11] Brundage, M., Avin, S., Clark, J., Toner, H., Eckersley, P., Garfinkel, B., … & Amodei, D. (2018). The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation. Future of Humanity Institute, University of Oxford. Retrieved from https://arxiv.org/abs/1802.07228 

[12] Boston Dynamics. (2021, August 17). Atlas | Partners in Parkour [Video]. YouTube. https://www.youtube.com/watch?v=tF4DML7FIWk 

[13] Hanson Robotics. (2025). Sophia the Robot. https://www.hansonrobotics.com/sophia/ 

 

What Would Jim Henson Do? Roleplaying Human-Robot Collaborations Through Puppeteering

By James Dwyer and Dr Valeria Macalupu (both QUT)

 

 A tangible, adaptable and modular interface for embodied explorations of human-robot interaction concepts.

As robots become increasingly integrated into various industries, from healthcare to manufacturing, the need for intuitive and adaptable tools to design and test robotic movements has never been greater. Traditional approaches often rely on expensive simulations or complex hardware setups, which can restrict early-stage experimentation and limit participation from non-expert stakeholders. The kinematic puppet offers a refreshing alternative by combining hands-on prototyping with virtual simulation, making it easier for anyone to explore and refine robot motion. This work is particularly critical for exploring intuitive ways surgeons can collaborate with robots in the operating room, improving Robot-Assisted Surgery (RAS).

 What is the kinematic puppet?

The kinematic puppet is an innovative tool that combines physical prototyping and virtual simulation to simplify the design and testing of robot movements and human-robot interactions. The physical component is a modular puppet constructed from 3D-printed joints equipped with rotary encoders and connected by PVC linkages. This flexible and cost-effective setup allows users to customise a robot arm to suit a variety of needs by adjusting linkage lengths and joint attachments.

On the digital side, a virtual simulation environment (developed in Unreal Engine) creates a real-time digital twin of the physical puppet. This integration via Wi-Fi/UDP enables immediate visualisation and testing of HRI concepts. By bridging the gap between physical manipulation and digital analysis, the kinematic puppet makes it easier for anyone to experiment with and refine robot motion in an interactive and accessible way.

Figure 1. The physical and virtual components of the kinematic puppet.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

How does the user interact with the puppet?

In the demonstration, users engage with the system by physically manipulating the kinematic puppet to control a digital twin of the robot arm, guiding it through a virtual cutting task. As they direct the arm’s movements, a virtual cutting tool simulates material removal in real time.

The system provides continuous feedback through both visual displays and haptic responses, creating an immersive and intuitive experience. This interactive environment challenges participants to balance precision and speed, highlighting the importance of both accuracy and efficiency in robotic tasks.

By making the abstract process of programming robotic movements tangible, the kinematic puppet empowers users to experiment and learn in a dynamic environment.

Figure 2. James showing how the kinematic puppet works.

Demonstration at HRI 2025 – An experience for HDR students.

Presenting the Kinematic Puppet at the Human-Robot Interaction Conference 2025 provided valuable insights into how our research resonates with the broader robotics community. Attendees were particularly drawn to the system’s modularity and reconfigurability and appreciated the puppetry-based approach as an intuitive method for exploring human-robot interaction concepts.

The demonstration wasn’t without challenges. Technical issues before the demo required some mildly frantic rebuilding of the code solution the morning of, highlighting a common research reality: experimental prototypes often accumulate small bugs through iterative development that compound unexpectedly. An all-too-common challenge that reflects the messy nature of research and something that isn’t always visible in polished publications.

Reviewer feedback highlighted potential applications we hadn’t considered, particularly around improving accessibility of research technologies. While most attendees engaged enthusiastically with the concept, some appeared to struggle to connect it to their work. It took time for me to find effective ways to explain the purpose and value of the approach—a good reminder that not every method resonates equally in a diverse field and how important it is to tailor explanations to your audience, even within a given research community.

For an HDR student, this experience underscores the importance of exposing your work to the research community early. The value lies not in positive reception, but in the process of presenting the work itself. Getting to explain my work to others forced me to articulate and refine my thinking, an opportunity that is missed when work is conducted in isolation. These interactions helped me understand how my work fits within the broader landscape and sparked new reflections on its purpose and potential applications that I might have missed otherwise.

You can read more about this demo here: https://dl.acm.org/doi/10.5555/3721488.3721764

Dwyer, J. L., Johansen, S. S., Donovan, J., Rittenbruch, M., & Gomez, R. (2025). What Would Jim Henson Do? Roleplaying Human-Robot Collaborations Through Puppeteering Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction, Melbourne, Australia.

TL; DR.

  1. Accessible Design: The kinematic puppet combines physical prototyping with virtual simulation for intuitive human-robot interaction design.
  2. Intuitive Feedback for Seamless Experience: Users control a customisable robot arm through hands-on manipulation while receiving real-time visual and haptic feedback. This novel approach supports Robot-Assisted Surgery design processes by enabling the intuitive exploration of human-robot interactions.
  3. Creative Inspiration: Inspired by film animation techniques and puppeteering, this low-cost, adaptable tool enables rapid prototyping and innovative experimentation in human-robot interaction research more broadly.
  4. Communicating Complex Research Concepts: Often requires tailoring explanations to diverse audiences. Even within a specialised community like HRI, individuals connect with ideas differently, and finding effective ways to articulate the purpose and value of novel methodological approaches is an ongoing challenge that improves with practice.
  5. Early Exposure of Research Work: Presenting research work to the community provides invaluable benefits beyond simply positive reception. The process of presenting forces the articulation and refinement of ideas, reveals how your work fits within the broader research landscape, and often uncovers applications and connections you might otherwise miss when working in isolation.

2024 Industry Symposium and Cobot mini-Expo

The Australian Cobotics Centre successfully hosted its first Industry Symposium and Mini-Expo on Thursday, 5th December 2024, at the QUT Kelvin Grove campus in Brisbane. This engaging event, sponsored by ARM Hub AI Adopt Centre and the Queensland Government’s Department of Natural Resources and Mines, Manufacturing and Regional and Rural Development, brought together over 100 manufacturers, researchers, and industry professionals to explore the latest advancements in advanced manufacturing technologies and discuss the evolving role of humans in the future of manufacturing.

Event Highlights

Keynote talk

Professor Cori Stewart, CEO of ARM Hub gave a keynote talk about the current state of Australian manufacturing and how AI can enhance productivity.

Industry talks

During the event, we heard from industry partners of the Centre about their experiences of cobots and related technology:

 

Panel Discussion: “The Future of Australian Manufacturing: From AI to Humanoids, Where Does the Human Fit?”

One of the standout moments of the event was the panel discussion facilitated by Professor Jonathan Roberts, Director of the Australian Cobotics Centre. Panelists included:

  • Dr Cornelis van Niekerk, Weld Australia
  • Associate Professor Penny Williams, QUT Centre for Decent Work and Industry and Australian Cobotics Centre
  • Dr Sue Keay, Robotics Australia Group
  • Richard (Ric) Pruss, Workr Labs

This lively discussion, facilitated by Centre Director, Professor Jonathan Roberts, explored the transformative impact of technologies like AI and humanoid robots on manufacturing. The panel also addressed the importance of maintaining a human-centered approach in an increasingly automated industry, sparking thought-provoking dialogue among attendees.

Mini-Expo

The mini-expo was a highlight for many participants, offering hands-on experiences with cobotic technologies. Live research demonstrations and stands from exhibitors including DCISIV Technolgogies, Queensland XR Hub, ARM Hub AI Adopt Centre, the Queensland Government’s Department of Natural Resources and Mines, Manufacturing and Regional and Rural Development showcased the potential of these innovations to enhance productivity, improve workplace safety, and support the competitiveness of Australian manufacturers.

Looking Ahead

Events like this symposium play a crucial role in strengthening industry connections and disseminating research outcomes.

For those who couldn’t attend, stay tuned for future events and opportunities to engage with the Centre’s groundbreaking work. Check out our program and project pages to learn more about our ongoing projects and upcoming initiatives.

We extend our gratitude to all speakers, panelists, and attendees who made the 2024 Industry Symposium a resounding success.

 

2024 Centre Awards

At the annual ACC Symposium, an awards evening was held with nominations put forward by Centre members in the lead up to the event. Our annual awards were a great way of celebrating the achievements of our people and their collaborative efforts over the past 12 months. The event is hosted by our Centre Director, Professor Jonathan Roberts.

2024 winners included:

  • Research Achievement: The Achievement award is awarded to a person that has made an outstanding contribution to ACC-related research.
  • Contribution to Public Debate: The award for Best Contribution to Public Debate is awarded to an individual or team who has exhibited outstanding thought leadership on any ACC related topic and its impact on society.
  • Best Collaborative Research output: The Best Collaborative Research Output is awarded to researchers who have worked together to conduct collaborative research relevant to the ACC, as evidenced by this research output.
    • WINNER: Dr Fouad (Fred) Sukkar,  for his work with THWS, Prof Tobias Kaupp, Usama Ali, Adrian Müller
  • Best Event: The award for Best-Profile Raising Event is presented to the individual or team who best represent the aspirations of the Centre.
    • WINNER: Sparking Innovation, Weld Australia Roadshow at Swinburne 
  • Emerging Leader: The award for Emerging Leader is awarded to the individual who provides guidance and inspiration to their peers and has displayed promising leadership skills.
    • WINNER: Jasper Vermeulen, QUT PhD Researcher
  • Quiet Achiever: The Quiet Achiever award is for a person that has produced an impressive amount of research outcomes in the last 12 months or made significant research progress.
  • EPIC Centre Citizen: The award for Centre Citizen is awarded to a person that embodies the spirit of the Centre and fosters a supportive, innovative, inclusive and fun environment as well as being an all-round achiever who displays positivity and resourcefulness.
  • Whoopsie Daisy: The Whoopsie Daisy Award honours an individual who, when faced with an unexpected challenge or mistake in their work, demonstrated resilience and transformed the situation into a valuable learning opportunity. This award celebrates their ability to adapt, grow, and turn setbacks into successes, showcasing the strength of innovation and perseverance in overcoming obstacles.
    • WINNER: Dr Alan Burden, QUT Postdoctoral Research Fellow
  • Industry Champion (individual): The Industry Champion Award is presented to a person who has demonstrated a strong commitment to engaging and collaborating with industry. This award celebrates leadership in fostering partnerships between industry and academia, driving innovation in automation, and promoting the integration of cobotics technology to enhance productivity, safety, and sustainability.
  • Industry-Research Collaboration: (Team) This award is awarded to a project team, consisting of industry and researchers who have collaborated to create impactful solutions that address real-world challenges. The team’s efforts demonstrate a successful integration of industry needs and research innovation and showcases the benefits of industry and academic collaboration.
    • WINNERS:
      • The Swinburne team and Universal Shower Base: Prof Mats Isaksson, Mariadas Roshan, Jagannatha Pyaraka, Rebecca Lowery
      • Translation projects with Stryker: Partner Investigator, Dr Tom Williamson, Dr Alan Burden,  Dr Stine Johansen, James Dwyer, Jasper Vermeulen, Yuan Liu, Prof Markus Rittenbruch, Prof Glenda Caldwell, A/Prof Müge Belek Fialho Teixeira , A/Prof Jared Donovan and Dr Matthias Guertler.

Continue reading “2024 Centre Awards”

Celebrating Research Excellence: Australian Cobotics Centre’s Research Showcase 2024

On 5th December 2024, the Australian Cobotics Centre proudly hosted its annual Research Showcase, celebrating the achievements of its researchers over the past 12 months. This year’s event provided an invaluable opportunity for the Centre’s HDR (Higher Degree by Research) students and early career researchers to present their research outcomes in a dynamic and engaging format through lightning talks and research demonstrations.

The showcase was hosted by Professor Glenda Caldwell, Associate Director of Research Training, who emphasised the importance of professional development and communication skills in research careers. In line with the Centre’s training and development program, researchers delivered three-minute lightning talks—a presentation format that challenged them to distill their research aims, outcomes, and impact into a 3 minute talk suitable for a multidisciplinary audience.

3 minute lightning talks

The showcase featured a diverse range of topics reflecting the Centre’s interdisciplinary focus on collaborative robotics in manufacturing and beyond. The following presentations were delivered:

  1. James DwyerRethinking Our Approach: Rapid Prototyping, Fast Failures, and Facilitating Interdisciplinary Conversations in Cobotics
  2. Stine JohansenRobotic Blended Sonification
  3. Akash HettiarachchiThe Impact of Collaborative Robots (Cobots) on a Diversified Manufacturing Workforce
  4. Phuong Anh TranA Research Study on the Design of Human-Cobot Manufacturing Work
  5. Munia AhamedTrust Your Robot: Building Acceptance in Human-Robot Teams
  6. Nadimul HaqueFramework for Adapting Robot Skills to Novel Tasks
  7. Jagannatha Pyaraka – Minimal Key-Point Learning for Robot Skill Transfer from Videos
  8. Zongyuan ZhangUsing a Mobile Robot for Sanding: The Non-Rigid Fixation Problem
  9. Jasper VermeulenHow Humans (Still) Play a Pivotal Role in Successful Human-Robot Teams
  10. Yuan LiuHuman Decision Making in Human-Robot Collaboration (HRC)
  11. Alan BurdenOptimising Cobot Integration with AR Simulation

Demonstrations

Rapid Robot Prototypes, presented by James Dwyer, PhD researcher, QUT, Human Robot Interaction research program

Human-Robot Collaboration (HRC) has the potential to improve work across various sectors, but it also brings changes to work practices, processes, and ways of thinking. Practical design tools are needed to connect the technical advancements in HRC with the real-world needs and expectations of end-users.

This demonstration introduces the ‘kinematic puppet,’ a novel, cost-effective, modular interface that makes it easy to explore robotic movements and interactions. By physically simulating different scenarios and controlling robot actions manually through “puppeteering”, it enables teams to test HRC concepts without needing specialised robotics skills. This hands-on approach provides actionable insights into the usability, efficiency, and ergonomic suitability of robotic platforms across different HRC scenarios and tasks.

Interactive distance field mapping and planning (IDMP) framework, presented by Dr Fouad Sukkar, Postdoctoral Research Fellow, UTS; Nadimul Haque, PhD researcher, UTS, Biomimic Cobots research program

Human-robot collaborative applications require scene representations that are kept up-to-date and facilitate safe robot motions in dynamic scenes. In this exhibition we showcase an interactive distance field mapping and planning (IDMP) framework that handles dynamic objects and collision avoidance through an efficient Gaussian Process field representation. In terms of mapping, IDMP is able to fuse point cloud data from single and multiple sensors, query the free space at any spatial resolution, and deal with moving objects without semantics. In terms of planning, IDMP allows seamless integration with gradient-based reactive planners facilitating dynamic obstacle avoidance for safe human-robot interactions. Video, code, and datasets are publicly available at https://uts-ri.github.io/IDMP.

Real-Time Collaborative Action Tracking (ReCAT) system, presented by Yuan Liu, PhD Researcher, QUT, Designing Socio-Technical Robotic Systems research program

The Real-Time Collaborative Action Tracking (ReCAT) system demonstrates how human pose, movement, and gestures can be detected and tracked through a camera in real time. Using advanced computer vision techniques, attendees can see firsthand how a webcam captures and interprets behaviour, offering valuable insights to support the future of human-centred design in human-robot collaboration.

XR Application for Cobot Interaction and Workflow Optimisation, presented by Dr Alan Burden, Postdoctoral Research Fellow, QUT, Designing Socio-Technical Robotic Systems research program

The demo showcases an Augmented Reality (AR) interface using a Meta Quest headset and Unreal Engine 5. The AR system employs simple visual and interactive elements to send commands to a cobot, enabling it to perform tasks like picking and placing items with precision. The AR interface highlights the 3D environment, giving users a clear, intuitive understanding of how cobots can be effectively integrated into real-world spaces.

Quality Assurance & Compliance using haptic feedback, presented by Dr Mariadas Roshan, Postdoctoral Research Fellow, Swinburne & Danial Rizvi, PhD researcher, UTS, Quality Assurance & Compliance research program

Demonstrations include a video showcasing the use of the Spot Dog as an assistant, another video illustrating how live feedback can be utilized to optimize robotic welding processes, and a live demonstration of a haptic device designed for inspections in manufacturing parts with restricted access. Gain insights into practical applications of robotics and emerging technologies aimed at improving efficiency and quality assessment in industrial environments.


The 2024 Research Showcase not only celebrated achievements but also reinforced the Centre’s commitment to supporting the professional growth of its researchers. By focusing on communication, engagement, and collaboration, the Centre is equipping its students and early career researchers with the tools they need to drive innovation in cobotics and advanced manufacturing.

We congratulate all presenters on their contributions and look forward to seeing their continued impact in the coming year.

 

Project Wrap-up: Shorts project Phase 2

Our research team from UTS and QUT has wrapped up phase 2 of the “Shorts” project with Infrabuild which involved demonstrating steel bar removal using a lightweight collaborative robot. This was an important milestone in proving that a smaller and safer robot could carry out similar work to the current operators.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

In addition, sensors placed along the bar production line in the Sydney Bar Mill have been capturing footage of short bars over several months. This comprehensive dataset of various bar types is being used to develop algorithms for automatic detection of defective short bars.

 

 

 

 

 

 

 

 

 

 

 

Members from Centre’s Biomimic Cobots program 1 visited Infrabuild’s Sydney Bar Mill in October last year to discuss findings from the study and to plan the next steps of the project.

 

 

 

 

 

 

 

 

 

 

 

 

 

The next phase of the project will see integration of the short bar detection and bar removal systems. Furthermore, key upgrades to the sensor system are underway in order to improve the detection of short bars and cover a wider range of scenarios that were learnt from the previous phase.

Another focus will be the human aspect. Understanding how a collaborative robot can integrate into existing workflows and how to best meet expectations. This will be an exciting opportunity to gain insights from workers and also for cross collaboration with other programs in the Centre.