Member Login

More Than Machines: Why Do We Want to Build Robots That Look Like Us? 

Written by Dr Alan Burden , QUT Postdoctoral Research Fellow, Designing Socio-Technical Robotics System program.

A colleague recently questioned why we are building robots that look human. If other machines already perform tasks reliably, robots in human shapes reveal more about our expectations rather than about technical necessity. Apart from striving to fulfil sci-fi fantasies, there seems to be little logical reason for many industries to develop humanoids. 

Humanoid robots are machines designed to resemble and move like humans, typically featuring an identifiable head, torso, arms, legs, and enabling interaction with people, objects, and environments in human-centred ways.  

In 2025, manufacturers are projected to ship approximately 18,000 humanoid robots globally [1], marking a significant step toward broader adoption. Looking ahead, Goldman Sachs forecasts that by 2035, the humanoid robot market could reach USD $38 billion (approximately AUD $57 billion), with annual shipments increasing to 1.4 million units [2]. Further into the future, Bank of America projects that by 2060, up to 3 billion humanoid robots could be in use worldwide, primarily in homes and service industries [1]. 

From Tesla’s Optimus Gen 2 [3] to Figure AI’s Figure 02 [4], the humanoid robot is no longer a figment of science fiction. These robots will walk, lift, talk, and perform factory tasks. Yet beneath the surface of innovation lies a deeper question: Will we build humanoid robots because the human form is genuinely useful, or because it reflects our own image back at us? 

In an age where industrial arm robots, wheeled and tracked platforms, and flying drones already perform industrial tasks with precision, the humanoid form can seem like an odd choice. These robots will be complex, expensive to develop, and often over-engineered for the roles they are expected to perform. 

So what explains the current fascination with building robots in our own image? 

Form vs Function: The Practical Debate 

Our world is designed around the human body. Door handles, tools, staircases, and car pedals all presume a body with arms, legs, and binocular vision. Humanoid robots will therefore adapt more easily to our environments. 

Still, there is a contradiction worth unpacking. We already have machines that operate far more efficiently without the constraints of two legs and a torso. Amazon’s warehouse bots glide on wheels, carrying shelving units with speed and precision [5]. Boston Dynamics’ Spot, a quadruped, excels at inspections and terrain navigation [6]. Agility Robotics’ Digit uses bipedal bird-like legs to move efficiently through human-centric spaces [7]. 

Humanoid robots won’t necessarily be more capable but may be more compatible with existing environments, especially where infrastructure redesign would be costly or disruptive. This compatibility advantage will be what Stanford’s Human-Centred AI Institute describes as the affordance of embodied compatibility rather than pure efficiency [8]. 

The Psychological Shortcut 

People respond to humanoid forms with startling immediacy. A robot with a face, a voice, and gestures doesn’t just operate in our space – it socially occupies it. 

That connection brings both benefits and barriers. Humanoid robots will be easier to instruct, cooperate with, or trust, especially in care or customer service roles. This intuitive rapport, however, will come at a cost. We’ll also project emotions, intentions, and even moral status onto these mechanical beings. The IEEE’s Global Initiative on Ethics of Autonomous Systems [9] has warned that anthropomorphic design risks confusion over autonomy, trust, and accountability. 

A robotic arm making a mistake will seem tolerable. A robot with eyes and facial features doing the same will feel uncanny. The “uncanny valley”, a term coined by roboticist Masahiro Mori in 1970 to describe the discomfort people feel when a robot or virtual character looks almost human, but not quite [10] – will blur the line between tool and companion, worker and being. 

Redefining Labour and Power 

Humanoid robots will often be pitched as general-purpose labourers: tireless, adaptable, and compliant. In some ways, they’ll echo the 19th-century industrial ideal of the perfect worker. 

But this vision raises complex questions. If these machines replace humans in repetitive or hazardous roles, how will we protect the dignity and security of displaced workers? If a robot becomes a “colleague,” what responsibilities will come with that illusion? 

The Future of Humanity Institute at Oxford [11] noted that humanoids could contribute to a shift in how we view authority and social dynamics. If robots are always obedient, will we begin to expect the same from people? Automation will soon shape not just job loss, but workplace culture and human behaviour. This connects with human-robot interaction research on anthropomorphic framing and robot deception, which cautions against uncritically assigning social roles to machines [11]. 

Who Are We Really Building? 

At its core, the humanoid robot reflects our self-image. When Boston Dynamics’ Atlas robot performed parkour in a now-iconic demonstration video [12], public fascination was less about mechanics and more about the eeriness of watching something mechanical move with such human-like agility. The video, titled Atlas – Partners in Parkour, showcased robots jumping, flipping, and vaulting through a gymnastics course which triggered both admiration, unease and a wave of social media memes drawing comparisions with Terminator films. 

This is not new. From clockwork automatons in royal courts to androids in science fiction, each era’s robots mirror its anxieties and desires. For instance, Hanson Robotics’ Sophia [13] was designed with expressive facial features to promote naturalistic interaction, yet remains polarising and dismissed as novelty. Is it an advancement in social robotics or a symbol of anthropomorphic overreach? 

The goal of today’s humanoids reveals our priorities. Tesla’s Optimus will be built to handle repetitive factory work. Figure AI’s humanoids will aim to integrate into warehouse workflows. These designs won’t just be technical – they will symbolise which human qualities we value and which jobs we are ready to relinquish. 

The Real Question 

As mechanical humans enter our homes and workplaces, we must ask what they will symbolise beyond their specs. Humanoid robots will reflect assumptions about work, social interaction, and human worth. When we automate tasks in human form, we choose which parts of ourselves we replicate and which we outsource. 

The most pressing questions won’t be about joint torque or facial recognition, but about how these machines reshape our relationships with technology, labour, and each other. Robots, like all tools, embody human intention. The challenge isn’t building minds like ours, but questioning why we keep giving them our face. 

References 

[1] Koetsier, J. (2025, April 30). Humanoid robot mass adoption will start in 2028, says Bank of America. Forbes. https://www.forbes.com/sites/johnkoetsier/2025/04/30/humanoid-robot-mass-adoption-will-start-in-2028-says-bank-of-america/ 

[2] Goldman Sachs. (2024, January 8). The global market for humanoid robots could reach $38 billion by 2035. https://www.goldmansachs.com/insights/articles/the-global-market-for-robots-could-reach-38-billion-by-2035 

[3] Tesla. (2023). Tesla Optimus: Our Humanoid Robot. https://www.tesla.com/AI 

[4] Figure AI. (2024). Figure 02 Robot Overview. https://www.figure.ai/ 

[5] Amazon. (2025). Facts & figures: Amazon fulfillment centers and robotics. https://www.aboutamazon.co.uk/news/innovation/bots-by-the-numbers-facts-and-figures-about-robotics-at-amazon 

[6] Boston Dynamics. (2025). Spot | Boston Dynamics. https://bostondynamics.com/products/spot/ 

[7] Agility Robotics. (2025). Digit – ROBOTS: Your Guide to the World of Robotics. https://www.agilityrobotics.com/ 

[8] Srivastava, S., Li, C., Lingelbach, M., Martín-Martín, R., Xia, F., Vainio, K., Lian, Z., Gokmen, C., Buch, S., Liu, K., Savarese, S., Gweon, H., Wu, J., & Fei-Fei, L. (2021). BEHAVIOR: Benchmark for Everyday Household Activities in Virtual, Interactive, and Ecological Environments. arXiv preprint arXiv:2108.03332. https://arxiv.org/abs/2108.03332 

[9] IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. (2020). Ethically Aligned Design, 1st ed. https://ethicsinaction.ieee.org/ 

[10] Mori, M. (1970). The uncanny valley. Energy, 7(4), 33–35. (English translation by MacDorman & Kageki, 2012, IEEE Robotics & Automation Magazine). https://doi.org/10.1109/MRA.2012.2192811 

[11] Brundage, M., Avin, S., Clark, J., Toner, H., Eckersley, P., Garfinkel, B., … & Amodei, D. (2018). The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation. Future of Humanity Institute, University of Oxford. Retrieved from https://arxiv.org/abs/1802.07228 

[12] Boston Dynamics. (2021, August 17). Atlas | Partners in Parkour [Video]. YouTube. https://www.youtube.com/watch?v=tF4DML7FIWk 

[13] Hanson Robotics. (2025). Sophia the Robot. https://www.hansonrobotics.com/sophia/ 

 

ARTICLE: Beyond Efficiency: Ethical Considerations of Adopting Cobots

Collaborative robots, commonly referred to as “Cobots,” are among the most groundbreaking technological advancements of our time. Academics and industry experts firmly believe that Cobots have the potential to revolutionise global manufacturing. A Cobot is a context-aware robot equipped with artificial intelligence and vision capabilities, enabling it to safely coexist with both human operators and machines in the same workspace.

The adoption of Cobots in manufacturing is one of the key enablers of Industry 5.0. The concept of Industry 5.0 was first proposed by Michael Rada[i] in 2015, after it was felt that Industry 4.0, the predecessor of Industry 5.0 was unable to meet the increasing demands of personalization and customization of goods. Through incorporation of highly advanced systems such as artificial intelligence, automated systems, internet of things, and cloud computing, Industry 4.0 was aimed at enhancing operational efficiency and productivity by connecting physical and virtual worlds. However, the rapidly evolving global business dynamics shifted the industry paradigm from not just efficient production but also high-value mass customization and personalization of goods. It was widely believed that Industry 4.0 was unable to address these changes. Therefore, Industry 5.0 was coined to address changing industrial dynamics focusing on collaboration between advanced production systems, machines and humans.

To reap the enormous benefits associated with this technology, its adoption necessitates careful consideration of the risks that could potentially affect the well-being of human operators who work collaboratively with Cobots.

Ethical Considerations of Adopting Cobots

Ethical considerations when adopting Cobots encompass a wide range of social factors[ii]. As defined by the British Standards Institution[iii], ethical hazards are any potential source of harm that compromises psychological, societal, and environmental well-being. While collaborative settings involving Cobots offer benefits like reducing physically demanding tasks for humans, they have also brought forth new risks and ethical considerations that demand attention during their planning and use. In following sections, I will discuss some of the ethical considerations of adopting Cobots:

Emotional Stress

Understanding potential worker emotional stress may result in designing better human-Cobot interaction systems that minimise stress and enhance the overall user experience. Cobots may cause emotional stress among users for several reasons. For instance, users might feel they have less control over their work environment when Cobots are involved, especially if the Cobots operate autonomously. This can lead to feelings of anxiety and stress. Moreover, Cobots are often used for tasks that require high precision and concentration, thus pressure to perform these tasks accurately can be mentally exhausting and stressful. The constant need to monitor and interact with Cobots can trigger physiological stress responses, such as increased heart rate and tension. Organisations can consider these factors when designing and implementing cobots.

Social Environment

Understanding potential social environment related disruptions, manufacturers can develop strategies to mitigate workers’ concerns and create a harmonious work environment. Unless workers are involved in the design and planning of Cobot implementations, they may disrupt the social harmony of the workplace in several ways, for example by raising concerns about job security among workers, or causing anxiety and tension due to the fear of being replaced by robots. This can lead to confusion and ambiguity about job roles, causing stress and disrupting team cohesion. Furthermore, the presence of Cobots can alter social interactions in the workplace, with some workers viewing them as teammates while others see them as intruders, potentially leading to conflicts. Additionally, the increasing autonomy of Cobots raises ethical questions about decision-making and accountability.

Social Acceptance

By comprehending social acceptance related community factors, strategies can be developed to enhance the acceptance of Cobots. Communities play a crucial role in determining the acceptance of new technologies. Several key factors influence the acceptance of Cobots. Different cultures exhibit varying levels of comfort and acceptance towards technology. Some cultures place a higher level of trust and enthusiasm for technological advancements, which can lead to greater acceptance of Cobots. The opinions and behaviours of peers, family, and colleagues can significantly impact an individual’s acceptance of Cobots. Communities with higher levels of education and awareness about the benefits and functionalities of Cobots tend to accept them more readily. Government policies and incentives that promote the use of Cobots can positively influence community acceptance. Supportive regulations and funding for Cobot integration can encourage businesses and individuals to adopt this technology.

Data Collection

Firms adopting Cobots need to devise their data management policies and ensure workers that collected data will not be used by any other third party. Considering that Cobots collect a variety of data from their safety systems, there’s a risk that operators and user data could be collected, used, and sold without consent. Research indicates that many industry organisations were already interested in the potential value of this data in developing future products and services.

The addressal of these ethical considerations can ensure that the adoption of Cobots contributes positively to society and aligns with our social values. Thus, by prioritizing ethics, we can foster trust and acceptance of Cobots in manufacturing.

[i] https://www.linkedin.com/pulse/industry-50-from-virtual-physical-michael-rada/

[ii] https://www.centreforwhs.nsw.gov.au/__data/assets/pdf_file/0019/1128133/Work-health-and-safety-risks-and-harms-of-cobots.pdf

[iii] https://knowledge.bsigroup.com/products/robots-and-robotic-devices-guide-to-the-ethical-design-and-application-of-robots-and-robotic-systems

What Would Jim Henson Do? Roleplaying Human-Robot Collaborations Through Puppeteering

By James Dwyer and Dr Valeria Macalupu (both QUT)

 

 A tangible, adaptable and modular interface for embodied explorations of human-robot interaction concepts.

As robots become increasingly integrated into various industries, from healthcare to manufacturing, the need for intuitive and adaptable tools to design and test robotic movements has never been greater. Traditional approaches often rely on expensive simulations or complex hardware setups, which can restrict early-stage experimentation and limit participation from non-expert stakeholders. The kinematic puppet offers a refreshing alternative by combining hands-on prototyping with virtual simulation, making it easier for anyone to explore and refine robot motion. This work is particularly critical for exploring intuitive ways surgeons can collaborate with robots in the operating room, improving Robot-Assisted Surgery (RAS).

 What is the kinematic puppet?

The kinematic puppet is an innovative tool that combines physical prototyping and virtual simulation to simplify the design and testing of robot movements and human-robot interactions. The physical component is a modular puppet constructed from 3D-printed joints equipped with rotary encoders and connected by PVC linkages. This flexible and cost-effective setup allows users to customise a robot arm to suit a variety of needs by adjusting linkage lengths and joint attachments.

On the digital side, a virtual simulation environment (developed in Unreal Engine) creates a real-time digital twin of the physical puppet. This integration via Wi-Fi/UDP enables immediate visualisation and testing of HRI concepts. By bridging the gap between physical manipulation and digital analysis, the kinematic puppet makes it easier for anyone to experiment with and refine robot motion in an interactive and accessible way.

Figure 1. The physical and virtual components of the kinematic puppet.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

How does the user interact with the puppet?

In the demonstration, users engage with the system by physically manipulating the kinematic puppet to control a digital twin of the robot arm, guiding it through a virtual cutting task. As they direct the arm’s movements, a virtual cutting tool simulates material removal in real time.

The system provides continuous feedback through both visual displays and haptic responses, creating an immersive and intuitive experience. This interactive environment challenges participants to balance precision and speed, highlighting the importance of both accuracy and efficiency in robotic tasks.

By making the abstract process of programming robotic movements tangible, the kinematic puppet empowers users to experiment and learn in a dynamic environment.

Figure 2. James showing how the kinematic puppet works.

Demonstration at HRI 2025 – An experience for HDR students.

Presenting the Kinematic Puppet at the Human-Robot Interaction Conference 2025 provided valuable insights into how our research resonates with the broader robotics community. Attendees were particularly drawn to the system’s modularity and reconfigurability and appreciated the puppetry-based approach as an intuitive method for exploring human-robot interaction concepts.

The demonstration wasn’t without challenges. Technical issues before the demo required some mildly frantic rebuilding of the code solution the morning of, highlighting a common research reality: experimental prototypes often accumulate small bugs through iterative development that compound unexpectedly. An all-too-common challenge that reflects the messy nature of research and something that isn’t always visible in polished publications.

Reviewer feedback highlighted potential applications we hadn’t considered, particularly around improving accessibility of research technologies. While most attendees engaged enthusiastically with the concept, some appeared to struggle to connect it to their work. It took time for me to find effective ways to explain the purpose and value of the approach—a good reminder that not every method resonates equally in a diverse field and how important it is to tailor explanations to your audience, even within a given research community.

For an HDR student, this experience underscores the importance of exposing your work to the research community early. The value lies not in positive reception, but in the process of presenting the work itself. Getting to explain my work to others forced me to articulate and refine my thinking, an opportunity that is missed when work is conducted in isolation. These interactions helped me understand how my work fits within the broader landscape and sparked new reflections on its purpose and potential applications that I might have missed otherwise.

You can read more about this demo here: https://dl.acm.org/doi/10.5555/3721488.3721764

Dwyer, J. L., Johansen, S. S., Donovan, J., Rittenbruch, M., & Gomez, R. (2025). What Would Jim Henson Do? Roleplaying Human-Robot Collaborations Through Puppeteering Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction, Melbourne, Australia.

TL; DR.

  1. Accessible Design: The kinematic puppet combines physical prototyping with virtual simulation for intuitive human-robot interaction design.
  2. Intuitive Feedback for Seamless Experience: Users control a customisable robot arm through hands-on manipulation while receiving real-time visual and haptic feedback. This novel approach supports Robot-Assisted Surgery design processes by enabling the intuitive exploration of human-robot interactions.
  3. Creative Inspiration: Inspired by film animation techniques and puppeteering, this low-cost, adaptable tool enables rapid prototyping and innovative experimentation in human-robot interaction research more broadly.
  4. Communicating Complex Research Concepts: Often requires tailoring explanations to diverse audiences. Even within a specialised community like HRI, individuals connect with ideas differently, and finding effective ways to articulate the purpose and value of novel methodological approaches is an ongoing challenge that improves with practice.
  5. Early Exposure of Research Work: Presenting research work to the community provides invaluable benefits beyond simply positive reception. The process of presenting forces the articulation and refinement of ideas, reveals how your work fits within the broader research landscape, and often uncovers applications and connections you might otherwise miss when working in isolation.

ARTICLE: Integrating Vision-Guided Cobots into Steel Manufacturing

A cobot equipped with a laser-mounted end-effector points at a detected short bar. A green dot marks the identified short bar, providing a clear visual cue for operators.

A demonstration of vision-guided collaborative robotics has shown what the future of automation in steel product manufacturing could look like. As part of the Australian Cobotics Centre’s (ACC) Biomimic Cobots Program, this research initiative fosters university-industry partnerships to drive technological advancements.  Researchers from the Robotics Institute at UTS and the Research Engineering Facility at QUT, in collaboration with InfraBuild, deployed a custom AI-based “shorts” detection system integrated with a collaborative robot (cobot) that aims to enhance safety and maintain quality control in an active production environment.

The industry partner, InfraBuild, operates a manufacturing process that involves producing hot steel bars in various shapes and sizes. Quality control is maintained through a manual process where workers, operating in 12-hour rotating shifts, identify and remove defective short-length bars, known as “shorts”, from a conveyor. This task is both physically demanding and requires continuous focus to reduce errors and ensure workplace safety.

A key requirement for the solution was that it integrate seamlessly into existing operations without necessitating extensive modifications to plant, equipment, or processes. However, due to the wide variety of products that InfraBuild manufactures, off-the-shelf automation solutions were not suitable for accurately identifying and removing every type of bar produced. Given these requirements, a vision system consisting of various sensing modalities and a cobot were selected. This choice minimises the disruption to Infrabuild’s current workflow since cobots can operate safely alongside human workers without the need for extensive guarding and offers the flexibility and the option to revert to manual operation if needed.

A major milestone was achieved during the demonstration, successfully showcasing the “shorts” detection and cobot bar tracking system functioning in a live factory environment.  During the demonstration, the AI-based “shorts” detection system successfully detected short bars in real time. This information was communicated through a graphical user interface, displaying live video streams from two cameras mounted on InfraBuild’s conveyor line. The interface also featured coloured indicators: dots marking the detected start and end of a bar, its corresponding length displayed in the centre, and the average length per run. If a short bar was detected, a red bounding box highlighted it, and its corresponding length measurement changed from green to red, providing a clear visual cue for operators. The additional information provided from each production run, offers valuable insights for InfraBuild’s quality assurance processes. Additionally, InfraBuild noted that the vision system alone was a valuable addition, as it would enable operators on the factory floor to more quickly identify and remove defective bars when necessary.

By leveraging real-time detections from the vision system, the cobot dynamically adjusted its actions, indicating the bars identified as “shorts” by pointing at them. A laser mounted on the cobot’s end-effector highlighted these bars, allowing staff from the ACC and InfraBuild to clearly see the identified short bar. This milestone demonstrated the adaptability of vision-guided cobots, which, unlike traditional automation systems requiring structured environments, can respond dynamically to changing conditions in manufacturing processes.

This trial serves as a proof of concept for integrating robotic vision systems into InfraBuild’s broader production lines and offers valuable insights for other SME manufacturing companies looking to implement similar cobot-enabled automation solutions. By demonstrating the potential of vision-guided cobots, this initiative represents a step toward smarter, safer, and more flexible manufacturing systems. Showcasing a live cobot system in a factory was a first and major milestone for the ACC, proving that it is possible to address challenging problems found in industry. This achievement provides insight on the commercial viability of such technologies, marking a step for InfraBuild as they move toward the next phase of development.

Graphical user interface of the AI-based ’shorts’ detection system. The top image displays a run with no short bars detected. In the bottom image, a short bar is identified, highlighted by a red bounding box, and its length measurement in the centre of the interface changes from green to red, providing a clear visual indicator.

 

ARTICLE: How Vocational Education and Training (VET) looks to meet the skills needs of the advanced Manufacturing Sector

As manufacturing moves to more advanced methods of production that utilises technologies such as cobots, vocational education and training (VET) providers are under increasing pressure to develop and deliver training that meets the evolving needs of the advanced manufacturing sector. This article uses the notion of employability to present three themes emerging from my research to unpack how skills are perceived and understood by those involved in provision and delivery of vocational education for advanced manufacturing.

Readiness: Laying the Groundwork for Success

In courses like Electrotechnology, higher-level math and literacy are prerequisites for success. VET providers look to support students with a range of programs including in class support to help bridge literacy, numeracy and digital abilities gaps of new students, ensuring they are better equipped to handle complex technical training.

Teachers are critical to ensuring readiness. As industries increasingly shift in the use and application of technology, trainers and training providers need to also keep pace but may lack familiarity with modern technologies such as robotics and automation. Investment in teacher development is essential to ensure they can deliver training that meets the current demands of industry.

VET providers must also ensure that their training equipment and facilities reflect the new technological landscape. This can be a significant hurdle, as systemic factors related to capital expenditure for public providers often restrict the ability to invest in advanced tools and machinery, requiring support from industry partners.

Adaptation: Responding to Changing Skills Needs

Adaptation underscores the importance of providers’ ability to respond to the changing skills needs of the workforce. While VET institutions recognize the need to evolve, the process of revising training packages is often slowed by conflicting industry interests and other stakeholder agendas.

To counter this, VET providers have increasingly turned to alternative forms of training. Microcredentials have emerged as flexible solutions to upskill or reskill workers in emerging areas like autonomous technologies and robotics. These shorter, more targeted programs can be developed quickly and are designed to address specific industry needs, even if they fall outside the scope of formal qualifications. Institutions are also offering hybrid courses that combine in-person and online elements, allowing workers to access training more flexibly. This adaptability is crucial as industries face rapid technological advancements and a need for workers with specialized skills.

Collaboration: Bridging the Gap Between Education and Industry

Collaboration emphasises the importance of partnerships between educational institutions, industry, and government to effectively meet the workforce’s evolving needs, and ensure that training is relevant and up to industry standards.

New initiatives like higher apprenticeships, which combine trade qualifications with university degrees, are emerging. These programs require careful coordination between VET and university sectors to ensure that students receive the necessary support and meet the varying requirements of both systems.

Industry partnerships also extend beyond course design to include equipment sharing and resource pooling. Industry partners help to overcome capital investment limitations of VET institutions by providing the latest equipment such as cobots. This reciprocal arrangement helps both parties. Industry partners gain access to skilled workers trained on the latest equipment, while VET providers can offer students hands-on experience with tools and equipment used in workplaces.

Moving Forward

Through readiness, adaptation, and collaboration VET providers can better prepare learners for the future workforce. Ensuring that learners enter training with the right foundational skills, adapting training offerings to meet the rapidly changing technological landscape, and fostering strong collaborations with industry and higher education institutions are all key steps in skilling a workforce capable of thriving in technologically complex workplaces. Ongoing collaboration between education providers, industry, and policymakers will be key to ensuring workers have the skills necessary to succeed in the advanced manufacturing industry.

ARTICLE: Beyond the Factory Floor: Cobots as the Ultimate Growth Hack for Small and Medium-sized Enterprises (SMEs)

Automation has long been the domain of large enterprises with deep pockets and extensive resources. However, the landscape is undergoing a transformation, with collaborative robots, often referred to as cobots, leading the way in driving this change. Designed to work seamlessly alongside humans, cobots are making automation accessible, even for Small and Medium-sized Enterprises (SMEs). According to a recent report[1], the global cobot market is projected to grow from $1.2 billion in 2023 to close to $3 billion by 2028, reflecting their rising adoption across industries.

For SMEs, staying competitive often requires overcoming unique challenges such as limited budgets, smaller teams, and the need for operational agility. According to the Australian Chamber of Commerce and Industry’s 2024 Small Business Condition Survey[2], labour shortages and rising costs are among the most significant obstacles that small businesses face. While traditional automation systems can help address labour shortages, they are often rigid, complex, and prohibitively expensive, making them unsuitable for many SMEs. Enter cobots, a revolutionary solution that combines affordability, flexibility, and ease of deployment.

  • What Are Cobots, and How Do They Differ from Traditional Robots?

Collaborative robots, or cobots, are a new generation of robotic systems designed to work directly with humans in shared workspaces. Unlike traditional industrial robots, which often require physical barriers for safety, cobots are equipped with advanced sensors and programming that allow them to detect and adapt to human presence. This makes them inherently safer and more versatile in environments where people and machines need to work side by side.

  • Why Cobots Are the Perfect Growth Hack for SMEs?

Cobots have the potential to transform SMEs by providing solutions that deliver a wide range of benefits. These include:

  • Cost-Effectiveness

Cobots are significantly more affordable than traditional industrial robots. Unlike industrial robots, which require heavy structures and safety cages, cobots can often be mounted with simple tools like a G-clamp, saving on installation costs and space. In contrast, industrial robots demand extensive safety measures and infrastructure, which not only consume space but also incur additional expenses. This cost difference is critical for SMEs operating on tight budgets.

  • Flexibility and Adaptability

Unlike traditional robots, which are designed to fully automate a task or leave it to manual labour, cobots offer a middle ground, semi-automation. This capability is invaluable for SMEs, where automating the complete workflows are complex or expensive.

For example, a furniture manufacturing SME can program a cobot to assist with sanding tasks. While the cobot performs the repetitive sanding, workers can focus on more intricate assembly tasks, significantly boosting overall productivity. This shared workspace model eliminates the rigidity of traditional automation, allowing SMEs to adapt quickly to changing demands.

  • Ease of Use

Cobots are designed with user-friendliness in mind, often featuring intuitive interfaces that require minimal training. Employees without technical expertise can quickly learn to program and operate these robots, reducing downtime. Blocky programming uses a drag-and-drop interface where users create workflows by connecting pre-designed blocks that represent commands or actions. This visual approach eliminates the need for complex coding knowledge, making it ideal for SMEs that may not have dedicated robotics experts on staff. For instance, programming a cobot to pick and place items can be as simple as dragging blocks for “move,” “grip,” and “release,” and arranging them in sequence.

  • Real-World Examples

Cobots have demonstrated significant value in real-world SME environments, offering practical solutions to common operational challenges. KUKA, a leading cobot manufacturer, has highlighted numerous cases[3] where SMEs have successfully implemented their collaborative robots. These include applications in quality inspection in plastics manufacturing, machine loading in metal industries, and assembly tasks in the automotive sector. Similarly, Universal Robots (UR), another leading cobot manufacturer, has documented a wide range of SME applications[4], such as palletizing in food production, welding in small-scale metal fabrication, and material handling in manufacturing environments. For example, the SME Bob’s Red Mill utilized UR cobots to automate palletizing tasks, effectively addressing labour shortages and boosting productivity. These examples illustrate how cobots are enabling SMEs to enhance their operations through flexible and scalable automation solutions tailored to their specific needs.

  • Start small, Scale smart !

By embracing cobots today, SMEs can secure the future of their operations and position themselves for sustained success in a world that is becoming more competitive. The key to successfully integrating cobots is to start with a focused approach by introducing them into one or two specific processes. As businesses gain confidence and expertise, they can gradually expand their use. This method helps organisations reduce risks, control costs, and tailor the technology to fit their specific requirements.

References

[1] T. Haworth, “Global cobot market exceeds $1bn in 2023, with strong growth forecast 2024-28,” Interact Analysis. Accessed: Nov. 21, 2024. [Online]. Available: https://interactanalysis.com/global-cobot-market-exceeds-1bn-in-2023-with-strong-growth-forecast-2024-28/

 [2] Australian Chamber of Commerce and Industry, Small Business Conditions Survey 2024, Australian Chamber of Commerce and Industry, Canberra, ACT, 2024. [Online]. Available: https://www.australianchamber.com.au/wp-content/uploads/2024/07/ACCI-Small-Business-Conditions-Survey-2024.pdf

[3] “Successful automation in small and medium-sized enterprises,” KUKA AG. Accessed: Nov. 21, 2024. [Online]. Available: https://www.kuka.com/en-de/company/iimagazine/2023/05/kmu-erfolgsgeschichten

[4] “Customer Success Stories – collaborative robots.” Accessed: Nov. 21, 2024. [Online]. Available: https://www.universal-robots.com/case-stories

 

 

 

ARTICLE: Industry 5.0 and Cobot Adoption

TL;DR

  • Industry 5.0 highlights environmental sustainability, human centricity, and resilience, pushing corporate responsibility to the social and planetary boundaries.
  • Cobots play an essential role in achieving human centricity and resilience.
  • Developing a holistic understanding of the technology is essential before adoption.
  • Allocating time for innovation is the key to sustainable growth.

Introduction

Industry 4.0, digital transformation, and smart factories with cyber-physical systems bring unprecedented capabilities for a seamlessly connected industry and improve production and business efficiency. As technology continues to advance, the vision of Industry 5.0 is within reach. Is Industry 5.0 all about cobots? This article discusses the concept of Industry 5.0 and the role of cobots and provides tips for technology adoption.

The Industry 5.0 vision

Industry 5.0 is a vision proposed by the European Commission in 2021. It envisions the industry’s next step toward becoming more environmentally sustainable, human-centric, and resilient. How can achieving success in these three aspects benefit companies and the industry?

  • Understanding planetary boundaries is essential for manufacturing as they provide guidelines for balancing industrial growth with environmental sustainability. Adopting circular processes, such as reducing waste, reusing materials, and improving energy efficiency, contributes to both environmental and operational benefits.
  • A human-centric approach prioritises workers’ needs, cultivating a thriving and innovative manufacturing environment. In Industry 5.0, technology goes beyond being a mere tool for improving production efficiency. “How can technology best support the workforce?” is the key question to ask. This vision paves the way for a future where technology enhances employee guidance and training, boosting productivity, job satisfaction, retention, and worker sustainability.
  • Geopolitical changes, natural disasters, and the recent COVID-19 pandemic have highlighted the vulnerabilities within current globalised production systems. Industry0 addresses these challenges by enhancing the resilience of industrial production through the establishment of resilient strategic value chains, adaptable production capacities, and flexible business processes.

The role of cobots in Industry 5.0

Cobots, or collaborative robots, are special robots equipped with advanced safety sensors and designed specifically for a secure human-robot co-working environment. With a reduced payload, speed, and force, using cobots does not require fencing and laser screening as required for traditional industrial robots. Therefore, cobots can provide promising solutions for achieving human centricity and resilience.

The key design principle of cobot application is for cobots to handle repetitive and hazardous work while workers can focus on complex and intelligent work. Some of the use cases are as follows:

  • Product assembly, where a cobot lifts and holds an item while workers perform jobs on the item.
  • Material transportation, where a cobot picks and places or delivers materials to the worker while the worker focuses on complex manufacturing tasks.
  • Machine tending, where a cobot loads and unloads items onto and from heavy machinery while the worker focuses on machine programming and finished goods inspection.

The characteristics of cobots also make them more flexible to deploy than traditional industrial robots. In case supply chain disruptions occur and production reconfiguration is required, cobots can be adapted quickly to fit the needs of the new production line, making the production line flexible and resilient.

Towards successful cobot adoption

Successful adoption of cobot is much more than acquisition and integration. Like any other technology, adopting cobots requires a holistic understanding of the technology, which goes beyond understanding the use cases and evaluating the fitness to the manufacturer’s context.

To support Australian manufacturing companies, especially small to medium-sized enterprises (SMEs), in successfully adopting new technologies, current adoption practices were investigated as a part of my PhD research. Based on academic literature and expert discussions, the following action items are recommended for building a holistic understanding of cobot before adoption:

  • Operational capabilities. Understand what cobots can do and which are relevant to the current and future applications. E.g. pick-and-place and welding.
  • Key areas and processes. Understand where cobots can be applied and which are relevant to the current and future applications. E.g. assembly and warehousing.
  • Key performance indicators. Clarify how adopting cobots aligns with the company’s strategy and how the outcome can be measured. These can range from production speed to job satisfaction.
  • Stakeholders. Investigate who might be affected by adopting cobots. E.g. customers and current workers.
  • Implementation capabilities. Understand what skills are required for adopting cobots, e.g. installation and programming. Clarify if the in-house engineering team has these skills, if the technology provider has the skills or provides training, or if new hires are necessary.
  • Technology dependencies. Consider prerequisite technologies, technologies that complement cobots, potential technologies that can be adopted afterwards, and their compatibility. E.g. conveyor belts, welders, and 3D printers.

As technology advances, the holistic view should expand, incorporating new capabilities as they emerge. Therefore, it is important to retain knowledge about cobots and the relevant technologies within the company while continuously seeking improvement needs and refining strategies. Despite manufacturers, especially SMEs, being found to be extremely overwhelmed by their daily activities, allocating minimal time to identify improvement needs, obtain new knowledge, and scan new opportunities is crucial to sustainable business development.

Our research will continue to develop a practical procedure model to support successful technology adoption, incorporating relevant methods and tools to guide companies from strategic planning through to identifying technology and adoption planning.

ARTICLE: Proposed guardrails for the safe and responsible use of AI

Artificial Intelligence (AI) is appearing in many aspects of our life and work, and advancements are rapid and continuous. For most of us, it has been hard to keep up. Regulations designed to protect our way of life and conditions of work, have also struggled to keep pace with the development of AI in ways that can reduce harm arising from the use of AI, while ensuring Australia can capitalise on the possibilities that AI offers.

Recognising that Australia’s current regulatory environment has not kept pace with AI capability, and following extensive consultations, the Australian Government recently released proposed guardrails for the safe and responsible development and deployment of AI. Outlining ‘high-risk AI’ these guardrails are put forward in the proposals paper  titled: Introducing mandatory guardrails for AI in high-risk settings, which can be found here.

The guardrails complement the previously released Voluntary AI Safety Standards and provide some guidance to developers, organisations and individuals, on how to build and use AI responsibly and safely. Unfortunately, like many technologies, even when created with the best of intentions, AI can be used in ways that are deliberately or inadvertently harmful with negative consequences for individuals or society. For example, case examples and much academic research has already demonstrated that AI can not only replicate existing biases but embed them in automated decisions that result in individuals being excluded or otherwise discriminated against on the basis of race or gender. This can have significant implications especially when AI is used to automate decisions that impact on the lives or livelihoods of individuals.

One situation that has been explored in academic studies is when AI is used to automate recruitment shortlisting or hiring decisions. In these cases, research has shown that without human oversight, AI training data can contain pre-existing biases that may exclude under-represented groups from the AI-compiled shortlist for a job. This has obvious implications for access to employment and an income for individuals or particular groups, and it also has implications for diversity and the associated benefits of innovation, creativity and idea generation within organisations. Organisations may also experience more direct effects arising from the malicious use of AI to expose enterprise vulnerabilities or as they are subjected to more sophisticated scams, fraud and cyber-security attacks.

Taking a risk-based approach to regulation similar to that adopted by the several States in the USA and the European Union in the EU AI Act 2024, the guardrails proposed in Australia focus on the development and deployment of AI in high-risk settings. While the Australian guardrails are still in development, the proposals paper provides a useful summary of high-risk settings identified in other countries. These include (among others):

  • biometrics used to assess behaviour, mental state or emotions;
  • AI systems used to determine access to education or employment (as in some automated recruitment systems);
  • AI systems used to determine access to public assistance or benefits; and
  • AI systems used as safety components in critical infrastructure.

Research currently being undertaken by Australian Cobotics Centre researchers, suggests that some organisations in Australia are using AI for biometric identification or for recruitment or in other ways that may be considered ‘high-risk’ under the use cases applied in other country contexts. It is therefore critical for Australian organisations to monitor the Australian Government’s Consultation Hub and ongoing work on Artificial Intelligence to keep abreast of proposed regulatory changes, and consider how any current or planned use of AI within their organisation aligns with principles for promoting safe and responsible use of AI in Australia.

ARTICLE: From Lab to Market (Part II): Bridging the Gap – Solutions for Effective Industry-Academic Collaboration

In today’s rapidly evolving technological landscape, the synergy between academic research and industrial innovation has never been more critical. Yet, as we explored in our previous article, significant barriers often hinder effective collaboration between these two sectors. From misaligned incentives to communication challenges, the road to fruitful partnerships is fraught with obstacles. However, where there are challenges, there are also opportunities for transformative solutions. In this article we will investigate how we can overcome these barriers between academic-industry collaborations and foster more productive collaborations? Here are some strategies I believe could make a significant difference:

1. Educational Outreach

  • Host Workshops and Seminars: Organize events that showcase research capabilities and potential benefits to industry partners. These can help demystify the research process and highlight its value.
  • Develop Industry-Focused Communication: Create materials that explain research in terms of business benefits, ROI, and practical applications.
  • Utilize social media: Leverage platforms like LinkedIn to share success stories, insights, and opportunities for collaboration.

2. Flexible Collaboration Models

  • Short-Term Projects: Offer opportunities for smaller, shorter-term collaborations that can serve as ‘proof of concept’ for more extensive partnerships.
  • Tiered Partnership Options: Develop a range of partnership models to suit different company sizes, budgets, and comfort levels with research collaboration.
  • Shared Resource Models: Create systems where multiple industry partners can share the costs and benefits of research initiatives.

3. Build Trust and Understanding

  • Industry Internships for Researchers: Encourage academic researchers to spend time in industry settings to better understand business needs and processes.
  • Academic Sabbaticals for Industry Professionals: Invite industry professionals to spend time in academic settings, fostering better understanding and communication.
  • Joint Advisory Boards: Establish boards with both academic and industry representation to guide research directions and collaboration strategies.

4. Address Financial Concerns

  • Highlight Long-Term ROI: Develop case studies and financial models that demonstrate the long-term return on investment for research collaborations.
  • Explore Public-Private Partnerships: Leverage government funding and initiatives designed to promote industry-academic collaborations.
  • Transparent Cost Structures: Develop clear, understandable cost structures for different types of collaborations to help businesses budget effectively.

5. Streamline Processes

  • Simplify Administrative Procedures: Work on streamlining the often-complex administrative processes involved in setting up research collaborations.
  • Dedicated Liaison Officers: Appoint individuals specifically tasked with facilitating and managing industry-academic partnerships.
  • Clear IP Agreements: Develop straightforward intellectual property agreements that protect both academic and industry interests.

The Path Forward

The future of innovation lies in the synergy between academia and industry. By working together, we can drive progress, enhance productivity, and tackle real-world challenges more effectively. It’s a journey that requires effort, understanding, and adaptability from both sides, but the potential rewards are immense.

As we move forward, I’m eager to hear from both my academic colleagues and industry professionals:

  • What challenges have you faced in establishing or maintaining industry-research collaborations?
  • What successful strategies have you employed to overcome these barriers?
  • How do you envision the future of industry-academic partnerships in your field?

As we explore these solutions, we’ll highlight the valuable contributions of organizations like the Australian Cobotics Centre. This pioneering training institution has been at the forefront of addressing the barriers between academia and industry, particularly in the field of collaborative robotics. Through its unique model of industry-led research, the Centre has been instrumental in developing practical solutions that not only advance academic knowledge but also address real-world industrial challenges. By examining the Centre’s approach, we can gain insights into effective strategies for overcoming the traditional divides between research institutions and commercial enterprises.

Let’s continue this crucial conversation in the comments below. By sharing our experiences and ideas, we can work together to build stronger, more productive bridges between the world of research and the world of industry.

ARTICLE: Accepted Papers for the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)

Australian Cobotics Centre researchers have two papers accepted for publication at the upcoming IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2024 in Abu Dhabi. IROS is one of the largest and most important robotics research conferences in the world, attracting researchers, academics, and industry professionals from around the globe.

Postdoctoral Research Fellow, Dr Fouad Sukkar gave is a brief summary of two of the papers appearing at the conference in October this year.

Constrained Bootstrapped Learning for Few-Shot Robot Skill Adaptation, by Nadimul Haque, Fouad (Fred) Sukkar, Lukas Tanz, Marc Carmichael, Teresa Vidal Calleja, proposes a new method for teaching robot skills via demonstration. Often this is a cumbersome and time-consuming process since a human operator must provide a demonstration for every new task. Furthermore, there will inevitably be some discrepancies between how the demonstrator carries out the task versus the robot, for example, due to localisation errors, that need to be corrected for in order for the skill to be successfully transferred. This paper tackles these two problems by proposing a learning method that facilitates fast skill adoption to new tasks that have not been seen by the robot. We do so by training a reinforcement learning (RL) policy across a diverse set of scenarios in simulation offline and then use a sensor feedback mechanism to quickly refine the learnt policy to a new scenario with the real robot online. Importantly, to make offline learning tractable we utilise Hausdorff Approximation Planner (HAP) to constrain RL exploration to promising regions of the workspace. Experiments showcase our method achieving an average success rate of 90% across various complex manipulation tasks compared to state-of-the-art which only achieved 56%.

Coordinated Multi-arm 3D Printing using Reeb Decomposition, by Jayant Kumar , Fouad (Fred) Sukkar, Mickey Clemon, Ramgopal Mettu, proposes a framework for utilising multiple robot arms to collaboratively 3D print objects. For robots to do this efficiently and minimise downtime while printing, they must have the flexibility to work closely together in a shared workspace. However, this dramatically increases problem complexity since there is a need to coordinate the arms so they do not collide with each other or the partially printed object. This is in addition to the planning problem of effectively allocating parts of the object to each robot while respecting the physical dependencies of the print, for example an arm can’t start extruding a contour until all the contours below it are printed first. All these factors make effective coordination a very computationally hard problem and we show that with bad coordination you can end up with even worse utilisation than if a single arm had carried out the same print! In this work we address this by performing a Reeb decomposition of the object model which partitions the model into smaller, geometrically distinct components. This drastically reduces the search space over feasible toolpaths, thus allowing us to plan highly effective allocations to each arm using a tree search-based method. For producing fast collision avoiding motions we utilise Hausdorff Approximation Planner (HAP). Our experimental setup consists of two robot arms with pellet extruders mounted on their end effectors. We evaluate our framework on 14 different objects and show that our method achieves up to a mean utilisation improvement of 132% over benchmark methods.