Member Login

ARTICLE: Reflections from the 2023 OZCHI workshop on Empowering People in Human-Robot Collaboration

This article is written by Stine Johansen, Postdoctoral Research Fellow (Human-Robot-Interaction Program) at Australian Cobotics Centre.


At the OzCHI 2023 conference, researchers from the Australian Cobotics Centre (QUT and UTS) and CINTEL (CSIRO) co-organised a workshop on the topic of “Empowering People in Human-Robot Collaboration: Why, How, When, and for Whom”. Our previous workshop at the OzCHI 2022 conference showed that there is a growing interest in the area from both researchers and practitioners located in the regions of Oceania. In the 2022 workshop, discussions centred around human roles in human-robot collaboration, empathy for robots, approaches to designing and evaluating human-robot collaboration, and ethical considerations. With the 2023 workshop, we aimed to take a step further by (1) discussing underlying assumptions that shape our research and (2) identifying pathways towards shared visions for future research. While it is impossible to capture all the nuances of our discussions here, I will use the limited space in this article to provide a peek into two of the topics that emerged. I hope this can serve as an inspiration to anyone who is reflecting on the why, when, how, and who of empowering people in human-robot collaboration.

Topic 1: Robots as tools for creativity

While an increasing number of digital tools to support creative work come into the world, there are still questions left to be answered in terms of how that support can or should be designed. While a robot might aid someone in drawing, 3D printing, milling furniture, etc, it is up to people to ask the right kinds of questions for artistic expressions and experiences. Furthermore, while a robot might be able to manipulate physical materials, the processes of moulding, cutting, drawing, painting, etc., is part of an artistic conversation that artists and creative professionals have with those materials. Workshop participants proposed that there is a potential for further empirical studies of how creativity works as a basis for how robots can support that.

There are a number of examples out there where designers, developers, and artists explore roles that robots can play for creative work. Here are some that I have come across:

Youtuber and artist Jazza tried to evaluate the drawing capabilities of a small desk robot by line-us. The video starts with a highly unsuccessful replication of Jazza’s drawings and moves into an interactive game session, e.g., playing hangman. It seems that replicating an artist’s drawings is a fun gimmick but perhaps does not offer any further space for creativity. (See the video here)

The humanoid robot Ai-Da paints “self”-portraits which seems ironic when a robot inherently does not have a self or an identity—at least from the perspective of current understandings of consciousness. The artist, Aidan Meller, states that the point of Ai-Da is to raise questions around what role people have if robots are able to replicate our work. (The Guardian published this article about Ai-Da in 2021)

By the way, on the topic of robot consciousness, our workshop panel member Associate Professor Christoph Bartneck, University of Canterbury, hosts a podcast in which the topic was discussed. You can listen to the episode here.

In a more academic direction, the MIT Media Lab has conducted research on ways that robots can help children be creative. They designed a set of games that support children either through demonstrating how to implement a creative idea or by prompting children to reflect by, e.g., asking them questions. (Read about the research here)

Topic 2: Assumptions about robots

Even though, much research and development has already shown a multitude of ways that robots can perform tasks in work and everyday life, there are still underlying assumptions about robots and people that drive these developments. The phrases we use between ourselves, participants, collaborators, industry partners, etc, to describe a design concept or how a robot could solve a problem are part of a larger storytelling. Such storytelling comes through narratives of, e.g., robots taking jobs from workers. We might ask ourselves how we contribute to these narratives, both in public forums as well as research publications.

As a side note to this, fiction and ‘speculation’ is increasingly utilised as a tool for designing human-robot interaction. Some examples include Auger, 2014, Luria et al., 2020, and Grafström et al., 2022. Speculative design is not a new method, but rather becoming a well-established approach within human-computer interaction (HCI), interaction design, and now also human-robot interaction.

What are our visions and how can we get there?

Our shared visions for the future of human-robot collaboration are not necessarily surprising, but thankfully reassuring, that collaborative robots should support people. There are, however, a multitude of ways that people can be supported. These range from support (1) during an actual task, e.g., heavy lifting, improving work safety, and providing effective communication, (2) by fitting into dynamic and unstructured environments, and (3) as part of the foundation for people to have a healthy and rewarding work life.

Different pathways exist towards making this reality. Here are a few examples taken from the workshop discussion. First, while the Australasian context might present some unique challenges, we can still learn from other parts of the world, e.g., in terms of socio-economic pressures that drive robotic development. Second, we can continuously reframe the problems we choose to prioritise. There are perhaps opportunities to move away from the framing of robots performing “dull, dirty, and dangerous” work to robots performing collaborative, inclusive, and even creative work. Third, increasingly dynamic settings require robotic interfaces that provide modular solutions. This prompts the question of how end users might use modular robotic systems, and whether this approach is best suited for certain problems and contexts. Finally, participants agreed that we increasingly need a network of researchers in this area to support each other.

In the spirit of the last point, I invite researchers and practitioners to visit the Australian Cobotics Centre at QUT, Brisbane. You are also welcome to join our public seminars, both as audience and presenter. I look forward to continuing this crucial conversation.


James Auger. 2014. Living with robots: a speculative design approach. J. Hum.-Robot Interact. 3, 1 (February 2014), 20–42.

Anna Grafström, Moa Holmgren, Simon Linge, Tomas Lagerberg, and Mohammad Obaid. 2022. A Speculative Design Approach to Investigate Interactions for an Assistant Robot Cleaner in Food Plants. In Adjunct Proceedings of the 2022 Nordic Human-Computer Interaction Conference (NordiCHI ’22). Association for Computing Machinery, New York, NY, USA, Article 50, 1–5.

Michal Luria, Ophir Sheriff, Marian Boo, Jodi Forlizzi, and Amit Zoran. 2020. Destruction, Catharsis, and Emotional Release in Human-Robot Interaction. J. Hum.-Robot Interact. 9, 4, Article 22 (December 2020), 19 pages.

Online links

Jazza trying the line-us robot:

Article about Ai-Da:

MIT Media Lab projects on child-robot interaction for creativity:

Christoph Bartneck’s podcast episode on robot consciousness:

ARTICLE: Human-Robot Collaboration in Healthcare: Challenges and Prospects

This article is written by Amir Asadi, PhD researcher at the Australian National University (ANU) and a visiting researcher at Australian Cobotics Centre. It draws upon the introduction section of a paper he co-authored with Associate Professor Elizabeth Williams from the Australian National University, Associate Professor Glenda Caldwell from the Queensland University of Technology, and Associate Professor Damith Herath from the University of Canberra.

Today’s global healthcare system faces a pressing challenge: ensuring equitable access to healthcare amidst a severe workforce shortage. The World Health Organization predicts a shortfall of 10 million healthcare workers by 2030 [1], a situation worsened by an ageing population, increasing demand for medical services, and the COVID-19 pandemic. This shortage leads to a heavy workload for existing healthcare professionals, which research indicates can severely affect patient care quality [2].

In response to the challenges caused by the shortage of healthcare professionals, technological innovations offer a viable approach to reduce the workload on healthcare workers, which could ultimately improve patient care and health service quality. Among many cutting-edge technologies suggested for healthcare, robotics has emerged as a particularly promising area. Robots can assist in a variety of tasks, ranging from surgical procedures to patient care and physical rehabilitation. This leads us to the Human-Robot Collaboration (HRC) concept, where humans and robots work together, leveraging each other’s strengths to achieve shared goals [3]. HRC focuses on augmenting human efforts with robotic assistance in a safe, flexible, and user-friendly manner, thereby enhancing the efficiency and effectiveness of tasks, operations, and workflows [4].

In healthcare, HRC aims to create a symbiotic relationship between healthcare professionals and robots to improve patient care. This approach spans a wide array of applications, including physical rehabilitation, support for the elderly and disabled, surgical assistance, and responses to COVID-19, such as patient handling and disinfection tasks. The breadth of HRC research reflects a commitment to addressing the healthcare system’s immediate and long-term needs.

Despite the clear advantages highlighted by research into HRC in healthcare, its integration has been gradual, reflecting the healthcare sector’s traditionally cautious approach towards new technologies [5]. This slow pace of adoption is multifaceted. The initial aspect encompasses general challenges associated with introducing new technologies into healthcare, such as infrastructure limitations, resistance from healthcare professionals, complex market dynamics, and regulatory barriers [6]. Following this, concerns particular to robots in healthcare, including safety issues, questions of effectiveness, public acceptance, and fears that robots may replace human caregivers, further slow the adoption process within healthcare environments [7]. The next dimension involves the distinct challenges of fostering a collaborative relationship between robots and human users. These challenges include developing intuitive interfaces for seamless human-robot collaboration, ensuring the reliability of robots in diverse healthcare scenarios, and addressing ethical considerations around autonomy and collaborative decision-making in patient care.

Together, these facets of challenges underscore the complexity of integrating HRC in healthcare settings and, therefore, necessitate a comprehensive approach that extends beyond mere technological considerations. This approach must encompass aspects such as regulatory compliance, ethical standards, stakeholder engagement, and infrastructural adaptation. To move forward and advance research in this field, it is crucial to adopt a holistic socio-technical perspective that acknowledges the complex interconnectedness between people, technology, environments, and workflows.

Furthermore, fostering a dialogue among multiple disciplines is imperative for the successful adoption of HRC in healthcare. The diversity of challenges that HRC is facing makes it crucial to bridge fields such as robotics, Human-Robot Interaction (HRI), human factors, medicine, nursing, social sciences, psychology, and ethics. By integrating insights from these diverse fields, the aim is to design and implement robotic technologies in a manner that not only addresses practical challenges but also enriches the efficiency and quality of healthcare services.

To conclude, we can safely say that while the journey to fully realise HRC’s potential in healthcare faces numerous obstacles, its effective adoption could transform healthcare delivery significantly, a process that requires both a socio-technical approach and a broad multidisciplinary dialogue.


[1]           World Health Organization (WHO), ‘Health workforce’. Accessed: Jan. 19, 2024. [Online]. Available:

[2]           D. J. Elliott, R. S. Young, J. Brice, R. Aguiar, and P. Kolm, ‘Effect of Hospitalist Workload on the Quality and Efficiency of Care’, JAMA Internal Medicine, vol. 174, no. 5, pp. 786–793, May 2014, doi: 10.1001/jamainternmed.2014.300.

[3]           J. Arents, V. Abolins, J. Judvaitis, O. Vismanis, A. Oraby, and K. Ozols, ‘Human–Robot Collaboration Trends and Safety Aspects: A Systematic Review’, Journal of Sensor and Actuator Networks, vol. 10, no. 3, Art. no. 3, Sep. 2021, doi: 10.3390/jsan10030048.

[4]           L. Lu, Z. Xie, H. Wang, L. Li, E. P. Fitts, and X. Xu, ‘Measurements of Mental Stress and Safety Awareness during Human Robot Collaboration -Review’, Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 66, no. 1, pp. 2273–2277, Sep. 2022, doi: 10.1177/1071181322661549.

[5]           K. Nakagawa and P. Yellowlees, ‘Inter-generational Effects of Technology: Why Millennial Physicians May Be Less at Risk for Burnout Than Baby Boomers’, Curr Psychiatry Rep, vol. 22, no. 9, p. 45, Jul. 2020, doi: 10.1007/s11920-020-01171-2.

[6]           A. B. Phillips and J. A. Merrill, ‘Innovative use of the integrative review to evaluate evidence of technology transformation in healthcare’, Journal of Biomedical Informatics, vol. 58, pp. 114–121, Dec. 2015, doi: 10.1016/j.jbi.2015.09.014.

[7]           I. Olaronke, O. Ojerinde, and R. Ikono, ‘State Of The Art: A Study of Human-Robot Interaction in Healthcare’, International Journal of Information Engineering and Electronic Business, vol. 3, pp. 43–55, May 2017, doi: 10.5815/ijieeb.2017.03.06.

ARTICLE: Navigating Augmented Reality: User Interface and UX in Cobotics

Written by Postdoctoral Research Fellow, Dr Alan Burden from the Designing Socio-technical Robotic Systems research program in the Centre.  

The rise of collaborative robots (cobots) is a game-changer for various industries. These robots are designed to work alongside humans, enhancing productivity and efficiency. However, the real challenge lies in making this human-robot interaction as seamless as possible. Augmented Reality (AR) is a technology that has the potential to revolutionise this space by overlaying digital information onto our physical environment.

The Shift in Cobot Interfaces

Traditionally, human-cobot interactions have been facilitated through screen-based interfaces or specialised hardware. While these methods are functional, they often require a strenuous learning curve and can be less intuitive. Augmented Reality offers a paradigm shift. By overlaying digital guides, data, or even real-time analytics onto a workspace, AR can make the interaction with cobots more straightforward and efficient. This reduces the time needed for task completion and makes the process more intuitive, reducing the need for extensive training. As we move forward, we are poised to transition from digital 2D interfaces to more immersive 3D interfaces, further enhancing the user experience.

UX Design Principles in AR

User Experience (UX) design is pivotal in making AR-based cobot interaction effective. The objective is to create interfaces that are not just visually appealing but also user-friendly and functional. This involves a deep understanding of the user’s needs, their tasks with the cobot, and the environmental factors at play. For example, an AR interface for a cobot in a medical lab would need to consider sterility and precision. At the same time, one in a manufacturing setting might focus on speed and durability. The design process should be iterative, continually involving users in testing to refine the interface.

User Journey Mapping

Mapping the user’s journey is an invaluable tool in this design process. It involves creating a visual representation of all the interaction points between the user and the cobot facilitated by the AR interface. This helps identify potential issues, bottlenecks, or areas for improvement in the interaction process. For instance, if users find it challenging to access certain information quickly, the interface can be tweaked to make that data more readily available. The ultimate aim is to make the AR interface a tool that enhances, rather than hinders, productivity and user satisfaction.

Safety and Ethics

While AR offers many advantages, it raises important ethical and safety considerations. Data privacy is a significant concern, especially when sensitive or proprietary information is displayed in a shared workspace. The AR interface must also be designed to minimise distractions that could lead to safety hazards. For example, overly flashy or intrusive graphics could divert the user’s attention from critical tasks, leading to accidents. Therefore, ethical guidelines and safety protocols must be integrated into the design process.

What’s Next?

As AR technology continues to evolve, the possibilities for its application in cobotics are virtually limitless. Future developments could include gesture-based controls, adaptive learning algorithms that tailor the interface to individual user preferences, and even real-time collaboration features that allow multiple users to interact with a single cobot. These advancements will make the interaction more seamless and open new avenues for automation and efficiency in various industries.

As we stand on the brink of a new era in human-robot collaboration, enabled by the transformative power of Augmented Reality, we must pause to consider some critical questions.

Will AR interfaces become the new standard in cobotics, making traditional interfaces obsolete?

If we integrate more advanced features like gesture controls and adaptive learning algorithms, are we also prepared to address the complex ethical and safety considerations that come with them?

These questions serve as a reminder that while technology offers immense potential for improvement and innovation, it also demands a level of responsibility and foresight. As we navigate this exciting frontier, let’s ensure our approach is technologically advanced, ethically sound, and user-centric.


ARTICLE: How to ensure quality assurance when integrating a cobot

Written by Postdoctoral Research Fellow, Dr. Anushani Bibile and Research Program Co-Lead, Dr. Michelle Dunn, both from SUT

A collaborative robot (or cobot) is designed to work side-by-side with people and can support applications from welding, pick and place, injection moulding, CNC, packaging, palletising, assembly, machine tending and materials handling. The integration of cobots enables the delegation of many human-based skill activities, with cobots able to undertake a range of repetitious tasks, whilst offering high flexibility and increased productivity.

A collaborative robot arm is compact, occupying a smaller floorspace than a conventional robot and can offer great flexibility for ‘low-volume, high-mix’ production, or high specialisation environments.

It is easier to re-program and re-tool a cobot to undertake a range of actions, providing greater agility as well as reductions in cost of operation. As cobots are also designed to work safely side-by-side with human operators, reduced safety measures are required when compared with a conventional robot.

If you are thinking of integrating a cobot into your manufacturing process it is important to look at the quality assurance of your system. When implementing a conventional robot, you would ensure the quality assurance was satisfied during initial setup, but when you use a cobot, which can be reconfigured for different processes, you need to consider the quality assurance every time you make a change. Changes to the code by non-experts, will have to be checked and verified very closely and safety always needs to be considered. Therefore, quality assurance is critical for human-cobot systems in automated processes as it ensures that the products or services produced meet the required specifications and are safe for use.

Why are continuous quality assurance checks important for human-cobot systems?

  • Productivity: Quality assurance measures can help optimise the performance of a human-cobot system, improve productivity and reducing waste. This can include monitoring and controlling the system to ensure that it is working efficiently and identifying areas where improvements can be made.
  • Safety: Safety is a critical concern when it comes to human-cobot systems. A cobot does not need to be caged, therefore a malfunctioning or improperly programmed cobot can cause serious injury or damage to humans or equipment. Quality assurance measures help ensure that the cobot system is designed and programmed correctly, and that it is safe for use.
  • Compliance: Quality assurance measures can help ensure that a human-cobot system meets regulatory and industry standards. This can include performing audits and inspections to ensure that the system is operating within the required parameters and that all safety regulations are being followed.

If proper quality assurance measures are not in place, there are potential risks associated with human-cobot systems. Some of these risks include:

  • Malfunctioning: A cobot that is not properly programmed or maintained can malfunction, causing damage or injury to humans or equipment.
  • Inaccuracy: A poorly calibrated or inaccurate cobot can produce defective products or services, leading to waste, customer dissatisfaction, and potentially legal liabilities.
  • Cybersecurity: Human-cobot systems are susceptible to cyber threats, which can lead to system failures, data breaches, and other security issues. Quality assurance measures can help ensure that the system is secure and that appropriate cybersecurity protocols are in place.

Design of safety mechanisms must meet the corresponding industrial standards which are exemplified in the figure below. First, a cobot must meet the relevant safety requirements, laws and directives for general machinery such as the European Machinery Directive (2006/42/EC). Basic safety rules and regulations (known as Type A standards) must also be met. Specific applications of a cobot system must meet type B standards. Finally, the cobots as products must meet type C standards.

Safety Assurance standards and regulations for human and machine collaboration [1]

Finally, it is important to regularly review and update quality assurance protocols to keep pace with evolving technologies and changing workplace conditions. By remaining vigilant and proactive in preserving the quality assurance of cobots in automated processes, organisations can reap the benefits of cobot automation while minimising risks and maximising productivity.

[1]Bi, Z. M., et al. (2021). “Safety assurance mechanisms of collaborative robotic systems in  manufacturing.” Robotics and Computer-Integrated Manufacturing 67.

[2] Vicentini, F. (2021). “Collaborative Robotics: A Survey.” Journal of Mechanical Design 143(4).

[3] Cobot – Wikipedia


ARTICLE: Cobots in manufacturing: Good for skill shortages and much more.

Written by Research Program Co-Lead Professor Greg Hearn and PhD Candidate Nisar Ahmed Channa both from the Human Robot Workforce research program in the Centre.  

In this era of rapidly evolving technology landscape, almost every industry sector needs to keep pace with technological advancements to prosper and remain competitive. However, many companies struggle to develop or even adopt innovations in technologies, processes, or business models. COVID-19 is one recent example where manufacturing companies found it extremely challenging to generate an innovative response to address labor shortages caused by lockdowns and movement restrictions across many countries. As a result, many production units of large as well as small to medium manufacturing companies shut down for significant time periods. This negatively affected global supply chains in many other sectors because manufacturing industries provide input in the form of usable goods and services to many other industries. Soon after the global economic crises, the manufacturing companies were facing issues like increasing production costs caused by unavailability of raw material and increased labour costs. Covid-19 pandemic further fuelled these issues due to disruptions in global supply chains and restricted movement of human workers. Even after the pandemic, various countries are still facing issues like increased labour costs, and shortages of skilled labour. Resultantly, companies are now investing huge financial resources to future proof their manufacturing potential and reduce input and increase output.

One of the innovative solutions to these labor and skills shortages on which academics and industry experts are working is the adoption of collaborative robots (Cobots) in manufacturing. A Cobot is a special kind of robot, with context awareness, which can safely share a workspace with other Cobots or with human operators. Recent research suggests that Cobots can be used as alternatives to skilled human workers and thus can supplement the shortage of workers across the industries. For instance, to cope with labour shortages caused by pandemic and to meet increased demand, manufacturing companies in North America spent around 2 billion USD in 2021 to acquire 40,000 robots[i],[ii],[iii]. Similarly, rising labour costs, and an aging workforce, has led to an increase in the demand for Cobots in the automobile sectors of Europe and the Asia–Pacific region iii.

Some labour economists believe that the introduction of technologies like artificial intelligence (AI) and robots increases production and efficiency in manufacturing through the displacement of jobs traditionally being performed by human workers. However, under certain conditions, these technologies can create new jobs and upskill other jobs across the ecosystem of the related suppliers and services providers.

In line with the priorities for Australian manufacturing formulated by the Australian Advanced Manufacturing Growth Centre[iv], we argue Cobots could be “creatively productive” for Australian manufacturing not only because of their potential to reduce production cost efficiencies but also to enhance value differentiation, and potentially open up new revenue segments including through export[v]. Efficiencies can be achieved through optimisation of human-robot workflow design; accelerating workforce acceptance of robotic driven process efficiencies; reducing human errors in automation documentation; and by reducing downtime through enhanced work safety.  Value differentiation can be achieved by integration of Cobots in product design for rapid prototyping; by developing autonomous systems of quality assurance and better data analytics as value adding services; by improving capabilities for just-in-time and mass customisation products in existing markets; and by upskilling the manufacturing workforce for innovation leadership which in itself is a value differentiator. The fact that Cobots are designed to work alongside and close to people to perform their jobs and responsibilities can help companies to integrate and digitalise their business operations without compromising on lacking human aspects of the job. In many respects, Cobots are the hardware equivalent of augmented intelligence, rather than replacing people with autonomous equivalents. Cobots can supplement and improve human skills with super-strength, accuracy, and data capabilities, allowing them to perform more and add more value to the production process and to final product itself. It aids in creating strategic business value and improves efficiency, resulting in better, quicker delivery of products to customers in market.

[i] North American companies send in the robots, even as productivity slumps | Reuters

[ii] Robots marched on in 2021, with record orders by North American firms | Reuters

[iii] Rise of The Cobots in Automotive Manufacturing | GEP


[v] Microsoft Word – Hearn et al ACRA Final Submission.docx (

6 Reasons Why We Need a Prototyping Toolkit for Designing Human-Robot Collaboration

Written by Postdoctoral Research Fellow, Dr Stine S Johansen and PhD Researcher, James Dwyer

In this short article, we will share 6 benefits of having a prototyping toolkit for designing human-robot collaboration (HRC). We will lift the curtain on our planned activities to work towards this in Program 2 of the Australian Cobotics Centre.

What type of human-robot collaboration are we talking about?

The Australian Cobotics Centre focuses on cobots in manufacturing settings. In these settings, robots are most often big and locked away in cages for safety reasons. They are useful for highly defined and repeatable tasks that require strength. In contrast, cobots are typically smaller and allow for people to safely carry out a task by handing over items to the robot or even by physically handling the robot.

Cobots address an increasing need for more adaptable robotic systems for customised and bespoke products. These types of products still require people in the manufacturing line to accommodate changes from product to product.

So, what could a prototyping toolkit look like?

Imagine a toolbox with screwdrivers, a hammer, cutters, etc. Similar to that, we already have tools in our design toolbox that work at a generic level or are appropriated to suit particular problems. But a toolkit for prototyping human-robot collaboration is still left for us to investigate. In Program 2, James Dwyer (PhD student) will contribute to our knowledge about how different prototyping tools can facilitate design processes of HRC. The goal is to develop a practical and affordable toolkit that can be used to enable designers, engineers, and end-users to work together towards human-robot collaboration in manufacturing settings and beyond.

What are the benefits of having a prototyping toolkit?

Knowing how a cobot can fit into an existing or new manufacturing setting requires substantial research. What if we had a way to make that process easier and more efficient for designers and clients as well as more accommodating for the final end-users of the cobot? This is the broad aim of a HRC prototyping toolkit. Here are 6 concrete benefits that we aim to support through our work in Program 2.

1) Accessible end-user engagement

Manufacturers often lack the expertise to define how a cobot could be used. They are, however, experts in their respective domain. Domain knowledge is not always something that can be documented in written reports. It is also the tacit knowledge that workers build through years of experience. A prototyping toolkit can enable that knowledge to play a role very early in the design and development process by lowering the currently high technical barriers to understand how a robot works. In Program 2, we rely on principles from participatory design which is a design practice to produce tangible outcomes together with end-users.

2) Cost and time efficiency

Facilitating a cobot integration project can require substantial costs and time which makes it non-viable for some manufacturers. The hardware investments require committing to a particular setup, but there are risks associated with such investments if feasibility of the concept has not been investigated early on. Therefore, it will be beneficial to have prototyping tools to conduct such investigations without the necessity of actual hardware. Prototyping tools can furthermore allow for quick and cheap iterations. Subsequently, there is a need for tools that facilitate the transition from early concepts to implementation and testing.

3) Flexibility

Given the opportunity for cobots to assist in manufacturing of customised products, there is a high need for flexible solutions. Crucial to realising flexibility is the establishment of design processes that bridge the gap between early stage conceptual development and technical integration. For cobots to effectively contribute to customised production, they must follow a rich understanding of work practices, production methods, and customisation requirements entailed in the manufacturing. This understanding can be developed through iterative design and a holistic approach, covering all aspects from conceptualisation, prototyping, and implementation. This will ensure that the cobots are versatile, adaptable, and able to meet changing production needs.

4) Risk mitigation

Even though cobots are generally equipped with safety measures such as a safe stop button and sensors to detect and stop collisions with people, it is still possible to get hurt by a faulty cobot that has not been adapted to its environment. Prototyping tools allow us to mitigate this risk in two ways. First, it is possible to create virtual models of the environment and cobot, meaning that we can simulate tasks and clarify potential safety risks we might not otherwise have detected purely from prior experience and safety standards. This allows us to develop safety measures long before anyone gets hurt. Second, while engaging end-users in the design process has many benefits, people with non-technical backgrounds are not necessarily comfortable interacting with a robot – especially an unfinished robot solution. Therefore, prototyping tools can support our engagement with end-users by removing the potential fear of getting hurt.

5) Enhanced creativity

As design researchers, we often engage in generative ideation activities to address research questions. Prototypes enable us to see facets of an idea that were not previously obvious. This is sometimes referred to as ‘filtering’ (for further reading on this topic, see our list of references). It’s like putting on special glasses that highlight the specific qualities we want to explore further while still capturing the essence of the entire concept. In order to use prototypes as filters, it is necessary to have a holistic understanding of the context within which the cobot will operate and how that context can change with the introduction of the cobot. A prototyping toolkit can help give us different lenses to explore facets of the context in early prototypes, thereby becoming a creative extension for designers. This could include prototyping tools such as facilitating Wizard-of-Oz methods, video prototyping, or virtual simulations.

6) Facilitating internal communication

Prototyping is an activity that allows us to both internalise and externalise ideas. In other words, prototypes enable us to internally reflect on what works and what does not work as well as communicate ideas to team members, clients, or anyone interacting with them. Prototypes have always had that role in design research, but with the technical barriers to quick prototyping for human-robot collaboration, there is a need to identify new ways to facilitate this role of prototypes.

We look forward to sharing our progress throughout the next few years. Please reach to us for further discussion, questions, or other inquiries.

Further reading:

Lim, Y. K., Stolterman, E., & Tenenberg, J. (2008). The anatomy of prototypes: Prototypes as filters, prototypes as manifestations of design ideas. ACM Transactions on Computer-Human Interaction (TOCHI)15(2), 1-27.

Wensveen, S., & Matthews, B. (2014). Prototypes and prototyping in design research. In The routledge companion to design research (pp. 262-276). Routledge.

William Odom, Ron Wakkary, Youn-kyung Lim, Audrey Desjardins, Bart Hengeveld, and Richard Banks. 2016. From Research Prototype to Research Product. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16). Association for Computing Machinery, New York, NY, USA, 2549–2561.

Gopika Ajaykumar. 2023. Supporting End-Users in Programming Collaborative Robots. In Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (HRI ’23). Association for Computing Machinery, New York, NY, USA, 736–738.

ARTICLE: Can we Unlock the Potential of Collaborative Robots?

Written by Dr Marc Carmichael and Louis Fernandez from the Australian Cobotics Centre.

Collaborative robots, or cobots for short, have gained significant attention in recent years due to their potential to work in close proximity and collaboration with humans. However, despite their name, there seems to be a lack of actual collaboration between humans and cobots in many, if not most, industrial settings.

The Australian Cobotics Centre aims to transform the Australian manufacturing industry through the deployment of collaborative robots, and in a recent webinar we discussed how significant benefits may be possible if more sophisticated forms of collaboration between humans and cobots can be practically achieved.

In this article we discuss this, starting with the basics of cobots, exploring the untapped potential of cobot-human collaboration, and how we hope to develop new ways of enabling humans and cobots to collaborate.

Defining Cobots and Industrial Robots:

Before we talk about the untapped potential of cobot-human collaboration, let’s start by understanding the basic differences between cobots and regular industrial robot arms.

Industrial robot arms are normally big, heavy machines you might see in factories or other environments that have repetitive and predictable jobs. Industrial robots are great at lifting heavy things quickly and accurately. However, this is also what makes them dangerous around people, so they need to be kept away from them.

On the other hand, cobots are much smaller and lighter. They also have technology that lets them ‘feel’ their surroundings. These functionalities allow them to work alongside humans. On top of that, they’re easier to program than industrial robots. This allows them to be quickly put to work on different tasks and makes them good for flexible jobs.

The Current State of Cobot Collaboration:

Even though cobots are capable of working beside people, they don’t very often really work with people. Feedback from experts and users, as well as research literature, have observed that cobots are being used more like traditional industrial robots. For example, cobots are often used in pick and place or palletising jobs. These applications look much like how industrial robots work, except cobots don’t need the protective cage around them. This raises the question: “Are we really using cobots to their full potential?”

Don’t get me wrong, using cobots as cageless industrial robots has great advantages. Not needing a cage means you have more space on your shop floor for other equipment, and you spend less time during the installation process. In addition, cobots are generally easier and faster to program compared to industrial robots. For example, cobots can be programmed by physically grabbing and moving their arm to show them where to go. This easy form of programming allows cobots to be easily set up and deployed, a benefit for small businesses getting into automation. Plus, cobots are getting better, with some having more reach and strength to handle different jobs. As they improve, we might see cobots and robots becoming harder to tell apart, and using cobots like cageless industrial robots might become common.

However, using cobots like industrial robots doesn’t make the most of what they can do. We should explore the challenges and opportunities of making cobot-human collaboration better.

Defining Levels of Human-Robot Collaboration:

Before we continue, it is important to define collaboration in the context of cobots. What collaboration means depends on the discipline, and terms are often used inconsistently or interchangeably. A classification that is becoming increasingly common, and which we personally like, is the following:

Level 0: Cell – this is the traditional approach used in industrial robots where humans are isolated from the robot, often by physical caging or fences.

Level 1: Co-existence – the human and cobot share the workspace, but work together on a task in a sequential fashion. For example, a cobot performs a packing task, with a human only entering the workspace to restock items. Sensors such as a safety area scanner are used to slow/stop the cobot when someone is in the vicinity.

Level 2: Co-operation – the human and cobot operate in shared space, with the worker guiding or influencing cobot operation via inputs (e.g. force, speech, gesture, etc). Cobot may adapt its motion based on human measurements.

Level 3: Collaboration – the human and cobot cooperate on joint task. Cobot learns and adapts by observing humans, to achieve a dynamic and supportive collaboration. Human and cobot are responsive to each other in a mutually beneficial manner, where both parties actively contribute to the task at hand.

Although it is sometimes difficult to define, these definitions can help distinguish different levels of interaction and collaboration between cobots and humans.

Exploring the Potential Gains and Barriers to Collaboration:

We would consider most cobot use cases to be Level 1 collaboration, where other than the cobot adapting its pre-programmed routine based on the presence of a human, there is next-to-no real collaboration between the two. To rephrase the previous question that we raised: “what are we missing out on by not going after Level 2 and Level 3 collaboration?”

There are some interesting and compelling proof-of-concepts by robotics researchers that demonstrate the potential to be achieved, See Further Reading for some examples. One study estimated a potential reduction in task completion time of up to 20%, suggesting significant benefits in productivity can be unlocked. Unfortunately, there are relatively few examples of high-level collaboration that have made their way to practical use.

In our program (Human-Robot Interaction) at the Australian Cobotics Centre, our goal is to increase the scope of genuine collaboration. Our efforts are focused on novel interaction approaches using multi-sensory interfaces, gesture control devices and augmented reality which can reduce training costs, enable rapid prototyping, and make robots safer and easy to use in production tasks.

It is our belief that addressing these challenges will lead to new methodologies for enabling rich and beneficial forms of human-robot collaboration. Combined with the work of our colleagues at the Australian Cobotics Centre whose programs are addressing technical, social, and organizational challenges, we are looking forward to sharing the outcomes we achieve and are excited about the future of cobotics.

Further reading:

Michaelis, J. E., Siebert-Evenstone, A., Shaffer, D. W., & Mutlu, B. (2020). Collaborative or simply uncaged? understanding human-cobot interactions in automation. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems.

Guertler, M., Tomidei, L., Sick, N., Carmichael, M., Paul, G., Wambsganss, A., Hernandez Moreno, V., & Hussain, S. (2023). When is a robot a cobot? moving beyond manufacturing and arm-based cobot manipulators. Proceedings of the Design Society, 3, 3889-3898.

Kopp, T., Baumgartner, M., & Kinkel, S. (2020). Success factors for introducing industrial human-robot interaction in practice: an empirically driven framework. The International Journal of Advanced Manufacturing Technology, 112(3-4), 685-704.

Male, J. and Martinez-Hernandez, U. (2021). Collaborative architecture for human-robot assembly tasks using multimodal sensors. 2021 20th International Conference on Advanced Robotics (ICAR).

Carmichael, M. G., Aldini, S., Khonasty, R., Tran, A., Reeks, C., Liu, D., … & Dissanayake, G. (2019). The ANBOT: an intelligent robotic co-worker for industrial abrasive blasting. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

Zhuang, Z., Ben-Shabat, Y., Zhang, J., Gould, S., & Mahony, R. (2022). Goferbot: a visual guided human-robot collaborative assembly system. 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

ARTICLE: Human-Robot Collaboration through Augmented Reality

Written by Dr Alan Burden, Postdoctoral Research Fellow from the Australian Cobotics Centre.

In previous articles, we delved into socio-technical systems (STS) and highlighted the importance of spatial design in shared human-robot environments. As we continue this exploration, this article will focus on technologies that show immense potential in improving the harmony between humans and cobotic systems. Our spotlight will be on augmented reality (AR), a technology poised to make human-cobot interactions more intuitive, efficient, and enjoyable. 

AR is a part of the ‘reality technologies’, often grouped under the umbrella term of extended reality (XR), which also includes virtual reality (VR) and mixed reality (MR). These technologies merge the physical and digital worlds, creating innovative environments where humans and machines interact. AR stands out because it doesn’t replace our reality, as with VR, but instead enhances our existing environment by overlaying digital information.   

AR enhances our perception of the physical world by overlaying images, sounds, or other data, onto our physical environment. In cobotics, AR could serve as a communication bridge between humans and robots, facilitating a more intuitive and efficient collaboration. For example, AR can visually guide a human worker in a manufacturing plant, showing them how to operate a machine or assemble a product with the help of a robot. Similarly, AR could provide surgeons with real-time data during a robotic assistant procedure in a healthcare setting. AR offers opportunities to improve the efficiency of the task at hand and enhance the safety and effectiveness of human-robot collaboration. 

The potential of AR extends beyond communication. It also plays a crucial role in spatial design for shared human-robot spaces. AR can help visualise the optimal arrangement of a workspace, considering the movement patterns and tasks of both humans and robots, which could lead to safer, more efficient, and intuitive shared spaces. For example, in a warehouse, AR can help design a layout that minimises the risk of accidents between human workers and autonomous robots. By visualising the robots’ paths and highlighting potential danger zones, AR can contribute to a safer and more productive environment. 

However, the integration of AR into cobotics is not without challenges. Technical limitations, such as AR devices’ accuracy and reliability, can affect AR applications’ effectiveness. User acceptance is another critical factor. While AR can make human-robot collaboration more intuitive, users must adapt to a new way of working and interacting with technology. Ethical considerations, such as privacy and data security, must also be addressed. 

Despite these challenges, AR presents exciting opportunities for the future of cobotics and STS. It can make human-robot collaboration more accessible and user-friendly, opening new possibilities for automation in various industries. Moreover, as AR technology evolves, we can expect even more innovative applications that will further enhance human-robot collaboration. 

AR is a powerful tool that can significantly enhance human-robot collaboration in STS. By improving communication and contributing to the design of safer and more efficient shared spaces, AR can help us harness the full potential of cobotics. As we navigate the intersection of humans and technology, embracing tools like AR will be crucial in creating a harmonious and efficient future for human-robot collaboration. The journey towards this future is filled with challenges and exciting opportunities. As we continue to explore and innovate, we can look forward to a world where humans and robots work together seamlessly, each enhancing the capabilities of the other. 

ARTICLE: 6 Reasons Why We Need a Prototyping Toolkit for Designing Human-Robot Collaboration

In this article, Postdoctoral Research Fellow, Stine Johansen and PhD Researcher, James Dwyer highlight the pressing need for a #prototyping toolkit to support the design process of human-robot collaboration (HRC).

As robots become increasingly integrated into industry, companies are grappling with uncertainties surrounding their implementation and task allocation. Developing a prototyping toolkit is one way to address these challenges.

By involving manufacturers and end-users early in the design process, we can harness their domain knowledge and tacit expertise to create meaningful outcomes to transform the future of manufacturing.

Read more HERE



ARTICLE: The Human Robot Workforce research program

To implement #collaborativerobotics effectively in #advancedmanufacturing, we must address the both the technological advancements required and the human and design factors that are associated with technological change. These areas form the focus of our research programs, each comprising several PhD projects that explore specific research questions.

Our Human Robot Workforce program is the first of our research programs where all of its PhD researchers have begun their projects. Today, we are delving a little deeper into the program and share the objectives of each project within it.

Program Leads: Dr Penny Williams & Prof greg hearn
Program Postdoctoral Research Fellow: Dr Melinda Laundon
PhD researchers: Jacqueline GreentreeNisar Ahmed ChannaAkash HettiarachchiPhuong Anh Tran
Other Chief Investigators involved: Dr Sean Gallagher
Associate Investigators Dr Claire Mason & Dr Luca Casali

Read more HERE