York University Research Explores Role of Robots in Enhancing Crisis Response and Public Safety

2726
Research at York university

Researchers at York University are exploring how artificial intelligence (AI) and robotics can play a pivotal role in improving public safety. This work aims to integrate robots into crisis response scenarios, potentially offering new ways to handle mental health emergencies, ensuring safer environments, and providing support in situations that could escalate into violence.

In a recent paper published in the journal Applied Sciences, York University psychology PhD candidate Kathryn Pierce and her co-authors delve into how robots could aid in crisis de-escalation. This research is part of a broader public safety project at York University’s Lassonde School of Engineering, where early-stage studies are being conducted to design and test robots capable of assisting with security and police force tasks. The aim is to make AI tools not just functional but empathetic, ultimately contributing to safer communities and even campuses.

“We’re looking at how robots can use verbal and non-verbal cues to de-escalate potentially dangerous situations,” says Pierce, who is supervised by Dr. Debra Pepler, a renowned psychologist and Distinguished Research Professor in York University’s Faculty of Health. This interdisciplinary approach combines psychology and engineering to inform the development of more human-like, responsive machines.

Currently, de-escalation is not well-researched, making it challenging to create clear guidelines for both human and robotic responses. However, Pierce’s study outlines a preliminary model suggesting that robots can be programmed with certain verbal and non-verbal communication strategies to enhance their effectiveness in crisis situations. This includes aspects like how a robot maintains eye contact, approaches an individual (slowly and predictably), and speaks (in an empathetic tone). These factors could be crucial in helping a person in distress feel more at ease, thus improving the likelihood of a positive resolution.

Pierce emphasizes that there are no “fixed rules” for de-escalation, even for human responders, which means programming robots to replicate this dynamic process remains a complex challenge. Nonetheless, this research sets the stage for robots to support crisis responders, enhancing safety in public spaces, including schools and universities.

With increasing concerns about mental health and safety, robots equipped with surveillance capabilities could monitor large areas, detect signs of agitation, and alert human responders before situations escalate. This could significantly improve the speed and effectiveness of interventions, preventing potentially dangerous incidents.

Robots could also accompany security personnel in corporate offices and campuses like York University, recording interactions and providing valuable footage for training purposes. This data could help refine response tactics, ensuring that both human and robotic responders are better equipped to handle future crises.

The potential introduction of robots into crisis response services introduces a new layer of complexity to safety protocols. While there’s growing support for integrating mental health professionals with traditional law enforcement to improve outcomes, using robots adds another dimension that requires careful consideration. Researchers stress that more data and real-world testing are needed to understand how individuals will respond to robots in crisis situations.

In recent years, studies have shown that police officers working alongside healthcare providers use less force when dealing with individuals in mental distress. The inclusion of robots in these settings could further enhance such collaborative efforts, making interventions even less confrontational and more focused on de-escalation.

While fully autonomous crisis-response robots remain a distant prospect, Pierce and her co-authors suggest that robots’ most immediate potential lies in assisting human responders. This includes capturing video footage during interventions, which can be invaluable for training and analysis. Furthermore, robots could help identify individuals exhibiting early signs of distress, enabling preemptive interventions by trained professionals.

“I think what’s most practical would be to have engineers direct their focus on how robots can ultimately assist in de-escalation, rather than aiming for them to act independently,” says Pierce. Her comments reflect the current consensus that while AI has the potential to be an invaluable tool, human experience and empathy remain irreplaceable.

The research, titled “Considerations for Developing Robot-Assisted Crisis De-Escalation Practice,” was authored by Pierce, Dr. Debra Pepler, Michael Jenkin (a professor of electrical engineering and computer science at York University’s Lassonde School of Engineering), and Stephanie Craig (an assistant professor of psychology at the University of Guelph).

This study represents a significant step toward increasing public safety, demonstrating how interdisciplinary collaboration between psychology and engineering can pioneer innovative solutions for crisis management. As AI technology continues to evolve, the integration of empathetic, supportive robots could soon become a reality, enhancing the safety and well-being of communities and campuses alike.

 

Previous articleConfederation College Addresses Impact of IRCC Changes on International Enrollment
Next articleFour Arrested in Winnipeg Kidnapping and Robbery Investigation