New chatbot tackles language barriers in emergency departments

Every minute counts in a hospital emergency department (ED), but language barriers can turn seconds into dangerous delays.

Every minute counts in a hospital emergency department (ED), but communication barriers can turn seconds into dangerous delays.

Delivery of critical care can be delayed—not because of equipment shortages, but due to language and cultural barriers. In regions like south-western Sydney, where 55% of residents speak a language other than English at home, this is a serious challenge in triage, according to the South Western Sydney Local Health District.

Dr Padmanesan Narasimhan, an ED clinician and UNSW Sydney researcher, has lived this issue first-hand. He explains the trouble in assigning acuity scores—used to determine how urgently a patient should be treated—when language gets in the way.

“If there’s a language barrier and triage staff have difficulty understanding the person presenting to ED, it can lead to people with really severe or urgent medical conditions assigned a lower acuity score and potentially be made to wait,” he says. Meanwhile, others might be misclassified as urgent, pushing them ahead in the queue.

He illustrates this with a scenario: “Beyond language barriers, cultural norms around stoicism might lead them to downplay discomfort, describing severe pain as mere ‘tiredness’. A triage nurse, unaware of these nuances, could misinterpret this, assigning a lower priority to a potentially critical condition like appendicitis.” These misinterpretations could literally cost lives.

Although interpreter services are available for NSW hospitals, their logistics don’t match the reality of emergency medicine. “In an emergency setting the interaction is only three to five minutes,” Dr Narasimhan says. “It’s really hard for a triage nurse to call the interpreter services and then engage them for translation and interpretation.”

In response, Dr Narasimhan is leading a team that includes linguistics specialists, AI engineers and emergency clinicians to develop a multilingual AI chatbot. Currently funded through an NHMRC Ideas grant, the system will listen in at ED registration desks and interpret patients’ descriptions in real time.

Read more: Accessibility in content: Why inclusive design matters

Dr Narasimhan explains: “So if you speak Arabic, it will be able to interpret and translate your Arabic into English. And because it has natural language processing and machine learning capabilities, it will also be able to give an appropriate triage recommendation.”

While the chatbot supports triage, a human staff member will always supervise the process. “If there is any discrepancy between the AI and the nurse’s triage recommendations, it will be referred immediately to the senior consultant in the emergency department.”

The development plan is structured over three phases: firstly, training the system using multilingual datasets and medical terminology; secondly, simulating triage scenarios in controlled environments; and finally, trialling the system in real EDs in multicultural regions such as Western Sydney.

Dr Narasimhan emphasises that this work differs from commercial language apps: “While there are some commercial apps that aim to break down language barriers in hospital settings, we’re doing it for the public good.” He adds that if successful, the technology could be adapted for other healthcare settings—including GPs—with potential to remove major barriers for multilingual Australians.

As hospitals increasingly use mobile-first tools, social media and digital health solutions, the chatbot represents not just an advancement in tech, but a deeply human-centred approach: designed to bridge divides, inject empathy into emergency communication and ultimately, improve equitable access to vital healthcare services.

Comms Logo
Commsadmin
+ posts
Share

Related Posts

Recent Posts