The 100 suicide notes that taught us about creating more empathetic chatbots

The 100 suicide notes that taught us about creating more.jpgsignature327d3e040836ce3e0f733c2be8499b40

While the art of conversation in machines is limited, there are improvements with every feather. As tools are developed to manage complex conversations, there will be technical and ethical challenges in how to find and deal with sensitive human issues.

Our work involves building chatbots for a range of practices in health care. Our system, which incorporates a number of algorithms using inartificial intelligence (AI) and natural language processing, has been in development at the Australian eHealth Research Center since 2014.

The system has created several chatbot apps that are tested among select people, usually with a basic health condition or requiring reliable health - related information.

They include HARLIE for Parkinson's disease and Autism Specialty Disorder, Edna for people receiving genetic counseling, Dolores for people living with chronic pain, and Quin for people who want to stop. to smoke.

Research has shown that those with underlying medical conditions are more likely to think about suicide than the general population. We need to make sure our chatbots pay attention to this.