NLP, Social Media, and Me

Language has power and technology is offering newer ways of understanding it. In our texting and emails, algorithms have started to complete our words and sentences for us. Technology like “autocomplete” is based on recognition of patterns in language. Do they always get it right? … well… I am sure we can all think of a time when auto-correct has failed us….

Autocorrect fail

 

There are also times it can be successful.

Language pattern recognition software has led to other “handy” technologies like chat bots. Trying not to be overly critical here but setting the tone for the article. That with new technology, although exciting, it is still not perfect.  Language pattern recognition is impacting our work and we may not even know it (hopefully not causing our clients to divorce).  Not only do mental health providers need to understand this technology, we have to consider it’s ethical implications in our practices.

The use of “Natural Language Processing” or NLP is no exception. Natural language processing is a subset of machine learning. It analyzes blocks of text for contextual patterns or clues. The frequency of this technology being used in the understanding healthcare and mental health is increasing.

As someone who types endless progress notes, there was one project that helped NLP come a live for me. One study aimed to look at context key words with notes containing keywords related to suicidal ideation.  The computer programs they developed successfully mined text and predicted accuracy well. This was done independently of billing codes or diagnosis.

Crisis Text Line also was able to mine their texts to identify those at higher risk. They found certain word frequency with those at higher risk. Here is example that certain words were more high risk than others…

 

crisis text line data

 

Machines are able to analyze text and calculate risk. This begs the question of how this should be used and when? Professionally I like the idea of analyzing the EHR data. Natural Language Processing holds clinical value as it has been demonstrated to catch things I might miss. In the above Crisis Text example the words Ibuprophen and Tyelonol stick out to me. These words could be mentioned not only as a method but suicidal individuals might also be expressing the physical pain they are in.

Some have suggested that social media consider using NLP to identify suicide risk. The use of social media data to identify suicide risk is an emerging question and ethical issues still arise. One could argue that from the public health perspective, using NLP to flag certain themes on social media may help.  That the “greater good” is more important than ethical concerns that may arise. Despite this issues of privacy and consent remain prevalent…

The use of this kind of language data begs several questions…

  • If you were able to give your EHR and social media data, how would you do it and whom would it be given to?
  • What sort of consent is required to do this?
  • Who would own your data and what would they do with it?
  • Who would be able to have access to this data?

There is a reason to be skeptical but people are critically thinking about this. To answer these questions researchers have begun to think about how social media text might be used in an ethical manner. Benton, A., Coppersmith, G., & Dredze, M. (2017) noted that in addition to working within the IRB process additional steps need to be taken with informed consent, data protection (how data is handled, encryption, etc),  de-identifying data, and considerations should be made if mental health intervention is recommended or not.

From the clinical perspective screening for suicide risk is important.  My question is that if social media data identifies them at high risk for suicide; what is next? This is where informed consent is critical. There should be some sort of “opt-in”. Social media companies entering this space should be clear with how they got the data and then services they can recommend. Personally I would want basic education about NLP and it’s potential benefits prior to signing on to an option like this.

After my consent is gained, prompts should be made during high risk times. Perhaps to the National Suicide Prevention Lifeline or Crisis Text Line. Local options of treatment should be presented as well.

Natural Language Processing can be a powerful tool to inform practice. The challenge lies in how one collects, mines, and follows up with the data. Ethical questions still abound but mental health professionals should be at the table to discuss these issues.

 

Would love to know what others think… Please feel free to comment below or give me a shout on twitter (@stuckonsw).

*Further Reading/Resources*

 

 

Tools For Practice Tuesday: Litesprite

Much is being made of the potential impact of video games on mental health. There are often headlines about the negative impact of video games on health and our culture is getting more sedentary. Video game addiction was also just classified a mental health disorder. Similar to arguments about social media I think we have to be cautious of automatically labeling video games as negative.

Litesprite is attempting to change the conversation around mental health and games. They have created a series of games to assist with anxiety and depression. With a friendly wolf as your guide, users get to learn basic cognitive behavioral therapy concepts and coping skills while having some fun along the way…

It was wonderful getting a demo from the the founder Swatee Surve. She walked me through the journey you take and some of the achievements you can earn. The end goal is to become a “zen master”. The design was simple and working with youth I thought maybe too simple. Much to my surprise, Swatee told me that the average user is a 40 year old female. I found the journey engaging and helpful. As with most therapy technology, creating a shared experience between therapist and user is important. There is a way to track symptoms and progress then discuss this along the “journey”.

So if you are someone you know is looking for an engaging piece of technology to help with anxiety and depression, check out Litesprite.

You can find the App on the App Store and Google Play.