Can Telemedicine Fix Mental Health?

As New York State has expanded who can provide telemental health/telepsychiatry services to include social workers, I have been thinking about telemedicine a lot… I mean A LOT. Technology certainly has a role in improving mental health care, but how? Electronic health records, apps, and telemedicine are playing an important role in making a difference in timely and affordable access. They can have a positive impact on creating positive experiences for both patients and providers. Mental health still has a long way to go.

There is certainly a lot of promise but how do we get there? Barriers exist in terms of billing, time to learn the technology, understanding regulations, and reimbursement (just to name a few). As I mentioned earlier New York State expanded it’s definition of telemental health services to include social workers. This lead me to start thinking about how this technology can assist in my role as a care manager for youth. I consider doing home visits with complex clients a powerful intervention. Doing home and community based work is a privilege and an excellent way to engage. Telemedicine can serve as a bridge between these face to face contacts. That he opportunities somehow outweigh the barriers.

Although I would never advocate for my clients to be completely reliant on video, doing a check in when weather, illness, or time becomes a factor would be a valuable tool. Rather than having to wait another week I can do a brief session via telemedicine. This would have meaning for both myself and my clients.

The other use case is transitions from inpatient care to the community. Youth are admitted to inpatient services, often with little contact with their families and those that are going to care for them after discharge. In the semi-rural area I work in, families have to travel at minimum 40 minutes to the nearest facility with one being 1 hour and 30 minutes away. It is best practice to involve parents and community providers in care. Having telemedicine to act as a bridge between families and communities could help improve care during this critical time.

These are some of the ways that we can use technology to better scale care but there are many other opportunities that exist. Other models that I have seen are companies like WorkIt Health who provides mix of brick an mortar facilities with telemedicine to provide substance abuse treatment (available in Michigan and California). Another example is Kip Therapy based in San Francisco that provides a similar mix of tech and face to face care.

I really enjoyed this webinar via The American Telemedicine Association and InSight Telepsychiatry on developing a telemedicine strategy. They provided a helpful framework on how to develop a strategy to how it might be beneficial.

A key point was thinking about implementation to grow a programs “across”, “within”, and “around”. This helped me better conceptualize my goals and align with the social work perspective. That creating creating a creating community partnerships and understand the complex challenges our clients and communities face. When we think about adding technology, it’s not just benefiting the organization but the community and systems as a whole….

Now we get to the challenging part. We figured about the “Why” but now the how becomes a little murky. Who pays? What does reimbursement look like? This will vary state to state and locality and their needs. I generated my care coordination wish list for telemental health based on my perceived needs in the community. Now the tricky part is to back this up with numbers and evidence. Will being able to fill gaps due to bad weather benefit client outcomes? Will increasing contact from inpatient level of care to outpatient actually improve outcomes?

This is where data and understanding the needs of a community come in. I started this post with a rather lofty question. Attempting to understand implementation strategy has lead me to think about how technology can be added to the already existing fiber of our mental health system. Social work’s strength in implementing telemedicine is growing from within and across. Rather than creating an entirely new service, growing within existing systems and communities would be my first step.

This is my vision for telemedcine assisting mental health care; what is yours?

Stuck On Algorithms

Algorithms are playing an important role in our daily lives. They tell us what to shop for, they decide if we get a loan, and coming soon how we make healthcare decisions. These could have significant implications for social work practice. Learning about concepts such as algorithms and artificial intelligence is a part of my journey of trying to get “unstuck” about technology issues. I set out to get clarity on algorithms and how social workers can gain a voice in their design. It’s important back up just a bit and define what they are…

I found this one minute video via BBC Learning to sum it up nicely…

This illustrates the need for algorithms to be clear, concise, and accurate. As algorithms, machine learning, and other forms of artificial intelligence get into more complex problems, this gets tricky. For social work practice the question is not “if” algorithms will impact our practice but “when” and “how”. This post was inspired by a medical blogger, Dr. Berci Mesko aka “The Medical Futurist”. He consistently explains how technology will effect medical care.

In a recent post he explains the medical algorithms currently approved by the Food and Drug Administration (FDA) in the United States. This provides an excellent overview of what algorithms are used for in medicine. What caught my eye is the four highlights as being relevant to psychiatry.

My enthusiasm for technology has been tempered over the last year as I learn more about algorithms and machine learning. I recently reviewed and read “Weapons of Math Destruction” about the potential faults of algorithms. That algorithms determining teacher evaluations, college rankings, and criminal justice sentencing are inherently biased. Social workers should be aware of potential biases in these systems. What I struggled to find was a way to analyze these issues in a concise way.

I began to to question concerns about medical algorithms and my twitter crew came through…

Those four algorithms for psychiatry are possible signposts. If the FDA approval is based on relative accuracy comparison by humans (example, ADHD), I have questions, but not necessarily surprised.— 𝗦𝘁𝗲𝗽𝗵𝗲𝗻 𝗖𝘂𝗺𝗺𝗶𝗻𝗴𝘀, 𝗟𝗜𝗦𝗪 🎙💻 (@spcummings) June 18, 2019

Along with (for some of these) who gets the data, what else is data used for, is there any kind of auditing…
— One Ring (doorbell) to surveil them all… (@hypervisible) June 18, 2019

Hard to say w/o more detailed breakdown, but one issue is definitely the “usual” question: what populations were used to train the algos?
— One Ring (doorbell) to surveil them all… (@hypervisible) June 18, 2019

The most helpful resource I found was provided by Dr. Laura Nissen. She found the AI Blindspot by the MIT Media Lab and others

Ok that is completely fascinating and I don’t have complete answers. So far I’ve found 2 things I like that seem like promising scaffolding to decide “do I like this?” Or “ do I not like this?” Here’s one of them… https://t.co/4ELcqsBRv3— Laura Nissen, PhD, LMSW (@lauranissen) June 18, 2019

They walk you through the process of potential errors in building AI and algorithms. The provide a series of cards that gives examples of each error. They provide further resources…

I found the card on “Representative Data” to best capture my initial concerns about data diversity. That in healthcare we want to be concerned about making sure that diverse data sets are available. From the social work perspective two more notions of algorithmic justice are important.

The concept of Discrimination by Proxy is a critical one. This means the algorithm may “have an adverse effect on vulnerable populations even without explicitly including protected characteristics. This often occurs when a model includes features that are correlated with these characteristics.” An example that I have heard about is algorithms that decide criminal justice sentencing. That correlated concepts such as race and socio-economic status will determine sentencing rather than other factors.

Also important to social workers should be the Right To Contest. If one of these common blindspots are found, there is a means to reconcile this. Is there enough transparency in the algorithm to fix “the problem. This is important when thinking about empowering the individuals and families we serve.

As decisions continue to be made more and more by algorithms, I found this frame work to be helpful in thinking critically about this. This provides a helpful overview of these issues and hope that it too gets you “unstuck” about algorithms.