The platform that we studied, and much of the other mental health technology that we briefly explored this summer, claims to help users with their mental health. However, unlike drug interventions or well-researched therapies, there are no clinical trials into the efficacy of these platforms.
As we previously discussed, ideal measures of helpfulness would at least include mental health assessments or surveys. Without access to this type of data, as well as follow-up with individuals once they discontinue use of the platform, it is incredibly hard to quantify the degree of help that is occurring on these platforms. This leads to researchers, such as ourselves, relying on proxy measures, and opens the door for these platforms to potentially claim more benefits than they should.
If a central claim to these types of platforms is that they help, an ability to actually measure helpfulness in a more robust and vetted way should be incorporated into platform design. This would then allow users, mental health professionals, and the platforms themselves to better understand how and when these platforms should be utilized and what benefits they may offer.
Secondly, there is a strong need for more effective moderation on this platform and others like it. Research into unmoderated, unstructured peer support has found that individuals who participate in this type of peer support reported higher levels of distress than those who did not. Interestingly, this finding held true even for individuals who personally recalled having a positive experience with the online peer support group [1].
Furthermore, in our analysis of suicidal and self-harm content within our mental health dataset, we found that while the platform’s policy is to delete content that contains references to violent or unsafe acts (such as suicidal ideation and self-harm), only 2.5% of the content was being deleted by administrators and moderators. Research into suicide contagion [2] and non-suicidal self-injury [3] has found that exposure to this type of content can emotionally distress, trigger, or even lead to similar types of behaviors in the viewer.
The absence of formal supervision and guidance on online peer support platforms has been linked with higher levels of distress for both peers requiring support and peers providing it [4]. Increasing both the quantity and quality of moderation occurring on online peer support platforms could thus potentially improve their effectiveness and limit any potential harm [1].
On a peer support platform specifically designed for users who are experiencing/have experienced mental health challenges, suicidal ideation occurs often: our own calculation within the mental health subset of data that we studied approximated that at least 10% of the posts discussed extreme suicidal ideation and self-injury. While users are encouraged to inform the platform to have this content removed, we found that the vast majority (97.5%) of it was not deleted by an administrator or moderator.
Research into how young adults respond when a peer discloses suicidal intent found that the response was heavily dependent upon the guiding peer’s familiarity with attempted or completed suicide. When a peer had a personal history of suicidal ideation, they were less likely to inform other adults and authorities about a peer’s intent [5]. While this is purely conjecture, this research may provide a clue into why users aren’t reporting these posts and are instead attempting to engage with the peer on their own.
One option for reversing this trend and potentially improving interactions overall would be to implement mandatory peer training for everyone who signs up for the platform. A 2017 study to determine public interest for mental health interventions found that nearly two-thirds of the surveyed population responded that they would be interested in a training program to learn how to give effective peer support [6]: the interest to learn to be a more effective peer supporter is clearly there.
When researchers compared unguided peer chat tools to guided peer chat tools (based on the problem-solving framework used in cognitive behavioral therapy, where the peer supporter was guided through a series of prompts while responding), they found that guided chats resulted in deeper discussion of troubling issues and greater personal connection. The type of training is crucially important, and the same study found that other peer support training rarely follows what would be considered best practice by mental health professionals [7].
Our research suggests that peer support platforms may be better suited towards building community, feeling heard, and being encouraged than dealing with more complicated issues of mental health, and research into the benefits and limitations of peer support backs this theory [8].
In fact, the paper that contains the definition of peer support that is widely quoted by peer support proponents (including the platform that we studied) later clarifies that peer support should be used as part of “prevention based” services, which must include “an array of services that meet a continuum of needs” [9]. Peer support should not, in other words, exist solely by itself.
Furthermore, a study on unmoderated and unstructured platforms (similar to the one we studied) found that a strong sense of connection and community—a key component to the benefits of peer support—was a factor that some study participants noted was lacking in online platforms [1].
While the peer support platform that we studied repeatedly extolls the benefits of their platform, they do not place the same level of attention and care into similarly emphasizing when peer support is and isn’t appropriate, and that, for individuals experiencing more significant mental health struggles, peer support should be used in conjunction with other services.
A survey of young adults already receiving care from a hospital’s inpatient and outpatient departments found that 74.3% of them were amenable to the idea of professionals contacting them via social media to offer help/advice [10]. A randomized controlled trial of online intervention for college students at risk for suicide found that it was the combination of offering students personalized feedback and the option of online counseling that had a positive impact on students’ willingness to seek out treatment [11]. As already discussed, peer support should exist within the context of other sources of help, and having mental health professionals engaging in these platforms, rather than just peer supporters, may be a more appropriate response to alleviating some of the burden and responsibility placed on peer supporters.
A 2018 study into the most effective types of responses by professionals to patients with suicidal ideation found that the treatments that were the most effective included restricting access to suicide means, fostering connectedness, and treating depression (both through psychology and pharmacology). Specifically for youth suicidality, family-based interventions were effective, and there was some evidence that follow-up actions that develop continuing care were important to reduce suicide risk as well [12].
The platform that we studied officially states that it is not a crisis service, but content related to crises is still prevalent on the platform. It is thus worth noting that of the effective treatments for decreasing suicide risk, the platform that we studied only can be argued to be potentially providing one: fostering connectedness (and as previously mentioned, the degree to which this is happening may be disputed [1] and should be the subject of further research). And yet, this platform declared in a fundraising campaign to have the “solution” for the teen suicide epidemic.
The issue of suicidal content is a complex one, but the platform’s existing protocol of depending on users to report extreme content and share the suicide hotline number is problematic. Furthermore, simply pulling down the posts neglects to understand that it is possible this user may be attempting a final plea for help.
Search for mental health apps, and you’ll find dozens of companies making large claims about their product being able to help. These platforms are not, however, subject to oversight and not held accountable for their declarations.
Behind the data we examined this summer are people, some of whom are desperate for help. And yet we know nothing about what happens to them after they log-off for the last time, and how the platform has influenced them. This is an incredibly vulnerable population, and yet these platforms are being touted as the solution to our mental health care crisis despite the complex or even nonexistent research into their claims.
Our project this summer was an eye-opening experience for us. We hope that our research and discussion can be similarly enlightening to others, and that in time, will help to start a much needed dialog about online peer support and the lack of regulation of mental health tech.
[1] Kaplan, K., Salzer, M., Solomoon, P., Brusilovskiy, E., & Cousounis, P. (2011, January). Internet peer support for individuals with psychiatric disabilities: A randomized controlled trial. Social Science & Medicine, 72(1), 54-62. https://doi.org/10.1016/j.socscimed.2010.09.037
[2] Mueller, A., Abrutyn, S. (2015, March 1). Suicidal Disclosures among Friends: Using Social Network Data to Understand Suicide Contagion. Journal of Health and Social Behavior, 56(1), 131-148. https://doi.org/10.1177/0022146514568793
[3] Lewis, S. P., Heath, N. L., Michal, N. J., & Duggan, J. M. (2012, March 30). Non-suicidal self-injury, youth, and the Internet: What mental health professionals need to know. Child and Adolescent Psychiatry and Mental Health, 6(1). doi: 10.1186/1753-2000-6-13
[4] Álvarez-Jiménez, M., Gleeson, J. F., Bendall, S., Lederman, R., Wadley, G., Killackey, E., McGorry, P. D. (2012, September). Internet-Based Interventions for Psychosis. Psychiatric Clinics of North America 35(3), 735-747.
[5] Dunham, K. (2004). Young Adults’ Support Strategies when Peers Disclose Suicidal Intent. Suicide and Life-Threatening Behavior 34(1), 56-65. doi: 10.1521/suli.34.1.56.27773
[6] Bernecker, S. L., Banschback, K., Santorelli, G. D., & Constantino, M. J. (2017). A Web-Disseminated Self-Help and Peer Support Program Could Fill Gaps in Mental Health Care: Lessons From a Consumer Survey. JMIR mental health, 4(1). doi:10.2196/mental.4751
[7] O’Leary, K., Schueller, S. M., Wobbrock, J. O., & Pratt, W. (2018). “Suddenly, we got to become therapists for each other”: Designing Peer Support Chats for Mental Health. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. Montreal, QC. doi: 10.1145/3173574.3173905
[8] Repper, J., & Carter, T. (2011, August). A review of the literature on peer support in mental health services. Journal of Mental Health 20(4), 392-411. doi: 10.3109/09638237.2011.583947
[9] Mead, S., Hilton, D., & Curtis, L. (2001, February). Peer Support: A Theoretical Perspective. Psychiatric Rehabilitation Journal 25(2), 134-141. Doi: 10.1037/h0095032.
[10] Birnbaum, M. L., Rizvi, A. F., Correll, C. U., Kane, J. M. & Confino, J. (2017, August). Role of social media and the Internet in pathways to care for adolescents and young adults with psychotic disorders and non-psychotic mood disorders. Early Intervention in Psychiatry, 11(4), 290-295. Doi: 10.1111/eip.12237
[11] King, C. A., Eisenberg, D., Zheng, K., Czyz, E., Kramer, A., Horwitz, A., & Chermack, S. (2015). Online suicide risk screening and intervention with college students: A pilot randomized controlled trial. Journal of Consulting and Clinical Psychology, 83(3), 630-636. http://dx.doi.org/10.1037/a0038805
[12] Rothes, I., & Henriques, M. (2018, June). Health Professionals Facing Suicidal Patients: What Are Their Clinical Practices? International Journal of Environmental Research and Public Health 15(6), 1210. doi: 10.3390/ijerph15061210