The Privacy and Equity Implications of Using Self-Harm Monitoring Technologies

Today, my personal labor of love is released: “The Privacy and Equity Implications of Using Self-Harm Monitoring Technologies: Recommendations for Schools.”

This report is deeply personal to me: I have depression and anxiety. If this technology had existed while I was in high school, I would almost certainly have been flagged and watched for potential self-harm because of my mental health, despite never being at risk. If someone had come to talk to me about what I was writing in my online journal, I would have felt violated and unsafe. I wouldn’t have used the internet for self-expression or to search for community – something that was vital to my mental health.

However, five years ago, my beloved oldest cousin died by suicide. He once saved my life: I jumped in the pool without floaties before I could swim, and he brought me back to the surface. I would do anything for him to still be alive.

We need to find ways to prevent these tragedies while ensuring that we aren’t keeping the very kids we are trying to help from seeking that help. We need schools to hire more counselors and other trusted adults that can help kids when they need it. If technology is used to monitor students, it needs to be evidence-based and independently-reviewed, with privacy embedded throughout both the tech and the implementation by schools. We must make sure the tech doesn’t sweep so broadly that kids are afraid to use the internet to find help and community. We need to make sure data collected is deleted routinely so no child or parent has to worry about this data undermining their future opportunities.

Read the full report here.