I recently got asked by a parent about SnapChat. “What are the rules?” she asked. This was an interesting question and upon reflection, each social media platform seems to have a different “culture” or set of norms. How you interact on twitter is a bit different than how you interact/post on LinkedIn (I hope). We started talking about it some more the parent clarified. “What if they see something wrong or something bad happening?”
She ultimately wanted to know how they are addressing bullying on the platform. I actually knew the resources to share… I shared with the parent SnapChat’s safety page and how it included resources for parents. It’s interesting that SnapChat recognizes the need to give information parents. That it is a deeper part of their culture or community. Although the parents may not be end users they are certainly part of the community. Not only that but they thoughtfully attempted to develop guidance for parents around safety.
As some of you my know I have build a social media ethics training for social workers. Reading the safety and community standards for SnapChat compelled me to examine more. Found these efforts to address these complex issues an interesting exercise with implications on how we define community and safety. So I dove into the community and safety standards for Twitter, Facebook, Instagram, WhatsApp, Reddit, and LinkedIn. I found some interesting themes…
(As a sidebar if you have access to Natural Language Processing tools, I would love to talk to you about doing a little more scientific study of this)
Going with the question that started it all. Facebook, SnapChat, and Instagram felt the need to address parents specifically. This acknowledges that parents have different needs and questions than youth. Getting even more specific about age differences, Facebook had developed a “Youth Portal”. They have information from stories from the teen perspective there. While talking about targeting specific populations, I found it fascinating that Instagram includes an “About Eating Disorders” research page.
With regard to general standards they all including language about respectful interactions. It’s hard not to laugh when LinkedIn’s community guidelines has a whole paragraph about “being nice” but sometimes going beyond that golden rule reminder is not a bad thing. What I found most interesting was Reddit who often is described as the wild west of social media attempts to set out community guidance for moderators; the language was powerful and a good reminder for all of us…
Healthy communities are those where participants engage in good faith, and with an assumption of good faith for their co-collaborators. It’s not appropriate to attack your own users. Communities are active, in relation to their size and purpose, and where they are not, they are open to ideas and leadership that may make them more active.
Social Media platforms are intentionally attempting to define a sense of community. Through professional use of social media, I see that sense of community daily. The headlines don’t always seem to make room for these but there are ways that social media is attempting to build community. In the current political climate it’s hard to believe social media can be this haven for community. Reading through the standards I started think about how social media community interactions are any different than the expectations of face to face interactions.
Almost all the “Safety” standards place emphasis on privacy settings. The news has certainly forced social media companies to educate users about privacy. There is more intentional around letting people know the potential risks of privacy and how to minimize them. Facebook has certainly come under fire but has responded with a privacy check. Found this to be an interesting design that walks you through privacy options. All other platforms offer some sort of privacy dashboard/tips.
This emphasis on privacy seems to less about educating users but more platforms attempting to reduce liability. When you step back the emphasis on these tips ensure that you are informed of your privacy rights. The more familiar you are with “privacy” the more empowered you can be to decide if you want to continue social media use in the first place. This emphasis is on privacy is not completely self-serving for social media companies. It is important that as members of a community we understand our rights.
All platforms have some form of reporting unsafe behaviors. Some have some proactive resources like Facebook being to integrate with Crisis Text Line on their messenger. Facebook also has developed a “bullying prevention” resource page. SnapChat’s safety page also has a robust information page. Twitter does a nice job collapsing a lot of resources about threats and violations of terms of service in one place.
Reporting threats and safety concerns on social media is not as intuitive as calling the police when you witness something face to face. Social media companies are taking some responsibility and be as proactive as possible. This is the challenge is again how is these different then preventing threats to others or self in real life. The complexity of violence and suicide prevention in “real life” communities is just about as complex as “social media”. The key is being proactive as possible.
Social media platforms have built this unique way of communicating. We have the ability for friends and strangers to connecting and converse. We convey complex concepts and emotions with others. We can agree or disagree. This gets simple until it gets threatening to someone else, themselves, or their privacy. These were things society has struggled with for a long time. Some would argue that social media has amplified these problems. But reading these community standards I started to think about the role social media could play in preventing these issues.
By being forced to develop solutions, social media companies have a unique opportunity. They have to define what it means to be a member of the social media community. It’s odd that we need to put in writing that as a member of a community you should be safe and respectful to other community members … but here we are. If they are being bullied or abused you SHOULD know where to go. However not everyone does. If you are feeling suicidal, you SHOULD know where to turn but not everyone does?
Reading through the community and safety standards of these platforms, reaffirmed the need to define what “community” is. We should know how to lift each other up conversely we should know how to report someone in danger. It is often easy to write off social media as the “Wild West” but I think in their efforts to define community standards, they might be doing a better job than in real life communities. Social Media companies have a unique opportunity to scale privacy education and violence/suicide prevention. From the social work perspective, I am glad to see they taking steps to do so.
If you feel compelled to take a look here are the standards I looked at…
I continue to feel social workers should be part of the social media conversation; Please check out my social work and social media ethics course: