After a bullied teen died by suicide, a grieving mother last year sued the platform where the abuse took place – Snapchat – for not doing enough to protect the younger users. Another lawsuit, related to another suicide, followed last month. In response to the former, Snap banned the anonymous messaging apps that had enabled online bullying and promised to renew its policies to address what kinds of Snapchat-connected experiences could be built using its developer tools. Today, the company released the results of its policy review and the changes it is making.

Effective immediately for new developers building on its Snap Kit platform, Snap will ban anonymous messaging apps and will restrict anyone who builds friend finder apps to users 18 and older. Existing developers will have 30 days to comply with the new policy.

These changes are limited to third-party apps integrated with Snapchat and are not intended to address other child safety issues on the Snap platform.

Snap says the policy update will affect a small subset of their community of more than 1,500 developers. Only about 2% of developers will be affected by the ban on anonymous messaging apps, while another 3% will be affected by the new requirement to make their apps obsolete. The company also noted that developers who remove anonymous posts from their apps can have their apps re-rated and remain a Snap Kit partner.

An app that has benefited immensely from the previous ban on anonymous messaging apps YOLO and LMK, Sendit is one of the apps that needs to make changes to continue working with Snapchat. Within a few months of the ban, Sendit had gained millions of downloads from teens who still wanted a way to post anonymous Q&As.

The appeal of anonymous social apps is undeniable, especially for young people. But over the years, it has been proven time and again that such apps cannot be used responsibly but can have devastating consequences. From the early MySpace days to the teen suicides linked to Ask.fm to the sadly well-funded anonymous apps like Secret and Yik Yak (neither of which lasted), anonymity at the hands of young people has been tested and consistently failed. Given this history, it was irresponsible to allow this kind of activity on Snapchat in the first place, given the core demographics of teens and young adults.

In addition to banning anonymous messages, Snap will now restrict friend finder apps to adult users 18 and older.

Friends search apps are designed to connect users with strangers on Snapchat, encourage people to share their personal information, and are a common way for child predators to reach younger, vulnerable Snapchat users. Often the apps are used for dating or sexting purposes, not finding friends, and can be filled with porn bots. For years, law enforcement and child safety experts have warned about child predators on Snapchat and friend-finding apps dubbed as “Tinder for Teens.”

Problems with these apps persist today. For example, a study published last month by The Times details the rampant sexual abuse and racism taking place on one of these apps, Yubo.

The anonymous messaging ban and the restrictions on apps for finding friends are the only two major changes being made to Snap’s policy today, but the company notes that developers’ apps still have to go through a review process where they ask questions about it. its use. cases and demonstrate their proposed integrations. Snap also said it will conduct periodic reviews every six months to ensure that the apps’ functionality has not changed in a way that would violate policy. Any developer who intentionally tries to trick Snap will be completely removed from Snap Kit and the developer platform, it added.

“As a platform partnering with a wide range of developers, we want to foster an ecosystem that helps apps protect users’ security, privacy and wellbeing, while unlocking product innovation for developers and helping them grow their business,” he said. a Snap spokesperson. reference to the policy updates. “We believe we can do both and will continue to regularly review our policies, monitor app compliance and work with developers to better protect the well-being of our community.”

Snap’s platform security still needs work

While the changes will affect third-party apps that integrate with Snapchat, the company has yet to address children’s security concerns on its platform through something like an age-appropriate experience for minors, similar to TikTok, or through the launch of the promised parental experience. surveillance, which Instagram and TikTok now have.

However, the company, whose app is rated 13+, has limited the visibility and discoverability of minors’ profiles, provides tools and reminders to keep track of your friends list, requires mutual friendship before messaging (if under 18), and provides links to safety resources, such as mental health lines.

Despite those efforts, today’s changes advance what remains to be done in the field of child safety.

Platform security is already a top priority for social media companies across the industry as regulatory pressures mount. In his case, Snap was taken before Congress last fall to answer questions from lawmakers about several safety issues affecting minors and young adults using the app, including the prevalence of eating disorder content and adult-directed fare inappropriate for younger teenage Snapchat users, but not blocked by an age gate.

Snap was also sued in January alongside Meta by another family who lost their child to suicide after she succumbed to pressure to send sexually explicit photos that were later leaked to her classmates. The complaint alleges Snapchat’s lack of verification of the child’s age and use of disappearing messages contributed to her death. In addition, the suit mentions how anonymous messages played a role, although it does not directly refer to the use of anonymous third-party apps.

In the same month, Snap fixed other issues with the friend recommendation feature to make it harder for drug dealers to connect with teens through the app. The issue had been the subject of an NBC News investigation that linked Snapchat to the sale of fentanyl-laced pills that had killed teens and young adults in more than a dozen states.

Before that, the company faced lawsuits over its “speed filter” that allowed users to take photos that showed how fast they were going. The filter has contributed to numerous car accidents, injuries and even deaths over the years. It was later initially disabled at road speed and then removed in 2021. (Snap declined to comment on the matter because a lawsuit is pending.)

With lawmakers finally looking to curb the Wild West days of Big Tech, where growth and engagement were consistently prioritized over user safety, Snap is gearing up to make changes. It hired its first-ever head of platform safety, Jacqueline Beauchere, in September.

Snap CEO Evan Spiegel also said in October that the company was developing parental control tools. These tools — which would follow the launch of parental controls on TikTok and, this week, Instagram — will allow parents to see who their teens are talking to in the app.

Snap has not said whether the tools will address other parent concerns — including a way for parents to disable the child’s access to sending or receiving disappearing messages, limit friend requests or require approvals, block the child from sharing photos and other media, or hide the adult-oriented (and often clickbait-y) content that features prominently in the Discover section of the app.

“We want to provide ways for parents and teens to work together to ensure their safety and well-being online — similar to the ways parents help prepare their children in real life,” said a Snap spokesperson on parental controls. “We hope these new tools will serve as a conversation starter between parents and their teens about how to be safe online.”

The company said the first set of parental controls is on track to launch this year. The developer policy changes are now live.

If you or someone you know is struggling with depression or has had thoughts about harming themselves or committing suicide, The National Suicide Prevention Lifeline (1-800-273-8255) offers 24/7, toll-free, confidential support for people in need, as well as best practices for professionals and resources to assist in prevention and crisis situations.

Correction, 3/17/22 2:45 PM ET: Snapchat’s speed filter was initially disabled at car speeds, but was not completely removed from the platform until 2021.

This post After suicides and lawsuits, Snapchat restricts app building on its platform with new policy – TechCrunch

was original published at “https://techcrunch.com/2022/03/17/following-suicides-and-lawsuits-snapchat-restricts-apps-building-on-its-platform-with-new-policies/”