Discord is testing parental controls that allow monitoring of friends and servers

Posted on

New usernames aren’t the only change to popular chat app Discord, which is now used by 150 million people every month. The company is also testing a suite of parental controls that will allow greater supervision of Discord’s youngest users, AapkaDost has learned and Discord confirmed. In a live test being conducted on Discord’s iOS app in the US, the company introduced a new “Family Center” feature, where parents can configure tools that allow them to see the names and avatars of their recently added friends. teen, the servers the teen has joined or participated in, and the names and avatars of users they’ve directly messaged or interacted with in group chats.

However, Discord clarifies in an informational screen that parents cannot view the content of their teen’s messages or calls to respect their privacy.

This approach, which strikes a fine line between the need for parental controls and a minor’s right to privacy, is similar to how Snapchat implemented parental controls in its app last year. Like Discord’s system, Snapchat only gives parents insight into who their teen is talking to and befriending, not what they’ve typed or the media they’ve shared.

Users participating in the Discord test will see the new Family Center hub linked under the User Settings section of the app, under the Privacy & Security and Profiles sections. From here, parents can read an overview of the Family Center’s features and click a button to get started when they’re ready to set things up.

Image Credits: Discord screenshot via Watchful.ai

Discord explains on this screen that it “built Family Center to bring you more content about how your teen is using Discord so you can work together to develop positive online behavior.” It then describes the different parental controls that let them see who their teen is chatting with and making friends with, and what servers they are logging into and participating in.

Similar to TikTok, parents can scan a QR code provided by the teen to place the account under their supervision.

Image Credits: Discord screenshot via Watchful.ai

The screenshots were discovered by app intelligence agency Watchful.ai. In addition, a handful of users had posted their own screenshots on Twitter when they came across, or just had, the new experience earlier this year noticed about the feature when you come across it in the app.

We reached out to Discord for comment on the tests and showed them some screenshots of the test. The company confirmed the development, but offered no firm commitment on when or if the parental controls feature would actually be rolled out.

“We are always working to improve our platform and keep users safe, especially our younger users,” said a Discord spokesperson. “We’ll let you know if and when anything comes of this work,” they added.

Among other things, the company refused to answer our questions about the scope of the tests or plan to offer the tools outside the US.

Image Credits: Discord screenshot via Watchful.ai

While Discord is regularly used by a younger, Gen Z audience these days, thanks to its roots in being a home for gamers, it’s often left out of the larger conversation about the damage to teens caused by social media use. Meanwhile, as executives from Facebook, Twitter, Instagram, Snap, YouTube and TikTok have had to testify before Congress on the issue, Discord has been able to sit on the sidelines.

Hoping to get ahead of expected regulations, most major social media companies have since rolled out parental control features for their apps, if they didn’t already offer such tools. YouTube and Instagram announced parental controls plans in 2021, and Instagram eventually launched them in 2022 and other Meta apps would follow. Snapchat also rolled out parental controls in 2022. And TikTok, which already had parental controls before the congressional investigation began, has been beefing them up in recent months.

But the lack of regulation at the federal level has prompted several U.S. states to launch their own laws around social media use, including new restrictions on social media apps in states like Utah and Montana, as well as broader laws to protect minors, such as California’s Age Appropriate Draft Act to come into force next year.

Discord has so far flown under the radar despite warnings from child safety experts, law enforcement and the media about the dangers the app poses to minors amid reports that groomers and sexual predators have used the service to target children. The nonprofit organization, the National Center on Sexual Exploitation, even added Discord to its “Dirty Dozen List” for its failure to “adequately address the sexually exploitative content and activities on its platform,” it says.

The organization specifically cites Discord’s lack of meaningful age verification technology, inadequate moderation, and inadequate security settings.

Today, Discord provides its users with access to an online safety center that guides users and parents on how to manage a secure Discord account or server, but it doesn’t go so far as to actually provide parents with tools to control their child’s use of the service or block them from joining servers or communicating with unknown persons. The new parental controls won’t address the last two concerns, though, but they’re at least an acknowledgment that some sort of parental controls are needed.

This is a shift from Discord’s previous stance on the issue, as the company told The Wall Street Journal in early 2021 that its philosophy was to put users first, not their parents, and said it’s not planned to add such a feature.

Leave a Reply

Your email address will not be published. Required fields are marked *