Discord has expanded its Family Centre with new tools designed to help parents and guardians oversee teen activity while maintaining user privacy. The latest update introduces detailed activity summaries and greater control over who can contact teens on the platform. These changes aim to improve safety and transparency without compromising private conversations.
A key addition is the new “Social Permissions” feature, which allows guardians to choose whether teens can receive direct messages only from friends or from users sharing the same servers. While this ensures more control over communication, message content remains private. Teens can also notify their guardians when reporting another user, though the report details stay confidential to preserve autonomy and trust.
Detailed Activity Summaries and Weekly Reports
The Family Centre’s activity summary now includes new metrics such as total purchases, total call minutes, and top users or servers from the past week. Both teens and guardians can view these insights, providing a broader understanding of online activity. Moreover, guardians will receive weekly email updates summarizing their teen’s recent engagement, allowing for ongoing monitoring and awareness.
The updated Family Centre strikes a balance between transparency and independence. Guardians can see activity patterns and server interactions, but cannot read private messages, manage friends, or directly moderate servers. This ensures oversight without crossing privacy boundaries.
Voluntary Participation and Age Verification
Participation in the Family Centre remains optional, giving teens control over whether to link their accounts. To connect, teens must share a QR code with a guardian, who then scans it and completes the pairing process. Once connected, monitoring becomes limited to general activity data, maintaining a respectful boundary between safety and personal communication.
Additionally, Discord has implemented stricter verification measures in line with evolving safety regulations. Under the UK’s Online Safety Act, only verified adults aged 18 or older can manage a teen’s message request settings. By integrating these measures, Discord reinforces its commitment to creating a safer and more accountable digital environment for younger users.








