Singapore – Snapchat has announced the launch of its ‘strike system’, with the goal of giving its 13 to 17-year-old users a safety measure when browsing their platform. With the addition of this new strike system, Snapchat aims to become a secure and safe messaging platform for minors.
The company is releasing a number of new features today with the clear goal of enhancing the protection of 13–17-year-olds from potential online risks. These upcoming features have been designed to accomplish three main objectives, to protect teenagers from unwanted contact from people they don’t know in real life, adapt the content viewing experience to the users’ age range, and improve the effectiveness of identifying and removing accounts promoting age-inappropriate content.
A strike system and detection technologies will be used to achieve this. The company’s goal for the safety and wellbeing of its young user demographic can be seen in the fact that these features are set to go live in the upcoming weeks.
They have implemented this new strike system in an effort to combat accounts that promote content that is inappropriate for children. Any content that is flagged as inappropriate, whether it is discovered proactively or is reported, is immediately removed under this system. In addition, they may decide to ban an account if it repeatedly tries to violate their policies.
“Our commitment to making our platform safer is always on, and we will continue to build on these improvements in the coming months with additional protections for teen Snapchatters and support for parents,” Snap said in a press statement.