skip to main content

How does PopJam keep kids safe?

One of the challenges we face daily at PopJam is balancing online creative play, gaming experiences, learning and expression with behaviour issues. Playing together can escalate from happy to hurt feelings in minutes.

For a quick glimpse of moderation, community and trust  at PopJam, imagine managing millions of young people in a playground. This is what our stellar community team is focused on 24/7.

With the high volume of young people online, it’s no longer adequate to have a “reporting system”that relies only upon users. Studies show that young people are reluctant to report serious behaviour issues online and offline. Young people fear being exposed to their peers and family and they worry about being cut off from their social lifeline – the internet.

Therefore we proactively address the reluctance issue as follows:

  • Advanced image and text filtering within behaviour-monitoring software
  • Experienced, vetted,  expert and professional community team members
  • Straightforward, realistic public policy
  • Efficient escalation procedures, established relationships and protocols with proper agencies
  • Regularly sharing best practices with industry professionals and organisations

Advanced Moderation, Trust and Safety

As youth site operators, we have a responsibility to use the most advanced image and text filtering, and behaviour-monitoring software, available to us. We license software and tools from (CommunitySIFT(™)). We employ a system of teaming our experienced, professional Moderators with the best moderation and social networking software commercially available. We’ll describe the system in more detail below.

SIFT (™) software is an AI (artificial intelligence) – based system that monitors PopJam content in real-time. We employ a ‘user trust’ system that monitors each user based on their posts. If a person posts an unacceptable post, they lose points and the system will send all their content to a pre-moderated queue. Our team reviews the content and the person’s trust score either rises or lowers once a moderator reviews the content. Users earn trust (their right to post without pre-moderation) by contributing positive content that is flagged as ‘okay’ by our moderators. This is an effective system that teaches our users good digital citizenship.

PopJam also provides a user-generated reporting tool that flags the potentially problematic content to our software and moderation team in seconds.

Our Head of Digital Trust and Community has over 28 years’ experience in online public policy, behaviour, and moderation, so we have complete confidence in our standards for moderation policies. We are always happy to speak with parents on the phone or via email (as long as we are confident we are exchanging emails with an actual parent).

In the United Kingdom

Our escalation procedures work within a protocol that may require reporting to or working with agencies such as the NSPCC (National Society for the Prevention of Cruelty to Children), and other law enforcement agencies. 

We have a well-established relationship with NSPCC dating back to 2007. As of 2014, Rebecca Newton, our Head of Digital Trust and Community, established a reporting protocol with the NSPCC to report potential harm to children, such as self-harm and suicide threats.

We also have well-established relationships with law enforcement throughout the country. Ms. Newton served on the UKCCIS working group for social media and was appointed to the APPG (All-Party Parliamentary Group) for Youth and Technology in 2014. She was also appointed to the NSPCC Digital Task Force in 2016 and is a Board of Trustee member of

In the United States of America

NCMEC (The National Center for Missing and Exploited Children) Cybertipline is America’s central agency for reporting any potential online child exploitation issues. NCMEC works with law enforcement throughout the world and attends to reports rapidly and much more efficiently than website owners individually contacting each city’s law enforcement agency. We are avid supporters of NCMEC’s processes and are privileged to work with such a fine organisation, should the need arise.

Positive Online Experiences

We take our community responsibilities seriously. Our statistics collected over 7+ years show 94% of the content and exchanges within PopJam are creative and positive. Less than 6% requires action such as non-punitive deletions (e.g., personally identifying information, selfies in pyjamas, a repost of an age-inappropriate internet meme). Just as in the offline world, the majority of our PopJammers are great digital citizens online. We prefer to focus upon good digital citizenship and the benefits of the online world. The 94% should not be penalised for the 6% who misbehave. We are constantly adapting and learning and therefore refining our moderation processes to ensure a positive, healthy, and safe online experience for kids of all ages. The online world is a great place for kids to learn, laugh, collaborate and express themselves. We’re honoured to provide the tools and space for them to do just that.