Awareness

Communicating via Social Media Apps

2 min read

Whose interest is in mind when communicating via Social Media Apps?

A recent report by the children’s commissioner has found that 1-in-5 internet users in the UK are children. The internet is deeply embedded in the fabric of children’s lives, but most online platforms are not designed with their interests or welfare in mind. Children should be able to use online platforms to play, learn and connect without experiencing harm. 

Parents and children want effective and privacy-preserving age assurance on social media platforms. 90% of parents and 70% of children think that platforms should have to check their ages before they sign up for social media and messaging accounts.

We have heard of many organisations turning to popular apps as a way to maintain contact with vulnerable children. While this was understandable at the start of the pandemic, the use of digital tools such as WhatsApp and Facebook now needs urgent review. This wasn’t news to us, as readers will know we posted on this topic before.

What’s wrong with Social Media Apps?

Most social media services have their servers outside the UK, whereas data privacy regulations require that children’s sensitive information must be stored on UK servers. Some, such as WhatsApp, require the young person to be 16 years old or over to sign up for an account. For others, there are clear accepted age recommendations (see more information here). There is a need to ensure clear boundaries around use and content to minimise the risk of sensitive data being shared inappropriately. NSPCC said –

‘Data protection is just one part of the picture: Meta’s products (including Whatsapp and Facebook) were used in more than half of grooming cases recorded by police over the last nine months.

Safe and secure channels.

We have to find ways of ensuring that children have safe and secure channels to communicate with the practitioners who are there to support them. Communication between workers and young people is important and should be properly recorded and stored.

Information security is at the heart of Mind Of My Own. 

Mind Of My Own complies with The UK General Data Protection Regulation (UKGDPR), the Data Protection Act 2018 and the Children’s Code. The purpose of the UK GDPR, Data Protection Act 2018 and Children’s Code, is to protect the rights of individuals whose data (information) is obtained, stored, processed and disclosed. Mind Of My Own is required by law to comply with the Act, we must:

  • Register with the Information Commissioner’s Office (ICO)
  • Apply the six data protection principles
  • Educate and train our staff in the correct use of data
  • Mind Of My Own is registered with the ICO under number ZA217007

Data is handled in a way that ensures appropriate security, including protection against unlawful or unauthorised processing, access, loss, destruction or damage. All information processed by Mind Of My Own is processed securely, using the processes detailed in the Mind Of My Own information security management system (ISMS). The ISMS is audited and reviewed internally every month and inspected independently every year for accreditation with the international standard ISO27001.