Safeguarding Children in their use of Technology
This policy has been developed to highlight the challenges posed by the increased use of technology by children and by those who pose a risk to a child's online safety.
Children have access to technology, and greater access to the internet, from a young age. Research published by Ofcom in 2023 - Children and Parents: Media Use and Attitudes 2023 found that:
- Almost all children aged 3-17 went online in 2022 (97%);
- Children were accessing the internet on various devices such as phones, tablets, gaming consoles or smart speakers;
- Almost eight in ten children aged 3-17 (79%) used apps or sites for messaging or voice/ video calls. However, similarly to other media activities, this varied by age, from 48% of children aged 3-4 to 98% of 12-17-year-olds.
For more information about this report, please visit Children and Parents: Media Use and Attitudes 2023.
Internet Abuse relates to four main areas of abuse to children:
- Sharing and production of abusive images of children (although these are not confined to the internet);
- A child or young person being groomed online for the purpose of Sexual Abuse or Exploitation;
- Exposure to pornographic images and other offensive material via the internet; and
- The use of the internet, and in particular social media sites, to engage children in extremist ideologies or to promote gang related violence.
The term 'Child Sexual Abuse Materials (CSAM)' will be used throughout this document. This refers to any sexually explicit content that involves a child, including pictures, videos and computer-generated images.
Social networking sites are often used by perpetrators as an easy way to access children and young people for sexual abuse. In addition, radical and extremist groups may use social networking to attract children and young people into rigid and narrow ideologies that are intolerant of diversity: this is similar to the grooming process and exploits the same vulnerabilities.
NSPCC (2024) shared that 19% of children aged 10-15 years old had exchanged messages on social media with people who they had never met offline. The data shows that boys are slightly more likely than girls to do this, with boys reporting this is due to involvement in gaming. Discussions with these children show that they do not always see these people as strangers as they have a shared interest.
Internet abuse may also include online bullying. This is when a child is tormented, threatened, harassed, humiliated, embarrassed or otherwise targeted by another child using the internet and/or mobile devices. Whether on social media sites, through a mobile phone, or gaming sites, the effects can be devastating for the young person involved. There are ways to help prevent a child from being bullied online and to help them cope and stop the bullying if it does happen.
It is another form of bullying which can happen at all times of the day, with a potentially bigger audience. By its very nature, online bullying tends to involve several online bystanders and can quickly spiral out of control. Children and young people who bully others online do not need to be physically stronger and their methods can often be hidden and subtle. Online bullying can sometimes escalate further and also include direct bullying and/or physical acts of harm being perpetrated offline. Where bullying occurs both on and offline this can further increase the emotional and sometimes physical impact upon the victim. Examples of this could include, but is no limited to, organising "fights", use of intimidation and the continuation of name calling and threats on a face-to-face basis. The emotional impact of this may be significant and lead to further feels of isolation and fear. Adults may also engage in online bullying in order to cause harm.
The Malicious Communications Act 1988 focuses on messages that are sent to cause harm or distress. Updates have been made to ensure this Act covers messages sent via technology.
For more information, please see the Preventing and Responding to Bullying and Stalking Toolkit.
In 2023, 275,652 webpages were identified as containing child sexual abuse imagery. In 2023, the Internet Watch Foundation (IWF) confirmed 275,652 webpages contained child sexual abuse imagery. 287 of these webpages were hosted in the United Kingdom. Each webpage could contain multiple child sexual abuse images or videos.
Of the 275,652 webpages, 92% (254,071) were assessed as containing 'self-generated' imagery. The IWF (2024) highlights 'self-generated' child sexual abuse as an inadequate and potentially misleading term which does not fully encompass the full range of factors often present within this imagery, and which appears to place the blame with the victim themselves. In some cases, children are groomed, deceived or extorted into producing and sharing a sexual image or video of themselves by someone who is not physically present in the room with the child. In some instances, children may also be unaware that they are being recorded and that images or videos of them are being shared by abusers.
Self-Generated Child Sexual Abuse Material is a term used when a child shares sexual, naked or semi-naked images or videos of themselves or sends sexually explicit messages. They can be sent using any device that allows you to share media and messages.
A child is breaking the law if they:
- Take an explicit photo or video of themselves or a friend;
- Share an explicit image or video of a child, even if it's shared between children of the same age;
- Possess, download or store an explicit image or video of a child, even if the child gave their permission for it to be created.
However, if a young person is found creating or sharing images, the police can choose to record that a crime has been committed but that taking formal action is not in the public interest.
Financially motivated sexual extortion is a type of blackmail where someone tries to use intimate, naked or sexual photos or videos of children to make them do things they do not want to do. These photos or videos are sometimes taken without their knowledge. Whilst all age groups and genders experience this, a large proportion of cases involve male victims aged between 14-18 years old (NCA issues urgent warning about 'sextortion' - National Crime Agency).
With effect from 29 June 2021, section 69 Domestic Abuse Act 2021 expanded internet image abuse to include threats to disclose private sexual photographs and films with intent to cause distress. However, 'revenge porn' refers to the sharing of material between two adults. The sharing of material including children is recognised as Child Sexual Abuse Material.
Where there is suspected or actual evidence of anyone accessing or creating indecent images of children, this must be shared with the Police and Children's Social Care in line with the Safeguarding Referrals Procedure.
If a child is aware of sexual images being shared online of themselves, Report Remove is an online tool that can help to report the images and get them removed. Report Remove cannot remove images from social media and messaging apps that use end to end encryption such as WhatsApp and Snapchat. However, if still reported, the image can be tagged so it cannot be shared on online forums, internet pages etc.
Online harm may also include accessing and sharing sexual images of adults, including abusive materials and pornography. For more guidance on this topic, please visit: Keeping children safe in education 2024.
Children may see extremist materials when online. Extremism is defined as "the promotion or advancement of an ideology based on violence, hatred or intolerance, that aims to:
- negate or destroy the fundamental rights and freedoms of others; or
- undermine, overturn or replace the UK's system of liberal parliamentary democracy and democratic rights; or
- intentionally create a permissive environment for others to achieve the results in (1) or (2).
Radicalisation is defined as the process by which people come to support terrorism and violent extremism and, in some cases, to then participate in terrorist groups. There is no obvious profile of a person likely to become involved in extremism or a single indicator of when a person might move to adopt violence in support of extremist ideas. The process of radicalisation is different for every individual and can take place over an extended period or within a very short time frame.
The process of radicalisation through grooming often occurs online. Harmful content and misinformation are increasingly being shared online via social media sites and video sharing sites, such as YouTube and TikTok. This increase can normalise extremist views making children and young people believe that they are commonplace. These types of sites use algorithms to keep the user engaged and will suggest content based on user profiles. Whilst this approach is used to enhance user experience it can unknowingly increase access to harmful content, which can further normalise this viewpoint for children and young people, potentially leading to radicalisation.
Locally, the following organisations are able to provide additional advice and guidance in relation to safeguarding individuals vulnerable to radicalisation and children who may be at risk through living with or being in direct contact with known extremists:
- Contact the Prevent team (Police) CTP-EM-Prevent@lincs.pnn.police.uk;
- Refer direct to CHANNEL (non-criminal space) channel@lincs.police.uk;
- General Enquiries, contact the Prevent Officer (LCC) prevent@lincolnshire.gov.uk.
For more information, please visit Supporting Children and Young People Vulnerable to Violent Extremism.
Where there are concerns about a child being groomed, exposed to pornographic material or contacted by someone inappropriately, via the internet or other ICT tools like a mobile phone, referrals should be made to the Police and to Children's Social Care in line with the Safeguarding Referrals Procedure.
The Serious Crime Act (2015) introduced an offence of 'sexual communication with a child'. This applies to an adult, who communicates with a child and where the communication is sexual or, if it is intended to elicit from the child a communication, which is sexual, and the adult reasonably believes the child to be under16 years of age. The Act also amended the Sexual Offences Act 2003 so it is now an offence for an adult to arrange to meet with someone under 16 having communicated with them on just one occasion, previously this was on at least two occasions.
All such reports should be taken seriously. Referrals will normally lead to a Strategy Discussion to determine the course of further investigation, enquiry and assessment. Any intervention should be continually under review especially if further evidence comes to light.
Due to the nature of this type of abuse and the possibility of the destruction of evidence, the referrer should discuss their concerns with the Police and Children's Social Care before raising the matter with the family. This will enable a joint decision to be made about informing the family and ensuring that the child's welfare is safeguarded.
AI-Generated Child Sexual Abuse Material describes images of Child Sexual Abuse that are partially or entirely computer-generated. They are usually produced using software which converts a text description into an image. This technology is developing rapidly, the images created can now be very realistic, and recent examples are difficult to differentiate from unaltered photographs.
Many popular, publicly available artificial intelligence tools automatically block attempts to create abusive material, but the large number of child sexual abuse images made using them that have been detected show that individuals have found ways around this. Typically, they are made using publicly available artificial intelligence tools that can be used and manipulated to produce images, (and, increasingly, videos) depicting child sexual abuse.
In Lincolnshire, there have been reported cases of children accessing images of their school peers on social media, using AI through a nudification app to create sexually explicit deepfake images before circulating the images: Children Nudification Tools and Sexually Explicit Deepfakes. Legally, this is managed via the same route as creating and sharing other Child Sexual Abuse Materials.
See Artificially generated child sexual abuse images: Understanding and responding to concerns/CSA Centre.
See also How AI is being abused to create child sexual abuse material (CSAM) online (iwf.org.uk).
The act of a perpetrator grooming a victim for exploitative reasons may take place online. This might include online platforms such as gaming sites or social media.
In the event of an agency or individual professional having concerns that exploitation is taking place, the level, nature and extent of these concerns should be established. This should be done using the Child exploitation tool - LSCP and accompanying guidance. The tool helps to gather the information required to make a decision about risk and vulnerability. The screening tool can be completed in relation to a child being exploited or a person or location of concern.
For more information about how to support children when they are being exploited online, please see Child Exploitation Policies and Procedures in Local Resources.
When communicating via the internet, children tend to become less wary and talk about things far more openly than they might when communicating face to face.
Both male and female adults and some young people may use the internet to harm children. Some do this by looking at, taking and/or distributing photographs and video images on the internet of children naked, in sexual poses and/or being sexually abused.
Children should be supported to understand that when they use digital technology they should not give out personal information, particularly their name, address, school address or mobile phone numbers to anyone they do not know or trust. If they have been asked for such information, they should always check with their parent or other trusted adult before providing such details. It is also important that they understand why they must take a parent or trusted adult with them if they meet someone face to face whom they have only previously met on-line.
Parents should be supported in setting parental controls on devices in the home using guidance sound on www.internetmatters.org.
Children should be warned about the risks of taking sexually explicit pictures of themselves and sharing them on the internet or by text. Think Before You Share Campaign from IWF is a campaign that aims to help children understand the harm of sharing self-generated material. It is essential, therefore, that young people understand the legal implications and the risks they are taking. The initial risk posed by Self-Generated Child Sexual Abuse Material may come from peers, friends and others in their social network who may share the images. For guidance on how to discuss this topic, please visit: Talking to your child about the risks of sharing nudes | NSPCC.
Where young people are voluntarily sending/sharing sexual images or content with one another the Police are likely to use the 'outcome 21' recording code to record that a crime has been committed but that it is not considered to be in the public interest to take criminal action against those involved. This reduces stigma and distress for children and can help to minimise the long-term impact of the situation.
The LSCP host an E-Safety eLearning course. Year five of the six-year training pathway also offers an Online Safety course focusing on how to manage risk online; this course can be accessed at any time by searching within Enable if you would like to complete it earlier than Year 5. Online safety is also covered in the Child Exploitation training package, Inter-Agency Module 4 and within the Radicalisation/Extremism eLearning module. For more information about these courses, please visit LSCP Training - LSCP.
The Stay Safe Partnership offers a range of resources and workshops for all education provisions in Lincolnshire. Further to this, the Stay Safe Partnership hold sessions for parents to build awareness of current risk trends and how to manage the risks associated with these. For more information, please visit Stay Safe Partnership - Lincolnshire County Council.
The LSCP and the Stay Safe partnership offer the Junior Online Safety Officer (JOSO) scheme aimed at Year 5 pupils to provide them with the knowledge, skills, and confidence to help spread e-safety messages throughout their school with pupils, teachers and parents. Participating schools and pupils take part in virtual sessions which can support with their understanding of this subject and the sharing of this wider. Places can be booked via Enable within the 'Events' area. For full details on using Enable and locating training, please visit LSCP Training - LSCP.
If professionals are concerned about parental supervision of online use, the LSCP's Neglect resources cover this area and can be found in the Local Resources.
Last Updated: October 14, 2025
v124