After more than 40 years of operation, DTVE is closing its doors and our website will no longer be updated daily. Thank you for all of your support.
Ofcom issues online safety guidelines for tech and social media firms
UK media regulator Ofcom has put in place new online safety measures for tech firms and social media sites to protect their users from online illegal content off the back of the Online Safety Act which came into law last month.
The new bill sees Ofcom exercises new powers to release a draft Codes of Practice that social media, gaming, pornography, search and sharing sites will have to meet. Ofcom’s role will be to force firms to tackle the online harm and make their services safer. However notes it will not be involved in making decisions about individual videos, posts, messages or accounts, or responding to individual complaints.
The Online Safety Act focuses on ‘priority offences’ set out in the legislation, such as child abuse, grooming and encouraging suicide. Ofcom cited that 16% of children aged 11-18 years have either been sent naked or half-dressed photos, or been asked to share these themselves, with 30% receiving an unwanted friend or follow request.
The media watchdog has ordered that tech firms will be required to assess the risk of users being harmed by illegal content on their platform, and take appropriate steps to protect them from it.
Under its draft codes published today, Ofcom said larger and higher-risk services should ensure that children are not presented with lists of suggested friends, they do not appear in other users’ lists of suggested friends, they are not visible in other users’ connection lists, children’s connection lists are not visible to other users, accounts outside a child’s connection list cannot send them direct messages and their lcation information is not visible to any other users.
It also proposed that services should use ‘hash matching’ – a technology that identifies illegal images of child sexual abuse by matching them to a database of illegal images, to help detect and remove child sexual abuse material (CSAM) circulating online and use automated tools to detect URLs that have been identified as hosting CSAM.
Fraud and terrorism
Ofcom has put in other measures to combat fraud and terrorism online such as automatic detection and verifying accounts.
Ahead of the Online Safety Act coming into force, the regulator assembled a team led by Ofcom’s group director of online safety Gill Whitehead to carry out research and gather detail to inform its Codes and guidance. Next year the company will further introduce guidelines for adult sites and will publish a consultation on additional protections for childrensuch as harmful content promoting suicide, self-harm, eating disorders and cyberbullying.
“Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression. Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular,” said Melanie Dawes, Ofcom’s chief executive.