After more than 40 years of operation, DTVE is closing its doors and our website will no longer be updated daily. Thank you for all of your support.
Ofcom claims success with Twtich measure to protect minors
UK regulator Ofcom says esports streamer Twitch has made changes to protect minors as a result of its intervention.
According to Ofcom, Twitch has made changes to the content available on its homepage, as well as introducing a number of protection measures to stop children accessing harmful content. These include preventing all viewers in the UK who are either logged-out (and whose age would therefore be unknown), or logged-in and declared as under 18, from accessing content labelled with sexual themes or gambling; removing content tagged with sexual themes, gambling, drugs, intoxication or excessive tobacco use and violent and graphic depictions from the Twitch homepage; and adding a new section to Twitch’s Guide for Parents and Educators about content classification labelling, which aims to help adults to make informed decisions about the types of content children can watch on the platform.
Ofcom said it had earlier raised concerns with Twitch about the measures it had in place to protect under-18s from harmful video content.
The exchange comes ahead of implementation of the Online Safety Act, which will impose additional duties on video sharing platforms over and above existing requirements to take appropriate measures to prevent under-18s from accessing pornography and other harmful material.
Ofcom recently set out new rules for technology firms that it said were designed to protect children. The measure include more robust age checks that go beyond self-declaration, including possible use of photo ID matching and facial age estimation. Ofcom has also called for platforms to design their algorithms to filter out the most harmful content from children’s feeds, and downrank other harmful content, to put in place content moderation systems and processes to take quick action on harmful content and to require large search services to use a ‘safe search’ setting for children, which can’t be turned off and must filter out the most harmful content
However, the watchdog has come under criticism from parents who say that its proposals do not go far enough, with arguments that more should be done to prevent platforms from publishing harmful content in the first place, rather than mitigate its impact once posted.