Tech firms must ‘tame aggressive algorithms’ under Ofcom online safety rules
by Martyn Landi
May 07, 2024
4 minutes
Social media platforms must take action to stop their algorithms recommending harmful content to children, and put robust age-checking measures in place to protect them, Ofcom has said.
The regulator has published its draft children’s safety codes of practice, which set out how it expects online services to meet their new legal responsibilities to protect children online under the Online Safety Act.
The online safety laws require sites which can be accessed by children to take action to protect those younger users by assessing the risk their platform poses to children and then putting in place measures to mitigate those risks – with
You’re reading a preview, subscribe to read more.
Start your free 30 days