MILAN (Reuters) - Italy's communication watchdog AGCOM has approved new rules enabling it to order online video-sharing platforms to remove "harmful content" to protect minors and consumers in the country, it said on Thursday.
The new rules will take effect from Jan. 8, 2024, and potentially affect services such as Google's YouTube (GOOGL.O), TikTok, and Meta's Instagram (META.O).
They will target videos considered to be a threat to minors by promoting racial, sexual, religious, and ethnic hate, as well as those that do not adequately safeguard consumers, AGCOM's statement said.
Under the new regulations, the authority will also be empowered to take aim at video platforms based in other EU member states, having first informed the relevant national authorities.
Italy's move follows the EU's approval of its Digital Services Act (DSA), passed in October 2022, which requires Big Tech to do more to fight harmful and illegal online content, especially if it is aimed at minors.