New EU Content Rules – Digital Services Act

With new EU content regulations coming in, the world’s tech giants are facing greater responsibility to clamp down on illegal and harmful content and an amplified duty to give more protection and choice to users.

Tech companies that ignore these responsibilities will face serious consequences – possible temporary bans of their platforms - and incur substantial fines - as much as 6% of their global turnover. The rules will require these companies to conduct risk management; independent external auditing; share data with authorities and researchers; and adopt a code of conduct, by as early as 25 August 2023.

The necessity for these rules stems from concerns with the trade or exchange of illegal goods, services and content online. More notably, with the continuous growth and use of artificial intelligence, data acquired by these online platforms is often misused by manipulated algorithm systems to amplify the spread of misinformation and for other harmful purposes.[1]  

The key goals of the new Digital Services Act are to provide better protection of fundamental rights and allow for less exposure to illegal content for citizens. For providers of digital services, it will provide legal certainty, harmonisation of rules and make it easier to start-up and scale-up in Europe. For business users of digital services, it will provide a level playing field against providers of illegal content. In general, the new Regulation aims to provide greater democratic control and oversight over systemic platforms and mitigation of system risks such as manipulation or disinformation.[2]

The Regulation will apply to three types of ‘intermediary service’ – (1) mere conduit service, (2) a caching service and (3) a hosting service, all of which are more precisely defined in Article 3(g) of the Act.

All online intermediaries who offer services in the single market will have to comply with the new rules, even if they are established outside of the EU. However, whilst all websites will be subject to the Regulation, platforms with more than forty-five million users (equivalent to approximately 10% population of Europe) – referred to as ‘very large online platforms’ will have to abide by stricter rules. To avoid overwhelming small companies with these new responsibilities, their obligations will be proportionate to their ability and size whilst also ensuring they remain accountable.

Whilst these rules will benefit the users of these platforms, it is obvious that the burden of hiring more staff to meet the requirements set out by the Regulation may cause difficulties for companies who are after experiencing mass layoffs.

What happens now?

After entering into force on 16 November 2022, online platforms had to have their user numbers published by 17 February 2023. If a platform has more than forty-five million users, the Commission will designate the service as a ‘very large online platform/ search engine’, these services will then be given 4 months to comply with the obligations of the new Regulation, which includes carrying out and providing the Commission with their first annual risk assessment. By 17 February 2024, platforms with less than forty-five million users will also have to comply with all the rules stated in the Act.[3]

For further information in relation to this matter, please contact Gavin Simons (Partner), or your usual AMOSS contact.


[1] ‘The Digital Service Act Package’ (European Commission, 2023)

https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package accessed 26 April 2023.

[2] ‘The Digital Services Act: ensuring a safe and accountable online environment’ (European Commission, 2023)

https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en accessed 26 April 2023

[3] supra n1