Author: Boyana Boyadzhieva
The Digital Services Act (DSA), which aims to make the online environment safer, fairer and more transparent, has been implemented since February this year. The new legislation has created new responsibilities for all online platforms in the EU and has extended consumer rights.
Thus, all online platforms with users in the EU, with the exception of small and micro businesses with less than 50 employees and an annual turnover below €10 million, will have to take measures in regards to:
- Illegal content, goods, and services: Online platforms are obliged to provide their users with means to flag illegal content, goods, and/or services. In addition, online platforms will have to work with ‘trusted flaggers’ – specialized entities, whose notices must be dealt with as a priority.
- Protection of minors: It is prohibited to target minors based on profiling or their personal data.
- Information about ads on the platform: Online platforms are obliged to provide users with detailed information about the ads they see, such as why they are being shown the ad and who paid for it.
- Prohibition of ads that target users based on sensitive data: Users’ sensitive information, such as political or religious beliefs, sexual preferences, etc., is prohibited from being used to target ads.
- Transparency: Online platforms are required to provide their reasoning to any user affected by a content moderation decision, e.g. content removal, account suspension, etc.
- Complaints mechanism: Online platforms are obliged to provide users with access to a complaints mechanism to appeal against content moderation decisions of the platform.
- Report on moderation procedures: Online platforms are obliged to publish a report on their content moderation procedures at least once a year.
- Clear terms and conditions: Online platforms are obliged to provide their users with clear terms and conditions, including the basic parameters under which their content recommender systems work.
- Contact point: Online platforms are obliged to designate a contact point for authorities as well as for users.
Since the end of August 2023, the DSA has already applied to the 19 very large online platforms (VLOP) and search engines (VLOSE), (with an average of over 45 million monthly users).
For all matters relating to the implementation and enforcement of the DSA, each Member State must designate an independent regulatory authority to act as the national coordinator for digital services.
The National Digital Service Coordinator will be responsible for the oversight and enforcement of the DSA on all platforms that are not designated as large online platforms (VLOPs) and search engines (VLOSE). They will be the first point of contact for consumer complaints and will award the status of trusted flaggers. They will also be responsible for investigating and taking action in cases of breach of the DSA. For Bulgaria, these functions will be performed by the Communications Regulation Commission.
An independent and advisory group, the „European Board for Digital Services“, is also being established. The Board will be consulted on the enforcement of the DSA and advise on emerging issues, contribute to guidance and analysis, and assist in the oversight of very large online platforms and very large online search engines. The Council will issue annual reports on systemic risks and best practices to mitigate them.
In March 2024, the Commission intends to adopt Guidelines on risk mitigation measures for electoral processes. A public consultation on the Data Access Delegated Act is expected in April with adoption by July and entry into force in October 2024.