Digital Services Act: an update
- 08 augustus 2024
On Feb. 17, 2024, the Digital Services Act went into full effect: the new European law to protect users from (the power of) very large online platforms. How do things stand now, six months after the law was introduced? Does the law do what it promises? Is there still room for improvement?
The Digital Services Act
The Digital Services Act, or DSA for short, is a European law containing rules for online platforms. The larger the online platform, the stricter the rules. So-called "Very Large Online Platforms and Search Engines," which are about 20 of them, must adhere to the strictest rules. These include Meta's Instagram and Facebook, but also Dutch company Booking. The law attempts to somewhat curb the arbitrary policies of dominant platforms, and thereby better protect users of those platforms.
For example, users are entitled to safeguards if their content is removed (such a decision must be motivated and the user may object) and advertisements may no longer be based on sensitive data such as religion or political affiliation. On our website www.yourplatformrights.euCheck out your platform rights! we explain the problems that preceded the DSA and the solutions the DSA proposes. The website also provides a clear step-by-step guide for users to practically enforce their new rights (e.g., how to set up a non-profiling algorithm).
The European Commission
The European Commission has an important role in the DSA, as it oversees the 20+ very large online platforms. The Commission takes this task seriously and sends out a press release about their DSA work almost every week. We list some of them for you.
- The Commission designated more and more platforms as "Very Large Online Platforms," requiring them to adhere to the strictest rules. There were nineteen initially, but they added more, such as Shein, Temu and XNXX.
- The European Commission launched a number of proceedings and investigations into large platforms. They often take initiative, which we can only encourage. The Commission ordered Twitter/X to provide more information about its content moderation team. Not a bad idea, since X's transparency report revealed that the billion-dollar company only has one person for content moderation in the Netherlands. It also launched an investigation into Meta's Instagram and Facebook. This concerns deceptive ads and political content. We are worried about the addictive nature of large online platforms, and we are not the only ones. The European Commission initiated proceedings under the DSA against Meta, expressing concerns about minors on the app, among other things because of the addictive nature of algorithms.
- The Commission also organized a number of civil society round tables, to which Bits of Freedom was invited. We were present at one of those round tables. It is good that the role of civil rights organizations has been taken seriously in the implementation of the DSA.
- The European Commission ruled that Meta's "pay or consent" model does not comply with the DSA's sister, the Digital Markets Act (DMA). We are big opponents of this revenue model and therefore think it's great that the DMA puts a stop to it.
We can conclude that the European Commission has certainly not been idle. This is good news, because it means that they are working hard to enforce the law and are taking their job as regulator seriously. We are also curious to see what concrete changes will result from the Commission's proceedings on the major platforms. The appearance of a procedure or investigation is of course exciting and promising, but we are only really satisfied if such a procedure leads to changes in the toxic mechanisms of large platforms.
Situation in the Netherlands
In the Netherlands, the Consumer and Market Authority (ACM) supervises the DSA. To be fair, the ACM has not quite been officially designated yet. Therein lies the problem right away: as a result, the ACM does not yet have all its powers and is therefore a bit behind on the DSA. For example, the Authority would be allowed to appoint "trusted flaggers" (entities whose reports of illegal content are given priority), but is not yet authorized to do so. However, the ACM already published a guideline for providers of digital services (platforms). The ACM also tries to investigate and exert pressure without powers: for example, the Authority found that many companies do not yet comply with the DSA and/or Platform-to-Business Regulation. Another good feature: users of online platforms can complain about online platforms at the ACM. We hope that the ACM will soon be allowed to really start working as the Dutch Digital Services Coordinator. By then, a number of certification tasks can be executed by the regulator, for example; we are curious to see how these turn out. We also urge the ACM to do something about public awareness around the DSA. Too many platform users are not aware of their (new) rights, so it is about time that someone tells them.
The Dutch courts also play a role in enforcing the DSA: after all, it is a law, and platforms must abide by it. If they don't, courts can issue a ruling. And that has already happened once in the Netherlands, in a case between Twitter/X and a user of that platform. The judge in that case ruled, based on the DSA, that X may not simply "shadow ban" users after someone noticed that his account was suddenly untraceable, without being notified by X. The DSA requires that users have the right to be notified and motivated of a restriction on their account and that there must be an effective point of contact available to the user. That wasn't there, so X violated the DSA. As far as we are concerned, a good case: shadowbanning is unacceptable and should not be allowed anywhere.
Our work
After more than two years of lobbying on the DSA, we also saw a role for ourselves in implementing and enforcing the law.
- First, we went live with our "Stay Loud" campaign calling on critical voices and civil society to stay loud on social media. This included our website jouwplatformrechten.nlCheck your platform rights! on which anyone can check their platform rights.
- Together with a number of other organizations, we filed a complaint against LinkedIn based on the DSA. We had found out that advertisers on LinkedIn can target their ads to groups. The names of those groups revealed sensitive information, such as religion, political affiliation or orientation. That, in our view, violated the DSA. The Commission took our complaint seriously and quickly issued a statementRead the Commissions press release here, after which LinkedIn closed the option to target groupsRead more about this victory here. We hope to file more complaints in the near future with equally successful results.
- We share our position on the DSA all too often. For example, in May we were guests in the Dutch Committee on Digital Affairs in ParliamentWatch our input, in which we spoke about the Dutch Implementation Act of the DSA.
What now?
The DSA is far from "finished": there will be many more investigations, proceedings and complaints until we really get everything out of this law. That is what we will be working on in the near future, and we encourage all regulators to do the same.