European Union imposes regulation on social networks and e-commerce platforms

With Digital Services ACT, moderation practices of the contents of major digital services will be monitored by Brussels, which can inflict heavy sanctions.

by

This is a “historic” moment for the regulation of digital: the commissioner of the internal market, Thierry Breton, does not hide his satisfaction of having completed, Saturday April 23, in Brussels, after sixteen hours of negotiation, a Political Agreement on European Digital Services Act (DSA). Adopted one month after Digital Markets Act, an “economic” text intended to impose on dominant platforms respect for their competitors, the DSA aims to reduce the risks for “society”, by imposing duties on networks Social like Facebook, Instagram, Twitter or Tiktok and on online sales markets like Amazon or Leboncoin. “These texts are two sides of the same piece,” said Breton. The DSA should come into force in early 2023.

This regulation is also a political victory for France, which hoped to obtain an agreement before the end of its presidency of the Council of the European Union (EU) provided for in mid-June.

These authors aims to update, “for the next twenty years”, the regulation of the web, in force in Europe since the electronic commerce directive, adopted in the year 2000, when Facebook did not exist And Amazon had just opened in France. Indeed, for some, this founding text has left the digital giants too much freedom, because it exempts liability hosts for the content posted by third parties, as long as they have not been notified.

But, retort others, make the platforms responsible or force them to withdraw within twenty-four hours the problematic content would endanger freedom of expression and generate excessive censorship-the deputy proposal for the deputy (LRM) Laetitia Avia was rejected in 2019 by the Constitutional Council for this reason.

To resolve this dilemma, the new European regulation imposes “means and transparency obligations” on major services. Like banks, these will be required to periodically carry out “risk assessments”, then propose measures. The areas targeted for the time being are the fight against illegal content (hate incentives, dangerous or counterfeit products, etc.), damage to electoral processes (disinformation, etc.), attacks on freedom of expression (in order to avoid Surcensure) and attacks on minors and their mental health. All are linked to the Human Rights Charter.

“compensation” for the lede -led consumer

Large platforms already have content moderation policies, but now, the means allocated and the results will be assessed by the European Commission. This may inflict fines of up to 6 % of their turnover, or even ban them in the EU.

You have 51.33% of this article to read. The rest is reserved for subscribers.

/Media reports.