The Digital Services Act (or DSA), is an EU bill designed for fight fake news. It contains a series of measures which aim to impose on Big Tech greater responsibility for illegal or harmful content circulating on their platforms and includes measures against online disinformation. Many of its provisions apply to systems that have more than 45 million users in the European Union, such as Facebook, the subsidiary of Google YouTube, Twitter and TikTok.
WATCH THE VIDEO: Digital Act, EU agreement on rules for big tech on content
Among the provisions are: the ad ban targeted at minors, users based on their sex, ethnicity or sexual orientation and the banning of deceptive techniques that companies use to get people to do things they didn’t intend to do, such as easy subscription to services that are hard to turn down. Additionally, to demonstrate that they are making progress in limiting these practices, technology companies will need to carry out annual risk assessments of their platforms
Big Techs will also need to have adequate staff for manage content moderation because users will have the right to submit complaints in their own language. By content we mean commercial advertisements but also posts by individual users. And whoever does not respect the rules risks penalties of up to 6% of turnover global or even the ban on operating in the EU single market in the event of repeated serious violations
“This regulation – the EU Commission said – unique in its kind, will force platforms such as Facebook, YouTube or Twitter to moderate content that host. DSA is one world premiere in terms of digital regulation. The text enshrines the principle that what is illegal offline must also be illegal online. It aims to protect the digital space from the dissemination of illegal content and to guarantee the protection of the fundamental rights of users ”
In particular, the bill states that the advertising targeted based on an individual’s religion, sexual orientation or ethnicity either prohibited. Even minors cannot be the subject of targeted advertising. The confusing or deceptive user interfaces (so-called dark patterns) designed to guide users to make certain choices will be prohibited. The EU has established that canceling subscriptions must be as easy as registering them
The large online platforms like Facebook, they will then have to make their operation transparent to users algorithms of recommendation, for example used for sort the contents in the news feed or suggest shows on Netflix. Users must also be offered a “non-profiling” recommendation system. In the case of Instagram, this would mean a chronological feed (as recently introduced)
Hosting services and online platforms will need to clearly explain why they have removed illegal content, as well as giving users the opportunity to appeal against such removals. The new regulation also provides for theobligation to “promptly” remove any illegal content or harmful or circulating on their platforms (according to national and European laws) as soon as a platform becomes aware of them and forces social networks to suspend users who “often” violate the law
The big online platforms will have to give key data to researchers to “deliver Learn more about how online risks evolve“. Online marketplaces need to retain basic information about traders and their platform to track down people who sell illegal goods or services. Big platforms will also need to introduce new strategies to tackle disinformation during crises.
The DSA also contains an important update on the directive of e-commerce platforms, born 20 years ago when they were still in an embryonic state. Online sales sites will be obliged to verify the identity of their suppliers before offering their products which, in turn, will be verified once a year by independent bodies and placed under the supervision of the European Commission.
The political agreement on the Digital Services Act paves the way for its formal adoption in the coming weeks and the law real who will enter probably in effect by the end of the year. Even if the rules won’t start to apply up to 15 months later. Until now, regulators had no access to the internal mechanisms of Google, Facebook and other platforms, but with the new law, companies will need to be transparent and provide information to regulators and independent researchers about content moderation efforts.
To enforce the new rules, the European Commission would have to hire about 200 new employeeswhich will be paid by tech companies through a “supervisory fee,” which may come up to 0.1% of their annual global net income