European Parliament approves landmark tech laws
On July 4, the “Digital Services Act package” was passed with the support of EU members. A safer and more competitive online environment is what the Digital Services Act (DSA) and the Digital Markets Act (DMA) are designed to achieve.
The Digital Services Act will be “underpinned” by the Code for “Very Large Online Platforms” that have a 45 million user base (DSA). The Act requires them to take steps to minimize any systemic risks related to their business that might lead to the spread of misinformation.
Companies who violate the DSA may be subject to fines of up to 6 percent of their annual global revenue, which may serve as an effective incentive to maintain Code compliance in the future. The press release on the 2022 Code also states that “The DSA will force huge platforms to put our society’s well-being first and their financial interests second.”
Linking the Code to the Act may or may not assure compliance for smaller companies because they may not be held to the same standards as bigger platforms. Commentators have noted that this may lead to smaller organizations not taking action against misinformation to the same extent.
According to the European Union, the DSA and DMA have two main objectives: to establish level playing fields to encourage innovation, growth, and competitiveness, both in the European Single Market and globally; and to create a safer digital space in which the fundamental rights of all users of digital services are protected.
The Code acknowledges the complex nature of online misinformation and is applicable to big tech platforms, social media firms, digital advertising, fact-checkers, and civil society.
The Code lays out its scope in granular detail, outlining 44 specific and varied commitments to fighting misinformation, reinforcing that “all parties in the [disinformation] ecosystem have duties to play in preventing its spread,” over the course of 9 chapters and 40 pages. The EU’s Digital Services Act envisions the Code of Practice ultimately evolving into a Code of Conduct.
A variety of steps are taken to carry out each pledge. The metrics for signatories on how to track, evaluate, and report their individual effects in battling misinformation are included. The Code further stipulates that the fight against misinformation must be balanced against the fundamental rights of EU citizens while strengthening monitoring tools and joint research.
By establishing additional obligations for online platforms, the DSA and the DMA are both intended to limit the influence of the four big tech tyrants: Facebook, Google, Apple, and Amazon. Additionally, the regulations aim to “better educate and empower consumers.
Many of the draft’s original strategies, meanwhile, were thought to be incompatible with democratic principles. The final measure avoids the total conversion of social networks and search engines into censorship tools, despite the fact that the new platform restrictions will be stricter than the existing ones. The first proposal called for social media sites to hand over user data to governments.
The DSA package, however, does not offer a comprehensive fix for every issue consumers may encounter online. In reality, the regulations might be viewed as granting government entities much too much authority to identify and remove potentially criminal content and to gather information on anonymous speakers.
How these regulations are put into effect will depend on how social media companies perceive their DSA requirements and how EU authorities carry out the regulation’s enforcement.
According to the official definition provided by the EU, “digital services comprise a broad spectrum of online services, from straightforward websites to internet infrastructure services and online platforms.”
The DSA’s regulations largely target online intermediaries and platforms, including social networks, online marketplaces, content-sharing websites, app stores, and websites for booking travel and lodging. It contains regulations for gatekeeper online platforms that act as barriers between companies and customers in the internal market of the Union for significant digital services.
The Digital Markets Act also covers some of these services, although for different reasons and under other rules. The advantages of widespread digitization are widely acknowledged, yet there are many negative effects on the economy and society of Europe. The internet trading and exchange of unlawful products, services, and content is a major source of worry.
Online services are increasingly being abused by deceptive algorithms to distribute misinformation more widely and for other negative ends. The way platforms respond to these new issues and problems will have a big influence on online basic rights.
There are still considerable gaps and regulatory hurdles to resolve despite a variety of focused, sector-specific actions at the EU level.
In preparing this legislative package, the European Commission sought input from a wide variety of stakeholders, including those in the corporate sector, consumers of digital services, civil society organizations, state authorities, academics, the technical community, and international organizations.
In the framework of the DSA and the DMA, these discussions identified particular challenges that could call for EU-level involvement. A political agreement was obtained on the Digital Markets Act on March 25, 2022, and on the Digital Services Act on April 23, 2022, after the Commission submitted its proposals in December 2020.
The Digital Services Act and the Digital Markets Act will be published in the Official Journal when the Digital Services Package is approved and will take effect 20 days following publication.
All of the EU will be immediately affected by the DSA. It will go into effect fifteen months after it enters into force, or on January 1, 2024, whichever comes first. The DSA will take effect four months following their designation as far as the responsibilities for very big online platforms and very large online search engines are concerned.
The DMA will start to apply six months after it becomes operative. Following the Commission’s designation decision, the designated gatekeepers will have a maximum of six months to guarantee adherence to the Digital Markets Act’s requirements.
The DMA will be applicable to the organizations in charge of overseeing whole eco-systems that obstruct communication between producers and consumers inside the EU’s internal market. These are made up of many platform services, such as operating systems, cloud services, online marketplaces, and search engines.
These gatekeepers will be bound by a number of clearly stated requirements and limitations. To ensure the contestability of gatekeepers’ digital services, they are determined by reference to the most unfair market activities, or behaviors that erect or reinforce obstacles for other businesses.
The DMA will also establish a strong enforcement mechanism to guarantee prompt adherence to specific responsibilities.
This agreement completes our ambitious reorganization of our digital domain within the EU internal market on the economic front. We’ll go to work right away on choosing gatekeepers based on objective standards. They must adhere to their new responsibilities within six months of being named.
The new regulations will strengthen contestability and provide more equitable circumstances for consumers and corporate users through effective enforcement, which will stimulate market innovation and choice.
We take this shared goal seriously, and no corporation in the world can ignore the possibility of being fined up to 20 percent of their worldwide revenue if they persistently infringe the law. Thierry Breton, EU Commissioner for the Internal Market
Once the DMA is in place, the European Commission will have the authority to impose penalties and fines of up to 10% of a company’s global revenue, which may increase to up to 20% of such turnover in the event of repeated violations.
Regular violations also give the Commission the right to apply any behavioral or structural sanctions, such as a prohibition on subsequent acquisitions that are related to the violation, that are required to guarantee the efficacy of the commitments.
The DMA also grants the Commission the authority to conduct market analyses that will guarantee that the regulatory requirements are kept current with the reality of digital markets, which is continuously changing.
Some of the significant topics covered by both the DSA and the DMA are highlighted below:
Online intermediaries and platforms like Facebook, Google, YouTube, and others would need to implement “new procedures for speedier removal” of information that is judged harmful or unlawful. Depending on the rules of each EU member state, this may change. Users will be allowed to contest these takedowns, and these sites will also need to make explicit their takedown policies.
Platforms will also need to work with “trusted flaggers” and offer a “clear method” to enable users to report information that is prohibited.
According to the DSA, big digital platforms and services now have “a duty to analyse systemic risks they cause and to conduct risk reduction studies.” Every year, this assessment for websites like Facebook and Google will be required.
The risks of “dissemination of illegal content,” “adverse effects on fundamental rights,” “manipulation of services having an impact on democratic processes and public security,” “adverse effects on gender-based violence and on minors,” and “serious consequences for the physical or mental health of users” will all need to be considered by businesses.
Marketplaces like Amazon will be required by the DMA to “impose a duty of care” on vendors who use their platform to sell goods online. To make sure that consumers are fully informed, they will need to “collect and display information about the items and services sold.”
The DSA includes a new crisis mechanism provision that will be “initiated by the Commission on the advice of the board of national digital services coordinators” (it alludes to the war between Russia and Ukraine).
These unique measures, though, will only be in effect for three months. The Commission will determine the necessary actions to be taken to ensure that the basic rights of users are not violated once this section makes it “possible to examine the impact of the operations of these platforms” on the situation.
The regulations’ overall goal is to forbid targeted advertising directed at kids based on their personal data and offer tighter protection for them.
Additionally, they advocate “transparency measures for online platforms on a range of topics, including the algorithms used to offer content or items to consumers.” Last but not least, it asserts that cancelling a subscription ought to be as simple as signing up.
The Digital Services Act package is being adopted by the European Union at a time when nations all over the world are enacting their own personal and non-personal data protection regulations and tightening restrictions on social media platforms.
Two significant nations in these fields are India and the United States. Below, we’ve outlined some of the key aspects of these laws (and the ones that have been suggested). The majority of user data across the globe is amassed by digital corporations in the US, but as of yet, there is no US regulation governing what data is gathered and how it is used.
The American Data Privacy and Protection Act, which is more likely to become law than any other federal privacy legislation introduced in the US in the past, is poised to change that status quo by becoming the first comprehensive national data privacy framework to enjoy support from both parties and from both chambers of Congress.
It mandates that only required data be collected by covered entities. ensures that customers don’t have to pay for privacy and that covered organizations implement privacy by design. It requires covered organizations to provide consumers with the option to stop receiving tailored marketing. It provides children and adolescents with improved data protection.
It gives customers the freedom to access, modify, delete, transfer, and withdraw their consent at any time. More accountability mechanisms for bigger platforms; stronger security for sensitive personal data; increased openness in how businesses gather and utilize data.
State governments around the nation are attempting to enact new legislation to regulate social media services. The US Supreme Court removed an injunction on the Texas social media statute HB 20 in May 2022, which forbade major social media platforms from filtering and barring users based on their “opinion.”
HB 20 went into effect in September 2021. Tech businesses are challenging the bill, claiming that it is illegal since it forces certain private organizations to speak in a way that the government prefers.
Although India does not yet have data protection legislation, the implementation of the Personal Data Protection Bill and the Data Governance Framework is uncertain.
The new Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules), published by the Central government in New Delhi, mandate that social media platforms proactively monitor content, publish regular grievance reports, appoint grievance redressal roles, and disable content in accordance with government directives. Below are some of the major elements of the regulations.
Significant social media intermediaries have been defined as social media intermediaries having registered users in India over a declared level (SSMIs). The use of technology-based measures on a best-effort basis to identify specific types of content is one of the additional due diligence procedures that SSMIs are required to follow.
Other requirements include the appointment of specific personnel for compliance, the ability to identify the first author of the information on their platform under specific circumstances, and other similar requirements.
The Rules outline a framework for the control of news and current affairs information online as well as curated audio-visual content.
All intermediaries must offer a grievance resolution process to address concerns from users or victims. Publishers are required to use a three-tiered grievance redressal procedure with various degrees of self-regulation.