Dominant internet platforms like Facebook, Amazon and Google are more and more becoming the arena of social and legal conflicts. We witness a worldwide debate about potential new rules for dominant social media platforms (a so called new “platform regulation”). These debates are highly complex in a law-based society because they require us to resolve the conflict between fundamental rights and risk delegation of essential tasks to private actors. Still, the negative effects of harmful behaviour by these actors increases political appetite for regulation.
To navigate the upcoming debate, we want to propose, collect, and evaluate concrete policy solutions within the fundamental rights framework of the European Union. These proposals have been reviewed by a group of experts from academia, civil society and selected experts. The project aims at broad acceptance of developed positions within various European civil society stakeholders. Given the complexity and novelty of the underlying problems this proposal cannot be considered the solution to all questions in this field, but instead aims to further the debate with a concrete proposal that also addresses enforcement processes. Importantly, this proposal does not tie enforcement to liability as such an approach would inherently create an incentive for over-blocking on the part of platforms.
This is a policy proposal in the form and in the spirit of a request for comments. We invite everybody to participate in the discussion, to provide feedback, and to propose amendments on any of the proposals outlined below on this website. feedback@platformregulation.eu
MUST
Types of RecommendationsMUST
This word means that the proposal is an absolute requirement of the recommendation.
MUST NOT
This phrase means that the proposal is an absolute prohibition of the recommendation.
RECOMMENDED
This word means that there may exist valid reasons in particular circumstances to ignore a particular item, but the full implications must be understood and carefully weighed before choosing a different course.
NOT RECOMMENDED
This phrase means that there may exist valid reasons in particular circumstances when the particular behaviour is acceptable or even useful, but the full implications should be understood and the case carefully weighed before implementing any behaviour described with this label.
DISCUSS
Policy proposal that is worth discussion within the community and requires further evaluation.
RFC2119 | Key words for use in RFCs to Indicate Requirement Levels
MUST
Scope LimitationsThese policy recommendations and discussions are limited in their scope to democratic countries with a stable rule of law and strong fundamental rights protections.
MUST
Online PlatformsBy online platforms, we indicate a service that provides an intermediary function in the access to information, goods or services that are residing on the systems or networks at the direction of users.
Definition from Conseil national du numérique, Ambition numérique, Pour une politique française européenne de la transition numérique and the Digital Millennium Copyright Act.
MUST
Relevant PlatformsBy relevant platforms, we understand online or social media platforms that have significant market power in a country within the EEA and a minimum global revenue of a certain threshold.
Examples of definitions for significant market power and how a regulator should assess them can be found in telecommunication law, e.g. Article 35 auf the Austrian Telecommunications Act 2003
MUST
Dominant PlatformsBy dominant platforms, we understand online or social media platforms that have significant market power in a majority of countries in the EEA and a minimum global revenue of a certain threshold.
Examples of definitions for significant market power and how a regulator should assess them can be found in telecommunication law, e.g. Article 35 auf the Austrian Telecommunications Act 2003
MUST
Sponsored ContentBy sponsored content, we understand all content for which a payment has been made in order to promote it, either in general or to a specific audience.
MUST
Dark ContentBy dark content, we understand all sponsored content that is not visibly published through the account of the advertiser and displayed only to a specified target audience.
What is the Difference Between a Sponsored Post and a Paid Ad?
MUST
API AccessibilityBy API accessible, we understand a computer information system that gives access to content via a unique identifier. This requires that data has to be downloadable in bulk, by day, week, year and per country. New data shall be accessible via the system within a day of being published. APIs should be designed in a way to sustain independent research and long-term studies.
Facebook and Google: This is What an Effective Ad Archive API Looks Like Facebook’s Ad Archive API is Inadequate
MUST
Content ProviderBy content provider, we understand the person or entity that has published or created the post with the content in question.
MUST
Political AccountsBy political accounts, we indicate those accounts run by, or acting on behalf of, political parties, associations affiliated with political parties, or politically exposed persons as defined by Article 2 of EU Directive 2006/70/EC.
MUST
Procedural Safeguards for Content NotificationsA central pitfall of the current notification and action regime is the lack of procedural safeguards for the notification procedure. Every online platform needs to present to the user easily accessible, user-friendly and contextual notification options. These options should be available without the obligation to sign-in or sign-up with the service itself, if the content in question is publicly available.
Notifications should offer categories of different types of violations, ranging from various classes of illegal content to legal content that might be in breach of the Terms of Service or other rules of the platform. Different notification categories should trigger different procedures, which take into account the fundamental rights of all parties in question, meaning that procedures with stricter safeguards cannot be substituted by procedures with less strict ones. For example, a notification of illegality with the possibility of legal redress cannot be circumvented by deletion of the content in question under the Terms of Service of the platform.
A valid notification should be sufficiently precise and adequately substantiated. This should include 1) the location of the content (URL); 2) the reason for the complaint (potentially including legal basis under which the content has to be assessed); 3) evidence of the claim and potentially legal standing; 4) a declaration of good faith that the information provided is accurate 5) considerations on limitations, exceptions, and defences available to the content provider. Only in notifications of violations of personality rights or intellectual property rights is the identification information of the notifier mandatory. In all other cases, identification and contact information of the notifier are optional.
For purposes of procedural fairness and increasing the quality of content moderation, the content provider should be informed about a notification of his or her content, the reason for the notification, information about the subsequent process and possible ways to appeal or file a counter-notifications. The content provider should be informed immediately once the platform has received the notification and not just after a decision has been taken. Exceptions from this obligation to notify the content provider might apply only if sending notifications would hamper ongoing law enforcement investigations.
Possibility for counter-notification should be offered to the content provider to respond to the claim of the original notifier with evidence and arguments to the contrary. This counter-notification should be an option even before a decision by the platform is taken. Both original notification and counter-notification should apply the same standards in terms of declarations of good faith. The counter-notification can also be filled after the content has already been removed and can also challenge the category of the content in question.
Online Platforms have to inform the parties involved in notification about the outcome of the decision a platform has taken in their case. This communication is always sent to content providers and to notifiers if they have provided contact details in their notification. This communication needs to include 1) the reasoning of the platform for why it came to this decision; 2) the circumstances via which the decision was made, and if the decision was made by a human or an automated decision agent; and 3) information about the possibility to appeal this decision by either party with the platform, courts or other entities. This communication should also be sent for counter-notifications.
Online platforms need to publish information about their procedures and time frames for intervention by interested parties. This information should include 1) time before a notification is sent to the content provider; 2) the time for the content provider to respond with a counter-notification; 3) the average and maximum time for a decision by the platform for categories of cases; 4) the time at which the platform will inform both parties about the result of the procedure; 5) the time for different forms of appeal against the decision.
These proposals derive strongly from Kuczerawy, Aleksandra, Safeguards for Freedom of Expression in the Era of Online Gatekeeping (September 11, 2018). Auteurs & Media, 2018, Forthcoming. which builds on top of the Manila Principles.
RECOMMENDED
Notification and Fair Balance for Illegal ContentThe current notification and takedown system should be replaced by a notification-and-fair-balance procedure, which obligates different types of actions depending on the content that gets notified and the affected fundamental rights. To allow for transparency and sufficient oversight within the rule of law, the required procedure after a notification of illegal content (notification of illegality) that grants suspicion for a criminal offense cannot be preempted by the deletion of this content based on the Terms of Services or Community Guidelines of the platform. Otherwise, the economic incentive of the dominant Platform would stipulate overblocking potentially legal content in order to avoid a more burdensome procedure.
RECOMMENDED
Enforcement via European Platform RegulatorA competent regulatory authority in the form of a European agency is tasked with the enforcement and supervision of the obligations for online platforms. The regulatory authority is tasked with ensuring compliance to rules on notification and action procedures, reporting and information requirements, advertisement transparency, service inter-operability and the cooperation of the platform based on law with other competent authorities. Furthermore, the regulator has to ex-ante approve the Terms of Service of any dominant social media platform (including other documents relevant to content moderation and account suspension, like Community Guidelines and Code of Conducts). It is explicitly outside of the scope of the supervision and enforcement duties of the regulator to make any decisions about the legality or permissibility of individual content or classes of content. The regulator can impose penalties of up to 2% of annual global revenue. The organisation shall follow the Common Approach of the European Union and has to publish bi-annual reports on all of its activities to fulfill its mandate.
See example for the regulatory aproval of Terms of Services in § 25 of the Austrian Telecommunications Act 2003
RECOMMENDED
Develop Minimum Standards of ToS Transparency and AccountabilityThe social media council and platform regulator together jointly develop minimum standards for transparency and accountability of content moderation practices. These minimum standards are based on international standards and human rights law as well as existing content regulation best practices among online platforms. The platform regulator shall take the utmost account of these standards in its assessment of the content moderation frameworks of relevant and dominant platforms.
RECOMMENDED
Develop Effective Alternatives to Content and Account DeletionCurrently the entire public debate and policy implementation focusses on content and account deletion. Platforms should be encouraged to use such measures as a last resort and instead explore other measures which are less invasive for freedom of expression but may have a similar effect. These may include but are not limited to: content curation and community management, changing operation style and design of forums away from maximising screen time and active setting of explicit speech norms within online communities. All of these measures can contribute to reducing the need for deletion in the first place.
DISCUSS
Trusted FlaggersDominant and relevant social media platforms may appoint trusted flaggers within a country. Notifications of trusted flaggers are dealt with more expeditiously than others, but they are subject to the same safeguards as regulator notifications. A list of all current and previous trusted flaggers has to be published by the platform. The application and revocation process, as well as criteria for an organisation or an individual to be awarded trusted flagger status must be made public. Governmental institutions should never be able to become trusted flaggers.
MUST NOT
Real Name PolicyFor many marginalised groups anonymity is a pre-condition for the excise of the right to freedom of speech. Therefore, the idea to foster effective law enforcement by obliging all account holders on social media platforms to register with their real identity would lead to a chilling effect.
Arguments against a real name policy proposal in Austria Alternatives to real name policies
MUST
Empower Users to Take Control Over Algorithmic Curation of InformationUsers must have an easily accessible option to sort the content being displayed to them Dominant and relevant social media platforms have to offer this possibility to users. At the least, the setting should incorporate a fully chronological timeline, but would benefit from including also other factors that empower the user to take control of their information diet. Users can take these decisions actively over the duration of individual sessions. The concrete options the platform must offer can be evaluated by the regulator, which can issue guidance on potential additions and the design of the feature. This obligation does not exclude the potential insertion of sponsored content.
In allowing users to see how much content is otherwise hidden from them, this measure improves the user’s understanding of algorithms. It also enables them to understand the amount of content posted by accounts they follow. Technically this option should not create an undue burden for the platform provider.
NOT RECOMMENDED
Liability for Discriminatory Recommendation AlgorithmsAfter a negatively discriminatory effect of algorithmic recommendations of dominant platforms has been proven, and after obtaining knowledge that the platform sustains that same negative effect for a prolonged period, the platform becomes NOT liable for the damages caused to the infringed group. Associations of marginalised groups can NOT bring class-action cases to court to establish the facts of the case and subsequently ask for damages. The decision about the discriminating effect has to be established by court with the possibility for both parties to appeal. (see description for detailed explanation)
Discrimination in recommendation algorithms is incidental, not intentional, as recommendation engines are generally trained to optimise the commercial success of a platform (e.g. by optimising the total time a user spends on a site). The mathematical models employed by recommendation engines do not in general have representations for specific types of content that could be manipulated to give results desired by a specific group, and any change of the model to incorporate such features would require a categorisation of the data set the model is trained on that is unlikely to be available, nor is there any guarantee that changing the model in favour of one group would not, as an unintended consequence, hurt a different group. As such, there should not be any liability that goes beyond current anti-discrimination legislation that protects against intentional discrimination.
DISCUSS
Scientific Access to Dominant Platforms via Committee SafeguardEstablishment of an EU committee which receives and decides on research applications from independent academic institutions that offer a benefit to society. The approval is dependent on ethical data protection and scientific standards of the research proposal. Once approved, the dominant social media platform has to grant access to the defined data sets. An oversight board will enforce the compliance of researchers and the platform with the agreed data protection and research standards. Data provided by the dominant platform needs to be consistent and in a standardized machine-readable format.
Social Science One: positive attempt that got stuck because of failed sanctions for refusal of dominant platforms to cooperate.
DISCUSS
Scientific Access to Dominant Platforms with Differential Privacy SafeguardsDominant platforms should provide access to their data via a differential privacy interface to the researchers selected by a committee. To protect private user data, the differential privacy measure introduces statistical noise into the output of every query.
Different systems of randomization of user data could be bypassed by requesting multiple sets of data to reverse engineer the randomization process. The risk of large amounts of personal data being published is thereby higher than the benefits possibly gained by publically accessible research data.
Possible options could be providing quarterly sets of randomized data for public research, which is only once randomized and then published as such.
MUST
Transparency ReportsProportionate transparency obligations have to empower users to adequately assess the trustworthiness of platforms. Reporting obligations have to be fulfilled with a proportionate regularity and in an openly licensed, easily understandable and machine-readable format. Platforms are required to publish such reports proportionate to their size, market share, and the potential risk for users. Transparency reports need to be published on the following topics:
Current Microsoft transparency on legal information requests
Current Twitter transparency on Terms of Service enforcement
Current Facebook transparency on community standards enforcement
MUST
Rectification of Behavioural ProfilesUsers must be enabled to rectify and edit their personal advertisement profile. The user can have information changed that has arisen from algorithms due to incorrect data, as well as information that an algorithm has incorrectly composed from correct information, without the necessity to prove the truthfulness of the request. The user-interface of the platform needs to display the option for rectification close to every targeted advertisement that is based on profiling. The user-interface also has to display the criteria via which the user was targeted with this particular advertisement.
Aspects of this proposal are already covered by the Right to rectification | Art. 16 GDPR. However, algorithmic assumptions about a person that cannot be disputed with facts are not covered by the right to rectification. Current Facebook Ad Preference Screen
MUST
Advertisement ArchiveDominant platforms must make an archive of sponsored content available if the content was either displayed within the European Union or paid for by an account registered within the European Union. This archive must contain all sponsored content displayed within the last several years, with full functionality, as they were displayed to the user. The additional information stored in this archive must be also provided in a machine-readable format and accessible by an API. Additional information that needs to be supplied within the archive includes: whether the sponsored content is currently active or inactive; the start date for active content and the timespan in which the sponsored content was active for inactive content; the name and contact details of the advertiser; the total number of impressions; the exact description of the target group; the exact amount of money paid and, while active, the estimated amount. For sponsored content that needs to be depublished due to Terms of Service violations or legal proceedings, the additional information needs to stay in the ad-archive and further information about the type of rule violation or pending lawsuits needs to be provided. Each piece of sponsored content must contain an attached info button that directly links to the content within the Advertisement Archive.
This provides more transparency about commercial advertisement in general and, by building awareness, this may also have a positive impact on public manipulation in general. In addition to algorithmic transparency, the possibility to understand the reason that an advertisement is shown to you may also be an important step in understanding why people see what they see online (algorithmic content composition). For more information on the current Ad Libraries see Description on “Political Advertisement Archive”. About Facebook Ads
MUST
Political Advertisement ArchiveAll political sponsored content needs to be centrally visible in a public advertisement archive. This archive must store all political sponsored content for several legislative terms. This archive must contain all sponsored content displayed within the last several years, with their full functionality as displayed to the user. The additional information stored in this archive must be also provided in a machine-readable format and accessible by an API. Additional information that needs to be supplied within the archive includes: whether the sponsored content is currently active or inactive; the start date for active content and the timespan in which the sponsored content was active for inactive content; the name and contact details of the advertiser; the total number of impressions; the exact description of the target group; the exact amount of money paid and, while active, the estimated amount. According to a follow-the-money approach, intermediaries have to list the ultimate client or beneficiary of the sponsored content. (Political sponsored content must be distinguishable from common sponsored content. To differentiate political sponsored content from common content, political accounts need to register with the platform and subsequently be distinguishable from common accounts.) To increase accountability of political actors, politically sponsored content needs to contain a link referencing this content in the political advertisement archive. For sponsored content that needs to be depublished due to Terms of Service violations or legal proceedings, the additional information needs to stay in the ad-archive and further information about the type of rule violation or pending lawsuits needs to be provided. Each piece of sponsored content must contain an attached info button that directly links to the content within the Advertisement Archive.
Political Accounts: Facebook on ads related to politics or issues of national importance Facebook authorization process for political accounts Facebook getting started for political accounts
Political ads are not restricted to political parties, leaders, or foundations. Therefore, it is important to create a general advertisement archive. See: Advertisement Archive.
It is important to list the ultimate beneficiary because political propaganda may be also be spread by dummy accounts to bypass regulation concerning political advertisement.
Providing the exact amount of money spent on a political online advertisement is also a requirement to effectively monitor political campaign regulation. EU country comparison on political campaign regulation p. 12
MUST
Prohibition of Dark ContentDark content provided by political accounts is generally prohibited. All content that is published by a public political account needs to be visible on the account page.
Personalised election promises are fostering misinformation and weaken democratic discourse. Every political message needs to be accountable and subject to public scrutiny.
RECOMMENDED
Rectifying Defamatory Content and Election InterferenceThe rectification or apology for content on dominant social media platforms that has been ruled as election interference or defamation by a court needs to be published by the platform on channels with equivalent audiences to the original content. Once a court has ordered the content provider to issue a rectification or apology statement according to national media or civil law, the obligation of the content provider shall extend to the dominant social media platform to publish this statement on the same level, and with the same parameters, via which the original content was displayed to users. The purpose of this obligation is to reach the same or an equivalent audience. To implement this obligation, the platform is not obliged to track user behaviour or retain additional information about user interactions.
See NETPEACE: “Right to digital counter statement: notification of rectification in case of identified false reports due to court decisions. In the field of false reports / honorary offences, the right to a digital counter statement should be established or expanded, according to which all those users who have been notified of a judicially established hoax or defamation must also be notified of the counter-notification. Any notifications of rectification must be sent out via all channels in which the hoax was displayed (i.e. also in the profiles of those users who shared the hoax) and to all users who interacted with the causing message (likes, comments) Etc.). The obligation to correct is to be designed in a way that there is no obligation for additional tracking.”
MUST
Interoperability ObligationThe platform regulator has within its mandate the power to order, on a case-by-case basis, the provision of data transfer and service inter-operability measures. Such measures can only be ordered from dominant platforms.
See Data transfer project, Data transfer project whitepaper, Right to data portability | Art. 20 GDPR, Economics of open and closed systems - switching costs p. 8 and Tim Berners-Lee | Solid | true data ownership
RECOMMENDED
Stricter Merger ControlMergers that create a monopoly or dominant platforms acquiring their rivals or nascent competitors should be prohibited. Similar to the telecommunications market, mergers between, and acquisitions of, dominant platforms are subject to approval by competition authorities. The public authority shall take into account the effect of the merger on consumer choice, on the potential concentration of market power, on risks of gatekeeping roles in other markets, as well as on the concentration of user data under one centralized entity. Therefore, specific models should be created that take the peculiarities of markets based on data into account. Additionally, the “potential competition” test should be applied more consistently to prevent that bigger firms absorbing small companies like start-ups that in the future could become competitors. Authorities can prohibit or allow the merger and also place conditions on it.
Partly based on BEUC: Shaping Competition Policy in the era of digitalisation. See also It’s Time to Break Up Facebook - NYT
RECOMMENDED
Reparation of the Harm Caused to Consumers and CompetitionAn effective collective enforcement mechanism empowers consumers to obtain timely redress in case of competition law infringements. While consumers are the ones ultimately affected by abusive conduct, they currently have little or no remedies at their disposal. Therefore, the scope of the Directive on Representative Actions for the protection of the collective interests of consumers should also include infringements of competition law. Further, part of the fines imposed on companies for breaches of competition law should contribute to projects and initiatives aiming at creating a culture of compliance and helping consumers to reap the benefits of competitive markets.
Based on BEUC: Shaping Competition Policy in the era of digitalisation.
DISCUSS
Effective Assessment of Market Power in Digital MarketsThe criteria upon which market power is assessed should include proxies, such as the control of data necessary for the creation and provisions of services. Abuses of competition power often also entail other breaches, such as consumer law or privacy protections. Close cooperation between competent authorities is a key requirement for effective enforcement.