In brief
The Canadian government plans to introduce legislation this year to regulate social media companies, with a focus on online hate and harassment. After nine months of study and deliberations, the Canadian Commission on Democratic Expression has settled on a series of principles and recommendations that are aimed at influencing legislation.
Contents
- In depth
- Recommendation #1: A new legislated duty on platforms to act responsibly
- Recommendation #2: A new regulator to oversee and enforce the Duty to Act Responsibly
- Recommendation #3: A Social Media Council to serve as an accessible forum in reducing harms and improving democratic expression on the internet
- Recommendation #4: A world-leading transparency regime to provide the flow of necessary information to the Regulator and Social Media Council
- Recommendation #5: Avenues to enable individuals and groups to deal with complaints of harmful content in an expeditious manner. An e-tribunal to facilitate and expedite dispute resolution and a process for addressing complaints swiftly and lightly before they become disputes.
- Recommendation #6: A mechanism to quickly remove content that presents an imminent threat to a person
- Other Considerations
- Participation
In depth
The Public Policy Forum (PPF) is a prominent independent, non-profit Canadian think tank for public-private dialogue. In April 2020, the PPF established the Canadian Commission on Democratic Expression (“Commission”) to study and provide informed advice on how to reduce harmful speech on the internet. The Commission issued a report recommending six practical steps that place the responsibility of hateful and harmful content on the shoulders of technology platforms and its creators. The recommendations are summarized below.
Recommendation #1: A new legislated duty on platforms to act responsibly
The Commission believes that platform companies must accede to a greater public interest by assuming responsibility for harmful content that appears within their domains. Creating a new legal standard to act responsibly would impose an affirmative requirement on platforms including social media companies, large messaging groups, search engines and other internet operators involved in the dissemination of user-generated and third-party content.
Recommendation #2: A new regulator to oversee and enforce the Duty to Act Responsibly
To oversee and enforce the new Duty to Act Responsibly on platforms, the Commission calls for the creation of a new regulatory body (“the Regulator”) that would move content moderation and platform governance beyond the control of private sector companies. Regulatory decisions would be judicially made, based on the rule of law and subject to a process of review. The Regulator would also be responsible for publication and enforcement of a Code of Conduct for regulated parties, underpinning the Duty to Act Responsibly.
Recommendation #3: A Social Media Council to serve as an accessible forum in reducing harms and improving democratic expression on the internet
Creating an independent, stakeholder-based social media council would provide an institutional forum for platforms, civil society, citizens and other interested parties to have an inclusive dialogue about ongoing platform governance policies and practices. Importantly, it would perform a consultative role for the Regulator in providing broad-based input into the Code of Conduct and how changing technology, business models and user experience affects policy.
Recommendation #4: A world-leading transparency regime to provide the flow of necessary information to the Regulator and Social Media Council
One of the central challenges faced by researchers, journalists, policy communities, social media users and, soon, regulators, is that the platform ecosystem is considered as non-transparent. Embedding significant transparency mechanisms at the core of the mandate for the Regulator and Social Media Council would provide greater access to information and create a more publicly accountable system.
Recommendation #5: Avenues to enable individuals and groups to deal with complaints of harmful content in an expeditious manner. An e-tribunal to facilitate and expedite dispute resolution and a process for addressing complaints swiftly and lightly before they become disputes.
The Commission believes that creating a new e-tribunal for online content disputes could rebalance the possible asymmetry in the digital sphere from private sector processes within the platform companies, to a public institution dedicated to due process and transparency. An e-tribunal would provide rapid and accessible recourse for content-based dispute settlement.
Recommendation #6: A mechanism to quickly remove content that presents an imminent threat to a person
Given the instantaneous nature of the internet, the Commission recommends that the Regulator be empowered to issue cease and desist orders for takedown within 24 hours in cases judged to contain a “credible and imminent threat to safety”. These orders would be challengeable in court and an exception to the Commission’s general rule that the Regulator refrain from individual content decisions and address systemic issues.
Other Considerations
The Commission considered imposing reactive takedown requirements on platforms, which would require companies to remove “offending categories” of content in as little as 24 hours or face heavy fines. Despite the existence of these mechanisms in other jurisdictions, these kinds of changes were rejected for fear of over-censorship.
Overall, the Commission believes that to be effective, the Regulator must have the power to impose penalties, such as significant fines and possible jail time for executives. The Commission intends for the affirmative requirements on the platforms to be developed under legislation and regulation.
Participation
Further details are available by consulting the Canadian Commission on Democratic Expression’s final report and the report of the Citizens’ Assembly on Democratic Expression.