Search for:

In depth

The Public Policy Forum (PPF) is a prominent independent, non-profit Canadian think tank for public-private dialogue. In April 2020, the PPF established the Canadian Commission on Democratic Expression (“Commission”) to study and provide informed advice on how to reduce harmful speech on the internet. The Commission issued a report recommending six practical steps that place the responsibility of hateful and harmful content on the shoulders of technology platforms and its creators. The recommendations are summarized below.

Recommendation #1: A new legislated duty on platforms to act responsibly

The Commission believes that platform companies must accede to a greater public interest by assuming responsibility for harmful content that appears within their domains. Creating a new legal standard to act responsibly would impose an affirmative requirement on platforms including social media companies, large messaging groups, search engines and other internet operators involved in the dissemination of user-generated and third-party content.

Recommendation #2: A new regulator to oversee and enforce the Duty to Act Responsibly

To oversee and enforce the new Duty to Act Responsibly on platforms, the Commission calls for the creation of a new regulatory body (“the Regulator”) that would move content moderation and platform governance beyond the control of private sector companies. Regulatory decisions would be judicially made, based on the rule of law and subject to a process of review. The Regulator would also be responsible for publication and enforcement of a Code of Conduct for regulated parties, underpinning the Duty to Act Responsibly.

Recommendation #3: A Social Media Council to serve as an accessible forum in reducing harms and improving democratic expression on the internet

Creating an independent, stakeholder-based social media council would provide an institutional forum for platforms, civil society, citizens and other interested parties to have an inclusive dialogue about ongoing platform governance policies and practices. Importantly, it would perform a consultative role for the Regulator in providing broad-based input into the Code of Conduct and how changing technology, business models and user experience affects policy.

Recommendation #4: A world-leading transparency regime to provide the flow of necessary information to the Regulator and Social Media Council

One of the central challenges faced by researchers, journalists, policy communities, social media users and, soon, regulators, is that the platform ecosystem is considered as non-transparent. Embedding significant transparency mechanisms at the core of the mandate for the Regulator and Social Media Council would provide greater access to information and create a more publicly accountable system.

Recommendation #5: Avenues to enable individuals and groups to deal with complaints of harmful content in an expeditious manner. An e-tribunal to facilitate and expedite dispute resolution and a process for addressing complaints swiftly and lightly before they become disputes.

The Commission believes that creating a new e-tribunal for online content disputes could rebalance the possible asymmetry in the digital sphere from private sector processes within the platform companies, to a public institution dedicated to due process and transparency. An e-tribunal would provide rapid and accessible recourse for content-based dispute settlement.

Recommendation #6: A mechanism to quickly remove content that presents an imminent threat to a person

Given the instantaneous nature of the internet, the Commission recommends that the Regulator be empowered to issue cease and desist orders for takedown within 24 hours in cases judged to contain a “credible and imminent threat to safety”. These orders would be challengeable in court and an exception to the Commission’s general rule that the Regulator refrain from individual content decisions and address systemic issues.

Other Considerations

The Commission considered imposing reactive takedown requirements on platforms, which would require companies to remove “offending categories” of content in as little as 24 hours or face heavy fines. Despite the existence of these mechanisms in other jurisdictions, these kinds of changes were rejected for fear of over-censorship.

Overall, the Commission believes that to be effective, the Regulator must have the power to impose penalties, such as significant fines and possible jail time for executives. The Commission intends for the affirmative requirements on the platforms to be developed under legislation and regulation.

Participation

Further details are available by consulting the Canadian Commission on Democratic Expression’s final report and the report of the Citizens’ Assembly on Democratic Expression.

Author

Theo Ling heads Baker McKenzie's Canadian Information Technology/Communications practice and is a member of the Firm's Global IP/Technology Practice Group, and Technology, Media & Telecoms and Financial Institutions Industry Groups. Theo is ranked by several legal directories, including Chambers Canada, where he is described as "a knowledgeable technology lawyer, with a practical, 'can-do' attitude who is excellent at getting things done." Named by the Financial Times as one of the Top Ten Most Innovative Lawyers in North America, Theo founded the legal industry's first global legal innovation lab focused on multidisciplinary collaboration and serves on the Firm's Global Innovation Committee.

Author

Karina Kudinova is an associate in Baker McKenzie's IP/Tech Group in the Toronto Office. Prior to joining the Firm, Karina was in-house counsel for a registered Canadian credit reporting agency.