Harms Reduction: A Six-Step Program to Protect Democratic Expression Online
Canadian Commission on Democratic Expression
January 2021
In a new report published by the Canadian Commission on Democratic Expression (CCDE), the CCDE and the Citizen’s Assembly on Democratic Expression argue that the Canadian government must undertake urgent action to ensure that social media platforms are subject to regulation in order to adequately protect Canadians from online harms.
About the Initiative
The Canadian Commission on Democratic Expression is a three-year initiative, led by the Public Policy Forum that aims to bring a concerted and disciplined review of the state of Canadian democracy and how it can be strengthened. The centerpiece is a small, deliberative Commission which will draw on available and original research, the insights of experts and the deliberations of a representative citizen’s assembly to assess what to do about online harms and how to buttress the public good. The Commission is designed to offer insights and policy options on an annual basis that support the cause of Canada’s democracy and social cohesion.
This initiative grew out of earlier insights about the relationship of digital technologies to Canada’s democracy covered by the Public Policy Forum’s ground-breaking report, The Shattered Mirror and its subsequent interdisciplinary research outlined in the Democracy Divided report (with UBC) and through the Digital Democracy Project partnership, directed by Taylor Owen.
The initiative is stewarded by Executive Director, Michel Cormier and delivered in partnership with MASS LBP co-founder Peter MacLeod who is executing the national citizen assemblies. The Centre for Media, Technology and Democracy provided the CCDE with research which informed the commission’s deliberations.
To learn more about the initiative and how you can become involved, please visit www.ppforum.ca. The initiative will run from April 2020 to March 2023.
In light of the steep growth in social and democratic harms online, the Public Policy Forum last spring established a Commission and Citizen Assembly to study and provide informed advice on how to reduce harmful speech on the internet without impairing free speech.
The report is the culmination of 20 separate Commission convenings, totalling more than 30-hours of deliberation over a nine- month period, beginning in May 2020. The seven Commissioners include: Rick Anderson, Julie Caron-Malenfant, Adam Dodek, Amira Elghawaby, Jameel Jaffer, Jean La Rose, and The Right Honourable Beverley McLachlin. The Commission and the Assembly undertook a concurrent though complementary deliberative process, hearing witness testimony from many of the same recognized experts and learning from the insights of an original body of research.
The Findings
Nine months on, the Commission has published its findings alongside those of the Citizens’ Assembly. The Commission has set out an integrated program of six practical steps that rejects a policy of aggressive takedown of content in favour of a citizen-centric approach that places responsibility for hateful and harmful content firmly on the shoulders of platforms and its creators. The program is intended to not only protect but enable democratic participation.
A new legislated duty on platforms to act responsibly.
A new regulator to oversee and enforce the Duty to Act Responsibly
A Social Media Council to serve as an accessible forum in reducing harms and improving democratic expression on the internet.
A world-leading transparency regime to provide the flow of necessary information to the regulator and Social Media Council.
Avenues to enable individuals and groups to deal with complaints of harmful content in an expeditious manner. An e-tribunal to facilitate and expedite dispute resolution and a process for addressing complaints swiftly and lightly before they become disputes.
A mechanism to quickly remove content that presents an imminent threat to a person.
The Commission believes that this made-in-Canada approach to curbing the dissemination of online harms will create a standard of reasonableness around how platforms deal with the harmful content in their domains, with responses modulated to the seriousness and pattern of abuse, and consequences for failure to comply.
Read the full report in English and French at the link below.