This executive summary lays out highlights from the report Protecting Privacy Rights in the Digital Age, written by Max Bell School Master of Public Policy students Elisa Alloul, Harshini Ramesh, Imani Thomas, and Caroline Wilson as part of the 2023 Policy Lab.
Access the summary and presentation below, and read their full report here.
Economic activity has been transformed by the ability to turn ubiquitous data into valuable insights. One category of this abundant data is personal information, which provides insights into people’s characteristics, preferences, and behaviours. No matter the type or sector, every organization in society can use this information to their benefit, whether it be to improve their operations, boost fundraising, gauge voting intentions, or to sell services and products. Individuals also gain value from data-driven innovations — 67% of Canadians are willing to share their data for new products or better customer experiences.
However, as more data are collected, people’s private spheres continue to shrink. Renowned Harvard professor, Shoshana Zuboff, coined the term “surveillance capitalism”, to describe the phenomenon of collecting excessive amounts of information from and generating insights on groups and individuals.
Balancing the protection of people’s privacy with data-driven economic growth is the central challenge of the Digital Age. This Policy Lab report, sponsored by Interac Corp., examines this tension in Canada and how Canadians’ privacy protections can be strengthened in the Digital Age. Specifically, it answers the following question:
“With a complex environment of digital privacy protections across various jurisdictions in Canada, what regulatory and legislative changes are needed to recognize digital privacy as a basic human right for all, and how do we ensure digital inclusion and control of data are considered as part of the solution?”
The main objectives of this report:
- To provide an understanding of Canada’s digital privacy challenges by outlining how digital transformation impacts privacy and trust in the digital economy
- To propose a human rights-based approach to digital privacy
A Snapshot of the Digital Age
Data is a key driver of economic growth in Canada. The digital economy has seen a significant increase in its nominal GDP, outpacing the growth rate of the overall economy at 40% between 2010 and 2017. In 2019, the industry was worth $118 billion, making it nearly as valuable as Canada’s lucrative mining, oil and gas industries ($119 billion). Both public and private organizations are investing considerably to leverage innovations in advanced analytics, algorithms, and artificial intelligence (AI), to understand how best to collect and process data. This prospecting in the digital economy has led to data being described as “the new oil”.
Canadian Digital Privacy Challenges
Privacy in the Context of Private Sector Operations
Data analytics and other new technologies have and will continue to drive economic value and competitive advantages in business and politics. This has major implications for the protection of personal information and by extension, people’s privacy. A fundamental tenet of Canada’s commercial privacy law is respecting privacy by obtaining consent. Despite federal and provincial privacy laws outlining responsibilities around consent and safeguarding of personal information, businesses are not fully compliant with their privacy obligations. Small and medium-sized enterprises (SMEs) in particular face challenges in balancing innovation with complying with privacy law.
Role of Trust in the Digital Era
Trust is critical to growth in the digital economy. People generally expect that information they share about themselves will be used for the purposes that they have identified and that any other information that is discerned about them will be used in a fair and appropriate manner. However, this has not always been the case. In a 2023 survey conducted on behalf of the OPC, 39% of respondents believed that businesses respect their privacy rights, a decline from 45% in 2020. In addition, in a survey commissioned by Interac earlier this year, 74% of Canadians expressed that they would like more control of their online information. Unlike data, trust is not ubiquitous in the digital economy.
Privacy Harms in the Digital Era
A breakdown of trust is not the only kind of harm that can arise in the digital era. The digital space recreates systems of oppression that have harmful impacts on marginalized communities when their digital privacy is infringed upon. The overcollection of data increases the surveillance of traditionally marginalized groups. Data processing can cause discrimination against entire groups. Data collection and processing put civil and political rights at risk.
Global Policy Shifts
Recognizing that there is a myriad of harms, jurisdictions such as the European Union (EU), California and Quebec have moved towards updating their privacy protection legislations. Canada risks falling behind its trading partners, namely the EU which requires other countries to have privacy legislation to protect Europeans that is comparable to its own. To ensure that trade can continue uninterrupted, Canada has an imperative to update its privacy laws.
Canada’s Policy Landscape
There are three privacy protections to consider under federal law:
- The Personal Information and Protection of Electronic Documents Act (PIPEDA) is Canada’s privacy law governing the use of personal information in commercial contexts.
- The Office of the Privacy Commissioner is Canada’s federal privacy oversight body, responsible for overseeing compliance to the Privacy Act (privacy law applicable to the federal public sector) and PIPEDA.
- Privacy Law Outside of Commercial Use largely concerns the Canada Elections Act which governs the use of personal information in the electoral process as it relates to the maintenance of the Register of Electors.
The federal government is in the midst of updating its privacy regime with Bill C-27, The Digital Charter Implementation Act, which comprises three parts: The Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act.
Both existing protections and the current iteration of Bill C-27 have limitations to protect people’s privacy. First, both legislations do not adequately recognize this or provide special requirements for sensitive information. Second, despite all sectors being involved in data collection and processing, political parties and not-for-profits are not covered by the law. Third, there are gaps in governing data processing, specifically de-identification and anonymization of data. Fourth, current privacy supports are not fully equipped to support SMEs and not-for-profit organizations. Finally, the OPC does not have the enforcement powers to fully encourage adherence to privacy laws.
Rapid developments in technology are putting people’s privacy at risk. Canada’s current regime views privacy through a commercial lens, which does not fully account for the human rights impacts of privacy infringements. Canada needs a human rights-based approach (HRBA) to privacy to strengthen protections, which it can justify with existing international and federal legal instruments. Taking an HRBA to privacy means recognizing two key pillars: information is inherent to personhood and accountability and inclusion foster trust. This approach forms the foundation for two key recommendations to bolster Canada’s privacy regime:
Recommendation 1: Raise the Level of Responsibility for Private Actors
Interac should advocate for an HRBA to protecting personal information. This begins with recognizing that there is an inherent “humanness” to data; personal information is inalienable from a person and by extension, their personhood. Legislative and regulatory changes should include stricter governance around people’s information
1.1 Safeguarding Sensitive Information
Effectively safeguarding sensitive information begins with a clear definition within the legislation and acknowledging that not all sensitive personal information has the same level of sensitivity and potential for harm. In addition to defining sensitive information, a tiered system of sensitive information should be implemented to ensure that private actors are treating and protecting sensitive information with the highest regard possible.
1.2.1 Fill in the Governance Gaps
Bill C-27 should extend its scope and include not-for-profits, political organisations, and other similar actors who are not commercial operators but collect and process significant amounts of personal information for their operation.
1.2.2 Governing Data Processing
Bill C-27 should specify “processing” of data and expand the focus from collection, use, and disclosure. A new set of regulations are required to provide uniform guidance on managing personal and sensitive information when de-identified and anonymized information.
Recommendation 2: Engineering an Environment of Trust
To build trust in the digital privacy environment, Interac should advocate for elevating community voices in a national conversation on privacy and ensure the OPC has the ability to implement compliance and enforcement.
2.1 Digital Privacy Dialogues Initiative
Businesses like Interac and others should adopt a Digital Privacy Dialogues corporate social responsibility initiative.
2.2 Establishing a SME and Not-for-Profit Organization Advisory Directorate
The Directorate can support SMEs and non-profit-organizations in accessing resources and information regarding best practices for compliance to help these organizations adapt, comply with new and amended laws, regulations, and policies, and prevent the possibility of an investigation, fine, or breach.
2.3 Enforcing Privacy Law
Bill C-27 should empower the OPC to enforce privacy with adequate regulatory tools including the power to issue penalties.