Are you confident your data is secure and used ethically?
Quebec introduced stringent regulations in September 2022, which will have downstream impacts across Canada in the years to come. Law 25 includes numerous provisions and penalties which go far beyond the current requirements of the federal Personal Information and Electronic Documents Act (PIPEDA). Law 25 is equalled only by Europe’s General Data Protection Regulation (GDPR) — widely accepted as the most robust legislation anywhere in the world.
Law 25 already has teeth beyond Quebec’s borders, as it applies to any organization (Canadian or international) that does business in the province or stores personal identifiable information (PII) on Quebec residents. Given the trend toward greater accountability and stewardship over PII, it is only a matter of time before other provinces and the federal government follow suit with enhancing existing privacy laws.
Stricter regulations present both an obstacle and an opportunity for organizations that have come to rely on data for critical insights and market opportunities. Compliance will require significant updates to policies and procedures governing the collection, use, and transmission of data. These will need to be resilient to the numerous and often differing privacy regulations organizations are exposed to across the various jurisdictions they do business in. It will also require that organizations be more transparent with how and why they collect user data and who it will be shared with — potentially triggering some uncomfortable conversations.
Many organizations will lament the potential financial costs of hiring new privacy officers, curtailing analytics programs which are beyond the scope of why data is collected, implementing additional strategies to ensure compliance, and any regulatory fines they might face.
At the same time, it’s worth noting that users are already much more aware of how much organizations value their data and the risks involved with sharing it. This is shifting the competitive edge to those entities that can demonstrate when, how, and why PII will be used, and the mechanisms they’ve put in place to protect it.
Related risks
- Privacy breaches due to lack of governance around AI and third-party solutions
- Emerging legislative impacts on the use of third-party data sets may increase non-compliance risk
- Impacts of transparency requirements on cross-channel management of user preferences and experience
- Increasing Insider risks due to remote/hybrid environment and turnover
- Regulatory non-compliance
Key questions to ask
- Are you confident that private and confidential data is kept in a secure location with sufficient controls?
- Do you understand which privacy laws apply to your organization and the related compliance requirements?
- Has your organization identified where all critical data is stored and how it moves between systems and jurisdictions?
- Do you know if your private and confidential data may have lost integrity and/or adequate segmentation in the transition to new digital systems (i.e., the cloud)?
- Is it possible for a third party to gain inappropriate access to your critical data?
- Does your organization have a policy concerning the use of AI tools such as Chat GPT and the management of confidential or personally identifiable information?
Red Flags
- Inability to isolate personal identifiable information and confidential data
- No training or policy related to data governance and controls
- No policy concerning AI tools such as ChatGPT
- Minimal use of data analytics for decision support or risk assessment, given data is private and confidential
- History of data breaches