Privacy law reform chasing rapid technological advance of AI and biometrics

International efforts have focused on protecting children, says lawyer
Privacy law reform chasing rapid technological advance of AI and biometrics

While businesses deploy ever-more advanced technology and AI drastically accelerates the cybercrime arms race, the regulatory landscape is expanding and fine-tuning.  

Bill C-26, An act respecting cyber security, amending the Telecommunications Act, and making consequential amendments to other Acts, finished the committee stage in the House of Commons on April 19.  

Bill C-26 will be Canada’s first cybersecurity law, says Brent Arnold. While Canada’s privacy laws focus on protecting individual privacy rights, cybersecurity laws are aimed at protecting organizations and the country from cyberattacks.  

Ottawa proposed the legislation in 2022, and it has had a “bumpy ride” through parliament, says Arnold, a partner in Gowling WLG’s advocacy department, specializing in cybersecurity and commercial litigation. Having taken a year and a half to get through two readings, he recently opined at a conference that the feds were not taking the issue seriously enough, but the bill has since quickly progressed through the committee stage.  

C-26 has two main functions, says Arnold. It amends the Telecommunications Act, allowing the federal government to prevent companies from using certain vendors or implementing certain products in the name of national security. He says this is intended to prevent controversies like the one that occurred a few years ago about whether to allow Chinese vendors to participate in building Canada’s 5G network.  

Arnold says the bill’s second aspect is more interesting to cybersecurity lawyers. C-26 would constitutionally impose a new regime on any organization dealing in critical infrastructure over which the federal government has jurisdiction.  

“What that’s going to require those organizations to do is essentially prove to the federal government that they’ve got some cybersecurity plans in place.” 

He says organizations will be required to share that information with the feds, who will either approve them or order improvements. These requirements carry significant monetary and potential criminal penalties.  

Bill C-27, the digital charter implementation act, 2022, is currently before the House of Commons Standing Committee on Industry and Technology. The proposed legislation includes three acts: the consumer privacy protection act, the artificial intelligence and data act, and the personal information and data protection tribunal act. In addition to replacing Canada’s 20-year-old private sector privacy law, the Personal Information Protection and Electronic Documents Act (PIPEDA), bill C-27 would create new rules concerning the development of AI technology.  

C-27 would also establish a tribunal for the enforcement of the consumer privacy protection act. The Personal Information and Data Protection Tribunal would review recommendations from the federal privacy commissioner and have the power to impose monetary penalties, with an appeal process available for the review of tribunal decisions.  

According to Innovation, Science, and Economic Development Canada, the new AI legislation is intended to “strengthen Canadians’ trust in the development and deployment of AI systems.” The legislation would establish an AI and Data Commissioner to support the ISED minister in executing ministerial responsibilities such as monitoring compliance, ordering third-party audits, and sharing information with other governmental agencies. C-27 also includes criminal penalties for unauthorized data collection and reckless or fraudulent AI deployment.  

“We’re going to have a whole new privacy regime in this country,” says Arnold. 

While the federal government attempts to keep up with AI, the full-tilt advance of the technology is also raising the temperature in the cybersecurity world, he says. AI is allowing hackers to produce more effective attacks, and law enforcement and those working to protect digital systems are using it to detect those attacks.  

“It’s putting that whole balance between the good guys and bad guys on steroids.” 

Just as AI’s rapid development is attracting privacy law scrutiny, such is the case in the steadily advancing area of biometrics. 

Kirsten Thompson, head of Dentons’ privacy and cybersecurity group, says organizations are increasingly rolling out the technology for things like identity verification as it improves and becomes more affordable.  

One issue with regulating biometrics is the lack of clarity on its definition. But generally, she says, biometrics is understood to consist of a biological marker that is turned into a mathematical sequence and associated with a unique individual.  

“The question, then, is: What’s the biometric? Is it the original biology? Is it the math part? Is it the correlation of the math with the individual?” 

Some common forms of biometrics are face recognition, voice recognition, fingerprinting, and gait recognition.  

Thompson says a cellphone’s face recognition is known as a “one-to-one match.” However, she says a “one-to-many match,” where the mathematical signature of one individual is compared against a database to find its parallel, is more problematic for privacy law.  

“Privacy commissioners and other regulators get a little uncomfortable with the idea of either public-sector or private-sector companies having large databases of biometrics,” says Thompson. “First of all, it creates a risk in case it ever gets out into the wild.” 

Under PIPEDA and the provincial privacy laws in Alberta and BC, she says, biometric information is considered personal information and, likely, sensitive personal information. This means there are heightened requirements around security, and the necessary consent will “almost certainly” be expressed consent. Organizations are required to disclose why they are collecting the information and what they will do with it. In Quebec, organizations also must notify the privacy regulator if they are using biometrics for identity authentication.  

Organizations must also consider the fundamental principle of privacy law – that they must use the least invasive technique possible, says Thompson. The question when rolling out a biometric tool is whether a less invasive method could meet the objective.  

“This is where there’s friction between companies and the privacy commissioner. The privacy commissioner’s draft guidance says that cost savings and convenience in the reduction of customer friction is not going to be a sufficient rationale for using biometrics. It has to be more than that, given the trade-off with privacy.” 

Children’s privacy is a focus in Canadian privacy law reform and privacy law around the world, says Max Jarvie, a partner at Davies Ward Phillips and Vineberg LLP, with expertise in technology, cybersecurity, data privacy, and life sciences. 

Jarvie appeared as a witness before the Standing Committee on Industry and Technology last November while it considered Bill C-27. In his own session, and in the previous sessions he had watched, the concern around children’s privacy was a “recurring theme,” including from Privacy Commissioner of Canada Philippe Dufresne.  

The legislation states that the personal information of minors will be considered sensitive, which “unlocks all kinds of additional obligations,” says Jarvie. He adds that the bill has several exceptions to the right of deletion, but those exceptions do not apply to the personal information of minors.  

Child protection is also a theme internationally. Last year, France enacted the Digital Majority Law, which requires social networks to obtain the parents’ express consent if a child under the age of 15 registers for the service. Once registered, the social networks must activate a device to monitor how much time minors spend on the service and provide regular updates.  

France also recently introduced a new law to protect children’s privacy when it comes to the online dissemination of their photos. According to an article by the law firm Bird & Bird, The Children’s Image Rights Law empowers judges to restrict a parent from posting a child’s image if there is disagreement between the parents on the issue. It also allows judges to restrict the posting of children’s images where the publication compromises the child’s “dignity or moral integrity,” said the article. 

“That’s one focus that I think is going to be a trend to watch for this year and in coming years,” says Jarvie. 

 

Lawyer(s)

Firm(s)

Gowling WLG Davies Ward Phillips & Vineberg LLP Dentons Canada LLP