How technology is reshaping privacy

Last winter, the Royal Canadian Mounted Police faced scrutiny for its use of facial recognition technology developed by a tech startup called Clearview AI. Clearview’s software scrapes social media sites such as Facebook and Instagram for facial images and then learns to recognize them.

In January, a New York Times investigation revealed that the software had reportedly extracted more than three billion photos from public websites such as Facebook, YouTube and Instagram and poured them into a database that has been used by more than 600 law enforcement agencies in the United States, Canada and elsewhere.

In March, the federal Office of the Privacy Commissioner began an investigation into whether the RCMP’S use of Clearview AI software violated federal privacy law, and the House of Commons Standing Committee on Access to Information, Privacy and Ethics launched a probe into the implications of facial recognition technology. (The RCMP later said it would continue to use Clearview AI “in very limited and specific circumstances” and would work with Canada’s privacy commissioner on guidelines for the use of facial recognition technology.)

“AI is one of those technologies that has power for great good and great harm at the same time,” says Gillian Stacey, a partner in Davies Ward Phillips & Vineberg LLP in Toronto. “How it gets used has the potential to be discriminatory and to breach human rights.

“There are a lot of legal issues wrapped up in AI, and privacy is just one of them,” she adds. “The privacy commissioner won’t let go of this too soon.”

Enhancing privacy
As companies are gathering much more personal data than in the past, so, too, do they have more responsibility for that data, says Sunny Handa, national practice group leader of Blake Cassels & Graydon LLP’s technology and communications groups, from his Montreal office. “The law is changing, trying to catch up,” he says. “We’ve seen a real uptick in the last 20 years of privacy laws.”

The Office of the Privacy Commissioner of Canada is currently engaged in legislative reform policy analysis of federal privacy laws, and it launched consultations with a submission deadline date of March 13.

Examining the use of data in an artificial intelligence environment, including what goes into creating an algorithm, “makes a lot of people uneasy,” Handa says. “Because companies are developing this, they consider it proprietary, and if asked to expose this, are there liability issues involved? So, this is why it’s a consultation a lot of people are participating in.”

In May, Innovation, Science and Economic Development Canada created an advisory council on artificial intelligence and released its “Strengthening Privacy for the Digital Age” discussion paper. As part of Canada’s Digital Charter initiative, the paper outlined proposals to modernize the Personal Information Protection and Electronic Documents Act. “Strengthening Privacy for the Digital Age” focused on four key areas: i) enhancing individuals’ control; ii) enabling innovation; iii) enhancing enforcement; and iv) clarifying PIPEDA.

“Globally, what we’re seeing is influencing Canada,” says Kirsten Thompson, the national lead of the transformative technologies and data strategy group at Dentons Canada LLP in Toronto and a member of its privacy and cybersecurity group.

“The Digital Charter, at the federal level, will inform how PIPEDA will change,” she says. PIPEDA modernization is on the horizon, and B.C. and Quebec are also looking at statutory reviews of privacy legislation. “With the advent of the GDPR [General Data Protection Regulation] in the EU and UK, that has . . . become the new floor for privacy legislation,” in part because it is so comprehensive, she says.

“Most large multinational companies looked at their own legislation and said, ‘We’re just going to adopt GDPR regulations.’ From talking to various ministers, the key priority in Canada is ensuring that all privacy legislations, federally and provincially, is homogenous,” says Thompson.

“Jurisdictions that previously had ad hoc, incomplete legislation are adopting legislation that looks a lot like GDPR.”

Stacey predicts “more overt government regulation. In Canada . . . the Privacy Commissioner has been pushing hard to modernize” privacy laws pertaining to technologies such as AI. In the United States, the California Consumer Privacy Act came into place in 2018, and other states also enacted legislation where there had been none before.

“It’s a global trend, I think, to actually have the government step in, recognizing that the markets are obviously not self-regulating, and give individuals privacy rights,” Stacey says.

“So, we will definitely see more regulation and heightened enforcement in Canada,” she adds, pointing to the Competition Bureau’s January announcement that it would take action against organizations that “make false or misleading statements about the type of data they collect, why they collect it and how they will use, maintain and erase it.” Statements that constitute unfair or deceptive marketing practices are subject to heavy administrative and other penalties under the Competition Act.

Canada’s privacy commissioner regularly asks Parliament for the ability to levy penalties, Stacey says, and when Parliament does reform PIPEDA, it may put penalties in. “But, in the short term, we can see the Competition Bureau stepping in to do it.”

Data, security and privacy rights are all interconnected issues. “They’re interconnected because, for some of the new technologies, like AI, what makes them work is vast amounts of data,” she says, “and quite often personal information. It’s an interconnected circle, and it’s one of the reasons the privacy commissioner has a subset focus on AI regulation.”

Protecting data
There’s a patchwork of protection for data, says Anthony de Fazekas, head of technology and innovation for Norton Rose Fulbright Canada LLP. “So, a lot of protection of data happens through contracts, whether it’s bilateral agreements or setting up a data consortium where everybody who participates by contributing or using data agrees to certain terms” regarding who will own and who will get to use.

“So, having the appropriate contracts and terms in relation to data is extremely important,” de Fazekas says.

“A tech company may have very valuable data that it has created, let’s say insights or inferences, or analytics or AI technology that was trained using other data and wouldn’t exist but for it, but a lot of times the data in the enterprise is coming from . . . third-party sources.” Earlier data collection may have been done “where there wasn’t necessarily a comprehensive approach around privacy.”

And whether a company is subject to GDPR, how it’s collecting, processing and even storing data may still be subject to data localization laws, he adds. “A lot of times, as much as valuable data or products of data are being generated by the company, there isn’t really that robust an approach to evaluating data sourcing and managing what we often call data provenance.”

This could lead to being blocked from commercializing assets if there is an underlying privacy breach, in addition to public exposure and reputational damage, “because you’re making money from a product where you’re using sensitive personal information in an unauthorized way.”

As well, a lot of trading is being done in Big Data sets, including for facial recognition. “We’re seeing a lot more data being sold in various marketplaces,” which in future may be an area of risk for companies when the data commerce does not conform to regulations and laws.

“That’s an area which needs to get cleaned up.”

And there is “a lot of business” in cybersecurity at present, with more employees working remotely and at greater risk of getting hacked. The incidence of hacking has very much taken off, he says, and companies are having to face that in terms of privacy and data protection, becoming better at understanding what they’re holding and who has access to it.

Adapting to the new ‘new’
Adapting can be tough, especially given that PIPEDA is technology-neutral legislation, meaning new technology can be folded into the legislation, says Dentons’ Thompson. But, she says, the problem is not the technologies but “the wholesale change in how we conduct ourselves and how we conduct business that’s instigated by these technologies.”

So, for example, PIPEDA contemplates a direct collection of personal information in many respects and a decision-maker in some respects, Thompson says. “But many of these new technologies have different ways of operating [and] facilitate different business models that PIPEDA never contemplated. So, sometimes, it’s very, very tough to advise clients on where they fit into this scheme of things” and difficult for clients to comply.

PIPEDA, which applies to the collection, use and disclosure of personal information in the course of commercial activities, won’t necessarily apply to tech companies but to their third-party customers, she adds. So, the service provider using voice technology on its telephone lines may be subject to PIPEDA, while the tech manufacturer will not be. “So that’s where they can get a bit of friction” in relation to PIPEDA, says Thompson.

All of this makes for a brisk business for privacy lawyers, says Handa. “It was a sleepy area for decades, and right now it’s a very hot area. And it’s going to continue to be; data is the currency of the future.”