Proposed privacy, AI legislation doesn’t limit business use of facial recognition, complain rights groups

Share post:

New legislation limiting the use of facial recognition in Canada is needed according to civil liberties groups, who say proposed privacy and artificial intelligence laws now before Parliament are inadequate.

The call by the Right2YourFace Coalition comes in advance of the testimony Thursday of one member, the Canadian Civil Liberties Association, before the House of Commons Industry committee on Bill C-27, which includes the Consumer Privacy Protection Act (CPPA) and the Artificial Intelligence and Data Act (AIDA).

The CPPA would cover federally-regulated businesses and firms in provinces and territories that don’t have their own private sector privacy legislation. AIDA would regulate the use of “high impact” automated decision-making software.

Among its many failings, C-27 doesn’t have clear definitions and contains too many exemptions that can leave facial recognition unregulated, the coalition said in a statement Wednesday.

“Facial recognition technology is a powerful and invasive tool that is being used by actors across the public and private sectors, from law enforcement to shopping malls—and there’s little in the way of guardrails to protect us from it,” said Daniel Konikoff of the Civil Liberties Association.

“The way to pressure test a new law is to see if it will do a better job at protecting people across Canada than the old one. Exceptions in the Consumer Privacy Protection Act and public sector exclusions in the Artificial Intelligence and Data Act mean Bill C-27 fails that test, at a time when invasive facial recognition technology is gaining ground in private and public sector applications alike,” said Brenda McPhail, a member of the Right2YourFace steering committee.

In a letter to Innovation Minister François-Philippe Champagne, whose department is responsible for C-27, the coalition outlines five problems with the proposed legislation:

— the CPPA does not flag biometric information as sensitive information, and it does not define “sensitive information” at all. “This omission leaves some of our most valuable and vulnerable information — including the faces to which we must have a right — without adequate protections,” the coalition says.

The CPPA should include special provisions for sensitive information, and its definition should explicitly provide for enhanced protection of biometric data, including facial recognition images. The safest biometric data, the letter adds, is biometric data that does not exist;

— the CPPA’s exemption to businesses for notifying people their personal information is being collected if it is being done for “legitimate business purposes” is too broad and will not protect consumers from private entities wishing to use facial-recognition technologies (FRTs);

— while AIDA covers “high impact systems,” the definition of what that includes is undefined in AIDA. “Leaving this crucial concept to be defined later in regulations leaves Canadians without meaningful basis from which to assess the impact of the Act, and FRT must be included,” says the coalition.

Note that Champagne has told the committee that the final version of AIDA would include a definition of “high impact” to include the processing of biometric information for identification without consent.

The letter acknowledges Champagne has produced some potential amendments to C-27. But, it complains, many lack concrete legislative language;

— AIDA does not apply to government institutions, including national security agencies who use AI for surveillance, and exempts private sector AI technology developed for use by those national security agencies. That creates “an unprecedented power imbalance” says the coalition;

— AIDA focuses on the concept of individual harm, which excludes the impacts of FRT on communities at large.

As it stands now, the letter says, the CPPA is “unequipped to protect individuals and communities from the risks of FRT.

The post Proposed privacy, AI legislation doesn’t limit business use of facial recognition, complain rights groups first appeared on IT World Canada.
Howard Solomon
Howard Solomonhttps://www.itworldcanada.com
Currently a freelance writer, I'm the former editor of ITWorldCanada.com and Computing Canada. An IT journalist since 1997, I've written for several of ITWC's sister publications including ITBusiness.ca and Computer Dealer News. Before that I was a staff reporter at the Calgary Herald and the Brampton (Ont.) Daily Times.

SUBSCRIBE NOW

Related articles

Cyber Security Today, May 3, 2024 – North Korea exploits weak email DMARC settings, and the latest Verizon analysis of thousands of data breaches

This episode reports on warnings about threats from China, Russia and North Korea, the hack of Dropbox Sign's infrastructure

Hashtag Trending for World Password Day, Thursday, May 2nd, 2024

Security firm Okta warns of an unprecendented password stuffing attack that is piggybacking on regular user’s mobile and...

Google Chrome’s new post-quantum cryptography causes connection issues

The latest update to Google Chrome, version 124, which integrates a new quantum-resistant encryption mechanism, has led to...

UK legislation bans weak passwords

Starting Monday, the UK will enforce new laws banning the sale of devices with weak default passwords such...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways