Tech
I
March 22, 2024

Facing the future – A biometrics code of practice for New Zealand?

In a press release published late last year, Privacy Commissioner Michael Webster announced that the Office of the Privacy Commissioner (OPC) will release an exposure draft of a biometrics code of practice early this year (Proposed Code).

While the use and collection of biometric information is currently regulated under the Privacy Act 2020 (Privacy Act), the OPC wants to introduce stronger protections to mitigate the specific risks associated with the automated processing of biometric information.

What is biometric information?

The OPC defines “biometric information” as information about a person’s physical or behavioural characteristics (such as a person’s face, fingerprints, voice, or gait).

Because it relates directly to a person’s image and characteristics, which are tied to a person’s identify and difficult (sometimes impossible) to change, the OPC considers biometric information to be particularly sensitive personal information.

What is biometric technology?

Biometric technology is automated technology for authenticating and verifying human body characteristics such as fingerprints, facial patterns, eye retinas and voice patterns. These systems use algorithms to compare raw samples of biometric information (such as a photo or fingerprint) against biometric templates, which are digital mathematical representations of biometric information that has previously been collected. These systems include facial recognition, finger scanning and voice recognition technologies.

Biometric technology has transformed identity verification and security. Many of us use it daily to unlock our phone, access our digital wallets or avoid the queues at airport border control. Biometric technology can play an important role in improving policing and law enforcement outcomes (including by identifying suspects). And it has real potential benefit in commercial contexts, such as age verification to protect minors, identifying problem gamblers, and preventing retail crime to better protect customers and staff.

So, is there a problem?

The Privacy Act already generally applies to the collection and use of biometric information as personal information.

But the OPC has identified “key privacy risks” associated with biometric information:

• Unnecessary or high-risk collection and use (such as the risk of mass surveillance and profiling of individuals);

• Function and scope creep (where biometric information collected for one purpose is used for another); and

• A lack of control or knowledge about when and how biometric information is collected and used (including collecting information from people without their knowledge or involvement).

The OPC has also stated that the automated processing of biometric information poses new or increased privacy risks – specifically, that biometric technology can:

• Be used to track, monitor, or profile people in ways that are “intrusive, discriminatory or creepy”, often without their knowledge; and

• Misidentify or misclassify people, who can suffer disadvantage because of these decisions or mistakes.

The OPC has concluded that the level of risk and intrusiveness is not the same for all biometric recognition, and that there is higher risk to individuals when decision-making based on the use of biometric information is automated, removing human oversight.

A new code of practice

The Privacy Act gives the OPC the power to issue codes of practice that modify the operation of the Act and set specific rules for types of personal information. Based on public consultation, the OPC has concluded that the privacy risks associated with collecting and using biometric information in automated processing warrant a new code of practice that will vary the general obligations under the Privacy Act. This is a pivot from the OPC’s previously published position, in October 2021, that a new code of practice dealing with biometric technology was not needed.

The Proposed Code has not yet been released for consultation, but the OPC has published some information about what it will cover.

Scope of the Proposed Code

The Proposed Code will apply to all agencies regulated by the Privacy Act when they collect and use biometric information to verify, identify or classify individuals in or via automated processing.

The OPC has indicated that “automated processing” is likely to mean “using a technological system that employs an algorithm to compare or analyse biometric information”. This would include technology that automates the verification, identification, or categorisation of an individual, such as facial recognition, finger scanning or voice recognition technologies.

The Proposed Code will not apply to:

• The use of biometric information in manual processes (for example, photographs in archival collections);

• Health information (if this information is already covered by the Health Information Privacy Code 2020), genetic information or neurodata; and

• Any information that is not personal information (that is, not about an identifiable individual).

The OPC has decided that the Proposed Code will focus on three requirements:

Proportionality assessments where agencies must carefully consider whether their collection and use of biometric information in or via automated processes is proportionate to the potential privacy risks and intrusions that may occur. The OPC has previously stated its view that agencies need to be able to articulate a “strong business case” for using biometric technology in a targeted and proportionate way to meet the agency’s needs.

Additional transparency and notification requirements imposing mandatory obligations on agencies to be open and transparent about their use and collection of biometric information in or via automated processes (e.g., through clear signage notifying the public of its use).

Purpose limitations that will restrict the collection and use of biometric information in or via automated processes in certain circumstances. The OPC is proposing to rule out some use cases for auto-processing biometric information, such as targeted marketing, classifying people using prohibited grounds of discrimination, inferring someone’s emotional state, and detecting health information.

International regulation

Stricter regulation around when and how biometric information can be collected and used in or via automated processing – such as the Proposed Code – would bring New Zealand more in line with comparable jurisdictions like the EU and Australia.

For example, article 22 of the European Union’s General Data Protection Regulation (GDPR), gives individuals the right not to be subject to a decision based solely on automated processing, including profiling, which may legally or significantly affect that individual.

Under section 6 of the Australian Privacy Act 1988, both biometric information that is to be used for the purpose of automated biometric verification or biometric identification, and biometric templates, are included in the definition of “sensitive information”. In Australia, sensitive information may only be collected with consent, except in specified circumstances, whereas consent is generally not required to collect personal information that is not sensitive information.

Regulating biometric technology in New Zealand

Exactly how well a code dealing with biometric technology will work in practice in New Zealand will depend on the detail published in the exposure draft of the Proposed Code.

As the OPC has itself previously stated:

If designed well and used appropriately, biometric systems have significant benefits. These include convenience for individuals wanting to have their identity verified, efficiency for agencies seeking to identify people quickly and in large numbers, and security (because they use characteristics that cannot easily be faked, lost, or stolen). Biometric systems can also play a role in protecting privacy, by helping to guard against identity theft and fraud.

Biometric technology can also serve the legitimate purpose of detecting and preventing crime. In 2023, the Information Commissioner’s Office in the United Kingdom concluded that the live facial recognition technology provided to the retail sector by security company Facewatch served a legitimate purpose for using people’s biometric information and could be used in compliance with data protection legislation.

The United Kingdom’s ICO’s Deputy Commissioner for Regulatory Supervision stated:

Innovative solutions helping businesses prevent crime is in the public interest and a benefit to society. Data protection law recognises this, allowing personal information – in this case facial images – to be used if there is a legitimate interest, such as for the detection and prevention of crime. However, these benefits must always be balanced against the privacy rights of the individual.

The OPC has stated that it is keen to ensure that the Proposed Code is effective and workable, so it will need to balance these same concerns.

In addition to the three requirements already identified by the OPC, agencies looking to implement and use biometric technology will first need to understand, and clearly articulate in a Privacy Impact Assessment, the privacy impacts of the use case for the biometric technology and how they will be addressed and mitigated. For example, agencies will need to address and mitigate the risk that the results of automated processing of biometric information are inaccurate or biased, e.g., false results, or inaccuracies because of race or gender. Human oversight and the manual checking of results before significant decisions are made will be a critical safeguard.  

Agencies may also need to take steps to limit the personal data they collect and use in or via automated processing, and may need to ensure that vulnerable people, such as children and young people, are excluded from the automated processing of biometric information. Given the sensitive nature of biometric information, agencies can also expect specific guidance from the OPC about security and accuracy measures that should be adopted.

Next steps

The Proposed Code will be released this year for public comment. After this initial consultation, a more formal code consultation process will follow before any code is issued. If the decision is made to issue a code, we would not expect to see it finalised before late 2024.

If you have any questions in relation to the OPC’s Proposed Code, or the collection and use of biometric information within your organisation, please do not hesitate to contact us.

This article was co-authored with Ella Claridge.

No items found.

Article Link

Dowload Resource

Dowload Resource

Insights

Tech
March 22, 2024

Facing the future – A biometrics code of practice for New Zealand?

In a press release published late last year, Privacy Commissioner Michael Webster announced that the Office of the Privacy Commissioner (OPC) will release an exposure draft of a biometrics code of practice early this year (Proposed Code).

While the use and collection of biometric information is currently regulated under the Privacy Act 2020 (Privacy Act), the OPC wants to introduce stronger protections to mitigate the specific risks associated with the automated processing of biometric information.

What is biometric information?

The OPC defines “biometric information” as information about a person’s physical or behavioural characteristics (such as a person’s face, fingerprints, voice, or gait).

Because it relates directly to a person’s image and characteristics, which are tied to a person’s identify and difficult (sometimes impossible) to change, the OPC considers biometric information to be particularly sensitive personal information.

What is biometric technology?

Biometric technology is automated technology for authenticating and verifying human body characteristics such as fingerprints, facial patterns, eye retinas and voice patterns. These systems use algorithms to compare raw samples of biometric information (such as a photo or fingerprint) against biometric templates, which are digital mathematical representations of biometric information that has previously been collected. These systems include facial recognition, finger scanning and voice recognition technologies.

Biometric technology has transformed identity verification and security. Many of us use it daily to unlock our phone, access our digital wallets or avoid the queues at airport border control. Biometric technology can play an important role in improving policing and law enforcement outcomes (including by identifying suspects). And it has real potential benefit in commercial contexts, such as age verification to protect minors, identifying problem gamblers, and preventing retail crime to better protect customers and staff.

So, is there a problem?

The Privacy Act already generally applies to the collection and use of biometric information as personal information.

But the OPC has identified “key privacy risks” associated with biometric information:

• Unnecessary or high-risk collection and use (such as the risk of mass surveillance and profiling of individuals);

• Function and scope creep (where biometric information collected for one purpose is used for another); and

• A lack of control or knowledge about when and how biometric information is collected and used (including collecting information from people without their knowledge or involvement).

The OPC has also stated that the automated processing of biometric information poses new or increased privacy risks – specifically, that biometric technology can:

• Be used to track, monitor, or profile people in ways that are “intrusive, discriminatory or creepy”, often without their knowledge; and

• Misidentify or misclassify people, who can suffer disadvantage because of these decisions or mistakes.

The OPC has concluded that the level of risk and intrusiveness is not the same for all biometric recognition, and that there is higher risk to individuals when decision-making based on the use of biometric information is automated, removing human oversight.

A new code of practice

The Privacy Act gives the OPC the power to issue codes of practice that modify the operation of the Act and set specific rules for types of personal information. Based on public consultation, the OPC has concluded that the privacy risks associated with collecting and using biometric information in automated processing warrant a new code of practice that will vary the general obligations under the Privacy Act. This is a pivot from the OPC’s previously published position, in October 2021, that a new code of practice dealing with biometric technology was not needed.

The Proposed Code has not yet been released for consultation, but the OPC has published some information about what it will cover.

Scope of the Proposed Code

The Proposed Code will apply to all agencies regulated by the Privacy Act when they collect and use biometric information to verify, identify or classify individuals in or via automated processing.

The OPC has indicated that “automated processing” is likely to mean “using a technological system that employs an algorithm to compare or analyse biometric information”. This would include technology that automates the verification, identification, or categorisation of an individual, such as facial recognition, finger scanning or voice recognition technologies.

The Proposed Code will not apply to:

• The use of biometric information in manual processes (for example, photographs in archival collections);

• Health information (if this information is already covered by the Health Information Privacy Code 2020), genetic information or neurodata; and

• Any information that is not personal information (that is, not about an identifiable individual).

The OPC has decided that the Proposed Code will focus on three requirements:

Proportionality assessments where agencies must carefully consider whether their collection and use of biometric information in or via automated processes is proportionate to the potential privacy risks and intrusions that may occur. The OPC has previously stated its view that agencies need to be able to articulate a “strong business case” for using biometric technology in a targeted and proportionate way to meet the agency’s needs.

Additional transparency and notification requirements imposing mandatory obligations on agencies to be open and transparent about their use and collection of biometric information in or via automated processes (e.g., through clear signage notifying the public of its use).

Purpose limitations that will restrict the collection and use of biometric information in or via automated processes in certain circumstances. The OPC is proposing to rule out some use cases for auto-processing biometric information, such as targeted marketing, classifying people using prohibited grounds of discrimination, inferring someone’s emotional state, and detecting health information.

International regulation

Stricter regulation around when and how biometric information can be collected and used in or via automated processing – such as the Proposed Code – would bring New Zealand more in line with comparable jurisdictions like the EU and Australia.

For example, article 22 of the European Union’s General Data Protection Regulation (GDPR), gives individuals the right not to be subject to a decision based solely on automated processing, including profiling, which may legally or significantly affect that individual.

Under section 6 of the Australian Privacy Act 1988, both biometric information that is to be used for the purpose of automated biometric verification or biometric identification, and biometric templates, are included in the definition of “sensitive information”. In Australia, sensitive information may only be collected with consent, except in specified circumstances, whereas consent is generally not required to collect personal information that is not sensitive information.

Regulating biometric technology in New Zealand

Exactly how well a code dealing with biometric technology will work in practice in New Zealand will depend on the detail published in the exposure draft of the Proposed Code.

As the OPC has itself previously stated:

If designed well and used appropriately, biometric systems have significant benefits. These include convenience for individuals wanting to have their identity verified, efficiency for agencies seeking to identify people quickly and in large numbers, and security (because they use characteristics that cannot easily be faked, lost, or stolen). Biometric systems can also play a role in protecting privacy, by helping to guard against identity theft and fraud.

Biometric technology can also serve the legitimate purpose of detecting and preventing crime. In 2023, the Information Commissioner’s Office in the United Kingdom concluded that the live facial recognition technology provided to the retail sector by security company Facewatch served a legitimate purpose for using people’s biometric information and could be used in compliance with data protection legislation.

The United Kingdom’s ICO’s Deputy Commissioner for Regulatory Supervision stated:

Innovative solutions helping businesses prevent crime is in the public interest and a benefit to society. Data protection law recognises this, allowing personal information – in this case facial images – to be used if there is a legitimate interest, such as for the detection and prevention of crime. However, these benefits must always be balanced against the privacy rights of the individual.

The OPC has stated that it is keen to ensure that the Proposed Code is effective and workable, so it will need to balance these same concerns.

In addition to the three requirements already identified by the OPC, agencies looking to implement and use biometric technology will first need to understand, and clearly articulate in a Privacy Impact Assessment, the privacy impacts of the use case for the biometric technology and how they will be addressed and mitigated. For example, agencies will need to address and mitigate the risk that the results of automated processing of biometric information are inaccurate or biased, e.g., false results, or inaccuracies because of race or gender. Human oversight and the manual checking of results before significant decisions are made will be a critical safeguard.  

Agencies may also need to take steps to limit the personal data they collect and use in or via automated processing, and may need to ensure that vulnerable people, such as children and young people, are excluded from the automated processing of biometric information. Given the sensitive nature of biometric information, agencies can also expect specific guidance from the OPC about security and accuracy measures that should be adopted.

Next steps

The Proposed Code will be released this year for public comment. After this initial consultation, a more formal code consultation process will follow before any code is issued. If the decision is made to issue a code, we would not expect to see it finalised before late 2024.

If you have any questions in relation to the OPC’s Proposed Code, or the collection and use of biometric information within your organisation, please do not hesitate to contact us.

This article was co-authored with Ella Claridge.

No items found.

Article Link

Dowload Resource

Dowload Resource

Insights

Get in Touch