Search for:

In brief

The Personal Data Protection Commission (PDPC) has issued the finalized Advisory Guidelines on the Use of Personal Data in AI Recommendation and Decision Systems (“Guidelines“). These Guidelines provide guidance on the use of personal data during three stages of AI system implementation: development, deployment (business-to-consumers) and procurement (business-to-business). In particular, the Guidelines clarify and elaborate on the application of the Consent Obligation and Notification Obligation, and their exceptions, under the Personal Data Protection Act (PDPA) to the use of personal data in AI systems.


Key takeaways

  • When developing AI systems, organizations should consider whether the business improvement and research exception may be relied upon instead of seeking consent. In addition, appropriate data protection measures must be put in place.
  • When deploying AI systems in consumer-facing products and services, such as in recommendation and decision systems, meaningful consent must be obtained. Alternatively, organizations can consider whether the legitimate interests exception may be relied upon.
  • Data intermediaries that develop or deploy bespoke AI systems for organizations controlling personal data should assist these organizations in meeting their obligations under the PDPA, such as by data mapping, data labeling and maintaining a data provenance record, which appear to be beyond the current best practices for data intermediaries.
  • Such data intermediaries are encouraged to assist organizations in meeting their notification, consent and accountability obligations, although they are not themselves subject to those obligations.

In more detail

Background

In July 2023, the PDPC carried out a public consultation on the proposed advisory guidelines on the use of personal data in AI systems (see our newsletter from July 2023), which closed on 31 August 2023.

On 1 March 2024, the PDPC issued a closing note to the public consultation to address issues raised and simultaneously issued the Guidelines.

The closing note summarizes the key matters raised by respondents and the PDPC’s responses to these matters. Among others, the PDPC stated as follows:

  • The Guidelines will assist organizations in applying exceptions to the consent obligation under the PDPA, such as the business improvement and research exceptions. Examples may be included, but are not intended to be exhaustive.
  • The PDPA will not override requirements in sectoral regulations, such as any that may apply to the financial sector.
  • Following requests for clarification, the PDPC has included guidance on the application of the legitimate interests exception.
  • The section on best practices for service providers is intended to apply to system integrators and bespoke solution providers rather than software-as-a-service companies.
  • Guidance on allocation of liability in relation to AI systems outside data protection is beyond the scope of the PDPA and would not be proper to include in the Guidelines.

Scope

The Guidelines cover situations involving the design or deployment of AI systems that interact with personal data. The Guidelines aim to (i) clarify how the PDPA applies when organizations use personal data to develop and train AI systems; and (ii) provide baseline guidance and best practices for organizations on how to be transparent about whether and how their AI systems use personal data to make recommendations, predictions or decisions.

The Guidelines cover the use of personal data in three main stages of AI system implementation, as follows:

  • Development, testing and monitoring of AI systems
  • Deployment: collecting and using personal data in AI systems (business-to-consumer)
  • Procurement: provision of bespoke AI systems (business-to-business)

Use of personal data in AI system development, testing and monitoring

Business improvement and research exceptions

The business improvement exception and research exception may be relied upon, instead of seeking consent, where appropriate. In general, the business improvement exception is relevant where the organization has developed a product or has an existing product that it is enhancing, and also caters for intragroup and intracompany sharing. The research exception is relevant when the organization is conducting commercial research to advance the science and engineering without a product development roadmap.

The Guidelines provide several factors for organizations to consider in determining whether to rely on the exceptions. Some pertinent factors include (but are not limited to) the following:

  • For the business improvement exception:
    • Whether using personal data for this purpose contributes toward improving the effectiveness or quality of the AI systems
    • Common industry practices or standards on how to develop, test and monitor such AI systems or ML models
  • For the research exception:
    • The use of personal data helps develop more effective methods to improve quality or performance of the AI system
    • Developing industry practices or standards for the development and deployment of AI systems or ML models

The usual restrictions and conditions for relying on the business improvement and research exceptions will apply, such as, (i) for the business improvement exception, it must be ensured that the business improvement purposes cannot be reasonably achieved without using the personal data in an individually identifiable form, and (i) for the research exception, there must be a clear public benefit to using the personal data for the research purpose.

Data protection considerations

The Guidelines remind organizations that appropriate technical, process or legal controls for data protection should be included during the AI system development process. The practice of data minimization will also reduce unnecessary risks. In deciding the kinds of controls to be implemented, companies can consider (i) the types of risks that the personal data would be subject to and (ii) the sensitivity and volume of the personal data used.

Best practices may include pseudonymization or de-identification where appropriate. Where this is not possible, organizations are encouraged to conduct a data protection impact assessment. And even for anonymized data (which falls beyond the scope of the PDPA), there is still the risk of re-identification, and so appropriate controls should be implemented.

Use of personal data in AI system deployment

The Guidelines address the situation where organizations deploy AI systems that collect and use personal data to provide new functionalities or enhance product features, such as to provide recommendations to users. The consent obligation and notification obligation under the PDPA will apply.

To obtain users’ informed and meaningful consent, notifications should be sufficient, but need not be overly technical. The information provided can also be “layered,” such that the most relevant information is provided more prominently, while more details are provided elsewhere.

In addition, the legitimate interests exception may apply; this generally refers to any lawful interests of an organization. In addition, several specific purposes are defined to constitute “legitimate interests” under the PDPA. In particular, the Guidelines provide that the detection or prevention of illegal activity would be such a purpose. Where this exception applies, personal data may be processed without consent.

Finally, the Guidelines also encourages organizations to ensure that they have sufficiently discharged the accountability obligation. The use of AI systems should be made known to users, and the level of detail provided should be proportionate to the risks of each use case. To build consumer confidence, measures taken to safeguard personal data and ensure fairness of recommendations could also be preemptively disclosed to users, instead of only upon request.

Use of personal data in AI system procurement

The final section of the Guidelines addresses service providers, such as systems integrators, that are engaged by organizations for professional services for the development and deployment of bespoke or fully customizable AI systems. It does not apply to organizations that develop their own AI systems in-house or those that use off-the-shelf solutions.

Such service providers could constitute data intermediaries under the PDPA, and it is a good practice for such service providers to adopt practices such as data mapping, data labeling and maintaining a data provenance record. While this will support data intermediaries in assessing whether there has been unauthorized access and modification of training data sets, these obligations appear to go beyond the current best practices of data intermediaries.

In addition, although they are not subject to these obligations themselves, service providers are also encouraged to support their clients in meeting their notification, consent and accountability obligations. This is because these clients may rely on the technical expertise of the service providers to meet their obligations under the PDPA. This recommendation appears to be beyond the obligations currently imposed on data intermediaries pursuant to the PDPA.

* * * * *

For further information and to discuss what this might mean for you, please get in touch with your usual Baker McKenzie contact.

Author

Andy Leck is the head of the Intellectual Property and Technology (IPTech) Practice Group and a member of the Dispute Resolution Practice Group in Singapore. He is a core member of Baker McKenzie's regional IP practice and also leads the Myanmar IP Practice Group. Andy is recognised by reputable global industry and legal publications as a leader in his field. He was named on "The A-List: Singapore's Top 100 lawyers" by Asia Business Law Journal 2018. In addition, Chambers Asia Pacific notes that Andy is "a well-known IP practitioner who is highlighted for his record of handling major trade mark litigation, as well as commercial exploitation of IP rights in the media and technology sectors. He's been in the industry for a long time and has always been held in high regard. He is known to be very fair and is someone you would like to be in the trenches with you during negotiations." Furthermore, Asian Legal Business acknowledges Andy as a leading practitioner in his field and notes that he “always gives good, quick advice, [is] client-focused and has strong technical knowledge for his areas of practice.” Andy was appointed by the Intellectual Property Office of Singapore (IPOS) as an IP Adjudicator to hear disputes at IPOS for a two-year term from April 2021. He has been an appointed member of the Singapore Copyright Tribunal since May 2010 and a mediator with the WIPO Arbitration and Mediation Center. He is also appointed as a Notary Public & Commissioner for Oaths in Singapore. He previously served on the International Trademark Association’s Board of Directors and was a member of the executive committee.

Author

Ren Jun Lim is a principal with Baker McKenzie Wong & Leow. He represents local and international clients in both contentious and non-contentious intellectual property matters. He also advises on a full range of healthcare, as well as consumer goods-related legal and regulatory issues. Ren Jun co-leads Baker McKenzie Wong & Leow's Healthcare as well as Consumer Goods & Retail industry groups. He sits on the Law Society of Singapore IP Committee and on the Executive Committee of the Association of Information Security Professionals. He is also a member of the Vaccines Working Group, Singapore Association of Pharmaceutical Industries, a member of the International Trademark Association, as well as a member of the Regulatory Affairs Professionals Association. Ren Jun is ranked in the Silver tier for Individuals: Enforcement and Litigation and Individuals: Prosecution and Strategy, and a recommended lawyer for Individuals: Transactions by WTR 1000, 2020. He is also listed in Asia IP's Best 50 IP Expert, 2020, recognised as a Rising Star by Managing IP: IP Stars, 2019 and one of Singapore's 70 most influential lawyers aged 40 and under by Singapore Business Review, 2016. Ren Jun was acknowledged by WTR 1000 as a "trademark connoisseur who boasts supplementary knowledge of regulatory issues in the consumer products industry." He was also commended by clients for being "very responsive to enquiries and with a keen eye for detail, he is extremely hands-on. His meticulous and in-depth approach to strategising is key to the excellent outcomes we enjoy."

Author

Ken Chia is a member of the Firm’s IP Tech, International Commercial & Trade and Competition Practice Groups. He is regularly ranked as a leading TMT and competition lawyer by top legal directories, including Chambers Asia Pacific and Legal 500 Asia Pacific. Ken is an IAPP Certified International Privacy Professional (FIP, CIPP(A), CIPT, CIPM) and a fellow of the Chartered Institute of Arbitrators and the Singapore Institute of Arbitrators.