389
Views
0
CrossRef citations to date
0
Altmetric
BILETA Special Edition

There and back again: how target market determination obligations for financial products may incentivise consumer data profiling

ORCID Icon
 

ABSTRACT

Increasingly precise data profiling of consumers is one of the drivers of profit for the financial industry in the digital age. However, data profiling may also bring about harm to consumers, ranging from data breaches to unfair pricing, digital manipulation, discrimination and exclusion of vulnerable consumers, which is particularly problematic in financial services context due to the consequences it has on consumers’ access to financial products. The focus of this article are target market determination (TMD) obligations for financial products and their interplay with data protection rules. It asks if financial product governance rules, requiring TMD and distribution of products within the target market, may further incentivise data profiling of consumers by financial services providers. I analyse this issue looking at two sets of rules applicable to financial firms in Australia: (1) the new financial products governance rules, which came into force in October 2021, and (2) the data protection rules: Australian Privacy Act 1988 and the GDPR. If these frameworks fail to strike a balance between (surprisingly) competing interests of consumer protection regarding the provision of appropriate financial products and the use of consumers’ data in digital profiling, the new rules may backfire, resulting in unintended consumer harms.

Acknowledgements

The research was partially conducted when the author worked at the Centre for Law, Markets and Regulation (CLMR) UNSW. The author would like to acknowledge funding received from the CLMR and Santander Financial Institute, Fundación UCEIF (Spain). The author thanks CLMR UNSW student interns, Leo Wu and Claire Ascoli for taking part in this research; attendees of the BILETA Conference 2021; and UNSW Law & Justice colleagues Kayleen Manwaring, Scott Donald, John Morgan and Dimity Kingsford Smith for valuable feedback. However, all errors and omissions are the author’s own.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 Cf. five ‘categories’: the type of client, their knowledge and experience, financial situation, risk tolerance and objective and needs in ESMA Citation2018, para. 18.

2 Cf. ESMA Citation2017, para. 17.

3 EU’s Proposal for a Regulation of the European Parliament and of the Council Laying down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, 21 April 2021, (COM/2021/206 final).

4 Regulation (EU) 2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1 (GDPR).

5 Which is why Recital (37) of the EU's AI Act Proposal considers AI application to credit scoring to be ‘high-risk’.

6 As it is being attempted with the EU’s Artificial Intelligence Act.

7 This clearly true in the context of consumers’ social media engagement: despite 70% of Australian consumers distrusting the industry’s treatment of users’ personal information (OAIC Citation2020, 55) more than 90% regularly use social media (ACMA Citation2020, 9).

8 I.e. it is impossible to say how the input and training data influenced the outcome.

9 Such as wearables, connected cars, smart home devices, other private and public spaces with embedded sensors (such as workplaces, shopping centres and Internet-connected bus shelters) etc.

10 Let us consider a theoretical example: after being provided a person’s grocery shopping as input data, a model proposes a certain, high premium for this person’s health insurance. However, we cannot know if this is because it considers the persons’ diet unhealthy, and therefore predicts a higher risk of certain diseases. This is true for some ‘black box’, or opaque, AI systems. If an algorithmic decision is based on information inferred and encoded within vectors there is a risk of unobservable discrimination.

11 Despite its name, the Australian Privacy Act is predominantly data protection, rather than privacy, legislation.

12 Privacy Act s 15.

13 Treasury Laws Amendment (Design and Distribution Obligations and Product Intervention Powers) Bill 2018 (Cth) Revised Explanatory Memorandum, para 1.5.

14 Financial firms conduct is regulated in Corporations Act and Australian Securities and Investments Commission Act 2001 (Cth) (ASIC Act), for a more detailed discussion regarding these rules and the use of AI analytics see Bednarz and Manwaring (Citation2021c).

15 Emphasis added.

16 Emphasis added.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.