Biometrics and Applied Data Ethics

Session 2: Emerging Concerns for Responsible Data Analytics: Trust, Fairness, Transparency and Discrimination

Abstract: The privacy assessment framework governing uses of information about individuals is now mature, understood and applied by responsible data custodians. However, this framework is incomplete. The framework focusses assessment of data analytics projects upon responsible management of personal information about individuals. But while a key concern, privacy this is not the only important concern. A narrow focus upon privacy is diverting attention away from addressing emerging concerns as to maintenance of consumer trust, as to fairness of outcomes and avoiding adverse impacts of uses of insights derived from data analytics. Building torrents of papers canvass concerns as to social equity, corporate social responsibility, lack of of ‘transparency’, ‘unaccountable algorithms’, unethical practices and weighting of benefits for the many against detriments for a few. Often these concerns are grouped together by academic commentators under a rubric of ‘data ethics’. This commentary is often coupled with a call to action, exhorting responsible data custodians to apply ethical principles to identify and address ‘unethical’ outcomes of data analytics projects. But this discussion does not provide a practical process to assess and address these concerns. Not do these calls take into account project methodologies of complex data analytics projects. Some expert commentators, notably including Marty Abrams and Peter Cullen at the Information Accountability Foundation, have suggested a framework integrating ethical evaluation within now commonly accepted procedures for privacy impact assessments (PIAs). This paper suggests that although coordinated assessment of fairness, ethics and privacy is appropriate and often practical, integration into a single assessment is not practicable and in fact often will be sub-optimal. This paper also suggests that it will often be appropriate to ensure that responsible assessment is conducted as to uses and applications of outputs (such as algorithms or insights) of data analytics projects in circumstances where these outputs do not themselves constitute uses or disclosures of personal information that are (or should be) subject to privacy assessment. Algorithms may go into productive use in circumstances where limitations as to underlying data used to generate such algorithms are not understood and their reliability is affected by exogenous factors, where the algorithms is inherently biased, or where a particular application of the algorithm has unfair discriminatory effect. The objective of this paper is to promote development of a framework for phased assessment of data analytics projects that encompasses ethical and fairness considerations, while not creating a dead weight of multiple detailed assessments. The framework must be targeted, agile and capable of application by responsible teams that are not ethicists. These teams should be empowered to ask and seek answers to sensible questions framed in plain language. The framework should draw upon learnings from use of Human Research Ethics Committees (HRECs) in reviewing medical research projects, without inappropriately expanding the administrative, cost and time imposts of processes of full HREC review. This paper explores a middle course, suggesting how well thought through and structured questions exploring ethics and fairness of outputs and outcomes are asked and retested at appropriate points through the phases of a complex data analytics project.

Full paper here

Mr Peter Leonard, Data Synergies

Mr Peter Leonard, Data Synergies

Biography: Peter Leonard is a data and technology business consultant and lawyer and principal of Data Synergies, a new data commercialisation consultancy. Peter was a founding partner of Gilbert + Tobin. Following his retirement as a partner in 2017 he continues to assist Gilbert + Tobin as a consultant. Peter also chairs the Australian IoT (Internet of Things) Alliance’s Data Access, Use and Privacy work stream and the Law Society of New South Wales Privacy and Communications Committee. The IoT Alliance is Australia’s peak body bringing together industry and governments to address issues affecting IoT adoption and implementation. He also participates in the Australian Computer Society’s Data Taskforce as chaired by Dr Ian Oppermann, NSW Chief Data Scientist. Peter wishes to acknowledge the contribution of all Taskforce members, with Ian Oppermann’s energetic leadership, through taxing discussions within the Taskforce that explored many of the concepts explored in this paper.