Hello my name is Katina Michael and I’m a professor of computing and information technology at the University of Wollongong in Australia. This chapter will focus on the role of privacy with respect to Information and Communications Technology and the Sustainable Development Goals, and emphasize why establishing trust between stakeholders-particularly between governments and citizens-is the most important aspect of any ICT intervention.
In July 2015, the UN Human Rights Council appointed its first Special Rapporteur on the Right to Privacy. The motivation for doing so was issues related to security and surveillance, Big Data and open data, health data, and personal data processed by private corporations. The focus was really on the efficacy and proportionality of intrusive measures made possible by advances in ICT.
As governments across the world undergo digital transformation, privacy issues abound in the secure storage of citizens’ personal information. Consider this in the context of SDG 3, good health and wellbeing. Whether sensitive information pertains to one’s health status, criminal records, race or religious affiliation or home address, citizens have an expectation of privacy. But before we talk about the right to privacy as it relates to the digital age, I think it’s important to look at the history of this right and its importance as an issue of international concern.
Article 12 of the Universal Declaration of Human Rights identifies the right to privacy as a key principle that ensures freedom. It states: “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation.” The Universal Declaration of Human Rights was established in 1948 after universal crimes were committed by the Nazis in World War II and evidenced during the Nuremburg International Military Tribunal. The Reich Government kept copious and meticulous hand-written records on a variety of minority groups who were discriminated against based on race, religion, sexual orientation, mental health, and more. Now, although computing power was then in its nascent stage relying greatly on the tabulation of punch cards, it was later discovered that the Hollerith machine was used by the Nazis to process census information that facilitated the identification of individuals who were later sent to the concentration camps and gas chambers.
I realize that this is a bleak way to start a discussion of the right to privacy, but it’s important to establish why privacy was included in the Universal Declaration of Human Rights, after people’s personal identifying information was used against them. On the other hand, however, if we are to examine other acts of dehumanization that have taken place in the 20th and 21st centuries, they often begin with the removal of nationally-recognized identification documents, like passports. If you don’t have a documented identity, you may be denied your rights as a citizen, because you can’t even prove that you are a citizen. Privacy, as it would have been defined in 1948, was basically the right to be left alone. But as ICTs began to permeate government and business, information privacy came to refer to the interest an individual has in controlling, or at least significantly influencing, the way data about themselves is handled and used. This might include sensitive information like: name, date of birth, age, sex, your address; current contact details of family and guardians; bank details; medical records; personal care issues; service records and file progress notes; individual personal plans, assessments or reports; guardianship orders; or even personal correspondence. Other information relating to ethnic or racial origin, political opinions, religious or philosophical beliefs, health, or sexual lifestyle should also be considered confidential, as it could be used against you if it falls into the wrong hands.
But in this day and age of mobile devices, social media, and online platforms that constantly leave behind digital footprints, how is it possible to maintain privacy? This is a particularly important question when websites do not disclose whether or not they are sharing your information with 3rd parties. Default sharing settings or unclear terms and conditions can place individuals at risk of disclosing private information that they may prefer to be confidential. First of all, it’s important to know who has your information. Most governments across the world store electronic records on their citizens, things such as tax records, electronic health records, even student identification records. The open data movement, which advocates for the free flow of data, sees value in making available data that has been funded by taxpayers, so it can contribute to the public good.
So governments are considering opening up some of this data so that it may be accessed by third parties who wish to create innovative services using de-identified information. De-identification aims to allow data to be used by others without the possibility of individuals being identified. Consumer data rights-basically, the idea that you as a consumer should control the data collected about you-are supposed to change the potential for individuals to have their data locked to a legacy provider. These rights, in theory, offer individuals data portability between stakeholders of choice, for example, service providers like your energy or electric company. In the context of the banking sector, the notion of an open banking framework has emerged, so that consumers will be able to access and safely transfer their banking data to trusted parties. Some Non-Government Organizations are suspicious of the consumer data rights movement, claiming that these rights could actually be used to manipulate consumers. Take the energy company example again: by knowing exactly how much energy a consumer uses, much can be determined, inclusive of household activity monitoring stemming from the types of household appliances in use, time of day data, for instance when someone is not at home or when someone chooses to sleep or rise. This is amplified when information about consumers is available publicly online, and big data analytics is able to draw from these various sources to make inferences about an individual’s pattern of life. This process is known as predictive profiling and may be used to on-sell more product.
On the flipside, this is the first time in the history of the world that we are able to gather and share information so quickly and in such quantities, and it can certainly be used for good. By harnessing the power of ICT, by crowdsourcing information from stakeholders around the world, and gathering data from sensors embedded in smart devices, collective awareness can be used to improve people’s lives and achieve sustainable development. So what’s the bottom line? Your data is inevitably going to be collected by someone, somewhere-the question is, do you trust that your data will be used for you and not against you by government agencies and private corporations? In this age of data-driven decision-making, I’d say that trust is just as important-maybe even more important-than privacy. You may be willing to give up control of your private data if you trust the person you’re giving control to. Without trust, explicit consent, transparency and accountability, even the most innovative ICT intervention will run into serious problems upon implementation.
Security breaches that occur from within governments, such as insider attacks by employees, or outside attacks, will have devastating impacts on people. Problems exist particularly when there are weak privacy laws and controls in place. Even a leading state in cybersecurity, like Singapore, can have their systems penetrated. In July 2018, Singapore had to disconnect computers at public healthcare centers from the internet after hackers compromised more than 1.5 million SingHealth patients’ personal information. Cyber attacks on national identity systems will become commonplace as more credentials are gathered and stored online. If the citizen profiles make it onto the dark web, the implications of adopting emerging technologies before they have been tried and tested on large-scale populations will become apparent, and there will be a major backlash from citizens. The dark web refers to encrypted online content that is not indexed on conventional search engines. The dark web is part of deep web, a wider collection of content that doesn't appear through regular internet browsing.
So what is the answer? Do we adopt new technologies to justly transform practices and reap the benefits of all this data? Or do we stick with traditional systems that have known vulnerabilities and learn to live with them? Perhaps what is of greatest importance is to treat privacy and security as functional aspects of any new system. All too often, engineers do not incorporate privacy and security by design for a product that will affect hundreds of millions of people. The long-standing myths are that we need to give up our personal privacy for public safety, and that we need to sacrifice privacy for data analytics.
Function creep in services are also a concern, such as when tax file numbers become de facto national ID numbers, or biometric rollout systems are used retrospectively for unrelated aims. Function creep is the gradual widening of the use of a technology or system beyond the purpose for which it was originally intended. “After the fact” privacy intrusions do not grant citizens an opportunity to consent to the mass-scale changes, rather they are imposed on them without a consultation process. The information gathered through both public and private data may be used for good or ill, depending on the stakeholder. However, we cannot deal in “what-ifs” if we are to adhere to the ethical principles of the Universal Declaration of Human Rights.
At the present time, our laws are not keeping pace with information technology, so what may be considered legal might well be unethical. We are also witnessing transformative changes in state-society relations in many countries. Globalization and the associated range of economic, technological, social, and political developments have supported the rise of individualism away from thinking in terms of the “public good”. So, to summarize: in this chapter we have reviewed concepts related to privacy and confidentiality in the context of human rights and the emergence of new Information Communication Technologies and systems. Both privacy and efficiency are equally important and should be considered in the design and implementation of any ICT system, particularly on large-scale government ICT projects rolled out to citizens.
Great emphasis needs to be placed on engaging civil society in order to develop ICT programs which are robust and trustworthy. No system is impenetrable, but we can reduce end-user vulnerability by working together to better understand the social implications of technology, being aware of the risks, and planning ahead as much as possible to ensure that ICT works for us, and not against us. Thank you.