Using a Social-Ethical Framework to Evaluate Location-Based Services



The idea for an Internet of Things has matured since its inception as a concept in 1999. People today speak openly of a Web of Things and People, and even more broadly of an Internet of Everything. As our relationships become more and more complex and enmeshed, through the use of advanced technologies, we have pondered on ways to simplify flows of communications, to collect meaningful data, and use them to make timely decisions with respect to optimisation and efficiency. At their core, these flows of communications are pathways to registers of interaction, and tell the intricate story of outputs at various units of analysis- things, vehicles, animals, people, organisations, industries, even governments. In this trend toward evidence-based enquiry, data is the enabling force driving the growth of IoT infrastructure. This paper uses the case of location-based services, which are integral to IoT approaches, to demonstrate that new technologies are complex in their effects on society. Fundamental to IoT is the spatial element, and through this capability, the tracking and monitoring of everything, from the smallest nut and bolt, to the largest shipping liner to the mapping of planet earth, and from the whereabouts of the minor to that of the prime minister. How this information is stored, who has access, and what they will do with it, is arguable depending on the stated answers. In this case study of location-based services we concentrate on control and trust, two overarching themes that have been very much neglected, and use the outcomes of this research to inform the development of a socio-ethical conceptual framework that can be applied to minimise the unintended negative consequences of advanced technologies. We posit it is not enough to claim objectivity through information ethics approaches alone, and present instead a socio-ethical impact framework. Sociality therefore binds together that higher ideal of praxis where the living thing (e.g. human) is the central and most valued actor of a system.


Introduction 1.1. 3

Control 1.2. 4

Surveillance 1.2.1. 5

Common surveillance metaphors 1.2.2. 5

Applying surveillance metaphors to LBS 1.2.3. 7

‘Geoslavery’ 1.2.4. 7

From state-based to citizen level surveillance 1.2.5. 7

Dataveillance 1.2.6. 8

Risks associated with dataveillance 1.2.7. 8

Loss of control 1.2.8. 8

Studies focussing on user requirements for control 1.2.9. 10

Monitoring using LBS: control versus care? 1.2.10. 10

Sousveillance 1.2.11. 11

Sousveillance, ‘reflectionism’ and control 1.2.12. 11

Towards überveillance 1.2.13. 12

Implications of überveillance on control 1.2.14. 13

Comparing the different forms of ‘veillance’ 1.2.15. 14

Identification 1.2.16. 14

Social sorting 1.2.17. 15

Profiling 1.2.18. 15

Digital personas and dossiers 1.2.19. 15

Trust 1.3. 16

Trust in the state 1.3.1. 17

Balancing trust and privacy in emergency services 1.3.2. 17

Trust-related implications of surveillance in the interest of national security 1.3.3. 17

Need for justification and cultural sensitivity 1.3.4. 18

Trust in corporations/LBS/IoT providers 1.3.5. 19

Importance of identity and privacy protection to trust 1.3.6. 19

Maintaining consumer trust 1.3.7. 20

Trust in individuals/others 1.3.8. 20

Consequences of workplace monitoring 1.3.9. 20

Location-monitoring amongst friends 1.3.10. 21

Location tracking for protection 1.3.11. 21

LBS/IoT is a ‘double-edged sword’ 1.3.12. 22

Discussion 1.4. 22

The Internet of Things (IoT) and LBS: extending the discussion on control and trust 1.4.1. 22

Control- and trust-related challenges in the IoT 1.4.2. 23

Ethical analysis: proposing a socio-ethical conceptual framework 1.4.3. 24

The need for objectivity 1.4.4. 25

Difficulties associated with objectivity 1.4.5. 26

Conclusion 1.5. 27


Introduction 1.1

Locative technologies are a key component of the Internet of Things (IoT). Some scholars go so far as to say it is the single most important component that enables the monitoring and tracking of subjects and objects. Knowing where something or someone is, is of greater importance than knowing who they are because it or they can be found, independent of what or who they are. Location also grants us that unique position on the earth’s surface, providing for us one of the vital pieces of information forming the distance, speed, time matrix. A unique ID, formed around an IP address in an IoT world, presents us with the capability to label every living and non-living thing and to recollect it, adding to its history and longer term physical lifetime. But without knowing where something is, even if we have the knowledge that an action is required toward some level of maintenance, we cannot be responsive. Since the introduction of electronic databases, providing accurate records for transaction processing has been a primary aim. Today, however, we are attempting to increase visibility using high resolution geographic details, we are contextualizing events through discrete and sometimes continuous sensor-based rich audio-visual data collection, and we are observing how mobile subjects and objects interact with the built environment. We are no longer satisfied with an approach that says identify all things, but we wish to be able to recollect or activate them on demand, understand associations and affiliations, creating a digital chronicle of its history to provide insights toward sustainability.

There is thus an undue pressure on the ethical justification for social and behavioral tracking of people and things in everyday life. Solely because we have the means to do something, it does not mean we should do it. We are told that through this new knowledge gained from big data we can reduce carbon emissions, we can eradicate poverty, we can grant all people equity in health services, we can better provision for expected food shortages, utilize energy resources optimally, in short, make the world a better place. This utopian view might well be the vision that the tech sector wish to adopt as an honourable marketing strategy, but the reality of thousands of years of history tells us that technology does not necessarily on its own accord, make things better. In fact, it has often made some aspects of life, such as conflict and war, much worse through the use of modern, sophisticated advanced techniques. We could argue that IoT will allow for care-based surveillance that will bring about aid to individuals and families given needs, but the reality is that wherever people are concerned, technology may be exploited towards a means for control. Control on its own is not necessarily an evil, it all depends on how the functionality of given technologies are applied. Applied negatively the recipient of this control orientation learns distrust instead of trust which then causes a chain reaction throughout society, especially with respect to privacy and security. We need only look at the techniques espoused by some governments in the last 200 years to acknowledge that heinous crimes against humanity (e.g. democide) have been committed with new technological armaments (Rummel, 1997) to the detriment of the citizenry.                                                         

A socio-ethical framework is proposed as a starting point for seeking to understand the social implications of location services, applicable to current and future applications within IoT infrastructure. To stop at critiquing services using solely an information ethics-based approach is to fall short. Today’s converging services and systems require a greater scope of orientation to ask more generally how society may be affected at large, not just whether information is being collected, stored, and shared appropriately. To ask questions about how location services and IoT technology will directly and indirectly change society has far greater importance for the longer term vision of person-to-person and person-to-thing interactions than simply studying various attributes in a given register.

Studies addressing the social implications of emerging technologies, such as LBS, generally reflect on the risks and ethical dilemmas resulting from the implementation of a particular technology within a given social context. While numerous approaches to ethics exist, all are inextricably linked to ideas of morality, and an ability to distinguish good conduct from bad. Ethics, in simple terms, can be considered as the “study of morality” (Quinn 2006, p. 55), where morality refers to a “system of rules for guiding human conduct and principles for evaluating those rules” (Tavani 2007, p. 32). This definition is shared by Elliot and Phillips (2004, p. 465), who regard ethics as “a set of rules, or a decision procedure, or both, intended to provide the conditions under which the greatest number of human beings can succeed in ‘flourishing’, where ‘flourishing’ is defined as living a fully human life” (O'Connor and Godar 2003, p. 248).

According to the literature, there are two prominent ethical dilemmas that emerge with respect to locating a person or thing in an Internet of Things world. First, the risk of unauthorised disclosure of one’s location which is a breach of privacy; and second the possibility of increased monitoring leading to unwarranted surveillance by institutions and individuals. The socio-ethical implications of LBS in the context of IoT can therefore be explored based on these two major factors. IoT more broadly, however, can be examined by studying numerous social and ethical dilemmas from differing perspectives. Michael et al. (2006a, pp. 1-10) propose a framework for considering the ethical challenges emerging from the use of GPS tracking and monitoring solutions in the control, convenience and care usability contexts. The authors examine these contexts in view of the four ethical dimensions of privacy, accuracy, property and accessibility (Michael et al. 2006a, pp. 4-5). Alternatively, Elliot and Phillips (2004, p. 463) discuss the social and ethical issues associated with m-commerce and wireless computing in view of the privacy and access, security and reliability challenges. The authors claim that factors such as trust and control are of great importance in the organisational context (Elliot and Phillips 2004, p. 470). Similar studies propose that the major themes regarding the social implications of LBS be summarised as control, trust, privacy and security (Perusco et al. 2006; Perusco and Michael 2007). These themes provide a conceptual framework for reviewing relevant literature in a structured fashion, given that a large number of studies are available in the respective areas.

This article, in the first instance, focusses on the control- and trust-related socio-ethical challenges arising from the deployment of LBS in the context of IoT, two themes that are yet to receive comprehensive coverage in the literature. This is followed by an examination of LBS in the context of the Internet of Things (IoT), and the ensuing ethical considerations. A socio-ethical framework is proposed as a valid starting point for addressing the social implications of LBS and delivering a conceptual framework that is applicable to current LBS use cases and future applications within an Internet of Things world.

Control 1.2

Control, according to the Oxford Dictionary (2012a), refers to the “the power to influence or direct people’s behaviour or the course of events”. With respect to LBS, this theme is examined in terms of a number of important concepts, notably surveillance, dataveillance, sousveillance and überveillance scholarship.

Surveillance 1.2.1

A prevailing notion in relation to control and LBS is the idea of exerting power over individuals through various forms of surveillance. Surveillance, according to sociologist David Lyon, “is the focused, systematic and routine attention to personal details for the purposes of influence, management, protection and or direction,” although Lyon admits that there are exceptions to this general definition (Lyon 2007, p. 14). Surveillance has also been described as the process of methodically monitoring the behaviour, statements, associates, actions and/or communications of an individual or individuals, and is centred on information collection (Clarke 1997; Clarke 2005, p. 9).

The act of surveillance, according to Clarke (1988; 1997) can either take the form of personal surveillance of a specific individual or mass surveillance of groups of interest. Wigan and Clarke (2006, p. 392) also introduce the categories of object surveillance of a particular item and area surveillance of a physical enclosure. Additional means of expressing the characteristics of surveillance exist. For example, the phrase “surveillance schemes” has been used to describe the various surveillance initiatives available (Clarke 2007a, p. 28). Such schemes have been demonstrated through the use of a number of mini cases or vignettes, which include, but are not limited to, baby monitoring, acute health care, staff movement monitoring, vehicle monitoring, goods monitoring, freight interchange-point monitoring, monitoring of human-attached chips, monitoring of human-embedded chips, and continuous monitoring of chips (Clarke 2007c; Clarke 2007b, pp. 47-60). The vignettes are intended to aid in understanding the desirable and undesirable social impacts resulting from respective schemes.

Common surveillance metaphors 1.2.2

In examining the theme of control with respect to LBS, it is valuable to initially refer to general surveillance scholarship to aid in understanding the link between LBS and surveillance. Surveillance literature is somewhat dominated by the use of metaphors to express the phenomenon. A prevalent metaphor is that of the panopticon, first introduced by Jeremy Bentham (Bentham and Bowring 1843), and later examined by Michel Foucault (1977). Foucault’s seminal piece Discipline and Punish traces the history of punishment, commencing with the torture of the body in the eighteenth century, through to more modern forms of punishment targeted at the soul (Foucault 1977). In particular, Foucault’s account offers commentary on the notions of surveillance, control and power through his examination of Bentham’s panopticon, which are pertinent in analysing surveillance in general and monitoring facilitated by LBS in particular. The panopticon, or “Inspection-House” (Bentham and Bowring 1843, p. 37), refers to Bentham’s design for a prison based on the essential notion of “seeing without being seen” (p. 44). The architecture of the panopticon is as follows:

“The building is circular. The apartments of the prisoners occupy the circumference. You may call them, if you please, the cells... The apartment of the inspector occupies the centre; you may call it if you please the inspector's lodge. It will be convenient in most, if not in all cases, to have a vacant space or area all round, between such centre and such circumference.  You may call it if you please the intermediate or annular area” (Bentham and Bowring 1843, pp. 40-41).

Foucault (1977, p. 200) further illustrates the main features of the inspection-house, and their subsequent implications on constant visibility:

“By the effect of backlighting, one can observe from the tower [‘lodge’], standing out precisely against the light, the small captive shadows in the cells of the periphery. They are like so many cages, so many small theatres, in which each actor is alone, perfectly individualized and constantly visible...Full lighting and the eye of a supervisor [‘inspector’] capture better than darkness, which ultimately protected. Visibility is a trap.”

While commonly conceived as ideal for the prison arrangement, the panopticon design is applicable and adaptable to a wide range of establishments, including but not limited to work sites, hospital, schools, and/or or any establishment in which individuals “are to be kept under inspection” (Bentham and Bowring 1843, p. 37). It has been suggested, however, that the panopticon functions as a tool for mass (as opposed to personal) surveillance in which large numbers of individuals are monitored, in an efficient sense, by a small number (Clarke 2005, p. 9). This differs from the more efficient, automated means of dataveillance (to be shortly examined). In enabling mass surveillance, the panopticon theoretically allows power to be. In examining the theme of control with respect to LBS, it is valuable to initially refer to general surveillance scholarship to aid in understanding the link between LBS and surveillance. Surveillance literature is somewhat dominated by the use of metaphors to express the phenomenon. Foucault (1977, pp. 202-203) provides a succinct summary of this point:

“He who is subjected to a field of visibility, and who knows it, assumes responsibility for the constraints of power; he makes them play spontaneously upon himself; he inscribes in himself the power relation in which he simultaneously plays both roles; he becomes the principle of his own subjection.”

This self-disciplinary mechanism functions similarly, and can somewhat be paralleled, to various notions in George Orwell’s classic novel Nineteen Eighty Four (Orwell 1949), also a common reference point in surveillance literature. Nineteen Eighty Four has been particularly influential in the surveillance realm, notably due to the use of “Big Brother” as a symbol of totalitarian, state-based surveillance. Big Brother’s inescapable presence is reflected in the nature of surveillance activities. That is, that monitoring is constant and omnipresent and that “[n]othing was your own except the few cubic centimetres inside your skull” (Orwell 1949, p. 29). The oppressive authority figure of Big Brother possesses the ability to persistently monitor and control the lives of individuals, employing numerous mechanisms to exert power and control over his populace as a reminder of his unavoidable gaze.

One such mechanism is the use of telescreens as the technological solution enabling surveillance practices to be applied. The telescreens operate as a form of self-disciplinary tool by way of reinforcing the idea that citizens are under constant scrutiny (in a similar fashion to the inspector’s lodge in the panopticon metaphor). The telescreens inevitably influence behaviours, enabling the state to maintain control over actions and thoughts, and to impose appropriate punishments in the case of an offence. This is demonstrated in the following excerpt:

“It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen. The smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself – anything that carried with it the suggestion of abnormality, of having something to hide. In any case, to wear an improper expression on your face (to look incredulous when a victory was announced, for example) was itself a punishable offence” (Orwell 1949, p. 65).

The Internet of Things, with its ability to locate and determine who is or what is related to one another using a multiplicity of technologies, will enable authorities in power to infer what someone is likely to do in a given context. Past behavioural patterns, can for example, reveal a likely course of action with relatively no prediction required. IoT in all its glory will provide complete visibility- the question is what are the risks associated with providing that kind of capability to the state or private enterprise? In scenario analysis we can ponder how IoT in a given context will be used for good, how it will be used for bad, and a neutral case where it will have no effect whatsoever because the data stream will be ignored by the system owner. While IoT has been touted as the ultimate in providing great organisational operational returns, one can see how it can lend itself to location-based tracking and monitoring using a panopticon metaphor. Paper records and registers were used during World War 2 for the purposes of segregation, IoT and especially the ability to “locate on demand”, may well be used for similar types of control purposes.

Applying surveillance metaphors to LBS 1.2.3

The aforementioned surveillance metaphors can be directly applied to the case of LBS within IoT. In the first instance, it can be perceived that the exploitation of emerging technologies, such as LBS, extends the notion of the panopticon in a manner that allows for inspection or surveillance to take place regardless of geographic boundaries or physical locations. When applying the idea of the panopticon to modern technologies, Lyon suggests that “Bentham’s panopticon gives way to the electronic superpanopticon” (Lyon 2001, p. 108). With respect to LBS, this superpanopticon is not limited to and by the physical boundaries of a particular establishment, but is rather reliant on the nature and capabilities of the mobile devices used for ‘inspection’. In an article titled “The Panopticon's Changing Geography”, Dobson and Fischer (2007) also discuss progress and various manifestations of surveillance technology, specifically the panopticon, and the consequent implications on power relationships. From Bentham's architectural design, to the electronic panopticon depicted by Orwell, and contemporary forms of electronic surveillance including LBS and covert human tracking, Dobson and Fisher (2007, p. 308-311) claim that all forms of watching enable continuous surveillance either as part of their primary or secondary purpose. They compare four means of surveillance- analogue technologies as used by spies which have unlimited geographic coverage and are very expensive to own and operate, Bentham’s original panopticon where the geographic view was internal to a building, George Orwell’s big brother view which was bound by the extent of television cables, and finally human tracking systems which were limited only by the availability and granularity of cell phone towers.

A key factor in applying the panopticon metaphor to IoT is that individuals, through the use of mobile location devices and technologies, will be constantly aware of their visibility and will assume the knowledge that an ‘inspector’ may be monitoring their location and other available information remotely at any given time. Mobile location devices may similarly replace Orwell’s idea of the telescreens as Big Brother’s primary surveillance technology, resulting in a situation in which the user is aiding in the process of location data collection and thereby surveillance. This creates, as maintained by Andrejevic (2007, p. 95), a “widening ‘digital enclosure’ within which a variety of interactive devices that provide convenience and customization to users double as technologies for gathering information about them.”

‘Geoslavery’ 1.2.4

Furthermore, in extreme situations, LBS may facilitate a new form of slavery, “geoslavery”, which Dobson and Fischer (2003, pp. 47-48) reveal is “a practice in which one entity, the master, coercively or surreptitiously monitors and exerts control over the physical location of another individual, the slave. Inherent in this concept is the potential for a master to routinely control time, location, speed, and direction for each and every movement of the slave or, indeed, of many slaves simultaneously.” In their seminal work, the authors flag geoslavery as a fundamental human rights issue (Dobson and Fisher 2003, p. 49), one that has the potential to somewhat fulfil Orwell's Big Brother prophecy, differing only in relation to the sophistication of LBS in comparison to visual surveillance and also in terms of who is in control. While Orwell’s focus is on the state, Dobson and Fischer (2003, p. 51) caution that geoslavery can also be performed by individuals “to control other individuals or groups of individuals.”

From state-based to citizen level surveillance 1.2.5

Common in both Discipline and Punish and Nineteen Eighty Four is the perspective that surveillance activities are conducted at the higher level of the “establishment”; that is, institutional and/or state-based surveillance. However, it must be noted that similar notions can be applied at the consumer or citizen level. Mark Andrejevic (2007, p. 212), in his book iSpy: Surveillance and Power in the Interactive Era, terms this form of surveillance as “lateral or peer-to-peer surveillance.” This form of surveillance is characterised by “increasing public access to the means of surveillance – not just by corporations and the state, but by individuals” (Andrejevic 2007, p. 212). Similarly, Barreras and Mathur (2007, pp. 176-177) state that wireless location tracking capabilities are no longer limited to law enforcement, but are open to any interested individual. Abbas et al. (2011, pp. 20-31) further the discussion by focussing on related notions, explicitly, the implications of covert LBS-based surveillance at the community level, where technologies typically associated with policing and law enforcement are increasingly available for use by members of the community. With further reference to LBS, Dobson and Fischer (2003, p. 51) claim that the technology empowers individuals to control other individuals or groups, while also facilitating extreme activities. For instance, child protection, partner tracking and employee monitoring can now take on extreme forms through the employment of LBS (Dobson and Fisher 2003, p. 49). According to Andrejevic (2007, p. 218), this “do-it-yourself” approach assigns the act of monitoring to citizens. In essence higher degrees of control are granted to individuals thereby encouraging their participation in the surveillance process (Andrejevic 2007, pp. 218-222). It is important to understand IoT in the context of this multifaceted “watching”. IoT will not only be used by organisations and government agencies, but individuals in a community will also be granted access to information at small units of aggregated data. This has implications at a multiplicity of levels. Forces of control will be manifold.

Dataveillance 1.2.6

The same sentiments can be applied to the related, and to an extent superseding, notion of data surveillance, commonly referred to as dataveillance. Coined by Roger Clarke in the mid-eighties, dataveillance is defined as “the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” (Clarke 1988). Clarke (2005, p. 9) maintains that this process is automated and therefore a relatively economical activity when compared with other forms of surveillance, in that dataveillance activities are centred on examination of the data trails of individuals. For example, traditional forms of surveillance rely on expensive visual monitoring techniques, whereas dataveillance is largely an economically efficient alternative (Clarke 1994; 2001d, p. 11). Visual behavioural monitoring (that is, traditional surveillance) is an issue, but is nonetheless overshadowed by the challenges associated with dataveillance, particularly with reference to personal and mass dataveillance (Clarke 2005, pp. 9-10). That is, personal dataveillance presents risks to the individual based primarily on the potential for the collected data/information to be incorrect or outdated, while mass dataveillance is risky in that it may generate suspicion amongst individuals (Albrecht & Michael, 2013).

Risks associated with dataveillance 1.2.7

Clarke’s early and influential work on “Information Technology and Dataveillance” recognises that information technology is accelerating the growth of dataveillance, which presents numerous benefits and risks (Clarke 1988, pp. 498, 505-507). Clarke lists advantages in terms of safety and government applications, while noting the dangers associated with both personal and mass dataveillance (Clarke 1988, pp. 505-507). These risks can indeed be extended or applied to the use of location and tracking technologies to perform dataveillance activities, resulting in what can be referred to as “dataveillance on the move” (Michael and Michael 2012). The specific risks include: ability for behavioural patterns to be exposed and cross-matched, potentially for revelations that may be harmful from a political and personal perspective, rise in the use of “circumstantial evidence”, transparency of behaviour resulting in the misuse of information relating to an individual’s conduct, and “actual repression of the readily locatable and trackable individual” (Clarke 2001b, p. 219). Emerging from this analysis, and that concerning surveillance and related metaphors, is the significant matter of loss of control.

Loss of control 1.2.8

Michael et al. (2006a, p. 2) state, in the context of GPS tracking, that the issue of control is a leading ethical challenge given the invasive nature of this form of monitoring. The mode of control can differ depending on the context. For instance, the business context may include control through directing or ‘pushing’ advertisements to a specific individual, and at personal/individual level could signify control in the manner of “self-direction” (Perusco et al. 2006, p. 93). Other forms of social control can also be exercised by governments and organisations (Clarke 2003b), while emerging LBS solutions intended for the consumer sector extend the notion of control to community members (Abbas et al. 2011). This is an area that has not been adequately addressed in the literature. The subsequent risks to the individual are summarised in the following passage:

“Location technologies therefore provide, to parties that have access to the data, the power to make decisions about the entity subject to the surveillance, and hence exercise control over it. Where the entity is a person, it enables those parties to make determinations, and to take action, for or against that person’s interests. These determinations and actions may be based on place(s) where the person is, or place(s) where the person has been, but also on place(s) where the person is not, or has not been” (Wigan and Clarke 2006, p. 393).

Therefore GPS and other location devices and technologies may result in decreased levels of control from the perspective of the individual being monitored. For example, in an article based on the use of scenarios to represent the social implications associated with the implementation of LBS, Perusco and Michael (2007) demonstrate the various facets of control in relation to LBS. The discussion is generally centred on the loss of control which can be experienced in numerous ways, such as when a device does not accurately operate, or when an individual constantly monitors a family member in an attempt to care for them (Perusco and Michael 2007, pp. 6-7, 10). The authors raise valuable ideas with respect to control, such as the need to understand the purpose of control, the notion of consent, and developing methods to deal with location inaccuracies amongst others (p. 14). Perusco and Michael further assert that control has a flow-on effect on other issues, such as trust for instance, with the authors questioning whether it is viable to control individuals given the likely risk that trust may be relinquished in the process (p. 13).

Concurrent with loss of control, the issue of pre-emptive control with respect to LBS is a delicate one, specifically in relation to suspected criminals or offenders. Perusco et al. (2006, p. 92) state that the punishment of a crime is typically proportionate to the committed offence, thus the notion of pre-emptive monitoring can be considered fundamentally flawed given that individuals are being punished without having committed an offence. Rather, they are suspected of being a threat. According to Clarke and Wigan (2011), a person is perceived a threat, based on their “personal associations” which can be determined using location and tracking technologies to establish the individual’s location in relation to others, and thus control them based on such details. This is where IoT fundamentally comes into play. While location information can tell us much about where an individual is at any point in time, it is IoT that will reveal the inter-relationships and frequency of interaction, and specific application of measurable transactions. IoT is that layer that will bring things to be scrutinized in new ways.  

This calls for an evaluation of LBS solutions that can be used for covert operations. Covert monitoring using LBS is often considered a useful technique, one that promotes less opposition than overt forms of monitoring, as summarised below:

“Powerful economic and political interests are seeking to employ location and tracking technologies surreptitiously, to some degree because their effectiveness is greater that way, but mostly in order to pre-empt opposition” (Clarke 2001b, p. 221).

Covert applications of LBS are increasingly available for the monitoring and tracking of social relations such as a partner or a child (Abbas et al. 2011). Regardless of whether covert or overt, using LBS for monitoring is essentially about control, irrespective of whether the act of controlling is motivated by necessity, or for more practical or supportive purposes (Perusco et al. 2006, p. 93). 

Studies focussing on user requirements for control 1.2.9

The control dimension is also significant in studies focussing on LBS users, namely, literature concerned with user-centric design, and user adoption and acceptance of LBS and related mobile solutions. In a paper focussing on understanding user requirements for the development of LBS, Bauer et al. (2005, p. 216) report on a user’s “fear” of losing control while interacting with mobile applications and LBS that may infringe on their personal life. The authors perceive loss of control to be a security concern requiring attention, and suggest that developers attempt to relieve the apprehension associated with increased levels of personalisation though ensuring that adequate levels of control are retained (Bauer et al. 2005, p. 216). This is somewhat supported by the research of Xu and Teo (2004, pp. 793-803), in which the authors suggest that there exists a relationship between control, privacy and intention to use LBS. That is, a loss of control results in a privacy breach, which in turn impacts on a user’s intention to embrace LBS.

The aforementioned studies, however, fail to explicitly incorporate the concept of value into their analyses. Due to the lack of literature discussing the three themes of privacy, value and control, Renegar et al. (2008, pp. 1-2) present the privacy-value-control (PVC) trichotomy as a paradigm beneficial for measuring user acceptance and adoption of mobile technologies. This paradigm stipulates the need to achieve harmony amongst the concepts of privacy, value and control in order for a technology to be adopted and accepted by the consumer. However, the authors note that perceptions of privacy, value and control are dependent on a number of factors or entities, including the individual, the technology and the service provider (Renegar et al. 2008, p. 9). Consequently, the outcomes of Renegar et al.’s study state that privacy does not obstruct the process of adoption but rather the latter must take into account the value proposition in addition to the amount of control granted.

Monitoring using LBS: control versus care? 1.2.10

The focus of the preceding sections has been on the loss of control, the dangers of pre-emptive control, covert monitoring, and user perspectives relating to the control dimension. However, this analysis should not be restricted to the negative implications arising from the use of LBS, but rather should incorporate both the control and care applications of LBS. For instance, while discussions of surveillance and the term in general typically invoke sinister images, numerous authors warn against assuming this subjective viewpoint. Surveillance should not be considered in itself as disagreeable. Rather, “[t]he problem has been the presumptiveness of its proponents, the lack of rational evaluation, and the exaggerations and excesses that have been permitted” (Clarke 2007a, p. 42). This viewpoint is reinforced in the work of Elliot and Phillips (2004, p. 474), and can also be applied to dataveillance.

The perspective that surveillance inevitability results in negative consequences such as individuals possessing excessive amounts of control over each other should be avoided. For instance, Lyon (2001, p. 2) speaks of the dual aspects of surveillance in that “[t]he same process, surveillance – watching over – both enables and constrains, involves care and control.”  Michael et al. (2006a) reinforce such ideas in the context of GPS tracking and monitoring. The authors claim that GPS tracking has been employed for control purposes in various situations, such as policing/law enforcement, the monitoring of parolees and sex offenders, the tracking of suspected terrorists and the monitoring of employees (Michael et al. 2006a, pp. 2-3). However, the authors argue that additional contexts such as convenience and care must not be ignored, as GPS solutions may potentially simplify or enable daily tasks (convenience) or be used for healthcare or protection of vulnerable groups (care) (Michael et al. 2006a, pp. 3-4). Perusco and Michael (2005) further note that the tracking of such vulnerable groups indicates that monitoring activities are no longer limited to those convicted of a particular offence, but rather can be employed for protection and safety purposes. Table 1 provides a summary of GPS tracking and monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4), identifying the potentially constructive uses of GPS tracking and monitoring.

Table 1: GPS monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4)

Table 1: GPS monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4)

It is crucial that in evaluating LBS control literature and establishing the need for LBS regulation, both the control and care perspectives are incorporated. The act of monitoring should not immediately conjure up sinister thoughts. The focus should preferably be directed to the important question of purpose or motives. Lyon (2007, p. 3) feels that purpose may exist anywhere on the broad spectrum between care and control. Therefore, as expressed by Elliot and Phillips (2004, p. 474), a crucial factor in evaluating the merit of surveillance activities and systems is determining “how they are used.” These sentiments are also applicable to dataveillance. It is helpful at this point to discuss alternative and related practices that may incorporate location information throughout the monitoring process.

Sousveillance 1.2.11

The term sousveillance, coined by Steve Mann, comes from the French terms sous which means from below, and veiller which means to watch (Mann et al. 2003, p. 332). It is primarily a form of “inverse surveillance” (Mann et al. 2003, p. 331), whereby an individual is in essence “surveilling the surveillers” (p. 332). Sousveillance is reliant on the use of wearable computing devices to capture audiovisual and sensory data (Mann 2005, p. 625). A major concern with respect to sousveillance, according to Mann (2005, p. 637), is the dissemination of the recorded data which for the purposes of this investigation, may include images of locations and corresponding geographic coordinates.

Sousveillance, ‘reflectionism’ and control 1.2.12

Relevant to the theme of control, it has been argued that sousveillance can be utilised as a form of resistance to unwarranted surveillance and control by institutions. According to Mann et al. (2003, p. 333), sousveillance is a type of reflectionism in which individuals can actively respond to bureaucratic monitoring and to an extent “neutralize surveillance”. Sousveillance can thus be employed in response to social control in that surveillance activities are reversed:

“The surveilled become sousveillers who engage social controllers (customs officials, shopkeepers, customer service personnel, security guards, etc.) by using devices that mirror those used by these social controllers” (Mann et al. 2003, p. 337).

Sousveillance differs from surveillance in that traditional surveillance activities are “centralised” and “localized.” It is dispersed in nature and “delocalized” in its global coverage (Ganascia 2010, p. 496). As such, sousveillance requires new metaphors for understanding its fundamental aspects. A useful metaphor proposed by Ganascia (2010, p. 496) for describing sousveillance is the canopticon, which can be contrasted to the panopticon metaphor. At the heart of the canopticon are the following principles:

“total transparency of society, fundamental equality, which gives everybody the ability to watch – and consequently to control – everybody else, [and] total communication, which enables everyone to exchange with everyone else” (Ganascia 2010, p. 497).

This exchange may include the dissemination of location details, thus signalling the need to incorporate sousveillance into LBS regulatory discussions. A noteworthy element of sousveillance is that it shifts the ability to control from the state/institution (surveillance) to the individual. While this can initially be perceived as an empowering feature, excessive amounts of control, if unchecked, may prove detrimental. That is, control may be granted to individuals to disseminate their location (and other) information, or the information of others, without the necessary precautions in place and in an unguarded fashion. The implications of this exercise are sinister in their extreme forms. When considered within the context of IoT, sousveillance ideals are likely compromised. Yes, I can fight back against state control and big brother with sousveillance but in doing so I unleash potentially a thousand or more little brothers, each with their capacity to (mis)use the information being gathered.

Towards überveillance 1.2.13

The concepts of surveillance, dataveillance and sousveillance have been examined with respect to their association with location services in an IoT world. It is therefore valuable, at this point, to introduce the related notion of überveillance. Überveillance, a term coined by M.G. Michael in 2006, can be described as “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” (Michael et al. 2006b; Macquarie Dictionary 2009, p. 1094). Überveillance combines the dimensions of identification, location and time, potentially allowing for forecasting and uninterrupted real-time monitoring (Michael and Michael 2007, pp. 9-10), and in its extreme forms can be regarded as “Big Brother on the inside looking out” (p. 10).

Überveillance is considered by several authors to be the contemporary notion that will supplant surveillance. For instance, Clarke (2007a, p. 27) suggests that the concept of surveillance is somewhat outdated and that contemporary discussions be focussed on the notion of überveillance. It has further been suggested that überveillance is built on the existing notion of dataveillance. That is, “[ü]berveillance takes that which was static or discrete in the dataveillance world, and makes it constant and embedded” (Michael and Michael 2007, p. 10). The move towards überveillance thus marks the evolution from physical, visual forms of monitoring (surveillance), through to the increasingly sophisticated and ubiquitous embedded chips (überveillance) (Michael & Michael 2010; Gagnon et al. 2013). Albrecht and McIntyre (2005) describe these embedded chips as “spychips” and were focused predominantly on RFID tracking of people through retail goods and services. They spend considerable space describing the Internet of Things concept. Perakslis and Wolk (2006) studied the social acceptance of RFID implants as a security method and Perakslis later went on to incorporate überveillance into her research into behavioural motivators and personality factors toward adoption of humancentric IoT applications.

Given that überveillance is an emerging term (Michael and Michael 2007, p. 9), diverse interpretations have been proposed. For example, Clarke (2007a) offers varying definitions of the term, suggesting that überveillance can be understood as any of the following: omni-surveillance, an apocalyptic notion that “applies across all space and all time (omnipresent), and supports some organisation that is all-seeing and even all-knowing (omniscient)”, which can be achieved through the use of embedded chips for instance (p. 33); exaggerated surveillance, referring to “the extent to which surveillance is undertaken... its justification is exaggerated” (p. 34) ; and/or meta-, supra-, or master-surveillance, which “could involve the consolidation of multiple surveillance threads in order to develop what would be envisaged by its proponents to be superior information” (p. 38). Shay et al. (2012) acknowledge:

“The pervasive nature of sensors coupled with recent advances in data mining, networking, and storage technologies creates tools and data that, while serving the public good, also create a ubiquitous surveillance infrastructure ripe for misuse. Roger Clarke’s concept of dataveillance and M.G. Michael and Katina Michael’s more recent uberveillance serve as important milestones in awareness of the growing threat of our instrumented world.”

All of these definitions indicate direct ways in which IoT applications can also be rolled-out whether it is for use of vehicle management in heavy traffic conditions, the tracking of suspects in a criminal investigation or even employees in a workplace. Disturbing is the manner in which a whole host of applications, particularly in tollways and public transportation, are being used for legal purposes without the knowledge of the driver and commuter. “Tapping” token cards is not only encouraged but mandatory at most metropolitan train stations of developed countries. Little do commuters know that the data gathered by these systems can be requested by a host of government agencies without a warrant.

Implications of überveillance on control 1.2.14

Irrespective of interpretation, the subject of current scholarly debate relates to the implications of überveillance on individuals in particular, and society in general. In an article discussing the evolution of automatic identification (auto-ID) techniques, Michael and Michael (2005) present an account of the issues associated with implantable technologies in humancentric applications. The authors note the evident trend of deploying a technology into the marketplace, prior to assessing the potential consequences (Michael and Michael 2005, pp. 22-33). This reactive approach causes apprehension in view of chip implants in particular, given the inexorable nature of embedded chips, and the fact that once the chip is accepted by the body, it is impossible to remove without an invasive surgical procedure, as summarised in the following excerpt:

“[U]nless the implant is removed within a short time, the body will adopt the foreign object and tie it to tissue. At this moment, there will be no exit strategy, no contingency plan, it will be a life enslaved to upgrades, virus protection mechanisms, and inescapable intrusion” (Michael and Michael 2007, p. 18).

Other concerns relevant to this investigation have also been raised. It is indicated that “über-intrusive technologies” are likely to leave substantial impressions on individuals, families and other social relations, with the added potential of affecting psychological well-being (Michael and Michael 2007, p. 17). Apart from implications for individuals, concerns also emerge at the broader social level that require remedies. For instance, if a state of überveillance is to be avoided, caution must be exercised in deploying technologies without due reflection of the corresponding implications. Namely, this will involve the introduction of appropriate regulatory measures, which will encompass proactive consideration of the social implications of emerging technologies and individuals assuming responsibility for promoting regulatory measures (Michael and Michael 2007, p. 20). It will also require a measured attempt to achieve some form of “balance” (Clarke 2007a, p. 43). The implications of überveillance are of particular relevance to LBS regulatory discussions, given that “overarching location tracking and monitoring is leading toward a state of überveillance” (Michael and Michael 2011, p. 2). As such, research into LBS regulation in Australia must be sensitive to both the significance of LBS to überveillance and the anticipated trajectory of the latter.

Unfortunately the same cannot be said for IoT-specific regulation. IoT is a fluid concept, and in many ways IoT is nebulous. It is made up of a host of technologies that are being integrated and are converging together over time. It is layers upon layers of infrastructure which have emerged since the inception of the first telephone lines to the cloud and wireless Internet today. IoT requires new protocols and new applications but it is difficult to point to a specific technology or application or system that can be subject to some form of external oversight. Herein lie the problems of potential unauthorised disclosure of data, or even misuse of data when government agencies require private enterprise to act upon their requests, or private enterprises work together in sophisticated ways to exploit the consumer.

Comparing the different forms of ‘veillance’ 1.2.15

Various terms ending in ‘veillance’ have been introduced throughout this paper, all of which imply and encompass the process of monitoring. Prior to delving into the dangers of this activity and the significance of LBS monitoring on control, it is helpful to compare the main features of each term. A comparison of surveillance, dataveillance, sousveillance, and überveillance is provided in Table 2.

It should be noted that with the increased use of techniques such as surveillance, dataveillance, sousveillance and überveillance, the threat of becoming a surveillance society looms. According to Ganascia (2010p. 491), a surveillance society is one in which the data gathered from the aforementioned techniques is utilised to exert power and control over others. This results in dangers such as the potential for identification and profiling of individuals (Clarke 1997), the latter of which can be associated with social sorting (Gandy 1993).

Table 2: Comparison of the different forms of ‘veillance’

Identification 1.2.16

Identity and identification are ambiguous terms with philosophical and psychological connotations (Kodl and Lokay 2001, p. 129). Identity can be perceived as “a particular presentation of an entity, such as a role that the entity plays in particular circumstances” (Clarke and Wigan 2011). With respect to information systems, human identification specifically (as opposed to object identification) is therefore “the association of data with a particular human being” (Kodl and Lokay 2001, pp. 129-130). Kodl and Lokay (2001, pp. 131-135) claim that numerous methods exist to identify individuals prior to performing a data linkage, namely, using appearance, social interactions/behaviours, names, codes and knowledge, amongst other techniques. With respect to LBS, these identifiers significantly contribute to the dangers pertaining to surveillance, dataveillance, souseveillance and überveillance. That is, LBS can be deployed to simplify and facilitate the process of tracking and be used for the collection of profile data that can potentially be linked to an entity using a given identification scheme. In a sense, LBS in their own right become an additional form of identification feeding the IoT scheme (Michael and Michael, 2013).

Thus, in order to address the regulatory concerns pertaining to LBS, it is crucial to appreciate the challenges regarding the identification of individuals. Of particularly importance is recognition that once an individual has been identified, they can be subjected to varying degrees of control. As such, in any scheme that enables identification, Kodl and Lokay (2001, p. 136) note the need to balance human rights with other competing interests, particularly given that identification systems may be exploited by powerful entities for control purposes, such as by governments to exercise social control. For an historical account of identification techniques, from manual methods through to automatic identification systems including those built on LBS see Michael and Michael (2009, pp. 43-60). It has also been suggested that civil libertarians and concerned individuals assert that automatic identification (auto-ID) technology “impinges on human rights, the right to privacy, and that eventually it will lead to totalitarian control of the populace that have been put forward since at least the 1970s” (Michael and Michael 2009, p. 364). These views are also pertinent to the notion of social sorting.

Social sorting 1.2.17

In relation to the theme of control, information derived from surveillance, dataveillance, sousveillance and überveillance techniques can also serve the purpose of social sorting, labelled by Oscar Gandy (1993, p. 1) as the “panoptic sort.” Relevant to this discussion, the information may relate to an individual’s location. In Gandy’s influential work The Panoptic Sort: A Political Economy of Personal Information, the author relies on the work of Michel Foucault and other critical theorists (refer to pp. 3-13) in examining the panoptic sort as an “antidemocratic system of control” (Gandy 1993, p. 227). According to Gandy, in this system, individuals are exposed to prejudiced forms of categorisation based on both economic and political factors (pp. 1-2). Lyon (1998, p. 94) describes the database management practices associated with social sorting, classing them a form of consumer surveillance, in which customers are grouped by “social type and location.” Such clustering forms the basis for the exclusion and marginalisation of individuals (King 2001, pp. 47-49). As a result, social sorting is presently used for profiling of individuals and in the market research realm (Bennett and Regan 2004, p. 452).

Profiling 1.2.18

Profiling “is a technique whereby a set of characteristics of a particular class of person is inferred from past experience, and data-holdings are then searched for individuals with a close fit to that set of characteristics” (Clarke 1993). The process is centred on the creation of a profile or model related to a specific individual, based on data aggregation processes (Casal 2004, p. 108). Assorted terms have been employed in labelling this profile. For instance, the model created of an individual using the data collected through dataveillance techniques has been referred to by Clarke (1997) as “the digital persona”, and is related to the “digital dossiers” idea introduced by Solove (2004, pp. 1-7). According to Clarke (1994), the use of networked systems, namely the internet, involves communicating and exposing data and certain aspects of, at times, recognisable behaviour, both of which are utilised in the creation of a personality.

Digital personas and dossiers 1.2.19

The resulting personality is referred to as the digital persona. Similarly, digital dossiers refer to the compilation of comprehensive electronic data related to an individual, utilised in the creation of the “digital person” (Solove 2004, p. 1), also referred to as “digital biographies” (Solove 2002, p. 1086). Digital biographies are further discussed by Solove (2002). In examining the need for LBS regulation throughout the globe, a given regulatory response or framework must appreciate the ease with which (past, present and future) location information can be compiled and integrated into an individual’s digital persona or dossier. Once such information is reproduced and disseminated the control implications are magnified.

With respect to the theme of control, an individual can exercise a limited amount of influence over their digital persona, as some aspects of creating an electronic personality may not be within their direct control. The scope of this article does not allow for reflection on the digital persona in great detail; however, Clarke (1994) offers a thorough investigation of the term, and associated notions such as the passive and active digital persona, in addition to the significance of the digital person to dataveillance techniques such as computer matching and profiling. However, significant to this research is the distinction between the physical and the digital persona and the resultant implications in relation to control, as summarised in the following extract:

“The physical persona is progressively being replaced by the digital persona as the basis for social control by governments, and for consumer marketing by corporations. Even from the strictly social control and business efficiency perspectives, substantial flaws exist in this approach. In addition, major risks to individuals and society arise” (Clarke 1994).

The same sentiments apply with respect to digital dossiers. In particular, Solove (2004, p. 2) notes that individuals are unaware of the ways in which their electronic data is exploited by government and commercial entities, and “lack the power to do much about it.” It is evident that profile data is advantageous for both social control and commercial purposes (Clarke 2001d, p. 12), the latter of which is associated with market research and sorting activities, which have evolved from ideas of “containment” of mobile consumer demand to the “control” model (Arvidsson 2004, pp. 456, 458-467). The control model in particular has been strengthened, but not solely driven, by emerging technologies including LBS, as explained:

“The control paradigm thus permits a tighter and more efficient surveillance that makes use of consumer mobility rather than discarding it as complexity. This ability to follow the consumer around has been greatly strengthened by new technologies: software for data mining, barcode scans, internet tracking devices, and lately location based information from mobile phones” (Arvidsson 2004, p. 467).

Social sorting, particularly for profiling and market research purposes, thus introduces numerous concerns relating to the theme of control, one of which is the ensuing consequences relating to personal privacy. This specifically includes the privacy of location information. In sum, examining the current regulatory framework for LBS in Australia, and determining the need for LBS regulation, necessitates an appreciation of the threats associated with social sorting using information derived from LBS solutions. Additionally, the benefits and risks associated with surveillance, dataveillance, sousveillance and überveillance for control must be measured and carefully contemplated in the proposed regulatory response.

Trust 1.3

Trust is a significant theme relating to LBS, given the importance of the notion to: (a) “human existence” (Perusco et al. 2006, p. 93; Perusco and Michael 2007, p. 10), (b) relationships (Lewis and Weigert 1985, pp. 968-969), (c) intimacy and rapport within a domestic relationship (Boesen et al. 2010, p. 65), and (d) LBS success and adoption (Jorns and Quirchmayr 2010, p. 152). Trust can be defined, in general terms, as the “firm belief in the reliability, truth, or ability of someone or something” (Oxford Dictionary 2012b). A definition of trust that has been widely cited in relevant literature is “the willingness of a party to be vulnerable to the actions of another party based on the expectation that the other will perform a particular action important to the trustor, irrespective of the ability to monitor or control that other party” (Mayer et al. 1995, p. 712). Related to electronic relationships or transactions, the concept has been defined as the “confident reliance by one party on the behaviour of other parties” (Clarke 2001c, p. 291), and it has been suggested that in the electronic-commerce domain, in particular, trust is intimately associated with the disclosure of information (Metzger 2004).

In reviewing literature concerning trust, Fusco et al. (2011, p. 2) claim that trust is typically described as a dynamic concept falling into the categories of cognitive (evidence based), emotional (faith-based), and/or behavioural (conduct-based) trust. For further reading, the major sources on trust can be found in: Lewis and Weigert's (1985) sociological treatment of trust, the influential work of Mayer et al. (1995) and the authors’ updated work Schoorman et al. (2007) centred on organisational trust, Weckert’s (2000) comprehensive review of trust in the context of workplace monitoring using electronic devices, research on trust in electronic-commerce (refer to McKnight and Chervany 2001; Pavlou 2003; Kim et al. 2009) and mobile-commerce (see Siau and Shen 2003; Yeh and Li 2009), the work of Valachich (2003) that introduces and evaluates trust in terms of ubiquitous computing environments, Dwyer et al.’s (2007) article on trust and privacy issues in social networks, Yan and Holtmanns’ (2008) examination of issues associated with digital trust management, the work of Chen et al. (2008) covering the benefits and concerns of LBS usage including privacy and trust implications, and the research by Junglas and Spitzmüller (2005) that examines privacy and trust issues concerning LBS by presenting a research model that incorporates these aspects amongst others.

For the purpose of this paper, the varying definitions and categorisations are acknowledged. However, trust will be assessed in terms of the relationships dominating existing LBS/IoT scholarship which comprise the government-citizen relationship centred on trust in the state, the business-consumer relationship associated with trust in corporations/LBS providers, and the consumer-consumer relationship concerned with trust in individuals/others.

Trust in the state 1.3.1

Trust in the state broadly covers LBS solutions implemented by government, thus representing the government-citizen relationship. Dominating current debates and literature are LBS government initiatives in the form of emergency management schemes, in conjunction with national security applications utilising LBS, which depending on the nature of their implementation may impact on citizens’ trust in the state. These concerns are typically expressed as a trade-off between security and safety. At present there are very few examples of fully-fledged IoT systems to point to, although increasingly quasi-IoT systems are being deployed using wireless sensor networks of varying kinds, e.g. for bushfire management and for fisheries. These systems do not include a direct human stakeholder but are still relevant as they may trigger flow-on effects that do impact citizenry.

Balancing trust and privacy in emergency services 1.3.2

In the context of emergency management, Aloudat and Michael (2011, p. 58) maintain that the dominant theme between government and consumers in relation to emergency warning messages and systems is trust. This includes trust in the LBS services being delivered and in the government itself (Aloudat and Michael 2011, p. 71). While privacy is typically believed to be the leading issue confronting LBS, in emergency and life-threatening situations it is overwhelmed by trust-related challenges, given that users are generally willing to relinquish their privacy in the interest of survival (Aloudat and Michael 2010, p. 2). Furthermore, the success of these services is reliant on trust in the technology, the service, and the accuracy/reliability/timeliness of the emergency alert. On the whole, this success can be measured in terms of citizens’ confidence in their government’s ability to sensibly select and implement a fitting emergency service utilising enhanced LBS features. In a paper that examines the deployment of location services in Dutch public administration, van Ooijen and Nouwt (2009, p. 81) assess the impact of government-based LBS initiatives on the government-citizen relationship, recommending that governments employ care in gathering and utilising location-based data about the public, to ensure that citizens' trust in the state is not compromised.

Trust-related implications of surveillance in the interest of national security 1.3.3

Trust is also prevalent in discussions relating to national security. National security has been regarded a priority area for many countries for over a decade, and as such has prompted the implementation of surveillance schemes by government. Wigan and Clarke (2006, p. 392) discuss the dimension of trust as a significant theme contributing to the social acceptance of a particular government surveillance initiative, which may incorporate location and tracking of individuals and objects. The implementation of surveillance systems by the state, including those incorporating LBS, can diminish the public’s confidence in the state given the potential for such mechanisms to be perceived as a form of authoritarian control. Nevertheless, a situation where national security and safety are considered to be in jeopardy may entail (partial) acceptance of various surveillance initiatives that would otherwise be perceived objectionable. In such circumstances, trust in government plays a crucial role in determining individuals’ willingness to compromise various civil liberties. This is explained by Davis and Silver (2004, p. 35) below:

“The more people trust the federal government or law enforcement agencies, the more willing they are to allow the government leeway in fighting the domestic war on terrorism by conceding some civil liberties.”

However, in due course it is expected that such increased security measures (even if initially supported by citizens) will yield a growing gap between government and citizens, “potentially dampening citizen participation in government and with it reducing citizens’ trust in public institutions and officials” (Gould 2002, p. 77). This is so as the degree of threat and trust in government is diminishing, thus resulting in the public’s reluctance to surrender their rights for the sake of security (Sanquist et al. 2008, p. 1126). In order to build and maintain trust, governments are required to be actively engaged in developing strategies to build confidence in both their abilities and of the technology under consideration, and are challenged to recognise “the massive harm that surveillance measures are doing to public confidence in its institutions” (Wigan and Clarke 2006, p. 401). It has been suggested that a privacy impact assessment (PIA) aids in establishing trust between government and citizens (Clarke 2009, p. 129). Carefully considered legislation is an alternative technique to enhance levels of trust. With respect to LBS, governments are responsible for proposing and enacting regulation that is in the best interest of citizens, incorporating citizen concerns into this process and encouraging suitable design of LBS applications, as explained in the following quotation:

“ laws and regulations must be drafted always on the basis of citizens’ trust in government authorities. This means that citizens trust the government to consider the issues at stake according to the needs and wishes of its citizens. Location aware services can influence citizens’ trust in the democratic society. Poorly designed infrastructures and services for storing, processing and distributing location-based data can give rise to a strong feeling of being threatened. Whereas a good design expands the feeling of freedom and safety, both in the private and in the public sphere/domain” (Beinat et al. 2007, p. 46).

One of the biggest difficulties that will face stakeholders is identifying when current LBS systems become a part of bigger IoT initiatives. Major changes in systems will require a re-evaluation of impact assessments of different types.

Need for justification and cultural sensitivity 1.3.4

Techniques of this nature will fail to be espoused, however, if surveillance schemes lack adequate substantiation at the outset, as trust is threatened by “absence of justification for surveillance, and of controls over abuses” (Wigan and Clarke 2006, p. 389). From a government perspective, this situation may prove detrimental, as Wigan and Clarke (2006, p. 401) claim that transparency and trust are prerequisites for ensuring public confidence in the state, noting that “[t]he integrity of surveillance schemes, in transport and elsewhere, is highly fragile.” Aside from adequate justification of surveillance schemes, cultural differences associated with the given context need to be acknowledged as factors influencing the level of trust citizens hold in government. As explained by Dinev et al. (2005, p. 3) in their cross-cultural study of American and Italian Internet users' privacy and surveillance concerns, “[a]ttitudes toward government and government initiatives are related to the culture’s propensity to trust.” In comparing the two contexts, Dinev et al. claim that Americans readily accept government surveillance to provide increased levels of security, whereas Italians’ low levels of trust in government results in opposing viewpoints (pp. 9-10).

Trust in corporations/LBS/IoT providers 1.3.5

Trust in corporations/LBS/IoT providers emerges from the level of confidence a user places in an organisation and their respective location-based solution(s), which may be correlated to the business-consumer relationship. In the context of consumer privacy, Culnan and Bies (2003, p. 327) assert that perceived trust in an organisation is closely linked to the extent to which an organisation's practices are aligned with its policies. A breach in this trust affects the likelihood of personal information disclosure in the future (Culnan and Bies 2003, p. 328), given the value of trust in sustaining lasting customer relationships (p. 337). Reducing this “trust gap” (Culnan and Bies 2003, pp. 336-337) is a defining element for organisations in achieving economic and industry success, as it may impact on a consumer’s decision to contemplate location data usage (Chen et al. 2008, p. 34). Reducing this gap requires that control over location details remain with the user, as opposed to the LBS provider or network operator (Giaglis et al. 2003, p. 82). Trust can thus emerge from a user’s perception that they are in command (Junglas and Spitzmüller 2005, p. 3). 

Küpper and Treu (2010, pp. 216-217) concur with these assertions, explaining that the lack of uptake of first-generation LBS applications was chiefly a consequence of the dominant role of the network operator over location information. This situation has been somewhat rectified since the introduction of GPS-enabled devices capable of determining location information without input from the network operator and higher emphasis on a user-focussed model (Bellavista et al. 2008, p. 85; Küpper and Treu 2010, p. 217). Trust, however, is not exclusively concerned with a network operator’s ability to determine location information, but also with the possible misuse of location data. As such, it has also been framed as a potential resolution to location data misappropriation, explained further by Jorns and Quirchmayr (2010, p. 152) in the following excerpt:

“The only way to completely avoid misuse is to entirely block location information, that is, to reject such services at all. Since this is not an adequate option... trust is the key to the realization of mobile applications that exchange sensitive information.”

There is much to learn from the covert and overt location tracking of large corporation on their subscribers. Increasingly, the dubious practices of retaining location information by information and communication technology giants Google, Apple and Microsoft are being reported and only small commensurate penalties being applied in countries in the European Union and Asia. Disturbing in this trend is that even smaller suppliers of location-based applications are beginning to unleash unethical (but seemingly not illegal) solutions at shopping malls and other campus-based locales (Michael & Clarke 2013).

Importance of identity and privacy protection to trust 1.3.6

In delivering trusted LBS solutions, Jorns and Quirchmayr (2010, pp. 151-155) further claim that identity and privacy protection are central considerations that must be built into a given solution, proposing an LBS architecture that integrates such safeguards. That is, identity protection may involve the use of false dummies, dummy users and landmark objects, while privacy protection generally relies on decreasing the resolution of location data, employing supportive regulatory techniques and ensuring anonymity and pseudonymity (Jorns and Quirchmayr 2010, p. 152). Similarly, and with respect to online privacy, Clarke (2001c, p. 297) suggests that an adequate framework must be introduced that “features strong and comprehensive privacy laws, and systematic enforcement of those laws.” These comments, also applicable to LBS in a specific sense, were made in the context of economic rather than social relationships, referring primarily to government and corporations, but are also relevant to trust amongst social relations.

It is important to recognise that issues of trust are closely related to privacy concerns from the perspective of users. In an article titled, “Trust and Transparency in Location-Based Services: Making Users Lose their Fear of Big Brother”, Böhm et al. (2004, pp. 1-3) claim that operators and service providers are charged with the difficult task of earning consumer trust and that this may be achieved by addressing user privacy concerns and adhering to relevant legislation. Additional studies also point to the relationship between trust and privacy, claiming that trust can aid in reducing the perceived privacy risk for users. For example, Xu et al. (2005) suggest that enhancing trust can reduce the perceived privacy risk. This influences a user's decision to disclose information, and that “service provider’s interventions including joining third party privacy seal programs and introducing device-based privacy enhancing features could increase consumers’ trust beliefs and mitigate their privacy risk perceptions” (Xu et al. 2005, p. 905). Chellappa and Sin (2005, pp. 188-189), in examining the link between trust and privacy, express the importance of trust building, which include consumer’s familiarity and previous experience with the organisation.

Maintaining consumer trust 1.3.7

The primary consideration in relation to trust in the business-consumer relationship is that all efforts be targeted at establishing and building trust in corporations and LBS/IoT providers. Once trust has been compromised, the situation cannot be repaired which is a point applicable to trust in any context. This point is explained by Kaasinen (2003, p. 77) in an interview-based study regarding user requirements in location-aware mobile applications:

“The faith that the users have in the technology, the service providers and the policy-makers should be regarded highly. Any abuse of personal data can betray that trust and it will be hard to win it back again.”

Trust in individuals/others 1.3.8

Trust in the consumer-to-consumer setting is determined by the level of confidence existing between an individual and their social relations, which may include friends, parents, other family members, employers and strangers, categories that are adapted from Levin et al. (2008, pp. 81-82). Yan and Holtmanns (2008, p. 2) express the importance of trust for social interactions, claiming that “[s]ocial trust is the product of past experiences and perceived trustworthiness.” It has been suggested that LBS monitoring can erode trust between the individual engaged in monitoring and the subject being monitored, as the very act implies that trust is lacking in a given relationship (Perusco et al. 2006, p. 93). These concerns are echoed in Michael et al. (2008). Previous studies relevant to LBS and trust generally focus on: the workplace situation, that is, trust between an employer and their employee; trust amongst ‘friends’ subscribed to a location-based social networking (LBSN) service which may include any of the predefined categories above; in addition to studies relating to the tracking of family members, such as children for instance, for safety and protection purposes and the relative trust implications.

Consequences of workplace monitoring 1.3.9

With respect to trust in an employer’s use of location-based applications and location data, a prevailing subject in existing literature is the impact of employee monitoring systems on staff. For example, in studying the link between electronic workplace monitoring and trust, Weckert (2000, p. 248) reported that trust is a significant issue resulting from excessive monitoring, in that monitoring may contribute to deterioration in professional work relationships between an employer and their employee and consequently reduce or eliminate trust. Weckert’s work reveals that employers often substantiate electronic monitoring based on the argument that the “benefits outweigh any loss of trust”, and may include gains for the involved parties; notably, for the employer in the form of economic benefits, for the employee to encourage improvements to performance and productivity, and for the customer who may experience enhanced customer service (p. 249). Chen and Ross (2005, p. 250), on the other hand, argue that an employer’s decision to monitor their subordinates may be related to a low degree of existing trust, which could be a result of unsuitable past behaviour on the part of the employee. As such, employers may perceive monitoring as necessary in order to manage employees. Alternatively, from the perspective of employees, trust-related issues materialise as a result of monitoring, which may leave an impression on job attitudes, including satisfaction and dedication, as covered in a paper by Alder et al. (2006) in the context of internet monitoring.

When applied to location monitoring of employees using LBS, the trust-related concerns expressed above are indeed warranted. Particularly, Kaupins and Minch (2005, p. 2) argue that the appropriateness of location monitoring in the workplace can be measured from either a legal or ethical perspective, which inevitably results in policy implications for the employer. The authors emphasise that location monitoring of employees can often be justified in terms of the security, productivity, reputational and protective capabilities of LBS (Kaupins and Minch 2005, p. 5). However, Kaupins and Minch (2005, pp. 5-6) continue to describe the ethical factors “limiting” location monitoring in the workplace, which entail the need for maintaining employee privacy and the restrictions associated with inaccurate information, amongst others. These factors will undoubtedly affect the degree of trust between an employer and employee.

However, the underlying concern relevant to this discussion of location monitoring in the workplace is not only the suitability of employee monitoring using LBS. While this is a valid issue, the challenge remains centred on the deeper trust-related consequences. Regardless of the technology or applications used to monitor employees, it can be concluded that a work atmosphere lacking trust results in sweeping consequences that extend beyond the workplace, expressed in the following excerpt:

“A low trust workplace environment will create the need for ever increasing amounts of monitoring which in turn will erode trust further. There is also the worry that this lack of trust may become more widespread. If there is no climate of trust at work, where most of us spend a great deal of our life, why should there be in other contexts? Some monitoring in some situations is justified, but it must be restricted by the need for trust” (Weckert 2000, p. 250).

Location-monitoring amongst friends 1.3.10

Therefore, these concerns are certainly applicable to the use of LBS applications amongst other social relations. Recent literature merging the concepts of LBS, online social networking and trust are particularly focused on the use of LBSN applications amongst various categories of friends. For example, Fusco et al.'s (2010) qualitative study examines the impact of LBSN on trust amongst friends, employing a focus group methodology in achieving this aim. The authors reveal that trust may suffer as a consequence of LBSN usage in several ways: as disclosure of location information and potential monitoring activities can result in application misuse in order to conceal things; excessive questioning and the deterioration in trust amongst social relations; and trust being placed in the application rather than the friend (Fusco et al. 2010, p. 7). Further information relating to Fusco et al.’s study, particularly the manner in which LBSN applications adversely impact on trust can be found in a follow-up article (Fusco et al. 2011).

Location tracking for protection 1.3.11

It has often been suggested that monitoring in familial relations can offer a justified means of protection, particularly in relation to vulnerable individuals such as Alzheimer’s or dementia sufferers and in children. With specific reference to the latter, trust emerges as a central theme relating to child tracking. In an article by Boesen et al. (2010) location tracking in families is evaluated, including the manner in which LBS applications are incorporated within the familial context. The qualitative study conducted by the authors revealed that the initial decision to use LBS by participants with children was a lack of existing trust within the given relationship, with participants reporting an improvement in their children's behaviour after a period of tracking (Boesen et al. 2010, p. 70). Boesen et al., however, warn of the trust-related consequences, claiming that “daily socially-based trusting interactions are potentially replaced by technologically mediated interactions” (p. 73). Lack of trust in a child is considered to be detrimental to their growth. The act of nurturing a child is believed to be untrustworthy through the use of technology, specifically location monitoring applications, may result in long-term implications. The importance of trust to the growth of a child and the dangers associated with ubiquitous forms of supervision are explained in the following excerpt:

“Trust (or at least its gradual extension as the child grows) is seen as fundamental to emerging self-control and healthy development... Lack of private spaces (whether physical, personal or social) for children amidst omni-present parental oversight may also create an inhibiting dependence and fear” (Marx and Steeves 2010, p. 218).

Furthermore, location tracking of children and other individuals in the name of protection may result in undesirable and contradictory consequences relevant to trust. Barreras and Mathur (2007, p. 182), in an article that describes the advantages and disadvantages of wireless location tracking, argue that technologies originally intended to protect family members (notably children, and other social relations such as friends and employees), can impact on trust and be regarded as “unnecessary surveillance.” The outcome of such tracking and reduced levels of trust may also result in a “counterproductive” effect if the tracking capabilities are deactivated by individuals, rendering them incapable of seeking assistance in actual emergency situations (Barreras and Mathur 2007, p. 182).

LBS/IoT is a ‘double-edged sword’ 1.3.12

In summary, location monitoring and tracking by the state, corporations and individuals is often justified in terms of the benefits that can be delivered to the party responsible for monitoring/tracking and the subject being tracked. As such, Junglas and Spitzmüller (2005, p. 7) claim that location-based services can be considered a “double-edged sword” in that they can aid in the performance of tasks in one instance, but may also generate Big Brother concerns. Furthermore, Perusco and Michael (2007, p. 10) mention the linkage between trust and freedom. As a result, Perusco et al. (2006, p. 97) suggest a number of questions that must be considered in the context of LBS and trust: “Does the LBS context already involve a low level of trust?”; “If the LBS context involves a moderate to high level of trust, why are LBS being considered anyway?”; and “Will the use of LBS in this situation be trust-building or trust-destroying?” In answering these questions, the implications of LBS/IoT monitoring on trust must be appreciated, given they are significant, irreparable, and closely tied to what is considered the central challenge in the LBS domain, privacy.

This paper has provided comprehensive coverage of the themes of control and trust with respect to the social implications of LBS. The subsequent discussion will extend the examination to cover LBS in the context of the IoT, providing an ethical analysis and stressing the importance of a robust socio-ethical framework.

Discussion 1.4

The Internet of Things (IoT) and LBS: extending the discussion on control and trust 1.4.1

The Internet of Things (IoT) is an encompassing network of connected intelligent “things”, and is “comprised of smart machines interacting and communicating with other machines, objects, environments and infrastructures” (Freescale Semiconductor Inc. and ARM Inc. 2014, p. 1). The phrase was originally coined by Kevin Ashton in 1999, and a definite definition is yet to be agreed upon (Ashton 2009, p. 1; Kranenburg and Bassi 2012, p. 1). Various forms of IoT are often used interchangeably, such as the Internet of Everything, the Internet of Things and People, the Web of Things and People etc. The IoT can, however, be described in terms of its core characteristics and/or the features it encompasses. At the crux of the IoT concept is the integration of the physical and virtual worlds, and the capability for “things” within these realms to be operated remotely through the employment of intelligent or smart objects with embedded processing functionality (Mattern and Floerkemeier 2010, p. 242; Ethics Subgroup IoT 2013, p. 3). These smart objects are capable of storing historical and varied forms of data, used as the basis for future interactions and the establishment of preferences. That is, once the data is processed, it can be utilized to “command and control” things within the IoT ecosystem, ideally resulting in enhancing the everyday lives of individual (Michael, K. et al., 2010).

According to Ashton (2009, p. 1), the IoT infrastructure should “empower computers” and exhibit less reliance on human involvement in the collection of information. It should also allow for “seamless” interactions and connections (Ethics Subgroup IoT 2013, p. 2). Potential use cases include personal/home applications, health/patient monitoring systems, and remote tracking and monitoring which may include applications such as asset tracking amongst others (Ethics Subgroup IoT 2013, p. 3).

As can be anticipated with an ecosystem of this scale, the nature of interactions with the physical/virtual worlds and the varied “things” within, will undoubtedly be affected and dramatically alter the state of play. In the context of this paper, the focus is ultimately on the ethical concerns emerging from the use of LBS within the IoT infrastructure that is characterized by its ubiquitous/pervasive nature, in view of the discussion above regarding control and trust. It is valuable at this point to identify the important role of LBS in the IoT infrastructure.

While the IoT can potentially encompass a myriad of devices, the mobile phone will likely feature as a key element within the ecosystem, providing connectivity between devices (Freescale Semiconductor Inc. and ARM Inc. 2014, p. 2). In essence, smart phones can therefore be perceived as the “mediator” between users, the internet and additional “things”, as is illustrated in Mattern and Floerkemeier (2010, p. 245, see figure 2). Significantly, most mobile devices are equipped with location and spatial capabilities, providing “localization”, whereby intelligent devices “are aware of their physical location, or can be located” (Mattern and Floerkemeier 2010, p. 244). An example of an LBS application in the IoT would be indoor navigation capabilities in the absence of GPS; or in affect seamless navigation between the outdoor and indoor environments.

Control- and trust-related challenges in the IoT 1.4.2

It may be argued that the LBS control and trust implications discussed throughout this paper (in addition to ethical challenges such as privacy and security) will matriculate into the IoT environment. However, it has also been suggested that “the IoT will essentially create much richer environments in which location-based and location-aware technology can function” (Blouin 2014), and in doing so the ethical challenges will be amplified. It has further been noted that ethical issues, including trust and control amongst others, will “gain a new dimension in light of the increased complexity” in the IoT environment (Ethics Subgroup IoT 2013, p. 2).

In relation to control and the previously identified surveillance metaphors, for instance, it is predicted that there will be less reliance on Orwell's notion of Big Brother whereby surveillance is conducted by a single entity. Rather the concept of "some brother" will emerge. Some brother can be defined as "a heterogeneous 'mass' consisting of innumerable social actors, e.g. public sector authorities, citizens' movements and NGOs, economic players, big corporations, SMEs and citizens" (Ethics Subgroup IoT 2013, p. 16). As can be anticipated, the ethical consequences and dangers can potentially multiply in such a scenario.

Following on from this idea, is that of lack of transparency. The IoT will inevitably result in the merging of both the virtual and physical worlds, in addition to public and private spaces. It has been suggested that lack of transparency regarding information access will create a sense of discomfort and will accordingly result in diminishing levels of trust (Ethics Subgroup IoT 2013, p. 8). The trust-related issues (relevant to LBS) are likely to be consistent with those discussed throughout this paper, possibly varying in intensity/severity depending on a given scenario. For example, the consequences of faulty IoT technology have the potential to be greater than those in conventional Internet services given the integration of the physical and virtual worlds, thereby impact on users’ trust in the IoT (Ethics Subgroup IoT 2013, p. 11). Therefore, trust considerations must primarily be examined in terms of: (a) trust in technology, and (b) trust in individuals/others.

Dealing with these (and other) challenges requires an ethical analysis in which appropriate conceptual and practical frameworks are considered. A preliminary examination is provided in the subsequent section, followed by dialogue regarding the need for objectivity in socio-ethical studies and the associated difficulties in achieving this.

Ethical analysis: proposing a socio-ethical conceptual framework 1.4.3

Research into the social and ethical implications of LBS, emerging technologies in general, and the IoT can be categorized in many ways and many frameworks can be applied. For instance, it may be regarded as a strand of “cyberethics”, defined by Tavani (2007, p. 3) as “the study of moral, legal and social issues involving cybertechnology”. Cybertechnology encompasses technological devices ranging from individual computers through to networked information and communication technologies. When considering ethical issues relating to cybertechnology and technology in general, Tavani (2007, pp. 23-24) notes that the latter should not necessarily be perceived as neutral. That is, technology may have “embedded values and biases” (Tavani 2007, p. 24), in that it may inherently provide capabilities to individuals to partake in unethical activities. This sentiment is echoed by Wakunuma and Stahl (2014, p. 393) in a paper examining the perceptions of IS professionals in relation to emerging ethical concerns.

Alternatively, research in this domain may be classed as a form of “computer ethics” or “information ethics”, which can be defined and applied using numerous approaches. While this article does not attempt to provide an in-depth account of information ethics, a number of its crucial characteristics are identified. In the first instance, the value of information ethics is in its ability to provide a conceptual framework for understanding the array of ethical challenges stemming from the introduction of new ICTs (Mathiesen 2004, p. 1). According to Floridi (1999), the question at the heart of information ethics is “what is good for an information entity and the infosphere in general?” The author continues that “more analytically, we shall say that [information ethics] determines what is morally right or wrong, what ought to be done, what the duties, the ‘oughts’ and the ‘ought nots’ of a moral agent are…” However, Capurro (2006, p. 182) disagrees, claiming that information ethics is additionally about “what is good for our bodily being-in-the-world with others in particular?” This involves contemplation of other “spheres” such as the ecological, political, economic, and cultural and is not limited to a study of the infosphere as suggested by Floridi. In this sense, the significance of context, environment and intercultural factors also becomes apparent.

Following on from these notions, there is the need for a robust ethical framework that is multi-dimensional in nature and explicitly covers the socio-ethical challenges emerging from the deployment of a given technology. This would include, but not be limited to, the control and trust issues identified throughout this paper, other concerns such as privacy and security, and any challenges that emerge as the IoT takes shape. This article proposes a broader more robust socio-ethical conceptual framework, as an appropriate means of examining and addressing ethical challenges relevant to LBS; both LBS in general and as a vital mediating component within the IoT. This framework is illustrated in Figure 1. Central to the socio-ethical framework is the contemplation of individuals as part of a broader social network or society, whilst considering the interactions amongst various elements of the overall “system”. The four themes underpinning socio-ethical studies include the investigation of what the human purpose is, what is moral, how justice is upheld and the principles that guide the usage of a given technique. Participants; their interactions with systems; people concerns and behavioural expectations; cultural and religious belief; structures, rules and norms; and fairness, personal benefits and personal harms are all areas of interest in a socio-ethical approach.

Figure 1: Proposed socio-ethical framework, in terms of the major components that require consideration

Figure 1: Proposed socio-ethical framework, in terms of the major components that require consideration

This article is intended to offer a preliminary account of the socio-ethical conceptual framework being proposed. Further research would examine and test its validity, whilst also providing a more detailed account of the various components within and how a socio-ethical assessment would be conducted based on the framework, and the range of techniques that could be applied.

The need for objectivity 1.4.4

Regardless of categorization and which conceptual framework is adopted, numerous authors stress that the focus of research and debates should not be skewed towards the unethical uses of a particular technology, but rather an objective stance should be embraced. Such objectivity must nonetheless ensure that social interests are adequately represented. That is, with respect to location and tracking technologies, Clarke (2001b, p. 220) claims that social interests have been somewhat overshadowed by the economic interests of LBS organisation. This is a situation that requires rectifying. While information technology professionals are not necessarily liable for how technology is deployed, they must nonetheless recognise its implications and be engaged in the process of introducing and promoting adequate safeguards (Clarke 1988, pp. 510-511). It has been argued that IS professionals are generally disinterested in the ethical challenges associated with emerging ICTs, and are rather concerned with the job or the technologies themselves (Wakunuma and Stahl 2014, p. 383).

This is explicitly the case for LBS given that the industry and technology have developed quicker than equivalent social implications scholarship and research, an unfavourable situation given the potential for LBS to have profound impacts on individuals and society (Perusco et al. 2006, p. 91). In a keynote address centred on defining the emerging notion of überveillance, Clarke (2007a, p. 34) discusses the need to measure the costs and disbenefits arising from surveillance practices in general, where costs refer to financial measures, and disbenefits to all non-economic impacts. This involves weighing the negatives against the potential advantages, a response that is applicable to LBS, and pertinent to seeking objectivity.

Difficulties associated with objectivity 1.4.5

However, a major challenge with respect to an impartial approach for LBS is the interplay between the constructive and the potentially damaging consequences that the technology facilitates. For instance, and with specific reference to wireless technologies in a business setting, Elliot and Phillips (2004, p. 474) maintain that such systems facilitate monitoring and surveillance which can be applied in conflicting scenarios. Positive applications, according to Elliot and Phillips, include monitoring to improve effectiveness or provide employee protection in various instances, although this view has been frequently contested. Alternatively, negative uses involve excessive monitoring, which may compromise privacy or lead to situations in which an individual is subjected to surveillance or unauthorised forms of monitoring.

Additional studies demonstrate the complexities arising from the dual, and opposing, uses of a single LBS solution. It has been illustrated that any given application, for instance, parent, healthcare, employee and criminal tracking applications, can be simultaneously perceived as ethical and unethical (Michael et al. 2006a, p. 7). A closer look at the scenario involving parents tracking children, as explained by Michael et al. (2006a, p. 7), highlights that child tracking can enable the safety of a child on the one hand, while invading their privacy on the other. Therefore, the dual and opposing uses of a single LBS solution become problematic and situation-dependent, and indeed increasingly difficult to objectively examine. Dobson and Fischer (2003, p. 50) maintain that technology cannot be perceived as either good or evil in that it is not directly the cause of unethical behaviour, rather they serve to “empower those who choose to engage in good or bad behaviour.”

This is similarly the case in relation to the IoT, as public approval of the IoT is largely centred on “the conventional dualisms of ‘security versus freedom’ and ‘comfort versus data privacy’” (Mattern and Floerkemeier 2010, p. 256). Assessing the implications of the IoT infrastructure as a whole is increasingly difficult.

An alternative obstacle is associated with the extent to which LBS threaten the integrity of the individual. Explicitly, the risks associated with location and tracking technologies “arise from individual technologies and the trails that they generate, from compounds of multiple technologies, and from amalgamated and cross-referenced trails captured using multiple technologies and arising in multiple contexts” (Clarke 2001b, pp. 218). The consequent social implications or “dangers” are thus a product of individuals being convicted, correctly or otherwise, of having committed a particular action (Clarke 2001b, p. 219). A wrongly accused individual may perceive the disbenefits arising from LBS as outweighing the benefits.

However, in situations where integrity is not compromised, an LBS application can be perceived as advantageous. For instance, Michael et al. (2006, pp. 1-11) refer to the potentially beneficial uses of LBS, in their paper focusing on the Avian Flu Tracker prototype that is intended to manage and contain the spread of the infectious disease, by relying on spatial data to communicate with individuals in the defined location. The authors demonstrate that their proposed system which is intended to operate on a subscription or opt-in basis is beneficial for numerous stakeholders such as government, health organisations and citizens (Michael et al. 2006c, p. 6).

Thus, a common challenge confronting researchers with respect to the study of morals, ethics and technology is that the field of ethics is subjective. That is, what constitutes right and wrong behaviour varies depending on the beliefs of a particular individual, which are understood to be based on cultural and other factors specific to the individual in question. One such factor is an individual’s experience with the technology, as can be seen in the previous example centred on the notion of an unjust accusation. Given these subjectivities and the potential for inconsistency from one individual to the next, Tavani (2007, p. 47) asserts that there is the need for ethical theories to direct the analysis of moral issues (relating to technology), given that numerous complications or disagreements exist in examining ethics.

Conclusion 1.5

This article has provided a comprehensive review of the control- and trust-related challenges relevant to location-based services, in order to identify and describe the major social and ethical considerations within each of the themes. The relevance of the IoT in such discussions has been demonstrated and a socio-ethical framework proposed to encourage discussion and further research into the socio-ethical implications of the IoT with a focus on LBS and/or localization technologies. The proposed socio-ethical conceptual framework requires further elaboration and it is recommended that a thorough analysis, beyond information ethics, be conducted based on this paper which forms the basis for such future work. IoT by its very nature is subject to socio-ethical dilemmas because for the greater part, the human is removed from decision-making processes and is instead subject to a machine.


Abbas, R., Michael, K., Michael, M.G. & Aloudat, A.: Emerging Forms of Covert Surveillance Using GPS-Enabled Devices. Journal of Cases on Information Technology 13(2), 2011, 19-33.

Albrecht, K. & McIntyre, L.: Spychips: How Major Corporations and Government Plan to Track Your Every Purchase and Watch Your Every Move. Tomas Nelson 2005.

Albrecht, K. & Michael, K.: Connected: To Everyone and Everything. IEEE Technology and Society Magazine, Winter, 2013, 31-34.

Alder, G.S., Noel, T.W. & Ambrose, M.L.: Clarifying the Effects of Internet Monitoring on Job Attitudes: The Mediating Role of Employee Trust. Information & Management, 43, 2006, 894-903.

Aloudat, A. & Michael, K.: The Socio-Ethical Considerations Surrounding Government Mandated Location-Based Services During Emergencies: An Australian Case Study, in M. Quigley (ed.), ICT Ethics and Security in the 21st Century: New Developments and Applications. IGI Global, Hershey, PA, 2010, 1-26.

Aloudat, A. & Michael, K.: Toward the Regulation of Ubiquitous Mobile Government: A case Study on Location-Based Emergency Services in Australia. Electronic Commerce Research, 11(1), 2011, 31-74.

Andrejevic, M.: ISpy: Surveillance and Power in the Interactive Era. University Press of Kansas, Lawrence, 2007.

Arvidsson, A.: On the ‘Pre-History of the Panoptic Sort’: Mobility in Market Research. Surveillance & Society, 1(4), 2004, 456-474.

Ashton, K.: The "Internet of Things" Things. RFID Journal, 2009,

Barreras, A. & Mathur, A.: Chapter 18. Wireless Location Tracking, in K.R. Larsen and Z.A. Voronovich (eds.), Convenient or Invasive: The Information Age. Ethica Publishing, United States, 2007, 176-186.

Bauer, H.H., Barnes, S.J., Reichardt, T. & Neumann, M.M.: Driving the Consumer Acceptance of Mobile Marketing: A Theoretical Framework and Empirical Study. Journal of Electronic Commerce Research, 6(3), 2005, 181-192.

Beinat, E., Steenbruggen, J. & Wagtendonk, A.: Location Awareness 2020: A Foresight Study on Location and Sensor Services. Vrije Universiteit, Amsterdam, 2007,

Bellavista, P., Küpper, A. & Helal, S.: Location-Based Services: Back to the Future. IEEE Pervasive Computing, 7(2), 2008, 85-89.

Bennett, C.J. & Regan, P.M.: Surveillance and Mobilities. Surveillance & Society, 1(4), 2004, 449-455.

Bentham, J. & Bowring, J.: The Works of Jeremy Bentham. Published under the Superintendence of His Executor, John Bowring, Volume IV, W. Tait, Edinburgh, 1843.

Blouin, D. An Intro to Internet of Things. 2014,

Boesen, J., Rode, J.A. & Mancini, C.: The Domestic Panopticon: Location Tracking in Families. UbiComp’10, Copenhagen, Denmark, 2010, pp. 65-74.

Böhm, A., Leiber, T. & Reufenheuser, B.: 'Trust and Transparency in Location-Based Services: Making Users Lose Their Fear of Big Brother. Proceedings Mobile HCI 2004 Workshop On Location Systems Privacy and Control, Glasgow, UK, 2004, 1-4.

Capurro, R.: Towards an Ontological Foundation of Information Ethics. Ethics and Information Technology, 8, 2006, 175-186.

Casal, C.R.: Impact of Location-Aware Services on the Privacy/Security Balance, Info: the Journal of Policy, Regulation and Strategy for Telecommunications. Information and Media, 6(2), 2004, 105-111.

Chellappa, R. & Sin, R.G.: Personalization Versus Privacy: An Empirical Examination of the Online Consumer’s Dilemma. Information Technology and Management, 6, 2005, 181-202.

Chen, J.V., Ross, W. & Huang, S.F.: Privacy, Trust, and Justice Considerations for Location-Based Mobile Telecommunication Services. info, 10(4), 2008, 30-45.

Chen, J.V. & Ross, W.H.: The Managerial Decision to Implement Electronic Surveillance at Work. International Journal of Organizational Analysis, 13(3), 2005, 244-268.

Clarke, R.: Information Technology and Dataveillance. Communications of the ACM, 31(5), 1988, 498-512.

Clarke, R.: Profiling: A Hidden Challenge to the Regulation of Data Surveillance. 1993,

Clarke, R.: The Digital Persona and Its Application to Data Surveillance. 1994,

Clarke, R.: Introduction to Dataveillance and Information Privacy, and Definitions of Terms. 1997,

Clarke, R.: Person Location and Person Tracking - Technologies, Risks and Policy Implications. Information Technology & People, 14(2), 2001b, 206-231.

Clarke, R.: Privacy as a Means of Engendering Trust in Cyberspace Commerce. The University of New South Wales Law Journal, 24(1), 2001c, 290-297.

Clarke, R.: While You Were Sleeping… Surveillance Technologies Arrived. Australian Quarterly, 73(1), 2001d, 10-14.

Clarke, R.: Privacy on the Move: The Impacts of Mobile Technologies on Consumers and Citizens. 2003b,

Clarke, R.: Have We Learnt to Love Big Brother? Issues, 71, June, 2005, 9-13.

Clarke, R.: What's 'Privacy'? 2006,

Clarke, R. Chapter 3. What 'Uberveillance' Is and What to Do About It, in K. Michael and M.G. Michael (eds.), The Second Workshop on the Social Implications of National Security, University of Wollongong, Wollongong, Australia, 2007a, 27-46.

Clarke, R.: Chapter 4. Appendix to What 'Uberveillance' Is and What to Do About It: Surveillance Vignettes, in K. Michael and M.G. Michael (eds.), The Second Workshop on the Social Implications of National Security, University of Wollongong, Wollongong, Australia, 2007b, 47-60.

Clarke, R.: Surveillance Vignettes Presentation. 2007c,

Clarke, R.: Privacy Impact Assessment: Its Origins and Development. Computer Law & Security Review, 25(2), 2009, 123-135.

Clarke, R. & Wigan, M.: You Are Where You've Been: The Privacy Implications of Location and Tracking Technologies. 2011,

Culnan, M.J. & Bies, R.J.: Consumer Privacy: Balancing Economic and Justice Considerations. Journal of Social Issues, 59(2), 2003, 323-342.

Davis, D.W. & Silver, B.D.: Civil Liberties vs. Security: Public Opinion in the Context of the Terrorist Attacks on America. American Journal of Political Science, 48(1), 2004, pp. 28-46.

Dinev, T., Bellotto, M., Hart, P., Colautti, C., Russo, V. & Serra, I.: Internet Users’ Privacy Concerns and Attitudes Towards Government Surveillance – an Exploratory Study of Cross-Cultural Differences between Italy and the United States. 18th Bled eConference eIntegration in Action, Bled, Slovenia, 2005, 1-13.

Dobson, J.E. & Fisher, P.F. Geoslavery. IEEE Technology and Society Magazine, 22(1), 2003, 47-52.

Dobson, J.E. & Fisher, P.F. The Panopticon's Changing Geography. Geographical Review, 97(3), 2007, 307-323.

Dwyer, C., Hiltz, S.R. & Passerini, K.: Trust and Privacy Concern within Social Networking Sites: A Comparison of Facebook and Myspace. Proceedings of the Thirteenth Americas Conference on Information Systems, Keystone, Colorado, 2007, 1-12.

Elliot, G. & Phillips, N. Mobile Commerce and Wireless Computing Systems. Pearson Education Limited, Great Britain, 2004.

Ethics Subgroup IoT: Fact sheet- Ethics Subgroup IoT - Version 4.0, European Commission. 2013, 1-21,

Freescale Semiconductor Inc. and ARM Inc:, Whitepaper: What the Internet of Things (IoT) Needs to Become a Reality. 2014, 1-16,

Floridi, L.: Information Ethics: On the Philosophical Foundation of Computer Ethics. Ethics and Information Technology, 1, 1999, 37-56.

Foucault, M. Discipline and Punish: The Birth of the Prison. Second Vintage Books Edition May 1995, Vintage Books: A Division of Random House Inc, New York, 1977.

Fusco, S.J., Michael, K., Aloudat, A. & Abbas, R.: Monitoring People Using Location-Based Social Networking and Its Negative Impact on Trust: An Exploratory Contextual Analysis of Five Types of “Friend” Relationships. IEEE Symposium on Technology and Society, Illinois, Chicago, 2011.

Fusco, S.J., Michael, K., Michael, M.G. & Abbas, R.: Exploring the Social Implications of Location Based Social Networking: An Inquiry into the Perceived Positive and Negative Impacts of Using LBSN between Friends. 9th International Conference on Mobile Business, Athens, Greece, IEEE, 2010, 230-237.

Gagnon, M., Jacob, J.D., Guta, A.: Treatment adherence redefined: a critical analysis of technotherapeutics. Nurs Inq. 20(1), 2013, 60-70.

Ganascia, J.G.: The Generalized Sousveillance Society. Social Science Information, 49(3), 2010, 489-507.

Gandy, O.H.: The Panoptic Sort: A Political Economy of Personal Information. Westview, Boulder, Colorado, 1993.

Giaglis, G.M., Kourouthanassis, P. & Tsamakos, A.: Chapter IV. Towards a Classification Framework for Mobile Location-Based Services, in B.E. Mennecke and T.J. Strader (eds.), Mobile Commerce: Technology, Theory and Applications. Idea Group Publishing, Hershey, US, 2003, 67-85.

Gould, J.B.: Playing with Fire: The Civil Liberties Implications of September 11th. Public Administration Review, 62, 2002, 74-79.

Jorns, O. & Quirchmayr, G.: Trust and Privacy in Location-Based Services. Elektrotechnik & Informationstechnik, 127(5), 2010, 151-155.

Junglas, I. & Spitzmüller, C.: A Research Model for Studying Privacy Concerns Pertaining to Location-Based Services. Proceedings of the 38th Hawaii International Conference on System Sciences, 2005, 1-10.

Kaasinen, E.: User Acceptance of Location-Aware Mobile Guides Based on Seven Field Studies. Behaviour & Information Technology, 24(1), 2003, 37-49.

Kaupins, G. & Minch, R.: Legal and Ethical Implications of Employee Location Monitoring. Proceedings of the 38th Hawaii International Conference on System Sciences. 2005, 1-10.

Kim, D.J., Ferrin, D.L. & Rao, H.R.: Trust and Satisfaction, Two Stepping Stones for Successful E-Commerce Relationships: A Longitudinal Exploration. Information Systems Research, 20(2), 2009, 237-257.

King, L.: Information, Society and the Panopticon. The Western Journal of Graduate Research, 10(1), 2001, 40-50.

Kodl, J. & Lokay, M.: Human Identity, Human Identification and Human Security. Proceedings of the Conference on Security and Protection of Information, Idet Brno, Czech Republic, 2001, 129-138.

Kranenburg, R.V. and Bassi, A.: IoT Challenges, Communications in Mobile Computing. 1(9), 2012, 1-5.

Küpper, A. & Treu, G.: Next Generation Location-Based Services: Merging Positioning and Web 2.0., in L.T. Yang, A.B. Waluyo, J. Ma, L. Tan and B. Srinivasan (eds.), Mobile Intelligence. John Wiley & Sons Inc, Hoboken, New Jersey, 2010, 213-236.

Levin, A., Foster, M., West, B., Nicholson, M.J., Hernandez, T. & Cukier, W.: The Next Digital Divide: Online Social Network Privacy. Ryerson University, Ted Rogers School of Management, Privacy and Cyber Crime Institute, 2008,

Lewis, J.D. & Weigert, A.: Trust as a Social Reality. Social Forces, 63(4), 1985, 967-985.

Lyon, D.: The World Wide Web of Surveillance: The Internet and Off-World Power Flows. Information, Communication & Society, 1(1), 1998, 91-105.

Lyon, D.: Surveillance Society: Monitoring Everyday Life. Open University Press, Phildelphia, PA, 2001.

Lyon, D.: Surveillance Studies: An Overview. Polity, Cambridge, 2007.

Macquarie Dictionary.: 'Uberveillance', in S. Butler, Fifth Edition of the Macquarie Dictionary, Australia's National Dictionary. Sydney University, 2009, 1094.

Mann, S.: Sousveillance and Cyborglogs: A 30-Year Empirical Voyage through Ethical, Legal, and Policy Issues. Presence, 14(6), 2005, 625-646.

Mann, S., Nolan, J. & Wellman, B.: Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments. Surveillance & Society, 1(3), 2003, 331-355.

Mathiesen, K.: What is Information Ethics? Computers and Society, 32(8), 2004, 1-11.

Mattern, F. and Floerkemeier, K.: From the Internet of Computers to the Internet of Things, in Sachs, K., Petrov, I. & Guerrero, P. (eds.), From Active Data Management to Event-Based Systems and More. Springer-Verlag Berlin Heidelberg, 2010, 242-259.

Marx, G.T. & Steeves, V.: From the Beginning: Children as Subjects and Agents of Surveillance. Surveillance & Society, 7(3/4), 2010, 192-230.

Mayer, R.C., Davis, J.H. & Schoorman, F.D.: An Integrative Model of Organizational Trust. The Academy of Management Review, 20(3), 1995, 709-734.

McKnight, D.H. & Chervany, N.L.: What Trust Means in E-Commerce Customer Relationships: An Interdisciplinary Conceptual Typology. International Journal of Electronic Commerce, 6(2), 2001, 35-59.

Metzger, M.J.: Privacy, Trust, and Disclosure: Exploring Barriers to Electronic Commerce. Journal of Computer-Mediated Communication, 9(4), 2004.

Michael, K. & Clarke, R.: Location and Tracking of Mobile Devices: Überveillance Stalks the Streets. Computer Law and Security Review, 29(2), 2013, 216-228.

Michael, K., McNamee, A. & Michael, M.G.: The Emerging Ethics of Humancentric GPS Tracking and Monitoring. International Conference on Mobile Business, Copenhagen, Denmark, IEEE Computer Society, 2006a, 1-10.

Michael, K., McNamee, A., Michael, M.G., and Tootell, H.: Location-Based Intelligence – Modeling Behavior in Humans using GPS. IEEE International Symposium on Technology and Society, 2006b.

Michael, K., Stroh, B., Berry, O., Muhlbauer, A. & Nicholls, T.: The Avian Flu Tracker - a Location Service Proof of Concept. Recent Advances in Security Technology, Australian Homeland Security Research Centre, 2006, 1-11.

Michael, K. and Michael, M.G.: Australia and the New Technologies: Towards Evidence Based Policy in Public Administration (1 ed). Wollongong, Australia: University of Wollongong, 2008, Available at:

Michael, K. & Michael, M.G.: Microchipping People: The Rise of the Electrophorus. Quadrant, 49(3), 2005, 22-33.

Michael, K. and Michael, M.G.: From Dataveillance to Überveillance (Uberveillance) and the Realpolitik of the Transparent Society (1 ed). Wollongong: University of Wollongong, 2007. Available at:

Michael, K. & Michael, M.G.: Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants. IGI Global, Hershey, PA, 2009.

Michael, K. & Michael, M.G.: The Social and Behavioral Implications of Location-Based Services. Journal of Location-Based Services, 5(3/4), 2011, 1-15,

Michael, K. & Michael, M.G.: Sousveillance and Point of View Technologies in Law Enforcement: An Overview, in The Sixth Workshop on the Social Implications of National Security: Sousveillance and Point of View Technologies in Law Enforcement, University of Sydney, NSW, Australia, Feb. 2012.

Michael, K., Roussos, G., Huang, G.Q., Gadh, R., Chattopadhyay, A., Prabhu, S. and Chu, P.: Planetary-scale RFID Services in an Age of Uberveillance. Proceedings of the IEEE, 98.9, 2010, 1663-1671.

Michael, M.G. and Michael, K.: National Security: The Social Implications of the Politics of Transparency. Prometheus, 24(4), 2006, 359-364.

Michael, M.G. & Michael, K. Towards a State of Uberveillance. IEEE Technology and Society Magazine, 29(2), 2010, 9-16.

Michael, M.G. & Michael, K. (eds): Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies. Hershey, PA, IGI Global, 2013.

O'Connor, P.J. & Godar, S.H.: Chapter XIII. We Know Where You Are: The Ethics of LBS Advertising, in B.E. Mennecke and T.J. Strader (eds.), Mobile Commerce: Technology, Theory and Applications, Idea Group Publishing, Hershey, US, 2003, 245-261.

Orwell, G.: Nineteen Eighty Four. McPherson Printing Group, Maryborough, Victoria, 1949.

Oxford Dictionary: Control, Oxford University Press, 2012a

Oxford Dictionary: Trust, Oxford University Press, 2012b,

Pavlou, P.A.: Consumer Acceptance of Electronic Commerce: Integrating Trust and Risk with the Technology Acceptance Model. International Journal of Electronic Commerce, 7(3), 2003, 69-103.

Perusco, L. & Michael, K.: Humancentric Applications of Precise Location Based Services, in IEEE International Conference on e-Business Engineering, Beijing, China, IEEE Computer Society, 2005, 409-418.

Perusco, L. & Michael, K.: Control, Trust, Privacy, and Security: Evaluating Location-Based Services. IEEE Technology and Society Magazine, 26(1), 2007, 4-16.

Perusco, L., Michael, K. & Michael, M.G.: Location-Based Services and the Privacy-Security Dichotomy, in Proceedings of the Third International Conference on Mobile Computing and Ubiquitous Networking, London, UK, Information Processing Society of Japan, 2006, 91-98.

Quinn, M.J.: Ethics for the Information Age. Second Edition, Pearson/Addison-Wesley, Boston, 2006.

Renegar, B., Michael, K. & Michael, M.G.: Privacy, Value and Control Issues in Four Mobile Business Applications, in 7th International Conference on Mobile Business (ICMB2008), Barcelona, Spain, IEEE Computer Society, 2008, 30-40.

Rozenfeld, M.: The Value of Privacy: Safeguarding your information in the age of the Internet of Everything, The Institute: the IEEE News Source, 2014,

Rummel, R.J.: Death by Government. Transaction Publishers, New Brunswick, New Jersey, 1997.

Sanquist, T.F., Mahy, H. & Morris, F.: An Exploratory Risk Perception Study of Attitudes toward Homeland Security Systems. Risk Analysis, 28(4), 2008, 1125-1133.

Schoorman, F.D., Mayer, R.C. & Davis, J.H.: An Integrative Model of Organizational Trust: Past, Present, and Future. Academy of Management Review, 32(2), 2007, 344-354.

Shay, L.A., Conti, G., Larkin, D., Nelson, J.: A framework for analysis of quotidian exposure in an instrumented world. IEEE International Carnahan Conference on Security Technology (ICCST), 2012, 126-134.

Siau, K. & Shen, Z.: Building Customer Trust in Mobile Commerce. Communications of the ACM, 46(4), 2003, 91-94.

Solove, D.: Digital Dossiers and the Dissipation of Fourth Amendment Privacy. Southern California Law Review, 75, 2002, 1083-1168.

Solove, D.: The Digital Person: Technology and Privacy in the Information Age. New York University Press, New York, 2004.

Tavani, H.T.: Ethics and Technology: Ethical Issues in an Age of Information and Communication Technology. John Wiley, Hoboken, N.J., 2007.

Valacich, J.S.: Ubiquitous Trust: Evolving Trust into Ubiquitous Computing Environments. Business, Washington State University, 2003, 1-2.

van Ooijen, C. & Nouwt, S.: Power and Privacy: The Use of LBS in Dutch Public Administration, in B. van Loenen, J.W.J. Besemer and J.A. Zevenbergen (eds.), Sdi Convergence. Research, Emerging Trends, and Critical Assessment, Nederlandse Commissie voor Geodesie Netherlands Geodetic Commission 48, 2009, 75-88.

Wakunuma, K.J. and Stahl, B.C.: Tomorrow’s Ethics and Today’s Response: An Investigation into The Ways Information Systems Professionals Perceive and Address Emerging Ethical Issues. Inf Syst Front, 16, 2014, 383–397.

Weckert, J.: Trust and Monitoring in the Workplace. IEEE International Symposium on Technology and Society, 2000. University as a Bridge from Technology to Society, 2000, 245-250.

Wigan, M. & Clarke, R.: Social Impacts of Transport Surveillance. Prometheus, 24(4), 2006, 389-403.

Xu, H. & Teo, H.H.: Alleviating Consumers’ Privacy Concerns in Location-Based Services: A Psychological Control Perspective. Twenty-Fifth International Conference on Information Systems, 2004, 793-806.

Xu, H., Teo, H.H. & Tan, B.C.Y.: Predicting the Adoption of Location-Based Services: The Role of Trust and Perceived Privacy Risk. Twenty-Sixth International Conference on Information Systems, 2005, 897-910.

Yan, Z. & Holtmanns, S.: Trust Modeling and Management: From Social Trust to Digital Trust, in R. Subramanian (ed.), Computer Security, Privacy and Politics: Current Issues, Challenges and Solutions. IGI Global, 2008, 290-323.

Yeh, Y.S. & Li, Y.M.: Building Trust in M-Commerce: Contributions from Quality and Satisfaction. Online Information Review, 33(6), 2009, 1066-1086.

Citation: Roba Abbas, Katina Michael, M.G. Michael, "Using a Social-Ethical Framework to Evaluate Location-Based Services in an Internet of Things World", IRIE, International Review of Information Ethics, Source: Dec 2014


Honorary Fellow Dr Roba Abbas:

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 - 4221 - 3555 , * :

·         Relevant publications:

o    R. Abbas, K. Michael, M.G. Michael, R. Nicholls, Sketching and validating the location-based services (LBS) regulatory framework in Australia, Computer Law and Security Review 29, No.5 (2013): 576-589.

o    R. Abbas, K. Michael, M.G. Michael, The Regulatory Considerations and Ethical Dilemmas of Location-Based Services (LBS): A Literature Review, Information Technology & People 27, No.1 (2014): 2-20.

Associate Professor Katina Michael:

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 - 4221 - 3937 , * :

·         Relevant publications:

o    K. Michael, R. Clarke, Location and Tracking of Mobile Devices: Überveillance Stalks the Streets, Computer Law and Security Review 29, No.3 (2013): 216-228.

o    K. Michael, M. G. Michael, Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants, IGI Global, (2009).

o    L. Perusco, K. Michael, Control, trust, privacy, and security: evaluating location-based services, IEEE Technology and Society Magazine 26, No.1 (2007): 4-16.

Honorary Associate Professor M.G. Michael

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 – 4221 - 3937, *, :

·         Relevant publications:

o    M.G. Michael and K. Michael (eds) Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies, Hershey: PA, IGI Global, (2013).

o    K. Michael, M. G. Michael, "The Social and Behavioral Implications of Location-Based Services, Journal of Location-Based Services, Volume 5, Issue 3-4, (2011), 121-137.

o    M.G. Michael, K. Michael, Towards a State of Uberveillance, IEEE Technology and Society Magazine, 29, No.2, (2010): 9-16.

o    M. G. Michael, S. J. Fusco, K. Michael, A Research Note on Ethics in the Emerging Age of Uberveillance, Computer Communications, 31 No.6, 2008: 1192-1199.

Be Vigilant: There Are Limits to Veillance

The Computer After Me: Awareness and Self-Awareness in Autonomic Systems

Chapter 13: Be Vigilant: There Are Limits to Veillance

This image was taken from the BioShock video game series or from websites created and owned by 2K Games, the copyright of which is held by Take-Two Interactive Software, Inc.

Katina Michael, M. G. Michael, Christine Perakslis

The following sections are included:

  • Introduction

  • From Fixed to Mobile Sensors

  • People as Sensors

  • Enter the Veillances

    • Surveillance

    • Dataveillance

    • Sousveillance

    • Überveillance

  • Colliding Principles

    • From ‘drone view’ to ‘person view’

    • Transparency and open data

    • Surveillance, listening devices and the law

    • Ethics and values

    • The unintended side effects of lifelogging

    • Pebbles and shells

    • When bad is good

    • Censorship

  • Summary and Conclusions: Mind/Body Distinction

13.1 Introduction

Be vigilant; we implore the reader. Yet, vigilance requires hard mental work (Warm et al., 2008). Humans have repeatedly shown evidence of poor performance relative to vigilance, especially when we are facing such factors as complex or novel data, time pressure, and information overload (Ware, 2000). For years, researchers have investigated the effect of vigilance, from the positive impact of it upon the survival of the ground squirrel in Africa to its decrement resulting in the poor performance of air traffic controllers. Scholars seem to agree: fatigue has a negative bearing on vigilance.

In our society, we have become increasingly fatigued, both physically and cognitively. It has been widely documented that employees are in­creasingly faced with time starvation, and that consequently self-imposed sleep deprivation is one of the primary reasons for increasing fatigue, as employees forego sleep in order to complete more work (see, for example, the online publications by the Society of Human Resources1 and the Na­tional Sleep Foundation2). Widespread access to technology exacerbates the problem, by making it possible to stay busy round the clock.

Our information-rich world which leads to information overload and novel data, as well as the 24/7/365 connectivity which leads to time pressure, both contribute to fatigue and so work against vigilance. However, the lack of vigilance, or the failure to accurately perceive, identify, or an­alyze bona fide threats, can lead to serious negative consequences, even a life-threatening state of affairs (Capurro, 2013).

This phenomenon, which can be termed vigilance fatigue, can be brought about by four factors:

·       Prolonged exposure to ambiguous, unspecified, and ubiquitous threat information.

·       Information overload.

·       Overwhelming pressure to maintain exceptional, error-free per­formance.

·       Faulty strategies for structuring informed decision-making under con­ditions of uncertainty and stress.

Therefore, as we are asking the reader to be vigilant in this transformative – and potentially disruptive transition toward – the ‘computer after me’, we feel obligated to articulate clearly the potential threats associated with veillance. We believe we must ask the challenging and unpopular questions now. We must disclose and discuss the existence of risk, the values at stake, and the possibility of harm related to veillance. We owe it to the reader in this world of increasing vigilance fatigue to provide unambiguous, specified threat information and to bring it to their attention.

13.2 From Fixed to Mobile Sensors

Embedded sensors have provided us with a range of benefits and conve­niences that many of us take for granted in our everyday life. We now find commonplace the auto-flushing lavatory and the auto-dispensing of soap and water for hand washing. Many of these practices are not only conve­nient but help to maintain health and hygiene. We even have embedded sensors in lamp-posts that can detect on-coming vehicles and are so energy efficient that they turn on as they detect movement, and then turn off again to conserve resources. However, these fixtures are static; they form basic infrastructure that often has ‘eyes’ (e.g. an image and/or motion sensor), but does not have ‘legs’.

What happens when these sensors – for identification, location, condi­tion monitoring, point-of-view (POV) and more – become embeddable in mobile objects and begin to follow and track us everywhere we go? Our vehicles, tablets, smart phones, and even contactless smart cards are equipped to capture, synthesize, and communicate a plethora of information about our behaviors, traits, likes and dislikes, as we lug them around everywhere we go. Automatic licence plate scanners are mounted not only in street­lights or on bridges, but now also on patrol cars. These scanners snap photos of automobiles passing and store such data as plate numbers, times, and locations within massive databases (Clarke, 2009). Stores are combin­ing the use of static fixtures with mobile devices to better understand the psychographics and demographics of their shoppers (Michael and Clarke, 2013). The combination of these monitoring tools is powerful. Cell phone identifiers are used to track the movements of the customers (even if the customer is not connected to the store’s WiFi network), with the surveillance cameras collecting biometric analytics to analyze facial expressions and moods. Along with an augmented capability to customize and person­alize marketing efforts, the stores can identify how long one tarries in an aisle, the customer’s reaction to a sale item, the age of the shopper, and even who did or did not walk by a certain display.

The human has now become an extension (voluntarily or involuntarily) of these location-based and affect-based technological breakthroughs; we the end-users are in fact the end-point of a complex network of net­works. The devices we carry take on a life of their own, sending binary data up and down stream in the name of better connectivity, awareness, and ambient intelligence. ‘I am here’, the device continuously signals to the nearest access node, handshaking a more accurate location fix, as well as providing key behavioral indicators which can easily become predictors of future behaviors. However, it seems as if we, as a society, are rapidly in de­mand of more and more communications technology – or so that is the idea we are being sold. Technology has its many benefits: few people are out of reach now, and communication becomes easier, more personalized, and much more flexible. Through connectivity, people’s input is garnered and responses can be felt immediately. Yet, just as Newton’s action–reaction law comes into play in the physical realm, there are reactions to consider for the human not only in the physical realms, but also in the mental, emo­tional, and spiritual realms (Loehr and Schwartz, 2001), when we live our lives not only in the ordinary world, but also within the digital world.

Claims have been made that our life has become so busy today that we are grasping to gain back seconds in our day. It could be asked: why should we waste time and effort by manually entering all these now-necessary pass­words, when a tattoo or pill could transmit an 18-bit authentication signal for automatic logon from within our bodies? We are led to believe that individuals are demanding uninterrupted connectivity; however, research has shown that some yearn to have the freedom to ‘live off the grid’, even if for only a short span of time (Pearce and Gretzel, 2012).

A recent front cover of a US business magazine Fast Company read “Unplug. My life was crazy. So I disconnected for 25 days. You should too”. The content within the publication includes coping mechanisms of senior-level professionals who are working to mitigate the consequences of perpetual connectivity through technology. One article reveals the digital dilemmas we now face (e.g. how much should I connect?); another article provides tips on how to do a digital detox (e.g. disconnecting because of the price we pay); and yet another article outlines how to bring sanity to your crazy, wired life with eight ways the busiest connectors give themselves a break (e.g. taking time each day to exercise in a way that makes it impossi­ble to check your phone; ditching the phone to ensure undivided attention is given to colleagues; or establishing a company ‘Shabbat’ in which it is acceptable to unplug one day a week). Baratunde Thurston, CEO and co­founder of Cultivated Wit (and considered by some to be the world’s most connected man), wrote:

I love my devices and my digital services, I love being connected to the global hive mind – but I am more aware of the price we pay: lack of depth, reduced accuracy, lower quality, impatience, selfishness, and mental exhaustion, to name but a few. In choosing to digitally enhance lives, we risk not living them.
— (Thurston, 2013, p. 77)

13.3 People as Sensors

Enter Google Glass, Autographer, Memoto, TrackStick, Fitbit, and other wearable devices that are worn like spectacles, apparel, or tied round the neck. The more pervasive innovations such as electronic tattoos, nanopatches, smart pills, and ICT implants seamlessly become a ‘part’ of the body once attached, swallowed, embedded, or injected. These technolo­gies are purported to be lifestyle choices that can provide a myriad of con­veniences and productivity gains, as well as improved health and well-being functionality. Wearables are believed to have such benefits as enhancements to self-awareness, communication, memory, sensing, recognition, and logis­tical skills. Common experiences can be augmented, for example when a theme park character (apparently) knows your child’s name because of a wrist strap that acts as an admissions ticket, wallet, and ID.

Gone are the days when there was a stigma around electronic bracelets being used to track those on parole; these devices are now becoming much like a fashion statement and a desirable method not only for safety and security, but also for convenience and enhanced experiences. However, one must consider that an innocuous method for convenience may prove to create ‘people as sensors’ in which information is collected from the envi­ronment using unobtrusive measures, but with the wearer – as well as those around the wearer – possibly unaware of the extent of the data collection. In addition to issues around privacy, other questions must be asked such as: what will be done with the data now and well into the future?

The metaphor of ‘people as sensors’, also referred to as Citizens as Sen­sors (Goodchild, 2007), is being espoused, as on-board chipsets allow an individual to look out toward another object or subject (e.g. using an im­age sensor), or to look inward toward oneself (e.g. measuring physiological characteristics with embedded surveillance devices). As optional prosthetic devices are incorporated into users, devices are recognized by some as be­coming an extension of the person’s mind and body. New developments in ‘smart skin’ offer even more solutions. The skin can become a function of the user’s habits, personality, mood, or behavior. For example, when inserted into a shoe, the smart skin can analyze and improve the technical skill of an athlete, factors associated with body stresses related to activity, or even health issues that may result from the wearer’s use of high-heeled shoes (Papakostas et al., 2002). Simply put, human beings who function in analog are able to communicate digitally through the devices that they wear or bear. This is quite a different proposition from the typical surveil­lance camera that is bolted onto a wall overlooking the streetscape or mall and has a pre-defined field of view.

Fig. 13.1 People as sensors: from surveillance to uberveillance

‘People as sensors’ is far more pervasive than dash-cams used in police vehicles, and can be likened to the putting on of body-worn devices by law enforcement agencies to collect real-time data from the field (see Fig­ure 13.1). When everyday citizens are wearing and bearing these devices, they form a collective network by contributing individual subjective (and personal) observations of themselves and their surroundings. There are advantages; the community is believed to benefit with relevant, real-time information on such issues as public safety, street damage, weather obser­vations, traffic patterns, and even public health (cf. Chapter 12). People, using their everyday devices, can enter information into a data warehouse, which could also reduce the cost of intensive physical networks that oth­erwise need to be deployed. Although murky, there is vulnerability; such as the risk of U-VGI (Un-Volunteered Geographical Information) with the tracking of mass movements in a cell phone network to ascertain traffic distribution (Resch, 2013).

Consider it a type of warwalking on foot rather than wardriving.3 It seems that opt-in and opt-out features are not deemed necessary, perhaps due to the perceived anonymity of individual user identifiers. The ability to ‘switch off’, ‘turn off’, ‘unplug’, or select the ‘I do not consent’ feature in a practical way, is a question that many have pondered, but with arguably a limited number of pragmatic solutions, if any.

With ‘citizens as sensors’ there is an opt-in for those subscribing, but issues need to be considered for those in the vicinity of the bearer who did not consent to subscribe or to be recorded. Researchers contend that even the bearer must be better educated on the potential privacy issues (Daskala, 2011). For example, user-generated information yields longitude and lat­itude coordinates, time and date stamps, and speed and elevation details which tell us significant aspects about a person’s everyday life leading to insight about current and predictive behavioral patterns. Data could also be routinely intercepted (and stored indefinitely), as has been alleged in the recent National Security Agency (NSA) scandal. Even greater concerns arise from the potential use of dragnet electronic surveillance to be mined for information (now or in the future) to extract and synthesize rich het­erogeneous data containing personal visual records and ‘friends lists’ of the new media. Call detail records (CDRs) may just be the tip of the iceberg.

The quantified-self movement, which incorporates data, taking into ac­count many inputs of a person’s daily life, is being used for self-tracking and community building so individuals can work toward improving their daily functioning (e.g. how you look, feel, and live). Because devices can look inward toward oneself, one can mine very personal data (e.g. body mass index and heart rate) which can then be combined with the outward (e.g. the vital role of your community support network) to yield such quantifiers as a higi score defining a person with a cumulative grade (e.g. your score today out of a possible 999 points).4

Wearables, together with other technologies, assist in the process of tak­ing in multiple and varied data points to synthesize the person’s mental and physical performance (e.g. sleep quality), psychological states such as moods and stimulation levels (e.g. excitement), and other inputs such as food, air quality, location, and human interactions. Neurologically, information is addictive; yet, humans may make worse decisions when more information is at hand. Humans also are believed to overestimate the value of missing data which may lead to an endless pursuit, or perhaps an overvaluing of useless information (Bastardi and Shafir, 1998). Even more consequential, it is even possible that too much introspection can also reduce the quality of decisions of individuals.

13.4 Enter the Veillances

Katina Michael and M. G. Michael (2009) made a presentation that, for the first time at a public gathering, considered surveillance, dataveillance, sousveillance and überveillance all together. As a specialist term, veillance was first used in an important blogpost exploring equiveillance by Ian Kerr and Steve Mann (2006) in which the ‘valences of veillance’ were briefly described. But in contrast to Kerr and Mann, Michael and Michael were pondering on the intensification of a state of überveillance through increasingly pervasive technologies, which can provide details from the big picture view right down to the miniscule personal details.

But what does veillance mean? And how is it understood in different contexts? What does it mean to be watched by a CCTV camera, to have one’s personal details deeply scrutinized, to watch another, to watch one­self? And so we continue by defining the four types of veillances that have received attention in recognized peer reviewed journal publications and the wider corpus of literature.

13.4.1 Surveillance

First, the much embraced idea of surveillance recognized in the early nine­teenth century from the French sur meaning ‘over’ and veiller meaning ‘to watch’. According to the Oxford English Dictionary, veiller stems from the Latin vigilare, which means ‘to keep watch’.

13.4.2 Dataveillance

Dataveillance was conceived by Clarke (1988a) as “the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” (although in the Oxford English Dictionary it is now defined as “the practice of monitoring the online ac­tivity of a person or group”). The term was introduced in response to government agency data matching initiatives linking taxation records and social security benefits, among other commercial data mining practices. At the time it was a powerful response to the proposed Australia Card pro­posal in 1987 (Clarke, 1988b), which was never implemented by the Hawke Government, while the Howard Government’s attempts to introduce an Access Card almost two decades later in 2005 were also unsuccessful. It is remarkable that same issues ensue today, only on a greater magnitude with more consequences and advanced capabilities in analytics, data storage, and converging systems.

13.4.3 Sousveillance

Sousveillance was defined by Steve Mann in 2002, but practiced since 1995 as “the recording of an activity from the perspective of a participant in the activity” . 5 However, its initial introduction into the literature came in the inaugural Surveillance and Society journal in 2003 with a meaning of ‘in­verse surveillance’ as a counter to organizational surveillance (Mann et al., 2003). Mann prefers to interpret sousveillance as under-sight, which main­tains integrity, contra to surveillance as over-sight (Mann, 2004a), which reduces to hypocrisy if governments responsible for surveillance pass laws to make sousveillance illegal.

Whereas dataveillance is the systematic use of personal data systems in the monitoring of people, sousveillance is the inverse of monitoring people; it is the continuous capture of personal experience (Mann, 2004b). For ex­ample, dataveillance might include the linking of someone’s tax file number with their bank account details and communications data. Sousveillance on the other hand, is a voluntary act of logging what people might see as they move through the world. Surveillance is thus considered watch­ing from above, whereas sousveillance is considered watching from below. In contrast, dataveillance is the monitoring of a person’s activities which presents the individual with numerous social dangers (Clarke, 1988a).

13.4.4 Uberveillance

¨Uberveillance conceived by M. G. Michael in 2006, is defined in the Aus­tralian Law Dictionary as: “ubiquitous or pervasive electronic surveillance that is not only ‘always on’ but ‘always with you’, ultimately in the form of bodily invasive surveillance”. The Macquarie Dictionary of Australia entered the term officially in 2008 as “an omnipresent electronic surveil­lance facilitated by technology that makes it possible to embed surveil­lance devices in the human body”. Michael and Michael (2007) defined überveillance as having “to do with the fundamental who (ID), where (loca­tion), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought)”.

¨Uberveillance is a compound word, conjoining the German über mean­ing ‘over’ or ‘above’ with the French veillance. The concept is very much linked to Friedrich Nietzsche’s vision of the übermensch, who is a man with powers beyond those of an ordinary human being, like a super-man with amplified abilities (Michael and Michael, 2010). ¨Uberveillance is analogous to big brother on the inside looking out. For example, heart, pulse, and temperature sensor readings emanating from the body in binary bits wire­lessly, or even through amplified eyes such as inserted contact lens ‘glass’ that might provide visual display and access to the Internet or social net­working applications.

¨Uberveillance brings together all forms of watching from above and from below, from machines that move to those that stand still, from animals and from people, acquired involuntarily or voluntarily using obtrusive or unob­trusive devices (Michael et al., 2010). The network infrastructure underlies the ability to collect data direct from the sensor devices worn by the individ­ual and big data analytics ensures an interpretation of the unique behavioral traits of the individual, implying more than just predicted movement, but intent and thought (Michael and Miller, 2013).

It has been said that überveillance is that part of the veillance puz­zle that brings together the sur, data, and sous to an intersecting point (Stephan et al., 2012). In überveillance, there is the ‘watching’ from above component (sur), there is the ‘collecting’ of personal data and public data for mining (data), and there is the watching from below (sous), which can draw together social networks and strangers, all coming together via wear­able and implantable devices on/in the human body. ¨Uberveillance can be used for good in the practice of health for instance, but we contend that, independent of its application for non-medical purposes, it will always have an underlying control factor (Masters and Michael, 2006).

13.5 Colliding Principles

13.5.1 From ‘drone view’ to ‘person view’

It can be argued that, because a CCTV camera is monitoring activities from above, we should have the ‘counter-right’ to monitor the world around us from below. It therefore follows, if Google can record ‘street views’, then the average citizen should also be able to engage in that same act, which we may call ‘person view’. Our laws as a rule do not forbid recording the world around us (or even each other for that matter), so long as we are not encroaching on someone else’s well-being or privacy (e.g. stalking, or making material public without expressed consent). While we have Street View today, it will only be a matter of time before we have ‘drones as a service’ (DaaS) products that systematically provide even better high res­olution imagery than ‘satellite views’. We can make ‘drone view’ available on Google Maps, as we could probably also make ‘person view’ available. Want to look up not only a street, but a person if they are logged in and registered? Then search ‘John Doe’ and find the nearest camera pointing toward him, and/or emanating from him. Call it a triangulation of sorts.

13.5.2 Transparency and open data

The benefits of this kind of transparency, argue numerous scholars, are that not only will we have a perfect source of open data to work with, but that there will be less crime as people consider the repercussions of being caught doing wrong in real-time. However, this is quite an idealistic paradigm and ethically flawed. Criminals, and non-criminals for that mat­ter, find ways around all secure processes, no matter how technologically foolproof. At that point, the technical elite might well be systematically hiding or erasing their recorded misdemeanours but no doubt keeping the innocent person under 24/7/365 watch. There are, however, varying de­grees to transparency, and most of these have to do with economies of scale and/or are context-based; they have to be. In short, transparency needs to be context related.

13.5.3 Surveillance, listening devices and the law

At what point do we actually believe that in a public space our privacy is not invaded by such incremental innovations as little wearable cameras, half the size of a matchbox, worn as lifelogging devices? One could speculate that the sheer size of these devices makes them unobtrusive and not easily detectable to the naked eye, meaning that they are covert in nature and blatantly break the law in some jurisdictions where they are worn and operational (Abbas et al., 2011). Some of these devices not only capture images every 30 seconds, but also record audio, making them potentially a form of unauthorized surveillance. It is also not always apparent when these devices are on or off. We must consider that the “unrestricted freedom of some may endanger the well-being, privacy, or safety of others” (Rodota and Capurro, 2005, p. 23). Where are the distinctions between the wearer’s right to capture his or her own personal experiences on the one hand (i.e. the unrestricted freedom of some), and intrusion into another’s private sphere in which he or she does not want to be recorded, and is perhaps even disturbed by the prospect of losing control over his or her privacy (i.e. endangering the well-being or privacy of others)?

13.5.4 Ethics and values

Enter ethics and values. Ethics in this debate are greatly important. They have been dangerously pushed aside, for it is ethics that determine the degree of importance, that is the value, we place on the levels of our decision-making. When is it right to take photographs and record another individual (even in a public space), and when is it wrong? Do I physically remove my wearable device when I enter a washroom, a leisure centre, a hospital, a funeral, someone else’s home, a bedroom? Do I need to ask express permis­sion from someone to record them, even if I am a participant in a shared activity? What about unobtrusive devices that blur the line between wear­ables and implantables, such as miniature recording devices embedded in spectacle frames or eye sockets and possibly in the future embedded in con­tact lenses? Do I have to tell my future partner or prospective employer? Should I declare these during the immigration process before I enter the secure zone?

At the same time, independent of how much crowdsourced evidence is gathered for a given event, wearables and implantables are not infallible, their sensors can easily misrepresent reality through inaccurate or incom­plete readings and data can be even further misconstrued post capture (Michael and Michael, 2007). This is the limitation of an überveillance so­ciety – devices are equipped with a myriad of sensors; they are celebrated as achieving near omnipresence, but the reality is that they will never be able to achieve omniscience. Finite knowledge and imperfect awareness create much potential for inadequate or incomplete interpretations.

Some technologists believe that they need to rewrite the books on meta­physics and ontology, as a result of old and outmoded definitions in the traditional humanities. We must be wary of our increasing ‘technicized’ environment however, and continue to test ourselves on the values we hold as canonical, which go towards defining a free and autonomous human be­ing. The protection of personal data has been deemed by the EU as an autonomous individual right.

Yet, with such pervasive data collection, how will we protect “the right of informational self-determination on each individual – including the right to remain master of the data concerning him or her” (Rodota and Capurro, 2005, p. 17)? If we rely on bio-data to drive our next move based on what our own wearable sensors tells some computer application is the right thing to do, we very well may lose a great part of our freedom and the life-force of improvization and spontaneity. By allowing this data to drive our decisions, we make ourselves prone to algorithmic faults in software programs among other significant problems.

13.5.5 The unintended side effects of lifelogging

Lifelogging captures continuous first-person recordings of a person’s life and can now be dynamically integrated into social networking and other appli­cations. If lifelogging is recording your daily life with technical tools, many are unintentionally participating in a form of lifelogging by recording their lives through social networks. Although, technically, data capture in social media happens in bursts (e.g. the upload of a photograph) compared with continuous recording of first-person recordings (e.g. (Daskala, 2011). Lifelogging is believed to have such benefits as affecting how we re­member, increasing productivity, reducing an individual’s sense of isolation, building social bonds, capturing memories, and enhancing communication.

Governing bodies could also derive benefit through lifelogging appli­cations data to better understanding public opinion or forecast emerging health issues for society. However, memories gathered by lifelogs can have side effects. Not every image, and not every recording you will take will be a happy one. Replaying these and other moments might be detrimental to our well-being. For example, history shows ‘looking back’ may become traumatic, such as Marina Lutz’s experience of having most of her life ei­ther recorded or photographed in the first 16 years of her life by her father (see the short film The Marina Experience).

Researchers have discovered that personality development and mental health could also be negatively impacted by lifelogging applications. Vul­nerabilities include high influence potential by others, suggestibility, weak perception of self, and a resulting low self-esteem (Daskala, 2011). There is also risk that wearers may also post undesirable or personal expressions of another person, which cause the person emotional harm due to a neg­ative perception of himself or herself among third parties (Daskala, 2011). We have already witnessed such events in other social forums with tragic consequences such as suicides.

Lifelogging data may also create unhealthy competition, for example in gamification programs that use higi scores to compare your quality of life to others. Studies report psychological harm among those who perceive they do not meet peer expectations (Daskala, 2011); how much more so when intimate data about one’s physical, emotional, psychological, and so­cial network is integrated, measured, and calculated to sum up quality of life in a three-digit score (Michael and Michael, 2011). Even the effect of sharing positive lifelogging data should be reconsidered. Various reports have claimed that watching other people’s lives can develop into an obsession and can incite envy, feelings of inadequacy, or feeling as if one is not accomplished enough, especially when comparing oneself to others.

13.5.6 Pebbles and shells

Perhaps lifelogs could have the opposite effect of their intended purpose, without ever denying the numerous positives. We may become wrapped up in the self, rather than in the common good, playing to a theater, and not allowing ourselves to flourish in other ways lest we are perceived as anything but normal. Such logging posted onto public Internet archival stores might well serve to promote a conflicting identity of the self, constant validation through page ranks, hit counts and likes, and other forms of electronic exhibitionism. Researchers purport that lifelogging activities are likely to lead to an over-reliance and excessive dependency on electronic devices and systems with emotionally concerning, on-going cognitive reflections as messages are posted or seen, and this could be at the expense of more important aspects of life (Daskala, 2011).

Isaac Newton gave us much to consider when he said, “I was like a boy playing on the sea-shore, and diverting myself now and then find­ing a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me” (Brewster, 2001). Society at large must question if the measurements of Google hits, higi scores, clicks, votes, recordings, and analysis of data to quantify ‘the self’, could become a dangerously distracting exercise if left unbalanced. The aforementioned measurements, which are multi-varied and enormously insightful, may be of value – and of great enjoyment and fascination – much like Newton’s peb­bles and shells. However, what is the ocean we may overlook – or ignore – as we scour the beach for pebbles and shells?

13.5.7 When bad is good

Data collection and analysis systems, such as lifelogging, may not appro­priately allow for individuals to progress in self-awareness and personal development upon tempered reflection. How do we aptly measure the con­tradictory aspects of life such as the healing that often comes through tears, or the expending of energy (exercise) to gain energy (physical health), or the unique wonder that is realized only through the pain of self-sacrifice (e.g. veritable altruistic acts)? Harvard researchers Loehr and Schwartz (2001) provide us with further evidence of how the bad (or the unpleasant) can be good relative to personal development, through an investigation in which a key participant went by the name of ‘Richard’.

Richard was an individual progressing in self-awareness as documented during an investigation in which researchers were working to determine how executives could achieve peak performance leading to increased capacity for endurance, determination, strength, flexibility, self-control, and focus. The researchers found that executives who perform to full potential, for the long­term, tap into energy at all levels of the ‘pyramid of performance’ which has four ascending levels of progressive capacities: physical, emotional, mental, and spiritual.

The tip of the pyramid was identified as spiritual capacity, defined by the researchers as “an energy that is released by tapping into one’s deepest values and defining a strong sense of purpose” (Loehr and Schwartz, 2001, p. 127). The spiritual capacity, above all else, was found to be the sustenance – or the fuel – of the ideal performance state (IPS); the state in which individuals ‘bring their talent and skills to full ignition and to sustain high performance over time’ (op. cit., p. 122). However, as Richard worked to realize his spiritual capacity, he experienced significant pain during a two-year period. He reported being overcome by emotion, consumed with grief, and filled with longing as he learned to affirm what mattered most in his life. The two-year battle resulted in Richard ‘tapping into a deeper sense of purpose with a new source of energy’ (op. cit., p. 128); however, one must question if technology would have properly quantified the bad as the ultimate good for Richard. Spiritual reflections on the trajectory of technology (certainly since it has now been plainly linked to teleology) are not out of place nor should they be discouraged.

13.5.8 Censorship

Beyond the veillance (the ‘watching’) of oneself, i.e. the inward gaze, is the outward veillance and watching of the other. But this point of eye (PoE), does not necessarily mean a point of view (PoV), or even wider angle field of view (FoV). Particularly in the context of ‘glass’. Our gaze too is subjective, and who or what will connote this censorship at the time when it really matters? The outward watching too may not tell the full story, despite its rich media capability to gather both audio and video. Audio-visual accounts have their own pitfalls. We have long known how vitally important eye gaze is for all of the social primates, and particularly for humans; there will be consequences to any artificial tampering of this basic natural instinct. Hans Holbein’s famous painting The Ambassadors (1533), with its patent reference to anamorphosis, speaks volumes of the critical distinction between PoE and PoV. Take a look, if you are not already familiar with this double portrait and still life. Can you see the skull? The secret lies in the perspective and in the tilt of the head.

13.6 Summary and Conclusions: Mind/Body Distinction

In the future, corporate marketing may hire professional lifeloggers (or mo­bile robotic contraptions) to log other people’s lives with commercial de­vices. Unfortunately, because of inadequate privacy policies or a lack of harmonized legislation, we, as consumers, may find no laws that would pre­clude companies from this sort of ‘live to life’ hire if we do not pull the reins on the obsession to auto-photograph and audio record everything in sight. And this needs to happen right now. We have already fallen behind and are playing a risky game of catch-up. Ethics is not the overriding issue for technology companies or developers; innovation is their primary focus because, in large part, they have a fiduciary responsibility to turn a profit. We must in turn, as an informed and socially responsive community, forge together to dutifully consider the risks. At what point will we leap from tracking the mundane, which is of the body (e.g. location of GPS coordi­nates), toward the tracking of the mind by bringing all of these separate components together using ¨uber-analytics and an ¨uber-view? We must ask the hard questions now. We must disclose and discuss the existence of risk, the values at stake, and the possibility of harm.

It is significant that as researchers we are once more, at least in some places, speaking on the importance of the Cartesian mind/body distinction and of the catastrophic consequences should they continue to be confused when it comes to etymological implications and ontological categories. The mind and the body are not identical even if we are to argue from Leibniz’s Law of Identity that two things can only be identical if they at the same time share exactly the same qualities. Here as well, vigilance is enormously important that we might not disremember the real distinction between machine and human.


Abbas, R., Michael, K., Michael, M. G., & Aloudat, A. (2011). Emerging Forms of Covert Surveillance Using GPS-Enabled Devices. Journal of Cases on Information Technology, 13(2), 19-33.

ACLU. (2013). You Are Being Tracked: How License Plate Readers Are Being Used to Record Americans' Movements. from

Adler, I. (2013). How Our Digital Devices Are Affecting Our Personal Relationships. 90.9 WBUR.

ALD (Ed.). (2010). Uberveillance: Oxford University Press.

Australian Privacy Foundation. (2005). Human Services Card.   Retrieved 6 June 2013, from

Bastardi, A., & Shafir, E. (1998). On the Pursuit and Misuse of Useless Information. Journal of Personality and Social Psychology, 75(1), 19-32.

Brewster, D. (2001). Memoirs of the Life, Writings, and Discoveries of Sir Isaac Newton (1855) Volume II. Ch. 27: Adamant Media Corporation.

Capurro, R. (2013). Medicine in the information and knowledge society. Paper presented at the Conference Name|. Retrieved Access Date|. from URL|.

Carpenter, L. (2011). Marina Lutz interview: The sins of my father. The Observer   Retrieved 20 April 2013, from

Clarke, R. (1988a). Information Technology and Dataveillance. Communications of the ACM, 31(5), 498-512.

Clarke, R. (1988b). Just another piece of plastic in your wallet: the `Australian card' scheme. ACM SIGCAS Computers and Society, 18(1), 7-21.

Clarke, R. (2009, 7 April 2009). The Covert Implementation of Mass Vehicle Surveillance in Australia. Paper presented at the Fourth Workshop on the Social Implications of National Security: Covert Policing, Canberra, Australia.

Clifford, S., & Hardy, Q. (2013). Attention, Shoppers: Store Is Tracking Your Cell.   Retrieved 14 July, from

Collins, L. (2008). Annals of Crime. Friend Game. Behind the online hoax that led to a girl’s suicide. The New Yorker.

DailyMail. (2013). Stores now tracking your behavior and moods through cameras.   Retrieved 6 August, from

ENISA. (2011). To log or not to log?: Risks and benefits of emerging life-logging applications. European Network and Information Security Agency   Retrieved 6 July 2013, from

FastCompany. (2013). #Unplug. Fast Company, July/August(177).

Frankel, T. C. (2012, 20 October). Megan Meier's mom is still fighting bullying.   Retrieved 4 November 2012

Friedman, R. (2012). Why Too Much Data Disables Your Decision Making. Psychology Today: Glue   Retrieved December 4, 2012, from

Goodchild, M. F. (2007). Citizens as sensors: the world of volunteered geography. GeoJournal, 69, 211–221.

Greenwald, G. (2013). NSA collecting phone records of millions of Verizon customers daily. The Guardian   Retrieved 10 August 2013, from

Hans Holbein the Younger. (1533). The Ambassadors.

Hayes, A. (2010). Uberveillance (Triquetra).   Retrieved 6 May 2013, from

HIGI. (2013). Your Score for Life.   Retrieved 29 June 2013, from

Intellitix. (2013). Reshaping the Event Horizon.   Retrieved 6 July 2013, from

Kerr, I., & Mann, S. (2006). Exploring Equiveillance. ID TRAIL MIX.

Krause. (2012). Vigilance Fatigue in Policing.   Retrieved 22 July, from

Levin, A. (2013). Waiting for Public Outrage. Paper presented at the IEEE International Symposium on Technology and Society, Toronto, Canada.

Loehr, J., & Schwartz, T. (2001). The Making of a Corporate Athlete. Harvard Business Review, January, 120-129.

Lutz, M. (2012). The Marina Experiment.   Retrieved 29 May 2013, from

Macquarie (Ed.). (2009). Uberveillance: Sydney University.

Magid, L. (2013). Wearables and Sensors Big Topics at All Things D. Forbes.

Mann, S. (2004a). Continuous lifelong capture of personal experience with EyeTap. Paper presented at the ACM International Multimedia Conference, Proceedings of the 1st ACM workshop on Continuous archival and retrieval of personal experiences (CARPE 2004), New York.

Mann, S. (2004b). Sousveillance: inverse surveillance in multimedia imaging. Paper presented at the Proceedings of the 12th annual ACM international conference on Multimedia, New York, NY, USA.

Mann, S., Nolan, J., & Wellman, B. (2003). Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments. Surveillance and Society, 1(3), 331-355.

Masters, A., & Michael, K. (2006). Lend me your arms: the use and implications of humancentric RFID. Electronic Commerce Research and Applications, 6(1), 29-39.

Michael, K. (2010). Stop social network pitfalls. Illawarra Mercury.

Michael, K. (2013a). Big Data and the Dangers of Over-Quantifying Oneself. Computer Magazine (Multimedia)   Retrieved June 7, 2013, from

Michael, K. (2013b). Snowden's Revelations Just the Tip of the Iceberg.   Retrieved 6 July 2013, from

Michael, K. (2013c). Social Implications of Wearable Computing and Augmediated Reality in Every Day Life (IEEE Symposium on Technology and Society, ISTAS13). Toronto: IEEE.

Michael, K. (2013d). Wearable computers challenge human rights. ABC Science Online.

Michael, K., & Clarke, R. (2013). Location and tracking of mobile devices: Überveillance stalks the streets. Computer Law & Security Review, 29(3), 216-228.

Michael, K., & Michael, M. G. (2009). Teaching Ethics in Wearable Computing:  the Social Implications of the New ‘Veillance’. EduPOV   Retrieved June 18, from

Michael, K., & Michael, M. G. (2012). Converging and coexisting systems towards smart surveillance. Awareness Magazine: Self-awareness in autonomic systems, June.

Michael, K., & Michael, M. G. (Eds.). (2007). From Dataveillance to Überveillance and the Realpolitik of the Transparent Society. Wollongong, NSW, Australia.

Michael, K., & Miller, K. W. (2013). Big Data: New Opportunities and New Challenges. IEEE Computer, 46(6), 22-24.

Michael, K., Roussos, G., Huang, G. Q., Gadh, R., Chattopadhyay, A., Prabhu, S., et al. (2010). Planetary-scale RFID Services in an Age of Uberveillance. Proceedings of the IEEE, 98(9), 1663-1671.

Michael, M. G., & Michael, K. (2007). Uberveillance. Paper presented at the 29th International Conference of Data Protection and Privacy Commissioners. Privacy Horizons: Terra Incognita, Location Based Tracking Workshop, Montreal, Canada.

Michael, M. G., & Michael, K. (2010). Towards a State of Uberveillance. IEEE Technology and Society Magazine, 29(2), 9-16.

Michael, M. G., & Michael, K. (2011). The Fall-Out from Emerging Technologies: on Matters of Surveillance, Social Networks and Suicide. IEEE Technology and Society Magazine, 30(3), 15-18.

mX. (2013). Hard to Swallow.   Retrieved 6 August 2013, from

Orcutt, M. (2013). Electronic “Skin” Emits Light When Pressed. MIT Tech Review.

Oxford Dictionary. (2013). Dataveillance.   Retrieved 6 May 2013, from

OxfordDictionary. (2013). Surveillance.   Retrieved 6 May 2013, from

Papakostas, T. V., Lima, J., & Lowe, M. (2002). 5:3 A Large Area Force Sensor for Smart Skin Applications. Sensors; Proceedings of IEEE, 5(3).

Pearce, P., & Gretzel, U. (2012). Tourism in technology dead zones: documenting experiential dimensions. International Journal of Tourism Sciences, 12(2), 1-20.

Pivtohead. (2013). Wearable Imaging: True point of view.   Retrieved 22 June 2013, from

Pokin, S. (2007). MySpace' hoax ends with suicide of Dardenne Prairie teen. St. Louis Post-Dispatch.

Resch, B. (2013). People as Sensors and Collective Sensing-Contextual Observations Complementing Geo-Sensor Network Measurements. Paper presented at the Progress in Location-Based Services, Lecture Notes in Geoinformation and Cartography.

Roberts, P. (1984). Information Visualization for Stock Market Ticks: Toward a New Trading Interface. Massachusetts Institute of Technology, Boston.

Rodota, S., & Capurro, R. (2005). Ethical Aspects of ICT Implants in the Human Body. The European Group on Ethics in Science and New Technologies (EGE)   Retrieved June 3, 2006, from

SHRM. (2011). from

Spence, R. (2009). Eyeborg.   Retrieved 22 June 2010, from

Stephan, K. D., Michael, K., Michael, M. G., Jacob, L., & Anesta, E. (2012). Social Implications of Technology: Past, Present, and Future. Proceedings of the IEEE, 100(13), 1752-1781.

SXSchedule. (2013). Better Measure: Health Engagement & higi Score.   Retrieved 29 June 2013, from

Thurston, B. (2013). I have left the internet. Fast Company, July/August(177), 66-78, 104-105.

Ware, C. (2000). Information Visualization: Perception for Design. San Francisco, CA: Morgan Kaufmann.

Warm, J. S., Parasuraman, R., & Matthews, G. (2008). Vigilance Requires Hard Mental Work and Is Stressful. Human Factors, 433-441.

Williams, R. B. (2012). Is Facebook Good Or Bad For Your Self-Esteem? Psychology Today: Wired for Success.

Wordnik. (2013). Sousveillance.   Retrieved 6 June 2013, from




3 Someone searching for a WiFi wireless network connection using a mobile device in a moving vehicle.



Citation: Katina Michael, M. G. Michael, and Christine Perakslis (2014) Be Vigilant: There Are Limits to Veillance. The Computer After Me: pp. 189-204. DOI: 

Uberveillance and the Social Implications of Microchip Implants: Preface

Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies

In addition to common forms of spatial units such as satellite imagery and street views, emerging automatic identification technologies are exploring the use of microchip implants in order to further track an individual’s personal data, identity, location, and condition in real time.

Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies presents case studies, literature reviews, ethnographies, and frameworks supporting the emerging technologies of RFID implants while also highlighting the current and predicted social implications of human-centric technologies. This book is essential for professionals and researchers engaged in the development of these technologies as well as providing insight and support to the inquiries with embedded micro technologies.


Katina Michael, University of Wollongong, Australia

M.G. Michael, University of Wollongong, Australia


Uberveillance can be defined as an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices into the human body. These embedded technologies can take the form of traditional pacemakers, radio-frequency identification (RFID) tag and transponder implants, smart swallowable pills, nanotechnology patches, multi-electrode array brain implants, and even smart dust to mention but a few form factors. To an extent, head-up displays like electronic contact lenses that interface with the inner body (i.e. the eye which sits within a socket) can also be said to be embedded and contributing to the uberveillance trajectory, despite their default categorisation as body wearables.

Uberveillance has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought). Uberveillance can be a predictive mechanism for a person’s expected behaviour, traits, likes, or dislikes based on historical fact; or it can be about real-time measurement and observation; or it can be something in between. The inherent problem with uberveillance is that facts do not always add up to truth, and predictions or interpretations based on uberveillance are not always correct, even if there is direct visual evidence available (Shih, 2013). Uberveillance is more than closed circuit television feeds, or cross-agency databases linked to national identity cards, or biometrics and ePassports used for international travel. Uberveillance is the sum total of all these types of surveillance and the deliberate integration of an individual’s personal data for the continuous tracking and monitoring of identity, location, condition, and point of view in real-time (Michael & Michael, 2010b).

In its ultimate form, uberveillance has to do with more than automatic identification and location-based technologies that we carry with us. It has to do with under-the-skin technology that is embedded in the body, such as microchip implants. Think of it as Big Brother on the inside looking out. It is like a black box embedded in the body which records and gathers evidence, and in this instance, transmitting specific measures wirelessly back to base. This implant is virtually meaningless without the hybrid network architecture that supports its functionality: making the person a walking online node. We are referring here, to the lowest common denominator, the smallest unit of tracking – presently a tiny chip inside the body of a human being. But it should be stated that electronic tattoos and nano-patches that are worn on the body can also certainly be considered mechanisms for data collection in the future. Whether wearable or bearable, it is the intent and objective which remains important, the notion of “people as sensors.” The gradual emergence of the so-called human cloud, that cloud computing platform which allows for the Internetworking of human “points of view” using wearable recording technology (Nolan, 2013), will also be a major factor in the proactive profiling of individuals (Michael & Michael, 2011).


This present volume will aim to equip the general public with much needed educational information about the technological trajectory of RFID implants through exclusive primary interviews, case studies, literature reviews, ethnographies, surveys and frameworks supporting emerging technologies. It was in 1997 that bioartist Eduardo Kac (Figure 1) implanted his leg in a live performance titled Time Capsule( in Brazil (Michael & Michael, 2009). The following year in an unrelated experiment, Kevin Warwick injected an implant into his left arm (Warwick, 2002; K. Michael, 2003). By 2004, the Verichip Corporation had their VeriChip product approved by the Food and Drug Administration (FDA) (Michael, Michael & Ip 2008). And since that point, there has been a great deal of misinformation and confusion surrounding the microchip implant, but also a lot of build-up on the part of the proponents of implantables.

Figure 1. 

Eduardo Kac implanting himself in his left leg with an RFID chip using an animal injector kit on 11 November 1997. Courtesy Eduardo Kac. More at

Radio-Frequency Identification (RFID) is not an inherently secure device, in fact it can be argued that it is just the opposite (Reynolds, 2004). So why someone would wish to implant something beneath the skin for non-medical reasons is quite surprising, despite the touted advantages. One of the biggest issues, not commonly discussed in public forums, has to be the increasing numbers of people who are suffering from paranoid or delusional thoughts with respect to enforced implantation or implantation through stealth. We have already encountered significant problems in the health domain- where for example, a clinical psychologist can no longer readily discount completely the claims of patients who identify with having been implanted or tracked and monitored using inconspicuous forms of ID. This will be especially true in the era of the almost “invisible scale to the naked eye” smart dust which has yet to fully arrive. Civil libertarians, religious advocates, and so-named conspiracy theorists will not be the only exclusive groups to discuss the real potential of microchipping people, and for this reason, the discussion will move into the public policy forum, all inclusive of stakeholders in the value chain.

Significantly, this book will also provide researchers and professionals who are engaged in the development or implementation of emerging services with awareness of the social implications of human-centric technologies. These implications cannot be ignored by operational stakeholders, such as engineers and the scientific elite, if we hope to enact long-term beneficial change with new technologies that will have a positive impact on humanity. We cannot possess the attitude that says- let us see how far we can go with technology and we will worry about the repercussions later: to do so would be narrow-sighted and to ignore the importance of socio-technical sustainability. Ethics are apparently irrelevant to the engineer who is innovating in a market-driven and research funded environment. For sure there are some notable exceptions where a middle of the way approach is pursued, notably in the medical and educational contexts. Engineering ethics, of course exist, unfortunately often denigrated and misinterpreted as discourses on “goodness” or appeals to the categorical imperative. Nevertheless industry as a whole has a social responsibility to consumers at large, to ensure that it has considered what the misuse of its innovations might mean in varied settings and scenarios, to ensure that there are limited, if any, health effects from the adoption of particular technologies, and that adverse event reports are maintained by a centralised administrative office with recognised oversight (e.g. an independent ombudsman).

Equally, government agencies must respond with adequate legislative and regulatory controls to ensure that there are consequences for the misuse of new technologies. It is not enough for example, for a company like Google to come out and openly “bar” applications for its Glass product, such as biometric recognition and pornography, especially when they are very aware that these are two application areas for which their device will be exploited. Google is trying to maintain its brand by stating clearly that it is not affiliated with negative uses of its product, knowing too well that their proclamation is quite meaningless, and by no means legally binding. And this remains one of the great quandaries, that few would deny that Google’s search rank and page algorithms have meant we have also been beneficiaries of some extraordinary inventiveness.

According to a survey by CAST, one in five persons, have reported that they want to see a Google Glass ban (Nolan, 2013). Therefore, the marketing and design approach nowadays, which is broadly evident across the universal corporate spectrum, seems to be:

We will develop products and make money from them, no matter how detrimental they may be to society. We will push the legislative/regulatory envelope as much as we can, until someone says: Stop. You’ve gone too far! The best we can do as a developer is place a warning on the packaging, just like on cigarette notices, and if people choose to do the wrong thing our liability as a company is removed completely because we have provided the prior warning and only see beneficial uses. If our product is used for bad then that is not our problem, the criminal justice system can deal with that occurrence, and if non-users of our technology are entangled in a given controversy, then our best advice to people is to realign the asymmetry by adopting our product.


This edited volume came together over a three year period. We formed our editorial board and sent out the call for book chapters soon after the IEEE conference we hosted at the University of Wollongong, the International Symposium on Technology and Society (ISTAS) on 7-10 June 2010, sponsored by IEEE’s Society on the Social Implications of Technology (SSIT) ( The symposium was dedicated to emerging technologies and there were a great many papers presented from a wide range of views on the debate over the microchipping of people. It was such a highlight to see this sober conversation happening between experts coming at the debate from different perspectives, different cultural contexts, and different lifeworlds. A great deal of the spirit from that conversation has taken root in this book. The audio-visual proceedings aired on the Australian Broadcasting Corporation’s much respected 7.30 Report and received wide coverage in major media outlets. The significance is not in the press coverage but in the fact that the topic is now relevant to the everyday person. Citizens will need to make a personal decision- do I receive an implant or not? Do I carry an identifier on the surface of my skin or not? Do I succumb to 24x7 monitoring by being fully “connected” to the grid or not?

Individuals who were present at ISTAS10 and were also key contributors to this volume include keynote speakers Professor Rafael Capurro, Professor Roger Clarke, Professor Kevin Warwick, Dr Katherine Albrecht, Dr Mark Gasson, Mr Amal Graafstra, and attendees Professor Marcus Wigan, Associate Professor Darren Palmer, Dr Ian Warren, Dr Mark Burdon, and Mr William A. Herbert. Each of these presenters have been instrumental voices in the discussion on Embedded Surveillance Devices (ESDs) in living things (animals and humans), and tracking and monitoring technologies. They have dedicated a portion of their professional life to investigating the possibilities and the effects of a world filled with microchips, beyond those in desktop computers and high-tech gadgetry. They have also been able to connect the practice of an Internet of Things (IoT) from not only machine-to-machine but nested forms of machine-to-people-to-machine interactions and considered the implications. When one is surrounded by such passionate voices, it is difficult not to be inspired onward to such an extensive work.

A further backdrop to the book is the annual workshops we began in 2006 on the Social Implications of National Security which have had ongoing sponsorship by the Australian Research Council’s Research Network for a Secure Australia (RNSA). Following ISTAS10, we held a workshop on the “Social Implications of Location-Based Services” at the University of Wollongong’s Innovation Campus and were fortunate to have Professor Rafael Capurro, Professor Andrew Goldsmith, Professor Peter Eklund, and Associate Professor Ulrike Gretzel present their work ( Worthy of note is the workshop proceedings which are available online have been recognised as major milestones for the Research Network in official government documentation. For example, the Department of the Prime Minister and Cabinet (PM&C) among other high profile agencies in Australia and abroad have requested copies of the works for their libraries.

In 2012, the topic of our annual RNSA workshop was “Sousveillance and the Social Implications of Point of View Technologies in Law Enforcement” held at the University of Sydney ( Professor Kevin Haggerty keynoted that event, speaking on a theme titled “Monitoring within and beyond the Police Organisation” and also later graciously contributed the foreword to this book, as well as presenting on biomimetics at the University of Wollongong. The workshop again acted to bring exceptional voices together to discuss audio-visual body-worn recording technologies including, Professor Roger Clarke, Professor David Lyon, Associate Professor Nick O’Brien, Associate Professor Darren Palmer, Dr Saskia Hufnagel, Dr Jann Karp, Mr Richard Kay, Mr Mark Lyell, and Mr Alexander Hayes.

In 2013, the theme of the National Security workshop was “Unmanned Aerial Vehicles - Pros and Cons in Policing, Security & Everyday Life” held at Ryerson University in Canada. This workshop had presentations from Professor Andrew Clement, Associate Professor Avner Levin, Mr Ian Hannah, and Mr Matthew Schroyer. It was the first time that the workshop was held outside Australian borders in eight years. While drones are not greatly discussed in this volume, they demonstrate one of the scenario views of the fulfilment of uberveillance. Case in point, the drone killing machine signifies the importance of a remote controlled macro-to-micro view. At first, something needs to be able to scan the skies to look down on the ground, and then when the target has been identified and tracked it can be extinguished with ease. One need only look at the Israel Defence Force’s pinpoint strike on Ahmed Jabari, the head of the Hamas Military Wing, to note the intrinsic link between the macro and micro levels of details (K. Michael, 2012). How much “easier” could this kind of strike have been if the GPS chipset in the mobile phone carried by an individual communicated with a chip implant embedded in the body? RFID can be a tracking mechanism, despite the claims of some researchers that it has only a 10cm proximity. That may well be the case for your typical wall-mounted reader, but the mobile phone can act as a continuous reader if in range, as can a set of traffic lights, lampposts, or even wifi access nodes, depending on the on-board technology and the power of the reader equipment being used. A telltale example of the potential risks can be seen in the rollout of Real ID driver’s licenses in the USA, since the enactment of the REAL ID Act of 2005.

In 2013, it was also special to meet some of our book contributors for the first time at ISTAS13, held at the University of Toronto on the theme of “Wearable Computers and Augmediated Reality in Everyday Life,” among them Professor Steve Mann, Associate Professor Christine Perakslis, and Dr Ellen McGee. As so often happens when a thematic interest area brings people together from multiple disciplines, an organic group of interdisciplinary voices has begun to form at The holistic nature of this group is especially stimulating in sharing its diverse perspectives. Building upon these initial conversations and ensuring they continue as the social shaping of technology occurs in the real world is paramount.

As we brought together this edited volume, we struck a very fruitful collaboration with Reader, Dr Jeremy Pitt of Imperial College London, contributing a large chapter in his disturbingly wonderful edited volume entitled This Pervasive Day: The Potential and Perils of Pervasive Computing (2012). Jeremy’s book is a considered forecast of the social impact of new technologies inspired by Ira Levin’s This Perfect Day(1970). Worthy of particular note is our participation in the session entitled “Heaven and Hell: Visions for Pervasive Adaptation” at the European Future Technologies Conference and Exhibition (Paechter, 2011). What is important to draw out from this is that pervasive computing will indeed have a divisive impact on its users: for some it will offer incredible benefits, while to others it will be debilitating in its everyday effect. We hope similarly, to have been able to remain objective in this edited volume, offering viewpoints from diverse positions on the topic of humancentric RFID. This remained one of our principal aims and fundamental goals.

Questioning technology’s trajectory, especially when technology no longer has a medical corrective or prosthetic application but one that is based on entertainment and convenience services is extremely important. What happens to us when we embed a device that we cannot remove on our own accord? Is this fundamentally different to wearing or lugging something around? Without a doubt, it is! And what of those technologies, that are presently being developed in laboratories across the world for microscopic forms of ID, and pinhole video capture? What will be the impact of these on our society with respect to covert surveillance? Indeed, the line between overt and covert surveillance is blurring- it becomes indistinguishable when we are surrounded by surveillance and are inside the thick fog itself. The other thing that becomes completely misconstrued is that there is actually logic in the equation that says that there is a trade-off between privacy and convenience. There is no trade-off. The two variables cannot be discussed on equal footing – you cannot give a little of your privacy away for convenience and hope to have it still intact thereafter. No amount of monetary or value-based recompense will correct this asymmetry. We would be hoodwinking ourselves if we were to suddenly be “bought out” by such a business model. There is no consolation for privacy loss. We cannot be made to feel better after giving away a part of ourselves. It is not like scraping one’s knee against the concrete with the expectation that the scab will heal after a few days. Privacy loss is to be perpetually bleeding, perpetually exposed.

Additionally, in the writing of this book we also managed a number of special issue journals in 2010 and 2011, all of which acted to inform the direction of the edited volume as a whole. These included special issues on “RFID – A Unique Radio Innovation for the 21st Century” in the Proceedings of the IEEE (together with Rajit Gadh, George Roussos, George Q. Huang, Shiv Prabhu, and Peter Chu); “The Social Implications of Emerging Technologies” in Case Studies in Information Technology with IGI (together with Dr Roba Abbas); “The Social and Behavioral Implications of Location-Based Services” in the Journal of Location Based Services with Routledge; and “Surveillance and Uberveillance” in IEEE Technology and Society Magazine. In 2013, Katina also guest edited a volume for IEEE Computer on “Big Data: Discovery, Productivity and Policy” with Keith W. Miller. If there are any doubts about the holistic work supporting uberveillance, we hope that these internationally recognised journals, amongst others, that have been associated with our guest editorship indicate the thoroughness and robustness of our approach, and the recognition that others have generously provided to us for the incremental work we have completed.

It should also not go without notice that since 2006 the term uberveillance has been internationally embedded into dozens of graduate and undergraduate technical and non-technical courses across the globe. From the University of New South Wales and Deakin University to the University of Salford, from the University of Malta right through to the University of Texas at El Paso and Western Illinois University- we are extremely encouraged by correspondence from academics and researchers noting the term’s insertion into outlines, chosen text book, lecture schedules, major assessable items, recommended readings, and research training. These citations have acted to inform and to interrogate the subjects that connect us. That our research conclusions resonate with you, without necessarily implying that you have always agreed with us, is indeed substantial.


Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies follows on from a 2009 IGI Premier Reference source book titled Automatic Identification and Location-Based Services: from Bar Codes to Chip Implants. This volume consists of 6 sections, and 18 chapters with 7 exclusive addendum primary interviews and panels. The strength of the volume is in its 41 author contributions. Contributors have come from diverse professional and research backgrounds in the field of emerging technologies, law and social policy, including, information and communication sciences, administrative sciences and management, criminology, sociology, law and regulation, philosophy, ethics and policy, government, political science, among others. Moreover, the book will provide insights and support to every day citizens who may be questioning the trajectory of micro and miniature technologies or the potential for humans to be embedded with electro-magnetic devices. Body wearable technologies are also directly relevant, as they will act as complementary if not supplementary innovations to various forms of implants.

Section 1 is titled “The Veillances” with a specific background context of uberveillance. This section inspects the antecedents of surveillance, Roger Clarke’s dataveillance thirty years on, Steve Mann’s sousveillance, and MG Michael’s uberveillance. These three neologisms are inspected under the umbrella of the “veillances” (from the French veiller) which stems from the Latin vigilare which means to “keep watch” (Oxford Dictionary, 2012).

In 2009, Katina Michael and MG Michael presented a plenary paper titled: “Teaching Ethics in Wearable Computing: the Social Implications of the New ‘Veillance’” (K. Michael & Michael, 2009d). It was the first time that surveillance, dataveillance, sousveillance, and uberveillance were considered together at a public gathering. Certainly as a specialist term, it should be noted “veillance” was first used in an important blogpost exploring equiveillance by Ian Kerr and Steve Mann (2006): “the valences of veillance” were briefly described. In contrast to Kerr and Mann (2006), Michael and Michael (2006) were pondering on the intensification of a state of uberveillance through increasingly pervasive technologies that can provide details from the big picture view right down to the miniscule personal details.

Alexander Hayes (2010), pictorialized this representation using the triquetra, also known as the trinity knot and Celtic triangle (Figure 2), and describes its application to uberveillance in the educational context in chapter 3. Hayes uses mini cases to illustrate the importance of understanding the impact of body-worn video across sectors. He concludes by warning that commercial entities should not be engaged in “techno-evangelism” when selling to the education sector but should rather maintain the purposeful intent of the use of point of view and body worn video recorders within the specific educational context. Hayes also emphasises the urgent need for serious discussion on the socio-ethical implications of wearable computers.

Figure 2. 

Uberveillance triquetra (Hayes, 2010). See also Michael and Michael (2007).


By 2013, K. Michael had published proceedings from the International Symposium on Technology and Society (ISTAS13) using the veillance concept as a theme (, with numerous papers submitted to the conference exploring veillance perspectives (Ali & Mann, 2013; Hayes, et al., 2013; K. Michael, 2013; Minsky, et al., 2012; Paterson, 2013). Two other crucial references to veillance include “in press” papers by Michael and Michael (2013) and Michael, Michael, and Perakslis (2014). But what does veillance mean? And how is it understood in different contexts? What does it mean to be watched by a CCTV camera, to have one’s personal details deeply scrutinized; to watch another; or to watch oneself?

Dataveillance (see Interview 1.1) conceived by Roger Clarke of the Australian National University (ANU) in 1988 “is the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” (Clarke, 1988a). According to the Oxford Dictionary, dataveillance is summarized as “the practice of monitoring the online activity of a person or group” (Oxford Dictionary, 2013). It is hard to believe that this term was introduced a quarter of a century ago, in response to government agency data matching initiatives linking taxation records and social security benefits, among other commercial data mining practices. At the time it was a powerful statement in response to the Australia Card proposal in 1987 (Clarke, 1988b) which was never implemented by the Hawke Government, despite the Howard Government attempts to introduce an Access Card almost two decades later in 2005 (Australian Privacy Foundation, 2005). The same issues ensue today, only on a more momentous magnitude with far more consequences and advanced capabilities in analytics, data storage, and converging systems.

Sousveillance (see chapter 2) conceived by Steve Mann of the University of Toronto in 2002 but practiced since at least 1995 is the “recording of an activity from the perspective of a participant in the activity” (Wordnik, 2013). However, its initial introduction into the literature came in the inaugural publication of the Surveillance and Society journal in 2003 with a meaning of “inverse surveillance” as a counter to organizational surveillance (Mann, Nolan, & Wellman, 2003). Mann prefers to interpret sousveillance as under-sight which maintains integrity, contra to surveillance as over-sight which equates to hypocrisy (Mann, 2004).

Whereas dataveillance is the systematic use of personal data systems in the monitoring of people, sousveillance is the inverse of monitoring people; it is the continuous capture of personal experience. For example, dataveillance might include the linking of someone’s tax file number with their bank account details and communications data. Sousveillance on the other hand, is a voluntary act of logging what one might see around them as they move through the world. Surveillance is thus considered watching from above, whereas sousveillance is considered watching from below. In contrast, dataveillance is the monitoring of a person’s online activities, which presents the individual with numerous social dangers (Clarke, 1988a).

Uberveillance (see chapter 1) conceived by MG Michael of the University of Wollongong (UOW) in 2006, is commonly defined as: “ubiquitous or pervasive electronic surveillance that is not only ‘always on’ but ‘always with you,’ ultimately in the form of bodily invasive surveillance” (ALD, 2010). The term entered the Macquarie Dictionary of Australia officially in 2008 as “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” (Macquarie, 2009, p. 1094). The concern over uberveillance is directly related to the misinformationmisinterpretation, and information manipulation of citizens' data. We can strive for omnipresence through real-time remote sharing and monitoring, but we will never achieve simple omniscience (Michael & Michael, 2009).

Uberveillance is a compound word, conjoining the German über meaning over or above with the French veillance. The concept is very much linked to Friedrich Nietzsche’s vision of the Übermensch, who is a man with powers beyond those of an ordinary human being, like a super-man with amplified abilities (Honderich, 1995; M. G. Michael & Michael, 2010b). Uberveillance is analogous to embedded devices that quantify the self and measure indiscriminately. For example, heart, pulse, and temperature sensor readings emanating from the body in binary bits wirelessly, or even through amplified eyes such as inserted contact lens “glass” that might provide visual display and access to the Internet or social networking applications.

Uberveillance brings together all forms of watching from above and from below, from machines that move to those that stand still, from animals and from people, acquired involuntarily or voluntarily using obtrusive or unobtrusive devices (Figure 3) (K. Michael, et al., 2010). The network infrastructure underlies the ability to collect data direct from the sensor devices worn by the individual, and big data analytics ensures an interpretation of the unique behavioral traits of the individual implying more than just predicted movement, but intent and thought (K. Michael & Miller, 2013).

Figure 3. From surveillance to uberveillance (K. Michael, et al., 2009b)

It has been said that uberveillance is that part of the veillance puzzle that brings together the surdata, and sous to an intersecting point (Stephan, et al., 2012). In uberveillance, there is the “watching” from above component (sur), there is the “collecting” of personal data and public data for mining (data), and there is the watching from below (sous) which can draw together social networks and strangers, all coming together via wearable and implantable devices on/in the human body. Uberveillance can be used for good but we contend that independent of its application for non-medical purposes, it will always have an underlying control factor of power and authority (Masters & Michael, 2005; Gagnon, et al., 2013).

Section 2 is dedicated to applications of humancentric implantables in both the medical and non-medical space. Chapter 4 is written by professor of cybernetics, Kevin Warwick at the University of Reading and his senior research fellow, Dr Mark Gasson. In 1998, Warwick was responsible for Cyborg 1.0, and later Cyborg 2.0 in 2002. In chapter 4, Warwick and Gasson describe implants, tracking and monitoring functionality, Deep Brain Stimulation (DBS), and magnetic implants. They are pioneers in the implantables arena but after initially investigating ID and location interactivity in a closed campus environment using humancentric RFID approaches, Warwick has begun to focus his efforts on medical solutions that can aid the disabled, teaming up with Professor Tipu Aziz, a neurosurgeon from the University of Oxford. He has also explored person-to-person interfaces using the implantable devices for bi-directional functionality.

Following on from the Warwick and Gasson chapter are two interviews and a modified presentation transcript demonstrating three different kinds of RFID implant applications. Interview 4.1 is with Mr Serafin Vilaplana the former IT Manager at the Baja Beach Club who implemented the RFID implants for club patronage in Barcelona, Spain. The RFID implants were used to attract VIP patrons, perform basic access control, and be used for electronic payments. Katina Michael had the opportunity to interview Serafin after being invited to attend a Women’s in Engineering (WIE) Conference in Spain in mid-2009 organised by the Georgia Institute of Technology. It was on this connected journey that Katina Michael also met with Mark Gasson during a one day conference at the London School of Economics for the very first time, and they discussed a variety of incremental innovations in RFID.

In late May 2009, Mr Gary Retherford, a Six Sigma black belt specialising in Security, contacted Katina to be formally interviewed after coming across the Michaels’ work on the Internet. Retherford was responsible for instituting the employee access control program using the VeriChip implantable device in 2006. Interview 4.2 presents a candid discussion between Retherford and K. Michael on the risk versus reward debate with respect to RFID implantables. While Retherford can see the potential for ID tokens being embedded in the body, Michael raises some very important matters with respect to security questions inherent in RFID. Plainly, Michael argues that if we invite technology into the body, then we are inviting a whole host of computer “connectedness” issues (e.g. viruses, denial-of-service-attacks, server outages, susceptibility to hacking) into the human body as well. Retherford believes that these are matters that can be overcome with the right technology, and predicts a time that RFID implant maintenance may well be as straightforward as visiting a Local Service Provider (LSP).

Presentation 4.3 was delivered at IEEE ISTAS10 by Mr Amal Graafstra and can be found on the Internet here: This chapter presents the Do-It-Yourselfer perspective, as opposed to getting an implant that someone else uses in their operations or commercial applications. Quite possibly, the DIY culture may have an even greater influence on the diffusion of RFID implantables than even the commercial arena. DIYers are usually circumspect of commercial RFID implant offerings which they cannot customise, or for which they need an implant injected into a pre-defined bodily space which they cannot physically control. Graafstra’s published interview in 2009, as well as his full-length paper on the RFID subculture with K. Michael and M.G. Michael (2010), still stand as the most informative dialogue on the motivations of DIYers. Recently, in 2012, Graafstra began his own company touting the benefits of RFID implantables within the DIY/hacking community. Notably, a footer disclaimer statement reads: “Certain things sold at the Dangerous Things Web shop are dangerous. You are purchasing, receiving, and using the items you acquired here at your own peril. You're a big boy/girl now, you can make your own decisions about how you want to use the items you purchase. If this makes you uncomfortable, or you are unable to take personal responsibility for your actions, don't order!”

Chapter 5 closes section 2, and is written by Maria Burke and Chris Speed on applications of technology with an emphasis on memory, knowledge browsing, knowledge recovery, and knowledge sharing. This chapter reports on outcomes from research in the Tales of Things Electronic Memory (TOTeM) large grant in the United Kingdom. Burke and Speed take a fresh perspective of how technology is influencing societal and organisational change by focusing on Knowledge Management (KM). While the chapter does not explicitly address RFID, it rather explores technologies already widely diffused under the broad category of tagging systems, such as quick response codes, essentially 2D barcodes. The authors also do not fail to acknowledge that tagging systems rely on underlying infrastructure, such as wireless networks and the Internet more broadly through devices we carry such as smartphones. In the context of this book, one might also look at this chapter with a view of how memory aids might be used to support an ageing population, or those suffering with Alzheimer’s disease for example.

Section 3 is about the adoption of RFID tags and transponders by various demographics. Christine Perakslis examines the willingness to adopt RFID implants in chapter 6. She looks specifically at how personality factors play a role in the acceptance of uberveillance. She reports on a preliminary study, as well as comparing outcomes from two separate studies in 2005 and 2010. In her important findings, she discusses RFID implants as lifesaving devices, their use for trackability in case of an emergency, their potential to increase safety and security, and to speed up airport checkpoints. Yet the purpose of the Perakslis study is not to identify implantable applications as such but to investigate differences between and among personality dimensions and levels of willingness toward implanting an RFID chip in the human body. Specifically, Perakslis examines the levels of willingness toward the uberveillance trajectory using the Myers Briggs Type Indicator (MBTI).

Interview 6.1 Katina Michael converses with a 16-year-old male from Campbelltown, NSW, about tattoos, implants, and amplification. The interview is telling with respect to the prevalence of the “coolness” factor and group dynamics in youth. Though tattoos have traditionally been used to identify with an affinity group, we learn that implants would only resonate with youth if they were functional in an advanced manner, beyond just for identification purposes. This interview demonstrates the intrinsic connection between technology and the youth sub-culture which will more than likely be among the early adopters of implantable devices, yet at the same time remain highly susceptible to peer group pressure and brand driven advertising.

In chapter 7, Randy Basham considers the potential for RFID chip technology use in the elderly for surveillance purposes. The chapter not only focuses on adoption of technology but emphasises the value conflicts that RFID poses to the elderly demographic. Among these conflicts are resistance to change, technophobia, matters of informed consent, the risk of physical harm, Western religious opposition, concerns over privacy and GPS tracking, and transhumanism. Basham who sits on the Human Services Information Technology Applications (HUSITA) board of directors provides major insights to resistance to change with respect to humancentric RFID. It is valuable to read Basham’s article alongside the earlier interview transcript of Gary Retherford, to consider how new technologies like RFID implantables may be diffused widely into society. Minors and the elderly are particularly dependent demographics in this space and require special attention. It is pertinent to note, that the protests by CASPIAN led by Katherine Albrecht in 2007 blocked the chipping of elderly patients who were suffering with Alzheimer’s Disease (Lewan, 2007; ABC, 2007). If one contemplates on the trajectory for technology crossover in the surveillance atmosphere, one might think on an implantable solution with a Unique Lifetime Identifier (ULI) which follows people from cradle-to-grave and becomes the fundamental componentry that powers human interactions.

Section 4 draws on laws, directives, regulations and standards with respect to challenges arising from the practice of uberveillance. Chapter 8 investigates how the collection of DNA profiles and samples in the United Kingdom is fast becoming uncontrolled. The National DNA Database (NDNAD) of the UK has more than 8% of the population registered with much higher proportions for minority groups, such as the Black Ethnic Minority (BEM). Author Katina Michael argues that such practices drive further adoption of what one could term, national security technologies. However, developments and innovations in this space are fraught with ethical challenges. The risks associated with familial searching as overlaid with medical research, further compounds the possibility that people may carry a microchip implant with some form of DNA identifier as linked to a Personal Health Record (PHR). This is particularly pertinent when considering the European Union (EU) decision to step up cross-border police and judicial cooperation in EU countries in criminal matters, allowing for the exchange of DNA profiles between the authorities responsible for the prevention and investigation of criminal offences (see Prüm Treaty).

Chapter 9 presents outcomes from a large Australian Research Council-funded project on the night time economy in Australia. In this chapter, ID scanners and uberveillance are considered in light of trade-offs between privacy and crime prevention. Does instituting ID scanners prevent or minimise crime in particular hot spots or do they simply cause a chilling effect and trigger the redistribution of crime to new areas. Darren Palmer and his co-authors demonstrate how ID scanners are becoming a normalized precondition of entry into one Australian nighttime economy. They demonstrate that the implications of technological determinism amongst policy makers, police and crime prevention theories need to be critically assessed and that the value of ID scanners needs to be reconsidered in context. In chapter 10, Jann Karp writes on global tracking systems in Australian interstate trucking. She investigates driver perspectives and attitudes on the modern practice of fleet management, and on the practice of tracking vehicles and what that means to truck drivers. Whereas chapter 9 investigates the impact of emerging technology on consumers, chapter 10 gives an employee perspective. While Palmer et al. question the effectiveness of ID scanners in pubs and clubs, Karp poses the challenging question- is locational surveillance of drivers in the trucking industry helpful or is it a hindrance?

Chapter 11 provides legislative developments in tracking, in relation to the “Do Not Track” initiatives written by Mark Burdon et al. The chapter focuses on online behavioral profiling, in contrast to chapter 8 that focuses on DNA profiling and sampling. US legislative developments are compared with those in the European Union, New Zealand, Canada and Australia. Burdon et al. provide an excellent analysis of the problems. Recommendations for ways forward are presented in a bid for members of our communities to be able to provide meaningful and educated consent, but also for the appropriate regulation of transborder information flows. This is a substantial piece of work, and one of the most informative chapters on Do Not Track initiatives available in the literature.

Chapter 12 by Kyle Powys Whyte and his nine co-authors from Michigan State University completes section 4 with a paper on the emerging standards in livestock industry. The chapter looks at the benefits of nanobiosensors in livestock traceability systems but does not neglect to raise the social and ethical dimensions related to standardising this industry. Whyte et al. argue that future development of nanobiosensors should include processes that engage diverse actors in ways that elicit productive dialogue on the social and ethical contexts. A number of practical recommendations are presented at the conclusion of the chapter, such as the role of “anticipatory governance” as linked to Science and Technology Studies (STS). One need only consider the findings of this priming chapter, and how these results may be applied in light of the relationship between non-humancentric RFID and humancentric RFID chipping. Indeed, the opening sentence of the chapter points to the potential: “uberveillance of humans will emerge through embedding chips within nonhumans in order to monitor humans.”

Section 5 contains the critical chapter dedicated to the health implications of microchipping living things. In chapter 13, Katherine Albrecht uncovers significant problems related to microchip-induced cancer in mice or rats (2010). A meta-data analysis of eleven clinical studies published in oncology and toxicology journals between 1996 and 2006 are examined in detail in this chapter. Albrecht goes beyond the prospective social implications of microchipping humans when she presents the physical adverse reactions to implants in animals. Albrecht concludes her chapter with solid recommendations for policy-makers, veterinarians, pet owners, and oncology researchers, among others. When the original report was first launched (, Todd Lewan (2007) of the Associated Press had an article published in the Washington Post titled, “Chip Implants Linked to Animal Tumors.” Albrecht is to be commended for this pioneering study, choosing to focus on health related matters which will increasingly become relevant in the adoption of invasive and pervasive technologies.

The sixth and final section addresses the emerging socio-ethical implications of RFID tags and transponders in humans. Chapter 14 addresses some of the underlying philosophical aspects of privacy within pervasive surveillance. Alan Rubel chooses to investigate the commercial arena, penal supervision, and child surveillance in this book chapter. He asks: what is the potential for privacy loss? The intriguing and difficult question that Rubel attempts to answer is whether privacy losses (and gains) are morally salient. Rubel posits that determining whether privacy loss is morally weighty, or of sufficient moral weight to give rise to a right to privacy, requires an examination of reasons why privacy might be valuable. He describes both instrumental value and intrinsic value and presents a brief discussion on surveillance and privacy value.

Panel 14.1 is a slightly modified transcription of the debate over microchipping people recorded at IEEE ISTAS10 ( This distinguished panel is chaired by lawyer William Herbert. Panel members included, Rafael Capurro, who was a member of the European Group on Ethics in Science and New Technologies (EGE), and who co-authored the landmark Opinion piece published in 2005 “On the ethical aspects of ICT implants in the human body.” Capurro, who is the director for the International Center for Information Ethics, was able to provide a highly specialist ethical contribution to the panel. Mark Gasson and Amal Graafstra, both of whom are RFID implantees, introduced their respective expert testimonies. Chair of the Australian Privacy Foundation Roger Clarke and CASPIAN director Katherine Albrecht represented the privacy and civil liberties positions in the debate. The transcript demonstrates the complexity and multi-layered dimensions surrounding humancentric RFID, and the divisive nature of the issues at hand: on whether to microchip people, or not.

In chapter 15 we are introduced to the development of brain computer interfaces, brain machine interfaces and neuromotor prostheses. Here Ellen McGee examines sophisticated technologies that are used for more than just identification purposes. She writes of brain implants that are surgically implanted and affixed, as opposed to simple implantable devices that are injected in the arm with a small injector kit. These advanced technologies will allow for radical enhancement and augmentation. It is clear from McGee’s fascinating work that these kinds of leaps in human function and capability will cause major ethical, safety, and justice dilemmas. McGee clearly articulates the need for discourse and regulation in the broad field of neuroprosthetics. She especially emphasises the importance of privacy and autonomy. McGee concludes that there is an urgent need for debate on these issues, and questions whether or not it is wise to pursue such irreversible developments.

Ronnie Lipschutz and Rebecca Hester complement the work of McGee, going beyond the possibilities to making the actual assumption that the human will assimilate into the cellular society. They proclaim “We are the Borg!” And in doing so point to a future scenario where not only bodies are read, but minds as well. They describe “re(b)organization” as that new phenomenon that is occurring in our society today. Chapter 16 is strikingly challenging for this reason, and makes one speculate what or who are the driving forces behind this cyborgization process. This chapter will also prove of special interest for those who are conversant with Cartesian theory. Lipschutz and Hester conclude by outlining the very real need for a legal framework to deal with hackers who penetrate biodata systems and alter individual’s minds and bodies, or who may even kill a person by tampering with or reprogramming their medical device remotely.

Interview 16.1 directly alludes to this cellular society. Videographer Jordan Brown interviews Katina Michael on the notion of the “screen bubble.” What is the screen culture doing to us? Rather than looking up as we walk around, we divert our attention to the screen in the form of a smart phone, iPad, or even a digital wearable glass device. We look down increasingly, and not at each other. We peer into lifeless windows of data, rather than peer into one another’s eyes. What could this mean and what are some of the social implications of this altering of our natural gaze? The discussion between Brown and K. Michael is applicable to not just the implantables space, but to the wearables phenomenon as well.

The question of faith in a data driven and information-saturated society is adeptly addressed by Marcus Wigan in the Epilogue. Wigan calls for a new moral imperative. He asks the very important question in the context of “who are the vulnerable now?” What is the role of information ethics, and where should targeted efforts be made to address these overarching issues which affect all members of society- from children to the elderly, from the employed to the unemployed, from those in positions of power to the powerless. It is the emblematic conclusion to a book on uberveillance.


ABC. (2007). Alzheimer's patients lining up for microchip. ABCNews. Retrieved from

Albrecht, K. (2010). Microchip-induced tumors in laboratory rodents and dogs: A review of the literature 1990–2006. In Proceedings of IEEE International Symposium on Technology and Society (ISTAS10). Wollongong, Australia: IEEE.

Ali, A., & Mann, S. (2013). The inevitability of the transition from a surveillance-society to a veillance-society: Moral and economic grounding for sousveillance. In Proceedings of IEEE International Symposium on Technology and Society (ISTAS13). Toronto, Canada: IEEE.

Australian Privacy Foundation. (2005). Human services card. Australian Privacy Foundation. Retrieved 6 June 2013, from

Clarke R. (1988a). Information technology and dataveillance.Communications of the ACM, 31(5), 498–512. 10.1145/42411.42413

Clarke R. (1988b). Just another piece of plastic in your wallet: The ‘Australian card’ scheme.ACM SIGCAS Computers and Society, 18(1), 7–21. 10.1145/47649.47650

Gagnon M. Jacob J. D. Guta A. (2013). Treatment adherence redefined: A critical analysis of technotherapeutics.Nursing Inquiry, 20(1), 60–70. 10.1111/j.1440-1800.2012.00595.x22381079

Graafstra A. (2009). Interview 14.2: The RFID do-it-yourselfer. In MichaelK.MichaelM. G. (Eds.), Innovative automatic identification and location based services: from bar codes to chip implants (pp. 427–449). Hershey, PA: IGI Global.

Graafstra, A., Michael, K., & Michael, M. G. (2010). Social-technical issues facing the humancentric RFID implantee sub-culture through the eyes of Amal Graafstra. In Proceedings of IEEE International Symposium on Technology and Society (ISTAS10). Wollongong, Australia: IEEE.

Hayes, A. (2010). Uberveillance (triquetra). Retrieved 6 May 2013, from

Hayes, A., Mann, S., Aryani, A., Sabbine, S., Blackall, L., Waugh, P., & Ridgway, S. (2013). Identity awareness of research data in veillance and social computing. In Proceedings of IEEE International Symposium on Technology and Society (ISTAS13). Toronto, Canada: IEEE.

Kerr, I., & Mann, S. (n.d.). Exploring equiveillance. ID TRAIL MIX. Retrieved 26 September 2013 from

Levin I. (1970). This perfect day: A novel. New York: Pegasus.

Lewan, T. (2007, September 8). Chip implants linked to animal tumors. Washington Post. Retrieved from

Macquarie. (2009). Uberveillance. In S. Butler (Ed.), Macquarie dictionary (5th ed.). Sydney, Australia: Sydney University.

Mann, S. (2004). Sousveillance: Inverse surveillance in multimedia imaging. In Proceedings of the 12th Annual ACM International Conference on Multimedia. New York, NY: ACM.

Mann S. Nolan J. Wellman B. (2003). Sousveillance: Inventing and using wearable computing devices for data collection in surveillance environments.Surveillance & Society, 1(3), 331–355.

Masters, A., & Michael, K. (2005). Humancentric applications of RFID implants: The usability contexts of control, convenience and care. In Proceedings of the Second IEEE International Workshop on Mobile Commerce and Services. Munich, Germany: IEEE Computer Society.

Michael K. (2003). The automatic identification trajectory. In LawrenceE.LawrenceJ.NewtonS.DannS.CorbittB.ThanasankitT. (Eds.), Internet commerce: Digital models for business. Sydney, Australia: John Wiley & Sons.

Michael, K. (2012). Israel, Palestine and the benefits of waging war through Twitter. The Conversation. Retrieved 22 November 2012, from

Michael K. (2013a). High-tech lust.IEEE Technology and Society Magazine, 32(2), 4–5. 10.1109/MTS.2013.2259652

Michael, K. (Ed.). (2013b). Social implications of wearable computing and augmediated reality in every day life. In Proceedings of IEEE Symposium on Technology and Society. Toronto, Canada: IEEE.

Michael, K., McNamee, A., & Michael, M. G. (2006). The emerging ethics of humancentric GPS tracking and monitoring. In Proceedings of International Conference on Mobile Business. Copenhagen, Denmark: IEEE Computer Society.

Michael K. Michael M. G. (Eds.). (2007). From dataveillance to überveillance and the realpolitik of the transparent society. Wollongong, Australia: Academic Press.

Michael K. Michael M. G. (2009a). Innovative automatic identification and location-based services: From bar codes to chip implants. Hershey, PA: IGI Global. 10.4018/978-1-59904-795-9

Michael, K., & Michael, M. G. (2009c). Predicting the socioethical implications of implanting people with microchips. PerAda Magazine. Retrieved from

Michael, K., & Michael, M. G. (2009d). Teaching ethics in wearable computing: The social implications of the new ‘veillance’. EduPOV.Retrieved June 18, from

Michael K. Michael M. G. (2010). Implementing namebers using implantable technologies: The future prospects of person ID. In PittJ. (Ed.), This pervasive day: The potential and perils of pervasive computing (pp. 163–206). London: Imperial College London.

Michael K. Michael M. G. (2011). The social and behavioral implications of location-based services.Journal of Location-Based Services, 5(3-4), 121–137. 10.1080/17489725.2011.642820

Michael K. Michael M. G. (2013). No limits to watching?Communications of the ACM, 56(11), 26-28.10.1145/2527187

Michael K. Michael M. G. Abbas R. (2009b). From surveillance to uberveillance (Australian Research Council Discovery Grant Application). Wollongong, Australia: University of Wollongong.

Michael, K., Michael, M. G., & Ip, R. (2008). Microchip implants for humans as unique identifiers: A case study on VeriChip. In Proceedings of Conference on Ethics, Technology, and Identity. Delft, The Netherlands: Delft University of Technology.

Michael K. Michael M. G. Perakslis C. (2014). Be vigilant: There are limits to veillance. In PittJ. (Ed.), The computer after me. London: Imperial College Press.

Michael K. Miller K. W. (2013). Big data: New opportunities and new challenges.IEEE Computer, 46(6), 22–24. 10.1109/MC.2013.196

Michael K. Roussos G. Huang G. Q. Gadh R. Chattopadhyay A. Prabhu S. (2010). Planetary-scale RFID Services in an age of uberveillance.Proceedings of the IEEE, 98(9), 1663–1671. 10.1109/JPROC.2010.2050850

Michael M. G. (2000). For it is the number of a man.Bulletin of Biblical Studies, 19, 79–89.

Michael M. G. Michael K. (2009). Uberveillance: Microchipping people and the assault on privacy.Quadrant, 53(3), 85–89.

Michael M. G. Michael K. (2010). Towards a state of uberveillance.IEEE Technology and Society Magazine, 29(2), 9–16. 10.1109/MTS.2010.937024

Minsky, M. (2013). The society of intelligent veillance. In Proceedings of IEEE International Symposium on Technology and Society (ISTAS13). Toronto, Canada: IEEE.

Nolan, D. (2013, June 7). The human cloud. Monolith. Retrieved from

Oxford Dictionary. (2012). Dataveillance. Retrieved 6 May 2013, from

Paechter B. Pitt J. Serbedzijac N. Michael K. Willies J. Helgason I. (2011). Heaven and hell: Visions for pervasive adaptation. In Fet11 essence. Budapest, Hungary: Elsevier. 10.1016/j.procs.2011.12.025

Paterson, N. (2013). Veillances: Protocols & network surveillance. In Proceedings of IEEE International Symposium on Technology and Society(ISTAS13). Toronto, Canada: IEEE.

Pitt J. (Ed.). (2012). This pervasive day: The potential and perils of pervasive computing. London: Imperial College London.

Pitt J. (2014). The computer after me. London: Imperial College Press.

Reynolds, M. (2004). Despite the hype, microchip implants won't deliver security. Gartner. Retrieved 6 May 2013, from

Rodotà, S., & Capurro, R. (2005). Ethical aspects of ICT implants in the human body. Opinion of the European Group on Ethics in Science and New Technologies to the European Commission, 20.

Shih, T. K. (2013). Video forgery and motion editing. In Proceedings of International Conference on Advances in ICT for Emerging Regions. ICT.

Stephan K. D. Michael K. Michael M. G. Jacob L. Anesta E. (2012). Social implications of technology: Past, present, and future.Proceedings of the IEEE, 100(13), 1752–1781. 10.1109/JPROC.2012.2189919

(1995). Superman. InHonderichT. (Ed.), Oxford companion to philosophy. Oxford, UK: Oxford University Press.

(2010). Uberveillance. InALD (Ed.), Australian law dictionary. Oxford, UK: Oxford University Press.

Warwick K. (2002). I, cyborg. London: Century.

Wordnik. (2013). Sousveillance. Retrieved 6 June 2013, from

Perceived barriers for implanting microchips in humans


This quantitative, descriptive study investigated if there was a relationship between countries of residence of small business owners (N = 453) within four countries (Australia, India, UK, and the USA) with respect to perceived barriers to RFID (radio frequency identification) transponders being implanted into humans for employee ID. Participants were asked what they believed were the greatest barriers in instituting chip implants for access control in organizations. Participants had six options from which to select. There were significant chi-square analyses reported relative to respondents' countries and: 1) a perceived barrier of technological issues (X2= 11.86, df = 3, p = .008); 2) a perceived barrier of philosophical issues (right of control over one's body) (X2= 31.21, df = 3, p = .000); and 3) a perceived barrier of health issues (unknown risks related to implants) (X2= 10.88, df = 3, p = .012). There were no significant chi-square analyses reported with respect to countries of residence and: 1) religious issues (mark of the beast), 2) social issues (digital divide), and 3) cultural issues (incisions into the skin are taboo). Thus, the researchers concluded that there were relationships between the respondents' countries and the perception of barriers in institutional microchips.

SECTION I. Introduction

The purpose of this study was to investigate if there were relationships between countries of residence (Australia, India, UK, and the USA) of small business owners  and perceived barriers of instituting RFID (radio frequency identification) transponders implanted into the human body for identification and access control purposes in organizations [1]. Participants were asked what they believed were the greatest barriers in instituting chip implants for access control in organizations [2]. Participants had six options from which to select all that apply, as well as an option to specify other barriers [3]. The options for perceived barriers included:

  • technological issues-RFID is inherently an insecure technology
  • social issues-there will be a digital divide between those with employees with implants for identification and those that have legacy electronic identification
  • cultural issues-incisions into the skin are taboo
  • religious issues-mark of the beast
  • philosophical issues-right of control over one's body
  • health issues-there are unknown risks related to implants that are in the body over the long term
  • other issues.

There were significant chi-square analyses reported relative to respondents' countries and: 1) the perceived barrier of technological issues; 2) the perceived barrier of philosophical issues (right of control over one's body); and 3) the perceived barrier of health issues (unknown risks related to implants). There were no significant chi-square analyses reported with respect to countries and religious issues (mark of the beast), social issues (digital divide), and cultural issues (incisions into the skin are taboo).

RFID implants are capable of omnipresent electronic surveillance. RFID tags or transponders can be implanted into the human body to track the who, what, where, when, and how of human life [4]. This act of embedding devices into human beings for surveillance purposes is known as uberveillance [5]. While the tiny embedded RFID chips do not have global positioning capabilities, an RFID reader (fixed or mobile) can capture time stamps, exit and entry sequences to denote when someone is coming or going, which direction they are travelling in, and then make inferences on time, location, distance. and speed.

In this paper, the authors present a brief review of the literature, key findings from the study, and a discussion on possible implications of the findings. Professionals working in the field of emerging technologies could use these findings to better understand how countries of residence may affect perceptions of barriers in instituting chip implants in humans.

SECTION II. Review of Literature

A. Implants and Social Acceptance

In 2004, the FDA (Food & Drug Administration) of the United States approved an implantable chip for use in humans in the U.S [6]. The implanted chip was and is being marketed by a variety of commercial enterprises as a potential method to detect and treat diseases, as well as a potential lifesaving device. If a person was brought to an emergency room unconscious, a scanner in the hospital doorway could read the person's unique ID on the implanted chip. The ID would then be used to unlock the personal health records (PHR) of the patient from a database [7]. Authorized health professionals would then have access to all pertinent medical information of that individual (i.e. medical history, previous surgeries, allergies, heart condition, blood type, diabetes) to care for the patient aptly. Additionally, the chip is being touted as a solution to kidnappings in Mexico (e.g. by the Xega Company), among many other uses [8].

B. Schools: RFID Tracking

A rural elementary school in California planned to implement RFID-tagged ID cards for school children, however the American Civil Liberties Union (ACLU) fought successfully to revoke the program. Veritable risks were articulated by the ACLU including identity theft, or kidnapping if the system was hacked and resulted in a perpetrator being able to access locations of schoolchildren.

However, with school districts looking to offset cuts in state funding which are partly based on attendance figures, RFID technology provides a method to count students more accurately. Added to increased revenues, administrators are facing the reality of increasing security issues; thus more school districts are adopting RFID to track students to improve safety. For many years in Tokyo, students have worn mandatory RFID bracelets; they are tracked not only in the school, but also to and from school [9] [10]. In other examples, bags are fitted with GPS units.

In 2012, the Northside Independent School District in San Antonio, Texas began a pilot program to track 6.2% of its 100,000 students through RFID tagged ID-cards. Northside was not the first district in Texas; two other school districts in Houston successfully use the technology with reported gains in hundreds of thousands of dollars in revenue due to improved attendance. The school board unanimously approved the program, but not after first debating privacy issues. Chip readers on campuses and on school buses will detect a student's location and authorized administrators will have access to the information. At a cost of 525,000 to launch the pilot program and approximately 1.7 million in the first year due to higher attendance figures, as well as Medicaid reimbursements for the busing of special education students. However, students could forget or lose the cards which would negatively affect the system [3]. One of Northside's sophomore students, Andrea Hernandez, refused to wear the RFID tag round her neck based on religious reasons. Initially, the school expelled her but when the case went to court, she was reinstated, a judge ruling her constitutional rights had been violated [11].

C. Medical Devices: RFID Implants

Recent technological developments are reaching new levels with the integration of silicon and biology; implanted devices can now interact directly with the brain [12]. Implantable devices for medical purposes are often highly beneficial to restore functions that were lost. Such current medical implants include cardiovascular pacers, cochlear and brainstem implants for patients with hearing disorders, implantable drug delivery pumps, implantable neurostimulation devices for such patients as those with urinary incontinence, chronic pain, or epilepsy, deep brain stimulation for patients with Parkinson's, and artificial chip-controlled legs [13].

D. RFID in India

Although India has been identified as a significant prospective market for RFID due to issues with the supply chain and a need for transparency, some contend that the slow adoption of RFID solutions can be tracked to unskilled RFID solution providers. Inexperienced systems integrators and vendors are believed to account for failed trials, leaving companies disillusioned with the technology, and subsequently abandoning solutions and declaiming its benefits loudly and publicly. A secondary technological threat to RFID adoption is believed to be related to price competitiveness in India. In such a price-sensitive environment, RFID players are known to quote the lowest costs per tag, thereby using inferior hardware. Thus, customers perceive RFID to be inconsistent and unreliable for use in the business setting [14]. The compulsory biometrics roll out, instituted by the Unique Identification Authority of India (UIDAI) is in direct contrast to the experience of RFID (fig. 1)

Fig. 1. Taking fingerprints for Aadhaar, a 12-digit unique number has been issued for all residents in india. The number will be stored in a centralized database and linked to basic demographic and biometric information. The system institutes multimodal biometrics. Creative commons: fotokannan.

Fig. 1. Taking fingerprints for Aadhaar, a 12-digit unique number has been issued for all residents in india. The number will be stored in a centralized database and linked to basic demographic and biometric information. The system institutes multimodal biometrics. Creative commons: fotokannan.

E. RFID in Libraries

In 2010, researchers reported that many corporate libraries had begun deploying RFID. RFID tags are placed into books and other media and used in libraries for such purposes as to automate stock verification, to locate misplaced items, to check in/check out patrons without human interaction, and to detect theft. In India, several deployment and implementation issues were identified and they are: consumer privacy issues/ethical concerns, costs, lack of standards and regulations in India (e.g. data ownership, data collection limitations), user confusion (e.g. lack of training and experience with the technology), and the immaturity of the technology (e.g. lack of accuracy, scalability, etc.) [15].

F. RFID and OEMS/Auto Component Manufacturers

In India, suppliers are not forced to conform to stringent regulations like those that exist in other countries. In example, the TREAD Act in the U.S. provided the impetus for OEMs to invest in track and trace solutions; failure to comply with the regulations can carry a maximum fine in the amount of $15 million and a criminal penalty of up to 15 years. Indian suppliers are not only free from such regulations of compliance, but also cost conscious with low volumes of high value cars. It is believed that the cost of RFID solutions is not yet justified in the Indian market [16].

G. Correctional Facilities: RFID Tracking

A researcher studied a correctional facility in Cleveland, Ohio to evaluate the impact of RFID technology to deter such misconduct as sexual assaults. The technology was considered because of its value in confirming inmate counts and perimeter controls. In addition, corrections officers can utilize such technology to check inmate locations against predetermined schedules, to detect if rival gang members are in close proximity, to classify and track proximity of former intimate partners, single out those inmates with food allergies or health issues, and even identify if inmates who may attempt to move through the cafeteria line twice [17].

The results of the study indicated that RFID did not deter inmate misconduct, although the researchers articulated many issues that affected the results. Significant technological challenges abounded for the correctional facility as RFID tracking was implemented and included system inoperability, signal interference (e.g. “blind spots” where bracelets could not be detected), and transmission problems [18] [17].

H. Social Concerns

Social concerns plague epidermal electronics for nonmedical purposes [19]. In the United States, many states have crafted legislation to balance the potential benefits of RFID technology with the disadvantages associated with privacy and security concerns [20]. California, Georgia, Missouri, North Dakota, and Wisconsin are among states in the U.S. which have passed legislation to prohibit forced implantation of RFID in humans [21]. The “Microchip Consent Act of 2010”, which became effective on July 1, 2010 in the state of Georgia, not only stated that no person shall be required to be implanted with a microchip (regardless of a state of emergency), but also that voluntary implantation of any microchip may only be performed by a physician under the authority of the Georgia Composite Medical Board.

Through the work of Rodata and Capurro in 2005, the European Group on Ethics in Science and New Technologies to the European Commission, examined the ethical questions arising from science and new technologies. The role of the opinion was to raise awareness concerning the dilemmas created by both medical and non-medical implants in humans which affect the intimate relation between bodily and psychic functions basic to our personal identity [22]. The opinion stated that Information and Communications Technology implants, should not be used to manipulate mental functions or to change a personal identity. Additionally, the opinion stated that principles of data protection must be applied to protect personal data embedded in implants [23]. The implants were identified in the opinion as a threat to human dignity when used for surveillance purposes, although the opinion stated that this might be justifiable for security and/or safety reasons [24].

I. Increased Levels of Willingness to Adopt: 2005–2010

Researchers continue to investigate social acceptance of the implantation of this technology into human bodies. In 2006, researchers reported higher levels of acceptance of the implantation of a chip within their bodies, when college students perceived benefits from this technology [25]. Utilizing the same questions posed in 2005 to college students attending both private and public institutions of higher education by the aforementioned researchers, the researchers once again in 2010 investigated levels of willingness to implant RFID chips to understand if there were shifts in levels of willingness of college students to implant RFID chips for various reasons [25] [26]. In both studies, students were asked: “How willing would you be to implant an RFID chip in your body as a method (to reduce identity theft, as a potential lifesaving device, to increase national security)?” A 5-point Likert-type scale was utilized varying from “Strongly Unwilling” to “Strongly Willing”. Comparisons of the 2005 results of the study to the results of the 2010 research revealed shifts in levels of willingness of college students. A shift was evident; levels of willingness moved from unwillingness toward either neutrality or willingness to implant a chip in the human body to reduce identity theft, as a potential lifesaving device, and to increase national security. Levels of unwillingness decreased for all aforementioned areas as follows [26]. Between 2005 and 2010, the unwillingness (“Strongly unwilling” and “Somewhat unwilling”) of college students to implant an RFID chip into their bodies decreased by 22.4% when considering RFID implants as method to reduce identity theft, decreased by 19.9% when considering RFID implants as a potential lifesaving device, and decreased by 16.3% when considering RFID implants to increase national security [26].

J. RFID Implant Study: German Tech Conference Delegates

A 2010 survey of individuals attending a technology conference conducted by BITKOM, a German information technology industry lobby group, reported 23% of 1000 respondents would be prepared to have a chip inserted under their skin for certain benefits; 72% of respondents, however, reported they would not allow implantation of a chip under any circumstances. Sixteen percent (16%) of respondents reported they would accept an implant to allow emergency services to rescue them more quickly in the event of a fire or accident [27].

K. Ask India: Are Implants a More Secure Technology?

Previously, researchers reported a significant chi-square analysis relative to countries of residence and perceptions of chip implants as a more secure technology for identification/access control in organizations. More than expected (46 vs. 19.8; adjusted residual = 7.5), participants from India responded “yes” to implants as a more secure technology. When compared against the other countries in the study, fewer residents from the UK responded “yes” than expected (9 vs. 19.8), and fewer residents from the USA responded “yes” than expected (11 vs. 20.9). In rank order, the countries contributing to this significant relationship were India, the UK and the USA; no such differences in opinion were found for respondents from Australia. [28].

Due to heightened security threats, there appears to be a surge in demand for security in India [29][30]. A progression of mass-casualty assaults that have been carried out by extremist Pakistani nationals against hotels and government buildings in India has brought more awareness to the potential threats against less secure establishments [30]. The government is working to institute security measures at the individual level with a form of national ID cards that will house key biometric data of the individual. In the local and regional settings, technological infrastructure is developing rapidly in metro and non-metro areas because of the increase of MNCs (multi-national corporations) now locating in India. Although the neighborhood “chowkiddaaar” (human guard/watchman) was previously a more popular security measure for localized security, advances in, and reliability and availability of, security technology is believed to be affecting the adoption of electronic access security as a replacement to the more traditional security measures [29] [30].

L. Prediction of Adoption of Technology

Many models have been developed and utilized to understand factors that affect the acceptance of technology such as: The Moguls Model of Computing by Ndubisi, Gupta, and Ndubisi in 2005, Diffusion of Innovation Theory by Rogers in 1983; Theory of Planned Behavior by Ajzen in 1991; The Model of PC Utilization attributed to Thompson, Higgins, and Howell in 1991, Protection Motivation Theory (PMT) by Rogers in 1985, and the Theory of Reasoned Action attributed to Fischbein & Ajzen in 1975, and with additional revisions by the same in 1980 [31].

Researchers in Berlin, Germany investigated consumers' reactions to RFID in retail. After viewing an introductory stimulus film about RFID services in retail, participants evaluated the technology and potential privacy mechanisms. Participants were asked to rate on a five point Likert-type scale (ranging from “not at all sensitive” to “extremely sensitive”) their attitudes toward privacy with such statements as: “Generally, I want to disclose the least amount of data about myself.” Or “To me it is irrelevant if somebody knows what I buy for my daily needs.” In the study, participants reported moderate privacy awareness  and interestingly, participants reported a moderate expectation that legal regulations will result in sufficient privacy protection . Results showed that the extent to which people view the protection of their privacy strongly influences how willing people will be to accept RFID in retail. Participants were aware of privacy problems with RFID-based services, however, if retailers articulate that they value the customers' privacy, participants appeared more likely to adopt the technology. Thus, privacy protection (and the communication of it) was found to be an essential element of RFID rollouts [32].

SECTION III. Methodology

This quantitative, descriptive study investigated if there were relationships between countries of residence with respect to perceived barriers of RFID chip implants in humans for identification and access control purposes in organizations. The survey took place between April 4, 2011 and April 18, 2011. It took an average of 10 minutes to complete each online survey. Participants, who are small business owners  within four countries including Australia , India , UK , and the USA , were asked “As a senior executive, what do you believe are the greatest barriers in instituting chip implants for access control in organizations?” Relative to gender, 51.9% of participants are male; 48.1% are female. The age of participants ranged from 18 to 71 years of age; the mean age was 44 and the median age was 45. Eighty percent of organizations surveyed had less than 5 employees. Table I shows the survey participant's industry sector.

Table I Senior executive's industry sector

Table I Senior executive's industry sector

The study employed one instrument that collected key data relative to the business profile, the currently utilized technologies for identification and access control at the organization, and the senior executives' perceptions of RFID implants in humans for identification and access control in organizations. Twenty-five percent of the small business owners that participated in the survey said they had electronic ID access to their premises. Twenty percent of small business owner employee ID cards came equipped with a photograph, and less than five percent stated they had a security breach in the 12 months preceding the study.

Descriptive statistics, including frequency counts and measures of central tendency, were run and chi-square analysis was conducted to examine if there were relationships between the respondents' countries and each of the perceived barriers in instituting microchips in humans.

SECTION IV. Findings

There was a significant relationship reported relative to respondents' countries for each of three of the six choices provided in the multi-chotomous question: “As a senior executive, what do you believe are the greatest barriers in instituting chip implants for access control in organizations?”

A. Barrier: Technological Issues

The significant chi-square analysis  indicated that there was a relationship between the respondents' countries and the perceived barrier of technological issues. Using the rule of identifying adjusted residuals greater than 2.0, examination of the adjusted residuals indicated that the relationship was created when more than expected participants from India selected “technological issues (RFID is inherently an insecure technology)” as a barrier in instituting chip implants (45 vs. 31.1; adjusted residual 3.4).

B. Barrier: Philosophical Issues

The second significant chi-square analysis , df = 3,  indicated that there was a relationship between the respondents' countries and the perceived barrier of philosophical issues (right of control over one's body). An examination of the adjusted residuals indicated that the relationship was mostly created when fewer than expected participants from India selected philosophical issues as a barrier in instituting chip implants (37 vs. 61.3; adjusted residual 5.3). In addition, more residents from Australia than expected (78 vs. 62.9; adjusted residual 3.3) selected philosophical issues as a barrier. In rank order, the countries contributing to this significant relationship were India, followed by Australia; no such differences in opinion were found for respondents from UK and the USA.

C. Barrier: Health Issues

The third significant chi-square analysis  indicated there was a relationship between the respondents' countries and the perceived barrier of health issues (unknown risks related to implants). An examination of the adjusted residuals indicated that the relationship was mostly created when more than expected residents of India selected health issues as a barrier in instituting chip implants (57 vs. 43.3; adjusted residual 3.1). In addition, fewer residents from America than expected (36 vs. 45.7; adjusted residual 2.1) selected health issues as a barrier. In rank order, the countries contributing to this significant relationship were India, followed by the USA; no such differences in opinion were found for respondents from Australia and the UK.

D. Barrier: Social Issues, Religious Issues, and Cultural Issues

There were no significant chi-square analyses reported with respect to respondents' countries and social issues (digital divide), religious issues (mark of the beast), and cultural issues (incisions into the skin are taboo). Thus, in this study the researchers concluded no such differences in opinion were found for respondents' countries of residence and the barriers of social issues, religious issues, and cultural issues.

E. Statistical Summary

When asked whether or not, radiofrequency identification (RFID) transponders surgically implanted beneath the skin of an employee would be a more secure technology for instituting employee identification in the organization, only eighteen percent believed so. When asked subsequently about their opinion on how many staff in their organization would opt for an employee ID chip implant instead of the current technology if it were available, it was stated that eighty percent would not opt in. These figures are consistent with an in depth interview conducted with consultant Gary Retherford who was responsible for the first small business adoption of RFID implants for access control at in 2006 [33]–[34][35] In terms of the perceived barriers to instituting an RFID implant for access control in organizations, senior executives stated the following (in order of greatest to least barriers): 61% said health issues, 55% said philosophical issues, 43% said social issues; 36% said cultural issues; 31% said religious issues, and 28% said technological issues.

F. Open-Ended Question

When senior executives were asked if they themselves would adopt an RFID transponder surgically implanted beneath the skin the responses were summarized into three categories-no, unsure, and yes [36]. We present a representative list of these responses below with a future study focused on providing in depth qualitative content analysis.

1) No, I Would Not Get an RFID Implant

“No way would I. Animals are microchipped, not humans.”

“Absurd and unnecessary.”

“I absolutely would not have any such device implanted.”

“Hate it and object strongly.”

“No way.”h

“No thanks.”


“Absolutely creepy and unnecessary.”

“Would not consider it.”

“I would leave the job.”

“I don't like the idea one bit. The idea is abhorrent. It is invasive both physically and psychologically. I would never endorse it.”

“Would never have it done.”

“Disagree invading my body's privacy.”

“Absolutely vehemently opposed.”

“This proposal is a total violation of human rights.”

“Yeah right!! and get sent straight to hell! not this little black duck!”

“I do not believe you should put things in your body that God did not supply you with …”

“I wouldn't permit it. This is a disgraceful suggestion. The company does not OWN the employees. Slavery was abolished in developed countries more than 100 years ago. How dare you even suggest such a thing. You should be ashamed.”

“I would sooner stick pins in my eyeballs.”

“It's just !@;#%^-Nazi's???”

2) I am Unsure about Getting an RFID Implant

“A bit overkill for identification purposes.”


“Maybe there is an issue with OH&S and personal privacy concern.”


“Only if I was paid enough to do this, $100000 minimum.”

“Unsure, seems very robotic.”

“I'm not against this type of device but I would not use it simply for business security.”

“A little skeptical.”

“A little apprehensive about it.”

3) Yes, I would Get an RFID Implant

“Ok, but I would be afraid that it could be used by”

“outside world, say police.”


“It is a smart idea.”

“It would not be a problem for me, but I own the business so no philosophical issues for me.”

“I'd think it was pretty damn cool.”

SECTION V. Discussion: Perceived Barriers

A. Barrier: Technological Issues

The literature revealed many technological barriers for non-implantable chips; this study suggests this same barrier is also perceived for implantable chips and is likely to be related [37]. More than expected, Indian participants in this study selected technological issues (RFID is inherently an insecure technology) as a barrier in instituting chip implants for access control; no such differences of opinion were found for the other countries in the study. However, the literature revealed in other analyses, that more than expected Indian participants, answered “yes” when asked if implants are a more secure technology for instituting identification/access control in an organization. The findings appear to suggest that although Indian participants perceive RFID implants as a more secure technology when compared with other such methods as manual methods, paper-based, smartcards, or biometric/RFID cards, participants are likely to view this technology as undeveloped and still too emergent. Further research is needed to substantiate this conclusion, although a review of the literature revealed that RFID solution providers are already in abundance in India, with many new companies launching and at a rapid pace. Without standards and regulations, providers are unskilled and uneducated in the technology, providing solutions that often do not prove successful in implementation. Customers then deem the technology as inconsistent and ineffective in its current state. In addition, RFID players undercut each other, providing cheap pricing for cheap, underperforming hardware. Therefore, the preliminary conclusion of the researchers is that adoption of implants in India is likely to be inhibited not only now, but well into the future if the implementations of non-implantable RFID solutions continue to misrepresent the capabilities of the technology. It is likely that far afield to accepting implantable chips, individuals in India would need to be assured of consistency and effectiveness for RFID chip use in non-human applications.

B. Barrier: Philosophical Issues

Fewer than expected Indian participants selected philosophical issues (right of control over one's body) as a barrier; and more than expected, Australian participants selected this as a barrier. The researchers concluded that this is fertile ground for future research [38]. The deep cultural assumptions of each country are likely to influence participants' responses. In example, although Indian philosophies vary, many emphasize the continuity of the soul or spirit, rather than the temporary state of the flesh (the body). Further research would inform these findings through an exploration as to how and why participants in India versus participants in Australia perceive their own right of control over one's body.

C. Barrier: Health Issues

More than expected Indian participants selected health issues (unknown risks related to implants) as a barrier in instituting implants; and, fewer than expected American participants selected this as a barrier. The researchers conclude that these results may be a result of the perceived successes with the current usage of the technology. The literature revealed participants from India are experiencing poor implementations of the technology. Conversely, Americans are increasingly exposed to the use of surgically implanted chips in pets (often with no choice if the pet is adopted from a shelter) and with little or no health issues faced [39]. In addition, segments of the healthcare industry are advocating for RFID for use in the supply chain (e.g. blood supply) with much success. To inform these findings, further research is needed to explore how participants from each country describe the unknown risks related to implants.

SECTION VI. Conclusion

In conclusion, the authors recognize there are significant social implications relative to implanting chips in humans. Although voluntary chipping has been embraced by certain individuals, the chipping of humans is rare and remains mostly a topic of discussion and debate into the future. Privacy and security issues abound and are not to be minimized. However, in the future, we may see an increased demand for, and acceptance of, chipping, especially as the global environment intensifies. When considering the increase in natural disasters over the past two years, the rising tensions between nations such as those faced by India with terrorism by extremists from neighboring countries, and the recent contingency plans to enact border controls to mitigate refugees fleeing failing countries in the Eurozone, the tracking of humans may once again come to the forefront as it did post 9–11 when rescuers raced against the clock to locate survivors in the rubble.

India is of particular interest in this study; participants from this country contributed most in many of the analyses. India is categorized as a developing country (or newly industrialized country) and the second most populous country in the world. The government of India is already utilizing national identification cards housing biometrics, although the rollout has been delayed as officials work to solve issues around cards that can be stolen or misplaced, as well as how to prevent use fraudulently after the cardholder's death. Technological infrastructure is improving in even the more remote regions in India as MNCs (multi-national corporations) are locating business divisions in the country. The findings, set against the backdrop of the literature review, bring to light what seems to be an environment of people more than expected (statistically) open to (and possibly ready for) the technology of implants when compared with developed countries. However ill-informed RFID players in India are selling a low quality product. There appears to be lack of standards and insufficient knowledge of the technology with those who should know the most about the technology. Further research is necessary to not only understand the Indian perspective, but also to better understand the environment now and into the future.


1. K. Michael and M. G. Michael, "The Diffusion of RFID Implants for Access Control and ePayments: Case Study on Baja Beach Club in Barcelona, " in IEEE International Symposium on Technology and Society (ISTAS10), Wollongong, Australia, 2010, pp. 242-252.

2. K. Michael and M. G. Michael, "Implementing Namebers Using Microchip Implants: The Black Box Beneath The Skin, " in This Pervasive Day: The Potential and Perils of Pervasive Computing, J. Pitt, Ed., ed London, United Kingdom: Imperial College Press, 2012, pp. 163-203.

3. K. Michael and M. G. Michael, "The Social, Cultural, Religious and Ethical Implications of Automatic Identification, " in The Seventh International Conference on Electronic Commerce Research, Dallas, Texas, 2004, pp. 432-450.

4. M. G. Michael and K. Michael, "A note on uberveillance, " in From dataveillance to uberveillance and the realpolitik of the transparent society, K. Michael and M. G. Michael, Eds., ed Wollongong: University of Wollongong, 2006, pp. 9-25.

5. M. G. Michael and K. Michael, Eds., Uberveillance and the Social Implications of Microchip Implants (Advances in Human and Social Aspects of Technology. Hershey, PA: IGI Global, 2014.

6. J. Stokes. (2004, October 14, 2004). FDA approves implanted RFID chip for humans. Available:

7. K. Michael, et al., "Microchip Implants for Humans as Unique Identifiers: A Case Study on VeriChip, " in Conference on Ethics, Technology, and Identity, Delft, Netherlands, 2008.

8. K. Opam. (2011, August 22, 2011). RFID Implants Won't Rescue the People Kidnapped in Mexico. Available:

9. C. Swedberg. (2005, June 12, 2012). L.A. County Jail to track inmates. Available:

10. F. Vara-Orta. (2012, May 31, 2012). Students will be tracked via chips in IDs. Available:

11. Newstaff. (November 27, 2012, May 13, 2014). Texas School: Judge Overturns Student's Expulsion over RFID Chip. Available:

12. M. Gasson, "ICT implants: The invasive future of identity?, " Advances in Information and Communication Technology, vol. 262, pp. 287-295, 2008.

13. K. D. Stephan, et al., "Social Implications of Technology: Past, Present, and Future, " Proceedings of the IEEE, vol. 100, pp. 1752-1781 2012.

14. R. Kumar. (2011, June 1, 2012). India's Big RFID Adoption Challenges. Available:

15. L. Radha, "Deployment of RFID (Radio Frequency Identification) at Indian academic libraries: Issues and best practice. , " International Journal of Library and Information Science, vol. 3, pp. 34-37, 2011.

16. H. Saranga, et al. (2010, June 2, 2012). Scope for RFID Implementation in the Indian Auto Components Industry. Available: http://tejasiimb. org/articles/73.php

17. N. LaVigne, "An evaluability assessment of RFID use in correctional settings, " in Final report submitted to the National Institute of Justice, ed. Washington DC: USA, 2006.

18. R. Halberstadt and N. LaVigne, "Evaluating the use of radio frequency identification device (RFID) technology to prevent and investigate sexual assaults in a correctional setting, " The Prison Journal, vol. 91, pp. 227-249, 2011.

19. A. Masters and K. Michael, "Lend me your arms: The use and implications of humancentric RFID, " Electronic Commerce and Applications, vol. 6, pp. 29-39, 2007.

20. K. Albrecht and L. McIntyre, Spychips: How Major Corporations and Government Plan to Track Your Every Purchase and Watch Your Every Move. New York: Plume, 2006.

21. A. Friggieri, et al., "The Legal Ramifications of Microchipping People in the United States of America-A State Legislative Comparison, " in IEEE International Symposium on Technology and Society (ISTAS '09), Phoenix, Arizona, 2009.

22. G. G. Assembly. (2010, January 12, 2011). Senate Bill 235. Available: nate-5.htm

23. M. G. Michael and K. Michael, "Towards a State of Uberveillance, " IEEE Technology and Society Magazine, vol. 29, pp. 9-16, 2010.

24. S. Rodota and R. Capurro, "Opinion n020: Ethical aspects of ICT Implants in the human body, " in European Group on Ethics in Science and New Technologie (EGE), ed, 2005.

25. C. Perakslis and R. Wolk, "Social acceptance of RFID as a biometric security method, " IEEE Symposium on Technology and Society Magazine, vol. 25, pp. 34-42, 2006.

26. C. Perakslis, "Consumer Willingness to Adopt RFID Implants: Do Personality Factors Play a Role in the Acceptance of Uberveillance?, " in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds., ed Hershey, PA: IGI Global, 2014, pp. 144-160.

27. A. Donoghue. (2010, March 2, 2010). CeBIT: Quarter Of Germans Happy To Have Chip Implants. Available:

28. R. Achille, et al., "Ethical Issues to consider for Microchip Implants in Humans, " Ethics in Biology, Engineering and Medicine vol. 3, pp. 77-91, 2012.

29. S. Das. (2009, May 1, 2012). Surveillance: Big Brothers Watching. Available: http://dqindia.ciol.commakesections.asp/09042401.asp

30. M. Krepon and N. Cohn. (2011, May 1, 2012). Crises in South Asia: Trends and Potential Consequences. Available:

31. C. Jung, Psychological types. Princeton, NJ: Princeton University Press, 1923 (1971).

32. M. Rothensee and S. Spiekermann, "Between Extreme Rejection and Cautious Acceptance Consumers' Reactions to RFID-Based IS in Retail, " Science Computer Review, vol. 26, pp. 75-86, 2008.

33. K. Michael and M. G. Michael, "The Future Prospects of Embedded Microchips in Humans as Unique Identifiers: The Risks versus the Rewards, " Media, Culture &Society, vol. 35, pp. 78-86, 2013.

34. WND. (October 2, 2006, May 13, 2014). Employees Get Microchip Implants. Available:

35. K. Michael, ", " in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds., ed Hershey, PA: IGI Global, 2014, pp. 133-143.

36. K. Michael, et al., "Microchip Implants for Employees in the Workplace: Findings from a Multi-Country Survey of Small Business Owners, " presented at the Surveillance and/in Everyday Life: Monitoring Pasts, Presents and Futures, University of Sydney, NSW, 2012.

37. M. N. Gasson, et al., "Human ICT Implants: Technical, Legal and Ethical Considerations, " in Information Technology and Law Series vol. 23, ed: Springer, 2012, p. 184.

38. S. O. Hansson, "Implant ethics, " Journal of Med Ethics, vol. 31, pp. 519-525, 2005.

39. K. Albrecht, "Microchip-induced tumours in laboratory rodents and dogs: A review of literature, " in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds., ed Hershey, PA: IGI Global, 2014, pp. 281-318.

Keywords: Radiofrequency identification, Implants, Educational institutions, Organizations, Access control, Australia, transponders, authorisation, microprocessor chips, organisational aspects, radiofrequency identification, institutional microchips, perceived barriers, microchips implant, transnational study, small business owners, RFID transponders, radio frequency identification transponders, employee ID, chip implants,access control, organizations, chi-square analysis, technological issues, philosophical issues, health issues, religious issues, social issues, digital divide, cultural issues, USA, RFID, radio frequency identification, implants, microchips, uberveillance, barriers, access control, employee identification, security, small business, Australia, India, UK

Citation: Christine Perakslis, Katina Michael, M. G. Michael, Robert Gable, "Perceived barriers for implanting microchips in humans", 2014 IEEE Conference on Norbert Wiener in the 21st Century (21CW), Date of Conference: 24-26 June 2014, Date Added to IEEE Xplore: 08 September 2014. DOI: 10.1109/NORBERT.2014.6893929

Social acceptance of location-based mobile government services for emergency management


Location-based services deployed by governments can be used to assist people manage emergencies via their mobile handsets. Research delineating the acceptance of public services in the domain of emergency management has been scantly investigated in information systems. The main aim of this study is to assess the viability of location-based mobile emergency services by: (i) exploring the issues related to location-based services and their nationwide utilisation for emergency management; (ii) investigating the attitudinal and behavioural implications of the services; and (iii) examining the social acceptance or rejection of the services and identify the determinants of this acceptance or rejection. The results reveal that both attitude and perceived usefulness demonstrate a good prediction power of behavioural intention. Although perceived ease of use was found not to be a predictor of attitude, the results affirm its influence on perceived usefulness. The results also demonstrate the role of trust as the most influential determinant of individual perception of the usefulness of the services. Further, the results indicate that only the collection of personal location information, as a perceived privacy concern, had a significant negative impact on trust. Implications and future research are also discussed.


► We investigate the public offerings of location-based services in the domain of emergency management.

► We examine the social acceptance or rejection of the services and identify the determinants of this acceptance or rejection.

► Attitude has a significant role in influencing behavioural intention towards using the services for emergency management.

► Trust is the most influential determinant of individual perception of the usefulness of the services.

1. Introduction

Emergencies and disasters have been part of our existence since the recording of history and will always be part of the continuing cycle of life and death. The 2001 terror attacks on New York City, the 2004 Indian Ocean Tsunami, the 2010 Haiti earthquake, and the 2012 Hurricane Sandy in the United States and Canada are just a few telling examples of what societies can endure. According to the United Nations’ International Strategy for Disaster Reduction Platform (2005), one of the main reasons for the loss of life in an emergency event is lack of early warning information. Therefore, in response to the lack of timely information, governments around the world have been exploring mobile phones as an additional feasible channel for disseminating information to people in emergency situations. The Short Message Service (SMS) and the Cell Broadcast Service (CBS) currently represent the feasible services that could be utilised for geo-specific emergency purposes as they can operate with almost all kinds of mobile handsets available today (Aloudat and Michael, 2010). We call such a service “location-based mobile government service for emergency management”.

Samsioe and Samsioe (2002) argued that an electronic service that has location capabilities should be able to fulfil the following three separate activities so as to be accurately defined as a location-based service (LBS): (i) estimate the location of the device; (ii) produce a service based on the estimated location; and (iii) deliver the location-enhanced service to that device. Accordingly, location-based services (LBS) for emergency management would involve the following: first, the location of the mobile handset can be estimated by using Cell-ID related technologies (Spiekermann, 2004); second, the mobile telecommunications network can produce an emergency information service, formed as an SMS or CBS, on events such as fire, flood, heavy rain, or hurricane, around the estimated location; and third, the warning message can then be sent to mobile handsets in the vicinity of the emergency to alert people.

After examining the related literature, it is clear that there is a marked scarcity of theoretical and empirical research that touches on the issues pertaining to the nationwide deployment of LBS for emergency management by governments. Furthermore, early studies have neglected the assessment of the acceptance and adoption of these services, along with their determinants, in the public domain. Accordingly, we seek to fill this gap by assessing the viability of location-based mobile government services within the national emergency management arrangements; Australia as a case study. To achieve this, we aim to investigate the social acceptance or rejection of location-based mobile government emergency services in Australia and identify the determinants of the acceptance or rejection.

The rest of this paper is organised as follows. Section 2 reviews the existing literature on the issues related to utilising LBS for emergency management. Section 3 develops a research model that demonstrates the acceptance of the services and their determinants. Section 4 describes the research method applied in this study. Section 5 reports the data analysis conducted to test the research model and Section 6 provides a discussion of the results. The contributions and limitations of this study and directions for future research are discussed in Section 7.

2. Issues related to LBS and emergency management

2.1. Visibility of LBS as a solution for emergency management

An individual may not be aware of the possible utilisation of location-based mobile phone services for emergency management and, therefore, it could be argued that the direct advantages or disadvantages of such utilisation would not be visible to him or her (Karahanna et al., 1999; Kurnia and Chien, 2003). An early explanation of these common phenomena came from Zajonc (1968) who defined it as the “mere exposure effect”. This describes the case where a person does not know or has little knowledge about a phenomenon, but by repeatedly exposing him or her to related stimulus objects, the repetition is capable of changing his or her beliefs towards the phenomenon either positively or negatively.

One of the key attributes of the Diffusion of Innovation (DOI) Theory by Rogers (1962) is observability, which was later segmented by Moore and Benbasat (1991) into two distinct constructs of demonstrability and visibility. The interpretation of visibility surmises that an innovation may not be new, but its benefits could be unknown to the public or even to governments. This is probably the case with LBS where these services have been available for several years, yet their general usage rates, specifically in the domain of emergency management are still extremely limited worldwide (Frost and Sullivan research service, 2007; O’Doherty et al., 2007; Aloudat and Michael, 2011).

2.2. The quality features of location-based emergency services

Service quality is defined as “a global judgement, or attitude, relating to the superiority of the service” (Parasuraman et al., 1988, p. 16).The quality of a service is, therefore, a result of subjective understanding, evaluation, and judgement of its merits. This understanding could, unfortunately, raise several judgement-related issues regarding the desired features of a service. Such issues could be easily augmented in the world of electronic services (e-services), such as LBS, especially in the absence of widely accepted and reliable instruments to quantifiably measure the quality features of an e-service. As a direct result of the absence of “agreed-upon” e-service quality models for all kinds of e-services, researchers have been compelled to use traditional service quality scales, such as the SERVQUAL model of Parasuraman et al. (1988), to measure the quality features of e-services (Liljander et al., 2002). In these traditional models however the interpersonal character of the delivery has the main impact on determining the quality of the service and, therefore, such models cannot truly be applied to the paradigm of e-services (Boshoff, 2007). Several studies suggested alternative instruments to measure e-service quality. Examples include Kaynama and Black (2000), and Zeithaml et al. (2000, 2002). But, Boshoff (2007) strongly argued that most of these proposed instruments had flaws since they were either too narrowly focused on a specific kind of e-services or failed to address the e-service from the perspective of the medium through which the service is provided or delivered.

In general, the quality of an e-service has been discerned as a multifaceted concept with different dimensions proposed for different service types (Zeithaml et al., 2002; Zhang and Prybutok, 2005). Unfortunately, in the context of LBS there is no existing consummate set of dimensions that can be employed to measure the quality features of the services and, subsequently, to measure their impact on an individual’s opinion about the utilisation of the services for emergency management. Therefore, defining a dimensional measurable set for location-based mobile phone emergency services would not be a straightforward task since there is almost no scholarly research regarding such a set. Nonetheless, the quality dimensions of a location-based mobile phone service that are expected to be relevant to emergency situations were adapted from Liljander et al. (2002), but were revised to accurately reflect the quality measurements of LBS in their new context (i.e. emergency management). The dimensions include reliabilityresponsivenesscustomisationassurance/trust, and user interface.

The interpretation of the reliability concept follows Kaynama and Black (2000), Zeithaml et al. (2002) and Yang et al. (2003) as the currency and accuracy of the service information. To be considered reliable, the LBS needs to be delivered with the best possible service information, in the best possible state, and within the promised time frame (Liljander et al., 2002).

It is reasonable to postulate that the success of a location-based mobile phone emergency service depends on the ability of the solution provider to disseminate the service information to a large number of people in a timely fashion. Due to the fact that fast response to changing situations or to people’s emergent requests is considered as providing timely information, then timeliness is closely related to responsiveness (Lee, 2005). Therefore, investigating the responsiveness of the LBS would be relevant in this context. In general, examining the influence of currencyaccuracy, and responsivenessquality features on public opinion is expected to provide an insight into the extent to which LBS is generally considered sufficiently trustworthy to be utilised for emergency management.

The User interface dimension comprises factors such as aesthetics, which could not be evaluated in this exploratory research as respondents will not have access to the LBS enabled applications for emergency management. Customisation refers to the state where information is presented in a tailored format to the user. Since LBS is customised based on the location of the recipient’s mobile handset and also on the type of information being sent, customisation is already an intrinsic quality in the core features of location-based mobile phone emergency services. Therefore, the service quality dimensions that are expected to impact on the acceptance or rejection of location-based mobile phone emergency service, and accordingly are investigated include:

(1) Perceived currency: the perceived quality of presenting up-to-the-minute service information during emergencies.

(2) Perceived accuracy: the individual’s perception about the conformity of location-based mobile phone emergency service with its actual attributes of content, location, and timing.

(3) Perceived responsiveness: the individual’s perception of receiving a prompt information service in the case of an emergency (Parasuraman et al., 1988; Liljander et al., 2002; Yang et al., 2003).

2.3. Risks of utilising LBS for emergency management

Risk of varying types exists on a daily basis in human life. Koller (1988) believed that the nature of the situation determines the type of risk and its potential effects. In extreme situations such as emergencies, risk perceptions stem from the fact that the sequence of events and the magnitude of the outcome are usually unknown or cannot be totally controlled. Risky situations affect public confidence in technology used in such situations (Im et al., 2008). Uncertainty is a salient element of risk. Two distinct types of uncertainty have been differentiated by Bensaou and Venkatraman (1996): behavioural and environmental. In the context of LBS, behavioural uncertainty arises when users cannot ascertain the behavioural actions of other LBS parties, especially in extreme events. Risk perceptions may be projected here in several forms. First, a personal risk could be perceived because the LBS user may not be able to guarantee that the service provider will fulfil its expected role under extreme emergency conditions. Physical, psychological, and social risk perceptions could all be envisaged here as personal risks (Jacoby and Kaplan, 1972). Second, the decision might hold a perception of economic risk as it might lead to a monetary loss in private properties or assets. Third, a privacy risk may be perceived since there can be some concerns that the service provider would act opportunistically in emergencies in a way that would disclose valuable personal information to other parties, collect an inordinate amount of information, or use the collected information for purposes other than and beyond the emergency situation itself and without any prior consent from the LBS user.

The second type of uncertainty is environmental, which originates because emergencies, by their nature, cannot usually be predicted in their exact timing or severity. Thus, the LBS user may reasonably assume that in an extreme condition the underlying infrastructure supporting location-based mobile phone emergency services would be compromised as in any other telecommunications model. Several risk perceptions may also be projected here. First, a perception of a personal risk could originate when the user is uncertain whether or not the LBS infrastructure would cope with the emergency situation, which might lead to a potential risk to the personal safety or the safety of important others (i.e. family members, friends, or working companions). Again, physical, psychological and social risk perceptions could all be conceived here as personal risks (Jacoby and Kaplan, 1972). Second, a perception of a performance risk emanates from the possibility that the location-based emergency service may suffer or not perform as it is intended or desired. There may not be a perception of a direct personal risk to the individual’s own safety, but the idea of a service failure when it is most needed could increase concerns about service performance and resilience in emergencies. A third environmental risk could be perceived financially when there is a possibility of monetary loss of private property or assets due to service failure (Featherman and Pavlou, 2003).

2.4. Trust in LBS for emergency management

Trust has long been regarded as an important aspect of human interactions and mutual relationships. Basically, any intended interaction between two parties proactively requires an element of trust predicated on the degree of certainty in one’s expectations or beliefs of the other’s trustworthiness (Mayer et al., 1995; Li, 2008). In the “relatively” uncertain environments of e-services, including LBS (Kaasinen, 2005; Lee, 2005), uncertainty leads individuals to reason about the capabilities of LBS and its expected performance in emergency situations, which eventually brings them to either trust the service by willingly agreeing to use it or distrust the service by simply refusing to use it. In emergencies, individuals may consider the possible risks associated with LBS before using such services. Therefore, individuals are likely to trust the service and engage in a risk taking relationship if they perceive that the benefits of using LBS surpass its risks. However, if high levels of risk are perceived, then it is most likely that individuals will not have trust in the service and, therefore, will not engage in risk-taking behaviour by using it (Mayer et al., 1995). Consequently, it could be posited that trust in an LBS is a pivotal determinant of utilising the services for emergency management where great uncertainty is always present.

Trust has generally been defined as the belief that allows a party to willingly become vulnerable to the trustee after having taken the characteristics of the trustee into consideration, whether the trustee is another person, a product, a service, an institution, or a group of people (McKnight and Chervany, 2001). In our context, the definition encompasses trust in the government providing the service and trust in the technology and underlying infrastructure through which the service is provided (Carter and Bélanger, 2005). But, since willingness to use the location-based mobile phone emergency service is an indication that the person has considered the characteristics of both the service and the service provider, including any third parties, then it is highly plausible to say that investigating trust propensity in the service will provide a prediction of trust in both the service and its provider. The ability to provide such a prediction is based upon the importance of trust in the service and its underlying technologies, which has been clearly recognised before in acceptance and adoption literature (Kini and Choobineh, 1998; Kim et al., 2001). It could be argued, however, that trust should be examined with the proposition that the person knows or, at least, has a presumption of knowledge about the service, its benefits, and the potential risks associated with its utilisation. Nonetheless, it should be noted here that trust, per se, is a subjective interpretation of the actual trustworthiness of the service, given the current extremely limited utilisation of LBS in the domain of emergency management.

2.5. Privacy concerns pertaining to LBS emergency services

A classical and commonly quoted definition of privacy is that it is “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others” (Westin, 1967, p. 7).

In the context of LBS, the traditional commercial use of the services where a high level of detail about the user’s information is regularly available for the mobile service provider, may not raise much sensitivity towards privacy from users since the user’s explicit consent is a prerequisite for initiating the services in most cases. However, in emergencies, pertinent government departments and law enforcement agencies have the power to temporarily set aside the person’s right to privacy by not informing the person when, where, and for how long his or her personal information would be collected and/or monitored. This is based on the assumption that the consent of the person is already implied when location information is collected and/or monitored in emergency situations. Nonetheless, the idea of their personal information perennially availability to other parties and the belief that the individual has incomplete control or no control over the collection and/or surveillance, the extent, the duration, the timing, or the amount of information being collected could raise privacy concerns.

Good intentions are generally assumed in the relation between the government and its people, as governments usually communicate with the individuals in regard to what kind of data will be collected in emergencies, the extent of the collection, and when data will be collected. However, the implications of suspending consent of the person, even temporarily, may have long-term adverse effects and negative impacts on public perception of LBS solutions in general. This also has the potential to generate debate on the right of the individual in an absolute privacy state and the power of governments to dispense with that right of privacy (Perusco et al., 2006), even when the services are suggested by the government for emergency management purposes.

Four privacy concerns have been identified by Smith et al. (1996). They are collectionunauthorised secondary use, errors in storage, and improper access of the collected data. These concerns can be examined when investigating privacy concerns pertaining to LBS (Junglas and Spitzmuller, 2006). Collection is defined as the concern that extensive amounts of location information or other personal identifiable information would be collected by the government when using LBS during emergencies. Unauthorised secondary use is defined as the concern that information is collected for emergency purposes using LBS, but will ultimately be used for other purposes by the government without the explicit authorisation/consent of the individual for those other uses. Errors in storage describe the concern that the procedures taken to protect against accidental or deliberate errors in storing the location information while utilising LBS are inadequate. Improper access is the concern that the stored location information is accessed by parties in the government who do not have the authority to do so.

3. Research model and hypotheses development

A special adaptation of the Theory of Reasoned Action (TRA) (Fishbein and Ajzen, 1975; Ajzen and Fishbein, 1980) has been introduced by Davis (1986, 1989) in the form of a Technology Acceptance Model (TAM). According to TRA, the actual behaviour of an individual is determined by the individual’s intention to perform that behaviour. Such intention is the result of a joint function and/or influence of the subjective norms and the individual’s attitude towards engaging in that specific behaviour. TAM postulates that the usage of a technology (i.e. the actual adoption of the technology) can be predicted as behaviour by the individual’s intention to use the technology. The individual’s intention to use can be determined by his or her attitude towards using that technology. In TAM, both the attitude and intention are postulated as the main predictors of accepting the technology. The attitude is presumed to act as a mediator between the behavioural intention and two key influential beliefs: the perceived ease of use of the technology, and its perceived usefulness. TAM posits a direct link between perceived usefulness and behavioural intention. The model also posits that the perceived usefulness of the technology is directly influenced by the perceived ease of use of that technology.

Firstly and based on original TAM, the following hypotheses are formulated:

H1 Intention to use location-based mobile phone emergency services is positively related to attitude towards the services.

H2 Intention to use location-based services in emergencies is positively associated with perceived usefulness of the services.

H3 Attitude towards location-based mobile phone emergency services is positively associated with perceived usefulness of these services.

H4 Attitude towards using location-based mobile phone emergency services is positively associated with perceived ease of use of the services.

H5 Perceived ease of use of location-based services has a positive impact on perceived usefulness of the services for emergency management purposes.

Due to its parsimony and predictive power, TAM has been widely applied, empirically validated, and extended in many studies related to user acceptance of information technology, (see for example, Venkatesh, 2000; Venkatesh and Davis, 2000; Pavlou, 2003; Djamasbi et al., 2010; Mouakket and Al-Hawari, 2012). However, TAM is a general model that only provides overall information about technology acceptance and usage and does not specify the determinants of perceived usefulness and perceived ease of use as the two main beliefs included in the model. Therefore, further information is needed regarding the specific factors that may affect a certain technology’s usefulness and ease of use from individual perspective; as this can guide the design and development of the technology in the right direction (Mathieson, 1991). Indeed, Venkatesh and Davis (2000)suggested that user behavioural beliefs included in TAM could be affected by external variables. TAM also theories that the effects of external variables on intention to use are mediated by perceived usefulness and perceived ease of use (Venkatesh, 2000). As such, this research utilises visibility, perceived risk, perceived service quality, perceived privacy concerns, and trust as external factors affecting perceived usefulness and perceived ease of use in TAM.

The rest of the research hypotheses, presented in the following sections, are completely consistent with the structural formulation of TAM and do not violate in any way TAM’s grounded theory of TRA. All the hypothesised effects of the external constructs in the proposed research model would only be exhibited on the internal variables of the model (i.e. attitude and intention) through the full mediation of TAM’s internal beliefs (i.e. perceived usefulness and perceived ease of use). Any other arrangement beside those mentioned must be considered as another model, and not TAM.

3.1. Effect of perceived service quality on perceived usefulness

It could be posited that an individual perception of how useful LBSs are in emergencies would be highly influenced by the degree to which the individual perceives the services to be accurate, current, and responsive. The research conceptual model follows the same rationale as TAM, which postulates the perceived ease of use of a technology as a direct determinant of its perceived usefulness. Perceived ease of use is defined as the degree to which the individual believes that using LBS would be free of physical and mental effort (Davis, 1989). It is then justifiable to postulate that ease of use is directly related to technical service quality features of LBS since the individual’s evaluation of the service’s ease of use is closely associated with the convenient design of the service itself. This is perhaps why ease of use has been conceived by several researchers as one of the core dimensions of service quality (Zeithaml et al., 2002; Yang et al., 2003; Zhang and Prybutok, 2005). Building upon this and following the trails of TAM, the currency, accuracy, and responsiveness service quality constructs are theorised in the research model as direct determinants of the perceived usefulness of the location-based mobile phone emergency service. The following hypotheses are proposed:

H6a: There is a positive relationship between perceived responsiveness of the location-based mobile phone emergency service and its perceived usefulness.

H6b: There is a positive relationship between the perceived currency of the location-based mobile phone emergency service and its perceived usefulness.

H6c: There is a positive relationship between the perceived accuracy of the location-based mobile phone emergency service and its perceived usefulness.

3.2. Effect of visibility on perceived usefulness

Visibility is defined as the extent to which the actual use of location-based mobile phone emergency service is observed as a solution by the individual. Following a line of reasoning in former studies, such as Karahanna et al. (1999) and Kurnia and Chien (2003), the perception of an individual of the usefulness of the location-based mobile phone emergency service is positively related to the degree to which the service solution is visible to that individual. The following hypothesis is presented:

H7: Perceived usefulness of the location-based mobile phone emergency service increases as the visibility of the service application increases in the context of use.

3.3. Effect of perceived risk on perceived usefulness

As it is practically rational to believe that the individual would perceive different types of risk during an emergency situation, it might be quite difficult to examine each risk facet as being separate to others since they can be inextricably intertwined in such situations. Therefore, following the theoretical reasoning of Pavlou (2003), the perceived risks will be investigated as a higher-order uni-dimensional concept that embraces the two types of uncertainty identified earlier, that is, behavioural and environmental.

A number of former studies have shown that public perceptions of the inherent risks in e-services can be a pivotal barrier to the acceptance of the services (Campbell and Goodstein, 2001; Featherman and Pavlou, 2003; Pavlou and Gefen, 2004; Heijden et al., 2005; Lee and Rao, 2005; Xu et al., 2005; Junglas and Spitzmuller, 2006; Horst et al., 2007). But, more importantly, in the mobile telecommunications environment people feel more vulnerable to the risks of the underlying technologies since there are always concerns about information loss or delivery failure because of the nature of the media through which information is usually delivered to them (Bahli and Benslimane, 2004).

Based on the interpretations of Pavlou and Gefen (2004) and Heijden et al. (2005), the perceived risk is defined as the individual belief as to the potential loss and the adverse consequences of using location-based mobile phone emergency services and the probability that these consequences may occur if the services solution is used for emergency management. Bearing in mind the high degree of uncertainty that is usually associated with emergency situations, it is argued that perceptions of risk would have a highly negative impact on individual perception of the usefulness of location-based mobile phone emergency services. Therefore, the following hypothesis is presented:

H8: Perceived risks from using location-based mobile phone emergency services have a negative influence on the perceived usefulness of the services.

3.4. Effect of trust on perceived usefulness

Despite the general consensus of the existence of a mutual relationship between trust and risk, the two concepts should be investigated separately when examining their impact on public acceptance of LBS since they usually show different sets of antecedents (Junglas and Spitzmuller, 2006). Trust and perceived risks are primarily essential constructs when uncertainty is present (Mayer et al., 1995). However, each has a different type of interrelationship with uncertainty. While uncertainty augments the risk perceptions of using location-based mobile phone emergency services trust reduces the individual’s concerns regarding the possible negative consequences of using the services, thus alleviating the uncertainty around services performance. Therefore, since trust in the LBS can lessen uncertainty associated with the services, thus reducing the perceptions of risk, it is theorised that the perceived risk is negatively related to an individual’s trust in the service. This is in line with a large body of former empirical research, which supports the influence of trust on perceptions of risk (Gefen et al., 2003). In addition, by reducing uncertainty trust is assumed to create a positive perspective regarding the usefulness of the services and provide expectations of an acceptable level of performance. Accordingly, trust is postulated to positively influence the perceived usefulness of location-based mobile phone emergency services and, therefore, the following hypotheses could be proposed:

H9: Trust in location-based mobile phone emergency services positively influences the perceived usefulness of the services.

H10: Trust in location-based mobile phone emergency services negatively impacts the risks perceived from using the services.

3.5. Effects of perceived privacy concerns on usefulness, trust, and risk

Perceived privacy concerns are expected to have a direct negative impact on the perceived usefulness of LBS. In addition, other prominent constructs of trust and perceived risks are also assumed to have mediating effects on the relationship between perceived privacy concerns and perceived usefulness since both constructs (i.e. trust and perceived risks) could be reasonably regarded as outcomes of the individual assessment of the privacy concerns (Junglas and Spitzmuller, 2006). For instance, if a person is not greatly concerned about the privacy of his or her location information, then it is most likely that that individual trusts the services, thus perceiving them to be useful. On the other hand, if the perceptions of privacy concerns are high, the individual would probably not engage in a risk taking behaviour, due to the high levels of risks perceived, thus resulting in lower perceptions of the usefulness of the services. Building on this reasoning, the perceived privacy concerns are theorised in the research model as direct determinants of both trust and risk perceptions. While the perceived privacy concerns are postulated to have a negative impact on trust in the services, they are theorised to positively influence on the perceived risks associated with using LBS.

Reductions in information privacy are generally the product of two types of activities: observing information about the person and sharing this information with others (Bridwell, 2007). Accordingly, the influences of two pertinent privacy concerns (i.e. collection and unauthorised secondary use) on individual acceptance of location-based mobile phone emergency services are proposed as the bases for the following hypotheses:

H11a: Collection as a perceived privacy concern negatively impacts the perceived usefulness of location-based mobile phone emergency services.

H11b: Unauthorised secondary use as a perceived privacy concern negatively impacts the perceived usefulness of location-based mobile phone emergency services.

H12a: Collection as a perceived privacy concern has a negative impact on trust in location-based mobile phone emergency services.

H12b: Unauthorised secondary use as a perceived privacy concern has a negative impact on trust in location-based mobile phone emergency services.

H13a: Risks perceived from using location-based mobile phone emergency services are positively associated with perceived privacy concerns about collection.

H13b: Risks perceived from using location-based mobile phone emergency services are positively associated with perceived privacy concerns about unauthorised secondary use.

Based on the proposed hypotheses, the research conceptual model is illustrated in Fig. 1.

Fig. 1. A conceptual model of location-based mobile phone emergency service acceptance.

4. Research method

4.1. Research context

Rapid proliferation of mobile platforms presents a real opportunity for the Australian Government to utilise location-based mobile services as an integral information lifeline in times of perils, especially now when Australians are becoming increasingly mobile; not only in the way they move, live and communicate, but also in the way they acquire information relevant to their whereabouts and various daily life activities. Utilising location-based services for emergency management has the potential to augment the overall levels of safety by increasing the situational awareness among people about threatening events in their immediate surrounds, thus helping to avoid unnecessary casualties, injuries or damages. The value of location-based mobile emergency services in Australia was realised after the Australian Federal, States and Territories Governments announced in 2009 their future intentions to utilise mobile services under the National Emergency Warning System (NEWS).

Location-based mobile services could help to find a solution to one of the intrinsic issues in most conventional emergency warning systems today that usually require the recipient to be anchored to an information channel at the time information is disseminated for one to receive an alert or warning message. However, given the current lack of research, not only in Australia but also globally, in relation to understanding the various implications of a nationwide utilisation of various mobile government location-based services for personal safety and public warning purposes, this study contends the pressing need for such a research. The results of this study would be of high importance to government, business and society at large.

4.2. Survey questionnaire

As attitude and intention are postulated as the main predictors of social acceptance or rejection of location-based mobile government emergency services, the researchers used a survey to examine and understand public attitudes and intentions towards using the services once the services are introduced by the Australian Government for emergency management solutions in the future. A five-point Likert rating scale was used in the questionnaire part of the survey. Each set of items or questions reflects a construct in the research conceptual model. The items and the studies from which the items were adapted can be found in Appendix A. At the end of the questionnaire, an open-ended question was used to solicit general comments, opinions, and additional information from the survey participants about the services.

4.3. Survey testing

Validating and testing the survey are essential processes in empirical information systems research (Straub, 1989). The survey testing was carried out in three separate steps. First, an observational study was conducted with two persons, both with minimal knowledge about location-based services. This lack of former knowledge was necessary to calculate the average time needed for each person to become acquainted with the topic of the study and complete the survey. Second, 600 pilot surveys were randomly distributed by hand. The results of the pilot survey provided the researchers with the needed grounds for testing the survey before its large-scale deployment. Third, the internal reliability of the survey was evaluated. Reliability reflects the internal consistency of the scale items measuring the same construct for the selected data if the survey is redeployed on the same population. After revision, values for all measurements were higher than the common threshold value of 0.7. The evaluation results (i.e. Composite reliability and Cronbach’s alpha scores) of the internal reliability are presented in Table 1.


Table 1. The internal consistency and discriminant validity of the research constructs.

4.4. Main survey

After survey testing, around 1350 surveys were mailed randomly by hand to households in the Illawarra region and the City of Wollongong, New South Wales, Australia. Participants were asked to return their copies to the researchers in a reply-paid envelope provided with the survey within three weeks. Three hundred and four filled surveys were returned, yielding an acceptable 22.52% response rate. Amongst the 304 surveys, 59 were returned with comments in their open-ended question. However, after excluding all unusable partial responses, 290 surveys remained for the statistical analysis.

5. Data analysis

5.1. Description of the survey population

The data of the survey subjects were summarised and reported in aggregated form to maintain anonymity and confidentiality of all respondents. Out of the 290 replies to the survey, 110 were female (37.9%) and 180 were male (62.1%). The sample showed that 43.1% (N = 125) of the respondents were between 18 and 25 years old, 21.7% (N = 63) were between 26 and 34 years old, 18.6% (N = 54) were between 35 and 44 years old, 12.4% (N = 36) were between 45 and 54 years old, 3.4% (N = 10) were between 55 and 64 years old, and only two people who were aged 65 or above completed the survey.

5.2. The partial least squares analysis results

The Smart PLS 2.0 M3 software (Ringle et al., 2005) was used to analyse the two components of the research model together: the calculation of the measurement model (i.e. the outer model) and the assessment of the structural model (i.e. the inner model) (Barclay et al., 1995).

5.2.1. The measurement model

Assessment of measurement models should examine: (1) individual item reliability, (2) internal consistency, and (3) discriminant validity (Barclay et al., 1995). To evaluate item reliability, Barclay et al. (1995) recommended accepting only items with a loading of 0.707 or more. However, Hair et al. (2006) argued that items with a factor loading of 0.5 or more are significant enough and could be retained. The measurement items of the research model were loaded heavily on their respective constructs (reported in Appendix A), with all loadings considerably above 0.5, thus demonstrating adequate reliability for all items.

Because all reliability scores are above 0.7 (i.e. Composite reliability and Cronbach’s alpha scores reported in Table 1) the internal consistency criteria are also met (Nunnally and Bernstein, 1994).

The third step in assessing the measurement model involves examining its discriminant validity where two conditions should be met. First, the off-diagonal elements in Table 1represent correlations of all latent variables, whereas the diagonal elements are the square roots of the average variances extracted (AVE) of the latent variables. The AVE of any latent variable should be greater than the variance shared between the latent variable and other latent variables (Barclay et al., 1995), i.e. the diagonal elements should be greater than corresponding off-diagonal elements. Data shown in Table 1 satisfy this requirement. Second, the indicators should load more highly on their respective construct than on any other construct, with all correlations being significant at (p ⩽ 0.05) level at least. Data reported in Table 2 satisfy this condition.

Table 2. Cross loadings of the constructs and their items.

Table 1. The internal consistency and discriminant validity of the research constructs. Representative only. For full table see ScienceDirect, Elsevier.

5.2.2. The structural model

The general aim of the structural model is to give an explanation of the theorised relationships (i.e. the hypotheses) amongst the constructs. Fig. 2 illustrates the results and also shows R2 values obtained for each endogenous variable (i.e. intention, attitude, usefulness, risk, and trust) in the structural model.

Fig. 2. The partial least squares (PLS) results of the research conceptual model.

As shown in Fig. 2, the attitude towards using location-based mobile phone emergency services (b = 0.241, p < 0.00.1) was a significant predictor of behavioural intention to use the services, thus supporting H1. The perceived usefulness (b = 0.444, p < 0.001) was also an influential predictor of intention, thus validating H2. Both attitude and perceived usefulness demonstrated a good prediction power of intention with R2 at 0.365, indicating an explanation level at 36.5% of the variance of behavioural intention to use the services in the future. Perceived usefulness (b = 0.471, p < 0.001) was a significant predictor of attitude, thus validating H3. However, H4 was not supported since perceived ease of use did not have any significant influence on attitude. On the contrary, the effect of perceived ease of use (b = 0.273, p < 0.001) on perceived usefulness was significant, thus validating H5.

Both the perceived usefulness and perceived ease of use were able to explain more than 26% of the variance of the attitude towards using the service, while the antecedents of the perceived usefulness were able to explain more than 45% of its variance with R2 at 0.454.

The positive effects of trust on perceived usefulness (b = 0.341, p < 0.001) and negatively on perceived risk (b = −0.334, p < 0.001) were significant, thus validating H9 and H10, respectively. The privacy concern of collection (b = −0.175, p < 0.05) had a significant negative impact on trust in the service, which supports H12a.

Hypotheses H6a, H6b, H6c, H7, H8, H11a, H11b, H12b, H13a and H13b were all not statistically supported and, therefore, should be rejected.

5.3. The research conceptual model “goodness-of-fit”

The “goodness-of-fit” measure provides a reasonable indication of how well the sampled data fits the conceptual model being proposed (Gefen et al., 2000). However, since there is no direct “goodness-of-fit” measure generated by the partial least squares method, the measure can be generally estimated based on the adequacy of three main indexes that include (i) construct reliability (internal consistency) being above 0.7 for all the constructs of the conceptual model, (ii) high acceptable R2, and (iii) significant path coefficients (t-statistics) between the constructs (Barclay et al., 1995; Gefen et al., 2000).

As illustrated in Table 1, all the reliability scores from two separate tests (i.e. composite reliability test and Cronbach’s alpha scores) exceeded the 0.7 threshold, indicating high internal consistency for all constructs in the research model. The R2’s of the attitude and intention constructs were above 25%, a highly acceptable prediction level in empirical research (Arlinghaus and Griffith, 1995; Gaur and Gaur, 2006). Although 10 out of the 17 path coefficients were insignificant, the path coefficients to the main predictors of social acceptance of location-based mobile phone emergency services (i.e. attitude and intention) evinced extremely high significance levels at p < 0.001, with all coefficients to be above the 0.2 threshold indicated by Chin (1998) as implying a very meaningful relationship. Accordingly, the goodness-of-fit for this research model is established since the analysis of the two components of the partial least squares model; the measurement model and the structural model, have shown good results in almost all of the statistical tests performed.

6. Discussion of the findings

6.1. Perceived usefulness

The perceived usefulness of LBS for emergency management was the key driver behind the individual positive attitude towards using the services and his or her behavioural intention towards using the services in the future. The services were perceived to be highly useful despite (i) the risks that are perceived to be associated with the utilisation of this kind of electronic services, (ii) the probability of the excessive collection of personal location information by governments utilising the services, and (iii) the probability of the unauthorised secondary use of the collected information. The findings about the usefulness of LBS completely support the few earlier studies of LBS acceptance, such as Chang et al. (2006) and Junglas and Spitzmuller (2006), in which the role of usefulness was identified as a key driver of individuals’ attitudes and intentions towards using the services despite concerns about the privacy of their locational information.

Reflecting on the arguments presented earlier, the antecedents of perceived usefulness of LBS for emergency management were: perceived quality features of the service, trust in the service and service providers, the social risks perceived in utilising the service, the privacy concerns perceived with the utilisation of the service, visibility of the service application, and perceived service ease of use. These antecedents were collectively successful in explaining more than 45% of the usefulness variance of the LBS for emergency management. This high level of explanation in the service usefulness variance, standing at 45.4%, provides reasonable indicators of the issues that can be brought into focus if there is ever a pressing need by governments to improve public perception of the usefulness of LBS for emergency management, thus positively enhancing the overall social acceptance of the services.

6.2. Perceived ease of use

The findings evince weak evidence for the existence of any direct effect of perceived ease of use of LBS on the individual’s attitude towards using the services. Therefore, it could be suggested that the public, in general, are willing to accept the utilisation of LBS for emergency management regardless of how easy or difficult they are to use. Nevertheless, the findings did verify the high impact of perceived ease of use of the services on the perceived usefulness of the services, which provides a strong indication that people would perceive the services to be more useful if they were easier to use. Accordingly, there is a reasonable ground to suggest that the perceived ease of use of the services has an indirect influence on an individual’s attitude towards using the services through the mediating role of the perceived usefulness of the services.

In general, these findings can inform the design of LBS solutions. Designers will need to contrive service offerings with easy-to-use design interfaces once the services are utilised for emergency management, making the services as intuitive as possible to use during emergency situations, and comprehensible to everyone, including the young, the elderly, and the non-technologically inclined.

6.3. Visibility

In general, visibility of LBS emergency management solutions can provide the opportunity for many people to observe and judge the application of the services in the usage context, providing an effective and direct means for the individual to evaluate the usefulness of the services (Karahanna et al., 1999). However, the findings show that visibility of LBS solutions is not statistically significant in determining the perceived usefulness of the services. One rational explanation for this result is that LBS are not yet widely utilised for emergency management and, therefore, the individual cannot easily observe the application of the services in the context of emergencies. However, a highly intuitive rationale is that the specific usage context (i.e. emergency management) eliminates the importance of observing the application of the LBS by the public for these services to be judged as useful, since any means, service, or technology that is used for emergencies is perceived, by the very nature of these situations, to be useful, regardless of how visible its application to the public.

6.4. Quality features

Investigating the quality features of LBS emanated from the need to understand the acceptable degree of service quality anticipated by the prospective user when the service is utilised for emergency management, given the fact that limited knowledge about the actual service quality dimensions of the service is currently available. However, the findings demonstrate the insignificant role of the perceived quality features of LBS in shaping the individual perception of the usefulness of the services for emergency management. One can then speculate that the findings reflect uncertainty about the performance impact of LBS in terms of accuracy, currency, and responsiveness on the usefulness of the services, which can only be grounded in the fact that the services have not yet been widely implemented for emergency management. Even with the insignificant impact, in statistical terms, of the perceived service quality features on the perceived usefulness of the services, service quality features did actually emerge in the answers to the open-ended question as one of the important issues pertaining to the possible nationwide utilisation of the services for emergency management in Australia.

6.5. Perceived social risks

The social risks perceived from using LBS had an extremely weak impact on the perceived usefulness of the services. One explanation for this insignificant impact is that the public may perceive location-based services to be a part of the well-established mobile telecommunications networks, thus being mature enough to permit the useful delivery of safety information or warning notifications during emergency situations without any potential high risks. Taking this into consideration, the risks associated with the use of LBS for emergency management are actually part of the risks impacting the entire cellular network infrastructure and not necessarily only impacting these particular services.

6.6. Privacy concerns

Perceived privacy concerns, including excessive collection of personal location information and the unauthorised secondary use of that information, were both posited to play determining roles in (i) diminishing individual trust in LBS, (ii) augmenting the risks perceived from using the services for emergency management, and (iii) negatively impacting the perceived usefulness of the services. However, the findings indicate that only the collection of personal location information, as a perceived privacy concern, had a significant negative impact on trust in the services while all other effects are statistically too insignificant to be reported.

It is of particular interest that unauthorised secondary use was without any effect on trust, unlike the collection of personal location information. One reason might stem from the very nature of the act of collection itself. Usually, when location data for a location-based service is collected, it would be done automatically and the individual is typically unaware of this collection process (Junglas et al., 2008). Nonetheless, the findings suggest that this automated process of collection, even in emergency management settings, whether the process is known to the individual or not, signifies a personal lack of control for the individual over his or her collected data. This contributed to a greater degree towards distrusting use of LBS for emergency management than any other privacy concern.

The findings also reveal that the two privacy concerns, collection of personal location information and unauthorised secondary use, did not have any significant influence in increasing perception of social risks from using LBS for emergency management. This indicates that there is some threshold level that must be reached in the privacy concerns hierarchy of effects before such risks are perceived (Drennan et al., 2006). Nonetheless, some did perceive the privacy concerns to be important even in emergency situations, reflected in the significant negative impact of the collection of personal location information on trust in the services. Still, it is argued that the negative impact of privacy concerns will not be enough to prevent the public from engaging in a risk taking relationship when they perceive the benefits of utilising LBS for emergency management as surpassing their perceived risks.

Although the impacts of the collection and unauthorised secondary use on service usefulness are insignificant in statistical terms, the unexpected positive effects of the two constructs on usefulness (as illustrated in Fig. 2) imply that people are inclined to concede a degree of privacy in return for potential benefits in extreme situations such as emergencies. One explanation for this might be that people may perceive the outcome of the extensive collection of their locational data and the secondary use of that data in an emergency situation to be always in their favour when these activities (i.e. collection and secondary use) are practised by the government. The findings could also suggest that the context of emergencies is quite sufficient to produce an adverse impact on some of the “traditionally negative” aspects of information privacy concerns.

6.7. Trust

The definition of trust in LBS encompasses individual trust in the government controlling and providing the services and trust in the technology and underlying infrastructure through which the services are provided (Carter and Bélanger, 2005). The findings show the highly significant role of trust as the most influential determinant of individual perception of the usefulness of the services, suggesting that reducing uncertainty is indeed a key component in social acceptance of the services that deserves on-going attention from the government.

The findings about the significant role that trust plays strongly corroborate several previous studies about the need to investigate trust in empirical research of location-based services (Kaasinen, 2005; Junglas and Spitzmuller, 2006; Rao and Troshani, 2007).

The findings of this study also demonstrate the pronounced role of trust in ameliorating the social risks perceived to arise from using the LBS for emergency management, thus breaking down these barriers to the usefulness of the services. These particular findings suggest that besides the significant direct influence of trust on perceived usefulness of the services, trust also indirectly influences usefulness of the services through perceived risks. This validates the earlier conceptualisation of the trust-risk relationship in the research model in this paper, in which the directionality of the relationship flows from trust to perceived risks.

Consequently, what is of a greater concern to the success of an emergency service offering is that people can willingly bestow their trust on the service, trust the message that is provided to them by the service in the case of an emergency, and, most importantly, trust the government as the provider and controller of these services.

6.8. Analysis of the open-ended question

Amongst the 304 surveys, 59 were returned with comments in the open-ended question. Twenty-three people discussed “quality” and others discussed “product reliability” features. The emphasis in the comments was that without quality and reliability LBS solutions for emergency management would be useless. For example, one respondent wrote: “I have some concerns about the accuracy. Sometimes it may not direct you to the right position in the shortest available path”. Another 17 people said that they look forward to seeing LBS utilised for emergency management in the near future, but at the same time they were worried that their personal information would be used illegally or for other purposes. Further, 11 people mentioned the regulations and laws. They thought that the government should pay more attention to formulating laws and regulations surrounding the utilisation of the services if the government wants to apply LBS for emergency management. The final eight answers can be viewed as general hopes for such technologies as LBS to be utilised as soon as possible for emergency management in Australia. From the open answers, we can see that people cared about the quality, privacy, laws, and regulations related to LBS. Consequently, it is highly recommended that governments should take such opinions into consideration before applying LBS within emergency management arrangements.

7. Implications

This study adds to the scholarly literature in a relatively new area and in which there has been little research investigating the public offerings of location-based services in the domain of emergency management. Although there have been several studies about the technical feasibility aspects for utilising LBS as advanced mobile government location-enabled applications for personal safety and public warning purposes there is however scant theoretical and empirical research concerning the investigation of different aspects in relation to the utilisation of the services in the domain of emergency management, such as the behavioural, social, technical, administrative, regulatory, and legal aspects. This is an evident gap in the current body of research and this paper makes a significant contribution to that body of research.

The findings of this paper also contribute to the current theories and models of acceptance by providing empirical evidence to support the retention of the attitude construct in the attitude-behaviour relationship of TAM. This is grounded upon the significant role of attitude in influencing behavioural intention towards using LBS for emergency management, thus enhancing the overall ability to predict social acceptance or rejection of these services. The findings completely validate, and are in line with, several social psychology studies in which the role of attitude as an important determinant of behavioural intention has been strongly emphasised (Ajzen, 2002; Dennis et al., 2004). The retention of attitude as one of the endogenous constructs within the nomological structure of TAM provides an additional momentum to arguments seeking to preserve the theoretical integrity of the Model and, consequently, the Model’s base theory of TRA. At the same time, this paper strongly signals the importance of examining individual attitude in acceptance research, especially when studying social acceptance of new government initiatives and services.

Although the research model was explicitly employed to predict social acceptance of location-based mobile government services for emergency management, the model can be easily viewed as a generic model that can credibly serve as a candidate model for future studies to predict acceptance of location-based services in other usage contexts, applications, scenarios, and/or settings. This is because all of the theorised constructs of the model are highly relevant to the intrinsic characteristics of LBS. Examples would include law enforcement applications of LBS, such as investigating their surveillance implications, capturing location-based evidence, and the social and ethical issues pertaining to the application of the services for counter-terrorism, arrest support, traffic violations, or riot control.

An issue that has been largely overlooked in the acceptance literature in respect to LBS is the quality features of these services, and the degree to which the perceptions of service quality actually impact on accepting the services. One of the main contributions of this study is the introduction of a highly justifiable theoretical foundation for investigating perceived quality features of LBS in the context of emergency management. Given the general lack of dedicated measurements for such quality features in the literature, it is argued that the service quality scales that were developed in this research, including accuracy, currency, and responsiveness, could be naturally adapted when researching acceptance of LBS, not only in the context of emergencies, but also in other usage contexts and settings.

Several opportunities for further empirical research have emerged from this study, but the most worthwhile is an examination of public opinion after national implementation and deployment of LBS for emergency management. Such a study could investigate, in the long term, how and why the determinants of acceptance change or reshape after the adoption and diffusion of the services, and whether or not the relationships between these determinants are consistent over time. This type of work reflects arguments by Karahanna et al. (1999) of the need to examine and, at the same time, differentiate between the beliefs of the individual in the pre-adoption phase (symbolic adoption), where one’s assessment leads into one’s decision to accept or reject the LBS for emergency management, and those beliefs in the post-adoption phase (actual adoption), which is marked by actual usage or take-up of the services.

Another interesting starting point for further research is the contradictions that were found between this study and most of the previous research about the influence of privacy concerns on an individual’s acceptance of LBS. Although it has been shown that the usage context of emergencies was quite sufficient to alleviate perceptions of privacy concerns, and despite the fact that it was not significant in statistical terms, a future cross-sectional comparative research taking into account several usage contexts is needed to further ascertain the role of the context of usage on the perceptions of location information privacy concerns.

8. Conclusions

Disasters and large scale emergencies that have the potential to disrupt the orderly manner of the civil society are considered national security challenges today. As Australians are becoming increasingly mobile in the way they acquire information about their whereabouts, the Australian government is contemplating the introduction of nationwide location-enabled mobile phone warning and alerting methods and techniques. Mobile government emergency applications, specifically location-based mobile phone emergency services are presented as a valuable addition within the envisaged emergency management apparatuses of the government for safeguarding people during emergencies anywhere and anytime. Indeed, governments have a responsibility to their citizens to inform and protect them against both conventional and unconventional threats, being natural or human-made.

Given the importance of this topic in the context of Australia and the fact that only very few studies tackled the utilisation of location-based mobile services in emergency management worldwide, this study aimed to investigate the social acceptance or rejection of location-based mobile government emergency services along with their determinants. The overall results of this study indicated that Australians are willing to accept such services in emergency situations. Indeed, our results indicated that behavioural intention is a function of both attitude and perceived usefulness. Perceived ease of use, according to results, has no influence on attitude. Further, the results confirmed that perceived usefulness is a strong direct predictor of attitude. Interestingly, the role of trust in determining individual perception of the usefulness of the services was found to be highly influential. Finally and from privacy concerns’ perspective, the results indicated that collection of personal location information is the only factor that has a significant negative impact on trust.

This study does not come without limitations and this can be addressed in future research. Although the response rate of the survey of this study was proven to be statistically adequate, a desirable goal was to obtain a higher response rate than the one acquired to have additional confidence in the generalizability of the findings. One possible solution for future research is to employ additional surveying techniques, such as the anonymous web-based surveying approach, along with the traditional mail survey approach to potentially increase the overall response rate. Further, as this study was designed and tested in the Australian context, future comparative cross-national studies between Australia and other countries would also be quite compelling. Such studies would shed light on the role of culture and government, such as the role and influence of government administration, in creating disparities in the factors determining the acceptance or rejection of location-based emergency services. Finally, due to time constraints, we could not afford conducting a longitudinal study although it may be useful here given that human behaviour is quite dynamic.

Appendix A. Measures and factor loadings of constructs∗

Appendix A. Measures and factor loadings of constructs∗ Partial table. See ScienceDirect>Elsevier for full table/appendix.


R. Agarwal, J. Prasad, Are individual differences germane to the acceptance of new information technologies? Decision Sciences, 30 (2) (1999), pp. 361-391

I. Ajzen, Residual effects of past on later behavior: habituation and reasoned action perspectives, Personality & Social Psychology Review (Lawrence Erlbaum Associates), 6 (2) (2002), pp. 107-122

I. Ajzen, M. Fishbein, Understanding Attitudes and Predicting Social Behavior, (first ed.), Prentice Hall, Englewood Cliffs, NJ (1980)

Aloudat, A., Michael, K., 2010. The application of location based services in national emergency warning systems: SMS, cell broadcast services and beyond. In: Proceedings of the National Security Science and Innovation, Australian Security Research Centre, Canberra, Australia, September 23, 2010, pp. 21–49 (September 23).

A. Aloudat, K. Michael, The socio-ethical considerations surrounding government mandated location-based services during emergencies: an Australian case study, M. Quigley (Ed.), ICT Ethics and Security in the 21st Century: New Developments and Applications (first ed.), IGI Global, Hershey, PA (2011), pp. 129-154

S.L. Arlinghaus, D.A. Griffith, Practical Handbook of Spatial Statistics, (first ed.), CRC Press, Boca Raton, FL (1995)

B. Bahli, Y. Benslimane, An exploration of wireless computing risks: development of a risk taxonomy, Information Management & Computer Security, 12 (3) (2004), pp. 245-254

D.W. Barclay, R. Thompson, C. Higgins, The partial least squares (PLS) approach to causal modeling: personal computer adoption and use as an illustration, Technology Studies: Special Issue on Research Methodology, 2 (2) (1995), pp. 285-309

Bensaou and Venkatraman, 1996, M. Bensaou, N. VenkatramanInter-organizational relationships and information technology: a conceptual synthesis and a research framework, European Journal of Information Systems, 5 (1996), pp. 84-91

C. Boshoff, A psychometric assessment of E–S–Qual: a scale to measure electronic service quality, Journal of Electronic Commerce Research, 8 (1) (2007), p. 101

S.A. Bridwell, The dimensions of locational privacy, H.J. Miller (Ed.), Societies and Cities in the Age of Instant Access (first ed.), Springer, Dordrecht, The Netherlands (2007), pp. 209-226

M.C. Campbell, R.C. Goodstein, The moderating effect of perceived risk on consumers’ evaluations of product incongruity: preference for the norm, Journal of Consumer Research, 28 (3) (2001), pp. 439-449

L. Carter, F. Bélanger, The utilization of e-government services: citizen trust, innovation and acceptance factors, Information Systems Journal, 15 (1) (2005), pp. 5-25

S. Chang, Y.-J. Hsieh, C.-W. Chen, C.-K. Liao, S.-T. WangLocation-based services for tourism industry: an empirical study, Ubiquitous Intelligence and Computing, Springer, Berlin, Heidelberg (2006), pp. 1144-1153

W.W. Chin, The partial least square approach to structural equation modeling, G.A. Marcoulides (Ed.), Modern Methods for Business Research (first ed.), Lawrence Erlbaum Associates, Inc., Mahwah, NJ (1998), pp. 295-336

G.A. Churchill, A paradigm for developing better measures of marketing constructs, Journal of Marketing Research, 16 (1) (1979), pp. 64-74

Davis, F.D., 1986. A technology acceptance model for empirically testing new end-user information systems: theory and results. Doctoral Dissertation, MIT Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, viewed 4 September 2007.

F.D. Davis, Perceived usefulness, perceived ease of use, and user acceptance of information technology, MIS Quarterly, 13 (3) (1989), pp. 318-340

Dennis, A.R., Venkatesh, V., Ramesh, V., 2004. Adoption of Collaboration Technologies: Integrating Technology Acceptance and Collaboration Technology Research. Information Systems Department, Kelley School of Business, Indiana University, 17 November 2007. <>.

S. Djamasbi, D. Strong, M. DishawAffect and acceptance: examining the effects of positive mood on the technology acceptance model, Decision Support Systems, 48 (2) (2010), pp. 383-394

J. Drennan, G.S. Mort, J. PrevitePrivacy, risk perception, and expert online behavior: an exploratory study of household end users, Journal of Organizational and End User Computing, 18 (1) (2006), pp. 1-22

M.S. Featherman, P.A. PavlouPredicting E-services adoption: a perceived risk facets perspective, International Journal of Human–Computer Studies, 59 (4) (2003), pp. 451-474

M. Fishbein, I. AjzenBelief, Attitude, Intention, and Behavior: An Introduction to Theory and Research, Addison-Wesley Publishing Co., Reading, Massachusetts (1975)

Frost and Sullivan Research Service, 2007. Asia Pacific Location-based Services (LBS) Markets, viewed 28 August 2007. <>.

A.S. Gaur, S.S. Gaur, Statistical Methods for Practice and Research: A Guide to Data Analysis using SPSS, (first ed.), Sage Publications, Thousand Oaks, CA (2006)

Gefen, D., Srinivasan Rao, V., Tractinsky, N., 2003. The conceptualization of trust, risk and their electronic commerce: the need for clarifications. In: Proceedings of the 36th Annual Hawaii International Conference on System Sciences. 6–9, January, 2009, viewed 6 January 2009, IEEEXplore Database.

D. Gefen, D.W. Straub, M. BoudreauStructural equation modeling and regression: guidelines for research practice, Communications of the Association for Information Systems, 4 (7) (2000), pp. 1-78

J.F. Hair, B. Black, B. Babin, R.E. Anderson, R.L. Tatham, Multivariate Data Analysis, (sixth ed.), Pearson Prentice Hall, New Jersey (2006)

Heijden, H.v.d., Ogertschnig, M., Gaast, L.v.d., 2005. Effects of context relevance and perceived risk on user acceptance of mobile information services. In: Proceedings of the 13th European Conference on Information Systems (ECIS 2005), Regensburg, Germany, May 26–28, 2005, viewed 15 September 2008, Google Scholar Database.

M. Horst, M. Kuttschreuter, J.M. Gutteling, Perceived usefulness, personal experiences, risk perception and trust as determinants of adoption of e-government services in The Netherlands, Computers in Human Behavior, 23 (4) (2007), pp. 1838-1852

I. Im, Y. Kim, H.-J. Han, The effects of perceived risk and technology type on users’ acceptance of technologies, Information & Management, 45 (1) (2008), pp. 1-9

Jacoby, J., Kaplan, L.B., 1972. The components of perceived risk. In: Proceedings of the Third Annual Conference of the Association for Consumer Research, Association for Consumer Research, Chicago, IL, November 1972, pp. 382–393.

Junglas, I., Spitzmuller, C., 2005. A research model for studying privacy concerns pertaining to location-based services. In: Proceedings of the 38th Annual Hawaii International Conference on System Sciences (HICSS’05), Hawaii, January 3–6, 2005, viewed 22 August 2007, IEEEXplore Database.

Junglas, I., Spitzmuller, C., 2006. Personality traits and privacy perceptions: an empirical study in the context of location-based services. In: Proceedings of the International Conference on Mobile Business, Copenhagen, Denmark, June 2006, viewed 14 August 2007, IEEEXplore Database, p. 11.

I.A. Junglas, N.A. Johnson, C. Spitzmüller, Personality traits and concern for privacy: an empirical study in the context of location-based services, European Journal of Information Systems, 17 (4) (2008), pp. 387-402

Kaasinen, E., 2005. User acceptance of mobile services – value, ease of use, trust and ease of adoption. Doctoral Dissertation. Tampere University of Technology, Tampere, Finland, viewed 27 July 2007.

E. Karahanna, D.W. Straub, N.L. Chervany, Information technology adoption across time: a cross-sectional comparison of pre-adoption and post-adoption beliefs, MIS Quarterly, 23 (2) (1999), pp. 183-213

S.A. Kaynama, C.I. Black, A proposal to assess the service quality of online travel agencies, Journal of Professional Services Marketing, 21 (1) (2000), pp. 63-68

Kim, D.J., Braynov, S.B., Rao, H.R., Song, Y.I., 2001. A B-to-C trust model for online exchange. In: Proceedings of the Seventh Americas Conference on Information Systems, Boston, MA, 2–5 August, viewed 03 September 2008, pp. 784–787.

Kini, A., Choobineh, J., 1998. Trust in electronic commerce: definition and theoretical considerations. In: Proceedings of the 31st Annual Hawaii International Conference on System Sciences, vol. 4, viewed 13 November 2008, IEEE Xplore Database, pp. 51–61.

M. Koller, Risk as a determinant of trust, Basic & Applied Social Psychology, 9 (4) (1988), pp. 265-276

Kurnia, S., Chien, A.-W.J., 2003. The acceptance of online grocery shopping, paper presented to the 16th Bled eCommerce Conference, Bled, Slovenia, 9–11 June.

Lee, J., Rao, H.R., 2005. Risk of Terrorism, Trust in Government, and e-Government Services: An Exploratory Study of Citizens’ Intention to use e-Government Services in a Turbulent Environment, York Centre for International and Security Studies (YCISS), 10 December 2008, <>.

T. Lee, The impact of perceptions of interactivity on customer trust and transaction intentions in mobile commerce, Journal of Electronic Commerce Research, 6 (3) (2005), pp. 165-180

P.P. Li, Toward a geocentric framework of trust: an application to organizational trust, Management and Organization Review, 4 (3) (2008), pp. 413-439

V. Liljander, A.C.R. Van-Riel, M. Pura, Customer satisfaction with e-services: the case of an on-line recruitment portal, M. Bruhn, B. Stauss (Eds.), Jahrbuch Dienstleistungsmanagement 2002 – Electronic Services (first ed.), Gabler Verlag, Wiesbaden, Germany (2002), pp. 407-432

K. Mathieson, Predicting user intentions: comparing the technology acceptance model with the theory of planned behavior, Information Systems Research, 2 (3) (1991), pp. 173-191

R.C. Mayer, J.H. Davis, F.D. Schoorman, An integrative model of organizational trust, Academy of Management Review, 20 (3) (1995), pp. 709-734

D.H. McKnight, N.L. Chervany, What trust means in e-commerce customer relationships: an interdisciplinary conceptual typology, International Journal of Electronic Commerce, 6 (2) (2001), pp. 35-59

G.C. Moore, I. Benbasat, Development of an instrument to measure the perceptions of adopting an information technology innovation, Information Systems Research, 2 (3) (1991), pp. 192-222

S. Mouakket, M.A. Al-Hawari, Investigating the factors affecting university students’ e-loyalty intention towards the Blackboard system, International Journal of Business Information Systems, 9 (3) (2012), pp. 239-260

J.C. Nunnally, I.H. Bernstein, Psychometric Theory, (third ed.), McGraw-Hill, New York (1994)

K. O’Doherty, S. Rao, M.M. Mackay, Young Australians’ perceptions of mobile phone content and information services: an analysis of the motivations behind usage, Young Consumers: Insight and Ideas for Responsible Marketers, 8 (4) (2007), pp. 257-268

A. Parasuraman, L. Berry, V. Zeithaml, SERVQUAL: a multiple-item scale for measuring service quality, Journal of Retailing, 64 (1) (1988), pp. 12-40

P.A. Pavlou, Consumer acceptance of electronic commerce: integrating trust and risk with the technology acceptance model, International Journal of Electronic Commerce, 7 (3) (2003), pp. 101-134

P.A. Pavlou, D. Gefen, Building effective online marketplaces with institution-based trust, Information Systems Research, 15 (1) (2004), pp. 37-59

Perusco, L., Michael, K., Michael, M.G., 2006. Location-based services and the privacy-security dichotomy. In: Proceedings of the Third International Conference on Mobile Computing and Ubiquitous Networking, London, 11–13 October, viewed 02 June 2007, Researh Online: University of Wollongong Database, pp. 91–98.

S. Rao, I. Troshani, A conceptual framework and propositions for the acceptance of mobile services, Journal of Theoretical and Applied Electronic Commerce Research, 2 (2) (2007), pp. 61-73

Ringle, C.M., Wende, S., Will, A., 2005, SmartPLS 2.0 (M3) Beta.

E.M. Rogers, Diffusion of Innovations, (first ed.), Free Press of Glencoe, New York (1962)

J. Samsioe, A. Samsioe, Introduction to location based services: markets and technologies

R. Reichwald (Ed.), Mobile Kommunikation: Wertschöpfung, Technologien, neue Dienste, Gabler, Wiesbaden, Germany (2002), pp. 417-438

H.J. Smith, S.J. Milberg, S.J. Burke, Information privacy: measuring individuals’ concerns about organizational practices, MIS Quarterly, 20 (2) (1996), pp. 167-196

S. Spiekermann, General aspects of location-based services, J. Schiller, A. Voisard (Eds.), Location-Based Services (first ed.), Elsevier, San Francisco, CA (2004), pp. 9-26

D.W. Straub, Validating instruments in MIS research, MIS Quarterly, 13 (2) (1989), pp. 147-169

United Nations’ International Strategy for Disaster Reduction Platform, 2005, United Nations’ International Strategy for Disaster Reduction Platform for the Promotion of Early Warning 2005, ‘Early Warning and Disaster Reduction’, paper presented to the World Conference on Disaster Reduction, Kobe, Hyogo, Japan, 18–22 January.

Van der Heijden, H., Verhagen, T., Creemers, M., 2001. Predicting online purchase behavior: replications and tests of competing models’. In: Proceedings of the 34th Annual Hawaii International Conference on System Sciences, Maui, Hawaii, 3–6 January 2001, viewed 11 November 2007, IEEEXplore Database.

V. Venkatesh, Determinants of perceived ease of use: integrating control, intrinsic motivation, and emotion into the technology acceptance model, Information Systems Research, 11 (4) (2000), pp. 342-365

V. Venkatesh, F.D. Davis, A theoretical extension of the technology acceptance model: four longitudinal field studies, Management Science, 46 (2) (2000), pp. 186-204

Xu, H., Teo, H.-H., Tan, B.C.Y., 2005. Predicting the adoption of location-based services: the role of trust and perceived privacy risk. In: Proceedings of the 26th International Conference on Information Systems, Las Vegas, USA, 31 December, viewed 20 August 2007, GoogleScholar Database, pp. 11–14.

Z. Yang, R.T. Peterson, S. Cai, Services quality dimensions of Internet retailing: an exploratory analysis, Journal of Services Marketing, 17 (7) (2003), pp. 685-700

R.B. Zajonc, Attitudinal effects of mere exposure, Journal of Personality and Social Psychology, 9 (2) (1968), pp. 1-27

Zeithaml, V.A., Parasuraman, A., Malhotra, A., 2000. A Conceptual Framework for Understanding e-Service Quality: Implications for Future Research and Managerial Practice. MSI Working Paper Series, Working Paper 00-115, Marketing Science Institute, Cambridge, MA, viewed 09 November 2007.

V.A. Zeithaml, A. Parasuraman, A. Malhotra, Service quality delivery through web sites: a critical review of extant knowledge, Academy of Marketing Science, 30 (4) (2002), p. 362

X. Zhang, V.R. Prybutok, A consumer perspective of E-service quality, IEEE Transactions on Engineering Management, 52 (4) (2005), pp. 461-477

Keywords: Location-based service, Emergency management, Social acceptance, Mobile government, Government deployment

Citation: Anas Aloudat, Katina Michael, Xi Chen, Mutaz M.Al-Debei, "Social acceptance of location-based mobile government services for emergency management", Telematics and Informatics, Vol. 31, No. 1, February 2014, Pages 153-171. DOI:

Towards the Blanket Coverage DNA Profiling and Sampling of Citizens

Towards the Blanket Coverage DNA Profiling and Sampling of Citizens in England, Wales, and Northern Ireland

Katina Michael, University of Wollongong, Australia



The European Court of Human Rights (ECtHR) ruling of S and Marper v United Kingdom will have major implications on the retention of Deoxyribonucleic Acid (DNA) samples, profiles, and fingerprints of innocents stored in England, Wales, and Northern Ireland. In its attempt to develop a comprehensive National DNA Database (NDNAD) for the fight against crime, the UK Government has come under fire for its blanket-style coverage of the DNA sampling of its populace. Figures indicate that the UK Government retains a highly disproportionate number of samples when compared to other nation states in the Council of Europe (CoE), and indeed anywhere else in the world. In addition, the UK Government also retains a disproportionate number of DNA profiles and samples of specific ethnic minority groups such as the Black Ethnic Minority group (BEM). Finally, the S and Marper case demonstrates that innocent children, and in general innocent citizens, are still on the national DNA database, sometimes even without their knowledge. Despite the fact that the S and Marper case concluded with the removal of the biometric data of Mr S and Mr Marper, all other innocent subjects must still apply to their local Metropolitan Police Service to have their fingerprints or DNA removed from the register. This is not only a time-consuming process, but not feasible.


The Police and Criminal Evidence Act of 1984 (UK) (PACE) has undergone major changes since its inception. The PACE and the PACE Codes of Practice provide the core framework of police powers and safeguards around stop and search, arrest, detention, investigation, identification and interviewing detainees (Police Home Office 2009). In the month of December 2008, post the S and Marper European Court of Human Rights ECtHR judgment, PACE underwent a review and changes were effective on the 31 December 2008, however, more changes especially on the issue of the retention of fingerprints and DNA are forthcoming. According to the Home Office the changes expected in the PACE will be to ensure that the “right balance between the powers of the police and the rights and freedoms of the public” are maintained (Police Home Office 2009). On reviewing the legal changes that have taken place since 1984 via a multitude of Acts, it can be said the United Kingdom (with the exception of Scotland) has, contrary to the claims of the Home Office, experienced a significant imbalance between the powers of the police and the rights and freedoms of the public. In the last 15 years, the rights and freedoms of the public have been severely encroached upon, and police powers significantly increased. A brief review of the major legislative impacts between 1984 and 2008 will be reviewed below. They are summarized in a timeline in Figure 1.

Figure 1. Changes to U.K. Legislation 1984-2008 that have Given the Police Greater Powers and have had an Impact on Fingerprint and DNA Retention (The content in was taken from Genewatch UK (2009a) but adapted and turned into a timeline for the sake of readability)

Legislative Changes between 1984 and 2009

PACE was introduced in 1984, one year prior to Dr Jeffrey’s discovery of DNA. Interestingly, PACE allowed for the police to ask a doctor to take a blood sample from a suspect during the investigation of a serious crime but only with their express consent. Thus a suspect had to volunteer or “agree” to a blood sample being taken, it could not be taken by force. Even after Jeffrey’s discovery, there was limited use of blood samples for forensic analysis as tools and techniques were still in their infancy. The Single Locus Probe (SLP) technique which was in use in early DNA examinations had numerous limitations. While new SLP technology overcame some of these limitations, “the statistical evaluation of SLP DNA evidence brought a new set of problems, perhaps even more difficult to overcome than the preceding technical limitations” (Sullivan 1998). In sections 61-65 the original PACE classified blood samples and scrapings of cells from the inner cheek as intimate in nature. Hair samples (save for pubic hair) was the only type of non-intimate DNA sample that could be retained for forensic analysis without the permission of the suspect, and this on account of an investigation into a serious arrestable offence. Although this kind of DNA cut with scissors rarely provided enough of a good sample to conduct single locus probe (SLP) profiling, it was in the late 1980s that PCR (polymerase chain reaction) profiling could amplify and type a single strand of hair (Home Office, 2004). This is when mass screenings of DNA samples were possible. To begin with there was great contention over the admissibility of DNA evidence in a court of law but this changed as commonplace errors and procedural issues were rectified, new more modern profiling techniques were introduced, and larger databases for statistical purposes became available.

A significant moment in the fight against crime in the United Kingdom came in 1993 after a Royal Commission on Criminal Justice (Hansard 2003). The Commission was set up because there was a feeling among the community that the criminal justice system was just not working well enough to convict the guilty and exonerate the innocent. Leading up to 1993, there were a number of high profile miscarriages of justice which weakened the public’s confidence in the criminal justice system, for example, the Birmingham Six, who had been jailed in 1974 for allegedly planting an IRA (Irish Republican Army) bomb that killed 21 people (BBC, 1991). One of the key recommendations coming from the Commission was the setting up of a national forensic DNA database. In the following year in 1994, the Criminal Justice and Public Order Act (CJPOA) introduced amendments to PACE and in 1995 the National DNA Database (NDNAD) was launched. At first, the Association of Chief Police Officers in England, Wales and Northern Ireland, believed that the system should have processed around 135, 000 samples in the first year, but by the end of that year only one quarter of the original target had been loaded into the system due to significant procedural and technical teething problems related to the database. The expected annual rate was not reached until 1998 as police did not know how to fully exploit the new legislation (Lynch, 2008).

One of the fundamental changes heralded by the CJPOA was the reclassification of particular types of DNA samples from intimate to non-intimate. Authorities knew too well from their limited experience with DNA since the mid-1980s, that “richer” cellular samples were needed if a useable database of the size being projected was going to be possible. Saliva samples and mouth swabs became non-intimate samples, and it followed that non-intimate samples could be taken without the consent of the suspect. Furthermore, police could now conduct the procedure without the assistance of a trained doctor, and if needed by force. The sweeping changes did not stop there; the CJPOA also altered the rules regarding when a DNA sample could be taken. It was the first time that DNA samples could be taken from people who had not conducted serious arrestable offences but from those who had conducted recordable offences beyond the most trivial. If a suspect was found guilty then for the first time since the introduction of PACE, the DNA sample could be stored indefinitely. Only if a person was acquitted of a crime, or charges were dropped, would the sample data be destroyed. Minor legislative changes were introduced allowing for the cross-matching of DNA profiles across the whole of the U.K. in 1996 through the Criminal Procedure and Investigations Act, and in 1997 the Criminal Evidence (Amendment) Act enabled non-intimate samples to be taken from prison inmates who had been convicted of serious offences prior to the establishment of the NDNAD.

In 1997, there was a change of government, the Labour Party came to power and by 1999 Prime Minister Tony Blair announced the aggressive expansion of the NDNAD to contain some 3 million profiles by 2004. It was in 2001, post the Sept 11 attacks via the Prevention of Terrorism Act that DNA profiles which entered the database remained there indefinitely, even if the suspect was acquitted or charges were dropped. The PACE was impacted by these changes and even volunteers who had partaken in mass screenings or dragnets who had willingly provided their DNA samples remained on the database indefinitely (Beattie, 2009). In 2003, under the Criminal Justice Act of s. 10 (amending s. 63 of PACE), those who were simply arrested or detained at a police station on suspicion of a recordable offence had their DNA sample taken. According to McCartney (2006):

This enables police to take DNA samples from almost all arrestees and preempts technological advances which are expected to see mobile DNA testing kits in the coming years (by omitting the words “in police detention”). It means that a sample (usually a cheek swab) can be taken upon “reasonable suspicion” for an offence, regardless of whether it will indicate guilt or have any possibility of use during the investigation. The law, then, is explicit: anyone who comes under police suspicion is liable to have a DNA sample taken, searched against the samples on the NDNAD, and retained. The course that an investigation takes or whether a prosecution proceeds is of little, if any, significance.

The Criminal Justice Act was yet another extension of police powers and no other nation state had the same freedom to gather and store such personal citizen information. By 2005, the Serious Organised Crime and Police Act extended the uses of the NDNAD to include the identification of deceased persons. By 2008, the Counter-Terrorism Act extended police powers to allow DNA and fingerprints to be taken from persons subject to control orders or those under secret surveillance in the interests of national security.

Numerous legal analysts have been critical of the changes that PACE has undergone since 1984 - ironically the increase in police powers and the establishment of the NDNAD was originally introduced to increase public confidence in the criminal justice system and has instead eroded citizen trust in the state and impinged on the rights of every day Britons by going too far. Beattie (2009) is rather candid in her assessment of the changes, stating:

[there is] no statutory guidance for decisions about the retention of samples, no readily accessible mechanism whereby individuals can challenge the decision to retain their records (other than judicial review) and no independent oversight by a designated regulatory body.

This assessment seems to strike at the very heart of the problem. With only a judicial route at one’s disposal to question current practices, an innocent citizen is left almost entirely powerless to battle against its own government. We can see no greater example of this than in the DNA sample storage of juveniles between the ages of ten and eighteen, “230,000 of whom were alleged to have been added following legislative changes in 2004, and of whom 24,000 were taken from ‘innocent children’ against whom no charges had been brought …” (Lynch, 2008). An utterly disturbing statistic, and one which rightly led to the accusation of the Labour government compiling a database by stealth.

It now seems that PACE “1984” really did lay the seeds to an Orwellian state. According to the most recent Government statistics, 7.39 per cent of the UK population has their DNA profiles retained on the NDNAD (Beattie, 2009). This is an alarming figure when one considers that most other European states have less than 1 per cent of their population on their respective DNA database, and do not keep cellular samples but rather DNA profiles alone and for a defined period of time (Table 1). The U.K. Government would possibly have us believe by these figures that they are dealing with an unusually high crime rate, but the reality is that the figures do not reveal the percentage of persons who have committed violent crimes as opposed to those who have committed petty crimes. Another problem with the NDNAD is that it is highly disproportionate in terms of its recording of citizens by ethnic background. The Guardian newspaper calculated that 37 per cent of black men and 13 per cent of Asian men in the nation are contained in the NDNAD, as compared to only 9 per cent of white men (Lynch, 2008). Liberty has stated that 77 per cent of young black men had records on the NDNAD in 2006 and that black people in general were almost 4 times as likely to appear on the database as white people (Rodgers, 2009).

Table 1. Characteristics of some National DNA Databases

The National DNA Database

The U.K. National DNA Database (NDNAD) of England and Wales was launched in April of 1995 at the Forensic Science Service (FSS) laboratory. It took several years for Northern Ireland to be included in the NDNAD. Before launching the official database the FSS trialed a small-scale forensic database to ensure the validity of such a system. The FSS began developing DNA testing in 1987 and in 1995 achieved a scientific breakthrough, inventing a chemical that enabled DNA profiling which led to the establishment of the NDNAD (FSS, 2009a). The NDNAD is the oldest and largest DNA database in the world with national legislation to foster and support its growth. The U.K. has also adopted a privatized model for forensic science services as related to the criminal justice system (Lynch, 2008). This was not always the case however, as the FSS was once an agency of the Home Office. When it became FSS Ltd. it became a profit maximizing, government-owned company under almost exclusive contract to the Home Office in forensic services to the police.

Although the legislation that enabled the police to collect DNA samples, request the FSS to process them and to store DNA profiles on the NDNAD, the annual expected growth rate was not reached until the late 1990s. As one of the main strategic objectives of the NDNAD was to demonstrate a return on investment, the Home Office set out to detect more crimes and thus reduce overall crime rates in the hope of closing the justice gap (McCartney, 2006, p. 175). In April 2000, five years after the establishment of the NDNAD, the UK government announced the DNA Expansion Programme, aimed at getting all known active offending persons onto the database which at the time was estimated to be about 3 million people. The total government investment in the program to March 2005 stood at £240.8 million which enabled police forces to increase the sampling of suspects and to recruit additional crime scene investigators, purchase the appropriate equipment, train more police etc. (Home Office, 2005). Old samples from 1995 to 1999 were also able to be reanalyzed (McCartney, 2006, p. 176). A portion of the profiles were updated to match upgrades in the system software of the NDNAD from the standard profiling software known as SGM (Second Generation Multiplex) which had an average discrimination power of 1 in 50 million, to SGM Plus profiles which was said to reduce the chance of an adventitious match as the size of the NDNAD inevitably increased fuelled by the funding from the Expansion Programme.

An adventitious match is the possibility that two different people would have a profile that was almost identical owing to a “false positive” also know in statistics as an α (alpha) error. Thus an adventitious match shows a positive result for the matching of two persons (e.g. that of a crime scene sample, and that of a record on the NDNAD) when in actual fact there is no match at all. In the original NDNAD the risk of an adventitious match using the original SGM profiles was calculated to be 26 per cent but it has been claimed that since the introduction of the SGM Plus software, no adventitious matches have occurred (Nuffield Council, 2007). Sir Alec Jeffreys, however, has warned publicly that the genetic profiles held by police for criminal investigations are not sophisticated enough to prevent false identifications. “Dissatisfied with the discriminatory power of SGM Plus, Jeffreys recommends that following the identification of a suspect, the authority of the match should be tested by reanalyzing the sample at six additional loci” (Lynch 2008, pp. 144-145). Reanalysis of samples (whether volunteers, suspects, or those convicted) without consent, raises additional ethical questions however, even if it might indeed be able to exonerate a small number of individuals, if anyone at all.

The FSS are aware of the small possibility for an error but believe that the 10 markers currently stored on the database are sufficient (Jha 2004). In their defense FSS claim that the NDNAD is simply a type of intelligence database, and ultimately one is not convicted on mere “intelligence” but on multiple sources of evidence (Koblinsky, Liotti & Oeser-Sweat 2005, p. 273). Peter Gill of the FSS responded to Jeffreys concerns to the need to increase the number of markers for each profile by emphasizing that adventitious matches occur quite often when degraded samples are used and that the jury had to make up their mind based on numerous sources of evidence not just DNA evidence in isolation (Jha, 2004). For Jeffreys, storing “unnecessary” personal information on the NDNAD, for instance of persons who have previously been wrongly suspected of a crime, will only act to over-represent certain ethnic minorities which could lead to resentment by some citizen sub groups. The other issue that Jeffreys raises is the potential to use DNA sample information at some time in the future, and the risks associated with the potential to reveal health information from those samples; he is strongly opposed to the police gaining access to that kind of information (FSS, 2009).

Looking at some cross-sectional data of the NDNAD can provide us with a better feel for the size of this databank, which per capita, stores the largest number of DNA profiles for any given nation. By the end of March 2005, the Nuffield Bioethics Council reported that there were 3 million profiles stored on the NDNAD, an estimated 5.2 per cent of the U.K. population with 40,000 to 50,000 profiles being added monthly. Specifically, the police had retained 3,072,041 criminal justice (CJ) profiles, 12,095 volunteer profiles, and 230,538 scene-of-crime (SOC) profiles (Lynch, 2008, p. 149). The increase in loading samples of crimes was not just due to the Expansion Programme but also the legislative changes noted above via the Criminal Justice Act of 2003 and also the Serious Organised Crime and Police Act of 2005, and because of innovations in processing capabilities by the FSS. These legislative changes broadened the net of people who would now be added to the databank, in effect lowering the threshold for making it onto the NDNAD. From the perspective of the Association of Chief Police Officers, this was a positive because it meant getting offenders onto the database earlier in their criminal careers. By the end of December 2005, the NDNAD held around 3.45 million CJ and elimination profiles and 263,923 crime scene sample profiles. At that rate it was predicted that an estimated 25 per cent of the adult male population and 7 per cent of the adult female population would eventually enter the database (Williams and Johnson 2005). More sober estimates indicate that the overall number of persons to be admitted to the NDNAD would be a little over 10 per cent of the UK population (Table 2) (Jobling & Gill, 2004, p. 745).

Table 2. A NDNAD snapshot using year-end 2007 data

Current NDNAD Statistics

The most recent NDNAD statistics were made public during a parliamentary debate in October of 2009 (Hansard 2009). Here new figures from between 2007 and 2009 were tabled. Figure 2 is based on the data that was presented and shows that at the end of March in 2007, there were about 151,882 DNA profiles of persons between the ages of 10 and 15 on the NDNA which constituted about 3 per cent of all DNA profiles. There were 206,449 DNA profiles of persons between the age of 16 and 17 equating to about 5 per cent of all DNA profiles. Not counting children under the age of 10 whose DNA profiles are stored on the NDNAD, we can estimate that about 9 per cent of the profiles on the NDNAD are of persons under the age of 18. These are numbers that have the wider community, especially civil liberties groups, other self-interest groups and key non-government organizations (NGOs) expressing deep concern over the widening retention of persons for inclusion on the NDNAD. The matter has now gone through judicial review and while the UK courts refused to acknowledge the rights of innocents or those of young children or those who have been acquitted of a crime from entering the NDNAD, the European Court of Human Rights (ECtHR) ruled otherwise. The S and Marper v. United Kingdom will be the focus of the next section of this paper.

Figure 2. DNA profiles on the NDNAD by age as of end March 2007

Beyond the problem of children on the NDNAD is the disproportionate number of persons of other ethnic appearance outside white Europeans who have had their DNA sample taken and analyzed and stored indefinitely. The NDNAD does not record detailed data about one’s ethnicity but it does categorise an individual into one of six ethnic origins based on appearance. These categories include: White-South European, White-North European, Asian, Black, Chinese Japanese or South East Asian, Middle Eastern and one more category referred to as Unknown. At first glance the numbers in Figure 3 show that about 77 per cent of the DNA profiles on the NDNAD have come from “White-Europeans” (summing both the South and North White European categories) and only 7 per cent from “Blacks” and about 5 per cent from “Asians”. But one should not look at these percentages on face value. Relatively speaking, when one analyses these numbers along-side census data, the truer picture emerges. Blacks and Asians do not make up the largest ethnic portion of the UK and thus a figure of 7 per cent of Blacks on the NDNAD means that more than 37 per cent of the Black male population in the UK have their DNA profile recorded on the NDNAD, and 5 per cent of “Asians” means that about 13 per cent of the Asian population have their DNA profile recorded on the NDNAD. This is compared with only 9 per cent of the total White population that is on the NDNAD.

Figure 3. DNA profiles on the NDNAD by ethnic appearance as of end March 2007

Some groups refer to this kind of disproportionate ethnic presence on the NDNAD as institutionalized racism. Institutionalized racism can be defined as “that which, covertly or overtly, resides in the policies, procedures, operations and culture of public or private institutions - reinforcing individual prejudices and being reinforced by them in turn” (Lawrence, 1999). It is a structured and systematic form of racism built into institutions. While this researcher would not label the disproportionate ethnic representation in the NDNAD as racism, she does acknowledge that minority ethnic populations, particularly black men, do not stand to benefit from the current UK legislation, but rather the legislation has been to the detriment of minority groups. According to National Black Police Association of the UK black men are four times more likely to be stopped and searched than white men. They are also more likely to be arrested and released without charge, let alone convicted, and without being compensated for their ordeal. The NDNAD statistics seem to suggest that black males are more likely to offend than white males, which is a fallacy. And this kind of feeling among the community of the Black Ethnic Minority (BEM) may not only provoke great mistrust in the UK police and the Government but also strong resentment toward future life opportunities and freedoms, a feeling echoed by Sir Jeffreys. It also means that less competent officers may be inclined, whether mindfully or not, to draw in ethnic minorities in general because they are the “usual” suspects in crimes (Jarrett, 2006). The most up-to-date figures on the profiles that constitute the NDNAD by gender, age and ethnicity can be found in Table 3, which is an adapted version of the data that was tabled in Hansard 27 October 2009 Col292W.

Table 3. Most recently released NDNAD profile statistics by gender and ethnic appearance (compare 2008 and 2009). Source: Hansard 27 October 2009 Col292W.

Of the greatest injustice of the UK legislation related to the collection and storage of DNA samples and profiles however, is the fact that at least 857,000 innocent people remain on the NDNAD who have not been convicted of crime and who may never be convicted of a crime. Living in this state of apprehension of any one of those people is quite incomprehensible. For some, such an ordeal would almost certainly lead to a feeling of bitterness or dislike or hatred for the State and especially the UK Police, for that individual who was wrongly apprehended. Among the one million innocent people whose DNA sample has been taken are an estimated 100,000 innocent children (Action on Rights for Children 2007). What are these persons to think and feel? What does it mean about their future, or employment opportunities requiring security checks? And how might their experience with Police impact them later in life? Psychologists will always point out that someone treated like a criminal may retaliate as if they were one: “[b]ecause it feels like someone is punishing us by making us feel guilty, we often have an urge to retaliate against those who do” (Stosny 2008).

But beyond the psychological repercussions on the individual stemming from what some refer to as “emotional pollution” is the effort that a person must go through to get their details removed from the NDNAD (Geoghegan, 2009), a process that was almost impossible until the S and Marper ECtHR judgment. Since 2004, in England, Wales and Northern Ireland records are removed and DNA destroyed only under “exceptional circumstances” (Genewatch UK, 2009). And given the profiles on the NDNAD belong to individual police forces, innocents whose profiles remain on the NDNAD and who wish to have them removed need to appeal to their Constabulary, although most recently ACPO have asked officers to ignore the ECtHR ruling (Travis, 2009).

At the end of March 2009, Lord West of Spithead noted that the NDNAD contained DNA profiles and linked NDA samples from approximately 4,859,934 individuals included by all police forces, of which an estimated 4,561,201 were from English and Welsh forces (more than 7 per cent of the UK population) (Hansard, 2009). This figure should be compared with those cited on 27 October 2009 in Parliament which indicated that at the end of March in 2008 there was a total of 5,056 313, profiles on the NDNAD and as of 2009 for the same period there were 5,617,112 (See Table 3). According to the latest population statistics obtained from the Office for National Statistics (2009), there are about 61.4 million people residing in the UK, which means that the NDNAD contains profiles of more than 8.36 per cent of the total population in the United Kingdom. This figure is rather conservative an estimate when one considers that Scotland has a different legislative requirement regarding the retention of DNA profiles.

Why these specifics are important is because they indicate a number of things. First, the size of the UK databank is growing at over 560,000 profiles per annum which is in keeping with the rate of 40,000 to 50,000 samples per month. Secondly one in nine persons in England, Wales and Northern Ireland is registered on the databank. Thirdly, and more to the point, there are 507,636 DNA profiles which are of unknown persons. This either means that these samples have been collected at crime scenes and have not been individually identified alongside “known” persons or that potentially errors exist in the NDNAD itself. Here an important complementary factor must be underscored in support of the latter claim. If we are to allege that 507,636 profiles came from scenes of crime (SOC) where the individual has not been identified since April 1995 then we also need to understand that (McCartney, 2006, p. 182):

only 5 per cent of examined crime scenes result in a successful DNA sample being loaded onto the NDNAD, and only 17 per cent of crime scenes are examined, meaning that just 0.85 per cent of all recorded crime produces a DNA sample that can be tested (NDNAD, 2003/04: 23)…

Thus it is very rare for a perpetrator of a serious crime to leave body samples behind unless it is saliva on a cigarette butt or a can of drink or in more violent crimes such as sexual assaults, semen or some other bodily stain sample. In the case of some violent crimes like sexual assault, most victims do not, and are unlikely to begin, reporting to police. Many of these who do report do so too late for DNA profiling to be an option. Of those who do report in time, the occurrence of sexual intercourse is often not an issue in dispute. The existence or non-existence of consent will be the critical matter. DNA profiling can offer nothing to resolve this problem. However, in the case of serial rapes or where there is no real doubt about identity of the assailant, DNA profiling potentially has a great deal to offer (Freckelton, 1989, p. 29).

Of Dragnets and Mass Screenings

In cases where heinous violent crimes have occurred, often of a serial nature, local police have conducted mass DNA screenings of the population in and of surrounding neighborhoods of the scene of the crime (Butler, 2005, p. 449). It becomes apparent to local police that a mass DNA screening is required when it seems that the crimes have been conducted by a single person nearby, given the trail of evidence left behind and other intelligence information. A DNA mass screening was used in the very first case where DNA was used to convict an individual. Mass screenings are now termed intelligence-led screens and the subtle change in nuance as of 1999 was of great importance to how the UK perceived its use of DNA evidence in criminal cases. In a talk on DNA technology, Lynn Fereday of the FSS said in 1999 that:

[t]he screens now are a routine method of policing. This is a major way of saving police resources. What happens is that once a crime is being investigated, and DNA evidence has been found, police immediately do a scoping of who or what area they have to screen. They decide on a select area, and they then look for volunteers in that area. One of the first cases involved a murder of the young girl using STRs …The interesting thing about the mass screens is that although there seem to be some unease about continuing with them here, people are volunteering constantly. They volunteer for a reason, because they know they are innocent. They have nothing to fear, and we will end up with crime detection.

Of course, such comments come from an employee of the FSS. Examples of very early mass screenings in the UK can be found in DNA user conferences (Burton, 1999).

There is no denying that mass screenings have led to convictions of perpetrators who would have otherwise gone unnoticed but the statement that people volunteer because they are “innocent” or they “have nothing to fear” is not entirely true.

In her landmark paper in 2006, Carole McCartney described Operation Minstead where the police profiled 1,000 black men in South London in the hunt for a serial rapist, and then requested each of them to volunteer a DNA sample. McCartney (2006, p. 180) writes:

Of those, 125 initially refused, leading to “intimidatory” letters from the police, urging re-consideration, and five were arrested, their DNA taken post-arrest and added to the NDNAD. Such actions have raised questions of legality, with arrests only lawful with 'reasonable suspicion' of an individual having committed a criminal act. If the police are to arrest on non-compliance with a DNA request, then that casts non-compliance as a crime--a step that worries civil libertarians and may lose the spirit of cooperation essential in these circumstances.

Table 4 shows an example of a prioritisation grid to deal with DNA intelligence led screen actions. While it is an early example of a grid, and today’s practices are much more sophisticated in manner, it does indicate why an individual approached to volunteer a DNA sample by the police might refuse to do so. Being targeted to donate a sample by the police in a mass screen such as Operation Minstead means you are under some suspicion and fall into one of the priority areas of concern. If you are indeed innocent of a crime, you may refuse to donate a DNA sample for any number of reasons, among which could be a basic right not to be insulted particularly by the State. An individual resident who lives in a mass screen prioritization area and meets the criteria of any number of priorities might feel like they are being presumed guilty, and may not trust technology to prove them innocent, or may even fear being accidentally matched to a crime they did not commit.

Table 4. A prioritisation grid to deal with DNA intelligence LED screen actions. Source (Burton 1999).

Now while the police can ask any person in the UK to volunteer a DNA sample, there is some controversy related to what happens with a sample once it is analyzed and an individual is proven to be innocent. If an individual has been eliminated from enquiries then the question remains whether or not their DNA profile should be retained on the NDNAD. According to Genewatch (2009c):

[i]n these cases, people can consent to having their DNA used only for the inquiry, or give an additional signature if they agree to having their DNA profile added to the database. In Scotland volunteers can change their minds and ask to be removed from the Database, but this is not possible in England and Wales. However, the NDNAD Ethics Group recommended in April 2008 that volunteers should not have their DNA added to the Database at all, and their DNA should be destroyed when the case has ended. This recommendation is likely to be implemented because there is no evidence that adding volunteers' DNA to the database is helping to solve crimes.’

Still this practice has yet to be implemented categorically and the claim remains that innocent people should be kept off the NDNAD.

Statistics presented by the Home Office will always tout suspect to scene matches and scene to scene matches and provide the numbers of murders, rapes and car crimes where suspects are identified but it is very important to note that not all successful matches result in a conviction or even in an arrest (McCartney, 2006). So while statistics might seem to indicate that the NDNAD is returning value for money, overall crimes rates in the UK have not been reduced (Ministry of Justice, 2009), and the number of persons convicted using DNA evidence remains relatively moderate based on previous years reports. The FSS and the Government will always seek to show that the NDNAD has been an excellent evidential tool that has supported many successful prosecutions and provided important leads in unsolved “cold” cases but no matter how one looks at it, the storage of innocent persons’ DNA profiles should not be permitted.

Where was the NDNAD Headed?

The Possibility of Blanket Coverage DNA Sampling of All Citizens

Putting the brakes on the NDNAD was not going to be easy. Several cases had been heard through various local courts but were unsuccessful in their attempts to have their clients’ fingerprints and DNA samples and profiles destroyed. Of course, some scientists working in the area of forensic analysis continued to dream of databases and databanks that ideally would contain the profiles of every person in the country. This was a view maintained by scientists not only within the UK but as far as the United States and even New Zealand. Although the overwhelming feeling among this community of experts was that such a database would “obviously never be compiled” (Michaelis et al., 2008, p. 106). Still this goodwill does not halt the potential for DNA databases to become commonplace into the future. In 2005, Koblinsky et al. (p. 290) rightly predicted that more people would find themselves onto national DNA databases. They believed that it was likely:

… that legislation will be passed that will require juveniles who commit serious crimes to be included in the database. It is possible that eventually every citizen will be required to have his or her profile in a national database despite concerns about privacy issues and constitutional protections.

Such attitudes must be understood within their context. It makes sense to forensic analysts and scientific-literate commentators that a larger database would help to capture repeat offenders and thus reduce overall crime rates. Many would not debate the importance of DNA profiling for serious crimes, but there are issues with relating DNA profiling techniques in a mandatory fashion to the whole populace. Even the Nuffield Bioethics Council was allegedly supportive of the benefits of a universal database. According to Lynch et al. (2008, p. 154) the Council:

…[found] that while the balance of argument and evidence presented in the consultation was against the establishment of a population-wide database, it recommend[ed] that the possibility should be subject to review, given its potential contribution to public safety and the detection of crime, and its potential for reducing discriminatory practices.

In 2005, Koblinsky et al. (p. 163) wrote: “[a]s DNA analysis becomes more and more common in criminal investigations, there will come a day when millions upon millions of people will have been profiled.” Well, we no longer have to look into the future for the fulfillment of such prophecies - they are here now. There are millions upon millions of DNA samples and profiles stored in the UK alone and the US too is now driving new initiatives on the road of mass DNA profiling (Moore, 2009). The FBI’s CODIS database has 6.7 million profiles and it is expected that it will accelerate its DNA database from 80,000 new entries a year to 1.2 million by 2012 (Michaelis et al., p. 105). But it may not be criminal legislation that impacts on such outlandish figures. One day it is indeed possible that the medical research field will have such an impact on society that “… every citizen’s genetic profile may be stored in a national database. There are many who are concerned about the ramifications of a government agency maintaining such records. It is essential that all DNA data can be encrypted and protected from abuse or unauthorized access” (Koblinsky et al., 2005).

Expanding databanks will clearly have an impact on civil liberties and individual privacy. And while there are those who believe such statements do a “disservice to a society suffering from a constant rise in violent crime,” (Melson, 1990) the recent ECtHR ruling is proof enough that we need to reconsider the road ahead. But it is not scientists alone who are providing the impetus for even larger databanks, politicians or political commentators also are entering the debate. Former mayor of New York, Mr Rudy Giuliani had advocated taking DNA samples of all babies born in American hospitals. This idea would not take much to institute in practice, given cellular samples (blood) are already taken from babies with the permission of the parent to test for common disorders. The same practice also exists in Australia and is known as the Guthrie Test or more commonly the Heel Prick Test (Guthrie Test, 2009). Michaelis et al. (2008, pp. 100-101) comment on such a potential status of mass DNA sampling at birth but are mindful of the implications on civil liberties and privacy:

Having a databank of all American-born persons would obviously be of great benefit, not only in violent crime investigations but also in cases of missing persons, inheritance disputes, immigration cases and mass casualties such as airline crashes and terrorist acts. The obvious concerns over privacy and civil liberties, however, have caused commentators to urge caution when deciding which samples to include in the databanks.

DNA Developments and Innovations Challenging Ethical Practice

The 13 year Human Genome Project (HGP) conducted by the US Department of Energy and the National Institutes of Health has gone a long way into identifying all the approximately 20,000-25,000 genes in human DNA, and determining the sequences of the 3 billion chemical base pairs that make up human DNA. The project was and still is surrounded by a number of very challenging ethical, legal and social issues (Table 5). Points 3 and 7 in the table are of particular interest when we consider what it means for someone’s DNA sample to be taken, analyzed, and stored indefinitely in a criminal databank. What kind of psychological impact will it have on the individual and forthcoming stigmatization by the individual themselves, and then by the community around them. This is particularly the case of minority groups. And what of the potential to “read” someone’s DNA and be able to make judgments on their mode of behavior based on their genetic makeup? Are persons for instance, more prone to violence because they carry particular genes? Or would some generalities based on genetics affect someone’s free will and determine their future because of some preconceived statistical result?

Table 5. Societal concerns arising from the new genetics (adapted from the Human Genome Project, 2009)

Already under research are “DNA identikits” which can describe a suspect’s physical appearance from their DNA sample in the absence of an eyewitness account. At present the FSS provide an ethnic inference service (McCartney, 2006, p. 178). The FSS used this technology in 2008 to investigate the stabbing of Sally Anne Bowman in 2005, although it was not this forensic result that ultimately led the police to her perpetrator (FSS, 2009). Used to supplement ethnic inference is the red hair test which can detect 84 per cent of red heads (McCartney, 2006, p. 181). The continued research into the HGP will inevitably determine very detailed information about a person in the future. The other problem closely related to innovations in identikits are those of advances in familial searching techniques. Given that families share a similar DNA profile, obtaining the DNA of one individual in a family, let us say “the son”, can help to determine close matches with other persons in the immediate family such as the sister, mother, father or first cousin. While only identical twins share exactly the same DNA, a sibling or parent share a very close match. The technique of familial searching was also used in the Sally Anne Bowman case without success. A suspect’s DNA was taken and matched against the UK NDNA but no exact matches were returned. The FSS then attempted the familial searching technique and that too did not aid their investigation. Familial searching was first used in 2002 in a rape and murder case when a list of 100 close matches was returned from the NDNAD to identify a perpetrator who had since died. DNA samples were first taken from the living relatives and then from the dead body of the offender Joe Kappen.

The Risks Associated with Familial Searching and Medical Research

Familial searching has very broad ethical implications. It is conducted on the premise that a rotten apple comes from a rotten tree. Put another way, the old adage goes, “tell me who your friends are and I’ll tell you who you are.” Instead today, we may be making the false connection of - “tell me who your friends are and I’ll tell what gene you are”! Interestingly this latter idea has formed the titled of a biology paper written by P. Morandini (2009). The point is that we return to models of reputation by association and these cannot be relied upon to make judgments in a court of law. We learnt all too well in Australia through the Dr Haneef case, that guilt by association, even guilt by blood-line, is dangerous to civil liberties. Considered another way, some have termed this kind of association based on DNA profiles, “genetic redlining.” Genetic redlining can be defined as “the differentiated treatment of individuals based upon apparent or perceived human variation” (Melson, 1990, p. 189). David L. Gollaher discusses the risks of what essentially is genetic discrimination in a 1998 paper.

Perhaps the most disturbing practice that may enter this field and make things impossible to police both in the “criminal law” arena and the “medical research” field is the deregulation and privatization of the DNA industry internationally. Future technological innovations will surely spawn the growth of this emerging industry. We have already noted the home-based DNA sampling kits available for less than 100 US dollars which come with free DNA sample databanking. It will not be long before some citizens volunteer somebody else’s DNA, instead of their own, forging consent documentations and the like. The bungle with the first ever UK DNA case shows that even the police could not imagine that Pitchfork (the offender), would have conceived of asking a friend to donate a sample on his behalf. Such cases will inevitably occur in volunteer home sampling methods, as fraudsters attempt to access the DNA samples of friends, strangers or even enemies via commonplace saliva-based sampling techniques. All you need is a pre-packed buccal swab from the DNA company providing the kits and away you go. If this seems an extreme possibility to the reader, consider the “spit kits” that have been issued to public transport drivers who have been harassed by passengers by being spat at or otherwise, who can now collect the DNA samples of an alleged offender and turn them into the appropriate authorities. No consent of the donor is required here (Lynch, 2008, p. 153).

When we consider how we as a society have traversed to this point of “accepting” the construction and development of such unusually large national databanks as the NDNAD in the UK, we can identify a number of driving forces. Some nations are at this point of almost indiscriminate storage of DNA profiles primarily due to changes in policing practices and the law, government policy, innovation in forensic science (the idea that because we can, we should), co-existing with venture capitalists who are backing commercial opportunities and the parallel developments in the genetic medical research field. In the case of the UK the PACE changed so much, and there was such a redefinition of what constituted a “recordable offence” that non-intimate samples could be obtained from individuals for investigation into the following offences without their consent (Roberts & Taylor, 2005, pp. 389-390):

unlawfully going onto the playing area at a designated football match; failing to leave licensed premises when asked to do so; taking or destroying rabbits by night; riding a pedal cycle without the owner's consent; allowing alcohol to be carried in vehicles on journeys to or from a designated sporting event.

Consider the Home Office’s August 2008 proposal to expand police powers which included plans to set up new “short term holding facilities” (STHFs) in shopping centers to take people's DNA and fingerprints but was later quashed with the S and Marper ECtHR judgment (Genewatch UK, 2009b).

This is short of being farcical. It makes little sense to take such personal data from an individual when the profile itself cannot be used for investigative purposes. There must be some other motivation toward the sampling of persons who on occasion might find themselves charged with a petty crime and are punished by fine, penalty, forfeiture or imprisonment other than in a penitentiary. Why store such petty crime offenders’ DNA profiles indefinitely on the NDNAD? Surely the action of someone who might find themselves, for instance, under the influence of alcohol and refuse to leave a licensed premise when asked to do so, is not indicative of their capacity to commit a serious felony in the future. There is a grave issue of proportionality here commensurate to the crime committed by the individual, and on the side of the crime itself, a major issue with what constitutes a recordable offence. The original PACE wording stated a “serious arrestable offence” (Ireland, 1989, p. 80) not just any old offence. As a result policing powers were increased significantly, and the individual’s right not to incriminate himself or herself was withdrawn in conflict with the underpinnings of Common Law (Freckelton, 1989, p. 31).

Our legal system has traditionally eschewed forcing people to incriminate themselves by becoming the instruments of their own downfall. That principle has suffered a number of encroachments in recent years.

It is here that we need to take a step back, reassess the balance needed in a robust criminal justice system and make the necessary changes to legislation, save we get too far ahead that we find recourse a near impossibility.


When one analyses the case of Mr S and Mr Marper, one realises how short of the mark the UK Government has fallen. Instead of upholding the rights of innocent people, the retention of their fingerprint and DNA data is kept for safe keeping. Some have claimed that this initial boost in the number of samples was purposefully conducted to make the NDNAD meaningful statistically, while others believe it was in line with more sinister overtones of a surveillance state. One thing is certain, that where the courts in England did not provide any recourse for either Mr S or Mr Marper, the European Court of Human Rights ruling indicated a landslide majority in the case for both Mr S and Mr Marper to have their DNA samples destroyed, and profiles permanently deleted. One of the major issues that has triggered this change in the collection of such personal and sensitive data have been the alleged 3,000 individual changes to the PACE Act. The watering down of laws that are meant to uphold justice, but instead are being alternatively used to abuse citizen rights, is an extremely worrying trend, and adequate solutions, despite the ECtHR ruling, are still lacking.


Action on Rights for Children. (2007). How many innocent children are being added to the national DNA database? Retrieved from

BBC. (1991). Birmingham six freed after 16 years. Retrieved from

Beattie K. (2009). S and Marper v UK: Privacy, DNA and crime prevention.European Human Rights Law Review, 2, 231.

Burton, C. (1999). The United Kingdom national DNA database. Interpol. Retrieved from

Butler J. M. (2005). Forensic DNA typing: Biology, technology, and genetic of STR markers. Oxford, UK: Elsevier.

Fereday, L. (1999). Technology development: DNA from fingerprints. Retrieved from

Forensic Science Service. (2009a). Analytical solutions: DNA solutions. Retrieved from

Forensic Science Service. (2009b). Sally Anne Bowman. Retrieved from

Freckelton I. (1989). DNA profiling: Forensic science under the microscope. In VernonJ.SelingerB. (Eds.), DNA and criminal justice (Vol. 2). Academic Press.

Genewatch, U. K. (2009a). A brief legal history of the NDNAD. Retrieved from

Genewatch, U. K. (2009b). Police and criminal evidence act (PACE) consultations. Retrieved from

Genewatch, U. K. (2009c). Whose DNA profiles are on the database? Retrieved from

Geoghegan, J. (2009, October 12). Criticism for police over silence on DNA database. Echo. Retrieved from

Gollaher D. L. (1998). Genetic discrimination: Who is really at risk?Genetic Testing, 2(1), 13. 10.1089/gte.1998.2.1310464593

Guthrie Test (Heel Prick Test). (2009). Discovery. Retrieved from

Hansard. (1993). Royal commission on criminal justice. Retrieved from

Hansard. (2009). DNA databases. Retrieved from

Hansard. (2009). Police: Databases. Retrieved from

Home Office. (2004). Coldcases to be cracked in DNA clampdown. Retrieved from'Coldcases'_To_Be_Cracked_In_Dna?version=1

Home Office. (2005). DNA expansion programme 2000–2005: Reporting achievement. Retrieved from

Human Genome Project. (2009). Human genome project information: Ethical, legal and social issues. Retrieved from

Ireland, S. (1989). What authority should police have to detain suspects to take samples? In J. Vernon & B. Selinger (Eds.), DNA and criminal justice. Retrieved from

Jarrett, K. (2006). DNA breakthrough. National Black Police Association. Retrieved from

Jha, A. (2004, September 9). DNA fingerprinting no longer foolproof. The Guardian. Retrieved from

Jobling M. A. Gill P. (2004). Encoded evidence: DNA in forensic analysis.Nature Reviews. Genetics, 5(10), 745. 10.1038/nrg145515510165

Koblinsky L. Liotti T. F. Oeser-Sweat J. (Eds.). (2005). DNA: Forensic and legal applications. Hoboken, NJ: Wiley.

Lawrence, S. (1999, February 24). What is institutional racism? The Guardian. Retrieved from

Lynch M. (2008). Truth machine: The contentious history of DNA fingerprinting. Chicago: Chicago University Press. 10.7208/chicago/9780226498089.001.0001

McCartney C. (2006). Forensic identification and criminal justice: Forensic science justice and risk. Cullompton: Willan Publishing.

McCartney C. (2006). The DNA expansion programme and criminal investigation.The British Journal of Criminology, 46(2), 189. 10.1093/bjc/azi094

Melson K. E. (1990). Legal and ethical considerations. In KirbyL. T. (Ed.), DNA fingerprinting: An introduction. Oxford, UK: Oxford University Press.

Michaelis R. C. Flanders R. G. Wulff P. H. (2008). A litigator's guide to DNA: From the laboratory to the courtroom. Burlington, UK: Elsvier.

Ministry of Justice. (2009). Population in custody. Retrieved from

Moore, S. (2009). F.B.I. & states vastly expand DNA databases. The New York Times. Retrieved from

Morandini, P. (2009). Tell me who your friends are and I'll tell what gene you are. Retrieved from

Nuffield Council on Bioethics. (2009). Forensic use of bioinformation: Ethical issues. Retrieved from

Office for National Statistics. (2007). Mid-2006 UK, England and Wales, Scotland and Northern Ireland: 22/08/07. Retrieved from

Office for National Statistics. (2009). UK population grows to 61.4 million. Retrieved from

Parliamentary Office of Science and Technology. (2006). Postnote: The national DNA database. Retrieved from

Police Home Office. (2009). Police and criminal evidence act 1984 (PACE) and accompanying codes of practice. Retrieved from

Roberts A. Taylor N. (2005). Privacy and the DNA database.European Human Rights Law Review, 4, 373.

Rodgers, M. C. (2009). Diane Abbott MP and liberty hold DNA clinic in Hackney. Liberty. Retrieved from

Stosny, S. (2008). Guilt vs. responsibility is powerlessness vs. power: Understanding emotional pollution and power. Anger in the Age of Entitlement. Retrieved from

Travis, A. (2009, August 8). Police told to ignore human rights ruling over DNA: Details of innocent people will continue to be held: Senior officers will not get new guidance for a year. The Guardian. Retrieved from

Williams, R., & Johnson, P. (2005). Inclusiveness, effectiveness and intrusiveness: Issues in the developing uses of DNA profiling in support of criminal investigations. Medical Malpractice: U.S., & International Perspectives, 545.

Key Terms and Definitions

BEM: Black Ethnic Minority group. BEM has specific national or cultural traditions from the majority of the population.

DNA: Deoxyribonucleic acid (DNA) is a molecule that encodes the genetic instructions used in the development and functioning of all known living organisms and many viruses.

DRAGNETS: In policing a dragnet is any system of coordinated measures for apprehending criminals or suspects, such as widespread DNA testing, pressuring potential criminals who have committed a given act to come forward.

ECtHR: European Court of Human Rights is a supra-national or international court established by the European Convention on Human Rights.

Familial Searching: Familial searching is a second phase step conducted by law enforcement after a search on a DNA database has returned no profile matches. Familial searching attempts to find a match of first-order relatives (e.g. sibling, parent/child) based on a partial match, granting some leads to law enforcement, as opposed to no leads.

HGP: The Human Genome Project is an international scientific research project with a primary goal of determining the sequence of chemical base pairs which make up human DNA, and of identifying and mapping the total genes of the human genome from both a physical and functional standpoint.

Mass Screenings: Occur when the police encourage people residing in a given area, or encourage people who are members of a certain group to volunteer their DNA sample. Mass screenings are supposed to save police resources in apprehending the offender(s) of a criminal activity.

NDNAD: Is a National DNA Database that was set up in 1995. As of the end of 2005, it carried the profiles of around 3.1 million people. In March 2012 the database contained an estimated 5,950,612 individuals. The database, which grows by 30,000 samples each month, is populated by samples recovered from crime scenes and taken from police suspects and, in England and Wales, anyone arrested and detained at a police station.

PACE: The Police and Criminal Evidence Act 1984 (PACE) (1984 c. 60) is an Act of Parliament which instituted a legislative framework for the powers of police officers in England and Wales to combat crime, as well as providing codes of practice for the exercise of those powers.

Profiling: With respect to DNA is the banding patterns of genetic profiles produced by electrophoresis of treated samples of DNA.

Scene of a Crime: Is a location where a crime took place or another location where evidence of the crime may be found. This is the area which comprises most of the physical evidence retrieved by law enforcement personnel, crime scene investigators (CSIs) or in some circumstances forensic scientists.

SLP: The Single Locus Probe (SLP) is a technique which was in use in early DNA examinations and has numerous limitations with respect to newer more advanced techniques.

Citation: Michael, K. (2014). Towards the Blanket Coverage DNA Profiling and Sampling of Citizens in England, Wales, and Northern Ireland. In M. Michael, & K. Michael (Eds.), Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies (pp. 187-207). Hershey, PA: IGI Global. doi:10.4018/978-1-4666-4582-0.ch008