Using a Social-Ethical Framework to Evaluate Location-Based Services

Abstract

etyhicsfront.jpg

The idea for an Internet of Things has matured since its inception as a concept in 1999. People today speak openly of a Web of Things and People, and even more broadly of an Internet of Everything. As our relationships become more and more complex and enmeshed, through the use of advanced technologies, we have pondered on ways to simplify flows of communications, to collect meaningful data, and use them to make timely decisions with respect to optimisation and efficiency. At their core, these flows of communications are pathways to registers of interaction, and tell the intricate story of outputs at various units of analysis- things, vehicles, animals, people, organisations, industries, even governments. In this trend toward evidence-based enquiry, data is the enabling force driving the growth of IoT infrastructure. This paper uses the case of location-based services, which are integral to IoT approaches, to demonstrate that new technologies are complex in their effects on society. Fundamental to IoT is the spatial element, and through this capability, the tracking and monitoring of everything, from the smallest nut and bolt, to the largest shipping liner to the mapping of planet earth, and from the whereabouts of the minor to that of the prime minister. How this information is stored, who has access, and what they will do with it, is arguable depending on the stated answers. In this case study of location-based services we concentrate on control and trust, two overarching themes that have been very much neglected, and use the outcomes of this research to inform the development of a socio-ethical conceptual framework that can be applied to minimise the unintended negative consequences of advanced technologies. We posit it is not enough to claim objectivity through information ethics approaches alone, and present instead a socio-ethical impact framework. Sociality therefore binds together that higher ideal of praxis where the living thing (e.g. human) is the central and most valued actor of a system.

Agenda:

Introduction 1.1. 3

Control 1.2. 4

Surveillance 1.2.1. 5

Common surveillance metaphors 1.2.2. 5

Applying surveillance metaphors to LBS 1.2.3. 7

‘Geoslavery’ 1.2.4. 7

From state-based to citizen level surveillance 1.2.5. 7

Dataveillance 1.2.6. 8

Risks associated with dataveillance 1.2.7. 8

Loss of control 1.2.8. 8

Studies focussing on user requirements for control 1.2.9. 10

Monitoring using LBS: control versus care? 1.2.10. 10

Sousveillance 1.2.11. 11

Sousveillance, ‘reflectionism’ and control 1.2.12. 11

Towards überveillance 1.2.13. 12

Implications of überveillance on control 1.2.14. 13

Comparing the different forms of ‘veillance’ 1.2.15. 14

Identification 1.2.16. 14

Social sorting 1.2.17. 15

Profiling 1.2.18. 15

Digital personas and dossiers 1.2.19. 15

Trust 1.3. 16

Trust in the state 1.3.1. 17

Balancing trust and privacy in emergency services 1.3.2. 17

Trust-related implications of surveillance in the interest of national security 1.3.3. 17

Need for justification and cultural sensitivity 1.3.4. 18

Trust in corporations/LBS/IoT providers 1.3.5. 19

Importance of identity and privacy protection to trust 1.3.6. 19

Maintaining consumer trust 1.3.7. 20

Trust in individuals/others 1.3.8. 20

Consequences of workplace monitoring 1.3.9. 20

Location-monitoring amongst friends 1.3.10. 21

Location tracking for protection 1.3.11. 21

LBS/IoT is a ‘double-edged sword’ 1.3.12. 22

Discussion 1.4. 22

The Internet of Things (IoT) and LBS: extending the discussion on control and trust 1.4.1. 22

Control- and trust-related challenges in the IoT 1.4.2. 23

Ethical analysis: proposing a socio-ethical conceptual framework 1.4.3. 24

The need for objectivity 1.4.4. 25

Difficulties associated with objectivity 1.4.5. 26

Conclusion 1.5. 27

 

Introduction 1.1

Locative technologies are a key component of the Internet of Things (IoT). Some scholars go so far as to say it is the single most important component that enables the monitoring and tracking of subjects and objects. Knowing where something or someone is, is of greater importance than knowing who they are because it or they can be found, independent of what or who they are. Location also grants us that unique position on the earth’s surface, providing for us one of the vital pieces of information forming the distance, speed, time matrix. A unique ID, formed around an IP address in an IoT world, presents us with the capability to label every living and non-living thing and to recollect it, adding to its history and longer term physical lifetime. But without knowing where something is, even if we have the knowledge that an action is required toward some level of maintenance, we cannot be responsive. Since the introduction of electronic databases, providing accurate records for transaction processing has been a primary aim. Today, however, we are attempting to increase visibility using high resolution geographic details, we are contextualizing events through discrete and sometimes continuous sensor-based rich audio-visual data collection, and we are observing how mobile subjects and objects interact with the built environment. We are no longer satisfied with an approach that says identify all things, but we wish to be able to recollect or activate them on demand, understand associations and affiliations, creating a digital chronicle of its history to provide insights toward sustainability.

There is thus an undue pressure on the ethical justification for social and behavioral tracking of people and things in everyday life. Solely because we have the means to do something, it does not mean we should do it. We are told that through this new knowledge gained from big data we can reduce carbon emissions, we can eradicate poverty, we can grant all people equity in health services, we can better provision for expected food shortages, utilize energy resources optimally, in short, make the world a better place. This utopian view might well be the vision that the tech sector wish to adopt as an honourable marketing strategy, but the reality of thousands of years of history tells us that technology does not necessarily on its own accord, make things better. In fact, it has often made some aspects of life, such as conflict and war, much worse through the use of modern, sophisticated advanced techniques. We could argue that IoT will allow for care-based surveillance that will bring about aid to individuals and families given needs, but the reality is that wherever people are concerned, technology may be exploited towards a means for control. Control on its own is not necessarily an evil, it all depends on how the functionality of given technologies are applied. Applied negatively the recipient of this control orientation learns distrust instead of trust which then causes a chain reaction throughout society, especially with respect to privacy and security. We need only look at the techniques espoused by some governments in the last 200 years to acknowledge that heinous crimes against humanity (e.g. democide) have been committed with new technological armaments (Rummel, 1997) to the detriment of the citizenry.                                                         

A socio-ethical framework is proposed as a starting point for seeking to understand the social implications of location services, applicable to current and future applications within IoT infrastructure. To stop at critiquing services using solely an information ethics-based approach is to fall short. Today’s converging services and systems require a greater scope of orientation to ask more generally how society may be affected at large, not just whether information is being collected, stored, and shared appropriately. To ask questions about how location services and IoT technology will directly and indirectly change society has far greater importance for the longer term vision of person-to-person and person-to-thing interactions than simply studying various attributes in a given register.

Studies addressing the social implications of emerging technologies, such as LBS, generally reflect on the risks and ethical dilemmas resulting from the implementation of a particular technology within a given social context. While numerous approaches to ethics exist, all are inextricably linked to ideas of morality, and an ability to distinguish good conduct from bad. Ethics, in simple terms, can be considered as the “study of morality” (Quinn 2006, p. 55), where morality refers to a “system of rules for guiding human conduct and principles for evaluating those rules” (Tavani 2007, p. 32). This definition is shared by Elliot and Phillips (2004, p. 465), who regard ethics as “a set of rules, or a decision procedure, or both, intended to provide the conditions under which the greatest number of human beings can succeed in ‘flourishing’, where ‘flourishing’ is defined as living a fully human life” (O'Connor and Godar 2003, p. 248).

According to the literature, there are two prominent ethical dilemmas that emerge with respect to locating a person or thing in an Internet of Things world. First, the risk of unauthorised disclosure of one’s location which is a breach of privacy; and second the possibility of increased monitoring leading to unwarranted surveillance by institutions and individuals. The socio-ethical implications of LBS in the context of IoT can therefore be explored based on these two major factors. IoT more broadly, however, can be examined by studying numerous social and ethical dilemmas from differing perspectives. Michael et al. (2006a, pp. 1-10) propose a framework for considering the ethical challenges emerging from the use of GPS tracking and monitoring solutions in the control, convenience and care usability contexts. The authors examine these contexts in view of the four ethical dimensions of privacy, accuracy, property and accessibility (Michael et al. 2006a, pp. 4-5). Alternatively, Elliot and Phillips (2004, p. 463) discuss the social and ethical issues associated with m-commerce and wireless computing in view of the privacy and access, security and reliability challenges. The authors claim that factors such as trust and control are of great importance in the organisational context (Elliot and Phillips 2004, p. 470). Similar studies propose that the major themes regarding the social implications of LBS be summarised as control, trust, privacy and security (Perusco et al. 2006; Perusco and Michael 2007). These themes provide a conceptual framework for reviewing relevant literature in a structured fashion, given that a large number of studies are available in the respective areas.

This article, in the first instance, focusses on the control- and trust-related socio-ethical challenges arising from the deployment of LBS in the context of IoT, two themes that are yet to receive comprehensive coverage in the literature. This is followed by an examination of LBS in the context of the Internet of Things (IoT), and the ensuing ethical considerations. A socio-ethical framework is proposed as a valid starting point for addressing the social implications of LBS and delivering a conceptual framework that is applicable to current LBS use cases and future applications within an Internet of Things world.

Control 1.2

Control, according to the Oxford Dictionary (2012a), refers to the “the power to influence or direct people’s behaviour or the course of events”. With respect to LBS, this theme is examined in terms of a number of important concepts, notably surveillance, dataveillance, sousveillance and überveillance scholarship.

Surveillance 1.2.1

A prevailing notion in relation to control and LBS is the idea of exerting power over individuals through various forms of surveillance. Surveillance, according to sociologist David Lyon, “is the focused, systematic and routine attention to personal details for the purposes of influence, management, protection and or direction,” although Lyon admits that there are exceptions to this general definition (Lyon 2007, p. 14). Surveillance has also been described as the process of methodically monitoring the behaviour, statements, associates, actions and/or communications of an individual or individuals, and is centred on information collection (Clarke 1997; Clarke 2005, p. 9).

The act of surveillance, according to Clarke (1988; 1997) can either take the form of personal surveillance of a specific individual or mass surveillance of groups of interest. Wigan and Clarke (2006, p. 392) also introduce the categories of object surveillance of a particular item and area surveillance of a physical enclosure. Additional means of expressing the characteristics of surveillance exist. For example, the phrase “surveillance schemes” has been used to describe the various surveillance initiatives available (Clarke 2007a, p. 28). Such schemes have been demonstrated through the use of a number of mini cases or vignettes, which include, but are not limited to, baby monitoring, acute health care, staff movement monitoring, vehicle monitoring, goods monitoring, freight interchange-point monitoring, monitoring of human-attached chips, monitoring of human-embedded chips, and continuous monitoring of chips (Clarke 2007c; Clarke 2007b, pp. 47-60). The vignettes are intended to aid in understanding the desirable and undesirable social impacts resulting from respective schemes.

Common surveillance metaphors 1.2.2

In examining the theme of control with respect to LBS, it is valuable to initially refer to general surveillance scholarship to aid in understanding the link between LBS and surveillance. Surveillance literature is somewhat dominated by the use of metaphors to express the phenomenon. A prevalent metaphor is that of the panopticon, first introduced by Jeremy Bentham (Bentham and Bowring 1843), and later examined by Michel Foucault (1977). Foucault’s seminal piece Discipline and Punish traces the history of punishment, commencing with the torture of the body in the eighteenth century, through to more modern forms of punishment targeted at the soul (Foucault 1977). In particular, Foucault’s account offers commentary on the notions of surveillance, control and power through his examination of Bentham’s panopticon, which are pertinent in analysing surveillance in general and monitoring facilitated by LBS in particular. The panopticon, or “Inspection-House” (Bentham and Bowring 1843, p. 37), refers to Bentham’s design for a prison based on the essential notion of “seeing without being seen” (p. 44). The architecture of the panopticon is as follows:

“The building is circular. The apartments of the prisoners occupy the circumference. You may call them, if you please, the cells... The apartment of the inspector occupies the centre; you may call it if you please the inspector's lodge. It will be convenient in most, if not in all cases, to have a vacant space or area all round, between such centre and such circumference.  You may call it if you please the intermediate or annular area” (Bentham and Bowring 1843, pp. 40-41).

Foucault (1977, p. 200) further illustrates the main features of the inspection-house, and their subsequent implications on constant visibility:

“By the effect of backlighting, one can observe from the tower [‘lodge’], standing out precisely against the light, the small captive shadows in the cells of the periphery. They are like so many cages, so many small theatres, in which each actor is alone, perfectly individualized and constantly visible...Full lighting and the eye of a supervisor [‘inspector’] capture better than darkness, which ultimately protected. Visibility is a trap.”

While commonly conceived as ideal for the prison arrangement, the panopticon design is applicable and adaptable to a wide range of establishments, including but not limited to work sites, hospital, schools, and/or or any establishment in which individuals “are to be kept under inspection” (Bentham and Bowring 1843, p. 37). It has been suggested, however, that the panopticon functions as a tool for mass (as opposed to personal) surveillance in which large numbers of individuals are monitored, in an efficient sense, by a small number (Clarke 2005, p. 9). This differs from the more efficient, automated means of dataveillance (to be shortly examined). In enabling mass surveillance, the panopticon theoretically allows power to be. In examining the theme of control with respect to LBS, it is valuable to initially refer to general surveillance scholarship to aid in understanding the link between LBS and surveillance. Surveillance literature is somewhat dominated by the use of metaphors to express the phenomenon. Foucault (1977, pp. 202-203) provides a succinct summary of this point:

“He who is subjected to a field of visibility, and who knows it, assumes responsibility for the constraints of power; he makes them play spontaneously upon himself; he inscribes in himself the power relation in which he simultaneously plays both roles; he becomes the principle of his own subjection.”

This self-disciplinary mechanism functions similarly, and can somewhat be paralleled, to various notions in George Orwell’s classic novel Nineteen Eighty Four (Orwell 1949), also a common reference point in surveillance literature. Nineteen Eighty Four has been particularly influential in the surveillance realm, notably due to the use of “Big Brother” as a symbol of totalitarian, state-based surveillance. Big Brother’s inescapable presence is reflected in the nature of surveillance activities. That is, that monitoring is constant and omnipresent and that “[n]othing was your own except the few cubic centimetres inside your skull” (Orwell 1949, p. 29). The oppressive authority figure of Big Brother possesses the ability to persistently monitor and control the lives of individuals, employing numerous mechanisms to exert power and control over his populace as a reminder of his unavoidable gaze.

One such mechanism is the use of telescreens as the technological solution enabling surveillance practices to be applied. The telescreens operate as a form of self-disciplinary tool by way of reinforcing the idea that citizens are under constant scrutiny (in a similar fashion to the inspector’s lodge in the panopticon metaphor). The telescreens inevitably influence behaviours, enabling the state to maintain control over actions and thoughts, and to impose appropriate punishments in the case of an offence. This is demonstrated in the following excerpt:

“It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen. The smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself – anything that carried with it the suggestion of abnormality, of having something to hide. In any case, to wear an improper expression on your face (to look incredulous when a victory was announced, for example) was itself a punishable offence” (Orwell 1949, p. 65).

The Internet of Things, with its ability to locate and determine who is or what is related to one another using a multiplicity of technologies, will enable authorities in power to infer what someone is likely to do in a given context. Past behavioural patterns, can for example, reveal a likely course of action with relatively no prediction required. IoT in all its glory will provide complete visibility- the question is what are the risks associated with providing that kind of capability to the state or private enterprise? In scenario analysis we can ponder how IoT in a given context will be used for good, how it will be used for bad, and a neutral case where it will have no effect whatsoever because the data stream will be ignored by the system owner. While IoT has been touted as the ultimate in providing great organisational operational returns, one can see how it can lend itself to location-based tracking and monitoring using a panopticon metaphor. Paper records and registers were used during World War 2 for the purposes of segregation, IoT and especially the ability to “locate on demand”, may well be used for similar types of control purposes.

Applying surveillance metaphors to LBS 1.2.3

The aforementioned surveillance metaphors can be directly applied to the case of LBS within IoT. In the first instance, it can be perceived that the exploitation of emerging technologies, such as LBS, extends the notion of the panopticon in a manner that allows for inspection or surveillance to take place regardless of geographic boundaries or physical locations. When applying the idea of the panopticon to modern technologies, Lyon suggests that “Bentham’s panopticon gives way to the electronic superpanopticon” (Lyon 2001, p. 108). With respect to LBS, this superpanopticon is not limited to and by the physical boundaries of a particular establishment, but is rather reliant on the nature and capabilities of the mobile devices used for ‘inspection’. In an article titled “The Panopticon's Changing Geography”, Dobson and Fischer (2007) also discuss progress and various manifestations of surveillance technology, specifically the panopticon, and the consequent implications on power relationships. From Bentham's architectural design, to the electronic panopticon depicted by Orwell, and contemporary forms of electronic surveillance including LBS and covert human tracking, Dobson and Fisher (2007, p. 308-311) claim that all forms of watching enable continuous surveillance either as part of their primary or secondary purpose. They compare four means of surveillance- analogue technologies as used by spies which have unlimited geographic coverage and are very expensive to own and operate, Bentham’s original panopticon where the geographic view was internal to a building, George Orwell’s big brother view which was bound by the extent of television cables, and finally human tracking systems which were limited only by the availability and granularity of cell phone towers.

A key factor in applying the panopticon metaphor to IoT is that individuals, through the use of mobile location devices and technologies, will be constantly aware of their visibility and will assume the knowledge that an ‘inspector’ may be monitoring their location and other available information remotely at any given time. Mobile location devices may similarly replace Orwell’s idea of the telescreens as Big Brother’s primary surveillance technology, resulting in a situation in which the user is aiding in the process of location data collection and thereby surveillance. This creates, as maintained by Andrejevic (2007, p. 95), a “widening ‘digital enclosure’ within which a variety of interactive devices that provide convenience and customization to users double as technologies for gathering information about them.”

‘Geoslavery’ 1.2.4

Furthermore, in extreme situations, LBS may facilitate a new form of slavery, “geoslavery”, which Dobson and Fischer (2003, pp. 47-48) reveal is “a practice in which one entity, the master, coercively or surreptitiously monitors and exerts control over the physical location of another individual, the slave. Inherent in this concept is the potential for a master to routinely control time, location, speed, and direction for each and every movement of the slave or, indeed, of many slaves simultaneously.” In their seminal work, the authors flag geoslavery as a fundamental human rights issue (Dobson and Fisher 2003, p. 49), one that has the potential to somewhat fulfil Orwell's Big Brother prophecy, differing only in relation to the sophistication of LBS in comparison to visual surveillance and also in terms of who is in control. While Orwell’s focus is on the state, Dobson and Fischer (2003, p. 51) caution that geoslavery can also be performed by individuals “to control other individuals or groups of individuals.”

From state-based to citizen level surveillance 1.2.5

Common in both Discipline and Punish and Nineteen Eighty Four is the perspective that surveillance activities are conducted at the higher level of the “establishment”; that is, institutional and/or state-based surveillance. However, it must be noted that similar notions can be applied at the consumer or citizen level. Mark Andrejevic (2007, p. 212), in his book iSpy: Surveillance and Power in the Interactive Era, terms this form of surveillance as “lateral or peer-to-peer surveillance.” This form of surveillance is characterised by “increasing public access to the means of surveillance – not just by corporations and the state, but by individuals” (Andrejevic 2007, p. 212). Similarly, Barreras and Mathur (2007, pp. 176-177) state that wireless location tracking capabilities are no longer limited to law enforcement, but are open to any interested individual. Abbas et al. (2011, pp. 20-31) further the discussion by focussing on related notions, explicitly, the implications of covert LBS-based surveillance at the community level, where technologies typically associated with policing and law enforcement are increasingly available for use by members of the community. With further reference to LBS, Dobson and Fischer (2003, p. 51) claim that the technology empowers individuals to control other individuals or groups, while also facilitating extreme activities. For instance, child protection, partner tracking and employee monitoring can now take on extreme forms through the employment of LBS (Dobson and Fisher 2003, p. 49). According to Andrejevic (2007, p. 218), this “do-it-yourself” approach assigns the act of monitoring to citizens. In essence higher degrees of control are granted to individuals thereby encouraging their participation in the surveillance process (Andrejevic 2007, pp. 218-222). It is important to understand IoT in the context of this multifaceted “watching”. IoT will not only be used by organisations and government agencies, but individuals in a community will also be granted access to information at small units of aggregated data. This has implications at a multiplicity of levels. Forces of control will be manifold.

Dataveillance 1.2.6

The same sentiments can be applied to the related, and to an extent superseding, notion of data surveillance, commonly referred to as dataveillance. Coined by Roger Clarke in the mid-eighties, dataveillance is defined as “the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” (Clarke 1988). Clarke (2005, p. 9) maintains that this process is automated and therefore a relatively economical activity when compared with other forms of surveillance, in that dataveillance activities are centred on examination of the data trails of individuals. For example, traditional forms of surveillance rely on expensive visual monitoring techniques, whereas dataveillance is largely an economically efficient alternative (Clarke 1994; 2001d, p. 11). Visual behavioural monitoring (that is, traditional surveillance) is an issue, but is nonetheless overshadowed by the challenges associated with dataveillance, particularly with reference to personal and mass dataveillance (Clarke 2005, pp. 9-10). That is, personal dataveillance presents risks to the individual based primarily on the potential for the collected data/information to be incorrect or outdated, while mass dataveillance is risky in that it may generate suspicion amongst individuals (Albrecht & Michael, 2013).

Risks associated with dataveillance 1.2.7

Clarke’s early and influential work on “Information Technology and Dataveillance” recognises that information technology is accelerating the growth of dataveillance, which presents numerous benefits and risks (Clarke 1988, pp. 498, 505-507). Clarke lists advantages in terms of safety and government applications, while noting the dangers associated with both personal and mass dataveillance (Clarke 1988, pp. 505-507). These risks can indeed be extended or applied to the use of location and tracking technologies to perform dataveillance activities, resulting in what can be referred to as “dataveillance on the move” (Michael and Michael 2012). The specific risks include: ability for behavioural patterns to be exposed and cross-matched, potentially for revelations that may be harmful from a political and personal perspective, rise in the use of “circumstantial evidence”, transparency of behaviour resulting in the misuse of information relating to an individual’s conduct, and “actual repression of the readily locatable and trackable individual” (Clarke 2001b, p. 219). Emerging from this analysis, and that concerning surveillance and related metaphors, is the significant matter of loss of control.

Loss of control 1.2.8

Michael et al. (2006a, p. 2) state, in the context of GPS tracking, that the issue of control is a leading ethical challenge given the invasive nature of this form of monitoring. The mode of control can differ depending on the context. For instance, the business context may include control through directing or ‘pushing’ advertisements to a specific individual, and at personal/individual level could signify control in the manner of “self-direction” (Perusco et al. 2006, p. 93). Other forms of social control can also be exercised by governments and organisations (Clarke 2003b), while emerging LBS solutions intended for the consumer sector extend the notion of control to community members (Abbas et al. 2011). This is an area that has not been adequately addressed in the literature. The subsequent risks to the individual are summarised in the following passage:

“Location technologies therefore provide, to parties that have access to the data, the power to make decisions about the entity subject to the surveillance, and hence exercise control over it. Where the entity is a person, it enables those parties to make determinations, and to take action, for or against that person’s interests. These determinations and actions may be based on place(s) where the person is, or place(s) where the person has been, but also on place(s) where the person is not, or has not been” (Wigan and Clarke 2006, p. 393).

Therefore GPS and other location devices and technologies may result in decreased levels of control from the perspective of the individual being monitored. For example, in an article based on the use of scenarios to represent the social implications associated with the implementation of LBS, Perusco and Michael (2007) demonstrate the various facets of control in relation to LBS. The discussion is generally centred on the loss of control which can be experienced in numerous ways, such as when a device does not accurately operate, or when an individual constantly monitors a family member in an attempt to care for them (Perusco and Michael 2007, pp. 6-7, 10). The authors raise valuable ideas with respect to control, such as the need to understand the purpose of control, the notion of consent, and developing methods to deal with location inaccuracies amongst others (p. 14). Perusco and Michael further assert that control has a flow-on effect on other issues, such as trust for instance, with the authors questioning whether it is viable to control individuals given the likely risk that trust may be relinquished in the process (p. 13).

Concurrent with loss of control, the issue of pre-emptive control with respect to LBS is a delicate one, specifically in relation to suspected criminals or offenders. Perusco et al. (2006, p. 92) state that the punishment of a crime is typically proportionate to the committed offence, thus the notion of pre-emptive monitoring can be considered fundamentally flawed given that individuals are being punished without having committed an offence. Rather, they are suspected of being a threat. According to Clarke and Wigan (2011), a person is perceived a threat, based on their “personal associations” which can be determined using location and tracking technologies to establish the individual’s location in relation to others, and thus control them based on such details. This is where IoT fundamentally comes into play. While location information can tell us much about where an individual is at any point in time, it is IoT that will reveal the inter-relationships and frequency of interaction, and specific application of measurable transactions. IoT is that layer that will bring things to be scrutinized in new ways.  

This calls for an evaluation of LBS solutions that can be used for covert operations. Covert monitoring using LBS is often considered a useful technique, one that promotes less opposition than overt forms of monitoring, as summarised below:

“Powerful economic and political interests are seeking to employ location and tracking technologies surreptitiously, to some degree because their effectiveness is greater that way, but mostly in order to pre-empt opposition” (Clarke 2001b, p. 221).

Covert applications of LBS are increasingly available for the monitoring and tracking of social relations such as a partner or a child (Abbas et al. 2011). Regardless of whether covert or overt, using LBS for monitoring is essentially about control, irrespective of whether the act of controlling is motivated by necessity, or for more practical or supportive purposes (Perusco et al. 2006, p. 93). 

Studies focussing on user requirements for control 1.2.9

The control dimension is also significant in studies focussing on LBS users, namely, literature concerned with user-centric design, and user adoption and acceptance of LBS and related mobile solutions. In a paper focussing on understanding user requirements for the development of LBS, Bauer et al. (2005, p. 216) report on a user’s “fear” of losing control while interacting with mobile applications and LBS that may infringe on their personal life. The authors perceive loss of control to be a security concern requiring attention, and suggest that developers attempt to relieve the apprehension associated with increased levels of personalisation though ensuring that adequate levels of control are retained (Bauer et al. 2005, p. 216). This is somewhat supported by the research of Xu and Teo (2004, pp. 793-803), in which the authors suggest that there exists a relationship between control, privacy and intention to use LBS. That is, a loss of control results in a privacy breach, which in turn impacts on a user’s intention to embrace LBS.

The aforementioned studies, however, fail to explicitly incorporate the concept of value into their analyses. Due to the lack of literature discussing the three themes of privacy, value and control, Renegar et al. (2008, pp. 1-2) present the privacy-value-control (PVC) trichotomy as a paradigm beneficial for measuring user acceptance and adoption of mobile technologies. This paradigm stipulates the need to achieve harmony amongst the concepts of privacy, value and control in order for a technology to be adopted and accepted by the consumer. However, the authors note that perceptions of privacy, value and control are dependent on a number of factors or entities, including the individual, the technology and the service provider (Renegar et al. 2008, p. 9). Consequently, the outcomes of Renegar et al.’s study state that privacy does not obstruct the process of adoption but rather the latter must take into account the value proposition in addition to the amount of control granted.

Monitoring using LBS: control versus care? 1.2.10

The focus of the preceding sections has been on the loss of control, the dangers of pre-emptive control, covert monitoring, and user perspectives relating to the control dimension. However, this analysis should not be restricted to the negative implications arising from the use of LBS, but rather should incorporate both the control and care applications of LBS. For instance, while discussions of surveillance and the term in general typically invoke sinister images, numerous authors warn against assuming this subjective viewpoint. Surveillance should not be considered in itself as disagreeable. Rather, “[t]he problem has been the presumptiveness of its proponents, the lack of rational evaluation, and the exaggerations and excesses that have been permitted” (Clarke 2007a, p. 42). This viewpoint is reinforced in the work of Elliot and Phillips (2004, p. 474), and can also be applied to dataveillance.

The perspective that surveillance inevitability results in negative consequences such as individuals possessing excessive amounts of control over each other should be avoided. For instance, Lyon (2001, p. 2) speaks of the dual aspects of surveillance in that “[t]he same process, surveillance – watching over – both enables and constrains, involves care and control.”  Michael et al. (2006a) reinforce such ideas in the context of GPS tracking and monitoring. The authors claim that GPS tracking has been employed for control purposes in various situations, such as policing/law enforcement, the monitoring of parolees and sex offenders, the tracking of suspected terrorists and the monitoring of employees (Michael et al. 2006a, pp. 2-3). However, the authors argue that additional contexts such as convenience and care must not be ignored, as GPS solutions may potentially simplify or enable daily tasks (convenience) or be used for healthcare or protection of vulnerable groups (care) (Michael et al. 2006a, pp. 3-4). Perusco and Michael (2005) further note that the tracking of such vulnerable groups indicates that monitoring activities are no longer limited to those convicted of a particular offence, but rather can be employed for protection and safety purposes. Table 1 provides a summary of GPS tracking and monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4), identifying the potentially constructive uses of GPS tracking and monitoring.

Table 1: GPS monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4)

Table 1: GPS monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4)

It is crucial that in evaluating LBS control literature and establishing the need for LBS regulation, both the control and care perspectives are incorporated. The act of monitoring should not immediately conjure up sinister thoughts. The focus should preferably be directed to the important question of purpose or motives. Lyon (2007, p. 3) feels that purpose may exist anywhere on the broad spectrum between care and control. Therefore, as expressed by Elliot and Phillips (2004, p. 474), a crucial factor in evaluating the merit of surveillance activities and systems is determining “how they are used.” These sentiments are also applicable to dataveillance. It is helpful at this point to discuss alternative and related practices that may incorporate location information throughout the monitoring process.

Sousveillance 1.2.11

The term sousveillance, coined by Steve Mann, comes from the French terms sous which means from below, and veiller which means to watch (Mann et al. 2003, p. 332). It is primarily a form of “inverse surveillance” (Mann et al. 2003, p. 331), whereby an individual is in essence “surveilling the surveillers” (p. 332). Sousveillance is reliant on the use of wearable computing devices to capture audiovisual and sensory data (Mann 2005, p. 625). A major concern with respect to sousveillance, according to Mann (2005, p. 637), is the dissemination of the recorded data which for the purposes of this investigation, may include images of locations and corresponding geographic coordinates.

Sousveillance, ‘reflectionism’ and control 1.2.12

Relevant to the theme of control, it has been argued that sousveillance can be utilised as a form of resistance to unwarranted surveillance and control by institutions. According to Mann et al. (2003, p. 333), sousveillance is a type of reflectionism in which individuals can actively respond to bureaucratic monitoring and to an extent “neutralize surveillance”. Sousveillance can thus be employed in response to social control in that surveillance activities are reversed:

“The surveilled become sousveillers who engage social controllers (customs officials, shopkeepers, customer service personnel, security guards, etc.) by using devices that mirror those used by these social controllers” (Mann et al. 2003, p. 337).

Sousveillance differs from surveillance in that traditional surveillance activities are “centralised” and “localized.” It is dispersed in nature and “delocalized” in its global coverage (Ganascia 2010, p. 496). As such, sousveillance requires new metaphors for understanding its fundamental aspects. A useful metaphor proposed by Ganascia (2010, p. 496) for describing sousveillance is the canopticon, which can be contrasted to the panopticon metaphor. At the heart of the canopticon are the following principles:

“total transparency of society, fundamental equality, which gives everybody the ability to watch – and consequently to control – everybody else, [and] total communication, which enables everyone to exchange with everyone else” (Ganascia 2010, p. 497).

This exchange may include the dissemination of location details, thus signalling the need to incorporate sousveillance into LBS regulatory discussions. A noteworthy element of sousveillance is that it shifts the ability to control from the state/institution (surveillance) to the individual. While this can initially be perceived as an empowering feature, excessive amounts of control, if unchecked, may prove detrimental. That is, control may be granted to individuals to disseminate their location (and other) information, or the information of others, without the necessary precautions in place and in an unguarded fashion. The implications of this exercise are sinister in their extreme forms. When considered within the context of IoT, sousveillance ideals are likely compromised. Yes, I can fight back against state control and big brother with sousveillance but in doing so I unleash potentially a thousand or more little brothers, each with their capacity to (mis)use the information being gathered.

Towards überveillance 1.2.13

The concepts of surveillance, dataveillance and sousveillance have been examined with respect to their association with location services in an IoT world. It is therefore valuable, at this point, to introduce the related notion of überveillance. Überveillance, a term coined by M.G. Michael in 2006, can be described as “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” (Michael et al. 2006b; Macquarie Dictionary 2009, p. 1094). Überveillance combines the dimensions of identification, location and time, potentially allowing for forecasting and uninterrupted real-time monitoring (Michael and Michael 2007, pp. 9-10), and in its extreme forms can be regarded as “Big Brother on the inside looking out” (p. 10).

Überveillance is considered by several authors to be the contemporary notion that will supplant surveillance. For instance, Clarke (2007a, p. 27) suggests that the concept of surveillance is somewhat outdated and that contemporary discussions be focussed on the notion of überveillance. It has further been suggested that überveillance is built on the existing notion of dataveillance. That is, “[ü]berveillance takes that which was static or discrete in the dataveillance world, and makes it constant and embedded” (Michael and Michael 2007, p. 10). The move towards überveillance thus marks the evolution from physical, visual forms of monitoring (surveillance), through to the increasingly sophisticated and ubiquitous embedded chips (überveillance) (Michael & Michael 2010; Gagnon et al. 2013). Albrecht and McIntyre (2005) describe these embedded chips as “spychips” and were focused predominantly on RFID tracking of people through retail goods and services. They spend considerable space describing the Internet of Things concept. Perakslis and Wolk (2006) studied the social acceptance of RFID implants as a security method and Perakslis later went on to incorporate überveillance into her research into behavioural motivators and personality factors toward adoption of humancentric IoT applications.

Given that überveillance is an emerging term (Michael and Michael 2007, p. 9), diverse interpretations have been proposed. For example, Clarke (2007a) offers varying definitions of the term, suggesting that überveillance can be understood as any of the following: omni-surveillance, an apocalyptic notion that “applies across all space and all time (omnipresent), and supports some organisation that is all-seeing and even all-knowing (omniscient)”, which can be achieved through the use of embedded chips for instance (p. 33); exaggerated surveillance, referring to “the extent to which surveillance is undertaken... its justification is exaggerated” (p. 34) ; and/or meta-, supra-, or master-surveillance, which “could involve the consolidation of multiple surveillance threads in order to develop what would be envisaged by its proponents to be superior information” (p. 38). Shay et al. (2012) acknowledge:

“The pervasive nature of sensors coupled with recent advances in data mining, networking, and storage technologies creates tools and data that, while serving the public good, also create a ubiquitous surveillance infrastructure ripe for misuse. Roger Clarke’s concept of dataveillance and M.G. Michael and Katina Michael’s more recent uberveillance serve as important milestones in awareness of the growing threat of our instrumented world.”

All of these definitions indicate direct ways in which IoT applications can also be rolled-out whether it is for use of vehicle management in heavy traffic conditions, the tracking of suspects in a criminal investigation or even employees in a workplace. Disturbing is the manner in which a whole host of applications, particularly in tollways and public transportation, are being used for legal purposes without the knowledge of the driver and commuter. “Tapping” token cards is not only encouraged but mandatory at most metropolitan train stations of developed countries. Little do commuters know that the data gathered by these systems can be requested by a host of government agencies without a warrant.

Implications of überveillance on control 1.2.14

Irrespective of interpretation, the subject of current scholarly debate relates to the implications of überveillance on individuals in particular, and society in general. In an article discussing the evolution of automatic identification (auto-ID) techniques, Michael and Michael (2005) present an account of the issues associated with implantable technologies in humancentric applications. The authors note the evident trend of deploying a technology into the marketplace, prior to assessing the potential consequences (Michael and Michael 2005, pp. 22-33). This reactive approach causes apprehension in view of chip implants in particular, given the inexorable nature of embedded chips, and the fact that once the chip is accepted by the body, it is impossible to remove without an invasive surgical procedure, as summarised in the following excerpt:

“[U]nless the implant is removed within a short time, the body will adopt the foreign object and tie it to tissue. At this moment, there will be no exit strategy, no contingency plan, it will be a life enslaved to upgrades, virus protection mechanisms, and inescapable intrusion” (Michael and Michael 2007, p. 18).

Other concerns relevant to this investigation have also been raised. It is indicated that “über-intrusive technologies” are likely to leave substantial impressions on individuals, families and other social relations, with the added potential of affecting psychological well-being (Michael and Michael 2007, p. 17). Apart from implications for individuals, concerns also emerge at the broader social level that require remedies. For instance, if a state of überveillance is to be avoided, caution must be exercised in deploying technologies without due reflection of the corresponding implications. Namely, this will involve the introduction of appropriate regulatory measures, which will encompass proactive consideration of the social implications of emerging technologies and individuals assuming responsibility for promoting regulatory measures (Michael and Michael 2007, p. 20). It will also require a measured attempt to achieve some form of “balance” (Clarke 2007a, p. 43). The implications of überveillance are of particular relevance to LBS regulatory discussions, given that “overarching location tracking and monitoring is leading toward a state of überveillance” (Michael and Michael 2011, p. 2). As such, research into LBS regulation in Australia must be sensitive to both the significance of LBS to überveillance and the anticipated trajectory of the latter.

Unfortunately the same cannot be said for IoT-specific regulation. IoT is a fluid concept, and in many ways IoT is nebulous. It is made up of a host of technologies that are being integrated and are converging together over time. It is layers upon layers of infrastructure which have emerged since the inception of the first telephone lines to the cloud and wireless Internet today. IoT requires new protocols and new applications but it is difficult to point to a specific technology or application or system that can be subject to some form of external oversight. Herein lie the problems of potential unauthorised disclosure of data, or even misuse of data when government agencies require private enterprise to act upon their requests, or private enterprises work together in sophisticated ways to exploit the consumer.

Comparing the different forms of ‘veillance’ 1.2.15

Various terms ending in ‘veillance’ have been introduced throughout this paper, all of which imply and encompass the process of monitoring. Prior to delving into the dangers of this activity and the significance of LBS monitoring on control, it is helpful to compare the main features of each term. A comparison of surveillance, dataveillance, sousveillance, and überveillance is provided in Table 2.

It should be noted that with the increased use of techniques such as surveillance, dataveillance, sousveillance and überveillance, the threat of becoming a surveillance society looms. According to Ganascia (2010p. 491), a surveillance society is one in which the data gathered from the aforementioned techniques is utilised to exert power and control over others. This results in dangers such as the potential for identification and profiling of individuals (Clarke 1997), the latter of which can be associated with social sorting (Gandy 1993).

Table 2: Comparison of the different forms of ‘veillance’

Identification 1.2.16

Identity and identification are ambiguous terms with philosophical and psychological connotations (Kodl and Lokay 2001, p. 129). Identity can be perceived as “a particular presentation of an entity, such as a role that the entity plays in particular circumstances” (Clarke and Wigan 2011). With respect to information systems, human identification specifically (as opposed to object identification) is therefore “the association of data with a particular human being” (Kodl and Lokay 2001, pp. 129-130). Kodl and Lokay (2001, pp. 131-135) claim that numerous methods exist to identify individuals prior to performing a data linkage, namely, using appearance, social interactions/behaviours, names, codes and knowledge, amongst other techniques. With respect to LBS, these identifiers significantly contribute to the dangers pertaining to surveillance, dataveillance, souseveillance and überveillance. That is, LBS can be deployed to simplify and facilitate the process of tracking and be used for the collection of profile data that can potentially be linked to an entity using a given identification scheme. In a sense, LBS in their own right become an additional form of identification feeding the IoT scheme (Michael and Michael, 2013).

Thus, in order to address the regulatory concerns pertaining to LBS, it is crucial to appreciate the challenges regarding the identification of individuals. Of particularly importance is recognition that once an individual has been identified, they can be subjected to varying degrees of control. As such, in any scheme that enables identification, Kodl and Lokay (2001, p. 136) note the need to balance human rights with other competing interests, particularly given that identification systems may be exploited by powerful entities for control purposes, such as by governments to exercise social control. For an historical account of identification techniques, from manual methods through to automatic identification systems including those built on LBS see Michael and Michael (2009, pp. 43-60). It has also been suggested that civil libertarians and concerned individuals assert that automatic identification (auto-ID) technology “impinges on human rights, the right to privacy, and that eventually it will lead to totalitarian control of the populace that have been put forward since at least the 1970s” (Michael and Michael 2009, p. 364). These views are also pertinent to the notion of social sorting.

Social sorting 1.2.17

In relation to the theme of control, information derived from surveillance, dataveillance, sousveillance and überveillance techniques can also serve the purpose of social sorting, labelled by Oscar Gandy (1993, p. 1) as the “panoptic sort.” Relevant to this discussion, the information may relate to an individual’s location. In Gandy’s influential work The Panoptic Sort: A Political Economy of Personal Information, the author relies on the work of Michel Foucault and other critical theorists (refer to pp. 3-13) in examining the panoptic sort as an “antidemocratic system of control” (Gandy 1993, p. 227). According to Gandy, in this system, individuals are exposed to prejudiced forms of categorisation based on both economic and political factors (pp. 1-2). Lyon (1998, p. 94) describes the database management practices associated with social sorting, classing them a form of consumer surveillance, in which customers are grouped by “social type and location.” Such clustering forms the basis for the exclusion and marginalisation of individuals (King 2001, pp. 47-49). As a result, social sorting is presently used for profiling of individuals and in the market research realm (Bennett and Regan 2004, p. 452).

Profiling 1.2.18

Profiling “is a technique whereby a set of characteristics of a particular class of person is inferred from past experience, and data-holdings are then searched for individuals with a close fit to that set of characteristics” (Clarke 1993). The process is centred on the creation of a profile or model related to a specific individual, based on data aggregation processes (Casal 2004, p. 108). Assorted terms have been employed in labelling this profile. For instance, the model created of an individual using the data collected through dataveillance techniques has been referred to by Clarke (1997) as “the digital persona”, and is related to the “digital dossiers” idea introduced by Solove (2004, pp. 1-7). According to Clarke (1994), the use of networked systems, namely the internet, involves communicating and exposing data and certain aspects of, at times, recognisable behaviour, both of which are utilised in the creation of a personality.

Digital personas and dossiers 1.2.19

The resulting personality is referred to as the digital persona. Similarly, digital dossiers refer to the compilation of comprehensive electronic data related to an individual, utilised in the creation of the “digital person” (Solove 2004, p. 1), also referred to as “digital biographies” (Solove 2002, p. 1086). Digital biographies are further discussed by Solove (2002). In examining the need for LBS regulation throughout the globe, a given regulatory response or framework must appreciate the ease with which (past, present and future) location information can be compiled and integrated into an individual’s digital persona or dossier. Once such information is reproduced and disseminated the control implications are magnified.

With respect to the theme of control, an individual can exercise a limited amount of influence over their digital persona, as some aspects of creating an electronic personality may not be within their direct control. The scope of this article does not allow for reflection on the digital persona in great detail; however, Clarke (1994) offers a thorough investigation of the term, and associated notions such as the passive and active digital persona, in addition to the significance of the digital person to dataveillance techniques such as computer matching and profiling. However, significant to this research is the distinction between the physical and the digital persona and the resultant implications in relation to control, as summarised in the following extract:

“The physical persona is progressively being replaced by the digital persona as the basis for social control by governments, and for consumer marketing by corporations. Even from the strictly social control and business efficiency perspectives, substantial flaws exist in this approach. In addition, major risks to individuals and society arise” (Clarke 1994).

The same sentiments apply with respect to digital dossiers. In particular, Solove (2004, p. 2) notes that individuals are unaware of the ways in which their electronic data is exploited by government and commercial entities, and “lack the power to do much about it.” It is evident that profile data is advantageous for both social control and commercial purposes (Clarke 2001d, p. 12), the latter of which is associated with market research and sorting activities, which have evolved from ideas of “containment” of mobile consumer demand to the “control” model (Arvidsson 2004, pp. 456, 458-467). The control model in particular has been strengthened, but not solely driven, by emerging technologies including LBS, as explained:

“The control paradigm thus permits a tighter and more efficient surveillance that makes use of consumer mobility rather than discarding it as complexity. This ability to follow the consumer around has been greatly strengthened by new technologies: software for data mining, barcode scans, internet tracking devices, and lately location based information from mobile phones” (Arvidsson 2004, p. 467).

Social sorting, particularly for profiling and market research purposes, thus introduces numerous concerns relating to the theme of control, one of which is the ensuing consequences relating to personal privacy. This specifically includes the privacy of location information. In sum, examining the current regulatory framework for LBS in Australia, and determining the need for LBS regulation, necessitates an appreciation of the threats associated with social sorting using information derived from LBS solutions. Additionally, the benefits and risks associated with surveillance, dataveillance, sousveillance and überveillance for control must be measured and carefully contemplated in the proposed regulatory response.

Trust 1.3

Trust is a significant theme relating to LBS, given the importance of the notion to: (a) “human existence” (Perusco et al. 2006, p. 93; Perusco and Michael 2007, p. 10), (b) relationships (Lewis and Weigert 1985, pp. 968-969), (c) intimacy and rapport within a domestic relationship (Boesen et al. 2010, p. 65), and (d) LBS success and adoption (Jorns and Quirchmayr 2010, p. 152). Trust can be defined, in general terms, as the “firm belief in the reliability, truth, or ability of someone or something” (Oxford Dictionary 2012b). A definition of trust that has been widely cited in relevant literature is “the willingness of a party to be vulnerable to the actions of another party based on the expectation that the other will perform a particular action important to the trustor, irrespective of the ability to monitor or control that other party” (Mayer et al. 1995, p. 712). Related to electronic relationships or transactions, the concept has been defined as the “confident reliance by one party on the behaviour of other parties” (Clarke 2001c, p. 291), and it has been suggested that in the electronic-commerce domain, in particular, trust is intimately associated with the disclosure of information (Metzger 2004).

In reviewing literature concerning trust, Fusco et al. (2011, p. 2) claim that trust is typically described as a dynamic concept falling into the categories of cognitive (evidence based), emotional (faith-based), and/or behavioural (conduct-based) trust. For further reading, the major sources on trust can be found in: Lewis and Weigert's (1985) sociological treatment of trust, the influential work of Mayer et al. (1995) and the authors’ updated work Schoorman et al. (2007) centred on organisational trust, Weckert’s (2000) comprehensive review of trust in the context of workplace monitoring using electronic devices, research on trust in electronic-commerce (refer to McKnight and Chervany 2001; Pavlou 2003; Kim et al. 2009) and mobile-commerce (see Siau and Shen 2003; Yeh and Li 2009), the work of Valachich (2003) that introduces and evaluates trust in terms of ubiquitous computing environments, Dwyer et al.’s (2007) article on trust and privacy issues in social networks, Yan and Holtmanns’ (2008) examination of issues associated with digital trust management, the work of Chen et al. (2008) covering the benefits and concerns of LBS usage including privacy and trust implications, and the research by Junglas and Spitzmüller (2005) that examines privacy and trust issues concerning LBS by presenting a research model that incorporates these aspects amongst others.

For the purpose of this paper, the varying definitions and categorisations are acknowledged. However, trust will be assessed in terms of the relationships dominating existing LBS/IoT scholarship which comprise the government-citizen relationship centred on trust in the state, the business-consumer relationship associated with trust in corporations/LBS providers, and the consumer-consumer relationship concerned with trust in individuals/others.

Trust in the state 1.3.1

Trust in the state broadly covers LBS solutions implemented by government, thus representing the government-citizen relationship. Dominating current debates and literature are LBS government initiatives in the form of emergency management schemes, in conjunction with national security applications utilising LBS, which depending on the nature of their implementation may impact on citizens’ trust in the state. These concerns are typically expressed as a trade-off between security and safety. At present there are very few examples of fully-fledged IoT systems to point to, although increasingly quasi-IoT systems are being deployed using wireless sensor networks of varying kinds, e.g. for bushfire management and for fisheries. These systems do not include a direct human stakeholder but are still relevant as they may trigger flow-on effects that do impact citizenry.

Balancing trust and privacy in emergency services 1.3.2

In the context of emergency management, Aloudat and Michael (2011, p. 58) maintain that the dominant theme between government and consumers in relation to emergency warning messages and systems is trust. This includes trust in the LBS services being delivered and in the government itself (Aloudat and Michael 2011, p. 71). While privacy is typically believed to be the leading issue confronting LBS, in emergency and life-threatening situations it is overwhelmed by trust-related challenges, given that users are generally willing to relinquish their privacy in the interest of survival (Aloudat and Michael 2010, p. 2). Furthermore, the success of these services is reliant on trust in the technology, the service, and the accuracy/reliability/timeliness of the emergency alert. On the whole, this success can be measured in terms of citizens’ confidence in their government’s ability to sensibly select and implement a fitting emergency service utilising enhanced LBS features. In a paper that examines the deployment of location services in Dutch public administration, van Ooijen and Nouwt (2009, p. 81) assess the impact of government-based LBS initiatives on the government-citizen relationship, recommending that governments employ care in gathering and utilising location-based data about the public, to ensure that citizens' trust in the state is not compromised.

Trust-related implications of surveillance in the interest of national security 1.3.3

Trust is also prevalent in discussions relating to national security. National security has been regarded a priority area for many countries for over a decade, and as such has prompted the implementation of surveillance schemes by government. Wigan and Clarke (2006, p. 392) discuss the dimension of trust as a significant theme contributing to the social acceptance of a particular government surveillance initiative, which may incorporate location and tracking of individuals and objects. The implementation of surveillance systems by the state, including those incorporating LBS, can diminish the public’s confidence in the state given the potential for such mechanisms to be perceived as a form of authoritarian control. Nevertheless, a situation where national security and safety are considered to be in jeopardy may entail (partial) acceptance of various surveillance initiatives that would otherwise be perceived objectionable. In such circumstances, trust in government plays a crucial role in determining individuals’ willingness to compromise various civil liberties. This is explained by Davis and Silver (2004, p. 35) below:

“The more people trust the federal government or law enforcement agencies, the more willing they are to allow the government leeway in fighting the domestic war on terrorism by conceding some civil liberties.”

However, in due course it is expected that such increased security measures (even if initially supported by citizens) will yield a growing gap between government and citizens, “potentially dampening citizen participation in government and with it reducing citizens’ trust in public institutions and officials” (Gould 2002, p. 77). This is so as the degree of threat and trust in government is diminishing, thus resulting in the public’s reluctance to surrender their rights for the sake of security (Sanquist et al. 2008, p. 1126). In order to build and maintain trust, governments are required to be actively engaged in developing strategies to build confidence in both their abilities and of the technology under consideration, and are challenged to recognise “the massive harm that surveillance measures are doing to public confidence in its institutions” (Wigan and Clarke 2006, p. 401). It has been suggested that a privacy impact assessment (PIA) aids in establishing trust between government and citizens (Clarke 2009, p. 129). Carefully considered legislation is an alternative technique to enhance levels of trust. With respect to LBS, governments are responsible for proposing and enacting regulation that is in the best interest of citizens, incorporating citizen concerns into this process and encouraging suitable design of LBS applications, as explained in the following quotation:

“...new laws and regulations must be drafted always on the basis of citizens’ trust in government authorities. This means that citizens trust the government to consider the issues at stake according to the needs and wishes of its citizens. Location aware services can influence citizens’ trust in the democratic society. Poorly designed infrastructures and services for storing, processing and distributing location-based data can give rise to a strong feeling of being threatened. Whereas a good design expands the feeling of freedom and safety, both in the private and in the public sphere/domain” (Beinat et al. 2007, p. 46).

One of the biggest difficulties that will face stakeholders is identifying when current LBS systems become a part of bigger IoT initiatives. Major changes in systems will require a re-evaluation of impact assessments of different types.

Need for justification and cultural sensitivity 1.3.4

Techniques of this nature will fail to be espoused, however, if surveillance schemes lack adequate substantiation at the outset, as trust is threatened by “absence of justification for surveillance, and of controls over abuses” (Wigan and Clarke 2006, p. 389). From a government perspective, this situation may prove detrimental, as Wigan and Clarke (2006, p. 401) claim that transparency and trust are prerequisites for ensuring public confidence in the state, noting that “[t]he integrity of surveillance schemes, in transport and elsewhere, is highly fragile.” Aside from adequate justification of surveillance schemes, cultural differences associated with the given context need to be acknowledged as factors influencing the level of trust citizens hold in government. As explained by Dinev et al. (2005, p. 3) in their cross-cultural study of American and Italian Internet users' privacy and surveillance concerns, “[a]ttitudes toward government and government initiatives are related to the culture’s propensity to trust.” In comparing the two contexts, Dinev et al. claim that Americans readily accept government surveillance to provide increased levels of security, whereas Italians’ low levels of trust in government results in opposing viewpoints (pp. 9-10).

Trust in corporations/LBS/IoT providers 1.3.5

Trust in corporations/LBS/IoT providers emerges from the level of confidence a user places in an organisation and their respective location-based solution(s), which may be correlated to the business-consumer relationship. In the context of consumer privacy, Culnan and Bies (2003, p. 327) assert that perceived trust in an organisation is closely linked to the extent to which an organisation's practices are aligned with its policies. A breach in this trust affects the likelihood of personal information disclosure in the future (Culnan and Bies 2003, p. 328), given the value of trust in sustaining lasting customer relationships (p. 337). Reducing this “trust gap” (Culnan and Bies 2003, pp. 336-337) is a defining element for organisations in achieving economic and industry success, as it may impact on a consumer’s decision to contemplate location data usage (Chen et al. 2008, p. 34). Reducing this gap requires that control over location details remain with the user, as opposed to the LBS provider or network operator (Giaglis et al. 2003, p. 82). Trust can thus emerge from a user’s perception that they are in command (Junglas and Spitzmüller 2005, p. 3). 

Küpper and Treu (2010, pp. 216-217) concur with these assertions, explaining that the lack of uptake of first-generation LBS applications was chiefly a consequence of the dominant role of the network operator over location information. This situation has been somewhat rectified since the introduction of GPS-enabled devices capable of determining location information without input from the network operator and higher emphasis on a user-focussed model (Bellavista et al. 2008, p. 85; Küpper and Treu 2010, p. 217). Trust, however, is not exclusively concerned with a network operator’s ability to determine location information, but also with the possible misuse of location data. As such, it has also been framed as a potential resolution to location data misappropriation, explained further by Jorns and Quirchmayr (2010, p. 152) in the following excerpt:

“The only way to completely avoid misuse is to entirely block location information, that is, to reject such services at all. Since this is not an adequate option... trust is the key to the realization of mobile applications that exchange sensitive information.”

There is much to learn from the covert and overt location tracking of large corporation on their subscribers. Increasingly, the dubious practices of retaining location information by information and communication technology giants Google, Apple and Microsoft are being reported and only small commensurate penalties being applied in countries in the European Union and Asia. Disturbing in this trend is that even smaller suppliers of location-based applications are beginning to unleash unethical (but seemingly not illegal) solutions at shopping malls and other campus-based locales (Michael & Clarke 2013).

Importance of identity and privacy protection to trust 1.3.6

In delivering trusted LBS solutions, Jorns and Quirchmayr (2010, pp. 151-155) further claim that identity and privacy protection are central considerations that must be built into a given solution, proposing an LBS architecture that integrates such safeguards. That is, identity protection may involve the use of false dummies, dummy users and landmark objects, while privacy protection generally relies on decreasing the resolution of location data, employing supportive regulatory techniques and ensuring anonymity and pseudonymity (Jorns and Quirchmayr 2010, p. 152). Similarly, and with respect to online privacy, Clarke (2001c, p. 297) suggests that an adequate framework must be introduced that “features strong and comprehensive privacy laws, and systematic enforcement of those laws.” These comments, also applicable to LBS in a specific sense, were made in the context of economic rather than social relationships, referring primarily to government and corporations, but are also relevant to trust amongst social relations.

It is important to recognise that issues of trust are closely related to privacy concerns from the perspective of users. In an article titled, “Trust and Transparency in Location-Based Services: Making Users Lose their Fear of Big Brother”, Böhm et al. (2004, pp. 1-3) claim that operators and service providers are charged with the difficult task of earning consumer trust and that this may be achieved by addressing user privacy concerns and adhering to relevant legislation. Additional studies also point to the relationship between trust and privacy, claiming that trust can aid in reducing the perceived privacy risk for users. For example, Xu et al. (2005) suggest that enhancing trust can reduce the perceived privacy risk. This influences a user's decision to disclose information, and that “service provider’s interventions including joining third party privacy seal programs and introducing device-based privacy enhancing features could increase consumers’ trust beliefs and mitigate their privacy risk perceptions” (Xu et al. 2005, p. 905). Chellappa and Sin (2005, pp. 188-189), in examining the link between trust and privacy, express the importance of trust building, which include consumer’s familiarity and previous experience with the organisation.

Maintaining consumer trust 1.3.7

The primary consideration in relation to trust in the business-consumer relationship is that all efforts be targeted at establishing and building trust in corporations and LBS/IoT providers. Once trust has been compromised, the situation cannot be repaired which is a point applicable to trust in any context. This point is explained by Kaasinen (2003, p. 77) in an interview-based study regarding user requirements in location-aware mobile applications:

“The faith that the users have in the technology, the service providers and the policy-makers should be regarded highly. Any abuse of personal data can betray that trust and it will be hard to win it back again.”

Trust in individuals/others 1.3.8

Trust in the consumer-to-consumer setting is determined by the level of confidence existing between an individual and their social relations, which may include friends, parents, other family members, employers and strangers, categories that are adapted from Levin et al. (2008, pp. 81-82). Yan and Holtmanns (2008, p. 2) express the importance of trust for social interactions, claiming that “[s]ocial trust is the product of past experiences and perceived trustworthiness.” It has been suggested that LBS monitoring can erode trust between the individual engaged in monitoring and the subject being monitored, as the very act implies that trust is lacking in a given relationship (Perusco et al. 2006, p. 93). These concerns are echoed in Michael et al. (2008). Previous studies relevant to LBS and trust generally focus on: the workplace situation, that is, trust between an employer and their employee; trust amongst ‘friends’ subscribed to a location-based social networking (LBSN) service which may include any of the predefined categories above; in addition to studies relating to the tracking of family members, such as children for instance, for safety and protection purposes and the relative trust implications.

Consequences of workplace monitoring 1.3.9

With respect to trust in an employer’s use of location-based applications and location data, a prevailing subject in existing literature is the impact of employee monitoring systems on staff. For example, in studying the link between electronic workplace monitoring and trust, Weckert (2000, p. 248) reported that trust is a significant issue resulting from excessive monitoring, in that monitoring may contribute to deterioration in professional work relationships between an employer and their employee and consequently reduce or eliminate trust. Weckert’s work reveals that employers often substantiate electronic monitoring based on the argument that the “benefits outweigh any loss of trust”, and may include gains for the involved parties; notably, for the employer in the form of economic benefits, for the employee to encourage improvements to performance and productivity, and for the customer who may experience enhanced customer service (p. 249). Chen and Ross (2005, p. 250), on the other hand, argue that an employer’s decision to monitor their subordinates may be related to a low degree of existing trust, which could be a result of unsuitable past behaviour on the part of the employee. As such, employers may perceive monitoring as necessary in order to manage employees. Alternatively, from the perspective of employees, trust-related issues materialise as a result of monitoring, which may leave an impression on job attitudes, including satisfaction and dedication, as covered in a paper by Alder et al. (2006) in the context of internet monitoring.

When applied to location monitoring of employees using LBS, the trust-related concerns expressed above are indeed warranted. Particularly, Kaupins and Minch (2005, p. 2) argue that the appropriateness of location monitoring in the workplace can be measured from either a legal or ethical perspective, which inevitably results in policy implications for the employer. The authors emphasise that location monitoring of employees can often be justified in terms of the security, productivity, reputational and protective capabilities of LBS (Kaupins and Minch 2005, p. 5). However, Kaupins and Minch (2005, pp. 5-6) continue to describe the ethical factors “limiting” location monitoring in the workplace, which entail the need for maintaining employee privacy and the restrictions associated with inaccurate information, amongst others. These factors will undoubtedly affect the degree of trust between an employer and employee.

However, the underlying concern relevant to this discussion of location monitoring in the workplace is not only the suitability of employee monitoring using LBS. While this is a valid issue, the challenge remains centred on the deeper trust-related consequences. Regardless of the technology or applications used to monitor employees, it can be concluded that a work atmosphere lacking trust results in sweeping consequences that extend beyond the workplace, expressed in the following excerpt:

“A low trust workplace environment will create the need for ever increasing amounts of monitoring which in turn will erode trust further. There is also the worry that this lack of trust may become more widespread. If there is no climate of trust at work, where most of us spend a great deal of our life, why should there be in other contexts? Some monitoring in some situations is justified, but it must be restricted by the need for trust” (Weckert 2000, p. 250).

Location-monitoring amongst friends 1.3.10

Therefore, these concerns are certainly applicable to the use of LBS applications amongst other social relations. Recent literature merging the concepts of LBS, online social networking and trust are particularly focused on the use of LBSN applications amongst various categories of friends. For example, Fusco et al.'s (2010) qualitative study examines the impact of LBSN on trust amongst friends, employing a focus group methodology in achieving this aim. The authors reveal that trust may suffer as a consequence of LBSN usage in several ways: as disclosure of location information and potential monitoring activities can result in application misuse in order to conceal things; excessive questioning and the deterioration in trust amongst social relations; and trust being placed in the application rather than the friend (Fusco et al. 2010, p. 7). Further information relating to Fusco et al.’s study, particularly the manner in which LBSN applications adversely impact on trust can be found in a follow-up article (Fusco et al. 2011).

Location tracking for protection 1.3.11

It has often been suggested that monitoring in familial relations can offer a justified means of protection, particularly in relation to vulnerable individuals such as Alzheimer’s or dementia sufferers and in children. With specific reference to the latter, trust emerges as a central theme relating to child tracking. In an article by Boesen et al. (2010) location tracking in families is evaluated, including the manner in which LBS applications are incorporated within the familial context. The qualitative study conducted by the authors revealed that the initial decision to use LBS by participants with children was a lack of existing trust within the given relationship, with participants reporting an improvement in their children's behaviour after a period of tracking (Boesen et al. 2010, p. 70). Boesen et al., however, warn of the trust-related consequences, claiming that “daily socially-based trusting interactions are potentially replaced by technologically mediated interactions” (p. 73). Lack of trust in a child is considered to be detrimental to their growth. The act of nurturing a child is believed to be untrustworthy through the use of technology, specifically location monitoring applications, may result in long-term implications. The importance of trust to the growth of a child and the dangers associated with ubiquitous forms of supervision are explained in the following excerpt:

“Trust (or at least its gradual extension as the child grows) is seen as fundamental to emerging self-control and healthy development... Lack of private spaces (whether physical, personal or social) for children amidst omni-present parental oversight may also create an inhibiting dependence and fear” (Marx and Steeves 2010, p. 218).

Furthermore, location tracking of children and other individuals in the name of protection may result in undesirable and contradictory consequences relevant to trust. Barreras and Mathur (2007, p. 182), in an article that describes the advantages and disadvantages of wireless location tracking, argue that technologies originally intended to protect family members (notably children, and other social relations such as friends and employees), can impact on trust and be regarded as “unnecessary surveillance.” The outcome of such tracking and reduced levels of trust may also result in a “counterproductive” effect if the tracking capabilities are deactivated by individuals, rendering them incapable of seeking assistance in actual emergency situations (Barreras and Mathur 2007, p. 182).

LBS/IoT is a ‘double-edged sword’ 1.3.12

In summary, location monitoring and tracking by the state, corporations and individuals is often justified in terms of the benefits that can be delivered to the party responsible for monitoring/tracking and the subject being tracked. As such, Junglas and Spitzmüller (2005, p. 7) claim that location-based services can be considered a “double-edged sword” in that they can aid in the performance of tasks in one instance, but may also generate Big Brother concerns. Furthermore, Perusco and Michael (2007, p. 10) mention the linkage between trust and freedom. As a result, Perusco et al. (2006, p. 97) suggest a number of questions that must be considered in the context of LBS and trust: “Does the LBS context already involve a low level of trust?”; “If the LBS context involves a moderate to high level of trust, why are LBS being considered anyway?”; and “Will the use of LBS in this situation be trust-building or trust-destroying?” In answering these questions, the implications of LBS/IoT monitoring on trust must be appreciated, given they are significant, irreparable, and closely tied to what is considered the central challenge in the LBS domain, privacy.

This paper has provided comprehensive coverage of the themes of control and trust with respect to the social implications of LBS. The subsequent discussion will extend the examination to cover LBS in the context of the IoT, providing an ethical analysis and stressing the importance of a robust socio-ethical framework.

Discussion 1.4

The Internet of Things (IoT) and LBS: extending the discussion on control and trust 1.4.1

The Internet of Things (IoT) is an encompassing network of connected intelligent “things”, and is “comprised of smart machines interacting and communicating with other machines, objects, environments and infrastructures” (Freescale Semiconductor Inc. and ARM Inc. 2014, p. 1). The phrase was originally coined by Kevin Ashton in 1999, and a definite definition is yet to be agreed upon (Ashton 2009, p. 1; Kranenburg and Bassi 2012, p. 1). Various forms of IoT are often used interchangeably, such as the Internet of Everything, the Internet of Things and People, the Web of Things and People etc. The IoT can, however, be described in terms of its core characteristics and/or the features it encompasses. At the crux of the IoT concept is the integration of the physical and virtual worlds, and the capability for “things” within these realms to be operated remotely through the employment of intelligent or smart objects with embedded processing functionality (Mattern and Floerkemeier 2010, p. 242; Ethics Subgroup IoT 2013, p. 3). These smart objects are capable of storing historical and varied forms of data, used as the basis for future interactions and the establishment of preferences. That is, once the data is processed, it can be utilized to “command and control” things within the IoT ecosystem, ideally resulting in enhancing the everyday lives of individual (Michael, K. et al., 2010).

According to Ashton (2009, p. 1), the IoT infrastructure should “empower computers” and exhibit less reliance on human involvement in the collection of information. It should also allow for “seamless” interactions and connections (Ethics Subgroup IoT 2013, p. 2). Potential use cases include personal/home applications, health/patient monitoring systems, and remote tracking and monitoring which may include applications such as asset tracking amongst others (Ethics Subgroup IoT 2013, p. 3).

As can be anticipated with an ecosystem of this scale, the nature of interactions with the physical/virtual worlds and the varied “things” within, will undoubtedly be affected and dramatically alter the state of play. In the context of this paper, the focus is ultimately on the ethical concerns emerging from the use of LBS within the IoT infrastructure that is characterized by its ubiquitous/pervasive nature, in view of the discussion above regarding control and trust. It is valuable at this point to identify the important role of LBS in the IoT infrastructure.

While the IoT can potentially encompass a myriad of devices, the mobile phone will likely feature as a key element within the ecosystem, providing connectivity between devices (Freescale Semiconductor Inc. and ARM Inc. 2014, p. 2). In essence, smart phones can therefore be perceived as the “mediator” between users, the internet and additional “things”, as is illustrated in Mattern and Floerkemeier (2010, p. 245, see figure 2). Significantly, most mobile devices are equipped with location and spatial capabilities, providing “localization”, whereby intelligent devices “are aware of their physical location, or can be located” (Mattern and Floerkemeier 2010, p. 244). An example of an LBS application in the IoT would be indoor navigation capabilities in the absence of GPS; or in affect seamless navigation between the outdoor and indoor environments.

Control- and trust-related challenges in the IoT 1.4.2

It may be argued that the LBS control and trust implications discussed throughout this paper (in addition to ethical challenges such as privacy and security) will matriculate into the IoT environment. However, it has also been suggested that “the IoT will essentially create much richer environments in which location-based and location-aware technology can function” (Blouin 2014), and in doing so the ethical challenges will be amplified. It has further been noted that ethical issues, including trust and control amongst others, will “gain a new dimension in light of the increased complexity” in the IoT environment (Ethics Subgroup IoT 2013, p. 2).

In relation to control and the previously identified surveillance metaphors, for instance, it is predicted that there will be less reliance on Orwell's notion of Big Brother whereby surveillance is conducted by a single entity. Rather the concept of "some brother" will emerge. Some brother can be defined as "a heterogeneous 'mass' consisting of innumerable social actors, e.g. public sector authorities, citizens' movements and NGOs, economic players, big corporations, SMEs and citizens" (Ethics Subgroup IoT 2013, p. 16). As can be anticipated, the ethical consequences and dangers can potentially multiply in such a scenario.

Following on from this idea, is that of lack of transparency. The IoT will inevitably result in the merging of both the virtual and physical worlds, in addition to public and private spaces. It has been suggested that lack of transparency regarding information access will create a sense of discomfort and will accordingly result in diminishing levels of trust (Ethics Subgroup IoT 2013, p. 8). The trust-related issues (relevant to LBS) are likely to be consistent with those discussed throughout this paper, possibly varying in intensity/severity depending on a given scenario. For example, the consequences of faulty IoT technology have the potential to be greater than those in conventional Internet services given the integration of the physical and virtual worlds, thereby impact on users’ trust in the IoT (Ethics Subgroup IoT 2013, p. 11). Therefore, trust considerations must primarily be examined in terms of: (a) trust in technology, and (b) trust in individuals/others.

Dealing with these (and other) challenges requires an ethical analysis in which appropriate conceptual and practical frameworks are considered. A preliminary examination is provided in the subsequent section, followed by dialogue regarding the need for objectivity in socio-ethical studies and the associated difficulties in achieving this.

Ethical analysis: proposing a socio-ethical conceptual framework 1.4.3

Research into the social and ethical implications of LBS, emerging technologies in general, and the IoT can be categorized in many ways and many frameworks can be applied. For instance, it may be regarded as a strand of “cyberethics”, defined by Tavani (2007, p. 3) as “the study of moral, legal and social issues involving cybertechnology”. Cybertechnology encompasses technological devices ranging from individual computers through to networked information and communication technologies. When considering ethical issues relating to cybertechnology and technology in general, Tavani (2007, pp. 23-24) notes that the latter should not necessarily be perceived as neutral. That is, technology may have “embedded values and biases” (Tavani 2007, p. 24), in that it may inherently provide capabilities to individuals to partake in unethical activities. This sentiment is echoed by Wakunuma and Stahl (2014, p. 393) in a paper examining the perceptions of IS professionals in relation to emerging ethical concerns.

Alternatively, research in this domain may be classed as a form of “computer ethics” or “information ethics”, which can be defined and applied using numerous approaches. While this article does not attempt to provide an in-depth account of information ethics, a number of its crucial characteristics are identified. In the first instance, the value of information ethics is in its ability to provide a conceptual framework for understanding the array of ethical challenges stemming from the introduction of new ICTs (Mathiesen 2004, p. 1). According to Floridi (1999), the question at the heart of information ethics is “what is good for an information entity and the infosphere in general?” The author continues that “more analytically, we shall say that [information ethics] determines what is morally right or wrong, what ought to be done, what the duties, the ‘oughts’ and the ‘ought nots’ of a moral agent are…” However, Capurro (2006, p. 182) disagrees, claiming that information ethics is additionally about “what is good for our bodily being-in-the-world with others in particular?” This involves contemplation of other “spheres” such as the ecological, political, economic, and cultural and is not limited to a study of the infosphere as suggested by Floridi. In this sense, the significance of context, environment and intercultural factors also becomes apparent.

Following on from these notions, there is the need for a robust ethical framework that is multi-dimensional in nature and explicitly covers the socio-ethical challenges emerging from the deployment of a given technology. This would include, but not be limited to, the control and trust issues identified throughout this paper, other concerns such as privacy and security, and any challenges that emerge as the IoT takes shape. This article proposes a broader more robust socio-ethical conceptual framework, as an appropriate means of examining and addressing ethical challenges relevant to LBS; both LBS in general and as a vital mediating component within the IoT. This framework is illustrated in Figure 1. Central to the socio-ethical framework is the contemplation of individuals as part of a broader social network or society, whilst considering the interactions amongst various elements of the overall “system”. The four themes underpinning socio-ethical studies include the investigation of what the human purpose is, what is moral, how justice is upheld and the principles that guide the usage of a given technique. Participants; their interactions with systems; people concerns and behavioural expectations; cultural and religious belief; structures, rules and norms; and fairness, personal benefits and personal harms are all areas of interest in a socio-ethical approach.

Figure 1: Proposed socio-ethical framework, in terms of the major components that require consideration

Figure 1: Proposed socio-ethical framework, in terms of the major components that require consideration

This article is intended to offer a preliminary account of the socio-ethical conceptual framework being proposed. Further research would examine and test its validity, whilst also providing a more detailed account of the various components within and how a socio-ethical assessment would be conducted based on the framework, and the range of techniques that could be applied.

The need for objectivity 1.4.4

Regardless of categorization and which conceptual framework is adopted, numerous authors stress that the focus of research and debates should not be skewed towards the unethical uses of a particular technology, but rather an objective stance should be embraced. Such objectivity must nonetheless ensure that social interests are adequately represented. That is, with respect to location and tracking technologies, Clarke (2001b, p. 220) claims that social interests have been somewhat overshadowed by the economic interests of LBS organisation. This is a situation that requires rectifying. While information technology professionals are not necessarily liable for how technology is deployed, they must nonetheless recognise its implications and be engaged in the process of introducing and promoting adequate safeguards (Clarke 1988, pp. 510-511). It has been argued that IS professionals are generally disinterested in the ethical challenges associated with emerging ICTs, and are rather concerned with the job or the technologies themselves (Wakunuma and Stahl 2014, p. 383).

This is explicitly the case for LBS given that the industry and technology have developed quicker than equivalent social implications scholarship and research, an unfavourable situation given the potential for LBS to have profound impacts on individuals and society (Perusco et al. 2006, p. 91). In a keynote address centred on defining the emerging notion of überveillance, Clarke (2007a, p. 34) discusses the need to measure the costs and disbenefits arising from surveillance practices in general, where costs refer to financial measures, and disbenefits to all non-economic impacts. This involves weighing the negatives against the potential advantages, a response that is applicable to LBS, and pertinent to seeking objectivity.

Difficulties associated with objectivity 1.4.5

However, a major challenge with respect to an impartial approach for LBS is the interplay between the constructive and the potentially damaging consequences that the technology facilitates. For instance, and with specific reference to wireless technologies in a business setting, Elliot and Phillips (2004, p. 474) maintain that such systems facilitate monitoring and surveillance which can be applied in conflicting scenarios. Positive applications, according to Elliot and Phillips, include monitoring to improve effectiveness or provide employee protection in various instances, although this view has been frequently contested. Alternatively, negative uses involve excessive monitoring, which may compromise privacy or lead to situations in which an individual is subjected to surveillance or unauthorised forms of monitoring.

Additional studies demonstrate the complexities arising from the dual, and opposing, uses of a single LBS solution. It has been illustrated that any given application, for instance, parent, healthcare, employee and criminal tracking applications, can be simultaneously perceived as ethical and unethical (Michael et al. 2006a, p. 7). A closer look at the scenario involving parents tracking children, as explained by Michael et al. (2006a, p. 7), highlights that child tracking can enable the safety of a child on the one hand, while invading their privacy on the other. Therefore, the dual and opposing uses of a single LBS solution become problematic and situation-dependent, and indeed increasingly difficult to objectively examine. Dobson and Fischer (2003, p. 50) maintain that technology cannot be perceived as either good or evil in that it is not directly the cause of unethical behaviour, rather they serve to “empower those who choose to engage in good or bad behaviour.”

This is similarly the case in relation to the IoT, as public approval of the IoT is largely centred on “the conventional dualisms of ‘security versus freedom’ and ‘comfort versus data privacy’” (Mattern and Floerkemeier 2010, p. 256). Assessing the implications of the IoT infrastructure as a whole is increasingly difficult.

An alternative obstacle is associated with the extent to which LBS threaten the integrity of the individual. Explicitly, the risks associated with location and tracking technologies “arise from individual technologies and the trails that they generate, from compounds of multiple technologies, and from amalgamated and cross-referenced trails captured using multiple technologies and arising in multiple contexts” (Clarke 2001b, pp. 218). The consequent social implications or “dangers” are thus a product of individuals being convicted, correctly or otherwise, of having committed a particular action (Clarke 2001b, p. 219). A wrongly accused individual may perceive the disbenefits arising from LBS as outweighing the benefits.

However, in situations where integrity is not compromised, an LBS application can be perceived as advantageous. For instance, Michael et al. (2006, pp. 1-11) refer to the potentially beneficial uses of LBS, in their paper focusing on the Avian Flu Tracker prototype that is intended to manage and contain the spread of the infectious disease, by relying on spatial data to communicate with individuals in the defined location. The authors demonstrate that their proposed system which is intended to operate on a subscription or opt-in basis is beneficial for numerous stakeholders such as government, health organisations and citizens (Michael et al. 2006c, p. 6).

Thus, a common challenge confronting researchers with respect to the study of morals, ethics and technology is that the field of ethics is subjective. That is, what constitutes right and wrong behaviour varies depending on the beliefs of a particular individual, which are understood to be based on cultural and other factors specific to the individual in question. One such factor is an individual’s experience with the technology, as can be seen in the previous example centred on the notion of an unjust accusation. Given these subjectivities and the potential for inconsistency from one individual to the next, Tavani (2007, p. 47) asserts that there is the need for ethical theories to direct the analysis of moral issues (relating to technology), given that numerous complications or disagreements exist in examining ethics.

Conclusion 1.5

This article has provided a comprehensive review of the control- and trust-related challenges relevant to location-based services, in order to identify and describe the major social and ethical considerations within each of the themes. The relevance of the IoT in such discussions has been demonstrated and a socio-ethical framework proposed to encourage discussion and further research into the socio-ethical implications of the IoT with a focus on LBS and/or localization technologies. The proposed socio-ethical conceptual framework requires further elaboration and it is recommended that a thorough analysis, beyond information ethics, be conducted based on this paper which forms the basis for such future work. IoT by its very nature is subject to socio-ethical dilemmas because for the greater part, the human is removed from decision-making processes and is instead subject to a machine.

References

Abbas, R., Michael, K., Michael, M.G. & Aloudat, A.: Emerging Forms of Covert Surveillance Using GPS-Enabled Devices. Journal of Cases on Information Technology 13(2), 2011, 19-33.

Albrecht, K. & McIntyre, L.: Spychips: How Major Corporations and Government Plan to Track Your Every Purchase and Watch Your Every Move. Tomas Nelson 2005.

Albrecht, K. & Michael, K.: Connected: To Everyone and Everything. IEEE Technology and Society Magazine, Winter, 2013, 31-34.

Alder, G.S., Noel, T.W. & Ambrose, M.L.: Clarifying the Effects of Internet Monitoring on Job Attitudes: The Mediating Role of Employee Trust. Information & Management, 43, 2006, 894-903.

Aloudat, A. & Michael, K.: The Socio-Ethical Considerations Surrounding Government Mandated Location-Based Services During Emergencies: An Australian Case Study, in M. Quigley (ed.), ICT Ethics and Security in the 21st Century: New Developments and Applications. IGI Global, Hershey, PA, 2010, 1-26.

Aloudat, A. & Michael, K.: Toward the Regulation of Ubiquitous Mobile Government: A case Study on Location-Based Emergency Services in Australia. Electronic Commerce Research, 11(1), 2011, 31-74.

Andrejevic, M.: ISpy: Surveillance and Power in the Interactive Era. University Press of Kansas, Lawrence, 2007.

Arvidsson, A.: On the ‘Pre-History of the Panoptic Sort’: Mobility in Market Research. Surveillance & Society, 1(4), 2004, 456-474.

Ashton, K.: The "Internet of Things" Things. RFID Journal, 2009, www.rfidjournal.com/articles/pdf?4986

Barreras, A. & Mathur, A.: Chapter 18. Wireless Location Tracking, in K.R. Larsen and Z.A. Voronovich (eds.), Convenient or Invasive: The Information Age. Ethica Publishing, United States, 2007, 176-186.

Bauer, H.H., Barnes, S.J., Reichardt, T. & Neumann, M.M.: Driving the Consumer Acceptance of Mobile Marketing: A Theoretical Framework and Empirical Study. Journal of Electronic Commerce Research, 6(3), 2005, 181-192.

Beinat, E., Steenbruggen, J. & Wagtendonk, A.: Location Awareness 2020: A Foresight Study on Location and Sensor Services. Vrije Universiteit, Amsterdam, 2007, http://reference.kfupm.edu.sa/content/l/o/location_awareness_2020_2_108_86452.pdf

Bellavista, P., Küpper, A. & Helal, S.: Location-Based Services: Back to the Future. IEEE Pervasive Computing, 7(2), 2008, 85-89.

Bennett, C.J. & Regan, P.M.: Surveillance and Mobilities. Surveillance & Society, 1(4), 2004, 449-455.

Bentham, J. & Bowring, J.: The Works of Jeremy Bentham. Published under the Superintendence of His Executor, John Bowring, Volume IV, W. Tait, Edinburgh, 1843.

Blouin, D. An Intro to Internet of Things. 2014, www.xyht.com/spatial-itgis/intro-to-internet-of-things/

Boesen, J., Rode, J.A. & Mancini, C.: The Domestic Panopticon: Location Tracking in Families. UbiComp’10, Copenhagen, Denmark, 2010, pp. 65-74.

Böhm, A., Leiber, T. & Reufenheuser, B.: 'Trust and Transparency in Location-Based Services: Making Users Lose Their Fear of Big Brother. Proceedings Mobile HCI 2004 Workshop On Location Systems Privacy and Control, Glasgow, UK, 2004, 1-4.

Capurro, R.: Towards an Ontological Foundation of Information Ethics. Ethics and Information Technology, 8, 2006, 175-186.

Casal, C.R.: Impact of Location-Aware Services on the Privacy/Security Balance, Info: the Journal of Policy, Regulation and Strategy for Telecommunications. Information and Media, 6(2), 2004, 105-111.

Chellappa, R. & Sin, R.G.: Personalization Versus Privacy: An Empirical Examination of the Online Consumer’s Dilemma. Information Technology and Management, 6, 2005, 181-202.

Chen, J.V., Ross, W. & Huang, S.F.: Privacy, Trust, and Justice Considerations for Location-Based Mobile Telecommunication Services. info, 10(4), 2008, 30-45.

Chen, J.V. & Ross, W.H.: The Managerial Decision to Implement Electronic Surveillance at Work. International Journal of Organizational Analysis, 13(3), 2005, 244-268.

Clarke, R.: Information Technology and Dataveillance. Communications of the ACM, 31(5), 1988, 498-512.

Clarke, R.: Profiling: A Hidden Challenge to the Regulation of Data Surveillance. 1993, http://www.rogerclarke.com/DV/PaperProfiling.html.

Clarke, R.: The Digital Persona and Its Application to Data Surveillance. 1994, http://www.rogerclarke.com/DV/DigPersona.html.

Clarke, R.: Introduction to Dataveillance and Information Privacy, and Definitions of Terms. 1997, http://www.anu.edu.au/people/Roger.Clarke/DV/Intro.html.

Clarke, R.: Person Location and Person Tracking - Technologies, Risks and Policy Implications. Information Technology & People, 14(2), 2001b, 206-231.

Clarke, R.: Privacy as a Means of Engendering Trust in Cyberspace Commerce. The University of New South Wales Law Journal, 24(1), 2001c, 290-297.

Clarke, R.: While You Were Sleeping… Surveillance Technologies Arrived. Australian Quarterly, 73(1), 2001d, 10-14.

Clarke, R.: Privacy on the Move: The Impacts of Mobile Technologies on Consumers and Citizens. 2003b, http://www.anu.edu.au/people/Roger.Clarke/DV/MPrivacy.html.

Clarke, R.: Have We Learnt to Love Big Brother? Issues, 71, June, 2005, 9-13.

Clarke, R.: What's 'Privacy'? 2006, http://www.rogerclarke.com/DV/Privacy.html.

Clarke, R. Chapter 3. What 'Uberveillance' Is and What to Do About It, in K. Michael and M.G. Michael (eds.), The Second Workshop on the Social Implications of National Security, University of Wollongong, Wollongong, Australia, 2007a, 27-46.

Clarke, R.: Chapter 4. Appendix to What 'Uberveillance' Is and What to Do About It: Surveillance Vignettes, in K. Michael and M.G. Michael (eds.), The Second Workshop on the Social Implications of National Security, University of Wollongong, Wollongong, Australia, 2007b, 47-60.

Clarke, R.: Surveillance Vignettes Presentation. 2007c, http://www.rogerclarke.com/DV/SurvVign-071029.ppt.

Clarke, R.: Privacy Impact Assessment: Its Origins and Development. Computer Law & Security Review, 25(2), 2009, 123-135.

Clarke, R. & Wigan, M.: You Are Where You've Been: The Privacy Implications of Location and Tracking Technologies. 2011, http://www.rogerclarke.com/DV/YAWYB-CWP.html.

Culnan, M.J. & Bies, R.J.: Consumer Privacy: Balancing Economic and Justice Considerations. Journal of Social Issues, 59(2), 2003, 323-342.

Davis, D.W. & Silver, B.D.: Civil Liberties vs. Security: Public Opinion in the Context of the Terrorist Attacks on America. American Journal of Political Science, 48(1), 2004, pp. 28-46.

Dinev, T., Bellotto, M., Hart, P., Colautti, C., Russo, V. & Serra, I.: Internet Users’ Privacy Concerns and Attitudes Towards Government Surveillance – an Exploratory Study of Cross-Cultural Differences between Italy and the United States. 18th Bled eConference eIntegration in Action, Bled, Slovenia, 2005, 1-13.

Dobson, J.E. & Fisher, P.F. Geoslavery. IEEE Technology and Society Magazine, 22(1), 2003, 47-52.

Dobson, J.E. & Fisher, P.F. The Panopticon's Changing Geography. Geographical Review, 97(3), 2007, 307-323.

Dwyer, C., Hiltz, S.R. & Passerini, K.: Trust and Privacy Concern within Social Networking Sites: A Comparison of Facebook and Myspace. Proceedings of the Thirteenth Americas Conference on Information Systems, Keystone, Colorado, 2007, 1-12.

Elliot, G. & Phillips, N. Mobile Commerce and Wireless Computing Systems. Pearson Education Limited, Great Britain, 2004.

Ethics Subgroup IoT: Fact sheet- Ethics Subgroup IoT - Version 4.0, European Commission. 2013, 1-21, http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CB0QFjAA&url=http%3A%2F%2Fec.europa.eu%2Finformation_society%2Fnewsroom%2Fcf%2Fdae%2Fdocument.cfm%3Fdoc_id%3D1751&ei=5i7RVK-FHczYavKWgPgL&usg=AFQjCNG_VgeaUP_DIJVwSiPIww3bC9Ug_w

Freescale Semiconductor Inc. and ARM Inc:, Whitepaper: What the Internet of Things (IoT) Needs to Become a Reality. 2014, 1-16, cache.freescale.com/files/32bit/doc/white_paper/INTOTHNGSWP.pdf

Floridi, L.: Information Ethics: On the Philosophical Foundation of Computer Ethics. Ethics and Information Technology, 1, 1999, 37-56.

Foucault, M. Discipline and Punish: The Birth of the Prison. Second Vintage Books Edition May 1995, Vintage Books: A Division of Random House Inc, New York, 1977.

Fusco, S.J., Michael, K., Aloudat, A. & Abbas, R.: Monitoring People Using Location-Based Social Networking and Its Negative Impact on Trust: An Exploratory Contextual Analysis of Five Types of “Friend” Relationships. IEEE Symposium on Technology and Society, Illinois, Chicago, 2011.

Fusco, S.J., Michael, K., Michael, M.G. & Abbas, R.: Exploring the Social Implications of Location Based Social Networking: An Inquiry into the Perceived Positive and Negative Impacts of Using LBSN between Friends. 9th International Conference on Mobile Business, Athens, Greece, IEEE, 2010, 230-237.

Gagnon, M., Jacob, J.D., Guta, A.: Treatment adherence redefined: a critical analysis of technotherapeutics. Nurs Inq. 20(1), 2013, 60-70.

Ganascia, J.G.: The Generalized Sousveillance Society. Social Science Information, 49(3), 2010, 489-507.

Gandy, O.H.: The Panoptic Sort: A Political Economy of Personal Information. Westview, Boulder, Colorado, 1993.

Giaglis, G.M., Kourouthanassis, P. & Tsamakos, A.: Chapter IV. Towards a Classification Framework for Mobile Location-Based Services, in B.E. Mennecke and T.J. Strader (eds.), Mobile Commerce: Technology, Theory and Applications. Idea Group Publishing, Hershey, US, 2003, 67-85.

Gould, J.B.: Playing with Fire: The Civil Liberties Implications of September 11th. Public Administration Review, 62, 2002, 74-79.

Jorns, O. & Quirchmayr, G.: Trust and Privacy in Location-Based Services. Elektrotechnik & Informationstechnik, 127(5), 2010, 151-155.

Junglas, I. & Spitzmüller, C.: A Research Model for Studying Privacy Concerns Pertaining to Location-Based Services. Proceedings of the 38th Hawaii International Conference on System Sciences, 2005, 1-10.

Kaasinen, E.: User Acceptance of Location-Aware Mobile Guides Based on Seven Field Studies. Behaviour & Information Technology, 24(1), 2003, 37-49.

Kaupins, G. & Minch, R.: Legal and Ethical Implications of Employee Location Monitoring. Proceedings of the 38th Hawaii International Conference on System Sciences. 2005, 1-10.

Kim, D.J., Ferrin, D.L. & Rao, H.R.: Trust and Satisfaction, Two Stepping Stones for Successful E-Commerce Relationships: A Longitudinal Exploration. Information Systems Research, 20(2), 2009, 237-257.

King, L.: Information, Society and the Panopticon. The Western Journal of Graduate Research, 10(1), 2001, 40-50.

Kodl, J. & Lokay, M.: Human Identity, Human Identification and Human Security. Proceedings of the Conference on Security and Protection of Information, Idet Brno, Czech Republic, 2001, 129-138.

Kranenburg, R.V. and Bassi, A.: IoT Challenges, Communications in Mobile Computing. 1(9), 2012, 1-5.

Küpper, A. & Treu, G.: Next Generation Location-Based Services: Merging Positioning and Web 2.0., in L.T. Yang, A.B. Waluyo, J. Ma, L. Tan and B. Srinivasan (eds.), Mobile Intelligence. John Wiley & Sons Inc, Hoboken, New Jersey, 2010, 213-236.

Levin, A., Foster, M., West, B., Nicholson, M.J., Hernandez, T. & Cukier, W.: The Next Digital Divide: Online Social Network Privacy. Ryerson University, Ted Rogers School of Management, Privacy and Cyber Crime Institute, 2008, www.ryerson.ca/tedrogersschool/privacy/Ryerson_Privacy_Institute_OSN_Report.pdf.

Lewis, J.D. & Weigert, A.: Trust as a Social Reality. Social Forces, 63(4), 1985, 967-985.

Lyon, D.: The World Wide Web of Surveillance: The Internet and Off-World Power Flows. Information, Communication & Society, 1(1), 1998, 91-105.

Lyon, D.: Surveillance Society: Monitoring Everyday Life. Open University Press, Phildelphia, PA, 2001.

Lyon, D.: Surveillance Studies: An Overview. Polity, Cambridge, 2007.

Macquarie Dictionary.: 'Uberveillance', in S. Butler, Fifth Edition of the Macquarie Dictionary, Australia's National Dictionary. Sydney University, 2009, 1094.

Mann, S.: Sousveillance and Cyborglogs: A 30-Year Empirical Voyage through Ethical, Legal, and Policy Issues. Presence, 14(6), 2005, 625-646.

Mann, S., Nolan, J. & Wellman, B.: Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments. Surveillance & Society, 1(3), 2003, 331-355.

Mathiesen, K.: What is Information Ethics? Computers and Society, 32(8), 2004, 1-11.

Mattern, F. and Floerkemeier, K.: From the Internet of Computers to the Internet of Things, in Sachs, K., Petrov, I. & Guerrero, P. (eds.), From Active Data Management to Event-Based Systems and More. Springer-Verlag Berlin Heidelberg, 2010, 242-259.

Marx, G.T. & Steeves, V.: From the Beginning: Children as Subjects and Agents of Surveillance. Surveillance & Society, 7(3/4), 2010, 192-230.

Mayer, R.C., Davis, J.H. & Schoorman, F.D.: An Integrative Model of Organizational Trust. The Academy of Management Review, 20(3), 1995, 709-734.

McKnight, D.H. & Chervany, N.L.: What Trust Means in E-Commerce Customer Relationships: An Interdisciplinary Conceptual Typology. International Journal of Electronic Commerce, 6(2), 2001, 35-59.

Metzger, M.J.: Privacy, Trust, and Disclosure: Exploring Barriers to Electronic Commerce. Journal of Computer-Mediated Communication, 9(4), 2004.

Michael, K. & Clarke, R.: Location and Tracking of Mobile Devices: Überveillance Stalks the Streets. Computer Law and Security Review, 29(2), 2013, 216-228.

Michael, K., McNamee, A. & Michael, M.G.: The Emerging Ethics of Humancentric GPS Tracking and Monitoring. International Conference on Mobile Business, Copenhagen, Denmark, IEEE Computer Society, 2006a, 1-10.

Michael, K., McNamee, A., Michael, M.G., and Tootell, H.: Location-Based Intelligence – Modeling Behavior in Humans using GPS. IEEE International Symposium on Technology and Society, 2006b.

Michael, K., Stroh, B., Berry, O., Muhlbauer, A. & Nicholls, T.: The Avian Flu Tracker - a Location Service Proof of Concept. Recent Advances in Security Technology, Australian Homeland Security Research Centre, 2006, 1-11.

Michael, K. and Michael, M.G.: Australia and the New Technologies: Towards Evidence Based Policy in Public Administration (1 ed). Wollongong, Australia: University of Wollongong, 2008, Available at: http://works.bepress.com/kmichael/93

Michael, K. & Michael, M.G.: Microchipping People: The Rise of the Electrophorus. Quadrant, 49(3), 2005, 22-33.

Michael, K. and Michael, M.G.: From Dataveillance to Überveillance (Uberveillance) and the Realpolitik of the Transparent Society (1 ed). Wollongong: University of Wollongong, 2007. Available at: http://works.bepress.com/kmichael/51.

Michael, K. & Michael, M.G.: Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants. IGI Global, Hershey, PA, 2009.

Michael, K. & Michael, M.G.: The Social and Behavioral Implications of Location-Based Services. Journal of Location-Based Services, 5(3/4), 2011, 1-15, http://works.bepress.com/kmichael/246.

Michael, K. & Michael, M.G.: Sousveillance and Point of View Technologies in Law Enforcement: An Overview, in The Sixth Workshop on the Social Implications of National Security: Sousveillance and Point of View Technologies in Law Enforcement, University of Sydney, NSW, Australia, Feb. 2012.

Michael, K., Roussos, G., Huang, G.Q., Gadh, R., Chattopadhyay, A., Prabhu, S. and Chu, P.: Planetary-scale RFID Services in an Age of Uberveillance. Proceedings of the IEEE, 98.9, 2010, 1663-1671.

Michael, M.G. and Michael, K.: National Security: The Social Implications of the Politics of Transparency. Prometheus, 24(4), 2006, 359-364.

Michael, M.G. & Michael, K. Towards a State of Uberveillance. IEEE Technology and Society Magazine, 29(2), 2010, 9-16.

Michael, M.G. & Michael, K. (eds): Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies. Hershey, PA, IGI Global, 2013.

O'Connor, P.J. & Godar, S.H.: Chapter XIII. We Know Where You Are: The Ethics of LBS Advertising, in B.E. Mennecke and T.J. Strader (eds.), Mobile Commerce: Technology, Theory and Applications, Idea Group Publishing, Hershey, US, 2003, 245-261.

Orwell, G.: Nineteen Eighty Four. McPherson Printing Group, Maryborough, Victoria, 1949.

Oxford Dictionary: Control, Oxford University Press, 2012a http://oxforddictionaries.com/definition/control?q=control.

Oxford Dictionary: Trust, Oxford University Press, 2012b, http://oxforddictionaries.com/definition/trust?q=trust.

Pavlou, P.A.: Consumer Acceptance of Electronic Commerce: Integrating Trust and Risk with the Technology Acceptance Model. International Journal of Electronic Commerce, 7(3), 2003, 69-103.

Perusco, L. & Michael, K.: Humancentric Applications of Precise Location Based Services, in IEEE International Conference on e-Business Engineering, Beijing, China, IEEE Computer Society, 2005, 409-418.

Perusco, L. & Michael, K.: Control, Trust, Privacy, and Security: Evaluating Location-Based Services. IEEE Technology and Society Magazine, 26(1), 2007, 4-16.

Perusco, L., Michael, K. & Michael, M.G.: Location-Based Services and the Privacy-Security Dichotomy, in Proceedings of the Third International Conference on Mobile Computing and Ubiquitous Networking, London, UK, Information Processing Society of Japan, 2006, 91-98.

Quinn, M.J.: Ethics for the Information Age. Second Edition, Pearson/Addison-Wesley, Boston, 2006.

Renegar, B., Michael, K. & Michael, M.G.: Privacy, Value and Control Issues in Four Mobile Business Applications, in 7th International Conference on Mobile Business (ICMB2008), Barcelona, Spain, IEEE Computer Society, 2008, 30-40.

Rozenfeld, M.: The Value of Privacy: Safeguarding your information in the age of the Internet of Everything, The Institute: the IEEE News Source, 2014, http://theinstitute.ieee.org/technology-focus/technology-topic/the-value-of-privacy.

Rummel, R.J.: Death by Government. Transaction Publishers, New Brunswick, New Jersey, 1997.

Sanquist, T.F., Mahy, H. & Morris, F.: An Exploratory Risk Perception Study of Attitudes toward Homeland Security Systems. Risk Analysis, 28(4), 2008, 1125-1133.

Schoorman, F.D., Mayer, R.C. & Davis, J.H.: An Integrative Model of Organizational Trust: Past, Present, and Future. Academy of Management Review, 32(2), 2007, 344-354.

Shay, L.A., Conti, G., Larkin, D., Nelson, J.: A framework for analysis of quotidian exposure in an instrumented world. IEEE International Carnahan Conference on Security Technology (ICCST), 2012, 126-134.

Siau, K. & Shen, Z.: Building Customer Trust in Mobile Commerce. Communications of the ACM, 46(4), 2003, 91-94.

Solove, D.: Digital Dossiers and the Dissipation of Fourth Amendment Privacy. Southern California Law Review, 75, 2002, 1083-1168.

Solove, D.: The Digital Person: Technology and Privacy in the Information Age. New York University Press, New York, 2004.

Tavani, H.T.: Ethics and Technology: Ethical Issues in an Age of Information and Communication Technology. John Wiley, Hoboken, N.J., 2007.

Valacich, J.S.: Ubiquitous Trust: Evolving Trust into Ubiquitous Computing Environments. Business, Washington State University, 2003, 1-2.

van Ooijen, C. & Nouwt, S.: Power and Privacy: The Use of LBS in Dutch Public Administration, in B. van Loenen, J.W.J. Besemer and J.A. Zevenbergen (eds.), Sdi Convergence. Research, Emerging Trends, and Critical Assessment, Nederlandse Commissie voor Geodesie Netherlands Geodetic Commission 48, 2009, 75-88.

Wakunuma, K.J. and Stahl, B.C.: Tomorrow’s Ethics and Today’s Response: An Investigation into The Ways Information Systems Professionals Perceive and Address Emerging Ethical Issues. Inf Syst Front, 16, 2014, 383–397.

Weckert, J.: Trust and Monitoring in the Workplace. IEEE International Symposium on Technology and Society, 2000. University as a Bridge from Technology to Society, 2000, 245-250.

Wigan, M. & Clarke, R.: Social Impacts of Transport Surveillance. Prometheus, 24(4), 2006, 389-403.

Xu, H. & Teo, H.H.: Alleviating Consumers’ Privacy Concerns in Location-Based Services: A Psychological Control Perspective. Twenty-Fifth International Conference on Information Systems, 2004, 793-806.

Xu, H., Teo, H.H. & Tan, B.C.Y.: Predicting the Adoption of Location-Based Services: The Role of Trust and Perceived Privacy Risk. Twenty-Sixth International Conference on Information Systems, 2005, 897-910.

Yan, Z. & Holtmanns, S.: Trust Modeling and Management: From Social Trust to Digital Trust, in R. Subramanian (ed.), Computer Security, Privacy and Politics: Current Issues, Challenges and Solutions. IGI Global, 2008, 290-323.

Yeh, Y.S. & Li, Y.M.: Building Trust in M-Commerce: Contributions from Quality and Satisfaction. Online Information Review, 33(6), 2009, 1066-1086.

Citation: Roba Abbas, Katina Michael, M.G. Michael, "Using a Social-Ethical Framework to Evaluate Location-Based Services in an Internet of Things World", IRIE, International Review of Information Ethics, http://www.i-r-i-e.net/ Source: http://www.i-r-i-e.net/inhalt/022/IRIE-Abbas-Michael-Michael.pdf Dec 2014

Author(s):

Honorary Fellow Dr Roba Abbas:

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 - 4221 - 3555 , * roba@uow.edu.au :http://www.technologyandsociety.org/members/2013/7/25/dr-roba-abbas

·         Relevant publications:

o    R. Abbas, K. Michael, M.G. Michael, R. Nicholls, Sketching and validating the location-based services (LBS) regulatory framework in Australia, Computer Law and Security Review 29, No.5 (2013): 576-589.

o    R. Abbas, K. Michael, M.G. Michael, The Regulatory Considerations and Ethical Dilemmas of Location-Based Services (LBS): A Literature Review, Information Technology & People 27, No.1 (2014): 2-20.

Associate Professor Katina Michael:

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 - 4221 - 3937 , * katina@uow.edu.au : http://ro.uow.edu.au/kmichael

·         Relevant publications:

o    K. Michael, R. Clarke, Location and Tracking of Mobile Devices: Überveillance Stalks the Streets, Computer Law and Security Review 29, No.3 (2013): 216-228.

o    K. Michael, M. G. Michael, Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants, IGI Global, (2009).

o    L. Perusco, K. Michael, Control, trust, privacy, and security: evaluating location-based services, IEEE Technology and Society Magazine 26, No.1 (2007): 4-16.

Honorary Associate Professor M.G. Michael

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 – 4221 - 3937, *  mgm@uow.edu.au, : http://ro.uow.edu.au/mgmichael

·         Relevant publications:

o    M.G. Michael and K. Michael (eds) Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies, Hershey: PA, IGI Global, (2013).

o    K. Michael, M. G. Michael, "The Social and Behavioral Implications of Location-Based Services, Journal of Location-Based Services, Volume 5, Issue 3-4, (2011), 121-137.

o    M.G. Michael, K. Michael, Towards a State of Uberveillance, IEEE Technology and Society Magazine, 29, No.2, (2010): 9-16.

o    M. G. Michael, S. J. Fusco, K. Michael, A Research Note on Ethics in the Emerging Age of Uberveillance, Computer Communications, 31 No.6, 2008: 1192-1199.

Be Vigilant: There Are Limits to Veillance

The Computer After Me: Awareness and Self-Awareness in Autonomic Systems

Chapter 13: Be Vigilant: There Are Limits to Veillance

This image was taken from the BioShock video game series or from websites created and owned by 2K Games, the copyright of which is held by Take-Two Interactive Software, Inc.

Katina Michael, M. G. Michael, Christine Perakslis

The following sections are included:

  • Introduction

  • From Fixed to Mobile Sensors

  • People as Sensors

  • Enter the Veillances

    • Surveillance

    • Dataveillance

    • Sousveillance

    • Überveillance

  • Colliding Principles

    • From ‘drone view’ to ‘person view’

    • Transparency and open data

    • Surveillance, listening devices and the law

    • Ethics and values

    • The unintended side effects of lifelogging

    • Pebbles and shells

    • When bad is good

    • Censorship

  • Summary and Conclusions: Mind/Body Distinction

13.1 Introduction

Be vigilant; we implore the reader. Yet, vigilance requires hard mental work (Warm et al., 2008). Humans have repeatedly shown evidence of poor performance relative to vigilance, especially when we are facing such factors as complex or novel data, time pressure, and information overload (Ware, 2000). For years, researchers have investigated the effect of vigilance, from the positive impact of it upon the survival of the ground squirrel in Africa to its decrement resulting in the poor performance of air traffic controllers. Scholars seem to agree: fatigue has a negative bearing on vigilance.

In our society, we have become increasingly fatigued, both physically and cognitively. It has been widely documented that employees are in­creasingly faced with time starvation, and that consequently self-imposed sleep deprivation is one of the primary reasons for increasing fatigue, as employees forego sleep in order to complete more work (see, for example, the online publications by the Society of Human Resources1 and the Na­tional Sleep Foundation2). Widespread access to technology exacerbates the problem, by making it possible to stay busy round the clock.

Our information-rich world which leads to information overload and novel data, as well as the 24/7/365 connectivity which leads to time pressure, both contribute to fatigue and so work against vigilance. However, the lack of vigilance, or the failure to accurately perceive, identify, or an­alyze bona fide threats, can lead to serious negative consequences, even a life-threatening state of affairs (Capurro, 2013).

This phenomenon, which can be termed vigilance fatigue, can be brought about by four factors:

·       Prolonged exposure to ambiguous, unspecified, and ubiquitous threat information.

·       Information overload.

·       Overwhelming pressure to maintain exceptional, error-free per­formance.

·       Faulty strategies for structuring informed decision-making under con­ditions of uncertainty and stress.

Therefore, as we are asking the reader to be vigilant in this transformative – and potentially disruptive transition toward – the ‘computer after me’, we feel obligated to articulate clearly the potential threats associated with veillance. We believe we must ask the challenging and unpopular questions now. We must disclose and discuss the existence of risk, the values at stake, and the possibility of harm related to veillance. We owe it to the reader in this world of increasing vigilance fatigue to provide unambiguous, specified threat information and to bring it to their attention.

13.2 From Fixed to Mobile Sensors

Embedded sensors have provided us with a range of benefits and conve­niences that many of us take for granted in our everyday life. We now find commonplace the auto-flushing lavatory and the auto-dispensing of soap and water for hand washing. Many of these practices are not only conve­nient but help to maintain health and hygiene. We even have embedded sensors in lamp-posts that can detect on-coming vehicles and are so energy efficient that they turn on as they detect movement, and then turn off again to conserve resources. However, these fixtures are static; they form basic infrastructure that often has ‘eyes’ (e.g. an image and/or motion sensor), but does not have ‘legs’.

What happens when these sensors – for identification, location, condi­tion monitoring, point-of-view (POV) and more – become embeddable in mobile objects and begin to follow and track us everywhere we go? Our vehicles, tablets, smart phones, and even contactless smart cards are equipped to capture, synthesize, and communicate a plethora of information about our behaviors, traits, likes and dislikes, as we lug them around everywhere we go. Automatic licence plate scanners are mounted not only in street­lights or on bridges, but now also on patrol cars. These scanners snap photos of automobiles passing and store such data as plate numbers, times, and locations within massive databases (Clarke, 2009). Stores are combin­ing the use of static fixtures with mobile devices to better understand the psychographics and demographics of their shoppers (Michael and Clarke, 2013). The combination of these monitoring tools is powerful. Cell phone identifiers are used to track the movements of the customers (even if the customer is not connected to the store’s WiFi network), with the surveillance cameras collecting biometric analytics to analyze facial expressions and moods. Along with an augmented capability to customize and person­alize marketing efforts, the stores can identify how long one tarries in an aisle, the customer’s reaction to a sale item, the age of the shopper, and even who did or did not walk by a certain display.

The human has now become an extension (voluntarily or involuntarily) of these location-based and affect-based technological breakthroughs; we the end-users are in fact the end-point of a complex network of net­works. The devices we carry take on a life of their own, sending binary data up and down stream in the name of better connectivity, awareness, and ambient intelligence. ‘I am here’, the device continuously signals to the nearest access node, handshaking a more accurate location fix, as well as providing key behavioral indicators which can easily become predictors of future behaviors. However, it seems as if we, as a society, are rapidly in de­mand of more and more communications technology – or so that is the idea we are being sold. Technology has its many benefits: few people are out of reach now, and communication becomes easier, more personalized, and much more flexible. Through connectivity, people’s input is garnered and responses can be felt immediately. Yet, just as Newton’s action–reaction law comes into play in the physical realm, there are reactions to consider for the human not only in the physical realms, but also in the mental, emo­tional, and spiritual realms (Loehr and Schwartz, 2001), when we live our lives not only in the ordinary world, but also within the digital world.

Claims have been made that our life has become so busy today that we are grasping to gain back seconds in our day. It could be asked: why should we waste time and effort by manually entering all these now-necessary pass­words, when a tattoo or pill could transmit an 18-bit authentication signal for automatic logon from within our bodies? We are led to believe that individuals are demanding uninterrupted connectivity; however, research has shown that some yearn to have the freedom to ‘live off the grid’, even if for only a short span of time (Pearce and Gretzel, 2012).

A recent front cover of a US business magazine Fast Company read “Unplug. My life was crazy. So I disconnected for 25 days. You should too”. The content within the publication includes coping mechanisms of senior-level professionals who are working to mitigate the consequences of perpetual connectivity through technology. One article reveals the digital dilemmas we now face (e.g. how much should I connect?); another article provides tips on how to do a digital detox (e.g. disconnecting because of the price we pay); and yet another article outlines how to bring sanity to your crazy, wired life with eight ways the busiest connectors give themselves a break (e.g. taking time each day to exercise in a way that makes it impossi­ble to check your phone; ditching the phone to ensure undivided attention is given to colleagues; or establishing a company ‘Shabbat’ in which it is acceptable to unplug one day a week). Baratunde Thurston, CEO and co­founder of Cultivated Wit (and considered by some to be the world’s most connected man), wrote:

I love my devices and my digital services, I love being connected to the global hive mind – but I am more aware of the price we pay: lack of depth, reduced accuracy, lower quality, impatience, selfishness, and mental exhaustion, to name but a few. In choosing to digitally enhance lives, we risk not living them.
— (Thurston, 2013, p. 77)

13.3 People as Sensors

Enter Google Glass, Autographer, Memoto, TrackStick, Fitbit, and other wearable devices that are worn like spectacles, apparel, or tied round the neck. The more pervasive innovations such as electronic tattoos, nanopatches, smart pills, and ICT implants seamlessly become a ‘part’ of the body once attached, swallowed, embedded, or injected. These technolo­gies are purported to be lifestyle choices that can provide a myriad of con­veniences and productivity gains, as well as improved health and well-being functionality. Wearables are believed to have such benefits as enhancements to self-awareness, communication, memory, sensing, recognition, and logis­tical skills. Common experiences can be augmented, for example when a theme park character (apparently) knows your child’s name because of a wrist strap that acts as an admissions ticket, wallet, and ID.

Gone are the days when there was a stigma around electronic bracelets being used to track those on parole; these devices are now becoming much like a fashion statement and a desirable method not only for safety and security, but also for convenience and enhanced experiences. However, one must consider that an innocuous method for convenience may prove to create ‘people as sensors’ in which information is collected from the envi­ronment using unobtrusive measures, but with the wearer – as well as those around the wearer – possibly unaware of the extent of the data collection. In addition to issues around privacy, other questions must be asked such as: what will be done with the data now and well into the future?

The metaphor of ‘people as sensors’, also referred to as Citizens as Sen­sors (Goodchild, 2007), is being espoused, as on-board chipsets allow an individual to look out toward another object or subject (e.g. using an im­age sensor), or to look inward toward oneself (e.g. measuring physiological characteristics with embedded surveillance devices). As optional prosthetic devices are incorporated into users, devices are recognized by some as be­coming an extension of the person’s mind and body. New developments in ‘smart skin’ offer even more solutions. The skin can become a function of the user’s habits, personality, mood, or behavior. For example, when inserted into a shoe, the smart skin can analyze and improve the technical skill of an athlete, factors associated with body stresses related to activity, or even health issues that may result from the wearer’s use of high-heeled shoes (Papakostas et al., 2002). Simply put, human beings who function in analog are able to communicate digitally through the devices that they wear or bear. This is quite a different proposition from the typical surveil­lance camera that is bolted onto a wall overlooking the streetscape or mall and has a pre-defined field of view.

Fig. 13.1 People as sensors: from surveillance to uberveillance

‘People as sensors’ is far more pervasive than dash-cams used in police vehicles, and can be likened to the putting on of body-worn devices by law enforcement agencies to collect real-time data from the field (see Fig­ure 13.1). When everyday citizens are wearing and bearing these devices, they form a collective network by contributing individual subjective (and personal) observations of themselves and their surroundings. There are advantages; the community is believed to benefit with relevant, real-time information on such issues as public safety, street damage, weather obser­vations, traffic patterns, and even public health (cf. Chapter 12). People, using their everyday devices, can enter information into a data warehouse, which could also reduce the cost of intensive physical networks that oth­erwise need to be deployed. Although murky, there is vulnerability; such as the risk of U-VGI (Un-Volunteered Geographical Information) with the tracking of mass movements in a cell phone network to ascertain traffic distribution (Resch, 2013).

Consider it a type of warwalking on foot rather than wardriving.3 It seems that opt-in and opt-out features are not deemed necessary, perhaps due to the perceived anonymity of individual user identifiers. The ability to ‘switch off’, ‘turn off’, ‘unplug’, or select the ‘I do not consent’ feature in a practical way, is a question that many have pondered, but with arguably a limited number of pragmatic solutions, if any.

With ‘citizens as sensors’ there is an opt-in for those subscribing, but issues need to be considered for those in the vicinity of the bearer who did not consent to subscribe or to be recorded. Researchers contend that even the bearer must be better educated on the potential privacy issues (Daskala, 2011). For example, user-generated information yields longitude and lat­itude coordinates, time and date stamps, and speed and elevation details which tell us significant aspects about a person’s everyday life leading to insight about current and predictive behavioral patterns. Data could also be routinely intercepted (and stored indefinitely), as has been alleged in the recent National Security Agency (NSA) scandal. Even greater concerns arise from the potential use of dragnet electronic surveillance to be mined for information (now or in the future) to extract and synthesize rich het­erogeneous data containing personal visual records and ‘friends lists’ of the new media. Call detail records (CDRs) may just be the tip of the iceberg.

The quantified-self movement, which incorporates data, taking into ac­count many inputs of a person’s daily life, is being used for self-tracking and community building so individuals can work toward improving their daily functioning (e.g. how you look, feel, and live). Because devices can look inward toward oneself, one can mine very personal data (e.g. body mass index and heart rate) which can then be combined with the outward (e.g. the vital role of your community support network) to yield such quantifiers as a higi score defining a person with a cumulative grade (e.g. your score today out of a possible 999 points).4

Wearables, together with other technologies, assist in the process of tak­ing in multiple and varied data points to synthesize the person’s mental and physical performance (e.g. sleep quality), psychological states such as moods and stimulation levels (e.g. excitement), and other inputs such as food, air quality, location, and human interactions. Neurologically, information is addictive; yet, humans may make worse decisions when more information is at hand. Humans also are believed to overestimate the value of missing data which may lead to an endless pursuit, or perhaps an overvaluing of useless information (Bastardi and Shafir, 1998). Even more consequential, it is even possible that too much introspection can also reduce the quality of decisions of individuals.

13.4 Enter the Veillances

Katina Michael and M. G. Michael (2009) made a presentation that, for the first time at a public gathering, considered surveillance, dataveillance, sousveillance and überveillance all together. As a specialist term, veillance was first used in an important blogpost exploring equiveillance by Ian Kerr and Steve Mann (2006) in which the ‘valences of veillance’ were briefly described. But in contrast to Kerr and Mann, Michael and Michael were pondering on the intensification of a state of überveillance through increasingly pervasive technologies, which can provide details from the big picture view right down to the miniscule personal details.

But what does veillance mean? And how is it understood in different contexts? What does it mean to be watched by a CCTV camera, to have one’s personal details deeply scrutinized, to watch another, to watch one­self? And so we continue by defining the four types of veillances that have received attention in recognized peer reviewed journal publications and the wider corpus of literature.

13.4.1 Surveillance

First, the much embraced idea of surveillance recognized in the early nine­teenth century from the French sur meaning ‘over’ and veiller meaning ‘to watch’. According to the Oxford English Dictionary, veiller stems from the Latin vigilare, which means ‘to keep watch’.

13.4.2 Dataveillance

Dataveillance was conceived by Clarke (1988a) as “the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” (although in the Oxford English Dictionary it is now defined as “the practice of monitoring the online ac­tivity of a person or group”). The term was introduced in response to government agency data matching initiatives linking taxation records and social security benefits, among other commercial data mining practices. At the time it was a powerful response to the proposed Australia Card pro­posal in 1987 (Clarke, 1988b), which was never implemented by the Hawke Government, while the Howard Government’s attempts to introduce an Access Card almost two decades later in 2005 were also unsuccessful. It is remarkable that same issues ensue today, only on a greater magnitude with more consequences and advanced capabilities in analytics, data storage, and converging systems.

13.4.3 Sousveillance

Sousveillance was defined by Steve Mann in 2002, but practiced since 1995 as “the recording of an activity from the perspective of a participant in the activity” . 5 However, its initial introduction into the literature came in the inaugural Surveillance and Society journal in 2003 with a meaning of ‘in­verse surveillance’ as a counter to organizational surveillance (Mann et al., 2003). Mann prefers to interpret sousveillance as under-sight, which main­tains integrity, contra to surveillance as over-sight (Mann, 2004a), which reduces to hypocrisy if governments responsible for surveillance pass laws to make sousveillance illegal.

Whereas dataveillance is the systematic use of personal data systems in the monitoring of people, sousveillance is the inverse of monitoring people; it is the continuous capture of personal experience (Mann, 2004b). For ex­ample, dataveillance might include the linking of someone’s tax file number with their bank account details and communications data. Sousveillance on the other hand, is a voluntary act of logging what people might see as they move through the world. Surveillance is thus considered watch­ing from above, whereas sousveillance is considered watching from below. In contrast, dataveillance is the monitoring of a person’s activities which presents the individual with numerous social dangers (Clarke, 1988a).

13.4.4 Uberveillance

¨Uberveillance conceived by M. G. Michael in 2006, is defined in the Aus­tralian Law Dictionary as: “ubiquitous or pervasive electronic surveillance that is not only ‘always on’ but ‘always with you’, ultimately in the form of bodily invasive surveillance”. The Macquarie Dictionary of Australia entered the term officially in 2008 as “an omnipresent electronic surveil­lance facilitated by technology that makes it possible to embed surveil­lance devices in the human body”. Michael and Michael (2007) defined überveillance as having “to do with the fundamental who (ID), where (loca­tion), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought)”.

¨Uberveillance is a compound word, conjoining the German über mean­ing ‘over’ or ‘above’ with the French veillance. The concept is very much linked to Friedrich Nietzsche’s vision of the übermensch, who is a man with powers beyond those of an ordinary human being, like a super-man with amplified abilities (Michael and Michael, 2010). ¨Uberveillance is analogous to big brother on the inside looking out. For example, heart, pulse, and temperature sensor readings emanating from the body in binary bits wire­lessly, or even through amplified eyes such as inserted contact lens ‘glass’ that might provide visual display and access to the Internet or social net­working applications.

¨Uberveillance brings together all forms of watching from above and from below, from machines that move to those that stand still, from animals and from people, acquired involuntarily or voluntarily using obtrusive or unob­trusive devices (Michael et al., 2010). The network infrastructure underlies the ability to collect data direct from the sensor devices worn by the individ­ual and big data analytics ensures an interpretation of the unique behavioral traits of the individual, implying more than just predicted movement, but intent and thought (Michael and Miller, 2013).

It has been said that überveillance is that part of the veillance puz­zle that brings together the sur, data, and sous to an intersecting point (Stephan et al., 2012). In überveillance, there is the ‘watching’ from above component (sur), there is the ‘collecting’ of personal data and public data for mining (data), and there is the watching from below (sous), which can draw together social networks and strangers, all coming together via wear­able and implantable devices on/in the human body. ¨Uberveillance can be used for good in the practice of health for instance, but we contend that, independent of its application for non-medical purposes, it will always have an underlying control factor (Masters and Michael, 2006).

13.5 Colliding Principles

13.5.1 From ‘drone view’ to ‘person view’

It can be argued that, because a CCTV camera is monitoring activities from above, we should have the ‘counter-right’ to monitor the world around us from below. It therefore follows, if Google can record ‘street views’, then the average citizen should also be able to engage in that same act, which we may call ‘person view’. Our laws as a rule do not forbid recording the world around us (or even each other for that matter), so long as we are not encroaching on someone else’s well-being or privacy (e.g. stalking, or making material public without expressed consent). While we have Street View today, it will only be a matter of time before we have ‘drones as a service’ (DaaS) products that systematically provide even better high res­olution imagery than ‘satellite views’. We can make ‘drone view’ available on Google Maps, as we could probably also make ‘person view’ available. Want to look up not only a street, but a person if they are logged in and registered? Then search ‘John Doe’ and find the nearest camera pointing toward him, and/or emanating from him. Call it a triangulation of sorts.

13.5.2 Transparency and open data

The benefits of this kind of transparency, argue numerous scholars, are that not only will we have a perfect source of open data to work with, but that there will be less crime as people consider the repercussions of being caught doing wrong in real-time. However, this is quite an idealistic paradigm and ethically flawed. Criminals, and non-criminals for that mat­ter, find ways around all secure processes, no matter how technologically foolproof. At that point, the technical elite might well be systematically hiding or erasing their recorded misdemeanours but no doubt keeping the innocent person under 24/7/365 watch. There are, however, varying de­grees to transparency, and most of these have to do with economies of scale and/or are context-based; they have to be. In short, transparency needs to be context related.

13.5.3 Surveillance, listening devices and the law

At what point do we actually believe that in a public space our privacy is not invaded by such incremental innovations as little wearable cameras, half the size of a matchbox, worn as lifelogging devices? One could speculate that the sheer size of these devices makes them unobtrusive and not easily detectable to the naked eye, meaning that they are covert in nature and blatantly break the law in some jurisdictions where they are worn and operational (Abbas et al., 2011). Some of these devices not only capture images every 30 seconds, but also record audio, making them potentially a form of unauthorized surveillance. It is also not always apparent when these devices are on or off. We must consider that the “unrestricted freedom of some may endanger the well-being, privacy, or safety of others” (Rodota and Capurro, 2005, p. 23). Where are the distinctions between the wearer’s right to capture his or her own personal experiences on the one hand (i.e. the unrestricted freedom of some), and intrusion into another’s private sphere in which he or she does not want to be recorded, and is perhaps even disturbed by the prospect of losing control over his or her privacy (i.e. endangering the well-being or privacy of others)?

13.5.4 Ethics and values

Enter ethics and values. Ethics in this debate are greatly important. They have been dangerously pushed aside, for it is ethics that determine the degree of importance, that is the value, we place on the levels of our decision-making. When is it right to take photographs and record another individual (even in a public space), and when is it wrong? Do I physically remove my wearable device when I enter a washroom, a leisure centre, a hospital, a funeral, someone else’s home, a bedroom? Do I need to ask express permis­sion from someone to record them, even if I am a participant in a shared activity? What about unobtrusive devices that blur the line between wear­ables and implantables, such as miniature recording devices embedded in spectacle frames or eye sockets and possibly in the future embedded in con­tact lenses? Do I have to tell my future partner or prospective employer? Should I declare these during the immigration process before I enter the secure zone?

At the same time, independent of how much crowdsourced evidence is gathered for a given event, wearables and implantables are not infallible, their sensors can easily misrepresent reality through inaccurate or incom­plete readings and data can be even further misconstrued post capture (Michael and Michael, 2007). This is the limitation of an überveillance so­ciety – devices are equipped with a myriad of sensors; they are celebrated as achieving near omnipresence, but the reality is that they will never be able to achieve omniscience. Finite knowledge and imperfect awareness create much potential for inadequate or incomplete interpretations.

Some technologists believe that they need to rewrite the books on meta­physics and ontology, as a result of old and outmoded definitions in the traditional humanities. We must be wary of our increasing ‘technicized’ environment however, and continue to test ourselves on the values we hold as canonical, which go towards defining a free and autonomous human be­ing. The protection of personal data has been deemed by the EU as an autonomous individual right.

Yet, with such pervasive data collection, how will we protect “the right of informational self-determination on each individual – including the right to remain master of the data concerning him or her” (Rodota and Capurro, 2005, p. 17)? If we rely on bio-data to drive our next move based on what our own wearable sensors tells some computer application is the right thing to do, we very well may lose a great part of our freedom and the life-force of improvization and spontaneity. By allowing this data to drive our decisions, we make ourselves prone to algorithmic faults in software programs among other significant problems.

13.5.5 The unintended side effects of lifelogging

Lifelogging captures continuous first-person recordings of a person’s life and can now be dynamically integrated into social networking and other appli­cations. If lifelogging is recording your daily life with technical tools, many are unintentionally participating in a form of lifelogging by recording their lives through social networks. Although, technically, data capture in social media happens in bursts (e.g. the upload of a photograph) compared with continuous recording of first-person recordings (e.g. glogger.mobi) (Daskala, 2011). Lifelogging is believed to have such benefits as affecting how we re­member, increasing productivity, reducing an individual’s sense of isolation, building social bonds, capturing memories, and enhancing communication.

Governing bodies could also derive benefit through lifelogging appli­cations data to better understanding public opinion or forecast emerging health issues for society. However, memories gathered by lifelogs can have side effects. Not every image, and not every recording you will take will be a happy one. Replaying these and other moments might be detrimental to our well-being. For example, history shows ‘looking back’ may become traumatic, such as Marina Lutz’s experience of having most of her life ei­ther recorded or photographed in the first 16 years of her life by her father (see the short film The Marina Experience).

Researchers have discovered that personality development and mental health could also be negatively impacted by lifelogging applications. Vul­nerabilities include high influence potential by others, suggestibility, weak perception of self, and a resulting low self-esteem (Daskala, 2011). There is also risk that wearers may also post undesirable or personal expressions of another person, which cause the person emotional harm due to a neg­ative perception of himself or herself among third parties (Daskala, 2011). We have already witnessed such events in other social forums with tragic consequences such as suicides.

Lifelogging data may also create unhealthy competition, for example in gamification programs that use higi scores to compare your quality of life to others. Studies report psychological harm among those who perceive they do not meet peer expectations (Daskala, 2011); how much more so when intimate data about one’s physical, emotional, psychological, and so­cial network is integrated, measured, and calculated to sum up quality of life in a three-digit score (Michael and Michael, 2011). Even the effect of sharing positive lifelogging data should be reconsidered. Various reports have claimed that watching other people’s lives can develop into an obsession and can incite envy, feelings of inadequacy, or feeling as if one is not accomplished enough, especially when comparing oneself to others.

13.5.6 Pebbles and shells

Perhaps lifelogs could have the opposite effect of their intended purpose, without ever denying the numerous positives. We may become wrapped up in the self, rather than in the common good, playing to a theater, and not allowing ourselves to flourish in other ways lest we are perceived as anything but normal. Such logging posted onto public Internet archival stores might well serve to promote a conflicting identity of the self, constant validation through page ranks, hit counts and likes, and other forms of electronic exhibitionism. Researchers purport that lifelogging activities are likely to lead to an over-reliance and excessive dependency on electronic devices and systems with emotionally concerning, on-going cognitive reflections as messages are posted or seen, and this could be at the expense of more important aspects of life (Daskala, 2011).

Isaac Newton gave us much to consider when he said, “I was like a boy playing on the sea-shore, and diverting myself now and then find­ing a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me” (Brewster, 2001). Society at large must question if the measurements of Google hits, higi scores, clicks, votes, recordings, and analysis of data to quantify ‘the self’, could become a dangerously distracting exercise if left unbalanced. The aforementioned measurements, which are multi-varied and enormously insightful, may be of value – and of great enjoyment and fascination – much like Newton’s peb­bles and shells. However, what is the ocean we may overlook – or ignore – as we scour the beach for pebbles and shells?

13.5.7 When bad is good

Data collection and analysis systems, such as lifelogging, may not appro­priately allow for individuals to progress in self-awareness and personal development upon tempered reflection. How do we aptly measure the con­tradictory aspects of life such as the healing that often comes through tears, or the expending of energy (exercise) to gain energy (physical health), or the unique wonder that is realized only through the pain of self-sacrifice (e.g. veritable altruistic acts)? Harvard researchers Loehr and Schwartz (2001) provide us with further evidence of how the bad (or the unpleasant) can be good relative to personal development, through an investigation in which a key participant went by the name of ‘Richard’.

Richard was an individual progressing in self-awareness as documented during an investigation in which researchers were working to determine how executives could achieve peak performance leading to increased capacity for endurance, determination, strength, flexibility, self-control, and focus. The researchers found that executives who perform to full potential, for the long­term, tap into energy at all levels of the ‘pyramid of performance’ which has four ascending levels of progressive capacities: physical, emotional, mental, and spiritual.

The tip of the pyramid was identified as spiritual capacity, defined by the researchers as “an energy that is released by tapping into one’s deepest values and defining a strong sense of purpose” (Loehr and Schwartz, 2001, p. 127). The spiritual capacity, above all else, was found to be the sustenance – or the fuel – of the ideal performance state (IPS); the state in which individuals ‘bring their talent and skills to full ignition and to sustain high performance over time’ (op. cit., p. 122). However, as Richard worked to realize his spiritual capacity, he experienced significant pain during a two-year period. He reported being overcome by emotion, consumed with grief, and filled with longing as he learned to affirm what mattered most in his life. The two-year battle resulted in Richard ‘tapping into a deeper sense of purpose with a new source of energy’ (op. cit., p. 128); however, one must question if technology would have properly quantified the bad as the ultimate good for Richard. Spiritual reflections on the trajectory of technology (certainly since it has now been plainly linked to teleology) are not out of place nor should they be discouraged.

13.5.8 Censorship

Beyond the veillance (the ‘watching’) of oneself, i.e. the inward gaze, is the outward veillance and watching of the other. But this point of eye (PoE), does not necessarily mean a point of view (PoV), or even wider angle field of view (FoV). Particularly in the context of ‘glass’. Our gaze too is subjective, and who or what will connote this censorship at the time when it really matters? The outward watching too may not tell the full story, despite its rich media capability to gather both audio and video. Audio-visual accounts have their own pitfalls. We have long known how vitally important eye gaze is for all of the social primates, and particularly for humans; there will be consequences to any artificial tampering of this basic natural instinct. Hans Holbein’s famous painting The Ambassadors (1533), with its patent reference to anamorphosis, speaks volumes of the critical distinction between PoE and PoV. Take a look, if you are not already familiar with this double portrait and still life. Can you see the skull? The secret lies in the perspective and in the tilt of the head.

13.6 Summary and Conclusions: Mind/Body Distinction

In the future, corporate marketing may hire professional lifeloggers (or mo­bile robotic contraptions) to log other people’s lives with commercial de­vices. Unfortunately, because of inadequate privacy policies or a lack of harmonized legislation, we, as consumers, may find no laws that would pre­clude companies from this sort of ‘live to life’ hire if we do not pull the reins on the obsession to auto-photograph and audio record everything in sight. And this needs to happen right now. We have already fallen behind and are playing a risky game of catch-up. Ethics is not the overriding issue for technology companies or developers; innovation is their primary focus because, in large part, they have a fiduciary responsibility to turn a profit. We must in turn, as an informed and socially responsive community, forge together to dutifully consider the risks. At what point will we leap from tracking the mundane, which is of the body (e.g. location of GPS coordi­nates), toward the tracking of the mind by bringing all of these separate components together using ¨uber-analytics and an ¨uber-view? We must ask the hard questions now. We must disclose and discuss the existence of risk, the values at stake, and the possibility of harm.

It is significant that as researchers we are once more, at least in some places, speaking on the importance of the Cartesian mind/body distinction and of the catastrophic consequences should they continue to be confused when it comes to etymological implications and ontological categories. The mind and the body are not identical even if we are to argue from Leibniz’s Law of Identity that two things can only be identical if they at the same time share exactly the same qualities. Here as well, vigilance is enormously important that we might not disremember the real distinction between machine and human.

References

Abbas, R., Michael, K., Michael, M. G., & Aloudat, A. (2011). Emerging Forms of Covert Surveillance Using GPS-Enabled Devices. Journal of Cases on Information Technology, 13(2), 19-33.

ACLU. (2013). You Are Being Tracked: How License Plate Readers Are Being Used to Record Americans' Movements. from http://www.aclu.org/technology-and-liberty/you-are-being-tracked-how-license-plate-readers-are-being-used-record

Adler, I. (2013). How Our Digital Devices Are Affecting Our Personal Relationships. 90.9 WBUR.

ALD (Ed.). (2010). Uberveillance: Oxford University Press.

Australian Privacy Foundation. (2005). Human Services Card.   Retrieved 6 June 2013, from http://www.privacy.org.au/Campaigns/ID_cards/HSCard.html

Bastardi, A., & Shafir, E. (1998). On the Pursuit and Misuse of Useless Information. Journal of Personality and Social Psychology, 75(1), 19-32.

Brewster, D. (2001). Memoirs of the Life, Writings, and Discoveries of Sir Isaac Newton (1855) Volume II. Ch. 27: Adamant Media Corporation.

Capurro, R. (2013). Medicine in the information and knowledge society. Paper presented at the Conference Name|. Retrieved Access Date|. from URL|.

Carpenter, L. (2011). Marina Lutz interview: The sins of my father. The Observer   Retrieved 20 April 2013, from http://www.guardian.co.uk/artanddesign/2011/apr/17/photography-children

Clarke, R. (1988a). Information Technology and Dataveillance. Communications of the ACM, 31(5), 498-512.

Clarke, R. (1988b). Just another piece of plastic in your wallet: the `Australian card' scheme. ACM SIGCAS Computers and Society, 18(1), 7-21.

Clarke, R. (2009, 7 April 2009). The Covert Implementation of Mass Vehicle Surveillance in Australia. Paper presented at the Fourth Workshop on the Social Implications of National Security: Covert Policing, Canberra, Australia.

Clifford, S., & Hardy, Q. (2013). Attention, Shoppers: Store Is Tracking Your Cell.   Retrieved 14 July, from http://www.nytimes.com/2013/07/15/business/attention-shopper-stores-are-tracking-your-cell.html?pagewanted=all

Collins, L. (2008). Annals of Crime. Friend Game. Behind the online hoax that led to a girl’s suicide. The New Yorker.

DailyMail. (2013). Stores now tracking your behavior and moods through cameras.   Retrieved 6 August, from http://www.dailymail.co.uk/news/article-2364753/Stores-tracking-behavior-moods-cameras-cell-phones.html?ito=feeds-newsxml

ENISA. (2011). To log or not to log?: Risks and benefits of emerging life-logging applications. European Network and Information Security Agency   Retrieved 6 July 2013, from http://www.enisa.europa.eu/activities/risk-management/emerging-and-future-risk/deliverables/life-logging-risk-assessment/to-log-or-not-to-log-risks-and-benefits-of-emerging-life-logging-applications

FastCompany. (2013). #Unplug. Fast Company, July/August(177).

Frankel, T. C. (2012, 20 October). Megan Meier's mom is still fighting bullying. stltoday.com   Retrieved 4 November 2012

Friedman, R. (2012). Why Too Much Data Disables Your Decision Making. Psychology Today: Glue   Retrieved December 4, 2012, from http://www.psychologytoday.com/blog/glue/201212/why-too-much-data-disables-your-decision-making

Goodchild, M. F. (2007). Citizens as sensors: the world of volunteered geography. GeoJournal, 69, 211–221.

Greenwald, G. (2013). NSA collecting phone records of millions of Verizon customers daily. The Guardian   Retrieved 10 August 2013, from http://www.theguardian.com/world/2013/jun/06/nsa-phone-records-verizon-court-order

Hans Holbein the Younger. (1533). The Ambassadors.

Hayes, A. (2010). Uberveillance (Triquetra).   Retrieved 6 May 2013, from http://archive.org/details/Uberveillancetriquetra

HIGI. (2013). Your Score for Life.   Retrieved 29 June 2013, from https://higi.com/about/score

Intellitix. (2013). Reshaping the Event Horizon.   Retrieved 6 July 2013, from http://www.intellitix.com/intellitix/home/

Kerr, I., & Mann, S. (2006). Exploring Equiveillance. ID TRAIL MIX.

Krause. (2012). Vigilance Fatigue in Policing.   Retrieved 22 July, from http://www.fbi.gov/stats-services/publications/law-enforcement-bulletin/december-2012/vigilance-fatigue-in-policing

Levin, A. (2013). Waiting for Public Outrage. Paper presented at the IEEE International Symposium on Technology and Society, Toronto, Canada.

Loehr, J., & Schwartz, T. (2001). The Making of a Corporate Athlete. Harvard Business Review, January, 120-129.

Lutz, M. (2012). The Marina Experiment.   Retrieved 29 May 2013, from www.themarinaexperiment.com

Macquarie (Ed.). (2009). Uberveillance: Sydney University.

Magid, L. (2013). Wearables and Sensors Big Topics at All Things D. Forbes.

Mann, S. (2004a). Continuous lifelong capture of personal experience with EyeTap. Paper presented at the ACM International Multimedia Conference, Proceedings of the 1st ACM workshop on Continuous archival and retrieval of personal experiences (CARPE 2004), New York.

Mann, S. (2004b). Sousveillance: inverse surveillance in multimedia imaging. Paper presented at the Proceedings of the 12th annual ACM international conference on Multimedia, New York, NY, USA.

Mann, S., Nolan, J., & Wellman, B. (2003). Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments. Surveillance and Society, 1(3), 331-355.

Masters, A., & Michael, K. (2006). Lend me your arms: the use and implications of humancentric RFID. Electronic Commerce Research and Applications, 6(1), 29-39.

Michael, K. (2010). Stop social network pitfalls. Illawarra Mercury.

Michael, K. (2013a). Big Data and the Dangers of Over-Quantifying Oneself. Computer Magazine (Multimedia)   Retrieved June 7, 2013, from http://www.youtube.com/watch?v=mn_9YHV2RGQ&list=PLHJB2bhmgB7cbB-oafjt68XbzyPV46szi&index=7

Michael, K. (2013b). Snowden's Revelations Just the Tip of the Iceberg.   Retrieved 6 July 2013, from http://uberveillance.com/blog/2013/7/23/snowdens-revelations-just-the-tip-of-the-iceberg

Michael, K. (2013c). Social Implications of Wearable Computing and Augmediated Reality in Every Day Life (IEEE Symposium on Technology and Society, ISTAS13). Toronto: IEEE.

Michael, K. (2013d). Wearable computers challenge human rights. ABC Science Online.

Michael, K., & Clarke, R. (2013). Location and tracking of mobile devices: Überveillance stalks the streets. Computer Law & Security Review, 29(3), 216-228.

Michael, K., & Michael, M. G. (2009). Teaching Ethics in Wearable Computing:  the Social Implications of the New ‘Veillance’. EduPOV   Retrieved June 18, from http://www.slideshare.net/alexanderhayes/2009-aupov-main-presentation?from_search=3

Michael, K., & Michael, M. G. (2012). Converging and coexisting systems towards smart surveillance. Awareness Magazine: Self-awareness in autonomic systems, June.

Michael, K., & Michael, M. G. (Eds.). (2007). From Dataveillance to Überveillance and the Realpolitik of the Transparent Society. Wollongong, NSW, Australia.

Michael, K., & Miller, K. W. (2013). Big Data: New Opportunities and New Challenges. IEEE Computer, 46(6), 22-24.

Michael, K., Roussos, G., Huang, G. Q., Gadh, R., Chattopadhyay, A., Prabhu, S., et al. (2010). Planetary-scale RFID Services in an Age of Uberveillance. Proceedings of the IEEE, 98(9), 1663-1671.

Michael, M. G., & Michael, K. (2007). Uberveillance. Paper presented at the 29th International Conference of Data Protection and Privacy Commissioners. Privacy Horizons: Terra Incognita, Location Based Tracking Workshop, Montreal, Canada.

Michael, M. G., & Michael, K. (2010). Towards a State of Uberveillance. IEEE Technology and Society Magazine, 29(2), 9-16.

Michael, M. G., & Michael, K. (2011). The Fall-Out from Emerging Technologies: on Matters of Surveillance, Social Networks and Suicide. IEEE Technology and Society Magazine, 30(3), 15-18.

mX. (2013). Hard to Swallow.   Retrieved 6 August 2013, from http://www.mxnet.com.au/story/hard-to-swallow/story-fnh38q9o-1226659271059

Orcutt, M. (2013). Electronic “Skin” Emits Light When Pressed. MIT Tech Review.

Oxford Dictionary. (2013). Dataveillance.   Retrieved 6 May 2013, from http://oxforddictionaries.com/definition/english/surveillance

OxfordDictionary. (2013). Surveillance.   Retrieved 6 May 2013, from http://oxforddictionaries.com/definition/english/surveillance

Papakostas, T. V., Lima, J., & Lowe, M. (2002). 5:3 A Large Area Force Sensor for Smart Skin Applications. Sensors; Proceedings of IEEE, 5(3).

Pearce, P., & Gretzel, U. (2012). Tourism in technology dead zones: documenting experiential dimensions. International Journal of Tourism Sciences, 12(2), 1-20.

Pivtohead. (2013). Wearable Imaging: True point of view.   Retrieved 22 June 2013, from http://pivothead.com/#

Pokin, S. (2007). MySpace' hoax ends with suicide of Dardenne Prairie teen. St. Louis Post-Dispatch.

Resch, B. (2013). People as Sensors and Collective Sensing-Contextual Observations Complementing Geo-Sensor Network Measurements. Paper presented at the Progress in Location-Based Services, Lecture Notes in Geoinformation and Cartography.

Roberts, P. (1984). Information Visualization for Stock Market Ticks: Toward a New Trading Interface. Massachusetts Institute of Technology, Boston.

Rodota, S., & Capurro, R. (2005). Ethical Aspects of ICT Implants in the Human Body. The European Group on Ethics in Science and New Technologies (EGE)   Retrieved June 3, 2006, from http://ec.europa.eu/bepa/european-group-ethics/docs/avis20_en.pdf

SHRM. (2011). from http://www.shrm.org/publications/hrnews/pages/fatiguefactors.aspx

Spence, R. (2009). Eyeborg.   Retrieved 22 June 2010, from http://eyeborg.blogspot.com.au/

Stephan, K. D., Michael, K., Michael, M. G., Jacob, L., & Anesta, E. (2012). Social Implications of Technology: Past, Present, and Future. Proceedings of the IEEE, 100(13), 1752-1781.

SXSchedule. (2013). Better Measure: Health Engagement & higi Score.   Retrieved 29 June 2013, from http://schedule.sxsw.com/2013/events/event_IAP4888

Thurston, B. (2013). I have left the internet. Fast Company, July/August(177), 66-78, 104-105.

Ware, C. (2000). Information Visualization: Perception for Design. San Francisco, CA: Morgan Kaufmann.

Warm, J. S., Parasuraman, R., & Matthews, G. (2008). Vigilance Requires Hard Mental Work and Is Stressful. Human Factors, 433-441.

Williams, R. B. (2012). Is Facebook Good Or Bad For Your Self-Esteem? Psychology Today: Wired for Success.

Wordnik. (2013). Sousveillance.   Retrieved 6 June 2013, from http://www.wordnik.com/words/sousveillance

Endnotes

1 http://www.shrm.org/

2 www.sleepfoundation.org 

3 Someone searching for a WiFi wireless network connection using a mobile device in a moving vehicle.

4 http://higi.com/about/score; http://schedule.sxsw.com

5 http://www.wordnik.com/words/sousveillance

Citation: Katina Michael, M. G. Michael, and Christine Perakslis (2014) Be Vigilant: There Are Limits to Veillance. The Computer After Me: pp. 189-204. DOI: https://doi-org.ezproxy.uow.edu.au/10.1142/9781783264186_0013 

Advanced location-based services

This special issue of Computer Communications presents state-of-the-art research and applications in the area of location-based services (LBS). Initial location-based services entered the market around the turn of the millennium and for the greater part appeared in the form of restaurant finders and tourist guides, which never gained widespread user acceptance. The reasons for this were numerous and ranged from inaccurate localization mechanisms like Cell-ID, little creativity in the design and functions of such services, to a generally low acceptance of data services. However, in recent years, there has been an increasing market penetration of GPS-capable mobile phones and devices, which not only support high-accuracy positioning, but also allow for the execution of sophisticated location-based applications due to fast mobile data services, remarkable computational power and high-resolution color displays. Furthermore, the popularity of these devices is accompanied by the emergence of new players in the LBS market, which offer real-time mapping, points-of-interest content, navigation support, and supplementary services. LBS have also received a significant boost by federal government agency mandates in emergency services, such as in the United States of America. All these advancements are making LBS one of the most exciting areas of research and development with the potential to become one of the most pervasive and convenient services in the near future.

As it turns out, these developments lead to new and sophisticated LBSs, which are referred to as “Advanced LBSs” in this special issue. Examples include, but are not limited to, proactive services, which automatically inform their users when they enter or leave the bounds of pre-defined points of interest; community services, where members of a community mutually exchange their locations either on request or in a proactive fashion; or mobile gaming, where the geographic locations of the players become an integral part of the game. However, the realization of such Advanced LBSs is also associated with some challenges and problems, which have yet to be resolved. For example, there is a strong need for powerful middleware frameworks, architectures and protocols that support the acquisition of location data, their distribution, and processing. In the area of localization mechanisms, accuracy, reliability, and coverage of available technologies must be improved, for example, by combining several methods and enabling a seamless positioning handover between outdoor and indoor technologies. And, finally, because LBSs will significantly change the way people interact and communicate with each other, similar to the impact that mobile phones had a decade ago, solutions must be developed that allow an LBS user to safeguard their privacy with respect to real-time location reckoning, and historical location profiles.

In this special issue, we have addressed the challenges of Advanced LBSs. We received many high-quality submissions from all over the world and finally selected 13 articles. Papers were carefully reviewed and selected based on their scholarship and to provide as broad an appeal to a range of research topics. We received several papers with advanced and very interesting applications, of which we selected the most relevant and novel. Five papers are devoted to middleware and architectures, which are meant to make the infrastructure transparent to application developers and therefore speed up the development process. We received many submissions related to localization schemes and algorithms showing the importance of this aspect on location-based services and the maturity of this research topic. Three localization-related papers are included in the issue. Finally, although security, privacy and ethical issues are well-known concerns in the field of LBS, too few articles were submitted on these topics, indicating that this area requires much needed exploration. However, three interesting papers are included for your perusal. It therefore follows that advanced location-based services can be considered in totality of a given end-to-end offering or ‘advanced’ in a given aspect-complex network architecture, novel application, or multi-mode end-user IP device. A summary of the accepted papers follows.

Two papers are related to LBS applications. The first paper, “Location-Based Services for Elderly and Disabled People” by Alvaro Marco et al. includes a robust, low cost, highly accurate and scalable ZigBee- and ultrasound-based positioning system that provides alarm, monitoring, navigation and leisure services to the elderly and disable people in a residence located in Zaragoza, Spain. The paper “BlueBot: Asset Tracking via Robotic Location Crawling” by Abhishek Patil et al. presents a robot-based system that combines RFID and Wi-Fi positioning technology to automatically survey assets in a facility. The proposed system, which uses off-the-shelf components, promises to automate the tedious inventory process taking place in libraries, manufactures, distributors, and retailers of consumer goods.

Five of the selected papers deal with software middleware, architectures and APIs for advanced LBSs. The first paper, “The PoSIM Middleware for Translucent and Context-aware Integrated Management of Heterogeneous Positioning Systems” by Paolo Bellavista et al., presents middleware that integrates and hides different positioning systems to the application developer while providing different levels of information depending on context, LBS requirements, user preferences, device characteristics, and overall system state. PoSIM provides application developers both, a high level APIs that provides simplified access to positioning systems, and a low level API that provides detailed information from a specific positioning system. Sean Barbeau et al. present an update of the under-development JSR293 Java Location API for J2ME. The article describes the main features of the current API as well as the significant enhancements and new services included in the standardization effort of the expert group so far. Next, the paper “The Internet Location Services Model” by Martin Dawson presents the architecture and services being standardized by the IETF to provide location information to devices independently of any remote service provider. Hasari Celebi and Hüseyin Arslan in “Enabling Location and Environment Awareness in Cognitive Radios” propose a cognitive radio-based architecture that utilizes not only location but also environment information to support advanced LBS. Finally, Christo Laoudias et al. present “Part One: The Statistical Terminal Assisted Mobile Positioning Methodology and Architecture”. The paper describes the architecture of the STAMP system, which is meant to improve the accuracy of existing positioning systems by exploiting measurements collected at the mobile terminal side.

In the area of localization, three papers are included for your perusal. The first paper by Yannis Markoulidakis et al. present “Part Two: Kalman Filtering Options for Error Minimization in Statistical Terminal Assisted Mobile Positioning”, a Kalman filter-based solution to minimize the terminal position error for the STAMP system. Then, Marian Mohr et al. present “A Study of LBS Accuracy in the UK and a Novel Approach to Inferring the Positioning Technology Employed”, an empirical study of the accuracy of positioning information in the UK and a novel technique to infer the positioning technology used by the cellular operators. Finally, in “MLDS: A Flexible Location Directory Service for Tiered Sensor Networks”, Sangeeta Bhattacharya et al. present a multi-resolution location directory service that allows the realization of LBSs with wireless sensor networks. The system successfully tracks mobile agents across single and multiple sensor networks while considering accuracy and communication costs.

The final three articles are devoted to security, privacy and ethical issues, again, very important topics in the realization of advanced LBSs. In “Location Constraints in Digital Rights Management”, Adam Muhlbauer et al. describe the design and implementation of a system for creating and enforcing licences containing location constraints, which can be used to confine access to sensitive documents to a defined area. The following paper, “A TTP-Free Protocol for Location Privacy in Location-Based Services” by Agusti Solanas and Antoni Martı´nez-Ballesté, presents a distributed technique to progressively increase the privacy of the users when they exchange location information among untrusted parties. Finally, the paper “A Research Note on Ethics in the Emerging Age of Überveillance” by M.G. Michael et al. defines, describes and interprets the socio-ethical implications that tracking and monitoring services bring to humans because of the ability of the government and service providers to collect targeted data and conduct general surveillance on individuals. The study calls for further research to create legislation, policies and social awareness in the age of Überveillance, an emerging concept used to describe exaggerated, omnipresent electronic surveillance.

This issue of Computer Communications offers a ground-breaking view into current and future developments in Advanced Location-Based Services. The global nature of submissions indicates that location-based services is a world-wide application focus that has universal appeal both in terms of research and commercialization. This issue offers both academic and industry appeal- the former as a basis toward future research directions, and the latter toward viable commercial LBS implementations. Advanced location-based services in the longer-term will be characterized by their criticalness in consumer, business and government applications in the areas of banking, health, supply chain management, emergency services, and national security.

We thank Editor-in-Chief Jeremy Thompson and Co-Editor-in-Chief Mohammed Atiquzzaman for hosting this special issue. Thanks also to Lorraine McMorrow and Sandra Korver for their support overseeing the paper review and publishing processes. We also thank all the authors and anonymous reviewers for their hard and timely work.

We hope you enjoy this issue as much as we did!

Citation: Miguel A.Labrador, Katina Michael, Axel Küpper Advanced location-based services, Computer Communications, Vol. 31, No. 6, 18 April 2008, pp. 1053-1054. DOI: https://doi.org/10.1016/j.comcom.2008.01.033

Lend Me Your Arms: Use and Implications of RFID Implants

Abstract

Recent developments in the area of RFID have seen the technology expand from its role in industrial and animal tagging applications, to being implantable in humans. With a gap in literature identified between current technological development and future humancentric possibility, little has been previously known about the nature of contemporary humancentric applications. By employing usability context analyses in control, convenience and care-related application areas, we begin to piece together a cohesive view of the current development state of humancentric RFID, as detached from predictive conjecture. This is supplemented by an understanding of the market-based, social and ethical concerns which plague the technology.

1. Introduction

Over the past three decades, Radio-frequency identification (RFID) systems have evolved to become cornerstones of many complex applications. From first beginnings, RFID has been promoted as an innovation in convenience and monitoring efficiencies. Indeed, with RFID supporters predicting the growth of key medical services and security systems, manufacturers are representing the devices as ‘life-enhancing’. Though the lifestyle benefits have long been known, only recently have humans become both integral and interactive components in RFID systems. Where we once carried smart cards or embedded devices interwoven in clothing, RFID technology is now at a point where humans can safely be implanted with small transponders.

This paper aims to explore the current state of development for humancentric applications of RFID. The current state is defined by the intersection of existing development for the subjects and objects of RFID – namely humans and implants. The need for such a study has been identified by a gap in knowledge between present applications and future possibility. This study aims to overcome forecast and provide a cohesive examination of existing humancentric RFID applications. Analysis of future possibility is outside the scope of this study. Instead, a discussion will be provided on present applications, their feasibility, use and social implications.

2. Literature review

The literature review is organized into three main areas – control, convenience, and care. In each of these contexts, literature will be reviewed chronologically.

2.1. The context of control

A control-related humancentric application of RFID is any human use of an implanted RFID transponder that allows an implantee to have power over an aspect of their lives, or, that allows a third party to have power over an implantee. Substantial literature on humancentric control applications begins in 1997 with United States patent 5629678 for a ‘Personal Tracking and Recovery System’. Though the literature scientifically describes the theoretical tracking system for recovery of RFID-implanted humans, no further evidence is available to ascertain whether it has since been developed. Questions as to feasibility of use are not necessarily answered by succeeding literature. Reports of the implantation of British soldiers [1] for example lack the evidentiary support needed to assuage doubts. Further, many articles highlight the technological obstacles besieging humancentric RFID systems. These include GPS hardware miniaturization [2] and creating active RFID tags capable of being safely recharged from within the body. Further adding to reservation, much literature is speculative in nature. Eng [3], for example, predicts that tags will be melded into children to advise parents of their location.

Despite concerns and conjecture, actual implementations of humancentric control applications of RFID have been identified. Both Murray [4] and Eng documented the implantation of Richard Seelig who had tags placed in his hip and arm in response to the September 11 tragedy of 2001. This sophisticated technology was employed to provide security and control over personal identification information. Wilson [5] also provides the example of 11-year old Danielle Duval who has had an active chip (i.e. containing a rechargeable battery) implanted in her. Her mother believes that it is no different to tracking a stolen car, simply that it is being used for another more important application.

2.2. The context of convenience

A convenience-related humancentric application of RFID is any human use of an implanted RFID transponder that increases the ease with which tasks are performed. The first major documented experiment into the use of human-implantable RFID was within this context. Sanchez-Klein [6] and Witt [7] both journalize on the self-implantation of Kevin Warwick, Director of Cybernetics at the University of Reading. They describe results of Warwick’s research by his having doors open, lights switch on and computers respond to the presence of the microchip. Warwick himself gives a review of the research in his article ‘Cyborg 1.0’, however this report is informal and contains emotive descriptions of “fantastic” experiences [8].

Woolnaugh [9], Holden [10], and Vogel [11] all published accounts of the lead-up to Warwick’s second ‘Cyborg 2.0’ experiment and although Woolnaugh’s work involves the documentation of an interview, all three are narrative descriptions of proposed events rather than a critical analysis within definitive research frameworks. Though the commotion surrounding Warwick later died down, speculation did not with Eng proposing a future where credit card features will be available in implanted RFID devices. The result would see commercial transactions made more convenient.

2.3. The context of care

A care-related humancentric application of RFID is any human use of an implanted RFID transponder where function is associated with medicine, health or wellbeing. In initial literature, after the Cyborg 1.0 trial, Kevin Warwick envisioned that with RFID implants paraplegics would walk [7]. Building incrementally on this notion then is the work of Kobetic, Triolo and Uhlir who documented the study of a paraplegic male who had muscular stimuli delivered via an implanted RFID controlled electrical simulation system [12]. Though not allowing the mobility which Warwick dreamt of, results did include increased energy and fitness for the patient.

Outside the research sphere, much literature centers on eight volunteers who were implanted with commercial VeriChip RFID devices in 2002 trials. Murray [13], Black [14], Grossman [15] and Gengler [16] all document medical reasons behind the implantation of four subjects. Supplemented by press releases however, all reports of the trials were journalistic, rather than research-based. In contrast, non-trivial research is found in the work of Michael [17]. Her thesis uses a case study methodology, and a systems of innovation framework, to discuss the adaptation of auto-ID for medical implants.

2.4. Critical response to literature

More recent publications on humancentric RFID include the works of Masters [18], Michael and Michael [19], Perusco and Michael [20], Johnston [21], and Perakslis and Wolk [22]. Masters approaches the subject from the perspective of usability contexts, while Perusco and Michael use document analysis to categorise location services into tag, track and trace applications. Johnston uses content analysis to identify important themes in the literature, supplemented by a small-scale sample survey on the social acceptance of chip implants. Perakslis and Wolk also follow this latter methodology. Of the other (earlier) landmark studies, the majority are concerned with non-humancentric applications. Gerdeman [23], Finkinzeller [24] and Geers [25] all use case studies to investigate non-humancentric RFID and hence our methodological precedent is set here. The bulk of the remaining literature is newstype in nature and the absence of research frameworks is evident. The few exceptions to this include Woolnaugh [9] who conducted an interview and Murray [13] and Eng [3]who provide small case studies. In further criticism the news articles do not demonstrate technological trajectories but speculate on utopian implementations unlikely to be achieved by incremental development in the short to medium-term. Thus, any real value in these news articles can only be found in the documentation of events.

3. Research methodology

Several modes of academic inquiry were used in this study, though usability context analyses were the focal means of research. These analyses are similar to case studies as they investigate “a contemporary phenomenon within its real life context when the boundaries between phenomenon and context are not clearly evident” [26]. They also similarly use multiple sources of evidence, however are differentiated on the basis of the unit of analysis. In a usability context analysis methodology, units are not individuals, groups or organizations but are applications or application areas for a product, where ‘product’ is defined as “any interactive system or device designed to support the performance of users’ tasks” [27]. The results of multiple analyses are more convincing than a singular study, and the broad themes identified cover the major fields of current humancentric RFID development.

Further defining the research framework, the primary question to be answered – ‘what is the current state of application development in the field of humancentric RFID devices?’ – is justifiably exploratory. It entails investigation into contemporary technology usage and seeks to clarify boundaries within the research area. As such, this is a largely qualitative study that uses some elements of descriptive research to enhance the central usability context analyses. The usability context analyses are also supplemented by a discussion of surrounding social, legal and ethical ambiguities. By this means, the addition of a narrative analysis to the methodology ensures a thorough investigation of usage and context.

4. Usability context analysis: control

The usability context analysis for control is divided into three main sub-contexts – security, management, and social controls.

4.1. Security controls

The most basic security application involves controlling personal identification through identifying data stored on a transponder. In theory, the limit to the amount of information stored is subject only to the capacity of the embedded device or associated database. Further, being secured within the body, the loss of the identifier is near impossible even though, as has occurred in herd animals, there are some concerns over possible dislodgement. Accordingly, the main usability drawback lies with reading the information. Implanted identification is useless if it is inaccessible.

Numerous applications have been proposed to assist individuals who depend solely on carers for support. This group consists of newly-born babies, sufferers of mental illness, persons with disabilities and the elderly. One use involves taking existing infant protection systems at birthing centres and internalizing the RFID devices worn by newborns. This would aid in identifying those who cannot identify themselves. Further, when connected to security sensors and alarms, the technology can alert staff to the “unauthorized removal of children” [28]. The South Tyneside Healthcare Trust Trial in the UK is a typical external-use example case. Early in 1995, Eagle Tracer installed an electronic tagging system at the hospital using TIRIS electronic tags and readers from Texas Instruments. Detection aerials were hidden at exit points so that if any baby was taken away without authorisation, its identity would be known and an alarm raised immediately. The trial was so successful that the hospital was considering expanding the system to include the children’s ward. [29] Notably, a number of other institutions have already begun targeting RFID applications toward adolescents. In Japan students are being tagged in a bid to keep them safe. RFID transponders are being placed inside their backpacks and are used to advise parents when their child has arrived at school [30]. A similar practice is being conducted in California where children are being asked to “wear” RFID tags around their necks when on school grounds [31].

Commentators are using this lack of objection to external electronic tagging for minors to highlight the idea that a national identity system based on implants is not impossible. Some believe that there will come a time when it will be common for different groups in the population to have tags implanted at birth. In Britain, chip implantation was suggested for illegal immigrants, asylum seekers and even travellers. Smet [32] argued the following, “[i]f you look to our societies, we are already registered from birth until death. Our governments know who we are and what we are. But one of the basic problems is the numbers of people in the world who are not registered, who do not have a set identity, and when people move with real or fake passports, you cannot identify them.”

4.2. Management controls

Many smart card access systems use RFID technology to associate a cardholder with access permissions to particular locations. Replacing cards with RFID implants alters the form of the ‘key’ but does not require great changes to verification systems. This is because information stored on a RFID microchip in a smart card can be stored on an implanted transponder. Readers are similarly triggered when the transponder is nearby. This application would have greatest value in ‘mission critical’ workplaces or for persons whose role hinges upon access to a particular location. The implanted access pass has the added benefit of being permanently attached to its owner.

Access provision translates easily into employee monitoring. In making the implanted RFID transponder the access pass to certain locations or resources, times of access can be recorded to ensure that the right people are in the right place at the right time. Control in this instance then moves away from ideals of permission and embraces the notion of supervision. A company’s security policy may stipulate that staff badges be secured onto clothing or that employees must wear tags woven into their uniforms. Some employers require their staff to wear RFID tags in a visible location for both identification purposes and access control [33]. In this regard, Olivetti’s “active badge” was ahead of its time when it was first launched [34].

4.3. Social controls

In the military, transponders may serve as an alternative to dog tags. Using RFID, in addition to the standard name, rank and serial number, information ranging from allergies and dietary needs to shoe size can be stored. This purports to ease local administrative burdens, and can eliminate the need to carry identification documents in the field allowing for accurate, immediate identification of Prisoners-Of-War.

Just as humancentric applications of RFID exist for those who enforce law, so too do applications exist for people who have broken it. The concept of ‘electronic jails’ for low-risk offenders is starting to be considered more seriously. In most cases, parolees wear wireless wrist or ankle bracelets and carry small boxes containing the vital tracking technology. Sweden and Australia have implemented this concept and trials are taking place in the UK, US, Netherlands and Canada. In 2002, 27 American states had tested or were using some form of satellite surveillance to monitor parolees [14]. In 2005 there were an estimated 120,000 tracked parolees in the United States alone [35]. Whilst tagging low-risk offenders is not popular in many countries it is far more economical than the conventional jail. Social benefits are also present as there is a level of certainty involved in identifying and monitoring so-called ‘threats’ to society. In a more sinister scenario in South America, chip implants are marketed toward victims of crime rather than offenders. They are seen as a way “to identify kidnapping victims who are drugged, unconscious or dead” [36].

5. Usability context analysis: convenience

The usability context analysis for convenience is divided into three main sub-contexts – assistance, financial services and interactivity.

5.1. Assistance

Automation is the repeated control of a process through technological means. Implied in the process is a relationship, the most common of which involves linking an implantee with appropriate data. Such information in convenience contexts can however be extended to encompass goods or physical objects with which the implantee has an association of ownership or bailment. VeriChip for example, a manufacturer of human-implantable RFID transponders, have developed VeriTag for use in travel. This device allows “personnel to link a VeriChip subscriber to his or her luggage… flight manifest logs and airline or law enforcement software databases” [37]. Convenience is provided for the implantee who receives greater assurance that they and their luggage will arrive at the correct destination, and also for the transport operator who is able to streamline processes using better identification and sorting measures.

Advancing the notion of timing, a period of movement leads to applications that can locate an implantee or find an entity relative to them [38]. This includes “find me”, “find a friend”, “where am I” and “guide me to” solutions. Integrating RFID and GPS technologies with a geographic information systems (GIS) portal such as the Internet-based mapquest.com would also allow users to find destinations based on their current GPS location. The nature of this application lends itself toward roadside assistance or emergency services, where the atypical circumstances surrounding the service may mean that other forms of subscriber identification are inaccessible or unavailable.

5.2. Financial services

Over the last few decades, world economies have acknowledged the rise of the cashless society. In recent years though, alongside traditional contact cards, we have seen the emergence of alternate payment processes. In 2001, Nokia tested the use of RFID in its 5100-series phone covers, allowing the device to be used as a bank facility. RFID readers were placed at McDonalds drive-through restaurants in New York and the consumer could pay their bill by holding their mobile phone near a reader. The reader contacted a wireless banking network and payment was deducted from a credit or debit account. Wired News noted the convenience stating, “there is no dialing, no ATM, no fumbling for a wallet or dropped coins” [39]. These benefits would similarly exist with implanted RFID. Ramo has noted the feasibility, commenting that “in the not too distant future” money could be stored anywhere, as well as “on a chip implant under [the] skin” [40]. Forgetting your wallet would no longer be an issue.

It is also feasible that humancentric RFID eliminates the need to stand in line at a bank. Purely as a means of identification, the unique serial or access key stored on the RFID transponder can be used to prove identity when opening an account or making a transaction. The need to gather paper-based identification is removed and, conveniently, the same identification used to open the account is instantly available if questioned. This has similar benefits for automatic teller machines. When such intermediary transaction devices are fitted with RFID readers, RFID transponders have the ability to replace debit and credit cards. This is in line with Warwick’s prediction that implanted chips “could be used for money transfers, medical records, passports, driving licenses, and loyalty cards” [41].

5.3. Interactivity

On August 24, 1998 Professor Kevin Warwick became the first recorded human to be implanted with an RFID device. Using the transponder, Warwick was able to interact with the ‘intelligent’ building that he worked in. Over the nine days he spent implanted, doors formerly requiring smart card access automatically opened. Lights activated when Warwick entered a room and upon sensing the Professor’s presence his computer greeted him. Warwick’s ‘Project Cyborg 1.0’ experiment thus showed enormous promise for humancentric convenience applications of RFID. The concept of such stand-alone applications expands easily into the development of personal area networks (PANs) and the interactive home or office. With systems available to manage door, light and personal computer preferences based on transponder identification, further climate and environmental changes are similarly exploitable (especially considering non-humancentric versions of these applications already exist) [42].

Given the success of interacting with inanimate locations and objects, the next step is to consider whether person-to-person communication can be achieved using humancentric RFID. Such communication would conveniently eliminate the need for intermediary devices like telephones or post. Answering this question was an aim of ‘Project Cyborg 2.0’ with Warwick writing, “We’d like to send movement and emotion signals from one person to the other, possibly via the Internet” [43]. Warwick’s wife Irena was the second trial subject, being similarly fitted with an implant in her median nerve. Communicating via computer-mediated signals was only met with limited success however. When Irena clenched her fist for example, Professor Warwick received a shot of current through his left index finger [44]. Movement sensations were therefore effectively, though primitively, transmitted.

6. Usability context analysis: care

The usability context analysis for care is divided into three main sub-contexts – medical, biomedical and therapeutic.

6.1. Medical

As implanted transponders contain identifying information, the storage of medical records is an obvious, and perhaps fundamental, humancentric care application of RFID. Similar to other identification purposes, a primary benefit involves the RFID transponder imparting critical information when the human host is otherwise incapable of communicating. In this way, the application is “not much different in principle from devices… such as medic-alert bracelets” [16]. American corporation VeriChip markets their implantable RFID device for this purpose. Approved for distribution throughout the United States in April of 2002, it has been subject to regulation as a medical device by the Food and Drug Administration since October of the same year.

Care-related humancentric RFID devices provide unparalleled portability for medical records. Full benefit cannot be gained without proper infrastructure however. Though having medical data instantly accessible through implanted RFID lends itself to saving lives in an emergency, this cannot be achieved if reader equipment is unavailable. The problem is amplified in the early days of application rollout, as the cost of readers may not be justified until the technology is considered mainstream. Also, as most readers only work with their respective proprietary transponders, questions regarding market monopolies and support for brand names arise.

6.2. Biomedical

A biosensor is a device which “detects, records, and transmits information regarding a physiological change or the presence of various chemical or biological materials in the environment” [45]. It combines biological and electronic components to produce quantitative measurements of biological parameters, or qualitative alerts for biological change. When integrated with humancentric RFID, biosensors can transmit source information as well as biological data. The time savings in simultaneously gathering two distinct data sets are an obvious benefit. Further, combined reading of the biological source and measurement is less likely to encounter the human error linked with manually correlating data to data sources.

Implantable transponders allowing for the measurement of body temperature have been used to monitor livestock for over a decade [25]. As such, the data procurement benefits are well known. It does however give a revolutionary new facet to human care by allowing internal temperature readings to be gained, post-implantation, through non-invasive means. In 1994 Bertrand Cambou, director of technology for Motorola’s Semiconductor Products in Phoenix, predicted that by 2004 all persons would have such a microchip implanted in their body to monitor and perhaps even control blood pressure, their heart rate, and cholesterol levels.[46] Though Cambou’s predictions did not come to timely fruition, the multitude of potential applications are still feasible and include: chemotherapy treatment management; chronic infection or critical care monitoring; organ transplantation treatment management; infertility management; post-operative or medication monitoring; and response to treatment evaluation. Multiple sensors placed on an individual could even form a body area network (BAN).

An implantable RFID device for use by diabetes sufferers has been prototyped by biotechnology firm M-Biotech. The small glucose bio-transponder consisting of a miniature pressure sensor and a glucose-sensitive hydrogel swells “reversibly and to varying degrees” when changes occur in the glucose concentrations of surrounding fluids [47]. Implanted in the abdominal region, a wireless alarm unit carried by the patient continually reads the data, monitoring critical glucose levels.

6.3. Therapeutic

Implanted therapeutic devices are not new; they have been used in humans for many years. Alongside the use of artificial joints for example, radical devices such as pacemakers have become commonplace. The use of RFID with these devices however has re-introduced some novelty to the remedial solution [48]. This is because, while the therapeutic devices remain static in the body, the integration of RFID allows for interactive status readings and monitoring, through identification, of the device.

There are very few proven applications of humancentric RFID in the treatment usability sub-context at current if one puts cochlear implants [49] and smart pills aside [50]. Further, of those applications at the proof of concept stage, benefits to the user are generally gained via an improvement to the quality of living, and not a cure for disease or disability. With applications to restore sight to the blind [51] and re-establish normal bladder function for patients with spinal injuries already in prototyped form however, some propose that real innovative benefit is only a matter of time [52]. Arguably the technology for the applications already exists. All that needs to be prototyped is a correct implementation. Thus, feasibility is perhaps a matter of technological achievement and not technological advancement.

7. Findings

The choice of control, convenience and care contexts for analysis stemmed from the emergence of separate themes in the literature review; however the context analyses themselves showed much congruence between application areas. In all contexts, identification and monitoring are core functions. For control, this functionality exists in security and in management of access to locations and resources. For convenience, identification necessarily provides assistance and monitoring supports interactivity with areas and objects. Care, as the third context, requires identification for medical purposes and highlights biological monitoring as basic functionality.

Table 1. High level benefits and costs for humancentric RFID

With standard identification and monitoring systems as a basis, it is logical that so many humancentric applications of RFID have a mass target market. Medical identification for example is not solely for the infirm because, as humans, we are all susceptible to illness. Similarly, security and convenience are generic wants. Combined with similarities between contextual innovations, mass-market appeal can lead to convergence of applications. One potential combination is in the area of transportation and driver welfare. Here the transponder of an implanted driver could be used for keyless passive entry (convenience), monitoring of health (care), location based services (convenience), roadside assistance (convenience) and, in terms of fleet management or commercial transportation, driver monitoring (control).

Despite parallels and a potential for convergence, development contexts for humancentric RFID are not equal. Instead, control is dominant. Though care can be a cause for control and medical applications are convenient, it is control which filters through other contexts as a central tenet. In convenience applications, control is in the power of automation and mass management, in the authority over environments and devices. For care applications, medical identification is a derivative of identification for security purposes and the use of biosensors or therapeutic devices extends control over well-being. Accordingly, control is the overriding theme encompassing all contexts of humancentric RFID in the current state of development [53].

Alongside the contextual themes encapsulating the usability contexts are the corresponding benefits and costs in each area (Table 1). When taking a narrow view it is clear that many benefits of humancentric RFID are application specific. Therapeutic implants for example have the benefit of the remedy itself. Conversely however, a general concern of applications is that they are largely given to social disadvantages including the onset of religious objections and privacy fears.

7.1. Application quality and support for service

For humancentric RFID, application quality depends on commercial readiness. For those applications being researched, the usability context analyses suggest that the technology, and not the applications, present the largest hurdle. In his Cyborg 1.0 experiments for example, Professor Kevin Warwick kept his transponder implanted for only nine days, as a direct blow would have shattered the glass casing, irreparably damaging nerves and tissue.

Once technological difficulties are overcome and applications move from proof of concept into commercialization, market-based concerns are more relevant. Quality of data is a key issue. In VeriChip applications, users control personal information that is accessible, though stored in the Global VeriChip Subscriber Registry database, through their implanted transponder. The system does not appear to account for data correlation however, and there is a risk of human error in information provision and in data entry. This indicates the need for industry standards, allowing a quality framework for humancentric RFID applications to be created and managed.

Industry standards are also relevant to support services. In humancentric applications of RFID they are especially needed as much usability, adjunct to the implanted transponder, centers upon peripherals and their interoperability. Most proprietary RFID readers for instance can only read data from similarly proprietary transponders. In medical applications though, where failure to harness available technology can have dramatic results, an implantee with an incompatible, and therefore unreadable, transponder is no better off for using the application. Accordingly, for humancentric RFID to realize its promotion as ‘life-enhancing’, standards for compatibility between differently branded devices must be developed.

Lastly, the site of implantation should be standardized as even if an implanted transponder is known to exist, difficulties may arise in discerning its location. Without a common site for implantation finding an implanted RFID device can be tedious. This is disadvantageous for medical, location-based or other critical implementations where time is a decisive factor in the success of the application. It is also a disadvantage in more general terms as the lack of standards suggests that though technological capability is available, there is no social framework ready to accept it.

7.2. Commercial viability for the consumer

A humancentric application of RFID must satisfy a valid need to be considered marketable. This is especially crucial as the source of the application, the transponder, requires an invasive installation and, afterwards, cannot be easily removed. Add to this that humancentric RFID is a relatively new offering with few known long-term effects, and participation is likely to be a highly considered decision. Thus, despite many applications having a mass target market, the value of the application to the individual will determine boundaries and commercial viability.

Value is not necessarily cost-based. Indeed, with the VeriChip sold at a cost of $US200 plus a $10 per month service fee, it is not being marketed as a toy for the elite. Instead, value and application scope are assessed in terms of life enhancement. Therapeutic devices for example provide obvious remedial benefit, but the viability of a financial identification system may be limited by available infrastructure.

Arguably, commercial viability is increased by the ability of one transponder to support multiple applications. Identification applications for example are available in control, convenience and care usability contexts. The question arises however, as to what occurs when different manufacturers market largely different applications? Where no real interoperability exists for humancentric RFID devices, it is likely that users must be implanted with multiple transponders from multiple providers. Further, given the power and processing constraint of multi-application transponders in the current state of development, the lack of transponder portability reflects negatively on commercial viability and suggests that each application change or upgrade may require further implantation and bodily invasion.

7.3. Commercial viability for the manufacturer

Taking VeriChip as a case study, one is led to believe that there is a commercially viable market for humancentric applications of RFID. Indeed, where the branded transponder is being sold in North and South America, and has been showcased in Europe [54], a global want for the technology is suggested. It must be recognized, however, that in the current state of development VeriChip and its parent, Applied Digital Solutions have a monopoly over those humancentric RFID devices approved for use. As such, their statistics and market growth have not been affected by competition and there is no comparative data. The difference between a successful public relations campaign and reality is therefore hard to discern.

Interestingly, in non-humancentric commercial markets, mass rollouts of RFID have been scaled back. Problems have arisen specifically in animal applications. The original implementation of the 1996 standards, ISO 11784: ‘Radio-frequency identification of animals – Code structure’ and ISO 11785: ‘Radio-frequency identification of animals – Technical concept’ for example, were the subject of extensive complaint [55]. Not only did the standards not require unique identification codes, they violated the patent policy of the International Standards Organization. Even after the ISO standards were returned to the SC19 Working Group 3 for review, a general lack of acceptance equated to limited success. Moreover, moves have now been made to ban the use of implantable transponders in herd animals. In a high percentage of cases the transponder moved in the fat layer, raising concerns that it might be later consumed by humans. Further, the meat quality was degraded as animals sensing the existence of an implanted foreign object produced antibodies to ‘attack’ it [18].

8. Discussion

8.1. Personal privacy

Given its contactless nature and non-line-of-sight (nLoS) capability, RFID has the ability to automatically collect a great deal of data about an individual in a covert and unobtrusive way. Hypothetically, a transponder implanted within a human can communicate with any number of readers it may pass in any given day. This opens up a plethora of possibilities, including the ability to link data based on a unique identifier (i.e. the chip implant), to locate and track an individual over time, and to look at individual patterns of behaviour. The severity of violations to personal privacy increase as data collected for one purpose is linked with completely separate datasets gathered for another purpose. Consider the use of an implant that deducts programmed payment for road tolls as you drive through sensor-based stations. Imagine this same data originally gathered for traffic management now being used to detect speeding and traffic infringements, resulting in the automatic issue of a fine. Real cases with respect to GPS and fleet management have already been documented. Kumagi and Cherry [56] describe how one family was billed an “out-of-state penalty” by their rental company based on GPS data that was gathered for a completely different reason. Stanford [57] menacingly calls this a type of data use “scope creep” while Papasliotis [58] more pleasantly deems it “knowledge discovery”.

These notions of ‘every-day’ information gathering, where an implantee must submit to information gathering practices in return for access to services, offends the absolutist view of privacy and “an individual [having] the right to control the use of his information in all circumstances” [59]. Indeed, given their implantation beneath the skin, the very nature of humancentric transponders negates the individual’s ability to ‘control’ the device and what flows from it. Not only do the majority of consumers lack the technical ability to either embed or remove implants but they naturally lack the ability to know when their device is emitting data and when it is not. There is also a limited understanding of what information ‘systems’ are actually gathering. This becomes a greater danger when we note that laws in different jurisdictions provide little restraint on the data mining of commercial databases by commercial entities. In this instance, there would be little to stop RFID service providers from mining data collected from their subscribers and on-selling it to other organisations.

Moreover, even where ethical data usage is not questioned, intellectual property directives in Europe may hamper the promise of some service providers to keep consumer data private. According to Papasliotis [58] “… the proposed EU Intellectual Property (IP) Enforcement Directive includes a measure that would make it illegal for European citizens to de-activate the chips in RFID tags, on the ground that the owner of the tag has an intellectual property right in the chip. De-activating the tag could arguably be treated as an infringement of that right”.

8.2. Data security

Relevant approaches to RFID security in relation to inanimate objects have been discussed in the literature. Gao [60] summarises these methods as “killing tags at the checkout, applying a rewritable memory, physical tag memory separation, hash encryption, random access hash, and hash chains”. Transponders that are embedded within the body pose a different type of data security requirement though. They are not in the body so they can be turned off, this being a circumvention of the original purpose of implantation. Instead, they are required to provide a persistent and unique identifier. In the US however, also thwarting an original purpose, a study has shown that some RFID transponders are capable of being cloned, meaning the prospect of fraud or theft may still exist [61]. One possibility, as proposed by Perakslis and Wolk [22], is the added security of saving an individual’s feature vector onboard the RFID chip. Biometrics too, however, is fraught with its own problems [62]. Despite some moves in criminal justice systems, it is still controversial to say that one’s fingerprint or facial image should be held on a public or private database.

Unfortunately, whatever the security, researchers like Stanford believe it is a “virtual certainty” that tags and their respective systems “will be abused” by some providers [57]. Here, the main risk for consumers involves third parties gaining access to personal data without prior notice. To this end, gaining and maintaining the trust of consumers is essential to the success of the technology. Mature trust models need to be architected and implemented, but more importantly they need to be understood outside of an academic context. Though it is important that trust continues to grow as an area of study within the e-commerce arena, it will be the practical operation of oversight companies like VeriSign in these early days of global information gathering which will allow consumers to create their own standards and opinions.

Outside of clear ethical concerns regarding third-party interests in information, another temptation for service providers surrounds the use of data to target individual consumer sales in value-added services and service-sets relying on location information. Though not an extreme concern in itself, we note that any such sales will face the more immediate concern of deciding on a secure and standard location for implants. For now live services place the implant in the left or right arm but the problems with designating such a zone surround the possibility of exclusion. What if the consumer is an amputee or has prosthetic limbs? Surely the limited space of the human body means that certain things are possible, while others are not. Thus, recognizing the limitations of the human body, will service providers brand transponders and allow multifunctional tags for different niche services? Which party then owns the transponder? The largest service provider, the government or agency acting as an issuer, or the individual? Who is responsible for accuracy and liable for errors? And more importantly, who is liable for break-downs in communication when services are unavailable and disaster results?

8.3. Ethical considerations

Molnar and Wagner [63] ask the definitive question “[i]s the cost of privacy and security “worth it”?” Stajano [64] answers by reminding us that, “[t]he benefits for consumers remain largely hypothetical, while the privacy-invading threats are real”. Indeed, when we add to privacy concerns the unknown health impacts, the potential changes to cultural and social interaction, the circumvention of religious and philosophical ideals, and a potential mandatory deployment, then the disadvantages of the technology seem almost burdensome. For the present, proponents of emerging humancentric RFID rebuke any negatives “under the aegis of personal and national security, enhanced working standards, reduced medical risks, protection of personal assets, and overall ease-of-living” [22]. Unless there are stringent ethical safeguards however, there is a potential for enhanced national security to come at the cost of freedom, or for enhanced working standards to devalue the importance of employee satisfaction. The innovative nature of the technology should not be cause to excuse it from the same “judicial or procedural constraints which limit the extent to which traditional surveillance technologies are permitted to infringe privacy” [58].

Garfinkel et al. [61] provide a thorough discussion on key considerations in their paper. Though their main focus is on users of RFID systems and purchasers of products containing RFID tags, the conclusions drawn are also relevant to the greater sphere of humancentric RFID. Firstly, Garfinkel et al. begin by stipulating that a user has the right to know if the product they have purchased contains an RFID tag. In the current climate of human transponder implant acceptance, it is safe to assume that an individual who has requested implantation knows of their implant and its location. But, does the guardian of an Alzheimer’s patient or adult schizophrenic, have the right to impose an implant on behalf of the sufferer for monitoring or medical purposes [65]?

Secondly, the user has the right to have embedded RFID tags “removed, deactivated, or destroyed” [61] at or after purchase. Applied to humancentric implantation, this point poses a number of difficulties. The user cannot remove the implant themselves without some physical harm, they have no real way of finding out whether a remaining implant has in fact been ‘deactivated’, and destroying an implant without its removal from the body implies some form of amputation. Garfinkel et al.’s third ethical consideration is that an individual should have alternatives to RFID. In the embedded scenario users should then also have to ability to opt-in to new services and opt-out of their current service set as they see fit. Given the nature of RFID however, there is little to indicate the success or failure of a stipulated user requested change, save for a receipt message that may be sent to a web client from the server. Quite possibly the user may not be aware that they have failed to opt out of a service until they receive their next billing statement.

The fourth notion involves the right to know what information is stored on the RFID transponder and whether or not this information is correct, while the fifth point is “the right to know when, where and why a RFID tag is being read” [61]. This is quite difficult to exercise, especially where unobtrusiveness is considered a goal of the RFID system. In the resultant struggle between privacy, convenience, streamlining and bureaucracy, the number of times RFID transponders are triggered in certain applications may mean that the end-user is bombarded with a very long statement of transactions.

8.4. The privacy fear and the threat of totalitarianism?

Mark Weiser, the founding father of ubiquitous computing, once said that the problem surrounding the introduction of new technologies is “often couched in terms of privacy, [but] is really one of control” [59]. Indeed, given that humans do not by nature trust others to safeguard our own individual privacy, in controlling technology we feel we can also control access to any social implications stemming from it. At its simplest, this highlights the different focus between the end result of using technology and the administration of its use. It becomes the choice between the idea that I am given privacy and the idea that I control how much privacy I have. In this regard, privacy is traded for service.

Fig. 1. The privacy-security trade-off.

What some civil libertarians fear beyond privacy exchange though is a government-driven mandatory introduction of invasive technologies based on the premise of national security. While the safety and security argument has obviously paved the way for some technologies in response to the new environment of terrorism and identity fraud [38], there is now a concern that further advancements will begin to infringe on the freedoms that security paradigms were originally designed to protect. For invasive technology like humancentric RFID, the concerns are multiplied as the automated nature of information gathering means that proximity to a reader, and not personal choice, may often be the only factor in deciding whether or not a transponder will be triggered. Though most believe that government-imposed mandatory implantation is a highly unlikely outcome of advancements in humancentric RFID, it should be recognised that a voluntary implantation scheme offers negligible benefits to a government body given the incompleteness of the associated data set. This is equally true of private enterprises that mandate the use of transponders in employees, inmates or other distinct population groups.

Where the usability context of control then becomes the realm of government organizations and private enterprise, RFID regulation is increasingly important. Not only is regulation necessary for ensuring legitimacy in control-type applications, it is also needed to prevent the perversion of convenience and care-related uses. For example, many of those implanted with RFID transponders today might consider them to be life-saving devices and the service-oriented nature of these applications means they must clearly remain voluntary (Table 2). If the data collected by the device was also to be used for law enforcement or government surveillance purposes however, users may think twice about employing the technology. In regulating then we do not want to allow unrestricted deployment and unparalleled capabilities for commercial data mining, but nor should we allow a doomsday scenario where all citizens are monitored in a techno-totalitarian state [61]. Any scope for such design of regulations must be considered in light of the illustrated privacy/security trade-off (Fig. 1). Taking any two vertices of the government – service provider – consumer triangle, privacy or security (which can often be equated with ‘control’) will always be traded in relation to the third vertex. For example, where we combine government and service providers in terms of security regulations and the protection of national interests, the consumer is guaranteed to forgo certain amounts of privacy. Similarly, where we combine government and the consumer as a means of ensuring privacy for the individual, the service provider becomes limited in the control it holds over information gathered (if indeed it is still allowed to gather information).

Table 2. Mapping contexts to the environment

9. Conclusion

In the current state of humancentric development, stand-alone applications exist for control, convenience and care purposes, but as control is the dominant context its effects can be seen in other application areas. Applications are also influenced by power and processing confines, and as such, many functions have simple bases in identification or monitoring. Application usage is made more complex however, as a need for peripherals (including readers and information storage systems) is restrained by a lack of industry standards for interoperability. Though the technology has been deemed feasible in both research and commercially approved contexts, the market for humancentric applications of RFID is still evolving. Initial adoption of the technology has met with some success but, as research continues into humancentric applications of RFID, the market is still too niche for truly low-cost, high-quality application services. Any real assessment of the industry is further prejudiced by commercial monopoly and limited research into the long-term effects of use. Coupled with security and privacy concerns, then the long-term commercial viability for humancentric applications of RFID is questionable. In the short- to medium-term, adoption of humancentric RFID technology and use of related applications will be hindered by a lack of infrastructure, a lack of standards, not only as to interoperability but also as to support for service and transponder placement, and the lack of response from developers and regulators to mounting ethical dilemmas.

References

[1] D. Icke, Has the old ID card had its chips? Soldier Magazine (2001)

[2] Applied Digital Solutions, Applied Digital Solutions Announces Working Prototype of Subdermal GPS Personal Location Device, Press Release, April 13, 2003.

[3] P. Eng, I Chip? ABC News.com, March 1, 2002.

[4] C. Murray, Injectable chip opens door to human bar code, EETimes, January 7, 2002. Available from: <http://www.eetimes.com/story/OEG20020104S0044>.

[5] J. Wilson, Girl to get tracker implant to ease parents’ fears, the guardian. Available from: <http://www.guardian.co.uk/Print/0,3858,4493297,00.html>.

[6] J. Sanchez-Klein, And Now For Something Completely Different, PC World Online, August 27, 1998. Available from: ProQuest.

[7] S. Witt, Professor Warwick Chips In, Computerworld, 33 (2) (1999), pp. 89-90

[8] K. Warwick, Cyborg 1.0, Wired Magazine 8.02, February 2000. Available from: <http://www.wired.com/wired/archive/8.02/warwick.html>.

[9] R. Woolnaugh, A man with a chip in his shoulder, Computer Weekly [Online], June 29, 2000. Available from: Expanded Academic Index.

[10] C. Holden, Hello Mr Chip, Science [Online], March 23. 2001. Available from: ProQuest.

[11] G. Vogel, Part Man, Part Computer, Science [Online], 295 (5557), February 8, 2002, p. 1020. Available from: Expanded Academic Index.

[12] R. Kobetic et al., Implanted functional electrical simulation system for mobility in paraplegia: a follow-up case report, IEEE Transactions on Rehabilitation Engineering [Online], December, 1999. Available from: ProQuest.

[13] C. Murray, Prodigy seeks out high-tech frontiers, Electronic Engineering Times [Online], February 25, 2002. Available from: ProQuest.

[14] J. Black, Roll up your sleeve – for a chip implant, Business Week Magazine [Online], March 21, 2002. Available from: <http://www.businessweek.com/bwdaily/dnflash/mar2002/nf20020321_1025.htm>.

[15] L. Grossman, Meet The Chipsons, Time New York, 159 (10) (2002), pp. 56-57

[16] B. Gengler, Chip implants become part of you, The Australian, September 10, 2002.

[17] K. Michael, The technological trajectory of the automatic identification industry, Ph.D. Thesis, School of Information Technology and Computer Science, University of Wollongong, Australia, 2003.

[18] A. Masters, Humancentric applications of RFID, BInfoTech (Hons) Thesis, School of Information Technology and Computer Science, University of Wollongong, Australia, 2003.

[19] K. Michael, M.G. Michael, Microchipping people: the rise of the electrophorus Quadrant, 414 (2005), pp. 22-33

[20] L. Perusco, K. Michael, Humancentric Applications of Precise Location-Based Services, IEEE Conference on e-Business Engineering, IEEE Computer Society, Washington (2005), pp. 409–418

[21] K. Johnston, RFID transponder implants: a content analysis and survey, BInfoTech (Hons) Thesis, School of Information Technology and Computer Science, University of Wollongong, Australia, 2005.

[22] C. Perakslis, R. Wolk, Social acceptance of RFID as a biometric security method, in: Proceedings of the IEEE Symposium on Technology and Society, 2005, pp. 79–87.

[23] J. Gerdeman, Radio frequency identification application 2000, North Carolina, USA, 1995.

[24] K. Finkinzeller, RFID Handbook: Radio-Frequency Identification Fundamentals and Applications, England, 2001.

[25] R. Geers et al., Electronic Identification, Monitoring and Tracking of Animals, United Kingdom, 1997.

[26] R. Yin, The case study method as a tool for doing evaluation, Current Sociology, 40 (1) (1998), p. 123

[27] C. Thomas, N. Bevan, Usability Context Analysis: A Practical Guide, Middlesex, UK, 1996.

[28] Vxceed Technologies, RFID Technology, 2003. Available from: <http://www.vxceed.com/developers/rfid.asp>.

[29] Automatic ID News, Radio Frequency Identification (RF/ID), 1998. Available from: <http://www.autoidenews.com/technologies/concepts/rfdcintro.htm>.

[30] K. Hall, Students tagged in bid to keep them safe, The Japan Times, 2004. Available from: <http://search.japantimes.co.jp/print/news/nn10-2004/nn20041014f2.htm>.

[31] M. Wood, RFID: Bring It On, CNET.com, 2005. Available from: <http://www.cnet.com/4520-6033_1-6223038.html>.

[32] M. Hawthorne, Refugees meeting hears proposal to register every human in the world, Sydney Morning Herald [Online], 2001. Available from: <http://www.iahf.com/other/20011219.html>.

[33] D.B. Kitsz, Promises and problems of RF identification, in: R. Ames (Ed.), Perspectives on Radio Frequency Identification: What is it, Where is it going, Should I be Involved? Van Nostrand Reinhold, New York, pp. 1-19–1-27.

[34] R. Want, et al.The Active Badge Location System, ACM Transactions on Information Systems, 10 (1) (1992), pp. 91-102

[35] W. Saletan, Call my cell, Slate Magazine, May, 2005. Available from: http://slate.msn.com/id/2118117.

[36] J. Scheeres, Politician wants to get chipped, Wired News, February 15, 2002. Available from: <http://www.wired.com/news/print/0,1294,50435,00.html>.

[37] Applied Digital Solutions, Protected by VeriChip™ – Awareness Campaign Continues – VeriChip To Exhibit At Airport Security Expo in Las Vegas, Press Release, July 2, 2002.

[38] K. Michael, A. Masters, Realised applications of positioning technologies in defense intelligence, in: H. Abbass, D. Essam (Eds.), Applications of Information Systems to Homeland Security and Defense, IDG Press, pp. 167–195.

[39] L. Nadile, Call Waiting: A Cell Phone ATM, Wired News. Available from: <http://www.wired.com/news/business/0,1367,41023,00.html>.

[40] J.C. Ramo, The Big Bank Theory and what it says about the future of money, Time, April 27, 1998, pp. 46–55.

[41] S. Dennis, UK Professor Implants Chip, Turns Himself Into Cyborg, Newsbytes, 1998. Available from: <http://www.newsbytes.com/pubNews/110782.html>.

[42] Texas Instruments, Loyally Yours, TIRIS News, 1997. Available from: <http://www.ti.com/tiris/docs/manuals/RFIDNews/Tiris_NL17>.

[43] K. Warwick, Project Cyborg 2.0. Available from: <http://www.rdg.ac.uk/KevinWarwick/html/project_ cyborg_2_0.html>.

[44] W. Underhill, Merging Man and Machine, Newsweek [Online], October 14, 2002. Available from: Expanded Academic Index.

[45] T. Seneadza, Biosensors – A Nearly Invisible Sentinel, Technically Speaking, July 21, 2003. Available from: <http://tonytalkstech.com/archives/000231.php>.

[46] P.L. Harrison, The Body Binary, Popular Science, October, 1994. Available from: <http://www.newciv.org/nanomius/tech/implants>.

[47] M-Biotech: Biosensor Technology. M-Biotech Salt Lake City, 2003. Available from: <http://www.m-biotech.com/technology1.html>.

[48] IEEE, Biomimetic Systems: Implantable, Sophisticated, and Effective. IEEE Engineering in Medicine and Biology 24(5) Sept/Oct (2005).

[49] Cochlear, Nucleus 24 Cochlear Implant, 1999. Available from: <http://www.Cochlear.com/euro/nucleussystems/ci24m.html>.

[50] Sun-Sentinel, The Smart Pill, Sun-Sentinel News: The Edge, 2003. Available from: <http://www.sun-sentinel.com/graphics/news/smartpill>.

[51] J. Rizzo, J. WyattProspects for a visual prosthesis, The Neuroscientist, 3 (4) (1997) Available from: http://rleweb.mit.edu/retina/a2.page1.html

[52] G.T.A. Kovacs, The nerve chip: technology development for a chronic neural interface, Stanford University, 1997. Available from <http://guide.stanford.edu.ezproxy.uow.edu.au/publications/dev4.html>.

[53] K. Michael, A. Masters, Applications of human transponder implants in mobile commerce, in: Proceedings of the Eighth World Multiconference on Systemics, Cybernetics and Informatics, Florida, vol. 5, 2004, pp. 505–512.

[54] Applied Digital Solutions, Press Release VeriChip™ Subdermal Personal Verification Microchip To Be Featured At IDTechex Smart Tagging In Healthcare, Conference in London, April 28–29, 2003.

[55] RFID News, International Standards Organization Returns RFID Standard For Animal Use To Working Group For Major Revisions, RFID News, 2002. Available from: <http://www.rfidnews.com/returns.html>.

[56] J. Kumagi, S. CherrySensors and sensibility, IEEE Spectrum, 41 (7) (2004), pp. 22-26, 28

[57] V. StanfordPervasive computing goes that last hundred feet with RFID Systems, IEEE Pervasive Computing, 2 (2) (2003), pp. 9-14

[58] I.-E. Papasliotis, Information technology: mining for data and personal privacy: reflections on an impasse, in: Proceedings of the 4th International Symposium on Information and Communication Technologies, 2004, pp. 50–56.

[59] O. Günther, S. Spiekermann, Tagging the world: RFID and the perception of control Communications of the ACM, 48 (9) (2005), p. 74

[60] X. Gao, et al.An approach to security and privacy of RFID system for supply chain, IEEE International Ecommerce Technology for Dynamic e-Business. (2004), pp. 164-168

[61] S.L. Garfinkel, A. Juels, R. Pappu, RFID privacy: an overview of problem and proposed solutions, IEEE Security and Privacy Magazine, 3 (3) (2005), pp. 38-43

[62]J.D. Woodward, Biometrics: privacy’s foe or privacy’s friend? Proceedings of the IEEE, 85 (9) (1997), pp. 1480-1492

[63] D. Molnar, D. Wagner, Privacy: privacy and security in library RFID: issues, practices, and architectures, in: Proceedings of the 11th ACM Conference on Computer and Communications Security, 2004, p. 218.

[64] F. Stajano, Viewpoint: RFID is X-ray vision, Communications of the ACM, 48 (9) (2005), p. 31

[65] J.E. Dobson, P.F. Fisher, Geoslavery, IEEE Technology and Society Magazine, 22 (1) (2003), p. 47

Keywords: Radio-frequency identification, Transponders, Chip implants, Humancentric applications, Usability context analysis, Location tracking, Personal privacy, Data security, Ethics

Citation: Amelia Masters and Katina Michael, "Lend me your arms: The use and implications of humancentric RFID, Electronic Commerce Research and Applications, Vol. 6, No. 1, Spring 2007, Pages 29-39, DOI: https://doi.org/10.1016/j.elerap.2006.04.008

The Emerging Ethics of Humancentric GPS Tracking and Monitoring

Abstract

The Global Positioning System (GPS) is increasingly being adopted by private and public enterprise to track and monitor humans for location-based services (LBS). Some of these applications include personal locators for children, the elderly or those suffering from Alzheimer's or memory loss, and the monitoring of parolees for law enforcement, security or personal protection purposes. The continual miniaturization of the GPS chipset means that receivers can take the form of wristwatches, mini mobiles and bracelets, with the ability to pinpoint the longitude and latitude of a subject 24/7/365. This paper employs usability context analyses to draw out the emerging ethical concerns facing current humancentric GPS applications. The outcome of the study is the classification of current state GPS applications into the contexts of control, convenience, and care; and a preliminary ethical framework for considering the viability of GPS location-based services emphasizing privacy, accuracy, property and accessibility.

Section I

Introduction

GPS has the ability to calculate the position, time, and velocity of any GPS receiver. It does so using a process of triangulation, which works on the premise that you can find any position if the distance from three other locations is also known. Originally conceived by the U.S. Air Force for military purposes in the 1960s, it was commercially released in 1995. In 2000, selective availability was turned off, providing consumers the same level of accuracy as the U.S. military. Since that time, mobile business applications based on GPS and cellular network technologies have proliferated. The rate of innovation has been high, and the level of adoption has been steadily increasing, showing a great deal of promise for the small start-up companies which are targeting GPS solutions at families, enterprises, and security-related government initiatives. This paper is significant because in the not-to-distant future, mobile devices will have GPS chipsets on board. Yet, the growth in the number of commercial offerings–while approved by government regulatory bodies–have not been faced with the commensurate ethical discourse which includes legalities and ownership. The aim of this paper is to explore current commercial services based on GPS technology, with a view to identifying emerging ethical concerns and developing an ethical framework.

Section II

Background

The concept of tracking and monitoring using GPS technologies is far from novel [1].Numerous studies and experiments have investigated the potential of GPS to record a person's movements [2], [3]. However, very few studies have attempted to explore the ethical problems of GPS tracking. The question of ethics in precise location services has been gathering traction within the research community, much of this provoked by Wal-Mart's announcement to implement radio-frequency identification (RFID) for itemized inventory tracking using the EPCglobal standard. More recently a whole issue of the Communications of the ACM was dedicated to RFID privacy and security concerns, while other location technologies were largely ignored. The work of Dobson and Fischer [4], Garfinkel et al. [5], Michael and Michael [6], Perusco and Michael [7], Kaupins and Minch [8], Perakslis and Wolk [9] and Stajano [10] have all indicated the need for a deeper understanding of ethics in location services. In addition the foreseeable power of GPS working in tandem with RFID and wireless local area networks (WLANs), will bring with it a new suite of pressing concerns.

2.1. Unanswered questions

Many questions remain unanswered. Who is liable for providing an incorrect geographic reference location for an emergency services call? Does a private enterprise require the consent of an individual subscriber to track a vehicle that has been rented and is mounted with a GPS receiver? Does a government agency or the police force have the right to location information for a given subscriber when they suspect illegal activity? Do refugees or illegal immigrants have the right to refuse a government-imposed tracking device? Is the 24/7/365 monitoring of a parolee's location information ethical? What rights does a mentally ill person have to their location data and does a caregiver have the right to impose certain geographic constraints on that subscriber? And how do caregiver relationships differ from guardian/parent-to-child, or husband-to-wife contexts? And what of employer work-related location monitoring of employees? Who owns location data–the individual subscriber, the service provider, or a third party that stores the information? The answers to these questions are complex and highlight the urgent need for the development of an ethical framework and other industry guidelines.

Section III

Usability Context Analyses and Ethics

Table 1.&nbsp;Ethics-based conceptual approach

Table 1. Ethics-based conceptual approach

Ethics is defined as “[a] system of moral principles, by which human actions and proposals may be judged good or bad or right or wrong” (Macquarie Dictionary). Moral is concerned with “right conduct or the distinction between right or wrong.” This study is aimed at exploring whether the real-time tracking and monitoring of people is morally right or wrong. It is an attempt to formulate an ethical framework by considering principles of moral behavior–something that “has always been a necessary feature of human cultures” [11], [12]. The conceptual approach used toward the building of an ethical framework is based on four main aspects: principles, purpose, morality and justice (Table 1).

When one conducts a usability context analysis, they are not focused on a traditional case study but on a specific product innovation area. The unit of analysis is thus any interactive system or device which supports a user's task. This approach has been used successfully in the past to study controversial chip implant applications [13]. Three usability contexts will be analyzed–care, control and convenience. Each context will focus on uses of GPS tracking and monitoring applications. There is synergy between a usability context analysis methodology and an ethics-based conceptual approach, as one looks at the use, and the other at the implications of the use value.

Section IV

Control

Most ethical issues are connected to the control aspect of GPS tracking, as it imposes an intrusive method of supervision. For the purposes of control GPS has been used for law enforcement, parolees and sex offenders, suspected terrorists and employee monitoring.

4.1. Law enforcement

U.S. law specifies that a court can issue a warrant for the installation of a mobile “tracking device” if a person is suspected of committing a crime [14]. See also House Bill 115 currently being deliberated in the U.S. The term “tracking device” covers a broad spectrum of technologies but the popularity and simplicity of GPS makes it an obvious choice. Gabriel Technologies is one company which is seeking to be the supplier of choice for the federal and homeland security markets [15]. GPSs are even being used to track gang members in U.S. cities, strapped to parolees [16].

There are documented cases in the U.S. of police discreetly planting GPS devices on suspected criminals. The William Jackson case was the first to rule that placing a GPS device on a person or their vehicle does not require a warrant as it is the same as following them around [17]. In 2000, Jackson was found guilty of murdering his daughter after the GPS device placed on his truck found that he had returned to his daughter's crime scene. In another case in New York the judge ruled that police do not need a warrant to track a person on a public street stating that the defendant: “… had no expectation of privacy in the whereabouts of his vehicle on a public roadway” [18].In San Francisco, Scott Peterson had a GPS tracking device placed on his car after being suspected of murdering his pregnant wife in 2002 [19]. His suspicious behavior led to a legal trial involving much speculation over the use of the GPS antenna (even though police had a warrant), and the accuracy of the collected data [20]. However, the judge ruled that the technology was “generally accepted and fundamentally valid” [21].

4.2. Parolees and sex offenders

Today many parolees are fitted with a small tamperproof GPS tracker worn as a bracelet or anklet. The ankle device is in the shape of a rigid plastic ring, accompanied by a small tracking box that can fit in a pocket [22]. Companies such as iSECUREtrac, design GPS monitoring systems to track parolees and sex offenders ensuring they do not commit any crimes, alert authorities if they enter certain locations, (e.g. schools, parks), and prevent them from leaving their homes, if that is prohibited [23]. Some GPS units can also offer the added capability of knowing how much alcohol a person has consumed by measuring perspiration levels every hour. Parolee and pedophile tracking is widespread in the United States with an estimated 120,000 tracked parolees in 28 states [24]. However, there are over 50,000 convicted sex offenders in the US that are not tracked at all [25].

Australian states have been trialing GPS systems and there are proposed schemes for NSW, Western Australia and Victoria [26]. In NSW there are 1,900 offenders on the Child Protection Register but officials say it is too costly and difficult to track all of them [27]. Queensland's corrective services minister, Judy Spence, reviewed a New Zealand trial and found that for the GPS scheme to be cost-effective in Australia, their would need to be quite a lot more prisoners. It is interesting to note, that the question of ethics was not addressed: “the cost of monitoring someone using GPS technology [is] about 1,000 cheaper than keeping them in prison [28].However, in Florida (USA), the estimated cost of placing tracking devices on all sex offenders is 56 million USD per annum [25]. Accounting for each person individually would cost about 100 if they were physically in prison [24]. One disadvantage of the parolee tracking process is its labor intensive nature. A U.S. parolee officer in Georgia who monitors the movements of 17 parolees has said: “… the amount of information is overwhelming … I could easily spend an hour every morning on each offender to go over the information that's there. For some of them, it's necessary. For some of them, it's not” [29]. The amount of data generated has some advantages, such as in the event that parolees are falsely accused of committing crimes at particular locations and evidence suggests otherwise. The message from the police is clear, “[w]e know where you are, and we are watching” [30].

4.3. Suspected terrorists

A number of national laws stipulate the use of a tracking device affixed to any person suspected of “activities prejudicial to security” (e.g. ASIO Act 1979). Previously, the maximum period of time a suspected terrorist could be tracked was 6 months, however, during the Council of Australian Government (COAG) meeting on counter-terrorism it was planned to increase this period to 12 months [31].

4.4. Employee monitoring

Employees that are tracked using GPS usually travel in vehicles over long distances. Tracked workers include couriers, and bus and truck drivers. The motivation for tracking employees is linked to improving company productivity. Automated Waste Disposal Incorporated uses GPS to ensure their truck drivers do not speed and are on track to meet their delivery schedule. The company imposed GPS tracking on its employees to reduce overtime and labor costs. After implementing the GPS tracking system the number of overtime hours dropped from 300 to 70 hours on average per week [32].

Section V

Convenience

Although GPS tracking may not be widely used for the purposes of convenience today, there are a number of commercial uses. For example, Satellite Security Systems (S3), offer vehicle tracking services to a variety of customers, including parents and suspicious spouses [33]. Clients carry a GPS device with them which transmits location data to S3 computers for further analysis. S3 tracks so many vehicles that even homeland security officials sometimes turn to them for support. GPS systems are also becoming important in delivering key business processes such as real-time sales force automation. Norwich Union uses GPS to track their 18 to 21 year old customers, charging their car insurance premiums based on the time of day they drive. The company induces a tariff at peak times when there is a greater chance of having an accident [34]. Companies like Disney are riding on their family brand, targeting up to 30 million children that they classify as “tweens” (8–12 year olds), with location-based family-centric services [35]. But this idea is not new, Japanese school children have for some years been tracked by their parents, wearing transmitters in their school backpacks, uniforms, or shoes [36]. BuddyFinder systems have also been around for some time, allowing friends and family to catch up based on their whereabouts. On another level, there are even golf GPS devices which display the layout of each hole and player locations on the course [37].

Section VI

Care

GPS satellite tracking can assist people who are responsible for the health and wellbeing of others. Two such applications include GPS for tracking dementia sufferers, and parents tracking their children.

6.1. Dementia wandering

Dementia is a symptom of a number of diseases. However, the most common forms are Alzheimer's disease, vascular dementia and dementia with Lewy bodies [38]. It currently affects five per cent of people aged over 65 years and twenty per cent of people aged over 80 years. Dementia becomes a serious problem when a patient begins to wander. Due to his/her mental state a dementia sufferer may get lost easily and may even be injured or killed [39]. Since it is difficult to keep constant watch over a dementia sufferer, a caregiver can employ a variety of assistive technologies which notify family members automatically by phone or email if problems arise [3].Proponents of this application emphasize that the technology grants dementia sufferers more independence and freedom, allowing them a better quality of life [40].

6.2. Parents tracking children

There are a number of GPS products available today which allow parents to track their children. One of the more popular products is Wherifone created by WherifyWireless. The device is about the size of a credit card and has a feature which alerts emergency services. Previously, the company offered a wristwatch tracker but discontinued production because customers wanted to be able to call their children [41]. Users can find the location of their child by logging onto the company website and viewing data on a map. Gilson's AlwaysFind GPS trackers are an alternative [42]. Another GPS tracking system provided by TAA GPS, supports The Teen Arrive Alive program in the U.S., dedicated to addressing teenager driving safety. Parents can find the location of their teenage child, for $19.99 USD a month by using the Internet or calling the locator hotline [43]. Locations are updated every two minutes so parents can keep a constant eye on their child's activities. Further on the theme of driving, the application Ezitrack allows parents in Australia to immobilize a car while it is moving. Even though the device gives a ninety second warning before the car shuts down, officials are still concerned saying it is dangerous, causes inconvenience, and “puts (policing) in the hands of the individual” [44]. A South Australian primary school is also using a GPS tracking system on their school bus, to monitor the speed and keep track of where children get off the bus [45].

Section VII

Towards an Ethical Framework

In each usability context analysis, several GPS tracking applications were presented, raising questions about the potential ethical implications of the technology. Yet the “acceptable use” of GPS is currently #ff0000. Can information generated by a receiver, be treated the same as just any other piece of information? Can data generated by a GPS for one purpose, be used for another? For example, can vehicle tracking be used to track an employee, and to convict the driver of speeding?

Table 2.&nbsp;Ethical framework

Table 2. Ethical framework

The most significant ethical issue facing GPS tracking is that of privacy (Table 2). It can be claimed that products that have the ability to track their subjects are automatically impinging the rights of the individual, even if they themselves have elected to carry the device. Legal jurisdictional issues also apply, as do acts which often seemingly contradict one another. For instance, there is precedence that indicates that a person can be found guilty of a crime based on GPS generated information [46]. In one such case, the judge ruled that there was “no Fourth Amendment implications in the use of the GPS device.” A framework has been devised to encapsulate the ethical issues related to GPS tracking and monitoring. This framework is based on the information technology (IT) ethical issues framework created by Mason [47], and later updated by Turban [48]. The four main ethical issues are categorized into privacy, accuracy, property and accessibility.

7.1. Privacy

The greatest concern of GPS tracking is the amount of information that can be deduced from the analysis of a person's movements.

7.1.1. What location-specific information should an individual require to reveal to others?

In many cases a person's location does not need to be known unless he/she does something unexpected. Parents only need to know if their child is not at school when they should be or is speeding in a vehicle. Similarly, caregivers should only be notified if a dementia patient is wandering, and parole officers only need to know if a parolee ventures outside his/her home zone. Employers too can be alerted when one of their vehicles has made an unnecessary detour.

7.1.2. What kind of surveillance can a parent use on a child?

Using a GPS device to track a child's location is becoming more and more popular. If a child is lost or kidnapped he or she has a better chance of being found. But does the child have a right to determine whether or not they are to be tracked, and until what age or length of time? [49] Another question is how children actually feel about being tracked? [50] Are parents replacing trust with technology, [41] and developing an unhealthy relationship with their children? [51] Christy Buchanan, an associate professor of psychology believes that: “[p]arents shouldn't fool themselves into thinking that they can keep their kids from making mistakes, which is a part of growing up and learning” [52]. Simon Davies of Privacy International believes parents may even become obsessed with tracking their children [51]. On the other hand, parents who have experienced the loss of a child, see GPS as a life-saving technology, especially those who have lost children to drink-driving accidents. These parents point out that tracking is for safety, not for spying.

7.1.3. What kind of surveillance can employers use on employees?

Employers usually track their employees to reduce costs, especially labor costs and costs related to unnecessary product shrinkage. In this context, employers attempt to protect their business interests, and employees attempt to protect their privacy? [53] The two positions are in contrast, as the power is obviously in the hands of the employer. Some workers however have objected to the technology due to privacy concerns [54]. Galen Monroe, a truck driver from Chicago USA, voices his concern: “[t[hese systems could be used to unfairly discipline drivers, for counting every minute that they might or might not be on or off duty and holding that against them” [32]. Lewis Maltby, president of the National Workrights Institute in New Jersey, said that the exchange of privacy for security would affect employee morale and that the next steps would probably be implants [55]. Managers, on the other hand, are more concerned that workers are doing what they are paid to do. Yet this is a shocking development when one considers that there are few, if any, laws governing workplace surveillance in countries like the U.S. and Australia [56].

7.1.4. Do police need a warrant to track a suspected criminal or terrorist?

Several cases have ruled that tracking a person with a GPS device is the same as following them on the street. However, GPS tracking is much more pervasive. First, a person is usually more aware of a person following them, than if a small tracking device were attached to their vehicle. Additionally, a GPS tracker can find a person's location anywhere at anytime even when trailing is not possible. Furthermore, since a tracked person's location is digitized it can be instantly analyzed to make inferences, in ways that simple observations cannot [57]. If the issuing of warrants is not compulsory there will be no barriers for police or security personnel to place tracking devices on any individual. Warrants are essential to ensure GPS tracking devices are used justly and ethically.

7.2. Accuracy

GPS can give error readings in particular conditions. Dense forest, tall buildings, cloud cover and moisture produce inaccuracies in readings but these are considered negligible when compared to the potential for inaccuracies in resultant information processing.

7.2.1. Who is responsible for the authenticity, fidelity and accuracy of information collected?

In the event of GPS failure or enforced shut down by the U.S. government, companies whose mission-critical applications rely on GPS technology would incur heavy losses.The U.S. government has already released plans to shut down parts of the network in a “national crisis” to prevent terrorists from using the network [58]. Consider the implications for those organizations and customers that have become reliant on the technology, for example, criminals serving their sentence from home. And who is responsible for accuracy? The U.S. government created the system but they are under no obligation to ensure accuracy. Another concern is that sixteen of the twenty-eight GPS satellites currently in orbit are beyond their design life and are likely to fail in the near future [59]. At least two satellites are failing each year and launches of new satellites are barely keeping up. This poses problems for the users of the GPS system in the longer term which is why the more accurate European Galileo initiative is critical.

7.2.2. Who is to be held accountable for errors in information, and how is the injured party compensated?

Private companies who offer GPS tracking services avoid liability by introducing product descriptions, warranties and disclaimers [60]. In California several rental car companies were wrongly fining customers for breaking their rental agreement for allegedly leaving the state. Customers were asked to pay $3000 USD for something they did not do. As a result California became the first U.S. state to prohibit the use of GPS receivers by car rental companies to track their customers [33].

7.2.3. Is GPS an appropriate tracking technology for dementia wandering?

The Project Life Saver Organisation helps locate and return wandering dementia sufferers. They believe that GPS is not suitable for tracking persons with dementia, recognizing that GPS lacks four fundamental attributes of an assistive technology: reliability, responsiveness, practicality and affordability [39].

7.2.4. How can we ensure that errors in databases, data transmissions and data processing are accidental and not intentional?

Software used to store tracking data makes it possible to edit data points in order to create false evidence. Effectively a person can be accused of a crime he or she did not commit. For this reason it is imperative that extensive validation checks are enforced to ensure data has not been tampered. There is also the concern with the intentional and non-intentional jamming of GPS signals. Safeguards and laws restricting GPS jamming need to be advocated.

7.3. Property

7.3.1. Who owns the information?

Table 3.&nbsp;The ethical possibilities

Table 3. The ethical possibilities

The U.S. government owns the physical satellite system but who owns the information once it is collected? If a company collects and stores location information on a person who commits a crime, are they obliged to hand it over to the police?

7.3.2. What are the just and fair prices for exchange?

It is theoretically free to access GPS, as long as you have a receiver. Free service however, does not equate to commercial satisfaction. GPS-based voice service providers incur a cost for ‘priority access’, and therefore pass this cost onto their subscribers.

7.4. Accessibility

7.4.1. Who is allowed to use the GPS service?

One of the objectives set out by the GPS policy is the provision of worldwide “positioning, navigation, and timing services” [61]. However, the GPS policy also indicates that the GPS system can be shut down in certain areas “under only the most remarkable circumstances,” like in the event of a terrorist attack [62].

7.4.2. How much should be charged for permitting accessibility to information?

US policy proclaims that the GPS service is and will continue to be “free of direct user fees” [62]. However, private companies are billing customers to use services [63]. Costs may include payment for equipment and data transmission among other fees. There is also the possibility that information can be accessed illegally by a third party for sinister purposes.

7.4.3. Who will be provided with equipment needed for accessing information?

Parolee tracking is more cost-effective than detainment but it is impossible to have all parolees and sex offenders tracked. So who will be tracked and who will not? In previous cases less aggressive criminals have GPS tracking devices attached first. Where radio tag tracking methods have been used, parolees have had to pay for their own tracking devices [24].

7.4.4. Is the tracking of parolees and sex offenders justified?

The three most apparent reasons for parolees and sex offenders to be tracked appear to be: to save costs, deter further crimes and for controlled rehabilitation. The cost of tracking a person is much lower than incarceration. Tracking may deter some criminals from committing a similar offence but if they are tracked at length they may lose awareness of their GPS device. In examining New Zealand's Bill of Rights (sec 21), the N.Z. Law Society (NZLS) found that authorities had the power to impose electronic monitoring on people who had already completed their sentences. NZLS argued that extended supervision equated to “two punishments for the same crime” but the government argued that the main purpose of the extended supervision was preventive not punitive [64]. Others believe that tracking parolees grants them the opportunity to spend more time with family, acting to fast-track the rehabilitation process (Table 3).

Section VIII

Conclusion

Molnar and Wagner [65] ask the definitive question “[i]s the cost of privacy and security ‘worth it’?” Stajano [10] answers by reminding us that, “[t]he benefits for consumers remain largely hypothetical, while the privacy-invading threats are real.” Indeed, when we add to privacy concerns the unknown longterm health impacts, the potential changes to cultural, social and political interactions, the circumvention of religious and philosophical ideals, and a potential mandatory deployment, then the disadvantages of the technology might seem almost burdensome. For the present, proponents of emerging LBS applications rebuke any negatives “under the aegis of personal and national security, enhanced working standards, reduced medical risks, protection of personal assets, and overall ease-of-living” [9]. Unless there are stringent ethical safeguards however, there is a potential for enhanced national security to come at the cost of freedom, or for enhanced working standards to devalue the importance of employee satisfaction. The innovative nature of the technology should not be cause to excuse it from the same “judicial or procedural constraints which limit the extent to which traditional surveillance technologies are permitted to infringe privacy” [56]. The aim of this present research is to understand the ethical implications of current LBS applications, with a view to emphasising the need for future innovators to ethically integrate these technologies into society.

References

1. B.W. Martin, "WatchIt: A Fully Supervised Identification, Location and Tracking System", Proceedings of the IEEE International Carnahan Conference on Security Technology, 1995, pp. 306-310.

2. D. Ashbrook and T. Starner, "Using GPS to Learn Significant Locations and Predict Movement Across Multiple Users", Personal and Ubiquitous Computing, 7, 2003, pp. 275-286.

3. K. Shimizu et al., "Location System for Dementia Wandering", Proceedings of the 22nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2, 2000, pp. 1556-1559.

4. J.E. Dobson and P.F. Fisher, "Geoslavery", IEEE Technology and Society Magazine, 22(1), 2003, pp. 47-52.

5. S.L. Garfinkel et al., "RFID Privacy: An Overview of Problem and Proposed Solutions", IEEE Security and Privacy Magazine, 3(3), 2005, pp. 38-43.

6. K. Michael and M.G. Michael, "Microchipping People: the Rise of the Electrophorus", Quadrant, March, 2005, pp. 22-33.

7. L. Perusco and K. Michael, "Humancentric Applications of Precise Location-Based Services", IEEE Conference on e-Business Engineering, IEEE Computer Society, Beijing, 2005, pp. 409-418.

8. G. Kaupins and R. Minch, "Legal and Ethical Implications of Employee Location Monitoring", Proceedings of the 38th Hawaii International Conference on System Sciences, http://csdl2.computer.org/comp/proceedings/ hicss/2005/2268/05/22680133a.pdf, 2005.

9. C. Perakslis and R. Wolk, "Social Acceptance of RFID as a Biometric Security Method", Proceedings of the IEEE Symposium on Technology and Society, 2005, pp. 79-87.

10. F. Stajano, "Viewpoint: RFID Is X-ray Vision", Communications of the ACM, 48(9), 2005, pp. 31-33.

11. Honderich, T. (ed.), The Oxford Companion to Philosophy, Oxford University Press, Oxford, 1995, p. 596.

12. J. Blom et al., "Contextual and Cultural Challenges for User Mobility Research", Communications of the ACM, 48(7), 2005, pp. 37-41.

13. A. Masters and K. Michael, "Humancentric Applications of RFID Implants: the Usability Contexts of Control, Convenience and Care", The Second IEEE International Workshop on Mobile Commerce and Services, IEEE Computer Society, Munich, 19th July, 2005, pp. 32-41.

14. Legal Information Institute, http://www4.law.cornell.edu/uscode/search/ index.html, 3 August, 2005.

15. MNP, "Gabriel Technologies Corp- Teams with Jefferson Consulting to Target Federal Homeland Security Markets", Market News Publishing, 6 April, 2006.

16. Reuters, "California Gang Members to be Tracked by GPS", Reuters, 17 March, 2006.

17. K. George, "Court Will Decide If Police Need Warrant for GPS 'Tracking'" http://seattlepi.nwsource.com/local/121572_gps12.html, Seattle PI, 12 May, 2003.

18. D. McCullagh, "Snooping by Satellite", CNET News, http://news.com.com/Snooping+by+satellite/2100-1028_3-5533560.html?tag=sas. email,12 January, 2005.

19. R. Dornin, "Judge Allows GPS Evidence in Peterson Case", CNN.com, http://www.cnn.com/2004/LAW/02/17/peterson.trial/, 17 February, 2004.

20. S. Finz and M. Taylor, "Peterson Tracking Device Called Flawed- Defense Wants GPS Evidence Shut Out of Trial", San Francisco Chronicle, http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2004/02/12/ BAG7P4V69B1.DTL, 12 February, 2004.

21. MSNBC.com, "Jurors: Peterson's Stoicism Was the Final Straw", Associated Press, http://msnbc.msn.com/id/6711259/, 14 December, 2004.

22. C. Parker, "GPS Tracking: the High-Tech Ball and Chain- System Lets Authorities Monitor Offenders and Helps Free Up Jail Space, The Morning Call, 17 April, 2006.

23. Monmonier, M. Spying with Maps: Surveillance Technologies and the Future of Privacy, University of Chicago Press, USA, 2002.

24. W. Saletan, "Call My Cell", http://slate.msn.com/id/2118117/, 6 May, 2005.

25. Scarborough Country, "Tracking Sex Offenders", http://www.msnbc.msn.com/id/7589426/, 21 April, 2005.

26. M. Murphy, "Satellite Tracking Plan for Pedophiles", The Age, http://www.theage.com.au/news/national/satellite-tracking-plan-for- pedophiles/2005/08/28/1125167554234.html?oneclick=true, 29 August, 2005.

27. T. Vermeer, "Satellite Tracking for Child Abusers", http://www.sundaytelegraph.news.com.austory0,9353,16406008-28778,00.html, 28 August, 2005.

28. AAP, "Qld: Minister Rules Out GPS Tracking of Sexual Offenders", Australian Associated Press General News, 10 April, 2006.

29. C. Campos, "Georgia Tracks Parolee by GPS", The Atlanta Journal-Constitution, http://www.ajc.com/metro/content/metro/1204/0101gps. html, 1 January, 2005.

30. J. Stockweel, "Checking Regularly On Sex Offenders; Home Visits By Police Seen As 'Proactive', Washington Post, 10 April, 2006.

31. C. Banham and M. Wilkinson, "Track and Tag - The New War On Terrorism", Sydney Morning Herald, http://www.smh.com.au/news/ national/track-and-tag-the-new-war-on-terrorism/2005/09/08/1125772641058.html, 9 September, 2005.

32. A. Geller, "Bosses Use GPS to Keep Sharp Eye On Mobile Workers", Detroit News, http://www.detnews.com/2005/technology/0501/ 01/technology46929.htm, 1 January, 2005.

33. A.E. Cha, "Satellite Tracking Finds Daily Uses", Detroit News, http://www.detnews.com/2005/technology/0501/23/A09-67089.htm, 23 January, 2005.

34. Anonymous, "Off-Peak Car Insurance Launched", The Guardian, http://www.guardian.co.uk/business/story/0,3604,1388623,00.html, 12 January, 2005.

35. L. Turner, "Disney Unveils 'Family' Mobile Service", Total Telecom, 6 April, 2006.

36. D. White, "Privacy Group: GPS Tracking of Kids is Bad", http://www.mobilemag.com/content/100/350/C7512/, 20 April, 2006.

37. StarCaddy.com, "StarCaddy Handheld GPS Yardage Tool for Golfers", http://www.starcaddy.com/index, cfm, 2005.

38. Alzheimer's Society, "Alzheimer's Society Information Sheet Assistive Technology", http://www.alzheimers.org.uk/After_diagnosis/PDF/ 437_assistivetechnolgy.pdf, August, 2005.

39. G. Saunders, "GPS and Wandering: More Questions Than Answers", http://www.projectlifesaver.org/advisories.htm, August, 2005.

40. J. Loh et al., "Technology Applied to Address Difficulties of Alzheimer Patients and Their Partners", Proceedings of the Conference on Dutch Directions in Human Computer Interaction, 18, 2005.

41. Y. Yeebo, "Spyed Kids", Newsweek, http://www.msnbc.msn. eom/id/9135838/site/newsweek/, 1 September, 2005.

42. B. Grady, "Uses for GPS Devices Branching Out", The Oaklands Tribune, 20 March, 2006.

43. ENP, "Cyber Tracker Featured on Television News Reports on Teen Driving", ENP Newswire, 23 March, 2006.

44. M. Benns, "Parent-Controlled Car Immobilizer Risky, Says Costa", The Sun-Herald, 29 May, 2005, p. 19.

45. Anonymous, "School Bus of the Future", ABC Riverland SA, http://www.abc.net.au/riverland/stories/s1449899.htm, 31 August, 2005.

46. H. Bray, "GPS Spying May Prove Irresistible to Police", Boston.com, http://vww.boston.com/business/technology/articles/2005/01/ 17/gps_spying_may_prove_irresistible_to_police/, January, 2005.

47. R.O. Mason, "Four Ethical Issues of the Information Age", MIS Quarterly, 1986, pp. 4-12.

48. Turban, E. et al., Electronic Commerce 2002: A Managerial Perspective, Prentice Hall, New Jersey.

49. S.N. Roberts, "Tracking Your Children with GPS: Do You Have the Right?", Wireless Business and Technology, http://wireless.sys-con. com/read/41433.htm, 3(12), 2003.

50. M. Williams et al., "Wearable Computing and the Geographies of Urban Childhood- Working with Children to Explore the Potential of New Technology", Proceeding of the 2003 Conference on Interaction Design and Children, 2003, pp. 111-116.

51. BBC, "Concerns over GPS Child Tracking", BBC News Online, 20 April, 2006.

52. Anonymous, "Big Mother (or Father) is Watching", Sydney Morning Herald, http://www.smh.com.au/news/technology/big-mother-or-father- is-watching/2005/09/08/1125772632570.html, 9 September, 2005.

53. J. Weckert, "Trust and Monitoring in the Workplace", IEEE International Symposium on Technology and Society, 2000, pp. 245-250.

54. T. Lepeska, "GPS Would Pinpoint Workers Too", The Commercial Appeal, 4 April, 2006.

55. P. Kitchen, "They're Watching You- Employer Surveillance of Workers and Property Extends Further Than You Think", Pittsburgh Post-Gazette, 12 March, 2006.

56. I-E. Papasliotis, "Information Technology: Mining for Data and Personal Privacy: Reflections on an Impasse", Proceedings of the 4th International Symposium on Information and Communication Technologies, 2004, pp. 50-56.

57. A. Burak and T. Sharon, "Analysing Usage of Location Based Services", CHI 2003: New Horizons, Florida, 5-10 April, 2003, pp. 970-971.

58. T. Bridis, "Bush Prepares for Possible Shutdown of GPS Network in National Crisis", Detroit News, http://www.detnews.com/2004/ technology/0412/16/technology-34633.htm, 16 December, 2004.

59. L. Bingley, "GPS Users Must Plan for Outages", IT Week, http://www.itweek.co.uk/itweek/news/2142864/gps-users-plan-outages, 27 September, 2005.

60. D. R. Sovocool, "Legal Issues For Manufacturers, System Integrators, Vendors and Service Providers", Thelen Reid & Priest LLP, http://www.thelenreid.com/articles/article/art_37_idx.htm, 17 April, 2000.

61. OSTP, "US Global Positioning System Policy", Office of Science and Technology Policy, http://www.ostp.gov/NSTC/html/pdd6.html, 29 March, 1996.

62. Spacetoday, "White House releases GPS policy", spacetoday.net, http://www.spacetoday.net/Summary/2704, 16 December, 2004.

63. D. Taggart, "Usage of Commercial Satellite Systems for Homeland Security Communications, IEEE Aerospace Conference, 2, 2003, pp. 1155-1165.

64. R. Palmer, "Safety or Liberty?", Dominion Post, 1 April, 2006.

65. D. Molnar and D. Wagner, "Privacy: Privacy and Security in Library RFID: Issues, Practices, and Architectures", Proceedings of the 11th ACM Conference on Computer and Communications Security, 2004, pp. 210-219.

Citation: Michael, K.; McNamee, A.; Michael, M.G. 2006. ICMB '06. International Conference on Mobile Business, Date: 26-27 June 2006, pp. 34 - 34, DOI: 10.1109/ICMB.2006.43

IEEE Keywords: Ethics, Global Positioning System, Monitoring, Humans, Senior citizens, Law enforcement, Security,Protection, Usability, Context-aware services

INSPEC: Global Positioning System, ethical aspects, law enforcement, ethics, humancentric GPS tracking, location-based services, memory loss