Location-Based Privacy, Protection, Safety, and Security

Abstract

This chapter will discuss the interrelated concepts of privacy and security with reference to location-based services, with a specific focus on the notion of location privacy protection. The latter can be defined as the extent and level of control an individual possesses over the gathering, use, and dissemination of personal information relevant to their location, whilst managing multiple interests. Location privacy in the context of wireless technologies is a significant and complex concept given the dual and opposing uses of a single LBS solution. That is, an application designed or intended for constructive uses can simultaneously be employed in contexts that violate the (location) privacy of an individual. For example, a child or employee monitoring LBS solution may offer safety and productivity gains (respectively) in one scenario, but when employed in secondary contexts may be regarded as a privacy-invasive solution. Regardless of the situation, it is valuable to initially define and examine the significance of “privacy” and “privacy protection,” prior to exploring the complexities involved.

16.1 Introduction

Privacy is often expressed as the most complex issue facing location-based services (LBS) adoption and usage [44, p. 82, 61, p. 5, 66, pp. 250–254, 69, pp. 414–415]. This is due to numerous factors such as the significance of the term in relation to human rights [65, p. 9]. According to a report by the Australian Law Reform Commission (ALRC), “privacy protection generally should take precedence over a range of other countervailing interests, such as cost and convenience” [3, p. 104]. The intricate nature of privacy is also a result of the challenges associated with accurately defining the term [13, p. 4, 74, p. 68]. That is, privacy is a difficult concept to articulate [65, p. 13], as the term is liberally and subjectively applied, and the boundaries constituting privacy protection are unclear. Additionally, privacy literature is dense, and contains varying interpretations, theories and discrepancies as to what constitutes privacy. However, as maintained by [65, p. 67], “[o]ne point on which there seems to be near-unanimous agreement is that privacy is a messy and complex subject.” Nonetheless, as asserted by [89, p. 196], privacy is fundamental to the individual due to various factors:

The intensity and complexity of life, attendant upon advancing civilization, have rendered necessary some retreat from the world, and man, under the refining influence of culture, has become more sensitive to publicity, so that solitude and privacy have become more essential to the individual.

The Oxford English Dictionary definition of security is the “state of being free from danger or threat.” A designation of security applicable to this research is “a condition in which harm does not arise, despite the occurrence of threatening events; and as a set of safeguards designed to achieve that condition” [92, pp. 390–391]. Security and privacy are often confused in LBS scholarship. Elliot and Phillips [40, p. 463] warn that “[p]rivacy is not the same as security,” although the two themes are related [70, p. 14]. Similarly, Clarke [21] states that the term privacy is often used by information and communication technology professionals to describe data and data transmission security. The importance of security is substantiated by the fact that it is considered “a precondition for privacy and anonymity” [93, p. 2], and as such the two themes are intimately connected. In developing this chapter and surveying security literature relevant to LBS, it became apparent that existing scholarship is varied, but nonetheless entails exploration of three key areas. These include: (1) security of data or information, (2) personal safety and physical security, and (3) security of a nation or homeland/national security, interrelated categories adapted from [70, p. 12].

This chapter will discuss the interrelated concepts of privacy and security with reference to LBS, with a specific focus on the notion of location privacy protection. The latter can be defined as the extent and level of control an individual possesses over the gathering, use, and dissemination of personal information relevant to their location [38, p. 1, 39, p. 2, 53, p. 233], whilst managing multiple interests (as described in Sect. 16.1.1). Location privacy in the context of wireless technologies and LBS is a significant and complex concept given the dual and opposing uses of a single LBS solution. That is, an application designed or intended for constructive uses can simultaneously be employed in contexts that violate the (location) privacy of an individual. For example, a child or employee monitoring LBS solution may offer safety and productivity gains (respectively) in one scenario, but when employed in secondary contexts may be regarded as a privacy-invasive solution. Regardless of the situation, it is valuable to initially define and examine the significance of “privacy” and “privacy protection,” prior to exploring the complexities involved.

16.1.1 Privacy: A Right or an Interest?

According to Clarke [26, pp. 123–129], the notions of privacy and privacy protection emerged as important social issues since the 1960s. An enduring definition of privacy is the “right to be let alone” [89, p. 193]. This definition requires further consideration as it is quite simplistic in nature and does not encompass diverse dimensions of privacy. For further reading on the development of privacy and the varying concepts including that of Warren and Brandeis, see [76]. Numerous scholars have attempted to provide a more workable definition of privacy than that offered by Warren and Brandeis.

For instance, [21] maintains that perceiving privacy simply as a right is problematic and narrow, and that privacy should rather be viewed as an interest or collection of interests, which encompasses a number of facets or categories. As such, privacy is defined as “the interest that individuals have in sustaining a ‘personal space’, free from interference by other people and organisations” [2126]. In viewing privacy as an interest, the challenge is in balancing multiple interests in the name of privacy protection. This, as Clarke [21] maintains, includes opposing interests in the form of one’s own interests, the interests of other people, and/or the interests of other people, organizations, or society. As such Clarke refers to privacy protection as “a process of finding appropriate balances between privacy and multiple competing interests.”

16.1.2 Alternative Perspectives on Privacy

Solove’s [80] taxonomy of privacy offers a unique, legal perspective on privacy by grouping privacy challenges under the categories of information collection, information processing, information dissemination, and invasion. Refer to [80, pp. 483–558] for an in depth overview of the taxonomy which includes subcategories of the privacy challenges. Nissenbaum [65, pp. 1–2], on the other hand, maintains that existing scholarship generally expresses privacy in view of restricting access to, and maintaining control over, personal information. For example, Quinn [73, p. 213] insists that the central theme in privacy debates is that of access, including physical access to an individual, in addition to information access. With respect to LBS and location privacy, Küpper and Treu [53, pp. 233–234] agree with the latter, distinguishing three categories of access: (1) third-party access by intruders and law enforcement personnel/authorities, (2) unauthorized access by providers within the supply chain for malicious purposes, and (3) access by other LBS users. Nissenbaum [65, pp. 1–2] disputes the interpretation focused on access and control, noting that individuals are not interested in “simply restricting the flow of information but ensuring that it flows appropriately.” As such, Nissenbaum offers the framework of contextual integrity, as a means of determining when certain systems and practices violate privacy, and transform existing information flows inappropriately [65, p. 150]. The framework serves as a possible tool that can assist in justifying the need for LBS regulation.

A primary contribution from Nissenbaum is her emphasis on the importance of context in determining the privacy-violating nature of a specific technology-based system or practice. In addition to an appreciation of context, Nissenbaum recognizes the value of perceiving technology with respect to social, economic, and political factors and interdependencies. That is, devices and systems should be considered as socio-technical units [65, pp. 5–6].

In relation to privacy, and given the importance of socio-technical systems, the complexities embedded within privacy may, therefore, arise from the fact that the term can be examined from a number of perspectives. For instance, it can be understood in terms of its philosophical, psychological, sociological, economical, and political significance [2126]. Alternatively, privacy theory can provide varying means of interpretation, given that available approaches draw on inspiration from multiple disciplines such as computer science and engineering, amongst others [65, p. 67]. It is also common to explore privacy through its complex dimensions.

According to Privacy International, for instance, the term comprises the aspects of information privacy, bodily privacy, privacy of communications, and territorial privacy [72]. Similarly, in providing a contemporary definition of privacy, Clarke [26] uses Maslow’s hierarchy of needs to define the various categories of privacy; that is, “privacy of the person,” “privacy of personal behavior,” “privacy of personal communications,” and “privacy of personal data.” Clarke argues that since the late 1960s the term has been confined, in a legal sense, to the last two categories. That is, privacy laws have been restricted in their focus in that they are predominantly based on the OECD fair information principles, and lack coverage of other significant categories of privacy. Therefore, the label of information privacy, typically interchangeable with data privacy, is utilized in reference to the combination of communications and data privacy [21], and is cited by [58, pp. 5–7] as a significant challenge in the information age.

16.2 Background

16.2.1 Defining Information Privacy

In Alan Westin’s prominent book Privacy and Freedom, information privacy is defined as “the right of individuals, groups and institutions to determine for themselves, when, how and to what extent information about them is communicated to others” [90, p. 7]. Information in this instance is personal information that can be linked to or identify a particular individual [33, p. 326]. For a summary of information privacy literature and theoretical frameworks, presented in tabular form, refer to [8, pp. 15–17].

16.2.2 Information Privacy Through the Privacy Calculus Perspective

For the purpose of this chapter, it is noteworthy that information privacy can be studied through differing lenses, one of which is the privacy calculus theoretical perspective. Xu et al. [95, p. 138] explain that “the calculus perspective of information privacy interprets the individual’s privacy interests as an exchange where individuals disclose their personal information in return for certain benefits.” It can be regarded a form of “cost–benefit analysis” conducted by the individual, where privacy is likely to be (somewhat) relinquished if there is a perceived net benefit resulting from information disclosure [33, p. 327]. This perspective acknowledges the claim that privacy-related issues and concerns are not constant, but rather depend on perceptions, motivations, and conditions that are context or situation dependent [78, p. 353]. A related notion is the personalizationprivacy paradox, which is based on the interplay between an individual’s willingness to reap the benefits of personalized services at the expense of divulging personal information, which may potentially threaten or invade their privacy. An article by Awad and Krishnan [8] examines this paradox, with specific reference to online customer profiling to deliver personalized services. The authors recommend that organizations work on increasing the perceived benefit and value of personalized services to ensure “the potential benefit of the service outweighs the potential risk of a privacy invasion” [8, p. 26].

In the LBS context, more specifically, Xu et al. [94] build on the privacy calculus framework to investigate the personalization–privacy paradox as it pertains to overt and covert personalization in location-aware marketing. The results of the study suggest that the personalization approaches (overt and covert) impact on the perceived privacy risks and values. A complete overview of results can be found in [94, pp. 49–50]. For further information regarding the privacy calculus and the personalization–privacy paradox in the context of ubiquitous commerce applications including LBS, refer to [78]. These privacy-related frameworks and the concepts presented in this section are intended to be introductory in nature, enabling an appreciation of the varied perspectives on privacy and information privacy, in addition to the importance of context, rather than providing thoroughness in the treatment of privacy and information privacy. Such notions are particularly pertinent when reflecting on privacy and the role of emerging information and communication technologies (ICTs) in greater detail.

16.2.3 Emerging Technologies, m-Commerce and the Related Privacy Challenges

It has been suggested that privacy concerns have been amplified (but not driven) by the emergence and increased use of ICTs, with the driving force being the manner in which these technologies are implemented by organizations [2126]. In the m-commerce domain, mobile technologies are believed to boost the threat to consumer privacy. That is, the intensity of marketing activities can potentially be increased with the availability of timely location details and, more significantly, tracking information; thus enabling the influencing of consumer behaviors to a greater extent [25]. The threat, however, is not solely derived from usage by organizations. Specifically, the technologies originally introduced for use by government and organizational entities are presently available for consumer adoption by members of the community. For further elaboration, refer to Abbas et al. [1] and chapter 8 of Andrejevic [4]. Thus, location (information) privacy protection emerges as a substantial challenge for the government, business, and consumer sectors.

16.2.4 Defining Location (Information) Privacy

Location privacy, regarded a subset of information privacy, has been defined and presented in various ways. Duckham [38, p. 1] believes that location privacy is “the right of individuals to control the collection, use, and communication of personal information about their location.” Küpper and Treu [53, p. 233] define location privacy as “the capability of the target person to exercise control about who may access her location information in which situation and in which level of detail.” Both definitions focus on the aspect of control, cited as a focal matter regarding location privacy [39, p. 2]. With specific reference to LBS, location privacy and related challenges are considered to be of utmost importance. For example, Perusco and Michael [70, pp. 414–415], in providing an overview of studies relating to the social implications of LBS, claim that the principal challenge is privacy.

In [61, p. 5] Michael et al. also state, with respect to GPS tracking, that privacy is the “greatest concern,” resulting in the authors proposing a number of questions relating to the type of location information that should be revealed to other parties, the acceptability of child tracking and employee monitoring, and the requirement for a warrant in the tracking of criminals and terrorists. Similarly, Bennett and Crowe [12, pp. 9–32] reveal the privacy threats to various individuals, for instance those in emergency situations, mobile employees/workers, vulnerable groups (e.g., elderly), family members (notably children and teenagers), telematics application users, rental car clients, recreational users, prisoners, and offenders. In several of these circumstances, location privacy must often be weighed against other conflicting interests, an example of which is the emergency management situation. For instance, Aloudat [2, p. 54] refers to the potential “deadlock” between privacy and security in the emergency context, noting public concerns associated with the move towards a “total surveillance society.”

16.2.5 Data or Information Security

It has been suggested that data or information security in the LBS domain involves prohibiting unauthorized access to location-based information, which is considered a prerequisite for privacy [88, p. 121]. This form of security is concerned with “implementing security measures to ensure that collected data is only accessed for the agreed-upon purpose” [46, p. 1]. It is not, however, limited to access but is also related to “unwanted tracking” and the protection of data and information from manipulation and distortion [10, p. 185]. The techniques and approaches available to prevent unauthorized access and minimize chances of manipulation include the use of “spatially aware access control systems” [34, p. 28] and security- and privacy-preserving functionality [9, p. 568]. The intricacies of these techniques are beyond the scope of this investigation. Rather, this section is restricted to coverage of the broad data and information security challenges and the resultant impact on LBS usage and adoption.

16.2.6 Impact of Data or Information Security on LBS Market Adoption

It has been suggested that data and information security is a fundamental concern influencing LBS market adoption. From a legal standpoint, security is an imperative concept, particularly in cases where location information is linked to an individual [41, p. 22]. In such situations, safeguarding location data or information has often been described as a decisive aspect impacting on user acceptance. These claims are supported in [85, p. 1], noting that user acceptance of location and context-aware m-business applications are closely linked to security challenges. Hence, from the perspective of organizations wishing to be “socially-responsive,” Chen et al. [19, p. 7] advise that security breaches must be avoided in the interest of economic stability:

Firms must reassure customers about how location data are used…A security lapse, with accompanying publicity in the media and possible ‘negligence’ lawsuits, may prove harmful to both sales and the financial stability of the firm.

Achieving satisfactory levels of security in location- and context-aware services, however, is a tricky task given the general issues associated with the development of security solutions; inevitable conflicts between protection and functionality; mobile-specific security challenges; inadequacy of standards to account for complex security features; and privacy and control-related issues [85, pp. 1–2]. Furthermore, developing secure LBS involves consideration of multiple factors; specifically those related to data or information accuracy, loss, abuse, unauthorized access, modification, storage, and transfer [83, p. 10]. There is the additional need to consider security issues from multiple stakeholder perspectives, in order to identify shared challenges and accurately assess their implications and the manner in which suitable security features can be integrated into LBS solutions. Numerous m-business security challenges relevant to LBS from various perspectives are listed in [85]. Data security challenges relevant to LBS are also discussed in [57, pp. 44–46].

16.3 Privacy and Security Issues

16.3.1 Access to Location Information Versus Privacy Protection

The issue of privacy in emergency situations, in particular, is delicate. For instance, Quinn [73, p. 225] remarks on the benefits of LBS in safety-related situations, with particular reference to the enhanced 911 Directive in the US, which stipulates that the location of mobile phones be provided in emergency situations, aiding in emergency response efforts. The author continues to identify “loss of privacy” as a consequence of this service, specifically in cases where location details are provided to third parties [73, p. 226]. Such claims imply that there may be conflicting aims in developing and utilizing LBS. Duckham [38, p. 1] explains this point, stating that the major challenge in the LBS realm is managing the competing aims of enabling improved access to location information versus allowing individuals to maintain a sufficient amount of control over such information. The latter is achieved through the deployment of techniques for location privacy protection.

16.3.2 Location Privacy Protection

It is valid at this point to discuss approaches to location privacy protection. Bennett and Grant [13, p. 7] claim that general approaches to privacy protection in the digital age may come in varied forms, including, but not limited to, privacy-enhancing technologies, self-regulation approaches, and advocacy. In terms of LBS, substantial literature is available proposing techniques for location privacy protection, at both the theoretical and practical levels. A number of these techniques are best summarized in [39, p. 13] as “regulation, privacy policies, anonymity, and obfuscation.” A review of complementary research on the topic of privacy and LBS indicate that location privacy has predominantly been examined in terms of the social challenges and trade-offs from theoretical and practical perspectives; the technological solutions available to maintain location privacy; and the need for other regulatory response(s)to address location privacy concerns. The respective streams of literature are now inspected further in this chapter.

16.3.3 Social Challenges and Trade-Offs

In reviewing existing literature, the social implications of LBS with respect to privacy tend to be centered on the concepts of invasion, trade-off, and interrelatedness and complexity. The first refers primarily to the perceived and actual intrusion or invasion of privacy resulting from LBS development, deployment, usage, and other aspects. Alternatively, the trade-off notion signifies the weighing of privacy interest against other competing factors, notably privacy versus convenience (including personalization) and privacy versus national security. On the other hand, the factors of interrelatedness and complexity refer to the complicated relationship between privacy and other ethical dilemmas or themes such as control, trust, and security.

With respect to the invasion concept, Westin notes that concerns regarding invasion of privacy were amplified during the 1990s in both the social and political spheres [91, p. 444]. Concentrating specifically on LBS, [62, p. 6] provides a summary of the manner in which LBS can be perceived as privacy-invasive, claiming that GPS tracking activities can threaten or invade the privacy of the individual. According to the authors, such privacy concerns can be attributed to a number of issues regarding the process of GPS tracking. These include: (1) questionable levels of accuracy and reliability of GPS data, (2) potential to falsify the data post-collection, (3) capacity for behavioral profiling, (4) ability to reveal spatial information at varying levels of detail depending on the GIS software used, and (5) potential for tracking efforts to become futile upon extended use as an individual may become nonchalant about the exercise [62, pp. 4–5]. Other scholars examine the invasion concept in various contexts. Varied examples include [55] in relation to mobile advertising, [51] in view of monitoring employee locations, and [79] regarding privacy invasion and legislation in the United States concerning personal location information.

Current studies declare that privacy interests must often be weighed against other, possibly competing, factors, notably the need for convenience and national security. That is, various strands of LBS literature are fixed on addressing the trade-off between convenience and privacy protection. For instance, in a field study of mobile guide services, Kaasinen [50, p. 49] supports the need for resolving such a trade-off, arguing that “effortless use” often results in lower levels of user control and, therefore, privacy. Other scholars reflect on the trade-off between privacy and national security. In an examination of the legal, ethical, social, and technological issues associated with the widespread use of LBS, Perusco et al. [71] propose the LBS privacy–security dichotomy. The dichotomy is a means of representing the relationship between the privacy of the individual and national security concerns at the broader social level [71, pp. 91–97]. The authors claim that a balance must be achieved between both factors. They also identify the elements contributing to privacy risk and security risk, expressing the privacy risks associated with LBS to be omniscience, exposure, and corruption, claiming that the degree of danger is reduced with the removal of a specific risk [71, pp. 95–96]. The lingering question proposed by the authors is “how much privacy are we willing to trade in order to increase security?” [71, p. 96]. Whether in the interest of convenience or national security, existing studies focus on the theoretical notion of the privacy calculus. This refers to a situation in which an individual attempts to balance perceived value or benefits arising from personalized services against loss of privacy in determining whether to disclose information (refer to [833789495]).

The relationship between privacy and other themes is a common topic of discussion in existing literature. That is, privacy, control, security, and trust are key and interrelated themes concerning the social implications of LBS [71, pp. 97–98]. It is, therefore, suggested that privacy and the remaining social considerations be studied in light of these associations rather than as independent themes or silos of information. In particular, privacy and control literature are closely correlated, and as such the fields of surveillance and dataveillance must be flagged as crucial in discussions surrounding privacy. Additionally, there are studies which suggest that privacy issues are closely linked to notions of trust and perceived risk in the minds of users [444849], thereby affecting a user’s decision to engage with LBS providers and technologies. It is commonly acknowledged in LBS privacy literature that resolutions will seek consensus between issues of privacy, security, control, risk, and trust—all of which must be technologically supported.

16.3.4 Personal Safety and Physical Security

LBS applications are often justified as valid means of maintaining personal safety, ensuring physical security and generally avoiding dangerous circumstances, through solutions that can be utilized for managing emergencies, tracking children, monitoring individuals suffering from illness or disability, and preserving security in employment situations. Researchers have noted that safety and security efforts may be enhanced merely through knowledge of an individual’s whereabouts [71, p. 94], offering care applications with notable advantages [61, p. 4].

16.3.5 Applications in the Marketplace

Devices and solutions that capitalize on these facilities have thus been developed, and are now commercially available for public use. They include GPS-enabled wristwatches, bracelets, and other wearable items [59, pp. 425–426], in addition to their supportive applications that enable remote viewing or monitoring of location (and other) information. Assistive applications are one such example, such as those technologies and solutions suited to the navigation requirements of vision impaired or blind individuals [75, p. 104 (example applications are described on pp. 104–105)].

Alternative applications deliver tracking capabilities as their primary function; an example is the Australian-owned Fleetfinder PT2 Personal Tracker, which is advertised as a device capable of safeguarding children, teenagers, and the elderly [64]. These devices and applications promise “live on-demand” tracking and “a solid sense of reassurance” [15], which may be appealing for parents, carers, and individuals interested in protecting others. Advertisements and product descriptions are often emotionally charged, taking advantage of an individual’s (parent or carer) desire to maintain the safety and security of loved ones:

Your child going missing is every parent’s worst nightmare. Even if they’ve just wandered off to another part of the park the fear and panic is instant… [It] will help give you peace of mind and act as an extra set of eyes to look out for your child. It will also give them a little more freedom to play and explore safely [56].

16.3.6 Risks Versus Benefits of LBS Security and Safety Solutions

Despite such promotion and endorsement, numerous studies point to the dangers of LBS safety and security applications. Since their inception, individuals and users have voiced privacy concerns, which have been largely disregarded by proponents of the technology, chiefly vendors, given the (seemingly) voluntary nature of technology and device usage [6, p. 7]. The argument claiming technology adoption to be optional thereby placing the onus on the user is certainly weak and flawed, particularly given situations where an individual is incapable of making an informed decision regarding monitoring activities, supplementary to covert deployment options that may render monitoring activities obligatory. The consequences arising from covert monitoring are explored in [59] (refer to pp. 430–432 for implications of covert versus overt tracking of familiy member) and [1]. Covert and/or mandatory overt monitoring of minors and individuals suffering from illness is particularly problematic, raising doubt and questions in relation to the necessity of consent processes in addition to the suitability of tracking and what constitutes appropriate use.

In [59, p. 426] Mayer claims that there is a fine line between using tracking technologies, such as GPS, for safety purposes within the family context and improper use. Child tracking, for instance, has been described as a controversial area centered on the safety versus trust and privacy debate [77, p. 7]. However, the argument is not limited to issues of trust and privacy. Patel discusses the dynamics in the parent–child relationship and conveys a number of critical points in relation to wearable and embedded tracking technologies. In particular, Patel provides the legal perspective on child (teenager) monitoring [68, pp. 430–435] and other emergent issues or risks (notably linked to embedded monitoring solutions), which may be related to medical complications, psychological repercussions, and unintended or secondary use [68, pp. 444–455]. In Patel’s article, these issues are offset by an explanation of the manner in which parental fears regarding child safety, some of which are unfounded, and the role of the media in publicizing cases of this nature, fuel parents’ need for monitoring teenagers, whereas ultimately the decision to be monitored (according to the author), particularly using embedded devices, should ultimately lie with the teenager [68, pp. 437–442].

16.3.7 Safety of “Vulnerable” Individuals

Similarly, monitoring individuals with an illness or intellectual disability, such as a person with dementia wandering, raises a unique set of challenges in addition to the aforementioned concerns associated with consent, psychological issues, and misuse in the child or teenager tracking scenario. For instance, while dementia wandering and other similar applications are designed to facilitate the protection and security of individuals, they can concurrently be unethical in situations where reliability and responsiveness, amongst other factors, are in question [61, p. 7]. Based on a recent qualitative, focus group study seeking the attitudes of varied stakeholders in relation to the use of GPS for individuals with cognitive disabilities [54, p. 360], it was clear that this is an area fraught with indecisiveness as to the suitability of assistive technologies [54, p. 358]. The recommendations emerging from [54, pp. 361–364] indicate the need to “balance” safety with independence and privacy, to ensure that the individual suffering from dementia is involved in the decision to utilize tracking technologies, and that a consent process is in place, among other suggestions that are technical and devices related.

While much can be written about LBS applications in the personal safety and physical security categories, including their advantages and disadvantages, this discussion is limited to introductory material. Relevant to this chapter is the portrayal of the tensions arising from the use of solutions originally intended for protection and the resultant consequences, some of which are indeed inadvertent. That is, while the benefits of LBS are evident in their ability to maintain safety and security, they can indeed result in risks, such as the use of LBS for cyber stalking others. In establishing the need for LBS regulation, it is, therefore, necessary to appreciate that there will always be a struggle between benefits and risks relating to LBS implementation and adoption.

16.3.8 National Security

Safety and security debates are not restricted to family situations but may also incorporate, as [59, p. 437] indicates, public safety initiatives and considerations, amongst others, that can contribute to the decline in privacy. These schemes include national security, which has been regarded a priority area by various governments for over a decade. The Australian government affirms that the nation’s security can be compromised or threatened through various acts of “espionage, foreign interference, terrorism, politically motivated violence, border violations, cyber attack, organised crime, natural disasters and biosecurity events” [7]. Accordingly, technological approaches and solutions have been proposed and implemented to support national security efforts in Australia, and globally. Positioning technologies, specifically, have been adopted as part of government defense and security strategies, a detailed examination of which can be found in [60], thus facilitating increased surveillance. Surveillance schemes have, therefore, emerged as a result of the perceived and real threats to national security promoted by governments [92, p. 389], and according to [63, p. 2] have been legitimized as a means of ensuring national security, thereby granting governments “extraordinary powers that never could have been justified previously” [71, p. 94]. In [20, p. 216], Cho maintains that the fundamental question is “which is the greater sin—to invade privacy or to maintain surveillance for security purposes?”

16.3.9 Proportionality: National Security Versus Individual Privacy

The central theme surfacing in relevant LBS scholarship is that of proportionality; that is, measuring the prospective security benefits against the impending privacy- and freedom-related concerns. For example, [71, pp. 95–96] proposes the privacy–security dichotomy, as means of illustrating the need for balance between an individual’s privacy and a nation’s security, where the privacy and security elements within the model contain subcomponents that collectively contribute to amplify risk in a given context. A key point to note in view of this discussion is that while the implementation of LBS may enhance security levels, this will inevitably come at the cost of privacy [71, pp. 95–96] and freedom [61, p. 9].

Furthermore, forsaking privacy corresponds to relinquishing personal freedom, a consequential cost of heightened security in threatening situations. Such circumstances weaken the effects of invasive techniques and increase, to some degree, individuals’ tolerance to them [41, p. 12]. In particular, they “tilt the balance in favor of sacrificing personal freedom for the sake of public safety and security” [36, p. 50]. For example, Davis and Silver [35] report that the trade-off between civil liberties and privacy is often correlated with an individual’s sense of threat. In reporting on a survey of Americans post the events of September 11, 2011, the authors conclude that civil liberties are often relinquished in favor of security in high-threat circumstances [35, p. 35], in that citizens are “willing to tolerate greater limits on civil liberties” [35, p. 74]. Similarly, in a dissertation centered on the social implications of auto-ID and LBS technologies, Tootell [86] presents the Privacy, Security, and Liberty Trichotomy, as a means of understanding the interaction between the three values [86: chapter 6]. Tootell concludes that a dominant value will always exist that is unique to each individual [86, pp. 162–163].

Furthermore, researchers such as Gould [45, p. 75] have found that while people are generally approving of enhanced surveillance, they simultaneously have uncertainties regarding government monitoring. From a government standpoint, there is a commonly held and weak view that if an individual has nothing to hide, then privacy is insignificant, an argument particularly popular in relation to state-based surveillance [81, p. 746]. However, this perspective has inherent flaws, as the right to privacy should not be narrowly perceived in terms of concealment of what would be considered unfavorable activities, discussed further by [81, pp. 764–772]. Furthermore, the “civil liberties vs. security trade-off has mainly been framed as one of protecting individual rights or civil liberties from the government as the government seeks to defend the country against a largely external enemy” [35, p. 29].

Wigan and Clarke state, in relation to national security, that “surveillance systems are being developed without any guiding philosophy that balances human rights against security concerns, and without standards or guidance in relation to social impact assessment, and privacy design features” [92, p. 400]. Solove [82, p. 362] agrees that a balance can be achieved between security and liberty, through oversight and control processes that restrict prospective uses of personal data. In the current climate, given the absence of such techniques, fears of an Orwellian society dominated by intense and excessive forms of surveillance materialize. However, Clarke [27, p. 39] proposes a set of “counterveillance” principles in response to extreme forms of surveillance introduced in the name of national security, which include:

independent evaluation of technology; a moratorium on technology deployments; open information flows; justification of proposed measures; consultation and participation; evaluation; design principles; balance; independent controls; nymity and multiple identity; and rollback.

The absence of such principles creates a situation in which extremism reigns, producing a flow-on effect with potentially dire consequences in view of privacy, but also trust and control.

16.4 Solutions

16.4.1 Technological Solutions

In discussing technology and privacy in general, Krumm [52, p. 391] notes that computation-based mechanisms can be employed both to safeguard and to invade privacy. It is, therefore, valuable to distinguish between privacy-invasive technologies (PITs) and privacy-enhancing technologies (PETs). Clarke [23] examines the conflict between PITs and PETs, which are tools that can be employed to invade and protect privacy interests respectively. Technologies can invade privacy either deliberately as part of their primary purpose, or alternatively their invasive nature may emerge in secondary uses [2324, p. 209]. The aspects contributing to the privacy-invasive nature of location and tracking technologies or transactions include the awareness level of the individual, whether an individual has a choice, and the capability of performing an anonymous transaction amongst others [22]. In relation to LBS, [23] cites person-location and person-tracking systems as potential PITs that require the implementation of countermeasures, which to-date have come in the form of PETs or “counter-PITs.”

Existing studies suggest that the technological solutions (i.e., counter-PITs) available to address the LBS privacy challenge are chiefly concerned with degrading the ability to pinpoint location, or alternatively masking the identity of the user. For example, [62, p. 7] suggests that “[l]evels of privacy can be controlled by incorporating intelligent systems and customizing the amount of detail in a given geographic information system”, thus enabling the ethical use of GPS tracking systems. Similarly, other authors present models that anonymize user identity through the use of pseudonyms [14], architectures and algorithms that decrease location resolution [46], and systems that introduce degrees of obfuscation [37]. Notably, scholars such as Duckham [37, p. 7] consider location privacy protection as involving multiple strategies, citing regulatory techniques and privacy policies as supplementary strategies to techniques that are more technological in nature, such as obfuscation.

16.4.2 Need for Additional Regulatory Responses

Clarke and Wigan [31] examine the threats posed by location and tracking technologies, particularly those relating to privacy, stating that “[t]hose technologies are now well-established, yet they lack a regulatory framework.” A suitable regulatory framework for LBS (that addresses privacy amongst other social and ethical challenges) may be built on numerous approaches, including the technical approaches described in Sect. 16.4.1. Other approaches are explored by Xu et al. [95] in their quasi-experimental survey of privacy challenges relevant to push versus pull LBS. The approaches include compensation (incentives), industry self-regulation, and government regulation strategies [95, p. 143]. According to Xu et al., these “intervention strategies,” may have an impact on the privacy calculus in LBS [95, pp. 136–137]. Notably, their survey of 528 participants found that self-regulation has a considerable bearing on perceived risk for both push and pull services, whereas perceived risks for compensation and government regulation strategies vary depending on types of services. That is, compensation increases perceived benefit in the push but not the pull model and, similarly, government regulation reduces perceived privacy risk in the push-based model [95, p. 158].

It should be acknowledged that a preliminary step in seeking a solution to the privacy dilemma, addressing the identified social concerns, and proposing appropriate regulatory responses is to clearly identify and assess the privacy-invasive elements of LBS in a given context- we have used Australia as an example in this instance. Possible techniques that can be employed to identify risks and implications, and consequently possible mitigation strategies, are a Privacy Impact Assessment (PIA) or employing other novel models such as the framework of contextual integrity.

16.4.3 Privacy Impact Assessment (PIA)

A PIA can be defined as “a systematic process that identifies and evaluates, from the perspectives of all stakeholders, the potential effects on privacy of a project, initiative or proposed system or scheme, and includes a search for ways to avoid or mitigate negative privacy impacts” [2930]. The PIA tool, originally linked to technology and impact assessments [28, p. 125], is effectively a “risk management” technique that involves addressing both positive and negative impacts of a project or proposal, but with a greater focus on the latter [67, pp. 4–5].

PIAs were established and developed from 1995 to 2005, and possess a number of distinct qualities, some of which are that a PIA is focused on a particular initiative, takes a forward-looking and preventative as opposed to retrospective approach, broadly considers the various aspects of privacy (i.e., privacy of person, personal behavior, personal communication, and personal data), and is inclusive in that it accounts for the interests of relevant entities [28, pp. 124–125]. Regarding the Australian context, the development of PIAs in Australia can be observed in the work of Clarke [30] who provides an account of PIA maturity pre-2000, post-2000, and the situation in 2010.

16.4.4 Framework of Contextual Integrity

The framework of contextual integrity, introduced by [65], is an alternative approach that can be employed to assess whether LBS, as a socio-technical system, violates privacy and thus contextual integrity. An overview of the framework is provided in [65, p. 14]:

The central claim is that contextual integrity captures the meaning of privacy in relation to personal information; predicts people’s reactions to new technologies because it captures what we care about when we question, protest, and resist them; and finally, offers a way to carefully evaluate these disruptive technologies. In addition, the framework yields practical, step-by-step guidelines for evaluating systems in question, which it calls the CI Decision Heuristic and the Augmented CI Decision Heuristic.

According to Nissenbaum [65], the primary phases within the framework are: (1) explanation, which entails assessing a new system or practice in view of “context-relative informational norms” [65, p. 190], (2) evaluation, which involves “comparing altered flows in relation to those that were previously entrenched” [65, p. 190], and (3) prescription, a process based on evaluation, whereby if a system or practice is deemed “morally or politically problematic,” it has grounds for resistance, redesign or being discarded [65, p. 191]. Within these phases are distinct stages: establish the prevailing context, determine key actors, ascertain what attributes are affected, establish changes in principles of transmission, and red flag, if there are modifications in actors, attributes, or principles of transmission [65, pp. 149–150].

The framework of contextual integrity and, similarly, PIAs are relevant to this study, and may be considered as valid tools for assessing the privacy-invasive or violating nature of LBS and justifying the need for some form of regulation. This is particularly pertinent as LBS present unique privacy challenges, given their reliance on knowing the location of the target. That is, the difficulty in maintaining location privacy is amplified due to the fact that m-commerce services and mobility in general, by nature, imply knowledge of the user’s location and preferences [40, p. 463]. Therefore, it is likely that there will always be a trade-off ranging in severity. Namely, one end of the privacy continuum will demand that stringent privacy mechanisms be implemented, while the opposing end will support and justify increased surveillance practices.

16.5 Challenges

16.5.1 Relationship Between Privacy, Security, Control and Trust

A common thread in discussions relating to privacy and security implications of LBS throughout this chapter has been the interrelatedness of themes; notably, the manner in which a particular consideration is often at odds with other concerns. The trade-off between privacy/freedom and safety/security is a particularly prevalent exchange that must be considered in the use of many ICTs [36, p. 47]. In the case of LBS, it has been observed that the need for safety and security conflicts with privacy concerns, potentially resulting in contradictory outcomes depending on the nature of implementation. For example, while LBS facilitate security and timely assistance in emergency situations, they simultaneously have the potential to threaten privacy based on the ability for LBS to be employed in tracking and profiling situations [18, p. 105]. According to Casal [18, p. 109], the conflict between privacy and security, and lack of adequate regulatory frameworks, has a flow-on effect in that trust in ICTs is diminished. Trust is also affected in the family context, where tracking or monitoring activities result in lack of privacy between family members [59, p. 436]. The underlying question, according to Mayer [59, p. 435] is in relation to the power struggle between those seeking privacy versus those seeking information:

What will be the impact within families as new technologies shift the balance of power between those looking for privacy and those seeking surveillance and information?

Mayer’s [59] question alludes to the relevance of the theme of control, in that surveillance can be perceived as a form of control and influence. Therefore, it can be observed that inextricable linkages exist between several themes presented or alluded to throughout this chapter; notably privacy and security, but also the themes of control and trust. In summary, privacy protection requires security to be maintained, which in turn results in enhanced levels of control, leading to decreased levels of trust, which is a supplement to privacy [70, pp. 13–14]. The interrelatedness of themes is illustrated in Fig. 16.1.

Fig. 16.1: Relationship between control, trust, privacy, and security, after [70, p. 14]

It is thus evident that the idea of balance resurfaces, with the requirement to weigh multiple and competing themes and interests. This notion is not new with respect to location monitoring and tracking. For instance, Mayer [59, p. 437] notes, in the child tracking context, that there is the requirement to resolve numerous questions and challenges in a legal or regulatory sense, noting that “[t]he key is balancing one person’s need for privacy with another person’s need to know, but who will define this balancing point?” Issues of age, consent, and reciprocal monitoring are also significant. Existing studies on location disclosure amongst social relations afford the foundations for exploring the social and ethical challenges for LBS, whilst simultaneously appreciating technical considerations or factors. Refer to [51632424347628487].

16.6 Conclusion

This chapter has provided an examination of privacy and security with respect to location-based services. There is a pressing need to ensure LBS privacy threats are not dismissed from a regulatory perspective. Doing so will introduce genuine dangers, such as psychological, social, cultural, scientific, economic, political, and democratic harm; dangers associated with profiling; increased visibility; publically damaging revelations; and oppression [31]. Additionally, the privacy considerations unique to the “locational or mobile dimension” require educating the general public regarding disclosure and increased transparency on the part of providers in relation to collection and use of location information [11, p. 15]. Thus, in response to the privacy challenges associated with LBS, and based on current scholarship, this research recognizes the need for technological solutions, in addition to commitment and adequate assessment or consideration at the social and regulatory levels. Specifically, the privacy debate involves contemplation of privacy policies and regulatory frameworks, in addition to technical approaches such as obfuscation and maintaining anonymity [37, p. 7]. That is, privacy-related technical solutions must also be allied with supportive public policy and socially acceptable regulatory structures.

For additional readings relevant to LBS and privacy, which include an adequate list of general references for further investigation, refer to [17] on privacy challenges relevant to privacy invasive geo-mash-ups, the inadequacy of information privacy laws and potential solutions in the form of technological solutions, social standards and legal frameworks; [12] report submitted to the Office of the Privacy Commissioner of Canada, focused on mobile surveillance, the privacy dangers, and legal consequences; and [57] report to the Canadian Privacy Commissioner dealing with complementary issues associated with mobility, location technologies, and privacy.

Based on the literature presented throughout this chapter, a valid starting point in determining the privacy-invasive nature of specific LBS applications is to review and employ the available solution(s). These solutions or techniques are summarized in Table 16.1, in terms of the merits and benefits of each approach and the extent to which they offer means of overcoming or mitigating privacy-related risks. The selection of a particular technique is dependent on the context or situation in question. Once the risks are identified it is then possible to develop and select an appropriate mitigation strategy to reduce or prevent the negative implications of utilizing certain LBS applications. This chapter is intended to provide a review of scholarship in relation to LBS privacy and security, and should be used as the basis for future research into the LBS privacy dilemma, and related regulatory debate.

Table 16.1 Summary of solutions and techniques

Solution/Technique | Merits | Limitations

Technological mechanisms

• Provide location obfuscation and anonymity in required situations

• Myriad of solutions available depending on level of privacy required

• In-built mechanisms requiring limited user involvement

• Unlike regulatory solutions, technological solutions encourage industry development

• Result in degradation in location quality/resolution

Regulatory mechanisms

• Variety of techniques available, such as industry self-regulation and government legislation

• Can offer legal protection to individuals in defined situations/scenarios

• Can be limiting in terms of advancement of LBS industry

Impact assessments, contextual frameworks, and internal policies

• Provide proactive approach in identifying privacy (and related) risks

• Used to develop suitable mitigation strategies

• Preventative and inclusive in nature

• Tend to be skewed in focus, focusing primarily on negative implications

• Can be limiting in terms of advancement of LBS industry

References

1. Abbas R, Michael K, Michael MG, Aloudat A (2011) Emerging forms of covert surveillance using GPS-enabled devices. J Cases Inf Technol (JCIT) 13(2):19–33

2. Aloudat A (2012) ‘Privacy Vs Security in national emergencies. IEEE Technol Soc Mag Spring 2012:50–55

3. ALRC 2008 (2012) For your information: Australian privacy law and practice (Alrc Report 108). http://www.alrc.gov.au.ezproxy.uow.edu.au/publications/report-108. Accessed 12 Jan 2012

4. Andrejevic M (2007) Ispy: Surveillance and Power in the Interactive Era. University Press of Kansas, Lawrence

5. Anthony D, Kotz D, Henderson T (2007) Privacy in location-aware computing environments. Pervas Comput 6(4):64–72

6. Applewhite A (2002) What knows where you are? Personal safety in the early days of wireless. Pervas Comput 3(12):4–8

7. Attorney General’s Department (2012) Telecommunications interception and surveillance. http://www.ag.gov.au/Telecommunicationsinterceptionandsurveillance/Pages/default.aspx. Accessed 20 Jan 2012

8. Awad NF, Krishnan MS (2006) The personalization privacy paradox: an empirical evaluation of information transparency and the willingness to be profiled online for personalization. MIS Q 30(1):13–28

9. Ayres G, Mehmood R (2010) Locpris: a security and privacy preserving location based services development framework. In: Setchi R, Jordanov I, Howlett R, Jain L (eds) Knowledge-based and intelligent information and engineering systems, vol 6279, pp 566–575

10. Bauer HH, Barnes SJ, Reichardt T, Neumann MM (2005) Driving the consumer acceptance of mobile marketing: a theoretical framework and empirical study. J Electron Commer Res 6(3):181–192

11. Bennett CJ (2006) The mobility of surveillance: challenges for the theory and practice of privacy protection. In: Paper prepared for the 2006 Meeting of the international communications association, Dresden Germany, June 2006, pp 1–20.

12. Bennett CJ, Crowe L (2005) Location-based services and the surveillance of mobility: an analysis of privacy risks in Canada. A report to the Office of the Privacy Commissioner of Canada, under the 200405 Contributions Program, June 2005. http://www.colinbennett.ca/recent-publications/reports-2/

13. Bennett CJ, Grant R (1999) Introduction. In: Bennett CJ, Grant R (eds) Visions of privacy: policy choices for the digital age. University of Toronto Press, Toronto, pp 3–16.

14. Beresford AR, Stajano F (2004) Mix zones: user privacy in location-aware services. In: Proceedings of the Second IEEE Annual conference on pervasive computing and communications workshops (PERCOMW’04) pp 127–131.

15. Brickhouse Security (2012) Lok8u GPS Child Locator. http://www.brickhousesecurity.com/child-locator.html. Accessed 9 Feb 2012

16. Brown B, Taylor AS, Izadi S, Sellen A, Kaye J, Eardley R (2007) Locating family values: a field trial of the whereabouts clock. In: UbiComp ‘07 Proceedings of the 9th international conference on Ubiquitous computing, pp 354–371.

17. Burdon M (2010) Privacy invasive geo-mashups : Privacy 2.0 and the limits of first generation information privacy laws. Univ Illinois J Law Technol Policy (1):1–50.

18. Casal CR (2004) Impact of location-aware services on the privacy/security balance. Info: J Policy Regul Strategy Telecommun Inf Media 6(2):105–111

19. Chen JV, Ross W, Huang SF (2008) Privacy, trust, and justice considerations for location-based mobile telecommunication services. Info 10(4):30–45

20. Cho G (2005) Geographic information science: mastering the legal issues. Wiley, Hoboken.

21. Clarke R (1997) Introduction to dataveillance and information privacy, and definitions of terms. http://www.anu.edu.au/people/Roger.Clarke/DV/Intro.html

22. Clarke R (1999) Relevant characteristics of person-location and person-tracking technologies. http://www.rogerclarke.com/DV/PLTApp.html

23. Clarke R (2001a) Introducing PITs and PETs: technologies affecting privacy. http://www.rogerclarke.com/DV/PITsPETs.html

24. Clarke R (2001) Person location and person tracking—technologies, risks and policy implications. Inf Technol People 14(2):206–231

25. Clarke R (2003b) Privacy on the move: the impacts of mobile technologies on consumers and citizens. http://www.anu.edu.au/people/Roger.Clarke/DV/MPrivacy.html

26. Clarke R (2006) What’s ‘Privacy’? http://www.rogerclarke.com/DV/Privacy.html

27. Clarke R (2007a) Chapter 3. What ‘Uberveillance’ is and what to do about it. In: Michael K, Michael MG (eds) The Second workshop on the social implications of national security (from Dataveillance to Uberveillance and the Realpolitik of the Transparent Society). University of Wollongong, IP Location-Based Services Research Program (Faculty of Informatics) and Centre for Transnational Crime Prevention (Faculty of Law), Wollongong, Australia, pp 27–46

28. Clarke R (2009) Privacy impact assessment: its origins and development. Comput Law Secur Rev 25(2):123–135

29. Clarke R (2010a) An evaluation of privacy impact assessment guidance documents. http://www.rogerclarke.com/DV/PIAG-Eval.html

30. Clarke R (2010b) Pias in Australia—A work-in-progress report. http://www.rogerclarke.com/DV/PIAsAust-11.html

31. Clarke R, Wigan M (2011) You are where you’ve been: the privacy implications of location and tracking technologies. http://www.rogerclarke.com/DV/YAWYB-CWP.html

32. Consolvo S, Smith IE, Matthews T, LaMarca A, Tabert J, Powledge P (2005) Location disclosure to social relations: why, when, & what people want to share. In: CHI 2005(April), pp 2–7, Portland, Oregon, USA, pp. 81–90

33. Culnan MJ, Bies RJ (2003) Consumer privacy: balancing economic and justice considerations. J Soc Issues 59(2):323–342

34. Damiani ML, Bertino E, Perlasca P (2007) Data security in location-aware applications: an approach based on Rbac. Int. J. Inf Comput Secur 1(1/2):5–38

35. Davis DW, Silver BD (2004) Civil Liberties Vs. Security: public opinion in the context of the terrorist attacks on America. Am J Polit Sci 48(1):28–46

36. Dobson JE, Fisher PF (2003) Geoslavery. IEEE Technol Soc Mag 22(1):47–52

37. Duckham M (2008) Location privacy protection through spatial information hiding. http://www.privacy.vic.gov.au/privacy/web2.nsf/files/20th-meeting-16-july-2008-duckham-presentation/$file/pvn_07_08_duckham.pdf

38. Duckham M (2010) Moving forward: location privacy and location awareness. In: SPRINGL’10 November 2, 2010, San Jose, CA, USA, pp 1–3

39. Duckham M, Kulik L (2006) Chapter 3. location privacy and location-aware computing. In: Drummond J, Billen R, Forrest D, Joao E (eds) Dynamic and Mobile Gis: investigating change in space and time. CRC Press, Boca Raton, pp 120. http://www.geosensor.net/papers/duckham06.IGIS.pdf

40. Elliot G, Phillips N (2004) Mobile commerce and wireless computing systems. Pearson Education Limited, Great Britain 532 pp

41. FIDIS 2007, D11.5: The legal framework for location-based services in Europe. http://www.fidis.net/

42. Fusco SJ, Michael K, Aloudat A, Abbas R (2011) Monitoring people using location-based social networking and its negative impact on trust: an exploratory contextual analysis of five types of “Friend” Relationships. In: IEEE symposium on technology and society (ISTAS11), Illinois, Chicago, IEEE 2011

43. Fusco SJ, Michael K, Michael MG, Abbas R (2010) Exploring the social implications of location based social networking: an inquiry into the perceived positive and negative impacts of using LBSN between friends. In: 9th international conference on mobile business (ICMB2010), Athens, Greece, IEEE, pp 230–237

44. Giaglis GM, Kourouthanassis P, Tsamakos A (2003) Chapter IV. Towards a classification framework for mobile location-based services. In: Mennecke BE, Strader TJ (eds) Mobile commerce: technology, theory and applications. Idea Group Publishing, Hershey, US, pp 67–85

45. Gould JB (2002) Playing with fire: the civil liberties implications of September 11th. In: Public Administration Review, 62 (Special Issue: Democratic Governance in the Aftermath of September 11, 2001), pp 74–79

46. Gruteser M, Grunwald D (2003) Anonymous usage of location-based services through spatial and temporal cloaking. In: ACM/USENIX international conference on mobile systems, applications and services (MobiSys), pp 31–42

47. Iqbal MU, Lim S (2007) Chapter 16. Privacy implications of automated GPS tracking and profiling. In: Michael K, Michael MG (eds) From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (Workshop on the Social Implications of National Security, 2007) University of Wollongong, IP Location-Based Services Research Program (Faculty of Informatics) and Centre for Transnational Crime Prevention (Faculty of Law), Wollongong, pp 225–240

48. Jorns O, Quirchmayr G (2010) Trust and privacy in location-based services. Elektrotechnik & Informationstechnik 127(5):151–155

49. Junglas I, Spitzmüller C (2005) A research model for studying privacy concerns pertaining to location-based services. In: Proceedings of the 38th Hawaii international conference on system sciences, pp 1–10

50. Kaasinen E (2003) User acceptance of location-aware mobile guides based on seven field studies. Behav Inf Technol 24(1):37–49

51. Kaupins G, Minch R (2005) Legal and ethical implications of employee location monitoring. In: Proceedings of the 38th Hawaii international conference on system sciences, pp 1–10

52. Krumm J (2008) A survey of computational location privacy. Pers Ubiquit Comput 13(6):391–399

53. Küpper A, Treu G (2010) Next generation location-based services: merging positioning and web 2.0. In: Yang LT, Waluyo AB, Ma J, Tan L, Srinivasan B (eds) Mobile intelligence. Wiley Inc, Hoboken, pp 213–236

54. Landau R, Werner S (2012) Ethical aspects of using GPS for tracking people with dementia: recommendations for practice. Int Psychogeriatr 24(3):358–366

55. Leppäniemi M, Karjaluoto H (2005) Factors influencing consumers’ willingness to accept mobile advertising: a conceptual model. Int. J Mobile Commun 3(3):197–213

56. Loc8tor Ltd. 2011 (2012), Loc8tor Plus. http://www.loc8tor.com/childcare/. Accessed 9 Feb 2012.

57. Lyon D, Marmura S, Peroff P (2005) Location technologies: mobility, surveillance and privacy (a Report to the Office of the Privacy Commissioner of Canada under the Contributions Program). The Surveillance Project, Queens Univeristy, Canada. www.sscqueens.org/sites/default/files/loctech.pdf

58. Mason RO (1986) Four ethcial challenges in the information age. MIS Q 10(1):4–12

59. Mayer RN (2003) Technology, families, and privacy: can we know too much about our loved ones? J Consum Policy 26:419–439

60. Michael K, Masters A (2006) The advancement of positioning technologies in defense intelligence. In: Abbass H, Essam D (eds) Applications of information systems to homeland security and defense. Idea Publishing Group, United States, pp 196–220

61. Michael K, McNamee A, Michael MG (2006) The emerging ethics of humancentric GPS tracking and monitoring. International conference on mobile business. IEEE Computer Society, Copenhagen, Denmark, pp 1–10

62. Michael K, McNamee A, Michael MG, Tootell H (2006) Location-based intelligence—modeling behavior in humans using GPS. IEEE international symposium on technology and society. IEEE, New York, United States, pp 1–8

63. Michael K, Clarke R (2012) Location privacy under dire threat as Uberveillance stalks the streets. In: Precedent (Focus on Privacy/FOI), vol 108, pp 1–8 (online version) & 24–29 (original article). http://works.bepress.com.ezproxy.uow.edu.au/kmichael/245/

64. Neltronics 2012 (2012) Fleetfinder Pt2 Personal Tracker. http://www.fleetminder.com.au/gps-systems/fleetfinder+PT2. Accessed 9 Feb 2012

65. Nissenbaum H (2010) Privacy in context: technology, policy, and the integrity of social life. Stanford Law Books, Stanford 288 pp

66. O’Connor PJ, Godar SH (2003) Chapter XIII. We know where you are: the ethics of LBS advertising. In: Mennecke BE, Strader TJ (eds) Mobile commerce: technology, theory and applications. Idea Group Publishing, Hershey, pp 245–261

67. Office of the Victorian Privacy Commissioner 2009 (2010) Privacy impact assessments: a single guide for the victorian public sector. www.privacy.vic.gov.au/privacy/web.nsf/content/guidelines. Accessed 3 March 2010

68. Patel DP (2004) Should teenagers get Lojackedt against their will? An argument for the ratification of the United Nations convention on the rights of the child. Howard L J 47(2):429–470

69. Perusco L, Michael K (2005) Humancentric applications of precise location based services. IEEE international conference on e-business engineering. IEEE Computer Society, Beijing, China, pp 409–418

70. Perusco L, Michael K (2007) Control, trust, privacy, and security: evaluating location-based services. IEEE Technol Soc Mag 26(1):4–16

71. Perusco L, Michael K, Michael MG (2006) Location-based services and the privacy-security dichotomy. In: Proceedings of the 3rd international conference on mobile computing and ubiquitous networking, London, UK. Information Processing Society of Japan, pp. 91–98

72. Privacy International 2007, Overview of Privacy. www.privacyinternational.org/article.shtml?cmd[347]=x-347-559062. Accessed 3 Dec 2009

73. Quinn MJ (2006) Ethics for the information age, 2nd edn. Pearson/Addison-Wesley, Boston 484 pp

74. Raab CD (1999) Chapter 3. From balancing to steering: new directions for data protection. In: Bennett CJ, Grant R (eds) Visions of privacy: policy choices for the digital age. University of Toronto Press, Toronto, pp 68–93

75. Raper J, Gartner G, Karimi HA, Rizos C (2007) Applications of location-based services: a selected review. J Locat Based Serv 1(2):89–111

76. Richards NM, Solove DJ (2007) Privacy’s other path: recovering the law of confidentiality. Georgetown Law J 96:123–182

77. Schreiner K (2007) Where We At? Mobile phones bring GPS to the masses. IEEE Comput Graph Appl 2007:6–11

78. Sheng H, Fui-Hoon Nah F, Siau K (2008) An experimental study on ubiquitous commerce adoption: impact of personalization and privacy concerns. J Assoc Inf Syst 9(6):344–376

79. Smith GD (2006) Private eyes are watching you: with the implementation of the E-911 Mandate, Who will watch every move you make? Federal Commun Law J 58:705–726

80. Solove DJ (2006) A taxonomy of privacy. Univ Pennsylvania Law Rev 154(3):477–557

81. Solove DJ (2007) I’ve Got Nothing to Hide’ and other misunderstandings of privacy. San Diego Law Rev 44:745–772

82. Solove DJ (2008) Data mining and the security-liberty debate. Univ Chicago Law Rev 74:343–362

83. Steinfield C (2004) The development of location based services in mobile commerce. In: Priessl B, Bouwman H, Steinfield C (eds) Elife after the Dot.Com Bust. www.msu.edu/~steinfie/elifelbschap.pdf, pp 1–15

84. Tang KO, Lin J, Hong J, Siewiorek DP, Sadeh N (2010) Rethinking location sharing: exploring the implications of social-driven vs. purpose-driven location sharing. In: UbiComp 2010, Sep 26–Sep 29, Copenhagen, Denmark, pp 1–10

85. Tatli EI, Stegemann D, Lucks S (2005) Security challenges of location-aware mobile business. In: The Second IEEE international workshop on mobile commerce and services, 2005. WMCS ‘05, pp 1–10

86. Tootell H (2007) The social impact of using automatic identification technologies and location-based services in national security. PhD Thesis, School of Information Systems and Technology, Informatics, University of Wollongong

87. Tsai JY, Kelley PG, Drielsma PH, Cranor LF, Hong J, Sadeh N (2009) Who’s Viewed You? the impact of feedback in a mobile location-sharing application. In: CHI 2009, April 3–9, 2009, Boston, Massachusetts, USA, pp 1–10

88. Wang S, Min J, Yi BK (2008) Location based services for mobiles: technologies and standards (Presentation). In: IEEE ICC 2008, Beijing, pp 1–123

89. Warren S, Brandeis L (1890) The right to privacy. Harvard Law Rev 4:193–220

90. Westin AF (1967) Privacy and freedom. Atheneum, New York 487 pp

91. Westin AF (2003) Social and political dimensions of privacy. J Social Issues 59(2):431–453

92. Wigan M, Clarke R (2006) Social impacts of transport surveillance. Prometheus 24(4):389–403

93. Wright T (2004) ‘Security. Privacy and Anonymity’, crossroads 11:1–8

94. Xu H, Luo X, Carroll JM, Rosson MB (2011) The personalization privacy paradox: an exploratory study of decision making process for location-aware marketing. Decis Support Syst 51(2011):42–52

95. Xu H, Teo HH, Tan BYC, Agarwal R (2009) The role of push-pull technology in privacy calculus: the case of location-based services. J Manage Inf Syst 26(3):135–173

Citation: Abbas R., Michael K., Michael M.G. (2015) "Location-Based Privacy, Protection, Safety, and Security." In: Zeadally S., Badra M. (eds) Privacy in a Digital, Networked World. Computer Communications and Networks. Springer, Cham, DOI: https://doi-org.ezproxy.uow.edu.au/10.1007/978-3-319-08470-1_16