Factors affecting privacy disclosure on social network sites

Abstract

The self-disclosure of personal information by users on social network sites (SNSs) play a vital role in the self-sustainability of online social networking service provider platforms. However, people’s levels of privacy concern increases as a direct result of unauthorized procurement and exploitation of personal information from the use of social networks which in turn discourages users from disclosing their information or encourages users to submit fake information online. After a review of the Theory of Planned Behavior (TPB) and the privacy calculus model, an integrated model is proposed to explain privacy disclosure behaviors on social network sites. Thus, the aim of this paper is to find the key factors affecting users’ self-disclosure of personal information. Using privacy calculus, the perceived benefit was combined into the Theory of Planned Behavior, and after some modifications, an integrated model was prescribed specifically for the context of social network sites. The constructs of information sensitivity and perceived benefit were redefined after reviewing the literature. Through a study on the constructs of privacy concern and self-disclosure, this article aims at reducing the levels of privacy concern, while sustaining online transactions and further stimulating the development of social network sites.

1 Introduction

The continued spate of unauthorized procurement and exploitation of personal information from web sites has acted to increase levels of privacy concern in the online community. This has had a significant negative impact on online business transactions and online social activities. Previous studies estimate that the decreasing levels of users’ trust caused by privacy concerns have led to heavy losses in business-to-consumer online business. Social network sites have also come under criticism because of eroding privacy policies. For example, in 2006, the ability to post content on a friend’s Wall on Facebook was met with some consumer backlash. Some users even protested against the feature being implemented at all. In 2008, Facebook’s Beacon social advertising system received a great number of complaints, becoming the target of a class action lawsuit, and ultimately shutting down in September 2009. And in 2010, Facebook’s Places which enabled location sharing with friends came under fire for its potential use as a cyberstalking tool or for less-than-savory purposes.

Given general privacy concerns, consumers are less inclined to submit information on social network sites, at times even providing fake personal information, for “date-of-birth” or “post code” data entry fields. The self-disclosure of personal information on social network sites, after all, can provide criminals with the opportunity to commit identity theft and a host of other cybercrimes. So how service providers can continue to convince users to provide personal information online despite the ever-increasing risks is a significant research issue in the sustainable development of social network sites.

Previous scholarly works have been mainly focused on studying the connection between privacy concern and information disclosure. Dwyer et al. [10] compared Facebook and Myspace in relation to privacy concerns and trust. Acquisti and Gross [1] analyzed how privacy concerns affect people’s information disclosure on Facebook. However, these studies did not take into account those factors that actually work to encourage users to disclose personal information on social network sites. According to the Theory of Reasoned Action (TRA) and the Theory of Planned Behavior (TPB) [2], the adoption of some behaviors must be directly related to some benefit. In this instance, the self-disclosure of personal information must be commensurately connected with some perceived benefits coming as a result of that disclosure. The privacy calculus model which describes the intention of self-disclosure of personal information within a risk-return exchange has been drawing more and more attention [52230]. Some scholars explain the behavior of disclosing private information publicly using the constructs of privacy concern and perceived benefits [536], although most of the studies are limited to online business transactions. Compared to online business sites, social network sites mainly request demographic information such as the user’s age and gender. Users are also encouraged to use their real name and not a pseudonym. In addition, the benefits perceived on social network sites are not discounts or free services, but the foundation of social capital or community attachment. This paper combines the Theory of Planned Behavior and privacy calculus to present a new holistic model toward the explanation of self-disclosure of personal information on social network sites.

2 Literature review

2.1 The theory of planned behavior

Fig. 1. Theory of planned behavior [2]

Since the 1980s, theories that explain the behavior of people have become more and more refined. Such theories as the Social Cognitive Theory (SCT), the Theory of Reasoned Action (TRA), and the Theory of Planned Behavior (TPB) explain the behavioral intentions of people. For example, in TPB, the attitude, behavior control and subjective norm are the main factors affecting behavior intention which directly determines the adoption of behavior (Fig. 1).

The Theory of Planned Behavior model has also been used to explain consumer online preferences. Lee (2009) added perceived risk and trust into TPB to study online transactions with acceptable results. The deductive method seemingly supports the conclusion that combined with certain characteristics in social networks, the TPB model can be used to explain the self-disclosure of personal information by users of online social network sites.

2.2 The theory of privacy calculus

Defining privacy is not straightforward, despite the vast number of studies on the topic. Given the diverse contexts in which privacy is described, a universally applied definition of privacy does not exist. “Privacy” and the self-disclosure of “private” information are viewed from an economic angle in the theory of privacy calculus. Klopfer and Rubenstein [17] for instance, regarded privacy as an entitlement which can be exchanged for more value. On that basis, many studies believe one behavior happens after the risk and return calculus is determined [622]. As to the privacy issues, the benefit must exceed the risk to guarantee the motive of self-disclosure [33].

3 The proposed integrated model

Combining the TPB model with privacy calculus, an integrated model is proposed (Fig. 2). In this model, privacy disclosure was determined by privacy concern and perceived benefit which drew from the theory of privacy calculus. The constructs of behavior control, subjective norm and factors of attitude were taken from the TPB model.

Fig. 2. Integrated model

3.1 Constructs and hypotheses in the model

3.1.1 Perceived benefit

As most of the relevant studies were in the context of online business transactions, perceived benefit was regarded as discounted or free services. Yang and Wang [36] performed an experiment which used discounts as compensation for the submission of personal information. Phelps et al. [27] found that the voluntary disclosure of personal information by consumers equated to a contraction in shopping time and better purchasing recommendations. Xu et al. [35] regarded more personalized service as perceived benefit. Generally, the perceived benefits mentioned above can be regarded more as attention to personalized service.

In the context of online social network sites, the self-disclosure of personal information can also achieve the same kind of return. For example, renren.com, a China-based social networking platform, rates users by the amount of personal information they disclose on their site using a star rating system. The higher the rating assessment, the more services the user enjoys. However, most users who engage with online social network sites do not expect such pragmatic practical benefits. Most users are motivated to disclose personal information for the purposes of being able to interact with online communities they share a common interest with. To these users, connecting within and between social networks, in essence, is a form of social capital.

People wish to be accepted and valued by the social networks they are members of and this can be explained in two ways.

Participating in a social network repays members’ organizational commitments. According to Forman et al. [12], people can achieve community attachment by enrolling and taking part in a social network. Forman et al. [12] found that the reason why people tend to make their personal information public is: (a) to make it easier to be identified; and (b) to gain an attachment to relevant communities. Khan and Shaikh [16] investigated the reason why people add lots of strangers as “friends” on social network sites and postulated that it had to do with gaining popularity. In organizational behavior (OB) this is known as “the sense of belonging”. Salancik [29] figured that community attachment is the dependence people have on a certain organization, and the behavior prompted by that organization of which they are a member.

Social network sites provide people with lots of virtual social capital. The accumulation of social capital permits people to attain rich resources, such as information linkages, organizing and cooperative abilities and tapping into a web of relationships. Granovetter [13] proved the necessity of social networks using the theory of weak ties. Having more relationships and connections, provides people more resources to call from when they are in need of a helping hand.

The introduction of online social network sites helped some people consolidate former relationships and made it possible to establish new circles of friends. It is said that the development of these platforms have forced the cost of communications to decrease, and have enabled the number of weak ties one can handle to sharply increase [9]. More advanced internet techniques have provided diverse methods to maintain relationships online, such as address lists with power function and video conference. Social capital can only be derived from the willingness of participants in the network to submit fundamental personal information. That is what motivates people to disclose who they are for authentication purposes.

The accumulation of social capital and the sense of community attachment are related. The social network site offering more social capital draws the user’s loyalty and guarantees attachment providing a certain level of “stickiness” in the form of quality return visits and participation; in turn, the sites with abundant loyal users can offer even more in terms of social capital for the individual user, and the members at large.

Numerous articles present the relationship between the perceived benefit by consumers and their online behavior. For example, shoppers will continue to disclose personal information if it means they will qualify for online discounts or promotions [815]. Ellison et al. [3] found perceived benefits had a positive affect on the usage of social network sites. Based on this research, this paper assumes perceived benefits have positive affectivity on the self-disclosure of personal information on social network sites.

Hypothesis 1: Perceived benefit will have a positive effect on the self-disclosure of personal information on social network sites.

3.1.2 Privacy concern

Privacy concern refers to the users’ concern about threats to their privacy online. This construct reflects users’ response to the perceived possibility of a privacy leak and the expected loss induced by the abuse of privacy. According to Paine et al. [24], privacy concern is not only the reaction to the security of privacy but also a motivator for users to take care of their personal information. Milne and Culnan [21] proved high levels of privacy concern provided the impetus for users to read, at least, the introduction of privacy policies online. And users with high levels of privacy concern may also be inclined to refuse to submit personal information to a social network site [31] or to submit false information [14]. Privacy concern has become one of the most important factors in studying online privacy issues.

Hypothesis 2: Privacy concerns will have a negative effect on the self-disclosure of personal information on social network sites.

3.1.3 Privacy sensitivity

Privacy sensitivity represents people’s attitudes toward revealing differing levels of personal information during the online shopping experience. Phelps et al. [28] divided personal information into three categories: (1) demographic information; (2) lifestyle and shopping information; and (3) personal financial information. Malhotra et al. [20] found that people are willing to provide less sensitive information online, but when they are faced with the dilemma of providing more sensitive information they usually decline participation. To measure privacy sensitivity, Yang and Wang [36] set matched groups in their experiment: (1) users that were asked demographic information only; and (2) users that were asked both demographic information and personal financial information.

Yang and Wang’s categorization of personal information mentioned above is not entirely suitable for social network sites, as extensive demographic data is not usually requested, and personal financial information is relevant to an even lesser extent. When compared with online shopping sites the only information that users of social network sites are requested to submit, belongs only to the demographic information category. Thus, it can be assumed that even though the same request to submit “essential” personal information is being made on the social network site, users may individually react in different ways. This is analogous to the manner in which the Technology Acceptance Model (TAM) can perceive different usefulness of the same facility. In experiential studies, it is not the attributes of a given technology that are of importance but a respondent’s feeling towards a given technology that we should be measuring [32]. On social network sites, a user can possess differing levels of personal information sensitivity, while the categories of information themselves are less relevant.

Hypothesis 3: Privacy sensitivity will have a positive effect on the level of privacy concern.

3.1.4 Privacy risk

Privacy risk refers to users’ expectation of losses associated with privacy disclosure online, which is caused by opportunistic behavior and the misuse of personal information. The greater the losses caused by the disclosure of personal information, the greater the risk users would perceive.

Experiential studies of online transactions proved that privacy risk had a negative effect on retail transactions [2325]. Many scholars confirmed the existence of the relationship between privacy concern and privacy risk. Chellappa and Sin [4] proved a positive relationship between privacy concern and privacy risk, but did not mention the causation relation. In the model presented by Chellappa and Sin [4] the level of privacy concern of Internet users was proved to have a positive effect on privacy risk. Xu et al. [34] and Dinev and Hart [8] proved that perceived risk had a positive effect on privacy concern. The Theory of Reasoned Action demonstrates that expected outcomes govern the attitudes of behavior, and that expected outcomes are affected by exogenous variables. So we further suppose that the level of privacy concern comes from perceived risk, and perceived risk roots itself in the attributes of web sites and the networked environment.

Hypothesis 4: Privacy risk will have a positive effect on privacy concern.

3.1.5 Information control

Information control refers to the capacity people have to control information released online. Factors that determine the perception people have of information control relate to manners in which web sites collect, store and utilize user personal information. Those factors can be reduced to four points: (1) the presence of a privacy policy on the online site; (2) knowing that information is being collected; (3) voluntary/involuntary submission of the personal information in question; and (4) the openness of the type of information usage by the online organization.

We could postulate that if a privacy policy of a given web site made users feel they could maintain their privacy, it would significantly decrease the level of user privacy concern. In researching online shopping, Phelps et al. [28] thought that the lack of control over personal information explained 42.5 % of the change in people’s level of privacy concern. Dinev and Hart [8] considered that it was the control that people perceived over their personal information that governed the extent of self-disclosure online.

Hypothesis 5: Information control will have a negative effect on the level of privacy concern.

3.1.6 Subjective norm

As a notion deriving from psychology and sociology, subjective norm has been well-used in studies exploring factors acting on people’s attitude about certain behaviors. Lehikoinen et al. [18] found social culture had a significant effect on people’s information disclosure in social networks. Lewis et al. [19] found students were more likely to take part in social network sites where their classmates had already gained membership. Xu et al. [34] demonstrated subjective norm acted on the degree that people regard privacy online. All of these researches have proved that subjective norm has influenced people’s privacy disclosure online.

In contrast to previous studies, the application of the subjective norm on social network sites also includes the effects of other users. Dwyer et al. [10] believed that users who willingly disclose personal information linked their behavior to the trust they had established with other users in the network. If in a given social network site, everyone tended to share real personal information with one another, then that behavior would be considered a subjective norm on that site.

Hypothesis 6: The subjective norm will have a positive effect on privacy concerns.

3.2 Model evolution

Factors in our proposed integrated model have been used in former studies about online privacy behavior. We reviewed empirical studies about privacy disclosure on the Internet, and abstracted the main factors used. There are six constructs thought to be related to the Theory of Planned Behavior model. The relational path between these constructs can be integrated into a comprehensive model according to former studies.

Table 1 Empirical studies about the self-disclosure of personal information online

Table 1 shows highly cited articles focusing on privacy disclosure online and main factors mentioned in those articles. Thirteen articles that have done experiential studies about online privacy information disclosure are presented, and eight constructs are abstracted from those studies, six of which are discussed in our proposed integrated model. Trust means users’ sense of trust about certain web sites. Trust factors about privacy online were influenced by the user experience and familiarity to an online web site. Usually, but not always how personal information (e.g. age, gender) characteristics is used by an online organization is set out in their privacy policy.

Table 1 is presented to show the theoretical basis underpinning the integrated model being proposed in this paper. The constructs used in the model have all been taken from previous studies. Each hypothesis in the model has also been previously proven in studies of situations like online business. The theoretical contribution of this integrated approach lies in a combination of former conclusions and interests on a new topic, the self-disclosure of personal information on social network sites.

Though in the original TPB Model behavior control, subjective norm and attitude were the key factors affecting attitude of behavior, previous studies provided several others factors that may affect privacy concern. In the following section, an exploratory factor analysis was conducted to choose the right constructs.

Ten graduate students who were long term users of renren.com were interviewed after completing a structured pilot questionnaire. Based on the feedback of these students, questions were adapted accordingly.

4 Methodology

The data presented in this paper was collected from Nanjing University using self-administered questionnaires. The sample was collected among senior students and graduate students. Other age groups were not discussed in this paper as most users of renren.com are young people. The questionnaires were distributed by hand at the main cafeteria and auditorium of Nanjing University from October 10 to November 10 in 2009, and 200 questionnaire responses were received. Of the entire sample of 200 students, 171 valid questionnaires were retrieved.

Table 2 Exploratory factor analysis of constructs of privacy concern

After reviewing the previous literature, several constructs were found to have an impact on privacy concern. These constructs included: Information Control, Subjective Norm, Privacy Risk, Privacy Sensitivity, Trust and Privacy Policy. Xu et al. [34] created a model to explain people’s privacy concern using the four former variables. This paper uses the same approach but differs in that it provides empirical evidence in its analysis. It is the first time that data collected through a questionnaire has been used to support this type of modeling approach.

Conducting a principal component analysis with all the named constructs showed that three components remained after extraction (Table 2). Factor loadings of the indicators of Subjective Norm, Information Control and Privacy Risk on corresponding components all exceeded 0.70. And indicators of Privacy Policy, Trust and Information Control loaded highly on the same factor, which suggests a high correlation among the three constructs. Most of the variances of the three constructs were positively correlated and could be reflected by one construct.

The indicator of information sensitivity in this study differed from previous research as the data requested by social network sites of users is usually limited to demographic details. So the measurement of information sensitivity mainly relies on the users’ perception, and thus a new indicator was created in this research to meet this need.

Table 3 shows the results of a principal component analysis of Subjective Norm, Information Control, Privacy Risk and Privacy Sensitivity. All the indicators were loaded on the corresponding construct, and no significant multi-colinearity was detected. So in the integrated model, Privacy Concern was explained by four variables: Subjective Norm, Information Control, Privacy Risk and Privacy Sensitivity, as depicted in Tables 3 and 4.

As the questionnaire in this research was based on individual items used in previous research, the reliability and validity of constructs must be verified. Cronbach’s alpha coefficient was chosen to evaluate the reliability, and the average variance extracted (AVE) was chosen to evaluate the convergent validity of constructs. Conducting exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) using spss 16.0 software and lisrel 8.70 software, Cronbach’s alpha coefficient and AVE were derived (Table 4).

All of the constructs passed the critical level of Cronbach’s alpha of 0.7 and the average variances extracted were all greater than 0.5, which meant the reliability and validity of the questionnaire was established.

A structural equation model (SEM) was created using lisrel 8.70 which analyzed the relationship between variables based on the covariance matrix. The results of this analysis are shown in Fig. 3.

According to the result of the SEM, Hypotheses 1, 2, 4 and 5 were all well-demonstrated, while Hypotheses 3 and 6 did not pass the significance testing. This means that the self-disclosure of personal information is determined by both the level of privacy concern and the level of perceived benefit. It was also found that indeed, privacy concern was related to the perceived risk and information control, but not to information sensitivity or subjective norm. CFA showed that the fit indices, such as AGFI, CFI and NNFI, were all over 0.90, which supported the construct validity of the model.

4.1 Mediator variable test

Table 5 Multiple hierarchical regression analysis

The following section tested the privacy concern as a mediator variable between perceived control, perceived risk and privacy disclosure. The verification method used in this paper was Multiple Hierarchical Regression analysis. Five equations were verified to authenticate the intermediary function of privacy concern, in which the perceived benefit was treated as a control variable.

(1) privacy disclosure =β1∗ perceived benefit + α

(2) privacy disclosure =β1∗ perceived benefit + β2∗ information control + β3∗ privacy risk + α

(3) privacy disclosure =β1∗ perceived benefit + β2∗ information control + β3∗ privacy risk + β4∗ privacy concern + α

(4) privacy disclosure =β1∗ perceived benefit + β2∗ privacy concern + α

(5) privacy concern =β1∗ information control + β2∗ privacy risk + α

Table 5 presented the results of the regression analysis of the five equations in SPSS. Equation (1) proved that the perceived benefit was significantly related to privacy disclosure. The result of Eq. (2) showed, after eliminating the effect of perceived benefit, that information control and perceived risk were still significantly related to privacy disclosure. But after including privacy concern in Eq. (3), the perceived benefit and information control were no longer related to privacy disclosure. The significance of information control was increased from 0.049 to 0.871 and the significance of perceived risk was increased from 0.029 to 0.791. As the relationship of privacy concern and privacy disclosure was certified in Eq. (4) and information control and perceived risk were proved to affect privacy concern in Eq. (5), this research demonstrates that privacy concern is fully mediated between information control, perceived risk and privacy disclosure.

5 Conclusion and contributions

5.1 Theoretical contribution

After presenting a review of previous studies, this article put forward a model to explain people’s privacy disclosure on social network sites. It also modified some constructs: not those exogenous variables as expected outcomes and risk of certain behaviors, but endogenous variables of adopters, and the perceived influence of certain behavior. Though each hypothesis in the integrated model had been proven previously, it was the first time that the theoretical conclusions had been combined. In addition, this paper focuses on a discussion addressing an increasingly important issue which is what factors affect the self-disclosure of personal information by users on social network sites. Using this modeling approach it is possible to identify the main factor(s) affecting people’s privacy concern and privacy disclosure on social network sites. These main indicators can aid in the development of a sustainable Internet where users of social network sites have less privacy concerns, and are willing to contribute to the development of a stable SNS environment.

Though the issue of privacy concern has been previously addressed, this paper is focused on an emerging topic, the disclosure of private information on social network sites. Three theoretical innovations were proposed in this paper: (1) an integrated model, (2) the redefinition of sensitivity; (3) the redefinition of benefits. First, the model proposed in this paper integrated the Theory of Planned Behavior and the Privacy Calculus model, which were both claimed here to interpret privacy disclosure online more comprehensively. Second, the perceived benefit in social networks is different from the profit in online business retail stores, and defining it from the angle of organizational behavior would be more credible. Third, the information sensitivity was used to be measured by experiments setting control groups with different information categories, which was not suitable on social network sites as only demographic information was requested. It is important to note, that this paper paid more attention to the perceived sensitivity felt by people when requested to submit personal information to social network sites.

5.2 Practical contribution

According to the results of the data analysis, perceived risk and information control were proved to have significant effects on privacy concern, while the relationship between information sensitivity, subjective norm and privacy concern did not pass significance testing. The result of the structural equation modeling showed that compared to the unauthorized disclosure of personal information, perceived risk caused by the invasion of privacy played a more powerful role in shaping users’ privacy concern. In practice social network sites demand only basic demographic personal information. People can voluntarily share more detailed information through web blogs and online photo albums (such as on facebook.com and renren.com). For the time being, there is a gap between the ability to submit personal information on a social network site in order to become enrolled on a system and the effort that social network sites go to relieve privacy concerns via the use of more robust privacy policies. In general, it is alleged that social network sites do not seek the privacy protection of their users.

To change the attitudes of people toward the self-disclosure of personal information online, the entire Internet world would need to become secure through a process of regulation. This needs among other things, the cooperation of all the major traffic generating web sites, like search engines, micro-blogs, Internet banking sites, and online ticketing sitesetc.

Compared to privacy concerns, perceived benefits were proven to have a higher effect on the user behavior of self-disclosure of personal information with a t value of 11.39. This value illustrates the desire for community attachment and identification motivated by people who wish to post personal news publicly online. This paper concludes by suggesting that a strategy that can be used by social network sites is to create and deploy an even greater number of online activities, which can entice users to participate, interact and become engaged with one another, building even stronger relationships. This would not only act to increase the number of return visits and number of hits on the social network site, but encourage users to share their real stories building even greater social capital to be shared.

5.3 Limitations

There are a number of limitations in this study which each provide opportunities for further research in the future. First, there are still other factors that may affect people’s behavioral choices which are not contained in the integrated model presented here. For instance, this model does not include some factors about people’s characteristics such as gender [11] and age [26] which were proved to be related to privacy disclosure online.

Secondly, the sample used in this paper was collected among a university student-base in China, and the results may not be applicable universally. However, as the Chinese culture is different from other more developed countries. However, it is true that most users of social network sites are young adults. For example, by the end of 2009, about 70 % of users of Facebook.com were between 18 and 25 years old. Additionally, college students constitute a high proportion of those enrolled in online social networks, and their network behaviors are allegedly representative.

Finally, this research differs from former investigations, as the relationship between information control, subjective norm and privacy concern do not pass significance testing. This may be explained by the particular nature of social network sites. Still, further verification is needed in the future.

Acknowledgements

The work was supported in part by the National Natural Science Foundation of China under Grant No. 70901039, 71171106 and 71101067, National Planning Office of Philosophy and Social Science under Grant No. 11&ZD169, NCET-11-0220 project, Jiangsu Planning Office of Social Science under Grant No. 11TQC010, Jiangsu University Philosophy and Social Science key project under Grant No. 2012ZDIXM036 and Nanjing University Innovation Team Support Project.

Appendix: Construct indicators

Constructs and measures (seven-point scales anchored with “strongly disagree” and “strongly agree”)

Privacy Concerns

  • It bothers me when renren.com asks me for this much personal information.

  • I am concerned that renren.com is collecting too much personal information about me.

  • I am concerned that unauthorized people may access my personal information.

  • I am concerned that renren.com may keep my personal information in an inaccurate manner.

  • I am concerned about submitting information to renren.com.

Privacy Risks

  • In general, it would be risky to give personal information to renren.com.

  • There would be high potential for privacy loss associated with giving personal information to renren.com.

  • Personal information could be inappropriately used by renren.com.

  • Providing renren.com with my personal information would involve many unexpected problems.

Privacy Control

  • I believe I have control over who can get access to my personal information collected by renren.com.

  • I think I have control over what personal information is released by renren.com.

  • I believe I have control over how personal information is used by renren.com.

  • I believe I can control my personal information provided to renren.com.

Subjective Norm

  • People who influence my behavior think that keeping personal information private is very important.

  • My friends believe I should care about my privacy.

  • People who are important to me think I should be careful when revealing personal information online.

  • People who influence my behavior think that keeping personal information private is very important.

Privacy policy

  • I feel confident that privacy statements on social network websites reflect their commitment to protecting my personal information.

  • With the privacy statements on online social network sites, I believe that my personal information will be kept private and confidential.

  • I believe that the privacy statements of online social network sites are an effective way to demonstrate their commitments to privacy.

Trust

  • I feel that the privacy of my personal information is protected by renren.com.

  • I trust that renren.com will not use my personal information for any other purpose.

  • Perceived benefit

  • Submitting personal information makes me easily identifiable by others.

  • Submitting personal information can help me to gain an attachment to relevant communities.

Sensitive

  • Renren.com asks me for the information that I think is sensitive.

  • Intention to disclosure

  • I am glad to submit personal information to renren.com.

  • I would submit personal information to renren.com.

References

1. Acquisti, A., & Gross, R. (2006). Imagined communities: awareness, information sharing, and privacy on the Facebook. Paper read at 6th Workshop on Privacy Enhancing Technologies.

2. Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior & Human Decision Processes50(2), 179–211.

3. Boyd, D. M., & Ellison, N. B. (2007). Social network sites: definition, history, and scholarship. Journal of Computer-Mediated Communication13(1), 210–230.

4. Chellappa, R. K., & Sin, R. (2005). Personalization versus privacy: an empirical examination of the online consumer’s dilemma. Information Technology and Management6(2), 181–202.

5. Culnan, M. J. (2000). Protecting privacy online: is self-regulation working? Journal of Public Policy & Marketing19(1), 20–26.

6. Culnan, M. J., & Bies, J. R. (2003). Consumer privacy: balancing economic and justice considerations. Journal of social Issues59(2), 323–342.

7. Culnan, M. J., & Armstrong, P. K. (1999). Information privacy concerns, procedural fairness, and impersonal trust: an empirical investigation. Organization Science10(1), 104–115.

8. Dinev, T., & Hart, P. (2004). Privacy concerns and Internet use—a model of trade-off factors. Paper read at Working Paper, Department of Information Technology and Operations Management at Florida Atlantic University.

9. Donath, J., & Boyd, B. (2004). Public displays of connection. BT Technology Journal22(4), 71–82.

10. Dwyer, C., Hiltz, S. R., & Passerini, K. (09–12 August 2007). Trust and privacy concern within social networking sites: a comparison of Facebook and MySpace. Paper read at the Thirteenth Americas Conference on Information Systems, at Keystone, Colorado.

11. Fogel, J., & Nehmad, E. (2009). Internet social network communities: risk taking, trust, and privacy concerns. Computers in Human Behavior25, 153–160.

12. Forman, C., Ghose, A., & Wiesenfeld, B. (2008). Examining the relationship between reviews and sales: the role of reviewer identity disclosure in electronic markets. Information Systems Research19(3), 291–313.

13. Granovetter, M. S. (1983). The strength of weak ties: a network theory revisited. Sociological Theory1, 201–233.

14. Gross, R., & Acquisti, A. (2005). Information revelation and privacy in online social networks. Paper read at the 2005 ACM Workshop on Privacy in the Electronic Society.

15. Hui, K.-L., Teo, H. H., & Lee, S.-Y. T. (2007). The value of privacy assurance: an exploratory field experiment. Management Information Systems Quarterly31(1), 19–33.

16. Khan, J. I., & Shaikh, S. S. (2007). Computing in social networks with relationship algebra. Journal of Network and Computer Applications2008(31), 862–878.

17. Klopfer, P. H., & Rubenstein, D. L. (1977). The concept privacy and its biological basis. Journal of social Issues33, 52–65.

18. Lehikoinen, J. T., Olsson, T., & Toivola, H. (2008). Privacy regulation in online social interaction. Paper read at ICT, Society and Human Beings 2008.

19. Lewis, K., Kaufman, J., & Christakis, N. (2008). The taste for privacy: an analysis of college student privacy settings in an online social network. Journal of Computer-Mediated Communication14, 79–100.

20. Malhotra, N. K., Kim, S. S., & Agarwal, J. (2004). Internet users’ information privacy concerns (IUIPC): the construct, the scale, and a causal model. Information Systems Research15(4), 336–355.

21. Milne, G. R., & Culnan, M. J. (2004). Strategies for reducing online privacy risks: why consumers read (or don’t read) online privacy notices. Journal of Interactive Marketing18(3), 15–29.

22. Milne, G. R., & Rohm, A. (2000). Consumer privacy and name removal across direct marketing channels: exploring opt-in and opt-out alternatives. Journal of Public Policy and Marketing19(2), 238–249.

23. Norberg, P. A., Horne, D. R., & Horne, D. A. (2007). The privacy paradox: personal information disclosure intentions versus behaviors. Journal of Consumer Affairs41(1), 100–126.

24. Paine, C., Reips, U.-D., Stieger, S., Joinson, A., & Buchanan, T. (2007). Internet users’ perceptions of ‘privacy concerns’ and ‘privacy actions’. International Journal of Human-Computer Studies65, 526–536.

25. Pavlou, P. A., & Gefen, D. (2004). Building effective online marketplaces with institution-based trust. Information Systems Research15(1), 37–59.

26. Pfeil, U., Arjan, R., & Zaphiris, P. (2009). Age differences in online social networking—a study of user profiles and the social capital divide among teenagers and older users in MySpace. Computers in Human Behavior25(3), 643–654.

27. Phelps, J., Nowak, G., & Ferrell, E. (2000). Privacy concerns and consumer willingness to provide personal information. Journal of Public Policy and Marketing19(1), 27–41.

28. Phelps, J. E., D’Souza, G., & Nowak, G. J. (2001). Antecedents and consequences of consumer privacy concerns: an empirical investigation. Journal of Interactive Marketing15(4), 2–17.

29. Salancik, G. R., & Pfeffer, J. (1977). An examination of need-satisfaction models of job attitudes. Administrative Science Quarterly9(22), 427–456.

30. Sheehan, K. B., & Hoy, G. M. (2000). Dimensions of privacy concern among online consumers. Journal of Public Policy and Marketing19(1), 62–73.

31. Sheehan, K. B., & Hoy, M. G. (1999). Flaming, complaining, abstaining: how online users respond to privacy concerns. Journal of Advertising28(3), 37–51.

32. Venkatesh, V., Morris, M. G., Gordon, B. D., & Davis, F. D. (2003). User acceptance of information technology: toward a unified view. Management Information Systems Quarterly27(3), 425–478.

33. Xu, H. (2009). Consumer responses to the introduction of privacy protection measures: an exploratory research framework. International Journal of E-Business Research5(2), 21–47. Special issue on the Protection of Privacy in E-Business.

34. Xu, H., Dinev, T., Smith, H. J., & Hart, P. (2008). Examining the formation of individual’s information privacy concerns: toward an Integrative view. Paper read at 29th Annual International Conference on Information Systems (ICIS), at Paris, France.

35. Xu, H., Teo, H. H., Tan, B. C. Y., & Agarwal, R. (2009). The role of push-pull technology in privacy calculus: the case of location-based services. Journal of Management Information Systems24(3), 135–174.

36. Yang, S., & Wang, K. (2009). The influence of information sensitivity compensation on privacy concern and behavioral intention. The Data Base for Advances in Information Systems40(1), 38–51.

Keywords: Self-disclosure, Personal information, Perceived benefit, Theory of planned behavior, Privacy calculus

Citation: Xu, F., Michael, K. & Chen, X. Electron Commer Res (2013) 13 (2): 151-168. https://doi-org.ezproxy.uow.edu.au/10.1007/s10660-013-9111-6

Computing Ethics: No Limits to Watching?

Figure 1. Google Glass (opening art)

Figure 1. Google Glass (opening art)

Little by little, the introduction of new body-worn technologies is transforming the way people interact with their environment and one another, and perhaps even with themselves. Social and environmental psychology studies of human-technology interaction pose as many questions as answers. We are learning as we go: "learning by doing" through interaction and "learning by being."9 Steve Mann calls this practice existential learning: wearers become photoborgs,3 a type of cyborg (cybernetic organism) whose primary intent is image capture from the domains of the natural and artificial.5 This approach elides the distinction between the technology and the human; they coalesce into one.

With each release greater numbers of on-board sensors can collect data about physiological characteristics, record real-time location coordinates, and use embedded cameras to "life-log" events 24x7. Such data, knowingly or unknowingly collected and bandwidth permitting, may be wirelessly sent to a private or public cloud and stored, often for public view and under a creative commons license.2 Embedded sensors on wearers can actively gather information about the world and capture details of a personal nature—ours and those of others too. These details can be minor, like embarrassing habits of less than admirable personal hygiene, or major, such as records of sexual peccadilloes or events relevant to court proceedings.

A third party might own the data gathered by these devices or the device itself. The Google Glass Terms of Service state: "...you may not resell, loan, transfer, or give your device to any other person. If you resell, loan, transfer, or give your device to any other person without Google's authorization, Google reserves the right to deactivate the device, and neither you nor the unauthorized person using the device will be entitled to any refund, product support, or product warranty."8 Personal information stored on the Internet for ease of access from anywhere at any time raises the possibility of unauthorized access. Most wearable sleep monitors indicate when you are awake, in light sleep, in deep sleep (REM), and calculate the level of efficiency reached between your rest and wake times.7Monitors can tell adults how often they wake up during the night, the duration of sleep, time spent in bed, and times of awakening.6 Sleeping patterns convey personal details about individuals, such as insomnia or compulsive obsessive disorder, sexual activity, workaholism, likely performance in stressful jobs, and other things.

Wearables can also look outward, reconstructing the world with location coordinates,11 current speed traveled and direction, rich high-resolution photographs, and audio capture. Wearers gather data about themselves but also heterogeneous data about fixed and mobile entities, including infrastructure, living things (such as people and animals) and non-living things (such as vehicles). This is not simply derivable information, such as the "point of interest nearest you is 'x' given your position on the Earth's surface," but can be interpreted as, "Johnny is traveling at 'x' miles per hour and is a little sluggish today on his bike ride compared to yesterday, perhaps because of his late night and his consumption of one glass too many of wine while at the nearby bar."

These devices can tell us about exceptions to everyday patterns of people in or out of our social networks. Consider the potential for government surveillance beyond the Call Detail Records that caused such controversy for the National Security Agency in 2013. Relentless data capture is uncompromising. Wearers concerned only about whether the device is working as intended might not consider the consequences of unauthorized data collection. They might feel they have purchased the device and are using it to their best ability. Will they consider feelings of fraternity with strangers who do not have access to the same technology? Or will they feel they have every right to put on their wearable device and adapt their body for convenience or personal needs such as maintaining personal security, reducing liability, and increasing life expectancy? Might wearers figure that any problems are the other persons' problems as long as the wearers believe they are not breaking the law? Whether the device is doing what it is supposed to do (for example, work properly), might occlude more meaningful questions of societal consequences from using such devices.

Bystanders are likely to be as oblivious to data collection from wearable devices as they are from data collection of private investigators using covert devices. Yet many people vehemently oppose being a subject of someone else's recording.1 The disappearing difference between covert and overt devices makes it possible for surveillance to become so ubiquitous that it is rendered "invisible." Anything right in front of us and ever present is in our "blind spot," hardly noticeable because we are enveloped by it like the fog. Novelty wears off over time, industrial/human factor design can help make things invisible to us, and we undergo conditioning. When surveillance cameras are everywhere, including on our heads and in our lapels, it is no longer surveillance. It is simply the human activity of "watching."

Relentless data capture is uncompromising.

CCTV cameras are arguably invasive, but we do not protest their use even though people are aware of their presence. What happens when we open the floodgates to constant watching by tiny lifelogging devices? We open ourselves to not just Big Brother, but countless Little Brothers.15 Corporate or governmental compliance and transparency hide the fact that audiovisual collection of information will come at a cost. Multiple depictions of the same event can be stronger than a single view, and corruption can flourish even in a transparent environment. It can even be corruption on a grander scale. Crowdsourced sousveillance (watching from below)12 might take place for authenticity or verification, but police with the authority to subpoena data for use in a court of law as direct evidence can use the data to support their "point of view" (POV), irrespective of the fact that "point of eye" (POE) does not always capture the whole truth.a,13

The more data we generate about ourselves, our families, our peers, and even strangers, the greater the potential risk of harm to ourselves and each other. If we lose the ability to control images or statistics about personal or public behaviors how do we make the distinction between becoming a photoborg and becoming the subject matter of a photoborg? There is a stark asymmetry between those who use wearables and those that do not. There is much confusion over whether sousveillance12 is ethical or unethical. The possible perils from lifelogging devices that capture the world around them are only now getting attention.4 To what extent is it ethical to create the records of the lives of others without prior consent or cognizance? Maker or hacker communities—"prosumers" and "producers"—create personalized devices for their own consumption and become trailblazers for what is possible. But they do not speak for everyone. What is initially made to serve individual needs often is commercialized for mass consumption.

Data from others can generate a great deal of money. Consider the story of Henrietta Lacks, a poor black tobacco farmer whom scientists named "HeLa."14 According to Rebecca Skloot, HeLa's cells were "taken without her knowledge in 1951" and became a vital medical tool "for developing the polio vaccine, cloning, gene mapping, in vitro fertilization, and more."14 Until this year, when the family came to a mutually satisfactory arrangement with the NIH, HeLa cells were "bought and sold by the billions," without compensation or acknowledgment. Who profits from wearable devices? The company that owns the device or the data? The wearer? The person in the field of view? Historical evidence suggests it will likely be everyone else but the user wearing the device or the citizen in the field of view. Marcus Wigan and Roger Clarke suggest a private data commons as a potential way forward in this big data enterprise.16

Widespread diffusion and data manipulation can require more than an ordinary consumer decision about use and acceptance. Trust and adoption are key to societal conversations that will shape guidelines and regulations about what is and is not acceptable with respect to wearable computing. At what stage of the game are the "rules" to be determined and by whom?

New technologies can bring wonderful benefits, but also disputed, unintended, and sometimes hidden consequences. Technologies should aid and sustain humankind, but we cannot limit technologies to just positive applications. We should not claim benefits without admitting to the risks and costs. Wearable devices create moral and ethical challenges, especially if they are widely used. We must look beyond findings from previous studies of emerging technologies because new technologies often help create new socio-technological contexts. We cannot afford unreflective adoption. "Play" can have real repercussions. The novelty, fun, and "wow" factors wear off and we are left with the fallout. We must be vigilant in our new playhouse, and not negate the importance of moral or ethical standards alongside market values.

Wearable devices create moral and ethical challenges, especially if they are widely used.

Philosophers have contemplated the question of technology and its impact on society. Martin Heidegger, Ivan Illich, Jacques Ellul, and those of the Frankfurt School, have argued that the worst outcome from technology gone wrong is dehumanization of the individual and the loss of dignity, resulting in a "standardized subject of brute self-preservation."10 A fundamental insight of such literature is that technology has not only to do with building: it is also a social process. Any social process resulting in unreflective adoption of technological marvels is profoundly deficient. More is not always better, and the latest is not always the greatest.

Charlie Chaplin's culturally significant film Modern Times (1936) shows the iconic Little Tramp caught up in the cogs of a giant machine. The unintended consequences of modern and efficient industrialization are clear. Chaplin's classic builds on Fritz Lang's futuristic film Metropolis (1926), which depicts a mechanized underground city in a dystopian society. Both films left indelible marks as prescient summaries of what was to follow. When technology becomes a final cause for its own sake, teleology and technology become confused. The old saw that "The person with the most toys wins," reflects this. What about the rest of us?

References

1. Abbas, R., Michael, K., Michael, M.G. and Aloudat, A. Emerging forms of covert surveillance using GPS-enabled devices. Journal of Cases on Information Technology 13, 2 (2011), 19–33.

2. Creative Commons: Attribution 2.0 Generic, n.d. http://creativecommons.org/licenses/by/2.0/.

3. Electrical and Computer Engineering, ECE1766 Final Course Project. Wearcam.org (1998) http://www.wearcam.org/students.htm.

4. ENISA. To log or not to log?: Risks and benefits of emerging life-logging applications. European Network and Information Security Agency. (2011) http://www.enisa.europa.eu/activities/risk-management/emerging-and-future-risk/deliverables/life-logging-risk-assessment/to-log-or-not-to-log-risks-and-benefits-of-emerging-life-logging-applications.

5. Gray, C. H. Cyborgs, Aufmerkamkeit and Aesthetik [transl. Cyborgs, Attention, & Aesthetics]. Kunstforum(Dec.–Jan. 1998); http://www.chrishablesgray.org/CyborgCitizen/kunst.html.

6. Henry, A. Best sleep tracking gadget or app? (2013); http://lifehacker.com/5992653/best-sleep-tracking-gadget-or-app.

7. Henry, A. Sleep time alarm clock for android watches your sleep cycles, wakes you gently (2012); http://lifehacker.com/5942519/sleep-time-alarm-clock-for-android-watches-your-sleep-cycles-wakes-you-gently.

8. Kravets, D. and Baldwin, R. Google is forbidding users from reselling, loaning glass eyewear. Wired: Gadget Lab (Apr. 17, 2013); http://www.wired.com/gadgetlab/2013/04/google-glass-resales/.

9. Mann, S. Learn by being: Thirty years of cyborg existemology. The International Handbook of Virtual Learning Environments (2006), 1571–1592.

10. Marcuse, H. Social implications of technology. Readings in the Philosophy of Technology 5, 71 D.M. Kaplan, Ed., 2009.

11. Michael, K. and Clarke, R. Location and tracking of mobile devices: Überveillance stalks the streets. Computer Law and Security Review 29, 2 (Feb. 2013), 216–228.

12. Michael, K. and Michael, M.G. Sousveillance and the social implications of point of view technologies in the law enforcement sector. In Proceedings of the 6th Workshop on the Social Implications of National Security. Sydney, NSW, Australia, 2012; http://works.bepress.com.ezproxy.uow.edu.au/kmichael/249.

13. Michael, K. and Miller, K.W. Big data: New opportunities and new challenges. IEEE Computer 46, 6 (2013), 22–24.

14. Skloot, R. The Immortal Life of Henrietta Lacks, Crown, New York, 2011; http://rebeccaskloot.com/the-immortal-life/.

15. Weil, J. Forget big brother. Little brother is the real threat. (Sept. 22, 2010); http://www.thenextgreatgeneration.com/2010/11/forget-big-brother-little-brother-is-the-real-threat/.

16. Wigan, M.R. and Clarke, R. Big data's big unintended consequences. Computer 46, 6 (June 2013), 46–53.

Footnotes

a. Hans Holbein's famous painting The Ambassadors (1533) with its patent reference to anamorphosis speaks volumes of the critical distinction between PoE and PoV. Take a look (http://www.nationalgallery.org.uk/paintings/hans-holbein-the-younger-the-ambassadors), if you are not already familiar with it. Can you see the skull? The secret lies in in the perspective.

Acknowledgment

The authors thank Rachelle Hollander and John King for their observations and insightful comments that helped make this column more robust.

Citation: Katina Michael, MG Michael, 2013, "Computing Ethics: No Limits to Watching?" 
Communications of the ACM, Vol. 56 No. 11, Pages 26-28, DOI: 10.1145/2527187

Location-based services (LBS) regulatory framework in Australia

Abstract

Location-based services (LBS) are defined as those applications that combine the location of a mobile device associated with a given entity (individual or object) together with contextual information to offer a value-added service. LBS solutions are being deployed globally, and in some markets like Australia, without appropriate regulatory provisions in place. Recent debates in Australia have addressed the need to bridge the gap between technological developments and legal/regulatory provisions. This requires an assessment of the regulatory environment within a given social context such as Australia. The core components of such an investigation include: (a) composing a conceptual framework for analysing regulation of technologies such as LBS, one that is sensitive to public policy themes and challenges, and (b) applying this conceptual framework to the Australian setting in order to sketch and define the components of the present framework, and identify areas for improvement through a process of validation. This paper addresses these aims, demonstrating how the current regulatory framework in Australia is bound by legislation with respect to privacy, telecommunications, surveillance, and national security (that is, anti-terrorism), in addition to a set of industry guidelines for location-service providers (LSPs). The existing Australian framework, however, is lacking in its coverage and treatment of LBS and location data, and does not adequately address the themes and challenges in the defined conceptual framework.

1. Introduction

Measuring the need for LBS regulation and engaging in related dialogue requires an informed understanding of regulation and public policy in general, and of existing LBS regulatory practices and frameworks. One approach is to consider regulation in the context of government and governance (Braithwaite et al., 2007, p. 3):

Governments and governance are about providing, distributing, and regulating. Regulation can be conceived as that large subset of governance that is about steering the flow of events and behaviour, as opposed to providing and distributing.

That is, regulation is concerned with “the effects of actions, not on the actions or the means of the actions themselves” (Koops, 2006, p. 6). Various theories and approaches to regulation exist. According to the Australian Law Reform Commission (ALRC), regulatory theory (in relation to information privacy) may include principles-based and compliance- or outcomes-oriented methods (ALRC, 2008, pp. 234–40).

Public policy, on the other hand, can take on various definitions and may involve ambiguity (Bridgman and Davis, 2004, p. 3). In simple terms, public policy is “about what governments do, why, and with what consequences” (Fenna, 1998, p. 3). However, there are a variety of interpretations of the term, as summarised by Maddison and Denniss (2009, pp. 3–4) based on the work of numerous authors in the public policy sphere. Importantly, the authors state that regardless of interpretation, public policy can be viewed in one of two ways: either as “the result of authoritative choice” in which government ministers play a dominant role in decision-making, or as “the result of structured interaction” involving cooperation between players and appreciation of conflicting interests (Maddison and Denniss, 2009, p. 4). That is, regulation is a set of rules designed to govern the operation and intervention of stakeholders. This operation is often in a market setting and thus lends itself to economic analysis (Stigler, 1971). Stigler's work recognised the strong interactions of the regulated with a regulator in the implementation of regulation and its enforcement. This paper similarly argues that regulation and public policy-making processes in the technology realm rely on a process of collaboration and consultation amongst industry stakeholders. With respect to regulatory choices regarding LBS, interaction between government and industry stakeholders is necessary given that the delivery of a given solution is reliant on the involvement of a range of stakeholders such as wireless network operators and handset vendors.

For the purpose of this paper, it should be noted that regulation and public policy-related processes are complex practices that vary from one context to the next and evolve as new debates emerge whereby existing processes and regulatory mechanisms must be reassessed. This interaction is made more complex in the Australian Federal environment where the constitution determines that some aspects of LBS are legislated at a national level and some at a state level. This necessitates an appraisal of current State and Federal legislation relevant to LBS in a manner that allows the regulatory framework and existing measures to be drawn, subsequently allowing the outcomes to be employed as the basis for future work. As such, this paper aims to develop a conceptual framework detailing how to examine LBS regulation, subsequently applying the framework to the Australian case. The outcome will be a sketch of the current LBS regulatory environment in Australia and the subsequent validation of the existing regime. An aspect of Australian law that assists this inquiry is the common approach taken by the States to their legislation. This common basis with a focus on Federal law means that this paper can provide a preliminary sketch of the existing national framework.

Current literature and studies relating to the LBS regulatory environment note that suitable regulatory frameworks are essential to industry development, from the perspective of safeguarding the interests of multiple stakeholders, notably, providers and users, in addition to government entities and society as a whole. Such frameworks should ideally address the ethical dilemmas and social implications of LBS, whilst also being sensitive to the regulatory and public policy challenges associated with emerging technologies in general. Furthermore, and in light of the divergent uses of LBS, Dobson and Fischer (2003, p. 51) call for protective mechanisms that enable the “legitimate uses”, while preventing undesirable exploitation. Similarly, Smith (2006, p. 725) acknowledges the potential benefits, whilst also suggesting further legislation to safeguard personal location information. The significance of adequate regulatory provisions is two-fold. First, regulation encourages fairness and consistent rules for providers. Second, regulation functions to safeguard individuals thereby increasing their support and trust in LBS (Cuijpers and Koops, 2008, p. 881; FIDIS, 2007, p. 10).

Regardless of the potential benefits of LBS, authors such as Clarke and Wigan (2011)indicate that LBS “have far outstripped both public awareness and legal and policy attention”, a situation they claim is exceedingly risky. The consequences of lack of regulation, specifically of tracking services and control over location histories by government, organisations and interested individuals, are great in terms of privacy in particular (Barreras and Mathur, 2007, p. 177). Cho (2005) claims that while concerned individuals are advocating regulation (p. 209) others are advancing the self-regulation movement (p. 253). Determining the most suitable response is indeed a challenge, one that requires the current regulatory environment and/or framework to be mapped out. However, it has been suggested that a single approach to regulation, such as legislation or self-regulation for instance, will fail to suffice. Xu et al. (2009, p. 163) agree that a single approach to regulating privacy in particular will not account for the interests of the diverse stakeholders comprising the LBS industry. Herbert (2006, p. 437), on the other hand, recommends an elementary reassessment of the manner in which emerging technologies, such as human tracking technologies, affect privacy as the basis for initiating a suitable legal response. In fact, the same sentiments apply for any regulatory issue associated with LBS. That is, a fundamental re-evaluation of the implications of LBS, in conjunction with an understanding of the regulatory and public policy challenges that apply, is indispensable.

The following section offers an overview of the significant themes and challenges pertaining to LBS regulation thereby providing a conceptual framework for examining LBS regulation; Section 3 introduces the Australian framework by applying the conceptual framework drawn from Section 2; Section 4 summarises and validates the main components in the Australian framework, noting areas for future research and Section 5provides the concluding remarks for this paper.

2. Conceptual framework for analysing LBS regulation

It is essential that a conceptual framework for LBS regulation be built on a preliminary understanding of the regulatory and public policy challenges associated with emerging technologies such as LBS. It has been noted that regulatory challenges in the LBS domain stem from the mounting gap between technology deployment and the employment of appropriate safeguards, legal or otherwise, to govern various aspects of LBS. For instance, in relation to modern surveillance technologies, Marx (1999, p. 63)observes the increasing gap between technological potential and present measures designed to offer protection. This gap has long been attributed to a lack of response in social and political spheres (Clarke, 2001, p. 13). Relevant scholarship is generally focussed on the inability for law to reflect technological change, a perspective that Moses (2011, p. 787) feels requires adjustment, given the mutually shaping characteristics of law and technology and the belief that “[t]he law should not race ahead by anticipating technological trajectories that may never come to pass. Rather, a useful goal should be to have mechanisms in place to ensure that law is designed around the socio-technical landscape of the present or, more realistically, the recent past.”

Aside from the interaction between technology and law, the study of regulation according to Svantesson (2011, pp. 243–245), often introduces researchers to a persistent set of themes which are centred on the claims that: (a) technological development will inevitably out-pace law-making processes, (b) legal professionals possess inadequate knowledge of technology, (c) globalisation and internationalisation necessitate consideration of multiple jurisdictions, and (d) the growth potential of technology has not been realised in domains such as e-commerce. While originally recommended for the internet regulation context, Svantesson's work is utilised as the basis for this framework in that it offers a clear summary of the themes relevant to all technological domains including LBS. Importantly, Svantesson's work posits that a successful regulatory framework must be sensitive to numerous challenges specific to the regulation of emerging technologies. However, that sensitivity does not mean that regulation has to be technologically determinative. A well-designed principles-based regulatory regime can address each of these four issues.

Fig. 1 provides a summary of these challenges, which have been derived from the secondary literature sources cited in this paper, in addition to a summary of Svantesson's primary themes which directly impact on the challenges. The distinct challenges are discussed below which when combined form a conceptual framework upon which the existing Australian framework and regimes in other contexts can be validated.

Fig. 1. Conceptual framework for examining LBS regulation and the associated regulatory themes impacting on the framework.

2.1. The Australian policy and regulatory context

Regulation refers to the set of rules which apply to a specific environment. These rules might be prescriptive and enforced rigidly, or they may be set by agreement between the entities being regulated. A regulatory environment can be likened to the rules for playing a game. In a regulated industry, there may be legislation, subordinate legislation made under specific laws (confusingly, often called regulations) and a set of conventions, adopted by stakeholders, which form part of the rules. In the Australian context, the rules can be changed by: changing the legislation; changing the subordinate legislation; by ministerial determination; by regulator action; or by stakeholders changing self-regulatory or co-regulatory codes.

Any changes are complicated by the Australian convention of amending legislation much more regularly than repealing and replacing it. By convention, amending legislation is named differently to the amended act. For legislation introduced since about 1990, the intention of legislation is set out in a section of the law entitled “Objects of the Act”. This section is intended to set out the principles behind the legislative framework. In the Australian context, these principles are generally designed to be general and not technology-specific. However, subordinate legislation may be technologically determinative, even if the primary legislation is not.

2.2. Technology-specific versus technology-neutral

A primary regulatory challenge exists in determining the suitability of technology-specific versus technology-neutral legislation. A popular belief in selected literature and in the government domain is the need for inclusive legislation that is broad enough to apply to present and future technologies, ensuring that laws remain up-to-date. This is the basic premise underlying the technology-neutral approach to legislation which “appears to have three main aims: future proofing, online and offline equivalence, and encouraging the development and uptake of the regulated technology” (Reed, 2007, p. 275). This approach is often incorrectly perceived in a positive sense, disregarding the fact that “technology-neutral language” does not necessarily account for the dynamic nature of technological change (Moses, 2007, p. 270). It has been argued by Koops (2006, pp. 5–6)that the phrase technology-neutral can imply different meanings and be examined from regulatory, technological and legislative perspectives (p. 26). For a comprehensive treatment of the concept and the varying interpretations and perspectives, refer to Koops (2006) and Reed (2007). According to Australia's Privacy Commissioner with reference to the Commonwealth Privacy Act 1988, technology-neutral legislation refers to the regulation of “information handling without referring to specific technologies”, granting flexibility and ensuring relevance as new technologies emerge (Pilgrim, 2010, p. 23).

It has been argued that parliaments “are using the spurring notion of ‘technology-neutral’ legislation as one excuse for inaction” (Clarke, 2003), resulting in a situation in which “[n]ew powers are granted through technological ambiguity rather than clear debate” (Escudero-Pascual and Hosein, 2004, p. 82). However, Pilgrim (2010, p. 24) contends that adopting this approach does not necessarily mean overlooking developments in technology. Other authors also insist that “legal regulation should define principles, functions and requirements, drawn from the experience (or anticipation) of using specific technologies, rather than provisions regulating the specific technologies themselves” (Székely et al., 2011, p. 183). Yet, Reed (2007, pp. 279–280) and Moses (2007, pp. 270–274) are sceptical of whether such legislation can be achieved in the drafting process as language that accounts for technology-neutrality is difficult to adequately reflect the nature of technological evolution. Even if accomplished, Hosein (2001, p. 29) claims that the approach is deceitful as it may disregard critical factors unique to certain technologies.

An alternative approach calls for technology-specific legislation, which is not without its drawbacks. Several researchers maintain that seeking technology-specificity will produce issues relating to the future applicability of legislation in that technological progress may render the law ineffectual and redundant (Koops, 2006, p. 27; Székely et al., 2011, pp. 182–183). Nonetheless, authors such as Ohm (2010) declare that there is a compelling case for technology-specific legislation. In an article titled The Argument against Technology-Neutral Surveillance Laws, Ohm explains that technology-neutral concepts are often emphatically embraced (p. 1685). This thereby prohibits the potential benefits of technology-specificity from being garnered, even though there are several flaws in technology-neutrality whereby the benefits of the approach can be offset by limitations specifically in relation to surveillance (p. 1686). However, Ohm claims that “longevity” is an advantage of technology-neutral legislation (p. 1702) but also suggests general principles that can be applied to technology-specific legislation that will address issues of redundancy and achieving a suitable degree of specificity (pp. 1702–1710). The selection of the technology-neutral versus technology-specific approach to legislation should be perceived as a choice, and technology-neutrality should not be presumed the most suitable means of regulating technology (Reed, 2007, p. 282–283; Ohm, 2010, p. 1686).

In the Australian context, the use of technology-neutral primary legislation and technologically determinative, and more frequently amended subordinate legislation, is intended to provide both options to create an optimal regulatory environment. However in the context of LBS, the optimisation function is complicated by the fact that there is no primary or subordinate legislation on LBS. As will be shown in this paper, LBS crosses a range of regulatory regimes, all of which are optimised for the principles set out in the primary legislation. As a result, the extent to which each approach applies to LBS has yet to be examined and cannot be so until the existing regulatory landscape has been defined and reviewed. LBS are positioned in a complex and multi-faceted regulatory environment with no single LBS regulatory framework.

2.3. Legislation versus self-regulation

An additional concern, particularly for industry, is exercising caution in the introduction of legal measures so as not to stifle development of particular technologies or industries. The Telecommunications Act 1997 (Cth) at Section 5 states that telecommunications should be regulated in a manner that promotes the greatest practicable use of industry self-regulation consistent with the objects of the Act. However, the Privacy Act 1988 (Cth)and the Telecommunications (Interception and Access) Act 1979 (Cth) have no such reference.

In the context of LBS, the telecommunications sector stakeholders would anticipate self-regulation as the core of the regulatory environment. On the other hand, a privacy advocate would expect a regulatory approach which is strictly rules based (legislation and subordinate legislation). This creates a potential struggle between the two forms of regulatory implementation. In the context of online privacy, Hirsch (2010, pp. 22–33)describes this struggle. Hirsch (2010, p. 3) also claims that the self-regulation has been dominant to date. Self-regulation is an ideal approach for advancing the growth of the information and communications technologies (ICT) sector (Koops, 2006, p. 9). An overview of industry self-regulation theory and literature is presented in Hemphill (2004, pp. 83–84). While self-regulation can assume many forms, Gunningham and Rees (1997, pp. 364–365) differentiate between the individual and group approaches. The first refers to autonomous regulation by an individual entity and the second to collective regulation, an example of which is industry self-regulation requiring cooperation amongst entities. According to the authors, other distinctions can also be made relative to economic versus social factors, in addition to the level of government involvement in the self-regulation process, including the degree to which self-regulation is mandated (p. 365). There is the belief that self-regulation complemented by some form of government involvement is of greater value than self-regulation alone (p. 366).

The self-regulation approach is typically favoured by industry due to its ability to facilitate and adapt to market and technological developments, and may accompany government regulation particularly in cases where gaps in the latter exist (Cleff, 2010, p. 162). The approach is frequently expressed as a fitting antidote to the limiting nature of legislative action. For example, O'Connor and Godar (2003, pp. 257–260) argue that industry self-regulation is preferable to legislation, eliminating the need for restrictive laws that hamper progress within the industry as was the case in the telemarketing arena. The researchers also state that self-regulatory measures should be developed with sensitivity to ethical concerns, otherwise they will be perceived unfavourably by consumers (O'Connor and Godar, 2003, p. 259). Only then can self-regulation demonstrate potential and be beneficial. Theoretical benefits include “speed, flexibility, sensitivity to market circumstances and lower costs”, but practically self-regulation generally falls short of these expectations (Gunningham and Rees, 1997, p. 366).

This is due to self-regulation being criticised as a means of avoiding State involvement and other forms of regulation (Gunningham and Rees, 1997, p. 370; Clarke, 2003), enabling industry to achieve its goals to the detriment of the public. Furthermore, the capacity for self-regulation to address societal concerns, such as consumer privacy, is questionable due largely to the lack of transparency, and as such the approach can merely serve as an adjunct to government regulation (Cleff, 2010, p. 162). Industry self-regulation should nonetheless be considered earnestly, although an understanding of its dimensions and known restrictions is indispensable (Gunningham and Rees, 1997, p. 405). Self-regulation and industry involvement in regulatory processes may be beneficial to consumers and other stakeholders. However, validating its value when compared with legislation requires an assessment of the level of independent oversight that exists, the manner in which self-regulation is implemented and the extent to which it complements present legislation and regulatory mechanisms.

2.4. Multiple and competing stakeholder interests

In considering the balance between rules-based regulation and self-regulation, a notable challenge emerges surrounding the importance of accounting for multiple and competing interests. That is, how the views of multiple stakeholders can be integrated without creating “regulatory capture” (see for example, Dal Bó, 2006) by the stakeholders with the greatest commercial or political power. This may theoretically be achieved by employing the co-regulatory approach to regulation. While the co-regulatory approach is an involved process that embodies countless complexities and facets (Hirsch, 2010, pp. 6–8, 41–46), and has been regarded a promising means of collaboratively managing multiple interests, it is also essential to recognise that such collaboration will involve reconciling rival perspectives. From the discussion above, it is apparent that certain entities will favour particular forms of and approaches to regulation. For example, there is often opposition from the technical and scientific communities in relation to legislation, which is typically perceived as a possible impediment to the technology development process (Székely et al. 2011, p. 183). Such communities are generally in favour of self-regulation and technology-based approaches in that they ensure industry progress is not hindered. However, these sentiments are not supported by all stakeholders. The LBS industry, with its varied value chain, consists of a wide range of stakeholders and its composition is dependent on a given LBS solution.

2.5. Flexible regulatory structures

In addition to being sensitive to varying stakeholder interests, a regulatory environment must be cognisant of the rapid and/or continual changes caused by technological innovations. This may require contemplation of flexible regulatory structures. However, it is likely that a regulatory framework would have no greater level of flexibility as a standards body dealing with the same innovations. For the purpose of this discussion, flexibility simply refers to the general need for the regulatory environment to deal with constant technological change. This is an important element as the pace of technological development and usage “raises the question whether law in general manages to keep up” (Cleff, 2010, p. 161). The level of flexibility does not require the law to “keep up”. Rather, it requires the regulatory environment to be able to flex. Nonetheless, the introduction of flexible regulatory structures capable of adapting to and incorporating developments in technology remains a challenge, one which technology-neutrality and self-regulation attempt to surmount. The introduction of adaptable structures demands a nuanced understanding of the nature of emerging technologies, and related legal and ethical challenges. Székely et al. (2011, p. 183) claim this to be an issue, given that a relatively limited number of legal experts possess such knowledge, a claim supported by Svantesson (2011, p. 244).

It is within this multi-faceted and intricate regulatory environment that the need for LBS regulation in Australia must be investigated, an environment that is characterised by diverse approaches to ICT regulation and privacy, that complicate regulatory debates associated with technologies such as LBS. The following section identifies the Australian regulatory framework for LBS, which is largely legislation-based but is supplemented by self-regulation. This is followed by the application of the conceptual framework drawn together in this section to the Australian case in order to validate the existing scheme. A sketch of the LBS regulatory framework in Australia has not yet been attempted, nor has the validity of such a framework been previously measured. This paper will consequently provide the foundations for further study into the need for LBS regulation in Australia.

3. LBS regulatory framework in Australia

Research into LBS regulation is very much context-dependent as each setting will inevitably embody a distinctive approach to regulation, based on numerous factors. This approach may involve a review of existing legal frameworks, for example, in addition to an assessment of the unique cultural, political, economic and other factors that define such regulatory frameworks. These differences demand an independent reflection of respective regulatory settings. Initially, context delineates the “structured social settings with characteristics that have evolved over time (sometimes long periods of time), and are subject to a host of causes and contingencies of purpose, place, culture, historical accident, and more” (Nissenbaum, 2010, pp. 129–130). With respect to regulation and the law, context produces challenges across jurisdictions, affecting both internationalisation of legal frameworks pertaining to LBS and interpretation of laws within specific settings. Such issues are evident in the implementation of the European legal framework for LBS, in which Member States have integrated applicable European Union Directives in alternative ways, resulting in varied coverage and distinct difficulties in the respective nations, as demonstrated in a report by the FIDIS Consortium (FIDIS, 2007).

The importance of context to regulatory and public policy discussions is not restricted to the jurisdictional issues but is also apparent in sub-contexts. For example, Marx (1999, p. 46) identifies “setting” as being of particular importance in terms of LBS usability contexts. That is, a location-monitoring solution that aids a skier in the event of an avalanche is perceived in a different light to the same device being covertly installed in an individual's vehicle. To form the foundations for a context-based investigation of LBS regulation in Australia, the Australian regulatory framework for LBS is presented in this section.

The present regime in Australia is comprised of and dominated by a collection or patch-work of federal and state-based laws that relate – albeit to varying degrees – to diverse aspects of LBS, in addition to numerous industry-based codes that seek to protect the interests of consumers and organisations. With respect to legislation, federal laws relating to privacy (Cho, 2005APF, 2007ALRC, 2008Rodrick, 2009), telecommunications(APF, 2007Nicholls and Rowland, 2007Nicholls and Rowland, 2008a,bRodrick, 2009), surveillance (APF, 2007ALRC, 2008Rodrick, 2009VLRC, 2009Attorney General's Department, 2011Michael and Clarke, 2012) and national security/anti-terrorism apply (Rix, 2007VLRC, 2009Attorney General's Department, 2011Michael and Clarke, 2013). With respect to self-regulatory schemes, industry-based guidelines such as those developed by Communications Alliance and the Australian Mobile Telecommunications Association (AMTA) are of significance. The respective approaches are now examined in greater detail.

Author of Geographic Information Systems and the Law: Mapping the Legal Frontiers(1998) and Geographic Information Science: Mastering the Legal Issues (2005) is GIS and legal scholar, George Cho. Both of Cho's works analyse the legal implications of geographic information and related technologies. In the first book, Cho (1998, pp. 27–28)explains that an elementary appreciation of the legal and policy challenges associated with GIS requires disaggregation of the terms geographic, information, and systems to define issues within individual themes. The author claims that information (and data) are central to these challenges (p. 28) given their ability to “be beneficial or detrimental to individuals, groups and ultimately to society at large” (p. 31) and to symbolise various power relations (p. 130). The “double-edged” nature of GIS simultaneously grants access while also enabling abuse and invasion of privacy (p. 131), thus requiring a policy response that may be enacted through “education of the public, facilitation, regulation and the provision of incentives” (p. 166). In sketching the LBS regulatory framework throughout this paper and considering the available regulatory choices, it is crucial to be mindful of this “double-edged” nature of LBS, specifically that LBS applications and devices can enable constructive uses on the one hand and simultaneously facilitate less desirable uses on the other.

In Cho's second book (2005, pp. 17–18) he advances the discussion by outlining the intricacies characterising GIS-related policy development given the multitude of actors, the abundance of applications and the rise in m-commerce and geo- or g-commerce services. Providing introductory material relating to policy, law and the relationship between the latter and GIS, Cho maintains that policy challenges are of equivalent value to technical considerations associated with geographic information access, implementation and usage (p. 27). With respect to GPS, and tracking more specifically, the author asserts that policy debates are generally concerned with privacy and human rights violations (p. 44). The privacy threat is largely the effect of “the new inferences that may be obtained by correlating geographic information with personal information” (p. 211). In Australia, the privacy threat and its varying implications fall within the scope of a regulatory framework that has been described as “ad-hoc”, entailing approaches such as legislation and self-regulation that aim to safeguard personal and information privacy (p. 217). The framework is based on existing legal safeguards that aim to protect public and private sector handling of information in accordance with a collection of privacy principles (p. 257), notably, the Privacy Act 1988 (Cth) (see also, Privacy Amendment (Private Sector) Act 2000 (Cth); Morris, 2010). For a comprehensive listing of privacy-related legislation, including state-based laws omitted from this paper, see Clarke (2010) and APF (2007).

3.1. Privacy legislation

The Privacy Act 1988 was amended in November 2012 to introduce the Australian Privacy Principles (APP). These principles come into effect in March 2014. The APPs are a single set of principles that apply to both agencies and organisations, which are together defined as APP entities. While the APPs apply to all APP entities, in some cases, they impose specific obligations that apply only to organisations or only to agencies. The APP concerning anonymity or pseudonymity (APP 2) and cross-border disclosure (APP 8) will have an impact on LBS providers. The APPs extend the existing obligations on data collection to rebalance the rights of collectors of personal information and an individual's right to privacy. There are also stricter controls on the collection and use of sensitive information.

The Office of the Australian Information Commission (OAIC) offers further information about the APPs which cover sensitive personal information handling (OAIC, n.d.). The Privacy Act 1988 defines ‘sensitive information’ as: “information or an opinion about an individual's: (i) racial or ethnic origin; or (ii) political opinions; or (iii) membership of a political association; (iv) religious beliefs or affiliations; or (v) philosophical beliefs; or (vi) membership of a professional or trade association; or (vii) membership of a trade union; or (viii) sexual preferences or practices; or (ix) criminal record; that is also personal information” (Part II, Section 6). Sensitive information can also encompass health and genetic information. In the context of the Privacy Act 1988, personal information refers to “information or an opinion (including information or an opinion forming part of a database), whether true or not, and whether recorded in a material form or not, about an individual whose identity is apparent, or can reasonably be ascertained, from the information or opinion” (Part II, Section 6).

It has been argued that the major dilemma in relation to LBS, location privacy and existing legislation is that the location of an individual may not necessarily be regarded as sensitive personal information. However, the obligations under the Privacy Act 1988 in respect to personal information under the APP are relatively onerous. It has been argued that processed LBS data presents sizeable privacy implications (Cho, 2005, p. 258).

The 2012 amendments to the Privacy Act 1988 were guided by The Australian Law Review Commission's (ALRC, 2008) report entitled For Your Information: Australian Privacy Law and Practice. This took into account submissions such as the policy statement by the Australian Privacy Foundation (APF) on “the use of positional data relating to mobile devices as a means of locating and tracking the individuals carrying them” (APF, 2011). That is, current government policy is that the privacy legislation in Australia deals with LBS-related privacy concerns at the federal level. One state, Victoria, has attempted to address these issues through human rights legislation (Michael and Clarke, 2012, pp. 4–5).

3.2. Telecommunications legislation

Location data is not, however, only subject to privacy legislation but also falls within the scope of the Telecommunications Act 1997 (Cth) and the Telecommunications (Interception and Access) Act 1979 (Cth). These laws collectively deal with telecommunications content and data interception, disclosure and use. The Telecommunications Act 1997 prohibits the disclosure and use of telecommunications metadata and telecommunications content. This prohibition is clarified in section 275A to include location information and a limited exemption to this prohibition for the purpose of providing “location dependent carriage services” is given in section 291A. However, there is no immunity provided for LBS which do not have a carriage component.

Relevant to this discussion, the ALRC's report outlines the interaction between the Privacy Act 1988 and the Telecommunications Act 1997 noting that both laws aim to regulate privacy and various forms of information (ALRC, 2008). The Privacy Act relates to safeguarding personal information, while Part 13 of the Telecommunications Act “regulates the use or disclosure of information or a document” (ALRC, 2008, p. 2381). The review, questions whether both privacy regimes are required, outlining a number of differing stakeholder opinions (ALRC, 2008, pp. 2385–8). Furthermore, it concludes with the opinion that while there is observable “merit in the promulgation of telecommunications privacy regulations under the Privacy Act to regulate the handling of personal information” (ALRC, 2008, p. 2388), “both the Telecommunications Act and the Privacy Act should continue to regulate privacy in the telecommunications industry” (p. 2389), however, the exchange between the two laws requires clarification (p. 2391). It would have been feasible, if it were government policy, for the amendments to the Privacy Act that were made in 2012 to have a set of consequential amendments to other legislation such as the Telecommunications Act 1997. The absence of such an amendment implies that there is no policy imperative requiring such a change.

The Telecommunications (Interception and Access) Act 1979, on the other hand, is intended “to protect the privacy of personal communications by generally prohibiting interception of those communications, subject to limited exceptions in which privacy is outweighed by other considerations”, and functions alongside Part 13 of the Telecommunications Act 1997 (Nicholls and Rowland, 2007, pp. 86–87). However, the Telecommunications (Interception and Access) Act 1979 does not have the objects of the Telecommunications Act (Nicholls and Rowland, 2008a, p. 349). Significantly, the Telecommunication (Interception and Access) Act 1979 generates three regimes for intercepting telecommunications content and data. The first deals with communications metadata (including in real time), the second with stored communications, and the third is concerned with the content of communications itself (Rodrick, 2009, pp. 376–378). The regulatory framework for this legislation can be analysed by using the European Telecommunications Standards Institute (ETSI) approach set out in TS 101 671. This sets out three handover interfaces that relate to (in ascending order): the relationship between the communications operator and the law enforcement agency; the request for and delivery of communications metadata; and the request for and delivery of communications content. This is depicted in Fig. 2.

Fig. 2. Interaction between communications operators and law enforcement agencies.

In Australia, there is an obligation on all communications providers (carriers and carriage service providers) to provide assistance to law enforcement agencies. Handover Interface 1 does this by legislation and by contract with law enforcement agencies in the case of the largest carriers. Handover Interface 2 is used for the delivery of communications metadata and this does not require a warrant (Rodrick, 2009, p. 384). The absence of a requirement for a warrant in Australia and merely consideration of the target's privacy expectations in the case of real-time metadata is unusual (Nicholls, 2012, p. 49). Communications content, either stored or being carried across a network is delivered over Handover Interface 3 in response to a warrant.

Relevant to LBS and this paper, it is crucial to determine the extent to which location information falls within the scope of federal telecommunications legislation, specifically the Telecommunications (Interception and Access) Act 1979. Of particular value is ascertaining whether location information signifies telecommunications data, in which case the implications for disclosure to and access by specific agencies is great given that such data may then be lawfully “disclosed to ASIO and law enforcement agencies without a warrant and without any independent oversight” (Rodrick, 2009, p. 391). In an article titled Regulating the Use of Telecommunications Location Data by Australian Law Enforcement AgenciesNicholls and Rowland (2008b, p. 174) argue that telecommunications data, or the metadata relevant to communications including location details, are increasingly being provided to law enforcement agencies in the absence of a warrant. The authors also note that an oversight process is lacking, a situation that is inconsistent with European and US models (Nicholls and Rowland, 2008b, p. 181).

That is, Australia “appears to be isolated in its approach of placing the power to have location metadata supplied on a prospective basis to law enforcement agencies” (Nicholls and Rowland, 2008b, p. 181). This is exceedingly problematic given that a definition of telecommunications data is non-existent in the legislation (Nicholls and Rowland, 2008b, p. 174) and that a certain degree of ambiguity is required in incorporating future technologies (p. 179). However, this is likely to result in issues whereby the agencies seeking location data are able to independently control the definition or the type of metadata requested (Nicholls and Rowland, 2008b, p. 180). Thus, agencies are lawfully able to access location data on a prospective basis. This ability for close to real-time access of location data will facilitate “live tracking” (Nicholls and Rowland, 2008b, p. 176).

As examined earlier in reference to federal privacy legislation, the (privacy) risks are intensified with increases in accuracy and greater use of mobile devices for tracking purposes, further questioning the suitability of present telecommunications legislation in Australia, especially given the capability for telecommunications data to be accessed without a warrant and devoid of an “independent oversight” process (Rodrick, 2009, p. 404). There has been a push for more rigorous safeguards, summarised succinctly by Rodrick (2009, p. 407): “In light of the fact that prospective location information is tantamount to surveillance, access to it should be procured only via a warrant, and, as is the case with the interception and stored communications regimes, in deciding whether to issue a warrant, the issuing authority should be required to have regard to the degree to which the privacy of a person would be interfered with”.

3.3. Surveillance legislation

The use of surveillance devices is generally prohibited under the laws of the states and territories in Australia. Each state and territory prohibits the use of tracking devices and then provides an exception to the prohibition for law enforcement agencies. A tracking device is usually defined to mean “any electronic device capable of determining or monitoring the location of a person or an object or the status of an object”. That is, an LBS device would generally be prohibited under state law if it was used for surveillance. However, if the person being tracked by the device was aware of the tracking then the use of the device would not be prohibited. Example LBS tracking devices could include smart phone-based location-monitoring solutions and dedicated data logging devices that may be mounted to a particular surface or wired into a vehicle.

The state-based exceptions refer to the law enforcement agencies of that state. As a result, the Surveillance Devices Act 2004(Cth) was introduced to provide a regime that permitted the use of surveillance devices (including tracking devices) across state and territory boundaries. The Surveillance Devices Act 2004 sets out the process through which warrants, emergency and tracking device authorisations can be obtained in relation to surveillance devices for law enforcement and other purposes (Attorney General's Department, 2011). Part 1, Section 6 of the act presents a number of definitions important for this article: “data surveillance device means any device or program capable of being used to record or monitor the input of information into, or the output of information from, a computer, but does not include an optical surveillance device…device includes instrument, apparatus and equipment… surveillance device means: (a) a data surveillance device, a listening device, an optical surveillance device or a tracking device; or (b) a device that is a combination of any 2 or more of the devices referred to in paragraph (a); or (c) a device of a kind prescribed by the regulations… tracking device means any electronic device capable of being used to determine or monitor the location of a person or an object or the status of an object.”

Michael and Clarke (2013) note that law enforcement agencies, in particular, may utilise LBS for personal and mass surveillance, which are often justified as means of maintaining security, despite the lack of an adequate judicial process in some cases.

The Victorian Law Reform Commission (VLRC) published a consultation-based report on the subject of Surveillance in Public Places (VLRC, 2010). While the report is largely state-focused, it covers many aspects relevant to this investigation and discusses limitations in current surveillance laws and the need for “modernising” existing state-based legislation (refer to Chapter 6 of the report). With specific reference to the Surveillance Devices Act 2004, the VLRC's accompanying consultation paper specifies its applicability to national security and surveillance efforts, explaining that the federal law does not seek to overrule state-based legislation (VLRC, 2009). In combination with the Telecommunications (Interception and Access) Act 1979, the Surveillance Devices Act 2004 does, nevertheless, intend to “provide enforcement and national security agencies with significant investigative tools, including the ability to obtain warrants to intercept communications, obtain access to stored communications, install and use surveillance devices, and to obtain access to telecommunications data while still protecting the privacy of individuals” (Attorney General's Department, n.d.).

While the federal Surveillance Devices Act 2004 generally requires a warrant for surveillance, Sections 37–39 of the legislation indicate the conditions or circumstances under which a warrant is not required. Explicitly section 39 outlines the provisions in relation to tracking devices; that is: “(1) A law enforcement officer may, with the written permission of an appropriate authorising officer, use a tracking device without a warrant in the investigation of a relevant offence” and “(3) A law enforcement officer may, with the written permission of an appropriate authorising officer, use a tracking device without a warrant in the location and safe recovery of a child to whom a recovery order relates”. A tracking device can also be used by a law enforcement agency without a warrant if there is now requirement to enter premises or a vehicle (for example, by installing a magnetically mounted GPS device).

Additional rules relating to the authorisation also apply. For example, the authorisation must specify the period of validity, which should not exceed 90 days (section 39 (7)). It is clear that there are situations in which a location-enabled tracking device may be lawfully deployed, utilised and retrieved by certain law enforcement agencies. In cases where personal information has been collected using such surveillance devices, the Privacy Act 1988 will then apply.

3.4. National security and anti-terrorism legislation

Federal anti-terrorism laws also grant organisations, notably ASIO and the Australian Federal Police (AFP), the facility to conduct surveillance activities and gather information believed to be in the interest of national security. For example, The Australian Security Intelligence Organisation Act 1979 (Cth) enables ASIO to gather information considered to be of value in the deterrence of an act of terrorism (Attorney General's Department, 2011). ASIO is specifically granted the ability “to obtain a warrant to detain and question persons (who do not themselves have to be suspected of terrorism offences) in order to gather intelligence related to terrorist activity” as a form of preventative measure (Rix, 2007, p. 104). The Criminal Code Act 1995 (Cth), grants the AFP powers relating to questioning and surveillance (VLRC, 2010, p. 21). It also covers procedures relating to court orders, detention, questioning and search, and the collection of information and documents (Rix, 2007, p. 106). The implications of these pieces of legislation in particular, and the extent to which they apply to LBS, surveillance, tracking and location information have not been sufficiently examined and remain unclear. It has previously been suggested that these laws fail to protect human rights (Rix, 2007, p. 107), and with respect to the ASIO Act, the government has “unquestioningly granted powers to national security agencies to use location technology to track citizens”, justifying surveillance as a necessary means of ensuring Australians are protected from terrorist threats (Michael and Clarke, 2012, p. 2).

3.5. Industry guidelines for location-service providers

The LBS regulatory framework in Australia is not limited to legislation, but also includes self-regulation in the form of industry guidelines. The main industry body for all telecommunications operators in Australia is Communications Alliance. Its Guideline G557:2009 Standardised Mobile Service Area and Location Indicator Register, uses a coarse LBS to identify the geographic location of calls from mobile and nomadic devices to the emergency services.

Guidelines have also been released by the Australian Mobile Telecommunications Association (AMTA). AMTA is “the peak industry body representing Australia's mobile telecommunications industry” (AMTA, n.d.). In 2010, AMTA released guidelines intended for location-service providers (LSP) to mitigate the threats associated with misuse of passive LBS (AMTA, 2010, p. 4), which are services that do not rely on active participation by the user once initial consent has been granted (p. 5). The guidelines were developed by AMTA's working party that comprised major stakeholders in Australia, including Nokia, Optus, Telstra and Vodafone Hutchison Australia (AMTA, 2010, p. 26), providing an example of the self-regulation approach in practice. Although AMTA's guidelines were built on the NPPs and other relevant legislation (AMTA, 2010, p. 5), by April 2013 they had not been updated to reflect the amendments to the Privacy Act. The guidelines document also encourages compliance with relevant Australian laws (AMTA, 2010, p. 17) including a selection of those identified throughout this paper.

In theory, industry guidelines are significant in that they are a form of self-regulation aimed at addressing regulatory concerns, such as the risks associated with LBS usage, without the need for legislative action. As such, they form a crucial component of the LBS regulatory framework. AMTA's guidelines represent the industry's effort to ensure consumer privacy protection and safety when utilising available LBS applications – yet they have not been amended to reflect legislative change in the privacy arena. It is noteworthy that the industry-based self-regulation approach has its critics. For instance, Cho (2005, p. 236) himself claims that while self-regulation affords flexibility to industry stakeholders and symbolises a proactive approach to privacy concerns, it may by the same token be perceived an inadequate safeguard. In the context of AMTA's guidelines, Michael and Clarke (2012, p. 5) are similarly critical of the efficacy of self-regulation, claiming that industry guidelines and codes are typically “a political tool to avoid regulation.”

4. Discussion: validating the Australian framework

4.1. Summarising and sketching the LBS regulatory framework in Australia

This paper serves to sketch the current LBS regulatory framework in Australia, identifying the components comprising the overall framework, as summarised in Fig. 3. Section 3demonstrated that the LBS regulatory framework in Australia is largely dominated by legal and industry-based regulatory approaches, in particular, commonwealth-based (federal) legislation and self-regulatory mechanisms applying across Australia. The extent to which each regulatory tool applies to LBS and location data was also covered.

Fig. 3. Components of the current LBS regulatory framework in Australia.

A number of issues inevitably emerge upon closer examination of the current LBS regulatory framework in Australia. With regards to privacy legislation, it was noted that (location) information derived from LBS solutions might or might not be personal information and is unlikely to be sensitive personal information. The Privacy Act may not cover the data. Regarding Australian telecommunications legislation, location data may not specifically be classed ‘telecommunications data’ in all circumstances. The location dependent carriage service introduces ambiguity regarding definitions. The state-based prohibition on the use of tracking devices means that the provision of LBS will require explicit permission of the users of an LBS device. This is similarly the case with respect to surveillance legislation, in which tracking devices can be deployed for surveillance purposes, and without a warrant, in specific situations as outlined in the federal legislation. The implications of this lawful but covert deployment of tracking devices are yet to be fully explored. Correspondingly, national security legislation grants increasing powers to various agencies to monitor individuals under the guise of maintaining national security and protecting the interests of Australian citizens. The legal mechanisms that apply to LBS require further review, as they fail to adequately cover various aspects relevant to LBS and location data and the laws are not necessarily consistent or matching. However, opportunities for policy implementing such a review have not been seized in recent legislative change.

Similarly, industry-based guidelines are lacking in their coverage of LBS. For example, AMTA's guidelines merely cover passive LBS or those that do not require user input once initial consent is given. This is not surprising as industry bodies self-regulate a narrow group. Self-regulation is poor at involving users and other industry representatives. Supplementary to these individual issues, it is essential at this point to validate the Australian regulatory scheme in view of the conceptual framework defined earlier in this paper, in order to identify the broad challenges that surface in examining the existing framework, summarised in Table 1.

Table 1. Validation of the LBS regulatory framework in Australia.

Challenges/considerations | Validation | Areas for improvement

Technology-specific versus technology-neutral

Australian framework is largely technology-neutral (with exception of industry guidelines) and is not LBS specific.

- Subordinate legislation and regulation could be extended to cover specifics of LBS and location data.

- This may necessitate continual review of regulatory settings as LBS solutions and underlying technologies evolve.

Legislation versus self-regulation

Existing framework draws on combined legal and industry-based approach to regulation, which allows for both government and industry involvement. However, self-regulation is a characteristic of telecommunications and not privacy legislation.

- Self-regulation is created by narrow industry groups and is lacking in its involvement of users.

- There could be closer collaboration between industry and government.

- Drawbacks of current regulation and industry-based tools identified in this paper should be addressed

Multiple and competing stakeholder interests

Government and industry have largely established the current Australian framework for LBS. However, it lacks a stronger level of collaboration and user involvement.

- Collaboration and consultation are crucial in the regulatory process to ensure stakeholder representation.

- Users, in particular, must be encouraged to participate.

- Individual stakeholders in government, industry and user segments should be identified and approached.

Flexible regulatory structures

Legislation in the present framework is not particularly flexible and does not easily cater for LBS solutions in the marketplace or any future developments. Subordinate legislation is more flexible.

- Technology-specificity is required to incorporate LBS and location data into subordinate legislation.

- Industry-based tools should be continually developed and should be adaptable to technological developments.

4.2. Validation: extent to which the existing framework is specific to LBS

When considering the technology-specific versus technology-neutral debate in light of the LBS regulatory framework in Australia, it is evident that the current framework entails largely technology-neutral elements. This suggests that the framework fails to account for the specifics of LBS in that it does not adequately account for location data. This generates a risk that concerns, unique to LBS, will be overlooked in the Australian context. Technology-neutrality creates ambiguity in definitions, as can be seen in the case of the Australian privacy and telecommunications legislation in particular. However, as Australian government policy has consistently adopted technology-neutral legislation, the focus of change needs to be on subordinate legislation and self-regulatory mechanisms. The absence of an appropriate regulatory environment for LBS is undesirable from the perspective of all stakeholders, particularly individuals. The existing framework requires further provisions for LBS and location data, and it is therefore expected that legal and industry-based regulatory mechanisms will require continual review in the present technological landscape that is dominated by constant developments in both underlying technologies and emerging (and novel) usability contexts.

4.3. Validation: value of existing legislative and self-regulatory mechanisms

The Australian regulatory framework for LBS demonstrates a combined approach to regulation, in which legal and industry-based mechanisms are concurrently implemented. It is often believed that the combined approach allows for the specifics of a given technology to be better incorporated, especially at the industry level and via self-regulatory mechanisms. The Australian initiative led by AMTA can be perceived as a move towards increased industry involvement and representation, and an attempt to avoid unnecessarily stifling the LBS industry. The concern, however, lies in the limitations of self-regulation and the consequence that the guidelines are narrow in their scope and their coverage of a wide range of LBS solutions. In terms of legislation, the specific drawbacks of existing laws have been identified, requiring a review of federal legislation to ensure their applicability to LBS and that the laws are consistent and corresponding. Furthermore, closer collaboration between government, industry and users would improve the legal and industry-based mechanisms in the current framework. That is, government and other stakeholders need to be involved in industry-based processes. This type of co-regulation reduces the negative impacts of self-regulation allowing industry to impart feedback, which informs legislative processes. Importantly, consumers have an opportunity to express ‘real-world’ concerns that would directly support both legislative and co-regulatory processes.

4.4. Validation: degree to which stakeholder interests are accounted for

While government and industry perspectives have somewhat been represented in the existing framework, further collaboration is required to account for the views of users. Furthermore, individual stakeholder types must be identified within the government, industry and user segments and collaboration of individual stakeholders must be encouraged to ensure that all interests are represented in the regulatory process. In the Australian context, collaboration and consultation with a wide range of LBS value chain stakeholders in lacking, but is essential in order to incorporate multiple and competing stakeholder interests.

4.5. Validation: level of flexibility

The Australian legislative framework does not provide a flexible regulatory structure. That is, the legislation is out-dated with respect to LBS and existing provisions do not naturally enable the absorption of new LBS solutions and features. It is suggested that a higher degree of technology-specificity is required in subordinate legislation, given the unique characteristics of LBS and location data which do not always fall within the scope of current definitions. However, this approach must be carefully constructed to ensure that the chosen regulatory mechanisms are adaptable as the technology evolves. In combination with considered co-regulatory tools and guidelines that have been developed in an objective manner, this should ensure a degree of flexibility, given that regulatory systems can adapt more quickly than legislative systems.

4.6. Future research and extending the Australian framework

This paper has set the groundwork for understanding the nature and extent of the LBS regulatory framework in Australia by sketching the components of the existing scheme and defining the extent to which the respective elements apply at the federal level. It has additionally set out the regulatory and public policy context within which the framework exists and the challenges that demand a certain degree of sensitivity by presenting a conceptual framework for analysing LBS regulation. It is recommended that future studies: (a) utilise the conceptual framework as a means of measuring the validity of a given regulatory framework in a specific setting, and (b) employ the defined Australian framework as the basis for examining the need for LBS regulation in Australia and understanding the manner in which LBS regulation should be implemented.

The Australian framework presented in this paper can be further extended as part of future work. Explicit areas for prospective research include: (a) broadening the scope of the framework to account for state-based legislation and additional industry-based mechanisms, (b) encouraging a greater focus on cross-cultural comparisons by comparing the Australian case with other, more mature examples such as the European data protection regime for LBS, (c) consulting with relevant stakeholders regarding the applicability and adequacy of the Australian framework and existing regulatory measures and contrasting the results with the outcomes of the validation process presented in this paper, and (d) improving the framework based on the suggested areas for improvement.

5. Conclusion

The focus of this paper was on developing a conceptual framework for analysing LBS regulation, presenting the components of the existing Australian framework and subsequently engaging in a process of validation. The validation process indicated that the LBS regulatory framework in Australia should: (i) account more specifically for LBS and location data, (ii) better incorporate legislative, self-regulatory and co-regulatory mechanisms, (iii) encourage a higher degree of collaboration with stakeholders in the LBS value chain, and (iv) encompass a higher degree of flexibility to ensure technological developments are integrated. The benefits to be garnered from this exercise include an accurate and detailed understanding of the current framework in Australia which has allowed areas for improvement to be identified. The ensuing outcomes can be used as the basis for future research in the LBS regulation field and provide a useful starting point for determining the need for LBS regulation in Australia.

Acknowledgements

The authors wish to acknowledge the funding support of the Australian Research Council (ARC) – Discovery Grant DP0881191 titled “Toward the Regulation of the Location-Based Services Industry: Influencing Australian Government Telecommunications Policy.” The views expressed herein are those of the authors and are not necessarily those of the ARC.

References

ALRC, For your information: Australian privacy law and practice, (2008), (ALRC report 108). 12 January 2012, http://www.alrc.gov.au.ezproxy.uow.edu.au/publications/report-108

AMTA, AMTA guidelines, 24 April 2012, www.amta.org.au/files/Location.Based.Services.Guidelines.pdf (2010)

About AMTA; n.d. 21 February 2012. http://www.amta.org.au/pages/About.AMTA.

APF, Privacy laws – Commonwealth of Australia, 20 February 2012, http://www.privacy.org.au/Resources/PLawsClth.html (2007)

APF, Location and tracking of individuals through their mobile devices, 20 February 2012, http://www.privacy.org.au/Papers/LocData.html (2011)

Attorney General's Department, Australian laws to combat terrorism, 20 February 2012, http://www.nationalsecurity.gov.au/agd/www/nationalsecurity.nsf/AllDocs/826190776D49EA90CA256FAB001BA5EA?OpenDocument (2011)

Attorney General's Department. Telecommunications interception and surveillance; n.d., 20 January 2012. http://www.ag.gov.au/Telecommunicationsinterceptionandsurveillance/Pages/default.aspx.

A. Barreras, A. Mathur, Wireless location tracking, [chapter 18], K.R. Larsen, Z.A. Voronovich (Eds.), Convenient or invasive: the information age, Ethica Publishing, United States (2007), pp. 176-186

J. Braithwaite, C. Coglianese, D. Levi-FaurCan regulation and governance make a difference? Regulation & Governance, 1 (2007), pp. 1-7

P. Bridgman, G. Davis, The Australian policy handbook, (3rd ed.), Allen & Unwin, Crows Nest, NSW (2004), 198 p.

G. Cho, Geographic information systems and the law: Mapping the legal frontiers, John Wiley & Sons, Chichester, West Sussex (1998), 337 p.

G. Cho, Geographic information science: mastering the legal issues, John Wiley & Sons Inc, Hoboken, NJ (2005), 440 p.

R. Clarke, While you were sleeping… surveillance technologies arrived, Australian Quarterly, 73 (1) (2001), pp. 10-14

R. Clarke, Privacy on the move: the impacts of mobile technologies on consumers and citizens, http://www.anu.edu.au/people/Roger.Clarke/DV/MPrivacy.html (2003)

R. Clarke, PAIs in Australia – a work-in-progress report, http://www.rogerclarke.com/DV/PIAsAust-11.html (2010)

R. Clarke, M. Wigan, You are where you've been: the privacy implications of location and tracking technologies, http://www.rogerclarke.com/DV/YAWYB-CWP.html (2011)

E.B. Cleff, Effective approaches to regulate mobile advertising: moving towards a coordinated legal, self-regulatory and technical response, Computer Law & Security Review, 26 (2010) (2010), pp. 158-169

C. Cuijpers, B.J. Koops, How fragmentation in European law undermines consumer protection: the case of location-based services, European Law Review, 33 (2008), pp. 880-897

E. Dal Bó, Regulatory capture: a review, Oxford Review of Economic Policy, 22 (2006), pp. 203-225

J.E. Dobson, P.F. Fisher, Geoslavery, IEEE Technology and Society Magazine, 22 (1) (2003), pp. 47-52

A. Escudero-Pascual, I. Hosein, Questioning lawful access to traffic data, Communications of the ACM, 47 (3) (2004), pp. 77-83

A. Fenna, Introduction to Australian public policy, Addison Wesley Longman Australia Pty Limited, South Melbourne, Australia (1998), 454 p.

FIDISD, 11.5: the legal framework for location-based services in Europe, http://www.fidis.net/ (2007)

N. Gunningham, J. Rees, Industry self-regulation: an institutional perspective, Law & Policy, 19 (4) (1997), pp. 363-414

T.A. Hemphill, Monitoring global corporate citizenship: industry self-regulation at a crossroads, The Journal of Corporate Citizenship, 14 (Summer 2004) (2004), pp. 81-95

W. Herbert, No direction home: will the law keep pace with human tracking technology to protect individual privacy and stop geoslavery?, I/S: A Journal of Law and Policy, 2 (2) (2006), pp. 409-473

D.D. Hirsch, The law and policy of online privacy: regulation, self-regulation, or co-regulation? (2010), p. 1–62, http://works.bepress.com.ezproxy.uow.edu.au/dennis_hirsch/61/

I. Hosein, The collision of regulatory convergence and divergence: updating policies of surveillance and information technology, The Southern African Journal of Information and Communication, 2 (1) (2001), pp. 18-33

B.J. Koops, Should ICT regulation be technology-neutral? [chapter 4], B.J. Koops, M. Lips, C. Prins, M. Schellekens (Eds.), Starting points for ICT regulation. Deconstructing prevalent policy one-liners, IT & law series, vol. 9, T.M.C. Asser Press, The Hague (2006), pp. 1-28, (online version). p. 77–108 (original version), http://papers.ssrn.com/sol103/papers.cfm?abstract_id=918746

S. Maddison, R. Denniss, An introduction to Australian public policy: theory and practice, Cambridge University Press, Port Melbourne, Victoria (2009), 281 p.

G.T. Marx, Ethics for the new surveillance, [chapter 2], C.J. Bennett, R. Grant (Eds.), Visions of privacy: policy choices for the digital age, University of Toronto Press, Toronto (1999), pp. 37-67

K. Michael, R. Clarke, Location privacy under dire threat as uberveillance stalks the streets, Precedent (Focus on Privacy/FOI), 108 (2012), pp. 1-8, (online version) & 24–9 (original article), http://works.bepress.com.ezproxy.uow.edu.au/kmichael/245/

K. Michael, R. Clarke, Location and tracking of mobile devices: uberveillance stalks the streets, Computer Law and Security Review, 29 (2) (2013), http://works.bepress.com.ezproxy.uow.edu.au/kmichael/305/

J.B. Morris, The privacy implications of commercial location-based services, Testimony before the House Committee on Energy and Commerce, Subcommittee on Commerce, Trade, and Consumer Protection and Subcommittee on Communications, Technology, and the Internet: 1–15, http://inews.berkeley.edu/files/CDT-MorrisLocationTestimony.pdf (2010)

L.B. Moses, Recurring dilemmas: the law's race to keep up with technological change, Journal of Law, Technology and Policy, 2007 (2) (2007), pp. 239-285

L.B. Moses, Agents of change: how the law ‘copes’ with technological change, Griffith Law Review, 20 (4) (2011), pp. 763-794

R. Nicholls, Right to privacy: telephone interception and access in Australia, Technology and Society Magazine, IEEE, 31 (2012), pp. 42-49

R. Nicholls, M. Rowland, Regulating the use of telecommunications location data by Australian law enforcement agencies, Criminal Law Journal, 32 (2008), pp. 343-350

R. Nicholls, M. Rowland, Message in a bottle: stored communications interception as practised in Australia, [chapter 7], K. Michael, M.G. Michael (Eds.), The second workshop on the social implications of national security (from Dataveillance to Uberveillance and the Realpolitik of the Transparent Society), University of Wollongong, IP Location-Based Services Research Program (Faculty of Informatics) and Centre for Transnational Crime Prevention (Faculty of Law), Wollongong, Australia (2007), pp. 83-96

Nicholls and Rowland, 2008b, R. Nicholls, M. RowlandRegulating the use of telecommunications location data by Australian law enforcement agencies, [chapter 14], K. Michael, M.G. Michael (Eds.), The third workshop on the social implications of national security (Australia and the New Technologies: Evidence Based Policy in Public Administration), University of Wollongong, IP Location-Based Services Research Program (Faculty of Informatics), Wollongong, Australia (2008), pp. 173-184

H. Nissenbaum, Privacy in context: technology, policy, and the integrity of social life, Stanford Law Books, Stanford, California (2010), 288 p.

P.J. O'Connor, S.H. Godar, We know where you are: the ethics of LBS advertising, [chapter Xiii], B.E. Mennecke, T.J. Strader (Eds.), Mobile commerce: technology, theory and applications, Idea Group Publishing, Hershey, US (2003), pp. 245-261

OAIC. Privacy Act; n.d. 20 February 2012. http://www.privacy.gov.au/law/act#.

P. Ohm, The argument against technology-neutral surveillance laws, Texas Law Review, 88 (2010) (2010), pp. 1685-1713

T. Pilgrim, Speech to biometrics institute privacy in Australia: challenges and opportunities, 27 May 2010, Amora Hotel Jamison, Sydney. p. 1–29, www.privacy.gov.au/materials/types/download/9516/7089 (2010)

Privacy Act 1988 (Cth). Commonwealth of Australia; 2 February, 2012. http://www.comlaw.gov.au/Details/C2011C00503/Download.

Privacy Amendment (Private Sector) Act 2000 (Cth), Privacy Amendment (Private Sector) Act 2000 (Cth). Commonwealth of Australia; 2 February 2012. http://www.comlaw.gov.au/Details/C2004A00748/Download.

C. Reed, Taking sides on technology neutrality, SCRIPT-ed, 4 (3) (2007), pp. 263-284

M. Rix, Australia and the ‘war against terrorism’: terrorism, national security and human rights, [chapter 8], K. Michael, M.G. Michael (Eds.), The second workshop on the social implications of national security (from Dataveillance to Uberveillance and the Realpolitik of the Transparent Society), University of Wollongong, IP Location-Based Services Research Program (Faculty of Informatics) and Centre for Transnational Crime Prevention (Faculty of Law), Wollongong, Australia (2007), pp. 97-112

S. Rodrick, Accessing telecommunications data for national security and law enforcement purposes, Federal Law Review, 37 (2009), pp. 375-415

G.D. Smith, Private eyes are watching you: with the implementation of the E-911 mandate, who will watch every move you make? Federal Communications Law Journal, 58 (2006), pp. 705-726

G.J. Stigler, The theory of economic regulation, The Bell Journal of Economics and Management Science, 2 (1971), pp. 3-21

Surveillance Devices Act 2004 (Cth). Commonwealth of Australia; 2 February 2012. http://www.comlaw.gov.au/Details/C2011C00646/Download.

D. Svantesson, A legal method for solving issues of internet regulation, International Journal of Law and Information Technology, 19 (3) (2011), pp. 243-263

I. Székely, M.D. Szabó, B. Vissy, Regulating the future? Law, ethics, and emerging technologies, Journal of Information, Communication & Ethics in Society, 9 (3) (2011), pp. 180-194

Telecommunications (Interception and Access) Act 1979 (Cth). Commonwealth of Australia; 2 February 2012. http://www.comlaw.gov.au/Details/C2012C00081/Download.

(Cth)Telecommunications Act 1997 (Cth). Commonwealth of Australia; 2 February 2012. http://www.comlaw.gov.au/Details/C2012C00084/Download.

The ASIO Legislation Amendment Act 2003 (Cth). Commonwealth of Australia; 2 February 2012. http://www.comlaw.gov.au/Details/C2004A01228/Download.

The Australian Security Intelligence Organisation Act 1979 (Cth). Commonwealth of Australia; 2 February 2012. http://www.comlaw.gov.au/Details/C2011C00585/Download.

VLRC, Surveillance in public places consultation paper, Victorian Law Reform Commission, Melbourne (2009), http://www.lawreform.vic.gov.au/projects/surveillance-public-places/surveillance-public-places-consultation-paper

VLRC Surveillance in public places, Final report 18, Victorian Law Reform Commission, Melbourne (2010), http://www.lawreform.vic.gov.au/projects/surveillance-public-places/surveillance-public-places-final-report

H. Xu, H.H. Teo, B.Y.C. Tan, R. Agarwal, The role of push–pull technology in privacy calculus: the case of location-based services, Journal of Management Information Systems, 26 (3) (2009), pp. 135-173

Keywords: Location-based services, Regulation, Legislation, Law, Self-regulation, Co-regulation, Industry guidelines, Privacy, Australia

Citation: Roba Abbas, Katina Michael, M.G. Michael, Rob Nicholls, Sketching and validating the location-based services (LBS) regulatory framework in Australia, Computer Law & Security Review, Vol. 29, No. 5, October 2013, pp. 576-589, DOI: https://doi.org/10.1016/j.clsr.2013.07.014

Location and Tracking of Mobile Devices

Location and Tracking of Mobile Devices: Überveillance Stalks the Streets

Review Version of 7 October 2012

Published in Computer Law & Security Review 29, 3 (June 2013) 216-228

Katina Michael and Roger Clarke **

© Katina Michael and Xamax Consultancy Pty Ltd, 2012

Available under an AEShareNet  licence or a Creative Commons  licence.

This document is at http://www.rogerclarke.com/DV/LTMD.html

Abstract

During the last decade, location-tracking and monitoring applications have proliferated, in mobile cellular and wireless data networks, and through self-reporting by applications running in smartphones that are equipped with onboard global positioning system (GPS) chipsets. It is now possible to locate a smartphone-user's location not merely to a cell, but to a small area within it. Innovators have been quick to capitalise on these location-based technologies for commercial purposes, and have gained access to a great deal of sensitive personal data in the process. In addition, law enforcement utilise these technologies, can do so inexpensively and hence can track many more people. Moreover, these agencies seek the power to conduct tracking covertly, and without a judicial warrant. This article investigates the dimensions of the problem of people-tracking through the devices that they carry. Location surveillance has very serious negative implications for individuals, yet there are very limited safeguards. It is incumbent on legislatures to address these problems, through both domestic laws and multilateral processes.

Contents

1. Introduction

Personal electronic devices travel with people, are worn by them, and are, or soon will be, inside them. Those devices are increasingly capable of being located, and, by recording the succession of locations, tracked. This creates a variety of opportunities for the people concerned. It also gives rise to a wide range of opportunities for organisations, at least some of which are detrimental to the person's interests.

Commonly, the focus of discussion of this topic falls on mobile phones and tablets. It is intrinsic to the network technologies on which those devices depend that the network operator has at least some knowledge of the location of each handset. In addition, many such devices have onboard global positioning system (GPS) chipsets, and self-report their coordinates to service-providers. The scope of this paper encompasses those already-well-known forms of location and tracking, but it extends beyond them.

The paper begins by outlining the various technologies that enable location and tracking, and identifies those technologies' key attributes. The many forms of surveillance are then reviewed, in order to establish a framework within which applications of location and tracking can be characterised. Applications are described, and their implications summarised. Controls are considered, whereby potential harm to the interests of individuals can be prevented or mitigated.

2. Relevant Technologies

The technologies considered here involve a device that has the following characteristics:

  • it is conveniently portable by a human, and
  • it emits signals that:
    • enable some other device to compute the location of the device (and hence of the person), and
    • are sufficiently distinctive that the device is reliably identifiable at least among those in the vicinity, and hence the device's (and hence the person's) successive locations can be detected, and combined into a trail

The primary form-factors for mobile devices are currently clam-shape (portable PCs), thin rectangles suitable for the hand (mobile phones), and flat forms (tablets). Many other form-factors are also relevant, however. Anklets imposed on dangerous prisoners, and even as conditions of bail, carry RFID tags. Chips are carried in cards of various sizes, particularly the size of credit-cards, and used for tickets for public transport and entertainment venues, aircraft boarding-passes, toll-road payments and in some countries to carry electronic cash. Chips may conduct transactions with other devices by contact-based means, or contactless, using radio-frequency identification (RFID) or its shorter-range version near-field communication (NFC) technologies. These capabilities are in credit and debit cards in many countries. Transactions may occur with the cardholder's knowledge, with their express consent, and with an authentication step to achieve confidence that the person using the card is authorised to do so. In a variety of circumstances, however, some and even all of those safeguards are dispensed with. The electronic versions of passports that are commonly now being issued carry such a chip, and have an autonomous communications capability. The widespread issue of cards with capabilities uncontrolled by, and in many cases unknown to, the cardholder, is causing consternation among segments of the population that have become aware of the schemes.

Such chips can be readily carried in other forms, including jewellery such as finger-rings, and belt-buckles. Endo-prostheses such as replacement hips and knees and heart pacemakers can readily carry chips. A few people have voluntarily embedded chips directly into their bodies for such purposes as automated entry to premises (Michael & Michael 2009).

In order to locate and track such devices, any sufficiently distinctive signals may in principle suffice. See Raper et al. (2007a) and Mautz (2011). In practice, the signals involved are commonly those transmitted by a device in order to take advantage of wireless telecommunications networks. The scope of the relevant technologies therefore also encompasses the signals, devices that detect the signals, and the networks over which the data that the signals contain are transmitted.

In wireless networks, it is generally the case that the base station or router needs to be aware of the identities of devices that are currently within the cell. A key reason for this is to conserve limited transmission capacity by sending messages only when the targeted device is known to be in the cell. This applies to all of:

  • cellular mobile originally designed for voice telephony and extended to data (in particular those using the '3G' standards GSM/GPRS, CDMA2000 and UMTS/HSPA and the '4G' standard LTE)
  • wireless local area networks (WLANs, commonly Wifi / IEEE 802.11x - RE 2010a)
  • wireless wide area networks (WWANs, commonly WiMAX / IEEE 802.16x - RE 2010b).

Devices in such networks are uniquely identified by various means (Clarke & Wigan 2011). In cellular networks, there is generally a clear distinction between the entity (the handset) and the identity it is adopting at any given time (which is determined by the module inserted in it). Depending on the particular standards used, what is commonly referred to as 'the SIM-card' is an R-UIM, a CSIM or a USIM. These modules store an International Mobile Subscriber Identity (IMSI), which constitutes the handset's identifier. Among other things, this enables network operators to determine whether or not to provide service, and what tariff to apply to the traffic. However, cellular network protocols may also involve transmission of a code that distinguishes the handset itself, within which the module is currently inserted. A useful generic term for this is the device 'entifier' (Clarke 2009b). Under the various standards, it may be referred to as an International Mobile Equipment Identity (IMEI), ESN, or MEID.

In Wifi and WiMAX networks, the device entifier may be a processor-id or more commonly a network interface card identifier (NIC Id). In various circumstances, other device-identifiers may be used, such as a phone-number, or an IP-address may be used as a proxy. In addition, the human using the device may be directly identified, e.g. by means of a user-accountname.

A WWAN cell may cover a large area, indicatively of a 50km radius. Telephony cells may have a radius as large as 2-3 km or as little as a hundred metres. WLANs using Wifi technologies have a cell-size of less than 1 hectare, indicatively 50-100 metres radius, but in practice often constrained by environmental factors to only 10-30 metres.

The base-station or router knows the identities of devices that are within its cell, because this is a technically necessary feature of the cell's operation. Mobile devices auto-report their presence 10 times per second. Meanwhile, the locations of base-stations for cellular services are known with considerable accuracy by the telecommunications providers. And, in the case of most private Wifi services, the location of the router is mapped to c. 30-100 metre accuracy by services such as Skyhook and Google Locations, which perform what have been dubbed 'war drives' in order to maintain their databases - in Google's case in probable violation of the telecommunications interception and/or privacy laws of at least a dozen countries (EPIC 2012).

Knowing that a device is within a particular mobile phone, WiMAX or Wifi cell provides only a rough indication of location. In order to generate a more precise estimate, within a cell, several techniques are used (McGuire et al. 2005). These include the following (adapted from Clarke & Wigan 2011. See also Figueiras & Frattasi 2010):

  • directional analysis. A single base-station may comprise multiple receivers at known locations and pointed in known directions, enabling the handset's location within the cell to be reduced to a sector within the cell, and possibly a narrow one, although without information about the distance along the sector;
  • triangulation. This involves multiple base-stations serving a single cell, at known locations some distance apart, and each with directional analysis capabilities. Particularly with three or more stations, this enables an inference that the device's location is within a small area at the intersection of the multiple directional plots;
  • signal analysis. This involves analysis of the characteristics of the signals exchanged between the handset and base-station, in order to infer the distance between them. Relevant signal characteristics include the apparent response-delay (Time Difference of Arrival - TDOA, also referred to as multilateration), and strength (Received Signal Strength Indicator - RSSI), perhaps supplemented by direction (Angle Of Arrival - AOA).

The precision and reliability of these techniques varies greatly, depending on the circumstances prevailing at the time. The variability and unpredictability result in many mutually inconsistent statements by suppliers, in the general media, and even in the technical literature.

Techniques for cellular networks generally provide reasonably reliable estimates of location to within an indicative 50-100m in urban areas and some hundreds of metres elsewhere. Worse performance has been reported in some field-tests, however. For example, Dahunsi & Dwolatzky (2012) found the accuracy of GSM location in Johannesberg to be in the range 200-1400m, and highly variable, with "a huge difference between the predicted and provided accuracies by mobile location providers".

The web-site of the Skyhook Wifi-router positioning service claims 10-metre accuracy, 1-second time-to-first-fix and 99.8% reliability (SHW 2012). On the other hand, tests have resulted in far lower accuracy measures, including an average positional error of 63m in Sydney (Gallagher et al. 2009) and "median values for positional accuracy in [Las Vegas, Miami and San Diego, which] ranged from 43 to 92 metres ... [and] the replicability ... was relatively poor" (Zandbergen 2012, p. 35). Nonetheless, a recent research article suggested the feasibility of "uncooperatively and covertly detecting people 'through the wall' [by means of their WiFi transmissions]" (Chetty et al. 2012).

Another way in which a device's location may become known to other devices is through self-reporting of the device's position, most commonly by means of an inbuilt Global Positioning System (GPS) chip-set. This provides coordinates and altitude based on broadcast signals received from a network of satellites. In any particular instance, the user of the device may or may not be aware that location is being disclosed.

Despite widespread enthusiasm and a moderate level of use, GPS is subject to a number of important limitations. The signals are subject to interference from atmospheric conditions, buildings and trees, and the time to achieve a fix on enough satellites and deliver a location measure may be long. This results in variability in its practical usefulness in different circumstances, and in its accuracy and reliability. Civil-use GPS coordinates are claimed to provide accuracy within a theoretical 7.8m at a 95% confidence level (USGov 2012), but various reports suggest 15m, or 20m, or 30m, but sometimes 100m. It may be affected by radio interference and jamming. The original and still-dominant GPS service operated by the US Government was subject to intentional degradation in the US's national interests. This 'Selective Availability' feature still exists, although subject to a decade-long policy not to use it; and future generations of GPS satellites may no longer support it.

Hybrid schemes exist that use two or more sources in order to generate more accurate location-estimates, or to generate estimates more quickly. In particular, Assisted GPS (A-GPS) utilises data from terrestrial servers accessed over cellular networks in order to more efficiently process satellite-derived data (e.g. RE 2012).

Further categories of location and tracking technologies emerge from time to time. A current example uses means described by the present authors as 'mobile device signatures' (MDS). A device may monitor the signals emanating from a user's mobile device, without being part of the network that the user's device is communicating with. The eavesdropping device may detect particular signal characteristics that distinguish the user's mobile device from others in the vicinity. In addition, it may apply any of the various techniques mentioned above, in order to locate the device. If the signal characteristics are persistent, the eavesdropping device can track the user's mobile device, and hence the person carrying it. No formal literature on MDS has yet been located. The supplier's brief description is at PI (2010b).

The various technologies described in this section are capable of being applied to many purposes. The focus in this paper is on their application to surveillance.

3. Surveillance

The term surveillance refers to the systematic investigation or monitoring of the actions or communications of one or more persons (Clarke 2009c). Until recent times, surveillance was visual, and depended on physical proximity of an observer to the observed. The volume of surveillance conducted was kept in check by the costs involved. Surveillance aids and enhancements emerged, such as binoculars and, later, directional microphones. During the 19th century, the post was intercepted, and telephones were tapped. During the 20th century, cameras enabled transmission of image, video and sound to remote locations, and recording for future use (e.g. Parenti 2003).

With the surge in stored personal data that accompanied the application of computing to administration in the 1970s and 1980s, dataveillance emerged (Clarke 1988). Monitoring people through their digital personae rather than through physical observation of their behaviour is much more economical, and hence many more people can be subjected to it (Clarke 1994). The dataveillance epidemic made it more important than ever to clearly distinguish between personal surveillance - of an identified person who has previously come to attention - and mass surveillance - of many people, not necessarily previously identified, about some or all of whom suspicion could be generated.

Location data is of a very particular nature, and hence it has become necessary to distinguish location surveillance as a sub-set of the general category of dataveillance. There are several categories of location surveillance with different characteristics (Clarke & Wigan 2011):

  • capture of an individual's location at a point in time. Depending on the context, this may support inferences being drawn about an individual's behaviour, purpose, intention and associates
  • real-time monitoring of a succession of locations and hence of the person's direction of movement. This is far richer data, and supports much more confident inferences being drawn about an individual's behaviour, purpose, intention and associates
  • predictive tracking, by extrapolation from the person's direction of movement, enabling inferences to be drawn about near-future behaviour, purpose, intention and associates
  • retrospective tracking, on the basis of the data trail of the person's movements, enabling reconstruction of a person's behaviour, purpose, intention and associates at previous times

Information arising at different times, and from different forms of surveillance, can be combined, in order to offer a more complete picture of a person's activities, and enable yet more inferences to be drawn, and suspicions generated. This is the primary sense in which the term 'überveillance' is applied: "Überveillance has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought). Überveillance can be a predictive mechanism for a person's expected behaviour, traits, likes, or dislikes; or it can be based on historical fact; or it can be something in between ... Überveillance is more than closed circuit television feeds, or cross-agency databases linked to national identity cards, or biometrics and ePassports used for international travel. Überveillance is the sum total of all these types of surveillance and the deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time" (Michael & Michael 2010. See also Michael & Michael 2007, Michael et al. 2008, Michael et al. 2010, Clarke 2010).

A comprehensive model of surveillance includes consideration of geographical scope, and of temporal scope. Such a model assists the analyst in answering key questions about surveillance: of what? for whom? by whom? why? how? where? and when? (Clarke 2009c). Distinctions are also needed based on the extent to which the subject has knowledge of surveillance activities. It may be overt or covert. If covert, it may be merely unnotified, or alternatively express measures may be undertaken in order to obfuscate, and achieve secrecy. A further element is the notion of 'sousveillance', whereby the tools of surveillance are applied, by those who are commonly watched, against those who are commonly the watchers (Mann et al. 2003).

These notions are applied in the following sections in order to establish the extent to which location and tracking of mobile devices is changing the game of surveillance, and to demonstrate that location surveillance is intruding more deeply into personal freedoms than previous forms of surveillance.

4. Applications

This section presents a typology of applications of mobile device location, as a means of narrowing down to the kinds of uses that have particularly serious privacy implications. These are commonly referred to as location-based services (LBS). One category of applications provide information services that are for the benefit of the mobile device's user, such as navigation aids, and search and discovery tools for the locations variously of particular, identified organisations, and of organisations that sell particular goods and services. Users of LBS of these kinds can be reasonably assumed to be aware that they are disclosing their location. Depending on the design, the disclosures may also be limited to specific service-providers and specific purposes, and the transmissions may be secured.

Another, very different category of application is use by law enforcement agencies (LEAs). The US E-911 mandate of 1999 was nominally a public safety measure, to enable people needing emergency assistance to be quickly and efficiently located. In practice, the facility also delivered LEAs means for locating and tracking people of interest, through their mobile devices. Personal surveillance may be justified by reasonable grounds for suspicion that the subject is involved in serious crime, and may be specifically authorised by judicial warrant. Many countries have always been very loose in their control over LEAs, however, and many others have drastically weakened their controls since 2001. Hence, in any given jurisdiction and context, each and all of the controls may be lacking.

Yet worse, LEAs use mobile location and tracking for mass surveillance, without any specific grounds for suspicion about any of the many people caught up in what is essentially a dragnet-fishing operation (e.g. Mery 2009). Examples might include monitoring the area adjacent to a meeting-venue watching out for a blacklist of device-identifiers known to have been associated with activists in the past, or collecting device-identifiers for use on future occasions. In addition to netting the kinds of individuals who are of legitimate interest, the 'by-catch' inevitably includes threatened species. There are already extraordinarily wide-ranging (and to a considerable extent uncontrolled) data retention requirements in many countries.

Of further concern is the use of Automated Number Plate Recognition (ANPR) for mass surveillance purposes. This has been out of control in the UK since 2006, and has been proposed or attempted in various other countries as well (Clarke 2009a). Traffic surveillance is expressly used not only for retrospective analysis of the movements of individuals of interest to LEAs, but also as a means of generating suspicions about other people (Lewis 2008).

Beyond LEAs, many government agencies perform social control functions, and may be tempted to conduct location and tracking surveillance. Examples would include benefits-paying organisations tracking the movements of benefits-recipients about whom suspicions have arisen. It is not too far-fetched to anticipate zealous public servants concerned about fraud control imposing location surveillance on all recipients of some particularly valuable benefit, or as a security precaution on every person visiting a sensitive area (e.g. a prison, a power plant, a national park).

Various forms of social control are also exercised by private sector organisations. Some of these organisations, such as placement services for the unemployed, may be performing outsourced public sector functions. Others, such as workers' compensation providers, may be seeking to control personal insurance claimants, and similarly car-hire companies and insurance providers may wish to monitor motor vehicles' distance driven and roads used (Economist 2012).

A further privacy-invasive practice that is already common is the acquisition of location and tracking data by marketing corporations, as a by-product of the provision of location-based services, but with the data then applied to further purposes other than that for which it was intended. Some uses rely on statistical analysis of large holdings ('data mining'). Many uses are, on the other hand, very specific to the individual, and are for such purposes as direct or indirect targeting of advertisements and the sale of goods and services. Some of these applications combine location data with data from other sources, such as consumer profiling agencies, in order to build up such a substantial digital persona that the individual's behaviour is readily influenced. This takes the activity into the realms of überveillance.

All such services raise serious privacy concerns, because the data is intensive and sensitive, and attractive to organisations. Companies may gain rights in relation to the data through market power, or by trickery - such as exploitation of a self-granted right to change the Terms of Service (Clarke 2011). Once captured, the data may be re-purposed by any organisation that gains access to it, because the value is high enough that they may judge the trivial penalties that generally apply to breaches of privacy laws to be well worth the risk.

A recently-emerged, privacy-invasive practice is the application of the mobile device signature (MDS) form of tracking, in such locations as supermarkets. This is claimed by its providers to offer deep observational insights into the behaviour of customers, including dwell-times in front of displays, possibly linked with the purchaser's behaviour. This raises concerns a little different from other categories of location and tracking technologies, and is accordingly considered in greater depth in the following section.

It is noteworthy that an early review identified a wide range of LBS, which the authors classified into mobile guides, transport, gaming, assistive technology and location-based health (Raper et al. 2007b). Yet that work completely failed to notice that a vast array of applications were emergent in surveillance, law enforcement and national security, despite the existence of relevant literature from at least 1999 onwards (Clarke 2001Michael & Masters 2006).

5. Implications

The previous sections have introduced many examples of risks to citizens and consumers arising from location surveillance. This section presents an analysis of the categories and of the degree of seriousness with which they should be viewed. The first topic addressed is the privacy of personal location data. Other dimensions of privacy are then considered, and then the specific case of MDS is examined. The treatment here is complementary to earlier articles that have looked more generally at particular applications such as location-based mobile advertising, e.g. Cleff (2007, 2010) and King & Jessen (2010). See also Art. 29 (2011).

5.1 Locational Privacy

Knowing where someone has been, knowing what they are doing right now, and being able to predict where they might go next is a powerful tool for social control and for chilling behaviour (Abbas 2011). Humans do not move around in a random manner (Song et al. 2010).

One interpretation of 'locational privacy' is that it "is the ability of an individual to move in public space with the expectation that under normal circumstances their location will not be systematically and secretly recorded for later use" (Blumberg & Eckersley 2009). A more concise definition is "the ability to control the extent to which personal location information is ... [accessible and] used by others" (van Loenen et al. 2009). Hence 'tracking privacy' is the interest an individual has in controlling information about their sequence of locations.

Location surveillance is deeply intrusive into data privacy, because it is very rich, and enables a great many inferences to be drawn (Clarke 2001, Dobson & Fisher 2003, Michael et al. 2006aClarke & Wigan 2011). As demonstrated by Raper et al. (2007a, pp. 32-33), most of the technical literature that considers privacy is merely concerned about it as an impediment to deployment and adoption, and how to overcome the barrier rather than how to solve the problem. Few authors adopt a positive approach to privacy-protective location technologies. The same authors' review of applications (Raper et al. 2007b) includes a single mention of privacy, and that is in relation to just one of the scores of sub-categories of application that they catalogue.

Most service-providers are cavalier in their handling of personal data, and extravagant in their claims. For example, Skyhook claims that it "respects the privacy of all users, customers, employees and partners"; but, significantly, it makes no mention of the privacy of the people whose locations, through the locations of their Wifi routers, it collects and stores (Skyhook 2012).

Consent is critical in such LBS as personal location chronicle systems, people-followers and footpath route-tracker systems that systematically collect personal location information from a device they are carrying (Collier 2011c). The data handled by such applications is highly sensitive because it can be used to conduct behavioural profiling of individuals in particular settings. The sensitivity exists even if the individuals remain 'nameless', i.e. if each identifier is a temporary or pseudo-identifier and is not linked to other records. Service-providers, and any other organisations that gain access to the data, achieve the capacity to make judgements on individuals based on their choices of, for example, which retail stores they walk into and which they do not. For example, if a subscriber visits a particular religious bookstore within a shopping mall on a weekly basis, the assumption can be reasonably made that they are in some way affiliated to that religion (Samuel 2008).

It is frequently asserted that individuals cannot have a reasonable expectation of privacy in a public space. Contrary to those assertions, however, privacy expectations always have existed in public places, and continue to exist (VLRC 2010). Tracking the movements of people as they go about their business is a breach of a fundamental expectation that people will be 'let alone'. In policing, for example, in most democratic countries, it is against the law to covertly track an individual or their vehicle without specific, prior approval in the form of a warrant. This principle has, however, been compromised in many countries since 2001. Warrantless tracking using a mobile device generally results in the evidence, which has been obtained without the proper authority, being inadmissible in a court of law (Samuel 2008). Some law enforcement agencies have argued for the abolition of the warrant process because the bureaucracy involved may mean that the suspect cannot be prosecuted for a crime they have likely committed (Ganz 2005). These issues are not new; but far from eliminating a warrant process, the appropriate response is to invest the energy in streamlining this process (Bronitt 2010).

Privacy risks arise not only from locational data of high integrity, but also from data that is or becomes associated with a person and that is inaccurate, misleading, or wrongly attributed to that individual. High levels of inaccuracy and unreliability were noted above in respect of all forms of location and tracking technologies. In the case of MDS services, claims have been made of one-to-two metre locational accuracy. This has yet to be supported by experimental test cases, however, and hence there is uncertainty about the reliability of inferences that the service-provider or the shop-owner draw. If the data is the subject of a warrant or subpoena, the data's inaccuracy could result in false accusations and even a miscarriage of justice, with the 'wrong person' finding themselves in the 'right place' at the 'right time'.

5.2 Privacy More Broadly

Privacy has multiple dimensions. One analysis, in Clarke (2006a), identifies four distinct aspects. Privacy of Personal Data, variously also 'data privacy' and 'information privacy', is the most widely-discussed dimension of the four. Individuals claim that data about themselves should not be automatically available to other individuals and organisations, and that, even where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use. The last five decades have seen the application of information technologies to a vast array of abuses of data privacy. The degree of privacy-intrusiveness is a function of both the intensity and the richness of the data. Where multiple sources are combined, the impact is particularly likely to chill behaviour. An example is the correlation of video-feeds with mobile device tracking. The previous sub-section addressed that dimension.

Privacy of the Person, or 'bodily privacy', extends from freedom from torture and right to medical treatment, via compulsory immunisation and imposed treatments, to compulsory provision of samples of body fluids and body tissue, and obligations to submit to biometric measurement. Locational surveillance gives rise to concerns about personal safety. Physical privacy is directly threatened where a person who wishes to inflict harm is able to infer the present or near-future location of their target. Dramatic examples include assassins, kidnappers, 'standover merchants' and extortionists. But even people who are neither celebrities nor notorities are subject to stalking and harassment (Fusco et al. 2012).

Privacy of Personal Communications is concerned with the need of individuals for freedom to communicate among themselves, without routine monitoring of their communications by other persons or organisations. Issues include 'mail covers', the use of directional microphones, 'bugs' and telephonic interception, with or without recording apparatus, and third-party access to email-messages. Locational surveillance thereby creates new threats to communications privacy. For example, the equivalent of 'call records' can be generated by combining the locations of two device-identifiers in order to infer that a face-to-face conversation occurred.

Privacy of Personal Behaviour encompasses 'media privacy', but particular concern arises in relation to sensitive matters such as sexual preferences and habits, political activities and religious practices. Some privacy analyses, particularly in Europe, extend this discussion to personal autonomy, liberty and the right of self-determination (e.g. King & Jesson 2010). The notion of 'private space' is vital to economic and social aspects of behaviour, is relevant in 'private places' such as the home and toilet cubicles, but is also relevant and important in 'public places', where systematic observation and the recording of images and sounds are far more intrusive than casual observation by the few people in the vicinity.

Locational surveillance gives rise to rich sets of data about individuals' activities. The knowledge, or even suspicion, that such surveillance is undertaken, chills their behaviour. The chilling factor is vital in the case of political behaviour (Clarke 2008). It is also of consequence in economic behaviour, because the inventors and innovators on whom new developments depend are commonly 'different-thinkers' and even 'deviants', who are liable to come to come to attention in mass surveillance dragnets, with the tendency to chill their behaviour, their interactions and their creativity.

Surveillance that generates accurate data is one form of threat. Surveillance that generates inaccurate data, or wrongly associates data with a particular person, is dangerous as well. Many inferences that arise from inaccurate data will be wrong, of course, but that won't prevent those inferences being drawn, resulting in unjustified behavioural privacy invasiveness, including unjustified association with people who are, perhaps for perfectly good reasons, themselves under suspicion.

In short, all dimensions of privacy are seriously affected by location surveillance. For deeper treatments of the topic, see Michael et al. (2006b) and Clarke & Wigan (2011).

5.3 Locational Privacy and MDS

The recent innovation of tracking by means of mobile device signatures (MDS) gives rise to some issues additional to, or different from, mainstream device-location technologies. This section accordingly considers this particular technique's implications in greater depth. Limited reliable information is currently available, and the analysis is of necessity based on supplier-published sources (PI 2010a, 2010b) and media reports (Collier 2010a, 2010b, 2010c).

A company called Path Intelligence (PI) markets an MDS service to shopping mall-owners, to enable them to better value their floorspace in terms of rental revenues, and to identify points of on-foot traffic congestion to on-sell physical advertising and marketing floorspace (PI 2010a). The company claims to detect each phone (and hence person) that enters a zone, and to capture data, including:

  • how long each device and person stay, including dwell times in front of shop windows;
  • repeat visits by shoppers in varying frequency durations; and
  • typical route and circuit paths taken by shoppers as they go from shop to shop during a given shopping experience.

For malls, PI is able to denote such things as whether or not shoppers who shop at one establishment will also shop at another in the same mall, and whether or not people will go out of their way to visit a particular retail outlet independent of its location. For retailers, PI says it is able to provide information on conversion rates by department or even product line, and even which areas of the store might require more attention by staff during specific times of the day or week (PI 2012).

PI says that it uses "complex algorithms" to denote the geographic position of a mobile, using strategically located "proprietary equipment" in a campus setting (PI 2010a). The company states that it is conducting "data-driven analysis", but is not collecting, or at least that it is is not disclosing, any personal information such as a name, mobile telephone number or contents of a short message service (SMS). It states that it only ever provides aggregated data at varying zone levels to the shopping mall-owners. This is presumably justified on the basis that, using MDS techniques, direct identifiers are unlikely to be available, and a pseudo-identifier needs to be assigned. There is no explicit definition of what constitutes a zone. It is clear, however, that minimally-aggregated data at the highest geographic resolution is available for purchase, and at a higher price than more highly-aggregated data.

Shoppers have no relationship with the company, and it appears unlikely that they would even be aware that data about them is being collected and used. The only disclosure appears to be that "at each of our installations our equipment is clearly visible and labelled with our logo and website address" (PI 2010a), but this is unlikely to be visible to many people, and in any case would not inform anyone who saw it.

In short, the company is generating revenue by monitoring signals from the mobile devices of people who visit a shopping mall for the purchase of goods and services. The data collection is performed without the knowledge of the person concerned (Renegar et al. 2008). The company is covertly collecting personal data and exploiting it for profit. There is no incentive or value proposition for the individual whose mobile is being tracked. No clear statement is provided about collection, storage, retention, use and disclosure of the data (Arnold 2008). Even if privacy were not a human right, this would demand statutory intervention on the public policy grounds of commercial unfairness. The company asserts that the "our privacy approach has been reviewed by the [US Federal Trade Commission] FTC, which determined that they are comfortable with our practices" (PI 20101a). It makes no claims of such 'approval' anywhere else in the world.

The service could be extended beyond a mall and the individual stores within it, to, for example, associated walkways and parking areas, and surrounding areas such as government offices, entertainment zones and shopping-strips. Applications can also be readily envisaged on hospital and university campuses, and in airports and other transport hubs. From prior research, this is likely to expose the individual's place of employment, and even their residence (Michael et al. 2006). Even if only aggregated data is sold to businesses, the individual records remain available to at least the service-provider.

The scope exists to combine this form of locational surveillance with video-surveillance such as in-store CCTV, and indeed this is claimed to be already a feature of the company's offering to retail stores. To the extent that a commonly-used identifier can be established (e.g. through association with the person's payment or loyalty card at a point-of-sale), the full battery of local and externally-acquired customer transaction histories and consolidated 'public records' data can be linked to in-store behaviour (Michael & Michael 2007). Longstanding visual surveillance is intersecting with well-established data surveillance, and being augmented by locational surveillance, giving breath to dataveillance, or what is now being referred to by some as 'smart surveillance' (Wright et al. 2010, IBM 2011).

Surreptitious collection of personal data is (with exemptions and exceptions) largely against the law, even when undertaken by law enforcement personnel. The MDS mechanism also flies in the face of telephonic interception laws. How, then, can it be in any way acceptable for a form of warrantless tracking to be undertaken by or on behalf of corporations or mainstream government agencies, of shoppers in a mall, or travellers in an airport, or commuters in a transport hub? Why should a service-provider have the right to do what a law enforcement agency cannot normally do?

6. Controls

The tenor of the discussion to date has been that location surveillance harbours enormous threats to location privacy, but also to personal safety, the freedom to communicate, freedom of movement, and freedom of behaviour. This section examines the extent to which protections exist, firstly in the form of natural or intrinsic controls, and secondly in the form of legal provisions. The existing safeguards are found to be seriously inadequate, and it is therefore necessary to also examine the prospects for major enhancements to law, in order to achieve essential protections.

6.1 Intrinsic Controls

A variety of forms of safeguard exist against harmful technologies and unreasonable applications of them. The intrinsic economic control has largely evaporated, partly because the tools use electronics and the components are produced in high volumes at low unit cost. Another reason is that the advertising and marketing sectors are highly sophisticated, already hold and exploit vast quantities of personal data, and are readily geared up to exploit yet more data.

Neither the oxymoronic notion of 'business ethics' nor the personal morality of executives in business and government act as any significant brake on the behaviours of corporations and governments, because they are very weak barriers, and they are readily rationalised away in the face of claims of enhanced efficiencies in, for example, marketing communications, fraud control, criminal justice and control over anti-social behaviour.

A further category of intrinsic control is 'self-regulatory' arrangements within relevant industry sectors. In 2010, for example, the Australian Mobile Telecommunications Association (AMTA) released industry guidelines to promote the privacy of people using LBS on mobile devices (AMTA 2010). The guidelines were as follows:

  1. Every LBS must be provided on an opt-in basis with a specific request from a user for the service
  2. Every LBS must comply with all relevant privacy legislation
  3. Every LBS must be designed to guard against consumers being located without their knowledge
  4. Every LBS must allow consumers to maintain full control
  5. Every LBS must enable customers to control who uses their location information and when that is appropriate, and be able to stop or suspend a service easily should they wish

The second point is a matter for parliaments, privacy oversight agencies and law enforcement agencies, and its inclusion in industry guidelines is for-information-only. The remainder, meanwhile, are at best 'aspirational', and at worst mere window-dressing. Codes of this nature are simply ignored by industry members. They are primarily a means to hold off the imposition of actual regulatory measures. Occasional short-term constraints may arise from flurries of media attention, but the 'responsible' organisations escape by suggesting that bad behaviour was limited to a few 'cowboy' organisations or was a one-time error that won't be repeated.

A case study of the industry self-regulation is provided by the Biometrics Code issued by the misleadingly-named Australian industry-and-users association, the Biometrics 'Institute' (BI 2004). During the period 2009-12, the privacy advocacy organisation, the Australian Privacy Foundation (APF), submitted to the Privacy Commissioner on multiple occasions that the Code failed to meet the stipulated requirements and under the Commissioner's own Rules had to be de-registered. The Code never had more than five subscribers (out of a base of well over 100 members - which was itself only a sub-set of organisations active in the area), and had no signatories among the major biometrics vendors or users, because all five subscribers were small organisations or consultants. In addition, none of the subscribers appear to have ever provided a link to the Code on their websites or in their Privacy Policy Statements (APF 2012).

The Commissioner finally ended the farce in April 2012, citing the "low numbers of subscribers", but avoided its responsibilities by permitting the 'Institute' to "request" revocation, over two years after the APF had made the same request (OAIC 2012). The case represents an object lesson in the vacuousness of self-regulation and the business-friendliness of a captive privacy oversight agency.

If economics, morality and industry-sector politics are inadequate, perhaps competition and organisational self-interest might work. On the other hand, repeated proposals that privacy is a strategic factor for corporations and government agencies have fallen on stony ground (Clarke 19962006b).

The public can endeavour to exercise countervailing power against privacy-invasive practices. On the other hand, individuals acting alone are of little or no consequence to organisations that are intent on the application of location surveillance. Moreover, consumer organisations lack funding, professionalism and reach, and only occasionally attract sufficient media attention to force any meaningful responses from organisations deploying surveillance technologies.

Individuals may have direct surveillance countermeasures available to them, but relatively few people have the combination of motivation, technical competence and persistence to overcome lethargy and the natural human desire to believe that the institutions surrounding them are benign. In addition, some government agencies, corporations and (increasingly prevalent) public-private partnerships seek to deny anonymity, pseudonymity and multiple identities, and to impose so-called 'real name' policies, for example as a solution to the imagined epidemics of cyber-bullying, hate speech and child pornography. Individuals who use cryptography and other obfuscation techniques have to overcome the endeavours of business and government to stigmatise them as criminals with 'something to hide'.

6.2 Legal Controls

It is clear that natural or intrinsic controls have been utter failures in privacy matters generally, and will be in locational privacy matters as well. That leaves legal safeguards for personal freedoms as the sole protection. There are enormous differences among domestic laws relating to location surveillance. This section accordingly limits itself to generalities and examples.

Privacy laws are (with some qualifications, mainly in Europe) very weak instruments. Even where public servants and parliaments have an actual intention to protect privacy, rather than merely to overcome public concerns by passing placebo statutes, the draft Bills are countered by strong lobbying by government agencies and industry, to the extent that measures that were originally portrayed as being privacy-protective reach the statute books as authority for privacy breaches and surveillance (Clarke 2000).

Privacy laws, once passed, are continually eroded by exceptions built into subsequent legislation, and by technological capabilities that were not contemplated when the laws were passed. In most countries, location privacy has yet to be specifically addressed in legislation. Even where it is encompassed by human rights and privacy laws, the coverage is generally imprecise and ambiguous. More direct and specific regulation may exist, however. In Australia, for example, the Telecommunications (Interception and Access) Act and the Surveillance Devices Act define and criminalise inappropriate interception and access, use, communication and publication of location information that is obtained from mobile device traffic (AG 2005). On the other hand, when Google Inc. intercepted wi-fi signals and recorded the data that they contained, the Privacy Commissioner absolved the company (Riley 2010), and the Australian Federal Police refused to prosecute despite the action - whether it was intentional, 'inadvertent' or merely plausibly deniable - being a clear breach of the criminal law (Moses 2010).

The European Union determined a decade ago that location data that is identifiable to individuals is to some extent at least subject to existing data protection laws (EU 2002). However, the wording of that so-called 'e-Privacy Directive' countenances the collection of "location data which are more precise than is necessary for the transmission of communications", without clear controls over the justification, proportionality and transparency of that collection (para. 35). In addition, the e-Privacy Directive only applies to telecommunications service providers, not to other organisations that acquire location and tracking data. King & Jessen (2010) discuss various gaps in the protective regimes in Europe.

The EU's Advisory Body (essentially a Committee of European Data Protection Commissioners) has issued an Opinion that mobile location data is generally capable of being associated with a person, and hence is personal data, and hence is subject to the EU Directive of 1995 and national laws that implement that Directive (Art. 29 2011). Consent is considered to be generally necessary, and that consent must be informed, and sufficiently granular (pp. 13-18).

It is unclear, however, to what extent this Opinion has actually caused, and will in the future cause, organisations that collect, store, use and disclose location data to change their practices. This uncertainty exists in respect of national security, law enforcement and social control agencies, which have, or which can arrange, legal authority that overrides data protection laws. It also applies to non-government organisations of all kinds, which can take advantage of exceptions, exemptions, loopholes, non-obviousness, obfuscation, unenforceability within each particular jurisdiction, and extra-jurisdictionality, to operate in ways that are in apparent breach of the Opinion.

Legal authorities for privacy-invasions are in a great many cases vague rather than precise, and in many jurisdictions power in relation to specific decisions is delegated to an LEA (in such forms as self-written 'warrants'), or even a social control agency (in the form of demand-powers), rather than requiring a decision by a judicial officer based on evidence provided by the applicant.

Citizens in many countries are subject to more or less legitimate surveillance of various degrees and orders of granularity, by their government, in the name of law enforcement and national security. However, many Parliaments have granted powers to national security agencies to use location technology to track citizens and to intercept telecommunications. Moreover, many Parliaments have failed the public by permitting a warrant to be signed by a Minister, or even a public servant, rather than a judicial officer (Jay 1999). Worse still, it appears that these already-gross breaches of the principle of a free society are in effect being extended to the authorisation of a private organisation to track mobiles of ordinary citizens because it may lead to better services planning, or more efficient advertising and marketing (Collier 2011a).

Data protection legislation in all countries evidences massive weaknesses. There are manifold exemptions and exceptions, and there are intentional and accidental exclusions, for example through limitations in the definitions of 'identified' and 'personal data'. Even the much-vaunted European laws fail to cope with extra-territoriality and are largely ignored by US-based service-providers. They are also focussed exclusively on data, leaving large gaps in safeguards for physical, communications and behavioural privacy.

Meanwhile, a vast amount of abuse of personal data is achieved through the freedom of corporations and government agencies to pretend that Terms imposed on consumers and citizens without the scope to reject them are somehow the subject of informed and freely-given consent. For example, petrol-stations, supermarkets and many government agencies pretend that walking past signs saying 'area subject to CCTV' represents consent to gather, transmit, record, store, use and disclose data. The same approach is being adopted in relation to highly-sensitive location data, and much-vaunted data protection laws are simply subverted by the mirage of consent.

At least notices such as 'you are now being watched' or 'smile, you are being recorded' inform customers that they are under observation. On the other hand, people are generally oblivious to the fact that their mobile subscriber identity is transmitted from their mobile phone and multilaterated to yield a reasonably precise location in a shopping mall (Collier 2011a, b, c). Further, there is no meaningful sense in which they can be claimed to have consented to providing location data to a third party, in this case a location service-provider with whom they have never had contact. And the emergent combination of MDS with CCTV sources becomes a pervasive view of the person, an 'über' view, providing a set of über-analytics to - at this stage - shopping complex owners and their constituents.

What rights do employees have if such a system were instituted in an employment setting? Are workplace surveillance laws in place that would protect employees from constant monitoring? A similar problem applies to people at airports, or on hospital, university, industrial or government campuses. No social contract has been entered into between the parties, rendering the subscriber powerless.

Since the collapse of the Technology Assessment movement, technological deployment proceeds unimpeded, and public risks are addressed only after they have emerged and the clamour of concern has risen to a crescendo. A reactive force is at play, rather than proactive measures being taken to ensure avoidance or mitigation of potential privacy breaches. In Australia, for example, safeguards for location surveillance exist at best incidentally, in provisions under separate legislative regimes and in separate jurisdictions, and at worst not at all. No overarching framework exists to provide consistency among the laws. This causes confusion and inevitably results in inadequate protections (ALRC 2008).

6.3 Prospective Legal Controls

Various learned studies have been conducted, but gather dust. In Australia, the three major law reform commissions have all reported, and all have been ignored by the legislatures (NSWLRC 2005ALRC 2008VLRC 2010).

One critical need is for the fundamental principle to be recovered, to the effect that the handling of personal data requires either consent or legal authority. Consent is meaningless as a control over unreasonable behaviour, however, unless it satisfies a number of key conditions. It must be informed, it must be freely-given, and it must be sufficiently granular, not bundled (Clarke 2002). In a great many of the circumstances in which organisations are claiming to have consent to gather, store, use and disclose location data, the consumer does not appreciate what the scope of handling is that the service-provider is authorising themselves to perform; the Terms are imposed by the service-provider and may even be varied or completely re-written without consultation, a period of notice or even any notice at all; and consent is bundled rather than the individual being able to construct a pattern of consents and denials that suit their personal needs. Discussions all too frequently focus on the specifically-US notion of 'opt-out' (or 'presumed consent'), with consent debased to 'opt-in', and deprecated as inefficient and business-unfriendly.

Recently, some very weak proposals have been put forward, primarily in the USA. In 2011, for example, two US Senators proposed a Location Privacy Protection Bill (Cheng 2011). An organisation that collected location data from mobile or wireless data devices would have to state explicitly in their privacy policies what was being collected, in plain English. This would represent only a partial implementation of the already very weak 2006 recommendation of the Internet Engineering Task Force for Geographic Location/Privacy (IETF GEOPRIV) working group, which decided that technical systems should include `Fair Information Practices' (FIPs) to defend against harms associated with the use of location technologies (EPIC 2006). FIPs, however, is itself only a highly cut-down version of effective privacy protections, and the Bill proposes only a small fraction of FIPs. It would be close to worthless to consumers, and close to legislative authorisation for highly privacy-invasive actions by organisations.

Two other US senators tabled a GPS Bill, nominally intended to "balance the needs of Americans' privacy protections with the legitimate needs of law enforcement, and maintains emergency exceptions" (Anderson 2011). The scope is very narrow - next would have to come the Wi-Fi Act, the A-GPS Act, etc. That approach is obviously unviable in the longer term as new innovations emerge. Effective legislation must have appropriate generality rather than excessive technology-specificity, and should be based on semantics not syntax. Yet worse, these Bills would provide legal authorisation for grossly privacy-invasive location and tracking. IETF engineers, and now Congressmen, want to compromise human rights and increase the imbalance of power between business and consumers.

7. Conclusions

Mobile device location technologies and their applications are enabling surveillance, and producing an enormous leap in intrusions into data privacy and into privacy of the person, privacy of personal communications, and privacy of personal behaviour.

Existing privacy laws are entirely incapable of protecting consumers and citizens against the onslaught. Even where consent is claimed, it generally fails the tests of being informed, freely-given and granular.

There is an urgent need for outcries from oversight agencies, and responses from legislatures. Individual countries can provide some degree of protection, but the extra-territorial nature of so much of the private sector, and the use of corporate havens, in particular the USA, mean that multilateral action is essential in order to overcome the excesses arising from the US laissez faire traditions.

One approach to the problem would be location privacy protection legislation, although it would need to embody the complete suite of protections rather than the mere notification that the technology breaches privacy. An alternative approach is amendment of the current privacy legislation and other anti-terrorism legislation in order to create appropriate regulatory provisions, and close the gaps that LBS providers are exploiting (Koppel 2010).

The chimeras of self-regulation, and the unenforceability of guidelines, are not safeguards. Sensitive data like location information must be subject to actual, enforced protections, with guidelines and codes no longer used as a substitute, but merely playing a supporting role. Unless substantial protections for personal location information are enacted and enforced, there will be an epidemic of unjustified, disproportionate and covert surveillance, conducted by government and business, and even by citizens (Gillespie 2009, Abbas et al. 2011).

References

Abbas R. (2011) 'The social and behavioural implications of location-based services: An observational study of users' Journal of Location Based Services, 5, 3-4 (December 2011)

Abbas R., Michael K., Michael m.g. & Aloudat A. (2011) 'Emerging forms of covert surveillance using GPS-enabled devices', Journal of Cases on Information Technology, 13, 2 (2011) 19-33

AG (2005) 'What the Government is doing: Surveillance Device Act 2004', 25 May 2005, Australian Government, at http://www.ag.gov.au/agd/www/nationalsecurity.nsf/AllDocs/9B1F97B59105AEE6CA2570C0014CAF5?OpenDocument

ALRC (2008) 'For your information: Australian privacy law and practice (ALRC Report 108)', Australian Government, 2, pp. 1409-10, http://www.alrc.gov.au/publications/report-108

AMTA (2010) 'New mobile telecommunications industry guidelines and consumer tips set benchmark for Location Based Services', Australian Mobile Telecommunications Association, 2010, athttp://www.amta.org.au/articles/New.mobile.telecommunications.industry.guidelines.and.consumer.tips.set.benchmark.for.Location.Based.Services

Anderson N. (2011) 'Bipartisan bill would end government's warrantless GPS tracking', Ars Technica, June 2011, at http://arstechnica.com/tech-policy/news/2011/06/bipartisan-bill-would-end-governments-warrantless-gps-tracking.ars

APF (2012) 'Revocation of the Biometrics Industry Code' Australian Privacy Foundation, March 2012, at http://www.privacy.org.au/Papers/OAIC-BiomCodeRevoc-120321.pdf

Arnold B. (2008) 'Privacy guide', Caslon Analytics, May 2008, at http://www.caslon.com.au/privacyguide19.htm

Art. 29 (2011) 'Opinion 13/2011 on Geolocation services on smart mobile devices' Article 29 Data Protection Working Party , 881/11/EN WP 185, 16 May 2011, at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2011/wp185_en.pdf

BI (2004) 'Privacy Code' Biometrics Institute, Sydney, April 2004, at http://web.archive.org/web/20050424120627/http://www.biometricsinstitute.org/displaycommon.cfm?an=1&subarticlenbr=8

Blumberg A.J. & Eckersley P. (2009) 'On locational privacy, and how to avoid losing it forever' Electronic Frontier Foundation, August 2009, at https://www.eff.org/wp/locational-privacy

Bronitt S. (2010) 'Regulating covert policing methods: from reactive to proactive models of admissibility', in S. Bronitt, C. Harfield and K. Michael (eds.), The Social Implications of Covert Policing, 2010, pp. 9-14

Cheng J. (2011) 'Franken's location-privacy bill would close mobile-tracking 'loopholes'', Wired, 17 June 2011, at http://www.wired.com/epicenter/2011/06/franken-location-loopholes/

Chetty K., Smith G.E. & Woodbridge K. (2012) 'Through-the-Wall Sensing of Personnel Using Passive Bistatic WiFi Radar at Standoff Distances' IEEE Transactions on Geoscience and Remote Sensing 50, 4 (Aril 2012) 1218 - 1226

Clarke R. (1988) 'Information technology and dataveillance', Communications of the ACM, 31(5), May 1988, pp498-512, at http://www.rogerclarke.com/DV/CACM88.html

Clarke R. (1994) 'The Digital Persona and its Application to Data Surveillance' The Information Society 10,2 (June 1994) 77-92, at http://www.rogerclarke.com/DV/DigPersona.html

Clarke R. (1996) 'Privacy and Dataveillance, and Organisational Strategy' Proc. I.S. Audit & Control Association (EDPAC'96), Perth, Western Australia, May 1996, athttp://www.rogerclarke.com/DV/PStrat.html

Clarke R. (2000) 'Submission to the Commonwealth Attorney-General re: 'A privacy scheme for the private sector: Release of Key Provisions' of 14 December 1999' Xamax Consultancy Pty Ltd, January 2000, at http://www.anu.edu.au/people/Roger.Clarke/DV/PAPSSub0001.html

Clarke R. (2001) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Information Technology & People 14, 2 (Summer 2001) 206-231, athttp://www.rogerclarke.com/DV/PLT.html

Clarke R. (2002) 'e-Consent: A Critical Element of Trust in e-Business' Proc. 15th Bled Electronic Commerce Conference, Bled, Slovenia, June 2002, at http://www.rogerclarke.com/EC/eConsent.html

Clarke R. (2006a) 'What's 'Privacy'?' Xamax Consultancy Pty Ltd, August 2006, at http://www.rogerclarke.com/DV/Privacy.html

Clarke R. (2006b) 'Make Privacy a Strategic Factor - The Why and the How' Cutter IT Journal 19, 11 (October 2006), at http://www.rogerclarke.com/DV/APBD-0609.html

Clarke R. (2008) 'Dissidentity: The Political Dimension of Identity and Privacy' Identity in the Information Society 1, 1 (December, 2008) 221-228, at http://www.rogerclarke.com/DV/Dissidentity.html

Clarke R. (2009a) 'The Covert Implementation of Mass Vehicle Surveillance in Australia' Proc 4th Workshop on the Social Implications of National Security: Covert Policing, April 2009, ANU, Canberra, at http://www.rogerclarke.com/DV/ANPR-Surv.html

Clarke R. (2009b) 'A Sufficiently Rich Model of (Id)entity, Authentication and Authorisation' Proc. IDIS 2009 - The 2nd Multidisciplinary Workshop on Identity in the Information Society, LSE, 5 June 2009, at http://www.rogerclarke.com/ID/IdModel-090605.html

Clarke R. (2009c) 'A Framework for Surveillance Analysis' Xamax Consultancy Pty Ltd, August 2009, at http://www.rogerclarke.com/DV/FSA.html

Clarke R. (2010) 'What is Überveillance? (And What Should Be Done About It?)' IEEE Technology and Society 29, 2 (Summer 2010) 17-25, at http://www.rogerclarke.com/DV/RNSA07.html

Clarke R. (2011) 'The Cloudy Future of Consumer Computing' Proc. 24th Bled eConference, June 2011, at http://www.rogerclarke.com/EC/CCC.html

Clarke R. & Wigan M. (2011) 'You are where you've been: The privacy implications of location and tracking technologies' Journal of Location Based Services 5, 3-4 (December 2011) 138-155, PrePrint athttp://www.rogerclarke.com/DV/YAWYB-CWP.html

Cleff E.B. (2007) 'Implementing the legal criteria of meaningful consent in the concept of mobile advertising' Computer Law & Security Review 23,2 (2007) 262-269

Cleff E.B. (2010) 'Effective approaches to regulate mobile advertising: Moving towards a coordinated legal, self-regulatory and technical response' Computer Law & Security Review 26, 2 (2010) 158-169

Collier K. (2011a) 'Stores spy on shoppers', Herald Sun, 12 October 2011, at http://www.heraldsun.com.au/news/more-news/stores-spy-on-shoppers/story-fn7x8me2-1226164244739

Collier K. (2011b) 'Shopping centres' Big Brother plan to track customers', Herald Sun, 14 October 2011, at http://www.heraldsun.com.au/news/more-news/shopping-centres-big-brother-plan-to-track-customers/story-fn7x8me2-1226166191503

Collier K. (2011c) ''Creepy' Path Intelligence retail technology tracks shoppers', news.com.au, 14 October 2011, at http://www.news.com.au/money/creepy-retail-technology-tracks-shoppers/story-e6frfmci-1226166413071

Dahunsi F. & Dwolatzky B. (2012) 'An empirical investigation of the accuracy of location-based services in South Africa' Journal of Location Based Services 6, 1 (March 2012) 22-34

Dobson J. & Fisher P. (2003) 'Geoslavery' IEEE Technology and Society 22 (2003) 47-52, cited in Raper et al. (2007)

Economist (2012) 'Vehicle data recorders - Watching your driving' The Economist' 23 June 2012, at http://www.economist.com/node/21557309

EPIC (2006) 'Privacy and human rights report 2006' Electronic Privacy Information Center, WorldLII, 2006, at http://www.worldlii.org/int/journals/EPICPrivHR/2006/PHR2006-Location.html

EPIC (2012) 'Investigations of Google Street View' Electronic Privacy Information Center, 2012, at http://epic.org/privacy/streetview/

EU (2002) 'Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications)' Official Journal L 201 , 31/07/2002 P. 0037 - 0047, European Commission, at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32002L0058:en:HTML

Figueiras J. & Frattasi S. (2010) 'Mobile Positioning and Tracking: From Conventional to Cooperative Techniques' Wiley, 2010

Fusco S.J., Abbas R., Michael K. & Aloudat A. (2012) 'Location-Based Social Networking and its Impact on Trust in Relationships' IEEE Technology and Society Magazine 31,2 (Summer 2012) 39-50, athttp://works.bepress.com/cgi/viewcontent.cgi?article=1326&context=kmichael

Gallagher T. et al. (2009) 'Trials of commercial Wi-Fi positioning systems for indoor and urban canyons' Proc. IGNSS Symposium, 1-3 December 2009, Queensland, cited in Zandbergen (2012)

Ganz J.S. (2005) 'It's already public: why federal officers should not need warrants to use GPS vehicle tracking devices', Journal of Criminal Law and Criminology 95, 4 (Summer 2005) 1325-37

Gillespie A.A. (2009) 'Covert surveillance, human rights and the law', Irish Criminal Law Journal, 19, 3 (August 2009) 71-79

IBM (2011) 'IBM Smart Surveillance System (Previous PeopleVision Project)', IBM Research, 30 October 2011, at http://www.research.ibm.com/peoplevision/

Jay D.M. (1999) 'Use of covert surveillance obtained by search warrant', Australian Law Journal, 73, 1 (Jan 1999) 34-36

King N.J. & Jessen P.W. (2010) 'Profiling the mobile customer - Privacy concerns when behavioural advertisers target mobile phones' Computer Law & Security Review 26, 5 (2010) 455-478 and 26, 6 (2010) 595-612

Koppel A. (2010) 'Warranting a warrant: Fourth Amendment concerns raised by law enforcement's warrantless use of GPS and cellular phone tracking', University of Miami Law Review 64, 3 (April 2010) 1061-1089

Lewis P. (2008) 'Fears over privacy as police expand surveillance project' The Guardian, 15 September 2008, at http://www.guardian.co.uk/uk/2008/sep/15/civilliberties.police

McGuire M., Plataniotis K.N. & Venetsanopoulos A.N. (2005) 'Data fusion of power and time measurements for mobile terminal location' IEEE Transaction on Mobile Computing 4 (2005) 142-153, cited in Raper et al. (2007)

Mann S., Nolan J. & Wellman B. (2003) 'Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments' Surveillance & Society 1, 3 (June 2003) 331-355, at http://www.surveillance-and-society.org/articles1(3)/sousveillance.pdf

Mautz R. (2011) 'Overview of Indoor Positioning Technologies' Keynote, Proc. IPIN'2011, Guimaraes, September 2011, at http://www.geometh.ethz.ch/people/.../IPIN_Keynote_Mautz_2011.pdf

Mery D. (2009) 'The mobile phone as self-inflicted surveillance - And if you don't have one, what have you got to hide?' The Register, 10 April 2009, athttp://www.theregister.co.uk/2009/04/10/mobile_phone_tracking/

Michael K. & Michael M.G. (2007) 'From Dataveillance to Überveillance and the Realpolitik of the Transparent Society' University of Wollongong, 2007, at http://works.bepress.com/kmichael/51

Michael K. & Michael M.G. (2009) 'Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants' IGI Global, 2009

Michael M.G. & Michael K. (2010) 'Towards a state of uberveillance' IEEE Technology and Society Magazine 29, 2 (Summer 2010) 9-16, at http://works.bepress.com/kmichael/187

Michael K., McNamee A., Michael M.G. & Tootell H. (2006a) 'Location-Based Intelligence - Modeling Behavior in Humans using GPS' Proc. Int'l Symposium on Technology and Society, New York, 8-11 June 2006, at http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1384&context=infopapers

Michael K., McNamee A. & Michael M.G. (2006b) 'The Emerging Ethics of Humancentric GPS Tracking and Monitoring' Proc. Int'l Conf. on Mobile Business, Copenhagen, Denmark IEEE Computer Society, 2006, at http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1384&context=infopapers

Michael M.G., Fusco S.J. & Michael K (2008) 'A Research Note on Ethics in the Emerging Age of Uberveillance (Überveillance)' Computer Communications, 31(6), 2008, 1192-119, athttp://works.bepress.com/kmichael/32/

Michael K. & Masters A. (2006) 'Realized Applications of Positioning Technologies in Defense Intelligence' in Hussein Abbass H. & Essam D. (eds.) 'Applications of Information Systems to Homeland Security and Defense' Idea Group Publishing, 2006, at http://works.bepress.com/kmichael/2

Michael K., Roussos G., Huang G.Q., Gadh R., Chattopadhyay A., Prabhu S. & Chu P. (2010) 'Planetary-scale RFID services in an age of uberveillance' Proceedings of the IEEE 98, 9 (2010) 1663-1671

Moses A. (2010) 'Google escapes criminal charges for Wi-Fi snooping', The Sydney Morning Herald, 6 December 2010, at http://www.smh.com.au/technology/security/google-escapes-criminal-charges-for-wifi-snooping-20101206-18lot.html

NSWLRC (2005) 'Surveillance' Report 108 , NSW Law Reform Commission, 2005, at http://www.lawlink.nsw.gov.au/lawlink/lrc/ll_lrc.nsf/pages/LRC_r108toc

OAIC (2012) '' Office of the Australian Information Commissioner, April 2012, at http://www.comlaw.gov.au/Details/F2012L00869/Explanatory%20Statement/Text

Otterberg A.A. (2005) 'Note: GPS tracking technology: The case for revisiting Knotts and shifting the Supreme Court's theory of the public space under the Fourth Amendment', Boston College Law Review 46 (2005) 661-704

Parenti C. (2003) 'The Soft Cage: Surveillance in America From Slavery to the War on Terror'  Basic Books, 2003

PI (2010a) 'Our Commitment to Privacy', Path Intelligence, 2010, heading changed in late 2012 to 'Privacy by design', at http://www.pathintelligence.com/en/products/footpath/privacy

PI (2010b) 'FootPath Technology', Path Intelligance, 2010, at http://www.pathintelligence.com/en/products/footpath/footpath-technology

PI (2012) 'Retail' Path Intelligence, 2012, at http://www.pathintelligence.com/en/industries/retail

Raper J., Gartner G., Karimi H. & Rizos C. (2007a) 'A critical evaluation of location based services and their potential' Journal of Location Based Services 1, 1 (March 2007) 5-45

Raper J., Gartner G., Karimi H. & Rizos C. (2007b) 'Applications of location-based services: a selected review' Journal of Location Based Services 1, 2 (June 2007) 89-111

RE (2010a) 'IEEE 802.11 standards tutorial' Radio-Electronics.com, apparently of 2010, at http://www.radio-electronics.com/info/wireless/wi-fi/ieee-802-11-standards-tutorial.php

RE (2010b) 'WiMAX IEEE 802.16 technology tutorial' Radio-Electronics.com, apparently of 2010, at http://www.radio-electronics.com/info/wireless/wimax/wimax.php

RE (2012) 'Assisted GPS, A-GPS' Radio-Electronics.com, apparently of 2012, at http://www.radio-electronics.com/info/cellulartelecomms/location_services/assisted_gps.php

Renegar B.D., Michael K. & Michael M.G. (2008) 'Privacy, value and control issues in four mobile business applications' Proc. 7th Int'l Conf. on Mobile Business, 2008, pp. 30-40

Riley J. (2010) 'Gov't 'travesty' in Google privacy case', ITWire, Wednesday 3 November 2010, 20:44, at http://www.itwire.com/it-policy-news/regulation/42898-govt-travesty-in-google-privacy-case

Samuel I.J. (2008) 'Warrantless location tracking', New York University Law Review, 83 (2008) 1324-1352

SHW (2012) 'Skyhook Location Performance', at http://www.skyhookwireless.com/location-technology/performance.php

Skyhook (2012) Website Entries, including 'Frequently Asked Questions' at http://www.skyhookwireless.com/whoweare/faq.php, 'Privacy Policy' athttp://www.skyhookwireless.com/whoweare/privacypolicy.php and 'Location Privacy' at http://www.skyhookwireless.com/whoweare/privacy.php,

Song C., Qu Z., Blumm N. & Barabási A.-L. (2010) 'Limits of predictability in human mobility' Science 327, 5968 (2010) 1018-1021

USGov (2012) 'GPS Accuracy' National Coordination Office for Space-Based Positioning, Navigation, and Timing, February 2012, at http://www.gps.gov/systems/gps/performance/accuracy/

van Loenen B., Zevenbergen J. & de Jong J. (2009) 'Balancing Location Privacy with National Security: A Comparative Analysis of Three Countries through the Balancing Framework of the European Court Of Human Rights' Ch. 2 of Patten N.J. et al. 'National Security: Institutional Approaches', Nova Science Publishers, 2009

VLRC (2010) 'Surveillance in Public Spaces' Victorian Law Reform Commission, Final Report 18, March 2010, athttp://www.lawreform.vic.gov.au/wps/wcm/connect/justlib/Law+Reform/resources/3/6/36418680438a4b4eacc0fd34222e6833/Surveillance_final_report.pdf

Wright D., Friedewald M., Gutwirth S., Langheinrich M., Mordini E., Bellanova R., De Hert P., Wadhwa K. & Bigo D. (2010) 'Sorting out smart surveillance' Computer Law & Security Review 26, 4 (2010) 343-354

Zandbergen P.A. (2012) 'Comparison of WiFi positioning on two mobile devices' Journal of Location Based Services 6, 1 (March 2012) 35-50

Acknowledgements

A preliminary version of the analysis presented in this paper appeared in the November 2011 edition of Precedent, the journal of the Lawyers Alliance. The article has been significantly upgraded as a result of comments provided by the referees and editor.

Author Affiliations

Katina Michael is an Associate Professor in the School of Information Systems and Technology at the University of Wollongong. She is the editor in chief of the IEEE Technology and Society Magazine, is on the editorial board of Computers & Security, and is a co-editor of 'Social Implications of Covert Policing' (2010). She is a Board member of the Australian Privacy Foundation and a representative of the Consumer Federation of Australia.

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Cyberspace Law & Policy Centre at the University of N.S.W., and a Visiting Professor in theResearch School of Computer Science at the Australian National University. He is currently Chair of the Australian Privacy Foundation, and an Advisory Board member of Privacy International.

Location and tracking of mobile devices: Uberveillance stalks the streets

Abstract

During the last decade, location-tracking and monitoring applications have proliferated, in mobile cellular and wireless data networks, and through self-reporting by applications running in smartphones that are equipped with onboard global positioning system (GPS) chipsets. It is now possible to locate a smartphone user's location not merely to a cell, but to a small area within it. Innovators have been quick to capitalise on these location-based technologies for commercial purposes, and have gained access to a great deal of sensitive personal data in the process. In addition, law enforcement utilises these technologies, can do so inexpensively and hence can track many more people. Moreover, these agencies seek the power to conduct tracking covertly, and without a judicial warrant. This article investigates the dimensions of the problem of people-tracking through the devices that they carry. Location surveillance has very serious negative implications for individuals, yet there are very limited safeguards. It is incumbent on legislatures to address these problems, through both domestic laws and multilateral processes.

1. Introduction

Personal electronic devices travel with people, are worn by them, and are, or soon will be, inside them. Those devices are increasingly capable of being located, and, by recording the succession of locations, tracked. This creates a variety of opportunities for the people concerned. It also gives rise to a wide range of opportunities for organisations, at least some of which are detrimental to the person's interests.

Commonly, the focus of discussion of this topic falls on mobile phones and tablets. It is intrinsic to the network technologies on which those devices depend that the network operator has at least some knowledge of the location of each handset. In addition, many such devices have onboard global positioning system (GPS) chipsets, and self-report their coordinates to service-providers. The scope of this paper encompasses those already well-known forms of location and tracking, but it extends beyond them.

The paper begins by outlining the various technologies that enable location and tracking, and identifies those technologies' key attributes. The many forms of surveillance are then reviewed, in order to establish a framework within which applications of location and tracking can be characterised. Applications are described, and their implications summarised. Controls are considered, whereby potential harm to the interests of individuals can be prevented or mitigated.

2. Relevant technologies

The technologies considered here involve a device that has the following characteristics:

• it is conveniently portable by a human, and

• it emits signals that:

• enable some other device to compute the location of the device (and hence of the person), and

• are sufficiently distinctive that the device is reliably identifiable at least among those in the vicinity, and hence the device's (and hence the person's) successive locations can be detected, and combined into a trail

The primary form-factors for mobile devices are currently clam-shape (portable PCs), thin rectangles suitable for the hand (mobile phones), and flat forms (tablets). Many other form-factors are also relevant, however. Anklets imposed on dangerous prisoners, and even as conditions of bail, carry RFID tags. Chips are carried in cards of various sizes, particularly the size of credit-cards, and used for tickets for public transport and entertainment venues, aircraft boarding-passes, toll-road payments and in some countries to carry electronic cash. Chips may conduct transactions with other devices by contact-based means, or contactless, using radio-frequency identification (RFID) or its shorter-range version near-field communication (NFC) technologies. These capabilities are in credit and debit cards in many countries. Transactions may occur with the cardholder's knowledge, with their express consent, and with an authentication step to achieve confidence that the person using the card is authorised to do so. In a variety of circumstances, however, some and even all of those safeguards are dispensed with. The electronic versions of passports that are commonly now being issued carry such a chip, and have an autonomous communications capability. The widespread issue of cards with capabilities uncontrolled by, and in many cases unknown to, the cardholder, is causing consternation among segments of the population that have become aware of the schemes.

Such chips can be readily carried in other forms, including jewellery such as finger-rings, and belt-buckles. Endo-prostheses such as replacement hips and knees and heart pacemakers can readily carry chips. A few people have voluntarily embedded chips directly into their bodies for such purposes as automated entry to premises (Michael and Michael, 2009).

In order to locate and track such devices, any sufficiently distinctive signals may in principle suffice. See Raper et al. (2007a) and Mautz (2011). In practice, the signals involved are commonly those transmitted by a device in order to take advantage of wireless telecommunications networks. The scope of the relevant technologies therefore also encompasses the signals, devices that detect the signals, and the networks over which the data that the signals contain are transmitted.

In wireless networks, it is generally the case that the base-station or router needs to be aware of the identities of devices that are currently within the cell. A key reason for this is to conserve limited transmission capacity by sending messages only when the targeted device is known to be in the cell. This applies to all of:

• cellular mobile originally designed for voice telephony and extended to data (in particular those using the ‘3G’ standards GSM/GPRS, CDMA2000 and UMTS/HSPA and the ‘4G’ standard LTE)

• wireless local area networks (WLANs, commonly Wifi/IEEE 802.11x – RE, 2010a)

• wireless wide area networks (WWANs, commonly WiMAX/IEEE 802.16x – RE, 2010b).

Devices in such networks are uniquely identified by various means (Clarke and Wigan, 2011). In cellular networks, there is generally a clear distinction between the entity (the handset) and the identity it is adopting at any given time (which is determined by the module inserted in it). Depending on the particular standards used, what is commonly referred to as ‘the SIM-card’ is an R-UIM, a CSIM or a USIM. These modules store an International Mobile Subscriber Identity (IMSI), which constitutes the handset's identifier. Among other things, this enables network operators to determine whether or not to provide service, and what tariff to apply to the traffic. However, cellular network protocols may also involve transmission of a code that distinguishes the handset itself, within which the module is currently inserted. A useful generic term for this is the device ‘entifier’ (Clarke, 2009b). Under the various standards, it may be referred to as an International Mobile Equipment Identity (IMEI), ESN, or MEID.

Vendor-specific solutions also may provide additional functionality to a handset unbeknown to the end-user. For example, every mobile device manufactured by Apple has a 40-character Unique Device Identifier (UDID). This enables Apple to track its users. Not only Apple itself, but also marketers, were able to use the UDID to track devices. It has also been alleged that data emanating from these devices is routinely accessible to law enforcement agencies. Since late 2012, Apple has prevented marketers from using the UDID, but has added an Identifier for Advertisers (IFA or IDFA). This is temporary, and it can be blocked; but it is by default open for tracking, and turning it off is difficult, and is likely to result in reduced services (Edwards, 2012). In short, Apple devices are specifically designed to enable tracking of consumers by Apple, by any government agency that has authority to gain access to the data, and by all consumer-marketing corporations, although in the last case with a low-grade option available to the user to suppress tracking.

In Wifi and WiMAX networks, the device entifier may be a processor-id or more commonly a network interface card identifier (NIC Id). In various circumstances, other device-identifiers may be used, such as a phone number, or an IP-address may be used as a proxy. In addition, the human using the device may be directly identified, e.g. by means of a user-account name.

A WWAN cell may cover a large area, indicatively of a 50 km radius. Telephony cells may have a radius as large as 2–3 km or as little as a hundred metres. WLANs using Wifi technologies have a cell-size of less than 1 ha, indicatively 50–100 m radius, but in practice often constrained by environmental factors to only 10–30 m.

The base-station or router knows the identities of devices that are within its cell, because this is a technically necessary feature of the cell's operation. Mobile devices auto-report their presence 10 times per second. Meanwhile, the locations of base-stations for cellular services are known with considerable accuracy by the telecommunications providers. And, in the case of most private Wifi services, the location of the router is mapped to c. 30–100 m accuracy by services such as Skyhook and Google Locations, which perform what have been dubbed ‘war drives’ in order to maintain their databases – in Google's case in probable violation of the telecommunications interception and/or privacy laws of at least a dozen countries (EPIC, 2012).

Knowing that a device is within a particular mobile phone, WiMAX or Wifi cell provides only a rough indication of location. In order to generate a more precise estimate, within a cell, several techniques are used (McGuire et al., 2005). These include the following (adapted from Clarke and Wigan, 2011; see also Figueiras and Frattasi, 2010):

• directional analysis. A single base-station may comprise multiple receivers at known locations and pointed in known directions, enabling the handset's location within the cell to be reduced to a sector within the cell, and possibly a narrow one, although without information about the distance along the sector;

• triangulation. This involves multiple base-stations serving a single cell, at known locations some distance apart, and each with directional analysis capabilities. Particularly with three or more stations, this enables an inference that the device's location is within a small area at the intersection of the multiple directional plots;

• signal analysis. This involves analysis of the characteristics of the signals exchanged between the handset and base-station, in order to infer the distance between them. Relevant signal characteristics include the apparent response-delay (Time Difference of Arrival – TDOA, also referred to as multilateration), and strength (Received Signal Strength Indicator – RSSI), perhaps supplemented by direction (Angle Of Arrival – AOA).

The precision and reliability of these techniques varies greatly, depending on the circumstances prevailing at the time. The variability and unpredictability result in many mutually inconsistent statements by suppliers, in the general media, and even in the technical literature.

Techniques for cellular networks generally provide reasonably reliable estimates of location to within an indicative 50–100 m in urban areas and some hundreds of metres elsewhere. Worse performance has been reported in some field-tests, however. For example, Dahunsi and Dwolatzky (2012) found the accuracy of GSM location in Johannesburg to be in the range 200–1400 m, and highly variable, with “a huge difference between the predicted and provided accuracies by mobile location providers”.

The website of the Skyhook Wifi-router positioning service claims 10-m accuracy, 1-s time-to-first-fix and 99.8% reliability (SHW, 2012). On the other hand, tests have resulted in far lower accuracy measures, including an average positional error of 63 m in Sydney (Gallagher et al., 2009) and “median values for positional accuracy in [Las Vegas, Miami and San Diego, which] ranged from 43 to 92 metres… [and] the replicability… was relatively poor” (Zandbergen, 2012, p. 35). Nonetheless, a recent research article suggested the feasibility of “uncooperatively and covertly detecting people ‘through the wall’ [by means of their WiFi transmissions]” (Chetty et al., 2012).

Another way in which a device's location may become known to other devices is through self-reporting of the device's position, most commonly by means of an inbuilt Global Positioning System (GPS) chipset. This provides coordinates and altitude based on broadcast signals received from a network of satellites. In any particular instance, the user of the device may or may not be aware that location is being disclosed.

Despite widespread enthusiasm and a moderate level of use, GPS is subject to a number of important limitations. The signals are subject to interference from atmospheric conditions, buildings and trees, and the time to achieve a fix on enough satellites and deliver a location measure may be long. This results in variability in its practical usefulness in different circumstances, and in its accuracy and reliability. Civil-use GPS coordinates are claimed to provide accuracy within a theoretical 7.8 m at a 95% confidence level (USGov, 2012), but various reports suggest 15 m, or 20 m, or 30 m, but sometimes 100 m. It may be affected by radio interference and jamming. The original and still-dominant GPS service operated by the US Government was subject to intentional degradation in the US's national interests. This ‘Selective Availability’ feature still exists, although subject to a decade-long policy not to use it; and future generations of GPS satellites may no longer support it.

Hybrid schemes exist that use two or more sources in order to generate more accurate location-estimates, or to generate estimates more quickly. In particular, Assisted GPS (A-GPS) utilises data from terrestrial servers accessed over cellular networks in order to more efficiently process satellite-derived data (e.g. RE, 2012).

Further categories of location and tracking technologies emerge from time to time. A current example uses means described by the present authors as ‘mobile device signatures’ (MDS). A device may monitor the signals emanating from a user's mobile device, without being part of the network that the user's device is communicating with. The eavesdropping device may detect particular signal characteristics that distinguish the user's mobile device from others in the vicinity. In addition, it may apply any of the various techniques mentioned above, in order to locate the device. If the signal characteristics are persistent, the eavesdropping device can track the user's mobile device, and hence the person carrying it. No formal literature on MDS has yet been located. The supplier's brief description is at PI (2010b).

The various technologies described in this section are capable of being applied to many purposes. The focus in this paper is on their application to surveillance.

3. Surveillance

The term surveillance refers to the systematic investigation or monitoring of the actions or communications of one or more persons (Clarke, 2009c). Until recent times, surveillance was visual, and depended on physical proximity of an observer to the observed. The volume of surveillance conducted was kept in check by the costs involved. Surveillance aids and enhancements emerged, such as binoculars and, later, directional microphones. During the 19th century, the post was intercepted, and telephones were tapped. During the 20th century, cameras enabled transmission of image, video and sound to remote locations, and recording for future use (e.g. Parenti, 2003).

With the surge in stored personal data that accompanied the application of computing to administration in the 1970s and 1980s, dataveillance emerged (Clarke, 1988). Monitoring people through their digital personae rather than through physical observation of their behaviour is much more economical, and hence many more people can be subjected to it (Clarke, 1994). The dataveillance epidemic made it more important than ever to clearly distinguish between personal surveillance – of an identified person who has previously come to attention – and mass surveillance – of many people, not necessarily previously identified, about some or all of whom suspicion could be generated.

Location data is of a very particular nature, and hence it has become necessary to distinguish location surveillance as a sub-set of the general category of dataveillance. There are several categories of location surveillance with different characteristics (Clarke and Wigan, 2011):

• capture of an individual's location at a point in time. Depending on the context, this may support inferences being drawn about an individual's behaviour, purpose, intention and associates

• real-time monitoring of a succession of locations and hence of the person's direction of movement. This is far richer data, and supports much more confident inferences being drawn about an individual's behaviour, purpose, intention and associates

• predictive tracking, by extrapolation from the person's direction of movement, enabling inferences to be drawn about near-future behaviour, purpose, intention and associates

• retrospective tracking, on the basis of the data trail of the person's movements, enabling reconstruction of a person's behaviour, purpose, intention and associates at previous times

Information arising at different times, and from different forms of surveillance, can be combined, in order to offer a more complete picture of a person's activities, and enable yet more inferences to be drawn, and suspicions generated. This is the primary sense in which the term ‘überveillance’ is applied: “Überveillance has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought). Überveillance can be a predictive mechanism for a person's expected behaviour, traits, likes, or dislikes; or it can be based on historical fact; or it can be something in between… Überveillance is more than closed circuit television feeds, or cross-agency databases linked to national identity cards, or biometrics and ePassports used for international travel. Überveillance is the sum total of all these types of surveillance and the deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time” (Michael and Michael, 2010. See also Michael and Michael, 2007Michael et al., 20082010Clarke, 2010).

A comprehensive model of surveillance includes consideration of geographical scope, and of temporal scope. Such a model assists the analyst in answering key questions about surveillance: of what? for whom? by whom? why? how? where? and when? (Clarke, 2009c). Distinctions are also needed based on the extent to which the subject has knowledge of surveillance activities. It may be overt or covert. If covert, it may be merely unnotified, or alternatively express measures may be undertaken in order to obfuscate, and achieve secrecy. A further element is the notion of ‘sousveillance’, whereby the tools of surveillance are applied, by those who are commonly watched, against those who are commonly the watchers (Mann et al., 2003).

These notions are applied in the following sections in order to establish the extent to which location and tracking of mobile devices is changing the game of surveillance, and to demonstrate that location surveillance is intruding more deeply into personal freedoms than previous forms of surveillance.

4. Applications

This section presents a typology of applications of mobile device location, as a means of narrowing down to the kinds of uses that have particularly serious privacy implications. These are commonly referred to as location-based services (LBS). One category of applications provide information services that are for the benefit of the mobile device's user, such as navigation aids, and search and discovery tools for the locations variously of particular, identified organisations, and of organisations that sell particular goods and services. Users of LBS of these kinds can be reasonably assumed to be aware that they are disclosing their location. Depending on the design, the disclosures may also be limited to specific service-providers and specific purposes, and the transmissions may be secured.

Another, very different category of application is use by law enforcement agencies (LEAs). The US E-911 mandate of 1999 was nominally a public safety measure, to enable people needing emergency assistance to be quickly and efficiently located. In practice, the facility also delivered LEAs means for locating and tracking people of interest, through their mobile devices. Personal surveillance may be justified by reasonable grounds for suspicion that the subject is involved in serious crime, and may be specifically authorised by judicial warrant. Many countries have always been very loose in their control over LEAs, however, and many others have drastically weakened their controls since 2001. Hence, in any given jurisdiction and context, each and all of the controls may be lacking.

Yet worse, LEAs use mobile location and tracking for mass surveillance, without any specific grounds for suspicion about any of the many people caught up in what is essentially a dragnet-fishing operation (e.g. Mery, 2009). Examples might include monitoring the area adjacent to a meeting-venue watching out for a blacklist of device-identifiers known to have been associated with activists in the past, or collecting device-identifiers for use on future occasions. In addition to netting the kinds of individuals who are of legitimate interest, the ‘by-catch’ inevitably includes threatened species. There are already extraordinarily wide-ranging (and to a considerable extent uncontrolled) data retention requirements in many countries.

Of further concern is the use of Automated Number Plate Recognition (ANPR) for mass surveillance purposes. This has been out of control in the UK since 2006, and has been proposed or attempted in various other countries as well (Clarke, 2009a). Traffic surveillance is expressly used not only for retrospective analysis of the movements of individuals of interest to LEAs, but also as a means of generating suspicions about other people (Lewis, 2008).

Beyond LEAs, many government agencies perform social control functions, and may be tempted to conduct location and tracking surveillance. Examples would include benefits-paying organisations tracking the movements of benefits-recipients about whom suspicions have arisen. It is not too far-fetched to anticipate zealous public servants concerned about fraud control imposing location surveillance on all recipients of some particularly valuable benefit, or as a security precaution on every person visiting a sensitive area (e.g. a prison, a power plant, a national park).

Various forms of social control are also exercised by private sector organisations. Some of these organisations, such as placement services for the unemployed, may be performing outsourced public sector functions. Others, such as workers' compensation providers, may be seeking to control personal insurance claimants, and similarly car-hire companies and insurance providers may wish to monitor motor vehicles' distance driven and roads used (Economist, 2012Michael et al., 2006b).

A further privacy-invasive practice that is already common is the acquisition of location and tracking data by marketing corporations, as a by-product of the provision of location-based services, but with the data then applied to further purposes other than that for which it was intended. Some uses rely on statistical analysis of large holdings (‘data mining’). Many uses are, on the other hand, very specific to the individual, and are for such purposes as direct or indirect targeting of advertisements and the sale of goods and services. Some of these applications combine location data with data from other sources, such as consumer profiling agencies, in order to build up such a substantial digital persona that the individual's behaviour is readily influenced. This takes the activity into the realms of überveillance.

All such services raise serious privacy concerns, because the data is intensive and sensitive, and attractive to organisations. Companies may gain rights in relation to the data through market power, or by trickery – such as exploitation of a self-granted right to change the Terms of Service (Clarke, 2011). Once captured, the data may be re-purposed by any organisation that gains access to it, because the value is high enough that they may judge the trivial penalties that generally apply to breaches of privacy laws to be well worth the risk.

A recently-emerged, privacy-invasive practice is the application of the mobile device signature (MDS) form of tracking, in such locations as supermarkets. This is claimed by its providers to offer deep observational insights into the behaviour of customers, including dwell times in front of displays, possibly linked with the purchaser's behaviour. This raises concerns a little different from other categories of location and tracking technologies, and is accordingly considered in greater depth in the following section.

It is noteworthy that an early review identified a wide range of LBS, which the authors classified into mobile guides, transport, gaming, assistive technology and location-based health (Raper et al., 2007b). Yet that work completely failed to notice that a vast array of applications were emergent in surveillance, law enforcement and national security, despite the existence of relevant literature from at least 1999 onwards (Clarke, 2001Michael and Masters, 2006).

5. Implications

The previous sections have introduced many examples of risks to citizens and consumers arising from location surveillance. This section presents an analysis of the categories and of the degree of seriousness with which they should be viewed. The first topic addressed is the privacy of personal location data. Other dimensions of privacy are then considered, and then the specific case of MDS is examined. The treatment here is complementary to earlier articles that have looked more generally at particular applications such as location-based mobile advertising, e.g. Cleff (20072010) and King and Jessen (2010). See also Art. 29 (2011).

5.1. Locational privacy

Knowing where someone has been, knowing what they are doing right now, and being able to predict where they might go next is a powerful tool for social control and for chilling behaviour (Abbas, 2011). Humans do not move around in a random manner (Song et al., 2010).

One interpretation of ‘locational privacy’ is that it “is the ability of an individual to move in public space with the expectation that under normal circumstances their location will not be systematically and secretly recorded for later use” (Blumberg and Eckersley, 2009). A more concise definition is “the ability to control the extent to which personal location information is… [accessible and] used by others” (van Loenen et al., 2009). Hence ‘tracking privacy’ is the interest an individual has in controlling information about their sequence of locations.

Location surveillance is deeply intrusive into data privacy, because it is very rich, and enables a great many inferences to be drawn (Clarke, 2001Dobson and Fisher, 2003Michael et al., 2006aClarke and Wigan, 2011). As demonstrated by Raper et al. (2007a, p. 32–3), most of the technical literature that considers privacy is merely concerned about it as an impediment to deployment and adoption, and how to overcome the barrier rather than how to solve the problem. Few authors adopt a positive approach to privacy-protective location technologies. The same authors' review of applications (Raper et al., 2007b) includes a single mention of privacy, and that is in relation to just one of the scores of sub-categories of application that they catalogue.

Most service-providers are cavalier in their handling of personal data, and extravagant in their claims. For example, Skyhook claims that it “respects the privacy of all users, customers, employees and partners”; but, significantly, it makes no mention of the privacy of the people whose locations, through the locations of their Wifi routers, it collects and stores (Skyhook, 2012).

Consent is critical in such LBS as personal location chronicle systems, people-followers and footpath route-tracker systems that systematically collect personal location information from a device they are carrying (Collier, 2011c). The data handled by such applications is highly sensitive because it can be used to conduct behavioural profiling of individuals in particular settings. The sensitivity exists even if the individuals remain ‘nameless’, i.e. if each identifier is a temporary or pseudo-identifier and is not linked to other records. Service-providers, and any other organisations that gain access to the data, achieve the capacity to make judgements on individuals based on their choices of, for example, which retail stores they walk into and which they do not. For example, if a subscriber visits a particular religious bookstore within a shopping mall on a weekly basis, the assumption can be reasonably made that they are in some way affiliated to that religion (Samuel, 2008).

It is frequently asserted that individuals cannot have a reasonable expectation of privacy in a public space (Otterberg, 2005). Contrary to those assertions, however, privacy expectations always have existed in public places, and continue to exist (VLRC, 2010). Tracking the movements of people as they go about their business is a breach of a fundamental expectation that people will be ‘let alone’. In policing, for example, in most democratic countries, it is against the law to covertly track an individual or their vehicle without specific, prior approval in the form of a warrant. This principle has, however, been compromised in many countries since 2001. Warrantless tracking using a mobile device generally results in the evidence, which has been obtained without the proper authority, being inadmissible in a court of law (Samuel, 2008). Some law enforcement agencies have argued for the abolition of the warrant process because the bureaucracy involved may mean that the suspect cannot be prosecuted for a crime they have likely committed (Ganz, 2005). These issues are not new; but far from eliminating a warrant process, the appropriate response is to invest the energy in streamlining this process (Bronitt, 2010).

Privacy risks arise not only from locational data of high integrity, but also from data that is or becomes associated with a person and that is inaccurate, misleading, or wrongly attributed to that individual. High levels of inaccuracy and unreliability were noted above in respect of all forms of location and tracking technologies. In the case of MDS services, claims have been made of 1–2 m locational accuracy. This has yet to be supported by experimental test cases however, and hence there is uncertainty about the reliability of inferences that the service-provider or the shop owner draw. If the data is the subject of a warrant or subpoena, the data's inaccuracy could result in false accusations and even a miscarriage of justice, with the ‘wrong person’ finding themselves in the ‘right place’ at the ‘right time’.

5.2. Privacy more broadly

Privacy has multiple dimensions. One analysis, in Clarke (2006a), identifies four distinct aspects. Privacy of Personal Data, variously also ‘data privacy’ and ‘information privacy’, is the most widely discussed dimension of the four. Individuals claim that data about themselves should not be automatically available to other individuals and organisations, and that, even where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use. The last five decades have seen the application of information technologies to a vast array of abuses of data privacy. The degree of privacy intrusiveness is a function of both the intensity and the richness of the data. Where multiple sources are combined, the impact is particularly likely to chill behaviour. An example is the correlation of video-feeds with mobile device tracking. The previous sub-section addressed that dimension.

Privacy of the Person, or ‘bodily privacy’, extends from freedom from torture and right to medical treatment, via compulsory immunisation and imposed treatments, to compulsory provision of samples of body fluids and body tissue, and obligations to submit to biometric measurement. Locational surveillance gives rise to concerns about personal safety. Physical privacy is directly threatened where a person who wishes to inflict harm is able to infer the present or near-future location of their target. Dramatic examples include assassins, kidnappers, ‘standover merchants’ and extortionists. But even people who are neither celebrities nor notorieties are subject to stalking and harassment (Fusco et al., 2012).

Privacy of Personal Communications is concerned with the need of individuals for freedom to communicate among themselves, without routine monitoring of their communications by other persons or organisations. Issues include ‘mail covers’, the use of directional microphones, ‘bugs’ and telephonic interception, with or without recording apparatus, and third-party access to email-messages. Locational surveillance thereby creates new threats to communications privacy. For example, the equivalent of ‘call records’ can be generated by combining the locations of two device-identifiers in order to infer that a face-to-face conversation occurred.

Privacy of Personal Behaviour encompasses ‘media privacy’, but particular concern arises in relation to sensitive matters such as sexual preferences and habits, political activities and religious practices. Some privacy analyses, particularly in Europe, extend this discussion to personal autonomy, liberty and the right of self-determination (e.g. King and Jessen, 2010). The notion of ‘private space’ is vital to economic and social aspects of behaviour, is relevant in ‘private places’ such as the home and toilet cubicles, but is also relevant and important in ‘public places’, where systematic observation and the recording of images and sounds are far more intrusive than casual observation by the few people in the vicinity.

Locational surveillance gives rise to rich sets of data about individuals' activities. The knowledge, or even suspicion, that such surveillance is undertaken, chills their behaviour. The chilling factor is vital in the case of political behaviour (Clarke, 2008). It is also of consequence in economic behaviour, because the inventors and innovators on whom new developments depend are commonly ‘different-thinkers’ and even ‘deviants’, who are liable to come to come to attention in mass surveillance dragnets, with the tendency to chill their behaviour, their interactions and their creativity.

Surveillance that generates accurate data is one form of threat. Surveillance that generates inaccurate data, or wrongly associates data with a particular person, is dangerous as well. Many inferences that arise from inaccurate data will be wrong, of course, but that won't prevent those inferences being drawn, resulting in unjustified behavioural privacy invasiveness, including unjustified association with people who are, perhaps for perfectly good reasons, themselves under suspicion.

In short, all dimensions of privacy are seriously affected by location surveillance. For deeper treatments of the topic, see Michael et al. (2006b) and Clarke and Wigan (2011).

5.3. Locational privacy and MDS

The recent innovation of tracking by means of mobile device signatures (MDS) gives rise to some issues additional to, or different from, mainstream device location technologies. This section accordingly considers this particular technique's implications in greater depth. Limited reliable information is currently available, and the analysis is of necessity based on supplier-published sources (PI, 2010a2010b) and media reports (Collier, 2011a,b,c).

Path Intelligence (PI) markets an MDS service to shopping mall-owners, to enable them to better value their floor space in terms of rental revenues, and to identify points of on-foot traffic congestion to on-sell physical advertising and marketing floor space (PI, 2010a). The company claims to detect each phone (and hence person) that enters a zone, and to capture data, including:

• how long each device and person stay, including dwell times in front of shop windows;

• repeat visits by shoppers in varying frequency durations; and

• typical route and circuit paths taken by shoppers as they go from shop to shop during a given shopping experience.

For malls, PI is able to denote such things as whether or not shoppers who shop at one establishment will also shop at another in the same mall, and whether or not people will go out of their way to visit a particular retail outlet independent of its location. For retailers, PI says it is able to provide information on conversion rates by department or even product line, and even which areas of the store might require more attention by staff during specific times of the day or week (PI, 2012).

PI says that it uses “complex algorithms” to denote the geographic position of a mobile phone, using strategically located “proprietary equipment” in a campus setting (PI, 2010a). The company states that it is conducting “data-driven analysis”, but is not collecting, or at least that it is not disclosing, any personal information such as a name, mobile telephone number or contents of a short message service (SMS). It states that it only ever provides aggregated data at varying zone levels to the shopping mall-owners. This is presumably justified on the basis that, using MDS techniques, direct identifiers are unlikely to be available, and a pseudo-identifier needs to be assigned. There is no explicit definition of what constitutes a zone. It is clear, however, that minimally-aggregated data at the highest geographic resolution is available for purchase, and at a higher price than more highly-aggregated data.

Shoppers have no relationship with the company, and it appears unlikely that they would even be aware that data about them is being collected and used. The only disclosure appears to be that “at each of our installations our equipment is clearly visible and labelled with our logo and website address” (PI, 2010a), but this is unlikely to be visible to many people, and in any case would not inform anyone who saw it.

In short, the company is generating revenue by monitoring signals from the mobile devices of people who visit a shopping mall for the purchase of goods and services. The data collection is performed without the knowledge of the person concerned (Renegar et al., 2008). The company is covertly collecting personal data and exploiting it for profit. There is no incentive or value proposition for the individual whose mobile is being tracked. No clear statement is provided about collection, storage, retention, use and disclosure of the data (Arnold, 2008). Even if privacy were not a human right, this would demand statutory intervention on the public policy grounds of commercial unfairness. The company asserts that “our privacy approach has been reviewed by the [US Federal Trade Commission] FTC, which determined that they are comfortable with our practices” (PI, 2010a). It makes no claims of such ‘approval’ anywhere else in the world.

The service could be extended beyond a mall and the individual stores within it, to for example, associated walkways and parking areas, and surrounding areas such as government offices, entertainment zones and shopping-strips. Applications can also be readily envisaged on hospital and university campuses, and in airports and other transport hubs. From prior research, this is likely to expose the individual's place of employment, and even their residence (Michael et al., 2006a,b). Even if only aggregated data is sold to businesses, the individual records remain available to at least the service-provider.

The scope exists to combine this form of locational surveillance with video-surveillance such as in-store CCTV, and indeed this is claimed to be already a feature of the company's offering to retail stores. To the extent that a commonly-used identifier can be established (e.g. through association with the person's payment or loyalty card at a point-of-sale), the full battery of local and externally acquired customer transaction histories and consolidated ‘public records’ data can be linked to in-store behaviour (Michael and Michael, 2007). Longstanding visual surveillance is intersecting with well-established data surveillance, and being augmented by locational surveillance, giving breath to dataveillance, or what is now being referred to by some as ‘smart surveillance’ (Wright et al., 2010IBM, 2011).

Surreptitious collection of personal data is (with exemptions and exceptions) largely against the law, even when undertaken by law enforcement personnel. The MDS mechanism also flies in the face of telephonic interception laws. How, then, can it be in any way acceptable for a form of warrantless tracking to be undertaken by or on behalf of corporations or mainstream government agencies, of shoppers in a mall, or travellers in an airport, or commuters in a transport hub? Why should a service-provider have the right to do what a law enforcement agency cannot normally do?

6. Controls

The tenor of the discussion to date has been that location surveillance harbours enormous threats to location privacy, but also to personal safety, the freedom to communicate, freedom of movement, and freedom of behaviour. This section examines the extent to which protections exist, firstly in the form of natural or intrinsic controls, and secondly in the form of legal provisions. The existing safeguards are found to be seriously inadequate, and it is therefore necessary to also examine the prospects for major enhancements to law, in order to achieve essential protections.

6.1. Intrinsic controls

A variety of forms of safeguard exist against harmful technologies and unreasonable applications of them. The intrinsic economic control has largely evaporated, partly because the tools use electronics and the components are produced in high volumes at low unit cost. Another reason is that the advertising and marketing sectors are highly sophisticated, already hold and exploit vast quantities of personal data, and are readily geared up to exploit yet more data.

Neither the oxymoronic notion of ‘business ethics’ nor the personal morality of executives in business and government act as any significant brake on the behaviours of corporations and governments, because they are very weak barriers, and they are readily rationalised away in the face of claims of enhanced efficiencies in, for example, marketing communications, fraud control, criminal justice and control over anti-social behaviour.

A further category of intrinsic control is ‘self-regulatory’ arrangements within relevant industry sectors. In 2010, for example, the Australian Mobile Telecommunications Association (AMTA) released industry guidelines to promote the privacy of people using LBS on mobile devices (AMTA, 2010). The guidelines were as follows:

1. Every LBS must be provided on an opt-in basis with a specific request from a user for the service

2. Every LBS must comply with all relevant privacy legislation

3. Every LBS must be designed to guard against consumers being located without their knowledge

4. Every LBS must allow consumers to maintain full control

5. Every LBS must enable customers to control who uses their location information and when that is appropriate, and be able to stop or suspend a service easily should they wish

The second point is a matter for parliaments, privacy oversight agencies and law enforcement agencies, and its inclusion in industry guidelines is for information only. The remainder, meanwhile, are at best ‘aspirational’, and at worst mere window-dressing. Codes of this nature are simply ignored by industry members. They are primarily a means to hold off the imposition of actual regulatory measures. Occasional short-term constraints may arise from flurries of media attention, but the ‘responsible’ organisations escape by suggesting that bad behaviour was limited to a few ‘cowboy’ organisations or was a one-time error that will not be repeated.

A case study of the industry self-regulation is provided by the Biometrics Code issued by the misleadingly named Australian industry-and-users association, the Biometrics ‘Institute’ (BI, 2004). During the period 2009–2012, the privacy advocacy organisation, the Australian Privacy Foundation (APF), submitted to the Privacy Commissioner on multiple occasions that the Code failed to meet the stipulated requirements and under the Commissioner's own Rules had to be de-registered. The Code never had more than five subscribers (out of a base of well over 100 members – which was itself only a sub-set of organisations active in the area), and had no signatories among the major biometrics vendors or users, because all five subscribers were small organisations or consultants. In addition, none of the subscribers appear to have ever provided a link to the Code on their websites or in their Privacy Policy Statements (APF, 2012).

The Commissioner finally ended the farce in April 2012, citing the “low numbers of subscribers”, but avoided its responsibilities by permitting the ‘Institute’ to “request” revocation, over two years after the APF had made the same request (OAIC, 2012). The case represents an object lesson in the vacuousness of self-regulation and the business friendliness of a captive privacy oversight agency.

If economics, morality and industry sector politics are inadequate, perhaps competition and organisational self-interest might work. On the other hand, repeated proposals that privacy is a strategic factor for corporations and government agencies have fallen on stony ground (Clarke, 19962006b).

The public can endeavour to exercise countervailing power against privacy-invasive practices. On the other hand, individuals acting alone are of little or no consequence to organisations that are intent on the application of location surveillance. Moreover, consumer organisations lack funding, professionalism and reach, and only occasionally attract sufficient media attention to force any meaningful responses from organisations deploying surveillance technologies.

Individuals may have direct surveillance countermeasures available to them, but relatively few people have the combination of motivation, technical competence and persistence to overcome lethargy and the natural human desire to believe that the institutions surrounding them are benign. In addition, some government agencies, corporations and (increasingly prevalent) public–private partnerships seek to deny anonymity, pseudonymity and multiple identities, and to impose so-called ‘real name’ policies, for example as a solution to the imagined epidemics of cyber-bullying, hate speech and child pornography. Individuals who use cryptography and other obfuscation techniques have to overcome the endeavours of business and government to stigmatise them as criminals with ‘something to hide’.

6.2. Legal controls

It is clear that natural or intrinsic controls have been utter failures in privacy matters generally, and will be in locational privacy matters as well. That leaves legal safeguards for personal freedoms as the sole protection. There are enormous differences among domestic laws relating to location surveillance. This section accordingly limits itself to generalities and examples.

Privacy laws are (with some qualifications, mainly in Europe) very weak instruments. Even where public servants and parliaments have an actual intention to protect privacy, rather than merely to overcome public concerns by passing placebo statutes, the draft Bills are countered by strong lobbying by government agencies and industry, to the extent that measures that were originally portrayed as being privacy-protective reach the statute books as authority for privacy breaches and surveillance (Clarke, 2000).

Privacy laws, once passed, are continually eroded by exceptions built into subsequent legislation, and by technological capabilities that were not contemplated when the laws were passed. In most countries, location privacy has yet to be specifically addressed in legislation. Even where it is encompassed by human rights and privacy laws, the coverage is generally imprecise and ambiguous. More direct and specific regulation may exist, however. In Australia, for example, the Telecommunications (Interception and Access) Act and the Surveillance Devices Act define and criminalise inappropriate interception and access, use, communication and publication of location information that is obtained from mobile device traffic (AG, 2005). On the other hand, when Google Inc. intercepted wi-fi signals and recorded the data that they contained, the Privacy Commissioner absolved the company (Riley, 2010), and the Australian Federal Police refused to prosecute despite the action – whether it was intentional, ‘inadvertent’ or merely plausibly deniable – being a clear breach of the criminal law (Moses, 2010Stilgherrian, 2012).

The European Union determined a decade ago that location data that is identifiable to individuals is to some extent at least subject to existing data protection laws (EU, 2002). However, the wording of that so-called ‘e-Privacy Directive’ countenances the collection of “location data which are more precise than is necessary for the transmission of communications”, without clear controls over the justification, proportionality and transparency of that collection (para. 35). In addition, the e-Privacy Directive only applies to telecommunications service-providers, not to other organisations that acquire location and tracking data. King and Jessen (2010) discuss various gaps in the protective regimes in Europe.

The EU's Advisory Body (essentially a Committee of European Data Protection Commissioners) has issued an Opinion that mobile location data is generally capable of being associated with a person, and hence is personal data, and hence is subject to the EU Directive of 1995 and national laws that implement that Directive (Art. 29, 2011). Consent is considered to be generally necessary, and that consent must be informed, and sufficiently granular (p. 13–8).

It is unclear, however, to what extent this Opinion has actually caused, and will in the future cause, organisations that collect, store, use and disclose location data to change their practices. This uncertainty exists in respect of national security, law enforcement and social control agencies, which have, or which can arrange, legal authority that overrides data protection laws. It also applies to non-government organisations of all kinds, which can take advantage of exceptions, exemptions, loopholes, non-obviousness, obfuscation, unenforceability within each particular jurisdiction, and extra-jurisdictionality, to operate in ways that are in apparent breach of the Opinion.

Legal authorities for privacy-invasions are in a great many cases vague rather than precise, and in many jurisdictions power in relation to specific decisions is delegated to a LEA (in such forms as self-written ‘warrants’), or even a social control agency (in the form of demand-powers), rather than requiring a decision by a judicial officer based on evidence provided by the applicant.

Citizens in many countries are subject to more or less legitimate surveillance of various degrees and orders of granularity, by their government, in the name of law enforcement and national security. However, many Parliaments have granted powers to national security agencies to use location technology to track citizens and to intercept telecommunications. Moreover, many Parliaments have failed the public by permitting a warrant to be signed by a Minister, or even a public servant, rather than a judicial officer (Jay, 1999). Worse still, it appears that these already gross breaches of the principle of a free society are in effect being extended to the authorisation of a private organisation to track mobiles of ordinary citizens because it may lead to better services planning, or more efficient advertising and marketing (Collier, 2011a).

Data protection legislation in all countries evidences massive weaknesses. There are manifold exemptions and exceptions, and there are intentional and accidental exclusions, for example through limitations in the definitions of ‘identified’ and ‘personal data’. Even the much vaunted European laws fail to cope with extraterritoriality and are largely ignored by US-based service-providers. They are also focused exclusively on data, leaving large gaps in safeguards for physical, communications and behavioural privacy.

Meanwhile, a vast amount of abuse of personal data is achieved through the freedom of corporations and government agencies to pretend that Terms imposed on consumers and citizens without the scope to reject them are somehow the subject of informed and freely given consent. For example, petrol stations, supermarkets and many government agencies pretend that walking past signs saying ‘area subject to CCTV’ represents consent to gather, transmit, record, store, use and disclose data. The same approach is being adopted in relation to highly sensitive location data, and much vaunted data protection laws are simply subverted by the mirage of consent.

At least notices such as ‘you are now being watched’ or ‘smile, you are being recorded’ inform customers that they are under observation. On the other hand, people are generally oblivious to the fact that their mobile subscriber identity is transmitted from their mobile phone and multilaterated to yield a reasonably precise location in a shopping mall (Collier, 2011a,b,c). Further, there is no meaningful sense in which they can be claimed to have consented to providing location data to a third party, in this case a location service-provider with whom they have never had contact. And the emergent combination of MDS with CCTV sources becomes a pervasive view of the person, an ‘über’ view, providing a set of über-analytics to – at this stage – shopping complex owners and their constituents.

What rights do employees have if such a system were instituted in an employment setting (Michael and Rose, 2007, p. 252–3)? Are workplace surveillance laws in place that would protect employees from constant monitoring (Stern, 2007)? A similar problem applies to people at airports, or on hospital, university, industrial or government campuses. No social contract has been entered into between the parties, rendering the subscriber powerless.

Since the collapse of the Technology Assessment movement, technological deployment proceeds unimpeded, and public risks are addressed only after they have emerged and the clamour of concern has risen to a crescendo. A reactive force is at play, rather than proactive measures being taken to ensure avoidance or mitigation of potential privacy breaches (Michael et al., 2011). In Australia, for example, safeguards for location surveillance exist at best incidentally, in provisions under separate legislative regimes and in separate jurisdictions, and at worst not at all. No overarching framework exists to provide consistency among the laws. This causes confusion and inevitably results in inadequate protections (ALRC, 2008).

6.3. Prospective legal controls

Various learned studies have been conducted, but gather dust. In Australia, the three major law reform commissions have all reported, and all have been ignored by the legislatures (NSWLRC, 2005ALRC, 2008VLRC, 2010).

One critical need is for the fundamental principle to be recovered, to the effect that the handling of personal data requires either consent or legal authority. Consent is meaningless as a control over unreasonable behaviour, however, unless it satisfies a number of key conditions. It must be informed, it must be freely given, and it must be sufficiently granular, not bundled (Clarke, 2002). In a great many of the circumstances in which organisations are claiming to have consent to gather, store, use and disclose location data, the consumer does not appreciate what the scope of handling is that the service-provider is authorising themselves to perform; the Terms are imposed by the service-provider and may even be varied or completely re-written without consultation, a period of notice or even any notice at all; and consent is bundled rather than the individual being able to construct a pattern of consents and denials that suit their personal needs. Discussions all too frequently focus on the specifically-US notion of ‘opt-out’ (or ‘presumed consent’), with consent debased to ‘opt-in’, and deprecated as inefficient and business-unfriendly.

Recently, some very weak proposals have been put forward, primarily in the USA. In 2011, for example, two US Senators proposed a Location Privacy Protection Bill (Cheng, 2011). An organisation that collected location data from mobile or wireless data devices would have to state explicitly in their privacy policies what was being collected, in plain English. This would represent only a partial implementation of the already very weak 2006 recommendation of the Internet Engineering Task Force for Geographic Location/Privacy (IETF GEOPRIV) working group, which decided that technical systems should include ‘Fair Information Practices’ (FIPs) to defend against harms associated with the use of location technologies (EPIC, 2006). FIPs, however, is itself only a highly cut-down version of effective privacy protections, and the Bill proposes only a small fraction of FIPs. It would be close to worthless to consumers, and close to legislative authorisation for highly privacy-invasive actions by organisations.

Two other US senators tabled a GPS Bill, nominally intended to “balance the needs of Americans' privacy protections with the legitimate needs of law enforcement, and maintains emergency exceptions” (Anderson, 2011). The scope is very narrow – next would have to come the Wi-Fi Act, the A-GPS Act, etc. That approach is obviously unviable in the longer term as new innovations emerge. Effective legislation must have appropriate generality rather than excessive technology-specificity, and should be based on semantics not syntax. Yet worse, these Bills would provide legal authorisation for grossly privacy-invasive location and tracking. IETF engineers, and now Congressmen, want to compromise human rights and increase the imbalance of power between business and consumers.

7. Conclusions

Mobile device location technologies and their applications are enabling surveillance, and producing an enormous leap in intrusions into data privacy and into privacy of the person, privacy of personal communications, and privacy of personal behaviour.

Existing privacy laws are entirely incapable of protecting consumers and citizens against the onslaught. Even where consent is claimed, it generally fails the tests of being informed, freely given and granular.

There is an urgent need for outcries from oversight agencies, and responses from legislatures. Individual countries can provide some degree of protection, but the extra-territorial nature of so much of the private sector, and the use of corporate havens, in particular the USA, mean that multilateral action is essential in order to overcome the excesses arising from the US laissez fairetraditions.

One approach to the problem would be location privacy protection legislation, although it would need to embody the complete suite of protections rather than the mere notification that the technology breaches privacy. An alternative approach is amendment of the current privacy legislation and other anti-terrorism legislation in order to create appropriate regulatory provisions, and close the gaps that LBS providers are exploiting (Koppel, 2010).

The chimeras of self-regulation, and the unenforceability of guidelines, are not safeguards. Sensitive data like location information must be subject to actual, enforced protections, with guidelines and codes no longer used as a substitute, but merely playing a supporting role. Unless substantial protections for personal location information are enacted and enforced, there will be an epidemic of unjustified, disproportionate and covert surveillance, conducted by government and business, and even by citizens (Gillespie, 2009Abbas et al., 2011).

Acknowledgements

A preliminary version of the analysis presented in this paper appeared in the November 2011 edition of Precedent, the journal of the Lawyers Alliance. The article has been significantly updated as a result of comments provided by the referees and editor.

References

R. Abbas, The social and behavioural implications of location-based services: an observational study of users, Journal of Location Based Services, 5 (December 2011), pp. 3-4

R. Abbas, K. Michael, M.G. Michael, A. Aloudat, Emerging forms of covert surveillance using GPS-enabled devices, Journal of Cases on Information Technology, 13 (2) (2011), pp. 19-33

AG, What the government is doing: Surveillance Device Act 2004, Australian Government (25 May 2005) at http://www.ag.gov.au/agd/www/nationalsecurity.nsf/AllDocs/9B1F97B59105AEE6CA25700C0014CAF5?OpenDocument

ALRC, For your information: Australian privacy law and practice, (ALRC report 108), Australian Government (2008), 2, p. 1409–10, http://www.alrc.gov.au.ezproxy.uow.edu.au/publications/report-108

AMTA, New mobile telecommunications industry guidelines and consumer tips set benchmark for location based services, Australian Mobile Telecommunications Association (2010) at http://www.amta.org.au/articles/New.mobile.telecommunications.industry.guidelines.and.consumer.tips.set.benchmark.for.Location.Based.Services

N. Anderson, Bipartisan bill would end government's warrantless GPS tracking, Ars Technica (June 2011) at http://arstechnica.com/tech-policy/news/2011/06/bipartisan-bill-would-end-governments-warrantless-gps-tracking.ars

APF Revocation of the biometrics industry code, Australian Privacy Foundation (March 2012) at http://www.privacy.org.au/Papers/OAIC-BiomCodeRevoc-120321.pdf

B. Arnold, Privacy guide, Caslon Analytics (May 2008), at http://www.caslon.com.au/privacyguide19.htm

Art. 29, Opinion 13/2011 on geolocation services on smart mobile devices, Article 29 Data Protection Working Party, 881/11/EN WP 185, at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2011/wp185_en.pdf (16 May 2011)

BI Privacy code, Biometrics Institute, Sydney (April 2004) at http://web.archive.org/web/20050424120627/http://www.biometricsinstitute.org/displaycommon.cfm?an=1&subarticlenbr=8

A.J. Blumberg, P. EckersleyOn locational privacy, and how to avoid losing it forever, Electronic Frontier Foundation (August 2009), at https://www.eff.org/wp/locational-privacy

S. Bronitt, Regulating covert policing methods: from reactive to proactive models of admissibility, S. Bronitt, C. Harfield, K. Michael (Eds.), The social implications of covert policing (2010), pp. 9-14

J. Cheng, Franken's location-privacy bill would close mobile-tracking ‘loopholes’, Wired (17 June 2011), at http://www.wired.com/epicenter/2011/06/franken-location-loopholes/

K. Chetty, G.E. Smith, K. Woodbridge, Through-the-wall sensing of personnel using passive bistatic WiFi radar at standoff distances, IEEE Transactions on Geoscience and Remote Sensing, 50 (4) (April 2012), pp. 1218-1226

R. Clarke, Information technology and dataveillance, Communications of the ACM, 31 (5) (May 1988), pp. 498-512, at http://www.rogerclarke.com/DV/CACM88.html

R. Clarke, The digital persona and its application to data surveillance, The Information Society, 10 (2) (June 1994), pp. 77-92, at http://www.rogerclarke.com/DV/DigPersona.html

Clarke R. Privacy and dataveillance, and organisational strategy. In: Proc. I.S. Audit & Control Association (EDPAC'96), Perth, Western Australia; May 1996, at http://www.rogerclarke.com/DV/PStrat.html.

R. Clarke, Submission to the Commonwealth Attorney-General re: ‘a privacy scheme for the private sector: release of key provisions’ of 14 December 1999, Xamax Consultancy Pty Ltd (January 2000) at http://www.anu.edu.au/people/Roger.Clarke/DV/PAPSSub0001.html

R. Clarke, Person-location and person-tracking: technologies, risks and policy implications, Information Technology & People, 14 (2) (Summer 2001), pp. 206-231, at http://www.rogerclarke.com/DV/PLT.html

Clarke R. e-Consent: a critical element of trust in e-business. In: Proc. 15th Bled electronic commerce conference, Bled, Slovenia; June 2002, at http://www.rogerclarke.com/EC/eConsent.html.

R. Clarke, What's ‘privacy’? Xamax Consultancy Pty Ltd (2006), August 2006, at http://www.rogerclarke.com/DV/Privacy.html

R. Clarke, Make privacy a strategic factor – the why and the how, Cutter IT Journal, 19 (11) (2006), at http://www.rogerclarke.com/DV/APBD-0609.html

R. Clarke, Dissidentity: the political dimension of identity and privacy, Identity in the Information Society, 1 (1) (December 2008), pp. 221-228, at http://www.rogerclarke.com/DV/Dissidentity.html

Clarke R. The covert implementation of mass vehicle surveillance in Australia. In: Proc 4th workshop on the social implications of national security: covert policing, April 2009, ANU, Canberra; 2009a, at http://www.rogerclarke.com/DV/ANPR-Surv.html.

Clarke R. A sufficiently rich model of (id)entity, authentication and authorisation. In: Proc. IDIS 2009 – the 2nd multidisciplinary workshop on identity in the Information Society, LSE, 5 June 2009; 2009b, at http://www.rogerclarke.com/ID/IdModel-090605.html.

R. Clarke, A framework for surveillance analysis, Xamax Consultancy Pty Ltd (2009), August 2009, at http://www.rogerclarke.com/DV/FSA.html

R. Clarke, What is überveillance? (And what should be done about it?) IEEE Technology and Society, 29 (2) (Summer 2010), pp. 17-25, at http://www.rogerclarke.com/DV/RNSA07.html

Clarke R. The cloudy future of consumer computing. In: Proc. 24th Bled eConference; June 2011, at http://www.rogerclarke.com/EC/CCC.html.

R. Clarke, M. Wigan, You are where you've been: the privacy implications of location and tracking technologies, Journal of Location Based Services, 5 (3–4) (December 2011), pp. 138-155, http://www.rogerclarke.com/DV/YAWYB-CWP.html

E.B. Cleff, Implementing the legal criteria of meaningful consent in the concept of mobile advertising, Computer Law & Security Review, 23 (2) (2007), pp. 262-269

E.B. Cleff, Effective approaches to regulate mobile advertising: moving towards a coordinated legal, self-regulatory and technical response, Computer Law & Security Review, 26 (2) (2010), pp. 158-169

K. Collier, Stores spy on shoppers, Herald Sun (2011), 12 October 2011, at http://www.heraldsun.com.au/news/more-news/stores-spy-on-shoppers/story-fn7x8me2-1226164244739

K. Collier, Shopping centres' Big Brother plan to track customers, Herald Sun (2011), 14 October 2011, at http://www.heraldsun.com.au/news/more-news/shopping-centres-big-brother-plan-to-track-customers/story-fn7x8me2-1226166191503

K. Collier, ‘Creepy’ path intelligence retail technology tracks shoppers, news.com.au (2011), 14 October 2011, at http://www.news.com.au/money/creepy-retail-technology-tracks-shoppers/story-e6frfmci-1226166413071

F. Dahunsi, B. Dwolatzky, An empirical investigation of the accuracy of location-based services in South Africa, Journal of Location Based Services, 6 (1) (March 2012), pp. 22-34

J. Dobson, P. Fisher, Geoslavery, IEEE Technology and Society, 22 (2003), pp. 47-52, cited in Raper et al. (2007)

Economist, Vehicle data recorders – watching your driving, The Economist (23 June 2012), at http://www.economist.com/node/21557309

J. Edwards, Apple has quietly started tracking iphone users again, and it's tricky to opt out, Business Insider (11 October 2012) at http://www.businessinsider.com/ifa-apples-iphone-tracking-in-ios-6-2012-10

EPIC, Privacy and human rights report 2006, Electronic Privacy Information Center, WorldLII (2006) at http://www.worldlii.org.ezproxy.uow.edu.au/int/journals/EPICPrivHR/2006/PHR2006-Location.html

EPIC, Investigations of Google street view, Electronic Privacy Information Center (2012), at http://epic.org/privacy/streetview/

EU Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications)

Official Journal, L 201 (2002), 31/07/2002 P. 0037-0047, European Commission, at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32002L0058:en:HTML

J. Figueiras, S. Frattasi, Mobile positioning and tracking: from conventional to cooperative techniques, Wiley (2010)

S.J. Fusco, R. Abbas, K. Michael, A. Aloudat, Location-based social networking and its impact on trust in relationships, IEEE Technology and Society Magazine, 31 (2) (Summer 2012), pp. 39-50, at http://works.bepress.com.ezproxy.uow.edu.au/cgi/viewcontent.cgi?article=1326&context=kmichael

Gallagher T et al. Trials of commercial Wi-Fi positioning systems for indoor and urban canyons. In: Proc. IGNSS symposium, Queensland; 1–3 December 2009, cited in Zandbergen (2012).

J.S. Ganz, It's already public: why federal officers should not need warrants to use GPS vehicle tracking devices, Journal of Criminal Law and Criminology, 95 (4) (Summer 2005), pp. 1325-1337

A.A. Gillespie, Covert surveillance, human rights and the law, Irish Criminal Law Journal, 19 (3) (August 2009), pp. 71-79

IBM, IBM smart surveillance system (previous PeopleVision project, IBM Research (30 October 2011), at http://www.research.ibm.com.ezproxy.uow.edu.au/peoplevision/

D.M. Jay, Use of covert surveillance obtained by search warrant, Australian Law Journal, 73 (1) (Jan 1999), pp. 34-36

N.J. King, P.W. Jessen, Profiling the mobile customer – privacy concerns when behavioural advertisers target mobile phones, Computer Law & Security Review, 26 (5) (2010), pp. 455-478, and 2010; 26(6): 595–612

A. Koppel, Warranting a warrant: fourth amendment concerns raised by law enforcement's warrantless use of GPS and cellular phone tracking, University of Miami Law Review, 64 (3) (April 2010), pp. 1061-1089

P. Lewis, Fears over privacy as police expand surveillance project, The Guardian (15 September 2008) at http://www.guardian.co.uk/uk/2008/sep/15/civilliberties.police

B. van Loenen, J. Zevenbergen, J. de JongBalancing location privacy with national security: a comparative analysis of three countries through the balancing framework of the European court of human rights, N.J. Patten, et al. (Eds.), National security: institutional approaches, Nova Science Publishers (2009), [chapter 2]

M. McGuire, K.N. Plataniotis, A.N. Venetsanopoulos, Data fusion of power and time measurements for mobile terminal location, IEEE Transaction on Mobile Computing, 4 (2005), pp. 142-153, cited in Raper et al. (2007)

S. Mann, J. Nolan, B. Wellman, Sousveillance: inventing and using wearable computing devices for data collection in surveillance environments, Surveillance & Society, 1 (3) (June 2003), pp. 331-355, at http://www.surveillance-and-society.org/articles1(3)/sousveillance.pdf

Mautz R. Overview of indoor positioning technologies. Keynote. In: Proc. IPIN'2011, Guimaraes; September 2011, at http://www.geometh.ethz.ch/people/.../IPIN_Keynote_Mautz_2011.pdf.

D. Mery, The mobile phone as self-inflicted surveillance – and if you don't have one, what have you got to hide? The Register (10 April 2009) at http://www.theregister.co.uk/2009/04/10/mobile_phone_tracking/

Michael and Michael, 2007, K. Michael, M.G. Michael, From dataveillance to überveillance and the Realpolitik of the Transparent Society, University of Wollongong (2007) at http://works.bepress.com.ezproxy.uow.edu.au/kmichael/51

K. Michael, M.G. Michael, Innovative automatic identification and location-based services: from bar codes to chip implants, IGI Global (2009)

M.G. Michael, K. Michael, Towards a state of uberveillance, IEEE Technology and Society Magazine, 29 (2) (Summer 2010), pp. 9-16, at, http://works.bepress.com.ezproxy.uow.edu.au/kmichael/187

Michael K, McNamee A, Michael MG, Tootell H., Location-based intelligence – modeling behavior in humans using GPS. In: Proc. int'l symposium on technology and society, New York, 8–11 June 2006; 2006a, at http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1384&context=infopapers.

Michael K, McNamee A, Michael MG. The emerging ethics of humancentric GPS tracking and monitoring. In: Proc. int'l conf. on mobile business, Copenhagen, Denmark. IEEE Computer Society; 2006b, at http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1384&context=infopapers.

M.G. Michael, S.J. Fusco, K. Michael, A research note on ethics in the emerging age of uberveillance, Computer Communications, 31 (6) (2008), pp. 1192-1199, at http://works.bepress.com.ezproxy.uow.edu.au/kmichael/32/

Michael and Masters, 2006, K. Michael, A. Masters, Realized applications of positioning technologies in defense intelligence, H. Hussein Abbass, D. Essam (Eds.), Applications of information systems to homeland security and defense, Idea Group Publishing (2006), at http://works.bepress.com.ezproxy.uow.edu.au/kmichael/2

K. Michael, G. Rose, Human tracking technology in mutual legal assistance and police inter-state cooperation in international crimes, K. Michael, M.G. Michael (Eds.), From dataveillance to überveillance and the realpolitik of the transparent society. 1st ed, University of Wollongong, Wollongong (2007), pp. 241-256.

K. Michael, G. Roussos, G.Q. Huang, R. Gadh, A. Chattopadhyay, S.Prabhu, et al.Planetary-scale RFID services in an age of uberveillance, Proceedings of the IEEE, 98 (9) (2010), pp. 1663-1671

K. Michael, M.G. Michael, R. Abbas, The importance of scenarios in the prediction of the social implications of emerging technologies and services, Journal of Cases on Information Technology (JCIT) 13.2 (2011), pp. i-vii

A. Moses, Google escapes criminal charges for Wi-Fi snooping, The Sydney Morning Herald (6 December 2010) at http://www.smh.com.au/technology/security/google-escapes-criminal-charges-for-wifi-snooping-20101206-18lot.html

NSWLRC Surveillance, Report 108, NSW Law Reform Commission (2005) at http://www.lawlink.nsw.gov.au/lawlink/lrc/ll_lrc.nsf/pages/LRC_r108toc

OAIC. Office of the Australian Information Commissioner; April 2012, at http://www.comlaw.gov.au/Details/F2012L00869/Explanatory%20Statement/Text.

A.A. Otterberg, Note: GPS tracking technology: the case for revisiting Knotts and shifting the Supreme Court's theory of the public space under the fourth amendment, Boston College Law Review, 46 (2005) (2005), pp. 661-704

C. Parenti, The soft cage: surveillance in America from slavery to the war on terror, Basic Books (2003)

PI, Our commitment to privacy, Path Intelligence (2010), heading changed in late 2012 to ‘privacy by design’, at http://www.pathintelligence.com/en/products/footpath/privacy

PI, FootPath technology, Path Intelligence (2010) at http://www.pathintelligence.com/en/products/footpath/footpath-technology

PI Retail, Path Intelligence (2012), at http://www.pathintelligence.com/en/industries/retail

J. Raper, G. Gartner, H. Karimi, C. Rizos, A critical evaluation of location based services and their potential, Journal of Location Based Services, 1 (1) (2007), pp. 5-45

J. Raper, G. Gartner, H. Karimi, C. Rizos, Applications of location-based services: a selected review, Journal of Location Based Services, 1 (2) (2007), pp. 89-111

RE IEEE 802.11 standards tutorial, Radio-Electronics.com (2010), apparently of 2010, at http://www.radio-electronics.com/info/wireless/wi-fi/ieee-802-11-standards-tutorial.php

RE WiMAX IEEE 802.16 technology tutorial, Radio-Electronics.com (2010), apparently of 2010, at http://www.radio-electronics.com/info/wireless/wimax/wimax.php

RE Assisted GPS, A-GPS, Radio-Electronics.com (2012) apparently of 2012, at http://www.radio-electronics.com/info/cellulartelecomms/location_services/assisted_gps.php

Renegar BD, Michael K, Michael MG. Privacy, value and control issues in four mobile business applications. In: Proc. 7th int'l conf. on mobile business; 2008. p. 30–40.

J. Riley, Gov't ‘travesty’ in Google privacy case, ITWire, 20 (Wednesday 3 November 2010), p. 44, at http://www.itwire.com/it-policy-news/regulation/42898-govt-travesty-in-google-privacy-case

I.J. Samuel, Warrantless location tracking, New York University Law Review, 83 (2008), pp. 1324-1352

SHW Skyhook location performance at http://www.skyhookwireless.com/location-technology/performance.php (2012)

Skyhook. (2012). Website entries, including ‘frequently asked questions’ at http://www.skyhookwireless.com/whoweare/faq.php, ‘privacy policy’ at http://www.skyhookwireless.com/whoweare/privacypolicy.php and ‘location privacy’ at http://www.skyhookwireless.com/whoweare/privacy.php.

C. Song, Z. Qu, N. Blumm, A.-L. Barabási, Limits of predictability in human mobility, Science, 327 (5968) (2010), pp. 1018-1021.

A. Stern, Man fired thanks to GPS tracking, Center Networks (31 August 2007), at http://www.centernetworks.com/man-fired-thanks-to-gps-tracking

Stilgherrian, Forget government data retention, Google has you wired, Crikey (2 October 2012), at http://www.crikey.com.au/2012/10/02/forget-government-data-retention-google-has-you-wired/

USGovGPS accuracy, National Coordination Office for Space-Based Positioning, Navigation, and Timing(February 2012), at http://www.gps.gov/systems/gps/performance/accuracy/

VLRC, Surveillance in public spaces, Victorian Law Reform Commission (March 2010), Final report 18, at http://www.lawreform.vic.gov.au/wps/wcm/connect/justlib/Law+Reform/resources/3/6/36418680438a4b4eacc0fd34222e6833/Surveillance_final_report.pdf

D. Wright, M. Friedewald, S. Gutwirth, M. Langheinrich, E. Mordini, R.Bellanova, et al.Sorting out smart surveillance, Computer Law & Security Review, 26 (4) (2010), pp. 343-354

P.A. Zandbergen, Comparison of WiFi positioning on two mobile devices, Journal of Location Based Services, 6 (1) (March 2012), pp. 35-50

Keywords: Location-based systems (LBS), Cellular mobile, Wireless LAN, GPS, Mobile device signatures (MDS), Privacy, Surveillance, Überveillance

Citation: Katina Michael and Roger Clarke, "Location and tracking of mobile devices: Überveillance stalks the streets", Computer Law & Security Review, Vol. 29, No. 3, June 2013, pp. 216-228, DOI: https://doi.org/10.1016/j.clsr.2013.03.004

Conceptual Model of User Acceptance of Location-Based Emergency Services

Towards a Conceptual Model of User Acceptance of Location-Based Emergency Services

Abstract

This paper investigates the introduction of location-based services by government as part of an all-hazards approach to modern emergency management solutions. Its main contribution is in exploring the determinants of an individual’s acceptance or rejection of location services. The authors put forward a conceptual model to better predict why an individual would accept or reject such services, especially with respect to emergencies. While it may be posited by government agencies that individuals would unanimously wish to accept life-saving and life-sustaining location services for their well-being, this view remains untested. The theorised determinants include: visibility of the service solution, perceived service quality features, risks as perceived by using the service, trust in the service and service provider, and perceived privacy concerns. The main concern here is to predict human behaviour, i.e. acceptance or rejection. Given that location-based services are fundamentally a set of electronic services, this paper employs the Technology Acceptance Model (TAM) as a special adaptation of the Theory of Reasoned Action (TRA) to serve as the theoretical foundation of its conceptualisation. A series of propositions are drawn upon the mutual relationships between the determinants and a conceptual model is constructed using the determinants and guided by the propositions. It is argued the conceptual model presented would yield to the field of location-based services research a justifiable theoretical approach competent for exploitation in further empirical research in a variety of contexts (e.g. national security).

1. Introduction

Emergency management (EM) activities have long been practiced in civil society. Such activities evolved from simple precautions and scattered procedures into more sophisticated management processes that include preparedness, protection, response, mitigation and recovery strategies (Canton, 2007). In the twentieth century, governments have been utilising technologies such as sirens, speakers, radio, television and internet to communicate and disseminate time-critical information to citizens about impending dangers, during and after hazards. Over the past decade, location based services (LBS) have been implemented, or considered for implementation, by several countries to geographically deliver warnings, notifications and possibly life-saving information to people (Krishnamurthy, 2002; Weiss et al., 2006; Aloudat et al., 2007; Jagtman, 2010).

LBS take into account the pinpoint geographic position of a given device (handheld, wearable, implantable), and provide the user of the device with value added information based on the derived locational information (Küpper, 2005; Perusco & Michael, 2007). The location information can be obtained by using various indoor and/or outdoor positioning technologies that differ in their range, coverage, precision, target market, purpose and functionality.Radio frequencies, cellular telecommunications networks and global navigation satellite systems are amongst the main access media used to determine the geographic location of a device (Michael, 2004; Perusco & Michael, 2007).The collected location information can be stored for the purpose of further processing (e.g. analysing the whereabouts of a fleet of emergency service vehicles over a period of time) or combined with other relevant information and sent back to the user in a value-added form (e.g. traffic accidents and alternative routes). The user can either initiate a request for the service or it is triggered automatically when the device enters or leaves or comes in the vicinity of a defined geographic area.

The conventional use of LBS in emergency management is to find the almost exact location of a mobile handset after an emergency call or a distress short message service (SMS).Although the accuracy of the positioning results ranges from a few metres up to several kilometres, the current objective by several governments is to regulate the telecommunications carriers to provide the location information within accuracies between 50 to 150 metres. This type of service is generally known as wireless E911 in Northern America (i.e. Canada and the United States), E112 in the European Union, and similarly, but not officially, E000 in Australia.

But, even with proximate levels of accuracy LBS applications have the ability to create much more value when they are utilised under an all hazards approach by government. For example, with LBS in use,government agencies pertinent to the emergency management portfolio can collaborate with telecommunications carriers in the country to disseminate rapid warnings and relevant safety messages to all active mobile handsets regarding severe weather conditions, an act of terrorism, an impending natural disaster or any other extreme event if it happened or was about to happen in the vicinity of these mobile handsets. For that reason, LBS solutions are critically viewed by different governments around the world as an extremely valuable addition to their arrangements for emergency notification purposes (Aloudat et al., 2007; Jagtman, 2010).

However, in relation to LBS and EM almost no study has undertaken the responsibility of exploring an individual’s acceptance of utilising the services for emergency management. One might rightly ponder on whether any individual would ever forego LBS in a time of emergency. Nonetheless, despite the apparent benefits of this type of electronic service,their commercial utilisation has long raised numerous technical, social, ethical and legal issues amongst users. For example, the quality of the service information being provided, to issues related to the right of citizen privacy, and issues concerning the legal liability of service failure or information disclosure have been raised (O’Connor & Godar, 2003; Tilson et al., 2004; Perusco et al., 2006; Perusco & Michael, 2007; Aloudat & Michael, 2011). Accordingly, the contribution of this paper is to discuss the potential determinants or drivers of a person’s acceptance or rejection for utilising location-based services for emergency management, and propose a conceptual research model that comprises the drivers and justly serves as the theoretical basis needed for further empirical research.

The rest of this paper is organised as follows: Section 2 is a discussion of the factors expected to impact on a person’s perceptions towards the services, and presentation of the theoretical propositions of the expected relationships between the factors. Section 3 introduces the conceptual model and its theoretical foundation. Section 4 outlines the steps taken for pretesting the model via a pilot survey and provides the analysis results of the data collected. Section 5 concludes the paper and discusses the implications of this research work, including the theoretical contributions to the scholarly literature.

2. Determinants of acceptance or rejection

A review of acceptance and adoption literature was conducted to identify, critically assess and then select the factors that would most likely influence individuals’ beliefs regarding the use of LBS for emergencies. This approach has been completely justified by Taylor and Todd (1995), and Venkatesh and Brown (2001) on the basis that there is a wealth of information systems (IS) acceptance research, which minimises the need to extract beliefs anew for each new acceptance setting. The adopted working definitions for the selected factors are summarised in Table 1.

Table 1. Summary of the constructs and their definitions

Factor | Description of the Adopted Working Definition | Based Upon

  • Individual’s attitude towards the use of LBS
    • Individual’s positive or negative feelings towards using LBS in emergencies. Fishbein and Ajzen (1975)
  • Individual’s intention to use LBS
    • Individual’s decision to engage or not to engage in using LBS in emergencies. Fishbein and Ajzen (1975)
  • Trust
    • The belief that allows a potential LBS user to willingly become vulnerable to the use-case outcome of LBS, having taken the characteristics of LBS into consideration, irrespective of the ability to monitor or control the services or the service provider. Mayer et al., (1995), McKnight and Chervany (2001)
  • Risk as perceived by the potential user
    • Individual’s belief of the potential loss and the adverse consequences of using LBS in emergencies and the probability that these consequences may occur if the services are used. Pavlou and Gefen (2004), Heijden et al., (2005)
  • Perceived usefulness
    • Individual perception that using LBS for managing emergencies is useful. Davis et al., (1989) Perceived ease of use The degree to which the prospective user expects LBS to be free of effort. Davis et al., (1989)
  • Visibility
    • The extent to which the actual use of LBS is observed as a solution to its potential users Agarwal and Prasad (1997)
  • Perceived service qualities
    • Individual’s global judgment relating to the superiority of the service. Parasuraman et al., (1988)
  • Perceived currency
    • Prospective user’s perception of receiving up-to-the-minute service information during emergencies. Zeithaml et al., (2000), Yang et al., (2003)
  • Perceived accuracy
    • Prospective user’s perception about the conformity of LBS with its actual attributes of content, location, and timing. Zeithaml et al., (2000), Yang et al., (2003)
  • Perceived responsiveness
    • Prospective user’s perception of receiving a prompt LBS service during emergencies. Parasuraman et al., (1988), Liljander et al., (2002), Yang et al., (2003)
  • Privacy concerns
    • as perceived by the prospective user Individual’s concerns regarding the level of control by others over personal identifiable information. Stone et al., (1983)
  • Collection
    • The concern that extensive amounts of location information or other personal identifiable data will be collected when using LBS during emergencies. Smith et al., (1996), Junglas and Spitzmuller (2005)
  • Unauthorised secondary use
    • The concern that LBS information is collected for emergency purposes but will be used for other purposes without explicit consent from the individual. Smith et al., (1996), Junglas and Spitzmuller (2005)

A further discussion of each proposed factor and the criteria behind its selection are presented in the following sections.

2.1. The Visibility of Location- Based Emergency Services

Many individuals may not be aware of the possible utilisation of location-based services in emergency management and, therefore, it could be argued that the direct advantages and disadvantages of such utilisation are not be vis­ible to them (Pura, 2005; Chang et al., 2007). Individuals who are not aware of the existence of LBS or, basically do not know anything about the capabilities of this type of electronic services in the domain of emergency management, may not develop an appreciation, or even depreciation, towards the services unless they were properly and repeatedly being introduced (exposed) to

LBS emergency management solutions. In other words, people may not be able to accu­rately judge the advantages or disadvantages of LBS unless the application of LBS is visible to them. It should be noted however, that the exposure effect does not necessarily increase the perceived functionality of the services, but it can greatly enhance or degrade the percep­tions of an individual about the usefulness of the services, thus influencing their acceptance or rejection of the services (Thong et al., 2004).

One of the key attributes of the Diffusion of Innovation (DOI) Theory by Rogers (1995) is the construct of observability, which is “the degree to which the results of an innovation are observable to others” (p. 16). Innovation is “an idea, practice, [technology, solution, service] or object that is perceived as new by an individual” (Rogers, 1995,p. 135). Later, observability was perceived by Moore and Benbasat (1991) as two distinct constructs of demonstrability and visibility. Demonstrability is “the tangibility of the results of using an innovation,” and visibility is “the extent to which potential adopters see the innovation as being visible in the adoption context” (Agarwal & Prasad, 1997, p. 562). Further interpretation of visibility surmises that, an innovation, application, solution, technology or service may not be new but it could be un­known for its prospective users. This probably is the case with LBS and their application, where the services have been around for several years now, yet their general usage rates, especially in the contexts of emergency management are still extremely limited worldwide (Frost & Sul­livan, 2007; O’Doherty et al., 2007; Aloudat & Michael, 2010).

The main contribution of the DOI theory to this paper is the integration of its visibility construct in the proposed conceptual model. Visibility is defined as the extent to which the actual utilisation of LBS in EM is observed as a solution to its potential users. Considering the arguments above and following a line of reasoning in former studies, such as Karahanna et al., (1999) and Kurnia and Chien (2003), the following proposition is given:

Proposition P1: The perception of an individual of the usefulness of location-based services for emergency management is positively related to the degree to which the services as a solution are visible to him or her.

2.2. The Quality Features of Location-Based

Emergency Services

A classic definition of service quality is that it is “a judgment, or attitude, relating to the superiority of the service” (Parasuraman et al., 1988, p. 16). The quality is, therefore, a result of personal subjective understanding and evaluation of the merits of the service. In the context of emergency management, individuals may not always have comprehensive knowledge about the attributes of LBS in such context or the capabilities of the services for emergencies. Consequently, individuals may rely on indirect or inaccurate measures to judge such attributes. Therefore, there is a need to create verifiable direct measurements in order to present the subjective quality (perceived) in an objective way (determinable dimensions) in order to examining the impact of the quality features of LBS on people’s opinions towards utilising the services for EM.

The quality of electronic services (e-ser­vices) has been discerned by several research­ers as a multifaceted concept with different dimensions proposed for different service types (Zeithamletal., 2002; Zhang&Prybutok, 2005). Unfortunately, in the context of LBS there is no existing consummate set of dimensions that can be employed to measure the impact of LBS quality features on people’s acceptance of the services. Nonetheless, a set by Liljander et al., (2002) can serve as a good candidate for this purpose. The set of Lilj ander et al., was adapted from the well-known work of Parasuraman et al., (1988); the SERVQUAL model, but they redesigned the model to accurately reflect the quality measurements of e-services. The dimensions of Liljander et al., (2002) include reliability, responsiveness, customisation, as­surance/trust, and user interface.

Since LBS belongs to the family of e-ser­vices, most of the aforementioned dimensions in Liljander’s et al., model are highly pertinent and can be utilised to the benefit of this research. In addition, as the dimensions are highly adaptable to capture new media (Liljander et al., 2002) then it is expected that these dimensions would be capable of explaining people’s evaluation towards the introduction ofLBS into the modern emergency management solutions. Moreover, the few number of these dimensions are expected to provide a parsimonious yet reliable approach to study the impact of LBS quality features on people’s opinions without the need to employ larger scales such as Zeithaml’s et al., (2000), which comprises eleven dimensions, making it almost impractical to be employed along with other theorised constructs in any proposed conceptual model.

The interpretation of the reliability concept follows Kaynama and Black (2000), Zeithaml et al., (2000) and Yang et al., (2003) definitions as the accuracy and currency of the product information. For LBS to be considered reli­able, the services need to be delivered with the best possible accurate state and within the promised time frame (Liljander et al., 2002). This is highly relevant to emergency situations, taking into account that individuals are most likely on the move and often in time-critical circumstances that always demand accurate and current services.

Since it is reasonable to postulate that the success of LBS utilisation in emergency situa­tions depends on the ability of the government, as the provider of the service, to disseminate the service information to a large number of people in a timely fashion, and due to the fact that fast response to changing situations, or to peoples’ emergent requests, is considered as providing timely information to them then timeliness is closely related to responsiveness (Lee, 2005). Therefore, investigating the responsiveness of LBS would also be relevant in this context.

Liljander’s et al., (2002) user interface and customisation dimensions are not explic­itly pertinent to EM. The dimension of User interface comprises factors such as aesthetics, something that cannot actually be relevant to an emergency situation. Customisation refers to the state where information is presented in a tailored format to the user. This can be done for and by the user. As LBS are customised based on the location of the recipient and the type of information being sent to the user then customisation is already an intrinsic quality in the core features of these services.

Therefore, the service quality dimensions that are expected to impact on people’s accep­tance of LBS for EM include:

1. Perceived currency: the perceived qual­ity of presenting up-to-the-minute service information during emergency situations;

2. Perceived accuracy: individual’s percep­tion about the conformity of LBS with its actual attributes of content, location, and timing;

3. Perceived responsiveness: individual’s perception of receiving a prompt service (Parasuraman et al., 1988; Liljander et al., 2002; Yang et al., 2003).

Although perceived service quality is a representation of a person’s subjective ex­pectations of LBS, and not necessarily a true interpretation of the actual attributes of the service, it is expected nonetheless that these perceptions would convey an accepted degree of quality the prospective user anticipates in LBS, given the fact that limited knowledge about the actual quality dimensions are available to them in the real world.

It could be posited that an individual’s perception of how useful LBS are in emergen­cies can be highly influenced by the degree to which he or she perceives the services to be accurate, current and responsive. Here, the conceptual model follows the same rationale of TAM, which postulates perceived ease of use as a direct determinant of the perceived usefulness. Perceived ease of use is defined as “the degree to which an individual believes that using a particular system would be free of physical and mental effort” (Davis, 1989, p. 320). It is justifiable therefore to postulate that ease of use is related to the technical qual­ity features of LBS since the evaluation of an individual to the service easiness is highly associated with the convenient design of the service itself. This explains why ease of use has been conceived before by several researchers as one of the dimensions of the service quality (Zeithaml et al., 2002; Yang et al., 2003; Zhang & Prybutok, 2005).

Building upon the mentioned arguments and following the trails of TAM, LBS quality features of currency, accuracy and responsive­ness are theorised in the conceptual model as direct determinants of the perceived usefulness and, accordingly, the following propositions are defined:

Proposition P2a: There is a positive relation­ship between the perceived currency of location-based services and the perceived usefulness of the services for emergency management;

Proposition P2b: There is a positive relation­ship between the perceived accuracy of location-based services and the perceived usefulness of the services for emergency management;

Proposition P2c: There is a positive relation­ship between the perceived responsive­ness of location-based services and the perceived usefulness of the services for emergency management.

2.3. Risks Associated with Using Location-Based Emergency Services

Risk of varying types exists on a daily basis in a human’s life. In the extreme situations, such as emergencies and disasters, perceptions of risk stem from the fact that the sequence of events and magnitude of the outcome are largely unknown or cannot be totally controlled. If one takes into account that risky situations generally affect the confidence of people in technology (Im et al., 2008), then the decision of an individual to accept LBS for EM might be influenced by his or her intuition that these electronic services could be easily disrupted since the underlying infrastructure may suffer heavily in severe conditions usually associated with such situations, especially in large-scale disasters. A telling example is Hurricane Katrina, in 2005, which caused serious dis­ruptions throughout New Orleans, Louisiana, and rendered inoperable almost every piece of public and private infrastructure in the city. As a result, uncertainty about the intensity of extreme situations coupled with their unfore­seeable contingencies may have long-term implications on one’s perceptions towards the use of all technologies, including LBS, in life- threatening situations, such as emergencies.

Since it is practically rational to believe that individuals would perceive different types of risk in emergencies, then it might be highly difficult to examine particular facets of risk as being separate to one another since they can all be inextricably intertwined. Therefore, follow­ing the theoretical justification of Pavlou (2003), perceived risk is theorised in the conceptual model as a high-order unidimensional concept.

Perceived risk is defined as the individual’s belief of the potential loss and the adverse consequences of using LBS in emergencies and the probability that these consequences may occur if the services are used. Bearing in mind the high uncertainty that is usually associated with such events, this paper puts forward the following proposition:

Proposition P3: The risks perceived in using location-based services for emergency management have a negative influence on the perceived usefulness of the services.

2.4. People’s Trust in Location- Based Emergency Services

Trust has long been regarded as an important aspect of human interactions and their mutual relationships. Basically, any intended interac­tion between two parties proactively requires an element of trust predicated on the degree of certainty in one’s expectations or beliefs of the other’s trustworthiness (Mayer et al., 1995;

Li, 2008). Uncertainty in e-services, including LBS, leads individuals to reason about the capabilities of the services and their expected performance, which eventually brings them to either trust the services by willingly accept to use them or distrust the services by simply reject to use them. In emergencies, individuals may consider the possible risks associated with LBS before using this kind of services. There­fore, individuals are likely to trust the services and engage in a risk taking relationship if they perceive the benefits of LBS outweigh the risks. However, if high levels of risk are perceived, then it is most likely that individuals do not have enough trust in the services and, therefore, will not engage in a risk-taking behaviour by using them (Mayer et al., 1995). Consequently, it could be posited here that trust in LBS is a pivotal determinant of accepting the services, especially in emergency situations where great uncertainty is always present.

Trust has generally been defined as the belief that allows a person to willingly become vulnerable to the trustee after having taken the characteristics of the trustee into consideration, whether the trustee is another person, a product, a service, an institution or a group of people (McKnight & Chervany, 2001). In the context of LBS, the definition encompasses trust in the service provider (i.e. government in col­laboration with telecommunications carriers) and trust in the services and their underlying infrastructure. This definition is in agreement with the generic model of trust in e-services, which encompasses two types of trust: trust in the government agency controlling and provid­ing the service and trust in the technology and underlying infrastructure through which the service is provided (Tan & Thoen, 2001; Carter & Bélanger, 2005; Horkoffet al., 2006).

Since the willingness to use the services can be regarded as an indication that the person has taken into account the characteristics of both the services and the provider of the services, including any third parties in between, then it is highly plausible to say that investigating trust propensity in the services would provide a prediction of trust in both LBS and their provider. Some could reasonably argue that trust should be examined with the proposition that the person knows or, at least, has a presumption of knowledge about the services, their usefulness and the potential risks associated with them. Nonetheless, it should be noted here that trust is, ipso facto, a subjective interpretation of the trustworthiness of the services, given the limited knowledge of the actual usage of LBS in the domain of emergency management in the real world.

Despite the general consensus of the ex­istence of a mutual relationship between trust and risk, the two concepts should be investi­gated separately when examining their impact on the acceptance of LBS since both usually show a different set of antecedents (Junglas & Spitzmuller, 2006). Trust and perceived risks are essential constructs when uncertainty is present (Mayer et al., 1995). However, each of the two has a different type of relationship with uncertainty. While uncertainty augments the risk perceptions of LBS, trust reduces the individual’s concerns regarding the possible negative consequences of using the services, thus alleviating uncertainty around their per­formance (Morgan & Hunt, 1994; Nicolaou & McKnight, 2006).

Therefore, as trust in LBS lessens the uncer­tainty associated with the services, thus reduc­ing the perceptions of risk, this paper theorises that perceived risk is negatively related to an individual’s trust in LBS. This is in line with a large body of previous empirical research, which supports the influence of trust on the perceptions of risk (Gefen et al., 2003). Furthermore, by reducing uncertainty trust is assumed to create a positive perspective regarding the usefulness of the services and provide expectations of an acceptable level of performance. Accordingly, the following propositions are defined:

Proposition P4: Trust in location-based ser­vices positively influences the perceived usefulness of the services for emergency management;

Proposition P5: Trust in location-based ser­vices negatively impacts the risks perceived from using the services for emergency management.

2.5. Privacy Concerns

Pertaining to Location-Based Emergency Services

In the context of LBS, privacy pertains mainly to the locational information of the person and the degree of control he or she exercises over this type of information. Location information is regarded as highly sensitive data that when collected over a period of time or combined with other personal information can infer a great deal about a person’s movements and in turn reveal more than just one’s location. Indeed, Clarke and Wigan (2008) noted that knowing the past and present locations of a person could, amongst other things, enable the discovery of the person’s behavioural patterns in a way that could be used, for example, by governments to create a suspicion, or by the private sector to conduct target marketing.

Privacy concerns could originate when individuals become uncomfortable of the per­ception that there is a constant collection of their personal location information, the idea of its perennial availability to other parties, and the belief that they have incomplete control over the collection, the extent, the duration, the timing or the amount of data being collected about them.

The traditional commercial use of LBS, where a great level of detail about the service application are regularly available for the end user, may not create much sensitivity towards privacy since in most cases the explicit consent of the user is a prerequisite for initiating these services. This is completely true in the markets of the United States, Europe and Australia (Gow, 2005; Code of Practice of Passive Loca­tion Services in the UK, 2006; The Australian Government: Attorney General’s Department, 2008). However, in emergencies pertinent government departments and law enforcement agencies have the power to temporarily waive the person’s right to privacy based on the as­sumption that the consent is already implied when collecting location information in such situations (Gow, 2005; Pura, 2005).

The implications of waiving away the consent, even temporarily, may have long-term adverse effects on people’s perspectives towards the services in general. It also has the potential to raise a debate on to what extent individuals are truly willing to relinquish their privacy in exchange for a sense of continuous security (Perusco et al., 2006). The debate could be easily augmented in the current political climate of the so-called “war on terror” where governments have started to bestow additional powers on themselves to monitor, track, and gather personal location information in a way that never could have been justified before (Perusco & Michael, 2007). As a result, privacy concerns are no exception to emergency management.

Four privacy concerns have been identified previously by Smith et al. (1996). They are col­lection, unauthorised secondary use, errors in storage, and improper access of the collected data. These concerns are also pertinent to LBS (Junglas & Spitzmuller, 2006). Collection is defined as the concern that extensive amounts of location information or other personal identifi­able information would be collected when using LBS for emergency management. Unauthorised secondary use is defined as the concern that LBS information is collected for the purposes of emergency management but ultimately is used for other purposes and without explicit consent from the individual. Errors in storage describe the concern that the procedures taken against accidental or deliberate errors in stor­ing location information are inadequate. And improper access is the concern that the stored location information is accessed by parties who do not have the authority to do so.

Two particular privacy concerns, collection and unauthorised secondary use, are integrated into the conceptual model. These concerns are expected to have a direct negative impact on the perceived usefulness of LBS. Other prominent constructs of trust and perceived risk are assumed to have mediating effects on the relationship between privacy concerns and perceived usefulness since both constructs (i.e. trust and perceived risk) could be reasonably regarded as outcomes of the assessment of the individual of the privacy concerns. For instance, if a person does not have much concern about the privacy of his or her location information then it is most likely he or she trusts the services, thus perceiving LBS to be beneficial and useful. On the other hand, if the perceptions of privacy concerns were high then the individual would not probably engage in a risk taking behaviour, resulting in lower perceptions of the usefulness of the services.

Accordingly, perceived privacy concerns are theorised in the conceptual model as direct determinants of both trust and perceived risk. While perceived privacy concerns are postu­lated to have a negative impact on the trust in the services, they are theorised to have a positive influence on the risks perceived from using location-based services for emergency management.

Considering the above mentioned argu­ments, the following propositions are made:

Proposition P6a: Collection, as a perceived privacy concern, negatively impacts the perceived usefulness of location-based services for emergency management;

Proposition P6b: Unauthorised secondary use, as a perceived privacy concern, nega­tively impacts the perceived usefulness of location-based services for emergencies;

Proposition P7a: Collection, as a perceived privacy concern, has a negative impact on trust in location-based services;

Proposition P7b: Unauthorised secondary use, as a perceived privacy concern, has a negative impact on trust in location-based services;

Proposition P8a: The risks perceived from us­ing location-based services for emergency management are positively associated with the perceived privacy concern of collection;

Proposition P8b: The risks perceived from us­ing location-based services for emergency management are positively associated with the perceived privacy concern of unautho­rised secondary use.

3. A CONCEPTUAL MODEL OF LOCATION-BASED EMERGENCY SERVICES ACCEPTANCE

The determinants of LBS acceptance are inte­grated into a conceptual model that extends and builds upon the established theory of reasoned action (TRA), applied in a technology-specific adaptation as a technology acceptance model (TAM). See Figure 1.

Figure 1. The conceptual model of location-based emergency services acceptance

TAM postulates that usage or adoption behaviour is predicted by the individual’s inten­tion to use location-based emergency services. The behavioural intention is determined by the individual’s attitude towards using the services. Both the attitude and intention are postulated as the main predictors of acceptance. The at­titude, in turn, is influenced by two key beliefs: perceived ease of use and perceived usefulness of LBS. TAM also grants a basis for investi­gating the influence of external factors on its internal beliefs, attitude, and intention (Davis etal., 1989).

As illustrated in the model in Figure 1, a set of propositions that reflect the theoretical relationships between the determinants of ac­ceptance are presented as arrowed lines that start from the influential factor and end into the dependent construct. The theorised factors supplement TAM’s original set and are totally in agreement with its theoretical structural formulation. That is, all of the hypothesised effects of the factors would only be exhibited on the internal constructs (i.e. attitude and inten­tion) through the full mediation of the internal beliefs (i.e. perceived usefulness or perceived ease of use).

4. MODEL PRETESTING

A pilot survey was conducted in order to test the reliability of the model’s constructs. IS literature places great emphasis on the importance of the piloting stage as part of the model’s development (Baker, 1999; Teijlingen & Hundley, 2001). In essence, the pilot survey is an experimental study that aims to collect data from a small set of subjects in order to discover any defects or flaws that can be corrected, before the conceptual model is tested in a large scale survey (Baker, 1999; Zikmund, 2003).

4.1. Measurement of Constructs

To increase construct measurement reliability, most of the items in the survey, which have been tested and validated in former studies, were adapted to reflect the specific context of this research i.e. location-based services. It should be emphasised here that the use of existing items in the literature is completely a valid approach (Churchill, 1979).

The scales of TAM’s perceived useful­ness and perceived ease of use were measured based on the original scales of Davis (1989). Attitude measurement items were adopted from two studies by Agarwal and Prasad (1999) and Van der Heij den et al., (2001). Intention to use items were measured using scales adopted from Junglas and Spitzmuller (2005). Trust measure­ments were adopted from Mayer et al., (1995) and Junglas and Spitzmuller (2005). Pavlou and Gefen (2004)perceived risk items were adopted given the emphasis on emergency management. The items of the visibility construct were ad­opted from a study by Karahanna et al., (1999). The items of perceived privacy concerns were adopted from Smith et al., (1996) and Junglas and Spitzmuller (2005). The statements of perceived service qualities were not directly available but have been operationalized based on the recommendations of Churchill (1979), who suggested that each statement should express limited meaning, its dimensions should be kept simple and the wording should be straightforward.

4.2. Survey Design

The survey included an overview and introduction of the application of location-based services in emergency management. In addition, the survey provided the participants with four vignettes. Each vignette depicted a hypothetical scenario about the possible uses of LBS applications for managing potential hazardous situations. The scenarios covered specific topics to emergencies such as an impending natural disaster, a situation where a person was par­ticularly in need of medical assistance, severe weather conditions and a national security issue. Two of the vignettes were designed to present location-based services in a favourable light, and the other two vignettes were designed to draw out the potential pitfalls and limitations of LBS in EM. Through the use of vignettes, participants were encouraged to project their true perceptions about the services while, at the same time, involved with creating a meaning related to their potential use in these events. Creating this meaningful attachment in context was very important, as it acted to inform par­ticipant responses, given the utilisation of LBS in EM is still in its nascent stages worldwide.

A self-administrated questionnaire was used to collect data from participants. A five- point Likert rating scale was used throughout the questionnaire. The survey which predominantly yielded quantitative results also included one open-ended question in order to solicit more detailed responses from the participants.

4.3. The Sample of the Pilot Survey

Six hundred pilot surveys were randomly distributed by hand, in November 2008, to households’ mailboxes in the Illawarra region and the City of Wollongong, New South Wales, Australia. Participants were asked to return their copies to the researcher within three weeks in the enclosed reply-paid envelope provided with the survey.

Although, this traditional approach is time- consuming and demands a lot of physical effort, it was favoured as it is more resilient to social desirability effects (Zikmund, 2003), where respondents may reply in a way they think it is more socially appropriate (Cook & Campbell, 1979). In addition, it is generally associated with high perceptions of anonymity, something that may not be completely assured or guaranteed by other methods of data collection since they tend to disclose some personal information, such as name, telephone number, email address or IP address, which may cause privacy infringements (Zikmund, 2003; Michaelidou & Dibb, 2006).

The main concern was to end up with a low response rate, an issue several researchers have noted before (Yu & Cooper, 1983; Galpin, 1987; Zikmund, 2003). Indeed, a total of 35 replies were returned, yielding an extremely low response rate of 5.8%. Two incomplete replies were excluded, leaving only 33 usable surveys for the final analysis.

Although it is a desirable goal to end up with a high response rate to have more confidence in the results, and to be able to comment on the significance of the findings (Emory & Cooper, 1991; Saunders et al., 2007), it should be noted that the pilot study’s main objective is to serve as an initial test (pretest) of the conceptual model and does not, in any away, attempt to generalise its results to a new population. Therefore, the generalisability of the findings is not an issue of contention here (Morgan & Hunt, 1994).

Nonetheless, there is much discussion in the literature of what constitutes a “good” response rate of the pilot survey; hence, its acceptable sample size. Hunt et al., (1982), for example, stated that several researchers simply recom­mended a “small” sample size, others indicated a sample size between 12 and 30 as sufficient to fulfil the requirements of the analysis. Anderson and Gerbing (1991) pretested a methodology for predicting the performance of measures in a confirmatory factor analysis with a sample size of 20. They posited the consistency of this small sample size with the general agreement between researchers that the number should be relatively small. Reynolds et al., (1993) noted that the sample size of pilot surveys is generally small when discussed in the literature, ranging from 5 to 100, an depending on the goal of the study.

The main concern, however, when as­sessing the effect of a low response rate on the validity of the survey is when taking into account the non-response bias (Cummings etal., 2001; Fowler, 2001). The bias stems from the possibility that only the sample population who are interested in the topic of the pilot survey would provide their responses back (Fowler, 2001). Nonetheless, if non-respondents’ char­acteristics are systematically similar to those of the respondents, then the non-response bias is not necessarily reduced by an increased response rate (Cummings et al., 2001).

Kanuk and Berenson (1975) in a compre­hensive literature review of the factors influenc­ing response rates to mail surveys, examined the significant differences between respondents and non-respondents, taking into account a broad range of personality traits, socio-economic and demographic characteristics. The researchers concluded that the only consistent difference was that respondents tend to be better educated.

Since respondents of this pilot survey were of all levels of education, as illustrated in Table 2, where for example, 7 respondents had a secondary education while 7 had post­graduate degrees, representing the low-level educated and the well-educated population, then it is argued that non-respondents did not differ significantly from the survey’s responders, suggesting that non-response bias was not present, and therefore, low response rate is not an issue here. Thus, the pilot survey with its low response rate, and for which no systematic differences between respondents and non-respondents exist is considered valid for the analysis.

Table 2. Respondents education

The traditional benchmarks in mail survey studies that positioned a 50 percent response rate as adequate and 70 percent as very good (Babbie, 1998) should be reappraised. Current trends of thinking reject these unconditional criterion levels and assertively demand for a contextual approach where response rate is considered in conjunction with the goal of the study, its design and the nature of its sample (Fife-Schaw, 2000; Fowler, 2001).

4.4. Reliability of the Measurements

Reliability expresses the extent to which the measures in the instrument are free of random errors, thus yielding similar consistent results if repeated (Yin, 2003; Zikmund, 2003). Reli­ability reflects the internal consistency of the scale items measuring the same construct for the selected data. Hence, it is basically an evaluation of the measurement accuracy (Straub, 1989). Nunnally and Bernstein (1994) recommended the calculation of Cronbach’s alpha coefficients to assess reliability. Straub (1989) suggested an alpha value of 0.80 as the lowest accepted threshold. However, Nunnally and Bernstein (1994) stated that 0.60 is accepted for newly developed measures, otherwise, 0.70 should serve as the lowest cut-off value.

The common threshold value of 0.7 was selected as the minimum acceptable level based on the recommendations of Nunnally and Bern­stein (1994) and Agarwal and Karahanna (2000). The results ofthe analysis are presented in Table 3, revealing acceptable values for nearly all measurements except perceived accuracy which was found to be 0.684. Accordingly, one highly complex item was excluded and the revised construct was put again through another round of validation, after which a higher acceptable coefficient of 0.724 was yielded.

Table 3. Cronbach’s alpha reliability statistics

Another reliability scale assessment, through the computation of composite reli­ability, was also conduted. It is similar in interpretation to Cronbach’s alpha test, but it applies the actual loadings of the items and does not assume weight equivalency among them (Chin, 1998). Moreover, Raykov (1997) showed that Cronbach’s test may under-estimate the reliability of the congeneric measures, leav­ing the researcher with lower-bound estimates of the true reliability scores. As illustrated in Table 4, the results show that all scores far exceed the 0.7 recommended threshold (Hair et al., 2006). Consequently, these results bring more confidence in the conceptual model and its constructs as they have demonstrated high internal consistency under the evaluation of two separate reliability tests.

Table 4. Composite reliability statistics

Table 4. Composite reliability statistics

5. Conclusion and Implications

Despite the large body of research that has been written to augment our understanding of the determinants of acceptance and adoption of location-based services in various usage contexts, the scarcity of theoretical and empiri­cal studies that examine people’s acceptance of LBS in the realm of emergencies is noted. This is clearly a gap in the current research in which this study makes a significant contribu­tion. This paper is a discussion of unexplored determinants in relation to user acceptance of location-based emergency services. These include the visibility of LBS applications in the context of emergency management, the privacy of individuals and their perceived concerns regarding extensive collection and unauthorised secondary use of the collected data by governments, risks as associated with using LBS for EM, trust in the services and in the service provider, and the current, accurate and responsive quality features of the services being offered for emergency management.

This paper proposed a conceptual model based on the aforementioned determinants that should serve as the theoretical basis for future empirical examination of acceptance. The model significantly extends and builds upon the theory of reasoned action, applied in a technology-specific adaptation as a technology acceptance model.

Although the conceptual model was built specifically to predict an individual’s acceptance of LBS for emergency management, the model can nonetheless be used as a generic candidate model in empirical studies to predict people’s acceptance of location-based services in other security usage contexts, applications, scenarios or settings. This is made possible due to the fact that all of the theorised factors of the model are highly relevant to the intrinsic characteristics of LBS. Examples of where the model would be deemed particularly useful include law enforce­ment applications, such as matters related to the surveillance implications of location-based services, and location-based evidence captures and social issues pertaining to the application of the services, such as arrest support, traffic violations or riot control.

In addition, the proposed model can be used not only to identify the predictors of acceptance but also to help the service providers to design their solutions in a way that can fairly meet the end user expectations. For instance, the model identifies perceived usefulness, perceived ease of use and perceived service quality features as expected determinants of acceptance. Once em­pirically tested, the impact of these factors can provide guidelines to developers of the services to accommodate the right service requirements that reflect acceptable performance standards for the potential users.

Finally, the application of location-based services in today’s society has the potential to raise concerns amongst users. These concerns could easily be augmented in highly sensitive settings, such as emergency management or counter-terrorism solutions. While this paper presents theoretical foundations, it is hoped the knowledge obtained here can be considered by governments and interested researchers towards the formation of developing more successful deployment and diffusion strategies for loca­tion-based emergency services globally. The purpose of this paper is to help in channelling such strategies in the right direction.

References

Agarwal, R., & Prasad, J. (1997). The role of innovation characteristics and perceived volun­tariness in the acceptance of information technologies. Decision Sciences, 28(3), 557–582. doi: 10.111 1/j. 1540-5915. 1997.tb01322.x.

Aloudat,A., &Michael,K. (2010). Toward the regula­tion of ubiquitous mobile government: A case study on location-based emergency services in Australia. Electronic Commerce Research, 10(4).

Aloudat,A., &Michael, K. (2011). The socio-ethical considerations surrounding government mandated location-based services during emergencies: An Australian case study. In M. Quigley(Ed.), ICT ethics and security in the 21st century: New developments and applications (1sted.,pp. 129–154). Hershey,PA: IGI Global. doi: 10.40 18/978-1-60960-573-5.ch007.

Aloudat, A., Michael, K., & Jun, Y. (2007). Location- based services in emergency management- from government to citizens: Global case studies. In P. Mendis, J. Lai, E. Dawson, & H. Abbass (Eds.), Re­cent advances in security technology (pp. 190–201). Melbourne, Australia: Australian Homeland Security Research Centre.

Canton, L. G. (2007). Emergency management: Concepts and strategies for effective programs (1st ed.). Hoboken, NJ: John Wiley & Sons, Inc..

Carter, L., & Bélanger, F. (2005). The utilization of e-government services: Citizen trust, innovation and acceptance factors’. Information Systems Journal, 15(1),5–25.doi:10.1111/j.1365-2575.2005.00183.x.

Chang, S., Hsieh, Y.-J., Lee, T.-R., Liao, C.-K., & Wang, S.-T. (2007). A user study on the adoption of location based services. In Advances in web and network technologies, and information management (pp. 276-286).

Clarke, R., & Wigan, M. (2008). You are where you have been. In K. Michael, & M. G. Michael (Eds.), Australia and the new technologies: Evidence based policy in public administration (pp. 100–114). Can­berra: University of Wollongong.

Code of Practice of Passive Location Services in the UK. (2006). Industry code of practice for the use of mobile phone technology to provide passive location services in the UK. Retrieved August 23, 2007, from http://www.mobilebroadbandgroup.com/documents/ UKCoP_location_servs_210706v_pub_clean.pdf

Davis, F. D. (1989). Perceived usefulness, perceived ease ofuse, and user acceptance of information tech­nology. Management Information Systems Quarterly, 13(3), 318–340. doi:10.2307/249008.

Frost and Sullivan research service. (2007). Asia Pacific location-based services (LBS) mar­kets. Retrieved August 28, 2007, from http:// www.frost.com/prod/servlet/report-brochure.pag?id=P08D-01-00-00-00

Gefen, D., Srinivasan Rao, V., & Tractinsky, N. (2003, January 6-9). The conceptualization of trust, risk and their electronic commerce: the need for clarifications. In Proceedings of the 36th Annual Hawaii International Conference on System Sciences. IEEEXplore Database.

Gow, G. A. (2005). Pinpointing consent: Location privacy and mobile phones. In K. Nyíri (Ed.), A sense of place: The global and the local in mobile communication (pp. 139–150). Vienna, Austria: Passagen Verlag.

Horkoff, J., Yu, E., & Liu, L. (2006). Analyzing trust in technology strategies. In Proceedings of the the 2006 International Conference on Privacy, Security and Trust: Bridge the Gap Between PST Technologies and Business Services, Markham, Ontario, Canada.

Im, I., Kim, Y., & Han, H.-J. (2008). The effects of perceived risk and technology type on users’ accep­tance of technologies. Information & Management, 45(1), 1–9. doi: 10. 1016/j.im.2007.03.005.

Jagtman, H. M. (2010). Cell broadcast trials in the Netherlands: Using mobile phone technology for citizens’alarming. Reliability Engineering & System Safety, 95(1), 18–28. doi: 10. 1016/j.ress.2009.07.005.

Junglas,I., & Spitzmuller, C. (2006). Personality traits and privacy perceptions: An empirical study in the context of location-based services. In Proceedings of the International Conference on Mobile Busi­ness, Copenhagen, Denmark (pp. 11). IEEEXplore Database.

Karahanna, E., Straub, D. W., & Chervany, N. L. (1999). Information technology adoption across time: A cross-sectional comparison of pre- adoption and post-adoption beliefs. Management Information Systems Quarterly, 23(2), 183–213. doi: 10.2307/249751.

Kaynama, S. A., & Black, C. I. (2000). A proposal to assess the service quality of online travel agencies. Journal of Professional Services Marketing, 21(1), 63–68. doi: 10. 1300/J090v21n01_05.

Krishnamurthy, N. (2002, December 15-17). Using SMS to deliver location-based services. In Proceed­ings of the 2002 IEEE International Conference on Personal Wireless Communications, New Delhi, India.

Küpper, A. (2005). Location-based services: Fundamentals and operation (1st ed.). Chichester, UK: John Wiley & Sons Ltd. doi:10.1002/0470092335.

Kurnia, S., & Chien,A.-W. J. (2003, June 9-11). The acceptance of online grocery shopping. In Proceed­ings of the 16th Bled eCommerce Conference, Bled, Slovenia.

Lee, T. (2005). The impact of perceptions of interac­tivity on customer trust and transaction intentions in mobile commerce. Journal of Electronic Commerce Research, 6(3), 165–180.

Li, P. P. (2008). Toward a geocentric framework of trust: An application to organizational trust. Man­agement and Organization Review, 4(3), 413–439. doi: 10.111 1/j. 1740-8784.2008.00120.x.

Liljander, V., Van-Riel, A. C. R., & Pura, M. (2002). Customer satisfaction with e-services: The case of an on-line recruitment portal. In M. Bruhn, & B. Stauss (Eds.), Jahrbuch Dienstleistungs management 2002 – Electronic Services (1st ed., pp. 407–432). Wiesbaden, Germany: Gabler Verlag.

Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational trust. Academy of Management Review, 20(3), 709–734.

McKnight, D. H., & Chervany, N. L. (2001). What trust means in e-commerce customer relationships: An interdisciplinary conceptual typology. Interna­tional Journal of Electronic Commerce, 6(2), 3 5–59.

Michael, K. (2004). Location-based services: A ve­hicle for IT&T convergence. In K. Cheng, D. Webb, & R. Marsh (Eds.), Advances in e-engineering and digital enterprise technology (pp. 467–478). Profes­sional Engineering Pub..

Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the perceptions of adopting an information technology innovation. Information Systems Research, 2(3), 192–222. doi:10.1287/isre.2.3. 192.

Morgan, R. M., & Hunt, S. D. (1994). The commit­ment-trust theory of relationship marketing. Journal of Marketing, 58(3), 20. doi: 10.2307/1252308 

Nicolaou, A. I., & McKnight, D.H. (2006).Perceived information quality in data exchanges: Effects on risk, trust, and intention to use. Information Systems Re­search, 17(4), 332–351. doi: 10. 1287/isre. 1060.0103.

O’Connor, P. J., & Godar, S. H. (2003). We know where you are: The ethics of LBS advertising. In B. E. Mennecke, &T. J. Strader (Eds.),Mobile commerce: Technology, theory, and applications (pp.211–222). Hershey, PA: IGI Global. doi: 10.4018/978-1-59140- 044-8.ch013.

O’Doherty, K., Rao, S., & Mackay, M. M. (2007). ‘Young Australians’ perceptions of mobile phone content and information services: An analysis of the motivations behind usage. Young Consumers: Insight and Ideas for Responsible Marketers, 8(4),257–268. doi:10.1 108/17473610710838617.

Parasuraman, A., Berry, L., & Zeithaml, V. (1988). SERVQUAL: A multiple-item scale for measuring service quality. Journal of Retailing, 64(1), 12–40.

Pavlou, P. A. (2003). Consumer acceptance of elec­tronic commerce: Integrating trust and risk with the technology acceptance model. International Journal of Electronic Commerce, 7(3), 101–134.

Perusco, L., & Michael, K. (2007). Control, trust, privacy, and security: Evaluating location-based services. IEEE Technology and Society Magazine, 4–16. doi:10.1109/MTAS.2007.335564.

Perusco, L., Michael, K., & Michael, M. G. (2006, October 11-13). Location-based services and the privacy-security dichotomy. In Proceedings of the Third International Conference on Mobile Computing and Ubiquitous Networking,London,UK(pp. 9 1-98). Research Online: University of Wollongong Database.

Pura, M. (2005). Linking perceived value and loyalty in location-based mobile services. Managing Service Quality, 15(6), 509–538. doi:10.1 108/096045205 10634005.

Rogers, E. M. (1995). Diffusion of innovations (4th ed.). New York, NY: Free Press.

Smith, H. J., Milberg, S. J., & Burke, S. J. (1996). Information privacy: Measuring individuals’ con­cerns about organizational practices’. Management Information Systems Quarterly, 20(2), 167–196. doi: 10.23 07/249477.

Tan, Y.-H., & Thoen, W. (2001). Toward a generic model of trust for electronic commerce. International Journal of Electronic Commerce, 5(2), 6 1–74.

The Australian Government: Attorney General’s Department. (2008). Privacy act 1988: Act No. 119 of 1988 as amended. the Office of Legislative Drafting and Publishing. Retrieved August 2, 2008, from http:// www.comlaw.gov.au/ComLaw/Legislation/Act­Compilation1.nsf/0/63C00ADD09B982ECCA257490002B9D57/$file/Privacy1988_WD02HYP.pdf

Thong, J. Y. L., Hong, W., & Tam, K. Y. (2004). What leads to acceptance of digital libraries? Communications of the ACM, 47(11), 78–83. doi: 10.1145/1029496.1029498.

Tilson, D., Lyytinen, K., & Baxter, R. (2004, Janu­ary 5-8). A framework for selecting a location based service (LBS) strategy and service portfolio. In Proceedings of the 3 7th Annual Hawaii International Conference on System Sciences, Big Island, HI. IEEEXplore Database.

Weiss, D., Kramer, I., Treu, G., & Kupper, A. (2006, June 26-29). Zone services -An approach for location-based data collection. In Proceedings of the 8th IEEE International Conference on E-Commerce Technology, The 3rd IEEE International Confer­ence on Enterprise Computing, E-Commerce, and E-Services, San Francisco, CA.

Yang, Z., Peterson, R. T., & Cai, S. (2003). Services quality dimensions of Internet retailing: An explor­atory analysis. Journal of Services Marketing, 17(7), 685–700. doi: 10.1108/08876040310501241.

Zeithaml, V. A., Parasuraman, A., & Malhotra, A. (2000). A conceptual framework for understanding e-service quality: Implications for future research and managerial practice. MSI Working Paper Series, (WorkingPaper00-1 15), Marketing Science Institute, Cambridge, MA.

Zeithaml, V. A., Parasuraman, A., & Malhotra, A. (2002). Service quality delivery through web sites: A critical review of extant knowledge. Academy of Marketing Science, 30(4), 362. doi: 10.1177/009207002236911.

Zhang, X., & Prybutok, V. R. (2005). A consumer perspective of e-service quality. IEEE Transactions on Engineering Management, 52(4), 461–477. doi: 10.1 109/TEM.2005.856568.

Keywords: Acceptance, Location-Based Emergency Services, Privacy, Risk, Service Quality, Technology Acceptance Model (TAM), Theory of Reasoned Action (TRA), Trust, Visibility

Citation: Anas Aloudat, Katina Michael, "Towards a Conceptual Model of User Acceptance of Location Based Emergency Services", International Journal of Ambient Computing and Intelligence, 5(2), 17-34, April-June 2013.