Reconnaissance and Social Engineering Risks as Effects of Social Networking

Author Note: This paper is a "living reference work entry". Published first in 2014, now in second edition with minor changes to original content.

… not what goes into the mouth defiles a man, but what comes out of the mouth, this defiles a man.” Matthew 15:11 (RSV)


For decades we have been concerned with how to stop viruses and worms from penetrating organizations and how to keep hackers out of organizations by luring them toward unsuspecting honeypots. In the mid-1990s Kevin Mitnick’s “dark-side” hacking demonstrated, and possibly even glamorized (Mitnick and Simon 2002), the need for organizations to invest in security equipment like intrusion detection systems and firewalls, at every level from perimeter to internal demilitarized zones (Mitnick and Simon 2005).

In the late 1990s, there was a wave of security attacks which stifled worker productivity. During these unexpected outages, employees would take long breaks queuing at the coffee machine, spend time cleaning their desk, and try to look busy shuffling paper in their in- and out-trays. It was clear by the downtime caused by malware hitting servers worldwide that corporations had begun to rely on intranets for content and workflow management so much and that employees would be left with very little to do when they were not connected. Nowadays, everything is online with respect to the service industry, and there is a known vulnerability in the requirement to be always connected. For example, you can cripple an organization if you take away their ability to accept electronic payments online, or render their content management system inaccessible due to denial of service attacks, or hack into a company’s webpage.

When the “Melissa” virus caught employees unaware in 1999, and was then followed by the “” worm in the same year, public folders had Microsoft Office files either deleted or corrupted. At the time, anecdotal stories indicated that some people (even whole groups) lost several weeks of work, after falling victim to the worm that had attacked their hard drive. This led many to seek backup copies of their files, only to find that the backups themselves were not activated (Michael 2003).

The moral of the story is that for decades we have been preoccupied with stopping data (executables, spam, false log-in attempts, and the like) from entering the organization when the real problem since the rise of broadband networks, 3G wireless, and more recently social media has been how to stop data from going out of the organization. While this sounds paradoxical, the major concern is not what data traffic comes into an organization, but what goes out of an organization that matters. We have become our own worst enemy when it comes to security in this online-everything world we live in.

In short, data leakage is responsible for most corporate damage, such as the loss of competitive information. You can secure a bucket and make it water tight, put a lid on it, even put a lock on the lid, but if that bucket has even a single tiny hole, its contents will leak out and cause spillage. Such is the dilemma of information security today – while we have become more aware of how to block out unwanted data, the greatest risk to our organization is that which leaves the organization – through the network, through storage devices, and via an employees’ online personal blog, even the spoken word. It is indeed what most security experts call the “human” factor (Michael 2008).

Reconnaissance of Social Networks for Social Engineering

Social Networking

The Millennials, also known as Gen Ys, have been the subject of great discussion by commentators. If we are to believe what researchers say about Gen Ys, then it is this generation that has voluntarily gone public with private data. This generation, propelled by advancements in broadband wireless, 3G mobiles, and cloud computing, is always connected and always sharing their sentiments and cannot get enough of the new apps. They are allegedly “transparent” with most of their data exchanges. Generally, Gen Ys do not think deeply about where the information they publish is stored, and they are focused on convenience solutions that benefit them with the least amount of rework required. They tend not to like to use products like Microsoft Office and would rather work on Google Drive using Google Docs collaboratively with their peers. They are less concerned with who owns information and more concerned with accessibility and collaboration.

Gen Ys are characterized with creating circles of friends online, doing everything digitally they possibly can, and blogging to their heart’s content. In fact, Google has recently released a study that has found that 80% of Gen Ys make up a new generation dubbed “Gen C.” Gen Cs are known as the YouTube generation and are focused on “creation, curation, connection, and community” (Google 2012). It is generally embraced in the literature that this is the generation that would rather use their personally purchased tools, devices, and equipment for work purposes because of the ease of carrying their “life” and “work” with them everywhere they go and the ease of melding their personal hobbies, interests, and professional skillsets with their workplace seamlessly (PWC 2012). Bring your own device (BYOD) is a movement that has emerged from this type of mind-set. It all has to do with customization and personalization, with working with settings that have been defined by the user and with lifelogging in a very audiovisual way. Above all the mantra of this generation is Open-Everything. The claim made by Gen Cs is that transparency is a great force to be reckoned with when it comes to accessibility. Gen Cs allegedly define their social network and are what they share, like, blog, and retweet. This is not without risk, despite that some criminologists have played down the fear as related to privacy and security concerns (David 2008).

Despite that online commentators regularly like to place us all into categories based on our age, most people we’ve spoken to through our research do not feel like any particular “generation.” Individuals like to think they are smart enough to exploit the technologies for what they need to achieve. People may generally choose not to embrace social networking for blogging purposes, for instance, but might see how the application can be put to good use within an institutional setting and educational framework. For this reason they might be heavy users of social networking applications like LinkedIn, Twitter, Facebook, and Google Latitude but also understand its shortcomings and the potential implications of providing a real name, gender, and date of birth, as well as other personal particulars like geotagged photos or live streaming.

This ability to gather and interpret cyber-physical data about individuals and their behaviors has a double-edged spur when related back to a place of work. On the one hand, we have data about someone’s personal encounters that can be placed in a context back to a place of employment (Dijst 2009). For instance, a social networking update might read: “In the morning, I met with Katina Michael, we spoke about putting a collaborative grant together on location-based tracking, and then I went and met Microsoft Research Labs to see if they were interested in working with us, and had lunch with (+person) (#microsoft) who is a senior software engineer.” This information is pretty innocent on its own but there are a lot of details in there that might be used for gathering information: (1) a real name, (2) a real e-mail address, (3) an identifiable position in an organization, (4) potentially links to an extended social network, and (5) possibly even a real physical location of where the meeting took place if the individual had a location-tracking feature switched on their mobile social network app. The underlying point here is that you might have nothing to fear by blogging or participating on social networks under your company identity, but your organization might have much to lose.

Social Reconnaissance

Despite that many of us don’t wish to admit it, we have from time to time conducted social reconnaissance online for any number of reasons. In the most basic of cases, you might be visiting a location you have not previously been to and you use Google Street View to take a quick look at what the dwelling looks like for identification purposes. You might also browse the web with your own name, dubbed “ego surfing,” to see how you have been cited, quoted, and tagged in images or generally what other people are saying about you. But businesses also are increasingly keeping their eye out on what is being said about their brand using automatic web alerts based on hashtags, to the extent that new schemes offering insurance for business reputation have begun to emerge. Now, my point here is not whether or not you conduct social reconnaissance on yourself, or your family, or your best friend, or even strangers that look enticing, but on what hackers out there might learn about you and your life and your organization by conducting both social and technical reconnaissance. Yes, indeed, if you didn’t know it already, there are people out there that will (1) spend all their work time looking up what you do (depending on who you are), (2) think about how that information they have gathered can be related back to your place of work, and (3) exploit that knowledge to conduct clever social engineering attacks (Hadnagy 2011).

Chris Hadnagy, founder of, was recently quoted as saying: “[i]nformation gathering is the most important part of any engagement. I suggest spending over 50 percent of the time on information gathering… Quality information and valid names, e-mails, phone number makes the engagement have a higher chance of success. Sometimes during information gathering you can uncover serious security flaws without even having to test, testing then confirms them” (Goodchild 2012).

It is for this reason that social engineers will focus on the company website, for instance, and build their attack plan off that. Dave Kennedy, CSO of Diebold, complements this idea by firsthand experience: “[a] lot of times, browsing through the company website, looking through LinkedIn are valuable ways to understand the company and its structure. We’ll also pull down PDF’s, Word documents, Excel spread sheets and others from the website and extract the metadata which usually tells us which version of Adobe or Word they were using and operating system that was used” (Goodchild 2012).

Most of us know of people who do not wish to be photographed and who have painstakingly attempted to un-tag themselves from a variety of images on social networks, who have tried to delete their online presence and be judged before an interview panel for the person they are today, not the person they were when MySpace or Facebook first came out. But what about the separate group of people who do not acknowledge that there is a fence between their work life and home life, accept personal e-mails on a work account, and then are vocal about everything that happens to them on a moment-by-moment basis with a disclaimer that reads: “anything you read on this page is my own personal opinion and not that of the organization I work for.” Some would say these individuals are terribly naïve and are probably not acting in accord with organizational policies. The disclaimer won’t help the company nor will it help them. Ethical hackers, who have built large empires around their tricks of the trade since the onset of social networking, have spent the last few years trying to educate us all – “data leakage is your biggest problem folks” not the fact that you have weak perimeters! You are, in other words, your own worst enemy because you divulge more than you can afford to, to the online world.

No one is discounting that there are clear benefits in making tacit knowledge explicit by recording it in one form or another, or openly sharing our research data in a way that is conducive to ethical practices, and making things more interoperable than what they are today – but the world keeps moving so fast that for the greater part people are becoming complacent with how they store their datasets and the repercussions of their actions. But the repercussions do exist, and they are real.

Social Engineering

Expert social engineers have never relied on very sophisticated ways of penetrating security systems. It is worth paying a visit to the social engineering toolkit (SET) at www.ocial-engineer.rg where you might learn a great deal about ethical hacking (Palmer 2001) and pentesting (Social-Engineer.Org 2012). Here social engineering tools are categorized as physical (e.g., cameras, GPS trackers, pen recorders, and radio-frequency bug kits), computer based (e.g., common user password profilers), and phone based (e.g., caller ID spoofing). In phase 1 of their premeditated attacks, social engineers are merely engaged in the practice of observation of the information we each put up for grabs willingly. And beyond “the information” itself, subjects and objects are also under surveillance by the social engineers as these might give further clues to the potential hack. It is when there is enough information that a social engineer will think about the next phase 2 which could mean dumpster diving and collecting as much hard copy and online evidence as possible (e.g., company website info). Social networks have given social engineers a whole new avenue of investigation. In fact, social networking will keep social engineers in fulltime work forever and ever unless we all get a lot smarter with how to use these applications.

In phase 3, the evidence gathered by the hacker is used to good practice as they claw their way deeper and deeper into organizational systems. It might mean having a few full names and position profiles of employees in a company and then using their “hacting” (hacking and acting) skills to get more and more data. Think about social engineers, building on steps and penetrating deeper and deeper into the administration of an organization. While we might think executives are the least targeted individuals, social engineers are brazen to ‘attack’ personal assistants of executives as well as operational staff. One of the problems associated with social networking is that executives casually give over their login and passwords to personal assistants to take care of their online reputations, thus becoming increasingly easier to manipulate and hijack these spaces and use them to as proof for a given action. When social engineers get that level of authority they require to circumvent systems or they are able to use a technical reconnaissance to exploit data found via social reconnaissance (or vice versa), then they can gain access to an organization’s network resources remotely, free to unleash cross-site scripting, man-in-the-middle attacks, SQL code injection, and the like.

Organizational Risks

We have thus come full circle on what social reconnaissance has to do with social networks. Social networking sites (SNS) provide social engineers with every bit of space they need to conduct their unethical hacking and their own penetration tests. You would not be the first person to admit that you have accepted a “friend” on a LinkedIn invitation without knowing who they are, or even caring who they are. Just another e-mail in the inbox to clear out, so pressing accept is usually a lot easier than pressing ignore and then delete or even blocking them for life.

Consider the problem of police in metropolitan forces creating LinkedIn profiles and accepting friends of friends on their public social network profile. What are the implications of this from a criminal perspective? Carrying the analogy of police further, what of the personal gadgets they carry? How many police are currently carrying e-mails on personal mobile phones that they should not be for security concerns? Or even worse, police who have their Twitter, Facebook, or LinkedIn profile always connected via their mobile phone? The police can be said to be rapidly introducing new policies to address these problems, but the problems regardless still exist for mainstream employees of large, medium, and even small organizations. The theft does not have to be complex like the stealing of software code or other intellectual property in designs and blueprints but as simple as the theft of competitive information like customer lead lists in a Microsoft Access database, or payroll data stored in MYOB, or even the physical device itself.

Penetration testing done periodically can be used as feedback into the development of a more robust information security life cycle that can aid those in charge of information governance to react proactively to help employees understand the implications of their practices (Bishop 2007). Trustwave 2012 advocates for four types of assessment and testing. The first is straightforward and traditional physical assessment. The second is client-side penetration testing which validates whether every staff member is adhering to policies. The third is business intelligence testing which is investigating how employees are using social networking, location-enabled devices, and mobile blogging to ensure that a company’s reputation is not at risk and to find out what data exists publically about an organization. And finally, red team testing is when a group of diverse subject matter experts try to penetrate a system reviewing security profiles independently.

No one would ever want to be the cause behind the ransacking of their organization’s online information above and beyond the web scraping technologies becoming widely available (Poggi et al. 2007). It would help if policies were enforceable within various settings but these too are difficult to monitor. How does one get the message across that while blocking unwanted traffic at the door is very important for an organization, what is even more important is noting what goes walkabout from inside the organization out? It will take some years for governance structures to adapt to this kind of thinking because the security industry and the media have previously been rightly focused on Denial of Service (DoS) attacks and botnets and the like (Papadimitriou and Garcia-Molina 2011). But it really is a chicken and egg problem – the more information we give out using social networking sites, the more we are giving impetus to DoS, DDoS, and the proliferation of botnets (Kartaltepe et al. 2010; Huber et al. 2009).


Possibly this entry may not have convinced employees that greater care should be taken about what they publish online, on personal blogs, or the pictures or footage post on lifelogs or on YouTube, but it may have convinced employees that the biggest problems today in security systems arise from the information that users post publicly in environments that rely on social networks. This information is just waiting to be harvested by people unsuspecting to users that they will probably never meet physically. Employers need to get their staff educated on company policies periodically and even review the policies they create no less than every 2 years. As an employer you should also be considering when the last time was that your organization performed a penetration test that considered new social networking applications. Individuals should extend this kind of pentesting to their own online profiles and review their own personal situation. Sure you might not have nothing to hide, but you might have a lot to lose.


  1. Bishop M (2007) About penetration testing. IEEE Secur Privacy 5(6):84–87

  2. David SW (2008) Cybercrime and the culture of fear. Inf Commun Soc 11(6):861–884

  3. Dijst M (2009) ICT and social networks: towards a situational perspective on the interaction between corporeal and connected presence. In: Kitamura R, Yoshii T, Yamamoto T (eds) The expanding sphere of travel behaviour research. Emerald, Bingley

  4. Goodchild J (2012) 3 tips for using the social engineering toolkit, CSOOnline- data protection. Accessed 3 Dec 2012

  5. Google (2012) Introducing Gen C: the YouTube generation. Accessed 1 Apr 2013

  6. Hadnagy C (2011) Social engineering: the art of human hacking. Wiley, Indianapolis

  7. Huber M, Kowalski S, Nohlberg M, Tjoa S (2009) Towards automating social engineering using social networking sites. In: IEEE international conference on computational science and engineering, CSE’09, Vancouver, vol 3. IEEE, Los Alamitos, pp 117–124

  8. Kartaltepe EJ, Morales JA, Xu S, Sandhu R (2010) Social network-based botnet command-and-control: emerging threats and countermeasures. In: Applied cryptography and network security. Springer, Berlin/Heidelberg, pp 511–528

  9. Michael K (2003) The battle against security attacks. In: Lawrence E, Lawrence J, Newton S, Dann S, Corbitt B, Thanasankit T (eds) Internet commerce: digital models for business. Wiley, Milton, pp 156–159. Accessed 1 Feb 2013

  10. Michael K (2008) Social and organizational aspects of information security management. In: IADIS e-Society, Algarve, 9–12 Apr 2008. Accessed 1 Feb 2013

  11. Mitnick K, Simon WL (2002) The art of deception: controlling the Human element of security. Wiley, Indianapolis

  12. Mitnick K, Simon WL (2005) The art of intrusion. Wiley, Indianapolis

  13. Palmer CC (2001) Ethical hacking. IBM Syst J 40(3):769–780

  14. Papadimitriou P, Garcia-Molina H (2011) Data leakage detection. IEEE Trans Knowl Data Eng 23(1):51–63

  15. Poggi N, Berral JL, Moreno T, Gavalda R, Torres J (2007) Automatic detection and banning of content stealing bots for e-commerce. In: NIPS 2007 workshop on machine learning in adversarial environments for computer security. http://eople.c.pc.du/poggi/ublications/.%2oggi%2-%2utomatic%2etection%2nd%2anning%2 of%2ontent%2tealing%2ots%2or%2-commerce.df. Accessed 1 May 2013

  16. PWC (2012) BYOD (Bring your own device): agility through consistent delivery. Accessed 3 Dec 2012

  17. Social-Engineer.Org: Security Through Education (2012) http://ww.ocial-engineer.rg/. Accessed 3 Dec 2012

  18. Trustwave (2012) Physical security and social engineering testing. Accessed 3 Dec 2012


Footprinting; Hacker; Penetration testing; Reconnaissance; Risk; Security; Self-disclosure; Social engineering; Social media; Social reconnaissance; Vulnerabilities


Social reconnaissance: A preliminary paper-based or electronic web-based survey to gain personal information about a member or group in your community of interest. The member may be an individual friend or foe, a corporation, or the government

Social engineering: With respect to security, is the art of the manipulation of people while purporting to be someone other than your true self, thus duping them into performing actions or providing secret information

Data leakage: The deliberate or accidental outflow of private data from the corporation to the outside world, in a physical or virtual form

Online social networking: An online social network is a site that allows for the building of social networks among people who share common interests

Malware: The generic term for software that has a malicious purpose. Can take the form of a virus, worm, Trojan horse, and spyware

Citation: Katina Michael, "Reconnaissance and Social Engineering Risks as Effects of Social Networking", in Reda Alhajj and Jon Rokne, Encyclopedia of Social Network Analysis and Mining, 2017, pp. 1-7, DOI: 10.1007/978-1-4614-7163-9_401-1.

Computing Ethics: No Limits to Watching?

Figure 1. Google Glass (opening art)

Figure 1. Google Glass (opening art)

Little by little, the introduction of new body-worn technologies is transforming the way people interact with their environment and one another, and perhaps even with themselves. Social and environmental psychology studies of human-technology interaction pose as many questions as answers. We are learning as we go: "learning by doing" through interaction and "learning by being."9 Steve Mann calls this practice existential learning: wearers become photoborgs,3 a type of cyborg (cybernetic organism) whose primary intent is image capture from the domains of the natural and artificial.5 This approach elides the distinction between the technology and the human; they coalesce into one.

With each release greater numbers of on-board sensors can collect data about physiological characteristics, record real-time location coordinates, and use embedded cameras to "life-log" events 24x7. Such data, knowingly or unknowingly collected and bandwidth permitting, may be wirelessly sent to a private or public cloud and stored, often for public view and under a creative commons license.2 Embedded sensors on wearers can actively gather information about the world and capture details of a personal nature—ours and those of others too. These details can be minor, like embarrassing habits of less than admirable personal hygiene, or major, such as records of sexual peccadilloes or events relevant to court proceedings.

A third party might own the data gathered by these devices or the device itself. The Google Glass Terms of Service state: " may not resell, loan, transfer, or give your device to any other person. If you resell, loan, transfer, or give your device to any other person without Google's authorization, Google reserves the right to deactivate the device, and neither you nor the unauthorized person using the device will be entitled to any refund, product support, or product warranty."8 Personal information stored on the Internet for ease of access from anywhere at any time raises the possibility of unauthorized access. Most wearable sleep monitors indicate when you are awake, in light sleep, in deep sleep (REM), and calculate the level of efficiency reached between your rest and wake times.7Monitors can tell adults how often they wake up during the night, the duration of sleep, time spent in bed, and times of awakening.6 Sleeping patterns convey personal details about individuals, such as insomnia or compulsive obsessive disorder, sexual activity, workaholism, likely performance in stressful jobs, and other things.

Wearables can also look outward, reconstructing the world with location coordinates,11 current speed traveled and direction, rich high-resolution photographs, and audio capture. Wearers gather data about themselves but also heterogeneous data about fixed and mobile entities, including infrastructure, living things (such as people and animals) and non-living things (such as vehicles). This is not simply derivable information, such as the "point of interest nearest you is 'x' given your position on the Earth's surface," but can be interpreted as, "Johnny is traveling at 'x' miles per hour and is a little sluggish today on his bike ride compared to yesterday, perhaps because of his late night and his consumption of one glass too many of wine while at the nearby bar."

These devices can tell us about exceptions to everyday patterns of people in or out of our social networks. Consider the potential for government surveillance beyond the Call Detail Records that caused such controversy for the National Security Agency in 2013. Relentless data capture is uncompromising. Wearers concerned only about whether the device is working as intended might not consider the consequences of unauthorized data collection. They might feel they have purchased the device and are using it to their best ability. Will they consider feelings of fraternity with strangers who do not have access to the same technology? Or will they feel they have every right to put on their wearable device and adapt their body for convenience or personal needs such as maintaining personal security, reducing liability, and increasing life expectancy? Might wearers figure that any problems are the other persons' problems as long as the wearers believe they are not breaking the law? Whether the device is doing what it is supposed to do (for example, work properly), might occlude more meaningful questions of societal consequences from using such devices.

Bystanders are likely to be as oblivious to data collection from wearable devices as they are from data collection of private investigators using covert devices. Yet many people vehemently oppose being a subject of someone else's recording.1 The disappearing difference between covert and overt devices makes it possible for surveillance to become so ubiquitous that it is rendered "invisible." Anything right in front of us and ever present is in our "blind spot," hardly noticeable because we are enveloped by it like the fog. Novelty wears off over time, industrial/human factor design can help make things invisible to us, and we undergo conditioning. When surveillance cameras are everywhere, including on our heads and in our lapels, it is no longer surveillance. It is simply the human activity of "watching."

Relentless data capture is uncompromising.

CCTV cameras are arguably invasive, but we do not protest their use even though people are aware of their presence. What happens when we open the floodgates to constant watching by tiny lifelogging devices? We open ourselves to not just Big Brother, but countless Little Brothers.15 Corporate or governmental compliance and transparency hide the fact that audiovisual collection of information will come at a cost. Multiple depictions of the same event can be stronger than a single view, and corruption can flourish even in a transparent environment. It can even be corruption on a grander scale. Crowdsourced sousveillance (watching from below)12 might take place for authenticity or verification, but police with the authority to subpoena data for use in a court of law as direct evidence can use the data to support their "point of view" (POV), irrespective of the fact that "point of eye" (POE) does not always capture the whole truth.a,13

The more data we generate about ourselves, our families, our peers, and even strangers, the greater the potential risk of harm to ourselves and each other. If we lose the ability to control images or statistics about personal or public behaviors how do we make the distinction between becoming a photoborg and becoming the subject matter of a photoborg? There is a stark asymmetry between those who use wearables and those that do not. There is much confusion over whether sousveillance12 is ethical or unethical. The possible perils from lifelogging devices that capture the world around them are only now getting attention.4 To what extent is it ethical to create the records of the lives of others without prior consent or cognizance? Maker or hacker communities—"prosumers" and "producers"—create personalized devices for their own consumption and become trailblazers for what is possible. But they do not speak for everyone. What is initially made to serve individual needs often is commercialized for mass consumption.

Data from others can generate a great deal of money. Consider the story of Henrietta Lacks, a poor black tobacco farmer whom scientists named "HeLa."14 According to Rebecca Skloot, HeLa's cells were "taken without her knowledge in 1951" and became a vital medical tool "for developing the polio vaccine, cloning, gene mapping, in vitro fertilization, and more."14 Until this year, when the family came to a mutually satisfactory arrangement with the NIH, HeLa cells were "bought and sold by the billions," without compensation or acknowledgment. Who profits from wearable devices? The company that owns the device or the data? The wearer? The person in the field of view? Historical evidence suggests it will likely be everyone else but the user wearing the device or the citizen in the field of view. Marcus Wigan and Roger Clarke suggest a private data commons as a potential way forward in this big data enterprise.16

Widespread diffusion and data manipulation can require more than an ordinary consumer decision about use and acceptance. Trust and adoption are key to societal conversations that will shape guidelines and regulations about what is and is not acceptable with respect to wearable computing. At what stage of the game are the "rules" to be determined and by whom?

New technologies can bring wonderful benefits, but also disputed, unintended, and sometimes hidden consequences. Technologies should aid and sustain humankind, but we cannot limit technologies to just positive applications. We should not claim benefits without admitting to the risks and costs. Wearable devices create moral and ethical challenges, especially if they are widely used. We must look beyond findings from previous studies of emerging technologies because new technologies often help create new socio-technological contexts. We cannot afford unreflective adoption. "Play" can have real repercussions. The novelty, fun, and "wow" factors wear off and we are left with the fallout. We must be vigilant in our new playhouse, and not negate the importance of moral or ethical standards alongside market values.

Wearable devices create moral and ethical challenges, especially if they are widely used.

Philosophers have contemplated the question of technology and its impact on society. Martin Heidegger, Ivan Illich, Jacques Ellul, and those of the Frankfurt School, have argued that the worst outcome from technology gone wrong is dehumanization of the individual and the loss of dignity, resulting in a "standardized subject of brute self-preservation."10 A fundamental insight of such literature is that technology has not only to do with building: it is also a social process. Any social process resulting in unreflective adoption of technological marvels is profoundly deficient. More is not always better, and the latest is not always the greatest.

Charlie Chaplin's culturally significant film Modern Times (1936) shows the iconic Little Tramp caught up in the cogs of a giant machine. The unintended consequences of modern and efficient industrialization are clear. Chaplin's classic builds on Fritz Lang's futuristic film Metropolis (1926), which depicts a mechanized underground city in a dystopian society. Both films left indelible marks as prescient summaries of what was to follow. When technology becomes a final cause for its own sake, teleology and technology become confused. The old saw that "The person with the most toys wins," reflects this. What about the rest of us?


1. Abbas, R., Michael, K., Michael, M.G. and Aloudat, A. Emerging forms of covert surveillance using GPS-enabled devices. Journal of Cases on Information Technology 13, 2 (2011), 19–33.

2. Creative Commons: Attribution 2.0 Generic, n.d.

3. Electrical and Computer Engineering, ECE1766 Final Course Project. (1998)

4. ENISA. To log or not to log?: Risks and benefits of emerging life-logging applications. European Network and Information Security Agency. (2011)

5. Gray, C. H. Cyborgs, Aufmerkamkeit and Aesthetik [transl. Cyborgs, Attention, & Aesthetics]. Kunstforum(Dec.–Jan. 1998);

6. Henry, A. Best sleep tracking gadget or app? (2013);

7. Henry, A. Sleep time alarm clock for android watches your sleep cycles, wakes you gently (2012);

8. Kravets, D. and Baldwin, R. Google is forbidding users from reselling, loaning glass eyewear. Wired: Gadget Lab (Apr. 17, 2013);

9. Mann, S. Learn by being: Thirty years of cyborg existemology. The International Handbook of Virtual Learning Environments (2006), 1571–1592.

10. Marcuse, H. Social implications of technology. Readings in the Philosophy of Technology 5, 71 D.M. Kaplan, Ed., 2009.

11. Michael, K. and Clarke, R. Location and tracking of mobile devices: Überveillance stalks the streets. Computer Law and Security Review 29, 2 (Feb. 2013), 216–228.

12. Michael, K. and Michael, M.G. Sousveillance and the social implications of point of view technologies in the law enforcement sector. In Proceedings of the 6th Workshop on the Social Implications of National Security. Sydney, NSW, Australia, 2012;

13. Michael, K. and Miller, K.W. Big data: New opportunities and new challenges. IEEE Computer 46, 6 (2013), 22–24.

14. Skloot, R. The Immortal Life of Henrietta Lacks, Crown, New York, 2011;

15. Weil, J. Forget big brother. Little brother is the real threat. (Sept. 22, 2010);

16. Wigan, M.R. and Clarke, R. Big data's big unintended consequences. Computer 46, 6 (June 2013), 46–53.


a. Hans Holbein's famous painting The Ambassadors (1533) with its patent reference to anamorphosis speaks volumes of the critical distinction between PoE and PoV. Take a look (, if you are not already familiar with it. Can you see the skull? The secret lies in in the perspective.


The authors thank Rachelle Hollander and John King for their observations and insightful comments that helped make this column more robust.

Citation: Katina Michael, MG Michael, 2013, "Computing Ethics: No Limits to Watching?" 
Communications of the ACM, Vol. 56 No. 11, Pages 26-28, DOI: 10.1145/2527187

Location and tracking of mobile devices: Uberveillance stalks the streets


During the last decade, location-tracking and monitoring applications have proliferated, in mobile cellular and wireless data networks, and through self-reporting by applications running in smartphones that are equipped with onboard global positioning system (GPS) chipsets. It is now possible to locate a smartphone user's location not merely to a cell, but to a small area within it. Innovators have been quick to capitalise on these location-based technologies for commercial purposes, and have gained access to a great deal of sensitive personal data in the process. In addition, law enforcement utilises these technologies, can do so inexpensively and hence can track many more people. Moreover, these agencies seek the power to conduct tracking covertly, and without a judicial warrant. This article investigates the dimensions of the problem of people-tracking through the devices that they carry. Location surveillance has very serious negative implications for individuals, yet there are very limited safeguards. It is incumbent on legislatures to address these problems, through both domestic laws and multilateral processes.

1. Introduction

Personal electronic devices travel with people, are worn by them, and are, or soon will be, inside them. Those devices are increasingly capable of being located, and, by recording the succession of locations, tracked. This creates a variety of opportunities for the people concerned. It also gives rise to a wide range of opportunities for organisations, at least some of which are detrimental to the person's interests.

Commonly, the focus of discussion of this topic falls on mobile phones and tablets. It is intrinsic to the network technologies on which those devices depend that the network operator has at least some knowledge of the location of each handset. In addition, many such devices have onboard global positioning system (GPS) chipsets, and self-report their coordinates to service-providers. The scope of this paper encompasses those already well-known forms of location and tracking, but it extends beyond them.

The paper begins by outlining the various technologies that enable location and tracking, and identifies those technologies' key attributes. The many forms of surveillance are then reviewed, in order to establish a framework within which applications of location and tracking can be characterised. Applications are described, and their implications summarised. Controls are considered, whereby potential harm to the interests of individuals can be prevented or mitigated.

2. Relevant technologies

The technologies considered here involve a device that has the following characteristics:

• it is conveniently portable by a human, and

• it emits signals that:

• enable some other device to compute the location of the device (and hence of the person), and

• are sufficiently distinctive that the device is reliably identifiable at least among those in the vicinity, and hence the device's (and hence the person's) successive locations can be detected, and combined into a trail

The primary form-factors for mobile devices are currently clam-shape (portable PCs), thin rectangles suitable for the hand (mobile phones), and flat forms (tablets). Many other form-factors are also relevant, however. Anklets imposed on dangerous prisoners, and even as conditions of bail, carry RFID tags. Chips are carried in cards of various sizes, particularly the size of credit-cards, and used for tickets for public transport and entertainment venues, aircraft boarding-passes, toll-road payments and in some countries to carry electronic cash. Chips may conduct transactions with other devices by contact-based means, or contactless, using radio-frequency identification (RFID) or its shorter-range version near-field communication (NFC) technologies. These capabilities are in credit and debit cards in many countries. Transactions may occur with the cardholder's knowledge, with their express consent, and with an authentication step to achieve confidence that the person using the card is authorised to do so. In a variety of circumstances, however, some and even all of those safeguards are dispensed with. The electronic versions of passports that are commonly now being issued carry such a chip, and have an autonomous communications capability. The widespread issue of cards with capabilities uncontrolled by, and in many cases unknown to, the cardholder, is causing consternation among segments of the population that have become aware of the schemes.

Such chips can be readily carried in other forms, including jewellery such as finger-rings, and belt-buckles. Endo-prostheses such as replacement hips and knees and heart pacemakers can readily carry chips. A few people have voluntarily embedded chips directly into their bodies for such purposes as automated entry to premises (Michael and Michael, 2009).

In order to locate and track such devices, any sufficiently distinctive signals may in principle suffice. See Raper et al. (2007a) and Mautz (2011). In practice, the signals involved are commonly those transmitted by a device in order to take advantage of wireless telecommunications networks. The scope of the relevant technologies therefore also encompasses the signals, devices that detect the signals, and the networks over which the data that the signals contain are transmitted.

In wireless networks, it is generally the case that the base-station or router needs to be aware of the identities of devices that are currently within the cell. A key reason for this is to conserve limited transmission capacity by sending messages only when the targeted device is known to be in the cell. This applies to all of:

• cellular mobile originally designed for voice telephony and extended to data (in particular those using the ‘3G’ standards GSM/GPRS, CDMA2000 and UMTS/HSPA and the ‘4G’ standard LTE)

• wireless local area networks (WLANs, commonly Wifi/IEEE 802.11x – RE, 2010a)

• wireless wide area networks (WWANs, commonly WiMAX/IEEE 802.16x – RE, 2010b).

Devices in such networks are uniquely identified by various means (Clarke and Wigan, 2011). In cellular networks, there is generally a clear distinction between the entity (the handset) and the identity it is adopting at any given time (which is determined by the module inserted in it). Depending on the particular standards used, what is commonly referred to as ‘the SIM-card’ is an R-UIM, a CSIM or a USIM. These modules store an International Mobile Subscriber Identity (IMSI), which constitutes the handset's identifier. Among other things, this enables network operators to determine whether or not to provide service, and what tariff to apply to the traffic. However, cellular network protocols may also involve transmission of a code that distinguishes the handset itself, within which the module is currently inserted. A useful generic term for this is the device ‘entifier’ (Clarke, 2009b). Under the various standards, it may be referred to as an International Mobile Equipment Identity (IMEI), ESN, or MEID.

Vendor-specific solutions also may provide additional functionality to a handset unbeknown to the end-user. For example, every mobile device manufactured by Apple has a 40-character Unique Device Identifier (UDID). This enables Apple to track its users. Not only Apple itself, but also marketers, were able to use the UDID to track devices. It has also been alleged that data emanating from these devices is routinely accessible to law enforcement agencies. Since late 2012, Apple has prevented marketers from using the UDID, but has added an Identifier for Advertisers (IFA or IDFA). This is temporary, and it can be blocked; but it is by default open for tracking, and turning it off is difficult, and is likely to result in reduced services (Edwards, 2012). In short, Apple devices are specifically designed to enable tracking of consumers by Apple, by any government agency that has authority to gain access to the data, and by all consumer-marketing corporations, although in the last case with a low-grade option available to the user to suppress tracking.

In Wifi and WiMAX networks, the device entifier may be a processor-id or more commonly a network interface card identifier (NIC Id). In various circumstances, other device-identifiers may be used, such as a phone number, or an IP-address may be used as a proxy. In addition, the human using the device may be directly identified, e.g. by means of a user-account name.

A WWAN cell may cover a large area, indicatively of a 50 km radius. Telephony cells may have a radius as large as 2–3 km or as little as a hundred metres. WLANs using Wifi technologies have a cell-size of less than 1 ha, indicatively 50–100 m radius, but in practice often constrained by environmental factors to only 10–30 m.

The base-station or router knows the identities of devices that are within its cell, because this is a technically necessary feature of the cell's operation. Mobile devices auto-report their presence 10 times per second. Meanwhile, the locations of base-stations for cellular services are known with considerable accuracy by the telecommunications providers. And, in the case of most private Wifi services, the location of the router is mapped to c. 30–100 m accuracy by services such as Skyhook and Google Locations, which perform what have been dubbed ‘war drives’ in order to maintain their databases – in Google's case in probable violation of the telecommunications interception and/or privacy laws of at least a dozen countries (EPIC, 2012).

Knowing that a device is within a particular mobile phone, WiMAX or Wifi cell provides only a rough indication of location. In order to generate a more precise estimate, within a cell, several techniques are used (McGuire et al., 2005). These include the following (adapted from Clarke and Wigan, 2011; see also Figueiras and Frattasi, 2010):

• directional analysis. A single base-station may comprise multiple receivers at known locations and pointed in known directions, enabling the handset's location within the cell to be reduced to a sector within the cell, and possibly a narrow one, although without information about the distance along the sector;

• triangulation. This involves multiple base-stations serving a single cell, at known locations some distance apart, and each with directional analysis capabilities. Particularly with three or more stations, this enables an inference that the device's location is within a small area at the intersection of the multiple directional plots;

• signal analysis. This involves analysis of the characteristics of the signals exchanged between the handset and base-station, in order to infer the distance between them. Relevant signal characteristics include the apparent response-delay (Time Difference of Arrival – TDOA, also referred to as multilateration), and strength (Received Signal Strength Indicator – RSSI), perhaps supplemented by direction (Angle Of Arrival – AOA).

The precision and reliability of these techniques varies greatly, depending on the circumstances prevailing at the time. The variability and unpredictability result in many mutually inconsistent statements by suppliers, in the general media, and even in the technical literature.

Techniques for cellular networks generally provide reasonably reliable estimates of location to within an indicative 50–100 m in urban areas and some hundreds of metres elsewhere. Worse performance has been reported in some field-tests, however. For example, Dahunsi and Dwolatzky (2012) found the accuracy of GSM location in Johannesburg to be in the range 200–1400 m, and highly variable, with “a huge difference between the predicted and provided accuracies by mobile location providers”.

The website of the Skyhook Wifi-router positioning service claims 10-m accuracy, 1-s time-to-first-fix and 99.8% reliability (SHW, 2012). On the other hand, tests have resulted in far lower accuracy measures, including an average positional error of 63 m in Sydney (Gallagher et al., 2009) and “median values for positional accuracy in [Las Vegas, Miami and San Diego, which] ranged from 43 to 92 metres… [and] the replicability… was relatively poor” (Zandbergen, 2012, p. 35). Nonetheless, a recent research article suggested the feasibility of “uncooperatively and covertly detecting people ‘through the wall’ [by means of their WiFi transmissions]” (Chetty et al., 2012).

Another way in which a device's location may become known to other devices is through self-reporting of the device's position, most commonly by means of an inbuilt Global Positioning System (GPS) chipset. This provides coordinates and altitude based on broadcast signals received from a network of satellites. In any particular instance, the user of the device may or may not be aware that location is being disclosed.

Despite widespread enthusiasm and a moderate level of use, GPS is subject to a number of important limitations. The signals are subject to interference from atmospheric conditions, buildings and trees, and the time to achieve a fix on enough satellites and deliver a location measure may be long. This results in variability in its practical usefulness in different circumstances, and in its accuracy and reliability. Civil-use GPS coordinates are claimed to provide accuracy within a theoretical 7.8 m at a 95% confidence level (USGov, 2012), but various reports suggest 15 m, or 20 m, or 30 m, but sometimes 100 m. It may be affected by radio interference and jamming. The original and still-dominant GPS service operated by the US Government was subject to intentional degradation in the US's national interests. This ‘Selective Availability’ feature still exists, although subject to a decade-long policy not to use it; and future generations of GPS satellites may no longer support it.

Hybrid schemes exist that use two or more sources in order to generate more accurate location-estimates, or to generate estimates more quickly. In particular, Assisted GPS (A-GPS) utilises data from terrestrial servers accessed over cellular networks in order to more efficiently process satellite-derived data (e.g. RE, 2012).

Further categories of location and tracking technologies emerge from time to time. A current example uses means described by the present authors as ‘mobile device signatures’ (MDS). A device may monitor the signals emanating from a user's mobile device, without being part of the network that the user's device is communicating with. The eavesdropping device may detect particular signal characteristics that distinguish the user's mobile device from others in the vicinity. In addition, it may apply any of the various techniques mentioned above, in order to locate the device. If the signal characteristics are persistent, the eavesdropping device can track the user's mobile device, and hence the person carrying it. No formal literature on MDS has yet been located. The supplier's brief description is at PI (2010b).

The various technologies described in this section are capable of being applied to many purposes. The focus in this paper is on their application to surveillance.

3. Surveillance

The term surveillance refers to the systematic investigation or monitoring of the actions or communications of one or more persons (Clarke, 2009c). Until recent times, surveillance was visual, and depended on physical proximity of an observer to the observed. The volume of surveillance conducted was kept in check by the costs involved. Surveillance aids and enhancements emerged, such as binoculars and, later, directional microphones. During the 19th century, the post was intercepted, and telephones were tapped. During the 20th century, cameras enabled transmission of image, video and sound to remote locations, and recording for future use (e.g. Parenti, 2003).

With the surge in stored personal data that accompanied the application of computing to administration in the 1970s and 1980s, dataveillance emerged (Clarke, 1988). Monitoring people through their digital personae rather than through physical observation of their behaviour is much more economical, and hence many more people can be subjected to it (Clarke, 1994). The dataveillance epidemic made it more important than ever to clearly distinguish between personal surveillance – of an identified person who has previously come to attention – and mass surveillance – of many people, not necessarily previously identified, about some or all of whom suspicion could be generated.

Location data is of a very particular nature, and hence it has become necessary to distinguish location surveillance as a sub-set of the general category of dataveillance. There are several categories of location surveillance with different characteristics (Clarke and Wigan, 2011):

• capture of an individual's location at a point in time. Depending on the context, this may support inferences being drawn about an individual's behaviour, purpose, intention and associates

• real-time monitoring of a succession of locations and hence of the person's direction of movement. This is far richer data, and supports much more confident inferences being drawn about an individual's behaviour, purpose, intention and associates

• predictive tracking, by extrapolation from the person's direction of movement, enabling inferences to be drawn about near-future behaviour, purpose, intention and associates

• retrospective tracking, on the basis of the data trail of the person's movements, enabling reconstruction of a person's behaviour, purpose, intention and associates at previous times

Information arising at different times, and from different forms of surveillance, can be combined, in order to offer a more complete picture of a person's activities, and enable yet more inferences to be drawn, and suspicions generated. This is the primary sense in which the term ‘überveillance’ is applied: “Überveillance has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought). Überveillance can be a predictive mechanism for a person's expected behaviour, traits, likes, or dislikes; or it can be based on historical fact; or it can be something in between… Überveillance is more than closed circuit television feeds, or cross-agency databases linked to national identity cards, or biometrics and ePassports used for international travel. Überveillance is the sum total of all these types of surveillance and the deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time” (Michael and Michael, 2010. See also Michael and Michael, 2007Michael et al., 20082010Clarke, 2010).

A comprehensive model of surveillance includes consideration of geographical scope, and of temporal scope. Such a model assists the analyst in answering key questions about surveillance: of what? for whom? by whom? why? how? where? and when? (Clarke, 2009c). Distinctions are also needed based on the extent to which the subject has knowledge of surveillance activities. It may be overt or covert. If covert, it may be merely unnotified, or alternatively express measures may be undertaken in order to obfuscate, and achieve secrecy. A further element is the notion of ‘sousveillance’, whereby the tools of surveillance are applied, by those who are commonly watched, against those who are commonly the watchers (Mann et al., 2003).

These notions are applied in the following sections in order to establish the extent to which location and tracking of mobile devices is changing the game of surveillance, and to demonstrate that location surveillance is intruding more deeply into personal freedoms than previous forms of surveillance.

4. Applications

This section presents a typology of applications of mobile device location, as a means of narrowing down to the kinds of uses that have particularly serious privacy implications. These are commonly referred to as location-based services (LBS). One category of applications provide information services that are for the benefit of the mobile device's user, such as navigation aids, and search and discovery tools for the locations variously of particular, identified organisations, and of organisations that sell particular goods and services. Users of LBS of these kinds can be reasonably assumed to be aware that they are disclosing their location. Depending on the design, the disclosures may also be limited to specific service-providers and specific purposes, and the transmissions may be secured.

Another, very different category of application is use by law enforcement agencies (LEAs). The US E-911 mandate of 1999 was nominally a public safety measure, to enable people needing emergency assistance to be quickly and efficiently located. In practice, the facility also delivered LEAs means for locating and tracking people of interest, through their mobile devices. Personal surveillance may be justified by reasonable grounds for suspicion that the subject is involved in serious crime, and may be specifically authorised by judicial warrant. Many countries have always been very loose in their control over LEAs, however, and many others have drastically weakened their controls since 2001. Hence, in any given jurisdiction and context, each and all of the controls may be lacking.

Yet worse, LEAs use mobile location and tracking for mass surveillance, without any specific grounds for suspicion about any of the many people caught up in what is essentially a dragnet-fishing operation (e.g. Mery, 2009). Examples might include monitoring the area adjacent to a meeting-venue watching out for a blacklist of device-identifiers known to have been associated with activists in the past, or collecting device-identifiers for use on future occasions. In addition to netting the kinds of individuals who are of legitimate interest, the ‘by-catch’ inevitably includes threatened species. There are already extraordinarily wide-ranging (and to a considerable extent uncontrolled) data retention requirements in many countries.

Of further concern is the use of Automated Number Plate Recognition (ANPR) for mass surveillance purposes. This has been out of control in the UK since 2006, and has been proposed or attempted in various other countries as well (Clarke, 2009a). Traffic surveillance is expressly used not only for retrospective analysis of the movements of individuals of interest to LEAs, but also as a means of generating suspicions about other people (Lewis, 2008).

Beyond LEAs, many government agencies perform social control functions, and may be tempted to conduct location and tracking surveillance. Examples would include benefits-paying organisations tracking the movements of benefits-recipients about whom suspicions have arisen. It is not too far-fetched to anticipate zealous public servants concerned about fraud control imposing location surveillance on all recipients of some particularly valuable benefit, or as a security precaution on every person visiting a sensitive area (e.g. a prison, a power plant, a national park).

Various forms of social control are also exercised by private sector organisations. Some of these organisations, such as placement services for the unemployed, may be performing outsourced public sector functions. Others, such as workers' compensation providers, may be seeking to control personal insurance claimants, and similarly car-hire companies and insurance providers may wish to monitor motor vehicles' distance driven and roads used (Economist, 2012Michael et al., 2006b).

A further privacy-invasive practice that is already common is the acquisition of location and tracking data by marketing corporations, as a by-product of the provision of location-based services, but with the data then applied to further purposes other than that for which it was intended. Some uses rely on statistical analysis of large holdings (‘data mining’). Many uses are, on the other hand, very specific to the individual, and are for such purposes as direct or indirect targeting of advertisements and the sale of goods and services. Some of these applications combine location data with data from other sources, such as consumer profiling agencies, in order to build up such a substantial digital persona that the individual's behaviour is readily influenced. This takes the activity into the realms of überveillance.

All such services raise serious privacy concerns, because the data is intensive and sensitive, and attractive to organisations. Companies may gain rights in relation to the data through market power, or by trickery – such as exploitation of a self-granted right to change the Terms of Service (Clarke, 2011). Once captured, the data may be re-purposed by any organisation that gains access to it, because the value is high enough that they may judge the trivial penalties that generally apply to breaches of privacy laws to be well worth the risk.

A recently-emerged, privacy-invasive practice is the application of the mobile device signature (MDS) form of tracking, in such locations as supermarkets. This is claimed by its providers to offer deep observational insights into the behaviour of customers, including dwell times in front of displays, possibly linked with the purchaser's behaviour. This raises concerns a little different from other categories of location and tracking technologies, and is accordingly considered in greater depth in the following section.

It is noteworthy that an early review identified a wide range of LBS, which the authors classified into mobile guides, transport, gaming, assistive technology and location-based health (Raper et al., 2007b). Yet that work completely failed to notice that a vast array of applications were emergent in surveillance, law enforcement and national security, despite the existence of relevant literature from at least 1999 onwards (Clarke, 2001Michael and Masters, 2006).

5. Implications

The previous sections have introduced many examples of risks to citizens and consumers arising from location surveillance. This section presents an analysis of the categories and of the degree of seriousness with which they should be viewed. The first topic addressed is the privacy of personal location data. Other dimensions of privacy are then considered, and then the specific case of MDS is examined. The treatment here is complementary to earlier articles that have looked more generally at particular applications such as location-based mobile advertising, e.g. Cleff (20072010) and King and Jessen (2010). See also Art. 29 (2011).

5.1. Locational privacy

Knowing where someone has been, knowing what they are doing right now, and being able to predict where they might go next is a powerful tool for social control and for chilling behaviour (Abbas, 2011). Humans do not move around in a random manner (Song et al., 2010).

One interpretation of ‘locational privacy’ is that it “is the ability of an individual to move in public space with the expectation that under normal circumstances their location will not be systematically and secretly recorded for later use” (Blumberg and Eckersley, 2009). A more concise definition is “the ability to control the extent to which personal location information is… [accessible and] used by others” (van Loenen et al., 2009). Hence ‘tracking privacy’ is the interest an individual has in controlling information about their sequence of locations.

Location surveillance is deeply intrusive into data privacy, because it is very rich, and enables a great many inferences to be drawn (Clarke, 2001Dobson and Fisher, 2003Michael et al., 2006aClarke and Wigan, 2011). As demonstrated by Raper et al. (2007a, p. 32–3), most of the technical literature that considers privacy is merely concerned about it as an impediment to deployment and adoption, and how to overcome the barrier rather than how to solve the problem. Few authors adopt a positive approach to privacy-protective location technologies. The same authors' review of applications (Raper et al., 2007b) includes a single mention of privacy, and that is in relation to just one of the scores of sub-categories of application that they catalogue.

Most service-providers are cavalier in their handling of personal data, and extravagant in their claims. For example, Skyhook claims that it “respects the privacy of all users, customers, employees and partners”; but, significantly, it makes no mention of the privacy of the people whose locations, through the locations of their Wifi routers, it collects and stores (Skyhook, 2012).

Consent is critical in such LBS as personal location chronicle systems, people-followers and footpath route-tracker systems that systematically collect personal location information from a device they are carrying (Collier, 2011c). The data handled by such applications is highly sensitive because it can be used to conduct behavioural profiling of individuals in particular settings. The sensitivity exists even if the individuals remain ‘nameless’, i.e. if each identifier is a temporary or pseudo-identifier and is not linked to other records. Service-providers, and any other organisations that gain access to the data, achieve the capacity to make judgements on individuals based on their choices of, for example, which retail stores they walk into and which they do not. For example, if a subscriber visits a particular religious bookstore within a shopping mall on a weekly basis, the assumption can be reasonably made that they are in some way affiliated to that religion (Samuel, 2008).

It is frequently asserted that individuals cannot have a reasonable expectation of privacy in a public space (Otterberg, 2005). Contrary to those assertions, however, privacy expectations always have existed in public places, and continue to exist (VLRC, 2010). Tracking the movements of people as they go about their business is a breach of a fundamental expectation that people will be ‘let alone’. In policing, for example, in most democratic countries, it is against the law to covertly track an individual or their vehicle without specific, prior approval in the form of a warrant. This principle has, however, been compromised in many countries since 2001. Warrantless tracking using a mobile device generally results in the evidence, which has been obtained without the proper authority, being inadmissible in a court of law (Samuel, 2008). Some law enforcement agencies have argued for the abolition of the warrant process because the bureaucracy involved may mean that the suspect cannot be prosecuted for a crime they have likely committed (Ganz, 2005). These issues are not new; but far from eliminating a warrant process, the appropriate response is to invest the energy in streamlining this process (Bronitt, 2010).

Privacy risks arise not only from locational data of high integrity, but also from data that is or becomes associated with a person and that is inaccurate, misleading, or wrongly attributed to that individual. High levels of inaccuracy and unreliability were noted above in respect of all forms of location and tracking technologies. In the case of MDS services, claims have been made of 1–2 m locational accuracy. This has yet to be supported by experimental test cases however, and hence there is uncertainty about the reliability of inferences that the service-provider or the shop owner draw. If the data is the subject of a warrant or subpoena, the data's inaccuracy could result in false accusations and even a miscarriage of justice, with the ‘wrong person’ finding themselves in the ‘right place’ at the ‘right time’.

5.2. Privacy more broadly

Privacy has multiple dimensions. One analysis, in Clarke (2006a), identifies four distinct aspects. Privacy of Personal Data, variously also ‘data privacy’ and ‘information privacy’, is the most widely discussed dimension of the four. Individuals claim that data about themselves should not be automatically available to other individuals and organisations, and that, even where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use. The last five decades have seen the application of information technologies to a vast array of abuses of data privacy. The degree of privacy intrusiveness is a function of both the intensity and the richness of the data. Where multiple sources are combined, the impact is particularly likely to chill behaviour. An example is the correlation of video-feeds with mobile device tracking. The previous sub-section addressed that dimension.

Privacy of the Person, or ‘bodily privacy’, extends from freedom from torture and right to medical treatment, via compulsory immunisation and imposed treatments, to compulsory provision of samples of body fluids and body tissue, and obligations to submit to biometric measurement. Locational surveillance gives rise to concerns about personal safety. Physical privacy is directly threatened where a person who wishes to inflict harm is able to infer the present or near-future location of their target. Dramatic examples include assassins, kidnappers, ‘standover merchants’ and extortionists. But even people who are neither celebrities nor notorieties are subject to stalking and harassment (Fusco et al., 2012).

Privacy of Personal Communications is concerned with the need of individuals for freedom to communicate among themselves, without routine monitoring of their communications by other persons or organisations. Issues include ‘mail covers’, the use of directional microphones, ‘bugs’ and telephonic interception, with or without recording apparatus, and third-party access to email-messages. Locational surveillance thereby creates new threats to communications privacy. For example, the equivalent of ‘call records’ can be generated by combining the locations of two device-identifiers in order to infer that a face-to-face conversation occurred.

Privacy of Personal Behaviour encompasses ‘media privacy’, but particular concern arises in relation to sensitive matters such as sexual preferences and habits, political activities and religious practices. Some privacy analyses, particularly in Europe, extend this discussion to personal autonomy, liberty and the right of self-determination (e.g. King and Jessen, 2010). The notion of ‘private space’ is vital to economic and social aspects of behaviour, is relevant in ‘private places’ such as the home and toilet cubicles, but is also relevant and important in ‘public places’, where systematic observation and the recording of images and sounds are far more intrusive than casual observation by the few people in the vicinity.

Locational surveillance gives rise to rich sets of data about individuals' activities. The knowledge, or even suspicion, that such surveillance is undertaken, chills their behaviour. The chilling factor is vital in the case of political behaviour (Clarke, 2008). It is also of consequence in economic behaviour, because the inventors and innovators on whom new developments depend are commonly ‘different-thinkers’ and even ‘deviants’, who are liable to come to come to attention in mass surveillance dragnets, with the tendency to chill their behaviour, their interactions and their creativity.

Surveillance that generates accurate data is one form of threat. Surveillance that generates inaccurate data, or wrongly associates data with a particular person, is dangerous as well. Many inferences that arise from inaccurate data will be wrong, of course, but that won't prevent those inferences being drawn, resulting in unjustified behavioural privacy invasiveness, including unjustified association with people who are, perhaps for perfectly good reasons, themselves under suspicion.

In short, all dimensions of privacy are seriously affected by location surveillance. For deeper treatments of the topic, see Michael et al. (2006b) and Clarke and Wigan (2011).

5.3. Locational privacy and MDS

The recent innovation of tracking by means of mobile device signatures (MDS) gives rise to some issues additional to, or different from, mainstream device location technologies. This section accordingly considers this particular technique's implications in greater depth. Limited reliable information is currently available, and the analysis is of necessity based on supplier-published sources (PI, 2010a2010b) and media reports (Collier, 2011a,b,c).

Path Intelligence (PI) markets an MDS service to shopping mall-owners, to enable them to better value their floor space in terms of rental revenues, and to identify points of on-foot traffic congestion to on-sell physical advertising and marketing floor space (PI, 2010a). The company claims to detect each phone (and hence person) that enters a zone, and to capture data, including:

• how long each device and person stay, including dwell times in front of shop windows;

• repeat visits by shoppers in varying frequency durations; and

• typical route and circuit paths taken by shoppers as they go from shop to shop during a given shopping experience.

For malls, PI is able to denote such things as whether or not shoppers who shop at one establishment will also shop at another in the same mall, and whether or not people will go out of their way to visit a particular retail outlet independent of its location. For retailers, PI says it is able to provide information on conversion rates by department or even product line, and even which areas of the store might require more attention by staff during specific times of the day or week (PI, 2012).

PI says that it uses “complex algorithms” to denote the geographic position of a mobile phone, using strategically located “proprietary equipment” in a campus setting (PI, 2010a). The company states that it is conducting “data-driven analysis”, but is not collecting, or at least that it is not disclosing, any personal information such as a name, mobile telephone number or contents of a short message service (SMS). It states that it only ever provides aggregated data at varying zone levels to the shopping mall-owners. This is presumably justified on the basis that, using MDS techniques, direct identifiers are unlikely to be available, and a pseudo-identifier needs to be assigned. There is no explicit definition of what constitutes a zone. It is clear, however, that minimally-aggregated data at the highest geographic resolution is available for purchase, and at a higher price than more highly-aggregated data.

Shoppers have no relationship with the company, and it appears unlikely that they would even be aware that data about them is being collected and used. The only disclosure appears to be that “at each of our installations our equipment is clearly visible and labelled with our logo and website address” (PI, 2010a), but this is unlikely to be visible to many people, and in any case would not inform anyone who saw it.

In short, the company is generating revenue by monitoring signals from the mobile devices of people who visit a shopping mall for the purchase of goods and services. The data collection is performed without the knowledge of the person concerned (Renegar et al., 2008). The company is covertly collecting personal data and exploiting it for profit. There is no incentive or value proposition for the individual whose mobile is being tracked. No clear statement is provided about collection, storage, retention, use and disclosure of the data (Arnold, 2008). Even if privacy were not a human right, this would demand statutory intervention on the public policy grounds of commercial unfairness. The company asserts that “our privacy approach has been reviewed by the [US Federal Trade Commission] FTC, which determined that they are comfortable with our practices” (PI, 2010a). It makes no claims of such ‘approval’ anywhere else in the world.

The service could be extended beyond a mall and the individual stores within it, to for example, associated walkways and parking areas, and surrounding areas such as government offices, entertainment zones and shopping-strips. Applications can also be readily envisaged on hospital and university campuses, and in airports and other transport hubs. From prior research, this is likely to expose the individual's place of employment, and even their residence (Michael et al., 2006a,b). Even if only aggregated data is sold to businesses, the individual records remain available to at least the service-provider.

The scope exists to combine this form of locational surveillance with video-surveillance such as in-store CCTV, and indeed this is claimed to be already a feature of the company's offering to retail stores. To the extent that a commonly-used identifier can be established (e.g. through association with the person's payment or loyalty card at a point-of-sale), the full battery of local and externally acquired customer transaction histories and consolidated ‘public records’ data can be linked to in-store behaviour (Michael and Michael, 2007). Longstanding visual surveillance is intersecting with well-established data surveillance, and being augmented by locational surveillance, giving breath to dataveillance, or what is now being referred to by some as ‘smart surveillance’ (Wright et al., 2010IBM, 2011).

Surreptitious collection of personal data is (with exemptions and exceptions) largely against the law, even when undertaken by law enforcement personnel. The MDS mechanism also flies in the face of telephonic interception laws. How, then, can it be in any way acceptable for a form of warrantless tracking to be undertaken by or on behalf of corporations or mainstream government agencies, of shoppers in a mall, or travellers in an airport, or commuters in a transport hub? Why should a service-provider have the right to do what a law enforcement agency cannot normally do?

6. Controls

The tenor of the discussion to date has been that location surveillance harbours enormous threats to location privacy, but also to personal safety, the freedom to communicate, freedom of movement, and freedom of behaviour. This section examines the extent to which protections exist, firstly in the form of natural or intrinsic controls, and secondly in the form of legal provisions. The existing safeguards are found to be seriously inadequate, and it is therefore necessary to also examine the prospects for major enhancements to law, in order to achieve essential protections.

6.1. Intrinsic controls

A variety of forms of safeguard exist against harmful technologies and unreasonable applications of them. The intrinsic economic control has largely evaporated, partly because the tools use electronics and the components are produced in high volumes at low unit cost. Another reason is that the advertising and marketing sectors are highly sophisticated, already hold and exploit vast quantities of personal data, and are readily geared up to exploit yet more data.

Neither the oxymoronic notion of ‘business ethics’ nor the personal morality of executives in business and government act as any significant brake on the behaviours of corporations and governments, because they are very weak barriers, and they are readily rationalised away in the face of claims of enhanced efficiencies in, for example, marketing communications, fraud control, criminal justice and control over anti-social behaviour.

A further category of intrinsic control is ‘self-regulatory’ arrangements within relevant industry sectors. In 2010, for example, the Australian Mobile Telecommunications Association (AMTA) released industry guidelines to promote the privacy of people using LBS on mobile devices (AMTA, 2010). The guidelines were as follows:

1. Every LBS must be provided on an opt-in basis with a specific request from a user for the service

2. Every LBS must comply with all relevant privacy legislation

3. Every LBS must be designed to guard against consumers being located without their knowledge

4. Every LBS must allow consumers to maintain full control

5. Every LBS must enable customers to control who uses their location information and when that is appropriate, and be able to stop or suspend a service easily should they wish

The second point is a matter for parliaments, privacy oversight agencies and law enforcement agencies, and its inclusion in industry guidelines is for information only. The remainder, meanwhile, are at best ‘aspirational’, and at worst mere window-dressing. Codes of this nature are simply ignored by industry members. They are primarily a means to hold off the imposition of actual regulatory measures. Occasional short-term constraints may arise from flurries of media attention, but the ‘responsible’ organisations escape by suggesting that bad behaviour was limited to a few ‘cowboy’ organisations or was a one-time error that will not be repeated.

A case study of the industry self-regulation is provided by the Biometrics Code issued by the misleadingly named Australian industry-and-users association, the Biometrics ‘Institute’ (BI, 2004). During the period 2009–2012, the privacy advocacy organisation, the Australian Privacy Foundation (APF), submitted to the Privacy Commissioner on multiple occasions that the Code failed to meet the stipulated requirements and under the Commissioner's own Rules had to be de-registered. The Code never had more than five subscribers (out of a base of well over 100 members – which was itself only a sub-set of organisations active in the area), and had no signatories among the major biometrics vendors or users, because all five subscribers were small organisations or consultants. In addition, none of the subscribers appear to have ever provided a link to the Code on their websites or in their Privacy Policy Statements (APF, 2012).

The Commissioner finally ended the farce in April 2012, citing the “low numbers of subscribers”, but avoided its responsibilities by permitting the ‘Institute’ to “request” revocation, over two years after the APF had made the same request (OAIC, 2012). The case represents an object lesson in the vacuousness of self-regulation and the business friendliness of a captive privacy oversight agency.

If economics, morality and industry sector politics are inadequate, perhaps competition and organisational self-interest might work. On the other hand, repeated proposals that privacy is a strategic factor for corporations and government agencies have fallen on stony ground (Clarke, 19962006b).

The public can endeavour to exercise countervailing power against privacy-invasive practices. On the other hand, individuals acting alone are of little or no consequence to organisations that are intent on the application of location surveillance. Moreover, consumer organisations lack funding, professionalism and reach, and only occasionally attract sufficient media attention to force any meaningful responses from organisations deploying surveillance technologies.

Individuals may have direct surveillance countermeasures available to them, but relatively few people have the combination of motivation, technical competence and persistence to overcome lethargy and the natural human desire to believe that the institutions surrounding them are benign. In addition, some government agencies, corporations and (increasingly prevalent) public–private partnerships seek to deny anonymity, pseudonymity and multiple identities, and to impose so-called ‘real name’ policies, for example as a solution to the imagined epidemics of cyber-bullying, hate speech and child pornography. Individuals who use cryptography and other obfuscation techniques have to overcome the endeavours of business and government to stigmatise them as criminals with ‘something to hide’.

6.2. Legal controls

It is clear that natural or intrinsic controls have been utter failures in privacy matters generally, and will be in locational privacy matters as well. That leaves legal safeguards for personal freedoms as the sole protection. There are enormous differences among domestic laws relating to location surveillance. This section accordingly limits itself to generalities and examples.

Privacy laws are (with some qualifications, mainly in Europe) very weak instruments. Even where public servants and parliaments have an actual intention to protect privacy, rather than merely to overcome public concerns by passing placebo statutes, the draft Bills are countered by strong lobbying by government agencies and industry, to the extent that measures that were originally portrayed as being privacy-protective reach the statute books as authority for privacy breaches and surveillance (Clarke, 2000).

Privacy laws, once passed, are continually eroded by exceptions built into subsequent legislation, and by technological capabilities that were not contemplated when the laws were passed. In most countries, location privacy has yet to be specifically addressed in legislation. Even where it is encompassed by human rights and privacy laws, the coverage is generally imprecise and ambiguous. More direct and specific regulation may exist, however. In Australia, for example, the Telecommunications (Interception and Access) Act and the Surveillance Devices Act define and criminalise inappropriate interception and access, use, communication and publication of location information that is obtained from mobile device traffic (AG, 2005). On the other hand, when Google Inc. intercepted wi-fi signals and recorded the data that they contained, the Privacy Commissioner absolved the company (Riley, 2010), and the Australian Federal Police refused to prosecute despite the action – whether it was intentional, ‘inadvertent’ or merely plausibly deniable – being a clear breach of the criminal law (Moses, 2010Stilgherrian, 2012).

The European Union determined a decade ago that location data that is identifiable to individuals is to some extent at least subject to existing data protection laws (EU, 2002). However, the wording of that so-called ‘e-Privacy Directive’ countenances the collection of “location data which are more precise than is necessary for the transmission of communications”, without clear controls over the justification, proportionality and transparency of that collection (para. 35). In addition, the e-Privacy Directive only applies to telecommunications service-providers, not to other organisations that acquire location and tracking data. King and Jessen (2010) discuss various gaps in the protective regimes in Europe.

The EU's Advisory Body (essentially a Committee of European Data Protection Commissioners) has issued an Opinion that mobile location data is generally capable of being associated with a person, and hence is personal data, and hence is subject to the EU Directive of 1995 and national laws that implement that Directive (Art. 29, 2011). Consent is considered to be generally necessary, and that consent must be informed, and sufficiently granular (p. 13–8).

It is unclear, however, to what extent this Opinion has actually caused, and will in the future cause, organisations that collect, store, use and disclose location data to change their practices. This uncertainty exists in respect of national security, law enforcement and social control agencies, which have, or which can arrange, legal authority that overrides data protection laws. It also applies to non-government organisations of all kinds, which can take advantage of exceptions, exemptions, loopholes, non-obviousness, obfuscation, unenforceability within each particular jurisdiction, and extra-jurisdictionality, to operate in ways that are in apparent breach of the Opinion.

Legal authorities for privacy-invasions are in a great many cases vague rather than precise, and in many jurisdictions power in relation to specific decisions is delegated to a LEA (in such forms as self-written ‘warrants’), or even a social control agency (in the form of demand-powers), rather than requiring a decision by a judicial officer based on evidence provided by the applicant.

Citizens in many countries are subject to more or less legitimate surveillance of various degrees and orders of granularity, by their government, in the name of law enforcement and national security. However, many Parliaments have granted powers to national security agencies to use location technology to track citizens and to intercept telecommunications. Moreover, many Parliaments have failed the public by permitting a warrant to be signed by a Minister, or even a public servant, rather than a judicial officer (Jay, 1999). Worse still, it appears that these already gross breaches of the principle of a free society are in effect being extended to the authorisation of a private organisation to track mobiles of ordinary citizens because it may lead to better services planning, or more efficient advertising and marketing (Collier, 2011a).

Data protection legislation in all countries evidences massive weaknesses. There are manifold exemptions and exceptions, and there are intentional and accidental exclusions, for example through limitations in the definitions of ‘identified’ and ‘personal data’. Even the much vaunted European laws fail to cope with extraterritoriality and are largely ignored by US-based service-providers. They are also focused exclusively on data, leaving large gaps in safeguards for physical, communications and behavioural privacy.

Meanwhile, a vast amount of abuse of personal data is achieved through the freedom of corporations and government agencies to pretend that Terms imposed on consumers and citizens without the scope to reject them are somehow the subject of informed and freely given consent. For example, petrol stations, supermarkets and many government agencies pretend that walking past signs saying ‘area subject to CCTV’ represents consent to gather, transmit, record, store, use and disclose data. The same approach is being adopted in relation to highly sensitive location data, and much vaunted data protection laws are simply subverted by the mirage of consent.

At least notices such as ‘you are now being watched’ or ‘smile, you are being recorded’ inform customers that they are under observation. On the other hand, people are generally oblivious to the fact that their mobile subscriber identity is transmitted from their mobile phone and multilaterated to yield a reasonably precise location in a shopping mall (Collier, 2011a,b,c). Further, there is no meaningful sense in which they can be claimed to have consented to providing location data to a third party, in this case a location service-provider with whom they have never had contact. And the emergent combination of MDS with CCTV sources becomes a pervasive view of the person, an ‘über’ view, providing a set of über-analytics to – at this stage – shopping complex owners and their constituents.

What rights do employees have if such a system were instituted in an employment setting (Michael and Rose, 2007, p. 252–3)? Are workplace surveillance laws in place that would protect employees from constant monitoring (Stern, 2007)? A similar problem applies to people at airports, or on hospital, university, industrial or government campuses. No social contract has been entered into between the parties, rendering the subscriber powerless.

Since the collapse of the Technology Assessment movement, technological deployment proceeds unimpeded, and public risks are addressed only after they have emerged and the clamour of concern has risen to a crescendo. A reactive force is at play, rather than proactive measures being taken to ensure avoidance or mitigation of potential privacy breaches (Michael et al., 2011). In Australia, for example, safeguards for location surveillance exist at best incidentally, in provisions under separate legislative regimes and in separate jurisdictions, and at worst not at all. No overarching framework exists to provide consistency among the laws. This causes confusion and inevitably results in inadequate protections (ALRC, 2008).

6.3. Prospective legal controls

Various learned studies have been conducted, but gather dust. In Australia, the three major law reform commissions have all reported, and all have been ignored by the legislatures (NSWLRC, 2005ALRC, 2008VLRC, 2010).

One critical need is for the fundamental principle to be recovered, to the effect that the handling of personal data requires either consent or legal authority. Consent is meaningless as a control over unreasonable behaviour, however, unless it satisfies a number of key conditions. It must be informed, it must be freely given, and it must be sufficiently granular, not bundled (Clarke, 2002). In a great many of the circumstances in which organisations are claiming to have consent to gather, store, use and disclose location data, the consumer does not appreciate what the scope of handling is that the service-provider is authorising themselves to perform; the Terms are imposed by the service-provider and may even be varied or completely re-written without consultation, a period of notice or even any notice at all; and consent is bundled rather than the individual being able to construct a pattern of consents and denials that suit their personal needs. Discussions all too frequently focus on the specifically-US notion of ‘opt-out’ (or ‘presumed consent’), with consent debased to ‘opt-in’, and deprecated as inefficient and business-unfriendly.

Recently, some very weak proposals have been put forward, primarily in the USA. In 2011, for example, two US Senators proposed a Location Privacy Protection Bill (Cheng, 2011). An organisation that collected location data from mobile or wireless data devices would have to state explicitly in their privacy policies what was being collected, in plain English. This would represent only a partial implementation of the already very weak 2006 recommendation of the Internet Engineering Task Force for Geographic Location/Privacy (IETF GEOPRIV) working group, which decided that technical systems should include ‘Fair Information Practices’ (FIPs) to defend against harms associated with the use of location technologies (EPIC, 2006). FIPs, however, is itself only a highly cut-down version of effective privacy protections, and the Bill proposes only a small fraction of FIPs. It would be close to worthless to consumers, and close to legislative authorisation for highly privacy-invasive actions by organisations.

Two other US senators tabled a GPS Bill, nominally intended to “balance the needs of Americans' privacy protections with the legitimate needs of law enforcement, and maintains emergency exceptions” (Anderson, 2011). The scope is very narrow – next would have to come the Wi-Fi Act, the A-GPS Act, etc. That approach is obviously unviable in the longer term as new innovations emerge. Effective legislation must have appropriate generality rather than excessive technology-specificity, and should be based on semantics not syntax. Yet worse, these Bills would provide legal authorisation for grossly privacy-invasive location and tracking. IETF engineers, and now Congressmen, want to compromise human rights and increase the imbalance of power between business and consumers.

7. Conclusions

Mobile device location technologies and their applications are enabling surveillance, and producing an enormous leap in intrusions into data privacy and into privacy of the person, privacy of personal communications, and privacy of personal behaviour.

Existing privacy laws are entirely incapable of protecting consumers and citizens against the onslaught. Even where consent is claimed, it generally fails the tests of being informed, freely given and granular.

There is an urgent need for outcries from oversight agencies, and responses from legislatures. Individual countries can provide some degree of protection, but the extra-territorial nature of so much of the private sector, and the use of corporate havens, in particular the USA, mean that multilateral action is essential in order to overcome the excesses arising from the US laissez fairetraditions.

One approach to the problem would be location privacy protection legislation, although it would need to embody the complete suite of protections rather than the mere notification that the technology breaches privacy. An alternative approach is amendment of the current privacy legislation and other anti-terrorism legislation in order to create appropriate regulatory provisions, and close the gaps that LBS providers are exploiting (Koppel, 2010).

The chimeras of self-regulation, and the unenforceability of guidelines, are not safeguards. Sensitive data like location information must be subject to actual, enforced protections, with guidelines and codes no longer used as a substitute, but merely playing a supporting role. Unless substantial protections for personal location information are enacted and enforced, there will be an epidemic of unjustified, disproportionate and covert surveillance, conducted by government and business, and even by citizens (Gillespie, 2009Abbas et al., 2011).


A preliminary version of the analysis presented in this paper appeared in the November 2011 edition of Precedent, the journal of the Lawyers Alliance. The article has been significantly updated as a result of comments provided by the referees and editor.


R. Abbas, The social and behavioural implications of location-based services: an observational study of users, Journal of Location Based Services, 5 (December 2011), pp. 3-4

R. Abbas, K. Michael, M.G. Michael, A. Aloudat, Emerging forms of covert surveillance using GPS-enabled devices, Journal of Cases on Information Technology, 13 (2) (2011), pp. 19-33

AG, What the government is doing: Surveillance Device Act 2004, Australian Government (25 May 2005) at

ALRC, For your information: Australian privacy law and practice, (ALRC report 108), Australian Government (2008), 2, p. 1409–10,

AMTA, New mobile telecommunications industry guidelines and consumer tips set benchmark for location based services, Australian Mobile Telecommunications Association (2010) at

N. Anderson, Bipartisan bill would end government's warrantless GPS tracking, Ars Technica (June 2011) at

APF Revocation of the biometrics industry code, Australian Privacy Foundation (March 2012) at

B. Arnold, Privacy guide, Caslon Analytics (May 2008), at

Art. 29, Opinion 13/2011 on geolocation services on smart mobile devices, Article 29 Data Protection Working Party, 881/11/EN WP 185, at (16 May 2011)

BI Privacy code, Biometrics Institute, Sydney (April 2004) at

A.J. Blumberg, P. EckersleyOn locational privacy, and how to avoid losing it forever, Electronic Frontier Foundation (August 2009), at

S. Bronitt, Regulating covert policing methods: from reactive to proactive models of admissibility, S. Bronitt, C. Harfield, K. Michael (Eds.), The social implications of covert policing (2010), pp. 9-14

J. Cheng, Franken's location-privacy bill would close mobile-tracking ‘loopholes’, Wired (17 June 2011), at

K. Chetty, G.E. Smith, K. Woodbridge, Through-the-wall sensing of personnel using passive bistatic WiFi radar at standoff distances, IEEE Transactions on Geoscience and Remote Sensing, 50 (4) (April 2012), pp. 1218-1226

R. Clarke, Information technology and dataveillance, Communications of the ACM, 31 (5) (May 1988), pp. 498-512, at

R. Clarke, The digital persona and its application to data surveillance, The Information Society, 10 (2) (June 1994), pp. 77-92, at

Clarke R. Privacy and dataveillance, and organisational strategy. In: Proc. I.S. Audit & Control Association (EDPAC'96), Perth, Western Australia; May 1996, at

R. Clarke, Submission to the Commonwealth Attorney-General re: ‘a privacy scheme for the private sector: release of key provisions’ of 14 December 1999, Xamax Consultancy Pty Ltd (January 2000) at

R. Clarke, Person-location and person-tracking: technologies, risks and policy implications, Information Technology & People, 14 (2) (Summer 2001), pp. 206-231, at

Clarke R. e-Consent: a critical element of trust in e-business. In: Proc. 15th Bled electronic commerce conference, Bled, Slovenia; June 2002, at

R. Clarke, What's ‘privacy’? Xamax Consultancy Pty Ltd (2006), August 2006, at

R. Clarke, Make privacy a strategic factor – the why and the how, Cutter IT Journal, 19 (11) (2006), at

R. Clarke, Dissidentity: the political dimension of identity and privacy, Identity in the Information Society, 1 (1) (December 2008), pp. 221-228, at

Clarke R. The covert implementation of mass vehicle surveillance in Australia. In: Proc 4th workshop on the social implications of national security: covert policing, April 2009, ANU, Canberra; 2009a, at

Clarke R. A sufficiently rich model of (id)entity, authentication and authorisation. In: Proc. IDIS 2009 – the 2nd multidisciplinary workshop on identity in the Information Society, LSE, 5 June 2009; 2009b, at

R. Clarke, A framework for surveillance analysis, Xamax Consultancy Pty Ltd (2009), August 2009, at

R. Clarke, What is überveillance? (And what should be done about it?) IEEE Technology and Society, 29 (2) (Summer 2010), pp. 17-25, at

Clarke R. The cloudy future of consumer computing. In: Proc. 24th Bled eConference; June 2011, at

R. Clarke, M. Wigan, You are where you've been: the privacy implications of location and tracking technologies, Journal of Location Based Services, 5 (3–4) (December 2011), pp. 138-155,

E.B. Cleff, Implementing the legal criteria of meaningful consent in the concept of mobile advertising, Computer Law & Security Review, 23 (2) (2007), pp. 262-269

E.B. Cleff, Effective approaches to regulate mobile advertising: moving towards a coordinated legal, self-regulatory and technical response, Computer Law & Security Review, 26 (2) (2010), pp. 158-169

K. Collier, Stores spy on shoppers, Herald Sun (2011), 12 October 2011, at

K. Collier, Shopping centres' Big Brother plan to track customers, Herald Sun (2011), 14 October 2011, at

K. Collier, ‘Creepy’ path intelligence retail technology tracks shoppers, (2011), 14 October 2011, at

F. Dahunsi, B. Dwolatzky, An empirical investigation of the accuracy of location-based services in South Africa, Journal of Location Based Services, 6 (1) (March 2012), pp. 22-34

J. Dobson, P. Fisher, Geoslavery, IEEE Technology and Society, 22 (2003), pp. 47-52, cited in Raper et al. (2007)

Economist, Vehicle data recorders – watching your driving, The Economist (23 June 2012), at

J. Edwards, Apple has quietly started tracking iphone users again, and it's tricky to opt out, Business Insider (11 October 2012) at

EPIC, Privacy and human rights report 2006, Electronic Privacy Information Center, WorldLII (2006) at

EPIC, Investigations of Google street view, Electronic Privacy Information Center (2012), at

EU Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications)

Official Journal, L 201 (2002), 31/07/2002 P. 0037-0047, European Commission, at

J. Figueiras, S. Frattasi, Mobile positioning and tracking: from conventional to cooperative techniques, Wiley (2010)

S.J. Fusco, R. Abbas, K. Michael, A. Aloudat, Location-based social networking and its impact on trust in relationships, IEEE Technology and Society Magazine, 31 (2) (Summer 2012), pp. 39-50, at

Gallagher T et al. Trials of commercial Wi-Fi positioning systems for indoor and urban canyons. In: Proc. IGNSS symposium, Queensland; 1–3 December 2009, cited in Zandbergen (2012).

J.S. Ganz, It's already public: why federal officers should not need warrants to use GPS vehicle tracking devices, Journal of Criminal Law and Criminology, 95 (4) (Summer 2005), pp. 1325-1337

A.A. Gillespie, Covert surveillance, human rights and the law, Irish Criminal Law Journal, 19 (3) (August 2009), pp. 71-79

IBM, IBM smart surveillance system (previous PeopleVision project, IBM Research (30 October 2011), at

D.M. Jay, Use of covert surveillance obtained by search warrant, Australian Law Journal, 73 (1) (Jan 1999), pp. 34-36

N.J. King, P.W. Jessen, Profiling the mobile customer – privacy concerns when behavioural advertisers target mobile phones, Computer Law & Security Review, 26 (5) (2010), pp. 455-478, and 2010; 26(6): 595–612

A. Koppel, Warranting a warrant: fourth amendment concerns raised by law enforcement's warrantless use of GPS and cellular phone tracking, University of Miami Law Review, 64 (3) (April 2010), pp. 1061-1089

P. Lewis, Fears over privacy as police expand surveillance project, The Guardian (15 September 2008) at

B. van Loenen, J. Zevenbergen, J. de JongBalancing location privacy with national security: a comparative analysis of three countries through the balancing framework of the European court of human rights, N.J. Patten, et al. (Eds.), National security: institutional approaches, Nova Science Publishers (2009), [chapter 2]

M. McGuire, K.N. Plataniotis, A.N. Venetsanopoulos, Data fusion of power and time measurements for mobile terminal location, IEEE Transaction on Mobile Computing, 4 (2005), pp. 142-153, cited in Raper et al. (2007)

S. Mann, J. Nolan, B. Wellman, Sousveillance: inventing and using wearable computing devices for data collection in surveillance environments, Surveillance & Society, 1 (3) (June 2003), pp. 331-355, at

Mautz R. Overview of indoor positioning technologies. Keynote. In: Proc. IPIN'2011, Guimaraes; September 2011, at

D. Mery, The mobile phone as self-inflicted surveillance – and if you don't have one, what have you got to hide? The Register (10 April 2009) at

Michael and Michael, 2007, K. Michael, M.G. Michael, From dataveillance to überveillance and the Realpolitik of the Transparent Society, University of Wollongong (2007) at

K. Michael, M.G. Michael, Innovative automatic identification and location-based services: from bar codes to chip implants, IGI Global (2009)

M.G. Michael, K. Michael, Towards a state of uberveillance, IEEE Technology and Society Magazine, 29 (2) (Summer 2010), pp. 9-16, at,

Michael K, McNamee A, Michael MG, Tootell H., Location-based intelligence – modeling behavior in humans using GPS. In: Proc. int'l symposium on technology and society, New York, 8–11 June 2006; 2006a, at

Michael K, McNamee A, Michael MG. The emerging ethics of humancentric GPS tracking and monitoring. In: Proc. int'l conf. on mobile business, Copenhagen, Denmark. IEEE Computer Society; 2006b, at

M.G. Michael, S.J. Fusco, K. Michael, A research note on ethics in the emerging age of uberveillance, Computer Communications, 31 (6) (2008), pp. 1192-1199, at

Michael and Masters, 2006, K. Michael, A. Masters, Realized applications of positioning technologies in defense intelligence, H. Hussein Abbass, D. Essam (Eds.), Applications of information systems to homeland security and defense, Idea Group Publishing (2006), at

K. Michael, G. Rose, Human tracking technology in mutual legal assistance and police inter-state cooperation in international crimes, K. Michael, M.G. Michael (Eds.), From dataveillance to überveillance and the realpolitik of the transparent society. 1st ed, University of Wollongong, Wollongong (2007), pp. 241-256.

K. Michael, G. Roussos, G.Q. Huang, R. Gadh, A. Chattopadhyay, S.Prabhu, et al.Planetary-scale RFID services in an age of uberveillance, Proceedings of the IEEE, 98 (9) (2010), pp. 1663-1671

K. Michael, M.G. Michael, R. Abbas, The importance of scenarios in the prediction of the social implications of emerging technologies and services, Journal of Cases on Information Technology (JCIT) 13.2 (2011), pp. i-vii

A. Moses, Google escapes criminal charges for Wi-Fi snooping, The Sydney Morning Herald (6 December 2010) at

NSWLRC Surveillance, Report 108, NSW Law Reform Commission (2005) at

OAIC. Office of the Australian Information Commissioner; April 2012, at

A.A. Otterberg, Note: GPS tracking technology: the case for revisiting Knotts and shifting the Supreme Court's theory of the public space under the fourth amendment, Boston College Law Review, 46 (2005) (2005), pp. 661-704

C. Parenti, The soft cage: surveillance in America from slavery to the war on terror, Basic Books (2003)

PI, Our commitment to privacy, Path Intelligence (2010), heading changed in late 2012 to ‘privacy by design’, at

PI, FootPath technology, Path Intelligence (2010) at

PI Retail, Path Intelligence (2012), at

J. Raper, G. Gartner, H. Karimi, C. Rizos, A critical evaluation of location based services and their potential, Journal of Location Based Services, 1 (1) (2007), pp. 5-45

J. Raper, G. Gartner, H. Karimi, C. Rizos, Applications of location-based services: a selected review, Journal of Location Based Services, 1 (2) (2007), pp. 89-111

RE IEEE 802.11 standards tutorial, (2010), apparently of 2010, at

RE WiMAX IEEE 802.16 technology tutorial, (2010), apparently of 2010, at

RE Assisted GPS, A-GPS, (2012) apparently of 2012, at

Renegar BD, Michael K, Michael MG. Privacy, value and control issues in four mobile business applications. In: Proc. 7th int'l conf. on mobile business; 2008. p. 30–40.

J. Riley, Gov't ‘travesty’ in Google privacy case, ITWire, 20 (Wednesday 3 November 2010), p. 44, at

I.J. Samuel, Warrantless location tracking, New York University Law Review, 83 (2008), pp. 1324-1352

SHW Skyhook location performance at (2012)

Skyhook. (2012). Website entries, including ‘frequently asked questions’ at, ‘privacy policy’ at and ‘location privacy’ at

C. Song, Z. Qu, N. Blumm, A.-L. Barabási, Limits of predictability in human mobility, Science, 327 (5968) (2010), pp. 1018-1021.

A. Stern, Man fired thanks to GPS tracking, Center Networks (31 August 2007), at

Stilgherrian, Forget government data retention, Google has you wired, Crikey (2 October 2012), at

USGovGPS accuracy, National Coordination Office for Space-Based Positioning, Navigation, and Timing(February 2012), at

VLRC, Surveillance in public spaces, Victorian Law Reform Commission (March 2010), Final report 18, at

D. Wright, M. Friedewald, S. Gutwirth, M. Langheinrich, E. Mordini, R.Bellanova, et al.Sorting out smart surveillance, Computer Law & Security Review, 26 (4) (2010), pp. 343-354

P.A. Zandbergen, Comparison of WiFi positioning on two mobile devices, Journal of Location Based Services, 6 (1) (March 2012), pp. 35-50

Keywords: Location-based systems (LBS), Cellular mobile, Wireless LAN, GPS, Mobile device signatures (MDS), Privacy, Surveillance, Überveillance

Citation: Katina Michael and Roger Clarke, "Location and tracking of mobile devices: Überveillance stalks the streets", Computer Law & Security Review, Vol. 29, No. 3, June 2013, pp. 216-228, DOI: