My Research Programme (2002 - Now)

High-Tech Child's Play in the Cloud

Introduction

The “internet of things” mantra promotes the potential for the interconnectedness of everyone and everything [1]. The fundamental premise is that embedded sensors (including audio and image) will herald in an age of convenience, security, and quick response [2]. We have become so oblivious to the presence and placement of sensors in civil infrastructure (e.g., shopping centers and lampposts) and computing devices (e.g., laptops and smartphones) that we do not question their placement in places of worship, restrooms, and, especially, children's toys [3].

The risk with consumer desensitization over the “sensors everywhere” paradigm is, at times, complacency, but, for the greater part, apathy. When functionality is hidden inside a black box or is wireless, consumers can underestimate the potential for harm. The old adage “what you don't know won't hurt you” is not true in this context and neither is the “I have nothing to hide” principle. Form factors can play a significant role in disarming buyers of white goods for households and gifts for minors. In context, the power of a sensor looks innocent when it is located in a children's toy, as opposed to sitting atop a mobile closed-circuit television policing unit.

Barbie is Watching

The Mattel Vidster is a digital tapeless camcorder that was marketed as a children's toy. It features a 28-mm LCD display, a 2x digital zoom, and records into AVI 320 × 240 video files encoded with the M-JPEG codec at 15 frames/s, with 22-kHz monaural sound. It also takes still photos.

The Mattel Vidster is a digital tapeless camcorder that was marketed as a children's toy. It features a 28-mm LCD display, a 2x digital zoom, and records into AVI 320 × 240 video files encoded with the M-JPEG codec at 15 frames/s, with 22-kHz monaural sound. It also takes still photos.

An example of this shift in context is Mattel's Video Girl Barbie doll, launched in July 2010 [4]. It features a fully functional standard-definition pinhole video camera embedded in Barbie's chest, with a viewing screen on her back. Young children (Mattel is targeting ages six years and above) are supported by user design to make use of “doll's-eye-view” to record Barbie's point of view for up to 30 min. They can then create movies using the accompanying StoryTeller software. Video Girl comes with a (pink) USB plug-in cord for easy upload of the recorded footage. Initially, Mattel provided storage space for video makers in the cloud to share movies (http://barbie.com/videogirl), but the company later recanted and eliminated this video-sharing capability. We have speculated that one of Mattel's reasons for doing so was because it was faced with potential footage recorded at ground level that exposed young, carefree children at play.

The Barbie Video Girl doll—Create movies from Barbie's point-of-view with a real video camera inside the doll (the camera lens is in the necklace, and the video screen is on her back).

The Barbie Video Girl doll—Create movies from Barbie's point-of-view with a real video camera inside the doll (the camera lens is in the necklace, and the video screen is on her back).

In his book Cybercrime, Jonathan Clough makes it clear that offenses for child pornography are stipulated in Title 3, Article 9 of the Cybercrime Convention as producing, offering or making available, distributing or transmitting, procuring, or possessing child pornography [5], [p. 281]. While definitions of what constitutes an offense under child pornography laws vary greatly from one country to the next, court cases worldwide are providing clear precedents for unacceptable behaviors. It is quite possible that Mattel did not wish to find itself in the precarious situation of “offering or making available” debatable imagery of young children or as a potential, albeit accidental, accessory for possession. In essence, this places the manufacturer at the mercy of those who would label them as groomers or even procurers of child pornography, engineers of another insidious arm of the child pornographer. Three of the offenses that constitute the “making available” category of child pornography laws include to publish, make available, and show [5], [p. 287]. Mattel had obviously not thought through all the pros and cons associated with video sharing by minors. In fact, in most social media web sites, Facebook and Instagram included, policies preclude those under the age of 13 from registration and participation.

Four months after the official launch of Video Girl, the U.S. Federal Bureau of Investigation (FBI) privately issued a warning that the doll could be used to produce child pornography [6]. On 30 November 2010, in a situational information report “cybercrime alert,” from its Sacramento field office, the FBI publicly announced in a statement that there was “no reported evidence that the doll had been used in any way other than intended” [7], [8]. However, the report also stated that the FBI had revealed that there was an instance where an individual convicted of distributing child pornography had given the Barbie doll to a 6-year-old girl. In addition, there were numerous instances where a concealed video camera had recorded child pornography as well. All of these events are unsurprising [9]. The most obvious form of possession, with respect to the Barbie, would be if the accused had the item in his or her “present manual custody.” For example, if the defendant was found to be holding a Video Girl Barbie doll containing child pornography images or video, then, subject to the requirement of knowledge, he or she would be in possession of those images or video. In addition, if the doll was likewise found in the defendant's physical control (e.g., in his or her house), even that would constitute an offense.

There are professionals who have filmed Video Girl Barbie in a sexualized manner [10], but that in itself is not an offense. Although the YouTube video that compares the camera quality of the Canon 7D to Video Girl is unlisted (only people who know the link to the video can view it, and unlisted videos do not appear in YouTube search results), it sadly shows what distortion is possible through adult eyes, through using arguably borderline “adult” humor. In the YouTube comments for the video, Naxell wrote, “[t]hat USB in the back and the leg batteries make this seem like some kind of bizarre multipurpose sex gynoid,” while Marcos Vidal wrote, “Well, think on the Barbie's use; it can spy—with Cannon 7D, it's a lot harder.” While no one is claiming that Vidal was referring to the recording of a child for duplicitous reasons, it certainly suggests that Barbie could be used as a covert camera. Essentially, it is taking a form of child's play and making that an asset of the cloud for future use and possible manipulation. And this is just a fundamental issue in the new type of cybercrime—that “the advent of digital technology has transformed the way in which child pornography is produced and distributed” [5], [p. 251]. In essence, child pornography can be defined as “the sexual depiction of a child under a certain age” [5], [p. 255].

Marketing Mishaps

While we do not need to point to a video someone has made of Barbie and her super-power recording prowess “under the hood,” we can simply look at Mattel's poor taste in advertising strategy for the Video Girl doll as a children's toy. The key question is whether those who engineered the doll at Mattel understand that they are accountable for the purposeful user design and user experience they have created [11]. In a press release, the company stated, “Mattel products are designed with children and their best interests in mind. Many of Mattel's employees are parents themselves, and we understand the importance of child safety—it is our number one priority” [12].

The Barbie Video Girl doll is “doll vision” for ages 6 and above.

The Barbie Video Girl doll is “doll vision” for ages 6 and above.

At the time of the online media content review in early 2011, one of the authors, Katina Michael, was horrified to find some disturbing ways in which Mattel had softly launched the product. In fact, the doll sold out at Wal-Mart in its first release. The other author, Alexander Hayes, purchased a Barbie Video Girl in 2010 to inform his Ph.D. research on point-of-view technologies, and he told Katina that the doll was “hideous…a manifestation of the most cruel manner in which to permeate a child's play.” Katina agreed and noted that the purchased Barbie would remain forever unopened because the packaging itself formed a part of the bigger picture they would need to use for a stimulus for discussion to public audiences. Katina used the packaged Barbie during her presentation at the Fourth Regional Conference on Cybercrime and International Criminal Cooperation, which was well attended by law enforcement agencies, legal personnel, and scholars in the social implications of technology [13]. The Video Girl Barbie also made further appearances at the February 2012 SINS Workshop, “Point-of-View Technologies in Law Enforcement” [14], and an invited workshop at which Katina and Alexander spoke, the 2013 INFORMA Policing Technology Conference on the theme “Bring Your Own Body-Worn Device” [15].

In July 2010, Mattel released Barbie Video Girl, a doll with a pinhole video camera in its chest enabling clips up to 30 min to be recorded.

In July 2010, Mattel released Barbie Video Girl, a doll with a pinhole video camera in its chest enabling clips up to 30 min to be recorded.

Perhaps the most disturbing and disappointing aspect of the Video Girl Barbie was the way in which the doll was marketed. On the packaging was the statement “I am a real working video camera.” This vernacular is akin to adult sex workers and does not fit with societal moral and ethical frameworks by which we protect innocent children. It is questionable why the word working was introduced into the phraseology. In essence, Video Girl Barbie is a photoborg [16]. She is reminiscent of Mattel's Vidster video camera toy for kids [17], cloaked in the form of a Barbie doll. Elsewhere, Mattel mentions: “Necklace is a real camera lens!” But the location of the camera on the chest looks less like a necklace and more like cleavage with an additional statement: “This Barbie has a hidden video camera” [18]. There was also a picture of Barbie depicted on her knees with a visual didactic stating “for easy shooting,” indicating the three steps to making a movie. The storytelling video demo scenario Mattel used had to do with cats at the vet and was generally in poor taste. The cat was depicted getting her heartbeat monitored in one video scene, getting an X-ray in another, and then finding herself in a basket with another cat and finding love, with a heart symbol depicted above the cats' heads.

Comments varied for iJustine's video “OMG Video Girl,” which has more than 1.4 million YouTube views [19]. Here was a female adult commenting on a toy for kids. Taylor Johnson wrote, “My Favorite was the vet Barbie! Haha!” Mssjasmine commented, “That doll is kinda creepy (like a pedophile would buy that to watch little kids…ew).” Sam Speirs similarly wrote, “This ‘toy’ of yours will/could be used as a major predator trap! And I know that the idea was for the girls to have a camera [to] do stuff, but, seriously, it's a concealed camera in a popular little girl's toy…Creepy, if you ask me!” Another product reviewer of children's toys wrote: “Barbie sees everything from a whole different angle” [20]. There were several “Boycott Barbie” websites found in 2011: “Get Rid of Barbie Video Girl” Facebook page and “Boycott Porno Barbie.”

A child plays with traditional dolls. Today, we are making dolls that are connected to the cloud and use artificial intelligence to listen to questions from children and provide them answers over the Internet without human intervention. Soon, we will be asking the question “what is real?”

A child plays with traditional dolls. Today, we are making dolls that are connected to the cloud and use artificial intelligence to listen to questions from children and provide them answers over the Internet without human intervention. Soon, we will be asking the question “what is real?”

Perhaps the worst example of Mattel's approach in this product was its initial press release (sent to TechCrunch by the PR firm responsible), which stated: “Unsuspecting subjects won't know that Barbie is watching their every move…” [21]. Issues for Mattel to consider have much to do with corporate responsibility. Excluding the potential for pedophiles to use this technology to cause harm, what happens if innocents produce illegal content which would otherwise mean criminalization? Could the doll be used to groom and seduce victims of child pornography?

Hello? Barbie is Listening

But Mattel, like most high-tech manufacturers, has not stopped there. Convergence has become an integral part of the development cycle. If the Barbie Video Girl doll seemed amazing as a concept, then the Hello Barbie doll has outdone it. In its own words, Mattel states that the Hello Barbie is “a whole new way to play with Barbie!” She differs from Barbie Video Girl in several ways. The doll still comes equipped with a whole bunch of electronics, but Hello Barbie uses speech-recognition technology to hold a conversation with a child and only allows for still-shot photo capture. The product information page on Mattel's website reads:

Using Wi-Fi and speech-recognition technology, Hello Barbie doll can interact uniquely with each child by holding conversations, playing games, sharing stories, and even telling jokes! […] Use is simple after set up—push the doll's belt buckle to start a conversation, and release to hear her respond […] To get started, download the Hello Barbie companion app to your own smart device from your device's app store (not included). Parents must also set up a ToyTalk account and connect the doll to use the conversational features. Hello Barbie doll can remember up to three different Wi-Fi locations [22].

Thus, the doll transmits data back to a service called ToyTalk. Forbes reported that ToyTalk has terms of service and a privacy policy that allow it to “share audio recordings with third-party vendors who assist [Mattel] with speech recognition.” Customer “recordings and photos may also be used for research and development purposes, such as to improve speech recognition technology and artificial intelligence algorithms and create better entertainment experiences” [23]. There is, however, a “SafePlay” option, where parents and guardians are still “in control of their child's data and can manage this data through the ToyTalk account at any time” [22].

To manage SafePlay, parents must visit www.mattel.com/hellobarbiefaq to get more information, or call +1 888 256 0224—and every parent will certainly have time to do this [24]. “Parents must also set up a ToyTalk account and connect to use the conversational features…Use of Hello Barbie involves recording of voice data; see ToyTalk's privacy policy at http://www.toytalk.” Of course, it is not the parents who will end up downloading these apps but the children.

Continued Infiltration

This raises many questions about the trajectory of toys and everyday products that increasingly contain networked features that introduce new parameters to what was once innocent child's play, unseen and carefree. First, Samsung launched a television set that can hear household conversations [25], and now we are to believe that it is the real Barbie who is “chatting” with our children. Are we too blind to see what is occurring? Is this really play? Or is it the best way of gathering marketing data and instituting further manipulation into those too young to know that the Barbie talking to them is not real and actually a robot of sorts? Just like we were once oblivious to the fact that our typed entries in search boxes were being collated to study our habits, likes, and dislikes, we are presently oblivious to the onslaught of products that are trying to infiltrate our homes and even our minds.

A spate of products has entered the market doing exactly the same thing as Hello Barbie but targeting a variety of vertical segments—from Amazon Echo for families who allegedly need a cloud connector because they cannot spell words like cantaloupe [26], [27] to NEST's thermostat and smoke-detection capability that doubles as human activity monitoring and tracking (NEST says so openly in its promotional commercials) [28], to DropCam's reconnaissance video recordings of what happens in your household 24/7, just in case there is a perpetrator who dares to enter [29].

Cayla is Talking—And It's Not Always Pretty

Perhaps our “favorite” is the My Friend Cayla doll [30], which connects to the cloud like the Hello Barbie. She is seemingly innocent but has shown herself to be the stuff of nightmares, akin to the horror movie Child's Play featuring the character Chucky [31]. On the Australian Cayla page, potential buyers are again greeted by a splash page with a cat on it: “I love my cat Lily. I will tell you her story.” Cayla is depicted talking to two little girls. The British Christmas best seller is effectively a Bluetooth headset dressed as a doll. With the help of a Wi-Fi connection (like Hello Barbie), she can answer a whole lot of tough questions, Amazon Echo style, and you would be surprised at her capacity [32]. But security researcher Ken Munro from Pen Test Partners put Cayla to the test and identified some major security flaws that could give perpetrators a way in. In essence, Cayla was hacked. She was made to speak a list of 1,500 strong words and expletives, and her responses to questions were modified [33].

This reminds us of the 2015 article in IEEE Technology and Society Magazine by K. Albrecht and L. McIntyre on IP cameras that double as baby monitors [34]. The moral of the story is the same whether the cloud-connected device is a children's monitor, children's toy, desktop game for kids, television console, Q&A tool for households, or a plain-old Wi-Fi-enabled smoke detector or thermostat: if it's connected, then it's vulnerable to security hacks and breaches in privacy [35]. Worse still, if it can talk back to you in the spoken word, then you need to think about the logic behind the process and what we are teaching our children about what is human and what is not. If these electronics products are going back to the Internet seeking results, then don't be surprised if nonphysical autonomous software robots one day begin to spit out bizarre answers and manipulative responses based on what is out there on the Internet.

As Kate Darling said in a Berkman talk at Harvard University in 2013, “[s]o not to undermine everything that I've just said here, but I do wonder…Say McDonald's gets its hands on a whole bunch of children's toys that are social robots and interacts with the kids socially, and the toys are telling the kids…to eat more McDonald's, and the kids are responding to that. That is something that we also need to think about and talk about, when these things start to happen. They could be used for good and for evil” [36]. If only that is all they will be saying to the next generation!

Katina visited the My Friend Cayla website recently and found this message: “Due to changes in the external website which Cayla gets some information from, she is temporarily unable to answer some types of questions. Cayla can still talk about herself, do maths and spelling, and all other functions are unaffected. A free app update will be issued (for both iOS and Android users) within the next two weeks with a fix. Thank you for your understanding” [37]. Keeping our children safe and aware of the difference between virtual and real is one thing, but, if we aren't careful, we will soon welcome a future where My Friend Cayla might well be facing off against Hello Barbie in another Child's Play blockbuster.

References

1. K. Albrecht, K. Michael, "Connected to everyone and everything", IEEE Technol. Soc. Mag., vol. 32, no. 4, pp. 31-34, 2014.

2. M. G. Michael, K. Michael, C. Perakslis, "Uberveillance the Web of Things and People: What is the culmination of all this surveillance?", IEEE Consumer Electron. Mag., vol. 4, no. 2, pp. 107-113, 2015.

3. K. Michael, "Wearable computers challenge human rights", ABC Science, July 2013, [online] Available: http://www.abc.net.au/science/articles/2013/07/24/3809675.htm.

4. Barbie's video girl, Sept. 2015, [online] Available: http://service.mattel.com/us/TechnicalProductDetail.aspx?prodno=R4807&siteid=27&catid1=508.

5. J. Clough, Principles of Cybercrime, Cambridge Univ. Press, 2010.

6. A. Toor, FBI says video Barbie girl could be used for ‘child pornography production’, Dec. 2010, [online] Available: http://www.switched.com/2010/12/03/fbi-video-barbie-girl-could-be-used-for-child-pornography/.

7. FBI memo raises Barbie child pornography fears, BBC News, Dec. 2010.

8. M. Martinez, FBI: New Barbie ‘Video Girl’ doll could be used for child porn, CNN, Dec. 2010.

9. D. M. Hughes, "The use of new communications and information technologies for sexual exploitation of women and children", Hastings Women's Law J., vol. 13, pp. 127, 2002.

10. Canon 7D vs. Barbie Video Girl, Dec. 2010, [online] Available: http://www.youtube.com/watch?v=uLmgXk4RlOc.

11. A. Hayes, FBI pornography Barbie, Dec. 2010, [online] Available: http://uberveillance.com/blog/2010/12/30/fbi-pornography-barbie.html?rq=barbie.

12. S. Fox, "FBI target new Barbie as child pornography threat", LiveScience, Sept. 2015, [online] Available: http://www.livescience.com/10319-fbi-targets-barbie-child-pornography-threat.html.

13. K. Michael, "The FBI's cybercrime alert on Mattel's Barbie video girl: A possible method for the production of child pornography or just another point of view", Conf. Cybercrime and Int. Criminal Cooperation, 2011-May-19–20.

14. K. Michael, M. G. Michael, "Point of view technologies in law enforcement" in The Social Implications of National Security, Sydney Univ., 2012. 

15. K. Michael, A. Hayes, "WORKSHOP | Body worn video recorders: The socio-technical implications of gathering direct evidence", INFORMA Police Technology Forum 2013, 2013-Mar.

16. K. Michael, "Wearables and lifeblogging: The socioethical implications", IEEE Consumer Electron. Mag., vol. 4, no. 2, pp. 80, 2015.

17. Mattel's Vidster is for kids, Sept. 2015, [online] Available: http://gizmodo.com/124713/mattels-vidster-is-for-kids.

18. VideoGirl, May 2011, [online] Available: http://www.barbie.com/videogirl/.

19. OMG Video Girl!, May 2011, [online] Available: http://www.youtube.com/watch?v=kSCfbSKSxMc.

20. "TimeToPlayMag", Barbie video girl doll from Mattel, [online] Available: http://www.youtube.com/watch?v=YKqrTycSHIQ&NR=1&feature=fvwp.

21. P. Carr, Feds finally closing the net on America's most wanted Barbie (since Klaus), May 2013, [online] Available: http://techcrunch.com/2010/12/03/you-can-brush-my-hair-arrest-me-anywhere/.

22. "Hello Barbie™ Doll—Light brown hair", Mattel Shop, Sept. 2015, [online] Available: http://shop.mattel.com/product/index.jsp?productId=71355596.

23. J. Steinberg, This new toy records your children's private moments—Buyer beware, Forbes, Mar. 2015.

24. High-tech Barbie sparks privacy concerns parental backlash, ABC News, Sept. 2015.

25. N. Grimm, Samsung warns customers new Smart TVs “listen in” on users' personal conversations, ABC News, Mar. 2015.

26. Introducing Amazon Echo, Dec. 2015, [online] Available: https://www.youtube.com/watch?v=KkOCeAtKHIc.

27. Amazon Echo, Sept. 2015, [online] Available: http://www.amazon.com/gp/product/B00X4WHP5E?*Version*=1&*entries*=0.

28. L. Whitney, Google closes \$3.2 billion purchase of Nest, C|NET, Feb. 2014.

29. G. Kumpara, Google and NEST acquire Dropcam for \$555 Million, TechCrunch, June 2014.

30. My Friend Cayla, Sept. 2015, [online] Available: http://www.myfriendcayla.com/.

31. "MovieClips Extras", Child's play behind the scenes—Making a nightmare (1988)—HD, Sept. 2015, [online] Available: https://www.youtube.com/watch?v=2EUwq9acGB8.

32. D. Moye, Talking Doll Cayla hacked to spew filthy things, Huffington Post, Sept. 2015.

33. N. Oakley, My Friend Cayla doll can be HACKED warns expert—Watch kids' toy quote 50 Shades and Hannibal, Sept. 2015, [online] Available: http://www.mirror.co.uk/news/technology-science/technology/friend-cayla-doll-can-hacked-5110112.

34. K. Albrecht, L. McIntyre, "Privacy nightmare: When baby monitors go bad", IEEE Technol. Soc. Mag., vol. 34, no. 3, pp. 14-19, 2015.

35. K. Goldberg, "Cloud Robotics Intro", Talks at Google, Sept. 2015, [online] Available: https://www.youtube.com/watch?v=IzUXT3_7tWc.

36. K. Darling, Kate Darling on near-term ethical legal and societal issues in robotics, Berkman Centre, Sept. 2015.

37. Meet Cayla, My Friend Cayla, Sept. 2015, [online] Available: http://myfriendcayla.co.uk/cayla.

Keywords: Cameras, Sensors, Consumer electronics, Motion pictures, Computer crime, YouTube, Context, social aspects of automation, cloud computing, Internet of Things, children toys, high-tech child play, cloud, Internet of Things, embedded sensors, civil infrastructure, computing devices

Citation: Katina Michael, Alexander Hayes, High-Tech Child's Play in the Cloud: Be safe and aware of the difference between virtual and real, IEEE Consumer Electronics Magazine ( Volume: 5, Issue: 1, Jan. 2016 ), pp. 123 - 128, Date of Publication: 11 December 2015. DOI: 10.1109/MCE.2015.2484878

Digital Wearability Scenarios: Trialability on the Run

Introduction

What happens when experimental technologies are deployed into society by market leaders without much forethought of the consequences on everyday life? When state-based regulations are deliberately ignored by rapid innovation design practices, giving birth to unconventional and radical production, a whole series of impacts play out in real life. One such example is Google's Glass product: an optical head-mounted display unit that is effectively a wearable computer. In early 2013, Google reached out to U.S. citizens asking potential Glass users to send a Twitter message with the #IfIHadGlass hashtag to qualify for consideration and to pay US$1,500 for the product if numbered among the eligible for its early adoption. About 8,000 consumers in the United States allegedly were invited to purchase the Explorer edition of Glass. By April 2013, Google had opened up Glass to its “Innovation in the Open” (I/O) developer community, and by May 2014, they allowed purchases of the product from anywhere in the world.

The early adopters of the open beta product quickly became tech evangelists for the Google brand. As was expected, the touted benefits of Glass, by the self-professed “Glassholes,” were projected as mainstream benefits to society via YouTube and Hangout. Tech-savvy value-added service providers who stood to gain from the adoption and citizens who wished to be recognized as forward-thinking, entrepreneurial, and cool came to almost instantaneous fame. There were, however, only a few dissenting voices that were audible during the trialability phase of diffusion, with most people in society either not paying much attention to “yet another device launch” by Google or ignoring folk who were just geeks working on hip stuff. About the biggest thought people had when confronted by one of these “glasses” in reality was “What's that?” followed by “Are you recording me?” The media played an interesting role in at least highlighting some of the potential risks of the technology, but for the most part, Glass was depicted as a next-generation technology that was here now and that even Australia's own then-Prime Minister Julia Gillard had to try out. Yep, another whiz-bang product that most of us would not dare to live without.

With apparently no limits set, users of Glass have applied the device to diverse contexts, from the operating theater in hospitals to preschools in education and evidence gathering in policing. Yes, it is here, right now. Google claims no responsibility for how its product is applied by individual consumers, and why should they—they're a tech company, right? Caveat emptor! But from the global to the local, Glass has received some very mixed reactions from society at large.

Scenario-Planning Approach

This article focuses on the social-ethical implications of Glass-style devices in a campus setting. It uses secondary sources of evidence to inspire nine short scenarios that depict a plausible “day in the life” of a person possessing a body-worn video camera. A scenario is “an internally consistent view of what the future might turn out to be” [1]. One gleans the current state of technology to map the future trajectory [2, p. 402]. Scenarios allow us two distinct qualities as researchers: 1) an opportunity to anticipate possible and desirable changes to society by the introduction of a new technology known as proactivity and 2) an opportunity to prepare for action before a technology is introduced into the mainstream, known as preactivity [3, p. 8]. While change is inevitable as technology develops and is diffused into society, we should be able to assess possible strategic directions to better prepare for expected changes and, to an extent, unexpected changes. This article aims to raise awareness of the possible social, cultural, and ethical implications of body-worn video recorders. It purposefully focuses on signs of threats and opportunities that body-worn recording devices presently raise in a campus setting such as a university [1, p. 59]. A similar approach was used successfully in [4] with respect to location-based services in 2007.

In February 2013, Katina and M.G. Michael were invited to write an opinion piece about the ethics of wearable cameras for Communications of the ACM (CACM) [5]. Upon the article's acceptance in September of the same year, the CACM editor provided the option of submitting a short video to accompany the article online, to act as a summary of the issues addressed. Encouraged by the University of Wollongong's videographer, Adam Preston from Learning, Teaching and Curriculum, after some initial correspondence on prospective scenarios, it was jointly decided to simulate the Glass experience with a head-mounted GoPro camera [6] and to discuss on camera some of the themes presented in the article within a university campus setting (Figure 1). A few months prior, in June, Katina hosted the International Symposium on Technology and Society (ISTAS13) with wearable pioneer Prof. Steve Mann [7]. Ethics approval for filming the three-day international symposium with a variety of wearable recorders was gained from the University of Wollongong's Human Research Ethics Committee (HREC) for the University of Toronto-based event. Importantly, it must be emphasized that the scenarios themselves are fictitious in terms of the characters and continuity. They did not happen in the manner stated, but, like a tapestry, they have been woven together to tell a larger story. That story is titled: “Recording on the Run.” Each scenario can be read in isolation, but, when placed side by side with other scenarios, becomes a telling narrative of what might be with respect to societal implications if such recording devices proliferate.

Figure 1. A GoPro device clipped to an elastine headband ready to mount on a user. Photo courtesy of Katina Michael.

Figure 1. A GoPro device clipped to an elastine headband ready to mount on a user. Photo courtesy of Katina Michael.

Having hired the videographer for 2 h to do the filming for CACM, we preplanned a walkthrough on the University of Wollongong's campus (Figure 2). Deniz Gokyer (Figures 3 and 4) was approached to participate in the video to play the protagonist GoPro wearer, as he was engaged in a master's major project on wearables in the School of Information Systems and Technology. Lifelogging Web sites such as Gloggler.mobi that publish point-of-view (POV) video content direct from a mobile device were also used to support claims made in the scenarios. The key question pondered at the conclusion of the scenarios is, how do we deal with the ever-increasing complexity in the global innovation environment that continues to emerge around us with seemingly no boundaries whatsoever? The scenarios are deliberately not interpreted by the authors to allow for debate and discussion. The primary purpose of the article was to demonstrate that body-worn recording products can have some very significant expected and unexpected side effects, additionally conflicting with state laws and regulations and campus-based policies and guidelines.

Figure 2. (a) The making of a short video to discuss the ethical implications of wearable devices for CACM. (b) The simultaneous GoPro view emanating from the user's head-mounted device. Screenshots courtesy of Adam Preston.

Figure 2. (a) The making of a short video to discuss the ethical implications of wearable devices for CACM. (b) The simultaneous GoPro view emanating from the user's head-mounted device. Screenshots courtesy of Adam Preston.

Figure 3. Deniz Gokyer simulating an ATM withdrawal while wearing a GoPro. Photo courtesy of Adam Preston.

Figure 3. Deniz Gokyer simulating an ATM withdrawal while wearing a GoPro. Photo courtesy of Adam Preston.

Figure 4. The aftereffect of wearing a GoPro mounted on an elastic band for 2 h. Photo courtesy of Katina Michael.

Figure 4. The aftereffect of wearing a GoPro mounted on an elastic band for 2 h. Photo courtesy of Katina Michael.

Recording on the Run

Scenario 1: The Lecture

Anthony rushed into his morning lecture on structures some 10 min late. Everyone had their heads down taking copious notes and listening to their dedicated professor as he provided some guidance on how to prepare for the final examination, which was worth 50% of their total mark. Anthony was mad at himself for being late, but the bus driver had not accepted his AUD$20 note in lieu of the Opal Card now available. Prof. Markson turned to the board and began writing the practice equations wildly, knowing that he had so much to get through. Anthony made sure to keep his hands free of anything that would sidetrack him. Instead, he recorded the lecture with a GoPro on his head. Some of the girls giggled in the back row as he probably looked rather stupid, but the laughter soon subsided and everyone got back to work, copying down Markson's examples. At one stage, Markson turned to look at what the giggles were about, made startling eye contact with Anthony, and probably thought to himself: “What's that? Whatever it is, it's not going to help him pass—nothing but calculators are allowed in exam situations.”

Anthony caught sight of Sophie, who motioned for him to go to the back row, but by then, he thought it would probably be better recording from the very front and he would cause less disruption by just sitting there. Markson was a little behind the times when it came to innovation in teaching, but he was a brilliant lecturer and tutor. Anthony thought to himself, if anyone asks for the recording, he would make sure that it would be available to them. The other students took note of the device that was firmly strapped to his head with a band but were somewhat unphased. Anthony had always argued that recording with a GoPro is nothing more than recording with a mobile phone. He surfed a lot at Austinmer Beach, and he thought the video he took of himself on the board was just awesome, even though his girlfriend thought it was vain. It was like a motion selfie.

Scenario 2: The Restroom

It had been one long day, practically like any other, save for the fact that today Anthony had chosen to wear the GoPro on a head-mounted bandana to record his lectures. They were in the serious part of the session, and he wanted to make sure that he had every opportunity to pass. Anthony was so tired from pulling an all-nighter with assessment tasks that he didn't even realize that he had walked into the restroom toward the end of his morning lecture with the device switched on and recording everything in full view. Lucky for him, no one had been accidentally caught on film while in midstream. Instead, as he walked in, he was greeted by someone who was walking out and a second guy who avoided eye contact but likely noticed the camera on Anthony's head from the reflection in the mirror while washing his hands. The third one didn't even care but just kept on doing what he was doing, and the fourth locked his eyes to the camera with rage for a while. They didn't speak, but Anthony could sense what he thought—“what the heck?” Anthony was an attractive young man who sported tattoos and always tried to look different in some way. He hated conformity. Now that he had watched the video to extract the lecture material, he wondered why no one had stopped him to punch the living daylights out of him in the restroom. Anthony had thought people were getting used to the pervasiveness of cameras everywhere—not just in the street and in lecture theaters but also in restrooms and probably soon in their homes as well.

Scenario 3: The Corridor

By this time, Anthony was feeling rather hungry. In fact, he was so hungry that he was beginning to feel very weak. All of those late nights were beginning to catch up now. Sophie demanded that they go eat before the afternoon lecture. As they walked out of the main tower building, they bumped into an acquaintance from the previous session. Oxford, as he was known by his Aussie name, was always polite. The conversation went something like this. “Hello Oxford! How are you?” said Sophie. Oxford replied, “I'm fine, thank you. Good to see you guys!” Sophie quickly pointed to Anthony's head-mounted camera and said, “Oxford, can you believe how desperate Anthony has become? He's even recording his lectures with this thing now!” Oxford, who was surprised, remarked, “Oh yeah. I've never seen one of these before. Are you recording right now, Anthony?” “Yes, I am,” Anthony affirmed, “but to be honest, I completely forgot about it—I'm dreaming about food right now.” Anthony patted his tummy, which was by now making grumbling noises. “Want to come with us to the café near the gymnasium?” Anthony asked.

“He just filmed most of the structures lecture—I'm thinking like, this might be the coolest thing that might stick,” Sophie reflected, ignoring Anthony. “No kidding,” Oxford said, “You're recording me right now? I'm not exactly thrilled about this, but ‘hi,’ for what it's worth.” Oxford waved to the camera and smiled. Sophie interjected, “Oxford, it is not like he's making a movie of you, haha!” Sophie grabbed Oxford's arm to pull it toward her—the jab was signified to make it clear she was joking. But suddenly, things became serious instead of lighter. Oxford continued, “No, I'm not quite good in front of the camera…like I don't like pictures being taken of me or even recordings of my voice. It's probably the way I was raised back home.”

Anthony told Oxford not to worry because he was not looking at him, and so, therefore, nothing but his voice was really being recorded. Little did he realize that was breaking local New South Wales laws, or at least that was what he would find out later in the day when someone from security spotted him on campus. Sophie asked with curiosity, “Do you think someone should ask you if they want to record you on campus?” Oxford thought that was a no brainer—“Of course they should ask. You're wearing this thing on your head, and there's nothing telling people passing by whether you are watching them and recording them. C'mon Anthony, you're a smart guy, you should know this stuff; you're studying engineering, aren't you? We're supposed to be the ones that think of everything before it actually happens. You might as well be a walking CCTV camera.” There was dead silence among the friends. Then Anthony blurted out, “But I'm not watching you; you just happen to be in my field of view.”

Sophie began to consider the deeper implications while Anthony was getting flustered. He wanted to eat, and they were just beginning a philosophical conversation. “C'mon Oxford, come with us, we're starving…and we can talk more at lunch, even though we should be studying.” As they walked, Sophie continued: “It's not like this is the worst form of camera that could be watching. I saw this thing on the news a couple of weeks ago. The cameras are getting tinier; you cannot even see them. The company was called OzSpy, I think, and they're importing cheap stuff from Asia, but I don't think it's legal in every state. The cameras are now embedded in USBs, wristbands, pens, keyfobs, bags, and t-shirts. How do you know you're being recorded with that kind of stuff?” Oxford was beginning to feel uneasy. Anthony felt like taking off the contraption but left it on because he was just too lazy to put the thing back in its box and then back on again in less than 2 h. Oxford confessed again: “I feel uncomfortable around cameras, and it's not because I'm doing anything wrong.” They walked quietly for a few minutes and then got to the café. Sophie pointed to the wall as they queued. “Look up there. It's not like we're not always under surveillance. What's the difference if it is on a building wall versus on someone's head?”

Anthony wished they'd change the subject because it was starting to become a little boring to him. Oxford thoughtfully replied to Sophie, “Maybe it's your culture or something, but I even wave to CCTV cameras because it's only for security to see on campus. But if someone else is recording me, I don't know how he or she will use the footage against me. I don't like that at all. I think if you're recording me to show other people, then I don't think it's okay at all.” Sophie chuckled, “Hey, Oxford, this way Anthony will never forget you even when you have finished your degree and return to Thailand in ten years; when he is rich and famous, he'll remember the good old days.” The truth was that Oxford never wanted to return to Thailand; he liked the opportunities in Australia but added, “Okay, so you will remember me and my voice forever.”

By this time, Anthony was at the front of the queue. “Guys, can we forget about this now? I need to order. Okay, Oxford, I promise to delete it if that makes you feel better.” Oxford said, “No, Anthony, you don't understand me. I don't mind if you keep this for old times sake, but just don't put it on the Internet. I mean don't make it public, that's all. Guys, I just remembered I have to go and return some library books so I don't get a fine. It's been nice chatting. Sorry I cannot stay for lunch. Good luck in your finals—let's catch up and do something after exams.” “Sure thing,” Sophie said. “See ya.” As Oxford left and Anthony ordered food, she exclaimed, “Your hair is going to be great on the video!” Oxford replied, “I know my hair is always great, but this jacket I am wearing is pretty old.” Oxford continued from afar, “Anthony, remind me to wear something nicer next time. Bye now.” Sophie waved as Oxford ran into the distance.

Scenario 4: Ordering at the Cafe

Anthony ordered a cappuccino and his favorite chicken and avocado toastie. The manager, who was in his 50s, asked for Anthony's name to write on the cup. “That will be 10 note and waited for change. “And how are you today?” asked the manager. “I'm fine thanks.” “Yeah, good,” replied the manager, “Okay, see you later, and have a good one.” Anthony muttered, “I'll try.” Next it was Sophie's turn to order. “What's up with him?” asked the café manager. “What's that thing on his head? He looks like a goose.” Sophie cracked up laughing and struck up a conversation with the manager. She was known to be friendly to everyone.

Anthony went to the service area waiting for his cappuccino and toastie. For once, the line was not a mile long. The male attendant asked Anthony, “What's with the camera?” By then, Anthony had decided that he'd play along—sick of feeling like he had to defend himself, yet again. He wasn't holding a gun after all. What was the big deal? He replied, “What's with the camera, mate? Well, I'm recording you right now.” “Oh, okay, awesome,” said the male attendant. Anthony probed, “How do you feel about that?” The male attendant answered, “Well, I don't really like it man.” “Yeah, why not?” asked Anthony, trying to figure out what all the hoo-ha was about. There were CCTV cameras crawling all over campus, and many of them were now even embedded in light fixtures.

“Hey, Josie, Josie—how do you feel about being filmed?” exclaimed the male attendant to the female barista cheekily. “I don't really mind. I always wanted to be an actress when I was little, here's my chance!” “Yeah?!” asked Anthony, in a surprised tone. “Are you filming me right now? Are you going to make me look real good?” laughed the barista in a frisky voice. Anthony smiled and, by then, Sophie had joined him at the service area, a little jealous. “What's this for?” asked Josie. She had worked on campus for a long time and was used to serving all sorts of weirdos. “No reason. I just filmed my structures class. And now, well now, I've just decided to keep the camera rolling.” Josie asked again, “Are you really filming me right now?” Anthony reaffirmed, “Yes.”

Sophie looked on in disbelief. The camera had just become the focal point for flirtation. She wasn't liking it one bit. Josie asked Anthony again, “Why are you filming?” Anthony didn't know why he blurted out what he did but he said, “Umm…to sort of get the reactions of people. Like how they act when they see someone actually recording them.” The male attendant interrupted, “You know what you should do? You should go up to him,” pointing to the manager, “and just stare at him, like just stare him in the face.” “I will, I will,” said Anthony. Egging Anthony on, the male attendant smiled, “Stand in front of the queue there, and just stare at him. He'll love it, he'll love it, trust me. You'd make his day man.” “Hey, where's my cappuccino and toastie?” demanded Anthony. The male attendant handed the food over and got Sophie's food ready too. “And this must be yours.” “Yes,” Sophie replied. The male attendant insisted: “Focus on him now, don't focus on me, all right?” “Yup, ok, see you later. Cheers.” Anthony felt a little diminished; although he was surprised that the barista talked to him for as long as she did, he wasn't about to pick a fight with an old bloke. What he was doing was harmless, he thought; he left the counter to take a seat, but considered switching off the device.

Scenario 5: Finding a Table at the Cafe

Sophie found a table with two seats left in a sunny spot and put her things down. Lack of sleep during exam time meant that everyone generally felt cold. Anthony sat down also. At the large oblong table was a small group of three—two girls and a guy. Sophie went looking for serviettes, as they forgot them at the counter. As soon as Anthony pulled up a chair to sit down, one of the girls got up and said, “And you have a lovely afternoon.” Anthony replied, “Thank you and you too.” Speechless, the other two students at the table picked up whatever was left of their drinks and left not long after. As Sophie returned, she saw the small group leaving and whispered, “Anthony, maybe you should take that thing off. You're getting quite a bit of attention. It's not good. A joke's a joke. Alright, I could cope with the classroom situation, but coming to the café and telling people you're recording. Surely, you are not, right? You're just kidding, right?” “Listen, Sophie, I'm recording you now. The battery pack lasts a while, about an hour, before it needs replacing. I'm going to have to charge the backup during the next lecture.” “Anthony,” Sophie whined, “c'mon, just turn it off.” Anthony acted like he was turning it off reluctantly although he had not. “Now put it away,” Sophie insisted. “No, I'm going to leave it on my head,” Anthony said. “I couldn't be bothered, to tell you honestly. Just don't forget to remind me to turn it back on when we are in class.” “Good,” said Sophie.

By then, two girls asked if they could sit down at the table. “Sure,” said Sophie. The girls were known to Sophie, at the Residence but they merely exchanged niceties. “My name is Klara,” said one of the girls. “And my name is Cygneta,” said the other. “I'm Sophie, and this is my boyfriend Anthony. Nice to finally get to talk to you. That'd be right. Just when we should all be studying, we're procrastinating and socializing.” Anthony was happy for the change of conversation, so he thought.

“I know what that is, Anthony! It's a GoPro,” Cygneta exclaimed. “Sophie, Sophie, I wouldn't let my man carry that thing around on campus filming all those pretty ladies.” Cygneta giggled childishly, and Klara joined her in harmony but did not know anything about the contraption on Anthony's head. Sophie was reminded why she had never bothered approaching Cygneta at the Residence. Those two were inseparable and always too cute—the typical creative arts and marketing students. Sophie retorted, “Well, he's not filming right now. He just filmed the lecture we were in.” Anthony made Sophie think twice. “How do you know I'm not filming right now?” Sophie said, “Because the counter on the LCD is not ticking.” Cygneta had used a GoPro to film her major project and knew that you could toggle the LCD not to show a counter, sharing this with the group. Sophie didn't like it one bit. It made her doubt Anthony.

Anthony proceeded to ask Klara, “How do you feel when you see someone recording you?” “Yeah, not great. I feel, like, really awkward,” confessed Klara. Then Anthony asked the million dollar question: “What if most people wore a Google Glass on campus and freed themselves of having to carry an iPhone?” Klara at this point was really confused. “Google what?” Sophie repeated, “Google Glass” in unison with Anthony. Shaking her head from side to side, Klara said, “Nah, I'm not into that kind of marketing at all.” “But it's the perfect marketing tool to gather information,” considered Anthony. “Maybe you're going to start using it one day as well? Don't you think?” Klara looked at Sophie and Anthony and replied, “What do you mean? Sorry?” Anthony repeated, “Do you reckon you're gonna be using Google Glass in a couple of years?” Klara turned to Cygneta for advice. “What in the world is Google Glass? It sounds dangerous?” Anthony explained, “It's a computer that you can wear as glasses. But it's a computer at the same time.” Klara let out a sigh. “I had no idea that even existed, and I think I'm a good marketing student and on top of things.”

By this stage, Sophie was feeling slighted and decided to finish her food, which was now cold. Anthony, caught off guard by Klara's lack of awareness, reaffirmed, “So you don't reckon you'd be wearing glasses that can record and works as a phone or a headband capable of reading brain waves?” Cygneta said, “Probably not,” and Klara also agreed, “No. I like my phone just fine. At least I can choose when I want to switch it off. Who knows what could happen with these glasses? It's a bit too out there for me. That stuff's for geeks, I think. And anyway, there's nothing interesting in my life to capture—just one big boring stream of uni, work, and home.”

Sophie pointed out an interesting fact: “Hey girls, did you know that there's no law in Australia that forbids people from video recording others in public? If it's happening out on the street, then it ain't private.” Cygneta replied, “Yeah I heard this news the other day; one of the ministers was caught on video heavily cursing to another minister when he was listening to his speech. He was waiting for his turn to give a speech of his own, apparently, and he didn't even notice someone was recording him. What an idiot!”

Sophie asked Anthony to accompany her to the bank. Lunch was almost over, and the lecture was now less than an hour away. The pair had not studied, although at the very next table was a group of six buried in books from the structures class. Klara and Cygneta went to order a meal at the café and said goodbye. Anthony reluctantly got up from the table and followed Sophie to the study group. Sophie bravely asked, “Anyone got any solutions yet to the latest practice questions?” People looked up, and the “little master,” who was codenamed for his genius, said, “Not yet.” None of the other engineering students, mostly of Asian background, could even care less about the camera mounted on Anthony's head. Sophie found this disturbing and startling. She immediately thought about those little drones being developed and how men seemed to purchase these toys way more than any woman she knew. Who knows what the future would hold for humankind, she thought. Maybe the guys would end up loving their machines so much they'd forget spending time with real people! Sophie liked the challenge of engineering, but it was at times strange to be in a room full of guys.

The power to exclude, delete, or misrepresent an event is with the wearer and not the passive passerby.

Scenario 6: A Visit to the C.A.B. Bank

Sophie was beginning to really tire of the GoPro shenanigans. She asked Anthony to wait outside the bank since he would not take off the contraption. Sophie was being pushed to the limit. Stressed out with exams coming up and a boyfriend who seemed preoccupied with proving a point, whatever that point was, she just needed things to go smoothly at the bank. Luckily this was the less popular bank on campus, and there was hardly anyone in it. Sophie went right up to the attendant but called out for Anthony to help her with her bag while she rummaged in her handbag for her driver's license. Anthony sat down on one of the sitting cubes and, looking up, realized he was now in the “being recorded” position in the bank himself. One attendant left the bank smiling directly into the camera and at Anthony. He thought, “How's that for security?” The third teller leaned over the screen and asked Anthony, “Is there anything we can help you with?” Anthony said, “I'm waiting for my girlfriend,” which seemed to appease the teller too easily.

It was now time for Sophie to withdraw money at the teller. Anthony really didn't mind because Sophie was always there to support him, no matter how long it took. They reflected that they had not more than 30 min left to do a couple more errands, including visit the ATM and go to the library. There were four people in the queue at the ATM. Anthony grabbed Sophie's hand and whispered in her ear, “Sophie, do you realize something? If I was recording right now, I'd be able to see all the PIN numbers of all the people in front of us.” Sophie shushed Anthony. “You're going to get us in trouble today. Enough's enough.” “No really, Sophie, we've got to tell security. They're worried about tiny cameras looking down and skimming devices, but what about the cameras people are wearing now?” Sophie squeezed Anthony's hand—“Anthony, you are going to get us in serious trouble. And this is not the time to be saving the world from cybercriminals.” Anthony moved away from the queue, realizing that his face was probably being recorded on CCTV. The last thing he ever wanted was to be in trouble. He went to instantly budge the GoPro off his head; it was becoming rather hot even though it had been a cool day, and it was beginning to feel uncomfortable and heavy on his back and neck muscles. By the time he could get his act together, Sophie had made her transaction and they were hurriedly off to the library just before class.

Scenario 7: In the Library

As they rushed into the library to get some last-minute resources, Anthony and Sophie decided to split up. Sophie was going to the reserved collection to ask for access to notes that the special topics lecturer had put on closed reserve, and Anthony was going to do some last-minute bibliographic searches for the group assignment that was due in a few days. Why was it that things were always crammed into the last two weeks of the session? How on earth was any human being able to survive those kinds of demands? Anthony grabbed Sophie's bag and proceeded to the front computers. It was packed in the library because everyone was trying to do their final assignments. As Anthony hovered behind the other students, he remembered the shoulder-surfing phenomenon he had considered at the ATM. It was exactly the same. Anthony made sure not to look forward. As soon as there was an empty computer, he'd be next. He conducted some library searches standing up and then spotted two guys moving away from a sit-down desk area. Given all the stuff he was carrying, he thought he'd ask the guys nearby if they had finished. They said yes and tried to vacate the space as fast as they could, being courteous to Anthony's needs. By this time, Anthony was also sweating profusely and had begun to look stressed out.

The cameras are now embedded in USBs, wristbands, pens, keyfobs, bags, and t-shirts.

Anthony dumped his stuff on the ground, and the shorter of the two men said, “Are you wearing a camera on your head?” Anthony muttered to himself, “Oh no, not again.” Had he been able to take the device off his head effortlessly, he would have. After wearing it for over 2 h straight, it had developed an octopus-like suction to his forehead. “Yeah, yeah, it's a camera.” This camera had brought him nothing but bad luck all day. Okay, so he had taped most of the first lecture in the morning, but it had not been any good since. Sophie was angry with him over the café discussions, Oxford was not interested in being filmed without his knowledge, and Anthony's shoulders were really starting to ache and he was developing a splitting headache. “You guys would not happen to be from civil engineering?” Anthony asked in the hope that he and Sophie might get some hints for the forthcoming group assignment. “Nah, we're from commerce.” Both men walked away after saying goodbye, and Anthony was left to ponder. Time was running out quickly, so he left his things where they were and decided to go to the desk and ask for help directly.

“Hello, I am wondering if you would be willing to help me. My name is Anthony, and I am doing research on…” The librarian studied Anthony's head closely. “Umm…can I just ask what's happening here? Please tell me you are not recording this conversation,” asked the librarian politely. “What?” said Anthony, completely oblivious to the camera mounted on his head. He then came to his senses. “Oh that? That's just a GoPro. I've not got it on. See?” He brought his head nearer to the librarian, who put on her glasses. “Now, I'm looking for…” “I'm sorry, young man, I'm going to have to call down the manager on duty. You just cannot come into the library looking like that. In fact, even onto campus.”

Anthony felt like all of his worst nightmares were coming true. He felt like running, but his and Sophie's belongings were at the cubicle and besides, the library security CCTV had been recording for the last few minutes. His parents would never forgive him if anything jeopardized his studies. Sophie was still likely photocopying in closed reserve. What would she think if she came out to be greeted by all this commotion? The manager of “The Library”—oh he felt a bad feeling in the pit of his stomach. Anthony knew he had done nothing wrong, but that was not the point at this time. The librarian seemed less informed than even he was of his citizen rights, and while she was on the phone, hurriedly trying to get through to the manager, Sophie returned with materials.

“Where are our bags? My laptop is in there Anthony.” Anthony signaled over to the cubicle, didn't go into details, and asked Sophie to return to the desk to do some more searches while he was with the librarian. Surprisingly, she complied immediately given the time on the clock. Anthony was relieved. “Look,” he said to the librarian, “I am not crazy, and I know what I am doing is legal.” She gestured to him to wait until she got off the phone. “Right-o, so the manager's at lunch, and so I'll have to have a chat with you. First and foremost, when you're taking footage of the students, you need permission and all that sort of thing. I'm just here to clarify that to you.” “Look, umm, Sue, I'm not recording right now, so I guess I can wear whatever I want and look as stupid as I want so long as I'm not being a public nuisance.” “Young man, can I have your student ID card please?” Anthony claimed he did not have one with him, but was trying to avoid returning back to where Sophie was to get hit with even more questions. Anthony proceeded by providing the librarian his full name.

“Well, Anthony Fielding, it is against university policy to go around recording people in a public or private space,” stated the librarian firmly. Anthony, by now, had enough. “Look, Sue, for the second time, I've not recorded anyone in the library. I did record part of my lecture today with this device. It is called a GoPro. Why hasn't anyone but me heard about it?” “Well we have heard of Google Glass here, and we know for now, we don't want just anyone waltzing around filming indiscriminately. That doesn't help anyone on campus,” the librarian responded. “Okay, based on my experience today, I know you are right,” Anthony admitted. “But can you at least point me toward a library policy that clearly stipulates what we can and cannot do with cameras? And why is this kind of camera one that you're alarmed about rather than a more flexible handheld one like this one?” Anthony pulled out his iPhone 6. The librarian seemed oblivious to what Anthony was trying to argue. Meanwhile, Anthony glanced over to Sophie half-smiling, indicating they will have to make a move soon by pointing at his watch and then the exit.

“Look, I know you mean well. But…” Anthony was interrupted again by the librarian. “Anthony Fielding, it is very important you understand what I am about to tell you; otherwise you might end up getting yourself in quite a bit of trouble. If you're recording students, you actually have to inform the student and ask if it's okay, because quite a lot of them are hesitant about being filmed.” Anthony retorted, “I know, I know, do unto others as you'd have them do unto you, but I already told you, I'm not recording…But which policy do you want to refer me to and I'll go and read it, I promise.” The librarian hesitated and murmured behind her computer, “Ah…I'll have to look…look…look and find it for you, but I just…I just know that…” The librarian realized the students were going to be late for a lecture. “Look, if you're right and there is no policy, assuming I've not made an error, then we need to develop one.” “Look, Sue, I don't mean to be rude, but we've already filmed in a lecture theater today. I wouldn't call a public theater, private in any capacity. Sure people can have private conversations in a theater, but they shouldn't be talking about private things unless they want to actively share it during class discussion time.” “Look, that's a bit of a gray area,” the librarian answered. “I think I am going to have to ask security to come over. It's just that I don't think the safety of others is being put first. For starters, you should take that thing off.” Anthony realized that things were now serious. He attempted to take off the band, which was soaking wet from sweat given his latest predicament.

Sophie realized something was wrong when she was walking with the bags back to the information desk. “Anthony, what's happening?” Sophie had a worried look on her face. “I've been asked to wait for security,” said Anthony. “Can you please not worry and just leave for class? I won't feel so bad if you go on without me.” Sophie responded, “Anthony, I told you this thing was trouble—you should have just taken it off—oh Anthony!” “What now?” said Anthony. “Your forehead…are you okay? It's all red and wrinkly and sweaty. Are you feeling okay?” Sophie put her hand on Anthony's forehead and realized he was running a fever. “Look, is this really necessary? My boyfriend has not done anything wrong. He's taken off the device. If you want to see the lecture footage, we'll show you. But really, the guy has to pass this subject. Please can we go to the lecture theater?” The librarian was unequivocally unemotional. Anthony looked at Sophie and she nodded okay and left for class with all the bags. “Please ring me if you need anything, and I'll be here in a flash.” Sophie kissed Anthony goodbye.

Scenario 8: Security on Campus

Moments later, security arrived on the scene. Anthony challenged the security guards and emphasized that he had done nothing wrong. Anthony was escorted back to the security office on campus some 500 m away. At this point, he was told he was not being detained, that simply university security staff were going to have a chat with him. Anthony became deeply concerned when several security staff greeted him at the front desk. They welcomed him inside and asked him to take a seat and whether or not he'd like a cup of coffee.

“Anthony, there have been a spate of thefts on campus of late. We'd like to ask you where you got your GoPro camera.” “Well, it was a birthday present from my older brother a few months ago,” Anthony explained. “He knows I've always made home movies from when I was a youngster, and he thought I might use it to film my own skateboarding stunts.” “Right,” said the police officer, “Could you let me take a look at the serial number at the bottom of the unit?” “Sure,” said Anthony, “and then can I go? I haven't stolen anything.” The security staff inspected the device and checked the serial number against their database, handing it back to Anthony. “Ok, you're free to go now.” “What? And I thought you were going to interrogate me for the footage I took today!”

“Look Anthony, that's a delicate issue. Yeah, under the Surveillance Devices Act, for you to be able to record somebody you need their explicit permission, which is why you'll see wherever we've got cameras we've got signage that states you're being filmed, and even then we've got a strict policy about what we do with the recordings. We can't let anybody view it unless it's police and so on, but it's really strict.” Anthony replied, “What happens when Google Glass begins to proliferate on campus? The GoPro, which will be obvious, won't be what you're looking out for but rather Glass being misused or covert devices.” “Look, security, the way it works at universities is that you are concerned with the here and now. I can't predict what will happen in about three months' time, right?” At this point Anthony was thinking about his lecture and how he was running late, yet again, however, this time through no fault of his own.

“Is she with you?” asked the security manager. “Who do you mean?” questioned Anthony. “That young lady over there,” the manager replied, pointing through the screen door. “Oh, that's my girlfriend, Sophie. I reckon she was worried about me and came to see what was going on.” Sophie had her iPhone out and was recording the goings on. Anthony just had to ask, “Am I right? Is my girlfriend allowed to do that? She isn't trespassing. The university campus is a public space for all to enjoy.” The security manager replied, “Actually, she's recording me, but she's not really allowed to do that without giving me some sort of notification. We might have cameras crawling all over this campus for student and staff safety, but our laws state if people don't want to be recorded, then you should not be recording them. On top of this, you would probably realize that when you walk around the campus in large areas like the walkways, they're actually facing the road, they're not facing people. So yes, you need permission for what she's doing there or adequate signage explaining what is going on.”

Sophie put the phone down and knocked on the door. “Can I come inside?” “Of course you can,” said the security manager. “Join the party!” “Anthony, Prof. Gabriel is asking for you; otherwise, he'll count you absent and you won't get your 10% participation mark for the session. I told him I knew where you were. If we get back within 15 min, you're off the hook.” “Hang on Sophie,” Anthony continued, “I'd like to solve this problem now to avoid any future misunderstandings. After all, I'm about to enter the classroom and record it for my own learning and progress. What do you think? Is that against the law?” Anthony asked the security manager. The security manager pondered for a long while. “Look, we get lots and lots of requests asking us to investigate the filming of an individual; we take that very seriously. But there is no law against that taking place in a public space.” “Is a lecture theater a public space?” Anthony prompted. The security manager replied, “I think you should be allowed to use headmounted display video cameras if it's obvious what you're doing and unless a bystander asks you to cease recording. The lecture rooms are open and are usually mixed with the reception areas, which makes them public areas; so if you want to gain access to the room, obviously you can because it's a public area. You don't have to use a swipe card to get in, you see. But then there are still things that you can't do in a public area, like you can't ride a bicycle in there; or if someone is giving a lecture, you can't interrupt the lecture. That sort of thing.”

Anthony started speaking from the experience of his day. “I was queueing in front of the ATM today, and I realized that I could easily see the activities of the people in front of me and the same in the library. When I hover around somebody's computer, I can see their screen and what they're up to on the Internet. It bothered even me after my experience today; unintentionally I'm seeing someone's ATM PIN number, I'm seeing someone searching on Google about how to survive HIV, which is personal and highly sensitive private stuff. No one should be seeing that. I just wore my GoPro to record my lecture for study purposes, but these kinds of devices in everyday life must be very disturbing for the people being recorded. That's why I'm curious what would happen on campus.” The security manager interrupted, “We already have some policies in place. For example, you can make a video recording, but what are you going to do with it? Are you going to watch it yourself or are you going to e-mail it around? You can't do that using your university e-mail account. You can't download, transfer, or copy videos using university Internet, your university account, or your university e-mail account. Look it up; there are also rules about harassment…It's fairly strict and already organized in that regard. But if you're asking where the university is applying policies, you're asking the wrong people because we don't get involved in policy making. You should be talking to the legal department. We don't make the policies; we just follow the procedures. Every citizen of this nation also has to abide by state and federal laws.”

The explanation satisfied Anthony. He realized that the security manager was not the person to talk to for any further inquiries. “Thank you for taking the time to answer my questions; you've been very helpful,” Anthony said as he headed to the door to attend his class with Sophie. He did need that 10% attendance mark from Prof. Gabriel if he wanted to be in the running for a Distinction grade.

Scenario 9: Sophie's Lecture

After their last lecture together, Anthony was happy thinking he was almost done for the day and he would be heading back home but Sophie had one more hour of tutorial. Anthony walked Sophie to her last tutorial's classroom. “C'mon Anthony, it'll only take half an hour tops. After this class, we can leave together; bear with it for just a while,” Sophie insisted. “Okay,” said Anthony; his mind was overflowing with the thought of the final exams and questions raised in his mind by his unique experience with the GoPro all day.

They arrived a few minutes late. Sophie quietly opened the door as Anthony walked in behind her. The lecturer took a glimpse of Anthony with the GoPro on his head. The lecturer asked Anthony, “Are you in this class?” “No, I'm just with a friend,” replied Anthony as he was still trying to walk in and take a seat. “Okay and you're wearing a camera?” “Yeah?!” Anthony replied, confused by the tone of the lecturer. “Take it off!” the lecturer exclaimed. “You don't have permission to wear a camera in my class!” Silence fell over the classroom. As the lecturer's tone became more aggravated, everyone stopped, trying to understand what was going on. “Ok, but it's not…” The lecturer refused to hear any explanation. “You're not supposed to interrupt my class, and you're not supposed to be wearing a camera, so please take the camera off and leave the class!”

Anthony saw no point in explaining himself and left the class. Sophie, in shock, followed Anthony outside to check up on him and make sure he was all right. “Oh Anthony, I don't know how many times I told you to take it off all day…Are you ok?” Anthony was shocked as well. “I don't understand why he got so upset.” Anthony was facing the lecture theater's glass door; it opened and the lecturer stepped out and asked, “Excuse me, are you filming inside the class?” “Professor…” Anthony tried to say he was sorry for the trouble and that he wasn't even recording. “No! Were you filming inside the class?” the lecturer asked again. “I'm sorry if I caused you trouble, professor, the camera is not even on.” The professor, angry at both of them for interrupting his class with such a silly incident, asked them to leave and returned to the lecture theater. Sophie was surprised. “He's a very nice person; I don't understand why he got so upset.” Anthony's shock turned into anger. “I thought this was a public space and I don't think there's any policy that forbids me to record the lecture! Couldn't he at least say it nicely? You get back in, I'll see you after your class, and meanwhile I'll take this darn thing off.” Anthony kissed Sophie goodbye and left for the library without the GoPro on his head.

Conclusion

Wearable computers—digital glasses, watches, headbands, armbands, and other apparel that can lifelog and record visual evidence—tell you where you are on the Earth's surface and how to navigate to your destination, alert you of your physical condition (heart and pulse rate monitors), and even inform you when you are running late to catch a plane, offering rescheduling advice. These devices are windows to others through social networking, bridges to storage centers, and, even on occasion, companions as they listen to your commands and respond like a personal assistant. Google Glass, for instance, is a wearable computer with an optical head-mounted display that acts on voice commands like “take a picture” and allows for hands-free recording. You can share what you see live with your social network, and it provides directions right in front of your eyes. Glass even syncs your deadlines with speed, distance, and time data critical to forthcoming appointments.

The slim-line Narrative Clip is the latest gadget to enter the wearable space.

But Google is not alone. Microsoft was in the business of lifelogging more than a decade ago with its SenseCam device, which has now been replaced by the Autographer. Initially developed to help those suffering with dementia as a memory aid, the Autographer takes a 5-mp picture about 2,000 times a day and can be replayed in fast-forward mode in about 5 min. It is jam-packed with sensors that provide a context for the photo including an accelerometer, light sensor, magnetometer, infrared motion detector, and thermometer as well as a GPS chipset. The slim-line Narrative Clip is the latest gadget to enter the wearable space. Far less obtrusive than Glass or Autographer, it can be pinned onto your shirt, takes a snapshot every 30 s, and is so lightweight that you quickly forget you are even wearing it.

These devices make computers part of the human interface. But what are the implications of inviting all this technology onto the body? We seem to be producing innovations at an ever-increasing rate and expect adoption to match that cycle of change. But while humans have limitations, technologies do not. We can keep developing at an incredible speed, but there are many questions about trust, privacy, security, and the effects on psychological well-being that, if left unaddressed, could have major risks and often negative societal effects. The most invasive feature of all of these wearables, however, is the image sensor that can take pictures in an outward-looking fashion.

The claim is often made that we are under surveillance by CCTV even within leisure centers and change rooms. But having a Glass device, Autographer, or Narrative Clip recording while you are in a private space, like a “public” washroom, provides all sorts of nightmare scenarios. The camera is looking outward, not at you. Those who believe that they will remember to turn off the camera, will not be tempted to keep the camera “rolling,” or will “delete” the data gathered at a later date are only kidding themselves. We can hardly delete our e-mail records, let alone the thousands of pictures or images we take each day. The recording of sensitive data might also increase criminality rather than reduce it. The power to exclude, delete, or misrepresent an event is with the wearer and not the passive passerby. There is an asymmetry here that cannot be rectified unless the passive participant becomes an active wearer themselves. And this is not only unfeasible, but we would argue undesirable. At what point do we say enough is enough?

We are challenging fundamental human rights through the thoughtless adoption of new technologies that are enslaving us to a paradigm of instantaneous reality-TV-style living. We are seduced into providing ever more of our personal selves without any concerns for the protection of our personal data. Who owns the data emanating from these devices if the information is stored somewhere other than the device itself? Does that mean I lose my capacity to own my own set of histories relating to my physiological characteristics as they are sold on to third-party suppliers? Who will return my sense of self after I have given it away to someone else? We need to face up to these real and proportional matters because they not only have lawful implications but implications for our humanity.

IEEE Keywords: Wearable computing, Market research, Product design, Product development, Consumer behavior,Supply and demand, Digital computers, Google, Marketing and sales

References

[1] M. Lindgren and H. Bandhold, Scenario Planning: The Link Between Future and Strategy. Basingstoke, Hampshire: Palgrave Macmillan, 2010, p. 22. 

[2] S. Inayatullah, “Humanity 3000: A comparative analysis of methodological approaches to forecasting the long-term,” Foresight, vol. 14, no. 5, pp. 401–417, 2012. 

[3] M. Godet, “The art of scenarios and strategic planning,” Technol. Forecast. Social Change, vol. 65, no. 1, pp. 3–22, 2000. 

[4] L. Perusco and K. Michael, “Control, trust, privacy, and security: Evaluating location-based services,” IEEE Technol. Soc. Mag., vol. 26, no. 1, pp. 4–16, 2007. 

[5] K. Michael and M. G. Michael. (2013). No limits to watching. Commun. ACM. [Online]. 56(11), 26–28. Available: http://cacm.acm.org/ magazines/2013/11/169022-no-limits-to-watching/abstract 

[6] Y. Gokyer, K. Michael, and A. Preston. Katina Michael discusses pervasive video recording in the accompaniment to “No Limits to Watching” on ACM’s vimeo channel. [Online]. Available: http://vimeo. com/77810226 

[7] K. Michael. (2013). Social implications of wearable computing and augmediated reality in every day life. In Proc. IEEE Symposium on Technology and Society (ISTAS13), Toronto,

INSPEC: wearable computers, helmet mounted displays, innovation management, wearable computer, digital wearability scenarios, experimental technologies, market leaders, state-based regulations, innovation design practices, radical production, Google Glass product, optical head-mounted display unit

Citation:  Deniz Gokye, Katina Michael, Digital Wearability Scenarios: Trialability on the run, IEEE Consumer Electronics Magazine, Year: 2015, Volume: 4, Issue: 2, pp. 82-91, DOI: 10.1109/MCE.2015.2393005 

Using a Social-Ethical Framework to Evaluate Location-Based Services

Abstract

etyhicsfront.jpg

The idea for an Internet of Things has matured since its inception as a concept in 1999. People today speak openly of a Web of Things and People, and even more broadly of an Internet of Everything. As our relationships become more and more complex and enmeshed, through the use of advanced technologies, we have pondered on ways to simplify flows of communications, to collect meaningful data, and use them to make timely decisions with respect to optimisation and efficiency. At their core, these flows of communications are pathways to registers of interaction, and tell the intricate story of outputs at various units of analysis- things, vehicles, animals, people, organisations, industries, even governments. In this trend toward evidence-based enquiry, data is the enabling force driving the growth of IoT infrastructure. This paper uses the case of location-based services, which are integral to IoT approaches, to demonstrate that new technologies are complex in their effects on society. Fundamental to IoT is the spatial element, and through this capability, the tracking and monitoring of everything, from the smallest nut and bolt, to the largest shipping liner to the mapping of planet earth, and from the whereabouts of the minor to that of the prime minister. How this information is stored, who has access, and what they will do with it, is arguable depending on the stated answers. In this case study of location-based services we concentrate on control and trust, two overarching themes that have been very much neglected, and use the outcomes of this research to inform the development of a socio-ethical conceptual framework that can be applied to minimise the unintended negative consequences of advanced technologies. We posit it is not enough to claim objectivity through information ethics approaches alone, and present instead a socio-ethical impact framework. Sociality therefore binds together that higher ideal of praxis where the living thing (e.g. human) is the central and most valued actor of a system.

Agenda:

Introduction 1.1. 3

Control 1.2. 4

Surveillance 1.2.1. 5

Common surveillance metaphors 1.2.2. 5

Applying surveillance metaphors to LBS 1.2.3. 7

‘Geoslavery’ 1.2.4. 7

From state-based to citizen level surveillance 1.2.5. 7

Dataveillance 1.2.6. 8

Risks associated with dataveillance 1.2.7. 8

Loss of control 1.2.8. 8

Studies focussing on user requirements for control 1.2.9. 10

Monitoring using LBS: control versus care? 1.2.10. 10

Sousveillance 1.2.11. 11

Sousveillance, ‘reflectionism’ and control 1.2.12. 11

Towards überveillance 1.2.13. 12

Implications of überveillance on control 1.2.14. 13

Comparing the different forms of ‘veillance’ 1.2.15. 14

Identification 1.2.16. 14

Social sorting 1.2.17. 15

Profiling 1.2.18. 15

Digital personas and dossiers 1.2.19. 15

Trust 1.3. 16

Trust in the state 1.3.1. 17

Balancing trust and privacy in emergency services 1.3.2. 17

Trust-related implications of surveillance in the interest of national security 1.3.3. 17

Need for justification and cultural sensitivity 1.3.4. 18

Trust in corporations/LBS/IoT providers 1.3.5. 19

Importance of identity and privacy protection to trust 1.3.6. 19

Maintaining consumer trust 1.3.7. 20

Trust in individuals/others 1.3.8. 20

Consequences of workplace monitoring 1.3.9. 20

Location-monitoring amongst friends 1.3.10. 21

Location tracking for protection 1.3.11. 21

LBS/IoT is a ‘double-edged sword’ 1.3.12. 22

Discussion 1.4. 22

The Internet of Things (IoT) and LBS: extending the discussion on control and trust 1.4.1. 22

Control- and trust-related challenges in the IoT 1.4.2. 23

Ethical analysis: proposing a socio-ethical conceptual framework 1.4.3. 24

The need for objectivity 1.4.4. 25

Difficulties associated with objectivity 1.4.5. 26

Conclusion 1.5. 27

 

Introduction 1.1

Locative technologies are a key component of the Internet of Things (IoT). Some scholars go so far as to say it is the single most important component that enables the monitoring and tracking of subjects and objects. Knowing where something or someone is, is of greater importance than knowing who they are because it or they can be found, independent of what or who they are. Location also grants us that unique position on the earth’s surface, providing for us one of the vital pieces of information forming the distance, speed, time matrix. A unique ID, formed around an IP address in an IoT world, presents us with the capability to label every living and non-living thing and to recollect it, adding to its history and longer term physical lifetime. But without knowing where something is, even if we have the knowledge that an action is required toward some level of maintenance, we cannot be responsive. Since the introduction of electronic databases, providing accurate records for transaction processing has been a primary aim. Today, however, we are attempting to increase visibility using high resolution geographic details, we are contextualizing events through discrete and sometimes continuous sensor-based rich audio-visual data collection, and we are observing how mobile subjects and objects interact with the built environment. We are no longer satisfied with an approach that says identify all things, but we wish to be able to recollect or activate them on demand, understand associations and affiliations, creating a digital chronicle of its history to provide insights toward sustainability.

There is thus an undue pressure on the ethical justification for social and behavioral tracking of people and things in everyday life. Solely because we have the means to do something, it does not mean we should do it. We are told that through this new knowledge gained from big data we can reduce carbon emissions, we can eradicate poverty, we can grant all people equity in health services, we can better provision for expected food shortages, utilize energy resources optimally, in short, make the world a better place. This utopian view might well be the vision that the tech sector wish to adopt as an honourable marketing strategy, but the reality of thousands of years of history tells us that technology does not necessarily on its own accord, make things better. In fact, it has often made some aspects of life, such as conflict and war, much worse through the use of modern, sophisticated advanced techniques. We could argue that IoT will allow for care-based surveillance that will bring about aid to individuals and families given needs, but the reality is that wherever people are concerned, technology may be exploited towards a means for control. Control on its own is not necessarily an evil, it all depends on how the functionality of given technologies are applied. Applied negatively the recipient of this control orientation learns distrust instead of trust which then causes a chain reaction throughout society, especially with respect to privacy and security. We need only look at the techniques espoused by some governments in the last 200 years to acknowledge that heinous crimes against humanity (e.g. democide) have been committed with new technological armaments (Rummel, 1997) to the detriment of the citizenry.                                                         

A socio-ethical framework is proposed as a starting point for seeking to understand the social implications of location services, applicable to current and future applications within IoT infrastructure. To stop at critiquing services using solely an information ethics-based approach is to fall short. Today’s converging services and systems require a greater scope of orientation to ask more generally how society may be affected at large, not just whether information is being collected, stored, and shared appropriately. To ask questions about how location services and IoT technology will directly and indirectly change society has far greater importance for the longer term vision of person-to-person and person-to-thing interactions than simply studying various attributes in a given register.

Studies addressing the social implications of emerging technologies, such as LBS, generally reflect on the risks and ethical dilemmas resulting from the implementation of a particular technology within a given social context. While numerous approaches to ethics exist, all are inextricably linked to ideas of morality, and an ability to distinguish good conduct from bad. Ethics, in simple terms, can be considered as the “study of morality” (Quinn 2006, p. 55), where morality refers to a “system of rules for guiding human conduct and principles for evaluating those rules” (Tavani 2007, p. 32). This definition is shared by Elliot and Phillips (2004, p. 465), who regard ethics as “a set of rules, or a decision procedure, or both, intended to provide the conditions under which the greatest number of human beings can succeed in ‘flourishing’, where ‘flourishing’ is defined as living a fully human life” (O'Connor and Godar 2003, p. 248).

According to the literature, there are two prominent ethical dilemmas that emerge with respect to locating a person or thing in an Internet of Things world. First, the risk of unauthorised disclosure of one’s location which is a breach of privacy; and second the possibility of increased monitoring leading to unwarranted surveillance by institutions and individuals. The socio-ethical implications of LBS in the context of IoT can therefore be explored based on these two major factors. IoT more broadly, however, can be examined by studying numerous social and ethical dilemmas from differing perspectives. Michael et al. (2006a, pp. 1-10) propose a framework for considering the ethical challenges emerging from the use of GPS tracking and monitoring solutions in the control, convenience and care usability contexts. The authors examine these contexts in view of the four ethical dimensions of privacy, accuracy, property and accessibility (Michael et al. 2006a, pp. 4-5). Alternatively, Elliot and Phillips (2004, p. 463) discuss the social and ethical issues associated with m-commerce and wireless computing in view of the privacy and access, security and reliability challenges. The authors claim that factors such as trust and control are of great importance in the organisational context (Elliot and Phillips 2004, p. 470). Similar studies propose that the major themes regarding the social implications of LBS be summarised as control, trust, privacy and security (Perusco et al. 2006; Perusco and Michael 2007). These themes provide a conceptual framework for reviewing relevant literature in a structured fashion, given that a large number of studies are available in the respective areas.

This article, in the first instance, focusses on the control- and trust-related socio-ethical challenges arising from the deployment of LBS in the context of IoT, two themes that are yet to receive comprehensive coverage in the literature. This is followed by an examination of LBS in the context of the Internet of Things (IoT), and the ensuing ethical considerations. A socio-ethical framework is proposed as a valid starting point for addressing the social implications of LBS and delivering a conceptual framework that is applicable to current LBS use cases and future applications within an Internet of Things world.

Control 1.2

Control, according to the Oxford Dictionary (2012a), refers to the “the power to influence or direct people’s behaviour or the course of events”. With respect to LBS, this theme is examined in terms of a number of important concepts, notably surveillance, dataveillance, sousveillance and überveillance scholarship.

Surveillance 1.2.1

A prevailing notion in relation to control and LBS is the idea of exerting power over individuals through various forms of surveillance. Surveillance, according to sociologist David Lyon, “is the focused, systematic and routine attention to personal details for the purposes of influence, management, protection and or direction,” although Lyon admits that there are exceptions to this general definition (Lyon 2007, p. 14). Surveillance has also been described as the process of methodically monitoring the behaviour, statements, associates, actions and/or communications of an individual or individuals, and is centred on information collection (Clarke 1997; Clarke 2005, p. 9).

The act of surveillance, according to Clarke (1988; 1997) can either take the form of personal surveillance of a specific individual or mass surveillance of groups of interest. Wigan and Clarke (2006, p. 392) also introduce the categories of object surveillance of a particular item and area surveillance of a physical enclosure. Additional means of expressing the characteristics of surveillance exist. For example, the phrase “surveillance schemes” has been used to describe the various surveillance initiatives available (Clarke 2007a, p. 28). Such schemes have been demonstrated through the use of a number of mini cases or vignettes, which include, but are not limited to, baby monitoring, acute health care, staff movement monitoring, vehicle monitoring, goods monitoring, freight interchange-point monitoring, monitoring of human-attached chips, monitoring of human-embedded chips, and continuous monitoring of chips (Clarke 2007c; Clarke 2007b, pp. 47-60). The vignettes are intended to aid in understanding the desirable and undesirable social impacts resulting from respective schemes.

Common surveillance metaphors 1.2.2

In examining the theme of control with respect to LBS, it is valuable to initially refer to general surveillance scholarship to aid in understanding the link between LBS and surveillance. Surveillance literature is somewhat dominated by the use of metaphors to express the phenomenon. A prevalent metaphor is that of the panopticon, first introduced by Jeremy Bentham (Bentham and Bowring 1843), and later examined by Michel Foucault (1977). Foucault’s seminal piece Discipline and Punish traces the history of punishment, commencing with the torture of the body in the eighteenth century, through to more modern forms of punishment targeted at the soul (Foucault 1977). In particular, Foucault’s account offers commentary on the notions of surveillance, control and power through his examination of Bentham’s panopticon, which are pertinent in analysing surveillance in general and monitoring facilitated by LBS in particular. The panopticon, or “Inspection-House” (Bentham and Bowring 1843, p. 37), refers to Bentham’s design for a prison based on the essential notion of “seeing without being seen” (p. 44). The architecture of the panopticon is as follows:

“The building is circular. The apartments of the prisoners occupy the circumference. You may call them, if you please, the cells... The apartment of the inspector occupies the centre; you may call it if you please the inspector's lodge. It will be convenient in most, if not in all cases, to have a vacant space or area all round, between such centre and such circumference.  You may call it if you please the intermediate or annular area” (Bentham and Bowring 1843, pp. 40-41).

Foucault (1977, p. 200) further illustrates the main features of the inspection-house, and their subsequent implications on constant visibility:

“By the effect of backlighting, one can observe from the tower [‘lodge’], standing out precisely against the light, the small captive shadows in the cells of the periphery. They are like so many cages, so many small theatres, in which each actor is alone, perfectly individualized and constantly visible...Full lighting and the eye of a supervisor [‘inspector’] capture better than darkness, which ultimately protected. Visibility is a trap.”

While commonly conceived as ideal for the prison arrangement, the panopticon design is applicable and adaptable to a wide range of establishments, including but not limited to work sites, hospital, schools, and/or or any establishment in which individuals “are to be kept under inspection” (Bentham and Bowring 1843, p. 37). It has been suggested, however, that the panopticon functions as a tool for mass (as opposed to personal) surveillance in which large numbers of individuals are monitored, in an efficient sense, by a small number (Clarke 2005, p. 9). This differs from the more efficient, automated means of dataveillance (to be shortly examined). In enabling mass surveillance, the panopticon theoretically allows power to be. In examining the theme of control with respect to LBS, it is valuable to initially refer to general surveillance scholarship to aid in understanding the link between LBS and surveillance. Surveillance literature is somewhat dominated by the use of metaphors to express the phenomenon. Foucault (1977, pp. 202-203) provides a succinct summary of this point:

“He who is subjected to a field of visibility, and who knows it, assumes responsibility for the constraints of power; he makes them play spontaneously upon himself; he inscribes in himself the power relation in which he simultaneously plays both roles; he becomes the principle of his own subjection.”

This self-disciplinary mechanism functions similarly, and can somewhat be paralleled, to various notions in George Orwell’s classic novel Nineteen Eighty Four (Orwell 1949), also a common reference point in surveillance literature. Nineteen Eighty Four has been particularly influential in the surveillance realm, notably due to the use of “Big Brother” as a symbol of totalitarian, state-based surveillance. Big Brother’s inescapable presence is reflected in the nature of surveillance activities. That is, that monitoring is constant and omnipresent and that “[n]othing was your own except the few cubic centimetres inside your skull” (Orwell 1949, p. 29). The oppressive authority figure of Big Brother possesses the ability to persistently monitor and control the lives of individuals, employing numerous mechanisms to exert power and control over his populace as a reminder of his unavoidable gaze.

One such mechanism is the use of telescreens as the technological solution enabling surveillance practices to be applied. The telescreens operate as a form of self-disciplinary tool by way of reinforcing the idea that citizens are under constant scrutiny (in a similar fashion to the inspector’s lodge in the panopticon metaphor). The telescreens inevitably influence behaviours, enabling the state to maintain control over actions and thoughts, and to impose appropriate punishments in the case of an offence. This is demonstrated in the following excerpt:

“It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen. The smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself – anything that carried with it the suggestion of abnormality, of having something to hide. In any case, to wear an improper expression on your face (to look incredulous when a victory was announced, for example) was itself a punishable offence” (Orwell 1949, p. 65).

The Internet of Things, with its ability to locate and determine who is or what is related to one another using a multiplicity of technologies, will enable authorities in power to infer what someone is likely to do in a given context. Past behavioural patterns, can for example, reveal a likely course of action with relatively no prediction required. IoT in all its glory will provide complete visibility- the question is what are the risks associated with providing that kind of capability to the state or private enterprise? In scenario analysis we can ponder how IoT in a given context will be used for good, how it will be used for bad, and a neutral case where it will have no effect whatsoever because the data stream will be ignored by the system owner. While IoT has been touted as the ultimate in providing great organisational operational returns, one can see how it can lend itself to location-based tracking and monitoring using a panopticon metaphor. Paper records and registers were used during World War 2 for the purposes of segregation, IoT and especially the ability to “locate on demand”, may well be used for similar types of control purposes.

Applying surveillance metaphors to LBS 1.2.3

The aforementioned surveillance metaphors can be directly applied to the case of LBS within IoT. In the first instance, it can be perceived that the exploitation of emerging technologies, such as LBS, extends the notion of the panopticon in a manner that allows for inspection or surveillance to take place regardless of geographic boundaries or physical locations. When applying the idea of the panopticon to modern technologies, Lyon suggests that “Bentham’s panopticon gives way to the electronic superpanopticon” (Lyon 2001, p. 108). With respect to LBS, this superpanopticon is not limited to and by the physical boundaries of a particular establishment, but is rather reliant on the nature and capabilities of the mobile devices used for ‘inspection’. In an article titled “The Panopticon's Changing Geography”, Dobson and Fischer (2007) also discuss progress and various manifestations of surveillance technology, specifically the panopticon, and the consequent implications on power relationships. From Bentham's architectural design, to the electronic panopticon depicted by Orwell, and contemporary forms of electronic surveillance including LBS and covert human tracking, Dobson and Fisher (2007, p. 308-311) claim that all forms of watching enable continuous surveillance either as part of their primary or secondary purpose. They compare four means of surveillance- analogue technologies as used by spies which have unlimited geographic coverage and are very expensive to own and operate, Bentham’s original panopticon where the geographic view was internal to a building, George Orwell’s big brother view which was bound by the extent of television cables, and finally human tracking systems which were limited only by the availability and granularity of cell phone towers.

A key factor in applying the panopticon metaphor to IoT is that individuals, through the use of mobile location devices and technologies, will be constantly aware of their visibility and will assume the knowledge that an ‘inspector’ may be monitoring their location and other available information remotely at any given time. Mobile location devices may similarly replace Orwell’s idea of the telescreens as Big Brother’s primary surveillance technology, resulting in a situation in which the user is aiding in the process of location data collection and thereby surveillance. This creates, as maintained by Andrejevic (2007, p. 95), a “widening ‘digital enclosure’ within which a variety of interactive devices that provide convenience and customization to users double as technologies for gathering information about them.”

‘Geoslavery’ 1.2.4

Furthermore, in extreme situations, LBS may facilitate a new form of slavery, “geoslavery”, which Dobson and Fischer (2003, pp. 47-48) reveal is “a practice in which one entity, the master, coercively or surreptitiously monitors and exerts control over the physical location of another individual, the slave. Inherent in this concept is the potential for a master to routinely control time, location, speed, and direction for each and every movement of the slave or, indeed, of many slaves simultaneously.” In their seminal work, the authors flag geoslavery as a fundamental human rights issue (Dobson and Fisher 2003, p. 49), one that has the potential to somewhat fulfil Orwell's Big Brother prophecy, differing only in relation to the sophistication of LBS in comparison to visual surveillance and also in terms of who is in control. While Orwell’s focus is on the state, Dobson and Fischer (2003, p. 51) caution that geoslavery can also be performed by individuals “to control other individuals or groups of individuals.”

From state-based to citizen level surveillance 1.2.5

Common in both Discipline and Punish and Nineteen Eighty Four is the perspective that surveillance activities are conducted at the higher level of the “establishment”; that is, institutional and/or state-based surveillance. However, it must be noted that similar notions can be applied at the consumer or citizen level. Mark Andrejevic (2007, p. 212), in his book iSpy: Surveillance and Power in the Interactive Era, terms this form of surveillance as “lateral or peer-to-peer surveillance.” This form of surveillance is characterised by “increasing public access to the means of surveillance – not just by corporations and the state, but by individuals” (Andrejevic 2007, p. 212). Similarly, Barreras and Mathur (2007, pp. 176-177) state that wireless location tracking capabilities are no longer limited to law enforcement, but are open to any interested individual. Abbas et al. (2011, pp. 20-31) further the discussion by focussing on related notions, explicitly, the implications of covert LBS-based surveillance at the community level, where technologies typically associated with policing and law enforcement are increasingly available for use by members of the community. With further reference to LBS, Dobson and Fischer (2003, p. 51) claim that the technology empowers individuals to control other individuals or groups, while also facilitating extreme activities. For instance, child protection, partner tracking and employee monitoring can now take on extreme forms through the employment of LBS (Dobson and Fisher 2003, p. 49). According to Andrejevic (2007, p. 218), this “do-it-yourself” approach assigns the act of monitoring to citizens. In essence higher degrees of control are granted to individuals thereby encouraging their participation in the surveillance process (Andrejevic 2007, pp. 218-222). It is important to understand IoT in the context of this multifaceted “watching”. IoT will not only be used by organisations and government agencies, but individuals in a community will also be granted access to information at small units of aggregated data. This has implications at a multiplicity of levels. Forces of control will be manifold.

Dataveillance 1.2.6

The same sentiments can be applied to the related, and to an extent superseding, notion of data surveillance, commonly referred to as dataveillance. Coined by Roger Clarke in the mid-eighties, dataveillance is defined as “the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” (Clarke 1988). Clarke (2005, p. 9) maintains that this process is automated and therefore a relatively economical activity when compared with other forms of surveillance, in that dataveillance activities are centred on examination of the data trails of individuals. For example, traditional forms of surveillance rely on expensive visual monitoring techniques, whereas dataveillance is largely an economically efficient alternative (Clarke 1994; 2001d, p. 11). Visual behavioural monitoring (that is, traditional surveillance) is an issue, but is nonetheless overshadowed by the challenges associated with dataveillance, particularly with reference to personal and mass dataveillance (Clarke 2005, pp. 9-10). That is, personal dataveillance presents risks to the individual based primarily on the potential for the collected data/information to be incorrect or outdated, while mass dataveillance is risky in that it may generate suspicion amongst individuals (Albrecht & Michael, 2013).

Risks associated with dataveillance 1.2.7

Clarke’s early and influential work on “Information Technology and Dataveillance” recognises that information technology is accelerating the growth of dataveillance, which presents numerous benefits and risks (Clarke 1988, pp. 498, 505-507). Clarke lists advantages in terms of safety and government applications, while noting the dangers associated with both personal and mass dataveillance (Clarke 1988, pp. 505-507). These risks can indeed be extended or applied to the use of location and tracking technologies to perform dataveillance activities, resulting in what can be referred to as “dataveillance on the move” (Michael and Michael 2012). The specific risks include: ability for behavioural patterns to be exposed and cross-matched, potentially for revelations that may be harmful from a political and personal perspective, rise in the use of “circumstantial evidence”, transparency of behaviour resulting in the misuse of information relating to an individual’s conduct, and “actual repression of the readily locatable and trackable individual” (Clarke 2001b, p. 219). Emerging from this analysis, and that concerning surveillance and related metaphors, is the significant matter of loss of control.

Loss of control 1.2.8

Michael et al. (2006a, p. 2) state, in the context of GPS tracking, that the issue of control is a leading ethical challenge given the invasive nature of this form of monitoring. The mode of control can differ depending on the context. For instance, the business context may include control through directing or ‘pushing’ advertisements to a specific individual, and at personal/individual level could signify control in the manner of “self-direction” (Perusco et al. 2006, p. 93). Other forms of social control can also be exercised by governments and organisations (Clarke 2003b), while emerging LBS solutions intended for the consumer sector extend the notion of control to community members (Abbas et al. 2011). This is an area that has not been adequately addressed in the literature. The subsequent risks to the individual are summarised in the following passage:

“Location technologies therefore provide, to parties that have access to the data, the power to make decisions about the entity subject to the surveillance, and hence exercise control over it. Where the entity is a person, it enables those parties to make determinations, and to take action, for or against that person’s interests. These determinations and actions may be based on place(s) where the person is, or place(s) where the person has been, but also on place(s) where the person is not, or has not been” (Wigan and Clarke 2006, p. 393).

Therefore GPS and other location devices and technologies may result in decreased levels of control from the perspective of the individual being monitored. For example, in an article based on the use of scenarios to represent the social implications associated with the implementation of LBS, Perusco and Michael (2007) demonstrate the various facets of control in relation to LBS. The discussion is generally centred on the loss of control which can be experienced in numerous ways, such as when a device does not accurately operate, or when an individual constantly monitors a family member in an attempt to care for them (Perusco and Michael 2007, pp. 6-7, 10). The authors raise valuable ideas with respect to control, such as the need to understand the purpose of control, the notion of consent, and developing methods to deal with location inaccuracies amongst others (p. 14). Perusco and Michael further assert that control has a flow-on effect on other issues, such as trust for instance, with the authors questioning whether it is viable to control individuals given the likely risk that trust may be relinquished in the process (p. 13).

Concurrent with loss of control, the issue of pre-emptive control with respect to LBS is a delicate one, specifically in relation to suspected criminals or offenders. Perusco et al. (2006, p. 92) state that the punishment of a crime is typically proportionate to the committed offence, thus the notion of pre-emptive monitoring can be considered fundamentally flawed given that individuals are being punished without having committed an offence. Rather, they are suspected of being a threat. According to Clarke and Wigan (2011), a person is perceived a threat, based on their “personal associations” which can be determined using location and tracking technologies to establish the individual’s location in relation to others, and thus control them based on such details. This is where IoT fundamentally comes into play. While location information can tell us much about where an individual is at any point in time, it is IoT that will reveal the inter-relationships and frequency of interaction, and specific application of measurable transactions. IoT is that layer that will bring things to be scrutinized in new ways.  

This calls for an evaluation of LBS solutions that can be used for covert operations. Covert monitoring using LBS is often considered a useful technique, one that promotes less opposition than overt forms of monitoring, as summarised below:

“Powerful economic and political interests are seeking to employ location and tracking technologies surreptitiously, to some degree because their effectiveness is greater that way, but mostly in order to pre-empt opposition” (Clarke 2001b, p. 221).

Covert applications of LBS are increasingly available for the monitoring and tracking of social relations such as a partner or a child (Abbas et al. 2011). Regardless of whether covert or overt, using LBS for monitoring is essentially about control, irrespective of whether the act of controlling is motivated by necessity, or for more practical or supportive purposes (Perusco et al. 2006, p. 93). 

Studies focussing on user requirements for control 1.2.9

The control dimension is also significant in studies focussing on LBS users, namely, literature concerned with user-centric design, and user adoption and acceptance of LBS and related mobile solutions. In a paper focussing on understanding user requirements for the development of LBS, Bauer et al. (2005, p. 216) report on a user’s “fear” of losing control while interacting with mobile applications and LBS that may infringe on their personal life. The authors perceive loss of control to be a security concern requiring attention, and suggest that developers attempt to relieve the apprehension associated with increased levels of personalisation though ensuring that adequate levels of control are retained (Bauer et al. 2005, p. 216). This is somewhat supported by the research of Xu and Teo (2004, pp. 793-803), in which the authors suggest that there exists a relationship between control, privacy and intention to use LBS. That is, a loss of control results in a privacy breach, which in turn impacts on a user’s intention to embrace LBS.

The aforementioned studies, however, fail to explicitly incorporate the concept of value into their analyses. Due to the lack of literature discussing the three themes of privacy, value and control, Renegar et al. (2008, pp. 1-2) present the privacy-value-control (PVC) trichotomy as a paradigm beneficial for measuring user acceptance and adoption of mobile technologies. This paradigm stipulates the need to achieve harmony amongst the concepts of privacy, value and control in order for a technology to be adopted and accepted by the consumer. However, the authors note that perceptions of privacy, value and control are dependent on a number of factors or entities, including the individual, the technology and the service provider (Renegar et al. 2008, p. 9). Consequently, the outcomes of Renegar et al.’s study state that privacy does not obstruct the process of adoption but rather the latter must take into account the value proposition in addition to the amount of control granted.

Monitoring using LBS: control versus care? 1.2.10

The focus of the preceding sections has been on the loss of control, the dangers of pre-emptive control, covert monitoring, and user perspectives relating to the control dimension. However, this analysis should not be restricted to the negative implications arising from the use of LBS, but rather should incorporate both the control and care applications of LBS. For instance, while discussions of surveillance and the term in general typically invoke sinister images, numerous authors warn against assuming this subjective viewpoint. Surveillance should not be considered in itself as disagreeable. Rather, “[t]he problem has been the presumptiveness of its proponents, the lack of rational evaluation, and the exaggerations and excesses that have been permitted” (Clarke 2007a, p. 42). This viewpoint is reinforced in the work of Elliot and Phillips (2004, p. 474), and can also be applied to dataveillance.

The perspective that surveillance inevitability results in negative consequences such as individuals possessing excessive amounts of control over each other should be avoided. For instance, Lyon (2001, p. 2) speaks of the dual aspects of surveillance in that “[t]he same process, surveillance – watching over – both enables and constrains, involves care and control.”  Michael et al. (2006a) reinforce such ideas in the context of GPS tracking and monitoring. The authors claim that GPS tracking has been employed for control purposes in various situations, such as policing/law enforcement, the monitoring of parolees and sex offenders, the tracking of suspected terrorists and the monitoring of employees (Michael et al. 2006a, pp. 2-3). However, the authors argue that additional contexts such as convenience and care must not be ignored, as GPS solutions may potentially simplify or enable daily tasks (convenience) or be used for healthcare or protection of vulnerable groups (care) (Michael et al. 2006a, pp. 3-4). Perusco and Michael (2005) further note that the tracking of such vulnerable groups indicates that monitoring activities are no longer limited to those convicted of a particular offence, but rather can be employed for protection and safety purposes. Table 1 provides a summary of GPS tracking and monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4), identifying the potentially constructive uses of GPS tracking and monitoring.

Table 1: GPS monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4)

Table 1: GPS monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4)

It is crucial that in evaluating LBS control literature and establishing the need for LBS regulation, both the control and care perspectives are incorporated. The act of monitoring should not immediately conjure up sinister thoughts. The focus should preferably be directed to the important question of purpose or motives. Lyon (2007, p. 3) feels that purpose may exist anywhere on the broad spectrum between care and control. Therefore, as expressed by Elliot and Phillips (2004, p. 474), a crucial factor in evaluating the merit of surveillance activities and systems is determining “how they are used.” These sentiments are also applicable to dataveillance. It is helpful at this point to discuss alternative and related practices that may incorporate location information throughout the monitoring process.

Sousveillance 1.2.11

The term sousveillance, coined by Steve Mann, comes from the French terms sous which means from below, and veiller which means to watch (Mann et al. 2003, p. 332). It is primarily a form of “inverse surveillance” (Mann et al. 2003, p. 331), whereby an individual is in essence “surveilling the surveillers” (p. 332). Sousveillance is reliant on the use of wearable computing devices to capture audiovisual and sensory data (Mann 2005, p. 625). A major concern with respect to sousveillance, according to Mann (2005, p. 637), is the dissemination of the recorded data which for the purposes of this investigation, may include images of locations and corresponding geographic coordinates.

Sousveillance, ‘reflectionism’ and control 1.2.12

Relevant to the theme of control, it has been argued that sousveillance can be utilised as a form of resistance to unwarranted surveillance and control by institutions. According to Mann et al. (2003, p. 333), sousveillance is a type of reflectionism in which individuals can actively respond to bureaucratic monitoring and to an extent “neutralize surveillance”. Sousveillance can thus be employed in response to social control in that surveillance activities are reversed:

“The surveilled become sousveillers who engage social controllers (customs officials, shopkeepers, customer service personnel, security guards, etc.) by using devices that mirror those used by these social controllers” (Mann et al. 2003, p. 337).

Sousveillance differs from surveillance in that traditional surveillance activities are “centralised” and “localized.” It is dispersed in nature and “delocalized” in its global coverage (Ganascia 2010, p. 496). As such, sousveillance requires new metaphors for understanding its fundamental aspects. A useful metaphor proposed by Ganascia (2010, p. 496) for describing sousveillance is the canopticon, which can be contrasted to the panopticon metaphor. At the heart of the canopticon are the following principles:

“total transparency of society, fundamental equality, which gives everybody the ability to watch – and consequently to control – everybody else, [and] total communication, which enables everyone to exchange with everyone else” (Ganascia 2010, p. 497).

This exchange may include the dissemination of location details, thus signalling the need to incorporate sousveillance into LBS regulatory discussions. A noteworthy element of sousveillance is that it shifts the ability to control from the state/institution (surveillance) to the individual. While this can initially be perceived as an empowering feature, excessive amounts of control, if unchecked, may prove detrimental. That is, control may be granted to individuals to disseminate their location (and other) information, or the information of others, without the necessary precautions in place and in an unguarded fashion. The implications of this exercise are sinister in their extreme forms. When considered within the context of IoT, sousveillance ideals are likely compromised. Yes, I can fight back against state control and big brother with sousveillance but in doing so I unleash potentially a thousand or more little brothers, each with their capacity to (mis)use the information being gathered.

Towards überveillance 1.2.13

The concepts of surveillance, dataveillance and sousveillance have been examined with respect to their association with location services in an IoT world. It is therefore valuable, at this point, to introduce the related notion of überveillance. Überveillance, a term coined by M.G. Michael in 2006, can be described as “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” (Michael et al. 2006b; Macquarie Dictionary 2009, p. 1094). Überveillance combines the dimensions of identification, location and time, potentially allowing for forecasting and uninterrupted real-time monitoring (Michael and Michael 2007, pp. 9-10), and in its extreme forms can be regarded as “Big Brother on the inside looking out” (p. 10).

Überveillance is considered by several authors to be the contemporary notion that will supplant surveillance. For instance, Clarke (2007a, p. 27) suggests that the concept of surveillance is somewhat outdated and that contemporary discussions be focussed on the notion of überveillance. It has further been suggested that überveillance is built on the existing notion of dataveillance. That is, “[ü]berveillance takes that which was static or discrete in the dataveillance world, and makes it constant and embedded” (Michael and Michael 2007, p. 10). The move towards überveillance thus marks the evolution from physical, visual forms of monitoring (surveillance), through to the increasingly sophisticated and ubiquitous embedded chips (überveillance) (Michael & Michael 2010; Gagnon et al. 2013). Albrecht and McIntyre (2005) describe these embedded chips as “spychips” and were focused predominantly on RFID tracking of people through retail goods and services. They spend considerable space describing the Internet of Things concept. Perakslis and Wolk (2006) studied the social acceptance of RFID implants as a security method and Perakslis later went on to incorporate überveillance into her research into behavioural motivators and personality factors toward adoption of humancentric IoT applications.

Given that überveillance is an emerging term (Michael and Michael 2007, p. 9), diverse interpretations have been proposed. For example, Clarke (2007a) offers varying definitions of the term, suggesting that überveillance can be understood as any of the following: omni-surveillance, an apocalyptic notion that “applies across all space and all time (omnipresent), and supports some organisation that is all-seeing and even all-knowing (omniscient)”, which can be achieved through the use of embedded chips for instance (p. 33); exaggerated surveillance, referring to “the extent to which surveillance is undertaken... its justification is exaggerated” (p. 34) ; and/or meta-, supra-, or master-surveillance, which “could involve the consolidation of multiple surveillance threads in order to develop what would be envisaged by its proponents to be superior information” (p. 38). Shay et al. (2012) acknowledge:

“The pervasive nature of sensors coupled with recent advances in data mining, networking, and storage technologies creates tools and data that, while serving the public good, also create a ubiquitous surveillance infrastructure ripe for misuse. Roger Clarke’s concept of dataveillance and M.G. Michael and Katina Michael’s more recent uberveillance serve as important milestones in awareness of the growing threat of our instrumented world.”

All of these definitions indicate direct ways in which IoT applications can also be rolled-out whether it is for use of vehicle management in heavy traffic conditions, the tracking of suspects in a criminal investigation or even employees in a workplace. Disturbing is the manner in which a whole host of applications, particularly in tollways and public transportation, are being used for legal purposes without the knowledge of the driver and commuter. “Tapping” token cards is not only encouraged but mandatory at most metropolitan train stations of developed countries. Little do commuters know that the data gathered by these systems can be requested by a host of government agencies without a warrant.

Implications of überveillance on control 1.2.14

Irrespective of interpretation, the subject of current scholarly debate relates to the implications of überveillance on individuals in particular, and society in general. In an article discussing the evolution of automatic identification (auto-ID) techniques, Michael and Michael (2005) present an account of the issues associated with implantable technologies in humancentric applications. The authors note the evident trend of deploying a technology into the marketplace, prior to assessing the potential consequences (Michael and Michael 2005, pp. 22-33). This reactive approach causes apprehension in view of chip implants in particular, given the inexorable nature of embedded chips, and the fact that once the chip is accepted by the body, it is impossible to remove without an invasive surgical procedure, as summarised in the following excerpt:

“[U]nless the implant is removed within a short time, the body will adopt the foreign object and tie it to tissue. At this moment, there will be no exit strategy, no contingency plan, it will be a life enslaved to upgrades, virus protection mechanisms, and inescapable intrusion” (Michael and Michael 2007, p. 18).

Other concerns relevant to this investigation have also been raised. It is indicated that “über-intrusive technologies” are likely to leave substantial impressions on individuals, families and other social relations, with the added potential of affecting psychological well-being (Michael and Michael 2007, p. 17). Apart from implications for individuals, concerns also emerge at the broader social level that require remedies. For instance, if a state of überveillance is to be avoided, caution must be exercised in deploying technologies without due reflection of the corresponding implications. Namely, this will involve the introduction of appropriate regulatory measures, which will encompass proactive consideration of the social implications of emerging technologies and individuals assuming responsibility for promoting regulatory measures (Michael and Michael 2007, p. 20). It will also require a measured attempt to achieve some form of “balance” (Clarke 2007a, p. 43). The implications of überveillance are of particular relevance to LBS regulatory discussions, given that “overarching location tracking and monitoring is leading toward a state of überveillance” (Michael and Michael 2011, p. 2). As such, research into LBS regulation in Australia must be sensitive to both the significance of LBS to überveillance and the anticipated trajectory of the latter.

Unfortunately the same cannot be said for IoT-specific regulation. IoT is a fluid concept, and in many ways IoT is nebulous. It is made up of a host of technologies that are being integrated and are converging together over time. It is layers upon layers of infrastructure which have emerged since the inception of the first telephone lines to the cloud and wireless Internet today. IoT requires new protocols and new applications but it is difficult to point to a specific technology or application or system that can be subject to some form of external oversight. Herein lie the problems of potential unauthorised disclosure of data, or even misuse of data when government agencies require private enterprise to act upon their requests, or private enterprises work together in sophisticated ways to exploit the consumer.

Comparing the different forms of ‘veillance’ 1.2.15

Various terms ending in ‘veillance’ have been introduced throughout this paper, all of which imply and encompass the process of monitoring. Prior to delving into the dangers of this activity and the significance of LBS monitoring on control, it is helpful to compare the main features of each term. A comparison of surveillance, dataveillance, sousveillance, and überveillance is provided in Table 2.

It should be noted that with the increased use of techniques such as surveillance, dataveillance, sousveillance and überveillance, the threat of becoming a surveillance society looms. According to Ganascia (2010p. 491), a surveillance society is one in which the data gathered from the aforementioned techniques is utilised to exert power and control over others. This results in dangers such as the potential for identification and profiling of individuals (Clarke 1997), the latter of which can be associated with social sorting (Gandy 1993).

Table 2: Comparison of the different forms of ‘veillance’

Identification 1.2.16

Identity and identification are ambiguous terms with philosophical and psychological connotations (Kodl and Lokay 2001, p. 129). Identity can be perceived as “a particular presentation of an entity, such as a role that the entity plays in particular circumstances” (Clarke and Wigan 2011). With respect to information systems, human identification specifically (as opposed to object identification) is therefore “the association of data with a particular human being” (Kodl and Lokay 2001, pp. 129-130). Kodl and Lokay (2001, pp. 131-135) claim that numerous methods exist to identify individuals prior to performing a data linkage, namely, using appearance, social interactions/behaviours, names, codes and knowledge, amongst other techniques. With respect to LBS, these identifiers significantly contribute to the dangers pertaining to surveillance, dataveillance, souseveillance and überveillance. That is, LBS can be deployed to simplify and facilitate the process of tracking and be used for the collection of profile data that can potentially be linked to an entity using a given identification scheme. In a sense, LBS in their own right become an additional form of identification feeding the IoT scheme (Michael and Michael, 2013).

Thus, in order to address the regulatory concerns pertaining to LBS, it is crucial to appreciate the challenges regarding the identification of individuals. Of particularly importance is recognition that once an individual has been identified, they can be subjected to varying degrees of control. As such, in any scheme that enables identification, Kodl and Lokay (2001, p. 136) note the need to balance human rights with other competing interests, particularly given that identification systems may be exploited by powerful entities for control purposes, such as by governments to exercise social control. For an historical account of identification techniques, from manual methods through to automatic identification systems including those built on LBS see Michael and Michael (2009, pp. 43-60). It has also been suggested that civil libertarians and concerned individuals assert that automatic identification (auto-ID) technology “impinges on human rights, the right to privacy, and that eventually it will lead to totalitarian control of the populace that have been put forward since at least the 1970s” (Michael and Michael 2009, p. 364). These views are also pertinent to the notion of social sorting.

Social sorting 1.2.17

In relation to the theme of control, information derived from surveillance, dataveillance, sousveillance and überveillance techniques can also serve the purpose of social sorting, labelled by Oscar Gandy (1993, p. 1) as the “panoptic sort.” Relevant to this discussion, the information may relate to an individual’s location. In Gandy’s influential work The Panoptic Sort: A Political Economy of Personal Information, the author relies on the work of Michel Foucault and other critical theorists (refer to pp. 3-13) in examining the panoptic sort as an “antidemocratic system of control” (Gandy 1993, p. 227). According to Gandy, in this system, individuals are exposed to prejudiced forms of categorisation based on both economic and political factors (pp. 1-2). Lyon (1998, p. 94) describes the database management practices associated with social sorting, classing them a form of consumer surveillance, in which customers are grouped by “social type and location.” Such clustering forms the basis for the exclusion and marginalisation of individuals (King 2001, pp. 47-49). As a result, social sorting is presently used for profiling of individuals and in the market research realm (Bennett and Regan 2004, p. 452).

Profiling 1.2.18

Profiling “is a technique whereby a set of characteristics of a particular class of person is inferred from past experience, and data-holdings are then searched for individuals with a close fit to that set of characteristics” (Clarke 1993). The process is centred on the creation of a profile or model related to a specific individual, based on data aggregation processes (Casal 2004, p. 108). Assorted terms have been employed in labelling this profile. For instance, the model created of an individual using the data collected through dataveillance techniques has been referred to by Clarke (1997) as “the digital persona”, and is related to the “digital dossiers” idea introduced by Solove (2004, pp. 1-7). According to Clarke (1994), the use of networked systems, namely the internet, involves communicating and exposing data and certain aspects of, at times, recognisable behaviour, both of which are utilised in the creation of a personality.

Digital personas and dossiers 1.2.19

The resulting personality is referred to as the digital persona. Similarly, digital dossiers refer to the compilation of comprehensive electronic data related to an individual, utilised in the creation of the “digital person” (Solove 2004, p. 1), also referred to as “digital biographies” (Solove 2002, p. 1086). Digital biographies are further discussed by Solove (2002). In examining the need for LBS regulation throughout the globe, a given regulatory response or framework must appreciate the ease with which (past, present and future) location information can be compiled and integrated into an individual’s digital persona or dossier. Once such information is reproduced and disseminated the control implications are magnified.

With respect to the theme of control, an individual can exercise a limited amount of influence over their digital persona, as some aspects of creating an electronic personality may not be within their direct control. The scope of this article does not allow for reflection on the digital persona in great detail; however, Clarke (1994) offers a thorough investigation of the term, and associated notions such as the passive and active digital persona, in addition to the significance of the digital person to dataveillance techniques such as computer matching and profiling. However, significant to this research is the distinction between the physical and the digital persona and the resultant implications in relation to control, as summarised in the following extract:

“The physical persona is progressively being replaced by the digital persona as the basis for social control by governments, and for consumer marketing by corporations. Even from the strictly social control and business efficiency perspectives, substantial flaws exist in this approach. In addition, major risks to individuals and society arise” (Clarke 1994).

The same sentiments apply with respect to digital dossiers. In particular, Solove (2004, p. 2) notes that individuals are unaware of the ways in which their electronic data is exploited by government and commercial entities, and “lack the power to do much about it.” It is evident that profile data is advantageous for both social control and commercial purposes (Clarke 2001d, p. 12), the latter of which is associated with market research and sorting activities, which have evolved from ideas of “containment” of mobile consumer demand to the “control” model (Arvidsson 2004, pp. 456, 458-467). The control model in particular has been strengthened, but not solely driven, by emerging technologies including LBS, as explained:

“The control paradigm thus permits a tighter and more efficient surveillance that makes use of consumer mobility rather than discarding it as complexity. This ability to follow the consumer around has been greatly strengthened by new technologies: software for data mining, barcode scans, internet tracking devices, and lately location based information from mobile phones” (Arvidsson 2004, p. 467).

Social sorting, particularly for profiling and market research purposes, thus introduces numerous concerns relating to the theme of control, one of which is the ensuing consequences relating to personal privacy. This specifically includes the privacy of location information. In sum, examining the current regulatory framework for LBS in Australia, and determining the need for LBS regulation, necessitates an appreciation of the threats associated with social sorting using information derived from LBS solutions. Additionally, the benefits and risks associated with surveillance, dataveillance, sousveillance and überveillance for control must be measured and carefully contemplated in the proposed regulatory response.

Trust 1.3

Trust is a significant theme relating to LBS, given the importance of the notion to: (a) “human existence” (Perusco et al. 2006, p. 93; Perusco and Michael 2007, p. 10), (b) relationships (Lewis and Weigert 1985, pp. 968-969), (c) intimacy and rapport within a domestic relationship (Boesen et al. 2010, p. 65), and (d) LBS success and adoption (Jorns and Quirchmayr 2010, p. 152). Trust can be defined, in general terms, as the “firm belief in the reliability, truth, or ability of someone or something” (Oxford Dictionary 2012b). A definition of trust that has been widely cited in relevant literature is “the willingness of a party to be vulnerable to the actions of another party based on the expectation that the other will perform a particular action important to the trustor, irrespective of the ability to monitor or control that other party” (Mayer et al. 1995, p. 712). Related to electronic relationships or transactions, the concept has been defined as the “confident reliance by one party on the behaviour of other parties” (Clarke 2001c, p. 291), and it has been suggested that in the electronic-commerce domain, in particular, trust is intimately associated with the disclosure of information (Metzger 2004).

In reviewing literature concerning trust, Fusco et al. (2011, p. 2) claim that trust is typically described as a dynamic concept falling into the categories of cognitive (evidence based), emotional (faith-based), and/or behavioural (conduct-based) trust. For further reading, the major sources on trust can be found in: Lewis and Weigert's (1985) sociological treatment of trust, the influential work of Mayer et al. (1995) and the authors’ updated work Schoorman et al. (2007) centred on organisational trust, Weckert’s (2000) comprehensive review of trust in the context of workplace monitoring using electronic devices, research on trust in electronic-commerce (refer to McKnight and Chervany 2001; Pavlou 2003; Kim et al. 2009) and mobile-commerce (see Siau and Shen 2003; Yeh and Li 2009), the work of Valachich (2003) that introduces and evaluates trust in terms of ubiquitous computing environments, Dwyer et al.’s (2007) article on trust and privacy issues in social networks, Yan and Holtmanns’ (2008) examination of issues associated with digital trust management, the work of Chen et al. (2008) covering the benefits and concerns of LBS usage including privacy and trust implications, and the research by Junglas and Spitzmüller (2005) that examines privacy and trust issues concerning LBS by presenting a research model that incorporates these aspects amongst others.

For the purpose of this paper, the varying definitions and categorisations are acknowledged. However, trust will be assessed in terms of the relationships dominating existing LBS/IoT scholarship which comprise the government-citizen relationship centred on trust in the state, the business-consumer relationship associated with trust in corporations/LBS providers, and the consumer-consumer relationship concerned with trust in individuals/others.

Trust in the state 1.3.1

Trust in the state broadly covers LBS solutions implemented by government, thus representing the government-citizen relationship. Dominating current debates and literature are LBS government initiatives in the form of emergency management schemes, in conjunction with national security applications utilising LBS, which depending on the nature of their implementation may impact on citizens’ trust in the state. These concerns are typically expressed as a trade-off between security and safety. At present there are very few examples of fully-fledged IoT systems to point to, although increasingly quasi-IoT systems are being deployed using wireless sensor networks of varying kinds, e.g. for bushfire management and for fisheries. These systems do not include a direct human stakeholder but are still relevant as they may trigger flow-on effects that do impact citizenry.

Balancing trust and privacy in emergency services 1.3.2

In the context of emergency management, Aloudat and Michael (2011, p. 58) maintain that the dominant theme between government and consumers in relation to emergency warning messages and systems is trust. This includes trust in the LBS services being delivered and in the government itself (Aloudat and Michael 2011, p. 71). While privacy is typically believed to be the leading issue confronting LBS, in emergency and life-threatening situations it is overwhelmed by trust-related challenges, given that users are generally willing to relinquish their privacy in the interest of survival (Aloudat and Michael 2010, p. 2). Furthermore, the success of these services is reliant on trust in the technology, the service, and the accuracy/reliability/timeliness of the emergency alert. On the whole, this success can be measured in terms of citizens’ confidence in their government’s ability to sensibly select and implement a fitting emergency service utilising enhanced LBS features. In a paper that examines the deployment of location services in Dutch public administration, van Ooijen and Nouwt (2009, p. 81) assess the impact of government-based LBS initiatives on the government-citizen relationship, recommending that governments employ care in gathering and utilising location-based data about the public, to ensure that citizens' trust in the state is not compromised.

Trust-related implications of surveillance in the interest of national security 1.3.3

Trust is also prevalent in discussions relating to national security. National security has been regarded a priority area for many countries for over a decade, and as such has prompted the implementation of surveillance schemes by government. Wigan and Clarke (2006, p. 392) discuss the dimension of trust as a significant theme contributing to the social acceptance of a particular government surveillance initiative, which may incorporate location and tracking of individuals and objects. The implementation of surveillance systems by the state, including those incorporating LBS, can diminish the public’s confidence in the state given the potential for such mechanisms to be perceived as a form of authoritarian control. Nevertheless, a situation where national security and safety are considered to be in jeopardy may entail (partial) acceptance of various surveillance initiatives that would otherwise be perceived objectionable. In such circumstances, trust in government plays a crucial role in determining individuals’ willingness to compromise various civil liberties. This is explained by Davis and Silver (2004, p. 35) below:

“The more people trust the federal government or law enforcement agencies, the more willing they are to allow the government leeway in fighting the domestic war on terrorism by conceding some civil liberties.”

However, in due course it is expected that such increased security measures (even if initially supported by citizens) will yield a growing gap between government and citizens, “potentially dampening citizen participation in government and with it reducing citizens’ trust in public institutions and officials” (Gould 2002, p. 77). This is so as the degree of threat and trust in government is diminishing, thus resulting in the public’s reluctance to surrender their rights for the sake of security (Sanquist et al. 2008, p. 1126). In order to build and maintain trust, governments are required to be actively engaged in developing strategies to build confidence in both their abilities and of the technology under consideration, and are challenged to recognise “the massive harm that surveillance measures are doing to public confidence in its institutions” (Wigan and Clarke 2006, p. 401). It has been suggested that a privacy impact assessment (PIA) aids in establishing trust between government and citizens (Clarke 2009, p. 129). Carefully considered legislation is an alternative technique to enhance levels of trust. With respect to LBS, governments are responsible for proposing and enacting regulation that is in the best interest of citizens, incorporating citizen concerns into this process and encouraging suitable design of LBS applications, as explained in the following quotation:

“...new laws and regulations must be drafted always on the basis of citizens’ trust in government authorities. This means that citizens trust the government to consider the issues at stake according to the needs and wishes of its citizens. Location aware services can influence citizens’ trust in the democratic society. Poorly designed infrastructures and services for storing, processing and distributing location-based data can give rise to a strong feeling of being threatened. Whereas a good design expands the feeling of freedom and safety, both in the private and in the public sphere/domain” (Beinat et al. 2007, p. 46).

One of the biggest difficulties that will face stakeholders is identifying when current LBS systems become a part of bigger IoT initiatives. Major changes in systems will require a re-evaluation of impact assessments of different types.

Need for justification and cultural sensitivity 1.3.4

Techniques of this nature will fail to be espoused, however, if surveillance schemes lack adequate substantiation at the outset, as trust is threatened by “absence of justification for surveillance, and of controls over abuses” (Wigan and Clarke 2006, p. 389). From a government perspective, this situation may prove detrimental, as Wigan and Clarke (2006, p. 401) claim that transparency and trust are prerequisites for ensuring public confidence in the state, noting that “[t]he integrity of surveillance schemes, in transport and elsewhere, is highly fragile.” Aside from adequate justification of surveillance schemes, cultural differences associated with the given context need to be acknowledged as factors influencing the level of trust citizens hold in government. As explained by Dinev et al. (2005, p. 3) in their cross-cultural study of American and Italian Internet users' privacy and surveillance concerns, “[a]ttitudes toward government and government initiatives are related to the culture’s propensity to trust.” In comparing the two contexts, Dinev et al. claim that Americans readily accept government surveillance to provide increased levels of security, whereas Italians’ low levels of trust in government results in opposing viewpoints (pp. 9-10).

Trust in corporations/LBS/IoT providers 1.3.5

Trust in corporations/LBS/IoT providers emerges from the level of confidence a user places in an organisation and their respective location-based solution(s), which may be correlated to the business-consumer relationship. In the context of consumer privacy, Culnan and Bies (2003, p. 327) assert that perceived trust in an organisation is closely linked to the extent to which an organisation's practices are aligned with its policies. A breach in this trust affects the likelihood of personal information disclosure in the future (Culnan and Bies 2003, p. 328), given the value of trust in sustaining lasting customer relationships (p. 337). Reducing this “trust gap” (Culnan and Bies 2003, pp. 336-337) is a defining element for organisations in achieving economic and industry success, as it may impact on a consumer’s decision to contemplate location data usage (Chen et al. 2008, p. 34). Reducing this gap requires that control over location details remain with the user, as opposed to the LBS provider or network operator (Giaglis et al. 2003, p. 82). Trust can thus emerge from a user’s perception that they are in command (Junglas and Spitzmüller 2005, p. 3). 

Küpper and Treu (2010, pp. 216-217) concur with these assertions, explaining that the lack of uptake of first-generation LBS applications was chiefly a consequence of the dominant role of the network operator over location information. This situation has been somewhat rectified since the introduction of GPS-enabled devices capable of determining location information without input from the network operator and higher emphasis on a user-focussed model (Bellavista et al. 2008, p. 85; Küpper and Treu 2010, p. 217). Trust, however, is not exclusively concerned with a network operator’s ability to determine location information, but also with the possible misuse of location data. As such, it has also been framed as a potential resolution to location data misappropriation, explained further by Jorns and Quirchmayr (2010, p. 152) in the following excerpt:

“The only way to completely avoid misuse is to entirely block location information, that is, to reject such services at all. Since this is not an adequate option... trust is the key to the realization of mobile applications that exchange sensitive information.”

There is much to learn from the covert and overt location tracking of large corporation on their subscribers. Increasingly, the dubious practices of retaining location information by information and communication technology giants Google, Apple and Microsoft are being reported and only small commensurate penalties being applied in countries in the European Union and Asia. Disturbing in this trend is that even smaller suppliers of location-based applications are beginning to unleash unethical (but seemingly not illegal) solutions at shopping malls and other campus-based locales (Michael & Clarke 2013).

Importance of identity and privacy protection to trust 1.3.6

In delivering trusted LBS solutions, Jorns and Quirchmayr (2010, pp. 151-155) further claim that identity and privacy protection are central considerations that must be built into a given solution, proposing an LBS architecture that integrates such safeguards. That is, identity protection may involve the use of false dummies, dummy users and landmark objects, while privacy protection generally relies on decreasing the resolution of location data, employing supportive regulatory techniques and ensuring anonymity and pseudonymity (Jorns and Quirchmayr 2010, p. 152). Similarly, and with respect to online privacy, Clarke (2001c, p. 297) suggests that an adequate framework must be introduced that “features strong and comprehensive privacy laws, and systematic enforcement of those laws.” These comments, also applicable to LBS in a specific sense, were made in the context of economic rather than social relationships, referring primarily to government and corporations, but are also relevant to trust amongst social relations.

It is important to recognise that issues of trust are closely related to privacy concerns from the perspective of users. In an article titled, “Trust and Transparency in Location-Based Services: Making Users Lose their Fear of Big Brother”, Böhm et al. (2004, pp. 1-3) claim that operators and service providers are charged with the difficult task of earning consumer trust and that this may be achieved by addressing user privacy concerns and adhering to relevant legislation. Additional studies also point to the relationship between trust and privacy, claiming that trust can aid in reducing the perceived privacy risk for users. For example, Xu et al. (2005) suggest that enhancing trust can reduce the perceived privacy risk. This influences a user's decision to disclose information, and that “service provider’s interventions including joining third party privacy seal programs and introducing device-based privacy enhancing features could increase consumers’ trust beliefs and mitigate their privacy risk perceptions” (Xu et al. 2005, p. 905). Chellappa and Sin (2005, pp. 188-189), in examining the link between trust and privacy, express the importance of trust building, which include consumer’s familiarity and previous experience with the organisation.

Maintaining consumer trust 1.3.7

The primary consideration in relation to trust in the business-consumer relationship is that all efforts be targeted at establishing and building trust in corporations and LBS/IoT providers. Once trust has been compromised, the situation cannot be repaired which is a point applicable to trust in any context. This point is explained by Kaasinen (2003, p. 77) in an interview-based study regarding user requirements in location-aware mobile applications:

“The faith that the users have in the technology, the service providers and the policy-makers should be regarded highly. Any abuse of personal data can betray that trust and it will be hard to win it back again.”

Trust in individuals/others 1.3.8

Trust in the consumer-to-consumer setting is determined by the level of confidence existing between an individual and their social relations, which may include friends, parents, other family members, employers and strangers, categories that are adapted from Levin et al. (2008, pp. 81-82). Yan and Holtmanns (2008, p. 2) express the importance of trust for social interactions, claiming that “[s]ocial trust is the product of past experiences and perceived trustworthiness.” It has been suggested that LBS monitoring can erode trust between the individual engaged in monitoring and the subject being monitored, as the very act implies that trust is lacking in a given relationship (Perusco et al. 2006, p. 93). These concerns are echoed in Michael et al. (2008). Previous studies relevant to LBS and trust generally focus on: the workplace situation, that is, trust between an employer and their employee; trust amongst ‘friends’ subscribed to a location-based social networking (LBSN) service which may include any of the predefined categories above; in addition to studies relating to the tracking of family members, such as children for instance, for safety and protection purposes and the relative trust implications.

Consequences of workplace monitoring 1.3.9

With respect to trust in an employer’s use of location-based applications and location data, a prevailing subject in existing literature is the impact of employee monitoring systems on staff. For example, in studying the link between electronic workplace monitoring and trust, Weckert (2000, p. 248) reported that trust is a significant issue resulting from excessive monitoring, in that monitoring may contribute to deterioration in professional work relationships between an employer and their employee and consequently reduce or eliminate trust. Weckert’s work reveals that employers often substantiate electronic monitoring based on the argument that the “benefits outweigh any loss of trust”, and may include gains for the involved parties; notably, for the employer in the form of economic benefits, for the employee to encourage improvements to performance and productivity, and for the customer who may experience enhanced customer service (p. 249). Chen and Ross (2005, p. 250), on the other hand, argue that an employer’s decision to monitor their subordinates may be related to a low degree of existing trust, which could be a result of unsuitable past behaviour on the part of the employee. As such, employers may perceive monitoring as necessary in order to manage employees. Alternatively, from the perspective of employees, trust-related issues materialise as a result of monitoring, which may leave an impression on job attitudes, including satisfaction and dedication, as covered in a paper by Alder et al. (2006) in the context of internet monitoring.

When applied to location monitoring of employees using LBS, the trust-related concerns expressed above are indeed warranted. Particularly, Kaupins and Minch (2005, p. 2) argue that the appropriateness of location monitoring in the workplace can be measured from either a legal or ethical perspective, which inevitably results in policy implications for the employer. The authors emphasise that location monitoring of employees can often be justified in terms of the security, productivity, reputational and protective capabilities of LBS (Kaupins and Minch 2005, p. 5). However, Kaupins and Minch (2005, pp. 5-6) continue to describe the ethical factors “limiting” location monitoring in the workplace, which entail the need for maintaining employee privacy and the restrictions associated with inaccurate information, amongst others. These factors will undoubtedly affect the degree of trust between an employer and employee.

However, the underlying concern relevant to this discussion of location monitoring in the workplace is not only the suitability of employee monitoring using LBS. While this is a valid issue, the challenge remains centred on the deeper trust-related consequences. Regardless of the technology or applications used to monitor employees, it can be concluded that a work atmosphere lacking trust results in sweeping consequences that extend beyond the workplace, expressed in the following excerpt:

“A low trust workplace environment will create the need for ever increasing amounts of monitoring which in turn will erode trust further. There is also the worry that this lack of trust may become more widespread. If there is no climate of trust at work, where most of us spend a great deal of our life, why should there be in other contexts? Some monitoring in some situations is justified, but it must be restricted by the need for trust” (Weckert 2000, p. 250).

Location-monitoring amongst friends 1.3.10

Therefore, these concerns are certainly applicable to the use of LBS applications amongst other social relations. Recent literature merging the concepts of LBS, online social networking and trust are particularly focused on the use of LBSN applications amongst various categories of friends. For example, Fusco et al.'s (2010) qualitative study examines the impact of LBSN on trust amongst friends, employing a focus group methodology in achieving this aim. The authors reveal that trust may suffer as a consequence of LBSN usage in several ways: as disclosure of location information and potential monitoring activities can result in application misuse in order to conceal things; excessive questioning and the deterioration in trust amongst social relations; and trust being placed in the application rather than the friend (Fusco et al. 2010, p. 7). Further information relating to Fusco et al.’s study, particularly the manner in which LBSN applications adversely impact on trust can be found in a follow-up article (Fusco et al. 2011).

Location tracking for protection 1.3.11

It has often been suggested that monitoring in familial relations can offer a justified means of protection, particularly in relation to vulnerable individuals such as Alzheimer’s or dementia sufferers and in children. With specific reference to the latter, trust emerges as a central theme relating to child tracking. In an article by Boesen et al. (2010) location tracking in families is evaluated, including the manner in which LBS applications are incorporated within the familial context. The qualitative study conducted by the authors revealed that the initial decision to use LBS by participants with children was a lack of existing trust within the given relationship, with participants reporting an improvement in their children's behaviour after a period of tracking (Boesen et al. 2010, p. 70). Boesen et al., however, warn of the trust-related consequences, claiming that “daily socially-based trusting interactions are potentially replaced by technologically mediated interactions” (p. 73). Lack of trust in a child is considered to be detrimental to their growth. The act of nurturing a child is believed to be untrustworthy through the use of technology, specifically location monitoring applications, may result in long-term implications. The importance of trust to the growth of a child and the dangers associated with ubiquitous forms of supervision are explained in the following excerpt:

“Trust (or at least its gradual extension as the child grows) is seen as fundamental to emerging self-control and healthy development... Lack of private spaces (whether physical, personal or social) for children amidst omni-present parental oversight may also create an inhibiting dependence and fear” (Marx and Steeves 2010, p. 218).

Furthermore, location tracking of children and other individuals in the name of protection may result in undesirable and contradictory consequences relevant to trust. Barreras and Mathur (2007, p. 182), in an article that describes the advantages and disadvantages of wireless location tracking, argue that technologies originally intended to protect family members (notably children, and other social relations such as friends and employees), can impact on trust and be regarded as “unnecessary surveillance.” The outcome of such tracking and reduced levels of trust may also result in a “counterproductive” effect if the tracking capabilities are deactivated by individuals, rendering them incapable of seeking assistance in actual emergency situations (Barreras and Mathur 2007, p. 182).

LBS/IoT is a ‘double-edged sword’ 1.3.12

In summary, location monitoring and tracking by the state, corporations and individuals is often justified in terms of the benefits that can be delivered to the party responsible for monitoring/tracking and the subject being tracked. As such, Junglas and Spitzmüller (2005, p. 7) claim that location-based services can be considered a “double-edged sword” in that they can aid in the performance of tasks in one instance, but may also generate Big Brother concerns. Furthermore, Perusco and Michael (2007, p. 10) mention the linkage between trust and freedom. As a result, Perusco et al. (2006, p. 97) suggest a number of questions that must be considered in the context of LBS and trust: “Does the LBS context already involve a low level of trust?”; “If the LBS context involves a moderate to high level of trust, why are LBS being considered anyway?”; and “Will the use of LBS in this situation be trust-building or trust-destroying?” In answering these questions, the implications of LBS/IoT monitoring on trust must be appreciated, given they are significant, irreparable, and closely tied to what is considered the central challenge in the LBS domain, privacy.

This paper has provided comprehensive coverage of the themes of control and trust with respect to the social implications of LBS. The subsequent discussion will extend the examination to cover LBS in the context of the IoT, providing an ethical analysis and stressing the importance of a robust socio-ethical framework.

Discussion 1.4

The Internet of Things (IoT) and LBS: extending the discussion on control and trust 1.4.1

The Internet of Things (IoT) is an encompassing network of connected intelligent “things”, and is “comprised of smart machines interacting and communicating with other machines, objects, environments and infrastructures” (Freescale Semiconductor Inc. and ARM Inc. 2014, p. 1). The phrase was originally coined by Kevin Ashton in 1999, and a definite definition is yet to be agreed upon (Ashton 2009, p. 1; Kranenburg and Bassi 2012, p. 1). Various forms of IoT are often used interchangeably, such as the Internet of Everything, the Internet of Things and People, the Web of Things and People etc. The IoT can, however, be described in terms of its core characteristics and/or the features it encompasses. At the crux of the IoT concept is the integration of the physical and virtual worlds, and the capability for “things” within these realms to be operated remotely through the employment of intelligent or smart objects with embedded processing functionality (Mattern and Floerkemeier 2010, p. 242; Ethics Subgroup IoT 2013, p. 3). These smart objects are capable of storing historical and varied forms of data, used as the basis for future interactions and the establishment of preferences. That is, once the data is processed, it can be utilized to “command and control” things within the IoT ecosystem, ideally resulting in enhancing the everyday lives of individual (Michael, K. et al., 2010).

According to Ashton (2009, p. 1), the IoT infrastructure should “empower computers” and exhibit less reliance on human involvement in the collection of information. It should also allow for “seamless” interactions and connections (Ethics Subgroup IoT 2013, p. 2). Potential use cases include personal/home applications, health/patient monitoring systems, and remote tracking and monitoring which may include applications such as asset tracking amongst others (Ethics Subgroup IoT 2013, p. 3).

As can be anticipated with an ecosystem of this scale, the nature of interactions with the physical/virtual worlds and the varied “things” within, will undoubtedly be affected and dramatically alter the state of play. In the context of this paper, the focus is ultimately on the ethical concerns emerging from the use of LBS within the IoT infrastructure that is characterized by its ubiquitous/pervasive nature, in view of the discussion above regarding control and trust. It is valuable at this point to identify the important role of LBS in the IoT infrastructure.

While the IoT can potentially encompass a myriad of devices, the mobile phone will likely feature as a key element within the ecosystem, providing connectivity between devices (Freescale Semiconductor Inc. and ARM Inc. 2014, p. 2). In essence, smart phones can therefore be perceived as the “mediator” between users, the internet and additional “things”, as is illustrated in Mattern and Floerkemeier (2010, p. 245, see figure 2). Significantly, most mobile devices are equipped with location and spatial capabilities, providing “localization”, whereby intelligent devices “are aware of their physical location, or can be located” (Mattern and Floerkemeier 2010, p. 244). An example of an LBS application in the IoT would be indoor navigation capabilities in the absence of GPS; or in affect seamless navigation between the outdoor and indoor environments.

Control- and trust-related challenges in the IoT 1.4.2

It may be argued that the LBS control and trust implications discussed throughout this paper (in addition to ethical challenges such as privacy and security) will matriculate into the IoT environment. However, it has also been suggested that “the IoT will essentially create much richer environments in which location-based and location-aware technology can function” (Blouin 2014), and in doing so the ethical challenges will be amplified. It has further been noted that ethical issues, including trust and control amongst others, will “gain a new dimension in light of the increased complexity” in the IoT environment (Ethics Subgroup IoT 2013, p. 2).

In relation to control and the previously identified surveillance metaphors, for instance, it is predicted that there will be less reliance on Orwell's notion of Big Brother whereby surveillance is conducted by a single entity. Rather the concept of "some brother" will emerge. Some brother can be defined as "a heterogeneous 'mass' consisting of innumerable social actors, e.g. public sector authorities, citizens' movements and NGOs, economic players, big corporations, SMEs and citizens" (Ethics Subgroup IoT 2013, p. 16). As can be anticipated, the ethical consequences and dangers can potentially multiply in such a scenario.

Following on from this idea, is that of lack of transparency. The IoT will inevitably result in the merging of both the virtual and physical worlds, in addition to public and private spaces. It has been suggested that lack of transparency regarding information access will create a sense of discomfort and will accordingly result in diminishing levels of trust (Ethics Subgroup IoT 2013, p. 8). The trust-related issues (relevant to LBS) are likely to be consistent with those discussed throughout this paper, possibly varying in intensity/severity depending on a given scenario. For example, the consequences of faulty IoT technology have the potential to be greater than those in conventional Internet services given the integration of the physical and virtual worlds, thereby impact on users’ trust in the IoT (Ethics Subgroup IoT 2013, p. 11). Therefore, trust considerations must primarily be examined in terms of: (a) trust in technology, and (b) trust in individuals/others.

Dealing with these (and other) challenges requires an ethical analysis in which appropriate conceptual and practical frameworks are considered. A preliminary examination is provided in the subsequent section, followed by dialogue regarding the need for objectivity in socio-ethical studies and the associated difficulties in achieving this.

Ethical analysis: proposing a socio-ethical conceptual framework 1.4.3

Research into the social and ethical implications of LBS, emerging technologies in general, and the IoT can be categorized in many ways and many frameworks can be applied. For instance, it may be regarded as a strand of “cyberethics”, defined by Tavani (2007, p. 3) as “the study of moral, legal and social issues involving cybertechnology”. Cybertechnology encompasses technological devices ranging from individual computers through to networked information and communication technologies. When considering ethical issues relating to cybertechnology and technology in general, Tavani (2007, pp. 23-24) notes that the latter should not necessarily be perceived as neutral. That is, technology may have “embedded values and biases” (Tavani 2007, p. 24), in that it may inherently provide capabilities to individuals to partake in unethical activities. This sentiment is echoed by Wakunuma and Stahl (2014, p. 393) in a paper examining the perceptions of IS professionals in relation to emerging ethical concerns.

Alternatively, research in this domain may be classed as a form of “computer ethics” or “information ethics”, which can be defined and applied using numerous approaches. While this article does not attempt to provide an in-depth account of information ethics, a number of its crucial characteristics are identified. In the first instance, the value of information ethics is in its ability to provide a conceptual framework for understanding the array of ethical challenges stemming from the introduction of new ICTs (Mathiesen 2004, p. 1). According to Floridi (1999), the question at the heart of information ethics is “what is good for an information entity and the infosphere in general?” The author continues that “more analytically, we shall say that [information ethics] determines what is morally right or wrong, what ought to be done, what the duties, the ‘oughts’ and the ‘ought nots’ of a moral agent are…” However, Capurro (2006, p. 182) disagrees, claiming that information ethics is additionally about “what is good for our bodily being-in-the-world with others in particular?” This involves contemplation of other “spheres” such as the ecological, political, economic, and cultural and is not limited to a study of the infosphere as suggested by Floridi. In this sense, the significance of context, environment and intercultural factors also becomes apparent.

Following on from these notions, there is the need for a robust ethical framework that is multi-dimensional in nature and explicitly covers the socio-ethical challenges emerging from the deployment of a given technology. This would include, but not be limited to, the control and trust issues identified throughout this paper, other concerns such as privacy and security, and any challenges that emerge as the IoT takes shape. This article proposes a broader more robust socio-ethical conceptual framework, as an appropriate means of examining and addressing ethical challenges relevant to LBS; both LBS in general and as a vital mediating component within the IoT. This framework is illustrated in Figure 1. Central to the socio-ethical framework is the contemplation of individuals as part of a broader social network or society, whilst considering the interactions amongst various elements of the overall “system”. The four themes underpinning socio-ethical studies include the investigation of what the human purpose is, what is moral, how justice is upheld and the principles that guide the usage of a given technique. Participants; their interactions with systems; people concerns and behavioural expectations; cultural and religious belief; structures, rules and norms; and fairness, personal benefits and personal harms are all areas of interest in a socio-ethical approach.

Figure 1: Proposed socio-ethical framework, in terms of the major components that require consideration

Figure 1: Proposed socio-ethical framework, in terms of the major components that require consideration

This article is intended to offer a preliminary account of the socio-ethical conceptual framework being proposed. Further research would examine and test its validity, whilst also providing a more detailed account of the various components within and how a socio-ethical assessment would be conducted based on the framework, and the range of techniques that could be applied.

The need for objectivity 1.4.4

Regardless of categorization and which conceptual framework is adopted, numerous authors stress that the focus of research and debates should not be skewed towards the unethical uses of a particular technology, but rather an objective stance should be embraced. Such objectivity must nonetheless ensure that social interests are adequately represented. That is, with respect to location and tracking technologies, Clarke (2001b, p. 220) claims that social interests have been somewhat overshadowed by the economic interests of LBS organisation. This is a situation that requires rectifying. While information technology professionals are not necessarily liable for how technology is deployed, they must nonetheless recognise its implications and be engaged in the process of introducing and promoting adequate safeguards (Clarke 1988, pp. 510-511). It has been argued that IS professionals are generally disinterested in the ethical challenges associated with emerging ICTs, and are rather concerned with the job or the technologies themselves (Wakunuma and Stahl 2014, p. 383).

This is explicitly the case for LBS given that the industry and technology have developed quicker than equivalent social implications scholarship and research, an unfavourable situation given the potential for LBS to have profound impacts on individuals and society (Perusco et al. 2006, p. 91). In a keynote address centred on defining the emerging notion of überveillance, Clarke (2007a, p. 34) discusses the need to measure the costs and disbenefits arising from surveillance practices in general, where costs refer to financial measures, and disbenefits to all non-economic impacts. This involves weighing the negatives against the potential advantages, a response that is applicable to LBS, and pertinent to seeking objectivity.

Difficulties associated with objectivity 1.4.5

However, a major challenge with respect to an impartial approach for LBS is the interplay between the constructive and the potentially damaging consequences that the technology facilitates. For instance, and with specific reference to wireless technologies in a business setting, Elliot and Phillips (2004, p. 474) maintain that such systems facilitate monitoring and surveillance which can be applied in conflicting scenarios. Positive applications, according to Elliot and Phillips, include monitoring to improve effectiveness or provide employee protection in various instances, although this view has been frequently contested. Alternatively, negative uses involve excessive monitoring, which may compromise privacy or lead to situations in which an individual is subjected to surveillance or unauthorised forms of monitoring.

Additional studies demonstrate the complexities arising from the dual, and opposing, uses of a single LBS solution. It has been illustrated that any given application, for instance, parent, healthcare, employee and criminal tracking applications, can be simultaneously perceived as ethical and unethical (Michael et al. 2006a, p. 7). A closer look at the scenario involving parents tracking children, as explained by Michael et al. (2006a, p. 7), highlights that child tracking can enable the safety of a child on the one hand, while invading their privacy on the other. Therefore, the dual and opposing uses of a single LBS solution become problematic and situation-dependent, and indeed increasingly difficult to objectively examine. Dobson and Fischer (2003, p. 50) maintain that technology cannot be perceived as either good or evil in that it is not directly the cause of unethical behaviour, rather they serve to “empower those who choose to engage in good or bad behaviour.”

This is similarly the case in relation to the IoT, as public approval of the IoT is largely centred on “the conventional dualisms of ‘security versus freedom’ and ‘comfort versus data privacy’” (Mattern and Floerkemeier 2010, p. 256). Assessing the implications of the IoT infrastructure as a whole is increasingly difficult.

An alternative obstacle is associated with the extent to which LBS threaten the integrity of the individual. Explicitly, the risks associated with location and tracking technologies “arise from individual technologies and the trails that they generate, from compounds of multiple technologies, and from amalgamated and cross-referenced trails captured using multiple technologies and arising in multiple contexts” (Clarke 2001b, pp. 218). The consequent social implications or “dangers” are thus a product of individuals being convicted, correctly or otherwise, of having committed a particular action (Clarke 2001b, p. 219). A wrongly accused individual may perceive the disbenefits arising from LBS as outweighing the benefits.

However, in situations where integrity is not compromised, an LBS application can be perceived as advantageous. For instance, Michael et al. (2006, pp. 1-11) refer to the potentially beneficial uses of LBS, in their paper focusing on the Avian Flu Tracker prototype that is intended to manage and contain the spread of the infectious disease, by relying on spatial data to communicate with individuals in the defined location. The authors demonstrate that their proposed system which is intended to operate on a subscription or opt-in basis is beneficial for numerous stakeholders such as government, health organisations and citizens (Michael et al. 2006c, p. 6).

Thus, a common challenge confronting researchers with respect to the study of morals, ethics and technology is that the field of ethics is subjective. That is, what constitutes right and wrong behaviour varies depending on the beliefs of a particular individual, which are understood to be based on cultural and other factors specific to the individual in question. One such factor is an individual’s experience with the technology, as can be seen in the previous example centred on the notion of an unjust accusation. Given these subjectivities and the potential for inconsistency from one individual to the next, Tavani (2007, p. 47) asserts that there is the need for ethical theories to direct the analysis of moral issues (relating to technology), given that numerous complications or disagreements exist in examining ethics.

Conclusion 1.5

This article has provided a comprehensive review of the control- and trust-related challenges relevant to location-based services, in order to identify and describe the major social and ethical considerations within each of the themes. The relevance of the IoT in such discussions has been demonstrated and a socio-ethical framework proposed to encourage discussion and further research into the socio-ethical implications of the IoT with a focus on LBS and/or localization technologies. The proposed socio-ethical conceptual framework requires further elaboration and it is recommended that a thorough analysis, beyond information ethics, be conducted based on this paper which forms the basis for such future work. IoT by its very nature is subject to socio-ethical dilemmas because for the greater part, the human is removed from decision-making processes and is instead subject to a machine.

References

Abbas, R., Michael, K., Michael, M.G. & Aloudat, A.: Emerging Forms of Covert Surveillance Using GPS-Enabled Devices. Journal of Cases on Information Technology 13(2), 2011, 19-33.

Albrecht, K. & McIntyre, L.: Spychips: How Major Corporations and Government Plan to Track Your Every Purchase and Watch Your Every Move. Tomas Nelson 2005.

Albrecht, K. & Michael, K.: Connected: To Everyone and Everything. IEEE Technology and Society Magazine, Winter, 2013, 31-34.

Alder, G.S., Noel, T.W. & Ambrose, M.L.: Clarifying the Effects of Internet Monitoring on Job Attitudes: The Mediating Role of Employee Trust. Information & Management, 43, 2006, 894-903.

Aloudat, A. & Michael, K.: The Socio-Ethical Considerations Surrounding Government Mandated Location-Based Services During Emergencies: An Australian Case Study, in M. Quigley (ed.), ICT Ethics and Security in the 21st Century: New Developments and Applications. IGI Global, Hershey, PA, 2010, 1-26.

Aloudat, A. & Michael, K.: Toward the Regulation of Ubiquitous Mobile Government: A case Study on Location-Based Emergency Services in Australia. Electronic Commerce Research, 11(1), 2011, 31-74.

Andrejevic, M.: ISpy: Surveillance and Power in the Interactive Era. University Press of Kansas, Lawrence, 2007.

Arvidsson, A.: On the ‘Pre-History of the Panoptic Sort’: Mobility in Market Research. Surveillance & Society, 1(4), 2004, 456-474.

Ashton, K.: The "Internet of Things" Things. RFID Journal, 2009, www.rfidjournal.com/articles/pdf?4986

Barreras, A. & Mathur, A.: Chapter 18. Wireless Location Tracking, in K.R. Larsen and Z.A. Voronovich (eds.), Convenient or Invasive: The Information Age. Ethica Publishing, United States, 2007, 176-186.

Bauer, H.H., Barnes, S.J., Reichardt, T. & Neumann, M.M.: Driving the Consumer Acceptance of Mobile Marketing: A Theoretical Framework and Empirical Study. Journal of Electronic Commerce Research, 6(3), 2005, 181-192.

Beinat, E., Steenbruggen, J. & Wagtendonk, A.: Location Awareness 2020: A Foresight Study on Location and Sensor Services. Vrije Universiteit, Amsterdam, 2007, http://reference.kfupm.edu.sa/content/l/o/location_awareness_2020_2_108_86452.pdf

Bellavista, P., Küpper, A. & Helal, S.: Location-Based Services: Back to the Future. IEEE Pervasive Computing, 7(2), 2008, 85-89.

Bennett, C.J. & Regan, P.M.: Surveillance and Mobilities. Surveillance & Society, 1(4), 2004, 449-455.

Bentham, J. & Bowring, J.: The Works of Jeremy Bentham. Published under the Superintendence of His Executor, John Bowring, Volume IV, W. Tait, Edinburgh, 1843.

Blouin, D. An Intro to Internet of Things. 2014, www.xyht.com/spatial-itgis/intro-to-internet-of-things/

Boesen, J., Rode, J.A. & Mancini, C.: The Domestic Panopticon: Location Tracking in Families. UbiComp’10, Copenhagen, Denmark, 2010, pp. 65-74.

Böhm, A., Leiber, T. & Reufenheuser, B.: 'Trust and Transparency in Location-Based Services: Making Users Lose Their Fear of Big Brother. Proceedings Mobile HCI 2004 Workshop On Location Systems Privacy and Control, Glasgow, UK, 2004, 1-4.

Capurro, R.: Towards an Ontological Foundation of Information Ethics. Ethics and Information Technology, 8, 2006, 175-186.

Casal, C.R.: Impact of Location-Aware Services on the Privacy/Security Balance, Info: the Journal of Policy, Regulation and Strategy for Telecommunications. Information and Media, 6(2), 2004, 105-111.

Chellappa, R. & Sin, R.G.: Personalization Versus Privacy: An Empirical Examination of the Online Consumer’s Dilemma. Information Technology and Management, 6, 2005, 181-202.

Chen, J.V., Ross, W. & Huang, S.F.: Privacy, Trust, and Justice Considerations for Location-Based Mobile Telecommunication Services. info, 10(4), 2008, 30-45.

Chen, J.V. & Ross, W.H.: The Managerial Decision to Implement Electronic Surveillance at Work. International Journal of Organizational Analysis, 13(3), 2005, 244-268.

Clarke, R.: Information Technology and Dataveillance. Communications of the ACM, 31(5), 1988, 498-512.

Clarke, R.: Profiling: A Hidden Challenge to the Regulation of Data Surveillance. 1993, http://www.rogerclarke.com/DV/PaperProfiling.html.

Clarke, R.: The Digital Persona and Its Application to Data Surveillance. 1994, http://www.rogerclarke.com/DV/DigPersona.html.

Clarke, R.: Introduction to Dataveillance and Information Privacy, and Definitions of Terms. 1997, http://www.anu.edu.au/people/Roger.Clarke/DV/Intro.html.

Clarke, R.: Person Location and Person Tracking - Technologies, Risks and Policy Implications. Information Technology & People, 14(2), 2001b, 206-231.

Clarke, R.: Privacy as a Means of Engendering Trust in Cyberspace Commerce. The University of New South Wales Law Journal, 24(1), 2001c, 290-297.

Clarke, R.: While You Were Sleeping… Surveillance Technologies Arrived. Australian Quarterly, 73(1), 2001d, 10-14.

Clarke, R.: Privacy on the Move: The Impacts of Mobile Technologies on Consumers and Citizens. 2003b, http://www.anu.edu.au/people/Roger.Clarke/DV/MPrivacy.html.

Clarke, R.: Have We Learnt to Love Big Brother? Issues, 71, June, 2005, 9-13.

Clarke, R.: What's 'Privacy'? 2006, http://www.rogerclarke.com/DV/Privacy.html.

Clarke, R. Chapter 3. What 'Uberveillance' Is and What to Do About It, in K. Michael and M.G. Michael (eds.), The Second Workshop on the Social Implications of National Security, University of Wollongong, Wollongong, Australia, 2007a, 27-46.

Clarke, R.: Chapter 4. Appendix to What 'Uberveillance' Is and What to Do About It: Surveillance Vignettes, in K. Michael and M.G. Michael (eds.), The Second Workshop on the Social Implications of National Security, University of Wollongong, Wollongong, Australia, 2007b, 47-60.

Clarke, R.: Surveillance Vignettes Presentation. 2007c, http://www.rogerclarke.com/DV/SurvVign-071029.ppt.

Clarke, R.: Privacy Impact Assessment: Its Origins and Development. Computer Law & Security Review, 25(2), 2009, 123-135.

Clarke, R. & Wigan, M.: You Are Where You've Been: The Privacy Implications of Location and Tracking Technologies. 2011, http://www.rogerclarke.com/DV/YAWYB-CWP.html.

Culnan, M.J. & Bies, R.J.: Consumer Privacy: Balancing Economic and Justice Considerations. Journal of Social Issues, 59(2), 2003, 323-342.

Davis, D.W. & Silver, B.D.: Civil Liberties vs. Security: Public Opinion in the Context of the Terrorist Attacks on America. American Journal of Political Science, 48(1), 2004, pp. 28-46.

Dinev, T., Bellotto, M., Hart, P., Colautti, C., Russo, V. & Serra, I.: Internet Users’ Privacy Concerns and Attitudes Towards Government Surveillance – an Exploratory Study of Cross-Cultural Differences between Italy and the United States. 18th Bled eConference eIntegration in Action, Bled, Slovenia, 2005, 1-13.

Dobson, J.E. & Fisher, P.F. Geoslavery. IEEE Technology and Society Magazine, 22(1), 2003, 47-52.

Dobson, J.E. & Fisher, P.F. The Panopticon's Changing Geography. Geographical Review, 97(3), 2007, 307-323.

Dwyer, C., Hiltz, S.R. & Passerini, K.: Trust and Privacy Concern within Social Networking Sites: A Comparison of Facebook and Myspace. Proceedings of the Thirteenth Americas Conference on Information Systems, Keystone, Colorado, 2007, 1-12.

Elliot, G. & Phillips, N. Mobile Commerce and Wireless Computing Systems. Pearson Education Limited, Great Britain, 2004.

Ethics Subgroup IoT: Fact sheet- Ethics Subgroup IoT - Version 4.0, European Commission. 2013, 1-21, http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CB0QFjAA&url=http%3A%2F%2Fec.europa.eu%2Finformation_society%2Fnewsroom%2Fcf%2Fdae%2Fdocument.cfm%3Fdoc_id%3D1751&ei=5i7RVK-FHczYavKWgPgL&usg=AFQjCNG_VgeaUP_DIJVwSiPIww3bC9Ug_w

Freescale Semiconductor Inc. and ARM Inc:, Whitepaper: What the Internet of Things (IoT) Needs to Become a Reality. 2014, 1-16, cache.freescale.com/files/32bit/doc/white_paper/INTOTHNGSWP.pdf

Floridi, L.: Information Ethics: On the Philosophical Foundation of Computer Ethics. Ethics and Information Technology, 1, 1999, 37-56.

Foucault, M. Discipline and Punish: The Birth of the Prison. Second Vintage Books Edition May 1995, Vintage Books: A Division of Random House Inc, New York, 1977.

Fusco, S.J., Michael, K., Aloudat, A. & Abbas, R.: Monitoring People Using Location-Based Social Networking and Its Negative Impact on Trust: An Exploratory Contextual Analysis of Five Types of “Friend” Relationships. IEEE Symposium on Technology and Society, Illinois, Chicago, 2011.

Fusco, S.J., Michael, K., Michael, M.G. & Abbas, R.: Exploring the Social Implications of Location Based Social Networking: An Inquiry into the Perceived Positive and Negative Impacts of Using LBSN between Friends. 9th International Conference on Mobile Business, Athens, Greece, IEEE, 2010, 230-237.

Gagnon, M., Jacob, J.D., Guta, A.: Treatment adherence redefined: a critical analysis of technotherapeutics. Nurs Inq. 20(1), 2013, 60-70.

Ganascia, J.G.: The Generalized Sousveillance Society. Social Science Information, 49(3), 2010, 489-507.

Gandy, O.H.: The Panoptic Sort: A Political Economy of Personal Information. Westview, Boulder, Colorado, 1993.

Giaglis, G.M., Kourouthanassis, P. & Tsamakos, A.: Chapter IV. Towards a Classification Framework for Mobile Location-Based Services, in B.E. Mennecke and T.J. Strader (eds.), Mobile Commerce: Technology, Theory and Applications. Idea Group Publishing, Hershey, US, 2003, 67-85.

Gould, J.B.: Playing with Fire: The Civil Liberties Implications of September 11th. Public Administration Review, 62, 2002, 74-79.

Jorns, O. & Quirchmayr, G.: Trust and Privacy in Location-Based Services. Elektrotechnik & Informationstechnik, 127(5), 2010, 151-155.

Junglas, I. & Spitzmüller, C.: A Research Model for Studying Privacy Concerns Pertaining to Location-Based Services. Proceedings of the 38th Hawaii International Conference on System Sciences, 2005, 1-10.

Kaasinen, E.: User Acceptance of Location-Aware Mobile Guides Based on Seven Field Studies. Behaviour & Information Technology, 24(1), 2003, 37-49.

Kaupins, G. & Minch, R.: Legal and Ethical Implications of Employee Location Monitoring. Proceedings of the 38th Hawaii International Conference on System Sciences. 2005, 1-10.

Kim, D.J., Ferrin, D.L. & Rao, H.R.: Trust and Satisfaction, Two Stepping Stones for Successful E-Commerce Relationships: A Longitudinal Exploration. Information Systems Research, 20(2), 2009, 237-257.

King, L.: Information, Society and the Panopticon. The Western Journal of Graduate Research, 10(1), 2001, 40-50.

Kodl, J. & Lokay, M.: Human Identity, Human Identification and Human Security. Proceedings of the Conference on Security and Protection of Information, Idet Brno, Czech Republic, 2001, 129-138.

Kranenburg, R.V. and Bassi, A.: IoT Challenges, Communications in Mobile Computing. 1(9), 2012, 1-5.

Küpper, A. & Treu, G.: Next Generation Location-Based Services: Merging Positioning and Web 2.0., in L.T. Yang, A.B. Waluyo, J. Ma, L. Tan and B. Srinivasan (eds.), Mobile Intelligence. John Wiley & Sons Inc, Hoboken, New Jersey, 2010, 213-236.

Levin, A., Foster, M., West, B., Nicholson, M.J., Hernandez, T. & Cukier, W.: The Next Digital Divide: Online Social Network Privacy. Ryerson University, Ted Rogers School of Management, Privacy and Cyber Crime Institute, 2008, www.ryerson.ca/tedrogersschool/privacy/Ryerson_Privacy_Institute_OSN_Report.pdf.

Lewis, J.D. & Weigert, A.: Trust as a Social Reality. Social Forces, 63(4), 1985, 967-985.

Lyon, D.: The World Wide Web of Surveillance: The Internet and Off-World Power Flows. Information, Communication & Society, 1(1), 1998, 91-105.

Lyon, D.: Surveillance Society: Monitoring Everyday Life. Open University Press, Phildelphia, PA, 2001.

Lyon, D.: Surveillance Studies: An Overview. Polity, Cambridge, 2007.

Macquarie Dictionary.: 'Uberveillance', in S. Butler, Fifth Edition of the Macquarie Dictionary, Australia's National Dictionary. Sydney University, 2009, 1094.

Mann, S.: Sousveillance and Cyborglogs: A 30-Year Empirical Voyage through Ethical, Legal, and Policy Issues. Presence, 14(6), 2005, 625-646.

Mann, S., Nolan, J. & Wellman, B.: Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments. Surveillance & Society, 1(3), 2003, 331-355.

Mathiesen, K.: What is Information Ethics? Computers and Society, 32(8), 2004, 1-11.

Mattern, F. and Floerkemeier, K.: From the Internet of Computers to the Internet of Things, in Sachs, K., Petrov, I. & Guerrero, P. (eds.), From Active Data Management to Event-Based Systems and More. Springer-Verlag Berlin Heidelberg, 2010, 242-259.

Marx, G.T. & Steeves, V.: From the Beginning: Children as Subjects and Agents of Surveillance. Surveillance & Society, 7(3/4), 2010, 192-230.

Mayer, R.C., Davis, J.H. & Schoorman, F.D.: An Integrative Model of Organizational Trust. The Academy of Management Review, 20(3), 1995, 709-734.

McKnight, D.H. & Chervany, N.L.: What Trust Means in E-Commerce Customer Relationships: An Interdisciplinary Conceptual Typology. International Journal of Electronic Commerce, 6(2), 2001, 35-59.

Metzger, M.J.: Privacy, Trust, and Disclosure: Exploring Barriers to Electronic Commerce. Journal of Computer-Mediated Communication, 9(4), 2004.

Michael, K. & Clarke, R.: Location and Tracking of Mobile Devices: Überveillance Stalks the Streets. Computer Law and Security Review, 29(2), 2013, 216-228.

Michael, K., McNamee, A. & Michael, M.G.: The Emerging Ethics of Humancentric GPS Tracking and Monitoring. International Conference on Mobile Business, Copenhagen, Denmark, IEEE Computer Society, 2006a, 1-10.

Michael, K., McNamee, A., Michael, M.G., and Tootell, H.: Location-Based Intelligence – Modeling Behavior in Humans using GPS. IEEE International Symposium on Technology and Society, 2006b.

Michael, K., Stroh, B., Berry, O., Muhlbauer, A. & Nicholls, T.: The Avian Flu Tracker - a Location Service Proof of Concept. Recent Advances in Security Technology, Australian Homeland Security Research Centre, 2006, 1-11.

Michael, K. and Michael, M.G.: Australia and the New Technologies: Towards Evidence Based Policy in Public Administration (1 ed). Wollongong, Australia: University of Wollongong, 2008, Available at: http://works.bepress.com/kmichael/93

Michael, K. & Michael, M.G.: Microchipping People: The Rise of the Electrophorus. Quadrant, 49(3), 2005, 22-33.

Michael, K. and Michael, M.G.: From Dataveillance to Überveillance (Uberveillance) and the Realpolitik of the Transparent Society (1 ed). Wollongong: University of Wollongong, 2007. Available at: http://works.bepress.com/kmichael/51.

Michael, K. & Michael, M.G.: Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants. IGI Global, Hershey, PA, 2009.

Michael, K. & Michael, M.G.: The Social and Behavioral Implications of Location-Based Services. Journal of Location-Based Services, 5(3/4), 2011, 1-15, http://works.bepress.com/kmichael/246.

Michael, K. & Michael, M.G.: Sousveillance and Point of View Technologies in Law Enforcement: An Overview, in The Sixth Workshop on the Social Implications of National Security: Sousveillance and Point of View Technologies in Law Enforcement, University of Sydney, NSW, Australia, Feb. 2012.

Michael, K., Roussos, G., Huang, G.Q., Gadh, R., Chattopadhyay, A., Prabhu, S. and Chu, P.: Planetary-scale RFID Services in an Age of Uberveillance. Proceedings of the IEEE, 98.9, 2010, 1663-1671.

Michael, M.G. and Michael, K.: National Security: The Social Implications of the Politics of Transparency. Prometheus, 24(4), 2006, 359-364.

Michael, M.G. & Michael, K. Towards a State of Uberveillance. IEEE Technology and Society Magazine, 29(2), 2010, 9-16.

Michael, M.G. & Michael, K. (eds): Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies. Hershey, PA, IGI Global, 2013.

O'Connor, P.J. & Godar, S.H.: Chapter XIII. We Know Where You Are: The Ethics of LBS Advertising, in B.E. Mennecke and T.J. Strader (eds.), Mobile Commerce: Technology, Theory and Applications, Idea Group Publishing, Hershey, US, 2003, 245-261.

Orwell, G.: Nineteen Eighty Four. McPherson Printing Group, Maryborough, Victoria, 1949.

Oxford Dictionary: Control, Oxford University Press, 2012a http://oxforddictionaries.com/definition/control?q=control.

Oxford Dictionary: Trust, Oxford University Press, 2012b, http://oxforddictionaries.com/definition/trust?q=trust.

Pavlou, P.A.: Consumer Acceptance of Electronic Commerce: Integrating Trust and Risk with the Technology Acceptance Model. International Journal of Electronic Commerce, 7(3), 2003, 69-103.

Perusco, L. & Michael, K.: Humancentric Applications of Precise Location Based Services, in IEEE International Conference on e-Business Engineering, Beijing, China, IEEE Computer Society, 2005, 409-418.

Perusco, L. & Michael, K.: Control, Trust, Privacy, and Security: Evaluating Location-Based Services. IEEE Technology and Society Magazine, 26(1), 2007, 4-16.

Perusco, L., Michael, K. & Michael, M.G.: Location-Based Services and the Privacy-Security Dichotomy, in Proceedings of the Third International Conference on Mobile Computing and Ubiquitous Networking, London, UK, Information Processing Society of Japan, 2006, 91-98.

Quinn, M.J.: Ethics for the Information Age. Second Edition, Pearson/Addison-Wesley, Boston, 2006.

Renegar, B., Michael, K. & Michael, M.G.: Privacy, Value and Control Issues in Four Mobile Business Applications, in 7th International Conference on Mobile Business (ICMB2008), Barcelona, Spain, IEEE Computer Society, 2008, 30-40.

Rozenfeld, M.: The Value of Privacy: Safeguarding your information in the age of the Internet of Everything, The Institute: the IEEE News Source, 2014, http://theinstitute.ieee.org/technology-focus/technology-topic/the-value-of-privacy.

Rummel, R.J.: Death by Government. Transaction Publishers, New Brunswick, New Jersey, 1997.

Sanquist, T.F., Mahy, H. & Morris, F.: An Exploratory Risk Perception Study of Attitudes toward Homeland Security Systems. Risk Analysis, 28(4), 2008, 1125-1133.

Schoorman, F.D., Mayer, R.C. & Davis, J.H.: An Integrative Model of Organizational Trust: Past, Present, and Future. Academy of Management Review, 32(2), 2007, 344-354.

Shay, L.A., Conti, G., Larkin, D., Nelson, J.: A framework for analysis of quotidian exposure in an instrumented world. IEEE International Carnahan Conference on Security Technology (ICCST), 2012, 126-134.

Siau, K. & Shen, Z.: Building Customer Trust in Mobile Commerce. Communications of the ACM, 46(4), 2003, 91-94.

Solove, D.: Digital Dossiers and the Dissipation of Fourth Amendment Privacy. Southern California Law Review, 75, 2002, 1083-1168.

Solove, D.: The Digital Person: Technology and Privacy in the Information Age. New York University Press, New York, 2004.

Tavani, H.T.: Ethics and Technology: Ethical Issues in an Age of Information and Communication Technology. John Wiley, Hoboken, N.J., 2007.

Valacich, J.S.: Ubiquitous Trust: Evolving Trust into Ubiquitous Computing Environments. Business, Washington State University, 2003, 1-2.

van Ooijen, C. & Nouwt, S.: Power and Privacy: The Use of LBS in Dutch Public Administration, in B. van Loenen, J.W.J. Besemer and J.A. Zevenbergen (eds.), Sdi Convergence. Research, Emerging Trends, and Critical Assessment, Nederlandse Commissie voor Geodesie Netherlands Geodetic Commission 48, 2009, 75-88.

Wakunuma, K.J. and Stahl, B.C.: Tomorrow’s Ethics and Today’s Response: An Investigation into The Ways Information Systems Professionals Perceive and Address Emerging Ethical Issues. Inf Syst Front, 16, 2014, 383–397.

Weckert, J.: Trust and Monitoring in the Workplace. IEEE International Symposium on Technology and Society, 2000. University as a Bridge from Technology to Society, 2000, 245-250.

Wigan, M. & Clarke, R.: Social Impacts of Transport Surveillance. Prometheus, 24(4), 2006, 389-403.

Xu, H. & Teo, H.H.: Alleviating Consumers’ Privacy Concerns in Location-Based Services: A Psychological Control Perspective. Twenty-Fifth International Conference on Information Systems, 2004, 793-806.

Xu, H., Teo, H.H. & Tan, B.C.Y.: Predicting the Adoption of Location-Based Services: The Role of Trust and Perceived Privacy Risk. Twenty-Sixth International Conference on Information Systems, 2005, 897-910.

Yan, Z. & Holtmanns, S.: Trust Modeling and Management: From Social Trust to Digital Trust, in R. Subramanian (ed.), Computer Security, Privacy and Politics: Current Issues, Challenges and Solutions. IGI Global, 2008, 290-323.

Yeh, Y.S. & Li, Y.M.: Building Trust in M-Commerce: Contributions from Quality and Satisfaction. Online Information Review, 33(6), 2009, 1066-1086.

Citation: Roba Abbas, Katina Michael, M.G. Michael, "Using a Social-Ethical Framework to Evaluate Location-Based Services in an Internet of Things World", IRIE, International Review of Information Ethics, http://www.i-r-i-e.net/ Source: http://www.i-r-i-e.net/inhalt/022/IRIE-Abbas-Michael-Michael.pdf Dec 2014

Author(s):

Honorary Fellow Dr Roba Abbas:

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 - 4221 - 3555 , * roba@uow.edu.au :http://www.technologyandsociety.org/members/2013/7/25/dr-roba-abbas

·         Relevant publications:

o    R. Abbas, K. Michael, M.G. Michael, R. Nicholls, Sketching and validating the location-based services (LBS) regulatory framework in Australia, Computer Law and Security Review 29, No.5 (2013): 576-589.

o    R. Abbas, K. Michael, M.G. Michael, The Regulatory Considerations and Ethical Dilemmas of Location-Based Services (LBS): A Literature Review, Information Technology & People 27, No.1 (2014): 2-20.

Associate Professor Katina Michael:

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 - 4221 - 3937 , * katina@uow.edu.au : http://ro.uow.edu.au/kmichael

·         Relevant publications:

o    K. Michael, R. Clarke, Location and Tracking of Mobile Devices: Überveillance Stalks the Streets, Computer Law and Security Review 29, No.3 (2013): 216-228.

o    K. Michael, M. G. Michael, Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants, IGI Global, (2009).

o    L. Perusco, K. Michael, Control, trust, privacy, and security: evaluating location-based services, IEEE Technology and Society Magazine 26, No.1 (2007): 4-16.

Honorary Associate Professor M.G. Michael

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 – 4221 - 3937, *  mgm@uow.edu.au, : http://ro.uow.edu.au/mgmichael

·         Relevant publications:

o    M.G. Michael and K. Michael (eds) Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies, Hershey: PA, IGI Global, (2013).

o    K. Michael, M. G. Michael, "The Social and Behavioral Implications of Location-Based Services, Journal of Location-Based Services, Volume 5, Issue 3-4, (2011), 121-137.

o    M.G. Michael, K. Michael, Towards a State of Uberveillance, IEEE Technology and Society Magazine, 29, No.2, (2010): 9-16.

o    M. G. Michael, S. J. Fusco, K. Michael, A Research Note on Ethics in the Emerging Age of Uberveillance, Computer Communications, 31 No.6, 2008: 1192-1199.

Perceived barriers for implanting microchips in humans

Abstract

This quantitative, descriptive study investigated if there was a relationship between countries of residence of small business owners (N = 453) within four countries (Australia, India, UK, and the USA) with respect to perceived barriers to RFID (radio frequency identification) transponders being implanted into humans for employee ID. Participants were asked what they believed were the greatest barriers in instituting chip implants for access control in organizations. Participants had six options from which to select. There were significant chi-square analyses reported relative to respondents' countries and: 1) a perceived barrier of technological issues (X2= 11.86, df = 3, p = .008); 2) a perceived barrier of philosophical issues (right of control over one's body) (X2= 31.21, df = 3, p = .000); and 3) a perceived barrier of health issues (unknown risks related to implants) (X2= 10.88, df = 3, p = .012). There were no significant chi-square analyses reported with respect to countries of residence and: 1) religious issues (mark of the beast), 2) social issues (digital divide), and 3) cultural issues (incisions into the skin are taboo). Thus, the researchers concluded that there were relationships between the respondents' countries and the perception of barriers in institutional microchips.

SECTION I. Introduction

The purpose of this study was to investigate if there were relationships between countries of residence (Australia, India, UK, and the USA) of small business owners  and perceived barriers of instituting RFID (radio frequency identification) transponders implanted into the human body for identification and access control purposes in organizations [1]. Participants were asked what they believed were the greatest barriers in instituting chip implants for access control in organizations [2]. Participants had six options from which to select all that apply, as well as an option to specify other barriers [3]. The options for perceived barriers included:

  • technological issues-RFID is inherently an insecure technology
  • social issues-there will be a digital divide between those with employees with implants for identification and those that have legacy electronic identification
  • cultural issues-incisions into the skin are taboo
  • religious issues-mark of the beast
  • philosophical issues-right of control over one's body
  • health issues-there are unknown risks related to implants that are in the body over the long term
  • other issues.

There were significant chi-square analyses reported relative to respondents' countries and: 1) the perceived barrier of technological issues; 2) the perceived barrier of philosophical issues (right of control over one's body); and 3) the perceived barrier of health issues (unknown risks related to implants). There were no significant chi-square analyses reported with respect to countries and religious issues (mark of the beast), social issues (digital divide), and cultural issues (incisions into the skin are taboo).

RFID implants are capable of omnipresent electronic surveillance. RFID tags or transponders can be implanted into the human body to track the who, what, where, when, and how of human life [4]. This act of embedding devices into human beings for surveillance purposes is known as uberveillance [5]. While the tiny embedded RFID chips do not have global positioning capabilities, an RFID reader (fixed or mobile) can capture time stamps, exit and entry sequences to denote when someone is coming or going, which direction they are travelling in, and then make inferences on time, location, distance. and speed.

In this paper, the authors present a brief review of the literature, key findings from the study, and a discussion on possible implications of the findings. Professionals working in the field of emerging technologies could use these findings to better understand how countries of residence may affect perceptions of barriers in instituting chip implants in humans.

SECTION II. Review of Literature

A. Implants and Social Acceptance

In 2004, the FDA (Food & Drug Administration) of the United States approved an implantable chip for use in humans in the U.S [6]. The implanted chip was and is being marketed by a variety of commercial enterprises as a potential method to detect and treat diseases, as well as a potential lifesaving device. If a person was brought to an emergency room unconscious, a scanner in the hospital doorway could read the person's unique ID on the implanted chip. The ID would then be used to unlock the personal health records (PHR) of the patient from a database [7]. Authorized health professionals would then have access to all pertinent medical information of that individual (i.e. medical history, previous surgeries, allergies, heart condition, blood type, diabetes) to care for the patient aptly. Additionally, the chip is being touted as a solution to kidnappings in Mexico (e.g. by the Xega Company), among many other uses [8].

B. Schools: RFID Tracking

A rural elementary school in California planned to implement RFID-tagged ID cards for school children, however the American Civil Liberties Union (ACLU) fought successfully to revoke the program. Veritable risks were articulated by the ACLU including identity theft, or kidnapping if the system was hacked and resulted in a perpetrator being able to access locations of schoolchildren.

However, with school districts looking to offset cuts in state funding which are partly based on attendance figures, RFID technology provides a method to count students more accurately. Added to increased revenues, administrators are facing the reality of increasing security issues; thus more school districts are adopting RFID to track students to improve safety. For many years in Tokyo, students have worn mandatory RFID bracelets; they are tracked not only in the school, but also to and from school [9] [10]. In other examples, bags are fitted with GPS units.

In 2012, the Northside Independent School District in San Antonio, Texas began a pilot program to track 6.2% of its 100,000 students through RFID tagged ID-cards. Northside was not the first district in Texas; two other school districts in Houston successfully use the technology with reported gains in hundreds of thousands of dollars in revenue due to improved attendance. The school board unanimously approved the program, but not after first debating privacy issues. Chip readers on campuses and on school buses will detect a student's location and authorized administrators will have access to the information. At a cost of 525,000 to launch the pilot program and approximately 1.7 million in the first year due to higher attendance figures, as well as Medicaid reimbursements for the busing of special education students. However, students could forget or lose the cards which would negatively affect the system [3]. One of Northside's sophomore students, Andrea Hernandez, refused to wear the RFID tag round her neck based on religious reasons. Initially, the school expelled her but when the case went to court, she was reinstated, a judge ruling her constitutional rights had been violated [11].

C. Medical Devices: RFID Implants

Recent technological developments are reaching new levels with the integration of silicon and biology; implanted devices can now interact directly with the brain [12]. Implantable devices for medical purposes are often highly beneficial to restore functions that were lost. Such current medical implants include cardiovascular pacers, cochlear and brainstem implants for patients with hearing disorders, implantable drug delivery pumps, implantable neurostimulation devices for such patients as those with urinary incontinence, chronic pain, or epilepsy, deep brain stimulation for patients with Parkinson's, and artificial chip-controlled legs [13].

D. RFID in India

Although India has been identified as a significant prospective market for RFID due to issues with the supply chain and a need for transparency, some contend that the slow adoption of RFID solutions can be tracked to unskilled RFID solution providers. Inexperienced systems integrators and vendors are believed to account for failed trials, leaving companies disillusioned with the technology, and subsequently abandoning solutions and declaiming its benefits loudly and publicly. A secondary technological threat to RFID adoption is believed to be related to price competitiveness in India. In such a price-sensitive environment, RFID players are known to quote the lowest costs per tag, thereby using inferior hardware. Thus, customers perceive RFID to be inconsistent and unreliable for use in the business setting [14]. The compulsory biometrics roll out, instituted by the Unique Identification Authority of India (UIDAI) is in direct contrast to the experience of RFID (fig. 1)

Fig. 1. Taking fingerprints for Aadhaar, a 12-digit unique number has been issued for all residents in india. The number will be stored in a centralized database and linked to basic demographic and biometric information. The system institutes multimodal biometrics. Creative commons: fotokannan.

Fig. 1. Taking fingerprints for Aadhaar, a 12-digit unique number has been issued for all residents in india. The number will be stored in a centralized database and linked to basic demographic and biometric information. The system institutes multimodal biometrics. Creative commons: fotokannan.

E. RFID in Libraries

In 2010, researchers reported that many corporate libraries had begun deploying RFID. RFID tags are placed into books and other media and used in libraries for such purposes as to automate stock verification, to locate misplaced items, to check in/check out patrons without human interaction, and to detect theft. In India, several deployment and implementation issues were identified and they are: consumer privacy issues/ethical concerns, costs, lack of standards and regulations in India (e.g. data ownership, data collection limitations), user confusion (e.g. lack of training and experience with the technology), and the immaturity of the technology (e.g. lack of accuracy, scalability, etc.) [15].

F. RFID and OEMS/Auto Component Manufacturers

In India, suppliers are not forced to conform to stringent regulations like those that exist in other countries. In example, the TREAD Act in the U.S. provided the impetus for OEMs to invest in track and trace solutions; failure to comply with the regulations can carry a maximum fine in the amount of $15 million and a criminal penalty of up to 15 years. Indian suppliers are not only free from such regulations of compliance, but also cost conscious with low volumes of high value cars. It is believed that the cost of RFID solutions is not yet justified in the Indian market [16].

G. Correctional Facilities: RFID Tracking

A researcher studied a correctional facility in Cleveland, Ohio to evaluate the impact of RFID technology to deter such misconduct as sexual assaults. The technology was considered because of its value in confirming inmate counts and perimeter controls. In addition, corrections officers can utilize such technology to check inmate locations against predetermined schedules, to detect if rival gang members are in close proximity, to classify and track proximity of former intimate partners, single out those inmates with food allergies or health issues, and even identify if inmates who may attempt to move through the cafeteria line twice [17].

The results of the study indicated that RFID did not deter inmate misconduct, although the researchers articulated many issues that affected the results. Significant technological challenges abounded for the correctional facility as RFID tracking was implemented and included system inoperability, signal interference (e.g. “blind spots” where bracelets could not be detected), and transmission problems [18] [17].

H. Social Concerns

Social concerns plague epidermal electronics for nonmedical purposes [19]. In the United States, many states have crafted legislation to balance the potential benefits of RFID technology with the disadvantages associated with privacy and security concerns [20]. California, Georgia, Missouri, North Dakota, and Wisconsin are among states in the U.S. which have passed legislation to prohibit forced implantation of RFID in humans [21]. The “Microchip Consent Act of 2010”, which became effective on July 1, 2010 in the state of Georgia, not only stated that no person shall be required to be implanted with a microchip (regardless of a state of emergency), but also that voluntary implantation of any microchip may only be performed by a physician under the authority of the Georgia Composite Medical Board.

Through the work of Rodata and Capurro in 2005, the European Group on Ethics in Science and New Technologies to the European Commission, examined the ethical questions arising from science and new technologies. The role of the opinion was to raise awareness concerning the dilemmas created by both medical and non-medical implants in humans which affect the intimate relation between bodily and psychic functions basic to our personal identity [22]. The opinion stated that Information and Communications Technology implants, should not be used to manipulate mental functions or to change a personal identity. Additionally, the opinion stated that principles of data protection must be applied to protect personal data embedded in implants [23]. The implants were identified in the opinion as a threat to human dignity when used for surveillance purposes, although the opinion stated that this might be justifiable for security and/or safety reasons [24].

I. Increased Levels of Willingness to Adopt: 2005–2010

Researchers continue to investigate social acceptance of the implantation of this technology into human bodies. In 2006, researchers reported higher levels of acceptance of the implantation of a chip within their bodies, when college students perceived benefits from this technology [25]. Utilizing the same questions posed in 2005 to college students attending both private and public institutions of higher education by the aforementioned researchers, the researchers once again in 2010 investigated levels of willingness to implant RFID chips to understand if there were shifts in levels of willingness of college students to implant RFID chips for various reasons [25] [26]. In both studies, students were asked: “How willing would you be to implant an RFID chip in your body as a method (to reduce identity theft, as a potential lifesaving device, to increase national security)?” A 5-point Likert-type scale was utilized varying from “Strongly Unwilling” to “Strongly Willing”. Comparisons of the 2005 results of the study to the results of the 2010 research revealed shifts in levels of willingness of college students. A shift was evident; levels of willingness moved from unwillingness toward either neutrality or willingness to implant a chip in the human body to reduce identity theft, as a potential lifesaving device, and to increase national security. Levels of unwillingness decreased for all aforementioned areas as follows [26]. Between 2005 and 2010, the unwillingness (“Strongly unwilling” and “Somewhat unwilling”) of college students to implant an RFID chip into their bodies decreased by 22.4% when considering RFID implants as method to reduce identity theft, decreased by 19.9% when considering RFID implants as a potential lifesaving device, and decreased by 16.3% when considering RFID implants to increase national security [26].

J. RFID Implant Study: German Tech Conference Delegates

A 2010 survey of individuals attending a technology conference conducted by BITKOM, a German information technology industry lobby group, reported 23% of 1000 respondents would be prepared to have a chip inserted under their skin for certain benefits; 72% of respondents, however, reported they would not allow implantation of a chip under any circumstances. Sixteen percent (16%) of respondents reported they would accept an implant to allow emergency services to rescue them more quickly in the event of a fire or accident [27].

K. Ask India: Are Implants a More Secure Technology?

Previously, researchers reported a significant chi-square analysis relative to countries of residence and perceptions of chip implants as a more secure technology for identification/access control in organizations. More than expected (46 vs. 19.8; adjusted residual = 7.5), participants from India responded “yes” to implants as a more secure technology. When compared against the other countries in the study, fewer residents from the UK responded “yes” than expected (9 vs. 19.8), and fewer residents from the USA responded “yes” than expected (11 vs. 20.9). In rank order, the countries contributing to this significant relationship were India, the UK and the USA; no such differences in opinion were found for respondents from Australia. [28].

Due to heightened security threats, there appears to be a surge in demand for security in India [29][30]. A progression of mass-casualty assaults that have been carried out by extremist Pakistani nationals against hotels and government buildings in India has brought more awareness to the potential threats against less secure establishments [30]. The government is working to institute security measures at the individual level with a form of national ID cards that will house key biometric data of the individual. In the local and regional settings, technological infrastructure is developing rapidly in metro and non-metro areas because of the increase of MNCs (multi-national corporations) now locating in India. Although the neighborhood “chowkiddaaar” (human guard/watchman) was previously a more popular security measure for localized security, advances in, and reliability and availability of, security technology is believed to be affecting the adoption of electronic access security as a replacement to the more traditional security measures [29] [30].

L. Prediction of Adoption of Technology

Many models have been developed and utilized to understand factors that affect the acceptance of technology such as: The Moguls Model of Computing by Ndubisi, Gupta, and Ndubisi in 2005, Diffusion of Innovation Theory by Rogers in 1983; Theory of Planned Behavior by Ajzen in 1991; The Model of PC Utilization attributed to Thompson, Higgins, and Howell in 1991, Protection Motivation Theory (PMT) by Rogers in 1985, and the Theory of Reasoned Action attributed to Fischbein & Ajzen in 1975, and with additional revisions by the same in 1980 [31].

Researchers in Berlin, Germany investigated consumers' reactions to RFID in retail. After viewing an introductory stimulus film about RFID services in retail, participants evaluated the technology and potential privacy mechanisms. Participants were asked to rate on a five point Likert-type scale (ranging from “not at all sensitive” to “extremely sensitive”) their attitudes toward privacy with such statements as: “Generally, I want to disclose the least amount of data about myself.” Or “To me it is irrelevant if somebody knows what I buy for my daily needs.” In the study, participants reported moderate privacy awareness  and interestingly, participants reported a moderate expectation that legal regulations will result in sufficient privacy protection . Results showed that the extent to which people view the protection of their privacy strongly influences how willing people will be to accept RFID in retail. Participants were aware of privacy problems with RFID-based services, however, if retailers articulate that they value the customers' privacy, participants appeared more likely to adopt the technology. Thus, privacy protection (and the communication of it) was found to be an essential element of RFID rollouts [32].

SECTION III. Methodology

This quantitative, descriptive study investigated if there were relationships between countries of residence with respect to perceived barriers of RFID chip implants in humans for identification and access control purposes in organizations. The survey took place between April 4, 2011 and April 18, 2011. It took an average of 10 minutes to complete each online survey. Participants, who are small business owners  within four countries including Australia , India , UK , and the USA , were asked “As a senior executive, what do you believe are the greatest barriers in instituting chip implants for access control in organizations?” Relative to gender, 51.9% of participants are male; 48.1% are female. The age of participants ranged from 18 to 71 years of age; the mean age was 44 and the median age was 45. Eighty percent of organizations surveyed had less than 5 employees. Table I shows the survey participant's industry sector.

Table I Senior executive's industry sector

Table I Senior executive's industry sector

The study employed one instrument that collected key data relative to the business profile, the currently utilized technologies for identification and access control at the organization, and the senior executives' perceptions of RFID implants in humans for identification and access control in organizations. Twenty-five percent of the small business owners that participated in the survey said they had electronic ID access to their premises. Twenty percent of small business owner employee ID cards came equipped with a photograph, and less than five percent stated they had a security breach in the 12 months preceding the study.

Descriptive statistics, including frequency counts and measures of central tendency, were run and chi-square analysis was conducted to examine if there were relationships between the respondents' countries and each of the perceived barriers in instituting microchips in humans.

SECTION IV. Findings

There was a significant relationship reported relative to respondents' countries for each of three of the six choices provided in the multi-chotomous question: “As a senior executive, what do you believe are the greatest barriers in instituting chip implants for access control in organizations?”

A. Barrier: Technological Issues

The significant chi-square analysis  indicated that there was a relationship between the respondents' countries and the perceived barrier of technological issues. Using the rule of identifying adjusted residuals greater than 2.0, examination of the adjusted residuals indicated that the relationship was created when more than expected participants from India selected “technological issues (RFID is inherently an insecure technology)” as a barrier in instituting chip implants (45 vs. 31.1; adjusted residual 3.4).

B. Barrier: Philosophical Issues

The second significant chi-square analysis , df = 3,  indicated that there was a relationship between the respondents' countries and the perceived barrier of philosophical issues (right of control over one's body). An examination of the adjusted residuals indicated that the relationship was mostly created when fewer than expected participants from India selected philosophical issues as a barrier in instituting chip implants (37 vs. 61.3; adjusted residual 5.3). In addition, more residents from Australia than expected (78 vs. 62.9; adjusted residual 3.3) selected philosophical issues as a barrier. In rank order, the countries contributing to this significant relationship were India, followed by Australia; no such differences in opinion were found for respondents from UK and the USA.

C. Barrier: Health Issues

The third significant chi-square analysis  indicated there was a relationship between the respondents' countries and the perceived barrier of health issues (unknown risks related to implants). An examination of the adjusted residuals indicated that the relationship was mostly created when more than expected residents of India selected health issues as a barrier in instituting chip implants (57 vs. 43.3; adjusted residual 3.1). In addition, fewer residents from America than expected (36 vs. 45.7; adjusted residual 2.1) selected health issues as a barrier. In rank order, the countries contributing to this significant relationship were India, followed by the USA; no such differences in opinion were found for respondents from Australia and the UK.

D. Barrier: Social Issues, Religious Issues, and Cultural Issues

There were no significant chi-square analyses reported with respect to respondents' countries and social issues (digital divide), religious issues (mark of the beast), and cultural issues (incisions into the skin are taboo). Thus, in this study the researchers concluded no such differences in opinion were found for respondents' countries of residence and the barriers of social issues, religious issues, and cultural issues.

E. Statistical Summary

When asked whether or not, radiofrequency identification (RFID) transponders surgically implanted beneath the skin of an employee would be a more secure technology for instituting employee identification in the organization, only eighteen percent believed so. When asked subsequently about their opinion on how many staff in their organization would opt for an employee ID chip implant instead of the current technology if it were available, it was stated that eighty percent would not opt in. These figures are consistent with an in depth interview conducted with consultant Gary Retherford who was responsible for the first small business adoption of RFID implants for access control at Citywatcher.com in 2006 [33]–[34][35] In terms of the perceived barriers to instituting an RFID implant for access control in organizations, senior executives stated the following (in order of greatest to least barriers): 61% said health issues, 55% said philosophical issues, 43% said social issues; 36% said cultural issues; 31% said religious issues, and 28% said technological issues.

F. Open-Ended Question

When senior executives were asked if they themselves would adopt an RFID transponder surgically implanted beneath the skin the responses were summarized into three categories-no, unsure, and yes [36]. We present a representative list of these responses below with a future study focused on providing in depth qualitative content analysis.

1) No, I Would Not Get an RFID Implant

“No way would I. Animals are microchipped, not humans.”

“Absurd and unnecessary.”

“I absolutely would not have any such device implanted.”

“Hate it and object strongly.”

“No way.”h

“No thanks.”

“Yuk.”

“Absolutely creepy and unnecessary.”

“Would not consider it.”

“I would leave the job.”

“I don't like the idea one bit. The idea is abhorrent. It is invasive both physically and psychologically. I would never endorse it.”

“Would never have it done.”

“Disagree invading my body's privacy.”

“Absolutely vehemently opposed.”

“This proposal is a total violation of human rights.”

“Yeah right!! and get sent straight to hell! not this little black duck!”

“I do not believe you should put things in your body that God did not supply you with …”

“I wouldn't permit it. This is a disgraceful suggestion. The company does not OWN the employees. Slavery was abolished in developed countries more than 100 years ago. How dare you even suggest such a thing. You should be ashamed.”

“I would sooner stick pins in my eyeballs.”

“It's just !@;#%^-Nazi's???”

2) I am Unsure about Getting an RFID Implant

“A bit overkill for identification purposes.”

“Uncomfortable.”

“Maybe there is an issue with OH&S and personal privacy concern.”

“Unsure.”

“Only if I was paid enough to do this, $100000 minimum.”

“Unsure, seems very robotic.”

“I'm not against this type of device but I would not use it simply for business security.”

“A little skeptical.”

“A little apprehensive about it.”

3) Yes, I would Get an RFID Implant

“Ok, but I would be afraid that it could be used by”

“outside world, say police.”

“Sick!”

“It is a smart idea.”

“It would not be a problem for me, but I own the business so no philosophical issues for me.”

“I'd think it was pretty damn cool.”

SECTION V. Discussion: Perceived Barriers

A. Barrier: Technological Issues

The literature revealed many technological barriers for non-implantable chips; this study suggests this same barrier is also perceived for implantable chips and is likely to be related [37]. More than expected, Indian participants in this study selected technological issues (RFID is inherently an insecure technology) as a barrier in instituting chip implants for access control; no such differences of opinion were found for the other countries in the study. However, the literature revealed in other analyses, that more than expected Indian participants, answered “yes” when asked if implants are a more secure technology for instituting identification/access control in an organization. The findings appear to suggest that although Indian participants perceive RFID implants as a more secure technology when compared with other such methods as manual methods, paper-based, smartcards, or biometric/RFID cards, participants are likely to view this technology as undeveloped and still too emergent. Further research is needed to substantiate this conclusion, although a review of the literature revealed that RFID solution providers are already in abundance in India, with many new companies launching and at a rapid pace. Without standards and regulations, providers are unskilled and uneducated in the technology, providing solutions that often do not prove successful in implementation. Customers then deem the technology as inconsistent and ineffective in its current state. In addition, RFID players undercut each other, providing cheap pricing for cheap, underperforming hardware. Therefore, the preliminary conclusion of the researchers is that adoption of implants in India is likely to be inhibited not only now, but well into the future if the implementations of non-implantable RFID solutions continue to misrepresent the capabilities of the technology. It is likely that far afield to accepting implantable chips, individuals in India would need to be assured of consistency and effectiveness for RFID chip use in non-human applications.

B. Barrier: Philosophical Issues

Fewer than expected Indian participants selected philosophical issues (right of control over one's body) as a barrier; and more than expected, Australian participants selected this as a barrier. The researchers concluded that this is fertile ground for future research [38]. The deep cultural assumptions of each country are likely to influence participants' responses. In example, although Indian philosophies vary, many emphasize the continuity of the soul or spirit, rather than the temporary state of the flesh (the body). Further research would inform these findings through an exploration as to how and why participants in India versus participants in Australia perceive their own right of control over one's body.

C. Barrier: Health Issues

More than expected Indian participants selected health issues (unknown risks related to implants) as a barrier in instituting implants; and, fewer than expected American participants selected this as a barrier. The researchers conclude that these results may be a result of the perceived successes with the current usage of the technology. The literature revealed participants from India are experiencing poor implementations of the technology. Conversely, Americans are increasingly exposed to the use of surgically implanted chips in pets (often with no choice if the pet is adopted from a shelter) and with little or no health issues faced [39]. In addition, segments of the healthcare industry are advocating for RFID for use in the supply chain (e.g. blood supply) with much success. To inform these findings, further research is needed to explore how participants from each country describe the unknown risks related to implants.

SECTION VI. Conclusion

In conclusion, the authors recognize there are significant social implications relative to implanting chips in humans. Although voluntary chipping has been embraced by certain individuals, the chipping of humans is rare and remains mostly a topic of discussion and debate into the future. Privacy and security issues abound and are not to be minimized. However, in the future, we may see an increased demand for, and acceptance of, chipping, especially as the global environment intensifies. When considering the increase in natural disasters over the past two years, the rising tensions between nations such as those faced by India with terrorism by extremists from neighboring countries, and the recent contingency plans to enact border controls to mitigate refugees fleeing failing countries in the Eurozone, the tracking of humans may once again come to the forefront as it did post 9–11 when rescuers raced against the clock to locate survivors in the rubble.

India is of particular interest in this study; participants from this country contributed most in many of the analyses. India is categorized as a developing country (or newly industrialized country) and the second most populous country in the world. The government of India is already utilizing national identification cards housing biometrics, although the rollout has been delayed as officials work to solve issues around cards that can be stolen or misplaced, as well as how to prevent use fraudulently after the cardholder's death. Technological infrastructure is improving in even the more remote regions in India as MNCs (multi-national corporations) are locating business divisions in the country. The findings, set against the backdrop of the literature review, bring to light what seems to be an environment of people more than expected (statistically) open to (and possibly ready for) the technology of implants when compared with developed countries. However ill-informed RFID players in India are selling a low quality product. There appears to be lack of standards and insufficient knowledge of the technology with those who should know the most about the technology. Further research is necessary to not only understand the Indian perspective, but also to better understand the environment now and into the future.

References

1. K. Michael and M. G. Michael, "The Diffusion of RFID Implants for Access Control and ePayments: Case Study on Baja Beach Club in Barcelona, " in IEEE International Symposium on Technology and Society (ISTAS10), Wollongong, Australia, 2010, pp. 242-252.

2. K. Michael and M. G. Michael, "Implementing Namebers Using Microchip Implants: The Black Box Beneath The Skin, " in This Pervasive Day: The Potential and Perils of Pervasive Computing, J. Pitt, Ed., ed London, United Kingdom: Imperial College Press, 2012, pp. 163-203.

3. K. Michael and M. G. Michael, "The Social, Cultural, Religious and Ethical Implications of Automatic Identification, " in The Seventh International Conference on Electronic Commerce Research, Dallas, Texas, 2004, pp. 432-450.

4. M. G. Michael and K. Michael, "A note on uberveillance, " in From dataveillance to uberveillance and the realpolitik of the transparent society, K. Michael and M. G. Michael, Eds., ed Wollongong: University of Wollongong, 2006, pp. 9-25.

5. M. G. Michael and K. Michael, Eds., Uberveillance and the Social Implications of Microchip Implants (Advances in Human and Social Aspects of Technology. Hershey, PA: IGI Global, 2014.

6. J. Stokes. (2004, October 14, 2004). FDA approves implanted RFID chip for humans. Available: http://arstechnica.com/uncategorized/2004/10/4305-2/

7. K. Michael, et al., "Microchip Implants for Humans as Unique Identifiers: A Case Study on VeriChip, " in Conference on Ethics, Technology, and Identity, Delft, Netherlands, 2008.

8. K. Opam. (2011, August 22, 2011). RFID Implants Won't Rescue the People Kidnapped in Mexico. Available: http://gizmodo.com/5833237/rfid-implants-wont-work-if-youve-beenkidnapped-in-mexico

9. C. Swedberg. (2005, June 12, 2012). L.A. County Jail to track inmates. Available: http://www.rfidjournal.com/article/articleview/1601/1/1

10. F. Vara-Orta. (2012, May 31, 2012). Students will be tracked via chips in IDs. Available: http://www.mysanantonio.com/news/education/article/Students-willbe-tracked-via-chips-in-IDs-3584339.php#ixzz1vszm9Wn4

11. Newstaff. (November 27, 2012, May 13, 2014). Texas School: Judge Overturns Student's Expulsion over RFID Chip. Available: http://www.govtech.com/Texas-School-Wear-RFID-Chip-or-Get-Expelled.html

12. M. Gasson, "ICT implants: The invasive future of identity?, " Advances in Information and Communication Technology, vol. 262, pp. 287-295, 2008.

13. K. D. Stephan, et al., "Social Implications of Technology: Past, Present, and Future, " Proceedings of the IEEE, vol. 100, pp. 1752-1781 2012.

14. R. Kumar. (2011, June 1, 2012). India's Big RFID Adoption Challenges. Available: http://www.rfidjournal.com/article/articleview/8145/1/82/

15. L. Radha, "Deployment of RFID (Radio Frequency Identification) at Indian academic libraries: Issues and best practice. , " International Journal of Library and Information Science, vol. 3, pp. 34-37, 2011.

16. H. Saranga, et al. (2010, June 2, 2012). Scope for RFID Implementation in the Indian Auto Components Industry. Available: http://tejasiimb. org/articles/73.php

17. N. LaVigne, "An evaluability assessment of RFID use in correctional settings, " in Final report submitted to the National Institute of Justice, ed. Washington DC: USA, 2006.

18. R. Halberstadt and N. LaVigne, "Evaluating the use of radio frequency identification device (RFID) technology to prevent and investigate sexual assaults in a correctional setting, " The Prison Journal, vol. 91, pp. 227-249, 2011.

19. A. Masters and K. Michael, "Lend me your arms: The use and implications of humancentric RFID, " Electronic Commerce and Applications, vol. 6, pp. 29-39, 2007.

20. K. Albrecht and L. McIntyre, Spychips: How Major Corporations and Government Plan to Track Your Every Purchase and Watch Your Every Move. New York: Plume, 2006.

21. A. Friggieri, et al., "The Legal Ramifications of Microchipping People in the United States of America-A State Legislative Comparison, " in IEEE International Symposium on Technology and Society (ISTAS '09), Phoenix, Arizona, 2009.

22. G. G. Assembly. (2010, January 12, 2011). Senate Bill 235. Available: http://www1.legis.ga.gov/legis/2009-10/versions/sb235-As-passed-Se nate-5.htm

23. M. G. Michael and K. Michael, "Towards a State of Uberveillance, " IEEE Technology and Society Magazine, vol. 29, pp. 9-16, 2010.

24. S. Rodota and R. Capurro, "Opinion n020: Ethical aspects of ICT Implants in the human body, " in European Group on Ethics in Science and New Technologie (EGE), ed, 2005.

25. C. Perakslis and R. Wolk, "Social acceptance of RFID as a biometric security method, " IEEE Symposium on Technology and Society Magazine, vol. 25, pp. 34-42, 2006.

26. C. Perakslis, "Consumer Willingness to Adopt RFID Implants: Do Personality Factors Play a Role in the Acceptance of Uberveillance?, " in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds., ed Hershey, PA: IGI Global, 2014, pp. 144-160.

27. A. Donoghue. (2010, March 2, 2010). CeBIT: Quarter Of Germans Happy To Have Chip Implants. Available: http://www.techweekeurope.co.uk/news/cebit-quarter-of-germanshappy-to-have-chip-implants-5590

28. R. Achille, et al., "Ethical Issues to consider for Microchip Implants in Humans, " Ethics in Biology, Engineering and Medicine vol. 3, pp. 77-91, 2012.

29. S. Das. (2009, May 1, 2012). Surveillance: Big Brothers Watching. Available: http://dqindia.ciol.commakesections.asp/09042401.asp

30. M. Krepon and N. Cohn. (2011, May 1, 2012). Crises in South Asia: Trends and Potential Consequences. Available: http://www.stimson.org/books-reports/crises-in-south-Asia-trends-Andconsequences

31. C. Jung, Psychological types. Princeton, NJ: Princeton University Press, 1923 (1971).

32. M. Rothensee and S. Spiekermann, "Between Extreme Rejection and Cautious Acceptance Consumers' Reactions to RFID-Based IS in Retail, " Science Computer Review, vol. 26, pp. 75-86, 2008.

33. K. Michael and M. G. Michael, "The Future Prospects of Embedded Microchips in Humans as Unique Identifiers: The Risks versus the Rewards, " Media, Culture &Society, vol. 35, pp. 78-86, 2013.

34. WND. (October 2, 2006, May 13, 2014). Employees Get Microchip Implants. Available: http://www.wnd.com/2006/02/34751/

35. K. Michael, "Citywatcher.com, " in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds., ed Hershey, PA: IGI Global, 2014, pp. 133-143.

36. K. Michael, et al., "Microchip Implants for Employees in the Workplace: Findings from a Multi-Country Survey of Small Business Owners, " presented at the Surveillance and/in Everyday Life: Monitoring Pasts, Presents and Futures, University of Sydney, NSW, 2012.

37. M. N. Gasson, et al., "Human ICT Implants: Technical, Legal and Ethical Considerations, " in Information Technology and Law Series vol. 23, ed: Springer, 2012, p. 184.

38. S. O. Hansson, "Implant ethics, " Journal of Med Ethics, vol. 31, pp. 519-525, 2005.

39. K. Albrecht, "Microchip-induced tumours in laboratory rodents and dogs: A review of literature, " in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds., ed Hershey, PA: IGI Global, 2014, pp. 281-318.

Keywords: Radiofrequency identification, Implants, Educational institutions, Organizations, Access control, Australia, transponders, authorisation, microprocessor chips, organisational aspects, radiofrequency identification, institutional microchips, perceived barriers, microchips implant, transnational study, small business owners, RFID transponders, radio frequency identification transponders, employee ID, chip implants,access control, organizations, chi-square analysis, technological issues, philosophical issues, health issues, religious issues, social issues, digital divide, cultural issues, USA, RFID, radio frequency identification, implants, microchips, uberveillance, barriers, access control, employee identification, security, small business, Australia, India, UK

Citation: Christine Perakslis, Katina Michael, M. G. Michael, Robert Gable, "Perceived barriers for implanting microchips in humans", 2014 IEEE Conference on Norbert Wiener in the 21st Century (21CW), Date of Conference: 24-26 June 2014, Date Added to IEEE Xplore: 08 September 2014. DOI: 10.1109/NORBERT.2014.6893929

Location and Tracking of Mobile Devices

Location and Tracking of Mobile Devices: Überveillance Stalks the Streets

Review Version of 7 October 2012

Published in Computer Law & Security Review 29, 3 (June 2013) 216-228

Katina Michael and Roger Clarke **

© Katina Michael and Xamax Consultancy Pty Ltd, 2012

Available under an AEShareNet  licence or a Creative Commons  licence.

This document is at http://www.rogerclarke.com/DV/LTMD.html

Abstract

During the last decade, location-tracking and monitoring applications have proliferated, in mobile cellular and wireless data networks, and through self-reporting by applications running in smartphones that are equipped with onboard global positioning system (GPS) chipsets. It is now possible to locate a smartphone-user's location not merely to a cell, but to a small area within it. Innovators have been quick to capitalise on these location-based technologies for commercial purposes, and have gained access to a great deal of sensitive personal data in the process. In addition, law enforcement utilise these technologies, can do so inexpensively and hence can track many more people. Moreover, these agencies seek the power to conduct tracking covertly, and without a judicial warrant. This article investigates the dimensions of the problem of people-tracking through the devices that they carry. Location surveillance has very serious negative implications for individuals, yet there are very limited safeguards. It is incumbent on legislatures to address these problems, through both domestic laws and multilateral processes.

Contents

1. Introduction

Personal electronic devices travel with people, are worn by them, and are, or soon will be, inside them. Those devices are increasingly capable of being located, and, by recording the succession of locations, tracked. This creates a variety of opportunities for the people concerned. It also gives rise to a wide range of opportunities for organisations, at least some of which are detrimental to the person's interests.

Commonly, the focus of discussion of this topic falls on mobile phones and tablets. It is intrinsic to the network technologies on which those devices depend that the network operator has at least some knowledge of the location of each handset. In addition, many such devices have onboard global positioning system (GPS) chipsets, and self-report their coordinates to service-providers. The scope of this paper encompasses those already-well-known forms of location and tracking, but it extends beyond them.

The paper begins by outlining the various technologies that enable location and tracking, and identifies those technologies' key attributes. The many forms of surveillance are then reviewed, in order to establish a framework within which applications of location and tracking can be characterised. Applications are described, and their implications summarised. Controls are considered, whereby potential harm to the interests of individuals can be prevented or mitigated.

2. Relevant Technologies

The technologies considered here involve a device that has the following characteristics:

  • it is conveniently portable by a human, and
  • it emits signals that:
    • enable some other device to compute the location of the device (and hence of the person), and
    • are sufficiently distinctive that the device is reliably identifiable at least among those in the vicinity, and hence the device's (and hence the person's) successive locations can be detected, and combined into a trail

The primary form-factors for mobile devices are currently clam-shape (portable PCs), thin rectangles suitable for the hand (mobile phones), and flat forms (tablets). Many other form-factors are also relevant, however. Anklets imposed on dangerous prisoners, and even as conditions of bail, carry RFID tags. Chips are carried in cards of various sizes, particularly the size of credit-cards, and used for tickets for public transport and entertainment venues, aircraft boarding-passes, toll-road payments and in some countries to carry electronic cash. Chips may conduct transactions with other devices by contact-based means, or contactless, using radio-frequency identification (RFID) or its shorter-range version near-field communication (NFC) technologies. These capabilities are in credit and debit cards in many countries. Transactions may occur with the cardholder's knowledge, with their express consent, and with an authentication step to achieve confidence that the person using the card is authorised to do so. In a variety of circumstances, however, some and even all of those safeguards are dispensed with. The electronic versions of passports that are commonly now being issued carry such a chip, and have an autonomous communications capability. The widespread issue of cards with capabilities uncontrolled by, and in many cases unknown to, the cardholder, is causing consternation among segments of the population that have become aware of the schemes.

Such chips can be readily carried in other forms, including jewellery such as finger-rings, and belt-buckles. Endo-prostheses such as replacement hips and knees and heart pacemakers can readily carry chips. A few people have voluntarily embedded chips directly into their bodies for such purposes as automated entry to premises (Michael & Michael 2009).

In order to locate and track such devices, any sufficiently distinctive signals may in principle suffice. See Raper et al. (2007a) and Mautz (2011). In practice, the signals involved are commonly those transmitted by a device in order to take advantage of wireless telecommunications networks. The scope of the relevant technologies therefore also encompasses the signals, devices that detect the signals, and the networks over which the data that the signals contain are transmitted.

In wireless networks, it is generally the case that the base station or router needs to be aware of the identities of devices that are currently within the cell. A key reason for this is to conserve limited transmission capacity by sending messages only when the targeted device is known to be in the cell. This applies to all of:

  • cellular mobile originally designed for voice telephony and extended to data (in particular those using the '3G' standards GSM/GPRS, CDMA2000 and UMTS/HSPA and the '4G' standard LTE)
  • wireless local area networks (WLANs, commonly Wifi / IEEE 802.11x - RE 2010a)
  • wireless wide area networks (WWANs, commonly WiMAX / IEEE 802.16x - RE 2010b).

Devices in such networks are uniquely identified by various means (Clarke & Wigan 2011). In cellular networks, there is generally a clear distinction between the entity (the handset) and the identity it is adopting at any given time (which is determined by the module inserted in it). Depending on the particular standards used, what is commonly referred to as 'the SIM-card' is an R-UIM, a CSIM or a USIM. These modules store an International Mobile Subscriber Identity (IMSI), which constitutes the handset's identifier. Among other things, this enables network operators to determine whether or not to provide service, and what tariff to apply to the traffic. However, cellular network protocols may also involve transmission of a code that distinguishes the handset itself, within which the module is currently inserted. A useful generic term for this is the device 'entifier' (Clarke 2009b). Under the various standards, it may be referred to as an International Mobile Equipment Identity (IMEI), ESN, or MEID.

In Wifi and WiMAX networks, the device entifier may be a processor-id or more commonly a network interface card identifier (NIC Id). In various circumstances, other device-identifiers may be used, such as a phone-number, or an IP-address may be used as a proxy. In addition, the human using the device may be directly identified, e.g. by means of a user-accountname.

A WWAN cell may cover a large area, indicatively of a 50km radius. Telephony cells may have a radius as large as 2-3 km or as little as a hundred metres. WLANs using Wifi technologies have a cell-size of less than 1 hectare, indicatively 50-100 metres radius, but in practice often constrained by environmental factors to only 10-30 metres.

The base-station or router knows the identities of devices that are within its cell, because this is a technically necessary feature of the cell's operation. Mobile devices auto-report their presence 10 times per second. Meanwhile, the locations of base-stations for cellular services are known with considerable accuracy by the telecommunications providers. And, in the case of most private Wifi services, the location of the router is mapped to c. 30-100 metre accuracy by services such as Skyhook and Google Locations, which perform what have been dubbed 'war drives' in order to maintain their databases - in Google's case in probable violation of the telecommunications interception and/or privacy laws of at least a dozen countries (EPIC 2012).

Knowing that a device is within a particular mobile phone, WiMAX or Wifi cell provides only a rough indication of location. In order to generate a more precise estimate, within a cell, several techniques are used (McGuire et al. 2005). These include the following (adapted from Clarke & Wigan 2011. See also Figueiras & Frattasi 2010):

  • directional analysis. A single base-station may comprise multiple receivers at known locations and pointed in known directions, enabling the handset's location within the cell to be reduced to a sector within the cell, and possibly a narrow one, although without information about the distance along the sector;
  • triangulation. This involves multiple base-stations serving a single cell, at known locations some distance apart, and each with directional analysis capabilities. Particularly with three or more stations, this enables an inference that the device's location is within a small area at the intersection of the multiple directional plots;
  • signal analysis. This involves analysis of the characteristics of the signals exchanged between the handset and base-station, in order to infer the distance between them. Relevant signal characteristics include the apparent response-delay (Time Difference of Arrival - TDOA, also referred to as multilateration), and strength (Received Signal Strength Indicator - RSSI), perhaps supplemented by direction (Angle Of Arrival - AOA).

The precision and reliability of these techniques varies greatly, depending on the circumstances prevailing at the time. The variability and unpredictability result in many mutually inconsistent statements by suppliers, in the general media, and even in the technical literature.

Techniques for cellular networks generally provide reasonably reliable estimates of location to within an indicative 50-100m in urban areas and some hundreds of metres elsewhere. Worse performance has been reported in some field-tests, however. For example, Dahunsi & Dwolatzky (2012) found the accuracy of GSM location in Johannesberg to be in the range 200-1400m, and highly variable, with "a huge difference between the predicted and provided accuracies by mobile location providers".

The web-site of the Skyhook Wifi-router positioning service claims 10-metre accuracy, 1-second time-to-first-fix and 99.8% reliability (SHW 2012). On the other hand, tests have resulted in far lower accuracy measures, including an average positional error of 63m in Sydney (Gallagher et al. 2009) and "median values for positional accuracy in [Las Vegas, Miami and San Diego, which] ranged from 43 to 92 metres ... [and] the replicability ... was relatively poor" (Zandbergen 2012, p. 35). Nonetheless, a recent research article suggested the feasibility of "uncooperatively and covertly detecting people 'through the wall' [by means of their WiFi transmissions]" (Chetty et al. 2012).

Another way in which a device's location may become known to other devices is through self-reporting of the device's position, most commonly by means of an inbuilt Global Positioning System (GPS) chip-set. This provides coordinates and altitude based on broadcast signals received from a network of satellites. In any particular instance, the user of the device may or may not be aware that location is being disclosed.

Despite widespread enthusiasm and a moderate level of use, GPS is subject to a number of important limitations. The signals are subject to interference from atmospheric conditions, buildings and trees, and the time to achieve a fix on enough satellites and deliver a location measure may be long. This results in variability in its practical usefulness in different circumstances, and in its accuracy and reliability. Civil-use GPS coordinates are claimed to provide accuracy within a theoretical 7.8m at a 95% confidence level (USGov 2012), but various reports suggest 15m, or 20m, or 30m, but sometimes 100m. It may be affected by radio interference and jamming. The original and still-dominant GPS service operated by the US Government was subject to intentional degradation in the US's national interests. This 'Selective Availability' feature still exists, although subject to a decade-long policy not to use it; and future generations of GPS satellites may no longer support it.

Hybrid schemes exist that use two or more sources in order to generate more accurate location-estimates, or to generate estimates more quickly. In particular, Assisted GPS (A-GPS) utilises data from terrestrial servers accessed over cellular networks in order to more efficiently process satellite-derived data (e.g. RE 2012).

Further categories of location and tracking technologies emerge from time to time. A current example uses means described by the present authors as 'mobile device signatures' (MDS). A device may monitor the signals emanating from a user's mobile device, without being part of the network that the user's device is communicating with. The eavesdropping device may detect particular signal characteristics that distinguish the user's mobile device from others in the vicinity. In addition, it may apply any of the various techniques mentioned above, in order to locate the device. If the signal characteristics are persistent, the eavesdropping device can track the user's mobile device, and hence the person carrying it. No formal literature on MDS has yet been located. The supplier's brief description is at PI (2010b).

The various technologies described in this section are capable of being applied to many purposes. The focus in this paper is on their application to surveillance.

3. Surveillance

The term surveillance refers to the systematic investigation or monitoring of the actions or communications of one or more persons (Clarke 2009c). Until recent times, surveillance was visual, and depended on physical proximity of an observer to the observed. The volume of surveillance conducted was kept in check by the costs involved. Surveillance aids and enhancements emerged, such as binoculars and, later, directional microphones. During the 19th century, the post was intercepted, and telephones were tapped. During the 20th century, cameras enabled transmission of image, video and sound to remote locations, and recording for future use (e.g. Parenti 2003).

With the surge in stored personal data that accompanied the application of computing to administration in the 1970s and 1980s, dataveillance emerged (Clarke 1988). Monitoring people through their digital personae rather than through physical observation of their behaviour is much more economical, and hence many more people can be subjected to it (Clarke 1994). The dataveillance epidemic made it more important than ever to clearly distinguish between personal surveillance - of an identified person who has previously come to attention - and mass surveillance - of many people, not necessarily previously identified, about some or all of whom suspicion could be generated.

Location data is of a very particular nature, and hence it has become necessary to distinguish location surveillance as a sub-set of the general category of dataveillance. There are several categories of location surveillance with different characteristics (Clarke & Wigan 2011):

  • capture of an individual's location at a point in time. Depending on the context, this may support inferences being drawn about an individual's behaviour, purpose, intention and associates
  • real-time monitoring of a succession of locations and hence of the person's direction of movement. This is far richer data, and supports much more confident inferences being drawn about an individual's behaviour, purpose, intention and associates
  • predictive tracking, by extrapolation from the person's direction of movement, enabling inferences to be drawn about near-future behaviour, purpose, intention and associates
  • retrospective tracking, on the basis of the data trail of the person's movements, enabling reconstruction of a person's behaviour, purpose, intention and associates at previous times

Information arising at different times, and from different forms of surveillance, can be combined, in order to offer a more complete picture of a person's activities, and enable yet more inferences to be drawn, and suspicions generated. This is the primary sense in which the term 'überveillance' is applied: "Überveillance has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought). Überveillance can be a predictive mechanism for a person's expected behaviour, traits, likes, or dislikes; or it can be based on historical fact; or it can be something in between ... Überveillance is more than closed circuit television feeds, or cross-agency databases linked to national identity cards, or biometrics and ePassports used for international travel. Überveillance is the sum total of all these types of surveillance and the deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time" (Michael & Michael 2010. See also Michael & Michael 2007, Michael et al. 2008, Michael et al. 2010, Clarke 2010).

A comprehensive model of surveillance includes consideration of geographical scope, and of temporal scope. Such a model assists the analyst in answering key questions about surveillance: of what? for whom? by whom? why? how? where? and when? (Clarke 2009c). Distinctions are also needed based on the extent to which the subject has knowledge of surveillance activities. It may be overt or covert. If covert, it may be merely unnotified, or alternatively express measures may be undertaken in order to obfuscate, and achieve secrecy. A further element is the notion of 'sousveillance', whereby the tools of surveillance are applied, by those who are commonly watched, against those who are commonly the watchers (Mann et al. 2003).

These notions are applied in the following sections in order to establish the extent to which location and tracking of mobile devices is changing the game of surveillance, and to demonstrate that location surveillance is intruding more deeply into personal freedoms than previous forms of surveillance.

4. Applications

This section presents a typology of applications of mobile device location, as a means of narrowing down to the kinds of uses that have particularly serious privacy implications. These are commonly referred to as location-based services (LBS). One category of applications provide information services that are for the benefit of the mobile device's user, such as navigation aids, and search and discovery tools for the locations variously of particular, identified organisations, and of organisations that sell particular goods and services. Users of LBS of these kinds can be reasonably assumed to be aware that they are disclosing their location. Depending on the design, the disclosures may also be limited to specific service-providers and specific purposes, and the transmissions may be secured.

Another, very different category of application is use by law enforcement agencies (LEAs). The US E-911 mandate of 1999 was nominally a public safety measure, to enable people needing emergency assistance to be quickly and efficiently located. In practice, the facility also delivered LEAs means for locating and tracking people of interest, through their mobile devices. Personal surveillance may be justified by reasonable grounds for suspicion that the subject is involved in serious crime, and may be specifically authorised by judicial warrant. Many countries have always been very loose in their control over LEAs, however, and many others have drastically weakened their controls since 2001. Hence, in any given jurisdiction and context, each and all of the controls may be lacking.

Yet worse, LEAs use mobile location and tracking for mass surveillance, without any specific grounds for suspicion about any of the many people caught up in what is essentially a dragnet-fishing operation (e.g. Mery 2009). Examples might include monitoring the area adjacent to a meeting-venue watching out for a blacklist of device-identifiers known to have been associated with activists in the past, or collecting device-identifiers for use on future occasions. In addition to netting the kinds of individuals who are of legitimate interest, the 'by-catch' inevitably includes threatened species. There are already extraordinarily wide-ranging (and to a considerable extent uncontrolled) data retention requirements in many countries.

Of further concern is the use of Automated Number Plate Recognition (ANPR) for mass surveillance purposes. This has been out of control in the UK since 2006, and has been proposed or attempted in various other countries as well (Clarke 2009a). Traffic surveillance is expressly used not only for retrospective analysis of the movements of individuals of interest to LEAs, but also as a means of generating suspicions about other people (Lewis 2008).

Beyond LEAs, many government agencies perform social control functions, and may be tempted to conduct location and tracking surveillance. Examples would include benefits-paying organisations tracking the movements of benefits-recipients about whom suspicions have arisen. It is not too far-fetched to anticipate zealous public servants concerned about fraud control imposing location surveillance on all recipients of some particularly valuable benefit, or as a security precaution on every person visiting a sensitive area (e.g. a prison, a power plant, a national park).

Various forms of social control are also exercised by private sector organisations. Some of these organisations, such as placement services for the unemployed, may be performing outsourced public sector functions. Others, such as workers' compensation providers, may be seeking to control personal insurance claimants, and similarly car-hire companies and insurance providers may wish to monitor motor vehicles' distance driven and roads used (Economist 2012).

A further privacy-invasive practice that is already common is the acquisition of location and tracking data by marketing corporations, as a by-product of the provision of location-based services, but with the data then applied to further purposes other than that for which it was intended. Some uses rely on statistical analysis of large holdings ('data mining'). Many uses are, on the other hand, very specific to the individual, and are for such purposes as direct or indirect targeting of advertisements and the sale of goods and services. Some of these applications combine location data with data from other sources, such as consumer profiling agencies, in order to build up such a substantial digital persona that the individual's behaviour is readily influenced. This takes the activity into the realms of überveillance.

All such services raise serious privacy concerns, because the data is intensive and sensitive, and attractive to organisations. Companies may gain rights in relation to the data through market power, or by trickery - such as exploitation of a self-granted right to change the Terms of Service (Clarke 2011). Once captured, the data may be re-purposed by any organisation that gains access to it, because the value is high enough that they may judge the trivial penalties that generally apply to breaches of privacy laws to be well worth the risk.

A recently-emerged, privacy-invasive practice is the application of the mobile device signature (MDS) form of tracking, in such locations as supermarkets. This is claimed by its providers to offer deep observational insights into the behaviour of customers, including dwell-times in front of displays, possibly linked with the purchaser's behaviour. This raises concerns a little different from other categories of location and tracking technologies, and is accordingly considered in greater depth in the following section.

It is noteworthy that an early review identified a wide range of LBS, which the authors classified into mobile guides, transport, gaming, assistive technology and location-based health (Raper et al. 2007b). Yet that work completely failed to notice that a vast array of applications were emergent in surveillance, law enforcement and national security, despite the existence of relevant literature from at least 1999 onwards (Clarke 2001Michael & Masters 2006).

5. Implications

The previous sections have introduced many examples of risks to citizens and consumers arising from location surveillance. This section presents an analysis of the categories and of the degree of seriousness with which they should be viewed. The first topic addressed is the privacy of personal location data. Other dimensions of privacy are then considered, and then the specific case of MDS is examined. The treatment here is complementary to earlier articles that have looked more generally at particular applications such as location-based mobile advertising, e.g. Cleff (2007, 2010) and King & Jessen (2010). See also Art. 29 (2011).

5.1 Locational Privacy

Knowing where someone has been, knowing what they are doing right now, and being able to predict where they might go next is a powerful tool for social control and for chilling behaviour (Abbas 2011). Humans do not move around in a random manner (Song et al. 2010).

One interpretation of 'locational privacy' is that it "is the ability of an individual to move in public space with the expectation that under normal circumstances their location will not be systematically and secretly recorded for later use" (Blumberg & Eckersley 2009). A more concise definition is "the ability to control the extent to which personal location information is ... [accessible and] used by others" (van Loenen et al. 2009). Hence 'tracking privacy' is the interest an individual has in controlling information about their sequence of locations.

Location surveillance is deeply intrusive into data privacy, because it is very rich, and enables a great many inferences to be drawn (Clarke 2001, Dobson & Fisher 2003, Michael et al. 2006aClarke & Wigan 2011). As demonstrated by Raper et al. (2007a, pp. 32-33), most of the technical literature that considers privacy is merely concerned about it as an impediment to deployment and adoption, and how to overcome the barrier rather than how to solve the problem. Few authors adopt a positive approach to privacy-protective location technologies. The same authors' review of applications (Raper et al. 2007b) includes a single mention of privacy, and that is in relation to just one of the scores of sub-categories of application that they catalogue.

Most service-providers are cavalier in their handling of personal data, and extravagant in their claims. For example, Skyhook claims that it "respects the privacy of all users, customers, employees and partners"; but, significantly, it makes no mention of the privacy of the people whose locations, through the locations of their Wifi routers, it collects and stores (Skyhook 2012).

Consent is critical in such LBS as personal location chronicle systems, people-followers and footpath route-tracker systems that systematically collect personal location information from a device they are carrying (Collier 2011c). The data handled by such applications is highly sensitive because it can be used to conduct behavioural profiling of individuals in particular settings. The sensitivity exists even if the individuals remain 'nameless', i.e. if each identifier is a temporary or pseudo-identifier and is not linked to other records. Service-providers, and any other organisations that gain access to the data, achieve the capacity to make judgements on individuals based on their choices of, for example, which retail stores they walk into and which they do not. For example, if a subscriber visits a particular religious bookstore within a shopping mall on a weekly basis, the assumption can be reasonably made that they are in some way affiliated to that religion (Samuel 2008).

It is frequently asserted that individuals cannot have a reasonable expectation of privacy in a public space. Contrary to those assertions, however, privacy expectations always have existed in public places, and continue to exist (VLRC 2010). Tracking the movements of people as they go about their business is a breach of a fundamental expectation that people will be 'let alone'. In policing, for example, in most democratic countries, it is against the law to covertly track an individual or their vehicle without specific, prior approval in the form of a warrant. This principle has, however, been compromised in many countries since 2001. Warrantless tracking using a mobile device generally results in the evidence, which has been obtained without the proper authority, being inadmissible in a court of law (Samuel 2008). Some law enforcement agencies have argued for the abolition of the warrant process because the bureaucracy involved may mean that the suspect cannot be prosecuted for a crime they have likely committed (Ganz 2005). These issues are not new; but far from eliminating a warrant process, the appropriate response is to invest the energy in streamlining this process (Bronitt 2010).

Privacy risks arise not only from locational data of high integrity, but also from data that is or becomes associated with a person and that is inaccurate, misleading, or wrongly attributed to that individual. High levels of inaccuracy and unreliability were noted above in respect of all forms of location and tracking technologies. In the case of MDS services, claims have been made of one-to-two metre locational accuracy. This has yet to be supported by experimental test cases, however, and hence there is uncertainty about the reliability of inferences that the service-provider or the shop-owner draw. If the data is the subject of a warrant or subpoena, the data's inaccuracy could result in false accusations and even a miscarriage of justice, with the 'wrong person' finding themselves in the 'right place' at the 'right time'.

5.2 Privacy More Broadly

Privacy has multiple dimensions. One analysis, in Clarke (2006a), identifies four distinct aspects. Privacy of Personal Data, variously also 'data privacy' and 'information privacy', is the most widely-discussed dimension of the four. Individuals claim that data about themselves should not be automatically available to other individuals and organisations, and that, even where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use. The last five decades have seen the application of information technologies to a vast array of abuses of data privacy. The degree of privacy-intrusiveness is a function of both the intensity and the richness of the data. Where multiple sources are combined, the impact is particularly likely to chill behaviour. An example is the correlation of video-feeds with mobile device tracking. The previous sub-section addressed that dimension.

Privacy of the Person, or 'bodily privacy', extends from freedom from torture and right to medical treatment, via compulsory immunisation and imposed treatments, to compulsory provision of samples of body fluids and body tissue, and obligations to submit to biometric measurement. Locational surveillance gives rise to concerns about personal safety. Physical privacy is directly threatened where a person who wishes to inflict harm is able to infer the present or near-future location of their target. Dramatic examples include assassins, kidnappers, 'standover merchants' and extortionists. But even people who are neither celebrities nor notorities are subject to stalking and harassment (Fusco et al. 2012).

Privacy of Personal Communications is concerned with the need of individuals for freedom to communicate among themselves, without routine monitoring of their communications by other persons or organisations. Issues include 'mail covers', the use of directional microphones, 'bugs' and telephonic interception, with or without recording apparatus, and third-party access to email-messages. Locational surveillance thereby creates new threats to communications privacy. For example, the equivalent of 'call records' can be generated by combining the locations of two device-identifiers in order to infer that a face-to-face conversation occurred.

Privacy of Personal Behaviour encompasses 'media privacy', but particular concern arises in relation to sensitive matters such as sexual preferences and habits, political activities and religious practices. Some privacy analyses, particularly in Europe, extend this discussion to personal autonomy, liberty and the right of self-determination (e.g. King & Jesson 2010). The notion of 'private space' is vital to economic and social aspects of behaviour, is relevant in 'private places' such as the home and toilet cubicles, but is also relevant and important in 'public places', where systematic observation and the recording of images and sounds are far more intrusive than casual observation by the few people in the vicinity.

Locational surveillance gives rise to rich sets of data about individuals' activities. The knowledge, or even suspicion, that such surveillance is undertaken, chills their behaviour. The chilling factor is vital in the case of political behaviour (Clarke 2008). It is also of consequence in economic behaviour, because the inventors and innovators on whom new developments depend are commonly 'different-thinkers' and even 'deviants', who are liable to come to come to attention in mass surveillance dragnets, with the tendency to chill their behaviour, their interactions and their creativity.

Surveillance that generates accurate data is one form of threat. Surveillance that generates inaccurate data, or wrongly associates data with a particular person, is dangerous as well. Many inferences that arise from inaccurate data will be wrong, of course, but that won't prevent those inferences being drawn, resulting in unjustified behavioural privacy invasiveness, including unjustified association with people who are, perhaps for perfectly good reasons, themselves under suspicion.

In short, all dimensions of privacy are seriously affected by location surveillance. For deeper treatments of the topic, see Michael et al. (2006b) and Clarke & Wigan (2011).

5.3 Locational Privacy and MDS

The recent innovation of tracking by means of mobile device signatures (MDS) gives rise to some issues additional to, or different from, mainstream device-location technologies. This section accordingly considers this particular technique's implications in greater depth. Limited reliable information is currently available, and the analysis is of necessity based on supplier-published sources (PI 2010a, 2010b) and media reports (Collier 2010a, 2010b, 2010c).

A company called Path Intelligence (PI) markets an MDS service to shopping mall-owners, to enable them to better value their floorspace in terms of rental revenues, and to identify points of on-foot traffic congestion to on-sell physical advertising and marketing floorspace (PI 2010a). The company claims to detect each phone (and hence person) that enters a zone, and to capture data, including:

  • how long each device and person stay, including dwell times in front of shop windows;
  • repeat visits by shoppers in varying frequency durations; and
  • typical route and circuit paths taken by shoppers as they go from shop to shop during a given shopping experience.

For malls, PI is able to denote such things as whether or not shoppers who shop at one establishment will also shop at another in the same mall, and whether or not people will go out of their way to visit a particular retail outlet independent of its location. For retailers, PI says it is able to provide information on conversion rates by department or even product line, and even which areas of the store might require more attention by staff during specific times of the day or week (PI 2012).

PI says that it uses "complex algorithms" to denote the geographic position of a mobile, using strategically located "proprietary equipment" in a campus setting (PI 2010a). The company states that it is conducting "data-driven analysis", but is not collecting, or at least that it is is not disclosing, any personal information such as a name, mobile telephone number or contents of a short message service (SMS). It states that it only ever provides aggregated data at varying zone levels to the shopping mall-owners. This is presumably justified on the basis that, using MDS techniques, direct identifiers are unlikely to be available, and a pseudo-identifier needs to be assigned. There is no explicit definition of what constitutes a zone. It is clear, however, that minimally-aggregated data at the highest geographic resolution is available for purchase, and at a higher price than more highly-aggregated data.

Shoppers have no relationship with the company, and it appears unlikely that they would even be aware that data about them is being collected and used. The only disclosure appears to be that "at each of our installations our equipment is clearly visible and labelled with our logo and website address" (PI 2010a), but this is unlikely to be visible to many people, and in any case would not inform anyone who saw it.

In short, the company is generating revenue by monitoring signals from the mobile devices of people who visit a shopping mall for the purchase of goods and services. The data collection is performed without the knowledge of the person concerned (Renegar et al. 2008). The company is covertly collecting personal data and exploiting it for profit. There is no incentive or value proposition for the individual whose mobile is being tracked. No clear statement is provided about collection, storage, retention, use and disclosure of the data (Arnold 2008). Even if privacy were not a human right, this would demand statutory intervention on the public policy grounds of commercial unfairness. The company asserts that the "our privacy approach has been reviewed by the [US Federal Trade Commission] FTC, which determined that they are comfortable with our practices" (PI 20101a). It makes no claims of such 'approval' anywhere else in the world.

The service could be extended beyond a mall and the individual stores within it, to, for example, associated walkways and parking areas, and surrounding areas such as government offices, entertainment zones and shopping-strips. Applications can also be readily envisaged on hospital and university campuses, and in airports and other transport hubs. From prior research, this is likely to expose the individual's place of employment, and even their residence (Michael et al. 2006). Even if only aggregated data is sold to businesses, the individual records remain available to at least the service-provider.

The scope exists to combine this form of locational surveillance with video-surveillance such as in-store CCTV, and indeed this is claimed to be already a feature of the company's offering to retail stores. To the extent that a commonly-used identifier can be established (e.g. through association with the person's payment or loyalty card at a point-of-sale), the full battery of local and externally-acquired customer transaction histories and consolidated 'public records' data can be linked to in-store behaviour (Michael & Michael 2007). Longstanding visual surveillance is intersecting with well-established data surveillance, and being augmented by locational surveillance, giving breath to dataveillance, or what is now being referred to by some as 'smart surveillance' (Wright et al. 2010, IBM 2011).

Surreptitious collection of personal data is (with exemptions and exceptions) largely against the law, even when undertaken by law enforcement personnel. The MDS mechanism also flies in the face of telephonic interception laws. How, then, can it be in any way acceptable for a form of warrantless tracking to be undertaken by or on behalf of corporations or mainstream government agencies, of shoppers in a mall, or travellers in an airport, or commuters in a transport hub? Why should a service-provider have the right to do what a law enforcement agency cannot normally do?

6. Controls

The tenor of the discussion to date has been that location surveillance harbours enormous threats to location privacy, but also to personal safety, the freedom to communicate, freedom of movement, and freedom of behaviour. This section examines the extent to which protections exist, firstly in the form of natural or intrinsic controls, and secondly in the form of legal provisions. The existing safeguards are found to be seriously inadequate, and it is therefore necessary to also examine the prospects for major enhancements to law, in order to achieve essential protections.

6.1 Intrinsic Controls

A variety of forms of safeguard exist against harmful technologies and unreasonable applications of them. The intrinsic economic control has largely evaporated, partly because the tools use electronics and the components are produced in high volumes at low unit cost. Another reason is that the advertising and marketing sectors are highly sophisticated, already hold and exploit vast quantities of personal data, and are readily geared up to exploit yet more data.

Neither the oxymoronic notion of 'business ethics' nor the personal morality of executives in business and government act as any significant brake on the behaviours of corporations and governments, because they are very weak barriers, and they are readily rationalised away in the face of claims of enhanced efficiencies in, for example, marketing communications, fraud control, criminal justice and control over anti-social behaviour.

A further category of intrinsic control is 'self-regulatory' arrangements within relevant industry sectors. In 2010, for example, the Australian Mobile Telecommunications Association (AMTA) released industry guidelines to promote the privacy of people using LBS on mobile devices (AMTA 2010). The guidelines were as follows:

  1. Every LBS must be provided on an opt-in basis with a specific request from a user for the service
  2. Every LBS must comply with all relevant privacy legislation
  3. Every LBS must be designed to guard against consumers being located without their knowledge
  4. Every LBS must allow consumers to maintain full control
  5. Every LBS must enable customers to control who uses their location information and when that is appropriate, and be able to stop or suspend a service easily should they wish

The second point is a matter for parliaments, privacy oversight agencies and law enforcement agencies, and its inclusion in industry guidelines is for-information-only. The remainder, meanwhile, are at best 'aspirational', and at worst mere window-dressing. Codes of this nature are simply ignored by industry members. They are primarily a means to hold off the imposition of actual regulatory measures. Occasional short-term constraints may arise from flurries of media attention, but the 'responsible' organisations escape by suggesting that bad behaviour was limited to a few 'cowboy' organisations or was a one-time error that won't be repeated.

A case study of the industry self-regulation is provided by the Biometrics Code issued by the misleadingly-named Australian industry-and-users association, the Biometrics 'Institute' (BI 2004). During the period 2009-12, the privacy advocacy organisation, the Australian Privacy Foundation (APF), submitted to the Privacy Commissioner on multiple occasions that the Code failed to meet the stipulated requirements and under the Commissioner's own Rules had to be de-registered. The Code never had more than five subscribers (out of a base of well over 100 members - which was itself only a sub-set of organisations active in the area), and had no signatories among the major biometrics vendors or users, because all five subscribers were small organisations or consultants. In addition, none of the subscribers appear to have ever provided a link to the Code on their websites or in their Privacy Policy Statements (APF 2012).

The Commissioner finally ended the farce in April 2012, citing the "low numbers of subscribers", but avoided its responsibilities by permitting the 'Institute' to "request" revocation, over two years after the APF had made the same request (OAIC 2012). The case represents an object lesson in the vacuousness of self-regulation and the business-friendliness of a captive privacy oversight agency.

If economics, morality and industry-sector politics are inadequate, perhaps competition and organisational self-interest might work. On the other hand, repeated proposals that privacy is a strategic factor for corporations and government agencies have fallen on stony ground (Clarke 19962006b).

The public can endeavour to exercise countervailing power against privacy-invasive practices. On the other hand, individuals acting alone are of little or no consequence to organisations that are intent on the application of location surveillance. Moreover, consumer organisations lack funding, professionalism and reach, and only occasionally attract sufficient media attention to force any meaningful responses from organisations deploying surveillance technologies.

Individuals may have direct surveillance countermeasures available to them, but relatively few people have the combination of motivation, technical competence and persistence to overcome lethargy and the natural human desire to believe that the institutions surrounding them are benign. In addition, some government agencies, corporations and (increasingly prevalent) public-private partnerships seek to deny anonymity, pseudonymity and multiple identities, and to impose so-called 'real name' policies, for example as a solution to the imagined epidemics of cyber-bullying, hate speech and child pornography. Individuals who use cryptography and other obfuscation techniques have to overcome the endeavours of business and government to stigmatise them as criminals with 'something to hide'.

6.2 Legal Controls

It is clear that natural or intrinsic controls have been utter failures in privacy matters generally, and will be in locational privacy matters as well. That leaves legal safeguards for personal freedoms as the sole protection. There are enormous differences among domestic laws relating to location surveillance. This section accordingly limits itself to generalities and examples.

Privacy laws are (with some qualifications, mainly in Europe) very weak instruments. Even where public servants and parliaments have an actual intention to protect privacy, rather than merely to overcome public concerns by passing placebo statutes, the draft Bills are countered by strong lobbying by government agencies and industry, to the extent that measures that were originally portrayed as being privacy-protective reach the statute books as authority for privacy breaches and surveillance (Clarke 2000).

Privacy laws, once passed, are continually eroded by exceptions built into subsequent legislation, and by technological capabilities that were not contemplated when the laws were passed. In most countries, location privacy has yet to be specifically addressed in legislation. Even where it is encompassed by human rights and privacy laws, the coverage is generally imprecise and ambiguous. More direct and specific regulation may exist, however. In Australia, for example, the Telecommunications (Interception and Access) Act and the Surveillance Devices Act define and criminalise inappropriate interception and access, use, communication and publication of location information that is obtained from mobile device traffic (AG 2005). On the other hand, when Google Inc. intercepted wi-fi signals and recorded the data that they contained, the Privacy Commissioner absolved the company (Riley 2010), and the Australian Federal Police refused to prosecute despite the action - whether it was intentional, 'inadvertent' or merely plausibly deniable - being a clear breach of the criminal law (Moses 2010).

The European Union determined a decade ago that location data that is identifiable to individuals is to some extent at least subject to existing data protection laws (EU 2002). However, the wording of that so-called 'e-Privacy Directive' countenances the collection of "location data which are more precise than is necessary for the transmission of communications", without clear controls over the justification, proportionality and transparency of that collection (para. 35). In addition, the e-Privacy Directive only applies to telecommunications service providers, not to other organisations that acquire location and tracking data. King & Jessen (2010) discuss various gaps in the protective regimes in Europe.

The EU's Advisory Body (essentially a Committee of European Data Protection Commissioners) has issued an Opinion that mobile locati