Bots Trending Now: Disinformation and Calculated Manipulation of the Masses

Bot Developments

A bot (short for robot) performs highly repetitive tasks by automatically gathering or posting information based on a set of algorithms. Internet-based bots can create new content and interact with other users like any human would. Bots are not neutral. They always have an underlying intent toward direct or indirect benefit or harm. The power is always with the individual(s)/organization(s) unleashing the bot, and imbued with the developer's subjectivity and bias [1].

Bots can be overt or covert to subjects; they can deliberately “listen” and then in turn manipulate situations providing real information or disinformation (known also as automated propaganda). They can target individuals or groups and successfully alter or even disrupt group-think, and equally silence activists trying to bring attention to a given cause (e.g., human rights abuses by governments). On the flipside, bots can be used as counterstrategies in raising awareness of political wrongdoing (e.g., censorship) but also be used for terrorist causes appealing to a global theatre (e.g., ISIS) [2].

Software engineers and computer programmers have developed bots that can do superior conversational analytics, bots to analyze human sentiment in social media platforms such as Facebook [3] and Twitter [4], and bots to get value out of unstructured data using a plethora of big data techniques. It won't be long before we have bots to analyse audio using natural language processing, and commensurate bots to analyze and respond to uploaded videos on YouTube, and even bots that respond with humanlike speech contextually adapted for age, gender, and even culture. The convergence of this suite of capabilities is known as artificial intelligence [5]. Bots can be invisible, they can appear as a 2D embodied agent on a screen (avatar or dialog screen), or as a 3D object (e.g., toy) or humanoid robot (e.g., Bina [6] and Pepper).

Bots that Pass the Turing Test

Most consumers who use instant messaging chat programs to interact with their service providers very well might not realize that they have likely interacted with a chat bot that is able to crawl through a provider's public Internet page for information acquisition [7]. After 3–4 interactions with the bot, that can last anything between 5 to 10 minutes, a human customer service representative might intervene to enable a direct answer to a more complex problem. This is known as a hybrid delivery model where bot and human work together to solve a customer inquiry. The customer may detect a slower than usual response in the chat window, but is willing to wait given the asynchronous mode of communications, and the mere fact they don't have to converse with a real person over the telephone. The benefit to the consumer is said to be bypassing a human clerk and wait times for a representative, and the benefit to the service provider is in saving the cost of human resources, including ongoing training.

Bots that interact with humans and go undetected as being non-human are considered successful in their implementation, and are said to pass the Turing Test [8]. Devised in 1950, English mathematician, Alan M. Turing suggested the “imitation game,” which consisted of a remote human interrogator within a fixed time frame being able to distinguish between a computer and a human subject based on their replies to various questions posed by the interrogator [9].

Bot Impacts Across the Globe

Bots usually have Internet/social media accounts that look like real people, generate new content like any human would, and interact with other users. Politicalbots.org reported that approximately 19 million bot accounts were tweeting in support of either Donald Trump or Hillary Clinton in the week before the U.S. presidential election [10]. Pro-Trump bots worked to sway public opinion by secretly taking over pro-Clinton hashtags like #ImWithHer and spreading fake news stories [11]. These pervasive bots are said to have swayed public opinion.

Yet bots have not just been utilized in the U.S. alone, but also the U.K. (Brexit's mood contagion [12]), Germany (fake news [1]), France (robojournalism [13]), Italy (popularity questioned [14]), and even in Australia (Coalition's fake followers [15]). Unsurprisingly, political bots have also been used by Turkey (Erdogan's 6000 robot army [16], [17]), Syria (Twitter spambots [18]), Ecuador (surveillance [19]), Mexico (Peñabots [20]), Brazil, Rwanda, Russia (Troll Houses [21]), China (tracking Tibetan protestors [22]), Ukraine (social bots [23]), Venezuela (6000 bots generating anti-U.S. sentiment [24] with #ObamaYankeeGoHome [25]).

Whether it is personal attacks meant to cause a chilling effect, spamming attacks on hashtags meant to redirect trending, overinflated follower numbers meant to show political strength, or deliberate social media messaging to perform sweeping surveillance, bots are polluting political discourse on a grand scale. So much so, that some politicians themselves are now calling for action against these autobots - with everything from demands for ethical conduct in society, to calls for more structured regulation [26] for political parties, to even implementation of criminal penalties for offenders creating and implementing malicious bot strategies.

Provided below are demonstrative examples of the use of bots in Australia, the U.K., Germany, Syria and China, with each example offering an alternative case whereby bots have been used to further specific political agendas.

Fake Followers in Australia

In 2013, the Liberal Party internally investigated the surge in Twitter followers that the then Opposition Leader Tony Abbot accumulated. On the night of August 10,2013, Abbot's Twitter following soared from 157 000 to 198 000 [27]. In the days preceding this period, his following was steadily growing at about 3000 per day. The Liberal Party had to declare on their Facebook page that someone had been purchasing “Fake Twitter followers for Tony Abbot's Twitter account,” but later a spokeswoman said it was someone not connected with the Liberal Party nor associated with the Liberal campaign and that the damage had been done using a spambot [27], an example of which is shown in Figure 1.

Figure 1. Twitter image taken from [28].

 

The Liberals acted quickly to contact Twitter who removed only about 8000 “fake followers” and by that same evening, Mr. Abbot's followers had grown again to 197000. A later analysis indicated that the Coalition had been spamming Twitter with exactly the same messages from different accounts, most of these not even from Australian shores. The ploy meant that Abbot got about 100000 likes on Facebook in a single week, which was previously unheard of for any Liberal Party leader. Shockingly, one unofficial audit noted that about 95% of Mr. Abbot's 203000 followers were fake, with 4% “active” and only 1% genuine [15]. The audit was verified by social media monitoring tools, StatusPeople and SocialBakers that determined in a report that around 41 per cent of Abbott's most recent 50000 Twitter followers were fake, unmanned Twitter accounts [29]. The social media monitoring companies noted that the numbers of fake followers were likely even higher. It is well known that most of the Coalition's supporters do not use social media [30]. Another example of the suspected use of bots during the 2013 election campaign can be seen in Figure 2.

Figure 2. Twitter image taken from [31].

Fake Trends and Robo-Journalists in the U.K.

As the U.K.'s June 2016 referendum on European Union membership drew near, researchers discovered automated social media accounts were swaying votes for and against Britain's exit from the EU. A recent study found 54% of accounts were pro-Leave, while 20% were pro-Remain [32]. And of the 1.5 million tweets with hashtags related to the referendum between June 5 and June 12, about half a million were generated by 1% of the accounts sampled.

As more and more citizenry head to social media for their primary information source, bots can sway decisions this way or that. After the results for Brexit were disclosed, many pro-Remain supporters claimed that social media had had an undue influence by discouraging “Remain” voters from actually going to the polls [33], refer to Figure 3. While there are only 15 million Twitter users in the U.K., it is possible that robo-journalists (content gathering bots) and human journalists who relied on social media content that was fake, further propelled the “fake news,” affecting more than just the TwitterSphere.

 

Figure 3. Twitter image taken from [34].

Fake News and Echo Chambers in Germany

German Chancellor Angela Merkel has expressed concern over the potential for social bots to influence this year's German national election [35]. She brought to the fore the ways in which fake news and bots have manipulated public opinion online by spreading false and malicious information. She said: “Today we have fake sites, bots, trolls - things that regenerate themselves, reinforcing opinions with certain algorithms and we have to learn to deal with them” [36]. The right-wing Alternative for Germany (AfD) already has more Facebook likes than Merkel's Christian Democrats (CDU) and the center-left Social Democrats (SPD) combined. Merkel is worried the AfD might use Trump-like strategies on social media channels to sway the vote.

It is not just that the bots are generating fake news [35], but that the algorithms that Facebook deploys as content are shared between user accounts, creating “echo chambers” and outlets for reverberation [37]. However in Germany, Facebook, which has been criticized for failing to police hate speech, in 2016 has just been legally classified as a “media company,” which means it will now be held accountable for the content it publishes. While the major political parties responded by saying they will not utilize “bots for votes,” it is now also outside geopolitical forces (e.g., Russians) who are chiming in, attempting to drive social media sentiment with their own hidden agendas [35].

Spambots and Hijacking Hashtags in Syria

During the Arab Spring, online activists were able to provide eyewitness accounts of uprisings in real time. In Syria, protesters used the hashtags #Syria, #Daraa, and #Mar15 to appeal for support from a global theater [18]. It did not take long for government intelligence officers to threaten online protesters with verbal assaults and one-on-one intimidation techniques. Syrian blogger Anas Qtiesh wrote: “These accounts were believed to be manned by Syrian mokhabarat (intelligence) agents with poor command of both written Arabic and English, and an endless arsenal of bile and insults” [38]. But when protesters continued despite the harassment, spambots created by Bahrain company EGHNA were coopted to create pro-regime accounts [39]. The pro-regime messages then flooded hashtags that had pro-revolution narratives.

This essentially drowned out protesters' voices with irrelevant information - such as photography of Syria. @LovelySyria, @SyriaBeauty and @DNNUpdates dominated #Syria with a flood of predetermined tweets every few minutes from EGHNA's media server [40]. Figure 4 provides an example of such tweets. Others who were using Twitter to portray the realities of the conflict in Syria publicly opposed the use of the spambots (see Figure 5) [43].

Figure 4. Twitter image taken from [41].

Figure 5. Twitter image taken from [42].

Since 2014, the Islamic State terror group has “ghost-tweeted” its messages to make it look like it has a large, sympathetic following [44]. This has been a deliberate act to try and attract resources, both human and financial, from global constituents. Tweets have consisted of allegations of mass killings of Iraqi soldiers and more [45]. This activity shows how extremists are employing the same social media strategies as some governments and social activists.

Sweeping Surveillance in China

In May 2016, China was exposed for purportedly fabricating 488 million social media comments annually in an effort to distract users' attention from bad news and politically sensitive issues [46]. A recent three-month study found 13% of messages had been deleted on Sina Weibo (Twitter's equivalent in China) in a bid to crack down on what government officials identified as politically charged messages [47]. It is likely that bots were used to censor messages containing key terms that matched a list of banned words. Typically, this might have included words in Mandarin such as “Tibet,” “Falun Gong,” and “democracy” [48].

China employs a classic hybrid model of online propaganda that comes into action only after some period of social unrest or protest when there is a surge in message volumes. Typically, the task is left to government officials to do the primary messaging, with back up support from bots, methodically spreading messages of positivity and ensuring political security using pro-government cheerleading. While on average it is believed that one in every 178 posts is curated for propaganda purposes, the posts are not continuous and appear to overwhelm dissent only at key times [49]. Distraction online, it seems, is the best way to overcome opposition. That distraction is carried out in conjunction with making sure there is a cap on the number of messages that can be sent from “public accounts” that have broadcasting capabilities.

What Effect are Bots Having on Society?

The deliberate act of spreading falsehoods via the Internet, and more specifically via social media, to make people believe something that is not true is certainly a form of propaganda. While it might create short-term gains in the eyes of political leaders, it inevitably causes significant public distrust in the long term. In many ways, it is a denial of citizen service that attacks fundamental human rights. It preys on the premise that most citizens in society are like sheep, a game of “follow the leader” ensues, making a mockery of the “right to know.” We are using faulty data to come to phony conclusions, to cast our votes and decide our futures. Disinformation on the Internet is now rife - and if the Internet has become our primary source of truth, then we might well believe anything.

REFERENCES

1. S.C. Woolley, "Automating power: Social bot interference in global politics", First Monday, vol. 21, no. 4.

2. H.M. Roff, D. Danks, J.H. Danks, "Fight ISIS by thinking inside the bot: How we can use artificial intelligence to distract ISIS recruiters", Slate, [online] Available: http://www.slate.com/articles/technology/future_tense/2015/10/using_chatbots_to_distract_isis_recruiters_on_social_media.htm. M. Fidelman, 10 Facebook messenger bots you need to try right now, Forbes, [online] Available: https://www.forbes.com/sites/markfidelman/2016/05/19/10-facebook-messenger-bots-you-need-to-try-right-now/#24546a4b325a.

4. D. Guilbeault, S.C. Woolley, "How Twitter bots are shaping the election", Atlantic, [online] Available: https://www.theatlantic.com/technology/archive/2016/11/election-bots/506072/.

5. K. Hammond, "What is artificial intelligence?", Computer World, [online] Available: http://www.computerworld.com/article/2906336/emerging-technology/what-is-artificial-intelligence.html

6. Bina 48 meets Bina Rothblatt - Part Two, [online] Available: https://www.youtube.com/watch?v=G5IqcRILeCc.

7. M. Vakulenko, Beyond the ‘chatbot’ - The messaging quadrant, [online] Available: https://www.visionmobile.com/blog/2016/05/beyond-chatbot-messaging-quadrant.

8. "Turing test: Artificial Intelligence", Encyclopaedia Britannica, [online] Available: https://www.britannica.com/technology/Turing-test.

9. D. Proudfoot, "What Turing himself said about the Imitation Game", Spectrum, [online] Available: http://spectrum.ieee.org/geek-life/history/what-turing-himself-said-about-the-imitation-game.

10. S.C. Woolley, Resource for understanding political bots, [online] Available: http://politicalbots.org/?p=797.

11. N. Byrnes, "How the bot-y politic influenced this election", Technology Rev., [online] Available: https://www.technologyreview.com/s/602817/how-the-bot-y-politic-influenced-this-election/.

12. I. Lapowsky, "Brexit is sending markets diving. Twitter could be making it worse", Wired, [online] Available: https://www.wired.com/2016/06/brexit-sending-markets-diving-twitter-making-worse/.

13. 2016 election news coverage, France:, [online] Available: http://www.france24.com/en/20161105-bots-step-2016-election-news-coverage.

14. A. Vogt, "Hot or bot? Italian professor casts doubt on politician's Twitter popularity", The Guardian, [online] Available: https://www.theguardian.com/world/2012/jul/22/bot-italian-politician-twitter-grillo.

15. T. Peel, "The Coalition's Twitter fraud and deception", Independent, [online] Available: https://independentaustralia.net/politics/politics-display/the-coalitions-twitter-fraud-and-deception.

16. C. Letsch, "Social media and opposition to blame for protests says Turkish PM", The Guardian, [online] Available: https://www.theguardian.com/world/2013/jun/02/turkish-protesters-control-istanbul-square.

17. E. Poyrazlar, "Turkey's leader bans his own Twitter bot army", Vocativ, [online] Available: http://www.vocativ.com/world/turkey-world/turkeys-leader-nearly-banned-twitter-bot-army/.

18. J.C. York, "Syria's Twitter spambots", The Guardian, [online] Available: https://www.theguardian.com/commentisfree/2011/apr/21/syria-twitter-spambots-pro-revolution.

19. R. Morla, "Ecuadorian websites report on hacking team get taken down", Panam Post, [online] Available: http://panampost.com/rebeca-morla/2015/07/13/ecuadorian-websites-report-on-hacking-team-get-taken-down/.

20. A. Najar, ¿Cuánto poder tienen los Feñabots. los tuiteros que combaten la crítica en Mexico?, BBC, [online] Available: http://www.bbc.com/mundo/noticias/2015/03/150317_mexico_internet_poder_penabot_an.

21. S. Walker, "Salutin’ Putin: Inside a Russian troll house", The Guardian, [online] Available: https://www.theguardian.com/world/2015/apr/02/putin-kremlin-inside-russian-troll-house.

 22. B. Krebs, "Twitter vots target Tibetan protests", Krebson Security, [online] Available: http://krebsonsecurity.com/2012/03/twitter-bots-target-tibetan-protests/.

23. S. Hegelich, D. Janetzko, "Are social bots on Twitter political actors? Empirical evidence from a Ukrainian social botnet", Proc. Tenth Int. AAAI Conf. Web and Social Media, pp. 579-582, 2016.

24. M.C. Forelle, P.N. Howard, A. Monroy-Hernandez, S. Savage, "Political bots and the manipulation of public opinion in Venezuela", SSRN, [online] Available: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2635800.

29. H. Polites, The perils of polling Twitter bots, The Australian, [online] Available: http://www.theaustralian.com.au/business/business-spectator/the-perils-of-polling-twitter-bots/news-story/97d733c6650991d20a03d25a4229b42e.

30. A. Bruns, Follower accession: How Australian politicians gained their Twitter followers, SBS, [online] Available: http://www.sbs.com.au/news/article/2013/07/08/follower-accession-how-australian-politicians-gained-their-twitter-followers.

31. S. Fazakerley, Paid parental leave is a winner for Tony Abbott, [online] Available: https://twitter.com/stuartfaz/status/369068662163910656/photo/1?ref_src=twsrc%5Etfw&ref_url=httpo/03Ao/02Fo/02Ftheconversation.com%2Fbots-without-borders-how-anonymous-accounts-hijack-political-debate-70347.

32. P.N. Howard, B. Kollanyi, "Bots #Stron-gerin and #Brexit: Computational propaganda during the UK-EU Referendum", SSRN, [online] Available: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2798311.

33. A. Bhattacharya, Watch out for the Brexit bots, [online] Available: https://qz.com/713980/watch-out-for-the-brexit-botsf.

34. M. A. Carter, N/A, [online] Available: https://twitter.com/rob_cart123/status/746091911354716161?ref_src=twsrc%5Etfw&ref_url=http%3A%2F%2Ftheconversation.com%2Fbots-without-borders-how-anonymous-accounts-hijack-political-debate-70347.

35. C. Copley, Angela Merkel fears social bots maymanipulate German election, [online] Available: http://www.smh.com.au/world/angela-merkel-fears-social-bots-may-manipulate-german-election-20161124-gsx5cu.html.

36. I. Tharoor, ‘Fake news’ threatens Germany's election too says Angela Merkel, [online] Available: http://www.smh.com.au/world/fake-news-threatens-germanys-election-too-says-angela-merkel-20161123-gsw7kp.html.

37. F. Floridi, "Fake news and a 400-year-old problem: we need to resolve the ‘post-truth’ crisis", The Guardian, [online] Available: https://www.theguardian.com/technology/2016/nov/29/fake-news-echo-chamber-ethics-infosphere-internet-digital.

38. A. Qtiesh, The blast inside, [online] Available: http://www.anasqtiesh.com/.

39. SYRIA - Syria's Twitter spambots, [online] Available: https://wikileaks.org/gifiles/docs/19/1928607_syria-syria-s-twitter-spam-bots-html.

40. A. Qtiesh, Spam bots flooding Twitter to drown info about #Syria Protests (Updated), [online] Available: https://advox.globalvoices.org/2011/04/18/spam-bots-flooding-twitter-to-drown-info-about-syria-protests/.

41. N/A, [online] Available: https://twitter.com/SyriaBeauty/status/202585453919076353?ref_src=twsrc%5Etfw&ref_url=http%3A%2F%2Ftheconversation.com%2Fbots-without-borders-how-anonymous-accounts-hijack-political-debate-70347.

42N/A, [online] Available: https://twitter.com/BritishLebanese/status/60075290055024640?ref_src=twsrc%5Etfw&ref_url=http%3A%2F%2Ftheconversation.com%2Fbots-without-borders-how-anonymous-accounts-hijack-political-debate-70347

43. L. Shamy, To everyone who can hear me!, [online] Available: https://twltter.com/Linasharny/status/808422105809387520?ref_src=twsrcsooEtfw&ref_url=http%3A%2F%2Ftheconversation.com%2Fbots-without-borders-how-anonymous-accounts-hijack-political-debate-70347.

44. S. Woolley, Spammers scammers. and trolls: Political bot manipulation, [online] Available: http://politicalbots.org/?p=295.

45.  R. Nordland, A.J. Rubin, Massacre claim shakes Iraq, NYTimes, [online] Available: https://www.nytimes.com/2014/06/16/world/middleeast/irao.html?_r=1

46. S. Oster, "China fakes 488 million social media posts a year: Study", Bloomberg News, [online] Available: https://www.bloomberg.com/news/articles/2016-05-19/china-seen-faking-488-million-internet-posts-to-divert-criticism.

47. G. King, J. Pan, M.E. Roberts, How the Chinese Government fabricates social media posts for strategic distraction not engaged argument, [online] Available: http://gking.harvard.edu/files/gking/files/50c.pdf.

48. Y. Yang, The perfect example of political propaganda: The Chinese Government's persecution against Falun Gong, [online] Available: http://www.globalmediajournal.com/open-access/the-perfect-example-of-political-propaganda-the-chinese-governments-persecution-against-falun-gong.php?aid=35171.

49. B. Feldman, How the Chinese Govemment uses social media to stop dissent, New York:, [online] Available: http://nymag.com/selectall/2016/05/china-posts-propaganda-on-social-media-as-misdirection.htm

ACKNOWLEDGMENT

This article is adapted from an article published in The Conversation titled “Bots without borders: how anonymous accounts hijack political debate,” on January 24, 2017. Read the original article http://theconversation.com/bots-without-borders-how-anonymous-accounts-hijack-political-debate-70347 Katina Michael would like to thank Michael Courts and Amanda Dunn from The Conversation for their editorial support, and Christiane Barro from Monash University for the inspiration to write the piece. Dr. Roba Abbas was also responsible for integrating the last draft with earlier work.

Citation: Katina Michael, 2017, "Bots Trending Now: Disinformation and Calculated Manipulation of the Masses", IEEE Technology and society Magazine, Vol. 36, No. 2, pp. 6-11.

Sociology of the docile body

Abstract

Discipline and Punish: The Birth of the Prison (Penguin Social Sciences): Michel Foucault, Alan Sheridan: 8601404245756: Books

Embedded radio-frequency identification, sensor technologies, biomedical devices and a new breed of nanotechnologies are now being commercialized within a variety of contexts and use cases. As these technologies gather momentum in the marketplace, consumers will need to navigate the changing cybernetic landscape. The trichotomy facing consumers are: (1) to adopt RFID implants as a means of self-expression or to resolve a technological challenge; (2) to adopt RFID implants for diagnostic or prosthetic purposes to aid in restorative health; as well as considerations (3) for enforced adoption stemming from institutional or organizational top-down control that has no direct benefit to the end-user. This paper uses the penal metaphor to explore the potential negative impact of enforced microchipping. The paper concludes with a discussion on the importance of protecting human rights and freedoms and the right to opt-out of sub-dermal devices.

Section I. Introduction

Radiofrequency identification (RFID) implant technology, sensor technology, biomedical devices, and nanotechnology continue to find increasing application in a variety of vertical markets. Significant factors leading to continued innovation include: convergence in devices, miniaturisation, storage capacity, and materials. The most common implantable devices are used in the medical domain, for example, heart pacemakers and implantable cardioverter defibrillators (ICDs). In non-medical applications, implantable devices are used for identification, [close-range] location and condition monitoring, care and convenience use cases [1].

RFID implants can be passive or active, and predominantly have a function to broadcast a unique ID when triggered by a reader within a specific read range. Sensors onboard an RFID device can, for instance, provide additional data such as an individual's temperature reading, pulse rate and heart rate. Biomedical devices usually have a specific function, like the provision of an artificial knee or hip, and can contain RFID and other specific sensors. An example cited in Ratner & Ratner that demonstrates the potential for nanotechnology to bring together RFID, sensors, and the biomedical realms is to inject nanobots into a soldier's bloodstream. “The sensors would circulate through the bloodstream and could be monitored at a place where blood vessels are closest to the surface, such as the eye… While quite invasive, so-called in vivo sensors could also have other uses in continually monitoring the health of a soldier” [2], p. 42f.

The next step in the miniaturization path for RFID microchips is nanotechnology, which allows for working at the nanoscale, that is the molecular level [3] p. 90. Humancentric implants are discussed [4], pp. 198-214, in the context of nanotechnology ethical and social implications. Regardless of the breakthroughs to come in these humancentric embedded surveillance devices (ESDs), we will soon be moving the discussion beyond, merely how the technologies are aiding humanity, regardless of whether such technologies are mobilized to aid human health or impair it. The fundamental concerns will rest within human willingness to adopt the technology, and not in what the technology claims to eradicate in and of itself. In order to later contextualize the issues surrounding human rights of refusal, this paper will now present a material view of implantable technologies in their nascent stage. A clear distinction will be made between nanotechnologies that can be used as a mechanism of control versus, for example, bio-medical technologies that are freely chosen and designed for the sole purpose of improving human health with no benefit extending beyond the aid of the individual.

Section II. Previous Work

Although cybernetic technologies have boundless potential to surface under an array of interchangeable names, for the purpose of this paper, RFID implants will be investigated given the degree of global attention they have experienced [5]–[6][7][8]. In Western civilization, RFID is being used for tracking merchandise and similar devices are used in our family pets to locate them should they roam astray [9]. Now the RFID is being considered for 24-7 human location monitoring. In order to offer a pragmatic perspective, which does not deviate from one source of research to the other, Hervé Aubert's 2011 article entitled, “RFID technology for human implant devices” [10] is utilized as the primary source of data given its seminal contribution to the field.

A. Experimental Stages of Cybernetic Innovations

Aubert investigates one type of RFID known as the VeriChip™; which is a device presently engineered to provide a data-bank of important records on the individual [5], in particular on the application of a personal health record for high-risk patients (PHR) [11], [12]. In addition, this implantable RFID that is known for its remote identification of persons or animals is being considered for the purpose of protective human surveillance [13]. RFID devices are not only being considered for identifying and locating humans, but for its potential to “remotely control human biological functions” [10], [14], p. 676. According to Aubert, this nano-technology is not conducive as a ‘spychip’ with current-day technologies, as it cannot successfully be connected to a Global Positioning System (which offers real-time tracking), as the GPS would require an implant that far surpasses the size capacity of what could be realistically embedded in the human body, and would therefore defeat the notion of a submicron global surveillance system for monitoring human activity. However, there is nothing to say that off-body data receivers, powered by wireless supplies, cannot be stationed short-range to monitor passive responders, such as subdermal RFID's [15]–[16][17]. Currently the anticipated range is dependent on the inductive coupling measured in MHz [5].

Aubert concludes his findings by arguing that RFID are not suitable for real-time tracking of humans as its capability to transmit the location of the body is too limited in range, permitting receivers to only read passive implanted devices within a free space range of 10 cm or less. This limitation makes communication with GPS satellites in an attempt to locate bodies impossible. Once again, this is not to refute the claim that interrogators, stationed territorially, can transmit its data to a centralized global positioning system inversely. Regardless, researchers are arguing nanotechnologies “[w]ill not exclusively revolve around the idea of centralization of surveillance and concentration of power, […but its greatest potential for negative impact will be centred around] constant observation at decentralized levels” [18], p. 283. In addition, depending on the context, monitoring does not have to be continuous but discrete to provide particular types of evidence. It may well be enough to read an RFID at a given access node point (either on entry or exit), or to know that a given unique ID is inside a building, or even headed in a given direction [19]. Two or more points of reading also can provide intricate details about distance, speed, and time, as equipment readers have their own GPS and IP location [20], [21]. It will be simple enough to tether an implant to a mobile phone or any other device with an onboard GPS chipset. Nokia, for instance, had an RFID reader in one of its units 2004 handsets [22].

Although such technologies are far from perfected, at least to the degree of synoptic centralization, with the exception of concerns surrounding information privacy, subdermal implants that are being designed for surveillance of humans is being identified as a central ethical challenge [23]. In particular, this is an ethical challenge because subdermal chips may be either injected or external tags worn on the body such as a PayBand [24] or FitBit. This in itself is not what is creating the most obvious challenge but rather that such devices have the potential to be implemented with or without the individual's consent and, therefore, provoking discussion around the need to legislate to keep pace with technological advances [25]. Although the chip is being suggested for use in a number of ways, bioethicists suggest that prior to these new applications of nanotechnologies becoming a present day reality, “[w]e need to examine carefully the very real dangers that RFID implants could pose to our privacy and our freedom” [5], p. 27. Despite this concern, skin-embedded devices are being employed in a multiplicity of ways, more recently by the biohacking communities who are increasingly commercialising their ideas and prototypes [26].

Aubert lists various possible health benefits of embedded RFID chips, such as the following: “[t]o transmit measurements of chemical or biological data inside the body”, as well as “[m]onitor biological activity” while modifying physiological functions and offer various therapeutic means, such as patient monitoring, such as for glucose concentrations of patients with diabetes [10], p. 676. Another possible health benefit is the potential for monitoring brain activity through “[t]ransponders embedded within the skull”, [10], p. 681. Increasingly implants are being used in techniques such as deep brain stimulation (DBS) and vagus nerve stimulation (VNS) to treat a variety of illnesses [27]. As outlined in Aubert's 2011 article, these transponders communicate with implanted probes, enabling the transmittal of localized microstimulation to be administered in response to neuron signals sent.

At this point, it becomes necessary to distinguish that which is engineered to monitor human organs and is freely adopted as a mechanism to improve one's health to that which is in effect through a top-down implementation, in which the individual is given no choice pertaining to adoption. These two scenarios have been demonstrated in a TEDx talk delivered by Katina Michael in 2012 within the “convenience/care” versus “control” contexts [28].

B. Human Versus Machine

Docile Bodies | Vestoj A Chain Gang in South Carolina, c. 1929 - 1931. Doris Umann. http://vestoj.com/docile-bodies/

There is a needful distinction between human and machine. Deciphering between biomedical technology designed for example, to improve human health, or as a means of self-expression (all of which are freely chosen by the individual), versus those designed for a benefit external to the individual and has the ability to be used as a mechanism of control over the citizen. For example, a heart monitor, created to sustain a human, is designed only with the intention to benefit the patient in a life sustaining way; such a device has no apparatus external from this cause that could be used to invoke power over the individual and therefore it is designed with no additional mandate other than improving or maintaining the individual's health [29]. Generally, the decision for adopting such a biomedical implant device is determined by the patient and in most developed nations using a process of consent. Because such a device currently has no mechanism for top-down control, stakeholders (i.e., hospitals, medical device purchasers, inbound logistics managers or buyers) do not have a hidden agenda for adoption. This type of bio-medical device currently possesses no ability to monitor any type of human activity that could contribute to an imbalance of power for the consumer over the user (in this instance the patient).

More recently, one of the largest suppliers of biomedical devices, Medtronics, has begun to blur the line between devices for care and devices for control. Apart from the hard line that most manufacturers of implants hold on who owns the data emanating from the device [30], companies specialising in biomedical devices are now beginning to engage with other secondary uses of their implants [31]. Just like wearable devices, such as the FitBit, are now being used for evidentiary purposes, it will not be long before biomedical devices originally introduced for prosthetic or diagnostic purposes will be used to set individualised health insurance premiums, and more. As noted by [29], even in care-related implant applications, there is an underlying dimension of control that may propel function creep or scope creep. These are the types of issues that bring science and the arts together. George Grant wrote [32], p. 17:

The thinker who has most deeply pondered our technological destiny has stated that the new copenetrated arts and sciences are now proceeding to the apogee of their determining power around the science of cybernetics; […] the mobilization of the objective arts and sciences at their apogee comes more and more to be unified around the planning and control of human activity.

Section III. Research Approach

Hence, while it is important to understand the trichotomy of skin-embedded technologies-deciphering between technology adoption which can be seen as a post-modern indicator of the autonomous self-exercising human rights [33], to that of acceptable bio-Western technologies with its sole function to improve one's existing health conditions (that is also freely chosen of the individual), versus technology which have potential to be used as mechanisms of organizational control-implanted through imposed order [34]. When disambiguating the way in which technology can be used, it is most essential to understand that this differentiation requires no thorough understanding of the purpose of the biotechnology or its utility as the plumb line rests alone, not on the trichotomy of the technology's utility but within the individual's moral freedom and human rights to accept or refuse. Therefore, the plumb line remains, not concerning the device's distinct utility, but rather with freedom of choice.

Currently, the question is being posed as to whether legislation will keep pace, which suggests that either a higher articulation of our former constitution is required or that new legislation be erected that will explicitly defend the rights of the individual to choose for oneself [35].

The ways in which sub-dermal technology may aid correctional facilities' endeavors will be more thoroughly expounded on in the next section. A historical look at a specific top-down and bottom-up institution will be examined, not as a raw set of material facts but, in order to create an inference between the way in which the incremental process of correctional ideologies are the prevailing influence of today and are promoting the individual's outward gaze to self-censorship [36]. Some researchers are arguing it is highly improbable that laws will be erected to enforce subdermal devices, with the exception of use in criminals [37]. Therefore, this next section is being devoted to an investigation of the penal system.

Section IV. The Penal Metaphor

Because the prisoner is being noted as the central focus as a possible industry enroot to legalizing the implementation of sub-dermal RFID's, it becomes imperative to investigate the penal system from an ideological perspective in order to assess its susceptibility [38], pp. 157-249; [39], p. 35. This paper will conclude that there needs to be a distinction between spatial autonomy and moral autonomy as moral freedom is of the higher good and rights to obtain unto this good supersedes loses that could be incurred as a result of the state invoking disciplinary measures [32].

Generation after generation civilization oscillates over freedom of choice, blurring the distinction between freely adopting governing rules of belief, following an individualized interrogation of the ethical underpinnings, versus conforming to systematic ruling government without understanding its fundamental doctrine. Often such systems strive to maintain order through imposing indoctrinations, in which its people accept the ideologies of the dominant class through a constant infiltration of information not conducive to independent thinking of the autonomous self; it is argued that when this knowledge becomes singular it is a form of soft-despotism [40]. Through various mechanisms of social control, such as through a prevailing slant being propagated through the media, it has led an onslaught of persons embodied in space to a place where the individual is losing ability to see the distinction and whereby choose for oneself. The specific slant contained within the dominant message is directing Western society to a place imbued with an external message with its constancy softly-coercing the viewer or listener in one specific direction [32].

A. A Look at the System as an Apparatus of Control

As the high-tech industry evolves, the media continues to endorse such change and those adopting a consumerist mentality continue to commoditize their own body as a source of consumer capitalism [41] through the latest technological upgrade. It will only stand to logic that human adaptation to body modifying devices will become more and more acceptable as a means to live within society, function in commerce and progress in self-actualization [42]. The authors of this paper argue that when any such movement coerces the people in one specific direction it is a form of soft-despotism whether invoked intentionally or otherwise [40].

It is within this investigation of the governing forces over the masses that the focus is taken away from the history of the penal institution in itself to the state's reliance on cumulative rationale. Theorists argue that it is this over reliance on human rationale that is propelling history in one specific direction and thus becomes the force that is evoking a certain type of social order and governance [43].

In order to elucidate Ann Light's notion of how biotechnology can turn us from outside within, she first turns our attention to the penal system [36]. Theorists argue that the open persecution of punishment found within the penal process has radically shifted to become less detectable and more hidden [44]. This is a far cry from the open persecution experienced by, let us say, Joan of Arc [45], as now, largely due to humanitarianism, the public spectacle of the executioner who leads the persecuted to the stake appears an equivalent act of savagery to the public who witnessos, as is the crime itself [44]. Hence the mechanism becomes more hidden and in this sense is argued to be less pervasive [44]. But is it?

Theorists view the apparatus of the persecutor as moving from control over the body to a much more sophisticated apparatus, which slackens the hold on the tangible physical body in exchange for a far more intricate part of the self. This shifts the focus from the external body to the human mind, which is considered as the seat of the soul and the final battleground [46]. Theorists go on to state that these more sophisticated systems of control will only be confirmed to actually exist as history unfolds [36].

The panoptic, for example is a model that can be deemed as a control mechanism which is less pervasive as it moves away from physical punishment to psychological punishment [44]. Specifically the sanctioned individual who believes the monitoring of one's behavior to be constant, whereby shifting the focus of what is believed to be periodic surveillance to a continual presence. The constancy found in this form of surveillance is argued to imprint permanence on the human cognition [36]. It is what M.G. Michael has termed uberveillance—a type of big brother on the inside looking out [47]. In order that the reader may have a clearer understanding of the Panopticon, below is a description of Bentham's institution:

“The hollow interior of the circular Panopticon has an incongruous resemblance to a dovecote with all the doves behind bars. The prisoners' cells are in the circumference, but are open at all times to inspection from the observation tower in the center of the building. The theory of the Panopticon relies on the fiction that each prisoner, alone in his cell, believes that he is under constant observation: yet it is patently impossible that the contractor and his small staff within the central tower could watch 3, 000 prisoners at once. So that the prisoners may not know whom he is watching, or whether he is present at all, the contractor must at all times be invisible; and Bentham thought much about deceptive lighting systems to preserve the illusion of the contractor's permanent presence, a “dark spot” at the center of the Panopticon. Observation of a single prisoner for several hours, followed by punishment for any misdemeanors, would convince all the rest of this constant vigilance. Although the contraptions such as Venetian blinds, pinholes and speaking tubes which delighted Bentham have lost some technological credibility, the general principle is readily applicable to modern methods of surveillance” [48], pp.4-5.

Upon reviewing the detailed description of the institution designed by Bentham, it is easy to see how the panoptic system supports the shift from the body to the mind, which then turns the imprisoned body's gaze inward [36]. Out of fear of punishnent, the embodied experience is to begin to self-monitor.

Although some argue Bentham's Panopticon never came to fruition, Michael Ignatieff views it as a “[s]ymbolic caricature of the characteristic features of disciplinary thinking [of] his age” [48], p. 5. Crowther argues:

[According to] Bentham, the Panopticon was not an enclosed relationship between the prisoner and the state, removed from the outside world, but a prison constantly open to public scrutiny. The contractor in his watchtower could be joined at any minute not only by magistrates, but by the prisoners' relatives, the curious, or the concerned, “The great open committee of the tribunal of the world.

This invokes two types of control of the incarcerated; according to sociology theorists, a top down approach to surveillance is referred to organizational surveillance, whereas a bottom-up approach in which the common citizen becomes the watch-guard is referred to as inverse [49]. Bentham became aware of the possible negative impact that constant surveillance of the state and the public could produce on the prisoners' sensibilities, and therefore suggested that the prisoner wear a disguise. The mask would conceal the individual's identity while each unique disguise, would represent the crime that was committed. Hence, Bentham did make a frail attempt to resolve the way in which the apparatus' constancy could impair one's well-being [48].

The Panopticon illustrated here is merely representational, as the physical apparatus of control is being reflected upon as a means of the reader relating to the modem-day ideological shift within organizational control that is designed to turn the gaze of the end-user, the prisoner, and such, to self-monitoring. Western civilization that once employed an external gaze that had previously sought a voice in politics, for instance, is being turned from outside within. According to Ann Light [36], digital technology is promoting this shift.

Section V. Discussion

A. The Impact of Bio-Tech Constancy on the Human Psyche

Whether this surveillance transpires every moment of every day [50], or just in the sanctioned individual's mind is of little importance as it is the unknown or fear of what is “ever-lurking” that has the greatest potential to negatively impact the human psyche. When the interrogator is no longer human but the receptor is a machine there is something even more demoralizing that transpires as the removing of human contact can be likened to placing the prisoner in a type of mechanical quarantine [36], [51].

Embedded surveillance devices (although currently only engineered to accommodate short-range, such as within a correctional facility), can be considered as the all-seeing pervasive eye, the interrogator. However, the individual being tracked may lack knowledge about what is on the other side; which is the receptor. This can create a greater monster than real-life as it adds insurmountable pressure due to the unknown and the inability to understand the boundaries and limitations of the surveillance technology. This becomes that much more of an infringement when the device is placed under the individual's skin. Illustratively speaking, rather than seeing it as it is, such as, a mark of servitude, a passive information bank, a personal identifier, or a location monitor, the inductive coupling device has potential to be mistakenly deemed as the predator. In support of this notion, modern-day scholars are referring to the reader as the interrogator.

As earlier stated, in this instance, the external public gaze of the community and the state will shift from the external all-seeing eye, to that which is internalized—regardless of whether the device is passive or active. Over and above Foucault's notion of self-policing, this process could be further accentuated due to the person's inability to comprehend the full purpose or limitations of the surveillance ID system in which they are under. This internalization has potential to create a feeling of “the beast within” rather than the threat being from without. The writers of this paper argue that this form of internalization of the gaze within the body will heighten the negative impact on one's psyche—ultimately negatively impacting one's state of consciousness [52].

In this sense Bentham's panoptic vision was never really defeated but now merely considered at a higher level of sophistication or barbarianism—depending on which way it is looked upon. Rather than institutions embracing practices designed to rehabilitate the prisoner, and bring the individual to an eventual state of freedom, bio-tech adoption could impair in the recovery process—its constancy heightening psychological fears—making it near impossible to ever be disabled within the mind of the end-user. Hence, as Bentham's notion of a free-enterprise is accepted on a much more hidden level, and the self turns to policing one's own actions, this utter enclosure can be argued to lead the human body to a state of utter docility. This is a subject of debate for psychologists, bioethicists and social scientists alike, and in support of the phenomenologist must also include the insider's perspective as well.

Section VI. Conclusion

Imprisonment is transpiring on many levels, and can be argued as being the system that has led Western civilization incrementally to the place it is today, where moral relativism is ruling the people, causing the moral voice of conviction designed for political and public engagement, to be displaced for a turning inward to oneself as a forms of self-expression [34]. This may be seen as the result of top-down governing institutes esteeming systematic rationale over the individuals' voice—inadvertently marginalizing the embodied-self over other forces such as the economy. As the ruling system continues to over extend its control, it ever-so-gently coerces society in one direction only, massaging the spirit of Epicureanism which endorses human passion to have it full reign over one's own body, as the final self-embodied means of conveying a message. Whereas the governing institutions can easily rule over a docile society. In this sense bio-tech with its constancy may be seen as just one more apparatus designed to control the mind—although hidden, it most certainly is invasive. With current considerations for adoption it brings Orwell's claim to the forefront when he wrote in 1984: “Nothing was your own except the few cubic centimetres inside your skull” [53], p. 27.

References

1. K. Michael, A. Masters, "Applications of human transponder implants in mobile commerce", Proceedings of the 8th World Multiconference on Systemics Cybernetics and Informatics, pp. 505-512, 2004.

2. D. Ratner, M. A. Ratner, Nanotechnology and Homeland Security, New Jersey:Prentice Hall, 2005.

3. M. H. Fulekar, Nanotechnology: Importance and Applications, New York:I K International Publishing House, 2010.

4. F. Allhoff et al., What is Nanotechnology and Why Does it Matter? From Science to Ethics, West Sussex: Wiley Blackwell, 2010.

5. K. R. Foster, J. Jaeger, "RFID inside - The murky ethics of implanted chips", IEEE Spectrum, vol. 44, pp. 24-29, 2007.

6. A. Masters, K. Michael, "Lend me your arms: The use and implications of humancentric RFID", Electronic Commerce Research and Applications, vol. 6, pp. 29-39, 2007.

7. K. Michael, M. G. Michael, "The diffusion of RFID implants for access control and epayments: a case study on Baja Beach Club in Barcelona", IEEE Symposium on Technology and Society, pp. 242-252, 2010.

8. K. Michael, M. G. Michael, J. Pitt, "Implementing ‘Namebars’ Using Microchip Implants: The Black Box Beneath the Skin" in This Pervasive Day: The Potential and Perils of Pervasive Computing, London:Imperial College London Press, pp. 163-206, 2010.

9. W. A. Herbert, "No Direction Home: Will the Law Keep Pace With Human Tracking Technology to Protect Individual Privacy and Stop Geoslavery", Law and Policy for the Information Society, vol. 2, pp. 436, 2006.

10. H. Aubert, "RFID technology for human implant devices", Comptes Rendus Physique, vol. 12, pp. 675-683, 2011.

11. K. Michael et al., "Microchip implants for humans as unique identifiers: a case study on VeriChip", Conference on Ethics Technology and Identity (ETI), pp. 81-84, 2008.

12. K. Michael, "The technological trajectory of the automatic identification industry: the application of the systems of innovation (SI) framework for the characterisation and prediction of the auto-ID industry", 2003.

13. A. Masters, K. Michael, "Lend me your arms: The use and implications of humancentric RFID", Electronic Commerce Research and Applications, vol. 6, pp. 29-39, 2007.

14. M. Michaud-Shields, "Personal Augmentation – The Ethics and Operational Considerations of Personal Augmentation in Military Operations", Canadian Military Journal, vol. 15, 2014.

15. "JOVIX", GPS vs. RFID, May 2016, [online] Available: http://atlasrfid.com/jovix-education/auto-id-basics/gps-vs-rfid/.

16. M. Roberti, Has RFID Been Integrated With GPS?, September 2016, [online] Available: http://www.rfidjournal.com/blogs/experts/entry?10729.

17. R. Ip et al., "Location and Interactive services not only at your fingertips but under your skin", IEEE International Symposium on Technology and Society, pp. 1-7, 2009.

18. J. van den Hoven, P. E. Vermaas, "Nano-Technology and Privacy: On Continuous Surveillance Outside the Panopticon", Journal of Medicine & Philosophy, vol. 32, pp. 283-297, 2007.

19. K. Michael, T. Y. Chew, Locat'em: Towards Hierarchical Positioning Systems, 2005, [online] Available: http://works.bepress.com/kmichael/145/.

20. K. Michael et al., "The emerging ethics of humancentric GPS tracking and monitoring", International Conference on Mobile Business, pp. 34-44, 2006.

21. K. Michael et al., "Location-Based Intelligence - Modeling Behavior in Humans using GPS", Proceedings of the International Symposium on Technology and Society, pp. 1-8, 2006.

22. B. Violino, Nokia Unveils RFID Phone Reader, March 2004, [online] Available: http://www.rfidjournal.com/articles/view?834.

23. K. Michael, M. G. Michael, "The social cultural religious and ethical implications of automatic identification", Proceedings of the Seventh InternationalConference in Electronic Commerce Research, pp. 433450, 2004.

24. D. Buckey, DirectCash Payments Inc. Announces Launch of DC TAG, August 2015, [online] Available: http://pay.band/tag/visa-paywave/.

25. A. Friggieri et al., "The legal ramifications of microchipping people in the United States of America-A state legislative comparison", International Symposium on Technology and Society, pp. 1-8, 2009.

26. L. McIntyre et al., "RFID: Helpful New Technology or Threat to Privacy and Civil Liberties?", IEEE Potentials, vol. 34, pp. 13-18, 2015.

27. K. Michael, "Mental Health Implantables and Side Effects", IEEE Technology and Society Magazine, vol. 34, pp. 5-7, 2015.

28. K. Michael, TEDxUWollongong: Microchipping People, May 2012, [online] Available: https://www.youtube.com/watch?v=fnghvVR5Evc.

29. A. Masters, K. Michael, "Humancentric applications of RFID implants: the usability contexts of control convenience and care", The Second IEEE International Workshop on Mobile Commerce and Services, pp. 32-41, 2005.

30. N. Olson, Joseph Carvalko, A Review of The TechnoHuman Shell, December 2013, [online] Available: http://ieet.org/index.php/IEET/print/8510.

31. E. Strickland, Medtronic Wants to Implant Sensors in Everyone, June 2014, [online] Available: http://spectrum.ieee.org/techtalk/biomedical/devices/medtronic-wants-to-implant-sensors-ineveryone.

32. G. Grant, Technology & Justice, Ontario:House of Anansi Press Ltd, 1986.

33. S. R. Bradley-Munn, K. Michael, "Whose Body Is It? The Body as Physical Capital in a Techno-Society", IEEE Consumer Electronics Magazine, vol. 5, 2016.

34. S. R. Bradley-Munn et al., "The Social Phenomenon of BodyModifying in a World of Technological Change: Past Present Future" in IEEE Norbert Wiener, Melbourne:, 2016.

35. Y. Poullet, "Data protection legislation: What is at stake for our society and democracy?", Computer Law and Security Review, vol. 25, pp. 211-226, 2009.

36. A. Light, "The Panopticon reaches within: how digital technology turns us inside out", Identity in the Information Society, vol. 3, pp. 583-598, 2010.

37. K. Johnson et al., "Consumer Awareness in Australia on the Prospect of Humancentric RFID Implants for Personalized Applications", The Sixth International Conference on Mobile Business, 2007.

38. D. Klitou, Privacy-Invading Technologies and Privacy by Design: Safeguarding Privacy Liberty and Security in the 21st Century, London:Springer, 2014.

39. M. N. Gasson et al., Human ICT Implants: Technical Legal and Ethical, The Hague: Springer, 2012.

40. P. A. Rahe, Soft Despotism Democracy's Drift: What Tocqueville Teaches Today, New Haven:Yale University Press, 2009.

41. C. Klesse, C. Malacrida, J. Low, "Part XIV: Consumer Bodies ‘Modern Primitivism’: Non-mainstream Body Modification and Racialized Representation" in Sociology of the Body: A reader, Don Mills, Ontario: Oxford University Press, 2008.

42. A. H. Maslow, "A Theory of Human Motivation", Psychological Review, vol. 50, pp. 370-396, 1943.

43. P. Rahe et al., Soft Despotism Democracy's Drift: What Tocqueville Teaches Today (The Heritage Foundation: First Principles Series Report #28 on Political Though), September 2009, [online] Available: http://www.heritage.org/research/reports/2009/09/softdespotism-democracys-drift-what-tocqueville-teaches-today.

44. M. Foucault, Discipline & Punish: The Birth of the Prison, New York: Vintage Books, 1977.

45. A. Williamson, Biography of Joan of Arc (Jeanne d'Arc), April 1999, [online] Available: http://joan-ofarc.org/joanofarc_biography.html.

46. F. Frangipane, The Three Battlegrounds: An In-Depth View of the Three Arenas of Spiritual Warfare: The Mind the Church and the Heavenly Places, Cedar Rapids: Arrow Publications, Inc., 1989.

47. K. Michael, M. G. Michael, From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Social Implications of National Security, Wollongong:, 2007.

48. A. Crowther, "Penal Peepshow: Bentham's Prison that Never Was", Times Literary Supplement, vol. 23, pp. 4-5, February 1996.

49. T. Timan, N. Oudshoorn, "Mobile cameras as new technologies of surveillance? How citizens experience the use of mobile cameras in public nightscapes", Surveillance Society Journal, vol. 10, pp. 167-181, 2012.

50. B. Welsh, "The Entire History of You" in Black Mirror, UK:, 2011.

51. C. Malacrida, J. Low, Sociology of the Body: A Reader, Don Mills, Ontario:Oxford University Press, 2008.

52. K. Michael, J. Pitt et al., "Be Vigilant: There are Limits to Veillance" in The ComputerAfter Me, London:, pp. 189-204, 2014.

53. G. Orwell, London: Signet Classic, 1984.

Keywords: Radio-frequency identification, Implants, Biomedical monitoring, Global Positioning System, Surveillance, Context, social sciences, cybernetics, prosthetics, radiofrequency identification, docile body sociology, penal metaphor, institutional top-down control, organizational top-down control, restorative health, diagnostic purpose, prosthetic purpose, RFID implants, cybernetic landscape, nanotechnology, biomedical device, sensor technology, human rights, freedom of choice, opt-out, penal control, constancy

Citation: S.B. Munn, Katina Michael, M.G. Michael, "Sociology of the docile body", 2016 IEEE International Symposium on Technology and Society (ISTAS16), 20-22 Oct. 2016, Kerala, India, DOI: 10.1109/ISTAS.2016.7764047

Digital Wearability Scenarios: Trialability on the Run

Introduction

What happens when experimental technologies are deployed into society by market leaders without much forethought of the consequences on everyday life? When state-based regulations are deliberately ignored by rapid innovation design practices, giving birth to unconventional and radical production, a whole series of impacts play out in real life. One such example is Google's Glass product: an optical head-mounted display unit that is effectively a wearable computer. In early 2013, Google reached out to U.S. citizens asking potential Glass users to send a Twitter message with the #IfIHadGlass hashtag to qualify for consideration and to pay US$1,500 for the product if numbered among the eligible for its early adoption. About 8,000 consumers in the United States allegedly were invited to purchase the Explorer edition of Glass. By April 2013, Google had opened up Glass to its “Innovation in the Open” (I/O) developer community, and by May 2014, they allowed purchases of the product from anywhere in the world.

The early adopters of the open beta product quickly became tech evangelists for the Google brand. As was expected, the touted benefits of Glass, by the self-professed “Glassholes,” were projected as mainstream benefits to society via YouTube and Hangout. Tech-savvy value-added service providers who stood to gain from the adoption and citizens who wished to be recognized as forward-thinking, entrepreneurial, and cool came to almost instantaneous fame. There were, however, only a few dissenting voices that were audible during the trialability phase of diffusion, with most people in society either not paying much attention to “yet another device launch” by Google or ignoring folk who were just geeks working on hip stuff. About the biggest thought people had when confronted by one of these “glasses” in reality was “What's that?” followed by “Are you recording me?” The media played an interesting role in at least highlighting some of the potential risks of the technology, but for the most part, Glass was depicted as a next-generation technology that was here now and that even Australia's own then-Prime Minister Julia Gillard had to try out. Yep, another whiz-bang product that most of us would not dare to live without.

With apparently no limits set, users of Glass have applied the device to diverse contexts, from the operating theater in hospitals to preschools in education and evidence gathering in policing. Yes, it is here, right now. Google claims no responsibility for how its product is applied by individual consumers, and why should they—they're a tech company, right? Caveat emptor! But from the global to the local, Glass has received some very mixed reactions from society at large.

Scenario-Planning Approach

This article focuses on the social-ethical implications of Glass-style devices in a campus setting. It uses secondary sources of evidence to inspire nine short scenarios that depict a plausible “day in the life” of a person possessing a body-worn video camera. A scenario is “an internally consistent view of what the future might turn out to be” [1]. One gleans the current state of technology to map the future trajectory [2, p. 402]. Scenarios allow us two distinct qualities as researchers: 1) an opportunity to anticipate possible and desirable changes to society by the introduction of a new technology known as proactivity and 2) an opportunity to prepare for action before a technology is introduced into the mainstream, known as preactivity [3, p. 8]. While change is inevitable as technology develops and is diffused into society, we should be able to assess possible strategic directions to better prepare for expected changes and, to an extent, unexpected changes. This article aims to raise awareness of the possible social, cultural, and ethical implications of body-worn video recorders. It purposefully focuses on signs of threats and opportunities that body-worn recording devices presently raise in a campus setting such as a university [1, p. 59]. A similar approach was used successfully in [4] with respect to location-based services in 2007.

In February 2013, Katina and M.G. Michael were invited to write an opinion piece about the ethics of wearable cameras for Communications of the ACM (CACM) [5]. Upon the article's acceptance in September of the same year, the CACM editor provided the option of submitting a short video to accompany the article online, to act as a summary of the issues addressed. Encouraged by the University of Wollongong's videographer, Adam Preston from Learning, Teaching and Curriculum, after some initial correspondence on prospective scenarios, it was jointly decided to simulate the Glass experience with a head-mounted GoPro camera [6] and to discuss on camera some of the themes presented in the article within a university campus setting (Figure 1). A few months prior, in June, Katina hosted the International Symposium on Technology and Society (ISTAS13) with wearable pioneer Prof. Steve Mann [7]. Ethics approval for filming the three-day international symposium with a variety of wearable recorders was gained from the University of Wollongong's Human Research Ethics Committee (HREC) for the University of Toronto-based event. Importantly, it must be emphasized that the scenarios themselves are fictitious in terms of the characters and continuity. They did not happen in the manner stated, but, like a tapestry, they have been woven together to tell a larger story. That story is titled: “Recording on the Run.” Each scenario can be read in isolation, but, when placed side by side with other scenarios, becomes a telling narrative of what might be with respect to societal implications if such recording devices proliferate.

Figure 1. A GoPro device clipped to an elastine headband ready to mount on a user. Photo courtesy of Katina Michael.

Figure 1. A GoPro device clipped to an elastine headband ready to mount on a user. Photo courtesy of Katina Michael.

Having hired the videographer for 2 h to do the filming for CACM, we preplanned a walkthrough on the University of Wollongong's campus (Figure 2). Deniz Gokyer (Figures 3 and 4) was approached to participate in the video to play the protagonist GoPro wearer, as he was engaged in a master's major project on wearables in the School of Information Systems and Technology. Lifelogging Web sites such as Gloggler.mobi that publish point-of-view (POV) video content direct from a mobile device were also used to support claims made in the scenarios. The key question pondered at the conclusion of the scenarios is, how do we deal with the ever-increasing complexity in the global innovation environment that continues to emerge around us with seemingly no boundaries whatsoever? The scenarios are deliberately not interpreted by the authors to allow for debate and discussion. The primary purpose of the article was to demonstrate that body-worn recording products can have some very significant expected and unexpected side effects, additionally conflicting with state laws and regulations and campus-based policies and guidelines.

Figure 2. (a) The making of a short video to discuss the ethical implications of wearable devices for CACM. (b) The simultaneous GoPro view emanating from the user's head-mounted device. Screenshots courtesy of Adam Preston.

Figure 2. (a) The making of a short video to discuss the ethical implications of wearable devices for CACM. (b) The simultaneous GoPro view emanating from the user's head-mounted device. Screenshots courtesy of Adam Preston.

Figure 3. Deniz Gokyer simulating an ATM withdrawal while wearing a GoPro. Photo courtesy of Adam Preston.

Figure 3. Deniz Gokyer simulating an ATM withdrawal while wearing a GoPro. Photo courtesy of Adam Preston.

Figure 4. The aftereffect of wearing a GoPro mounted on an elastic band for 2 h. Photo courtesy of Katina Michael.

Figure 4. The aftereffect of wearing a GoPro mounted on an elastic band for 2 h. Photo courtesy of Katina Michael.

Recording on the Run

Scenario 1: The Lecture

Anthony rushed into his morning lecture on structures some 10 min late. Everyone had their heads down taking copious notes and listening to their dedicated professor as he provided some guidance on how to prepare for the final examination, which was worth 50% of their total mark. Anthony was mad at himself for being late, but the bus driver had not accepted his AUD$20 note in lieu of the Opal Card now available. Prof. Markson turned to the board and began writing the practice equations wildly, knowing that he had so much to get through. Anthony made sure to keep his hands free of anything that would sidetrack him. Instead, he recorded the lecture with a GoPro on his head. Some of the girls giggled in the back row as he probably looked rather stupid, but the laughter soon subsided and everyone got back to work, copying down Markson's examples. At one stage, Markson turned to look at what the giggles were about, made startling eye contact with Anthony, and probably thought to himself: “What's that? Whatever it is, it's not going to help him pass—nothing but calculators are allowed in exam situations.”

Anthony caught sight of Sophie, who motioned for him to go to the back row, but by then, he thought it would probably be better recording from the very front and he would cause less disruption by just sitting there. Markson was a little behind the times when it came to innovation in teaching, but he was a brilliant lecturer and tutor. Anthony thought to himself, if anyone asks for the recording, he would make sure that it would be available to them. The other students took note of the device that was firmly strapped to his head with a band but were somewhat unphased. Anthony had always argued that recording with a GoPro is nothing more than recording with a mobile phone. He surfed a lot at Austinmer Beach, and he thought the video he took of himself on the board was just awesome, even though his girlfriend thought it was vain. It was like a motion selfie.

Scenario 2: The Restroom

It had been one long day, practically like any other, save for the fact that today Anthony had chosen to wear the GoPro on a head-mounted bandana to record his lectures. They were in the serious part of the session, and he wanted to make sure that he had every opportunity to pass. Anthony was so tired from pulling an all-nighter with assessment tasks that he didn't even realize that he had walked into the restroom toward the end of his morning lecture with the device switched on and recording everything in full view. Lucky for him, no one had been accidentally caught on film while in midstream. Instead, as he walked in, he was greeted by someone who was walking out and a second guy who avoided eye contact but likely noticed the camera on Anthony's head from the reflection in the mirror while washing his hands. The third one didn't even care but just kept on doing what he was doing, and the fourth locked his eyes to the camera with rage for a while. They didn't speak, but Anthony could sense what he thought—“what the heck?” Anthony was an attractive young man who sported tattoos and always tried to look different in some way. He hated conformity. Now that he had watched the video to extract the lecture material, he wondered why no one had stopped him to punch the living daylights out of him in the restroom. Anthony had thought people were getting used to the pervasiveness of cameras everywhere—not just in the street and in lecture theaters but also in restrooms and probably soon in their homes as well.

Scenario 3: The Corridor

By this time, Anthony was feeling rather hungry. In fact, he was so hungry that he was beginning to feel very weak. All of those late nights were beginning to catch up now. Sophie demanded that they go eat before the afternoon lecture. As they walked out of the main tower building, they bumped into an acquaintance from the previous session. Oxford, as he was known by his Aussie name, was always polite. The conversation went something like this. “Hello Oxford! How are you?” said Sophie. Oxford replied, “I'm fine, thank you. Good to see you guys!” Sophie quickly pointed to Anthony's head-mounted camera and said, “Oxford, can you believe how desperate Anthony has become? He's even recording his lectures with this thing now!” Oxford, who was surprised, remarked, “Oh yeah. I've never seen one of these before. Are you recording right now, Anthony?” “Yes, I am,” Anthony affirmed, “but to be honest, I completely forgot about it—I'm dreaming about food right now.” Anthony patted his tummy, which was by now making grumbling noises. “Want to come with us to the café near the gymnasium?” Anthony asked.

“He just filmed most of the structures lecture—I'm thinking like, this might be the coolest thing that might stick,” Sophie reflected, ignoring Anthony. “No kidding,” Oxford said, “You're recording me right now? I'm not exactly thrilled about this, but ‘hi,’ for what it's worth.” Oxford waved to the camera and smiled. Sophie interjected, “Oxford, it is not like he's making a movie of you, haha!” Sophie grabbed Oxford's arm to pull it toward her—the jab was signified to make it clear she was joking. But suddenly, things became serious instead of lighter. Oxford continued, “No, I'm not quite good in front of the camera…like I don't like pictures being taken of me or even recordings of my voice. It's probably the way I was raised back home.”

Anthony told Oxford not to worry because he was not looking at him, and so, therefore, nothing but his voice was really being recorded. Little did he realize that was breaking local New South Wales laws, or at least that was what he would find out later in the day when someone from security spotted him on campus. Sophie asked with curiosity, “Do you think someone should ask you if they want to record you on campus?” Oxford thought that was a no brainer—“Of course they should ask. You're wearing this thing on your head, and there's nothing telling people passing by whether you are watching them and recording them. C'mon Anthony, you're a smart guy, you should know this stuff; you're studying engineering, aren't you? We're supposed to be the ones that think of everything before it actually happens. You might as well be a walking CCTV camera.” There was dead silence among the friends. Then Anthony blurted out, “But I'm not watching you; you just happen to be in my field of view.”

Sophie began to consider the deeper implications while Anthony was getting flustered. He wanted to eat, and they were just beginning a philosophical conversation. “C'mon Oxford, come with us, we're starving…and we can talk more at lunch, even though we should be studying.” As they walked, Sophie continued: “It's not like this is the worst form of camera that could be watching. I saw this thing on the news a couple of weeks ago. The cameras are getting tinier; you cannot even see them. The company was called OzSpy, I think, and they're importing cheap stuff from Asia, but I don't think it's legal in every state. The cameras are now embedded in USBs, wristbands, pens, keyfobs, bags, and t-shirts. How do you know you're being recorded with that kind of stuff?” Oxford was beginning to feel uneasy. Anthony felt like taking off the contraption but left it on because he was just too lazy to put the thing back in its box and then back on again in less than 2 h. Oxford confessed again: “I feel uncomfortable around cameras, and it's not because I'm doing anything wrong.” They walked quietly for a few minutes and then got to the café. Sophie pointed to the wall as they queued. “Look up there. It's not like we're not always under surveillance. What's the difference if it is on a building wall versus on someone's head?”

Anthony wished they'd change the subject because it was starting to become a little boring to him. Oxford thoughtfully replied to Sophie, “Maybe it's your culture or something, but I even wave to CCTV cameras because it's only for security to see on campus. But if someone else is recording me, I don't know how he or she will use the footage against me. I don't like that at all. I think if you're recording me to show other people, then I don't think it's okay at all.” Sophie chuckled, “Hey, Oxford, this way Anthony will never forget you even when you have finished your degree and return to Thailand in ten years; when he is rich and famous, he'll remember the good old days.” The truth was that Oxford never wanted to return to Thailand; he liked the opportunities in Australia but added, “Okay, so you will remember me and my voice forever.”

By this time, Anthony was at the front of the queue. “Guys, can we forget about this now? I need to order. Okay, Oxford, I promise to delete it if that makes you feel better.” Oxford said, “No, Anthony, you don't understand me. I don't mind if you keep this for old times sake, but just don't put it on the Internet. I mean don't make it public, that's all. Guys, I just remembered I have to go and return some library books so I don't get a fine. It's been nice chatting. Sorry I cannot stay for lunch. Good luck in your finals—let's catch up and do something after exams.” “Sure thing,” Sophie said. “See ya.” As Oxford left and Anthony ordered food, she exclaimed, “Your hair is going to be great on the video!” Oxford replied, “I know my hair is always great, but this jacket I am wearing is pretty old.” Oxford continued from afar, “Anthony, remind me to wear something nicer next time. Bye now.” Sophie waved as Oxford ran into the distance.

Scenario 4: Ordering at the Cafe

Anthony ordered a cappuccino and his favorite chicken and avocado toastie. The manager, who was in his 50s, asked for Anthony's name to write on the cup. “That will be 10 note and waited for change. “And how are you today?” asked the manager. “I'm fine thanks.” “Yeah, good,” replied the manager, “Okay, see you later, and have a good one.” Anthony muttered, “I'll try.” Next it was Sophie's turn to order. “What's up with him?” asked the café manager. “What's that thing on his head? He looks like a goose.” Sophie cracked up laughing and struck up a conversation with the manager. She was known to be friendly to everyone.

Anthony went to the service area waiting for his cappuccino and toastie. For once, the line was not a mile long. The male attendant asked Anthony, “What's with the camera?” By then, Anthony had decided that he'd play along—sick of feeling like he had to defend himself, yet again. He wasn't holding a gun after all. What was the big deal? He replied, “What's with the camera, mate? Well, I'm recording you right now.” “Oh, okay, awesome,” said the male attendant. Anthony probed, “How do you feel about that?” The male attendant answered, “Well, I don't really like it man.” “Yeah, why not?” asked Anthony, trying to figure out what all the hoo-ha was about. There were CCTV cameras crawling all over campus, and many of them were now even embedded in light fixtures.

“Hey, Josie, Josie—how do you feel about being filmed?” exclaimed the male attendant to the female barista cheekily. “I don't really mind. I always wanted to be an actress when I was little, here's my chance!” “Yeah?!” asked Anthony, in a surprised tone. “Are you filming me right now? Are you going to make me look real good?” laughed the barista in a frisky voice. Anthony smiled and, by then, Sophie had joined him at the service area, a little jealous. “What's this for?” asked Josie. She had worked on campus for a long time and was used to serving all sorts of weirdos. “No reason. I just filmed my structures class. And now, well now, I've just decided to keep the camera rolling.” Josie asked again, “Are you really filming me right now?” Anthony reaffirmed, “Yes.”

Sophie looked on in disbelief. The camera had just become the focal point for flirtation. She wasn't liking it one bit. Josie asked Anthony again, “Why are you filming?” Anthony didn't know why he blurted out what he did but he said, “Umm…to sort of get the reactions of people. Like how they act when they see someone actually recording them.” The male attendant interrupted, “You know what you should do? You should go up to him,” pointing to the manager, “and just stare at him, like just stare him in the face.” “I will, I will,” said Anthony. Egging Anthony on, the male attendant smiled, “Stand in front of the queue there, and just stare at him. He'll love it, he'll love it, trust me. You'd make his day man.” “Hey, where's my cappuccino and toastie?” demanded Anthony. The male attendant handed the food over and got Sophie's food ready too. “And this must be yours.” “Yes,” Sophie replied. The male attendant insisted: “Focus on him now, don't focus on me, all right?” “Yup, ok, see you later. Cheers.” Anthony felt a little diminished; although he was surprised that the barista talked to him for as long as she did, he wasn't about to pick a fight with an old bloke. What he was doing was harmless, he thought; he left the counter to take a seat, but considered switching off the device.

Scenario 5: Finding a Table at the Cafe

Sophie found a table with two seats left in a sunny spot and put her things down. Lack of sleep during exam time meant that everyone generally felt cold. Anthony sat down also. At the large oblong table was a small group of three—two girls and a guy. Sophie went looking for serviettes, as they forgot them at the counter. As soon as Anthony pulled up a chair to sit down, one of the girls got up and said, “And you have a lovely afternoon.” Anthony replied, “Thank you and you too.” Speechless, the other two students at the table picked up whatever was left of their drinks and left not long after. As Sophie returned, she saw the small group leaving and whispered, “Anthony, maybe you should take that thing off. You're getting quite a bit of attention. It's not good. A joke's a joke. Alright, I could cope with the classroom situation, but coming to the café and telling people you're recording. Surely, you are not, right? You're just kidding, right?” “Listen, Sophie, I'm recording you now. The battery pack lasts a while, about an hour, before it needs replacing. I'm going to have to charge the backup during the next lecture.” “Anthony,” Sophie whined, “c'mon, just turn it off.” Anthony acted like he was turning it off reluctantly although he had not. “Now put it away,” Sophie insisted. “No, I'm going to leave it on my head,” Anthony said. “I couldn't be bothered, to tell you honestly. Just don't forget to remind me to turn it back on when we are in class.” “Good,” said Sophie.

By then, two girls asked if they could sit down at the table. “Sure,” said Sophie. The girls were known to Sophie, at the Residence but they merely exchanged niceties. “My name is Klara,” said one of the girls. “And my name is Cygneta,” said the other. “I'm Sophie, and this is my boyfriend Anthony. Nice to finally get to talk to you. That'd be right. Just when we should all be studying, we're procrastinating and socializing.” Anthony was happy for the change of conversation, so he thought.

“I know what that is, Anthony! It's a GoPro,” Cygneta exclaimed. “Sophie, Sophie, I wouldn't let my man carry that thing around on campus filming all those pretty ladies.” Cygneta giggled childishly, and Klara joined her in harmony but did not know anything about the contraption on Anthony's head. Sophie was reminded why she had never bothered approaching Cygneta at the Residence. Those two were inseparable and always too cute—the typical creative arts and marketing students. Sophie retorted, “Well, he's not filming right now. He just filmed the lecture we were in.” Anthony made Sophie think twice. “How do you know I'm not filming right now?” Sophie said, “Because the counter on the LCD is not ticking.” Cygneta had used a GoPro to film her major project and knew that you could toggle the LCD not to show a counter, sharing this with the group. Sophie didn't like it one bit. It made her doubt Anthony.

Anthony proceeded to ask Klara, “How do you feel when you see someone recording you?” “Yeah, not great. I feel, like, really awkward,” confessed Klara. Then Anthony asked the million dollar question: “What if most people wore a Google Glass on campus and freed themselves of having to carry an iPhone?” Klara at this point was really confused. “Google what?” Sophie repeated, “Google Glass” in unison with Anthony. Shaking her head from side to side, Klara said, “Nah, I'm not into that kind of marketing at all.” “But it's the perfect marketing tool to gather information,” considered Anthony. “Maybe you're going to start using it one day as well? Don't you think?” Klara looked at Sophie and Anthony and replied, “What do you mean? Sorry?” Anthony repeated, “Do you reckon you're gonna be using Google Glass in a couple of years?” Klara turned to Cygneta for advice. “What in the world is Google Glass? It sounds dangerous?” Anthony explained, “It's a computer that you can wear as glasses. But it's a computer at the same time.” Klara let out a sigh. “I had no idea that even existed, and I think I'm a good marketing student and on top of things.”

By this stage, Sophie was feeling slighted and decided to finish her food, which was now cold. Anthony, caught off guard by Klara's lack of awareness, reaffirmed, “So you don't reckon you'd be wearing glasses that can record and works as a phone or a headband capable of reading brain waves?” Cygneta said, “Probably not,” and Klara also agreed, “No. I like my phone just fine. At least I can choose when I want to switch it off. Who knows what could happen with these glasses? It's a bit too out there for me. That stuff's for geeks, I think. And anyway, there's nothing interesting in my life to capture—just one big boring stream of uni, work, and home.”

Sophie pointed out an interesting fact: “Hey girls, did you know that there's no law in Australia that forbids people from video recording others in public? If it's happening out on the street, then it ain't private.” Cygneta replied, “Yeah I heard this news the other day; one of the ministers was caught on video heavily cursing to another minister when he was listening to his speech. He was waiting for his turn to give a speech of his own, apparently, and he didn't even notice someone was recording him. What an idiot!”

Sophie asked Anthony to accompany her to the bank. Lunch was almost over, and the lecture was now less than an hour away. The pair had not studied, although at the very next table was a group of six buried in books from the structures class. Klara and Cygneta went to order a meal at the café and said goodbye. Anthony reluctantly got up from the table and followed Sophie to the study group. Sophie bravely asked, “Anyone got any solutions yet to the latest practice questions?” People looked up, and the “little master,” who was codenamed for his genius, said, “Not yet.” None of the other engineering students, mostly of Asian background, could even care less about the camera mounted on Anthony's head. Sophie found this disturbing and startling. She immediately thought about those little drones being developed and how men seemed to purchase these toys way more than any woman she knew. Who knows what the future would hold for humankind, she thought. Maybe the guys would end up loving their machines so much they'd forget spending time with real people! Sophie liked the challenge of engineering, but it was at times strange to be in a room full of guys.

The power to exclude, delete, or misrepresent an event is with the wearer and not the passive passerby.

Scenario 6: A Visit to the C.A.B. Bank

Sophie was beginning to really tire of the GoPro shenanigans. She asked Anthony to wait outside the bank since he would not take off the contraption. Sophie was being pushed to the limit. Stressed out with exams coming up and a boyfriend who seemed preoccupied with proving a point, whatever that point was, she just needed things to go smoothly at the bank. Luckily this was the less popular bank on campus, and there was hardly anyone in it. Sophie went right up to the attendant but called out for Anthony to help her with her bag while she rummaged in her handbag for her driver's license. Anthony sat down on one of the sitting cubes and, looking up, realized he was now in the “being recorded” position in the bank himself. One attendant left the bank smiling directly into the camera and at Anthony. He thought, “How's that for security?” The third teller leaned over the screen and asked Anthony, “Is there anything we can help you with?” Anthony said, “I'm waiting for my girlfriend,” which seemed to appease the teller too easily.

It was now time for Sophie to withdraw money at the teller. Anthony really didn't mind because Sophie was always there to support him, no matter how long it took. They reflected that they had not more than 30 min left to do a couple more errands, including visit the ATM and go to the library. There were four people in the queue at the ATM. Anthony grabbed Sophie's hand and whispered in her ear, “Sophie, do you realize something? If I was recording right now, I'd be able to see all the PIN numbers of all the people in front of us.” Sophie shushed Anthony. “You're going to get us in trouble today. Enough's enough.” “No really, Sophie, we've got to tell security. They're worried about tiny cameras looking down and skimming devices, but what about the cameras people are wearing now?” Sophie squeezed Anthony's hand—“Anthony, you are going to get us in serious trouble. And this is not the time to be saving the world from cybercriminals.” Anthony moved away from the queue, realizing that his face was probably being recorded on CCTV. The last thing he ever wanted was to be in trouble. He went to instantly budge the GoPro off his head; it was becoming rather hot even though it had been a cool day, and it was beginning to feel uncomfortable and heavy on his back and neck muscles. By the time he could get his act together, Sophie had made her transaction and they were hurriedly off to the library just before class.

Scenario 7: In the Library

As they rushed into the library to get some last-minute resources, Anthony and Sophie decided to split up. Sophie was going to the reserved collection to ask for access to notes that the special topics lecturer had put on closed reserve, and Anthony was going to do some last-minute bibliographic searches for the group assignment that was due in a few days. Why was it that things were always crammed into the last two weeks of the session? How on earth was any human being able to survive those kinds of demands? Anthony grabbed Sophie's bag and proceeded to the front computers. It was packed in the library because everyone was trying to do their final assignments. As Anthony hovered behind the other students, he remembered the shoulder-surfing phenomenon he had considered at the ATM. It was exactly the same. Anthony made sure not to look forward. As soon as there was an empty computer, he'd be next. He conducted some library searches standing up and then spotted two guys moving away from a sit-down desk area. Given all the stuff he was carrying, he thought he'd ask the guys nearby if they had finished. They said yes and tried to vacate the space as fast as they could, being courteous to Anthony's needs. By this time, Anthony was also sweating profusely and had begun to look stressed out.

The cameras are now embedded in USBs, wristbands, pens, keyfobs, bags, and t-shirts.

Anthony dumped his stuff on the ground, and the shorter of the two men said, “Are you wearing a camera on your head?” Anthony muttered to himself, “Oh no, not again.” Had he been able to take the device off his head effortlessly, he would have. After wearing it for over 2 h straight, it had developed an octopus-like suction to his forehead. “Yeah, yeah, it's a camera.” This camera had brought him nothing but bad luck all day. Okay, so he had taped most of the first lecture in the morning, but it had not been any good since. Sophie was angry with him over the café discussions, Oxford was not interested in being filmed without his knowledge, and Anthony's shoulders were really starting to ache and he was developing a splitting headache. “You guys would not happen to be from civil engineering?” Anthony asked in the hope that he and Sophie might get some hints for the forthcoming group assignment. “Nah, we're from commerce.” Both men walked away after saying goodbye, and Anthony was left to ponder. Time was running out quickly, so he left his things where they were and decided to go to the desk and ask for help directly.

“Hello, I am wondering if you would be willing to help me. My name is Anthony, and I am doing research on…” The librarian studied Anthony's head closely. “Umm…can I just ask what's happening here? Please tell me you are not recording this conversation,” asked the librarian politely. “What?” said Anthony, completely oblivious to the camera mounted on his head. He then came to his senses. “Oh that? That's just a GoPro. I've not got it on. See?” He brought his head nearer to the librarian, who put on her glasses. “Now, I'm looking for…” “I'm sorry, young man, I'm going to have to call down the manager on duty. You just cannot come into the library looking like that. In fact, even onto campus.”

Anthony felt like all of his worst nightmares were coming true. He felt like running, but his and Sophie's belongings were at the cubicle and besides, the library security CCTV had been recording for the last few minutes. His parents would never forgive him if anything jeopardized his studies. Sophie was still likely photocopying in closed reserve. What would she think if she came out to be greeted by all this commotion? The manager of “The Library”—oh he felt a bad feeling in the pit of his stomach. Anthony knew he had done nothing wrong, but that was not the point at this time. The librarian seemed less informed than even he was of his citizen rights, and while she was on the phone, hurriedly trying to get through to the manager, Sophie returned with materials.

“Where are our bags? My laptop is in there Anthony.” Anthony signaled over to the cubicle, didn't go into details, and asked Sophie to return to the desk to do some more searches while he was with the librarian. Surprisingly, she complied immediately given the time on the clock. Anthony was relieved. “Look,” he said to the librarian, “I am not crazy, and I know what I am doing is legal.” She gestured to him to wait until she got off the phone. “Right-o, so the manager's at lunch, and so I'll have to have a chat with you. First and foremost, when you're taking footage of the students, you need permission and all that sort of thing. I'm just here to clarify that to you.” “Look, umm, Sue, I'm not recording right now, so I guess I can wear whatever I want and look as stupid as I want so long as I'm not being a public nuisance.” “Young man, can I have your student ID card please?” Anthony claimed he did not have one with him, but was trying to avoid returning back to where Sophie was to get hit with even more questions. Anthony proceeded by providing the librarian his full name.

“Well, Anthony Fielding, it is against university policy to go around recording people in a public or private space,” stated the librarian firmly. Anthony, by now, had enough. “Look, Sue, for the second time, I've not recorded anyone in the library. I did record part of my lecture today with this device. It is called a GoPro. Why hasn't anyone but me heard about it?” “Well we have heard of Google Glass here, and we know for now, we don't want just anyone waltzing around filming indiscriminately. That doesn't help anyone on campus,” the librarian responded. “Okay, based on my experience today, I know you are right,” Anthony admitted. “But can you at least point me toward a library policy that clearly stipulates what we can and cannot do with cameras? And why is this kind of camera one that you're alarmed about rather than a more flexible handheld one like this one?” Anthony pulled out his iPhone 6. The librarian seemed oblivious to what Anthony was trying to argue. Meanwhile, Anthony glanced over to Sophie half-smiling, indicating they will have to make a move soon by pointing at his watch and then the exit.

“Look, I know you mean well. But…” Anthony was interrupted again by the librarian. “Anthony Fielding, it is very important you understand what I am about to tell you; otherwise you might end up getting yourself in quite a bit of trouble. If you're recording students, you actually have to inform the student and ask if it's okay, because quite a lot of them are hesitant about being filmed.” Anthony retorted, “I know, I know, do unto others as you'd have them do unto you, but I already told you, I'm not recording…But which policy do you want to refer me to and I'll go and read it, I promise.” The librarian hesitated and murmured behind her computer, “Ah…I'll have to look…look…look and find it for you, but I just…I just know that…” The librarian realized the students were going to be late for a lecture. “Look, if you're right and there is no policy, assuming I've not made an error, then we need to develop one.” “Look, Sue, I don't mean to be rude, but we've already filmed in a lecture theater today. I wouldn't call a public theater, private in any capacity. Sure people can have private conversations in a theater, but they shouldn't be talking about private things unless they want to actively share it during class discussion time.” “Look, that's a bit of a gray area,” the librarian answered. “I think I am going to have to ask security to come over. It's just that I don't think the safety of others is being put first. For starters, you should take that thing off.” Anthony realized that things were now serious. He attempted to take off the band, which was soaking wet from sweat given his latest predicament.

Sophie realized something was wrong when she was walking with the bags back to the information desk. “Anthony, what's happening?” Sophie had a worried look on her face. “I've been asked to wait for security,” said Anthony. “Can you please not worry and just leave for class? I won't feel so bad if you go on without me.” Sophie responded, “Anthony, I told you this thing was trouble—you should have just taken it off—oh Anthony!” “What now?” said Anthony. “Your forehead…are you okay? It's all red and wrinkly and sweaty. Are you feeling okay?” Sophie put her hand on Anthony's forehead and realized he was running a fever. “Look, is this really necessary? My boyfriend has not done anything wrong. He's taken off the device. If you want to see the lecture footage, we'll show you. But really, the guy has to pass this subject. Please can we go to the lecture theater?” The librarian was unequivocally unemotional. Anthony looked at Sophie and she nodded okay and left for class with all the bags. “Please ring me if you need anything, and I'll be here in a flash.” Sophie kissed Anthony goodbye.

Scenario 8: Security on Campus

Moments later, security arrived on the scene. Anthony challenged the security guards and emphasized that he had done nothing wrong. Anthony was escorted back to the security office on campus some 500 m away. At this point, he was told he was not being detained, that simply university security staff were going to have a chat with him. Anthony became deeply concerned when several security staff greeted him at the front desk. They welcomed him inside and asked him to take a seat and whether or not he'd like a cup of coffee.

“Anthony, there have been a spate of thefts on campus of late. We'd like to ask you where you got your GoPro camera.” “Well, it was a birthday present from my older brother a few months ago,” Anthony explained. “He knows I've always made home movies from when I was a youngster, and he thought I might use it to film my own skateboarding stunts.” “Right,” said the police officer, “Could you let me take a look at the serial number at the bottom of the unit?” “Sure,” said Anthony, “and then can I go? I haven't stolen anything.” The security staff inspected the device and checked the serial number against their database, handing it back to Anthony. “Ok, you're free to go now.” “What? And I thought you were going to interrogate me for the footage I took today!”

“Look Anthony, that's a delicate issue. Yeah, under the Surveillance Devices Act, for you to be able to record somebody you need their explicit permission, which is why you'll see wherever we've got cameras we've got signage that states you're being filmed, and even then we've got a strict policy about what we do with the recordings. We can't let anybody view it unless it's police and so on, but it's really strict.” Anthony replied, “What happens when Google Glass begins to proliferate on campus? The GoPro, which will be obvious, won't be what you're looking out for but rather Glass being misused or covert devices.” “Look, security, the way it works at universities is that you are concerned with the here and now. I can't predict what will happen in about three months' time, right?” At this point Anthony was thinking about his lecture and how he was running late, yet again, however, this time through no fault of his own.

“Is she with you?” asked the security manager. “Who do you mean?” questioned Anthony. “That young lady over there,” the manager replied, pointing through the screen door. “Oh, that's my girlfriend, Sophie. I reckon she was worried about me and came to see what was going on.” Sophie had her iPhone out and was recording the goings on. Anthony just had to ask, “Am I right? Is my girlfriend allowed to do that? She isn't trespassing. The university campus is a public space for all to enjoy.” The security manager replied, “Actually, she's recording me, but she's not really allowed to do that without giving me some sort of notification. We might have cameras crawling all over this campus for student and staff safety, but our laws state if people don't want to be recorded, then you should not be recording them. On top of this, you would probably realize that when you walk around the campus in large areas like the walkways, they're actually facing the road, they're not facing people. So yes, you need permission for what she's doing there or adequate signage explaining what is going on.”

Sophie put the phone down and knocked on the door. “Can I come inside?” “Of course you can,” said the security manager. “Join the party!” “Anthony, Prof. Gabriel is asking for you; otherwise, he'll count you absent and you won't get your 10% participation mark for the session. I told him I knew where you were. If we get back within 15 min, you're off the hook.” “Hang on Sophie,” Anthony continued, “I'd like to solve this problem now to avoid any future misunderstandings. After all, I'm about to enter the classroom and record it for my own learning and progress. What do you think? Is that against the law?” Anthony asked the security manager. The security manager pondered for a long while. “Look, we get lots and lots of requests asking us to investigate the filming of an individual; we take that very seriously. But there is no law against that taking place in a public space.” “Is a lecture theater a public space?” Anthony prompted. The security manager replied, “I think you should be allowed to use headmounted display video cameras if it's obvious what you're doing and unless a bystander asks you to cease recording. The lecture rooms are open and are usually mixed with the reception areas, which makes them public areas; so if you want to gain access to the room, obviously you can because it's a public area. You don't have to use a swipe card to get in, you see. But then there are still things that you can't do in a public area, like you can't ride a bicycle in there; or if someone is giving a lecture, you can't interrupt the lecture. That sort of thing.”

Anthony started speaking from the experience of his day. “I was queueing in front of the ATM today, and I realized that I could easily see the activities of the people in front of me and the same in the library. When I hover around somebody's computer, I can see their screen and what they're up to on the Internet. It bothered even me after my experience today; unintentionally I'm seeing someone's ATM PIN number, I'm seeing someone searching on Google about how to survive HIV, which is personal and highly sensitive private stuff. No one should be seeing that. I just wore my GoPro to record my lecture for study purposes, but these kinds of devices in everyday life must be very disturbing for the people being recorded. That's why I'm curious what would happen on campus.” The security manager interrupted, “We already have some policies in place. For example, you can make a video recording, but what are you going to do with it? Are you going to watch it yourself or are you going to e-mail it around? You can't do that using your university e-mail account. You can't download, transfer, or copy videos using university Internet, your university account, or your university e-mail account. Look it up; there are also rules about harassment…It's fairly strict and already organized in that regard. But if you're asking where the university is applying policies, you're asking the wrong people because we don't get involved in policy making. You should be talking to the legal department. We don't make the policies; we just follow the procedures. Every citizen of this nation also has to abide by state and federal laws.”

The explanation satisfied Anthony. He realized that the security manager was not the person to talk to for any further inquiries. “Thank you for taking the time to answer my questions; you've been very helpful,” Anthony said as he headed to the door to attend his class with Sophie. He did need that 10% attendance mark from Prof. Gabriel if he wanted to be in the running for a Distinction grade.

Scenario 9: Sophie's Lecture

After their last lecture together, Anthony was happy thinking he was almost done for the day and he would be heading back home but Sophie had one more hour of tutorial. Anthony walked Sophie to her last tutorial's classroom. “C'mon Anthony, it'll only take half an hour tops. After this class, we can leave together; bear with it for just a while,” Sophie insisted. “Okay,” said Anthony; his mind was overflowing with the thought of the final exams and questions raised in his mind by his unique experience with the GoPro all day.

They arrived a few minutes late. Sophie quietly opened the door as Anthony walked in behind her. The lecturer took a glimpse of Anthony with the GoPro on his head. The lecturer asked Anthony, “Are you in this class?” “No, I'm just with a friend,” replied Anthony as he was still trying to walk in and take a seat. “Okay and you're wearing a camera?” “Yeah?!” Anthony replied, confused by the tone of the lecturer. “Take it off!” the lecturer exclaimed. “You don't have permission to wear a camera in my class!” Silence fell over the classroom. As the lecturer's tone became more aggravated, everyone stopped, trying to understand what was going on. “Ok, but it's not…” The lecturer refused to hear any explanation. “You're not supposed to interrupt my class, and you're not supposed to be wearing a camera, so please take the camera off and leave the class!”

Anthony saw no point in explaining himself and left the class. Sophie, in shock, followed Anthony outside to check up on him and make sure he was all right. “Oh Anthony, I don't know how many times I told you to take it off all day…Are you ok?” Anthony was shocked as well. “I don't understand why he got so upset.” Anthony was facing the lecture theater's glass door; it opened and the lecturer stepped out and asked, “Excuse me, are you filming inside the class?” “Professor…” Anthony tried to say he was sorry for the trouble and that he wasn't even recording. “No! Were you filming inside the class?” the lecturer asked again. “I'm sorry if I caused you trouble, professor, the camera is not even on.” The professor, angry at both of them for interrupting his class with such a silly incident, asked them to leave and returned to the lecture theater. Sophie was surprised. “He's a very nice person; I don't understand why he got so upset.” Anthony's shock turned into anger. “I thought this was a public space and I don't think there's any policy that forbids me to record the lecture! Couldn't he at least say it nicely? You get back in, I'll see you after your class, and meanwhile I'll take this darn thing off.” Anthony kissed Sophie goodbye and left for the library without the GoPro on his head.

Conclusion

Wearable computers—digital glasses, watches, headbands, armbands, and other apparel that can lifelog and record visual evidence—tell you where you are on the Earth's surface and how to navigate to your destination, alert you of your physical condition (heart and pulse rate monitors), and even inform you when you are running late to catch a plane, offering rescheduling advice. These devices are windows to others through social networking, bridges to storage centers, and, even on occasion, companions as they listen to your commands and respond like a personal assistant. Google Glass, for instance, is a wearable computer with an optical head-mounted display that acts on voice commands like “take a picture” and allows for hands-free recording. You can share what you see live with your social network, and it provides directions right in front of your eyes. Glass even syncs your deadlines with speed, distance, and time data critical to forthcoming appointments.

The slim-line Narrative Clip is the latest gadget to enter the wearable space.

But Google is not alone. Microsoft was in the business of lifelogging more than a decade ago with its SenseCam device, which has now been replaced by the Autographer. Initially developed to help those suffering with dementia as a memory aid, the Autographer takes a 5-mp picture about 2,000 times a day and can be replayed in fast-forward mode in about 5 min. It is jam-packed with sensors that provide a context for the photo including an accelerometer, light sensor, magnetometer, infrared motion detector, and thermometer as well as a GPS chipset. The slim-line Narrative Clip is the latest gadget to enter the wearable space. Far less obtrusive than Glass or Autographer, it can be pinned onto your shirt, takes a snapshot every 30 s, and is so lightweight that you quickly forget you are even wearing it.

These devices make computers part of the human interface. But what are the implications of inviting all this technology onto the body? We seem to be producing innovations at an ever-increasing rate and expect adoption to match that cycle of change. But while humans have limitations, technologies do not. We can keep developing at an incredible speed, but there are many questions about trust, privacy, security, and the effects on psychological well-being that, if left unaddressed, could have major risks and often negative societal effects. The most invasive feature of all of these wearables, however, is the image sensor that can take pictures in an outward-looking fashion.

The claim is often made that we are under surveillance by CCTV even within leisure centers and change rooms. But having a Glass device, Autographer, or Narrative Clip recording while you are in a private space, like a “public” washroom, provides all sorts of nightmare scenarios. The camera is looking outward, not at you. Those who believe that they will remember to turn off the camera, will not be tempted to keep the camera “rolling,” or will “delete” the data gathered at a later date are only kidding themselves. We can hardly delete our e-mail records, let alone the thousands of pictures or images we take each day. The recording of sensitive data might also increase criminality rather than reduce it. The power to exclude, delete, or misrepresent an event is with the wearer and not the passive passerby. There is an asymmetry here that cannot be rectified unless the passive participant becomes an active wearer themselves. And this is not only unfeasible, but we would argue undesirable. At what point do we say enough is enough?

We are challenging fundamental human rights through the thoughtless adoption of new technologies that are enslaving us to a paradigm of instantaneous reality-TV-style living. We are seduced into providing ever more of our personal selves without any concerns for the protection of our personal data. Who owns the data emanating from these devices if the information is stored somewhere other than the device itself? Does that mean I lose my capacity to own my own set of histories relating to my physiological characteristics as they are sold on to third-party suppliers? Who will return my sense of self after I have given it away to someone else? We need to face up to these real and proportional matters because they not only have lawful implications but implications for our humanity.

IEEE Keywords: Wearable computing, Market research, Product design, Product development, Consumer behavior,Supply and demand, Digital computers, Google, Marketing and sales

References

[1] M. Lindgren and H. Bandhold, Scenario Planning: The Link Between Future and Strategy. Basingstoke, Hampshire: Palgrave Macmillan, 2010, p. 22. 

[2] S. Inayatullah, “Humanity 3000: A comparative analysis of methodological approaches to forecasting the long-term,” Foresight, vol. 14, no. 5, pp. 401–417, 2012. 

[3] M. Godet, “The art of scenarios and strategic planning,” Technol. Forecast. Social Change, vol. 65, no. 1, pp. 3–22, 2000. 

[4] L. Perusco and K. Michael, “Control, trust, privacy, and security: Evaluating location-based services,” IEEE Technol. Soc. Mag., vol. 26, no. 1, pp. 4–16, 2007. 

[5] K. Michael and M. G. Michael. (2013). No limits to watching. Commun. ACM. [Online]. 56(11), 26–28. Available: http://cacm.acm.org/ magazines/2013/11/169022-no-limits-to-watching/abstract 

[6] Y. Gokyer, K. Michael, and A. Preston. Katina Michael discusses pervasive video recording in the accompaniment to “No Limits to Watching” on ACM’s vimeo channel. [Online]. Available: http://vimeo. com/77810226 

[7] K. Michael. (2013). Social implications of wearable computing and augmediated reality in every day life. In Proc. IEEE Symposium on Technology and Society (ISTAS13), Toronto,

INSPEC: wearable computers, helmet mounted displays, innovation management, wearable computer, digital wearability scenarios, experimental technologies, market leaders, state-based regulations, innovation design practices, radical production, Google Glass product, optical head-mounted display unit

Citation:  Deniz Gokye, Katina Michael, Digital Wearability Scenarios: Trialability on the run, IEEE Consumer Electronics Magazine, Year: 2015, Volume: 4, Issue: 2, pp. 82-91, DOI: 10.1109/MCE.2015.2393005 

Surveillance, Social Networks, and Suicide

Saint Augustine's "Confessions"

Saint Augustine of Hippo (354–430 CE) [1] one of the most revered doctors of the ecclesia catholica, might not have been so highly esteemed had he flourished centuries afterwards in a world of uberveillance [2]. One of the unique aspects of Augustine's life that endeared him to the community of the faithful, both past and present, was his rising up from the “fornications” [3] and the “delight in thievery” [4] to become a paradigm for both the eastern and western churches of the penitent who becomes a saint. But would the celebrated bishop and author of The City of God have risen to such prominence and reverence had his early and formative life been chronicled on Facebook and “serialized” on YouTube? Would Augustine's long and grueling years of penitence and good works have been recognized? That we have his stylized and erudite Confessions on paper is another matter altogether; as to its impact, the written record cannot be compared to capturing someone in the act on closed circuit television (CCTV). The audio-visual evidence is there forever to be rerun at whim by those who have access. And what of the multitude of other canonized “sinners” who in their own time and private space might not only mature by engaging with their humanity, indeed with their flaws and weaknesses, but also aspire to sainthood through repentance. If these “lives of the saints” were rerun before us, would we view such consecrated men and women in the same way? Where context is lacking or missing, then all interpretation of content, however compelling to the contrary, must be viewed with a high degree of suspicion.

Even in the political and civil rights arena, for example, had the private lives of colossal and “untouchable” figures such as John F. Kennedy and Martin Luther King been subjected to never-ending uberveillance, how might that not only have affected the biography of these two men, but changed the course of history itself? Moreover, how would knowledge of such bio-intrusive surveillance altered both Kennedy's and King's decision-making processes and life habits? We know for instance, particularly from the seminal study of M.F. Keen, that the surveillance of prominent sociologists in the United States played a role in shaping the American sociological tradition. Certainly, J. Edgar Hoover's FBI [5] might have kept a detailed account of the supposed meanderings and subversions of its “suspects,” but these records whether true or false were not universally accessible and limited given the state of information and communication technology at the time [6]. And what of the private lives of popes and patriarchs, kings and queens, great philanthropists, and other exalted figures, how might they have stood up to the nowadays literal “fly on the wall” shadowing [7]?

The incongruity behind traditional surveillance technologies (including wholesale surveillance and “dataveillance”) is that, generally, individuals of power and influence are not subjected to the extreme and exaggerated types of surveillance techniques designed and planned for everyone else. This concept applies, except of course to occasions of blackmail and industrial espionage, for example, when the powerful and influential make use of whatever apparatus is at their disposal to spy on and turn against their own. It is not our blanket assertion that all influential and powerful people must necessarily be corrupt. It is fundamentally a matter of control revolving around authority, access, and opportunity. We return then, to the perennial question of who will guard the guards themselves: Quis custodiet ipsos custodes?

Even uniquely enlightened persons such as Siddhartha Gautama and Jesus of Nazareth needed private space not only to engage inwardly and to reflect on their respective missions, but also to do discrete battle with their respective “temptations.” Uberveillance makes private space inch-by-inch obsolete [8]. Private space is that location that we all, saint and sinner alike, need – to make our mistakes in secret, to mature into wisdom, and to discover what we are and are not capable of. In losing large chunks of our privacy we are also forfeiting a critical component of our personal identity, which for a substantial group of philosophers, following on from John Locke, is “the identity of consciousness” [9]. There is, then, the potential for personality disorders to develop, particularly anxiety disorders or phobic neuroses.

The unbridled rush and push to create the transparent society, as David Brin [10] very well described it, has social implications that are largely ignored, or at best marginalized. The social implications of information security measures that are connected to never-ending surveillance or indeed to other network applications have serious and often irreversible psychological consequences of which only a few can be cited here: increased cases of mental illness (new forms of obsessive compulsive disorder and paranoia); a rise in related suicides; decreased levels of trust (at all spheres of relationships); and the impossibility of a “fresh start.” The traditionally received idea of the unconditional absolution of sin [11] in the secrecy of the confessional already does not exist in the world of some religious communities; believers are encouraged to log on and to “confess” online [12], [13]. These types of social networks are especially dangerous for individuals already battling mental illness, and who might afterwards deeply regret having uploaded imaginary or real discretions for everyone to read.

Would the celebrated bishop and author of The City of God have risen to such prominence and reverence had his early and formative life been chronicled on Facebook and “serialized” on YouTube?

The author of a noteworthy article published in Newsweek [14], commenting on the high-profile suicides of two internationally recognized digital technologists, Theresa Duncan and Jeremy Blake, put it well when he surmised “for some, technology and mental illness have long been thought to exist in a kind of dark symbiosis.” The startling suicides first of Duncan and soon after that of her partner Blake, for whom “the very technologies that had infused their work and elevated their lives became tools to reinforce destructive delusions,” are a significant, albeit sad reminder that even those heavily involved in new technologies are not immune from delusional and paranoid torment, whether based on fact or not.

And that is precisely the point: with covert shadowing you can never be completely sure that your paranoia is groundless. Long-term research at a clinical level remains to be conducted on the subject of never-ending surveillance and mental illness. There is some evidence to suggest that a similar paranoia played at least some part in another shocking suicide, that of the Chinese American novelist and journalist Iris Chang [15], the author of The Rape of Nanking.

Iris Chang promoting her book "The Rape of Nanking". In Wikipedia we read: "  It was later discovered that she had left behind three  suicide notes  each dated November 8, 2004. "Statement of Iris Chang" stated:   I promise to get up and get out of the house every morning. I will stop by to visit my parents then go for a long walk. I will follow the doctor's orders for medications. I promise not to hurt myself. I promise not to visit Web sites that talk about suicide. [11]    The next note was a draft of the third:   When you believe you have a future, you think in terms of generations and years. When you do not, you live not just by the day — but by the minute. It is far better that you remember me as I was—in my heyday as a best-selling author—than the wild-eyed wreck who returned from Louisville. ... Each breath is becoming difficult for me to take—the anxiety can be compared to drowning in an open sea. I know that my actions will transfer some of this pain to others, indeed those who love me the most. Please forgive me. [13]    The third note included:   There are aspects of my experience in Louisville that I will never understand. Deep down I suspect that you may have more answers about this than I do. I can never shake my belief that I was being recruited, and later persecuted, by forces more powerful than I could have imagined. Whether it was the  CIA  or some other organization I will never know. As long as I am alive, these forces will never stop hounding me.    Days before I left for Louisville I had a deep foreboding about my safety. I sensed suddenly threats to my own life: an eerie feeling that I was being followed in the streets, the white van parked outside my house, damaged mail arriving at my P.O. Box. I believe my detention at Norton Hospital was the government's attempt to discredit me.

Iris Chang promoting her book "The Rape of Nanking". In Wikipedia we read: "

It was later discovered that she had left behind three suicide notes each dated November 8, 2004. "Statement of Iris Chang" stated:

I promise to get up and get out of the house every morning. I will stop by to visit my parents then go for a long walk. I will follow the doctor's orders for medications. I promise not to hurt myself. I promise not to visit Web sites that talk about suicide.[11]

The next note was a draft of the third:

When you believe you have a future, you think in terms of generations and years. When you do not, you live not just by the day — but by the minute. It is far better that you remember me as I was—in my heyday as a best-selling author—than the wild-eyed wreck who returned from Louisville. ... Each breath is becoming difficult for me to take—the anxiety can be compared to drowning in an open sea. I know that my actions will transfer some of this pain to others, indeed those who love me the most. Please forgive me.[13]

The third note included:

There are aspects of my experience in Louisville that I will never understand. Deep down I suspect that you may have more answers about this than I do. I can never shake my belief that I was being recruited, and later persecuted, by forces more powerful than I could have imagined. Whether it was the CIA or some other organization I will never know. As long as I am alive, these forces will never stop hounding me.

Days before I left for Louisville I had a deep foreboding about my safety. I sensed suddenly threats to my own life: an eerie feeling that I was being followed in the streets, the white van parked outside my house, damaged mail arriving at my P.O. Box. I believe my detention at Norton Hospital was the government's attempt to discredit me.

The application of technology is rarely unbiased. Once a technique [16] is set in motion and diffused into our society it progressively becomes irreversible, particularly given the key component of interoperability and the vast amounts of capital invested in twenty-first century machinery. However, our comprehension of this hi-tech diffusion is not on commensurate levels. Cross-disciplinary discourse, public debate, and legislation lag far behind the establishment of the infrastructure and application of the technology. In simple terms, this lag is the “too much change in too short a period of time,” which Alvin Toffler famously referred to as “Future Shock” [17].

The situation is, unfortunately, reminiscent of that time in Alamogordo, New Mexico, in 1945, when some of those engaged in the Manhattan Project, including one of the group's top physicists, the Nobel laureate Enrico Fermi, were taking side bets on the eve of the test on whether they would “ignite the atmosphere” once the atomic bomb was tested [18]. But the “fallout” from uberveillance is distributed, and it will initially, at least, be invisible to all except the approved operators of the data vacuum. The setting and foreboding of notable dystopian novels, which warn of “dangerous and alienating future societies” – Yevgeny Zamyatin's We (1921), Aldous Huxley's Brave New World (1932), Ayn Rand's Anthem (1938), George Orwell's 1984 (1949), Ray Bradbury's Fahrenheit 451 (1953) – where “dissent is bad” and the deified State “knows all” is being gradually realized. This is especially worrying, for as Noam Chomsky and others point out, we are concurrently witnessing a “growing democratic deficit” [19], [20].

Great strides are also being made in the field of biomedical engineering in the application of engineering principles and techniques to the medical field [21]. New technologies will heal and give hope to many who are suffering from life-debilitating and life-threatening diseases. The broken will walk again. The blind will see. The deaf will hear. Even bionic tongues are on the drawing board. Hearts and kidneys and other organs will be built anew. The fundamental point is that society at large is able to distinguish between positive and negative applications of technological advancements before we diffuse and integrate such innovations into our day-to-day existence.

Nanotechnology, which is behind many of these marvelous medical wonders, will interconnect with the surveillance field and quite literally make the notion of “privacy” – that is, revealing ourselves selectively – an artifact. We must do whatever is in our lawful power to check, mitigate, and to legislate against the unwarranted and abusive use of uber-intrusive surveillance applications. We are talking about applications with such incredible capabilities that will potentially have the power to dehumanize us and reach into the secret layers of our humanity. These are not unruly exaggerations when we consider that wireless sensors and motes, body area networks (BANs), and brain-computer interfaces (BCIs) are already established technologies and that the era of mind control, particularly through pioneering advancements in brain-scanning technology, is getting steadily closer.

The incongruity behind traditional surveillance technologies is that, generally, individuals of power and influence are not subjected to the extreme and exaggerated types of surveillance techniques designed and planned for everyone else.

The argument most often heard in the public domain is “if you have nothing to hide, why worry?” There are, however, at least three problems with this popular mantra. First, freedom implies not only being “free of chains” in the practical sense, to be permitted to go about one's daily business freely and without undue constraint, but nowadays also without your every move being tracked, monitored, and recorded.

Second, there is a metaphysical freedom connected to trust, which also implies to be able to dream, to think, and to believe without outside coercion.

And finally, whether we care to admit it or not, we all have something to hide. Disruption of any of these freedoms or rights would affect our decision-making processes and contribute to unhealthy personality development where what we “want” to do (or engage in) becomes what we think we must do (and theatrically engage in).

To artificially build a personality or to hold on to a set system of synthetically engineered beliefs is to deconstruct the human entity to the point where both initiative and creativity (two key components of a healthy individual) are increasingly diminished, and ultimately eradicated. Humancentric implants for surveillance will alter the “inner man” as much as the externals of technological innovation will transform the “outer man.” There are those who would argue that the body is obsolete and should be fused with machines; there are others who would support mind and identity downloading. In the context of such futuristic scenarios, Andrew Ross has aptly spoken of the “technocolonization of the body” [22]. Others on the cutting edge of the digital world are using technology in ways supposedly never intended by the manufacturers.

If the elements to this discussion that might point to the potential mushrooming of new totalitarian regimes seem paradoxical – after all we are living and reveling in a postmodern and liberal society where the individual cult on a mass scale is idolized and thriving – then we should stand back for a moment and reconsider the emerging picture. Two prominent features of the murderous regimes of Stalin and Hitler were the obsession with state secrecy and the detailed collection of all sorts of evidence documented in scrupulous registers [23]. Related to this collection of information was the well-known and beastly numbering of minorities, prisoners, and political dissidents. In our time, privacy experts such as David Lyon are warning, this type of “social sorting” is becoming evidenced once more [24]. Where are we heading today? In response, already in the United States a number of states (including North Dakota and Wisconsin) have passed anti-chipping bills banning the forced implantation of RFID tags or transponders into people [25].

In 1902 Georges Méliès' short science-fiction film A Trip to the Moon (Le Voyage dans la Lune)spawned the fantastic tradition of putting celluloid form onto the predictive word. More recently representative of this tradition is James Bond in Casino Royale (2006). In this movie, Bond becomes a “marked” man, chipped in his left arm, just above the wrist by his government minders. “So you can keep an eye on me?” the famous spy sarcastically rejoins. The chip is not only for identification purposes but has multiple functions and applications, including the ability to act as a global positioning system (GPS) receiver for chronicling his every move. Later in the film when Bond is captured by his arch-nemesis, the banker Le Chiffre, he will have the microchip, which looks more like a miniature spark plug, cut out of his arm with a blade. These kinds of scenarios are no longer the exclusive domain of the novelist, the conspiracy theorist, the religious apocalypticist, or the intellectual property of the tech-visionary. We have the ability and potential to upgrade these information gathering mechanisms to unprecedented and sci-fi proportions.

Unique lifetime identifiers are more touted than ever before by both the private and public sectors as they have become increasingly synonymous with tax file and social security numbers. The supposed benefits of this permanent cradle-to-grave identification are energetically broadcast at various national and international forums, and especially in the contexts of white collar crime and national security. We are living in times in which commercial innovations will possibly match the internal complexity of the neuron with the help of the appositely called “labs-on-chips.” Writers dealing with these subjects have been speaking less of future shock and more along the lines of hyper-future shock. The key question, so far as identification and information-gathering technology is concerned, is: How are we as a concerned and informed community going to curb and regulate the broad dispersal and depth-charged reaches of surveillance? And how are we going to do this without denying the many positive and desirable applications of the infrastructures that underlie these technologies, particularly in the domain of healing the sick and the injured?

A great deal of this discussion should revolve around the related ethics of emerging technologies, and as we have noted, this discourse is especially critical when we consider the “unintentional” and hidden consequences of innovation. However, one of the methodological weaknesses in this global debate is the direct focus by some of the interlocutors on meta-ethics alone. What we must understand, if we are to make any practical progress in our negotiations, is that this subject must first be approached from the perspective of normative and applied ethics. The lines of distinction between all three of these approaches will at times remain unclear and even merge, but there are some litmus tests (human rights for example) for determining the morality and the ultimate price of our decisions.

Readers might well be asking what technology has to do with some of the metaphysical issues that we are raising here. Perhaps it would be sensible to periodically remind ourselves, as a recent discerning researcher also has pointed out [26], that two of our greatest thinkers, Plato and Aristotle, both warned of the inherent dangers of glorifying techne (art, skill). Techne should be subject to “reason and law”. Furthermore, Plato and Aristotle argued that techne represents “imperfect human imitation of nature.” The pertinent question in this instance might be why have modern societies gradually moved away from asking or seeking out these metaphysical connections. Such general apathy, with a few honorable exceptions, towards a philosophical critique of technology can probably be traced to a defensive response of western economic tradition to Karl Marx's “critique of Victorian progress.”

In relation to surveillance and to ubiquitous location determination technologies, we are at a critical junction; some might well argue that we have long ago decided which road to travel. Maybe these commenters are right. Perhaps there is no longer a place for trusty wisdom in our world. Just the same, full-scale uberveillance has not yet arrived. We must moderate the negative fallout of science and control technology and, as Jacques Ellul [16] would say, “transcend” it: lest its control of us becomes non-negotiable and we ourselves become the frogs in the slow warming water.

The insightful and expertly considered papers that follow were presented at the IEEE-SSIT International Symposium on Technology and Society (ISTAS) 2010, in Wollongong, Australia. In one way or another each of the writers directly investigate issues related to both the technological and social implication spheres broached in this paper. Though their approach or methodology might differ in some evident places, they all agree that the rapid pace of the development and application of new surveillance techniques without due diligence and involvement of the scientific and public communities at large, has built-in potential of a great disaster, in terms of societal loss of privacy, erosion of freedoms, and disintegration of trust.

Acknowledgment

Excerpts of this article were originally published in Quadrant magazine [27] in 2009. Quadrant is Australia's leading intellectual journal of ideas, literature, poetry, and historical and political debate.

Keywords

Special issues and sections, Social network services, History, Technological innovation, Surveillance

Citation: MG Michael, Katina Michael, 2011, IEEE Technology and Society Magazine, Vol. 30, No. 3, Fall, 2011, pp. 13-17, DOI: 10.1109/MTS.2011.942312

Using social informatics to study effects of location-based social networking

Using a social informatics framework to study the effects of location-based social networking on relationships between people: A review of literature

Abstract

6c89c-social-networking-informatics.jpg

This paper is predominantly a review of literature on the emerging mobile application area known as location-based social networking. The study applies the social informatics framework to the exploratory question of what effect location based social networking may have on relationships between people. The classification model used in the paper relates previous research on location based services and online social networking together. Specifically the wider study is concerned with literature which identifies the impact of technology on trust with respect to friendship. This paper attempts to draw out the motivations behind using location based social networking applications and the implications this may have on individual privacy and more broadly one's social life. It relies heavily on the domain of social informatics with a view to setting a theoretical underpinning to the shaping between context and information and communication technology design.

Section 1. Introduction

The purpose of this paper is to provide a review of the relevant literature of the effects of location-based social networking (LBSN) on relationships between people. There are three main areas of literature reviewed. The first area is literature related to the domain of social informatics. The purpose of reviewing this literature is to guide the conduct of the wider research study. The second area of literature reviewed is the social informatics based studies on online social networking (OSN), location based services (LBS), and location based social networking (LBSN). The purpose of reviewing the literature on online social networking and location based services is because these technologies precede location based social networking. LBSN is the composite of LBS and OSN and therefore the literature on each of these technologies provides insight into core concepts related to location based social networking. The intersection between LBS, ONS and LBSN also uncovers an area which has been under researched predominantly due to its newness in the field of information and communication technology (ICT). The third area of literature reviewed by this research is the literature on trust and friendship. The purpose of briefly reviewing this literature is to provide an outline of the social theory that forms the background of the wider study. Prior to reviewing the literature a classification model is presented which summarizes the literature in the domain, in addition to providing a roadmap for this paper.

Section 2. Background

Location Based Social Networking (LBSN) applications such as Google Latitude, Loopt and BrightKite enhance our ability to perform social surveillance. These applications enable users to view and share real time location information with their “friends”. LBSN applications offer users the ability to look up the location of another “friend” remotely using a smart phone, desktop or other device, anytime and anywhere. Users invite their friends to participate in LBSN and there is a process of consent that follows. Friends have the ability to alter their privacy settings to allow their location to be monitored by another at differing levels of accuracy (e.g. suburb, pinpoint at the street address level, or manual location entry). Individuals can invite friends they have met in the physical space, friends they have met virtually in an online social network, their parents, their siblings, their extended family, partners, even strangers to join them in an LBSN setting.

With the emergence of this technology it is crucial to consider that “technology alone, even good technology alone is not sufficient to create social or economic value” [1]. Further to not contributing “sufficient” economic or social value, Kling and other scholars have identified that technologies can have negative impacts on society [2]. Consider the case of persons who have befriended each other in the virtual space, only to meet in the physical space and to encounter unforeseen consequences by doing so [3]. As location based social networking technologies are used between what is loosely termed “friends,” they have the potential to impact friendships, which are integral not only to the operation of society but also to the individual's well being [4].

Section 3. Classification Model

The classification model of the literature review expressed in Figure 1 summarizes the current social informatics based scholarship on location based services, online social networking and location based social networking applications. The arrows indicate the researchers view that location based social networking applications are novel in that they have been designed to provide additional functionality for social networking. The classification model also summarizes the scholarship on trust and technology and introduces the social theory of trust and friendship. The purpose of reviewing this literature is first to identify studies relating trust to LBS and OSN, and then to understand how technology has the potential to impact upon human trust. Although it must be stated upfront that the number of studies relating to this particular research question are scarce, given that the first popular LBSN application was launched in the beginning of 2009 [5], with only beta applications existing in August of 2008. Secondly, the purpose of reviewing the literature on trust and friendship is to develop a social theory to inform the research.

Figure 1. Classification Model

In order to logically understand the literature it is organized in a top-down approach. First the paper addresses enquiries in the domain of social informatics. Second the literature on online social networking and location based services is reviewed, providing a background to the types of issues pertinent to location based social networking. The review of the literature specifically on LBSN then follows. Once the gap in current research is presented, previous works on ‘trust and technology’, and ‘trust and friendship’ are presented.

Section 4. Socio-Technical Network Influences

The social implications of technologies have been explored under several different theoretical frameworks, including technological determinism, social shaping of technology, critical information theory and social informatics. This research adopts the approach of social informatics. Thus the overall aim of the research is to engage in a holistic and empirical study of the ‘consequences’ of location based social networking applications. This section provides a definition and outline of social informatics, how and why it has developed and how it can be used as a framework for further research. This section concludes with a justification for the adoption of this particular approach against a backdrop of other possible theories.

4.1. Definition of Social Informatics

Social informatics research focuses upon the relationships between information and communication technologies (ICTs) and the larger social context they exist within [6]. The definition of social informatics provided by the Encyclopedia of Library and Information Sciencedefines Social Informatics as [7]:

“the systematic, interdisciplinary study of the design, uses and consequences of information technologies that takes into account their interaction with institutional and cultural contexts. Thus, it is the study of the social aspects of computers, telecommunications, and related technologies, and examines issues such as the ways that IT shape organizational and social relations, or the ways in which social forces influence the use and design of IT… Social Informatics research strategies are usually based on empirical data… [and] use data to analyze the present and recent past to better understand which social changes are possible, which are plausible and which are most likely in the future.”

One of the key concepts underlying the approach of social informatics is that information and communication technology are not designed in social isolation, that a social context does exist, and it does influence the manner in which ICT is developed, used and ultimately has a social impact [7].

4.2. The Development of Social Informatics

Social informatics research was born from the dissatisfaction with previous information systems research methods that were focused on either exploring the deterministic effects of technology upon society, or society upon technology. These theories are respectively referred to as technological determinism and social shaping of technology.

Technological deterministic research studies focus on the impact of technology upon society. The research approach aims to answer questions such as:

“What would be the impact of computers on organizational behavior if we did X? What would be the changes in social life if we did X? Will computer systems improve or degrade the quality of work?… ‘What will happen, X or Y?’ The answer was, sometimes X, and sometimes Y. There was no simple, direct effect” [8].

Technological determinism has failed to produce satisfactory prediction and this has lead to the formation of social informatics research [9]. Technological determinism was also seen by the proponents of the social shaping of technology, as being only a partial truth, and “oversimplistic” [10].

The social shaping of technology approach proposes that technology is not an autonomous entity as it is shaped by social forces. This is in direct opposition to technological determinism which depicts technology as an “autonomous entity, which develops according to an internal logic and in a direction of its own, and then has determinate impacts on society” [11]. Social shaping of technology studies aim to show that technology is in fact a social product, it does not mold society, but rather society molds it, and this can be seen by investigating the social forces at play in the creation and use of technology [12]. Examples of approaches in the social shaping of technology include the social construction of technology and the actor network theory. These theories focused on the role of either knowledge or actors upon the development of technology. Technological determinism focuses on the impacts of technology, while the social shaping of technology focuses on the context. Social informatics on the other hand “investigates how the influences and nodes in a sociotechnical network shape each other” [13].

Social informatics does not ask deterministic questions ‘What will happen X or Y?’, instead social informatics researchers asks the question 'When will X happen? And Under what Conditions?’ providing a nuanced conceptual understanding of the operation of technology in social life [9]. In contrast to technologic determinism and social shaping of technology theories, the social informatics framework highlights the mutual shaping of technology and society, both molding each other at the same time.

4.3. Examples of Social Informatics Research

Figure 2. Bidirectional Shaping between Context and ICT Design

Social informatics takes a nuanced approach to investigating technologies and explores the bidirectional shaping between context and ICT design, implementation and use [13] (figure 2). This approach, which combines the social aspects and the technical aspects of technology, has been found to be useful for understanding the social shaping and ‘consequences’ of information communication technologies [9]. Examples of social informatics research include the vitality of electronic journals [14], the adoption and use of Lotus Notes within organizations [15], public access to information via the internet [16], and many other studies. Social informatics research also investigates new social phenomenon that materialize when people use technology, for example, the unintended effects of behavioral control in virtual teams [17]. Research falling in this area is perceived as the future direction for social informatics research [9].

4.4. Social Informatics as a Framework

Social informatics is not described as a theory, but as a “large and growing federation of scholars focused on common problems”, with no single theory or theoretical notion being pursued [13]. What social informatics does provide is a framework for conducting research. What follows is a description of the framework, its key elements and distinguishing features.

4.4.1. Key Features of Social Informatics Research

Social informatics research is problem orientated, empirical, theory based and interdisciplinary with a focus on informatics (table 1). In addition there are several key distinguishing features of the framework. First, social informatics does not prescribe a specific methodology although the majority of methods employed by researchers in this field are qualitative methods. Second, social informatics is inclusive of normative, analytical or critical approaches to research. Third, this type of research “investigate[s] how influences and nodes at different levels in the network shape each other” [13], engaging in analysis of the interconnected levels of the social context. Fourth, research in this field can be seen to fall within three broad themes:

  1. ICT uses lead to multiple and sometimes paradoxical effects,

  2. ICT uses shape thought and action in ways that benefit some groups more than others and these differential effects often have moral and ethical consequences and;

  3. a reciprocal relationship exists between ICT design, implementation, use and the context in which these occur [13].

When adopting the framework of social informatics, the main focus of social informatics should not be overshadowed. The research should be focused upon the idea that “ICT are inherently socio-technical, situated and social shaped” [18] and that in order to understand their impacts we need to explore, explain and theorize about their socio-technical contexts [13].

Table 1. Key Features of Social Informatics Research (adapted from [13])

4.5. Justification for Using the Social Informatics Framework

There are two primary justifications for adopting a social informatics approach. First, the goals and achievements of social informatics accords to the researchers' goal and motivation. Second, the holistic method of enquiry adopted by social informatics research provides meaningful data. Social Informatics researchers aim to develop: “reliable knowledge about information technology and social change based on systematic empirical research, in order to inform both public policy issues and professional practice” [8]. This is in accordance with the researchers' goal to identify the credible threats that LBSN pose to friends and society with a view to preventing or minimizing their effect. Social informatics research has also developed an “increased understanding of the design, use, configuration and/or consequences of ICTs so that they are actually workable for people and can fulfill their intended functions” [9]. In essence, this is the primary motivation behind this study: to increase our understanding of location based social networking so that it can be workable and fulfill its intended function in society without causing individuals harm.

The method of enquiry adopted by social informatics researchers is usually based on conducting a holistic and interdisciplinary investigation into the bidirectional relationship between context and ICT design, use and implementation. This study takes into account the social theory surrounding trust and relationships; thus providing meaningful data on the implications of location based social networking upon trust. For Kling, it was the fact that information and communication technologies were increasingly becoming enmeshed in the lives of more and more people, that there was a pressing need to explore the ultimate social consequences of the ensuing changes [8]. Kling considered that studying new and emerging applications early in the process of diffusion granted significant opportunities to shape the forms and uses of new technologies.

4.6. Alternative Theories and Approaches to the Study of the Social Implications of Technology

Two alternative approaches to social informatics were discussed in section 4.2, i.e., technological determinism and the social shaping of technology. A third possible theory that was considered was critical social theory (founded by Jürgen Habermas). Critical social theory has four distinct attributes: (1) it is sensitive to lifeworlds of the organizational actors and is oriented to interpreting and mapping the meanings of their actions from their perspectives, (2) adopts pluralistic methods, (3) does not separate the subjects of inquiry from their context and (4) recognizes that the context is not only important to meaning construction, but to social activity as well [19]. Thus, we can say, that critical social theory is similar to social informatics in three main ways: (1) both approaches are sensitive to the context surrounding the subject of enquiry, (2) both focus on the inter-relationship between context and subject, and (3) both approaches employ pluralistic methods. However, the main focus of the two approaches is markedly different.

Critical information theory focuses on “questioning the conventional wisdom of prevailing schools of thought and institutional practices with a primary focus on issues related to justice and power” [20]. In applying this kind of approach to ICT we would be aiming to “discover and expose attempts to design and (mis)use IS to deceive, manipulate, exploit, dominate and disempower people” [21]. This is not the aim of the research problem presented here- while admittedly location based social networking can cause harm if misused (e.g. stalking by x-partners), it can also act to be incredibly beneficial (e.g. in a family travel holiday in a foreign country). Thus, the aim of the research is to understand the positive and negative implications of the use of location based social networking in society, not just to look at issues of justice and power.

The following section provides an overview of the key literature on the use, design, implementation, context and implications of online social networking, location based services, and location based social networking.

Section 5. Online Social Networking Sites

Current studies on online social networking sites use varied methods involving case studies, surveys, interviews and observations to investigate the use, implications, design and context of the emerging application. The literature on OSN falls into three broad areas of study: (1) purpose, motivation and patterns of use, (2) effect on interpersonal relationships, and (3) threats to privacy, trust and security.

5.1. Purpose, Motivation and Patterns of Use

These studies on online social networking outline the purpose for which OSN is used, the motivation behind an individual's use of OSN, and how users go about the adoption of OSN applications.

5.1.1. Purpose of Online Social Networking

The purpose of OSN has been identified as the public articulation of individual social connections [22], the creation of an information ground [23] or a means of satisfying “our human tendencies towards togetherness” [24]. Boyd's study on Friendster users, revealed that OSN “reshaped how groups of people verbally identify relationships and solidified the importance of creative play in social interactions” [22]. Boyd identified the value of networks, how users presented themselves on Friendster, who users connected with from exiting friends to “hook-ups” to “familiar strangers,” and it highlighted the dilemma caused by fakesters in the network.

Counts and Fisher's study explored OSN exposing the “types and usefulness of information shared in everyday life, the way the system fits into participants communication and social “ecosystem” and the ways in which the system functions as an information ground” [23]. Other than just a source of information, OSN also functions to provide “a logical extension of our human tendencies towards togetherness” [24]. Weaver and Morrison perform case studies on four social networking sites (mySpace, Facebook, Wikipedia and YouTube) to explore the range of socialization that can occur revealing the core purpose of connecting to people.

5.1.2. Motivation Behind the Use of Online Social Networking

Lampe, Ellison and Steinfield have conducted two major survey studies on the use of OSN. The first study was in 2006, and the second was in 2008. The purpose of the first study was to answer the question - “Are Facebook members using the site to make new online connections, or to support already existing offline connections?” The results revealed that Facebook users are primarily interested in increasing “their awareness of those in their offline community” [25]. The second study incorporated three surveys and interviews in order to explore whether the use, perception of audience and attitudes of users of Facebook changed over time with the introduction of new features to Facebook. The results again revealed that the primary use of Facebook was to maintain existing offline connections, in order to: keep in touch with friends, learn more about existing classmates and people that users have met socially offline [26]. Both studies were conducted upon undergraduate university populations.

Joinson [27] performed a use and motivation study on a random sample of Facebook users, not limited to campus-based populations, which supported the conclusions of both Lampe, Ellison and Steinfield studies. Furthermore the study by Joinson probed further identifying seven unique uses and gratifications of online social networks, including social connection, shared identities, content, social investigation, social network surfing and status updating, and identifying that different uses and gratifications relate differentially to patterns of usage [27].

5.1.3. Patterns of Use of Online Social Networking

Other studies of use of online social networking have looked at how the information provided by social networking sites can be used to understand patterns of use. Hancock, Toma and Fenner [28]explore how people use information available on social networking sites to initiate relationships. They asked participants to befriend partners via an instant messaging conversation by using profile information readily available on Facebook. This use of asymmetric information revealed that the information helped in linking persons together, but only in 2 out of 133 scenarios did the users realize that information had been gained from their Facebook profile, instead of the real-time instant messaging conversation(s) they had had with the friend. This study highlighted the rich source of information about the self which is available online, as well as the unintended consequences of others strategically plotting to use that information for their own relational goals.

Online social networking researchers have also explored patterns of use among different groups of people and communities. Ahn and Han [29] investigated the typological characteristics of online networking services. Chapman and Lahav [30] conducted an ethnographic interview studying the cross-cultural differences in usage patterns of OSN in multiple cultures. Results from the interviews identified three dimensions of cultural difference for typical social networking behaviors: users' goals, typical pattern of self expression and common interaction behaviors. The study was limited to the interviews with participants from the United States, France, China and South Korea, and therefore requires future work to evaluate the presented results.

Other studies have explored the usage among different age groups. Arjan, Pfeil and Zaphiris [31]explored users MySpace friend networks with webcrawlers to compare teenage (13–19) networks with those of older people (60+). The findings of the study showed that teenage users had larger networks with more users of the same age than older users. Furthermore when representing themselves online teenagers use more self referencing, negative emotions and cognitive works than older people. The limitation of this study is the small sample size and limited frame of reference – that is the differences between teenagers and older people without reference to other intermediate age groups. A third study by Schrammel, Köffel and Tscheligi [32] surveyed users of various online communities to explore the different information disclosure behavior in the different types of online communities. They identified that users disclose more information in business and social contexts, with students being more freehanded with information than employed people, and females being more cautious than males. Studies relating to the use of OSN have also explored its potential application to other contexts including the workplace [33][34]; student learning [35], citizen involvement [36] and connecting women in information technology [37].

5.2. The Effect of Online Social Networking on Interpersonal Relationships

Online social networking is used in the context of being social, creating connections with users and expanding networks [38]. The implication of using OSN to create or maintain relationships has been explored by several researchers highlighting the nature of intimate online relationships and social interactions as well as the benefits and detriments of the use of OSN upon relationships. Boyd's study concentrated on intimacy and trust within the OSN site Friendster. He highlighted that intimate computing hinges upon issues surrounding trust, trust in the technology, and ultimately trust in the other users to operate by the same set or rules [39]. Dwyer [40] has presented a preliminary framework modeling how attitudes towards privacy and impression management translate into social interactions within MySpace. Other issues that have been explored in the literature include whether interaction between users, flow from the declaration of friends and whether users interact evenly or lopsidedly with friends. These questions were explored by Chun et al, in a quantitative case study of the OSN site Cyworld, reporting that there was a high degree of reciprocity among users [41].

The benefits and detriments of OSN upon interpersonal relationships have not been extensively explored. A survey of undergraduate university students conducted by Ellison, Steinfield and Lampe [42] identified that using Facebook benefits the maintenance and growth of social capital among “friends” and also improves psychological well being. However, although OSN sites reinforce peer communication, Subrahmanyam and Greenfield [43] point out that this may be at the expense of communication within the family, expressing the need for further research into the affects of OSN upon real world communications and relationships.

5.3. Implications of Use- Privacy, Trust and Security

5.3.1. Privacy

Privacy in online social networking sites has received significant attention, with researchers exploring patterns of information revelation and implications upon privacy [44], the use of OSN policies to ensure privacy [45], differences in perceptions of privacy across different OSN [46], the privacy risks presented by OSN [47], mechanisms to enhance privacy on OSN [48], user strategies to manage privacy [49], and the notion of privacy and privacy risk in OSN [50].

The work of Levin and others at Ryerson University (the Ryerson Study) provides the largest survey on usage, attitudes and perceptions of risk of online social networking sites [50]. The design of the survey incorporated quantitative questions, scenarios and short answer questions to understand the level of risk and responsibility one feels when revealing information online. This study identified that young Canadians have a unique perception of network privacy “according to which personal information is considered private as long as it is limited to their social network” [50]. A further contribution of this study, along with other privacy studies [44][46] is the implication of the use of online social networking sites upon trust.

5.3.2. Trust

There are very few studies that explore the concept of trust in online social networking. The majority of studies which do look at trust are focused upon algorithms [51] or frameworks [52] that provide users of OSN with trust ratings. Other scant studies have mentioned or examined online social networking sites in terms of their impact upon trust in relationships. Gross and Acquisti [44]have mentioned that: “trust in and within online social networks may be assigned differently and have a different meaning than in their offline counterparts…[and that] trust may decrease within an online social network”. However they did not investigate this aspect of OSN further. There are three studies which have investigated the impact of OSN upon trust. The first by Dwyer, Hiltz and Passerini [46], compares perceptions of trust and privacy between different OSN applications. The second study, conducted by Ryerson University, identifies the potential for OSN to impact upon trust, and the third study, by Gambi and Reader, is currently ongoing and aims to determine whether trust is important in online friendships and how it is developed.

Dwyer, Hiltz and Passerini [46] compared perceptions of trust and privacy concern between MySpace and Facebook. Trust was measured with the following two quantitative questions; “I feel that my personal information is protected by [social networking sites]” and “I believe most of the profiles I view on [social networking sites] are exaggerated to make the person look more appealing”. The outcome of the study was focused upon trust in the users and online social network itself, but it did not shed light upon the effect of OSN upon trust in relationships.

The Ryerson study provides some exploration into the impact of online social networking sites upon trust in relationships, by presenting scenarios where users had experienced a loss of trust with other members of the site. The participants were then asked whether they had experienced or know of someone who had experienced such a scenario. The first scenario presented a user who went out partying and photographs were taken of the occasion and displayed on Facebook, resulting in the loss of trust by the family. Sixty-four percent of respondents either experienced this scenario directly or indirectly or heard of it happening to someone else. The second scenario that focused on trust involved a comment being posted upon a user's wall, indicating that that individual had been involved in shoplifting, and that no matter what the user claimed everyone still believed that he/she was a shoplifter. In this scenario, seventy-six percent of respondents reported that they had not heard of this occurring. The Ryerson study therefore presented a glimpse into the potential effect of use of online social networking sites upon trust. Another snapshot is provided by Gambi and Reader [53] who performed an online questionnaire with online social networking users to determine whether trust was important in online friendships, and how trust is developed online. Despite the low number of studies in the area of trust and OSN, it is clear from the currency of the three studies that this is an emerging area of research.

5.3.3. Security

Studies in online social networking have explored the impact of OSN on the security of user information and identity. A recent study by Bilge, Strufe, Balzarotti and Kirda [54] identifies the ease with which a potential attacker could perform identity theft attacks upon OSN and suggests improvements in OSN security.

Section 6. Location Based Services

The focus of the literature on location based services, as with social networking, does not surround the technological aspects of design but the use and implications from a social informatics perspective. In this vein the literature reviewed identified the different contexts of use of LBS, the implications of use including trust, control, privacy and security.

6.1. Context of Use of Location Based Services

The literature identifies both current and future applications of LBS to track and monitor human subjects. These applications include employee monitoring [55], government surveillance [56], law enforcement [57], source of evidence [58], patient monitoring [59], locating family members for safety [60][61][62], locating students at school [63], identifying kidnapped victims [60], and socializing with friends [64][65]. The following section details the literature conducted on humancentric LBS in terms of their social implications.

6.2. Implications of Using Location Based Services

Michael, Fusco and Michael's research note on the ethics of LBS provides a concise summary of the literature on the socio-ethical implications of LBS available prior to 2008. The research note identifies trust, control, security and privacy [66] as the four implications of LBS. The literature pertaining to each of these implications will now be described.

6.2.1. Trust

The literature on trust and location based services has predominantly used scenarios [67], theory based discussion of workplace practices [68], and addressed consumer trust with respect to LBS [69]. To the researcher's knowledge, the investigation of trust and LBS is limited to these works.

6.2.2. Control

Dobson and Fisher provide an account of the concept of “geoslavery”, which is defined as “the practice in which one entity, the master, coercively or surreptitiously monitors and exerts control over the physical location of another individual, the slave” [70]. While Dobson and Fisher provide a theoretical account of the potential for “geoslavery” and the human rights issues which accompany it, Troshynski, Lee and Dourish examine the application of “geoslavery” upon paroled sex offenders who have been tracked using a LBS device [57].

Troshynski, Lee and Dourish's work draws upon two focus groups of parole sex offenders to explore the ways that LBS frame people's everyday experience of space. The findings from the focus groups draw out the notion of accountabilities of presence. Troshynski et al define accountabilities of presence as the notion that “[l]ocations are not merely disclosed, rather users are held accountable for their presence and absence at certain time and places” [57]. This presence need not be their actual physical location but the location that is disclosed to the observer. For instance, the parole sex offenders were “primarily concerned with understanding how their movement appear to their parole officers” [57]. This concept of being held to account is a mechanism of enforcing control.

A handful of studies have made mention of the parallel between LBS and Michel Foucault's Panopticon design for prisons [71][57][72]. The Panopticon prison was designed to be round so that the guards could observe the prisoners from the centre without the prisoners knowing whether they were being observed or not. Foucault argued “that the omni-present threat of surveillance renders the actual exercise of power (or violence) unnecessary; the mechanisms of pervasive surveillance induce discipline and docility in those who are surveilled” [57]. LBS represent a modern form of the Panopticon prison, exerting implicit control through the ability to observe.

6.2.3. Security

LBS can be used to provide security, such as law enforcement in order to make “police more efficient in the war against crime” [73] and also for border security [63]. However they can also present a threat to security [74].

6.2.4. Privacy

LBS pose a threat to privacy in the way that information is collected, stored, used and disclosed [75][74][76]. The threat to privacy is further exacerbated by the aggregation and centralization of personal information enabling location information to be combined with other personal information [77]. However while privacy is important, a hypothetical study requiring users to “imagine” the existence of a LBS, provided evidence to show that users were “not overly concerned about their privacy” [78]. Two other studies showed that in situations of emergency, individuals are more willing to forgo some of their privacy [60][79].

Section 7. Location Based Social Networking

The current literature on location based social networking explores users' willingness and motivations for disclosing location information and presents several user studies, which draw out different findings on the implications of using LBSN.

7.1. Disclosure of Location Information

Grandhi, Jones and Karam [80] conducted a survey to gauge attitudes towards disclosure of location information, and use of LBSN applications. The findings from the short survey indicated that there was a general interest in LBSN services. The majority of respondents stated that they would disclose their personal location data, that demographics and geotemporal routines did matter, and finally that social relationships are important in predicting when or with whom individuals want to share personal location data.

7.2. LBSN User Studies

7.2.1. LBSN Studies Based on Perceptions and Closed Environments

Several user studies have been conducted on location based social networking [81]. One of the earliest studies to be conducted involved a two phased study comparing perceived privacy concerns with actual privacy concerns within a closed LBS environment [82]. Barkhuus found that although users were concerned about their location privacy in general, when confronted with a closed environment the concern diminished. Another user study observed the configuration of privacy settings on a work-related location based service [83]. The study found that grouping permissions provided a convenient balance between privacy and control. Moving away solely from the concept of privacy, Consolvo and Smith [84] conducted a three phased study. First they explored whether social networking users would use location-enhanced computing, second they recorded the response of users to in-situ hypothetical requests for information, and thirdly requested participants to reflect upon phase one and two. Some of the captured results included: what participants were willing to disclose, the relationship between participant and requestor, the effect of where participants were located, the activity or mode, privacy classifications, what people want to know about another's location, and privacy and security concerns. The limitation of the research, and prior research on LBSN technologies was the hypothetical nature of the research, or that the research took place within a controlled environment. The following studies employed the use of actual or tailored LBSN.

7.2.2. Semi-Automated and Customizable LBSN Studies

Brown and Taylor [61] implemented the Whereabouts Clock, a location based service which displayed the location of family members on a clock face with four values. At any given point of time, an individual had the status of being at home, at work, at school, or elsewhere. This study revealed that LBSN within the family context could help co-ordination and communication and provide reassurance and connectedness, although it also caused some unnecessary anxiety. Privacy was found not to be an issue among family members using the Whereabouts Clock. The LBSN technology used in this study was more sophisticated than prior studies but it was rather limited in geographic granularity.

Humphreys performed a year long qualitative field study on the mobile social network known as Dodgeball which allowed users to ‘check in’ at a location and then that location was broadcasted to people on their given network. The outcomes of this study revealed patterns of use of LBSN, the creation of a “third space” by LBSN, and the resultant social molecularization caused by Dodgeball use [85]. The limitation of this study is again in the technology employed, the location information was not automated or real-time as Dodgeball required the user to consciously provide manual location updates.

Barkhuus and Brown [86] conducted a trial using Connecto, in order to investigate the emergent practices around LBSN. Connecto allowed users to tag physical locations and then the phone would automatically change the users displayed location to represent the tagged location. This provided a closer simulation of real-time automated LBSN. The outcomes of this study demonstrated that users could use Connecto to establish a repartee and were self-conscious about the location they disclosed. By publishing their location, the users were found to engage in ongoing story-telling with their friends, via a process of mutual monitoring. This act was seen as a “part of friendship relations” and added to an “ongoing relationship state.” There was also the additional expectation that users had to “have seen each others' location or else risk falling ‘out of touch’ with the group” [86].

7.2.3. Real-time LBSN Studies

Brown LBSN studies published after the 2008 calendar year use methods that take advantage of sophisticated real-time automated LBSN applications. Tsai and Kelley [87] developed the Locyoution Facebook application which was used to automatically locate user laptops using wireless fidelity (Wi-Fi) access points leveraging the SkyHook technology. The aim of the study was to investigate how important feedback is for managing personal privacy in ubiquitous systems. Participants were divided into two groups; one group received no information about who had requested their location while the other group was able to view their location disclosure history. The four major findings of the study were that (1) providing feedback to users makes them more comfortable about sharing location (2) feedback is a desired feature and makes users more willing to share location information, (3) time and group based rules are effective for managing privacy, and (4) peers and technical savviness have a significant impact upon use.

Vihavaninen and Oulasvirta [88] performed three field trials of Jaiku, a mobile microblogging service that automates disclosure and diffusion of location information. The focus of the field trials was on investigating the use, user response and user understanding of automation. The results of this study revealed that automation caused issues related to control, understanding, emergent practices and privacy. This study is significant as it is one of the first studies to investigate the implication of automated location disclosure upon user perceptions. The study however does not investigate the implications of the use of automated LBSN upon social relationships.

An ethnographic study by Page and Kobsa explored people's attitudes towards and adoption of Google Latitude, a real-time and automated LBSN. The focus of this study was upon “how participants perceive[d] Latitude to be conceptually situated within the ecology of social networking and communication technologies” [65], based upon technology adoption, social norms, audience management, information filtering and benefits. This study while innovative, presented preliminary results based upon 12 interviews of users and non-users of Latitude.

The user studies conducted upon LBSN have matured over time, with more recent studies employing sophisticated LBSN which provide automated real-time location disclosure. These studies provide insight into user perceptions and use of LBSN however issues of control, security or trust have been neglected, although they are becoming increasingly pertinent to both location based services and online social networking technologies. Furthermore there has been no more than a cursory investigation into the implications of using LBSN upon social relationships.

Section 8. Towards a Study Investigating the Social Implications of LBSN on Relationships

Location based social networking is an emerging and evolving technology with current applications still very much in their infancy. Previous works reflect the state of the technology in late 2008, utilizing hypothetical scenario methods or unsophisticated non-real time incarnations of LSBN. While new research has begun to utilize more sophisticated mobile software applications such as Google Latitude, a sober full-length study is absent from the literature. The need for such a study however is escalating as more and more LBSN applications proliferate, with more and more mobile Internet users being aware of the existence of LBSN and/or adopting the technology. What remains to be explored in the area of LBSN are the concepts of control, security and trust, and the effect of these emerging technologies upon social relationships.

In the months between February and May 2010, the number of fully-fledged LBSN applications more than doubled from fifty to over one hundred [89]. This is a substantial increase when one considers that in late 2009 there were about 30 functional LBSN applications, but only about 8 that people would generally say were usable, reliable, or worth using. Today, innovative developers are simply piggybacking on top of the Google platform and offering niche LBSN applications targeted at dating services, adventure sports, hobbyists, expertise and qualifications, and other demographic profiling categories. Table 2 shows a list of over 100 LBSN applications. Although this is not an exhaustive list, one can only imagine the potential for such services, and the unforeseen consequences (positive and negative) that may ensue from their widespread adoption.

TABLE 2. A List of LBSN Applications [89]

8.1. Trust and Technology

Many studies concerning trust and technology focus upon trust in technology. Trust is an important aspect of human interaction, including human interaction with technology, however that interaction is a two way event, and only minimal research has been undertaken to observe the impact of technology upon trust. Two studies have been found which focus upon the effect of technology upon trust.

Vasalou, Hopfensiz and Pitt [90] examined how trust can break down in online interactions. The ways trust can break down can occur from intentional acts but also from unintentional acts or exceptional acts. The paper titled: “In praise of forgiveness: ways for repairing trust breakdowns in one-off online interactions” also proposes methods for fairly assessing the kind of offender to determine whether the offender committed an intentional act that resulted in the trust breakdown or whether the act was unintentional or exceptional.

The second study that looked at the effect of technology on trust was conducted by Piccoli and Ives [17], and explored trust and the unintended effects of behavior control in virtual teams. This study was based upon observations of the conduct of virtual teams. The findings showed that behavior control mechanisms increase vigilance and make instances when individuals perceive team members to have failed to uphold their obligations salient [17].

8.2. Social Theory

Social informatics studies incorporate a social theory into the study of the technology. This research will incorporate the theory of trust and its importance within friendships.

8.2.1. Trust

Trust is defined as the willingness for an individual to be vulnerable where there is the presence of risk and dependence or reliance between the parities [91]. There are two important things to note about this definition of trust. First that trust is not a behavior or choice but a state of mind where the individual is willing to make themselves vulnerable. Second, that trust is not a control mechanism but a substitute for control [92], although the relationship between trust and control is more complex than this [93]. In order to understand trust more fully it is important to understand the bases upon which trust is formed and the dynamic nature of trust.

Trust is formed upon three bases (1) cognitive, (2) emotional or relational and (3) behavioral [94]. The cognitive basis of trust refers to the “evidence of trustworthiness” or “good reason” to trust. It is not that evidence or knowledge amounts to trust but that “when social actors no longer need or want any further evidence or rational reasons for their confidence in the objects' of trust” and are then able to make the cognitive “leap” into trust [94]. The emotional basis of trust refers to the emotional bond between parties which provides the interpersonal platform for trust. Finally, behavioral trust is the behavioral enactment of trust. To illustrate behavioral trust consider two individuals A and B and A trusts B with task X. If B performs task X then the trust that A has in B will be confirmed, therefore there is the behavioral enactment of trust. In the same way acting incongruently can reduce the trust. The behavioral basis of trust feeds also into the fact that trust is a dynamic concept: “ a trustor takes a risk in a trustee that leads to a positive outcome, the trustor's perceptions of the trustee are enhanced. Likewise, perceptions of the trustee will decline when trust leads to unfavorable conclusions” [92].

8.2.2. Trust and Friendship

Trust is a vitally important element of friendship. Trust secures the “stability of social relationships” [4]. Friendships are described as being “based on trust, reciprocity and equality… which is an important source of solidarity and self-esteem” [4]. And trust is described as a timelessly essential factor of friendships: “the importance of mutual commitment, loyalty and trust between friends will increase and may become an essential element of modern friendship regardless of other changes, which may be expected as the nature of social communication and contracts is transformed” [4].

Section 9. Conclusion

Online social networking technologies have already transformed the way in which people interact in the virtual space. Generally, younger people are more inclined to interact via features on online social networks than with traditional forms of online communications such as electronic mail. The ability to look up a “friends” location using a location based social network, now grants individuals even greater freedom to interact with one another in an almost omniscient manner. Not only do we now know the ‘who’ (identity) of a person, but we also know the ‘whereabouts’ (location) of a person, and from the profile data available on the online social network we also know something more about one's ‘context.’ If used appropriately these new applications have the potential to strengthen individual relationships and provide an unforeseen level of convenience between “friends”, including partners, siblings, parent-child, employer-employee relationships. However, there is also the danger that these technologies can be misused and threaten fundamental threads that society is built upon, such as trust. This literature review has attempted to establish what previous research has already been conducted in the area of LBSN, and what has yet to be done. Our future work will focus on participant realtime automated LBSN fieldwork, with a view to understanding the impact of LBSN on trust between people, and the broader social implications of this emerging technology upon society.

References

1. R. Kling, "What is social informatics and why does it matter?", The Information Society, vol. 23, pp. 205-220, 2007.

2. K. Robert, K. Sara, "Internet paradox revisited", Journal of Social Issues, vol. 58, pp. 49-74, 2002.

3. A. Drummond, Teenager missing after Facebook meeting, 14 May 2010.

4. B. Misztal, Trust in Modern Societies - The Serach for the bases of Social Order, Cambridge:Blackwell Publishers, 1998.

5See where your friends are with Google Latitude, February 2009.

6. R. Kling, H. Rosenbaum, "Social informatics in information science: An introduction", Journal of the American Society for Information Science, vol. 49, pp. 1047-1052, 1998.

7. R. Kling, "Social Informatics", Encyclopedia of Library and Information Science, pp. 2656-2661, 2003.

8. R. Kling, "Learning About Information Technologies and Social Change: The Contribution of Social Informatics", The Information Society, vol. 16, pp. 217-232, 2000.

9. R. Kling, "Social Informatics: A New Perspective on Social Research about Information and Communication Technologies", Prometheus, vol. 18, pp. 245-264, 2000.

10. D. Mackenzie, D. Mackenzie, "Introductory Essay: The Social Shaping of Technology" in The Social Shaping of Technology, Philadelphia:Open University Press, pp. 2-27, 1999.

11. S. Russell, R. Williams, K. Sorensen, R. Williams, "Social Shaping of Technology: Frameworks Findings and Implications for Policy With Glossary of Social Shaping Concepts" in Shaping Technology Guiding Policy: Concepts Spaces and Tools, Chetenham:Elgar, pp. 37-131, 2002.

12. R. Williams, D. Edge, "The Social Shaping of Technology", Research Policy, vol. 25, pp. 856-899, 1996.

13. S. Sawyer, K. Eschenfelder, "Social informatics: Perspectives examples and trends", Annual Review of Information Science and Technology, vol. 36, pp. 427-465, 2002.

14. R. Kling, L. Covi, "Electronic journals and legitimate media in the systems of scholarly communication", The Information Society, vol. 11, pp. 261-271, 1995. 

15. W. Orlikowski, "Learning from notes: Organizational issues in GroupWare implementation", The Information Society, vol. 9, pp. 237-250, 1993.

16. B. Kahin, J. Keller, Public Access to the Internet, Cambridge: MIT Press, 1995.

17. G. Piccoli, B. Ives, "Trust and the Unintended Effects of Behavior Control in Virtual Teams", MIS Quarterly, vol. 27, pp. 365-395, 2003.

18. S. Sawyer, A. Tapia, "From Findings to Theories: Institutionalizing Social Informatics", The Information Society, vol. 23, pp. 263-275, 2007.

19. O. K. Ngwenyama, A. S. Lee, "Communication Richness in Electronic Mail: Critical Social Theory and the Contextuality of Meaning", MIS Quarterly, vol. 21, pp. 145-167, 1997.

20. S. Hansen, N. Berente, "Wikipedia Critical Social Theory and the Possibility of Rational Discourse", The Information Society, vol. 25, pp. 38-59, 2009.

21. D. Cecez-Kecmanovic, "Doing critical IS research: the question of methodology" in Qualitative Research in Information Systems: Issues and Trends, Hershey:Idea Group Publishing, pp. 141-163, 2001.

22. D. M. Boyd, "Friendster and publicly articulated social networking" in CHI '04 on Human Factors in Computing Systems, Vienna, Australia:, 2004.

23. S. Counts, K. E. Fisher, "Mobile Social Networking: An Information Grounds Perspective", Proceedings of the 41st Annual Hawaii International Conference on System Sciences, 2008.

24. A. C. Weaver, B. B. Morrison, "Social Networking", Computer, vol. 41, pp. 97-100, 2008.

25. C. Lampe, N. Ellison, C. Steinfield, "A face(book) in the crowd: social Searching vs. social browsing", Proceedings of the 2006 20th Anniversary Conference on Computer Supported Cooperative Work, 2006.

26. C. Lampe, N. B. Ellison, C. Steinfield, "Changes in use and perception of facebook", Proceedings of the ACM 2008 conference on Computer supported cooperative work, 2008.

27. A. N. Joinson, Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, 2008.

28. J. T. Hancock, C. L. Toma, "I know something you don't: the use of asymmetric personal information for interpersonal advantage", Proceedings of the ACM 2008 conference on Computer supported cooperative work, 2008.

29. Y.-Y. Ahn, S. Han, Proceedings of the 16th international conference on World Wide Web, 2007.

30. C. N. Chapman, M. Lahav, "International ethnographic observation of social networking sites" in CHI '08 extended abstracts on Human factors in computing systems, Florence, Italy:, 2008.

31. R. Arjan, U. Pfeil, P. Zaphiris, "Age differences in online social networking", Conference on Human Factors in Computing Systems, 2008.

32. J. Schrammel, C. Kaffel, Tscheligi, "How much do you tell?: information disclosure behavior indifferent types of online communities", Proceedings of the fourth international conference on Communities and technologies, 2009.

33. J. DiMicco, D. R. Millen, "Motivations for social networking at work", Proceedings of the ACM 2008 conference on Computer supported cooperative work, 2008.

34. M. M. Skeels, J. Grudin, "When social networks cross boundaries: a case study of workplace use of facebook and linkedin", Proceedings of the ACM 2009 international conference on Supporting group work, 2009.

35. I. Liccardi, A. Ounnas, "The role of social networks in students' learning experiences" in Working group reports on ITiCSE on Innovation and technology in computer science education, Dundee, Scotland:, 2007.

36. S. Bystein, J. Rose, "The Role of Social Networking Services in eParticipation", Proceedings of the 1st International Conference on Electronic Participation, 2009.

37. R. M. Beth, M. C. John, "wConnect: a facebook-based developmental learning community to support women in information technology", Proceedings of the fourth international conference on Communities and technologies, 2009.

38. J. Donath, D. Boyd, "Public displays of connection", BT Technology Journal, vol. 22, pp. 71-82, 2004.

39. D. Boyd, "Reflections on Friendster Trust and Intimacy", Ubiquitous Computing Workshop application for the Intimate Ubiquitous Computing Workshop, 2003.

40. C. Dwyer, "Digital Relationships in the “MySpace” Generation: Results From a Qualitative Study", Proceedings of the 40th Annual Hawaii International Conference on System Sciences, 2007.

41. H. Chun, H. Kwak, "Comparison of online social relations in volume vs interaction: a case study of cyworld", Proceedings of the 8th ACM SIGCOMM conference on Internet measurement, 2008.

42. N. Ellison, C. Steinfield, C. Lampe, "The Benefits of Facebook “Friends:” Social Capital and College Students Use of Online Social Network Sites", Journal of Computer-Mediated Communication, vol. 12, pp. 1143-1168, 2007.

43. K. Subrahmanyam, P. Greenfield, "Online communication and adolescent relationships", The Future of Children, vol. 18, pp. 119-128, 2008.

44. R. Gross, A. Acquisti, "Information Revelation and Privacy in Online Social Networks", Workshop on Privacy in Electronic Society, 2005.

45. J. Snyder, D. Carpenter, " MySpace.com - A Social Networking Site and Social Contract Theory ", Information Systems Education Journal, vol. 5, pp. 3-11, 2007.

46. C. Dwyer, S. Hiltz, Passerini, "Trust and privacy concern within social networking sites: A comparison of Facebook and MySpace", Proceedings of the Thirteenth Americas Conference on Information Systems (AMCIS), 2007.

47. D. Rosenblum, "What Anyone Can Know: The Privacy Risks of Social Networking Sites", IEEE Security & Privacy, vol. 5, pp. 40-49, 2007.

48. M. Mohammad, C. O. Paul, "Privacy-enhanced sharing of personal content on the web", Proceeding of the 17th international conference on World Wide Web, 2008.

49. S. Katherine, L. H. Richter, "Strategies and struggles with privacy in an online social networking community", Proceedings of the 22nd British HCI Group Annual Conference on HCI 2008: People and Computers XXII: Culture Creativity Interaction, vol. 1, 2008.

50. A. Levin, M. Foster, The Next Digital Divide: Online Social Network Privacy, 2008.

51. J. Golbeck, U. Kuter, "The Ripple Effect: Change in Trust and Its Impact Over a Social Network", Computing with Social Trust, pp. 169-181, 2009.

52. C. James, L. Ling, "Socialtrust: tamper-resilient trust establishment in online communities", Proceedings of the 8th ACM/IEEE-CS joint conference on Digital libraries, 2008.

53. S. Gambi, W. Reader, "The Development of Trust in Close Friendships Formed within Social Network Sites", Proceedings of the WebSci'09: Society On-Line, 2009.

54. L. Bilge, T. Strufe, Balzarotti, Kirda, "All your contacts are belong to us: automated identity theft attacks on social networks", Proceedings of the 18th international conference on World wide web, 2009.

 55. G. Kaupins, R. Minch, "Legal and ethical implications of employee location monitoring", International Journal of Technology and Human Interaction, vol. 2, pp. 16-20, 2006.

56. G. D. Smith, "Private eyes are watching you: with the implementation of the E-911 mandate who will watch every move you make? (Telecommunications Act of 1996: Ten Years Later Symposium)", Federal Communications Law Journal, vol. 58, pp. 705-721, 2006.

57. E. Troshynski, C. Lee, Dourish, "Accountabilities of presence: reframing location-based systems", Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, 2008.

58. C. Strawn, "Expanding The Potential for GPS Evidence Acquisition", Small Scale Digital Device Forensics Journal, vol. 3, pp. 1-12, 2009.

59. Y. Xiao, B. Shen, "Security and Privacy in RFID and application in telemedicine", IEEE Communications Magazine, vol. 44, pp. 64-72, 2006.

60. A. Masters, K. Michael, "Lend me your arms: The use and implications of humancentric RFID", Electronic Commerce Research Applications, vol. 6, pp. 29-39, 2007.

61. B. Brown, A. Taylor, "Locating Family Values: A Field Trial of the {Whereabouts} Clock", UbiComp 2007, 2007.

62. L.-D. Chou, N.-H. Lai, Y.-W. Chen, Y.-J. Chang, L.-F. Huang, W.-L. Chiang, H.-Y. Chiu, J.-Y. Yang, "Management of mobile social network services for families with Developmental Delay Children", 10th International Conference on e-health Networking Applications and Services: HealthCom 2008, 2008.

63. D. J. Glasser, K. W. Goodman, "Chips tags and scanners: Ethical challenges for radio frequency identification", Ethics and Information Technology, vol. 9, pp. 101-109, 2007.

64. L. Nan, C. Guanling, "Analysis of a Location-Based Social Network", International Conference on Computational Science and Engineering, 2009.

65. X. Page, A. Kobsa, "The Circles of Latitude: Adoption and Usage of Location Tracking in Online Social Networking", International Conference on Computational Science and Engineering, 2009.

66. M. G. Michael, S. J. Fusco, K. Michael, "A Research Note on Ethics in the Emerging Age of überveillance", Computer Communications, vol. 31, pp. 1192-1199, 2008.

67. L. Perusco, K. Michael, "Humancentric applications of precise location based services", International Conference on eBusiness Engineering, 2005.

68. J. Weckert, "Trust and monitoring in the workplace", IEEE Symposium on Technology and Society, 2000.

69. G. Borriello, "RFID: tagging the world", Communications of the ACM, vol. 48, pp. 34-37, 2005.

70. J. E. Dobson, P. F. Fisher, "Geoslavery", IEEE Technology and Society Magazine, vol. 22, pp. 47-52, 2003.

71. P. Joore, "Social aspects of location-monitoring systems: the case of Guide Me and of My-SOS", Social Science Information, vol. 47, pp. 253-274, 2008.

72. J. E. Dobson, P. F. Fisher, "The Panopticon's Changing Geography", The Geographical Review, vol. 97, pp. 307-323, 2007.

73. E. M. Dowdell, "You are here! Mapping the boundaries of the Fourth Amendment with GPS technology", Rutgers Computer and Technology Law Journal, vol. 32, pp. 109-131, 2005.

74. V. Lockton, R. Rosenberg, "RFID: The Next Serious Threat to Privacy", Ethics and Information Technology, vol. 7, pp. 221-231, 2005.

75. S. L. Garfinkel, A. Juels, "RFID Privacy: An Overview of Problems and Proposed Solutions", IEEE Security and Privacy, pp. 34-43, 2005.

76. M. Gadzheva, "Privacy concerns pertaining to location-based services", International Journal of Intercultural Information Management, vol. 1, pp. 49, 2007.

77. J. L. Wang, M. Loui, "Privacy and ethical issues in location-based tracking systems", Proceedings of the IEEE Symposium on Technology and Society, 2009.

78. L. Barkhuus, A. Dey, "Location-Based Services for Mobile Telephony: a study of user's privacy concerns", Proceedings of the INTERACT 9th IFIP TC13 International Conference on Human-Computer Interaction, 2003.

79. A. Aloudat, K. Michael, R. Abbas, "Location-Based Services for Emergency Management: A Multi-Stakeholder Perspective", Eighth International Conference on Mobile Business (ICMB 2009), 2009.

80. S. A. Grandhi, Q. Jones, Karam, "Sharing the big apple: a survey study of people place and locatability" in presented at CHI '05 extended abstracts on Human factors in computing systems, Portland, OR:, 2005.

81. S. J. Fusco, K. Michael, M. G. Michael, R. Abbas, "Exploring the Social Implications of Location Based Social Networking: An inquiry into the perceived positive and negative impacts of using LBSN between friends", International Conference on Mobile Business, 2010.

82. L. Barkhuus, "Privacy in Location-Based Services Concern vs. Coolness", HCI 2004 workshop: Location System Privacy and Control, 2004.

83. S. Patil, J. Lai, "Who gets to know what when: configuring privacy permissions in an awareness application", Proceedings of the SIGCHI conference on Human factors in computing systems, 2005.

84. S. Consolvo, I. E. Smith, "Location disclosure to social relations: why when & what people want to share", Proceedings of the SIGCHI conference on Human factors in computing systems, 2005.

85. L. Humphreys, "Mobile Social Networks and Social Practice: A Case Study of Dodgeball", Journal of Computer-Mediated Communication, vol. 13, pp. 341-360, 2008.

86. L. Barkhuus, B. Brown, "From awareness to repartee: sharing location within social groups", Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, 2008.

87. J. Y. Tsai, P. Kelley, "Who's viewed you?: the impact of feedback in a mobile location-sharing application", Proceedings of the 27th international conference on Human factors in computing systems, 2009.

88. S. Vihavainen, A. Oulasvirta, "I can't lie anymore!”: The implications of location automation for mobile social applications", 6th Annual International Mobile and Ubiquitous Systems: Networking & Services, 2009.

89. C. Schapsis, Location Based Social Networks Links: A list of Location Based Social Networks, 2010.

90. A. Vasalou, A. Hopfensitz, J. Pitt, "In praise of forgiveness: Ways for repairing trust breakdowns in one-off online interactions", International Journal of Human-Computer Studies, vol. 66, pp. 466-480, 2008.

91. D. Rousseau, S. Sitkin, "Not So Different After All: A Cross-Discipline View of Trust", Academy of Management Review, vol. 22, pp. 393-404, 1998.

92. R. C. Mayer, J. H. Davis, "An Integrative Model of Organizational Trust", Academy of Management Review, vol. 20, pp. 709-734, 1995.

93. K. Bijlsma-Frankema, A. C. Costa, "Understanding the Trust-Control Nexus", International Sociology, vol. 20, pp. 259-282, 2005.

94. J. D. Lewis, A. Weigert, "Trust as a Social Reality", Social Forces, vol. 63, pp. 967-985, 1985.

Acknowledgments

The authors would like to acknowledge the funding support of the Australian Research Council (Discovery grant DP0881191): “Toward the Regulation of the Location-Based Services Industry: Influencing Australian Government Telecommunications Policy”.

Keywords

Informatics, Social network services, Space technology, Privacy, Communications technology, Information systems, Social implications of technology, Context, Surveillance, Smart phones, social networking (online), data privacy, mobile computing, social aspects of automation, information and communication technology design, social informatics, location-based social networking, mobile application, classification model, location based service, online social networking, trust, friendship, privacy, social life

Citation:  Sarah Jean Fusco, Katina Michael and M. G. Michael, "Using a social informatics framework to study the effects of location-based social networking on relationships between people: A review of literature",  2010 IEEE International Symposium on Technology and Society (ISTAS), 7-9 June 2010, Wollongong, Australia, DOI: 10.1109/ISTAS.2010.5514641

 

 

The legal, social and ethical controversy of DNA samples in forensic science

The legal, social and ethical controversy of the collection and storage of fingerprint profiles and DNA samples in forensic science

Abstract

The collection and storage of fingerprint profiles and DNA samples in the field of forensic science for nonviolent crimes is highly controversial. While biometric techniques such as fingerprinting have been used in law enforcement since the early 1900s, DNA presents a more invasive and contentious technique as most sampling is of an intimate nature (e.g. buccal swab). A fingerprint is a pattern residing on the surface of the skin while a DNA sample needs to be extracted in the vast majority of cases (e.g. at times extraction even implying the breaking of the skin). This paper aims to balance the need to collect DNA samples where direct evidence is lacking in violent crimes, versus the systematic collection of DNA from citizens who have committed acts such as petty crimes. The legal, ethical and social issues surrounding the proliferation of DNA collection and storage are explored, with a view to outlining the threats that such a regime may pose to citizens in the not-to-distant future, especially persons belonging to ethnic minority groups.

SECTION 1. Introduction

The aim of this paper is to apply the science, technology and society (STS) studies approach which combines history, social study and philosophy of science to the legal history of DNA sampling and profiling in the United Kingdom since the first forensic use of DNA in a criminal court case in 1988. The paper begins by defining the application of biometrics to the field of criminal law, in particular the use of fingerprint and DNA identification techniques. It then presents the differences between fingerprints and DNA evidence and focuses on distinguishing between DNA profiles and samples, and DNA databanks and databases. Finally the paper presents the legal, ethical and social concerns of the proliferation of DNA collection and storage in particular jurisdictions prior to 2010 (e.g. United Kingdom). The paper points to the pressing need for the review of the Police and Criminal Evidence Act 1984, and to the procedures for DNA collection and storage in the U.K.'s National DNA Database (NDNAD) which was established in 1995. Some examples are provided of the state of play in the United States as well.

SECTION 2. Conceptual Framework

It is of no surprise that in recent years there has been a convergence between science and technology studies (STS) and law and society (L&S) studies. Some commentators, like this author believe that there is a need to define a new theoretical framework that amalgamates these increasingly converging areas. Lynch et al. [6], [p.14] write: “[w]hen law turns to science or science turns to law, we have the opportunity to examine how these two powerful systems work out their differences.” This convergence has its roots planted in legal disputes in the fields of health, safety and environmental regulation. For instance, advances in technology have challenged ones right to live or die. New innovations have the capacity to draw out traditional distinctions of regulations or they can challenge and even evade them.

In this paper we study the “DNA controversy” using the conceptual framework that can be found in Figure 1 which depicts the role of major stakeholders in the debate. In the early 1990s the “DNA Wars” [6] focused on two major problems with respect to the techno-legal accountability of DNA evidence in a court of law. The first had to do with the potential for error in the forensic laboratory, and the second had to do with the combination of genetic and statistical datasets. And it did not just have to do with legal and administrative matters, but issues that were both technical and scientific in nature. The key players included expert lawyers, scientists who actively participated in legal challenges and public policy debates, and the media who investigated and reported the controversy [6]. To put an end to the controversy would require the coming together of law, science and the public in a head-on confrontation. And that is indeed what occurred. By the late 1990s DNA had become an acceptable method of suspect identification and a great number of onlookers prematurely rushed to declare a closure to the controversy although as commentators have stated there was no moment of truth or definitive judgment that put an end to the controversy. What many did not recognize at the time however, is that the DNA controversy would return, in places like the United Kingdom, bigger and with more intensity than ever before.

Figure 1. The core set diagram: studying the DNA controversy

It is with great interest to read that closure in the DNA controversy was really visible when the NDNAD and some of the legislation and policy surrounding it facilitated talks between nations in Europe with respect to harmonization. According to Lynch et al. [6], [p.229]:

“[e]fforts were made to “harmonize” DNA profile and database standards in Europe, and other international efforts were made to coordinate forensic methods in order to track suspected “mobile” criminals and terrorists across national borders. These international efforts to implement and standardize DNA profiling contributed to closure in particular localities by demonstrating that the technique was widely used and had become a fixture of many criminal justice systems.”

While closure it may have signified to those working within an STS and L&S approach, harmonization was certainly not reached. Far from it, the U.K. who had been responsible for initial harmonization efforts, later, lost its way. What made onlookers believe that closure had fully occurred were the technical, legal and administrative fixes that had taken place. But closure in this instance did not mean the complete end to the controversy-no-what was coming was much greater disquiet in the U. K, and this period was named ‘post-closure’ by the STS and L&S commentators. Postclosure signals a period of time after closure is established, when the possibilities for issues that were once closed are reopened. In the case of the NDNAD in the U.K. it was not old issues that were reopened during postclosure, but new issues that were introduced due to so-called legal fixes. These legal fixes had social implications, so it was not until the public and the media and non-government organizations alongside self-interest groups were satisfied that change would be imminent, that postclosure seemed a real possibility. The threat to the post-closure of the DNA controversy however, is the burgeoning demand for DNA samples in fields such as epidemiology research and the recent commercialization of DNA sample collection and storage for every day citizens (e.g. DNA home kits selling for less than $100US dollars). DNA is no longer seen as just useful for forensic science or health, and this is placing incredible pressure on the advanced identification technique which is increasingly becoming commoditized.

SECTION 3. Background: What is Biometrics?

As defined by the Association for Biometrics (AFB) a biometric is “ … a measurable, unique physical characteristic or personal trait to recognize the identity, or verify the claimed identity, of an enrollee.” The physical characteristics that can be used for identification include: facial features, full face and profile, fingerprints, palmprints, footprints, hand geometry, ear (pinna) shape, retinal blood vessels, striation of the iris, surface blood vessels (e.g., in the wrist), and electrocardiac waveforms [1]. Other examples of biometric types include DNA (deoxyribonucleic acid), odor, skin reflectance, thermogram, gait, keystroke, and lip motion. Biometrics have seven characteristics: they are universal in that every person should possess that given characteristic; they are unique in that no two persons should have the same pattern; they are permanent in that they do not change over time; they are collectable and quantifiable; there is performance in that the measure is accurate, it is acceptable to users; and circumventing, meaning that the system of identification theoretically cannot be duped [2]. The two most popular methods of identification today in criminal law, when direct evidence is lacking such as a first hand eyewitness account, are fingerprinting and DNA.

SECTION 4. What is Fingerprinting?

Fingerprints are classified upon a number of fingerprint characteristics or unique pattern types, which include arches, loops and whorls [3], [p.228]. If one inspects the epidermis layer of the fingertips closely, one can see that it is made up of ridge and valley structures forming a unique geometric pattern. The ridge endings are given a special name called minutiae. Identifying an individual using the relative position of minutiae and the number of ridges between minutiae is the traditional algorithm used to compare pattern matches. As fingerprints do not change from birth until death unless they are accidentally or deliberately deformed, it is argued that they can provide an absolute proof of identity. The science of fingerprint identification is called dactyloscopy [4], [p.4].

4.1. Fingerprinting as Applied to Criminal Law

Fingerprints left behind at the scene of a crime (SOC) can be used to collect physical evidence for the purposes of human identification. They have the capacity to link a person (e.g. a suspect) to a particular location at a given time. This can happen in one of two ways: (i) the suspect's fingerprints are taken and cross-matched with those fingerprints found at the scene of a crime; or (ii) a successful match is found using computer technology to compare the fingerprints found at the scene of a crime with a database of previous offenders. It should be noted that fingerprinting in criminal law is not new. Manual standards, for instance, existed since the 1920s when the Federal Bureau of Investigation (FBI) in the U.S. started processing fingerprint cards. These standards ensured completeness, quality and permanency.

By the early 1970s due to progress in computer processing power and storage, and the rise of new more sophisticated software applications, law enforcement began to use automatic machines to classify, store, and retrieve fingerprint data. The FBI led the way by introducing the Integrated Automated Fingerprint Identification Systems (IAFIS) that could scan a fingerprint image and convert the minutiae to digital information and compare it to thousands of other fingerprints [5], [p.4ll]. Today, very large computer databases containing millions of fingerprints of persons who have been arrested are used to make comparisons with prints obtained from new crime scenes. These comparisons can literally take seconds or minutes depending on the depth of the search required. Sometimes successful matches can be made, other times the fingerprints cannot be matched. When fingerprints cannot be matched it is inferred that a new offender has committed a crime. These ‘new’ prints are still stored on the database as a means to trace back crimes committed by a person committing a second offence and who is apprehended by direct evidence, thus creating a trail of criminal events linked back to the same individual with the potential to solve multiple crimes. Commonly a list of prints that come closest to matching that print found at the scene of a crime are returned for further examination by an expert who then deems which single print is the closest match. In recent years background checks are even conducted on individuals using fingerprints, as a means to gain employment such as in early childhood [4], [p.5], or during the process of adoption or other security clearance requirements.

SECTION 5. What is DNA?

DNA fingerprinting, DNA (geno)typing, DNA profiling, identity testing and identification analysis, all denote the ability to characterize one or more rare features of an individual's genome, that is, their hereditary makeup. DNA contains the blueprints that are responsible for our cells, tissues, organs, and body [4], [p.8]. In short it can be likened to “God's signature” [6], [p.259]. Every single human has a unique composition, save for identical twins who share the same genotype but have subtly different phenotypes. When DNA samples are taken from blood cells, saliva or hair bulb specimens of the same person, the structure of the DNA remains the same. Thus only one sample is required as the basis for DNA profiling, and it can come from any tissue of the body [7], [P.L]. DNA fingerprinting was discovered in 1985 by English geneticist Dr Alec Jeffreys. He found that certain regions of DNA contained sequences that repeated themselves over and over again, one after the other and that different individuals had a different number of repeated sections. He developed a technique to examine the length variation of these DNA repeat sequences, thus creating the ability to perform identification tests [8], pp.2FJ.

The smallest building block of DNA is known as the nucleotide. Each nucleotide contains a deoxyribose, a phosphate group and a base. When we are analyzing DNA structures it is the sequence of bases that is important for the purposes of identification [9], [p.ll]. There are four bases through which a genetic code is described. These are: Adenine (A), Thymine (T), Guanine (G) and Cytosine (C). When trying to understand DNA sequences as they might appear in written form, consider that ‘A’ only binds with ‘T’, and ‘G’ only binds with ‘C’ (see figure 2 comparing row one and two). These base pairs are repeated millions of times in every cell and it is their order of sequence that determines the characteristics of each person. It is repetitive DNA sequences that are utilized in DNA profiling [10], [p.2].

Figure 2.  A typical DNA sequence

Figure 2. A typical DNA sequence

For example, in Figure 2 the base sequences of the two strands, known as the double helix, is written for a fictitious DNA sample. While the labels “5” and “3” have been included for illustrative purposes a sequence is written plainly as CTTAGCCATAGCCTA. From this sequence we can deduce the second strand given the rules for binding described above. Furthermore, in specific applications of DNA testing various polymorphisms may be considered which denote the type of repeat for a given stretch of DNA. For instance the tetranucleotide repeat is merely a stretch of DNA where a specific four nucleotide motif is repeated [9], [P.L 0].

DNA profiling can be applied to a broad range of applications including diagnostic medicine, famil y relationship analysis (proof of paternity and inheritance cases), and animal and plant sciences [7], [p.31]. The most high profile use of DNA however is in the area of forensic science, popularized by modern day television series such as CSI Miami and Cold Case. Episodes from the series, such as “Death Pool” [11] and “Dead Air,” [12] allow members of the public to visualize how DNA might be used to gather evidence towards prosecution in a court of law. Although Hollywood is well known for its farcical and inaccurate representations, these episodes still do demonstrate the potential for DNA. DNA profiling illustrates the power to eliminate a suspect with a discrimination power so high that it can be considered a major identification mechanism [13], [P.L]. It is with no doubt that forensic DNA analysis has made a huge impact on criminal justice and the law since its inception in U.K. Courts with the 1988 investigation into the deaths of schoolgirls Lynda Mann in 1983 and Dawn Ashworth in 1986 [14]. Since that time, DNA has been used successfully in criminal law to help prove guilt or innocence [15], in family law to prove parentage, and in immigration law to prove blood relations for cases related to citizenship [4], [p.xiii].

5.1. DNA as Applied to Criminal Law

In forensic DNA analysis today, mitochondrial DNA is used for identification, as nuclear DNA does not possess the right properties toward individual identification [9], [p.5]. According to Koblinsky et al. it is the moderately repetitious DNA that is of interest to forensic analysts [4], [pp.17f]:

“It has been shown that 99.9% of human DNA is the same in every individual. In fact, every individual's DNA has a relatively small number of variations from others. It is that variation of 1 in every 1000 bases that allows us to distinguish one individual from another through forensic genetic testing.”

Similarly in the case of dactyloscopy, an individual's DNA can be left behind at a scene of a crime or on a victim. When natural fibers are transferred through human contact, for example, from a perpetrator to a victim, or natural fibers sometimes microscopic in nature are left behind at a scene of a crime, they can be used for evidentiary purposes. The DNA found in hair for example, can be compared to hair specimens taken from a crime suspect or the DNA profile stored in an existing DNA databank. Synthetic fibers not containing DNA, such as threads from a piece of clothing worn by a perpetrator, can also be used to link a suspect to a crime. When fibers are transferred from one person to another upon physical contact it is known as the Locard exchange principle [4], [p.3].

It is important to note that all physical evidence like DNA should only ever be considered circumstantial evidence. It is evidence that provides only a basis for inference about the claim being made, and can be used in logical reasoning to prove or disprove an assertion. In a criminal case, DNA alone cannot be used to prove someone's guilt or innocence. Rather DNA may be able to point investigators to ‘what happened’, ‘the order of events that took place’, ‘who was involved’, ‘where an event took place’ and ‘how it might have taken place,’ and in that manner the forensic scientist is conducting a reconstruction by means of association (table 1) [16], [P.L]. Thus the job of an investigator is to put all the pieces of the puzzle together and to gather as much information as possible and from as many available sources of evidence including eyewitness accounts, physical evidence and archival records [4], [P.L].

Table 1. A theoretical framework for the discipline of criminalistics [16], [p.2]

As more sophisticated techniques have emerged to analyze DNA samples taken at the scene of a crime, the lesser the mass of DNA that is needed for a correct reading. How much DNA do you need? Well, it all depends on the richness of the sample. For instance, a 2002 US State Police handbook noted that a clump of pulled hair contained enough material for successful RFLP (Restriction Fragment Length Polymorphism) typing. A single hair root provided enough nuclear DNA for PCR STR (polymerase chain reaction short tandem repeat) typing, but not enough for RFLP. And a hair shaft contained sufficient mitochondria for successful mtDNA (mitochondrial DNA) typing, but was inadequate for PCR STR or RFLP typing [16], [p.61]. A blood, saliva, urine, bone, teeth, skin or semen sample could be considered a richer sample than a hair root for extraction purposes, but DNA analysis is all very much dependent on the level of degradation the sample has been exposed to.

Environmental factors can be harmful to DNA that has been collected from a scene of a crime and can lead to issues relating to deterioration, destruction, or contamination of evidence which are all contestable issues a lawyer may have to deal with in a court of law [4], [p.xiii]. For instance, heat, moisture, bacteria, ultraviolet (UV) rays and common chemicals can contribute to the degradation process [9], [p.61]. When a sample undergoes some level of degradation, it is said to have had infringed upon the chain of custody. To get around such problems, experts have proposed bringing the laboratory closer to policing practice. The concept of “lab in a van” or “lab on a chip” (LOC) proposes the use of a mobile laboratory where analysis and interpretation of evidence is even possible at the scene of a crime [6], [p.153]. The advancements in mobile technologies continue to allow for even very tiny biological substances to undergo DNA testing resulting in accurate identification. Even a cigarette butt which has saliva on it containing epithelial cells can be screened for DNA evidence [4], [p.6].

SECTION 6. Comparing DNA and Fingerprinting

To begin with, traditional fingerprinting classification techniques have been around a lot longer than DNA identification, although both fingerprinting and DNA have been part of the human body since the start of time. In its manual form, the Galton-Henry system of fingerprint classification first made its impact on the practices of Scotland Yard in 1901. So whereas fingerprint recognition can happen using manual methods, DNA testing can only happen using laboratory systems, even if analysis now takes the form of a mobile lab on a chip. DNA is also a pervasive and invasive biometric technique. That is DNA is owned by everyone, and DNA actually belongs to the internals of what makes up the body. For a DNA reading, a hair shaft has been detached from the scalp, teeth and skin and bones have to be ‘dismembered’ from the body, blood and urine and saliva is extracted from the body [17], [p.374].

In most states, the police can take non-intimate samples if a person has been arrested for a serious recordable offence, and in other states DNA can be taken for offences such as begging, being drunk and disorderly, and taking part in an illegal demonstration. In the U.K. for instance, DNA does not have to be directly relevant to investigating the offence for which a person is being arrested and they do not have to be charged before the sample is taken. The police are not allowed to take more than one successful sample from the same body part during the course of an investigation. The police can take an intimate sample only with a person's written consent even if they have been arrested. However, there is a burgeoning debate at present about what actually constitutes consent during such a process-is it true consent, or merely compliance or acknowledgment of required police procedures by the individual under arrest.

Fingerprints are different in that while belonging to the body, they are a feature on the surface of the body, and they do not constitute mass. Fingerprints are patterns that appear on the skin, but they are not the fiber we know as skin. Fingerprints also exclude a small portion of the population-those who do not have particular fingers, or hands, or arms, or may have fingers that have been severely deformed due to accidental or deliberate damage. Despite these differences, the claim is made by scientists that forensic DNA testing has emerged as an accurate measure of someone's identification with reliability equal to that of fingerprint recognition [4], [p.5].

6.1. Intimate and Non-Intimate Measures: Other Biometrics Versus DNA Sampling

6.1.1. The United States and Other Biometrics

The notion of “intimacy” is very much linked to literature on DNA, and not of biometrics in general. Although historically there has been some contention that a fingerprint sample is both “intimate” and “private”, the proliferation of fingerprint, handprint, and facial recognition systems now used for government and commercial applications, has rendered this debate somewhat redundant. This is not to say that the storage of personal attributes is not without its own commensurate risks but large-scale applications enforced by such acts as the United States Enhanced Border Security and Visa Entry Reform Act of 2002 mean that fingerprint, hand and facial recognition systems have now become commonplace. In fact, this trend promises to continue through multimodal biometrics, the adoption of several biometrics toward individual authentication. Few travelers, at the time of transit, directly challenge the right of authorities to be taking such personal details, and to be storing them on large databases in the name of national security. However sentiment, at least in North America, was different prior to the September 11 terrorist attacks on the Twin Towers [18].

In 1997 biometrics were touted a type of personal data which was wholly owned by the individual bearer with statutory implications depending on the governing jurisdiction [19]. It followed that a mandatory requirement by a government agency to collect and store fingerprint data may have been in breach of an individual's legitimate right to privacy. In the U.S., court cases on this issue have found consistently that certain biometrics do not violate federal laws like the Fourth Amendment. It seems that the [20]:

“ … real test for constitutionality of biometrics … appears to be based on the degree of physical intrusiveness of the biometric procedure. Those that do not break the skin are probably not searches, while those that do are”.

In the context of DNA we can almost certainly claim that there is “physical intrusiveness” of a different nature to the collection of surface-level fingerprints (figure 2). In the collection of blood samples we must “break” or “pierce” the skin, in the collection of saliva samples we enter the mouth and touch the inner lining of the mouth with buccal swabs, in the removal of a hair or clump of hair we are “pulling” the hair out of a shaft etc. And it is here, in these examples, where consent and policing powers and authority become of greatest relevance and significance.

Figure 2. Left: finger “prints” on the surface of the skin. right: DNA blood “sample” taken by pricking the skin

6.1.2. Britain and DNA

In the world of DNA, there is a simple classification, followed by most law enforcement agencies that denote samples as either being of an “intimate” nature or “non-intimate” nature. In the British provisions of the original Police and Criminal Evidence Act of 1984 (PACE), section 65 defines intimate samples as: “a sample of blood, semen or any other tissue fluid, urine, saliva or pubic hair, or a swab taken from a person's body orifice” and non-intimate samples as “hair other than pubic hair; a sample taken from a nail or from under a nail; a swab taken from any part of a person's body other than a body orifice” [21], [p.80]. Generally, it must be noted that at times police can take a sample by force but on other occasions they require consent. In Britain, prior to 2001, intimate samples from a person in custody were once only obtainable through the express authority of a police officer at the rank of superintendent and only with the written permission of the person who had been detained (section 62) [21]. Non-intimate samples could be taken from an individual without consent but with permission from a police officer of superintendent rank (section 63). In both instances, there had to be reasonable grounds for suspecting that the person from whom the sample would be taken had been involved in a serious offence [21]. And above reasonable grounds, there had to be, theoreticall y at least, the potential to confirm or disprove the suspect's involvement through obtaining a DNA sample [22], [p.29]. Over time Acts such as the PACE have been watered down leading to controversial strategic choices in law enforcement practices, such as the trend towards growing national DNA databases at a rapid rate.

6.2. Continuity of Evidence

Table 2. Ways to mitigate the effect of DNA evidence

Policing and forensic investigative work, are no different to any other “system” of practice; they require to maintain sophisticated audit trails, even beyond those of corporate organizations, to ensure that a miscarriage of justice does not take place. However, fingerprints are much easier attributes to prove a continuity of evidence than DNA which is much more complex. A fingerprint found at a crime scene, does not undergo the same type of degradation as a DNA sample. Thus it is much easier to claim a fingerprint match in a court of law, than a DNA closeness match. Providing physical evidence in the form of a DNA sample or profile requires the litigator to prove that the sample was handled with the utmost of care throughout the whole chain of custody and followed a particular set of standard procedures for the collection, transportation, and handling of the material. The proof that these procedures were followed can be found in a series of paper trails which track the movements of samples [6], [p.114].

Beyond the actual location of the evidence, a continuity of evidence has to do with how a DNA sample is stored and handled, information related to temperature of the place where the sample was found and the temperature at the place of storage, whether surrounding samples to that being analyzed were contaminated, how samples are identified and qualified using techniques such as barcode labels or tags, how samples were tested and under what conditions, and how frequently samples were accessed and by whom and for what purposes [4], [p.43]. When DNA forensic testing was in its infancy, knowledgeable lawyers would contest the DNA evidence in court by pointing to micro-level practices of particular laboratories that had been tasked with the analytical process. The first time that attention had been focused on the need to standardize procedures and to develop accreditation processes for laboratories and for personnel was in the 1989 case People v Castro 545 N.Y.S.2d 985 (Sup. Ct. 1989). When DNA testing began it was a very unregulated field, with one commentator famously noting that: “clinical laboratories [were required to] meet higher standards to be allowed to diagnose strep throat than forensic labs [were required to] meet to put a defendant on death row” [9], [p.55]. But it must be said, given the advancement in quality procedures, attacks on DNA evidence, rarely focus on the actual standards, and more so focus on whether or not standards were followed appropriately [9], [p.61].

In the event that a defense lawyer attempts to lodge an attack on the DNA evidence being presented in a court of law, they will almost always claim human error with respect to the procedures not being followed in accordance to industry standards. Human error cannot be eradicated from any system, and no matter how small a chance, there is always the possibility that a sample has been wrongly labeled or contaminated with other external agents [9]. Worse still is the potential for a forensic expert to provide erroneous or misleading results, whether by a lack of experience, a miscalculation on statistical probabilities or deliberate perjury. The latter is complex to prove in court. Some have explained away these human errors toward wrongful conviction as a result of undue political pressure placed on lab directors and subsequently analysts for a timely response to a violent crime [16], [p.157]. As Michaelis et al. note [9], [p.69]:

“[i]n far too many cases, the directors of government agencies such as forensic testing laboratories are subjected to pressure from politicians and government officials to produce results that are politically expedient, sometimes at the expense of quality assurance … Laboratory directors are too often pressured to produce results quickly, or to produce results that will lead to a conviction, rather than allowed to take the time required to ensure quality results.”

Thus attacks on DNA evidence can be made by attacking the chain of custody among other strategies shown in Table 2.

SECTION 7. The Difference Between Databases and Databanks

7.1. Of Profiles and Samples

In almost any biometric system, there are four steps that are required towards matching one biometric with another. First, data is acquired from the subject, usually in the form of an image (e.g. fingerprint or iris). Second, the transmission channel which acts as the link between the primary components will transfer the data to the signal processor. Third, the processor takes the raw biometric image and begins the process of coding the biometric by segmentation which results in a feature extraction and a quality score. The matching algorithm attempts to find a record that is identical resulting in a match score. Finally, a decision is made based on the resultant scores, and an acceptance or rejection is determined [23]. At the computer level, a biometric image is translated into a string of bits, that is, a series of one's and zero's. Thus a fingerprint is coded into a numeric value, and these values are compared in the matching algorithm against other existing values. So simply put, the input value is the actual fingerprint image, and the output value is a coded value. This coded value is unique in that it can determine an individual profile.

With respect to the extraction of a DNA sample the process is much more complex, as is its evaluation and interpretation. A DNA sample differs from a fingerprint image. A sample is a piece of the body or something coming forth or out from the body, while in the case of fingerprints, an image is an outward bodily aspect. When a DNA sample undergoes processing, it is also coded into a unique value of As, Ts, Gs and Cs. This value is referred to as a DNA profile. Storing DNA profiles in a computer software program is considered a different practice to storing the actual feature rich DNA sample in a DNA store. Some members of the community have volunteered DNA samples using commercial DNA test kits such as “DNA Exam” by the BioSynthesis Corporation [24]. For example, the DNA Diagnostics Center [25] states that one may:

“ … elect to take advantage of [the] DNA banking service without any additional charge if [one] orders a DNA profile [and that the company] will store a sample of the tested individual's DNA in a safe, secure facility for 15 years-in case the DNA sample is ever needed for additional testing”.

The controversy over storing “samples” by force in the crime arena has to do with the potential for DNA to generate information such as a person's predisposition to disease or other characteristics that a person might consider confidential. It is the application of new algorithms or extraction/evaluation/ interpretation techniques to an existing sample that is of greatest concern to civil liberties advocates. Profiles are usually unique combinations of 16 markers [26], they can only be used to match, and cannot be used toward further fact finding discoveries although some believe that you might be able to draw conclusions from profiles in the future. In a given population, there are several different alleles for any single marker and some of these may appear more frequently than others. The best markers are those with the greatest number of different alleles and an even distribution of allele frequencies [9], [p.19].

7.2. Of Databases and Databanks

Although textbooks would have us believe that there is a clear-cut distinction about what constitutes a database as opposed to a databank, in actual fact the terms are used interchangeably in most generalist computing literature. Most dictionaries for example will define the term database without an entry for databank. A database is a file of information assembled in an orderly manner by a program designed to record and manipulate data and that can be queried using specific criteria. Commercial enterprise grade database products include Oracle and Microsoft Access. The International Standards Organization however, does define a databank as being “a set of data related to a given subject and organized in such a way that it can be consulted by users” [27]. This distinction is still quite subtle but we can extrapolate from these definitions that databases are generic information stores, while databanks are specific to a subject [28].

In the study of DNA with respect to criminal law, the distinction between databases and databanks is a lot more crystallized, although readers are still bound to be confused by some contradictory statements made by some authors. Still, in most cases, a databank is used to investigate crimes and to identify suspects, and a database is used to estimate the rarity of a particular DNA profile in the larger population [9], [p.99]. Databanks contain richer personal information related to samples, even if the identity of the person is unknown. For example, the databank can contain unique profiles of suspects and convicted criminals and content about physical crime stains and records of DNA profiles generated by specific probes at specific loci [10], [p.40]. Databases are much more generic than databanks containing information that is representative of the whole populace or a segment of the populace. For example, a database can contain statistical information relating to the population frequencies of various DNA markers generated from random samples for particular ethnic groups or for the whole population at large. Databanks may contain rich personal data about offenders and cases [16], [pp.157f] but databases only contain minimal information such as the DNA profile, ethnic background and gender of the corresponding individuals.

Table 3. The NDNAD database attributes [30]

The premise of the DNA databank is that DNA profile data of known offenders can be searched in an attempt to solve crimes, known as ‘cold cases’. They are valuable in that they can help investigators string a series of crimes together that would otherwise go unrelated, allowing for the investigator to go across space and time after all other avenues have been exhausted [9, p.99]. With respect to violent crimes, we know that offenders are highly prone to re-offending and we also know that violent crimes often provide rich DNA sample sources such as bones, blood, or semen. Thus DNA left at the scene of a crime can be used to search against a DNA databank in the hope of a “close” match [16], [p.157]. The probative value of the DNA evidence is greater the rarer the DNA profile in the larger population set [9], [p.19].

Different jurisdictions have different standards on the criteria for inclusion into DNA databanks and what attribute information is stored in individual records and who has access. In the United States for instance, different states have different rules, some allowing for DNA databanks to be accessed by law enforcement agencies alone, and others allowing for public officials to have access for purposes outside law enforcement [9], [p.100]. In the U.S. the CODIS (Combined DNA Index System) system was launched in 1998–99 by the FBI. It contains two searchable databases, one with previous offenders and another with DNA profiles gathered from evidence at crime scenes [9], [p.16]. In the case of the U.K., the National DNA Database (NDNAD) of Britain, Wales and Northern Ireland, contains very detailed information for each criminal justice (CJ) record (see table 3) and profiles are searched against each other on a daily basis with close hit results forwarded on to the appropriate police personnel. It is quite ironic that the 1995 NDNAD is a databank but is so large that it is considered a database by most, as is also evident by the fact that the word “database” also appears in the NDNAD acronym [29], [p.2].

SECTION 8. Legal, Ethical and Social Concerns

The collection, storage, and use of DNA samples, profiles and fingerprints raise a number of legal, ethical and social concerns. While some of the concerns for the collection and storage of an individual's fingerprints by the State have dissipated over the last decade, the debate over the storage of DNA samples and profiles rages more than ever before. It was around the turn of the century when a number of social, ethical and legal issues were raised with respect to DNA sampling but councils and institutes through lack of knowledge or expertise could hardly offer anything in terms of a possible solution or way forward to the DNA controversy [31], [p.34]. At the heart of the techno-legal “controversy” is a clash of ideals coming from a collision of disciplines. For many medical practitioners working on topics related to consent or confidentiality, the legal position on DNA is one which acts as a barrier to important medical research. While few would dispute the importance of data protection laws and the ethical reasons behind balancing the right to privacy against other rights and interests, some in the medical field believe that the law has not been able to deal with exceptions where the use of DNA data could be considered proportionate, for instance, in the area of epidemiology. There are those like Iverson who argue that consent requirements could be relaxed for the sake of the common good.

“We are not arguing that epidemiological research should always proceed without consent. But it should be allowed to do so when the privacy interference is proportionate. Regulators and researchers need to improve their ability to recognize these situations. Our data indicate a propensity to over-predict participants' distress and under-predict the problems of using proxies in place of researchers. Rectifying these points would be a big step in the right direction” [32], [p.169].

Thinking in this utilitarian way, the use of DNA evidence for criminal cases, especially violent crimes, is something that most people would agree is a legitimate use of technology and within the confines of the law. The application of DNA to assist in civil cases, again, would seem appropriate where family and state-to-citizen disputes can only be settled by the provision of genetic evidence. Volunteering DNA samples to appropriate organizations and institutions is also something that an individual has the freedom to do, despite the fact that a large portion of the population would not participate in a systematic collection of such personal details. Voluntary donation of a DNA sample usually happens for one of three reasons: (i) to assist practitioners in the field of medical research; (ii) to assist in DNA cross-matching exercises with respect to criminal cases; and (iii) to aid an individual in the potential need they may have of requiring to use their own DNA in future uses with any number of potential possibilities. For as Carole McCartney reminds us:

“[f]orensic DNA technology has multiple uses in the fight against crime, and ongoing research looks to expand its usefulness further in the future. While the typical application of DNA technology in criminal investigations is most often unproblematic, there needs to be continued vigilance over the direction and implications of research and future uses” [33], [p.189].

Table 4. Legal, ethical and social issues related to use of DNA in criminal law

It is in this parallel development that we can see an evolution of sorts occurring with the collection of highly intimate personal information. On the one hand we have the law, on the other hand we have medical discovery, both on parallel trajectories that will have overflow impact effects on one other. For many, the appropriate use of DNA in the medical research field and criminal law field can only have positive benefits for the community at large. There is no denying this to be the case. However, the real risks cannot be overlooked. Supplementary industries can see the application of DNA in a plethora of programs, including the medical insurance of ‘at risk’ claimants to an unforeseen level of precision, measuring an individual's predisposition to a particular behavioral characteristic for employment purposes [34], [p.897], and the ability to tinker with the genes of unborn children to ensure the “right” type of citizens are born into the world. All of these might sound like the stuff of science fiction but they are all areas under current exploration.

For now, we have the ability to identify issues that have quickly escalated in importance in the DNA debate. For this we have several high profile cases in Europe to thank but especially the latest case which was heard in the European Court of Human Rights (ECtHR) on the 4 December 2008, that being S and Marper v. the United Kingdom [35]. This landmark case, against all odds, acted to make the U.K. (and to some extent the rest of the world) stop and think about the course it had taken. For the U.K. this meant a re-evaluation of its path forward via a community consultation process regarding the decade old initiatives of the NDNAD. The main issues that the case brought to the fore, and those of its predecessor cases, can be found in summary in Table 4. The table should be read from left to right, one row at a time. The left column indicates what most authors studying the socio-ethical issues regard as an acceptable use of DNA, and the right column indicates what most authors regard as either debatable or unacceptable use of DNA.

Of greatest concern to most civil libertarians is the issue of proportionality and the potential for a disproportionate number of profiles to be gathered relative to other state practices towards a blanket coverage databank. Blanket coverage databanks can be achieved by sampling a populace, a census approach is not required. Maintaining DNA profiles for some 15–20% of the total population, means you could conduct familial searching on the rest to make associations between persons with a high degree of accuracy [4], [p.274], something that would be possible in the U.K. by 2018 if it maintained the same level of sampling due process. This is not without its dangers, as it promotes adventitious searching and close matches that might not categorically infer someone's guilt or Innocence.

Table 5. Social, ethical and legal issues pertaining to DNA databanks identified by national institute of justice in the united states in 2000 [31, pp. 35f].

In addition, the large databanks are not without their bias. Already police records are filled with the presence of minority groups of particular ethnic origin for instance, which can have an impact on the probability of a close match despite someone's innocence. Being on the database means that there is a chance a result might list you as a suspect based on having a similar DNA profile to someone else. And ultimately, the fact that innocent people would have their profiles stored on the NDNAD would do little in the way of preventing crime, and would lead before too long, to a de facto sampling of all state citizens.

The driving force behind such a campaign could only be achieved by obtaining DNA samples from persons (including innocent people or ‘innocents’), either via some event triggering contact between an individual and the police or via an avenue at birth [10], [p.40]. Police powers have increased since world wide terrorist attacks post 2000 especially, and this has led to a tradeoff with an individual's right to privacy [36], [p.14]. Notions of consenting to provide a DNA sample to law enforcement personnel have been challenged whereby the use of force has been applied. And not consenting to a sample being taken, even if you are innocent has its own implications and can be equally incriminating. So legislative changes have encroached on individual rights; whereby a warrant was once required to take a DNA sample from a suspect's body based on reasonable grounds, today it is questionable if this caveat actually exists.

Beyond the obvious downsides of retaining the DNA profile or sample of innocent people who are in actual fact law abiding citizens, there is the potential for persons to feel aggravated because they have not been let alone to go about their private business. Innocent persons who are treated like criminals may end up losing their trust in law enforcement agencies. This problem is not too small of a social issue, given that there are about 1 million innocent people on the NDNAD in the U.K. And in this context, it is not difficult to see how some individuals or groups of individuals might grow to possess an anti-police or antigovernment sentiment, feeling in some way that they have been wronged or singled out. In some of these ‘mistaken identity’ situations, surely it would have been better to prove someone’ s innocence by using other available evidence such as closed circuit television (CCTV), without the need to take an intimate DNA sample first. Despite these problems, it seems anyone coming under police suspicion in the U.K. will have their DNA taken anyway [33], [p.175].Of a most sensitive nature is the collection of DNA samples for an indefinite period of time [4], [p. 7]. In most countries, samples are taken and DNA profiles are determined and stored in computer databases, and subsequently samples are destroyed. The long-term storage of DNA samples for those who have committed petty crimes and not violent crimes raises the question of motivation for such storage by government authorities [4]. There are critics who believe that the retention of samples is “an unjustifiable infringement on an individual's privacy” [33], [p.189].

There is much that has changed with respect to social, ethical and legal issues since 2000, both in the United States and the United Kingdom since its publication. But the table still provides a historical insight into the growing list of issues that were identified at the turn of the century.

Equally alarming is the storage of samples of innocents and also of those who are minors. Even more disturbing is the storage of samples with which no personal details have been associated. DNA databanks are not different to other databanks kept by the state-they can be lost, they can be accessed by unauthorized persons, and results can be misrepresented either accidentally or deliberately [33], [p.188]. The stakes however are much higher in the case of DNA than in fingerprinting or other application areas because the information contained in a DNA sample or profile is much richer in potential use. All of this raises issues pertaining to how changes in the law affect society, and how ethics might be understood within a human rights context.

SECTION 9. Conclusion

The legal, social and ethical issues surrounding the collection, use and storage of DNA profiles and samples is probably more evident today than at any other time in history. On the one hand we have the necessity to advance technology and to use it in situations in which it is advantageous to the whole community, on the other hand this same technology can impinge on the rights of individuals (if we let it), through sweeping changes to legislation. Whether we are discussing the need for DNA evidence in criminal law, civil law, epidemiological research or other general use, consent should be the core focus of any and every collection instance. Unlimited retention of DNA samples collected from those arrested but not charged is another issue where legislative reforms need to be taken in a number of European jurisdictions, although this trend seems to be gathering momentum now more so outside Europe. Another issue is the redefinition of what constitutes an intimate or non-intimate sample, and here, especially most clearly we have a problem in a plethora of jurisdictions with regards to the watering down of what DNA procedures are considered invasive as opposed to non-invasive with respect to the human body. The bottom line is that we can still convict criminals who have committed serious recordable offences, without needing to take the DNA sample of persons committing petty crimes, despite that statistics allege links between those persons committing serious and petty offences. So long as a profile is in a database, it can be searched, and the problem with this is that so-called ‘matches’ (adventitious in nature) can be as much ‘incorrect’ as they are ‘correct’. And this possibility alone has serious implications for human rights. The time to debate and discuss these matters is now, before the potential for widespread usage of DNA becomes commonplace for general societal applications.

SECTION 10. Afterword

Although members of society should not expect to learn of a black market for DNA profiles just yet, it is merely a matter of time before the proliferation and use of such profiles means they become more attracti ve to members of illicit networks. There is now overwhelming evidence to show that identity theft worldwide is on the rise (although estimates vary depending on the study and state). The systematic manipulation of identification numbers, such as social security numbers, credit card numbers, and even driver's license numbers for misuse is now well documented. Victims of identity theft know too well the pains of having to prove who they are to government agencies and financial institutions, and providing adequate evidence that they should not be held liable for information and monetary transactions they did not commit. Today's type of identity theft has its limitations however-stealing a number is unlike stealing somebody's godly signature. While credit card numbers can be replaced, one's DNA or fingerprints cannot. This resonates with the well-known response of Sir Thomas More to Norfolk in A Man for All Seasons: “you might as well advise a man to change the color of his eyes [another type of biometric]”, knowing all too well that this was impossible. While some have proclaimed the end of the DNA controversy, at least from a quality assurance and scientific standpoint, the real controversy is perhaps just beginning.

ACKNOWLEDGEMENTS

The author would like to acknowledge Associate Professor Clive Harfield of the Centre for Transnational Crime Prevention in the Faculty of Law at the University of Wollongong for his mentorship in the areas of U.K. law and policing in 2009. The author also wishes to extend her sincere thanks to Mr Peter Mahy, Partner at Howells LLC and the lawyer who represented S & Marper in front of the Grand Chamber at the European Court of Human Rights, for his willingness to share his knowledge on the NDNAD controversy via a primary interview.

References

1. J. R. Parks, P. L. Hawkes, "Automated personal identification methods for use with smart cards" in Integrated Circuit Cards Tags and Tokens: new technology and applications, Oxford: BSP Professional Books, pp. 92-135, 1990.

2. A. K. Jain, L. Hong, S. Pankati, R. Bolle, "An identity-authentication system using fingerprints", Proceedings of the IEEE, vol. 85, pp. 1365-1387, 1997.

3. J. Cohen, Automatic Identification and Data Collection Systems, London:McGraw-Hill Book Company, pp. 228, 1994.

4. L. Koblinsky, T. F. Liotti, J. Oeser-Sweat, "DNA: Forensic and Legal Applications" in , New Jersey:Wiley, 2005.

5. P. T. Higgins, "Standards for the electronic submission of fingerprint cards to the FBI", Journal of Forensic Identification, vol. 45, pp. 409-418, 1995.

6. M. Lynch, S. A. Cole, R. McNally, K. Jordan, Truth Machine: the Contentious History of DNA Fingerprinting, Chicago:The University of Chicago Press, 2008.

7. L. T. Kirby, DNA Fingerprinting: An Introduction, New York:Stockton Press, 1990.

8. J. M. Butler, Forensic DNA Typing: Biology Technology and Genetic of STR Markers, Amsterdam:Elsevier Academic Press, pp. 2, 2005.

9. R. C. Michaelis, R. G. Flanders, P. H. Wulff, A Litigator's Guide to DNA: from the Laboratory to the Courtroom, Amsterdam:Elsevier, 2008.

10. C. A. Price, DNA Evidence: How Reliable Is It? An Analysis of Issues Which May Affect the Validity and Reliability of DNA Evidence, Legal Research Foundation, vol. 38, 1994.

11. A. Donahue, E. Devine, S. Hill, "Death Pool (Season 5 Episode 3)", CSI Miami, 2006.

12. J. Haynes, S. Hill, "Dead Air (Season 4 Episode 21)", CSI Miami, 2006.

13. B. Selinger, J. Vernon, B. Selinger, "The Scientific Basis of DNA Technology" in DNA and Criminal Justice, Canberra:, vol. 2, 1989.

14Man jailed in first DNA case wins murder appeal, May 2009, [online] Available: http://uk.reuters.comlarticle/idUKTRE54D3cc20090514?pageNumber=1&virtuaIBrandChannel=O.

15The Innocence Project-Home, 2009, [online] Available: http://www.innocenceproject.org/.

16. N. Rudin, K. Inman, An Introduction to Forensic DNA Analysis, London:CRC Press, 2002.

17. A. Roberts, N. Taylor, "Privacy and the DNA Database", European Human Rights Law Review, vol. 4, pp. 374, 2005.

18. K. Michael, M. G. Michael, Automatic Identification and Location Based Services: from Bar Codes to Chip Implants:, 2009.

19. R. Van Kralingen, C. Prins, J. Grijpink, Using your body as a key; legal aspects of biometrics, 1997, [online] Available: http://cwis.kub.nll~frw/people/kraling/contentlbiomet.htm.

20. S. O'Connor, "Collected tagged and archived: legal issues in the burgeoning use of biometrics for personal identification", Stanford Technology Law Review, 1998, [online] Available: http://www.jus.unitn.it/USERS/pascuzzi/privcomp99-00/topics/6/firma/connor.txt.

21. S. Ireland, "What Authority Should Police Have to Detain Suspects to take Samples?", DNA and Criminal Justice, pp. 80, 1989.

22. I. Feckelton, J. Vernon, B. Selinger, "DNA Profiling: Forensic Science Under the Microscope" in DNA and Criminal Justice, Canberra:, vol. 2, pp. 29, 1989.

23. K. Raina, J. D. Woodward, N. Orlans, J. D. Woodward, N. M. Orlans, P. T. Higgins, "How Biometrics Work", Biometrics, pp. 29, 2002.

24Identity DNA Tests, 2009, [online] Available: http://www.800dnaexam.comlIdentity_dna_tests.aspx.

25Profiling, 2009, [online] Available: http://www.dnacenter.comldna-testing/profiling.html.

26. "Biosciences Federation and The Royal Society of Chemistry", Forensic Use of Bioinformation: A response from the Biosciences Federation and the Royal Society of Chemistry to the Nuffield Council on Bioethics, January 2007, [online] Available: http://www.rsc.org/images/ForensicBioinformation_tcm18-77563.pdf.

27. J. C. Nader, "Data bank" in , Prentice Hall's Illustrated Dictionary of Computing, pp. 152, 1998.

28DNA Safeguarding for security and identification, 2009, [online] Available: http://www.dnatesting.comldna-safeguarding/index.php.

29. "The British Academy of Forensic Sciences", response to the Nuffield Bioethics Council Consultation-The Forensic use of bioinformation: ethical issues between November 2006 and January 2007, 2007, [online] Available: http://www.nuffieldbioethics.org/fileLibrary/pdf/British_Academy-of_Forensic_Sciences.pdf.

30What happens when someone is arrested?, 2009, [online] Available: http://www.genewatch.org/sub-539483.

31. "The Future of Forensic DNA Testing: Predictions of the Research and Development Working Group", National Institute of Justice, 2000.

32. A. Iversen, K. Liddell, N. Fear, M. Hotopf, S. Wessely, "Consent confidentiality and the Data Protection Act", British Medical Journal, vol. 332, pp. 169, 2006.

33. C. McCartney, "The DNA Expansion Programme and Criminal Investigation", The British Journal of Criminology, vol. 46, pp. 175-189, 2006.

34. D. Meyerson, "Why Courts Should Not Balance Rights Against the Public Interest", Melbourne University Law Review, vol. 33, pp. 897, 2007.

35. "Grand Chamber I Case of S. and Marper v. The United Kingdom (Applications nos. 30562/04 and 30566/04) Judgment", European Court of Human Rights Strasbourg, December 2008.

36. J. Kearney, P. Gunn, "Meet the Experts-Part III DNA Profiling", pp. 14, 1991.

Keywords

Law, Legal factors, Fingerprint recognition, DNA, Forensics, Biometrics, Sampling methods, Skin, Sociotechnical systems, History, Controlled Indexing
social sciences, criminal law, ethical aspects, fingerprint identification, forensic science, social issue, fingerprint profile collection, fingerprint profile storage, DNA sample, forensic science, nonviolent crime, biometric technique, buccal swab, legal issue, ethical issue

Citation: Katina Michael, "The legal, social and ethical controversy of the collection and storage of fingerprint profiles and DNA samples in forensic science", 2010 IEEE International Symposium on Technology and Society (ISTAS), 7-9 June 2010, Wollongong, Australia

Toward a State of Überveillance

Introduction

Überveillance is an emerging concept, and neither its application nor its power have yet fully arrived [38]. For some time, Roger Clarke's [12, p. 498] 1988 dataveillance concept has been prevalent: the “systematic use of personal data systems in the investigation or monitoring of the actions of one or more persons.”

Almost twenty years on, technology has developed so much and the national security context has altered so greatly [52], that there is a pressing need to formulate a new term to convey both the present reality, and the Realpolitik (policy primarily based on power) of our times. However, if it had not been for dataveillance, überveillance could not be. It must be emphasized that dataveillance will always exist - it will provide the scorecard for the engine being used to fulfill überveillance.

Dataveillance to Überveillance

Überveillance takes that which was static or discrete in the dataveillance world, and makes it constant and embedded. Consider überveillance not only automatic and having to do with identification, but also about real-time location tracking and condition monitoring. That is, überveillance connotes the ability to automatically locate and identify - in essence the ability to perform automatic location identification (ALI). Überveillance has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought). Überveillance can be a predictive mechanism for a person's expected behavior, traits, likes, or dislikes; or it can be based on historical fact; or it can be something in between. The inherent problem with überveillance is that facts do not always add up to truth (i.e., as in the case of an exclusive disjunction T + T = F), and predictions based on überveillance are not always correct.

Überveillance is more than closed circuit television feeds, or cross-agency databases linked to national identity cards, or biometrics and ePassports used for international travel. Überveillance is the sum total of all these types of surveillance and the deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time. In its ultimate form, überveillance has to do with more than automatic identification technologies that we carry with us. It has to do with under-the-skin technology that is embedded in the body, such as microchip implants; it is that which cuts into the flesh - a charagma (mark) [61]. Think of it as Big Brother on the inside looking out. This charagma is virtually meaningless without the hybrid network architecture that supports its functionality: making the person a walking online node i.e., beyond luggable netbooks, smart phones, and contactless cards. We are referring here to the lowest common denominator, the smallest unit of tracking - presently a tiny chip inside the body of a human being, which could one day work similarly to the black box.

Implants cannot be left behind, cannot be lost, and supposedly cannot be tampered with; they are always on, can link to objects, and make the person seemingly otherworldly. This act of “chipification” is best illustrated by the ever-increasing uses of implant devices for medical prosthesis and for diagnostics [54]. Humancentric implants are giving rise to the Electrophorus [36, p. 313], the bearer of electric technology; an individual entity very different from the sci-fi notion of Cyborg as portrayed in such popular television series as the Six Million Dollar Man (1974–1978). In its current state, the Electrophorus relies on a device being triggered wirelessly when it enters an electromagnetic field; these properties now mean that systems can interact with people within a spatial dimension, unobtrusively [62]. And it is surely not simple coincidence that alongside überveillance we are witnessing the philosophical reawakening (throughout most of the fundamental streams running through our culture) of Nietzsche's Übermensch - the overcoming of the “all-too-human” [25].

Legal and Ethical Issues

In 2005 the European Group on Ethics (EGE) in Science and New Technologies, established by the European Commission (EC), submitted an Opinion on ICT implants in the human body [45]. The thirty-four page document outlines legal and ethical issues having to do with ICT implants, and is based on the European Union Treaty (Article 6) which has to do with the “fundamental rights” of the individual. Fundamental rights have to do with human dignity, the right to the integrity of the person, and the protection of personal data. From the legal perspective the following was ascertained [45, pp. 20–21]:

  • the existence of a recognised serious but uncertain risk, currently applying to the simplest types of ICT implants in the human body, requires application of the precautionary principle. In particular, one should distinguish between active and passive implants, reversible and irreversible implants, and between offline and online implants;
  • the purpose specification principle mandates at least a distinction between medical and non-medical applications. However, medical applications should also be evaluated stringently, partly to prevent them from being invoked as a means to legitimize other types of application;
  • the data minimization principle rules out the lawfulness of ICT implants that are only aimed at identifying patients, if they can be replaced by less invasive and equally secure tools;
  • the proportionality principle rules out the lawfulness of implants such as those that are used, for instance, exclusively to facilitate entrance to public premises;
  • the principle of integrity and inviolability of the body rules out that the data subject's consent is sufficient to allow all kinds of implant to be deployed; and
  • the dignity principle prohibits transformation of the body into an object that can be manipulated and controlled remotely - into a mere source of information.

ICT implants for non-medical purposes violate fundamental legal principles. ICT implants also have numerous ethical issues, including the requirement for: non-instrumentalization, privacy, non-discrimination, informed consent, equity, and the precautionary principle (see also [8], [27], [29]). It should be stated, however, that the EGE, while not recommending ICT implants for non-medical applications because they are fundamentally fraught with legal and ethical issues, did state the following [45, p. 32]:

ICT implants for surveillance in particular threaten human dignity. They could be used by state authorities, individuals and groups to increase their power over others. The implants could be used to locate people (and also to retrieve other kinds of information about them). This might be justified for security reasons (early release for prisoners) or for safety reasons (location of vulnerable children).

However, the EGE insists that such surveillance applications of ICT implants may only be permitted if the legislator considers that there is an urgent and justified necessity in a democratic society (Article 8 of the Human Rights Convention) and there are no less intrusive methods. Nevertheless the EGE does not favor such uses and considers that surveillance applications, under all circumstances, must be specified in legislation. Surveillance procedures in individual cases should be approved and monitored by an independent court.

The same general principles should apply to the use of ICT implants for military purposes. Although this Opinion was certainly useful, we have growing concerns about the development of the information society, the lack of public debate and awareness regarding this emerging technology, and the pressing need for regulation that has not occurred commensurate to developments in this domain.

Herein rests the problem of human rights and striking a “balance” between freedom, security, and justice. First, we contend that it is a fallacy to speak of a balance. In the microchip implant scenario, there will never be a balance, so long as someone else has the potential to control the implant device or the stored data about us that is linked to the device. Second, we are living in a period where chip implants for the purposes of segregation are being discussed seriously by health officials and politicians. We are speaking here of the identification of groups of people in the name of “health management” or “national security.” We will almost certainly witness new, and more fixed forms, of “electronic apartheid.”

Consider the very real case where the “Papua Legislative Council was deliberating a regulation that would see microchips implanted in people living with HIV/AIDS so authorities could monitor their actions” [50]. Similar discussions on “registration” were held regarding asylum seekers and illegal immigrants in the European Union [18]. RFID implants or the “tagging” of populations in Asia (e.g., Singapore) were also considered “the next step” in the containment and eradication of the Severe Acute Respiratory Syndrome (SARS) in 2003 [43]. Apart from disease outbreaks, RFID has also been discussed as a response and recovery device for emergency services personnel dispatched to terrorist disasters [6], and for the identification of victims of natural disasters, such as in the case of the Boxing Day Tsunami [10]. The question remains whether there is a truly legitimate use function of chip implants for the purposes of emergency management as opposed to other applications. Definition plays a critical role in this instance. A similar debate has ensued in the use of the Schengen Information System II in the European Union where differing states have recorded alerts on individuals based on their understanding of a security risk [17].

In June of 2006, legislative analyst Anthony Gad, reported in brief 06-13 for the Legislative Reference Bureau [16], that the:

2005 Wisconsin Act 482, passed by the legislature and signed by Governor Jim Doyle on May 30, 2006, prohibits the required implanting of microchips in humans. It is the first law of its kind in the nation reflecting a proactive attempt to prevent potential abuses of this emergent technology.

A number of states in the United States have passed similar laws [63], despite the fact that at the national level, the U.S. Food and Drug Administration [15] has allowed radio frequency identification implants for medical use in humans. The Wisconsin Act [59] states:

The people of the state of Wisconsin, represented in senate and assembly, do enact as follows: SECTION 1. 146.25 of the statutes is created to read: 146.25 Required implanting of microchip prohibited. (1) No person may require an individual to undergo the implanting of a microchip. (2) Any person who violates sub. (1) may be required to forfeit not more than $10,000. Each day of continued violation constitutes a separate offense.

North Dakota followed Wisconsin's example. Wisconsin Governor Hoeven signed a two sentence bill into state law on April 4, 2007. The bill was criticized by some who said that while it protected citizens from being “injected” with an implant, it did not prevent someone from making them swallow it [51]. And indeed, there are now a number of swallowable capsule technologies for a variety of purposes that have been patented in the U.S. and worldwide. As with a number of other states, California Governor Arnold Schwarzenegger signed bill SB 362 proposed by state Senator Joe Simitian barring “employers and others from forcing people to have a radio frequency identification (RFID) device implanted under their skin” [28], [60]. According to the Californian Office of Privacy Protection [9] this bill

… would prohibit a person from requiring any other individual to undergo the subcutaneous implanting of an identification device. It would allow an aggrieved party to bring an action against a violator for injunctive relief or for the assessment of civil penalties to be determined by the court.

The bill, which went into effect January 1, 2008, did not receive support from the technology industry on the contention that it was “unnecessary.”

Interestingly, however, it is in the United States that most chip implant applications have occurred, despite the calls for caution. The first human-implantable passive RFID microchip (the VeriChipTM) was approved for medical use in October of 2004 by the U.S. Food and Drug Administration. Nine hundred hospitals across the United States have registered the VeriChip's VeriMed system, and now the corporation's focus has moved to “patient enrollment” including people with diabetes, Alzheimer's, and dementia [14]. The VeriMedTM Patient Identification System is used for “rapidly and accurately identifying people who arrive in an emergency room and are unable to communicate” [56].

In February of 2006 [55], CityWatcher.com reported two of its employees had “glass encapsulated microchips with miniature antennas embedded in their forearms … merely a way of restricting access to vaults that held sensitive data and images for police departments, a layer of security beyond key cards and clearance codes.” Implants may soon be applied to the corrective services sector [44]. In 2002, 27 of 50 American states were using some form of satellite surveillance to monitor parolees. Similar schemes have been used in Sweden since 1994. In the majority of cases, parolees wear wireless wrist or ankle bracelets and carry small boxes containing the vital tracking and positioning technology. The positioning transmitter emits a constant signal that is monitored at a central location [33]. Despite continued claims by researchers that RFID is only used for identification purposes, Health Data Management disclosed that VeriChip (the primary commercial RFID implant patient ID provider) had enhanced its patient wander application by adding the ability to follow the “real-time location of patients, the ability to define containment areas for different classes of patients, and one-touch alerting. The system now also features the ability to track equipment in addition to patients” [19]. A number of these issues have moved the American Medical Association to produce an ethics code for RFID chip implants [4], [41], [47].

Outside the U.S., we find several applications for human-centric RFID. VeriChip's Scott Silverman stated in 2004 that 7000 chip implants had been given to distributors [57]. Today the number of VeriChip implantees worldwide is estimated to be at about 2000. So where did all these chips go? As far back as 2004, a nightclub in Barcelona, Spain [11] and Rotterdam, The Netherlands, known as the Baja Beach Club was offering “its VIP clients the opportunity to have a syringeinjected microchip implanted in their upper arms that not only [gave] them special access to VIP lounges, but also [acted] as a debit account from which they [could] pay for drinks” [39]. Microchips have also been implanted in a number of Mexican officials in the law enforcement sector [57]. “Mexico's top federal prosecutors and investigators began receiving chip implants in their arms … in order to get access to restricted areas inside the attorney general's headquarters.” In this instance, the implant acted as an access control security device despite the documented evidence that RFID is not a secure technology (see Gartner Research report [42]).

Despite the obvious issues related to security, there are a few unsolicited studies that forecast that VeriChip (now under the new corporate name Positive ID) will sell between 1 million and 1.4 million chips by 2020 [64, p. 21]. While these forecasts may seem over inflated to some researchers, one need only consider the very real possibility that some Americans may opt-in to adopting a Class II device that is implantable, life-supporting, or life-sustaining for more affordable and better quality health care (see section C of the Health Care bill titled: National Medical Device Registry [65, pp. 1001–1012]. There is also the real possibility that future pandemic outbreaks even more threatening than the H1N1 influenza, may require all citizens to become implanted for early detection depending on their travel patterns [66].

In the United Kingdom, The Guardian [58], reported that 11-year old Danielle Duval had an active chip (i.e., containing a rechargeable battery) implanted in her. Her mother believes that it is no different from tracking a stolen car, albeit for more important application. Mrs. Duvall is considering implanting her younger daughter age 7 as well but will wait until the child is a bit older, “so that she fully understands what's happening.” In Tokyo the Kyowa Corporation in 2004 manufactured a schoolbag with a GPS device fitted into it, to meet parental concerns about crime, and in 2005 Yokohama City children were involved in a four month RFID bracelet trial using the I-Safety system [53]. In 2007, Trutex, a company in Lancashire England, was seriously considering fitting the school uniforms they manufacture with RFID [31]. What might be next? Will concerned parents force microchip implants on minors?

Recently, decade-old experimental studies on microchip implants in rats have come to light tying the device to tumors [29]. The American Veterinary Medical Association [3] was so concerned that they released the following statement:

The American Veterinary Medical Association (AVMA) is very concerned about recent reports and studies that have linked microchip identification implants, commonly used in dogs and cats, to cancer in dogs and laboratory animals…. In addition, removal of the chip is a more invasive procedure and not without potential complications. It's clear that there is a need for more scientific research into this technology. [emphasis added]

We see here evidence pointing to the notion of “no return” - an admittance that removal of the chip is not easy, and not without complications.

The Norplant System was a levonorgestrel contraceptive insert that over 1 million women in the United States, and over 3.6 million women worldwide had been implanted with through 1996 [2]. The implants were inserted just under the skin of the upper arm in a surgical procedure under local anesthesia and could be removed in a similar fashion. As of 1997, there were 2700 Norplant suits pending in the state and federal courts across the United States alone. Most of the claims had to do with “pain or damage associated with insertion or removal of the implants … [p]laintiffs have contended that they were not adequately warned, however, concerning the degree or severity of these events” [2]. Thus, concerns for the potential for widespread health implications caused by humancentric implants have also been around for some time. In 2003, Covacio provided evidence why implants may impact humans adversely, categorizing these into thermal (i.e., whole/partial rise in body heating), stimulation (i.e., excitation of nerves and muscles), and other effects, most of which are currently unknown [13].

Role of Emerging Technologies

Wireless networks are now commonplace. What is not yet common are formal service level agreements to hand-off transactions between different types of networks. These architectures and protocols are being developed, and it is only a matter of time before existing technologies have the capability to track individuals between indoor and outdoor locations seamlessly, or a new technology is created to do what present-day networks cannot [26]. For instance, a wristwatch device with GPS capabilities to be worn under the skin translucently is one idea that was proposed in 1998. Hengartner and Steenkiste [23] forewarn that “[l]ocation is a sensitive piece of information” and that “releasing it to random entities might pose security and privacy risks.”

There is nowhere to hide in this digital society, and nothing remains private (in due course, perhaps, not even our thoughts). Nanotechnology, the engineering of functional systems at the molecular level, is also set to change the way we perceive surveillance - microscopic bugs (some 50 000 times smaller than the width of the human hair) will be more parasitic than even the most advanced silicon-based auto-ID technologies. In the future we may be wearing hundreds of microscopic implants, each relating to an exomuscle or an exoskeleton, and which have the power to interact with literally millions of objects in the “outside world.” The question is not whether state governments will invest in this technology: they are already making these investments [40]. There is a question whether the next generation will view this technology as super “cool” and convenient and opt-in without comprehending the consequences of their compliance.

The social implications of these über-intrusive technologies will obey few limits and no political borders. They will affect our day-to-day existence and our family and community relations. They will give rise to mental health problems, even more complex forms of paranoia and obsessive compulsive disorder. Many scholars now agree that with the support of modern neuroscience, “the intimate relation between bodily and psychic functions is basic to our personal identity” [45, p. 3]. Religious observances will be affected; for example, in the practice of confession and a particular understanding of absolution from “sin” - people might confess as much as they might want, but the records on the database, the slate, will not be wiped clean. The list of social implications is limited only by our imaginations. The peeping Tom that we carry on the inside will have manifest consequences for that which philosophers and theologians normally term self-consciousness.

Paradoxical Levels of Überveillance

In all of these factors rests the multiple paradoxical levels of überveillance. In the first instance, it will be one of the great blunders of the new political order to think that chip implants (or indeed nanodevices) will provide the last inch of detail required to know where a person is, what they are doing, and what they are thinking. Authentic ambient context will always be lacking, and this could further aggravate potential “puppeteers” of any comprehensive surveillance system. Marcus Wigan captures this critical facet of context when he speaks of “asymmetric information held by third parties.” Second, chip implants will not necessarily make a person smarter or more aware (unless someone can afford chip implants that have that effect), but on the contrary and under the “right” circumstances may make us increasingly unaware and mute. Third, chip implants are not the panacea they are made out to be - they can fail, they can be stolen, they are not tamper-proof, and they may cause harmful effects to the body. They are a foreign object and their primary function is to relate to the outside world not to the body itself (as in the case of pacemakers and cochlear implants). Fourth, chip implants at present do not give a person greater control over her space, but allow for others to control and to decrease the individual's autonomy and as a result decrease interpersonal trust at both societal and state levels. Trust is inexorably linked to both metaphysical and moral freedom. Therefore the naive position routinely heard in the public domain that if you have “nothing to hide, why worry?” misses the point entirely. Fifth, chip implants will create a presently unimaginable digital divide - we are not referring to computer access here, or Internet access, but access to another mode of existence. The “haves” (implantees) and the “have-nots” (non-implantees) will not be on speaking terms; perhaps this suggests a fresh interpretation to the biblical tower of Babel (Gen. 11:9).

In the scenario, where a universal ID is instituted, unless the implant is removed within its prescribed time, the body will adopt the foreign object and tie it to tissue. At this moment, there will be no exit strategy and no contingency plan; it will be a life sentence to upgrades, virus protection mechanisms, and inescapable intrusion. Imagine a working situation where your computer - the one that stores all your personal data - has been hit by a worm, and becomes increasingly inoperable and subject to overflow errors and connectivity problems. Now imagine the same thing happening with an embedded implant. There would be little choice other than to upgrade or to opt out of the networked world altogether.

A decisive step towards überveillance will be a unique and “non-refundable” identification number (ID). The universal drive to provide us all with cradle-to-grave unique lifetime identifiers (ULIs), which will replace our names, is gaining increasing momentum, especially after September 11. Philosophers have have argued that names are the signification of identity and origin; our names possess both sense and reference [24, p. 602f]. Two of the twentieth century's greatest political consciences (one who survived the Stalinist purges and the other the holocaust), Aleksandr Solzhenitsyn and Primo Levi, have warned us of the connection between murderous regimes and the numbering of individuals. It is far easier to extinguish an individual if you are rubbing out a number rather than a life history.

Aleksandr Solzhenitsyn recounts in The Gulag Archipelago (1918–56), (2007, p. 346f):

[Corrective Labor Camps] quite blatantly borrowed from the Nazis a practice which had proved valuable to them - the substitution of a number for the prisoner's name, his “I”, his human individuality, so that the difference between one man and another was a digit more or less in an otherwise identical row of figures … [i]f you remember all this, it may not surprise you to hear that making him wear numbers was the most hurtful and effective way of damaging a prisoner's self-respect.

Primo Levi writes similarly in his own well-known account of the human condition in The Drowned and the Saved (1989, p. 94f):

Altogether different is what must be said about the tattoo [the number], an altogether autochthonous Auschwitzian invention … [t]he operation was not very painful and lasted no more than a minute, but it was traumatic. Its symbolic meaning was clear to everyone: this is an indelible mark, you will never leave here; this is the mark with which slaves are branded and cattle sent to the slaughter, and this is what you have become. You no longer have a name; this is your new name.

And many centuries before both Solzhenitsyn and Levi were to become acknowledged as two of the greatest political consciences of our times, an exile on the isle of Patmos - during the reign of the Emperor Domitian - referred to the abuses of the emperor cult which was practiced in Asia Minor away from the more sophisticated population of Rome [37, pp. 176–196]. He was Saint John the Evangelist, commonly recognized as the author of the Book of Revelation (c. A.D. 95):

16 Also it causes all, both small and great, both rich and poor, both free and slave, to be marked on the right hand or the forehead, 17 so that no one can buy or sell unless he has the mark, that is, the name of the beast or the number of its name. 18 This calls for wisdom: let him who has understanding reckon the number of the beast, for it is a human number, its number is six hundred and sixty-six (Rev 13:16–18) [RSV, 1973].

The technological infrastructures—the software, the middleware, and the hardware for ULIs—are readily available to support a diverse range of humancentric applications, and increasingly those embedded technologies which will eventually support überveillance. Multi-national corporations, particularly those involved in telecommunications, banking, and health are investing millions (expecting literally billions in return) in identifiable technologies that have a tracking capability. At the same time the media, which in some cases may yield more sway with people than government institutions themselves, squanders its influence and is not intelligently challenging the automatic identification (auto-ID) trajectory. As if in chorus, blockbuster productions from Hollywood are playing up all forms of biometrics as not only hip and smart, but also as unavoidable mini-device fashion accessories for the upwardly mobile and attractive. Advertising plays a dominant role in this cultural tech-rap. Advertisers are well aware that the market is literally limitless and demographically accessible at all levels (and more tantalizingly from cradle-to-grave consumers). Our culture, which in previous generations was for the better part the vanguard against most things detrimental to our collective well-being, is dangerously close to bankrupt (it already is idol worshipping) and has progressively become fecund territory for whatever idiocy might take our fancy. Carl Bernstein [7] captured the atmosphere of recent times very well:

We are in the process of creating what deserves to be called the idiot culture. Not an idiot sub-culture, which every society has bubbling beneath the surface and which can provide harmless fun; but the culture itself. For the first time the weird and the stupid and the coarse are becoming our cultural norm, even our cultural ideal.

Despite the technological fixation with which most of the world is engaged, there is a perceptible mood of a collective disquiet that something is not as it should be. In the face of that, this self-deception of “wellness” is not only taking a stronger hold on us, but it is also being rationalized and deconstructed on many levels. We must break free of this dangerous daydream to make out the cracks that have already started to appear on the gold tinted rim of this seeming 21st century utopia. The machine, the new technicized “gulag archipelago” is ever pitiless and without conscience. It can crush bones, break spirits, and rip out hearts without pausing.

The authors of this article are not anti-government; nor are they conspiracy theorists (though we now know better than to rule out all conspiracy theories). Nor do they believe that these dark scenarios are inevitable. But we do believe that we are close to the point of no return. Others believe that point is much closer [1]. It remains for individuals to speak up and argue for, and to demand regulation, as has happened in several states in the United States where Acts have been established to avoid microchipping without an individual's consent, i.e., compulsory electronic tagging of citizens. Our politicians for a number of reasons will not legislate on this issue of their own accord, with some few exceptions. It would involve multifaceted industry and absorb too much of their time, and there is the fear they might be labelled anti-technology or worse still, failing to do all that they can to fight against “terror.” This is one of the components of the modern-day Realpolitik, which in its push for a transparent society is bulldozing ahead without any true sensibility for the richness, fullness, and sensitivity of the undergrowth. As an actively engaged community, as a body of concerned researchers with an ecumenical conscience and voice, we can make a difference by postponing or even avoiding some of the doomsday scenario outlined here.

Finally, the authors would like to underscore three main points. First, nowhere is it suggested in this paper that medical prosthetic or therapeutic devices are not welcome technological innovations. Second, the positions, projections, and beliefs expressed in this summary do not necessarily reflect the positions, projections, and beliefs of the individual contributors to this special section. And third the authors of the papers do embrace all that which is vital and dynamic with technology, but reject its rampant application and diffusion without studied consideration as to the potential effects and consequences.

References

1. Surveillance Society Clock 23:54 American Civil Liberties Union, Oct. 2007, [online] Available: http://www.aclu.org/privacy/spying/surveillancesocietyclock.html, accessed.

2. Norplant system contraceptive inserts, Oct. 2007, [online] Available: http://www.ama-assn.org/ama/pub/category/print/13593.html.

3. "Breaking news: Statement on microchipping", American Veterinary Medical Association, Oct. 2007, [online] Available: http://www.avma.org/ aa/microchip/breaking_news_070913_pf.asp.

4. B. Bacheldor, "AMA issues Ethics Code for RFID chip implants", RFID J., Oct. 2007, [online] Available: http://www.rfidjournal.com/article/ articleprint/3487/-1/1/.

5. E. Ball, K. Bond, Bess Marion v. Eddie Cafka and ECC Enterprises Inc., Oct. 2007, [online] Available: http://www. itmootcourt.com/2005%20Briefs/Petitioner/Team18.pdf.

6. "Implant chip to identify the dead", BBC News, Jan. 2006, [online] Available: http://news.bbc.co.Uk/1/hi/technology/4721175.stm. 

7. C. Bernstein, The Guardian, June 1992.

8. P. Burton, K. Stockhausen, The Australian Medical Association's Submission to the Legal and Constitutional's Inquiry into the Privacy Act 1988, Oct. 2007, [online] Available: http://www.ama.com.au/web.nsf/doc/ WEEN-69X6DV/\$file/Privacy_Submission_to_Senate_Committee. doc.

9. California privacy legislation, State of California:Office of Privacy Protection, July 2007, [online] Available: http://www.privacy.ca.gov/califlegis.htm.

10. "Thai wave disaster largest forensic challenge in years: Expert", Channel News Asia, Feb. 2005, [online] Available: http://www.channelnewsasia.com/stories/afp_asiapacific/view/125459/1/.html.

11. C. Chase, "VIP Verichip", Baja Beach House- Zona VIP, Oct. 2007, [online] Available: http:// www.baja-beachclub.com/bajaes/asp/zonavip2.aspx.

12. R. A. Clarke, "Information technology and dataveillance", Commun. ACM, vol. 31, no. 5, pp. 498-512, 1988.

13. S. Covacio, "Technological problems associated with the subcutaneous microchips for human identification (SMHId)", InSITE-Where Parallels Intersect, pp. 843-853, June 2003.

14. "13 diabetics implanted with VeriMed RFID microchip at Boston diabetes EXPO", Medical News Today, Oct. 2007, [online] Available: http://www.medicalnewstoday.com/articles/65560.php.

15. "Medical devices; General hospital and personal use devices; classification of implantable radiofrequency transponder system for patient identification and health information", U.S. Food and Drug Administration-Department of Health and Human Services, vol. 69, no. 237, Oct. 2007, [online] Available: http://www.fda.gov/ohrms/dockets/98fr/0427077.htm.

16. A. Gad, "Legislative Brief 06-13: Human Microchip Implantation", Legislative Briefs from the Legislative Reference Bureau, June 2006, [online] Available: http://www.legis.state.wi.us/lrb/pubs/Lb/06Lb13.pdf.

17. E. Guild, D. Bigo, "The Schengen Border System and Enlargement" in Police and Justice Co-operation and the New European Borders, European Monographs, pp. 121-138, 2002.

18. M. Hawthorne, "Refugees meeting hears proposal to register every human in the world", Sydney Morning Herald, July 2003, [online] Available: http://www.smh.com.au/breaking/2001/12/14/FFX058CU6VC.html.

19. "VeriChip enhances patient wander app", Health Data Management, Oct. 2007, [online] Available: http://healthdatamanagement.com/ HDMSearchResultsDetails.cfm?articleId=12361.

20. "VeriChip buys monitoring tech vendor", Health Data Management, July 2005, [online] Available: http://healthdatamanagement.com/ HDMSearchResultsDetails.cfm?articleId=12458.

21. "Chips keep tabs on babies moms", Health Data Management, Oct. 2005, [online] Available: http://healthdatamanagement.com/HDMSearchResultsDetails. cfm?articleId=15439.

22. "Baylor uses RFID to track newborns", Health Data Management, July 2007, [online] Available: http://healthdatamanagement.com/HDMSearchResultsDetails.cfm?articleId=15439.

23. U. Hengartner, P. Steenkiste, "Access control to people location information", ACM Trans. Information Syst. Security, vol. 8, no. 4, pp. 424-456, 2005.

24. "Names" in Oxford Companion to Philosophy, U.K., Oxford:Oxford Univ. Press, pp. 602f, 1995.

25. "Nietzsche Friedrich" in Oxford Companion to Philosophy, U.K., Oxford:Oxford Univ. Press, pp. 619-623, 1995.

26. "RFID tags equipped with GPS", Navigadget, Oct. 2007, [online] Available: http://www.navigadget.com/index.php/2007/06/27/rfid-tags-equipped-with-gps/.

27. "Me & my RFIDs", IEEE Spectrum, vol. 4, no. 3, pp. 14-25, Mar. 2007.

28. K. C. Jones, "California passes bill to ban forced RFID tagging", InformationWeek, Sept. 2007, [online] Available: http://www.informationweek.com/ shared/printableArticle.jhtml?articleID=201803861.

29. T. Lewan, "Microchips implanted in humans: High-tech helpers or Big Brother's surveillance tools?", The Associated Press, Oct. 2007, [online] Available: http://abcnews.go.com/print?id=3401306.

30. T. Lewan, Chip implants linked to animal tumors, Associated Press/ WashingtonPost.com, Oct. 2007, [online] Available: http://www.washingtonpost.com/wp-dyn/content/article/2007/09/09/AR2007090900467. html.

31. J. Meikle, "Pupils face tracking bugs in school blazers", The Guardian, Aug. 2007, [online] Available: http://www.guardian.co.uk/uk_news/ story/0, 2152979,00.

32. K. Michael, Selected Works of Dr. Katina Michael, Australia, Wollongong:Univ. of Wollongong, Oct. 2007, [online] Available: http://ro.uow.edu.au/kmichael/.

33. K. Michael, A. Masters, "Realised applications of positioning technologies in defense intelligence" in Applications of Information Systems to Homeland Security and Defense, IDG Press, pp. 164-192, 2006.

34. K. Michael, A. Masters, "The advancement of positioning technologies in defence intelligence" in Applications of Information Systems to Homeland Security and Defense, IDG Press, pp. 193-214, 2006.

35. K. Michael, M. G. Michael, "Towards chipification: The multifunctional body art of the net generation" in Cultural Attitudes Towards Technology and Communication, Estonia, Tartu:, pp. 622-641, 2006.

36. K. Michael, M. G. Michael, "Homo electricus and the continued speciation of humans" in The Encyclopedia of Information Ethics and Security, IGI Global, pp. 312-318, 2007.

37. M. G. Michael, Ch IX: Imperial cult in The Number of the Beast 666 (Revelation 13:16-18): Background Sources and Interpretation, Macquarie Univ., 1998.

38. M. G. Michael, "Überveillance: 24/7 × 365-People tracking and monitoring", Proc. 29 International Conference of Data Protection and Privacy Commissioners: Privacy Horizons Terra Incognita, 2007-Sept.-25-28, [online] Available: http://www.privacyconference2007.gc.ca/Terra_Incognita_program_E.html.

39. S. Morton, "Barcelona clubbers get chipped", BBC News, Oct. 2007, [online] Available: http://news.bbc.co.Uk/2/hi/technology/3697940.stm. 

40. D. Ratner, M. A. Ratner, Nanotechnology and Homeland Security: New Weapons for New Wars, U.S.Α., New Jersey:Prentice Hall, 2004.

41. J. H. Reichman, "RFID labeling in humans American Medical Association House of Delegates: Resolution: 6 (A-06)", Reference Committee on Amendments to Constitution and Bylaws, 2006, [online] Available: http://www. ama-assn.org/amal/pub/upload/mm/471/006a06.doc.

42. M. Reynolds, "Despite the hype microchip implants won't deliver security", Gartner Research, Oct. 2007, [online] Available: http://www.gartner.com/ DisplayDocument?doc_cd=121944.

43. "Singapore fights SARS with RFID", RFID J., Aug. 2005, [online] Available: http://www.rfidjournal.com/article/articleprint/446/-1/1/.

44. "I am not a number - Tracking Australian prisoners with wearable RFID tech", RFID Gazette, Oct. 2007, [online] Available: http://www. rfidgazette.org/2006/08/i_am_not_a_numb.html.

45. S. Rodotà, R. Capurro, "Ethical aspects of ICT implants in the human body", Opinion of the European Group on Ethics in Science and New Technologies to the European Commission N° 20 Adopted on 16/03/2005, Oct. 2007, [online] Available: http://ec.europa.eu/european_group_ethics/docs/ avis20_en.pdf.

46. "Papua Legislative Council deliberating microchip regulation for people with HIV/AIDS", Radio New Zealand International, Oct. 2007, [online] Available: http://www.rnzi.com/pages/news. php?op=read&id=33896.

47. R. M. Sade, "Radio frequency ID devices in humans Report of the Council on Ethical and Judicial Affairs: CEJA Report 5-A-07", Reference Committee on Amendments to Constitution and Bylaws, Oct. 2007, [online] Available: http://www.ama-assn.org/amal/pub/upload/ mm/369/ceja_5a07.pdf.

48. B. K. Schuerenberg, "Implantable RFID chip takes root in CIO: Beta tester praises new mobile device though some experts see obstacles to widespread adoption", Health Data Management, Feb. 2005, [online] Available: http://www.healthdatamanagement.com/HDMSearchResultsDetails. cfm?articleId=12232.

49. B. K. Schuerenberg, "Patients let RFID get under their skin", Health Data Management, Nov. 2005, [online] Available: http://healthdatamanagement. com/HDMSearchResultsDetails.cfm?articleId=12601.

50. N. D. Somba, "Papua considers 'chipping' people with HIV/ AIDS", The Jakarta Post, Oct. 2007, [online] Available: http://www.thejakartapost. com/yesterdaydetail.asp?fileid=20070724.G04.

51. M. L. Songini, "N.D. bans forced RFID chipping Governor wants a balance between technology privacy", ComputerWorld, Oct. 2007, [online] Available: http://www.computerworld.com/action/article.do?command =viewArticleBasic&taxonomyId=15&articleId=9016385&intsrc=h m_topic.

52. D. M. Snow, National Security For A New Era.: Globalization And Geopolitics, Addison-Wesley, 2005.

53. C. Swedberg, "RFID watches over school kids in Japan", RFID J., Oct. 2007, [online] Available: http://www.rfidjournal.com/article/ articleview/2050/1/1/.

54. C. Swedberg, "Alzheimer's care center to carry out VeriChip pilot", RFID J., Oct. 2007, [online] Available: http://www.rfidjournal.com/article/ articleview/3340/1/1/.

55. "Chips: High tech aids or tracking tools?", Fairfax Digital: The Age, Oct. 2007, [online] Available: http://www.theage.com.au/news/Technology/Microchip-Implants-Raise-Privacy-Concern/2007/07/22/1184560127138. html. 

56. "VeriChip Corporation adds more than 200 hospitals at the American College of Emergency Physicians (ACEP) Conference", VeriChip News Release, 2007-Oct.-11, [online] Available: http://www.verichipcorp.com/ news/1192106879.

57. W. Weissert, "Microchips implanted in Mexican officials", Associated Press, Oct. 2007, [online] Available: http://www.msnbc.msn.com/id/5439055/.

58. J. Wilson, "Girl to get tracker implant to ease parents' fears", The Guardian, Oct. 2002, [online] Available: http://www.guardian.co.uk/Print/0,3858,4493297,00. html.

59. Wisconsin Act 482, May 2006, [online] Available: http://www.legis.state. wi.us/2005/data/acts/05Act482.pdf.

60. J. Woolfolk, "Back off Boss: Forcible RFID implants outlawed in California", Mercury News, Oct. 2007, [online] Available: http://www.mercurynews. com/portlet/article/html/fragments/print_article.jsp?articleId=7162880.

61. Macquarie Dictionary, Sydney University, pp. 1094, 2009.

62. K. Michael, M. G. Michael, Innovative Automatic Identification and Location-Βased Services: From Bar Codes to Chip Implants, PA, Hershey:IGI Global, pp. 401, 2009

63. A. Griggieri, K. Michael, M. G. Michael, "The legal ramifications of microchipping people in the United States of America- A state legislative comparison", Ρroc. 2009 IEEE Int. Symp. Technology and Society, pp. 1-8, 2009.

64. A. Marburger, J. Coon, K. Fleck, T. Kremer, VeriChip™: Implantable RFID for The Health Industry, June 2005, [online] Available: http://www. thecivilrightonline.com/docs/Verichip_Implantable%20RFID.pdf.

65. 111TH CONGRESS 1ST SESSION H. R. 11 A BILL: To provide affordable quality health care for all Americans and reduce the growth in health care spending and for other purposes, 2010-Apr.-1, [online] Available: http://waysandmeans. house.gov/media/pdf/111/AAHCA09001xml.pdf.

66. Positive ID. 2010. Health-ID, May 2010, [online] Available: http://www.positiveidcorp.com/ health-id.html.

IEEE Keywords: Implants, TV, Data systems, National security, Pressing, Engines, Condition monitoring, Circuits,Feeds, Databases

Citation: M.G. Michael, Katina Michael, Toward a State of Überveillance, IEEE Technology and Society Magazine ( Volume: 29, Issue: 2, Summer 2010 ), pp. 9 - 16, Date of Publication: 01 June 2010, DOI: 10.1109/MTS.2010.937024