My Research Programme (2002 - Now)

Implantable Medical Device Tells All

Implantable Medical Device Tells All: Uberveillance Gets to the Heart of the Matter

In 2015, I provided evidence at an Australian inquiry into the use of subsection 313(3) of the Telecommunications Act of 1997 by government agencies to disrupt the operation of illegal online services [1]. I stated to the Standing Committee on Infrastructure and Communications that mandatory metadata retention laws meant blanket coverage surveillance for Australians and visitors to Australia. The intent behind asking Australian service providers to keep subscriber search history data for up to two years was to grant government and law enforcement organizations the ability to search Internet Protocol–based records in the event of suspected criminal activity.

Importantly, I told the committee that, while instituting programs of surveillance through metadata retention laws would likely help to speed up criminal investigations, we should also note that every individual is a consumer, and such programs ultimately come back to bite innocent people through some breach of privacy or security. Enter the idea of uberveillance, which, I told the committee, is “exaggerated surveillance” that allows for interference [1] that I believe is a threat to our human rights [2]. I strongly advised that evoking section 313 of the Telecommunications Act 1997 requires judicial oversight through the process of a search warrant. My recommendations fell on deaf ears, and, today, we even have the government deliberating over whether or not they should relax metadata laws to allow information to be accessed for both criminal and civil litigation [3], which includes divorces, child custody battles, and business disputes. In June 2017, Australian Prime Minister Malcolm Turnbull even stated that “global social media and messaging companies” need to assist security services’ efforts to fight terrorism by “providing access to encrypted communications” [52].

Consumer Electronics Leave Digital Data Footprints

Of course, Australia is not alone in having metadata retention laws. Numerous countries have adopted these laws or similar directives since 2005, keeping certain types of data for anywhere between 30 days and indefinitely, although the standard length is somewhere between one and two years. For example, since 2005, Italy has retained subscriber information at Internet cafes for 30 days. I recall traveling to Verona in 2008 for the European Conference on Information Systems, forgetting my passport in my hotel room, and being unable to use an Internet cafe to send a message back home because I was carrying no recognized identity information. When I asked why I was unable to send a simple message, I was handed an antiterrorism information leaflet. Italy also retains telephone data for up to two years and Internet service provider (ISP) data for up to 12 months.

Similarly, the United Kingdom retains all telecommunications data for one to two years. It also maintains postal information (sender, receiver data), banking data for up to seven years, and vehicle movements for up to two years. In Germany, metadata retention was established in 2008 under the directive Gesetz zur Neuregelung der Telekommunikationsüberwachung und anderer verdeckter Ermittlungsmaßnahmen sowie zur Umsetzung der Richtlinie 2006/24/EG, but it was overturned in 2010 by the Federal Constitutional Court of Germany, which ruled the law was unconstitutional because it violated a fundamental right, in that correspondence should remain secret. In 2015, this violation was challenged again, and a compromise was reached to retain telecommunications metadata for up to ten weeks. Mandatory data retention in Sweden was challenged by one holdout ISP, Bahnhof, which was threatened with an approximately US$605,000 fine in November 2014 if it did not comply [4]. They defended their stance to protect the privacy and integrity of their customers by offering a no-logs virtual private network free of charge [5].

Some European Union countries have been deliberating whether to extend metadata retention to chats and social media, but, in the United States, many corporations voluntarily retain subscriber data, including market giants Amazon and Google. It was reported in The Guardian in 2014 that the United States records Internet metadata for not only itself but the world at large through the National Security Agency (NSA) using its MARINA database to conduct pattern-of-life analysis [6]. Additionally, with the Amendments Act in 2008 of the Foreign Intelligence Surveillance Act 1978, the time allotted for warrantless surveillance was increased, and additional provisions were made for emergency eavesdropping. Under section 702 of the Foreign Intelligence Surveillance Act of 1978 Amendments Act, now all American citizens’ metadata is stored. Phone records are kept by the NSA in the MAINWAY telephony metadata collection database [53], and short message service and other text messaging worldwide are retained in DISHFIRE [7], [8].

Emerging Forms of Metadata in an Internet of Things World

Figure 1. An artificial pacemaker (serial number 1723182) from St. Jude medical, with electrode, which was removed from a deceased patient prior to cremation. (Photo courtesy of wikimedia commons.)

The upward movement toward a highly interconnected world through the Web of Things and people [9] will only mean that even greater amounts of data will be retained by corporations and government agencies around the world, extending beyond traditional forms of telecommunications data (e.g., phone records, e-mail correspondence, Internet search histories, metadata of images, videos, and other forms of multimedia). It should not surprise us that even medical devices are being touted as soon to be connected to the Internet of Things (IoT) [10]. Heart pacemakers, for instance, already send a steady stream of data back to the manufacturer’s data warehouse (Figure 1). Cardiac rhythmic data is stored on the implantable cardioverter-defibrillator’s (ICD’s) memory and is transmitted wirelessly to a home bedside monitor. Via a network connection, the data find their way to the manufacturer’s data store (Figure 2).

The standard setup for an EKG. A patient lies in a bed with EKG electrodes attached to his chest, upper arms, and legs. A nurse oversees the painless procedure. The ICD in a patient produces an EKG (A) which can automatically be sent to a ICD manufacturer's data store (B). (Image courtesy of wikimedia commons.)

In health speak, the ICD set up in the patient’s home is a type of remote monitoring that happens usually when the ICD recipient is in a state of rest, most often while sleeping overnight. It is a bit like how normal computer data backups happen, when network traffic is at its lowest. In the future, an ICD’s proprietary firmware updates may well travel back up to the device, remote from the manufacturer, like installing a Windows operating system update on a desktop. In the following section, we will explore the implications of access to personal cardiac data emanating from heart pacemakers in two cases.

CASE 1: HUGO CAMPOS DENIED ACCESS TO HIS PERSONAL CARDIAC DATA

Figure 3. The conventional radiography of a single-chamber pacemaker. (Photo courtesy of wikimedia commons.)

In 2007, scientist Hugo Campos collapsed at a train station and later was horrified to find out that he had to get an ICD for his genetic heart condition. ICDs usually last about seven years before they require replacement (Figure 3). A few years into wearing the device, being a high-end quantifiedself user who measured his sleep, exercise, and even alcohol consumption, Campos became inquisitive over how he might gain access to the data generated by his ICD (Figure 4). He made some requests to the ICD’s manufacturer and was told that he was unable to receive the information he sought, despite his doctor having full access. Some doctors could even remotely download the patient’s historical data on a mobile app for 24/7 support during emergency situations (Figure 5). Campos’s heart specialist did grant him access to written interrogation reports, but Campos only saw him about once every six months after his conditioned stabilized. Additionally, the logs were of no consequence to him on paper, and the fields and layout were predominantly decipherable only by a doctor (Figure 6).

Figure 4. The Nike FuelBand is a wearable computer that has become one of the most popular devices driving the so-called quantified-self trend. (Photo courtesy of wikimedia commons.)

Dissatisfied by his denied access, Campos took matters into his own hands and purchased a device on eBay that could help him get the data. He also went to a specialist ICD course and then intercepted the cardiac rhythms being recorded [11]. He got to the data stream but realized that to make sense of it from a patient perspective, a patient-centric app had to be built. Campos quickly deduced that regulatory and liability concerns were at the heart of the matter from the manufacturer’s perspective. How does a manufacturer continue to improve its product if it does not continually get feedback from the actual ICDs in the field? If manufacturers offered mobile apps for patients, might patients misread their own diagnoses? Is a manufacturer there to enhance life alone or to make a patient feel better about bearing an ICD? Can an ICD be misused by a patient? Or, in the worst case scenario, what happens in the case of device failure? Or patient death? Would the proof lie onboard? Would the data tell the true story? These are all very interesting questions.

Figure 5. The medical waveform format encoding rule software on a Blackberry device. It displays medical waveforms, such as EKG (shown), electroencephalogram, and blood pressure. Some doctors have software that allows them to interrogate EKG information, but patients presently do not have access to their own ICD data. (Photo courtesy of wikimedia commons.)

Campos might well have acted to not only get what he wanted (access to his data his own way) but to raise awareness globally as to the type of data being stored remotely by ICDs in patients. He noted in his TEDxCambridge talk in 2011 [12]:

the ICD does a lot more than just prevent a sudden cardiac arrest: it collects a lot of data about its own function and about the patient’s clinical status; it monitors its own battery life; the amount of time it takes to deliver a life-saving shock; it monitors a patient’s heart rhythm, daily activity; and even looks at variations in chest impedance to look if there is build-up of fluids in the chest; so it is a pretty complex little computer you have built into your body. Unfortunately, none of this invaluable data is available to the patient who originates it. I have absolutely no access to it, no knowledge of it.

Doctors, on the other hand, have full 24/7 unrestricted access to this information; even some of the manufacturers of these medical devices offer the ability for doctors to access this information through mobile devices. Compare this with the patients’ experience who have no access to this information. The best we can do is to get a printout or a hardcopy of an interrogation report when you go into the doctor’s office.

Figure 6. An EKG chart. Twelve different derivations of an EKG of a 23-year-old japanese man. A similar log was provided to hugo campos upon his request for six months worth of EKG readings. (Photo courtesy of wikimedia commons.)

Campos decided to sue the manufacturer after he was informed that the data being generated from his ICD measuring his own heart activity was “proprietary data” [13]. Perhaps this is the new side of big data. But it is fraught with legal implications and, as far as I am concerned, blatantly dangerous. If we deduce that a person’s natural biometric data (in this instance, the cardiac rhythm of an individual) belong to a third party, then we are headed into murky waters when we speak of even more invasive technology like deepbrain stimulators [14]. It not only means that the device is not owned by the electrophorus (the bearer of technology) [15], [16], but quite possibly the cardiac rhythms unique to the individual are also owned by the device manufacturer. We should not be surprised. In Google Glass’s “Software and Services” section of its terms of use, it states that Google has the right to “remotely disable or remove any such Glass service from user systems” at its “sole discretion” [17]. Placing this in the context of ICDs means that a third party almost indelibly has the right to switch someone off.

CASE 2: ROSS COMPTON’S PACEMAKER DATA IS SUBPOENAED FOR CRIMINAL INVESTIGATIONS

Enter the Ross Compton case of Middletown, Ohio. M.G. Michael and I have dubbed it one of the first authentic uberveillance cases in the world, because the technology was not just wearable but embedded. The story goes something like this: On 27 January 2017, 59-year-old Ross Compton was indicted on arson and insurance fraud charges. Police gained a search warrant to obtain his heart pacemaker readings (heart and cardiac rhythms) and called his alibi into question. Data from Compton’s pacemaker before, during, and after the fire in his home broke out were disclosed by the heart pacemaker manufacturer after a subpoena was served. The insurer’s bill for the damage was estimated at about US$400,000. Police became suspicious of Compton when they traced gasoline to Compton’s shoes, trousers, and shirt.

In his statement of events to police, Compton told a story that misaligned and conflicted with his call to 911. Forensic analysts found traces of multiple fires having been lit in various locations in the home. Yet, Compton told police he had rushed his escape, breaking a window with his walking stick to throw some hastily packed bags out and then fleeing the flames himself to safety. Compton also told police that he had an artificial heart with a pump attached, a fact that he thought might help his cause but that was to be his undoing. In this instance, his pacemaker acted akin to a black box recording on an airplane [18].

After securing the heart pacemaker data set, an independent cardiologist was asked to assess the telemetry data and determine if Compton’s heart function was commensurate with the exertion needed to make a break with personal belongings during a life-threatening fire [19]. The cardiologist noted that, based on the evidence he was given to interpret, it was “highly improbable” that a man who suffered with the medical conditions that Compton did could manage to collect, pack, and remove the number of items that he did from his bedroom window, escape himself, and then proceed to carry these items in front of his house, out of harm’s way (see “Columbo, How to Dial a Murder”). Compton’s own cardio readings, in effect, snitched on him, and none were happier than the law enforcement officer in charge of the case, Lieutenant Jimmy Cunningham, who noted that the pacemaker data, while only a supporting piece of evidence, was vital in proving Compton’s guilt after gasoline was found on his clothing. Evidence-based policing has now well outstripped the more traditional intelligence-led policing approach, entrenched given the new realm of big data availability [20], [21].

Columbo, How to Dial a Murder [S1] Columbo says to the murderer:
“You claim that you were at the physicians getting your heart examined…which was true [Columbo unravels a roll of EKG readings]…the electrocardiogram, Sir. Just before three o’clock your physician left you alone for a resting trace. At that moment you were lying down in a restful position and your heart showed a calm, slow, easy beat [pointing to the EKG readout]. Look at this part, right here [Columbo points to the reading], lots of sudden stress, lots of excitement, right here at three o’clock, your heart beating like a hammer just before the dogs attacked…Oh you killed him with a phone call, Sir…I’ll bet my life on it. Very simple case. Not that I’m particularly bright, Sir…I must say, I found you disappointing, I mean your incompetence, you left enough clues to sink a ship. Motive. Opportunity. And for a man of your intelligence Sir, you got caught on a lot of stupid lies. A lot.” [S1] Columbo: How to Dial a Murder. Directed by James Frawley. 1978. Los Angeles, CA: Universal Pictures Home Entertainment, 2006. DVD.

Consumer Electronics Tell a Story

Several things are now of interest to the legal community: first and foremost, how is the search warrant for a person’s pacemaker data executed? In case 1, Campos was denied access to his own ICD data stream by the manufacturer, and yet his doctor had full access. In case 2, Compton’s own data provided authorities with the extra evidence they needed to accuse him of fraud. This is yet another example of seemingly private data being used against an individual (in this instance, the person from whose body the data emanated), but in the future, for instance, the data from one person’s pacemaker might well implicate other members of the public. For example, the pacemaker might be able to prove that someone’s heart rate substantially increased during an episode of domestic violence [22] or that an individual was unfaithful in a marriage based on the cross matching of his or her time stamp and heart rate data with another.

Of course, a consumer electronic does not have to be embedded to tell a story (Figure 7). It can also be wearable or luggable, as in the case of a Fitbit that was used as a truthdetector in an alleged rape case that turned out to be completely fabricated [23]. Lawyers are now beginning to experiment with other wearable gadgetry that helps to show the impact of personal injury cases from accidents (work and nonwork related) on a person’s ability to return to his or her normal course of activities [24] (Figure 8). We can certainly expect to see a rise in criminal and civil litigation that makes use of a person’s Android S Health data, for instance, which measure things like steps taken, stress, heart rate, SpO2, and even location and time (Figure 9). But cases like Compton’s open the floodgates.

Figure 7. A Fitbit, which measures calories, steps, distance, and floors. (Photo courtesy of wikimedia commons.)

Figure 8. A closeup of a patient wearing the iRhythm ZIO XT patch, nine days after its placement. (Photo courtesy of wikimedia commons.)

I have pondered on the evidence itself: are heart rate data really any different from other biometric data, such as deoxyribonucleic acid (DNA)? Is it perhaps more revealing than DNA? Should it be dealt with in the same way? For example, is the chain of custody coming from a pacemaker equal to that of a DNA sample and profile? In some way, heart rates can be considered a behavioral biometric [25], whereas DNA is actually a cellular sample [26]. No doubt we will be debating the challenges, and extreme perspectives will be hotly contested. But it seems nothing is off limits. If it exists, it can be used for or against you.

Figure 9. (a) and (b) The health-related data from Samsung's S Health application. Unknown to most is that Samsung has diversified its businesses to be a parent company to one of the world's largest health insurers. (Photos courtesy of katina michael.)

The Paradox of Uberveillance

In 2006, M.G. Michael coined the term uberveillance to denote “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” [27]. No doubt Michael’s background as a former police officer in the early 1980s, together with his cross-disciplinary studies, had something to do with his insights into the creation of the term [28]. This kind of surveillance does not watch from above, rather it penetrates the body and watches from the inside, looking out [29].

Furthermore, uberveillance “takes that which was static or discrete…and makes it constant and embedded” [30]. It is real-time location and condition monitoring and “has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought)” [30]. Uberveillance can be used prospectively or retrospectively. It can be applied as a “predictive mechanism for a person’s expected behavior, traits, likes, or dislikes; or it can be based on historical fact” [30].

In 2008, the term uberveillance was entered into the official Macquarie Dictionary of Australia [31]. In research that has spanned more than two decades on the social implications of implantable devices for medical and nonmedical applications, I predicted [15] that the technological trajectory of implantable devices that were once used solely for care purposes would one day be used retrospectively for tracking and monitoring purposes. Even if the consumer electronics in question were there to provide health care (e.g., the pacemaker example) or convenience (e.g., a near-field-communication-enabled smartphone), the underlying dominant function of the service would be control [32]. The socioethical implications of pervasive and persuasive emerging technologies have yet to really be understood, but increasingly, they will emerge to take center stage in court hearings, like the emergence of DNA evidence and then subsequently global positioning system (GPS) data [33].

Medical device implants provide a very rich source of human activity monitoring, such as the electrocardiogram (EKG), heart rate, and more. Companies like Medtronics, among others specializing in implantables, have proposed a future where even healthy people carry a medical implant packed with sensors that could be life sustaining and detect heart problems (among others), reporting them to a care provider and signaling when assistance might be required [34]. Heart readings provide an individual’s rhythmic biometrics and, at the same time, can record increases and decreases in activity. One could extrapolate that it won’t be long before our health insurance providers are asking for the same evidence for reduced premiums.

Figure 10. A pacemaker cemetery. (Photo courtesy of wikimedia commons.)

The future might well be one where we all carry a black box implantable recorder of some sort [35], an alibi that proves our innocence or guilt, minute by minute (Figure 10). Of course, an electronic eye constantly recording our every move brings a new connotation to the wise words expressed in the story of Pinocchio: always let your conscience be your guide. The future black boxes may not be as forgiving as Jiminy Cricket and more like Black Mirror’s “The Entire History of You” [36]. But if we assume that these technologies are to be completely trusted, whether they are implantable, wearable, or even luggable, then we are wrong.

The contribution of M.G. Michael’s uberveillance is in the emphasis that the uberveillance equation is a paradox. Yes, there are near-real-time data flowing continuously from more points of view than ever [37], closed-circuit TV looking down, smartphones in our pockets recording location and movement, and even implantables in some of us ensuring nontransferability of identity [38]. The proposition is that all this technology in sum total is bulletproof and foolproof, omniscient and omnipresent, a God’s eye view that cannot be challenged but for the fact that the infrastructure and the devices, and the software, are all too human. And while uberveillance is being touted for good through an IoT world that will collectively make us and our planet more sustainable, there is one big crack in the utopian vision: the data can misrepresent, misinform, and be subject to information manipulation [39]. Researchers are already studying the phenomenon on complex visual information manipulation, how to tell whether data has been tampered with, a suspect introduced or removed from a scene of a crime, and other forensic visual analytics [40]. It is why Vladimir Radunovic, director of cybersecurity and e-diplomacy programs in the DiploFoundation, cited M.G. Michael’s contribution that “big data must be followed by big judgment” [41].

What happens in the future if we go down the path of constant bodily monitoring of vital organs and vital signs, where we are all bearing some device or at least wearing one? Will we be in control of our own data, or, as is seemingly obvious at present, will we not be in control? And how might selfincrimination play a role in our daily lives, or even worse, individual expectations that can be achieved by only playing to a theater 24/7 so our health statistics can stack up to whatever measure and cross-examination they are put under personally or publicly [42]? Can we believe the authenticity of every data stream coming out of a sensor onboard consumer electronics? The answer is no.

Having run many years of GPS data-logging experiments, I can say that a lot can go wrong with sensors, and they are susceptible to outside environmental conditions. For instance, they can log your location miles away (even in another continent), the temperature gauge can play up, time stamps can revert to different time zones, the speed of travel can be wildly inaccurate due to propagation delays in satellites, readings may not be at regular intervals due to some kind of interference, and memory overflow and battery issues, while getting better, are still problematic. The short and long of it is that technology cannot be trusted. At best, it can act as supporting evidence but should never replace eyewitness accounts. Additionally, “the inherent problem with uberveillance is that facts do not always add up to truth (i.e., as in the case of an exclusive disjunction T 1 T 5 F), and predictions based on uberveillance are not always correct” [30].

Conclusion

While device manufacturers are challenging the possibility that their ICDs are hackable in courts [43], highly revered security experts like Bruce Schneier are heavily cautioning about going down the IoT path, no matter how inviting it might look. In his acclaimed blog, Schneier recently wrote [44]:

All computers are hackable…The industry is filled with market failures that, until now, have been largely ignorable. As computers continue to permeate our homes, cars, businesses, these market failures will no longer be tolerable. Our only solution will be regulation, and that regulation will be foisted on us by a government desperate to “do something” in the face of disaster…We also need to reverse the trend to connect everything to the internet. And if we risk harm and even death, we need to think twice about what we connect and what we deliberately leave uncomputerized. If we get this wrong, the computer industry will look like the pharmaceutical industry, or the aircraft industry. But if we get this right, we can maintain the innovative environment of the internet that has given us so much.

The cardiac implantables market by 2020 is predicted to become a US$43 billion industry [45]. Obviously, the stakes are high and getting higher with every breakthrough implantable innovation we develop and bring to market. We will need to address some very pressing questions at hand, as Schneier suggests, through some form of regulation if we are to maintain consumer privacy rights and data security. Joe Carvalko, a former telecommunications engineer and U.S. patent attorney as well as an associate editor of IEEE Technology and Society Magazine and pacemaker recipient, has added much to this discussion already [46], [47]. I highly recommend several of his publications, including “Who Should Own In-the-Body Medical Data in the Age of eHealth?” [48] and an ABA publication coauthored with Cara Morris, The Science and Technology Guidebook for Lawyers [49]. Carvalko is a thought leader in this space, and I encourage you to listen to his podcast [50] and also to read his speculative fiction novel, Death by Internet, [51] which is hot off the press and wrestles with some of the issues raised in this article.

REFERENCES

[1] K. Michael, M. Thistlethwaite, M. Rowland, and K. Pitt. (2015, Mar. 6). Standing Committee on Infrastructure and Communications, Section 313 of the Telecommunications Act 1997. [Online]. Available: http:// parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;db=COMMITT EES;id=committees%2Fcommrep%2Fd8727a07-ba09-4a91-9920-73d21 e446d1d%2F0006;query=Id%3A%22committees%2Fcommrep%2Fd872 7a07-ba09-4a91-9920-73d21e446d1d%2F0000%22

[2] S. Bronitt and K. Michael, “Human rights, regulation, and national security,” IEEE Technol. Soc. Mag., vol. 31, pp. 15–16, 2012.

[3] B. Hall. (2016, Dec. 22). Australians’ phone and email records could be used in civil lawsuits. Sydney Morning Herald. [Online]. Available: http:// www.smh.com.au/federal-politics/political-news/australians-phone-andemail-records-could-be-used-in-civil-lawsuits-20161222-gtgdy6.html

[4] PureVPN. (2015, Oct. 14). Data retention laws—an update. [Online]. Available: https://www.purevpn.com/blog/data-retention-laws-by-countries/

[5] D. Crawford. (2014, Nov. 18). Renegade Swedish ISP offers all customers VPN. Best VPN. [Online]. Available: https://www.bestvpn.com/ blog/11806/renegade-swedish-isp-offers-customers-vpn/

[6] J. Ball. (2013, Oct. 1). NSA stores metadata of millions of web users for up to a year, secret files show. Guardian. [Online]. Available: https://www .theguardian.com/world/2013/sep/30/nsa-americans-metadata-year-documents

[7] J. S. Granick, American Spies: Modern Surveillance, Why You Should Care, and What to Do About It. Cambridge, U.K.: Cambridge Univ. Press, 2017.

[8] A. Gregory, American Surveillance: Intelligence, Privacy, and the Fourth Amendment. Madison: Univ. of Wisconsin Press, 2016.

[9] K. Michael, G. Roussos, G. Q. Huang, A. Chattopadhyay, R. Gadh, B. S. Prabhu, and P. Chu, “Planetary-scale RFID services in an age of uberveillance,” Proc. IEEE, vol. 98, no. 9, pp. 1663–1671, 2010.

[10] N. Lars. (2015, Mar. 26). Connected medical devices, apps: Are they leading the IOT revolution—or vice versa? Wired. [Online]. Available: https://www.wired.com/insights/2014/06/connected-medical-devicesapps-leading-iot-revolution-vice-versa/

[11] H. Campos. (2015). The heart of the matter. Slate. [Online]. Available: http://www.slate.com/articles/technology/future_tense/2015/03/ patients_should_be_allowed_to_access_data_generated_by_implanted_ devices.html

[12] H. Campos. (2011). Fighting for the right to open his heart data: Hugo Campos at TEDxCambridge 2011. [Online]. Available: https:// www.youtube.com/watch?v=oro19-l5M8k

[13] D. Hinckley. (2016, Feb. 22). This big brother/big data business goes way beyond Apple and the FBI. Huffington Post. [Online]. Available: http://www.huffingtonpost.com/david-hinckley/this-big-brotherbigdata_b_9292744.html october 2017 ^ IEEE Consumer Electronics Magazine 115

[14] K. Michael, “Mental health, implantables, and side effects,” IEEE Technol. Soc. Mag., vol. 34, no. 2, pp. 5–17, 2015.

[15] K. Michael, “The technological trajectory of the automatic identification industry: The application of the systems of innovation (SI) framework for the characterisation and prediction of the auto-ID industry,” Ph.D. dissertation, School of Information Technology and Computer Science, Univ. of Wollongong, Wollongong, Australia, 2003.

[16] K. Michael and M. G. Michael, “Homo electricus and the continued speciation of humans,” in The Encyclopedia of Information Ethics and Security, M. Quigley, Ed. Hershey, PA: IGI Global, 2007, pp. 312–318.

[17] Google Glass. (2014, Aug. 19). Glass terms of use. [Online]. Available: https://www.google.com/glass/termsofuse/

[18] K. Michael and M. G. Michael, “Implementing ‘namebers’ using microchip implants: The black box beneath the skin,” in This Pervasive Day: The Potential and Perils of Pervasive Computing, J. Pitt, Ed. London, U.K.: Imperial College Press, 2011.

[19] D. Smith. (2017, Feb. 4). Pacemaker data used to charge alleged arsonist. Jonathan Turley. [Online]. Available: https://jonathanturley .org/2017/02/04/pacemaker-data-used-to-charge-alleged-arsonist/

[20] K. Michael, “Big data and policing: The pros and cons of using situational awareness for proactive criminalisation,” presented at the Human Rights and Policing Conf,. Australian National University, Canberra, Apr. 16, 2013.

[21] K. Michael and G. L. Rose, “Human tracking technology in mutual legal assistance and police inter-state cooperation in international crimes,” in From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Second Workshop on Social Implications of National Security), K. Michael and M. G. Michael, Eds. Wollongong, Australia: University of Wollongong, 2007.

[22] F. Gerry, “Using data to combat human rights abuses,” IEEE Technol. Soc. Mag., vol. 33, no. 4, pp. 42–43, 2014.

[23] J. Gershman. (2016, Apr. 21). Prosecutors say Fitbit device exposed fibbing in rape case. Wall Street Journal. [Online]. Available: http:// blogs.wsj.com/law/2016/04/21/prosecutors-say-fitbit-device-exposedfibbing-in-rape-case/

[24] P. Olson. (2014, Nov. 16). Fitbit data now being used in the courtroom. Forbes. [Online]. Available: https://www.forbes.com/sites/parmyolson/ 2014/11/16/fitbit-data-court-room-personal-injury-claim/#459434e37379

[25] K. Michael and M. G. Michael, “The social and behavioural implications of location-based services,” J. Location Based Services, vol. 5, no. 3–4, pp. 121–137, Sept.–Dec. 2011.

[26] K. Michael, “The European court of human rights ruling against the policy of keeping fingerprints and DNA samples of criminal suspects in Britain, Wales and Northern Ireland: The case of S. and Marper v United Kingdom,” in The Social Implications of Covert Policing (Workshop on the Social Implications of National Security, 2009), S. Bronitt, C. Harfield, and K. Michael, Eds. Wollongong, Australia: University of Wollongong, 2010, pp. 131–155.

[27] M. G. Michael and K. Michael, “National security: The social implications of the politics of transparency,” Prometheus, vol. 24, no. 4, pp. 359–364, 2006.

[28] M. G. Michael, “On the ‘birth’ of uberveillance,” in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds. Hershey, PA: IGI Global, 2014.

[29] M. G. Michael and K. Michael, “A note on uberveillance,” in From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Second Workshop on Social Implications of National Security), M. G. Michael and K. Michael, Eds. Wollongong, Australia: University of Wollongong, 2007.

[30] M. G. Michael and K. Michael, “Toward a state of uberveillance,” IEEE Technol. Soc. Mag., vol. 29, pp. 9–16, 2010.

[31] M. G. Michael and K. Michael, “Uberveillance,” in Fifth Edition of the Macquarie Dictionary, S. Butler, Ed. Sydney, Australia: Sydney University, 2009.

[32] A. Masters and K. Michael, “Lend me your arms: The use and implications of humancentric RFID,” Electron. Commerce Res. Applicat., vol. 6, no. 1, pp. 29–39, 2007.

[33] K. D. Stephan, K. Michael, M. G. Michael, L. Jacob, and E. P. Anesta, “Social implications of technology: The past, the present, and the future,” Proc. IEEE, vol. 100, pp. 1752–1781, 2012. [34] E. Strickland. (2014, June 10). Medtronic wants to implant sensors in everyone. IEEE Spectrum. [Online]. Available: http://spectrum.ieee .org/tech-talk/biomedical/devices/medtronic-wants-to-implant-sensorsin-everyone

[35] K. Michael, “The benefits and harms of national security technologies,” presented at the Int. Women in Law Enforcement Conf., Hyderabad, India, 2015. [36] J. A. Brian Welsh. (2011). The entire history of you,” Black Mirror, C. Brooker, Ed. [Online]. Available: https://www.youtube.com/watch?v= Sw3GIR70HAY

[37] K. Michael, “Sousveillance and point of view technologies in law enforcement,” presented at the Sixth Workshop on the Social Implications of National Security: Sousveillance and Point of View Technologies in Law Enforcement, University of Sydney, Australia, 2012.

[38] K. Albrecht and K. Michael, “Connected: To everyone and everything,” IEEE Technology and Soc. Mag., vol. 32, pp. 31–34, 2013.

[39] M. G. Michael, “The paradox of the uberveillance equation,” IEEE Technol. Soc. Mag., vol. 35, no. 3, pp. 14–16, 20, 2016.

[40] K. Michael, “The final cut—tampering with direct evidence from wearable computers,” presented at the Fifth Int. Conf. Multimedia Information Networking and Security (MINES 2013), Beijing, China, 2013.

[41] V. Radunovic, “Internet governance, security, privacy and the ethical dimension of ICTs in 2030,” IEEE Technol. Soc. Mag., vol. 35, no. 3, pp. 12–14, 2016.

[42] K. Michael. (2011, Sept. 12). The microchipping of people and the uberveillance trajectory. Social Interface. [Online]. Available: http:// socialinterface.blogspot.com.au/2011/08/microchipping-of-people-and .html

[43] O. Ford. (2017, Jan. 12). Post-merger Abbott moves into 2017 with renewed focus, still faces hurdles. J.P. Morgan Healthcare Conf. 2017. [Online]. Available: http://www.medicaldevicedaily.com/servlet/com .accumedia.web.Dispatcher?next=bioWorldHeadlines_article& forceid=94497

[44] B. Schneier. (2017, Feb. 1). Security and the Internet of Things: Schneier on security. [Online]. Available: https://www.schneier.com/ blog/archives/2017/02/security_and_th.html

[45] IndustryARC. (2015, July 30). Cardiac implantable devices market to reach $43 billion by 2020. GlobeNewswire. [Online]. Available: https://globenewswire.com/news-release/2015/07/30/756345/10143745/ en/Cardiac-Implantable-Devices-Market-to-Reach-43-Billion-By-2020 .html

[46] J. Carvalko, The Techno-Human Shell: A Jump in the Evolutionary Gap. Mechanicsburg, PA: Sunbury Press, 2013.

[47] J. Carvalko and C. Morris, “Crowdsourcing biological specimen identification: Consumer technology applied to health-care access,” IEEE Consum. Electron. Mag., vol. 4, no. 1, pp. 90–93, 2014.

[48] J. Carvalko, “Who should own in-the-body medical data in the age of ehealth?” IEEE Technol. Soc. Mag., vol. 33, no. 2, pp. 36–37, 2014.

[49] J. Carvalko and C. Morris, The Science and Technology Guidebook for Lawyers. New York: ABA, 2014.

[50] K. Michael and J. Carvalko. (2016, June 20). Joseph Carvalko speaks with Katina Michael on his non-fiction and fiction pieces. [Online]. Available: https://www.youtube.com/watch?v=p4JyVCba6VM

[51] J. Carvalko, Death by Internet. Mechanicsburg, PA: Sunbury Press, 2016.

[52] R. Pearce. (2017, June 7). “No-one’s talking about backdoors” for encrypted services, says PM’s cyber guy. Computerworld. [Online]. Available: https://www.computerworld.com.au/article/620329/no-onetalking-about-backdoors-says-pm-cyber-guy/

[53] M. Ambinder. (2013, Aug. 14). An educated guess about how the NSA is structured. The Atlantic. [Online]. Available: https://www .theatlantic.com/technology/archive/2013/08/an-educated-guess-abouthow-the-nsa-is-structured/278697/

Acknowledgment

A short form of this article was presented as a video keynote speech for the Fourth International Conference on Innovations in Information, Embedded and Communication Systems in Coimbatore, India, on 17 March 2017. The video is available at https://www.youtube.com/watch?v=bEKLDhNfZio.

Keywords

Metadata, Electrocardiography, Pacemakers, Heart beat, Telecommunication services, Implants, Biomedical equipment, biomedical equipment, cardiology, criminal law, medical computing, police data processing, transport protocols, implantable medical device, heart, Australian inquiry, government agencies, illegal online services,mandatory metadata retention laws, government organizations, law enforcement organizations, Internet protocol

Citation: Katina Michael, 2017, "Implantable Medical Device Tells All: Uberveillance Gets to the Heart of the Matter", IEEE Consumer Electronics Magazine, Vol. 6, No. 4, Oct. 2017, pp. 107 - 115, DOI: 10.1109/MCE.2017.2714279.

 

Using a Social-Ethical Framework to Evaluate Location-Based Services

Abstract

etyhicsfront.jpg

The idea for an Internet of Things has matured since its inception as a concept in 1999. People today speak openly of a Web of Things and People, and even more broadly of an Internet of Everything. As our relationships become more and more complex and enmeshed, through the use of advanced technologies, we have pondered on ways to simplify flows of communications, to collect meaningful data, and use them to make timely decisions with respect to optimisation and efficiency. At their core, these flows of communications are pathways to registers of interaction, and tell the intricate story of outputs at various units of analysis- things, vehicles, animals, people, organisations, industries, even governments. In this trend toward evidence-based enquiry, data is the enabling force driving the growth of IoT infrastructure. This paper uses the case of location-based services, which are integral to IoT approaches, to demonstrate that new technologies are complex in their effects on society. Fundamental to IoT is the spatial element, and through this capability, the tracking and monitoring of everything, from the smallest nut and bolt, to the largest shipping liner to the mapping of planet earth, and from the whereabouts of the minor to that of the prime minister. How this information is stored, who has access, and what they will do with it, is arguable depending on the stated answers. In this case study of location-based services we concentrate on control and trust, two overarching themes that have been very much neglected, and use the outcomes of this research to inform the development of a socio-ethical conceptual framework that can be applied to minimise the unintended negative consequences of advanced technologies. We posit it is not enough to claim objectivity through information ethics approaches alone, and present instead a socio-ethical impact framework. Sociality therefore binds together that higher ideal of praxis where the living thing (e.g. human) is the central and most valued actor of a system.

Agenda:

Introduction 1.1. 3

Control 1.2. 4

Surveillance 1.2.1. 5

Common surveillance metaphors 1.2.2. 5

Applying surveillance metaphors to LBS 1.2.3. 7

‘Geoslavery’ 1.2.4. 7

From state-based to citizen level surveillance 1.2.5. 7

Dataveillance 1.2.6. 8

Risks associated with dataveillance 1.2.7. 8

Loss of control 1.2.8. 8

Studies focussing on user requirements for control 1.2.9. 10

Monitoring using LBS: control versus care? 1.2.10. 10

Sousveillance 1.2.11. 11

Sousveillance, ‘reflectionism’ and control 1.2.12. 11

Towards überveillance 1.2.13. 12

Implications of überveillance on control 1.2.14. 13

Comparing the different forms of ‘veillance’ 1.2.15. 14

Identification 1.2.16. 14

Social sorting 1.2.17. 15

Profiling 1.2.18. 15

Digital personas and dossiers 1.2.19. 15

Trust 1.3. 16

Trust in the state 1.3.1. 17

Balancing trust and privacy in emergency services 1.3.2. 17

Trust-related implications of surveillance in the interest of national security 1.3.3. 17

Need for justification and cultural sensitivity 1.3.4. 18

Trust in corporations/LBS/IoT providers 1.3.5. 19

Importance of identity and privacy protection to trust 1.3.6. 19

Maintaining consumer trust 1.3.7. 20

Trust in individuals/others 1.3.8. 20

Consequences of workplace monitoring 1.3.9. 20

Location-monitoring amongst friends 1.3.10. 21

Location tracking for protection 1.3.11. 21

LBS/IoT is a ‘double-edged sword’ 1.3.12. 22

Discussion 1.4. 22

The Internet of Things (IoT) and LBS: extending the discussion on control and trust 1.4.1. 22

Control- and trust-related challenges in the IoT 1.4.2. 23

Ethical analysis: proposing a socio-ethical conceptual framework 1.4.3. 24

The need for objectivity 1.4.4. 25

Difficulties associated with objectivity 1.4.5. 26

Conclusion 1.5. 27

 

Introduction 1.1

Locative technologies are a key component of the Internet of Things (IoT). Some scholars go so far as to say it is the single most important component that enables the monitoring and tracking of subjects and objects. Knowing where something or someone is, is of greater importance than knowing who they are because it or they can be found, independent of what or who they are. Location also grants us that unique position on the earth’s surface, providing for us one of the vital pieces of information forming the distance, speed, time matrix. A unique ID, formed around an IP address in an IoT world, presents us with the capability to label every living and non-living thing and to recollect it, adding to its history and longer term physical lifetime. But without knowing where something is, even if we have the knowledge that an action is required toward some level of maintenance, we cannot be responsive. Since the introduction of electronic databases, providing accurate records for transaction processing has been a primary aim. Today, however, we are attempting to increase visibility using high resolution geographic details, we are contextualizing events through discrete and sometimes continuous sensor-based rich audio-visual data collection, and we are observing how mobile subjects and objects interact with the built environment. We are no longer satisfied with an approach that says identify all things, but we wish to be able to recollect or activate them on demand, understand associations and affiliations, creating a digital chronicle of its history to provide insights toward sustainability.

There is thus an undue pressure on the ethical justification for social and behavioral tracking of people and things in everyday life. Solely because we have the means to do something, it does not mean we should do it. We are told that through this new knowledge gained from big data we can reduce carbon emissions, we can eradicate poverty, we can grant all people equity in health services, we can better provision for expected food shortages, utilize energy resources optimally, in short, make the world a better place. This utopian view might well be the vision that the tech sector wish to adopt as an honourable marketing strategy, but the reality of thousands of years of history tells us that technology does not necessarily on its own accord, make things better. In fact, it has often made some aspects of life, such as conflict and war, much worse through the use of modern, sophisticated advanced techniques. We could argue that IoT will allow for care-based surveillance that will bring about aid to individuals and families given needs, but the reality is that wherever people are concerned, technology may be exploited towards a means for control. Control on its own is not necessarily an evil, it all depends on how the functionality of given technologies are applied. Applied negatively the recipient of this control orientation learns distrust instead of trust which then causes a chain reaction throughout society, especially with respect to privacy and security. We need only look at the techniques espoused by some governments in the last 200 years to acknowledge that heinous crimes against humanity (e.g. democide) have been committed with new technological armaments (Rummel, 1997) to the detriment of the citizenry.                                                         

A socio-ethical framework is proposed as a starting point for seeking to understand the social implications of location services, applicable to current and future applications within IoT infrastructure. To stop at critiquing services using solely an information ethics-based approach is to fall short. Today’s converging services and systems require a greater scope of orientation to ask more generally how society may be affected at large, not just whether information is being collected, stored, and shared appropriately. To ask questions about how location services and IoT technology will directly and indirectly change society has far greater importance for the longer term vision of person-to-person and person-to-thing interactions than simply studying various attributes in a given register.

Studies addressing the social implications of emerging technologies, such as LBS, generally reflect on the risks and ethical dilemmas resulting from the implementation of a particular technology within a given social context. While numerous approaches to ethics exist, all are inextricably linked to ideas of morality, and an ability to distinguish good conduct from bad. Ethics, in simple terms, can be considered as the “study of morality” (Quinn 2006, p. 55), where morality refers to a “system of rules for guiding human conduct and principles for evaluating those rules” (Tavani 2007, p. 32). This definition is shared by Elliot and Phillips (2004, p. 465), who regard ethics as “a set of rules, or a decision procedure, or both, intended to provide the conditions under which the greatest number of human beings can succeed in ‘flourishing’, where ‘flourishing’ is defined as living a fully human life” (O'Connor and Godar 2003, p. 248).

According to the literature, there are two prominent ethical dilemmas that emerge with respect to locating a person or thing in an Internet of Things world. First, the risk of unauthorised disclosure of one’s location which is a breach of privacy; and second the possibility of increased monitoring leading to unwarranted surveillance by institutions and individuals. The socio-ethical implications of LBS in the context of IoT can therefore be explored based on these two major factors. IoT more broadly, however, can be examined by studying numerous social and ethical dilemmas from differing perspectives. Michael et al. (2006a, pp. 1-10) propose a framework for considering the ethical challenges emerging from the use of GPS tracking and monitoring solutions in the control, convenience and care usability contexts. The authors examine these contexts in view of the four ethical dimensions of privacy, accuracy, property and accessibility (Michael et al. 2006a, pp. 4-5). Alternatively, Elliot and Phillips (2004, p. 463) discuss the social and ethical issues associated with m-commerce and wireless computing in view of the privacy and access, security and reliability challenges. The authors claim that factors such as trust and control are of great importance in the organisational context (Elliot and Phillips 2004, p. 470). Similar studies propose that the major themes regarding the social implications of LBS be summarised as control, trust, privacy and security (Perusco et al. 2006; Perusco and Michael 2007). These themes provide a conceptual framework for reviewing relevant literature in a structured fashion, given that a large number of studies are available in the respective areas.

This article, in the first instance, focusses on the control- and trust-related socio-ethical challenges arising from the deployment of LBS in the context of IoT, two themes that are yet to receive comprehensive coverage in the literature. This is followed by an examination of LBS in the context of the Internet of Things (IoT), and the ensuing ethical considerations. A socio-ethical framework is proposed as a valid starting point for addressing the social implications of LBS and delivering a conceptual framework that is applicable to current LBS use cases and future applications within an Internet of Things world.

Control 1.2

Control, according to the Oxford Dictionary (2012a), refers to the “the power to influence or direct people’s behaviour or the course of events”. With respect to LBS, this theme is examined in terms of a number of important concepts, notably surveillance, dataveillance, sousveillance and überveillance scholarship.

Surveillance 1.2.1

A prevailing notion in relation to control and LBS is the idea of exerting power over individuals through various forms of surveillance. Surveillance, according to sociologist David Lyon, “is the focused, systematic and routine attention to personal details for the purposes of influence, management, protection and or direction,” although Lyon admits that there are exceptions to this general definition (Lyon 2007, p. 14). Surveillance has also been described as the process of methodically monitoring the behaviour, statements, associates, actions and/or communications of an individual or individuals, and is centred on information collection (Clarke 1997; Clarke 2005, p. 9).

The act of surveillance, according to Clarke (1988; 1997) can either take the form of personal surveillance of a specific individual or mass surveillance of groups of interest. Wigan and Clarke (2006, p. 392) also introduce the categories of object surveillance of a particular item and area surveillance of a physical enclosure. Additional means of expressing the characteristics of surveillance exist. For example, the phrase “surveillance schemes” has been used to describe the various surveillance initiatives available (Clarke 2007a, p. 28). Such schemes have been demonstrated through the use of a number of mini cases or vignettes, which include, but are not limited to, baby monitoring, acute health care, staff movement monitoring, vehicle monitoring, goods monitoring, freight interchange-point monitoring, monitoring of human-attached chips, monitoring of human-embedded chips, and continuous monitoring of chips (Clarke 2007c; Clarke 2007b, pp. 47-60). The vignettes are intended to aid in understanding the desirable and undesirable social impacts resulting from respective schemes.

Common surveillance metaphors 1.2.2

In examining the theme of control with respect to LBS, it is valuable to initially refer to general surveillance scholarship to aid in understanding the link between LBS and surveillance. Surveillance literature is somewhat dominated by the use of metaphors to express the phenomenon. A prevalent metaphor is that of the panopticon, first introduced by Jeremy Bentham (Bentham and Bowring 1843), and later examined by Michel Foucault (1977). Foucault’s seminal piece Discipline and Punish traces the history of punishment, commencing with the torture of the body in the eighteenth century, through to more modern forms of punishment targeted at the soul (Foucault 1977). In particular, Foucault’s account offers commentary on the notions of surveillance, control and power through his examination of Bentham’s panopticon, which are pertinent in analysing surveillance in general and monitoring facilitated by LBS in particular. The panopticon, or “Inspection-House” (Bentham and Bowring 1843, p. 37), refers to Bentham’s design for a prison based on the essential notion of “seeing without being seen” (p. 44). The architecture of the panopticon is as follows:

“The building is circular. The apartments of the prisoners occupy the circumference. You may call them, if you please, the cells... The apartment of the inspector occupies the centre; you may call it if you please the inspector's lodge. It will be convenient in most, if not in all cases, to have a vacant space or area all round, between such centre and such circumference.  You may call it if you please the intermediate or annular area” (Bentham and Bowring 1843, pp. 40-41).

Foucault (1977, p. 200) further illustrates the main features of the inspection-house, and their subsequent implications on constant visibility:

“By the effect of backlighting, one can observe from the tower [‘lodge’], standing out precisely against the light, the small captive shadows in the cells of the periphery. They are like so many cages, so many small theatres, in which each actor is alone, perfectly individualized and constantly visible...Full lighting and the eye of a supervisor [‘inspector’] capture better than darkness, which ultimately protected. Visibility is a trap.”

While commonly conceived as ideal for the prison arrangement, the panopticon design is applicable and adaptable to a wide range of establishments, including but not limited to work sites, hospital, schools, and/or or any establishment in which individuals “are to be kept under inspection” (Bentham and Bowring 1843, p. 37). It has been suggested, however, that the panopticon functions as a tool for mass (as opposed to personal) surveillance in which large numbers of individuals are monitored, in an efficient sense, by a small number (Clarke 2005, p. 9). This differs from the more efficient, automated means of dataveillance (to be shortly examined). In enabling mass surveillance, the panopticon theoretically allows power to be. In examining the theme of control with respect to LBS, it is valuable to initially refer to general surveillance scholarship to aid in understanding the link between LBS and surveillance. Surveillance literature is somewhat dominated by the use of metaphors to express the phenomenon. Foucault (1977, pp. 202-203) provides a succinct summary of this point:

“He who is subjected to a field of visibility, and who knows it, assumes responsibility for the constraints of power; he makes them play spontaneously upon himself; he inscribes in himself the power relation in which he simultaneously plays both roles; he becomes the principle of his own subjection.”

This self-disciplinary mechanism functions similarly, and can somewhat be paralleled, to various notions in George Orwell’s classic novel Nineteen Eighty Four (Orwell 1949), also a common reference point in surveillance literature. Nineteen Eighty Four has been particularly influential in the surveillance realm, notably due to the use of “Big Brother” as a symbol of totalitarian, state-based surveillance. Big Brother’s inescapable presence is reflected in the nature of surveillance activities. That is, that monitoring is constant and omnipresent and that “[n]othing was your own except the few cubic centimetres inside your skull” (Orwell 1949, p. 29). The oppressive authority figure of Big Brother possesses the ability to persistently monitor and control the lives of individuals, employing numerous mechanisms to exert power and control over his populace as a reminder of his unavoidable gaze.

One such mechanism is the use of telescreens as the technological solution enabling surveillance practices to be applied. The telescreens operate as a form of self-disciplinary tool by way of reinforcing the idea that citizens are under constant scrutiny (in a similar fashion to the inspector’s lodge in the panopticon metaphor). The telescreens inevitably influence behaviours, enabling the state to maintain control over actions and thoughts, and to impose appropriate punishments in the case of an offence. This is demonstrated in the following excerpt:

“It was terribly dangerous to let your thoughts wander when you were in any public place or within range of a telescreen. The smallest thing could give you away. A nervous tic, an unconscious look of anxiety, a habit of muttering to yourself – anything that carried with it the suggestion of abnormality, of having something to hide. In any case, to wear an improper expression on your face (to look incredulous when a victory was announced, for example) was itself a punishable offence” (Orwell 1949, p. 65).

The Internet of Things, with its ability to locate and determine who is or what is related to one another using a multiplicity of technologies, will enable authorities in power to infer what someone is likely to do in a given context. Past behavioural patterns, can for example, reveal a likely course of action with relatively no prediction required. IoT in all its glory will provide complete visibility- the question is what are the risks associated with providing that kind of capability to the state or private enterprise? In scenario analysis we can ponder how IoT in a given context will be used for good, how it will be used for bad, and a neutral case where it will have no effect whatsoever because the data stream will be ignored by the system owner. While IoT has been touted as the ultimate in providing great organisational operational returns, one can see how it can lend itself to location-based tracking and monitoring using a panopticon metaphor. Paper records and registers were used during World War 2 for the purposes of segregation, IoT and especially the ability to “locate on demand”, may well be used for similar types of control purposes.

Applying surveillance metaphors to LBS 1.2.3

The aforementioned surveillance metaphors can be directly applied to the case of LBS within IoT. In the first instance, it can be perceived that the exploitation of emerging technologies, such as LBS, extends the notion of the panopticon in a manner that allows for inspection or surveillance to take place regardless of geographic boundaries or physical locations. When applying the idea of the panopticon to modern technologies, Lyon suggests that “Bentham’s panopticon gives way to the electronic superpanopticon” (Lyon 2001, p. 108). With respect to LBS, this superpanopticon is not limited to and by the physical boundaries of a particular establishment, but is rather reliant on the nature and capabilities of the mobile devices used for ‘inspection’. In an article titled “The Panopticon's Changing Geography”, Dobson and Fischer (2007) also discuss progress and various manifestations of surveillance technology, specifically the panopticon, and the consequent implications on power relationships. From Bentham's architectural design, to the electronic panopticon depicted by Orwell, and contemporary forms of electronic surveillance including LBS and covert human tracking, Dobson and Fisher (2007, p. 308-311) claim that all forms of watching enable continuous surveillance either as part of their primary or secondary purpose. They compare four means of surveillance- analogue technologies as used by spies which have unlimited geographic coverage and are very expensive to own and operate, Bentham’s original panopticon where the geographic view was internal to a building, George Orwell’s big brother view which was bound by the extent of television cables, and finally human tracking systems which were limited only by the availability and granularity of cell phone towers.

A key factor in applying the panopticon metaphor to IoT is that individuals, through the use of mobile location devices and technologies, will be constantly aware of their visibility and will assume the knowledge that an ‘inspector’ may be monitoring their location and other available information remotely at any given time. Mobile location devices may similarly replace Orwell’s idea of the telescreens as Big Brother’s primary surveillance technology, resulting in a situation in which the user is aiding in the process of location data collection and thereby surveillance. This creates, as maintained by Andrejevic (2007, p. 95), a “widening ‘digital enclosure’ within which a variety of interactive devices that provide convenience and customization to users double as technologies for gathering information about them.”

‘Geoslavery’ 1.2.4

Furthermore, in extreme situations, LBS may facilitate a new form of slavery, “geoslavery”, which Dobson and Fischer (2003, pp. 47-48) reveal is “a practice in which one entity, the master, coercively or surreptitiously monitors and exerts control over the physical location of another individual, the slave. Inherent in this concept is the potential for a master to routinely control time, location, speed, and direction for each and every movement of the slave or, indeed, of many slaves simultaneously.” In their seminal work, the authors flag geoslavery as a fundamental human rights issue (Dobson and Fisher 2003, p. 49), one that has the potential to somewhat fulfil Orwell's Big Brother prophecy, differing only in relation to the sophistication of LBS in comparison to visual surveillance and also in terms of who is in control. While Orwell’s focus is on the state, Dobson and Fischer (2003, p. 51) caution that geoslavery can also be performed by individuals “to control other individuals or groups of individuals.”

From state-based to citizen level surveillance 1.2.5

Common in both Discipline and Punish and Nineteen Eighty Four is the perspective that surveillance activities are conducted at the higher level of the “establishment”; that is, institutional and/or state-based surveillance. However, it must be noted that similar notions can be applied at the consumer or citizen level. Mark Andrejevic (2007, p. 212), in his book iSpy: Surveillance and Power in the Interactive Era, terms this form of surveillance as “lateral or peer-to-peer surveillance.” This form of surveillance is characterised by “increasing public access to the means of surveillance – not just by corporations and the state, but by individuals” (Andrejevic 2007, p. 212). Similarly, Barreras and Mathur (2007, pp. 176-177) state that wireless location tracking capabilities are no longer limited to law enforcement, but are open to any interested individual. Abbas et al. (2011, pp. 20-31) further the discussion by focussing on related notions, explicitly, the implications of covert LBS-based surveillance at the community level, where technologies typically associated with policing and law enforcement are increasingly available for use by members of the community. With further reference to LBS, Dobson and Fischer (2003, p. 51) claim that the technology empowers individuals to control other individuals or groups, while also facilitating extreme activities. For instance, child protection, partner tracking and employee monitoring can now take on extreme forms through the employment of LBS (Dobson and Fisher 2003, p. 49). According to Andrejevic (2007, p. 218), this “do-it-yourself” approach assigns the act of monitoring to citizens. In essence higher degrees of control are granted to individuals thereby encouraging their participation in the surveillance process (Andrejevic 2007, pp. 218-222). It is important to understand IoT in the context of this multifaceted “watching”. IoT will not only be used by organisations and government agencies, but individuals in a community will also be granted access to information at small units of aggregated data. This has implications at a multiplicity of levels. Forces of control will be manifold.

Dataveillance 1.2.6

The same sentiments can be applied to the related, and to an extent superseding, notion of data surveillance, commonly referred to as dataveillance. Coined by Roger Clarke in the mid-eighties, dataveillance is defined as “the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” (Clarke 1988). Clarke (2005, p. 9) maintains that this process is automated and therefore a relatively economical activity when compared with other forms of surveillance, in that dataveillance activities are centred on examination of the data trails of individuals. For example, traditional forms of surveillance rely on expensive visual monitoring techniques, whereas dataveillance is largely an economically efficient alternative (Clarke 1994; 2001d, p. 11). Visual behavioural monitoring (that is, traditional surveillance) is an issue, but is nonetheless overshadowed by the challenges associated with dataveillance, particularly with reference to personal and mass dataveillance (Clarke 2005, pp. 9-10). That is, personal dataveillance presents risks to the individual based primarily on the potential for the collected data/information to be incorrect or outdated, while mass dataveillance is risky in that it may generate suspicion amongst individuals (Albrecht & Michael, 2013).

Risks associated with dataveillance 1.2.7

Clarke’s early and influential work on “Information Technology and Dataveillance” recognises that information technology is accelerating the growth of dataveillance, which presents numerous benefits and risks (Clarke 1988, pp. 498, 505-507). Clarke lists advantages in terms of safety and government applications, while noting the dangers associated with both personal and mass dataveillance (Clarke 1988, pp. 505-507). These risks can indeed be extended or applied to the use of location and tracking technologies to perform dataveillance activities, resulting in what can be referred to as “dataveillance on the move” (Michael and Michael 2012). The specific risks include: ability for behavioural patterns to be exposed and cross-matched, potentially for revelations that may be harmful from a political and personal perspective, rise in the use of “circumstantial evidence”, transparency of behaviour resulting in the misuse of information relating to an individual’s conduct, and “actual repression of the readily locatable and trackable individual” (Clarke 2001b, p. 219). Emerging from this analysis, and that concerning surveillance and related metaphors, is the significant matter of loss of control.

Loss of control 1.2.8

Michael et al. (2006a, p. 2) state, in the context of GPS tracking, that the issue of control is a leading ethical challenge given the invasive nature of this form of monitoring. The mode of control can differ depending on the context. For instance, the business context may include control through directing or ‘pushing’ advertisements to a specific individual, and at personal/individual level could signify control in the manner of “self-direction” (Perusco et al. 2006, p. 93). Other forms of social control can also be exercised by governments and organisations (Clarke 2003b), while emerging LBS solutions intended for the consumer sector extend the notion of control to community members (Abbas et al. 2011). This is an area that has not been adequately addressed in the literature. The subsequent risks to the individual are summarised in the following passage:

“Location technologies therefore provide, to parties that have access to the data, the power to make decisions about the entity subject to the surveillance, and hence exercise control over it. Where the entity is a person, it enables those parties to make determinations, and to take action, for or against that person’s interests. These determinations and actions may be based on place(s) where the person is, or place(s) where the person has been, but also on place(s) where the person is not, or has not been” (Wigan and Clarke 2006, p. 393).

Therefore GPS and other location devices and technologies may result in decreased levels of control from the perspective of the individual being monitored. For example, in an article based on the use of scenarios to represent the social implications associated with the implementation of LBS, Perusco and Michael (2007) demonstrate the various facets of control in relation to LBS. The discussion is generally centred on the loss of control which can be experienced in numerous ways, such as when a device does not accurately operate, or when an individual constantly monitors a family member in an attempt to care for them (Perusco and Michael 2007, pp. 6-7, 10). The authors raise valuable ideas with respect to control, such as the need to understand the purpose of control, the notion of consent, and developing methods to deal with location inaccuracies amongst others (p. 14). Perusco and Michael further assert that control has a flow-on effect on other issues, such as trust for instance, with the authors questioning whether it is viable to control individuals given the likely risk that trust may be relinquished in the process (p. 13).

Concurrent with loss of control, the issue of pre-emptive control with respect to LBS is a delicate one, specifically in relation to suspected criminals or offenders. Perusco et al. (2006, p. 92) state that the punishment of a crime is typically proportionate to the committed offence, thus the notion of pre-emptive monitoring can be considered fundamentally flawed given that individuals are being punished without having committed an offence. Rather, they are suspected of being a threat. According to Clarke and Wigan (2011), a person is perceived a threat, based on their “personal associations” which can be determined using location and tracking technologies to establish the individual’s location in relation to others, and thus control them based on such details. This is where IoT fundamentally comes into play. While location information can tell us much about where an individual is at any point in time, it is IoT that will reveal the inter-relationships and frequency of interaction, and specific application of measurable transactions. IoT is that layer that will bring things to be scrutinized in new ways.  

This calls for an evaluation of LBS solutions that can be used for covert operations. Covert monitoring using LBS is often considered a useful technique, one that promotes less opposition than overt forms of monitoring, as summarised below:

“Powerful economic and political interests are seeking to employ location and tracking technologies surreptitiously, to some degree because their effectiveness is greater that way, but mostly in order to pre-empt opposition” (Clarke 2001b, p. 221).

Covert applications of LBS are increasingly available for the monitoring and tracking of social relations such as a partner or a child (Abbas et al. 2011). Regardless of whether covert or overt, using LBS for monitoring is essentially about control, irrespective of whether the act of controlling is motivated by necessity, or for more practical or supportive purposes (Perusco et al. 2006, p. 93). 

Studies focussing on user requirements for control 1.2.9

The control dimension is also significant in studies focussing on LBS users, namely, literature concerned with user-centric design, and user adoption and acceptance of LBS and related mobile solutions. In a paper focussing on understanding user requirements for the development of LBS, Bauer et al. (2005, p. 216) report on a user’s “fear” of losing control while interacting with mobile applications and LBS that may infringe on their personal life. The authors perceive loss of control to be a security concern requiring attention, and suggest that developers attempt to relieve the apprehension associated with increased levels of personalisation though ensuring that adequate levels of control are retained (Bauer et al. 2005, p. 216). This is somewhat supported by the research of Xu and Teo (2004, pp. 793-803), in which the authors suggest that there exists a relationship between control, privacy and intention to use LBS. That is, a loss of control results in a privacy breach, which in turn impacts on a user’s intention to embrace LBS.

The aforementioned studies, however, fail to explicitly incorporate the concept of value into their analyses. Due to the lack of literature discussing the three themes of privacy, value and control, Renegar et al. (2008, pp. 1-2) present the privacy-value-control (PVC) trichotomy as a paradigm beneficial for measuring user acceptance and adoption of mobile technologies. This paradigm stipulates the need to achieve harmony amongst the concepts of privacy, value and control in order for a technology to be adopted and accepted by the consumer. However, the authors note that perceptions of privacy, value and control are dependent on a number of factors or entities, including the individual, the technology and the service provider (Renegar et al. 2008, p. 9). Consequently, the outcomes of Renegar et al.’s study state that privacy does not obstruct the process of adoption but rather the latter must take into account the value proposition in addition to the amount of control granted.

Monitoring using LBS: control versus care? 1.2.10

The focus of the preceding sections has been on the loss of control, the dangers of pre-emptive control, covert monitoring, and user perspectives relating to the control dimension. However, this analysis should not be restricted to the negative implications arising from the use of LBS, but rather should incorporate both the control and care applications of LBS. For instance, while discussions of surveillance and the term in general typically invoke sinister images, numerous authors warn against assuming this subjective viewpoint. Surveillance should not be considered in itself as disagreeable. Rather, “[t]he problem has been the presumptiveness of its proponents, the lack of rational evaluation, and the exaggerations and excesses that have been permitted” (Clarke 2007a, p. 42). This viewpoint is reinforced in the work of Elliot and Phillips (2004, p. 474), and can also be applied to dataveillance.

The perspective that surveillance inevitability results in negative consequences such as individuals possessing excessive amounts of control over each other should be avoided. For instance, Lyon (2001, p. 2) speaks of the dual aspects of surveillance in that “[t]he same process, surveillance – watching over – both enables and constrains, involves care and control.”  Michael et al. (2006a) reinforce such ideas in the context of GPS tracking and monitoring. The authors claim that GPS tracking has been employed for control purposes in various situations, such as policing/law enforcement, the monitoring of parolees and sex offenders, the tracking of suspected terrorists and the monitoring of employees (Michael et al. 2006a, pp. 2-3). However, the authors argue that additional contexts such as convenience and care must not be ignored, as GPS solutions may potentially simplify or enable daily tasks (convenience) or be used for healthcare or protection of vulnerable groups (care) (Michael et al. 2006a, pp. 3-4). Perusco and Michael (2005) further note that the tracking of such vulnerable groups indicates that monitoring activities are no longer limited to those convicted of a particular offence, but rather can be employed for protection and safety purposes. Table 1 provides a summary of GPS tracking and monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4), identifying the potentially constructive uses of GPS tracking and monitoring.

Table 1: GPS monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4)

Table 1: GPS monitoring applications in the control, convenience and care contexts, adapted from Michael et al. (2006a, pp. 2-4)

It is crucial that in evaluating LBS control literature and establishing the need for LBS regulation, both the control and care perspectives are incorporated. The act of monitoring should not immediately conjure up sinister thoughts. The focus should preferably be directed to the important question of purpose or motives. Lyon (2007, p. 3) feels that purpose may exist anywhere on the broad spectrum between care and control. Therefore, as expressed by Elliot and Phillips (2004, p. 474), a crucial factor in evaluating the merit of surveillance activities and systems is determining “how they are used.” These sentiments are also applicable to dataveillance. It is helpful at this point to discuss alternative and related practices that may incorporate location information throughout the monitoring process.

Sousveillance 1.2.11

The term sousveillance, coined by Steve Mann, comes from the French terms sous which means from below, and veiller which means to watch (Mann et al. 2003, p. 332). It is primarily a form of “inverse surveillance” (Mann et al. 2003, p. 331), whereby an individual is in essence “surveilling the surveillers” (p. 332). Sousveillance is reliant on the use of wearable computing devices to capture audiovisual and sensory data (Mann 2005, p. 625). A major concern with respect to sousveillance, according to Mann (2005, p. 637), is the dissemination of the recorded data which for the purposes of this investigation, may include images of locations and corresponding geographic coordinates.

Sousveillance, ‘reflectionism’ and control 1.2.12

Relevant to the theme of control, it has been argued that sousveillance can be utilised as a form of resistance to unwarranted surveillance and control by institutions. According to Mann et al. (2003, p. 333), sousveillance is a type of reflectionism in which individuals can actively respond to bureaucratic monitoring and to an extent “neutralize surveillance”. Sousveillance can thus be employed in response to social control in that surveillance activities are reversed:

“The surveilled become sousveillers who engage social controllers (customs officials, shopkeepers, customer service personnel, security guards, etc.) by using devices that mirror those used by these social controllers” (Mann et al. 2003, p. 337).

Sousveillance differs from surveillance in that traditional surveillance activities are “centralised” and “localized.” It is dispersed in nature and “delocalized” in its global coverage (Ganascia 2010, p. 496). As such, sousveillance requires new metaphors for understanding its fundamental aspects. A useful metaphor proposed by Ganascia (2010, p. 496) for describing sousveillance is the canopticon, which can be contrasted to the panopticon metaphor. At the heart of the canopticon are the following principles:

“total transparency of society, fundamental equality, which gives everybody the ability to watch – and consequently to control – everybody else, [and] total communication, which enables everyone to exchange with everyone else” (Ganascia 2010, p. 497).

This exchange may include the dissemination of location details, thus signalling the need to incorporate sousveillance into LBS regulatory discussions. A noteworthy element of sousveillance is that it shifts the ability to control from the state/institution (surveillance) to the individual. While this can initially be perceived as an empowering feature, excessive amounts of control, if unchecked, may prove detrimental. That is, control may be granted to individuals to disseminate their location (and other) information, or the information of others, without the necessary precautions in place and in an unguarded fashion. The implications of this exercise are sinister in their extreme forms. When considered within the context of IoT, sousveillance ideals are likely compromised. Yes, I can fight back against state control and big brother with sousveillance but in doing so I unleash potentially a thousand or more little brothers, each with their capacity to (mis)use the information being gathered.

Towards überveillance 1.2.13

The concepts of surveillance, dataveillance and sousveillance have been examined with respect to their association with location services in an IoT world. It is therefore valuable, at this point, to introduce the related notion of überveillance. Überveillance, a term coined by M.G. Michael in 2006, can be described as “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” (Michael et al. 2006b; Macquarie Dictionary 2009, p. 1094). Überveillance combines the dimensions of identification, location and time, potentially allowing for forecasting and uninterrupted real-time monitoring (Michael and Michael 2007, pp. 9-10), and in its extreme forms can be regarded as “Big Brother on the inside looking out” (p. 10).

Überveillance is considered by several authors to be the contemporary notion that will supplant surveillance. For instance, Clarke (2007a, p. 27) suggests that the concept of surveillance is somewhat outdated and that contemporary discussions be focussed on the notion of überveillance. It has further been suggested that überveillance is built on the existing notion of dataveillance. That is, “[ü]berveillance takes that which was static or discrete in the dataveillance world, and makes it constant and embedded” (Michael and Michael 2007, p. 10). The move towards überveillance thus marks the evolution from physical, visual forms of monitoring (surveillance), through to the increasingly sophisticated and ubiquitous embedded chips (überveillance) (Michael & Michael 2010; Gagnon et al. 2013). Albrecht and McIntyre (2005) describe these embedded chips as “spychips” and were focused predominantly on RFID tracking of people through retail goods and services. They spend considerable space describing the Internet of Things concept. Perakslis and Wolk (2006) studied the social acceptance of RFID implants as a security method and Perakslis later went on to incorporate überveillance into her research into behavioural motivators and personality factors toward adoption of humancentric IoT applications.

Given that überveillance is an emerging term (Michael and Michael 2007, p. 9), diverse interpretations have been proposed. For example, Clarke (2007a) offers varying definitions of the term, suggesting that überveillance can be understood as any of the following: omni-surveillance, an apocalyptic notion that “applies across all space and all time (omnipresent), and supports some organisation that is all-seeing and even all-knowing (omniscient)”, which can be achieved through the use of embedded chips for instance (p. 33); exaggerated surveillance, referring to “the extent to which surveillance is undertaken... its justification is exaggerated” (p. 34) ; and/or meta-, supra-, or master-surveillance, which “could involve the consolidation of multiple surveillance threads in order to develop what would be envisaged by its proponents to be superior information” (p. 38). Shay et al. (2012) acknowledge:

“The pervasive nature of sensors coupled with recent advances in data mining, networking, and storage technologies creates tools and data that, while serving the public good, also create a ubiquitous surveillance infrastructure ripe for misuse. Roger Clarke’s concept of dataveillance and M.G. Michael and Katina Michael’s more recent uberveillance serve as important milestones in awareness of the growing threat of our instrumented world.”

All of these definitions indicate direct ways in which IoT applications can also be rolled-out whether it is for use of vehicle management in heavy traffic conditions, the tracking of suspects in a criminal investigation or even employees in a workplace. Disturbing is the manner in which a whole host of applications, particularly in tollways and public transportation, are being used for legal purposes without the knowledge of the driver and commuter. “Tapping” token cards is not only encouraged but mandatory at most metropolitan train stations of developed countries. Little do commuters know that the data gathered by these systems can be requested by a host of government agencies without a warrant.

Implications of überveillance on control 1.2.14

Irrespective of interpretation, the subject of current scholarly debate relates to the implications of überveillance on individuals in particular, and society in general. In an article discussing the evolution of automatic identification (auto-ID) techniques, Michael and Michael (2005) present an account of the issues associated with implantable technologies in humancentric applications. The authors note the evident trend of deploying a technology into the marketplace, prior to assessing the potential consequences (Michael and Michael 2005, pp. 22-33). This reactive approach causes apprehension in view of chip implants in particular, given the inexorable nature of embedded chips, and the fact that once the chip is accepted by the body, it is impossible to remove without an invasive surgical procedure, as summarised in the following excerpt:

“[U]nless the implant is removed within a short time, the body will adopt the foreign object and tie it to tissue. At this moment, there will be no exit strategy, no contingency plan, it will be a life enslaved to upgrades, virus protection mechanisms, and inescapable intrusion” (Michael and Michael 2007, p. 18).

Other concerns relevant to this investigation have also been raised. It is indicated that “über-intrusive technologies” are likely to leave substantial impressions on individuals, families and other social relations, with the added potential of affecting psychological well-being (Michael and Michael 2007, p. 17). Apart from implications for individuals, concerns also emerge at the broader social level that require remedies. For instance, if a state of überveillance is to be avoided, caution must be exercised in deploying technologies without due reflection of the corresponding implications. Namely, this will involve the introduction of appropriate regulatory measures, which will encompass proactive consideration of the social implications of emerging technologies and individuals assuming responsibility for promoting regulatory measures (Michael and Michael 2007, p. 20). It will also require a measured attempt to achieve some form of “balance” (Clarke 2007a, p. 43). The implications of überveillance are of particular relevance to LBS regulatory discussions, given that “overarching location tracking and monitoring is leading toward a state of überveillance” (Michael and Michael 2011, p. 2). As such, research into LBS regulation in Australia must be sensitive to both the significance of LBS to überveillance and the anticipated trajectory of the latter.

Unfortunately the same cannot be said for IoT-specific regulation. IoT is a fluid concept, and in many ways IoT is nebulous. It is made up of a host of technologies that are being integrated and are converging together over time. It is layers upon layers of infrastructure which have emerged since the inception of the first telephone lines to the cloud and wireless Internet today. IoT requires new protocols and new applications but it is difficult to point to a specific technology or application or system that can be subject to some form of external oversight. Herein lie the problems of potential unauthorised disclosure of data, or even misuse of data when government agencies require private enterprise to act upon their requests, or private enterprises work together in sophisticated ways to exploit the consumer.

Comparing the different forms of ‘veillance’ 1.2.15

Various terms ending in ‘veillance’ have been introduced throughout this paper, all of which imply and encompass the process of monitoring. Prior to delving into the dangers of this activity and the significance of LBS monitoring on control, it is helpful to compare the main features of each term. A comparison of surveillance, dataveillance, sousveillance, and überveillance is provided in Table 2.

It should be noted that with the increased use of techniques such as surveillance, dataveillance, sousveillance and überveillance, the threat of becoming a surveillance society looms. According to Ganascia (2010p. 491), a surveillance society is one in which the data gathered from the aforementioned techniques is utilised to exert power and control over others. This results in dangers such as the potential for identification and profiling of individuals (Clarke 1997), the latter of which can be associated with social sorting (Gandy 1993).

Table 2: Comparison of the different forms of ‘veillance’

Identification 1.2.16

Identity and identification are ambiguous terms with philosophical and psychological connotations (Kodl and Lokay 2001, p. 129). Identity can be perceived as “a particular presentation of an entity, such as a role that the entity plays in particular circumstances” (Clarke and Wigan 2011). With respect to information systems, human identification specifically (as opposed to object identification) is therefore “the association of data with a particular human being” (Kodl and Lokay 2001, pp. 129-130). Kodl and Lokay (2001, pp. 131-135) claim that numerous methods exist to identify individuals prior to performing a data linkage, namely, using appearance, social interactions/behaviours, names, codes and knowledge, amongst other techniques. With respect to LBS, these identifiers significantly contribute to the dangers pertaining to surveillance, dataveillance, souseveillance and überveillance. That is, LBS can be deployed to simplify and facilitate the process of tracking and be used for the collection of profile data that can potentially be linked to an entity using a given identification scheme. In a sense, LBS in their own right become an additional form of identification feeding the IoT scheme (Michael and Michael, 2013).

Thus, in order to address the regulatory concerns pertaining to LBS, it is crucial to appreciate the challenges regarding the identification of individuals. Of particularly importance is recognition that once an individual has been identified, they can be subjected to varying degrees of control. As such, in any scheme that enables identification, Kodl and Lokay (2001, p. 136) note the need to balance human rights with other competing interests, particularly given that identification systems may be exploited by powerful entities for control purposes, such as by governments to exercise social control. For an historical account of identification techniques, from manual methods through to automatic identification systems including those built on LBS see Michael and Michael (2009, pp. 43-60). It has also been suggested that civil libertarians and concerned individuals assert that automatic identification (auto-ID) technology “impinges on human rights, the right to privacy, and that eventually it will lead to totalitarian control of the populace that have been put forward since at least the 1970s” (Michael and Michael 2009, p. 364). These views are also pertinent to the notion of social sorting.

Social sorting 1.2.17

In relation to the theme of control, information derived from surveillance, dataveillance, sousveillance and überveillance techniques can also serve the purpose of social sorting, labelled by Oscar Gandy (1993, p. 1) as the “panoptic sort.” Relevant to this discussion, the information may relate to an individual’s location. In Gandy’s influential work The Panoptic Sort: A Political Economy of Personal Information, the author relies on the work of Michel Foucault and other critical theorists (refer to pp. 3-13) in examining the panoptic sort as an “antidemocratic system of control” (Gandy 1993, p. 227). According to Gandy, in this system, individuals are exposed to prejudiced forms of categorisation based on both economic and political factors (pp. 1-2). Lyon (1998, p. 94) describes the database management practices associated with social sorting, classing them a form of consumer surveillance, in which customers are grouped by “social type and location.” Such clustering forms the basis for the exclusion and marginalisation of individuals (King 2001, pp. 47-49). As a result, social sorting is presently used for profiling of individuals and in the market research realm (Bennett and Regan 2004, p. 452).

Profiling 1.2.18

Profiling “is a technique whereby a set of characteristics of a particular class of person is inferred from past experience, and data-holdings are then searched for individuals with a close fit to that set of characteristics” (Clarke 1993). The process is centred on the creation of a profile or model related to a specific individual, based on data aggregation processes (Casal 2004, p. 108). Assorted terms have been employed in labelling this profile. For instance, the model created of an individual using the data collected through dataveillance techniques has been referred to by Clarke (1997) as “the digital persona”, and is related to the “digital dossiers” idea introduced by Solove (2004, pp. 1-7). According to Clarke (1994), the use of networked systems, namely the internet, involves communicating and exposing data and certain aspects of, at times, recognisable behaviour, both of which are utilised in the creation of a personality.

Digital personas and dossiers 1.2.19

The resulting personality is referred to as the digital persona. Similarly, digital dossiers refer to the compilation of comprehensive electronic data related to an individual, utilised in the creation of the “digital person” (Solove 2004, p. 1), also referred to as “digital biographies” (Solove 2002, p. 1086). Digital biographies are further discussed by Solove (2002). In examining the need for LBS regulation throughout the globe, a given regulatory response or framework must appreciate the ease with which (past, present and future) location information can be compiled and integrated into an individual’s digital persona or dossier. Once such information is reproduced and disseminated the control implications are magnified.

With respect to the theme of control, an individual can exercise a limited amount of influence over their digital persona, as some aspects of creating an electronic personality may not be within their direct control. The scope of this article does not allow for reflection on the digital persona in great detail; however, Clarke (1994) offers a thorough investigation of the term, and associated notions such as the passive and active digital persona, in addition to the significance of the digital person to dataveillance techniques such as computer matching and profiling. However, significant to this research is the distinction between the physical and the digital persona and the resultant implications in relation to control, as summarised in the following extract:

“The physical persona is progressively being replaced by the digital persona as the basis for social control by governments, and for consumer marketing by corporations. Even from the strictly social control and business efficiency perspectives, substantial flaws exist in this approach. In addition, major risks to individuals and society arise” (Clarke 1994).

The same sentiments apply with respect to digital dossiers. In particular, Solove (2004, p. 2) notes that individuals are unaware of the ways in which their electronic data is exploited by government and commercial entities, and “lack the power to do much about it.” It is evident that profile data is advantageous for both social control and commercial purposes (Clarke 2001d, p. 12), the latter of which is associated with market research and sorting activities, which have evolved from ideas of “containment” of mobile consumer demand to the “control” model (Arvidsson 2004, pp. 456, 458-467). The control model in particular has been strengthened, but not solely driven, by emerging technologies including LBS, as explained:

“The control paradigm thus permits a tighter and more efficient surveillance that makes use of consumer mobility rather than discarding it as complexity. This ability to follow the consumer around has been greatly strengthened by new technologies: software for data mining, barcode scans, internet tracking devices, and lately location based information from mobile phones” (Arvidsson 2004, p. 467).

Social sorting, particularly for profiling and market research purposes, thus introduces numerous concerns relating to the theme of control, one of which is the ensuing consequences relating to personal privacy. This specifically includes the privacy of location information. In sum, examining the current regulatory framework for LBS in Australia, and determining the need for LBS regulation, necessitates an appreciation of the threats associated with social sorting using information derived from LBS solutions. Additionally, the benefits and risks associated with surveillance, dataveillance, sousveillance and überveillance for control must be measured and carefully contemplated in the proposed regulatory response.

Trust 1.3

Trust is a significant theme relating to LBS, given the importance of the notion to: (a) “human existence” (Perusco et al. 2006, p. 93; Perusco and Michael 2007, p. 10), (b) relationships (Lewis and Weigert 1985, pp. 968-969), (c) intimacy and rapport within a domestic relationship (Boesen et al. 2010, p. 65), and (d) LBS success and adoption (Jorns and Quirchmayr 2010, p. 152). Trust can be defined, in general terms, as the “firm belief in the reliability, truth, or ability of someone or something” (Oxford Dictionary 2012b). A definition of trust that has been widely cited in relevant literature is “the willingness of a party to be vulnerable to the actions of another party based on the expectation that the other will perform a particular action important to the trustor, irrespective of the ability to monitor or control that other party” (Mayer et al. 1995, p. 712). Related to electronic relationships or transactions, the concept has been defined as the “confident reliance by one party on the behaviour of other parties” (Clarke 2001c, p. 291), and it has been suggested that in the electronic-commerce domain, in particular, trust is intimately associated with the disclosure of information (Metzger 2004).

In reviewing literature concerning trust, Fusco et al. (2011, p. 2) claim that trust is typically described as a dynamic concept falling into the categories of cognitive (evidence based), emotional (faith-based), and/or behavioural (conduct-based) trust. For further reading, the major sources on trust can be found in: Lewis and Weigert's (1985) sociological treatment of trust, the influential work of Mayer et al. (1995) and the authors’ updated work Schoorman et al. (2007) centred on organisational trust, Weckert’s (2000) comprehensive review of trust in the context of workplace monitoring using electronic devices, research on trust in electronic-commerce (refer to McKnight and Chervany 2001; Pavlou 2003; Kim et al. 2009) and mobile-commerce (see Siau and Shen 2003; Yeh and Li 2009), the work of Valachich (2003) that introduces and evaluates trust in terms of ubiquitous computing environments, Dwyer et al.’s (2007) article on trust and privacy issues in social networks, Yan and Holtmanns’ (2008) examination of issues associated with digital trust management, the work of Chen et al. (2008) covering the benefits and concerns of LBS usage including privacy and trust implications, and the research by Junglas and Spitzmüller (2005) that examines privacy and trust issues concerning LBS by presenting a research model that incorporates these aspects amongst others.

For the purpose of this paper, the varying definitions and categorisations are acknowledged. However, trust will be assessed in terms of the relationships dominating existing LBS/IoT scholarship which comprise the government-citizen relationship centred on trust in the state, the business-consumer relationship associated with trust in corporations/LBS providers, and the consumer-consumer relationship concerned with trust in individuals/others.

Trust in the state 1.3.1

Trust in the state broadly covers LBS solutions implemented by government, thus representing the government-citizen relationship. Dominating current debates and literature are LBS government initiatives in the form of emergency management schemes, in conjunction with national security applications utilising LBS, which depending on the nature of their implementation may impact on citizens’ trust in the state. These concerns are typically expressed as a trade-off between security and safety. At present there are very few examples of fully-fledged IoT systems to point to, although increasingly quasi-IoT systems are being deployed using wireless sensor networks of varying kinds, e.g. for bushfire management and for fisheries. These systems do not include a direct human stakeholder but are still relevant as they may trigger flow-on effects that do impact citizenry.

Balancing trust and privacy in emergency services 1.3.2

In the context of emergency management, Aloudat and Michael (2011, p. 58) maintain that the dominant theme between government and consumers in relation to emergency warning messages and systems is trust. This includes trust in the LBS services being delivered and in the government itself (Aloudat and Michael 2011, p. 71). While privacy is typically believed to be the leading issue confronting LBS, in emergency and life-threatening situations it is overwhelmed by trust-related challenges, given that users are generally willing to relinquish their privacy in the interest of survival (Aloudat and Michael 2010, p. 2). Furthermore, the success of these services is reliant on trust in the technology, the service, and the accuracy/reliability/timeliness of the emergency alert. On the whole, this success can be measured in terms of citizens’ confidence in their government’s ability to sensibly select and implement a fitting emergency service utilising enhanced LBS features. In a paper that examines the deployment of location services in Dutch public administration, van Ooijen and Nouwt (2009, p. 81) assess the impact of government-based LBS initiatives on the government-citizen relationship, recommending that governments employ care in gathering and utilising location-based data about the public, to ensure that citizens' trust in the state is not compromised.

Trust-related implications of surveillance in the interest of national security 1.3.3

Trust is also prevalent in discussions relating to national security. National security has been regarded a priority area for many countries for over a decade, and as such has prompted the implementation of surveillance schemes by government. Wigan and Clarke (2006, p. 392) discuss the dimension of trust as a significant theme contributing to the social acceptance of a particular government surveillance initiative, which may incorporate location and tracking of individuals and objects. The implementation of surveillance systems by the state, including those incorporating LBS, can diminish the public’s confidence in the state given the potential for such mechanisms to be perceived as a form of authoritarian control. Nevertheless, a situation where national security and safety are considered to be in jeopardy may entail (partial) acceptance of various surveillance initiatives that would otherwise be perceived objectionable. In such circumstances, trust in government plays a crucial role in determining individuals’ willingness to compromise various civil liberties. This is explained by Davis and Silver (2004, p. 35) below:

“The more people trust the federal government or law enforcement agencies, the more willing they are to allow the government leeway in fighting the domestic war on terrorism by conceding some civil liberties.”

However, in due course it is expected that such increased security measures (even if initially supported by citizens) will yield a growing gap between government and citizens, “potentially dampening citizen participation in government and with it reducing citizens’ trust in public institutions and officials” (Gould 2002, p. 77). This is so as the degree of threat and trust in government is diminishing, thus resulting in the public’s reluctance to surrender their rights for the sake of security (Sanquist et al. 2008, p. 1126). In order to build and maintain trust, governments are required to be actively engaged in developing strategies to build confidence in both their abilities and of the technology under consideration, and are challenged to recognise “the massive harm that surveillance measures are doing to public confidence in its institutions” (Wigan and Clarke 2006, p. 401). It has been suggested that a privacy impact assessment (PIA) aids in establishing trust between government and citizens (Clarke 2009, p. 129). Carefully considered legislation is an alternative technique to enhance levels of trust. With respect to LBS, governments are responsible for proposing and enacting regulation that is in the best interest of citizens, incorporating citizen concerns into this process and encouraging suitable design of LBS applications, as explained in the following quotation:

“...new laws and regulations must be drafted always on the basis of citizens’ trust in government authorities. This means that citizens trust the government to consider the issues at stake according to the needs and wishes of its citizens. Location aware services can influence citizens’ trust in the democratic society. Poorly designed infrastructures and services for storing, processing and distributing location-based data can give rise to a strong feeling of being threatened. Whereas a good design expands the feeling of freedom and safety, both in the private and in the public sphere/domain” (Beinat et al. 2007, p. 46).

One of the biggest difficulties that will face stakeholders is identifying when current LBS systems become a part of bigger IoT initiatives. Major changes in systems will require a re-evaluation of impact assessments of different types.

Need for justification and cultural sensitivity 1.3.4

Techniques of this nature will fail to be espoused, however, if surveillance schemes lack adequate substantiation at the outset, as trust is threatened by “absence of justification for surveillance, and of controls over abuses” (Wigan and Clarke 2006, p. 389). From a government perspective, this situation may prove detrimental, as Wigan and Clarke (2006, p. 401) claim that transparency and trust are prerequisites for ensuring public confidence in the state, noting that “[t]he integrity of surveillance schemes, in transport and elsewhere, is highly fragile.” Aside from adequate justification of surveillance schemes, cultural differences associated with the given context need to be acknowledged as factors influencing the level of trust citizens hold in government. As explained by Dinev et al. (2005, p. 3) in their cross-cultural study of American and Italian Internet users' privacy and surveillance concerns, “[a]ttitudes toward government and government initiatives are related to the culture’s propensity to trust.” In comparing the two contexts, Dinev et al. claim that Americans readily accept government surveillance to provide increased levels of security, whereas Italians’ low levels of trust in government results in opposing viewpoints (pp. 9-10).

Trust in corporations/LBS/IoT providers 1.3.5

Trust in corporations/LBS/IoT providers emerges from the level of confidence a user places in an organisation and their respective location-based solution(s), which may be correlated to the business-consumer relationship. In the context of consumer privacy, Culnan and Bies (2003, p. 327) assert that perceived trust in an organisation is closely linked to the extent to which an organisation's practices are aligned with its policies. A breach in this trust affects the likelihood of personal information disclosure in the future (Culnan and Bies 2003, p. 328), given the value of trust in sustaining lasting customer relationships (p. 337). Reducing this “trust gap” (Culnan and Bies 2003, pp. 336-337) is a defining element for organisations in achieving economic and industry success, as it may impact on a consumer’s decision to contemplate location data usage (Chen et al. 2008, p. 34). Reducing this gap requires that control over location details remain with the user, as opposed to the LBS provider or network operator (Giaglis et al. 2003, p. 82). Trust can thus emerge from a user’s perception that they are in command (Junglas and Spitzmüller 2005, p. 3). 

Küpper and Treu (2010, pp. 216-217) concur with these assertions, explaining that the lack of uptake of first-generation LBS applications was chiefly a consequence of the dominant role of the network operator over location information. This situation has been somewhat rectified since the introduction of GPS-enabled devices capable of determining location information without input from the network operator and higher emphasis on a user-focussed model (Bellavista et al. 2008, p. 85; Küpper and Treu 2010, p. 217). Trust, however, is not exclusively concerned with a network operator’s ability to determine location information, but also with the possible misuse of location data. As such, it has also been framed as a potential resolution to location data misappropriation, explained further by Jorns and Quirchmayr (2010, p. 152) in the following excerpt:

“The only way to completely avoid misuse is to entirely block location information, that is, to reject such services at all. Since this is not an adequate option... trust is the key to the realization of mobile applications that exchange sensitive information.”

There is much to learn from the covert and overt location tracking of large corporation on their subscribers. Increasingly, the dubious practices of retaining location information by information and communication technology giants Google, Apple and Microsoft are being reported and only small commensurate penalties being applied in countries in the European Union and Asia. Disturbing in this trend is that even smaller suppliers of location-based applications are beginning to unleash unethical (but seemingly not illegal) solutions at shopping malls and other campus-based locales (Michael & Clarke 2013).

Importance of identity and privacy protection to trust 1.3.6

In delivering trusted LBS solutions, Jorns and Quirchmayr (2010, pp. 151-155) further claim that identity and privacy protection are central considerations that must be built into a given solution, proposing an LBS architecture that integrates such safeguards. That is, identity protection may involve the use of false dummies, dummy users and landmark objects, while privacy protection generally relies on decreasing the resolution of location data, employing supportive regulatory techniques and ensuring anonymity and pseudonymity (Jorns and Quirchmayr 2010, p. 152). Similarly, and with respect to online privacy, Clarke (2001c, p. 297) suggests that an adequate framework must be introduced that “features strong and comprehensive privacy laws, and systematic enforcement of those laws.” These comments, also applicable to LBS in a specific sense, were made in the context of economic rather than social relationships, referring primarily to government and corporations, but are also relevant to trust amongst social relations.

It is important to recognise that issues of trust are closely related to privacy concerns from the perspective of users. In an article titled, “Trust and Transparency in Location-Based Services: Making Users Lose their Fear of Big Brother”, Böhm et al. (2004, pp. 1-3) claim that operators and service providers are charged with the difficult task of earning consumer trust and that this may be achieved by addressing user privacy concerns and adhering to relevant legislation. Additional studies also point to the relationship between trust and privacy, claiming that trust can aid in reducing the perceived privacy risk for users. For example, Xu et al. (2005) suggest that enhancing trust can reduce the perceived privacy risk. This influences a user's decision to disclose information, and that “service provider’s interventions including joining third party privacy seal programs and introducing device-based privacy enhancing features could increase consumers’ trust beliefs and mitigate their privacy risk perceptions” (Xu et al. 2005, p. 905). Chellappa and Sin (2005, pp. 188-189), in examining the link between trust and privacy, express the importance of trust building, which include consumer’s familiarity and previous experience with the organisation.

Maintaining consumer trust 1.3.7

The primary consideration in relation to trust in the business-consumer relationship is that all efforts be targeted at establishing and building trust in corporations and LBS/IoT providers. Once trust has been compromised, the situation cannot be repaired which is a point applicable to trust in any context. This point is explained by Kaasinen (2003, p. 77) in an interview-based study regarding user requirements in location-aware mobile applications:

“The faith that the users have in the technology, the service providers and the policy-makers should be regarded highly. Any abuse of personal data can betray that trust and it will be hard to win it back again.”

Trust in individuals/others 1.3.8

Trust in the consumer-to-consumer setting is determined by the level of confidence existing between an individual and their social relations, which may include friends, parents, other family members, employers and strangers, categories that are adapted from Levin et al. (2008, pp. 81-82). Yan and Holtmanns (2008, p. 2) express the importance of trust for social interactions, claiming that “[s]ocial trust is the product of past experiences and perceived trustworthiness.” It has been suggested that LBS monitoring can erode trust between the individual engaged in monitoring and the subject being monitored, as the very act implies that trust is lacking in a given relationship (Perusco et al. 2006, p. 93). These concerns are echoed in Michael et al. (2008). Previous studies relevant to LBS and trust generally focus on: the workplace situation, that is, trust between an employer and their employee; trust amongst ‘friends’ subscribed to a location-based social networking (LBSN) service which may include any of the predefined categories above; in addition to studies relating to the tracking of family members, such as children for instance, for safety and protection purposes and the relative trust implications.

Consequences of workplace monitoring 1.3.9

With respect to trust in an employer’s use of location-based applications and location data, a prevailing subject in existing literature is the impact of employee monitoring systems on staff. For example, in studying the link between electronic workplace monitoring and trust, Weckert (2000, p. 248) reported that trust is a significant issue resulting from excessive monitoring, in that monitoring may contribute to deterioration in professional work relationships between an employer and their employee and consequently reduce or eliminate trust. Weckert’s work reveals that employers often substantiate electronic monitoring based on the argument that the “benefits outweigh any loss of trust”, and may include gains for the involved parties; notably, for the employer in the form of economic benefits, for the employee to encourage improvements to performance and productivity, and for the customer who may experience enhanced customer service (p. 249). Chen and Ross (2005, p. 250), on the other hand, argue that an employer’s decision to monitor their subordinates may be related to a low degree of existing trust, which could be a result of unsuitable past behaviour on the part of the employee. As such, employers may perceive monitoring as necessary in order to manage employees. Alternatively, from the perspective of employees, trust-related issues materialise as a result of monitoring, which may leave an impression on job attitudes, including satisfaction and dedication, as covered in a paper by Alder et al. (2006) in the context of internet monitoring.

When applied to location monitoring of employees using LBS, the trust-related concerns expressed above are indeed warranted. Particularly, Kaupins and Minch (2005, p. 2) argue that the appropriateness of location monitoring in the workplace can be measured from either a legal or ethical perspective, which inevitably results in policy implications for the employer. The authors emphasise that location monitoring of employees can often be justified in terms of the security, productivity, reputational and protective capabilities of LBS (Kaupins and Minch 2005, p. 5). However, Kaupins and Minch (2005, pp. 5-6) continue to describe the ethical factors “limiting” location monitoring in the workplace, which entail the need for maintaining employee privacy and the restrictions associated with inaccurate information, amongst others. These factors will undoubtedly affect the degree of trust between an employer and employee.

However, the underlying concern relevant to this discussion of location monitoring in the workplace is not only the suitability of employee monitoring using LBS. While this is a valid issue, the challenge remains centred on the deeper trust-related consequences. Regardless of the technology or applications used to monitor employees, it can be concluded that a work atmosphere lacking trust results in sweeping consequences that extend beyond the workplace, expressed in the following excerpt:

“A low trust workplace environment will create the need for ever increasing amounts of monitoring which in turn will erode trust further. There is also the worry that this lack of trust may become more widespread. If there is no climate of trust at work, where most of us spend a great deal of our life, why should there be in other contexts? Some monitoring in some situations is justified, but it must be restricted by the need for trust” (Weckert 2000, p. 250).

Location-monitoring amongst friends 1.3.10

Therefore, these concerns are certainly applicable to the use of LBS applications amongst other social relations. Recent literature merging the concepts of LBS, online social networking and trust are particularly focused on the use of LBSN applications amongst various categories of friends. For example, Fusco et al.'s (2010) qualitative study examines the impact of LBSN on trust amongst friends, employing a focus group methodology in achieving this aim. The authors reveal that trust may suffer as a consequence of LBSN usage in several ways: as disclosure of location information and potential monitoring activities can result in application misuse in order to conceal things; excessive questioning and the deterioration in trust amongst social relations; and trust being placed in the application rather than the friend (Fusco et al. 2010, p. 7). Further information relating to Fusco et al.’s study, particularly the manner in which LBSN applications adversely impact on trust can be found in a follow-up article (Fusco et al. 2011).

Location tracking for protection 1.3.11

It has often been suggested that monitoring in familial relations can offer a justified means of protection, particularly in relation to vulnerable individuals such as Alzheimer’s or dementia sufferers and in children. With specific reference to the latter, trust emerges as a central theme relating to child tracking. In an article by Boesen et al. (2010) location tracking in families is evaluated, including the manner in which LBS applications are incorporated within the familial context. The qualitative study conducted by the authors revealed that the initial decision to use LBS by participants with children was a lack of existing trust within the given relationship, with participants reporting an improvement in their children's behaviour after a period of tracking (Boesen et al. 2010, p. 70). Boesen et al., however, warn of the trust-related consequences, claiming that “daily socially-based trusting interactions are potentially replaced by technologically mediated interactions” (p. 73). Lack of trust in a child is considered to be detrimental to their growth. The act of nurturing a child is believed to be untrustworthy through the use of technology, specifically location monitoring applications, may result in long-term implications. The importance of trust to the growth of a child and the dangers associated with ubiquitous forms of supervision are explained in the following excerpt:

“Trust (or at least its gradual extension as the child grows) is seen as fundamental to emerging self-control and healthy development... Lack of private spaces (whether physical, personal or social) for children amidst omni-present parental oversight may also create an inhibiting dependence and fear” (Marx and Steeves 2010, p. 218).

Furthermore, location tracking of children and other individuals in the name of protection may result in undesirable and contradictory consequences relevant to trust. Barreras and Mathur (2007, p. 182), in an article that describes the advantages and disadvantages of wireless location tracking, argue that technologies originally intended to protect family members (notably children, and other social relations such as friends and employees), can impact on trust and be regarded as “unnecessary surveillance.” The outcome of such tracking and reduced levels of trust may also result in a “counterproductive” effect if the tracking capabilities are deactivated by individuals, rendering them incapable of seeking assistance in actual emergency situations (Barreras and Mathur 2007, p. 182).

LBS/IoT is a ‘double-edged sword’ 1.3.12

In summary, location monitoring and tracking by the state, corporations and individuals is often justified in terms of the benefits that can be delivered to the party responsible for monitoring/tracking and the subject being tracked. As such, Junglas and Spitzmüller (2005, p. 7) claim that location-based services can be considered a “double-edged sword” in that they can aid in the performance of tasks in one instance, but may also generate Big Brother concerns. Furthermore, Perusco and Michael (2007, p. 10) mention the linkage between trust and freedom. As a result, Perusco et al. (2006, p. 97) suggest a number of questions that must be considered in the context of LBS and trust: “Does the LBS context already involve a low level of trust?”; “If the LBS context involves a moderate to high level of trust, why are LBS being considered anyway?”; and “Will the use of LBS in this situation be trust-building or trust-destroying?” In answering these questions, the implications of LBS/IoT monitoring on trust must be appreciated, given they are significant, irreparable, and closely tied to what is considered the central challenge in the LBS domain, privacy.

This paper has provided comprehensive coverage of the themes of control and trust with respect to the social implications of LBS. The subsequent discussion will extend the examination to cover LBS in the context of the IoT, providing an ethical analysis and stressing the importance of a robust socio-ethical framework.

Discussion 1.4

The Internet of Things (IoT) and LBS: extending the discussion on control and trust 1.4.1

The Internet of Things (IoT) is an encompassing network of connected intelligent “things”, and is “comprised of smart machines interacting and communicating with other machines, objects, environments and infrastructures” (Freescale Semiconductor Inc. and ARM Inc. 2014, p. 1). The phrase was originally coined by Kevin Ashton in 1999, and a definite definition is yet to be agreed upon (Ashton 2009, p. 1; Kranenburg and Bassi 2012, p. 1). Various forms of IoT are often used interchangeably, such as the Internet of Everything, the Internet of Things and People, the Web of Things and People etc. The IoT can, however, be described in terms of its core characteristics and/or the features it encompasses. At the crux of the IoT concept is the integration of the physical and virtual worlds, and the capability for “things” within these realms to be operated remotely through the employment of intelligent or smart objects with embedded processing functionality (Mattern and Floerkemeier 2010, p. 242; Ethics Subgroup IoT 2013, p. 3). These smart objects are capable of storing historical and varied forms of data, used as the basis for future interactions and the establishment of preferences. That is, once the data is processed, it can be utilized to “command and control” things within the IoT ecosystem, ideally resulting in enhancing the everyday lives of individual (Michael, K. et al., 2010).

According to Ashton (2009, p. 1), the IoT infrastructure should “empower computers” and exhibit less reliance on human involvement in the collection of information. It should also allow for “seamless” interactions and connections (Ethics Subgroup IoT 2013, p. 2). Potential use cases include personal/home applications, health/patient monitoring systems, and remote tracking and monitoring which may include applications such as asset tracking amongst others (Ethics Subgroup IoT 2013, p. 3).

As can be anticipated with an ecosystem of this scale, the nature of interactions with the physical/virtual worlds and the varied “things” within, will undoubtedly be affected and dramatically alter the state of play. In the context of this paper, the focus is ultimately on the ethical concerns emerging from the use of LBS within the IoT infrastructure that is characterized by its ubiquitous/pervasive nature, in view of the discussion above regarding control and trust. It is valuable at this point to identify the important role of LBS in the IoT infrastructure.

While the IoT can potentially encompass a myriad of devices, the mobile phone will likely feature as a key element within the ecosystem, providing connectivity between devices (Freescale Semiconductor Inc. and ARM Inc. 2014, p. 2). In essence, smart phones can therefore be perceived as the “mediator” between users, the internet and additional “things”, as is illustrated in Mattern and Floerkemeier (2010, p. 245, see figure 2). Significantly, most mobile devices are equipped with location and spatial capabilities, providing “localization”, whereby intelligent devices “are aware of their physical location, or can be located” (Mattern and Floerkemeier 2010, p. 244). An example of an LBS application in the IoT would be indoor navigation capabilities in the absence of GPS; or in affect seamless navigation between the outdoor and indoor environments.

Control- and trust-related challenges in the IoT 1.4.2

It may be argued that the LBS control and trust implications discussed throughout this paper (in addition to ethical challenges such as privacy and security) will matriculate into the IoT environment. However, it has also been suggested that “the IoT will essentially create much richer environments in which location-based and location-aware technology can function” (Blouin 2014), and in doing so the ethical challenges will be amplified. It has further been noted that ethical issues, including trust and control amongst others, will “gain a new dimension in light of the increased complexity” in the IoT environment (Ethics Subgroup IoT 2013, p. 2).

In relation to control and the previously identified surveillance metaphors, for instance, it is predicted that there will be less reliance on Orwell's notion of Big Brother whereby surveillance is conducted by a single entity. Rather the concept of "some brother" will emerge. Some brother can be defined as "a heterogeneous 'mass' consisting of innumerable social actors, e.g. public sector authorities, citizens' movements and NGOs, economic players, big corporations, SMEs and citizens" (Ethics Subgroup IoT 2013, p. 16). As can be anticipated, the ethical consequences and dangers can potentially multiply in such a scenario.

Following on from this idea, is that of lack of transparency. The IoT will inevitably result in the merging of both the virtual and physical worlds, in addition to public and private spaces. It has been suggested that lack of transparency regarding information access will create a sense of discomfort and will accordingly result in diminishing levels of trust (Ethics Subgroup IoT 2013, p. 8). The trust-related issues (relevant to LBS) are likely to be consistent with those discussed throughout this paper, possibly varying in intensity/severity depending on a given scenario. For example, the consequences of faulty IoT technology have the potential to be greater than those in conventional Internet services given the integration of the physical and virtual worlds, thereby impact on users’ trust in the IoT (Ethics Subgroup IoT 2013, p. 11). Therefore, trust considerations must primarily be examined in terms of: (a) trust in technology, and (b) trust in individuals/others.

Dealing with these (and other) challenges requires an ethical analysis in which appropriate conceptual and practical frameworks are considered. A preliminary examination is provided in the subsequent section, followed by dialogue regarding the need for objectivity in socio-ethical studies and the associated difficulties in achieving this.

Ethical analysis: proposing a socio-ethical conceptual framework 1.4.3

Research into the social and ethical implications of LBS, emerging technologies in general, and the IoT can be categorized in many ways and many frameworks can be applied. For instance, it may be regarded as a strand of “cyberethics”, defined by Tavani (2007, p. 3) as “the study of moral, legal and social issues involving cybertechnology”. Cybertechnology encompasses technological devices ranging from individual computers through to networked information and communication technologies. When considering ethical issues relating to cybertechnology and technology in general, Tavani (2007, pp. 23-24) notes that the latter should not necessarily be perceived as neutral. That is, technology may have “embedded values and biases” (Tavani 2007, p. 24), in that it may inherently provide capabilities to individuals to partake in unethical activities. This sentiment is echoed by Wakunuma and Stahl (2014, p. 393) in a paper examining the perceptions of IS professionals in relation to emerging ethical concerns.

Alternatively, research in this domain may be classed as a form of “computer ethics” or “information ethics”, which can be defined and applied using numerous approaches. While this article does not attempt to provide an in-depth account of information ethics, a number of its crucial characteristics are identified. In the first instance, the value of information ethics is in its ability to provide a conceptual framework for understanding the array of ethical challenges stemming from the introduction of new ICTs (Mathiesen 2004, p. 1). According to Floridi (1999), the question at the heart of information ethics is “what is good for an information entity and the infosphere in general?” The author continues that “more analytically, we shall say that [information ethics] determines what is morally right or wrong, what ought to be done, what the duties, the ‘oughts’ and the ‘ought nots’ of a moral agent are…” However, Capurro (2006, p. 182) disagrees, claiming that information ethics is additionally about “what is good for our bodily being-in-the-world with others in particular?” This involves contemplation of other “spheres” such as the ecological, political, economic, and cultural and is not limited to a study of the infosphere as suggested by Floridi. In this sense, the significance of context, environment and intercultural factors also becomes apparent.

Following on from these notions, there is the need for a robust ethical framework that is multi-dimensional in nature and explicitly covers the socio-ethical challenges emerging from the deployment of a given technology. This would include, but not be limited to, the control and trust issues identified throughout this paper, other concerns such as privacy and security, and any challenges that emerge as the IoT takes shape. This article proposes a broader more robust socio-ethical conceptual framework, as an appropriate means of examining and addressing ethical challenges relevant to LBS; both LBS in general and as a vital mediating component within the IoT. This framework is illustrated in Figure 1. Central to the socio-ethical framework is the contemplation of individuals as part of a broader social network or society, whilst considering the interactions amongst various elements of the overall “system”. The four themes underpinning socio-ethical studies include the investigation of what the human purpose is, what is moral, how justice is upheld and the principles that guide the usage of a given technique. Participants; their interactions with systems; people concerns and behavioural expectations; cultural and religious belief; structures, rules and norms; and fairness, personal benefits and personal harms are all areas of interest in a socio-ethical approach.

Figure 1: Proposed socio-ethical framework, in terms of the major components that require consideration

Figure 1: Proposed socio-ethical framework, in terms of the major components that require consideration

This article is intended to offer a preliminary account of the socio-ethical conceptual framework being proposed. Further research would examine and test its validity, whilst also providing a more detailed account of the various components within and how a socio-ethical assessment would be conducted based on the framework, and the range of techniques that could be applied.

The need for objectivity 1.4.4

Regardless of categorization and which conceptual framework is adopted, numerous authors stress that the focus of research and debates should not be skewed towards the unethical uses of a particular technology, but rather an objective stance should be embraced. Such objectivity must nonetheless ensure that social interests are adequately represented. That is, with respect to location and tracking technologies, Clarke (2001b, p. 220) claims that social interests have been somewhat overshadowed by the economic interests of LBS organisation. This is a situation that requires rectifying. While information technology professionals are not necessarily liable for how technology is deployed, they must nonetheless recognise its implications and be engaged in the process of introducing and promoting adequate safeguards (Clarke 1988, pp. 510-511). It has been argued that IS professionals are generally disinterested in the ethical challenges associated with emerging ICTs, and are rather concerned with the job or the technologies themselves (Wakunuma and Stahl 2014, p. 383).

This is explicitly the case for LBS given that the industry and technology have developed quicker than equivalent social implications scholarship and research, an unfavourable situation given the potential for LBS to have profound impacts on individuals and society (Perusco et al. 2006, p. 91). In a keynote address centred on defining the emerging notion of überveillance, Clarke (2007a, p. 34) discusses the need to measure the costs and disbenefits arising from surveillance practices in general, where costs refer to financial measures, and disbenefits to all non-economic impacts. This involves weighing the negatives against the potential advantages, a response that is applicable to LBS, and pertinent to seeking objectivity.

Difficulties associated with objectivity 1.4.5

However, a major challenge with respect to an impartial approach for LBS is the interplay between the constructive and the potentially damaging consequences that the technology facilitates. For instance, and with specific reference to wireless technologies in a business setting, Elliot and Phillips (2004, p. 474) maintain that such systems facilitate monitoring and surveillance which can be applied in conflicting scenarios. Positive applications, according to Elliot and Phillips, include monitoring to improve effectiveness or provide employee protection in various instances, although this view has been frequently contested. Alternatively, negative uses involve excessive monitoring, which may compromise privacy or lead to situations in which an individual is subjected to surveillance or unauthorised forms of monitoring.

Additional studies demonstrate the complexities arising from the dual, and opposing, uses of a single LBS solution. It has been illustrated that any given application, for instance, parent, healthcare, employee and criminal tracking applications, can be simultaneously perceived as ethical and unethical (Michael et al. 2006a, p. 7). A closer look at the scenario involving parents tracking children, as explained by Michael et al. (2006a, p. 7), highlights that child tracking can enable the safety of a child on the one hand, while invading their privacy on the other. Therefore, the dual and opposing uses of a single LBS solution become problematic and situation-dependent, and indeed increasingly difficult to objectively examine. Dobson and Fischer (2003, p. 50) maintain that technology cannot be perceived as either good or evil in that it is not directly the cause of unethical behaviour, rather they serve to “empower those who choose to engage in good or bad behaviour.”

This is similarly the case in relation to the IoT, as public approval of the IoT is largely centred on “the conventional dualisms of ‘security versus freedom’ and ‘comfort versus data privacy’” (Mattern and Floerkemeier 2010, p. 256). Assessing the implications of the IoT infrastructure as a whole is increasingly difficult.

An alternative obstacle is associated with the extent to which LBS threaten the integrity of the individual. Explicitly, the risks associated with location and tracking technologies “arise from individual technologies and the trails that they generate, from compounds of multiple technologies, and from amalgamated and cross-referenced trails captured using multiple technologies and arising in multiple contexts” (Clarke 2001b, pp. 218). The consequent social implications or “dangers” are thus a product of individuals being convicted, correctly or otherwise, of having committed a particular action (Clarke 2001b, p. 219). A wrongly accused individual may perceive the disbenefits arising from LBS as outweighing the benefits.

However, in situations where integrity is not compromised, an LBS application can be perceived as advantageous. For instance, Michael et al. (2006, pp. 1-11) refer to the potentially beneficial uses of LBS, in their paper focusing on the Avian Flu Tracker prototype that is intended to manage and contain the spread of the infectious disease, by relying on spatial data to communicate with individuals in the defined location. The authors demonstrate that their proposed system which is intended to operate on a subscription or opt-in basis is beneficial for numerous stakeholders such as government, health organisations and citizens (Michael et al. 2006c, p. 6).

Thus, a common challenge confronting researchers with respect to the study of morals, ethics and technology is that the field of ethics is subjective. That is, what constitutes right and wrong behaviour varies depending on the beliefs of a particular individual, which are understood to be based on cultural and other factors specific to the individual in question. One such factor is an individual’s experience with the technology, as can be seen in the previous example centred on the notion of an unjust accusation. Given these subjectivities and the potential for inconsistency from one individual to the next, Tavani (2007, p. 47) asserts that there is the need for ethical theories to direct the analysis of moral issues (relating to technology), given that numerous complications or disagreements exist in examining ethics.

Conclusion 1.5

This article has provided a comprehensive review of the control- and trust-related challenges relevant to location-based services, in order to identify and describe the major social and ethical considerations within each of the themes. The relevance of the IoT in such discussions has been demonstrated and a socio-ethical framework proposed to encourage discussion and further research into the socio-ethical implications of the IoT with a focus on LBS and/or localization technologies. The proposed socio-ethical conceptual framework requires further elaboration and it is recommended that a thorough analysis, beyond information ethics, be conducted based on this paper which forms the basis for such future work. IoT by its very nature is subject to socio-ethical dilemmas because for the greater part, the human is removed from decision-making processes and is instead subject to a machine.

References

Abbas, R., Michael, K., Michael, M.G. & Aloudat, A.: Emerging Forms of Covert Surveillance Using GPS-Enabled Devices. Journal of Cases on Information Technology 13(2), 2011, 19-33.

Albrecht, K. & McIntyre, L.: Spychips: How Major Corporations and Government Plan to Track Your Every Purchase and Watch Your Every Move. Tomas Nelson 2005.

Albrecht, K. & Michael, K.: Connected: To Everyone and Everything. IEEE Technology and Society Magazine, Winter, 2013, 31-34.

Alder, G.S., Noel, T.W. & Ambrose, M.L.: Clarifying the Effects of Internet Monitoring on Job Attitudes: The Mediating Role of Employee Trust. Information & Management, 43, 2006, 894-903.

Aloudat, A. & Michael, K.: The Socio-Ethical Considerations Surrounding Government Mandated Location-Based Services During Emergencies: An Australian Case Study, in M. Quigley (ed.), ICT Ethics and Security in the 21st Century: New Developments and Applications. IGI Global, Hershey, PA, 2010, 1-26.

Aloudat, A. & Michael, K.: Toward the Regulation of Ubiquitous Mobile Government: A case Study on Location-Based Emergency Services in Australia. Electronic Commerce Research, 11(1), 2011, 31-74.

Andrejevic, M.: ISpy: Surveillance and Power in the Interactive Era. University Press of Kansas, Lawrence, 2007.

Arvidsson, A.: On the ‘Pre-History of the Panoptic Sort’: Mobility in Market Research. Surveillance & Society, 1(4), 2004, 456-474.

Ashton, K.: The "Internet of Things" Things. RFID Journal, 2009, www.rfidjournal.com/articles/pdf?4986

Barreras, A. & Mathur, A.: Chapter 18. Wireless Location Tracking, in K.R. Larsen and Z.A. Voronovich (eds.), Convenient or Invasive: The Information Age. Ethica Publishing, United States, 2007, 176-186.

Bauer, H.H., Barnes, S.J., Reichardt, T. & Neumann, M.M.: Driving the Consumer Acceptance of Mobile Marketing: A Theoretical Framework and Empirical Study. Journal of Electronic Commerce Research, 6(3), 2005, 181-192.

Beinat, E., Steenbruggen, J. & Wagtendonk, A.: Location Awareness 2020: A Foresight Study on Location and Sensor Services. Vrije Universiteit, Amsterdam, 2007, http://reference.kfupm.edu.sa/content/l/o/location_awareness_2020_2_108_86452.pdf

Bellavista, P., Küpper, A. & Helal, S.: Location-Based Services: Back to the Future. IEEE Pervasive Computing, 7(2), 2008, 85-89.

Bennett, C.J. & Regan, P.M.: Surveillance and Mobilities. Surveillance & Society, 1(4), 2004, 449-455.

Bentham, J. & Bowring, J.: The Works of Jeremy Bentham. Published under the Superintendence of His Executor, John Bowring, Volume IV, W. Tait, Edinburgh, 1843.

Blouin, D. An Intro to Internet of Things. 2014, www.xyht.com/spatial-itgis/intro-to-internet-of-things/

Boesen, J., Rode, J.A. & Mancini, C.: The Domestic Panopticon: Location Tracking in Families. UbiComp’10, Copenhagen, Denmark, 2010, pp. 65-74.

Böhm, A., Leiber, T. & Reufenheuser, B.: 'Trust and Transparency in Location-Based Services: Making Users Lose Their Fear of Big Brother. Proceedings Mobile HCI 2004 Workshop On Location Systems Privacy and Control, Glasgow, UK, 2004, 1-4.

Capurro, R.: Towards an Ontological Foundation of Information Ethics. Ethics and Information Technology, 8, 2006, 175-186.

Casal, C.R.: Impact of Location-Aware Services on the Privacy/Security Balance, Info: the Journal of Policy, Regulation and Strategy for Telecommunications. Information and Media, 6(2), 2004, 105-111.

Chellappa, R. & Sin, R.G.: Personalization Versus Privacy: An Empirical Examination of the Online Consumer’s Dilemma. Information Technology and Management, 6, 2005, 181-202.

Chen, J.V., Ross, W. & Huang, S.F.: Privacy, Trust, and Justice Considerations for Location-Based Mobile Telecommunication Services. info, 10(4), 2008, 30-45.

Chen, J.V. & Ross, W.H.: The Managerial Decision to Implement Electronic Surveillance at Work. International Journal of Organizational Analysis, 13(3), 2005, 244-268.

Clarke, R.: Information Technology and Dataveillance. Communications of the ACM, 31(5), 1988, 498-512.

Clarke, R.: Profiling: A Hidden Challenge to the Regulation of Data Surveillance. 1993, http://www.rogerclarke.com/DV/PaperProfiling.html.

Clarke, R.: The Digital Persona and Its Application to Data Surveillance. 1994, http://www.rogerclarke.com/DV/DigPersona.html.

Clarke, R.: Introduction to Dataveillance and Information Privacy, and Definitions of Terms. 1997, http://www.anu.edu.au/people/Roger.Clarke/DV/Intro.html.

Clarke, R.: Person Location and Person Tracking - Technologies, Risks and Policy Implications. Information Technology & People, 14(2), 2001b, 206-231.

Clarke, R.: Privacy as a Means of Engendering Trust in Cyberspace Commerce. The University of New South Wales Law Journal, 24(1), 2001c, 290-297.

Clarke, R.: While You Were Sleeping… Surveillance Technologies Arrived. Australian Quarterly, 73(1), 2001d, 10-14.

Clarke, R.: Privacy on the Move: The Impacts of Mobile Technologies on Consumers and Citizens. 2003b, http://www.anu.edu.au/people/Roger.Clarke/DV/MPrivacy.html.

Clarke, R.: Have We Learnt to Love Big Brother? Issues, 71, June, 2005, 9-13.

Clarke, R.: What's 'Privacy'? 2006, http://www.rogerclarke.com/DV/Privacy.html.

Clarke, R. Chapter 3. What 'Uberveillance' Is and What to Do About It, in K. Michael and M.G. Michael (eds.), The Second Workshop on the Social Implications of National Security, University of Wollongong, Wollongong, Australia, 2007a, 27-46.

Clarke, R.: Chapter 4. Appendix to What 'Uberveillance' Is and What to Do About It: Surveillance Vignettes, in K. Michael and M.G. Michael (eds.), The Second Workshop on the Social Implications of National Security, University of Wollongong, Wollongong, Australia, 2007b, 47-60.

Clarke, R.: Surveillance Vignettes Presentation. 2007c, http://www.rogerclarke.com/DV/SurvVign-071029.ppt.

Clarke, R.: Privacy Impact Assessment: Its Origins and Development. Computer Law & Security Review, 25(2), 2009, 123-135.

Clarke, R. & Wigan, M.: You Are Where You've Been: The Privacy Implications of Location and Tracking Technologies. 2011, http://www.rogerclarke.com/DV/YAWYB-CWP.html.

Culnan, M.J. & Bies, R.J.: Consumer Privacy: Balancing Economic and Justice Considerations. Journal of Social Issues, 59(2), 2003, 323-342.

Davis, D.W. & Silver, B.D.: Civil Liberties vs. Security: Public Opinion in the Context of the Terrorist Attacks on America. American Journal of Political Science, 48(1), 2004, pp. 28-46.

Dinev, T., Bellotto, M., Hart, P., Colautti, C., Russo, V. & Serra, I.: Internet Users’ Privacy Concerns and Attitudes Towards Government Surveillance – an Exploratory Study of Cross-Cultural Differences between Italy and the United States. 18th Bled eConference eIntegration in Action, Bled, Slovenia, 2005, 1-13.

Dobson, J.E. & Fisher, P.F. Geoslavery. IEEE Technology and Society Magazine, 22(1), 2003, 47-52.

Dobson, J.E. & Fisher, P.F. The Panopticon's Changing Geography. Geographical Review, 97(3), 2007, 307-323.

Dwyer, C., Hiltz, S.R. & Passerini, K.: Trust and Privacy Concern within Social Networking Sites: A Comparison of Facebook and Myspace. Proceedings of the Thirteenth Americas Conference on Information Systems, Keystone, Colorado, 2007, 1-12.

Elliot, G. & Phillips, N. Mobile Commerce and Wireless Computing Systems. Pearson Education Limited, Great Britain, 2004.

Ethics Subgroup IoT: Fact sheet- Ethics Subgroup IoT - Version 4.0, European Commission. 2013, 1-21, http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CB0QFjAA&url=http%3A%2F%2Fec.europa.eu%2Finformation_society%2Fnewsroom%2Fcf%2Fdae%2Fdocument.cfm%3Fdoc_id%3D1751&ei=5i7RVK-FHczYavKWgPgL&usg=AFQjCNG_VgeaUP_DIJVwSiPIww3bC9Ug_w

Freescale Semiconductor Inc. and ARM Inc:, Whitepaper: What the Internet of Things (IoT) Needs to Become a Reality. 2014, 1-16, cache.freescale.com/files/32bit/doc/white_paper/INTOTHNGSWP.pdf

Floridi, L.: Information Ethics: On the Philosophical Foundation of Computer Ethics. Ethics and Information Technology, 1, 1999, 37-56.

Foucault, M. Discipline and Punish: The Birth of the Prison. Second Vintage Books Edition May 1995, Vintage Books: A Division of Random House Inc, New York, 1977.

Fusco, S.J., Michael, K., Aloudat, A. & Abbas, R.: Monitoring People Using Location-Based Social Networking and Its Negative Impact on Trust: An Exploratory Contextual Analysis of Five Types of “Friend” Relationships. IEEE Symposium on Technology and Society, Illinois, Chicago, 2011.

Fusco, S.J., Michael, K., Michael, M.G. & Abbas, R.: Exploring the Social Implications of Location Based Social Networking: An Inquiry into the Perceived Positive and Negative Impacts of Using LBSN between Friends. 9th International Conference on Mobile Business, Athens, Greece, IEEE, 2010, 230-237.

Gagnon, M., Jacob, J.D., Guta, A.: Treatment adherence redefined: a critical analysis of technotherapeutics. Nurs Inq. 20(1), 2013, 60-70.

Ganascia, J.G.: The Generalized Sousveillance Society. Social Science Information, 49(3), 2010, 489-507.

Gandy, O.H.: The Panoptic Sort: A Political Economy of Personal Information. Westview, Boulder, Colorado, 1993.

Giaglis, G.M., Kourouthanassis, P. & Tsamakos, A.: Chapter IV. Towards a Classification Framework for Mobile Location-Based Services, in B.E. Mennecke and T.J. Strader (eds.), Mobile Commerce: Technology, Theory and Applications. Idea Group Publishing, Hershey, US, 2003, 67-85.

Gould, J.B.: Playing with Fire: The Civil Liberties Implications of September 11th. Public Administration Review, 62, 2002, 74-79.

Jorns, O. & Quirchmayr, G.: Trust and Privacy in Location-Based Services. Elektrotechnik & Informationstechnik, 127(5), 2010, 151-155.

Junglas, I. & Spitzmüller, C.: A Research Model for Studying Privacy Concerns Pertaining to Location-Based Services. Proceedings of the 38th Hawaii International Conference on System Sciences, 2005, 1-10.

Kaasinen, E.: User Acceptance of Location-Aware Mobile Guides Based on Seven Field Studies. Behaviour & Information Technology, 24(1), 2003, 37-49.

Kaupins, G. & Minch, R.: Legal and Ethical Implications of Employee Location Monitoring. Proceedings of the 38th Hawaii International Conference on System Sciences. 2005, 1-10.

Kim, D.J., Ferrin, D.L. & Rao, H.R.: Trust and Satisfaction, Two Stepping Stones for Successful E-Commerce Relationships: A Longitudinal Exploration. Information Systems Research, 20(2), 2009, 237-257.

King, L.: Information, Society and the Panopticon. The Western Journal of Graduate Research, 10(1), 2001, 40-50.

Kodl, J. & Lokay, M.: Human Identity, Human Identification and Human Security. Proceedings of the Conference on Security and Protection of Information, Idet Brno, Czech Republic, 2001, 129-138.

Kranenburg, R.V. and Bassi, A.: IoT Challenges, Communications in Mobile Computing. 1(9), 2012, 1-5.

Küpper, A. & Treu, G.: Next Generation Location-Based Services: Merging Positioning and Web 2.0., in L.T. Yang, A.B. Waluyo, J. Ma, L. Tan and B. Srinivasan (eds.), Mobile Intelligence. John Wiley & Sons Inc, Hoboken, New Jersey, 2010, 213-236.

Levin, A., Foster, M., West, B., Nicholson, M.J., Hernandez, T. & Cukier, W.: The Next Digital Divide: Online Social Network Privacy. Ryerson University, Ted Rogers School of Management, Privacy and Cyber Crime Institute, 2008, www.ryerson.ca/tedrogersschool/privacy/Ryerson_Privacy_Institute_OSN_Report.pdf.

Lewis, J.D. & Weigert, A.: Trust as a Social Reality. Social Forces, 63(4), 1985, 967-985.

Lyon, D.: The World Wide Web of Surveillance: The Internet and Off-World Power Flows. Information, Communication & Society, 1(1), 1998, 91-105.

Lyon, D.: Surveillance Society: Monitoring Everyday Life. Open University Press, Phildelphia, PA, 2001.

Lyon, D.: Surveillance Studies: An Overview. Polity, Cambridge, 2007.

Macquarie Dictionary.: 'Uberveillance', in S. Butler, Fifth Edition of the Macquarie Dictionary, Australia's National Dictionary. Sydney University, 2009, 1094.

Mann, S.: Sousveillance and Cyborglogs: A 30-Year Empirical Voyage through Ethical, Legal, and Policy Issues. Presence, 14(6), 2005, 625-646.

Mann, S., Nolan, J. & Wellman, B.: Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments. Surveillance & Society, 1(3), 2003, 331-355.

Mathiesen, K.: What is Information Ethics? Computers and Society, 32(8), 2004, 1-11.

Mattern, F. and Floerkemeier, K.: From the Internet of Computers to the Internet of Things, in Sachs, K., Petrov, I. & Guerrero, P. (eds.), From Active Data Management to Event-Based Systems and More. Springer-Verlag Berlin Heidelberg, 2010, 242-259.

Marx, G.T. & Steeves, V.: From the Beginning: Children as Subjects and Agents of Surveillance. Surveillance & Society, 7(3/4), 2010, 192-230.

Mayer, R.C., Davis, J.H. & Schoorman, F.D.: An Integrative Model of Organizational Trust. The Academy of Management Review, 20(3), 1995, 709-734.

McKnight, D.H. & Chervany, N.L.: What Trust Means in E-Commerce Customer Relationships: An Interdisciplinary Conceptual Typology. International Journal of Electronic Commerce, 6(2), 2001, 35-59.

Metzger, M.J.: Privacy, Trust, and Disclosure: Exploring Barriers to Electronic Commerce. Journal of Computer-Mediated Communication, 9(4), 2004.

Michael, K. & Clarke, R.: Location and Tracking of Mobile Devices: Überveillance Stalks the Streets. Computer Law and Security Review, 29(2), 2013, 216-228.

Michael, K., McNamee, A. & Michael, M.G.: The Emerging Ethics of Humancentric GPS Tracking and Monitoring. International Conference on Mobile Business, Copenhagen, Denmark, IEEE Computer Society, 2006a, 1-10.

Michael, K., McNamee, A., Michael, M.G., and Tootell, H.: Location-Based Intelligence – Modeling Behavior in Humans using GPS. IEEE International Symposium on Technology and Society, 2006b.

Michael, K., Stroh, B., Berry, O., Muhlbauer, A. & Nicholls, T.: The Avian Flu Tracker - a Location Service Proof of Concept. Recent Advances in Security Technology, Australian Homeland Security Research Centre, 2006, 1-11.

Michael, K. and Michael, M.G.: Australia and the New Technologies: Towards Evidence Based Policy in Public Administration (1 ed). Wollongong, Australia: University of Wollongong, 2008, Available at: http://works.bepress.com/kmichael/93

Michael, K. & Michael, M.G.: Microchipping People: The Rise of the Electrophorus. Quadrant, 49(3), 2005, 22-33.

Michael, K. and Michael, M.G.: From Dataveillance to Überveillance (Uberveillance) and the Realpolitik of the Transparent Society (1 ed). Wollongong: University of Wollongong, 2007. Available at: http://works.bepress.com/kmichael/51.

Michael, K. & Michael, M.G.: Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants. IGI Global, Hershey, PA, 2009.

Michael, K. & Michael, M.G.: The Social and Behavioral Implications of Location-Based Services. Journal of Location-Based Services, 5(3/4), 2011, 1-15, http://works.bepress.com/kmichael/246.

Michael, K. & Michael, M.G.: Sousveillance and Point of View Technologies in Law Enforcement: An Overview, in The Sixth Workshop on the Social Implications of National Security: Sousveillance and Point of View Technologies in Law Enforcement, University of Sydney, NSW, Australia, Feb. 2012.

Michael, K., Roussos, G., Huang, G.Q., Gadh, R., Chattopadhyay, A., Prabhu, S. and Chu, P.: Planetary-scale RFID Services in an Age of Uberveillance. Proceedings of the IEEE, 98.9, 2010, 1663-1671.

Michael, M.G. and Michael, K.: National Security: The Social Implications of the Politics of Transparency. Prometheus, 24(4), 2006, 359-364.

Michael, M.G. & Michael, K. Towards a State of Uberveillance. IEEE Technology and Society Magazine, 29(2), 2010, 9-16.

Michael, M.G. & Michael, K. (eds): Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies. Hershey, PA, IGI Global, 2013.

O'Connor, P.J. & Godar, S.H.: Chapter XIII. We Know Where You Are: The Ethics of LBS Advertising, in B.E. Mennecke and T.J. Strader (eds.), Mobile Commerce: Technology, Theory and Applications, Idea Group Publishing, Hershey, US, 2003, 245-261.

Orwell, G.: Nineteen Eighty Four. McPherson Printing Group, Maryborough, Victoria, 1949.

Oxford Dictionary: Control, Oxford University Press, 2012a http://oxforddictionaries.com/definition/control?q=control.

Oxford Dictionary: Trust, Oxford University Press, 2012b, http://oxforddictionaries.com/definition/trust?q=trust.

Pavlou, P.A.: Consumer Acceptance of Electronic Commerce: Integrating Trust and Risk with the Technology Acceptance Model. International Journal of Electronic Commerce, 7(3), 2003, 69-103.

Perusco, L. & Michael, K.: Humancentric Applications of Precise Location Based Services, in IEEE International Conference on e-Business Engineering, Beijing, China, IEEE Computer Society, 2005, 409-418.

Perusco, L. & Michael, K.: Control, Trust, Privacy, and Security: Evaluating Location-Based Services. IEEE Technology and Society Magazine, 26(1), 2007, 4-16.

Perusco, L., Michael, K. & Michael, M.G.: Location-Based Services and the Privacy-Security Dichotomy, in Proceedings of the Third International Conference on Mobile Computing and Ubiquitous Networking, London, UK, Information Processing Society of Japan, 2006, 91-98.

Quinn, M.J.: Ethics for the Information Age. Second Edition, Pearson/Addison-Wesley, Boston, 2006.

Renegar, B., Michael, K. & Michael, M.G.: Privacy, Value and Control Issues in Four Mobile Business Applications, in 7th International Conference on Mobile Business (ICMB2008), Barcelona, Spain, IEEE Computer Society, 2008, 30-40.

Rozenfeld, M.: The Value of Privacy: Safeguarding your information in the age of the Internet of Everything, The Institute: the IEEE News Source, 2014, http://theinstitute.ieee.org/technology-focus/technology-topic/the-value-of-privacy.

Rummel, R.J.: Death by Government. Transaction Publishers, New Brunswick, New Jersey, 1997.

Sanquist, T.F., Mahy, H. & Morris, F.: An Exploratory Risk Perception Study of Attitudes toward Homeland Security Systems. Risk Analysis, 28(4), 2008, 1125-1133.

Schoorman, F.D., Mayer, R.C. & Davis, J.H.: An Integrative Model of Organizational Trust: Past, Present, and Future. Academy of Management Review, 32(2), 2007, 344-354.

Shay, L.A., Conti, G., Larkin, D., Nelson, J.: A framework for analysis of quotidian exposure in an instrumented world. IEEE International Carnahan Conference on Security Technology (ICCST), 2012, 126-134.

Siau, K. & Shen, Z.: Building Customer Trust in Mobile Commerce. Communications of the ACM, 46(4), 2003, 91-94.

Solove, D.: Digital Dossiers and the Dissipation of Fourth Amendment Privacy. Southern California Law Review, 75, 2002, 1083-1168.

Solove, D.: The Digital Person: Technology and Privacy in the Information Age. New York University Press, New York, 2004.

Tavani, H.T.: Ethics and Technology: Ethical Issues in an Age of Information and Communication Technology. John Wiley, Hoboken, N.J., 2007.

Valacich, J.S.: Ubiquitous Trust: Evolving Trust into Ubiquitous Computing Environments. Business, Washington State University, 2003, 1-2.

van Ooijen, C. & Nouwt, S.: Power and Privacy: The Use of LBS in Dutch Public Administration, in B. van Loenen, J.W.J. Besemer and J.A. Zevenbergen (eds.), Sdi Convergence. Research, Emerging Trends, and Critical Assessment, Nederlandse Commissie voor Geodesie Netherlands Geodetic Commission 48, 2009, 75-88.

Wakunuma, K.J. and Stahl, B.C.: Tomorrow’s Ethics and Today’s Response: An Investigation into The Ways Information Systems Professionals Perceive and Address Emerging Ethical Issues. Inf Syst Front, 16, 2014, 383–397.

Weckert, J.: Trust and Monitoring in the Workplace. IEEE International Symposium on Technology and Society, 2000. University as a Bridge from Technology to Society, 2000, 245-250.

Wigan, M. & Clarke, R.: Social Impacts of Transport Surveillance. Prometheus, 24(4), 2006, 389-403.

Xu, H. & Teo, H.H.: Alleviating Consumers’ Privacy Concerns in Location-Based Services: A Psychological Control Perspective. Twenty-Fifth International Conference on Information Systems, 2004, 793-806.

Xu, H., Teo, H.H. & Tan, B.C.Y.: Predicting the Adoption of Location-Based Services: The Role of Trust and Perceived Privacy Risk. Twenty-Sixth International Conference on Information Systems, 2005, 897-910.

Yan, Z. & Holtmanns, S.: Trust Modeling and Management: From Social Trust to Digital Trust, in R. Subramanian (ed.), Computer Security, Privacy and Politics: Current Issues, Challenges and Solutions. IGI Global, 2008, 290-323.

Yeh, Y.S. & Li, Y.M.: Building Trust in M-Commerce: Contributions from Quality and Satisfaction. Online Information Review, 33(6), 2009, 1066-1086.

Citation: Roba Abbas, Katina Michael, M.G. Michael, "Using a Social-Ethical Framework to Evaluate Location-Based Services in an Internet of Things World", IRIE, International Review of Information Ethics, http://www.i-r-i-e.net/ Source: http://www.i-r-i-e.net/inhalt/022/IRIE-Abbas-Michael-Michael.pdf Dec 2014

Author(s):

Honorary Fellow Dr Roba Abbas:

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 - 4221 - 3555 , * roba@uow.edu.au :http://www.technologyandsociety.org/members/2013/7/25/dr-roba-abbas

·         Relevant publications:

o    R. Abbas, K. Michael, M.G. Michael, R. Nicholls, Sketching and validating the location-based services (LBS) regulatory framework in Australia, Computer Law and Security Review 29, No.5 (2013): 576-589.

o    R. Abbas, K. Michael, M.G. Michael, The Regulatory Considerations and Ethical Dilemmas of Location-Based Services (LBS): A Literature Review, Information Technology & People 27, No.1 (2014): 2-20.

Associate Professor Katina Michael:

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 - 4221 - 3937 , * katina@uow.edu.au : http://ro.uow.edu.au/kmichael

·         Relevant publications:

o    K. Michael, R. Clarke, Location and Tracking of Mobile Devices: Überveillance Stalks the Streets, Computer Law and Security Review 29, No.3 (2013): 216-228.

o    K. Michael, M. G. Michael, Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants, IGI Global, (2009).

o    L. Perusco, K. Michael, Control, trust, privacy, and security: evaluating location-based services, IEEE Technology and Society Magazine 26, No.1 (2007): 4-16.

Honorary Associate Professor M.G. Michael

·         School of Information Systems and Technology, University of Wollongong, Northfields Avenue, Wollongong NSW 2522, Australia

·         ( + 612 – 4221 - 3937, *  mgm@uow.edu.au, : http://ro.uow.edu.au/mgmichael

·         Relevant publications:

o    M.G. Michael and K. Michael (eds) Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies, Hershey: PA, IGI Global, (2013).

o    K. Michael, M. G. Michael, "The Social and Behavioral Implications of Location-Based Services, Journal of Location-Based Services, Volume 5, Issue 3-4, (2011), 121-137.

o    M.G. Michael, K. Michael, Towards a State of Uberveillance, IEEE Technology and Society Magazine, 29, No.2, (2010): 9-16.

o    M. G. Michael, S. J. Fusco, K. Michael, A Research Note on Ethics in the Emerging Age of Uberveillance, Computer Communications, 31 No.6, 2008: 1192-1199.

Be Vigilant: There Are Limits to Veillance

The Computer After Me: Awareness and Self-Awareness in Autonomic Systems

Chapter 13: Be Vigilant: There Are Limits to Veillance

This image was taken from the BioShock video game series or from websites created and owned by 2K Games, the copyright of which is held by Take-Two Interactive Software, Inc.

Katina Michael, M. G. Michael, Christine Perakslis

The following sections are included:

  • Introduction

  • From Fixed to Mobile Sensors

  • People as Sensors

  • Enter the Veillances

    • Surveillance

    • Dataveillance

    • Sousveillance

    • Überveillance

  • Colliding Principles

    • From ‘drone view’ to ‘person view’

    • Transparency and open data

    • Surveillance, listening devices and the law

    • Ethics and values

    • The unintended side effects of lifelogging

    • Pebbles and shells

    • When bad is good

    • Censorship

  • Summary and Conclusions: Mind/Body Distinction

13.1 Introduction

Be vigilant; we implore the reader. Yet, vigilance requires hard mental work (Warm et al., 2008). Humans have repeatedly shown evidence of poor performance relative to vigilance, especially when we are facing such factors as complex or novel data, time pressure, and information overload (Ware, 2000). For years, researchers have investigated the effect of vigilance, from the positive impact of it upon the survival of the ground squirrel in Africa to its decrement resulting in the poor performance of air traffic controllers. Scholars seem to agree: fatigue has a negative bearing on vigilance.

In our society, we have become increasingly fatigued, both physically and cognitively. It has been widely documented that employees are in­creasingly faced with time starvation, and that consequently self-imposed sleep deprivation is one of the primary reasons for increasing fatigue, as employees forego sleep in order to complete more work (see, for example, the online publications by the Society of Human Resources1 and the Na­tional Sleep Foundation2). Widespread access to technology exacerbates the problem, by making it possible to stay busy round the clock.

Our information-rich world which leads to information overload and novel data, as well as the 24/7/365 connectivity which leads to time pressure, both contribute to fatigue and so work against vigilance. However, the lack of vigilance, or the failure to accurately perceive, identify, or an­alyze bona fide threats, can lead to serious negative consequences, even a life-threatening state of affairs (Capurro, 2013).

This phenomenon, which can be termed vigilance fatigue, can be brought about by four factors:

·       Prolonged exposure to ambiguous, unspecified, and ubiquitous threat information.

·       Information overload.

·       Overwhelming pressure to maintain exceptional, error-free per­formance.

·       Faulty strategies for structuring informed decision-making under con­ditions of uncertainty and stress.

Therefore, as we are asking the reader to be vigilant in this transformative – and potentially disruptive transition toward – the ‘computer after me’, we feel obligated to articulate clearly the potential threats associated with veillance. We believe we must ask the challenging and unpopular questions now. We must disclose and discuss the existence of risk, the values at stake, and the possibility of harm related to veillance. We owe it to the reader in this world of increasing vigilance fatigue to provide unambiguous, specified threat information and to bring it to their attention.

13.2 From Fixed to Mobile Sensors

Embedded sensors have provided us with a range of benefits and conve­niences that many of us take for granted in our everyday life. We now find commonplace the auto-flushing lavatory and the auto-dispensing of soap and water for hand washing. Many of these practices are not only conve­nient but help to maintain health and hygiene. We even have embedded sensors in lamp-posts that can detect on-coming vehicles and are so energy efficient that they turn on as they detect movement, and then turn off again to conserve resources. However, these fixtures are static; they form basic infrastructure that often has ‘eyes’ (e.g. an image and/or motion sensor), but does not have ‘legs’.

What happens when these sensors – for identification, location, condi­tion monitoring, point-of-view (POV) and more – become embeddable in mobile objects and begin to follow and track us everywhere we go? Our vehicles, tablets, smart phones, and even contactless smart cards are equipped to capture, synthesize, and communicate a plethora of information about our behaviors, traits, likes and dislikes, as we lug them around everywhere we go. Automatic licence plate scanners are mounted not only in street­lights or on bridges, but now also on patrol cars. These scanners snap photos of automobiles passing and store such data as plate numbers, times, and locations within massive databases (Clarke, 2009). Stores are combin­ing the use of static fixtures with mobile devices to better understand the psychographics and demographics of their shoppers (Michael and Clarke, 2013). The combination of these monitoring tools is powerful. Cell phone identifiers are used to track the movements of the customers (even if the customer is not connected to the store’s WiFi network), with the surveillance cameras collecting biometric analytics to analyze facial expressions and moods. Along with an augmented capability to customize and person­alize marketing efforts, the stores can identify how long one tarries in an aisle, the customer’s reaction to a sale item, the age of the shopper, and even who did or did not walk by a certain display.

The human has now become an extension (voluntarily or involuntarily) of these location-based and affect-based technological breakthroughs; we the end-users are in fact the end-point of a complex network of net­works. The devices we carry take on a life of their own, sending binary data up and down stream in the name of better connectivity, awareness, and ambient intelligence. ‘I am here’, the device continuously signals to the nearest access node, handshaking a more accurate location fix, as well as providing key behavioral indicators which can easily become predictors of future behaviors. However, it seems as if we, as a society, are rapidly in de­mand of more and more communications technology – or so that is the idea we are being sold. Technology has its many benefits: few people are out of reach now, and communication becomes easier, more personalized, and much more flexible. Through connectivity, people’s input is garnered and responses can be felt immediately. Yet, just as Newton’s action–reaction law comes into play in the physical realm, there are reactions to consider for the human not only in the physical realms, but also in the mental, emo­tional, and spiritual realms (Loehr and Schwartz, 2001), when we live our lives not only in the ordinary world, but also within the digital world.

Claims have been made that our life has become so busy today that we are grasping to gain back seconds in our day. It could be asked: why should we waste time and effort by manually entering all these now-necessary pass­words, when a tattoo or pill could transmit an 18-bit authentication signal for automatic logon from within our bodies? We are led to believe that individuals are demanding uninterrupted connectivity; however, research has shown that some yearn to have the freedom to ‘live off the grid’, even if for only a short span of time (Pearce and Gretzel, 2012).

A recent front cover of a US business magazine Fast Company read “Unplug. My life was crazy. So I disconnected for 25 days. You should too”. The content within the publication includes coping mechanisms of senior-level professionals who are working to mitigate the consequences of perpetual connectivity through technology. One article reveals the digital dilemmas we now face (e.g. how much should I connect?); another article provides tips on how to do a digital detox (e.g. disconnecting because of the price we pay); and yet another article outlines how to bring sanity to your crazy, wired life with eight ways the busiest connectors give themselves a break (e.g. taking time each day to exercise in a way that makes it impossi­ble to check your phone; ditching the phone to ensure undivided attention is given to colleagues; or establishing a company ‘Shabbat’ in which it is acceptable to unplug one day a week). Baratunde Thurston, CEO and co­founder of Cultivated Wit (and considered by some to be the world’s most connected man), wrote:

I love my devices and my digital services, I love being connected to the global hive mind – but I am more aware of the price we pay: lack of depth, reduced accuracy, lower quality, impatience, selfishness, and mental exhaustion, to name but a few. In choosing to digitally enhance lives, we risk not living them.
— (Thurston, 2013, p. 77)

13.3 People as Sensors

Enter Google Glass, Autographer, Memoto, TrackStick, Fitbit, and other wearable devices that are worn like spectacles, apparel, or tied round the neck. The more pervasive innovations such as electronic tattoos, nanopatches, smart pills, and ICT implants seamlessly become a ‘part’ of the body once attached, swallowed, embedded, or injected. These technolo­gies are purported to be lifestyle choices that can provide a myriad of con­veniences and productivity gains, as well as improved health and well-being functionality. Wearables are believed to have such benefits as enhancements to self-awareness, communication, memory, sensing, recognition, and logis­tical skills. Common experiences can be augmented, for example when a theme park character (apparently) knows your child’s name because of a wrist strap that acts as an admissions ticket, wallet, and ID.

Gone are the days when there was a stigma around electronic bracelets being used to track those on parole; these devices are now becoming much like a fashion statement and a desirable method not only for safety and security, but also for convenience and enhanced experiences. However, one must consider that an innocuous method for convenience may prove to create ‘people as sensors’ in which information is collected from the envi­ronment using unobtrusive measures, but with the wearer – as well as those around the wearer – possibly unaware of the extent of the data collection. In addition to issues around privacy, other questions must be asked such as: what will be done with the data now and well into the future?

The metaphor of ‘people as sensors’, also referred to as Citizens as Sen­sors (Goodchild, 2007), is being espoused, as on-board chipsets allow an individual to look out toward another object or subject (e.g. using an im­age sensor), or to look inward toward oneself (e.g. measuring physiological characteristics with embedded surveillance devices). As optional prosthetic devices are incorporated into users, devices are recognized by some as be­coming an extension of the person’s mind and body. New developments in ‘smart skin’ offer even more solutions. The skin can become a function of the user’s habits, personality, mood, or behavior. For example, when inserted into a shoe, the smart skin can analyze and improve the technical skill of an athlete, factors associated with body stresses related to activity, or even health issues that may result from the wearer’s use of high-heeled shoes (Papakostas et al., 2002). Simply put, human beings who function in analog are able to communicate digitally through the devices that they wear or bear. This is quite a different proposition from the typical surveil­lance camera that is bolted onto a wall overlooking the streetscape or mall and has a pre-defined field of view.

Fig. 13.1 People as sensors: from surveillance to uberveillance

‘People as sensors’ is far more pervasive than dash-cams used in police vehicles, and can be likened to the putting on of body-worn devices by law enforcement agencies to collect real-time data from the field (see Fig­ure 13.1). When everyday citizens are wearing and bearing these devices, they form a collective network by contributing individual subjective (and personal) observations of themselves and their surroundings. There are advantages; the community is believed to benefit with relevant, real-time information on such issues as public safety, street damage, weather obser­vations, traffic patterns, and even public health (cf. Chapter 12). People, using their everyday devices, can enter information into a data warehouse, which could also reduce the cost of intensive physical networks that oth­erwise need to be deployed. Although murky, there is vulnerability; such as the risk of U-VGI (Un-Volunteered Geographical Information) with the tracking of mass movements in a cell phone network to ascertain traffic distribution (Resch, 2013).

Consider it a type of warwalking on foot rather than wardriving.3 It seems that opt-in and opt-out features are not deemed necessary, perhaps due to the perceived anonymity of individual user identifiers. The ability to ‘switch off’, ‘turn off’, ‘unplug’, or select the ‘I do not consent’ feature in a practical way, is a question that many have pondered, but with arguably a limited number of pragmatic solutions, if any.

With ‘citizens as sensors’ there is an opt-in for those subscribing, but issues need to be considered for those in the vicinity of the bearer who did not consent to subscribe or to be recorded. Researchers contend that even the bearer must be better educated on the potential privacy issues (Daskala, 2011). For example, user-generated information yields longitude and lat­itude coordinates, time and date stamps, and speed and elevation details which tell us significant aspects about a person’s everyday life leading to insight about current and predictive behavioral patterns. Data could also be routinely intercepted (and stored indefinitely), as has been alleged in the recent National Security Agency (NSA) scandal. Even greater concerns arise from the potential use of dragnet electronic surveillance to be mined for information (now or in the future) to extract and synthesize rich het­erogeneous data containing personal visual records and ‘friends lists’ of the new media. Call detail records (CDRs) may just be the tip of the iceberg.

The quantified-self movement, which incorporates data, taking into ac­count many inputs of a person’s daily life, is being used for self-tracking and community building so individuals can work toward improving their daily functioning (e.g. how you look, feel, and live). Because devices can look inward toward oneself, one can mine very personal data (e.g. body mass index and heart rate) which can then be combined with the outward (e.g. the vital role of your community support network) to yield such quantifiers as a higi score defining a person with a cumulative grade (e.g. your score today out of a possible 999 points).4

Wearables, together with other technologies, assist in the process of tak­ing in multiple and varied data points to synthesize the person’s mental and physical performance (e.g. sleep quality), psychological states such as moods and stimulation levels (e.g. excitement), and other inputs such as food, air quality, location, and human interactions. Neurologically, information is addictive; yet, humans may make worse decisions when more information is at hand. Humans also are believed to overestimate the value of missing data which may lead to an endless pursuit, or perhaps an overvaluing of useless information (Bastardi and Shafir, 1998). Even more consequential, it is even possible that too much introspection can also reduce the quality of decisions of individuals.

13.4 Enter the Veillances

Katina Michael and M. G. Michael (2009) made a presentation that, for the first time at a public gathering, considered surveillance, dataveillance, sousveillance and überveillance all together. As a specialist term, veillance was first used in an important blogpost exploring equiveillance by Ian Kerr and Steve Mann (2006) in which the ‘valences of veillance’ were briefly described. But in contrast to Kerr and Mann, Michael and Michael were pondering on the intensification of a state of überveillance through increasingly pervasive technologies, which can provide details from the big picture view right down to the miniscule personal details.

But what does veillance mean? And how is it understood in different contexts? What does it mean to be watched by a CCTV camera, to have one’s personal details deeply scrutinized, to watch another, to watch one­self? And so we continue by defining the four types of veillances that have received attention in recognized peer reviewed journal publications and the wider corpus of literature.

13.4.1 Surveillance

First, the much embraced idea of surveillance recognized in the early nine­teenth century from the French sur meaning ‘over’ and veiller meaning ‘to watch’. According to the Oxford English Dictionary, veiller stems from the Latin vigilare, which means ‘to keep watch’.

13.4.2 Dataveillance

Dataveillance was conceived by Clarke (1988a) as “the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” (although in the Oxford English Dictionary it is now defined as “the practice of monitoring the online ac­tivity of a person or group”). The term was introduced in response to government agency data matching initiatives linking taxation records and social security benefits, among other commercial data mining practices. At the time it was a powerful response to the proposed Australia Card pro­posal in 1987 (Clarke, 1988b), which was never implemented by the Hawke Government, while the Howard Government’s attempts to introduce an Access Card almost two decades later in 2005 were also unsuccessful. It is remarkable that same issues ensue today, only on a greater magnitude with more consequences and advanced capabilities in analytics, data storage, and converging systems.

13.4.3 Sousveillance

Sousveillance was defined by Steve Mann in 2002, but practiced since 1995 as “the recording of an activity from the perspective of a participant in the activity” . 5 However, its initial introduction into the literature came in the inaugural Surveillance and Society journal in 2003 with a meaning of ‘in­verse surveillance’ as a counter to organizational surveillance (Mann et al., 2003). Mann prefers to interpret sousveillance as under-sight, which main­tains integrity, contra to surveillance as over-sight (Mann, 2004a), which reduces to hypocrisy if governments responsible for surveillance pass laws to make sousveillance illegal.

Whereas dataveillance is the systematic use of personal data systems in the monitoring of people, sousveillance is the inverse of monitoring people; it is the continuous capture of personal experience (Mann, 2004b). For ex­ample, dataveillance might include the linking of someone’s tax file number with their bank account details and communications data. Sousveillance on the other hand, is a voluntary act of logging what people might see as they move through the world. Surveillance is thus considered watch­ing from above, whereas sousveillance is considered watching from below. In contrast, dataveillance is the monitoring of a person’s activities which presents the individual with numerous social dangers (Clarke, 1988a).

13.4.4 Uberveillance

¨Uberveillance conceived by M. G. Michael in 2006, is defined in the Aus­tralian Law Dictionary as: “ubiquitous or pervasive electronic surveillance that is not only ‘always on’ but ‘always with you’, ultimately in the form of bodily invasive surveillance”. The Macquarie Dictionary of Australia entered the term officially in 2008 as “an omnipresent electronic surveil­lance facilitated by technology that makes it possible to embed surveil­lance devices in the human body”. Michael and Michael (2007) defined überveillance as having “to do with the fundamental who (ID), where (loca­tion), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought)”.

¨Uberveillance is a compound word, conjoining the German über mean­ing ‘over’ or ‘above’ with the French veillance. The concept is very much linked to Friedrich Nietzsche’s vision of the übermensch, who is a man with powers beyond those of an ordinary human being, like a super-man with amplified abilities (Michael and Michael, 2010). ¨Uberveillance is analogous to big brother on the inside looking out. For example, heart, pulse, and temperature sensor readings emanating from the body in binary bits wire­lessly, or even through amplified eyes such as inserted contact lens ‘glass’ that might provide visual display and access to the Internet or social net­working applications.

¨Uberveillance brings together all forms of watching from above and from below, from machines that move to those that stand still, from animals and from people, acquired involuntarily or voluntarily using obtrusive or unob­trusive devices (Michael et al., 2010). The network infrastructure underlies the ability to collect data direct from the sensor devices worn by the individ­ual and big data analytics ensures an interpretation of the unique behavioral traits of the individual, implying more than just predicted movement, but intent and thought (Michael and Miller, 2013).

It has been said that überveillance is that part of the veillance puz­zle that brings together the sur, data, and sous to an intersecting point (Stephan et al., 2012). In überveillance, there is the ‘watching’ from above component (sur), there is the ‘collecting’ of personal data and public data for mining (data), and there is the watching from below (sous), which can draw together social networks and strangers, all coming together via wear­able and implantable devices on/in the human body. ¨Uberveillance can be used for good in the practice of health for instance, but we contend that, independent of its application for non-medical purposes, it will always have an underlying control factor (Masters and Michael, 2006).

13.5 Colliding Principles

13.5.1 From ‘drone view’ to ‘person view’

It can be argued that, because a CCTV camera is monitoring activities from above, we should have the ‘counter-right’ to monitor the world around us from below. It therefore follows, if Google can record ‘street views’, then the average citizen should also be able to engage in that same act, which we may call ‘person view’. Our laws as a rule do not forbid recording the world around us (or even each other for that matter), so long as we are not encroaching on someone else’s well-being or privacy (e.g. stalking, or making material public without expressed consent). While we have Street View today, it will only be a matter of time before we have ‘drones as a service’ (DaaS) products that systematically provide even better high res­olution imagery than ‘satellite views’. We can make ‘drone view’ available on Google Maps, as we could probably also make ‘person view’ available. Want to look up not only a street, but a person if they are logged in and registered? Then search ‘John Doe’ and find the nearest camera pointing toward him, and/or emanating from him. Call it a triangulation of sorts.

13.5.2 Transparency and open data

The benefits of this kind of transparency, argue numerous scholars, are that not only will we have a perfect source of open data to work with, but that there will be less crime as people consider the repercussions of being caught doing wrong in real-time. However, this is quite an idealistic paradigm and ethically flawed. Criminals, and non-criminals for that mat­ter, find ways around all secure processes, no matter how technologically foolproof. At that point, the technical elite might well be systematically hiding or erasing their recorded misdemeanours but no doubt keeping the innocent person under 24/7/365 watch. There are, however, varying de­grees to transparency, and most of these have to do with economies of scale and/or are context-based; they have to be. In short, transparency needs to be context related.

13.5.3 Surveillance, listening devices and the law

At what point do we actually believe that in a public space our privacy is not invaded by such incremental innovations as little wearable cameras, half the size of a matchbox, worn as lifelogging devices? One could speculate that the sheer size of these devices makes them unobtrusive and not easily detectable to the naked eye, meaning that they are covert in nature and blatantly break the law in some jurisdictions where they are worn and operational (Abbas et al., 2011). Some of these devices not only capture images every 30 seconds, but also record audio, making them potentially a form of unauthorized surveillance. It is also not always apparent when these devices are on or off. We must consider that the “unrestricted freedom of some may endanger the well-being, privacy, or safety of others” (Rodota and Capurro, 2005, p. 23). Where are the distinctions between the wearer’s right to capture his or her own personal experiences on the one hand (i.e. the unrestricted freedom of some), and intrusion into another’s private sphere in which he or she does not want to be recorded, and is perhaps even disturbed by the prospect of losing control over his or her privacy (i.e. endangering the well-being or privacy of others)?

13.5.4 Ethics and values

Enter ethics and values. Ethics in this debate are greatly important. They have been dangerously pushed aside, for it is ethics that determine the degree of importance, that is the value, we place on the levels of our decision-making. When is it right to take photographs and record another individual (even in a public space), and when is it wrong? Do I physically remove my wearable device when I enter a washroom, a leisure centre, a hospital, a funeral, someone else’s home, a bedroom? Do I need to ask express permis­sion from someone to record them, even if I am a participant in a shared activity? What about unobtrusive devices that blur the line between wear­ables and implantables, such as miniature recording devices embedded in spectacle frames or eye sockets and possibly in the future embedded in con­tact lenses? Do I have to tell my future partner or prospective employer? Should I declare these during the immigration process before I enter the secure zone?

At the same time, independent of how much crowdsourced evidence is gathered for a given event, wearables and implantables are not infallible, their sensors can easily misrepresent reality through inaccurate or incom­plete readings and data can be even further misconstrued post capture (Michael and Michael, 2007). This is the limitation of an überveillance so­ciety – devices are equipped with a myriad of sensors; they are celebrated as achieving near omnipresence, but the reality is that they will never be able to achieve omniscience. Finite knowledge and imperfect awareness create much potential for inadequate or incomplete interpretations.

Some technologists believe that they need to rewrite the books on meta­physics and ontology, as a result of old and outmoded definitions in the traditional humanities. We must be wary of our increasing ‘technicized’ environment however, and continue to test ourselves on the values we hold as canonical, which go towards defining a free and autonomous human be­ing. The protection of personal data has been deemed by the EU as an autonomous individual right.

Yet, with such pervasive data collection, how will we protect “the right of informational self-determination on each individual – including the right to remain master of the data concerning him or her” (Rodota and Capurro, 2005, p. 17)? If we rely on bio-data to drive our next move based on what our own wearable sensors tells some computer application is the right thing to do, we very well may lose a great part of our freedom and the life-force of improvization and spontaneity. By allowing this data to drive our decisions, we make ourselves prone to algorithmic faults in software programs among other significant problems.

13.5.5 The unintended side effects of lifelogging

Lifelogging captures continuous first-person recordings of a person’s life and can now be dynamically integrated into social networking and other appli­cations. If lifelogging is recording your daily life with technical tools, many are unintentionally participating in a form of lifelogging by recording their lives through social networks. Although, technically, data capture in social media happens in bursts (e.g. the upload of a photograph) compared with continuous recording of first-person recordings (e.g. glogger.mobi) (Daskala, 2011). Lifelogging is believed to have such benefits as affecting how we re­member, increasing productivity, reducing an individual’s sense of isolation, building social bonds, capturing memories, and enhancing communication.

Governing bodies could also derive benefit through lifelogging appli­cations data to better understanding public opinion or forecast emerging health issues for society. However, memories gathered by lifelogs can have side effects. Not every image, and not every recording you will take will be a happy one. Replaying these and other moments might be detrimental to our well-being. For example, history shows ‘looking back’ may become traumatic, such as Marina Lutz’s experience of having most of her life ei­ther recorded or photographed in the first 16 years of her life by her father (see the short film The Marina Experience).

Researchers have discovered that personality development and mental health could also be negatively impacted by lifelogging applications. Vul­nerabilities include high influence potential by others, suggestibility, weak perception of self, and a resulting low self-esteem (Daskala, 2011). There is also risk that wearers may also post undesirable or personal expressions of another person, which cause the person emotional harm due to a neg­ative perception of himself or herself among third parties (Daskala, 2011). We have already witnessed such events in other social forums with tragic consequences such as suicides.

Lifelogging data may also create unhealthy competition, for example in gamification programs that use higi scores to compare your quality of life to others. Studies report psychological harm among those who perceive they do not meet peer expectations (Daskala, 2011); how much more so when intimate data about one’s physical, emotional, psychological, and so­cial network is integrated, measured, and calculated to sum up quality of life in a three-digit score (Michael and Michael, 2011). Even the effect of sharing positive lifelogging data should be reconsidered. Various reports have claimed that watching other people’s lives can develop into an obsession and can incite envy, feelings of inadequacy, or feeling as if one is not accomplished enough, especially when comparing oneself to others.

13.5.6 Pebbles and shells

Perhaps lifelogs could have the opposite effect of their intended purpose, without ever denying the numerous positives. We may become wrapped up in the self, rather than in the common good, playing to a theater, and not allowing ourselves to flourish in other ways lest we are perceived as anything but normal. Such logging posted onto public Internet archival stores might well serve to promote a conflicting identity of the self, constant validation through page ranks, hit counts and likes, and other forms of electronic exhibitionism. Researchers purport that lifelogging activities are likely to lead to an over-reliance and excessive dependency on electronic devices and systems with emotionally concerning, on-going cognitive reflections as messages are posted or seen, and this could be at the expense of more important aspects of life (Daskala, 2011).

Isaac Newton gave us much to consider when he said, “I was like a boy playing on the sea-shore, and diverting myself now and then find­ing a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me” (Brewster, 2001). Society at large must question if the measurements of Google hits, higi scores, clicks, votes, recordings, and analysis of data to quantify ‘the self’, could become a dangerously distracting exercise if left unbalanced. The aforementioned measurements, which are multi-varied and enormously insightful, may be of value – and of great enjoyment and fascination – much like Newton’s peb­bles and shells. However, what is the ocean we may overlook – or ignore – as we scour the beach for pebbles and shells?

13.5.7 When bad is good

Data collection and analysis systems, such as lifelogging, may not appro­priately allow for individuals to progress in self-awareness and personal development upon tempered reflection. How do we aptly measure the con­tradictory aspects of life such as the healing that often comes through tears, or the expending of energy (exercise) to gain energy (physical health), or the unique wonder that is realized only through the pain of self-sacrifice (e.g. veritable altruistic acts)? Harvard researchers Loehr and Schwartz (2001) provide us with further evidence of how the bad (or the unpleasant) can be good relative to personal development, through an investigation in which a key participant went by the name of ‘Richard’.

Richard was an individual progressing in self-awareness as documented during an investigation in which researchers were working to determine how executives could achieve peak performance leading to increased capacity for endurance, determination, strength, flexibility, self-control, and focus. The researchers found that executives who perform to full potential, for the long­term, tap into energy at all levels of the ‘pyramid of performance’ which has four ascending levels of progressive capacities: physical, emotional, mental, and spiritual.

The tip of the pyramid was identified as spiritual capacity, defined by the researchers as “an energy that is released by tapping into one’s deepest values and defining a strong sense of purpose” (Loehr and Schwartz, 2001, p. 127). The spiritual capacity, above all else, was found to be the sustenance – or the fuel – of the ideal performance state (IPS); the state in which individuals ‘bring their talent and skills to full ignition and to sustain high performance over time’ (op. cit., p. 122). However, as Richard worked to realize his spiritual capacity, he experienced significant pain during a two-year period. He reported being overcome by emotion, consumed with grief, and filled with longing as he learned to affirm what mattered most in his life. The two-year battle resulted in Richard ‘tapping into a deeper sense of purpose with a new source of energy’ (op. cit., p. 128); however, one must question if technology would have properly quantified the bad as the ultimate good for Richard. Spiritual reflections on the trajectory of technology (certainly since it has now been plainly linked to teleology) are not out of place nor should they be discouraged.

13.5.8 Censorship

Beyond the veillance (the ‘watching’) of oneself, i.e. the inward gaze, is the outward veillance and watching of the other. But this point of eye (PoE), does not necessarily mean a point of view (PoV), or even wider angle field of view (FoV). Particularly in the context of ‘glass’. Our gaze too is subjective, and who or what will connote this censorship at the time when it really matters? The outward watching too may not tell the full story, despite its rich media capability to gather both audio and video. Audio-visual accounts have their own pitfalls. We have long known how vitally important eye gaze is for all of the social primates, and particularly for humans; there will be consequences to any artificial tampering of this basic natural instinct. Hans Holbein’s famous painting The Ambassadors (1533), with its patent reference to anamorphosis, speaks volumes of the critical distinction between PoE and PoV. Take a look, if you are not already familiar with this double portrait and still life. Can you see the skull? The secret lies in the perspective and in the tilt of the head.

13.6 Summary and Conclusions: Mind/Body Distinction

In the future, corporate marketing may hire professional lifeloggers (or mo­bile robotic contraptions) to log other people’s lives with commercial de­vices. Unfortunately, because of inadequate privacy policies or a lack of harmonized legislation, we, as consumers, may find no laws that would pre­clude companies from this sort of ‘live to life’ hire if we do not pull the reins on the obsession to auto-photograph and audio record everything in sight. And this needs to happen right now. We have already fallen behind and are playing a risky game of catch-up. Ethics is not the overriding issue for technology companies or developers; innovation is their primary focus because, in large part, they have a fiduciary responsibility to turn a profit. We must in turn, as an informed and socially responsive community, forge together to dutifully consider the risks. At what point will we leap from tracking the mundane, which is of the body (e.g. location of GPS coordi­nates), toward the tracking of the mind by bringing all of these separate components together using ¨uber-analytics and an ¨uber-view? We must ask the hard questions now. We must disclose and discuss the existence of risk, the values at stake, and the possibility of harm.

It is significant that as researchers we are once more, at least in some places, speaking on the importance of the Cartesian mind/body distinction and of the catastrophic consequences should they continue to be confused when it comes to etymological implications and ontological categories. The mind and the body are not identical even if we are to argue from Leibniz’s Law of Identity that two things can only be identical if they at the same time share exactly the same qualities. Here as well, vigilance is enormously important that we might not disremember the real distinction between machine and human.

References

Abbas, R., Michael, K., Michael, M. G., & Aloudat, A. (2011). Emerging Forms of Covert Surveillance Using GPS-Enabled Devices. Journal of Cases on Information Technology, 13(2), 19-33.

ACLU. (2013). You Are Being Tracked: How License Plate Readers Are Being Used to Record Americans' Movements. from http://www.aclu.org/technology-and-liberty/you-are-being-tracked-how-license-plate-readers-are-being-used-record

Adler, I. (2013). How Our Digital Devices Are Affecting Our Personal Relationships. 90.9 WBUR.

ALD (Ed.). (2010). Uberveillance: Oxford University Press.

Australian Privacy Foundation. (2005). Human Services Card.   Retrieved 6 June 2013, from http://www.privacy.org.au/Campaigns/ID_cards/HSCard.html

Bastardi, A., & Shafir, E. (1998). On the Pursuit and Misuse of Useless Information. Journal of Personality and Social Psychology, 75(1), 19-32.

Brewster, D. (2001). Memoirs of the Life, Writings, and Discoveries of Sir Isaac Newton (1855) Volume II. Ch. 27: Adamant Media Corporation.

Capurro, R. (2013). Medicine in the information and knowledge society. Paper presented at the Conference Name|. Retrieved Access Date|. from URL|.

Carpenter, L. (2011). Marina Lutz interview: The sins of my father. The Observer   Retrieved 20 April 2013, from http://www.guardian.co.uk/artanddesign/2011/apr/17/photography-children

Clarke, R. (1988a). Information Technology and Dataveillance. Communications of the ACM, 31(5), 498-512.

Clarke, R. (1988b). Just another piece of plastic in your wallet: the `Australian card' scheme. ACM SIGCAS Computers and Society, 18(1), 7-21.

Clarke, R. (2009, 7 April 2009). The Covert Implementation of Mass Vehicle Surveillance in Australia. Paper presented at the Fourth Workshop on the Social Implications of National Security: Covert Policing, Canberra, Australia.

Clifford, S., & Hardy, Q. (2013). Attention, Shoppers: Store Is Tracking Your Cell.   Retrieved 14 July, from http://www.nytimes.com/2013/07/15/business/attention-shopper-stores-are-tracking-your-cell.html?pagewanted=all

Collins, L. (2008). Annals of Crime. Friend Game. Behind the online hoax that led to a girl’s suicide. The New Yorker.

DailyMail. (2013). Stores now tracking your behavior and moods through cameras.   Retrieved 6 August, from http://www.dailymail.co.uk/news/article-2364753/Stores-tracking-behavior-moods-cameras-cell-phones.html?ito=feeds-newsxml

ENISA. (2011). To log or not to log?: Risks and benefits of emerging life-logging applications. European Network and Information Security Agency   Retrieved 6 July 2013, from http://www.enisa.europa.eu/activities/risk-management/emerging-and-future-risk/deliverables/life-logging-risk-assessment/to-log-or-not-to-log-risks-and-benefits-of-emerging-life-logging-applications

FastCompany. (2013). #Unplug. Fast Company, July/August(177).

Frankel, T. C. (2012, 20 October). Megan Meier's mom is still fighting bullying. stltoday.com   Retrieved 4 November 2012

Friedman, R. (2012). Why Too Much Data Disables Your Decision Making. Psychology Today: Glue   Retrieved December 4, 2012, from http://www.psychologytoday.com/blog/glue/201212/why-too-much-data-disables-your-decision-making

Goodchild, M. F. (2007). Citizens as sensors: the world of volunteered geography. GeoJournal, 69, 211–221.

Greenwald, G. (2013). NSA collecting phone records of millions of Verizon customers daily. The Guardian   Retrieved 10 August 2013, from http://www.theguardian.com/world/2013/jun/06/nsa-phone-records-verizon-court-order

Hans Holbein the Younger. (1533). The Ambassadors.

Hayes, A. (2010). Uberveillance (Triquetra).   Retrieved 6 May 2013, from http://archive.org/details/Uberveillancetriquetra

HIGI. (2013). Your Score for Life.   Retrieved 29 June 2013, from https://higi.com/about/score

Intellitix. (2013). Reshaping the Event Horizon.   Retrieved 6 July 2013, from http://www.intellitix.com/intellitix/home/

Kerr, I., & Mann, S. (2006). Exploring Equiveillance. ID TRAIL MIX.

Krause. (2012). Vigilance Fatigue in Policing.   Retrieved 22 July, from http://www.fbi.gov/stats-services/publications/law-enforcement-bulletin/december-2012/vigilance-fatigue-in-policing

Levin, A. (2013). Waiting for Public Outrage. Paper presented at the IEEE International Symposium on Technology and Society, Toronto, Canada.

Loehr, J., & Schwartz, T. (2001). The Making of a Corporate Athlete. Harvard Business Review, January, 120-129.

Lutz, M. (2012). The Marina Experiment.   Retrieved 29 May 2013, from www.themarinaexperiment.com

Macquarie (Ed.). (2009). Uberveillance: Sydney University.

Magid, L. (2013). Wearables and Sensors Big Topics at All Things D. Forbes.

Mann, S. (2004a). Continuous lifelong capture of personal experience with EyeTap. Paper presented at the ACM International Multimedia Conference, Proceedings of the 1st ACM workshop on Continuous archival and retrieval of personal experiences (CARPE 2004), New York.

Mann, S. (2004b). Sousveillance: inverse surveillance in multimedia imaging. Paper presented at the Proceedings of the 12th annual ACM international conference on Multimedia, New York, NY, USA.

Mann, S., Nolan, J., & Wellman, B. (2003). Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments. Surveillance and Society, 1(3), 331-355.

Masters, A., & Michael, K. (2006). Lend me your arms: the use and implications of humancentric RFID. Electronic Commerce Research and Applications, 6(1), 29-39.

Michael, K. (2010). Stop social network pitfalls. Illawarra Mercury.

Michael, K. (2013a). Big Data and the Dangers of Over-Quantifying Oneself. Computer Magazine (Multimedia)   Retrieved June 7, 2013, from http://www.youtube.com/watch?v=mn_9YHV2RGQ&list=PLHJB2bhmgB7cbB-oafjt68XbzyPV46szi&index=7

Michael, K. (2013b). Snowden's Revelations Just the Tip of the Iceberg.   Retrieved 6 July 2013, from http://uberveillance.com/blog/2013/7/23/snowdens-revelations-just-the-tip-of-the-iceberg

Michael, K. (2013c). Social Implications of Wearable Computing and Augmediated Reality in Every Day Life (IEEE Symposium on Technology and Society, ISTAS13). Toronto: IEEE.

Michael, K. (2013d). Wearable computers challenge human rights. ABC Science Online.

Michael, K., & Clarke, R. (2013). Location and tracking of mobile devices: Überveillance stalks the streets. Computer Law & Security Review, 29(3), 216-228.

Michael, K., & Michael, M. G. (2009). Teaching Ethics in Wearable Computing:  the Social Implications of the New ‘Veillance’. EduPOV   Retrieved June 18, from http://www.slideshare.net/alexanderhayes/2009-aupov-main-presentation?from_search=3

Michael, K., & Michael, M. G. (2012). Converging and coexisting systems towards smart surveillance. Awareness Magazine: Self-awareness in autonomic systems, June.

Michael, K., & Michael, M. G. (Eds.). (2007). From Dataveillance to Überveillance and the Realpolitik of the Transparent Society. Wollongong, NSW, Australia.

Michael, K., & Miller, K. W. (2013). Big Data: New Opportunities and New Challenges. IEEE Computer, 46(6), 22-24.

Michael, K., Roussos, G., Huang, G. Q., Gadh, R., Chattopadhyay, A., Prabhu, S., et al. (2010). Planetary-scale RFID Services in an Age of Uberveillance. Proceedings of the IEEE, 98(9), 1663-1671.

Michael, M. G., & Michael, K. (2007). Uberveillance. Paper presented at the 29th International Conference of Data Protection and Privacy Commissioners. Privacy Horizons: Terra Incognita, Location Based Tracking Workshop, Montreal, Canada.

Michael, M. G., & Michael, K. (2010). Towards a State of Uberveillance. IEEE Technology and Society Magazine, 29(2), 9-16.

Michael, M. G., & Michael, K. (2011). The Fall-Out from Emerging Technologies: on Matters of Surveillance, Social Networks and Suicide. IEEE Technology and Society Magazine, 30(3), 15-18.

mX. (2013). Hard to Swallow.   Retrieved 6 August 2013, from http://www.mxnet.com.au/story/hard-to-swallow/story-fnh38q9o-1226659271059

Orcutt, M. (2013). Electronic “Skin” Emits Light When Pressed. MIT Tech Review.

Oxford Dictionary. (2013). Dataveillance.   Retrieved 6 May 2013, from http://oxforddictionaries.com/definition/english/surveillance

OxfordDictionary. (2013). Surveillance.   Retrieved 6 May 2013, from http://oxforddictionaries.com/definition/english/surveillance

Papakostas, T. V., Lima, J., & Lowe, M. (2002). 5:3 A Large Area Force Sensor for Smart Skin Applications. Sensors; Proceedings of IEEE, 5(3).

Pearce, P., & Gretzel, U. (2012). Tourism in technology dead zones: documenting experiential dimensions. International Journal of Tourism Sciences, 12(2), 1-20.

Pivtohead. (2013). Wearable Imaging: True point of view.   Retrieved 22 June 2013, from http://pivothead.com/#

Pokin, S. (2007). MySpace' hoax ends with suicide of Dardenne Prairie teen. St. Louis Post-Dispatch.

Resch, B. (2013). People as Sensors and Collective Sensing-Contextual Observations Complementing Geo-Sensor Network Measurements. Paper presented at the Progress in Location-Based Services, Lecture Notes in Geoinformation and Cartography.

Roberts, P. (1984). Information Visualization for Stock Market Ticks: Toward a New Trading Interface. Massachusetts Institute of Technology, Boston.

Rodota, S., & Capurro, R. (2005). Ethical Aspects of ICT Implants in the Human Body. The European Group on Ethics in Science and New Technologies (EGE)   Retrieved June 3, 2006, from http://ec.europa.eu/bepa/european-group-ethics/docs/avis20_en.pdf

SHRM. (2011). from http://www.shrm.org/publications/hrnews/pages/fatiguefactors.aspx

Spence, R. (2009). Eyeborg.   Retrieved 22 June 2010, from http://eyeborg.blogspot.com.au/

Stephan, K. D., Michael, K., Michael, M. G., Jacob, L., & Anesta, E. (2012). Social Implications of Technology: Past, Present, and Future. Proceedings of the IEEE, 100(13), 1752-1781.

SXSchedule. (2013). Better Measure: Health Engagement & higi Score.   Retrieved 29 June 2013, from http://schedule.sxsw.com/2013/events/event_IAP4888

Thurston, B. (2013). I have left the internet. Fast Company, July/August(177), 66-78, 104-105.

Ware, C. (2000). Information Visualization: Perception for Design. San Francisco, CA: Morgan Kaufmann.

Warm, J. S., Parasuraman, R., & Matthews, G. (2008). Vigilance Requires Hard Mental Work and Is Stressful. Human Factors, 433-441.

Williams, R. B. (2012). Is Facebook Good Or Bad For Your Self-Esteem? Psychology Today: Wired for Success.

Wordnik. (2013). Sousveillance.   Retrieved 6 June 2013, from http://www.wordnik.com/words/sousveillance

Endnotes

1 http://www.shrm.org/

2 www.sleepfoundation.org 

3 Someone searching for a WiFi wireless network connection using a mobile device in a moving vehicle.

4 http://higi.com/about/score; http://schedule.sxsw.com

5 http://www.wordnik.com/words/sousveillance

Citation: Katina Michael, M. G. Michael, and Christine Perakslis (2014) Be Vigilant: There Are Limits to Veillance. The Computer After Me: pp. 189-204. DOI: https://doi-org.ezproxy.uow.edu.au/10.1142/9781783264186_0013 

Uberveillance and the Social Implications of Microchip Implants: Preface

Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies

In addition to common forms of spatial units such as satellite imagery and street views, emerging automatic identification technologies are exploring the use of microchip implants in order to further track an individual’s personal data, identity, location, and condition in real time.

Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies presents case studies, literature reviews, ethnographies, and frameworks supporting the emerging technologies of RFID implants while also highlighting the current and predicted social implications of human-centric technologies. This book is essential for professionals and researchers engaged in the development of these technologies as well as providing insight and support to the inquiries with embedded micro technologies.

Preface

Katina Michael, University of Wollongong, Australia

M.G. Michael, University of Wollongong, Australia

INTRODUCTION

Uberveillance can be defined as an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices into the human body. These embedded technologies can take the form of traditional pacemakers, radio-frequency identification (RFID) tag and transponder implants, smart swallowable pills, nanotechnology patches, multi-electrode array brain implants, and even smart dust to mention but a few form factors. To an extent, head-up displays like electronic contact lenses that interface with the inner body (i.e. the eye which sits within a socket) can also be said to be embedded and contributing to the uberveillance trajectory, despite their default categorisation as body wearables.

Uberveillance has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought). Uberveillance can be a predictive mechanism for a person’s expected behaviour, traits, likes, or dislikes based on historical fact; or it can be about real-time measurement and observation; or it can be something in between. The inherent problem with uberveillance is that facts do not always add up to truth, and predictions or interpretations based on uberveillance are not always correct, even if there is direct visual evidence available (Shih, 2013). Uberveillance is more than closed circuit television feeds, or cross-agency databases linked to national identity cards, or biometrics and ePassports used for international travel. Uberveillance is the sum total of all these types of surveillance and the deliberate integration of an individual’s personal data for the continuous tracking and monitoring of identity, location, condition, and point of view in real-time (Michael & Michael, 2010b).

In its ultimate form, uberveillance has to do with more than automatic identification and location-based technologies that we carry with us. It has to do with under-the-skin technology that is embedded in the body, such as microchip implants. Think of it as Big Brother on the inside looking out. It is like a black box embedded in the body which records and gathers evidence, and in this instance, transmitting specific measures wirelessly back to base. This implant is virtually meaningless without the hybrid network architecture that supports its functionality: making the person a walking online node. We are referring here, to the lowest common denominator, the smallest unit of tracking – presently a tiny chip inside the body of a human being. But it should be stated that electronic tattoos and nano-patches that are worn on the body can also certainly be considered mechanisms for data collection in the future. Whether wearable or bearable, it is the intent and objective which remains important, the notion of “people as sensors.” The gradual emergence of the so-called human cloud, that cloud computing platform which allows for the Internetworking of human “points of view” using wearable recording technology (Nolan, 2013), will also be a major factor in the proactive profiling of individuals (Michael & Michael, 2011).

AUDIENCE

This present volume will aim to equip the general public with much needed educational information about the technological trajectory of RFID implants through exclusive primary interviews, case studies, literature reviews, ethnographies, surveys and frameworks supporting emerging technologies. It was in 1997 that bioartist Eduardo Kac (Figure 1) implanted his leg in a live performance titled Time Capsule(http://www.ekac.org/timec.html) in Brazil (Michael & Michael, 2009). The following year in an unrelated experiment, Kevin Warwick injected an implant into his left arm (Warwick, 2002; K. Michael, 2003). By 2004, the Verichip Corporation had their VeriChip product approved by the Food and Drug Administration (FDA) (Michael, Michael & Ip 2008). And since that point, there has been a great deal of misinformation and confusion surrounding the microchip implant, but also a lot of build-up on the part of the proponents of implantables.

Figure 1. 

Eduardo Kac implanting himself in his left leg with an RFID chip using an animal injector kit on 11 November 1997. Courtesy Eduardo Kac. More at http://www.ekac.org/figs.html.

Radio-Frequency Identification (RFID) is not an inherently secure device, in fact it can be argued that it is just the opposite (Reynolds, 2004). So why someone would wish to implant something beneath the skin for non-medical reasons is quite surprising, despite the touted advantages. One of the biggest issues, not commonly discussed in public forums, has to be the increasing numbers of people who are suffering from paranoid or delusional thoughts with respect to enforced implantation or implantation through stealth. We have already encountered significant problems in the health domain- where for example, a clinical psychologist can no longer readily discount completely the claims of patients who identify with having been implanted or tracked and monitored using inconspicuous forms of ID. This will be especially true in the era of the almost “invisible scale to the naked eye” smart dust which has yet to fully arrive. Civil libertarians, religious advocates, and so-named conspiracy theorists will not be the only exclusive groups to discuss the real potential of microchipping people, and for this reason, the discussion will move into the public policy forum, all inclusive of stakeholders in the value chain.

Significantly, this book will also provide researchers and professionals who are engaged in the development or implementation of emerging services with awareness of the social implications of human-centric technologies. These implications cannot be ignored by operational stakeholders, such as engineers and the scientific elite, if we hope to enact long-term beneficial change with new technologies that will have a positive impact on humanity. We cannot possess the attitude that says- let us see how far we can go with technology and we will worry about the repercussions later: to do so would be narrow-sighted and to ignore the importance of socio-technical sustainability. Ethics are apparently irrelevant to the engineer who is innovating in a market-driven and research funded environment. For sure there are some notable exceptions where a middle of the way approach is pursued, notably in the medical and educational contexts. Engineering ethics, of course exist, unfortunately often denigrated and misinterpreted as discourses on “goodness” or appeals to the categorical imperative. Nevertheless industry as a whole has a social responsibility to consumers at large, to ensure that it has considered what the misuse of its innovations might mean in varied settings and scenarios, to ensure that there are limited, if any, health effects from the adoption of particular technologies, and that adverse event reports are maintained by a centralised administrative office with recognised oversight (e.g. an independent ombudsman).

Equally, government agencies must respond with adequate legislative and regulatory controls to ensure that there are consequences for the misuse of new technologies. It is not enough for example, for a company like Google to come out and openly “bar” applications for its Glass product, such as biometric recognition and pornography, especially when they are very aware that these are two application areas for which their device will be exploited. Google is trying to maintain its brand by stating clearly that it is not affiliated with negative uses of its product, knowing too well that their proclamation is quite meaningless, and by no means legally binding. And this remains one of the great quandaries, that few would deny that Google’s search rank and page algorithms have meant we have also been beneficiaries of some extraordinary inventiveness.

According to a survey by CAST, one in five persons, have reported that they want to see a Google Glass ban (Nolan, 2013). Therefore, the marketing and design approach nowadays, which is broadly evident across the universal corporate spectrum, seems to be:

We will develop products and make money from them, no matter how detrimental they may be to society. We will push the legislative/regulatory envelope as much as we can, until someone says: Stop. You’ve gone too far! The best we can do as a developer is place a warning on the packaging, just like on cigarette notices, and if people choose to do the wrong thing our liability as a company is removed completely because we have provided the prior warning and only see beneficial uses. If our product is used for bad then that is not our problem, the criminal justice system can deal with that occurrence, and if non-users of our technology are entangled in a given controversy, then our best advice to people is to realign the asymmetry by adopting our product.

INSPIRATION

This edited volume came together over a three year period. We formed our editorial board and sent out the call for book chapters soon after the IEEE conference we hosted at the University of Wollongong, the International Symposium on Technology and Society (ISTAS) on 7-10 June 2010, sponsored by IEEE’s Society on the Social Implications of Technology (SSIT) (http://iibsor.uow.edu.au/conferences/ISTAS/home/index.html). The symposium was dedicated to emerging technologies and there were a great many papers presented from a wide range of views on the debate over the microchipping of people. It was such a highlight to see this sober conversation happening between experts coming at the debate from different perspectives, different cultural contexts, and different lifeworlds. A great deal of the spirit from that conversation has taken root in this book. The audio-visual proceedings aired on the Australian Broadcasting Corporation’s much respected 7.30 Report and received wide coverage in major media outlets. The significance is not in the press coverage but in the fact that the topic is now relevant to the everyday person. Citizens will need to make a personal decision- do I receive an implant or not? Do I carry an identifier on the surface of my skin or not? Do I succumb to 24x7 monitoring by being fully “connected” to the grid or not?

Individuals who were present at ISTAS10 and were also key contributors to this volume include keynote speakers Professor Rafael Capurro, Professor Roger Clarke, Professor Kevin Warwick, Dr Katherine Albrecht, Dr Mark Gasson, Mr Amal Graafstra, and attendees Professor Marcus Wigan, Associate Professor Darren Palmer, Dr Ian Warren, Dr Mark Burdon, and Mr William A. Herbert. Each of these presenters have been instrumental voices in the discussion on Embedded Surveillance Devices (ESDs) in living things (animals and humans), and tracking and monitoring technologies. They have dedicated a portion of their professional life to investigating the possibilities and the effects of a world filled with microchips, beyond those in desktop computers and high-tech gadgetry. They have also been able to connect the practice of an Internet of Things (IoT) from not only machine-to-machine but nested forms of machine-to-people-to-machine interactions and considered the implications. When one is surrounded by such passionate voices, it is difficult not to be inspired onward to such an extensive work.

A further backdrop to the book is the annual workshops we began in 2006 on the Social Implications of National Security which have had ongoing sponsorship by the Australian Research Council’s Research Network for a Secure Australia (RNSA). Following ISTAS10, we held a workshop on the “Social Implications of Location-Based Services” at the University of Wollongong’s Innovation Campus and were fortunate to have Professor Rafael Capurro, Professor Andrew Goldsmith, Professor Peter Eklund, and Associate Professor Ulrike Gretzel present their work (http://iibsor.uow.edu.au/conferences/ISTAS/workshops/index.html). Worthy of note is the workshop proceedings which are available online have been recognised as major milestones for the Research Network in official government documentation. For example, the Department of the Prime Minister and Cabinet (PM&C) among other high profile agencies in Australia and abroad have requested copies of the works for their libraries.

In 2012, the topic of our annual RNSA workshop was “Sousveillance and the Social Implications of Point of View Technologies in Law Enforcement” held at the University of Sydney (http://works.bepress.com.ezproxy.uow.edu.au/kmichael/249/). Professor Kevin Haggerty keynoted that event, speaking on a theme titled “Monitoring within and beyond the Police Organisation” and also later graciously contributed the foreword to this book, as well as presenting on biomimetics at the University of Wollongong. The workshop again acted to bring exceptional voices together to discuss audio-visual body-worn recording technologies including, Professor Roger Clarke, Professor David Lyon, Associate Professor Nick O’Brien, Associate Professor Darren Palmer, Dr Saskia Hufnagel, Dr Jann Karp, Mr Richard Kay, Mr Mark Lyell, and Mr Alexander Hayes.

In 2013, the theme of the National Security workshop was “Unmanned Aerial Vehicles - Pros and Cons in Policing, Security & Everyday Life” held at Ryerson University in Canada. This workshop had presentations from Professor Andrew Clement, Associate Professor Avner Levin, Mr Ian Hannah, and Mr Matthew Schroyer. It was the first time that the workshop was held outside Australian borders in eight years. While drones are not greatly discussed in this volume, they demonstrate one of the scenario views of the fulfilment of uberveillance. Case in point, the drone killing machine signifies the importance of a remote controlled macro-to-micro view. At first, something needs to be able to scan the skies to look down on the ground, and then when the target has been identified and tracked it can be extinguished with ease. One need only look at the Israel Defence Force’s pinpoint strike on Ahmed Jabari, the head of the Hamas Military Wing, to note the intrinsic link between the macro and micro levels of details (K. Michael, 2012). How much “easier” could this kind of strike have been if the GPS chipset in the mobile phone carried by an individual communicated with a chip implant embedded in the body? RFID can be a tracking mechanism, despite the claims of some researchers that it has only a 10cm proximity. That may well be the case for your typical wall-mounted reader, but the mobile phone can act as a continuous reader if in range, as can a set of traffic lights, lampposts, or even wifi access nodes, depending on the on-board technology and the power of the reader equipment being used. A telltale example of the potential risks can be seen in the rollout of Real ID driver’s licenses in the USA, since the enactment of the REAL ID Act of 2005.

In 2013, it was also special to meet some of our book contributors for the first time at ISTAS13, held at the University of Toronto on the theme of “Wearable Computers and Augmediated Reality in Everyday Life,” among them Professor Steve Mann, Associate Professor Christine Perakslis, and Dr Ellen McGee. As so often happens when a thematic interest area brings people together from multiple disciplines, an organic group of interdisciplinary voices has begun to form at www.technologyandsociety.org. The holistic nature of this group is especially stimulating in sharing its diverse perspectives. Building upon these initial conversations and ensuring they continue as the social shaping of technology occurs in the real world is paramount.

As we brought together this edited volume, we struck a very fruitful collaboration with Reader, Dr Jeremy Pitt of Imperial College London, contributing a large chapter in his disturbingly wonderful edited volume entitled This Pervasive Day: The Potential and Perils of Pervasive Computing (2012). Jeremy’s book is a considered forecast of the social impact of new technologies inspired by Ira Levin’s This Perfect Day(1970). Worthy of particular note is our participation in the session entitled “Heaven and Hell: Visions for Pervasive Adaptation” at the European Future Technologies Conference and Exhibition (Paechter, 2011). What is important to draw out from this is that pervasive computing will indeed have a divisive impact on its users: for some it will offer incredible benefits, while to others it will be debilitating in its everyday effect. We hope similarly, to have been able to remain objective in this edited volume, offering viewpoints from diverse positions on the topic of humancentric RFID. This remained one of our principal aims and fundamental goals.

Questioning technology’s trajectory, especially when technology no longer has a medical corrective or prosthetic application but one that is based on entertainment and convenience services is extremely important. What happens to us when we embed a device that we cannot remove on our own accord? Is this fundamentally different to wearing or lugging something around? Without a doubt, it is! And what of those technologies, that are presently being developed in laboratories across the world for microscopic forms of ID, and pinhole video capture? What will be the impact of these on our society with respect to covert surveillance? Indeed, the line between overt and covert surveillance is blurring- it becomes indistinguishable when we are surrounded by surveillance and are inside the thick fog itself. The other thing that becomes completely misconstrued is that there is actually logic in the equation that says that there is a trade-off between privacy and convenience. There is no trade-off. The two variables cannot be discussed on equal footing – you cannot give a little of your privacy away for convenience and hope to have it still intact thereafter. No amount of monetary or value-based recompense will correct this asymmetry. We would be hoodwinking ourselves if we were to suddenly be “bought out” by such a business model. There is no consolation for privacy loss. We cannot be made to feel better after giving away a part of ourselves. It is not like scraping one’s knee against the concrete with the expectation that the scab will heal after a few days. Privacy loss is to be perpetually bleeding, perpetually exposed.

Additionally, in the writing of this book we also managed a number of special issue journals in 2010 and 2011, all of which acted to inform the direction of the edited volume as a whole. These included special issues on “RFID – A Unique Radio Innovation for the 21st Century” in the Proceedings of the IEEE (together with Rajit Gadh, George Roussos, George Q. Huang, Shiv Prabhu, and Peter Chu); “The Social Implications of Emerging Technologies” in Case Studies in Information Technology with IGI (together with Dr Roba Abbas); “The Social and Behavioral Implications of Location-Based Services” in the Journal of Location Based Services with Routledge; and “Surveillance and Uberveillance” in IEEE Technology and Society Magazine. In 2013, Katina also guest edited a volume for IEEE Computer on “Big Data: Discovery, Productivity and Policy” with Keith W. Miller. If there are any doubts about the holistic work supporting uberveillance, we hope that these internationally recognised journals, amongst others, that have been associated with our guest editorship indicate the thoroughness and robustness of our approach, and the recognition that others have generously provided to us for the incremental work we have completed.

It should also not go without notice that since 2006 the term uberveillance has been internationally embedded into dozens of graduate and undergraduate technical and non-technical courses across the globe. From the University of New South Wales and Deakin University to the University of Salford, from the University of Malta right through to the University of Texas at El Paso and Western Illinois University- we are extremely encouraged by correspondence from academics and researchers noting the term’s insertion into outlines, chosen text book, lecture schedules, major assessable items, recommended readings, and research training. These citations have acted to inform and to interrogate the subjects that connect us. That our research conclusions resonate with you, without necessarily implying that you have always agreed with us, is indeed substantial.

OUTLINE

Uberveillance and the Social Implications of Microchip Implants: Emerging Technologies follows on from a 2009 IGI Premier Reference source book titled Automatic Identification and Location-Based Services: from Bar Codes to Chip Implants. This volume consists of 6 sections, and 18 chapters with 7 exclusive addendum primary interviews and panels. The strength of the volume is in its 41 author contributions. Contributors have come from diverse professional and research backgrounds in the field of emerging technologies, law and social policy, including, information and communication sciences, administrative sciences and management, criminology, sociology, law and regulation, philosophy, ethics and policy, government, political science, among others. Moreover, the book will provide insights and support to every day citizens who may be questioning the trajectory of micro and miniature technologies or the potential for humans to be embedded with electro-magnetic devices. Body wearable technologies are also directly relevant, as they will act as complementary if not supplementary innovations to various forms of implants.

Section 1 is titled “The Veillances” with a specific background context of uberveillance. This section inspects the antecedents of surveillance, Roger Clarke’s dataveillance thirty years on, Steve Mann’s sousveillance, and MG Michael’s uberveillance. These three neologisms are inspected under the umbrella of the “veillances” (from the French veiller) which stems from the Latin vigilare which means to “keep watch” (Oxford Dictionary, 2012).

In 2009, Katina Michael and MG Michael presented a plenary paper titled: “Teaching Ethics in Wearable Computing: the Social Implications of the New ‘Veillance’” (K. Michael & Michael, 2009d). It was the first time that surveillance, dataveillance, sousveillance, and uberveillance were considered together at a public gathering. Certainly as a specialist term, it should be noted “veillance” was first used in an important blogpost exploring equiveillance by Ian Kerr and Steve Mann (2006): “the valences of veillance” were briefly described. In contrast to Kerr and Mann (2006), Michael and Michael (2006) were pondering on the intensification of a state of uberveillance through increasingly pervasive technologies that can provide details from the big picture view right down to the miniscule personal details.

Alexander Hayes (2010), pictorialized this representation using the triquetra, also known as the trinity knot and Celtic triangle (Figure 2), and describes its application to uberveillance in the educational context in chapter 3. Hayes uses mini cases to illustrate the importance of understanding the impact of body-worn video across sectors. He concludes by warning that commercial entities should not be engaged in “techno-evangelism” when selling to the education sector but should rather maintain the purposeful intent of the use of point of view and body worn video recorders within the specific educational context. Hayes also emphasises the urgent need for serious discussion on the socio-ethical implications of wearable computers.

Figure 2. 

Uberveillance triquetra (Hayes, 2010). See also Michael and Michael (2007).

 

By 2013, K. Michael had published proceedings from the International Symposium on Technology and Society (ISTAS13) using the veillance concept as a theme (http://veillance.me), with numerous papers submitted to the conference exploring veillance perspectives (Ali & Mann, 2013; Hayes, et al., 2013; K. Michael, 2013; Minsky, et al., 2012; Paterson, 2013). Two other crucial references to veillance include “in press” papers by Michael and Michael (2013) and Michael, Michael, and Perakslis (2014). But what does veillance mean? And how is it understood in different contexts? What does it mean to be watched by a CCTV camera, to have one’s personal details deeply scrutinized; to watch another; or to watch oneself?

Dataveillance (see Interview 1.1) conceived by Roger Clarke of the Australian National University (ANU) in 1988 “is the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” (Clarke, 1988a). According to the Oxford Dictionary, dataveillance is summarized as “the practice of monitoring the online activity of a person or group” (Oxford Dictionary, 2013). It is hard to believe that this term was introduced a quarter of a century ago, in response to government agency data matching initiatives linking taxation records and social security benefits, among other commercial data mining practices. At the time it was a powerful statement in response to the Australia Card proposal in 1987 (Clarke, 1988b) which was never implemented by the Hawke Government, despite the Howard Government attempts to introduce an Access Card almost two decades later in 2005 (Australian Privacy Foundation, 2005). The same issues ensue today, only on a more momentous magnitude with far more consequences and advanced capabilities in analytics, data storage, and converging systems.

Sousveillance (see chapter 2) conceived by Steve Mann of the University of Toronto in 2002 but practiced since at least 1995 is the “recording of an activity from the perspective of a participant in the activity” (Wordnik, 2013). However, its initial introduction into the literature came in the inaugural publication of the Surveillance and Society journal in 2003 with a meaning of “inverse surveillance” as a counter to organizational surveillance (Mann, Nolan, & Wellman, 2003). Mann prefers to interpret sousveillance as under-sight which maintains integrity, contra to surveillance as over-sight which equates to hypocrisy (Mann, 2004).

Whereas dataveillance is the systematic use of personal data systems in the monitoring of people, sousveillance is the inverse of monitoring people; it is the continuous capture of personal experience. For example, dataveillance might include the linking of someone’s tax file number with their bank account details and communications data. Sousveillance on the other hand, is a voluntary act of logging what one might see around them as they move through the world. Surveillance is thus considered watching from above, whereas sousveillance is considered watching from below. In contrast, dataveillance is the monitoring of a person’s online activities, which presents the individual with numerous social dangers (Clarke, 1988a).

Uberveillance (see chapter 1) conceived by MG Michael of the University of Wollongong (UOW) in 2006, is commonly defined as: “ubiquitous or pervasive electronic surveillance that is not only ‘always on’ but ‘always with you,’ ultimately in the form of bodily invasive surveillance” (ALD, 2010). The term entered the Macquarie Dictionary of Australia officially in 2008 as “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” (Macquarie, 2009, p. 1094). The concern over uberveillance is directly related to the misinformationmisinterpretation, and information manipulation of citizens' data. We can strive for omnipresence through real-time remote sharing and monitoring, but we will never achieve simple omniscience (Michael & Michael, 2009).

Uberveillance is a compound word, conjoining the German über meaning over or above with the French veillance. The concept is very much linked to Friedrich Nietzsche’s vision of the Übermensch, who is a man with powers beyond those of an ordinary human being, like a super-man with amplified abilities (Honderich, 1995; M. G. Michael & Michael, 2010b). Uberveillance is analogous to embedded devices that quantify the self and measure indiscriminately. For example, heart, pulse, and temperature sensor readings emanating from the body in binary bits wirelessly, or even through amplified eyes such as inserted contact lens “glass” that might provide visual display and access to the Internet or social networking applications.

Uberveillance brings together all forms of watching from above and from below, from machines that move to those that stand still, from animals and from people, acquired involuntarily or voluntarily using obtrusive or unobtrusive devices (Figure 3) (K. Michael, et al., 2010). The network infrastructure underlies the ability to collect data direct from the sensor devices worn by the individual, and big data analytics ensures an interpretation of the unique behavioral traits of the individual implying more than just predicted movement, but intent and thought (K. Michael & Miller, 2013).

Figure 3. From surveillance to uberveillance (K. Michael, et al., 2009b)

It has been said that uberveillance is that part of the veillance puzzle that brings together the surdata, and sous to an intersecting point (Stephan, et al., 2012). In uberveillance, there is the “watching” from above component (sur), there is the “collecting” of personal data and public data for mining (data), and there is the watching from below (sous) which can draw together social networks and strangers, all coming together via wearable and implantable devices on/in the human body. Uberveillance can be used for good but we contend that independent of its application for non-medical purposes, it will always have an underlying control factor of power and authority (Masters & Michael, 2005; Gagnon, et al., 2013).

Section 2 is dedicated to applications of humancentric implantables in both the medical and non-medical space. Chapter 4 is written by professor of cybernetics, Kevin Warwick at the University of Reading and his senior research fellow, Dr Mark Gasson. In 1998, Warwick was responsible for Cyborg 1.0, and later Cyborg 2.0 in 2002. In chapter 4, Warwick and Gasson describe implants, tracking and monitoring functionality, Deep Brain Stimulation (DBS), and magnetic implants. They are pioneers in the implantables arena but after initially investigating ID and location interactivity in a closed campus environment using humancentric RFID approaches, Warwick has begun to focus his efforts on medical solutions that can aid the disabled, teaming up with Professor Tipu Aziz, a neurosurgeon from the University of Oxford. He has also explored person-to-person interfaces using the implantable devices for bi-directional functionality.

Following on from the Warwick and Gasson chapter are two interviews and a modified presentation transcript demonstrating three different kinds of RFID implant applications. Interview 4.1 is with Mr Serafin Vilaplana the former IT Manager at the Baja Beach Club who implemented the RFID implants for club patronage in Barcelona, Spain. The RFID implants were used to attract VIP patrons, perform basic access control, and be used for electronic payments. Katina Michael had the opportunity to interview Serafin after being invited to attend a Women’s in Engineering (WIE) Conference in Spain in mid-2009 organised by the Georgia Institute of Technology. It was on this connected journey that Katina Michael also met with Mark Gasson during a one day conference at the London School of Economics for the very first time, and they discussed a variety of incremental innovations in RFID.

In late May 2009, Mr Gary Retherford, a Six Sigma black belt specialising in Security, contacted Katina to be formally interviewed after coming across the Michaels’ work on the Internet. Retherford was responsible for instituting the Citywatcher.com employee access control program using the VeriChip implantable device in 2006. Interview 4.2 presents a candid discussion between Retherford and K. Michael on the risk versus reward debate with respect to RFID implantables. While Retherford can see the potential for ID tokens being embedded in the body, Michael raises some very important matters with respect to security questions inherent in RFID. Plainly, Michael argues that if we invite technology into the body, then we are inviting a whole host of computer “connectedness” issues (e.g. viruses, denial-of-service-attacks, server outages, susceptibility to hacking) into the human body as well. Retherford believes that these are matters that can be overcome with the right technology, and predicts a time that RFID implant maintenance may well be as straightforward as visiting a Local Service Provider (LSP).

Presentation 4.3 was delivered at IEEE ISTAS10 by Mr Amal Graafstra and can be found on the Internet here: http://www.youtube.com/watch?v=kraWt1adY3k. This chapter presents the Do-It-Yourselfer perspective, as opposed to getting an implant that someone else uses in their operations or commercial applications. Quite possibly, the DIY culture may have an even greater influence on the diffusion of RFID implantables than even the commercial arena. DIYers are usually circumspect of commercial RFID implant offerings which they cannot customise, or for which they need an implant injected into a pre-defined bodily space which they cannot physically control. Graafstra’s published interview in 2009, as well as his full-length paper on the RFID subculture with K. Michael and M.G. Michael (2010), still stand as the most informative dialogue on the motivations of DIYers. Recently, in 2012, Graafstra began his own company DangerousThings.com touting the benefits of RFID implantables within the DIY/hacking community. Notably, a footer disclaimer statement reads: “Certain things sold at the Dangerous Things Web shop are dangerous. You are purchasing, receiving, and using the items you acquired here at your own peril. You're a big boy/girl now, you can make your own decisions about how you want to use the items you purchase. If this makes you uncomfortable, or you are unable to take personal responsibility for your actions, don't order!”

Chapter 5 closes section 2, and is written by Maria Burke and Chris Speed on applications of technology with an emphasis on memory, knowledge browsing, knowledge recovery, and knowledge sharing. This chapter reports on outcomes from research in the Tales of Things Electronic Memory (TOTeM) large grant in the United Kingdom. Burke and Speed take a fresh perspective of how technology is influencing societal and organisational change by focusing on Knowledge Management (KM). While the chapter does not explicitly address RFID, it rather explores technologies already widely diffused under the broad category of tagging systems, such as quick response codes, essentially 2D barcodes. The authors also do not fail to acknowledge that tagging systems rely on underlying infrastructure, such as wireless networks and the Internet more broadly through devices we carry such as smartphones. In the context of this book, one might also look at this chapter with a view of how memory aids might be used to support an ageing population, or those suffering with Alzheimer’s disease for example.

Section 3 is about the adoption of RFID tags and transponders by various demographics. Christine Perakslis examines the willingness to adopt RFID implants in chapter 6. She looks specifically at how personality factors play a role in the acceptance of uberveillance. She reports on a preliminary study, as well as comparing outcomes from two separate studies in 2005 and 2010. In her important findings, she discusses RFID implants as lifesaving devices, their use for trackability in case of an emergency, their potential to increase safety and security, and to speed up airport checkpoints. Yet the purpose of the Perakslis study is not to identify implantable applications as such but to investigate differences between and among personality dimensions and levels of willingness toward implanting an RFID chip in the human body. Specifically, Perakslis examines the levels of willingness toward the uberveillance trajectory using the Myers Briggs Type Indicator (MBTI).

Interview 6.1 Katina Michael converses with a 16-year-old male from Campbelltown, NSW, about tattoos, implants, and amplification. The interview is telling with respect to the prevalence of the “coolness” factor and group dynamics in youth. Though tattoos have traditionally been used to identify with an affinity group, we learn that implants would only resonate with youth if they were functional in an advanced manner, beyond just for identification purposes. This interview demonstrates the intrinsic connection between technology and the youth sub-culture which will more than likely be among the early adopters of implantable devices, yet at the same time remain highly susceptible to peer group pressure and brand driven advertising.

In chapter 7, Randy Basham considers the potential for RFID chip technology use in the elderly for surveillance purposes. The chapter not only focuses on adoption of technology but emphasises the value conflicts that RFID poses to the elderly demographic. Among these conflicts are resistance to change, technophobia, matters of informed consent, the risk of physical harm, Western religious opposition, concerns over privacy and GPS tracking, and transhumanism. Basham who sits on the Human Services Information Technology Applications (HUSITA) board of directors provides major insights to resistance to change with respect to humancentric RFID. It is valuable to read Basham’s article alongside the earlier interview transcript of Gary Retherford, to consider how new technologies like RFID implantables may be diffused widely into society. Minors and the elderly are particularly dependent demographics in this space and require special attention. It is pertinent to note, that the protests by CASPIAN led by Katherine Albrecht in 2007 blocked the chipping of elderly patients who were suffering with Alzheimer’s Disease (Lewan, 2007; ABC, 2007). If one contemplates on the trajectory for technology crossover in the surveillance atmosphere, one might think on an implantable solution with a Unique Lifetime Identifier (ULI) which follows people from cradle-to-grave and becomes the fundamental componentry that powers human interactions.

Section 4 draws on laws, directives, regulations and standards with respect to challenges arising from the practice of uberveillance. Chapter 8 investigates how the collection of DNA profiles and samples in the United Kingdom is fast becoming uncontrolled. The National DNA Database (NDNAD) of the UK has more than 8% of the population registered with much higher proportions for minority groups, such as the Black Ethnic Minority (BEM). Author Katina Michael argues that such practices drive further adoption of what one could term, national security technologies. However, developments and innovations in this space are fraught with ethical challenges. The risks associated with familial searching as overlaid with medical research, further compounds the possibility that people may carry a microchip implant with some form of DNA identifier as linked to a Personal Health Record (PHR). This is particularly pertinent when considering the European Union (EU) decision to step up cross-border police and judicial cooperation in EU countries in criminal matters, allowing for the exchange of DNA profiles between the authorities responsible for the prevention and investigation of criminal offences (see Prüm Treaty).

Chapter 9 presents outcomes from a large Australian Research Council-funded project on the night time economy in Australia. In this chapter, ID scanners and uberveillance are considered in light of trade-offs between privacy and crime prevention. Does instituting ID scanners prevent or minimise crime in particular hot spots or do they simply cause a chilling effect and trigger the redistribution of crime to new areas. Darren Palmer and his co-authors demonstrate how ID scanners are becoming a normalized precondition of entry into one Australian nighttime economy. They demonstrate that the implications of technological determinism amongst policy makers, police and crime prevention theories need to be critically assessed and that the value of ID scanners needs to be reconsidered in context. In chapter 10, Jann Karp writes on global tracking systems in Australian interstate trucking. She investigates driver perspectives and attitudes on the modern practice of fleet management, and on the practice of tracking vehicles and what that means to truck drivers. Whereas chapter 9 investigates the impact of emerging technology on consumers, chapter 10 gives an employee perspective. While Palmer et al. question the effectiveness of ID scanners in pubs and clubs, Karp poses the challenging question- is locational surveillance of drivers in the trucking industry helpful or is it a hindrance?

Chapter 11 provides legislative developments in tracking, in relation to the “Do Not Track” initiatives written by Mark Burdon et al. The chapter focuses on online behavioral profiling, in contrast to chapter 8 that focuses on DNA profiling and sampling. US legislative developments are compared with those in the European Union, New Zealand, Canada and Australia. Burdon et al. provide an excellent analysis of the problems. Recommendations for ways forward are presented in a bid for members of our communities to be able to provide meaningful and educated consent, but also for the appropriate regulation of transborder information flows. This is a substantial piece of work, and one of the most informative chapters on Do Not Track initiatives available in the literature.

Chapter 12 by Kyle Powys Whyte and his nine co-authors from Michigan State University completes section 4 with a paper on the emerging standards in livestock industry. The chapter looks at the benefits of nanobiosensors in livestock traceability systems but does not neglect to raise the social and ethical dimensions related to standardising this industry. Whyte et al. argue that future development of nanobiosensors should include processes that engage diverse actors in ways that elicit productive dialogue on the social and ethical contexts. A number of practical recommendations are presented at the conclusion of the chapter, such as the role of “anticipatory governance” as linked to Science and Technology Studies (STS). One need only consider the findings of this priming chapter, and how these results may be applied in light of the relationship between non-humancentric RFID and humancentric RFID chipping. Indeed, the opening sentence of the chapter points to the potential: “uberveillance of humans will emerge through embedding chips within nonhumans in order to monitor humans.”

Section 5 contains the critical chapter dedicated to the health implications of microchipping living things. In chapter 13, Katherine Albrecht uncovers significant problems related to microchip-induced cancer in mice or rats (2010). A meta-data analysis of eleven clinical studies published in oncology and toxicology journals between 1996 and 2006 are examined in detail in this chapter. Albrecht goes beyond the prospective social implications of microchipping humans when she presents the physical adverse reactions to implants in animals. Albrecht concludes her chapter with solid recommendations for policy-makers, veterinarians, pet owners, and oncology researchers, among others. When the original report was first launched (http://www.antichips.com/cancer/), Todd Lewan (2007) of the Associated Press had an article published in the Washington Post titled, “Chip Implants Linked to Animal Tumors.” Albrecht is to be commended for this pioneering study, choosing to focus on health related matters which will increasingly become relevant in the adoption of invasive and pervasive technologies.

The sixth and final section addresses the emerging socio-ethical implications of RFID tags and transponders in humans. Chapter 14 addresses some of the underlying philosophical aspects of privacy within pervasive surveillance. Alan Rubel chooses to investigate the commercial arena, penal supervision, and child surveillance in this book chapter. He asks: what is the potential for privacy loss? The intriguing and difficult question that Rubel attempts to answer is whether privacy losses (and gains) are morally salient. Rubel posits that determining whether privacy loss is morally weighty, or of sufficient moral weight to give rise to a right to privacy, requires an examination of reasons why privacy might be valuable. He describes both instrumental value and intrinsic value and presents a brief discussion on surveillance and privacy value.

Panel 14.1 is a slightly modified transcription of the debate over microchipping people recorded at IEEE ISTAS10 (https://www.youtube.com/watch?v=dI3Rps-VFdo). This distinguished panel is chaired by lawyer William Herbert. Panel members included, Rafael Capurro, who was a member of the European Group on Ethics in Science and New Technologies (EGE), and who co-authored the landmark Opinion piece published in 2005 “On the ethical aspects of ICT implants in the human body.” Capurro, who is the director for the International Center for Information Ethics, was able to provide a highly specialist ethical contribution to the panel. Mark Gasson and Amal Graafstra, both of whom are RFID implantees, introduced their respective expert testimonies. Chair of the Australian Privacy Foundation Roger Clarke and CASPIAN director Katherine Albrecht represented the privacy and civil liberties positions in the debate. The transcript demonstrates the complexity and multi-layered dimensions surrounding humancentric RFID, and the divisive nature of the issues at hand: on whether to microchip people, or not.

In chapter 15 we are introduced to the development of brain computer interfaces, brain machine interfaces and neuromotor prostheses. Here Ellen McGee examines sophisticated technologies that are used for more than just identification purposes. She writes of brain implants that are surgically implanted and affixed, as opposed to simple implantable devices that are injected in the arm with a small injector kit. These advanced technologies will allow for radical enhancement and augmentation. It is clear from McGee’s fascinating work that these kinds of leaps in human function and capability will cause major ethical, safety, and justice dilemmas. McGee clearly articulates the need for discourse and regulation in the broad field of neuroprosthetics. She especially emphasises the importance of privacy and autonomy. McGee concludes that there is an urgent need for debate on these issues, and questions whether or not it is wise to pursue such irreversible developments.

Ronnie Lipschutz and Rebecca Hester complement the work of McGee, going beyond the possibilities to making the actual assumption that the human will assimilate into the cellular society. They proclaim “We are the Borg!” And in doing so point to a future scenario where not only bodies are read, but minds as well. They describe “re(b)organization” as that new phenomenon that is occurring in our society today. Chapter 16 is strikingly challenging for this reason, and makes one speculate what or who are the driving forces behind this cyborgization process. This chapter will also prove of special interest for those who are conversant with Cartesian theory. Lipschutz and Hester conclude by outlining the very real need for a legal framework to deal with hackers who penetrate biodata systems and alter individual’s minds and bodies, or who may even kill a person by tampering with or reprogramming their medical device remotely.

Interview 16.1 directly alludes to this cellular society. Videographer Jordan Brown interviews Katina Michael on the notion of the “screen bubble.” What is the screen culture doing to us? Rather than looking up as we walk around, we divert our attention to the screen in the form of a smart phone, iPad, or even a digital wearable glass device. We look down increasingly, and not at each other. We peer into lifeless windows of data, rather than peer into one another’s eyes. What could this mean and what are some of the social implications of this altering of our natural gaze? The discussion between Brown and K. Michael is applicable to not just the implantables space, but to the wearables phenomenon as well.

The question of faith in a data driven and information-saturated society is adeptly addressed by Marcus Wigan in the Epilogue. Wigan calls for a new moral imperative. He asks the very important question in the context of “who are the vulnerable now?” What is the role of information ethics, and where should targeted efforts be made to address these overarching issues which affect all members of society- from children to the elderly, from the employed to the unemployed, from those in positions of power to the powerless. It is the emblematic conclusion to a book on uberveillance.

REFERENCES

ABC. (2007). Alzheimer's patients lining up for microchip. ABCNews. Retrieved from http://abcnews.go.com/GMA/OnCall/story?id=3536539

Albrecht, K. (2010). Microchip-induced tumors in laboratory rodents and dogs: A review of the literature 1990–2006. In Proceedings of IEEE International Symposium on Technology and Society (ISTAS10). Wollongong, Australia: IEEE.

Ali, A., & Mann, S. (2013). The inevitability of the transition from a surveillance-society to a veillance-society: Moral and economic grounding for sousveillance. In Proceedings of IEEE International Symposium on Technology and Society (ISTAS13). Toronto, Canada: IEEE.

Australian Privacy Foundation. (2005). Human services card. Australian Privacy Foundation. Retrieved 6 June 2013, from http://www.privacy.org.au/Campaigns/ID_cards/HSCard.html

Clarke R. (1988a). Information technology and dataveillance.Communications of the ACM, 31(5), 498–512. 10.1145/42411.42413

Clarke R. (1988b). Just another piece of plastic in your wallet: The ‘Australian card’ scheme.ACM SIGCAS Computers and Society, 18(1), 7–21. 10.1145/47649.47650

Gagnon M. Jacob J. D. Guta A. (2013). Treatment adherence redefined: A critical analysis of technotherapeutics.Nursing Inquiry, 20(1), 60–70. 10.1111/j.1440-1800.2012.00595.x22381079

Graafstra A. (2009). Interview 14.2: The RFID do-it-yourselfer. In MichaelK.MichaelM. G. (Eds.), Innovative automatic identification and location based services: from bar codes to chip implants (pp. 427–449). Hershey, PA: IGI Global.

Graafstra, A., Michael, K., & Michael, M. G. (2010). Social-technical issues facing the humancentric RFID implantee sub-culture through the eyes of Amal Graafstra. In Proceedings of IEEE International Symposium on Technology and Society (ISTAS10). Wollongong, Australia: IEEE.

Hayes, A. (2010). Uberveillance (triquetra). Retrieved 6 May 2013, from http://archive.org/details/Uberveillancetriquetra

Hayes, A., Mann, S., Aryani, A., Sabbine, S., Blackall, L., Waugh, P., & Ridgway, S. (2013). Identity awareness of research data in veillance and social computing. In Proceedings of IEEE International Symposium on Technology and Society (ISTAS13). Toronto, Canada: IEEE.

Kerr, I., & Mann, S. (n.d.). Exploring equiveillance. ID TRAIL MIX. Retrieved 26 September 2013 from http://wearcam.org/anonequiveillance.htm

Levin I. (1970). This perfect day: A novel. New York: Pegasus.

Lewan, T. (2007, September 8). Chip implants linked to animal tumors. Washington Post. Retrieved from http://www.washingtonpost.com/wp-dyn/content/article/2007/09/08/AR2007090800997_pf.html

Macquarie. (2009). Uberveillance. In S. Butler (Ed.), Macquarie dictionary (5th ed.). Sydney, Australia: Sydney University.

Mann, S. (2004). Sousveillance: Inverse surveillance in multimedia imaging. In Proceedings of the 12th Annual ACM International Conference on Multimedia. New York, NY: ACM.

Mann S. Nolan J. Wellman B. (2003). Sousveillance: Inventing and using wearable computing devices for data collection in surveillance environments.Surveillance & Society, 1(3), 331–355.

Masters, A., & Michael, K. (2005). Humancentric applications of RFID implants: The usability contexts of control, convenience and care. In Proceedings of the Second IEEE International Workshop on Mobile Commerce and Services. Munich, Germany: IEEE Computer Society.

Michael K. (2003). The automatic identification trajectory. In LawrenceE.LawrenceJ.NewtonS.DannS.CorbittB.ThanasankitT. (Eds.), Internet commerce: Digital models for business. Sydney, Australia: John Wiley & Sons.

Michael, K. (2012). Israel, Palestine and the benefits of waging war through Twitter. The Conversation. Retrieved 22 November 2012, from http://theconversation.com/israel-palestine-and-the-benefits-of-waging-war-through-twitter-10932

Michael K. (2013a). High-tech lust.IEEE Technology and Society Magazine, 32(2), 4–5. 10.1109/MTS.2013.2259652

Michael, K. (Ed.). (2013b). Social implications of wearable computing and augmediated reality in every day life. In Proceedings of IEEE Symposium on Technology and Society. Toronto, Canada: IEEE.

Michael, K., McNamee, A., & Michael, M. G. (2006). The emerging ethics of humancentric GPS tracking and monitoring. In Proceedings of International Conference on Mobile Business. Copenhagen, Denmark: IEEE Computer Society.

Michael K. Michael M. G. (Eds.). (2007). From dataveillance to überveillance and the realpolitik of the transparent society. Wollongong, Australia: Academic Press.

Michael K. Michael M. G. (2009a). Innovative automatic identification and location-based services: From bar codes to chip implants. Hershey, PA: IGI Global. 10.4018/978-1-59904-795-9

Michael, K., & Michael, M. G. (2009c). Predicting the socioethical implications of implanting people with microchips. PerAda Magazine. Retrieved from http://www.perada-magazine.eu/view.php?article=1598-2009-04-02&category=Citizenship

Michael, K., & Michael, M. G. (2009d). Teaching ethics in wearable computing: The social implications of the new ‘veillance’. EduPOV.Retrieved June 18, from http://www.slideshare.net/alexanderhayes/2009-aupov-main-presentation?from_search=3

Michael K. Michael M. G. (2010). Implementing namebers using implantable technologies: The future prospects of person ID. In PittJ. (Ed.), This pervasive day: The potential and perils of pervasive computing (pp. 163–206). London: Imperial College London.

Michael K. Michael M. G. (2011). The social and behavioral implications of location-based services.Journal of Location-Based Services, 5(3-4), 121–137. 10.1080/17489725.2011.642820

Michael K. Michael M. G. (2013). No limits to watching?Communications of the ACM, 56(11), 26-28.10.1145/2527187

Michael K. Michael M. G. Abbas R. (2009b). From surveillance to uberveillance (Australian Research Council Discovery Grant Application). Wollongong, Australia: University of Wollongong.

Michael, K., Michael, M. G., & Ip, R. (2008). Microchip implants for humans as unique identifiers: A case study on VeriChip. In Proceedings of Conference on Ethics, Technology, and Identity. Delft, The Netherlands: Delft University of Technology.

Michael K. Michael M. G. Perakslis C. (2014). Be vigilant: There are limits to veillance. In PittJ. (Ed.), The computer after me. London: Imperial College Press.

Michael K. Miller K. W. (2013). Big data: New opportunities and new challenges.IEEE Computer, 46(6), 22–24. 10.1109/MC.2013.196

Michael K. Roussos G. Huang G. Q. Gadh R. Chattopadhyay A. Prabhu S. (2010). Planetary-scale RFID Services in an age of uberveillance.Proceedings of the IEEE, 98(9), 1663–1671. 10.1109/JPROC.2010.2050850

Michael M. G. (2000). For it is the number of a man.Bulletin of Biblical Studies, 19, 79–89.

Michael M. G. Michael K. (2009). Uberveillance: Microchipping people and the assault on privacy.Quadrant, 53(3), 85–89.

Michael M. G. Michael K. (2010). Towards a state of uberveillance.IEEE Technology and Society Magazine, 29(2), 9–16. 10.1109/MTS.2010.937024

Minsky, M. (2013). The society of intelligent veillance. In Proceedings of IEEE International Symposium on Technology and Society (ISTAS13). Toronto, Canada: IEEE.

Nolan, D. (2013, June 7). The human cloud. Monolith. Retrieved from http://www.monolithmagazine.co.uk/the-human-cloud/

Oxford Dictionary. (2012). Dataveillance. Retrieved 6 May 2013, from http://oxforddictionaries.com/definition/english/surveillance

Paechter B. Pitt J. Serbedzijac N. Michael K. Willies J. Helgason I. (2011). Heaven and hell: Visions for pervasive adaptation. In Fet11 essence. Budapest, Hungary: Elsevier. 10.1016/j.procs.2011.12.025

Paterson, N. (2013). Veillances: Protocols & network surveillance. In Proceedings of IEEE International Symposium on Technology and Society(ISTAS13). Toronto, Canada: IEEE.

Pitt J. (Ed.). (2012). This pervasive day: The potential and perils of pervasive computing. London: Imperial College London.

Pitt J. (2014). The computer after me. London: Imperial College Press.

Reynolds, M. (2004). Despite the hype, microchip implants won't deliver security. Gartner. Retrieved 6 May 2013, from http://www.gartner.com/DisplayDocument?doc_cd=121944

Rodotà, S., & Capurro, R. (2005). Ethical aspects of ICT implants in the human body. Opinion of the European Group on Ethics in Science and New Technologies to the European Commission, 20.

Shih, T. K. (2013). Video forgery and motion editing. In Proceedings of International Conference on Advances in ICT for Emerging Regions. ICT.

Stephan K. D. Michael K. Michael M. G. Jacob L. Anesta E. (2012). Social implications of technology: Past, present, and future.Proceedings of the IEEE, 100(13), 1752–1781. 10.1109/JPROC.2012.2189919

(1995). Superman. InHonderichT. (Ed.), Oxford companion to philosophy. Oxford, UK: Oxford University Press.

(2010). Uberveillance. InALD (Ed.), Australian law dictionary. Oxford, UK: Oxford University Press.

Warwick K. (2002). I, cyborg. London: Century.

Wordnik. (2013). Sousveillance. Retrieved 6 June 2013, from http://www.wordnik.com/words/sousveillance

Perceived barriers for implanting microchips in humans

Abstract

This quantitative, descriptive study investigated if there was a relationship between countries of residence of small business owners (N = 453) within four countries (Australia, India, UK, and the USA) with respect to perceived barriers to RFID (radio frequency identification) transponders being implanted into humans for employee ID. Participants were asked what they believed were the greatest barriers in instituting chip implants for access control in organizations. Participants had six options from which to select. There were significant chi-square analyses reported relative to respondents' countries and: 1) a perceived barrier of technological issues (X2= 11.86, df = 3, p = .008); 2) a perceived barrier of philosophical issues (right of control over one's body) (X2= 31.21, df = 3, p = .000); and 3) a perceived barrier of health issues (unknown risks related to implants) (X2= 10.88, df = 3, p = .012). There were no significant chi-square analyses reported with respect to countries of residence and: 1) religious issues (mark of the beast), 2) social issues (digital divide), and 3) cultural issues (incisions into the skin are taboo). Thus, the researchers concluded that there were relationships between the respondents' countries and the perception of barriers in institutional microchips.

SECTION I. Introduction

The purpose of this study was to investigate if there were relationships between countries of residence (Australia, India, UK, and the USA) of small business owners  and perceived barriers of instituting RFID (radio frequency identification) transponders implanted into the human body for identification and access control purposes in organizations [1]. Participants were asked what they believed were the greatest barriers in instituting chip implants for access control in organizations [2]. Participants had six options from which to select all that apply, as well as an option to specify other barriers [3]. The options for perceived barriers included:

  • technological issues-RFID is inherently an insecure technology
  • social issues-there will be a digital divide between those with employees with implants for identification and those that have legacy electronic identification
  • cultural issues-incisions into the skin are taboo
  • religious issues-mark of the beast
  • philosophical issues-right of control over one's body
  • health issues-there are unknown risks related to implants that are in the body over the long term
  • other issues.

There were significant chi-square analyses reported relative to respondents' countries and: 1) the perceived barrier of technological issues; 2) the perceived barrier of philosophical issues (right of control over one's body); and 3) the perceived barrier of health issues (unknown risks related to implants). There were no significant chi-square analyses reported with respect to countries and religious issues (mark of the beast), social issues (digital divide), and cultural issues (incisions into the skin are taboo).

RFID implants are capable of omnipresent electronic surveillance. RFID tags or transponders can be implanted into the human body to track the who, what, where, when, and how of human life [4]. This act of embedding devices into human beings for surveillance purposes is known as uberveillance [5]. While the tiny embedded RFID chips do not have global positioning capabilities, an RFID reader (fixed or mobile) can capture time stamps, exit and entry sequences to denote when someone is coming or going, which direction they are travelling in, and then make inferences on time, location, distance. and speed.

In this paper, the authors present a brief review of the literature, key findings from the study, and a discussion on possible implications of the findings. Professionals working in the field of emerging technologies could use these findings to better understand how countries of residence may affect perceptions of barriers in instituting chip implants in humans.

SECTION II. Review of Literature

A. Implants and Social Acceptance

In 2004, the FDA (Food & Drug Administration) of the United States approved an implantable chip for use in humans in the U.S [6]. The implanted chip was and is being marketed by a variety of commercial enterprises as a potential method to detect and treat diseases, as well as a potential lifesaving device. If a person was brought to an emergency room unconscious, a scanner in the hospital doorway could read the person's unique ID on the implanted chip. The ID would then be used to unlock the personal health records (PHR) of the patient from a database [7]. Authorized health professionals would then have access to all pertinent medical information of that individual (i.e. medical history, previous surgeries, allergies, heart condition, blood type, diabetes) to care for the patient aptly. Additionally, the chip is being touted as a solution to kidnappings in Mexico (e.g. by the Xega Company), among many other uses [8].

B. Schools: RFID Tracking

A rural elementary school in California planned to implement RFID-tagged ID cards for school children, however the American Civil Liberties Union (ACLU) fought successfully to revoke the program. Veritable risks were articulated by the ACLU including identity theft, or kidnapping if the system was hacked and resulted in a perpetrator being able to access locations of schoolchildren.

However, with school districts looking to offset cuts in state funding which are partly based on attendance figures, RFID technology provides a method to count students more accurately. Added to increased revenues, administrators are facing the reality of increasing security issues; thus more school districts are adopting RFID to track students to improve safety. For many years in Tokyo, students have worn mandatory RFID bracelets; they are tracked not only in the school, but also to and from school [9] [10]. In other examples, bags are fitted with GPS units.

In 2012, the Northside Independent School District in San Antonio, Texas began a pilot program to track 6.2% of its 100,000 students through RFID tagged ID-cards. Northside was not the first district in Texas; two other school districts in Houston successfully use the technology with reported gains in hundreds of thousands of dollars in revenue due to improved attendance. The school board unanimously approved the program, but not after first debating privacy issues. Chip readers on campuses and on school buses will detect a student's location and authorized administrators will have access to the information. At a cost of 525,000 to launch the pilot program and approximately 1.7 million in the first year due to higher attendance figures, as well as Medicaid reimbursements for the busing of special education students. However, students could forget or lose the cards which would negatively affect the system [3]. One of Northside's sophomore students, Andrea Hernandez, refused to wear the RFID tag round her neck based on religious reasons. Initially, the school expelled her but when the case went to court, she was reinstated, a judge ruling her constitutional rights had been violated [11].

C. Medical Devices: RFID Implants

Recent technological developments are reaching new levels with the integration of silicon and biology; implanted devices can now interact directly with the brain [12]. Implantable devices for medical purposes are often highly beneficial to restore functions that were lost. Such current medical implants include cardiovascular pacers, cochlear and brainstem implants for patients with hearing disorders, implantable drug delivery pumps, implantable neurostimulation devices for such patients as those with urinary incontinence, chronic pain, or epilepsy, deep brain stimulation for patients with Parkinson's, and artificial chip-controlled legs [13].

D. RFID in India

Although India has been identified as a significant prospective market for RFID due to issues with the supply chain and a need for transparency, some contend that the slow adoption of RFID solutions can be tracked to unskilled RFID solution providers. Inexperienced systems integrators and vendors are believed to account for failed trials, leaving companies disillusioned with the technology, and subsequently abandoning solutions and declaiming its benefits loudly and publicly. A secondary technological threat to RFID adoption is believed to be related to price competitiveness in India. In such a price-sensitive environment, RFID players are known to quote the lowest costs per tag, thereby using inferior hardware. Thus, customers perceive RFID to be inconsistent and unreliable for use in the business setting [14]. The compulsory biometrics roll out, instituted by the Unique Identification Authority of India (UIDAI) is in direct contrast to the experience of RFID (fig. 1)

Fig. 1. Taking fingerprints for Aadhaar, a 12-digit unique number has been issued for all residents in india. The number will be stored in a centralized database and linked to basic demographic and biometric information. The system institutes multimodal biometrics. Creative commons: fotokannan.

Fig. 1. Taking fingerprints for Aadhaar, a 12-digit unique number has been issued for all residents in india. The number will be stored in a centralized database and linked to basic demographic and biometric information. The system institutes multimodal biometrics. Creative commons: fotokannan.

E. RFID in Libraries

In 2010, researchers reported that many corporate libraries had begun deploying RFID. RFID tags are placed into books and other media and used in libraries for such purposes as to automate stock verification, to locate misplaced items, to check in/check out patrons without human interaction, and to detect theft. In India, several deployment and implementation issues were identified and they are: consumer privacy issues/ethical concerns, costs, lack of standards and regulations in India (e.g. data ownership, data collection limitations), user confusion (e.g. lack of training and experience with the technology), and the immaturity of the technology (e.g. lack of accuracy, scalability, etc.) [15].

F. RFID and OEMS/Auto Component Manufacturers

In India, suppliers are not forced to conform to stringent regulations like those that exist in other countries. In example, the TREAD Act in the U.S. provided the impetus for OEMs to invest in track and trace solutions; failure to comply with the regulations can carry a maximum fine in the amount of $15 million and a criminal penalty of up to 15 years. Indian suppliers are not only free from such regulations of compliance, but also cost conscious with low volumes of high value cars. It is believed that the cost of RFID solutions is not yet justified in the Indian market [16].

G. Correctional Facilities: RFID Tracking

A researcher studied a correctional facility in Cleveland, Ohio to evaluate the impact of RFID technology to deter such misconduct as sexual assaults. The technology was considered because of its value in confirming inmate counts and perimeter controls. In addition, corrections officers can utilize such technology to check inmate locations against predetermined schedules, to detect if rival gang members are in close proximity, to classify and track proximity of former intimate partners, single out those inmates with food allergies or health issues, and even identify if inmates who may attempt to move through the cafeteria line twice [17].

The results of the study indicated that RFID did not deter inmate misconduct, although the researchers articulated many issues that affected the results. Significant technological challenges abounded for the correctional facility as RFID tracking was implemented and included system inoperability, signal interference (e.g. “blind spots” where bracelets could not be detected), and transmission problems [18] [17].

H. Social Concerns

Social concerns plague epidermal electronics for nonmedical purposes [19]. In the United States, many states have crafted legislation to balance the potential benefits of RFID technology with the disadvantages associated with privacy and security concerns [20]. California, Georgia, Missouri, North Dakota, and Wisconsin are among states in the U.S. which have passed legislation to prohibit forced implantation of RFID in humans [21]. The “Microchip Consent Act of 2010”, which became effective on July 1, 2010 in the state of Georgia, not only stated that no person shall be required to be implanted with a microchip (regardless of a state of emergency), but also that voluntary implantation of any microchip may only be performed by a physician under the authority of the Georgia Composite Medical Board.

Through the work of Rodata and Capurro in 2005, the European Group on Ethics in Science and New Technologies to the European Commission, examined the ethical questions arising from science and new technologies. The role of the opinion was to raise awareness concerning the dilemmas created by both medical and non-medical implants in humans which affect the intimate relation between bodily and psychic functions basic to our personal identity [22]. The opinion stated that Information and Communications Technology implants, should not be used to manipulate mental functions or to change a personal identity. Additionally, the opinion stated that principles of data protection must be applied to protect personal data embedded in implants [23]. The implants were identified in the opinion as a threat to human dignity when used for surveillance purposes, although the opinion stated that this might be justifiable for security and/or safety reasons [24].

I. Increased Levels of Willingness to Adopt: 2005–2010

Researchers continue to investigate social acceptance of the implantation of this technology into human bodies. In 2006, researchers reported higher levels of acceptance of the implantation of a chip within their bodies, when college students perceived benefits from this technology [25]. Utilizing the same questions posed in 2005 to college students attending both private and public institutions of higher education by the aforementioned researchers, the researchers once again in 2010 investigated levels of willingness to implant RFID chips to understand if there were shifts in levels of willingness of college students to implant RFID chips for various reasons [25] [26]. In both studies, students were asked: “How willing would you be to implant an RFID chip in your body as a method (to reduce identity theft, as a potential lifesaving device, to increase national security)?” A 5-point Likert-type scale was utilized varying from “Strongly Unwilling” to “Strongly Willing”. Comparisons of the 2005 results of the study to the results of the 2010 research revealed shifts in levels of willingness of college students. A shift was evident; levels of willingness moved from unwillingness toward either neutrality or willingness to implant a chip in the human body to reduce identity theft, as a potential lifesaving device, and to increase national security. Levels of unwillingness decreased for all aforementioned areas as follows [26]. Between 2005 and 2010, the unwillingness (“Strongly unwilling” and “Somewhat unwilling”) of college students to implant an RFID chip into their bodies decreased by 22.4% when considering RFID implants as method to reduce identity theft, decreased by 19.9% when considering RFID implants as a potential lifesaving device, and decreased by 16.3% when considering RFID implants to increase national security [26].

J. RFID Implant Study: German Tech Conference Delegates

A 2010 survey of individuals attending a technology conference conducted by BITKOM, a German information technology industry lobby group, reported 23% of 1000 respondents would be prepared to have a chip inserted under their skin for certain benefits; 72% of respondents, however, reported they would not allow implantation of a chip under any circumstances. Sixteen percent (16%) of respondents reported they would accept an implant to allow emergency services to rescue them more quickly in the event of a fire or accident [27].

K. Ask India: Are Implants a More Secure Technology?

Previously, researchers reported a significant chi-square analysis relative to countries of residence and perceptions of chip implants as a more secure technology for identification/access control in organizations. More than expected (46 vs. 19.8; adjusted residual = 7.5), participants from India responded “yes” to implants as a more secure technology. When compared against the other countries in the study, fewer residents from the UK responded “yes” than expected (9 vs. 19.8), and fewer residents from the USA responded “yes” than expected (11 vs. 20.9). In rank order, the countries contributing to this significant relationship were India, the UK and the USA; no such differences in opinion were found for respondents from Australia. [28].

Due to heightened security threats, there appears to be a surge in demand for security in India [29][30]. A progression of mass-casualty assaults that have been carried out by extremist Pakistani nationals against hotels and government buildings in India has brought more awareness to the potential threats against less secure establishments [30]. The government is working to institute security measures at the individual level with a form of national ID cards that will house key biometric data of the individual. In the local and regional settings, technological infrastructure is developing rapidly in metro and non-metro areas because of the increase of MNCs (multi-national corporations) now locating in India. Although the neighborhood “chowkiddaaar” (human guard/watchman) was previously a more popular security measure for localized security, advances in, and reliability and availability of, security technology is believed to be affecting the adoption of electronic access security as a replacement to the more traditional security measures [29] [30].

L. Prediction of Adoption of Technology

Many models have been developed and utilized to understand factors that affect the acceptance of technology such as: The Moguls Model of Computing by Ndubisi, Gupta, and Ndubisi in 2005, Diffusion of Innovation Theory by Rogers in 1983; Theory of Planned Behavior by Ajzen in 1991; The Model of PC Utilization attributed to Thompson, Higgins, and Howell in 1991, Protection Motivation Theory (PMT) by Rogers in 1985, and the Theory of Reasoned Action attributed to Fischbein & Ajzen in 1975, and with additional revisions by the same in 1980 [31].

Researchers in Berlin, Germany investigated consumers' reactions to RFID in retail. After viewing an introductory stimulus film about RFID services in retail, participants evaluated the technology and potential privacy mechanisms. Participants were asked to rate on a five point Likert-type scale (ranging from “not at all sensitive” to “extremely sensitive”) their attitudes toward privacy with such statements as: “Generally, I want to disclose the least amount of data about myself.” Or “To me it is irrelevant if somebody knows what I buy for my daily needs.” In the study, participants reported moderate privacy awareness  and interestingly, participants reported a moderate expectation that legal regulations will result in sufficient privacy protection . Results showed that the extent to which people view the protection of their privacy strongly influences how willing people will be to accept RFID in retail. Participants were aware of privacy problems with RFID-based services, however, if retailers articulate that they value the customers' privacy, participants appeared more likely to adopt the technology. Thus, privacy protection (and the communication of it) was found to be an essential element of RFID rollouts [32].

SECTION III. Methodology

This quantitative, descriptive study investigated if there were relationships between countries of residence with respect to perceived barriers of RFID chip implants in humans for identification and access control purposes in organizations. The survey took place between April 4, 2011 and April 18, 2011. It took an average of 10 minutes to complete each online survey. Participants, who are small business owners  within four countries including Australia , India , UK , and the USA , were asked “As a senior executive, what do you believe are the greatest barriers in instituting chip implants for access control in organizations?” Relative to gender, 51.9% of participants are male; 48.1% are female. The age of participants ranged from 18 to 71 years of age; the mean age was 44 and the median age was 45. Eighty percent of organizations surveyed had less than 5 employees. Table I shows the survey participant's industry sector.

Table I Senior executive's industry sector

Table I Senior executive's industry sector

The study employed one instrument that collected key data relative to the business profile, the currently utilized technologies for identification and access control at the organization, and the senior executives' perceptions of RFID implants in humans for identification and access control in organizations. Twenty-five percent of the small business owners that participated in the survey said they had electronic ID access to their premises. Twenty percent of small business owner employee ID cards came equipped with a photograph, and less than five percent stated they had a security breach in the 12 months preceding the study.

Descriptive statistics, including frequency counts and measures of central tendency, were run and chi-square analysis was conducted to examine if there were relationships between the respondents' countries and each of the perceived barriers in instituting microchips in humans.

SECTION IV. Findings

There was a significant relationship reported relative to respondents' countries for each of three of the six choices provided in the multi-chotomous question: “As a senior executive, what do you believe are the greatest barriers in instituting chip implants for access control in organizations?”

A. Barrier: Technological Issues

The significant chi-square analysis  indicated that there was a relationship between the respondents' countries and the perceived barrier of technological issues. Using the rule of identifying adjusted residuals greater than 2.0, examination of the adjusted residuals indicated that the relationship was created when more than expected participants from India selected “technological issues (RFID is inherently an insecure technology)” as a barrier in instituting chip implants (45 vs. 31.1; adjusted residual 3.4).

B. Barrier: Philosophical Issues

The second significant chi-square analysis , df = 3,  indicated that there was a relationship between the respondents' countries and the perceived barrier of philosophical issues (right of control over one's body). An examination of the adjusted residuals indicated that the relationship was mostly created when fewer than expected participants from India selected philosophical issues as a barrier in instituting chip implants (37 vs. 61.3; adjusted residual 5.3). In addition, more residents from Australia than expected (78 vs. 62.9; adjusted residual 3.3) selected philosophical issues as a barrier. In rank order, the countries contributing to this significant relationship were India, followed by Australia; no such differences in opinion were found for respondents from UK and the USA.

C. Barrier: Health Issues

The third significant chi-square analysis  indicated there was a relationship between the respondents' countries and the perceived barrier of health issues (unknown risks related to implants). An examination of the adjusted residuals indicated that the relationship was mostly created when more than expected residents of India selected health issues as a barrier in instituting chip implants (57 vs. 43.3; adjusted residual 3.1). In addition, fewer residents from America than expected (36 vs. 45.7; adjusted residual 2.1) selected health issues as a barrier. In rank order, the countries contributing to this significant relationship were India, followed by the USA; no such differences in opinion were found for respondents from Australia and the UK.

D. Barrier: Social Issues, Religious Issues, and Cultural Issues

There were no significant chi-square analyses reported with respect to respondents' countries and social issues (digital divide), religious issues (mark of the beast), and cultural issues (incisions into the skin are taboo). Thus, in this study the researchers concluded no such differences in opinion were found for respondents' countries of residence and the barriers of social issues, religious issues, and cultural issues.

E. Statistical Summary

When asked whether or not, radiofrequency identification (RFID) transponders surgically implanted beneath the skin of an employee would be a more secure technology for instituting employee identification in the organization, only eighteen percent believed so. When asked subsequently about their opinion on how many staff in their organization would opt for an employee ID chip implant instead of the current technology if it were available, it was stated that eighty percent would not opt in. These figures are consistent with an in depth interview conducted with consultant Gary Retherford who was responsible for the first small business adoption of RFID implants for access control at Citywatcher.com in 2006 [33]–[34][35] In terms of the perceived barriers to instituting an RFID implant for access control in organizations, senior executives stated the following (in order of greatest to least barriers): 61% said health issues, 55% said philosophical issues, 43% said social issues; 36% said cultural issues; 31% said religious issues, and 28% said technological issues.

F. Open-Ended Question

When senior executives were asked if they themselves would adopt an RFID transponder surgically implanted beneath the skin the responses were summarized into three categories-no, unsure, and yes [36]. We present a representative list of these responses below with a future study focused on providing in depth qualitative content analysis.

1) No, I Would Not Get an RFID Implant

“No way would I. Animals are microchipped, not humans.”

“Absurd and unnecessary.”

“I absolutely would not have any such device implanted.”

“Hate it and object strongly.”

“No way.”h

“No thanks.”

“Yuk.”

“Absolutely creepy and unnecessary.”

“Would not consider it.”

“I would leave the job.”

“I don't like the idea one bit. The idea is abhorrent. It is invasive both physically and psychologically. I would never endorse it.”

“Would never have it done.”

“Disagree invading my body's privacy.”

“Absolutely vehemently opposed.”

“This proposal is a total violation of human rights.”

“Yeah right!! and get sent straight to hell! not this little black duck!”

“I do not believe you should put things in your body that God did not supply you with …”

“I wouldn't permit it. This is a disgraceful suggestion. The company does not OWN the employees. Slavery was abolished in developed countries more than 100 years ago. How dare you even suggest such a thing. You should be ashamed.”

“I would sooner stick pins in my eyeballs.”

“It's just !@;#%^-Nazi's???”

2) I am Unsure about Getting an RFID Implant

“A bit overkill for identification purposes.”

“Uncomfortable.”

“Maybe there is an issue with OH&S and personal privacy concern.”

“Unsure.”

“Only if I was paid enough to do this, $100000 minimum.”

“Unsure, seems very robotic.”

“I'm not against this type of device but I would not use it simply for business security.”

“A little skeptical.”

“A little apprehensive about it.”

3) Yes, I would Get an RFID Implant

“Ok, but I would be afraid that it could be used by”

“outside world, say police.”

“Sick!”

“It is a smart idea.”

“It would not be a problem for me, but I own the business so no philosophical issues for me.”

“I'd think it was pretty damn cool.”

SECTION V. Discussion: Perceived Barriers

A. Barrier: Technological Issues

The literature revealed many technological barriers for non-implantable chips; this study suggests this same barrier is also perceived for implantable chips and is likely to be related [37]. More than expected, Indian participants in this study selected technological issues (RFID is inherently an insecure technology) as a barrier in instituting chip implants for access control; no such differences of opinion were found for the other countries in the study. However, the literature revealed in other analyses, that more than expected Indian participants, answered “yes” when asked if implants are a more secure technology for instituting identification/access control in an organization. The findings appear to suggest that although Indian participants perceive RFID implants as a more secure technology when compared with other such methods as manual methods, paper-based, smartcards, or biometric/RFID cards, participants are likely to view this technology as undeveloped and still too emergent. Further research is needed to substantiate this conclusion, although a review of the literature revealed that RFID solution providers are already in abundance in India, with many new companies launching and at a rapid pace. Without standards and regulations, providers are unskilled and uneducated in the technology, providing solutions that often do not prove successful in implementation. Customers then deem the technology as inconsistent and ineffective in its current state. In addition, RFID players undercut each other, providing cheap pricing for cheap, underperforming hardware. Therefore, the preliminary conclusion of the researchers is that adoption of implants in India is likely to be inhibited not only now, but well into the future if the implementations of non-implantable RFID solutions continue to misrepresent the capabilities of the technology. It is likely that far afield to accepting implantable chips, individuals in India would need to be assured of consistency and effectiveness for RFID chip use in non-human applications.

B. Barrier: Philosophical Issues

Fewer than expected Indian participants selected philosophical issues (right of control over one's body) as a barrier; and more than expected, Australian participants selected this as a barrier. The researchers concluded that this is fertile ground for future research [38]. The deep cultural assumptions of each country are likely to influence participants' responses. In example, although Indian philosophies vary, many emphasize the continuity of the soul or spirit, rather than the temporary state of the flesh (the body). Further research would inform these findings through an exploration as to how and why participants in India versus participants in Australia perceive their own right of control over one's body.

C. Barrier: Health Issues

More than expected Indian participants selected health issues (unknown risks related to implants) as a barrier in instituting implants; and, fewer than expected American participants selected this as a barrier. The researchers conclude that these results may be a result of the perceived successes with the current usage of the technology. The literature revealed participants from India are experiencing poor implementations of the technology. Conversely, Americans are increasingly exposed to the use of surgically implanted chips in pets (often with no choice if the pet is adopted from a shelter) and with little or no health issues faced [39]. In addition, segments of the healthcare industry are advocating for RFID for use in the supply chain (e.g. blood supply) with much success. To inform these findings, further research is needed to explore how participants from each country describe the unknown risks related to implants.

SECTION VI. Conclusion

In conclusion, the authors recognize there are significant social implications relative to implanting chips in humans. Although voluntary chipping has been embraced by certain individuals, the chipping of humans is rare and remains mostly a topic of discussion and debate into the future. Privacy and security issues abound and are not to be minimized. However, in the future, we may see an increased demand for, and acceptance of, chipping, especially as the global environment intensifies. When considering the increase in natural disasters over the past two years, the rising tensions between nations such as those faced by India with terrorism by extremists from neighboring countries, and the recent contingency plans to enact border controls to mitigate refugees fleeing failing countries in the Eurozone, the tracking of humans may once again come to the forefront as it did post 9–11 when rescuers raced against the clock to locate survivors in the rubble.

India is of particular interest in this study; participants from this country contributed most in many of the analyses. India is categorized as a developing country (or newly industrialized country) and the second most populous country in the world. The government of India is already utilizing national identification cards housing biometrics, although the rollout has been delayed as officials work to solve issues around cards that can be stolen or misplaced, as well as how to prevent use fraudulently after the cardholder's death. Technological infrastructure is improving in even the more remote regions in India as MNCs (multi-national corporations) are locating business divisions in the country. The findings, set against the backdrop of the literature review, bring to light what seems to be an environment of people more than expected (statistically) open to (and possibly ready for) the technology of implants when compared with developed countries. However ill-informed RFID players in India are selling a low quality product. There appears to be lack of standards and insufficient knowledge of the technology with those who should know the most about the technology. Further research is necessary to not only understand the Indian perspective, but also to better understand the environment now and into the future.

References

1. K. Michael and M. G. Michael, "The Diffusion of RFID Implants for Access Control and ePayments: Case Study on Baja Beach Club in Barcelona, " in IEEE International Symposium on Technology and Society (ISTAS10), Wollongong, Australia, 2010, pp. 242-252.

2. K. Michael and M. G. Michael, "Implementing Namebers Using Microchip Implants: The Black Box Beneath The Skin, " in This Pervasive Day: The Potential and Perils of Pervasive Computing, J. Pitt, Ed., ed London, United Kingdom: Imperial College Press, 2012, pp. 163-203.

3. K. Michael and M. G. Michael, "The Social, Cultural, Religious and Ethical Implications of Automatic Identification, " in The Seventh International Conference on Electronic Commerce Research, Dallas, Texas, 2004, pp. 432-450.

4. M. G. Michael and K. Michael, "A note on uberveillance, " in From dataveillance to uberveillance and the realpolitik of the transparent society, K. Michael and M. G. Michael, Eds., ed Wollongong: University of Wollongong, 2006, pp. 9-25.

5. M. G. Michael and K. Michael, Eds., Uberveillance and the Social Implications of Microchip Implants (Advances in Human and Social Aspects of Technology. Hershey, PA: IGI Global, 2014.

6. J. Stokes. (2004, October 14, 2004). FDA approves implanted RFID chip for humans. Available: http://arstechnica.com/uncategorized/2004/10/4305-2/

7. K. Michael, et al., "Microchip Implants for Humans as Unique Identifiers: A Case Study on VeriChip, " in Conference on Ethics, Technology, and Identity, Delft, Netherlands, 2008.

8. K. Opam. (2011, August 22, 2011). RFID Implants Won't Rescue the People Kidnapped in Mexico. Available: http://gizmodo.com/5833237/rfid-implants-wont-work-if-youve-beenkidnapped-in-mexico

9. C. Swedberg. (2005, June 12, 2012). L.A. County Jail to track inmates. Available: http://www.rfidjournal.com/article/articleview/1601/1/1

10. F. Vara-Orta. (2012, May 31, 2012). Students will be tracked via chips in IDs. Available: http://www.mysanantonio.com/news/education/article/Students-willbe-tracked-via-chips-in-IDs-3584339.php#ixzz1vszm9Wn4

11. Newstaff. (November 27, 2012, May 13, 2014). Texas School: Judge Overturns Student's Expulsion over RFID Chip. Available: http://www.govtech.com/Texas-School-Wear-RFID-Chip-or-Get-Expelled.html

12. M. Gasson, "ICT implants: The invasive future of identity?, " Advances in Information and Communication Technology, vol. 262, pp. 287-295, 2008.

13. K. D. Stephan, et al., "Social Implications of Technology: Past, Present, and Future, " Proceedings of the IEEE, vol. 100, pp. 1752-1781 2012.

14. R. Kumar. (2011, June 1, 2012). India's Big RFID Adoption Challenges. Available: http://www.rfidjournal.com/article/articleview/8145/1/82/

15. L. Radha, "Deployment of RFID (Radio Frequency Identification) at Indian academic libraries: Issues and best practice. , " International Journal of Library and Information Science, vol. 3, pp. 34-37, 2011.

16. H. Saranga, et al. (2010, June 2, 2012). Scope for RFID Implementation in the Indian Auto Components Industry. Available: http://tejasiimb. org/articles/73.php

17. N. LaVigne, "An evaluability assessment of RFID use in correctional settings, " in Final report submitted to the National Institute of Justice, ed. Washington DC: USA, 2006.

18. R. Halberstadt and N. LaVigne, "Evaluating the use of radio frequency identification device (RFID) technology to prevent and investigate sexual assaults in a correctional setting, " The Prison Journal, vol. 91, pp. 227-249, 2011.

19. A. Masters and K. Michael, "Lend me your arms: The use and implications of humancentric RFID, " Electronic Commerce and Applications, vol. 6, pp. 29-39, 2007.

20. K. Albrecht and L. McIntyre, Spychips: How Major Corporations and Government Plan to Track Your Every Purchase and Watch Your Every Move. New York: Plume, 2006.

21. A. Friggieri, et al., "The Legal Ramifications of Microchipping People in the United States of America-A State Legislative Comparison, " in IEEE International Symposium on Technology and Society (ISTAS '09), Phoenix, Arizona, 2009.

22. G. G. Assembly. (2010, January 12, 2011). Senate Bill 235. Available: http://www1.legis.ga.gov/legis/2009-10/versions/sb235-As-passed-Se nate-5.htm

23. M. G. Michael and K. Michael, "Towards a State of Uberveillance, " IEEE Technology and Society Magazine, vol. 29, pp. 9-16, 2010.

24. S. Rodota and R. Capurro, "Opinion n020: Ethical aspects of ICT Implants in the human body, " in European Group on Ethics in Science and New Technologie (EGE), ed, 2005.

25. C. Perakslis and R. Wolk, "Social acceptance of RFID as a biometric security method, " IEEE Symposium on Technology and Society Magazine, vol. 25, pp. 34-42, 2006.

26. C. Perakslis, "Consumer Willingness to Adopt RFID Implants: Do Personality Factors Play a Role in the Acceptance of Uberveillance?, " in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds., ed Hershey, PA: IGI Global, 2014, pp. 144-160.

27. A. Donoghue. (2010, March 2, 2010). CeBIT: Quarter Of Germans Happy To Have Chip Implants. Available: http://www.techweekeurope.co.uk/news/cebit-quarter-of-germanshappy-to-have-chip-implants-5590

28. R. Achille, et al., "Ethical Issues to consider for Microchip Implants in Humans, " Ethics in Biology, Engineering and Medicine vol. 3, pp. 77-91, 2012.

29. S. Das. (2009, May 1, 2012). Surveillance: Big Brothers Watching. Available: http://dqindia.ciol.commakesections.asp/09042401.asp

30. M. Krepon and N. Cohn. (2011, May 1, 2012). Crises in South Asia: Trends and Potential Consequences. Available: http://www.stimson.org/books-reports/crises-in-south-Asia-trends-Andconsequences

31. C. Jung, Psychological types. Princeton, NJ: Princeton University Press, 1923 (1971).

32. M. Rothensee and S. Spiekermann, "Between Extreme Rejection and Cautious Acceptance Consumers' Reactions to RFID-Based IS in Retail, " Science Computer Review, vol. 26, pp. 75-86, 2008.

33. K. Michael and M. G. Michael, "The Future Prospects of Embedded Microchips in Humans as Unique Identifiers: The Risks versus the Rewards, " Media, Culture &Society, vol. 35, pp. 78-86, 2013.

34. WND. (October 2, 2006, May 13, 2014). Employees Get Microchip Implants. Available: http://www.wnd.com/2006/02/34751/

35. K. Michael, "Citywatcher.com, " in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds., ed Hershey, PA: IGI Global, 2014, pp. 133-143.

36. K. Michael, et al., "Microchip Implants for Employees in the Workplace: Findings from a Multi-Country Survey of Small Business Owners, " presented at the Surveillance and/in Everyday Life: Monitoring Pasts, Presents and Futures, University of Sydney, NSW, 2012.

37. M. N. Gasson, et al., "Human ICT Implants: Technical, Legal and Ethical Considerations, " in Information Technology and Law Series vol. 23, ed: Springer, 2012, p. 184.

38. S. O. Hansson, "Implant ethics, " Journal of Med Ethics, vol. 31, pp. 519-525, 2005.

39. K. Albrecht, "Microchip-induced tumours in laboratory rodents and dogs: A review of literature, " in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds., ed Hershey, PA: IGI Global, 2014, pp. 281-318.

Keywords: Radiofrequency identification, Implants, Educational institutions, Organizations, Access control, Australia, transponders, authorisation, microprocessor chips, organisational aspects, radiofrequency identification, institutional microchips, perceived barriers, microchips implant, transnational study, small business owners, RFID transponders, radio frequency identification transponders, employee ID, chip implants,access control, organizations, chi-square analysis, technological issues, philosophical issues, health issues, religious issues, social issues, digital divide, cultural issues, USA, RFID, radio frequency identification, implants, microchips, uberveillance, barriers, access control, employee identification, security, small business, Australia, India, UK

Citation: Christine Perakslis, Katina Michael, M. G. Michael, Robert Gable, "Perceived barriers for implanting microchips in humans", 2014 IEEE Conference on Norbert Wiener in the 21st Century (21CW), Date of Conference: 24-26 June 2014, Date Added to IEEE Xplore: 08 September 2014. DOI: 10.1109/NORBERT.2014.6893929

Location and Tracking of Mobile Devices

Location and Tracking of Mobile Devices: Überveillance Stalks the Streets

Review Version of 7 October 2012

Published in Computer Law & Security Review 29, 3 (June 2013) 216-228

Katina Michael and Roger Clarke **

© Katina Michael and Xamax Consultancy Pty Ltd, 2012

Available under an AEShareNet  licence or a Creative Commons  licence.

This document is at http://www.rogerclarke.com/DV/LTMD.html

Abstract

During the last decade, location-tracking and monitoring applications have proliferated, in mobile cellular and wireless data networks, and through self-reporting by applications running in smartphones that are equipped with onboard global positioning system (GPS) chipsets. It is now possible to locate a smartphone-user's location not merely to a cell, but to a small area within it. Innovators have been quick to capitalise on these location-based technologies for commercial purposes, and have gained access to a great deal of sensitive personal data in the process. In addition, law enforcement utilise these technologies, can do so inexpensively and hence can track many more people. Moreover, these agencies seek the power to conduct tracking covertly, and without a judicial warrant. This article investigates the dimensions of the problem of people-tracking through the devices that they carry. Location surveillance has very serious negative implications for individuals, yet there are very limited safeguards. It is incumbent on legislatures to address these problems, through both domestic laws and multilateral processes.

Contents

1. Introduction

Personal electronic devices travel with people, are worn by them, and are, or soon will be, inside them. Those devices are increasingly capable of being located, and, by recording the succession of locations, tracked. This creates a variety of opportunities for the people concerned. It also gives rise to a wide range of opportunities for organisations, at least some of which are detrimental to the person's interests.

Commonly, the focus of discussion of this topic falls on mobile phones and tablets. It is intrinsic to the network technologies on which those devices depend that the network operator has at least some knowledge of the location of each handset. In addition, many such devices have onboard global positioning system (GPS) chipsets, and self-report their coordinates to service-providers. The scope of this paper encompasses those already-well-known forms of location and tracking, but it extends beyond them.

The paper begins by outlining the various technologies that enable location and tracking, and identifies those technologies' key attributes. The many forms of surveillance are then reviewed, in order to establish a framework within which applications of location and tracking can be characterised. Applications are described, and their implications summarised. Controls are considered, whereby potential harm to the interests of individuals can be prevented or mitigated.

2. Relevant Technologies

The technologies considered here involve a device that has the following characteristics:

  • it is conveniently portable by a human, and
  • it emits signals that:
    • enable some other device to compute the location of the device (and hence of the person), and
    • are sufficiently distinctive that the device is reliably identifiable at least among those in the vicinity, and hence the device's (and hence the person's) successive locations can be detected, and combined into a trail

The primary form-factors for mobile devices are currently clam-shape (portable PCs), thin rectangles suitable for the hand (mobile phones), and flat forms (tablets). Many other form-factors are also relevant, however. Anklets imposed on dangerous prisoners, and even as conditions of bail, carry RFID tags. Chips are carried in cards of various sizes, particularly the size of credit-cards, and used for tickets for public transport and entertainment venues, aircraft boarding-passes, toll-road payments and in some countries to carry electronic cash. Chips may conduct transactions with other devices by contact-based means, or contactless, using radio-frequency identification (RFID) or its shorter-range version near-field communication (NFC) technologies. These capabilities are in credit and debit cards in many countries. Transactions may occur with the cardholder's knowledge, with their express consent, and with an authentication step to achieve confidence that the person using the card is authorised to do so. In a variety of circumstances, however, some and even all of those safeguards are dispensed with. The electronic versions of passports that are commonly now being issued carry such a chip, and have an autonomous communications capability. The widespread issue of cards with capabilities uncontrolled by, and in many cases unknown to, the cardholder, is causing consternation among segments of the population that have become aware of the schemes.

Such chips can be readily carried in other forms, including jewellery such as finger-rings, and belt-buckles. Endo-prostheses such as replacement hips and knees and heart pacemakers can readily carry chips. A few people have voluntarily embedded chips directly into their bodies for such purposes as automated entry to premises (Michael & Michael 2009).

In order to locate and track such devices, any sufficiently distinctive signals may in principle suffice. See Raper et al. (2007a) and Mautz (2011). In practice, the signals involved are commonly those transmitted by a device in order to take advantage of wireless telecommunications networks. The scope of the relevant technologies therefore also encompasses the signals, devices that detect the signals, and the networks over which the data that the signals contain are transmitted.

In wireless networks, it is generally the case that the base station or router needs to be aware of the identities of devices that are currently within the cell. A key reason for this is to conserve limited transmission capacity by sending messages only when the targeted device is known to be in the cell. This applies to all of:

  • cellular mobile originally designed for voice telephony and extended to data (in particular those using the '3G' standards GSM/GPRS, CDMA2000 and UMTS/HSPA and the '4G' standard LTE)
  • wireless local area networks (WLANs, commonly Wifi / IEEE 802.11x - RE 2010a)
  • wireless wide area networks (WWANs, commonly WiMAX / IEEE 802.16x - RE 2010b).

Devices in such networks are uniquely identified by various means (Clarke & Wigan 2011). In cellular networks, there is generally a clear distinction between the entity (the handset) and the identity it is adopting at any given time (which is determined by the module inserted in it). Depending on the particular standards used, what is commonly referred to as 'the SIM-card' is an R-UIM, a CSIM or a USIM. These modules store an International Mobile Subscriber Identity (IMSI), which constitutes the handset's identifier. Among other things, this enables network operators to determine whether or not to provide service, and what tariff to apply to the traffic. However, cellular network protocols may also involve transmission of a code that distinguishes the handset itself, within which the module is currently inserted. A useful generic term for this is the device 'entifier' (Clarke 2009b). Under the various standards, it may be referred to as an International Mobile Equipment Identity (IMEI), ESN, or MEID.

In Wifi and WiMAX networks, the device entifier may be a processor-id or more commonly a network interface card identifier (NIC Id). In various circumstances, other device-identifiers may be used, such as a phone-number, or an IP-address may be used as a proxy. In addition, the human using the device may be directly identified, e.g. by means of a user-accountname.

A WWAN cell may cover a large area, indicatively of a 50km radius. Telephony cells may have a radius as large as 2-3 km or as little as a hundred metres. WLANs using Wifi technologies have a cell-size of less than 1 hectare, indicatively 50-100 metres radius, but in practice often constrained by environmental factors to only 10-30 metres.

The base-station or router knows the identities of devices that are within its cell, because this is a technically necessary feature of the cell's operation. Mobile devices auto-report their presence 10 times per second. Meanwhile, the locations of base-stations for cellular services are known with considerable accuracy by the telecommunications providers. And, in the case of most private Wifi services, the location of the router is mapped to c. 30-100 metre accuracy by services such as Skyhook and Google Locations, which perform what have been dubbed 'war drives' in order to maintain their databases - in Google's case in probable violation of the telecommunications interception and/or privacy laws of at least a dozen countries (EPIC 2012).

Knowing that a device is within a particular mobile phone, WiMAX or Wifi cell provides only a rough indication of location. In order to generate a more precise estimate, within a cell, several techniques are used (McGuire et al. 2005). These include the following (adapted from Clarke & Wigan 2011. See also Figueiras & Frattasi 2010):

  • directional analysis. A single base-station may comprise multiple receivers at known locations and pointed in known directions, enabling the handset's location within the cell to be reduced to a sector within the cell, and possibly a narrow one, although without information about the distance along the sector;
  • triangulation. This involves multiple base-stations serving a single cell, at known locations some distance apart, and each with directional analysis capabilities. Particularly with three or more stations, this enables an inference that the device's location is within a small area at the intersection of the multiple directional plots;
  • signal analysis. This involves analysis of the characteristics of the signals exchanged between the handset and base-station, in order to infer the distance between them. Relevant signal characteristics include the apparent response-delay (Time Difference of Arrival - TDOA, also referred to as multilateration), and strength (Received Signal Strength Indicator - RSSI), perhaps supplemented by direction (Angle Of Arrival - AOA).

The precision and reliability of these techniques varies greatly, depending on the circumstances prevailing at the time. The variability and unpredictability result in many mutually inconsistent statements by suppliers, in the general media, and even in the technical literature.

Techniques for cellular networks generally provide reasonably reliable estimates of location to within an indicative 50-100m in urban areas and some hundreds of metres elsewhere. Worse performance has been reported in some field-tests, however. For example, Dahunsi & Dwolatzky (2012) found the accuracy of GSM location in Johannesberg to be in the range 200-1400m, and highly variable, with "a huge difference between the predicted and provided accuracies by mobile location providers".

The web-site of the Skyhook Wifi-router positioning service claims 10-metre accuracy, 1-second time-to-first-fix and 99.8% reliability (SHW 2012). On the other hand, tests have resulted in far lower accuracy measures, including an average positional error of 63m in Sydney (Gallagher et al. 2009) and "median values for positional accuracy in [Las Vegas, Miami and San Diego, which] ranged from 43 to 92 metres ... [and] the replicability ... was relatively poor" (Zandbergen 2012, p. 35). Nonetheless, a recent research article suggested the feasibility of "uncooperatively and covertly detecting people 'through the wall' [by means of their WiFi transmissions]" (Chetty et al. 2012).

Another way in which a device's location may become known to other devices is through self-reporting of the device's position, most commonly by means of an inbuilt Global Positioning System (GPS) chip-set. This provides coordinates and altitude based on broadcast signals received from a network of satellites. In any particular instance, the user of the device may or may not be aware that location is being disclosed.

Despite widespread enthusiasm and a moderate level of use, GPS is subject to a number of important limitations. The signals are subject to interference from atmospheric conditions, buildings and trees, and the time to achieve a fix on enough satellites and deliver a location measure may be long. This results in variability in its practical usefulness in different circumstances, and in its accuracy and reliability. Civil-use GPS coordinates are claimed to provide accuracy within a theoretical 7.8m at a 95% confidence level (USGov 2012), but various reports suggest 15m, or 20m, or 30m, but sometimes 100m. It may be affected by radio interference and jamming. The original and still-dominant GPS service operated by the US Government was subject to intentional degradation in the US's national interests. This 'Selective Availability' feature still exists, although subject to a decade-long policy not to use it; and future generations of GPS satellites may no longer support it.

Hybrid schemes exist that use two or more sources in order to generate more accurate location-estimates, or to generate estimates more quickly. In particular, Assisted GPS (A-GPS) utilises data from terrestrial servers accessed over cellular networks in order to more efficiently process satellite-derived data (e.g. RE 2012).

Further categories of location and tracking technologies emerge from time to time. A current example uses means described by the present authors as 'mobile device signatures' (MDS). A device may monitor the signals emanating from a user's mobile device, without being part of the network that the user's device is communicating with. The eavesdropping device may detect particular signal characteristics that distinguish the user's mobile device from others in the vicinity. In addition, it may apply any of the various techniques mentioned above, in order to locate the device. If the signal characteristics are persistent, the eavesdropping device can track the user's mobile device, and hence the person carrying it. No formal literature on MDS has yet been located. The supplier's brief description is at PI (2010b).

The various technologies described in this section are capable of being applied to many purposes. The focus in this paper is on their application to surveillance.

3. Surveillance

The term surveillance refers to the systematic investigation or monitoring of the actions or communications of one or more persons (Clarke 2009c). Until recent times, surveillance was visual, and depended on physical proximity of an observer to the observed. The volume of surveillance conducted was kept in check by the costs involved. Surveillance aids and enhancements emerged, such as binoculars and, later, directional microphones. During the 19th century, the post was intercepted, and telephones were tapped. During the 20th century, cameras enabled transmission of image, video and sound to remote locations, and recording for future use (e.g. Parenti 2003).

With the surge in stored personal data that accompanied the application of computing to administration in the 1970s and 1980s, dataveillance emerged (Clarke 1988). Monitoring people through their digital personae rather than through physical observation of their behaviour is much more economical, and hence many more people can be subjected to it (Clarke 1994). The dataveillance epidemic made it more important than ever to clearly distinguish between personal surveillance - of an identified person who has previously come to attention - and mass surveillance - of many people, not necessarily previously identified, about some or all of whom suspicion could be generated.

Location data is of a very particular nature, and hence it has become necessary to distinguish location surveillance as a sub-set of the general category of dataveillance. There are several categories of location surveillance with different characteristics (Clarke & Wigan 2011):

  • capture of an individual's location at a point in time. Depending on the context, this may support inferences being drawn about an individual's behaviour, purpose, intention and associates
  • real-time monitoring of a succession of locations and hence of the person's direction of movement. This is far richer data, and supports much more confident inferences being drawn about an individual's behaviour, purpose, intention and associates
  • predictive tracking, by extrapolation from the person's direction of movement, enabling inferences to be drawn about near-future behaviour, purpose, intention and associates
  • retrospective tracking, on the basis of the data trail of the person's movements, enabling reconstruction of a person's behaviour, purpose, intention and associates at previous times

Information arising at different times, and from different forms of surveillance, can be combined, in order to offer a more complete picture of a person's activities, and enable yet more inferences to be drawn, and suspicions generated. This is the primary sense in which the term 'überveillance' is applied: "Überveillance has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought). Überveillance can be a predictive mechanism for a person's expected behaviour, traits, likes, or dislikes; or it can be based on historical fact; or it can be something in between ... Überveillance is more than closed circuit television feeds, or cross-agency databases linked to national identity cards, or biometrics and ePassports used for international travel. Überveillance is the sum total of all these types of surveillance and the deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time" (Michael & Michael 2010. See also Michael & Michael 2007, Michael et al. 2008, Michael et al. 2010, Clarke 2010).

A comprehensive model of surveillance includes consideration of geographical scope, and of temporal scope. Such a model assists the analyst in answering key questions about surveillance: of what? for whom? by whom? why? how? where? and when? (Clarke 2009c). Distinctions are also needed based on the extent to which the subject has knowledge of surveillance activities. It may be overt or covert. If covert, it may be merely unnotified, or alternatively express measures may be undertaken in order to obfuscate, and achieve secrecy. A further element is the notion of 'sousveillance', whereby the tools of surveillance are applied, by those who are commonly watched, against those who are commonly the watchers (Mann et al. 2003).

These notions are applied in the following sections in order to establish the extent to which location and tracking of mobile devices is changing the game of surveillance, and to demonstrate that location surveillance is intruding more deeply into personal freedoms than previous forms of surveillance.

4. Applications

This section presents a typology of applications of mobile device location, as a means of narrowing down to the kinds of uses that have particularly serious privacy implications. These are commonly referred to as location-based services (LBS). One category of applications provide information services that are for the benefit of the mobile device's user, such as navigation aids, and search and discovery tools for the locations variously of particular, identified organisations, and of organisations that sell particular goods and services. Users of LBS of these kinds can be reasonably assumed to be aware that they are disclosing their location. Depending on the design, the disclosures may also be limited to specific service-providers and specific purposes, and the transmissions may be secured.

Another, very different category of application is use by law enforcement agencies (LEAs). The US E-911 mandate of 1999 was nominally a public safety measure, to enable people needing emergency assistance to be quickly and efficiently located. In practice, the facility also delivered LEAs means for locating and tracking people of interest, through their mobile devices. Personal surveillance may be justified by reasonable grounds for suspicion that the subject is involved in serious crime, and may be specifically authorised by judicial warrant. Many countries have always been very loose in their control over LEAs, however, and many others have drastically weakened their controls since 2001. Hence, in any given jurisdiction and context, each and all of the controls may be lacking.

Yet worse, LEAs use mobile location and tracking for mass surveillance, without any specific grounds for suspicion about any of the many people caught up in what is essentially a dragnet-fishing operation (e.g. Mery 2009). Examples might include monitoring the area adjacent to a meeting-venue watching out for a blacklist of device-identifiers known to have been associated with activists in the past, or collecting device-identifiers for use on future occasions. In addition to netting the kinds of individuals who are of legitimate interest, the 'by-catch' inevitably includes threatened species. There are already extraordinarily wide-ranging (and to a considerable extent uncontrolled) data retention requirements in many countries.

Of further concern is the use of Automated Number Plate Recognition (ANPR) for mass surveillance purposes. This has been out of control in the UK since 2006, and has been proposed or attempted in various other countries as well (Clarke 2009a). Traffic surveillance is expressly used not only for retrospective analysis of the movements of individuals of interest to LEAs, but also as a means of generating suspicions about other people (Lewis 2008).

Beyond LEAs, many government agencies perform social control functions, and may be tempted to conduct location and tracking surveillance. Examples would include benefits-paying organisations tracking the movements of benefits-recipients about whom suspicions have arisen. It is not too far-fetched to anticipate zealous public servants concerned about fraud control imposing location surveillance on all recipients of some particularly valuable benefit, or as a security precaution on every person visiting a sensitive area (e.g. a prison, a power plant, a national park).

Various forms of social control are also exercised by private sector organisations. Some of these organisations, such as placement services for the unemployed, may be performing outsourced public sector functions. Others, such as workers' compensation providers, may be seeking to control personal insurance claimants, and similarly car-hire companies and insurance providers may wish to monitor motor vehicles' distance driven and roads used (Economist 2012).

A further privacy-invasive practice that is already common is the acquisition of location and tracking data by marketing corporations, as a by-product of the provision of location-based services, but with the data then applied to further purposes other than that for which it was intended. Some uses rely on statistical analysis of large holdings ('data mining'). Many uses are, on the other hand, very specific to the individual, and are for such purposes as direct or indirect targeting of advertisements and the sale of goods and services. Some of these applications combine location data with data from other sources, such as consumer profiling agencies, in order to build up such a substantial digital persona that the individual's behaviour is readily influenced. This takes the activity into the realms of überveillance.

All such services raise serious privacy concerns, because the data is intensive and sensitive, and attractive to organisations. Companies may gain rights in relation to the data through market power, or by trickery - such as exploitation of a self-granted right to change the Terms of Service (Clarke 2011). Once captured, the data may be re-purposed by any organisation that gains access to it, because the value is high enough that they may judge the trivial penalties that generally apply to breaches of privacy laws to be well worth the risk.

A recently-emerged, privacy-invasive practice is the application of the mobile device signature (MDS) form of tracking, in such locations as supermarkets. This is claimed by its providers to offer deep observational insights into the behaviour of customers, including dwell-times in front of displays, possibly linked with the purchaser's behaviour. This raises concerns a little different from other categories of location and tracking technologies, and is accordingly considered in greater depth in the following section.

It is noteworthy that an early review identified a wide range of LBS, which the authors classified into mobile guides, transport, gaming, assistive technology and location-based health (Raper et al. 2007b). Yet that work completely failed to notice that a vast array of applications were emergent in surveillance, law enforcement and national security, despite the existence of relevant literature from at least 1999 onwards (Clarke 2001Michael & Masters 2006).

5. Implications

The previous sections have introduced many examples of risks to citizens and consumers arising from location surveillance. This section presents an analysis of the categories and of the degree of seriousness with which they should be viewed. The first topic addressed is the privacy of personal location data. Other dimensions of privacy are then considered, and then the specific case of MDS is examined. The treatment here is complementary to earlier articles that have looked more generally at particular applications such as location-based mobile advertising, e.g. Cleff (2007, 2010) and King & Jessen (2010). See also Art. 29 (2011).

5.1 Locational Privacy

Knowing where someone has been, knowing what they are doing right now, and being able to predict where they might go next is a powerful tool for social control and for chilling behaviour (Abbas 2011). Humans do not move around in a random manner (Song et al. 2010).

One interpretation of 'locational privacy' is that it "is the ability of an individual to move in public space with the expectation that under normal circumstances their location will not be systematically and secretly recorded for later use" (Blumberg & Eckersley 2009). A more concise definition is "the ability to control the extent to which personal location information is ... [accessible and] used by others" (van Loenen et al. 2009). Hence 'tracking privacy' is the interest an individual has in controlling information about their sequence of locations.

Location surveillance is deeply intrusive into data privacy, because it is very rich, and enables a great many inferences to be drawn (Clarke 2001, Dobson & Fisher 2003, Michael et al. 2006aClarke & Wigan 2011). As demonstrated by Raper et al. (2007a, pp. 32-33), most of the technical literature that considers privacy is merely concerned about it as an impediment to deployment and adoption, and how to overcome the barrier rather than how to solve the problem. Few authors adopt a positive approach to privacy-protective location technologies. The same authors' review of applications (Raper et al. 2007b) includes a single mention of privacy, and that is in relation to just one of the scores of sub-categories of application that they catalogue.

Most service-providers are cavalier in their handling of personal data, and extravagant in their claims. For example, Skyhook claims that it "respects the privacy of all users, customers, employees and partners"; but, significantly, it makes no mention of the privacy of the people whose locations, through the locations of their Wifi routers, it collects and stores (Skyhook 2012).

Consent is critical in such LBS as personal location chronicle systems, people-followers and footpath route-tracker systems that systematically collect personal location information from a device they are carrying (Collier 2011c). The data handled by such applications is highly sensitive because it can be used to conduct behavioural profiling of individuals in particular settings. The sensitivity exists even if the individuals remain 'nameless', i.e. if each identifier is a temporary or pseudo-identifier and is not linked to other records. Service-providers, and any other organisations that gain access to the data, achieve the capacity to make judgements on individuals based on their choices of, for example, which retail stores they walk into and which they do not. For example, if a subscriber visits a particular religious bookstore within a shopping mall on a weekly basis, the assumption can be reasonably made that they are in some way affiliated to that religion (Samuel 2008).

It is frequently asserted that individuals cannot have a reasonable expectation of privacy in a public space. Contrary to those assertions, however, privacy expectations always have existed in public places, and continue to exist (VLRC 2010). Tracking the movements of people as they go about their business is a breach of a fundamental expectation that people will be 'let alone'. In policing, for example, in most democratic countries, it is against the law to covertly track an individual or their vehicle without specific, prior approval in the form of a warrant. This principle has, however, been compromised in many countries since 2001. Warrantless tracking using a mobile device generally results in the evidence, which has been obtained without the proper authority, being inadmissible in a court of law (Samuel 2008). Some law enforcement agencies have argued for the abolition of the warrant process because the bureaucracy involved may mean that the suspect cannot be prosecuted for a crime they have likely committed (Ganz 2005). These issues are not new; but far from eliminating a warrant process, the appropriate response is to invest the energy in streamlining this process (Bronitt 2010).

Privacy risks arise not only from locational data of high integrity, but also from data that is or becomes associated with a person and that is inaccurate, misleading, or wrongly attributed to that individual. High levels of inaccuracy and unreliability were noted above in respect of all forms of location and tracking technologies. In the case of MDS services, claims have been made of one-to-two metre locational accuracy. This has yet to be supported by experimental test cases, however, and hence there is uncertainty about the reliability of inferences that the service-provider or the shop-owner draw. If the data is the subject of a warrant or subpoena, the data's inaccuracy could result in false accusations and even a miscarriage of justice, with the 'wrong person' finding themselves in the 'right place' at the 'right time'.

5.2 Privacy More Broadly

Privacy has multiple dimensions. One analysis, in Clarke (2006a), identifies four distinct aspects. Privacy of Personal Data, variously also 'data privacy' and 'information privacy', is the most widely-discussed dimension of the four. Individuals claim that data about themselves should not be automatically available to other individuals and organisations, and that, even where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use. The last five decades have seen the application of information technologies to a vast array of abuses of data privacy. The degree of privacy-intrusiveness is a function of both the intensity and the richness of the data. Where multiple sources are combined, the impact is particularly likely to chill behaviour. An example is the correlation of video-feeds with mobile device tracking. The previous sub-section addressed that dimension.

Privacy of the Person, or 'bodily privacy', extends from freedom from torture and right to medical treatment, via compulsory immunisation and imposed treatments, to compulsory provision of samples of body fluids and body tissue, and obligations to submit to biometric measurement. Locational surveillance gives rise to concerns about personal safety. Physical privacy is directly threatened where a person who wishes to inflict harm is able to infer the present or near-future location of their target. Dramatic examples include assassins, kidnappers, 'standover merchants' and extortionists. But even people who are neither celebrities nor notorities are subject to stalking and harassment (Fusco et al. 2012).

Privacy of Personal Communications is concerned with the need of individuals for freedom to communicate among themselves, without routine monitoring of their communications by other persons or organisations. Issues include 'mail covers', the use of directional microphones, 'bugs' and telephonic interception, with or without recording apparatus, and third-party access to email-messages. Locational surveillance thereby creates new threats to communications privacy. For example, the equivalent of 'call records' can be generated by combining the locations of two device-identifiers in order to infer that a face-to-face conversation occurred.

Privacy of Personal Behaviour encompasses 'media privacy', but particular concern arises in relation to sensitive matters such as sexual preferences and habits, political activities and religious practices. Some privacy analyses, particularly in Europe, extend this discussion to personal autonomy, liberty and the right of self-determination (e.g. King & Jesson 2010). The notion of 'private space' is vital to economic and social aspects of behaviour, is relevant in 'private places' such as the home and toilet cubicles, but is also relevant and important in 'public places', where systematic observation and the recording of images and sounds are far more intrusive than casual observation by the few people in the vicinity.

Locational surveillance gives rise to rich sets of data about individuals' activities. The knowledge, or even suspicion, that such surveillance is undertaken, chills their behaviour. The chilling factor is vital in the case of political behaviour (Clarke 2008). It is also of consequence in economic behaviour, because the inventors and innovators on whom new developments depend are commonly 'different-thinkers' and even 'deviants', who are liable to come to come to attention in mass surveillance dragnets, with the tendency to chill their behaviour, their interactions and their creativity.

Surveillance that generates accurate data is one form of threat. Surveillance that generates inaccurate data, or wrongly associates data with a particular person, is dangerous as well. Many inferences that arise from inaccurate data will be wrong, of course, but that won't prevent those inferences being drawn, resulting in unjustified behavioural privacy invasiveness, including unjustified association with people who are, perhaps for perfectly good reasons, themselves under suspicion.

In short, all dimensions of privacy are seriously affected by location surveillance. For deeper treatments of the topic, see Michael et al. (2006b) and Clarke & Wigan (2011).

5.3 Locational Privacy and MDS

The recent innovation of tracking by means of mobile device signatures (MDS) gives rise to some issues additional to, or different from, mainstream device-location technologies. This section accordingly considers this particular technique's implications in greater depth. Limited reliable information is currently available, and the analysis is of necessity based on supplier-published sources (PI 2010a, 2010b) and media reports (Collier 2010a, 2010b, 2010c).

A company called Path Intelligence (PI) markets an MDS service to shopping mall-owners, to enable them to better value their floorspace in terms of rental revenues, and to identify points of on-foot traffic congestion to on-sell physical advertising and marketing floorspace (PI 2010a). The company claims to detect each phone (and hence person) that enters a zone, and to capture data, including:

  • how long each device and person stay, including dwell times in front of shop windows;
  • repeat visits by shoppers in varying frequency durations; and
  • typical route and circuit paths taken by shoppers as they go from shop to shop during a given shopping experience.

For malls, PI is able to denote such things as whether or not shoppers who shop at one establishment will also shop at another in the same mall, and whether or not people will go out of their way to visit a particular retail outlet independent of its location. For retailers, PI says it is able to provide information on conversion rates by department or even product line, and even which areas of the store might require more attention by staff during specific times of the day or week (PI 2012).

PI says that it uses "complex algorithms" to denote the geographic position of a mobile, using strategically located "proprietary equipment" in a campus setting (PI 2010a). The company states that it is conducting "data-driven analysis", but is not collecting, or at least that it is is not disclosing, any personal information such as a name, mobile telephone number or contents of a short message service (SMS). It states that it only ever provides aggregated data at varying zone levels to the shopping mall-owners. This is presumably justified on the basis that, using MDS techniques, direct identifiers are unlikely to be available, and a pseudo-identifier needs to be assigned. There is no explicit definition of what constitutes a zone. It is clear, however, that minimally-aggregated data at the highest geographic resolution is available for purchase, and at a higher price than more highly-aggregated data.

Shoppers have no relationship with the company, and it appears unlikely that they would even be aware that data about them is being collected and used. The only disclosure appears to be that "at each of our installations our equipment is clearly visible and labelled with our logo and website address" (PI 2010a), but this is unlikely to be visible to many people, and in any case would not inform anyone who saw it.

In short, the company is generating revenue by monitoring signals from the mobile devices of people who visit a shopping mall for the purchase of goods and services. The data collection is performed without the knowledge of the person concerned (Renegar et al. 2008). The company is covertly collecting personal data and exploiting it for profit. There is no incentive or value proposition for the individual whose mobile is being tracked. No clear statement is provided about collection, storage, retention, use and disclosure of the data (Arnold 2008). Even if privacy were not a human right, this would demand statutory intervention on the public policy grounds of commercial unfairness. The company asserts that the "our privacy approach has been reviewed by the [US Federal Trade Commission] FTC, which determined that they are comfortable with our practices" (PI 20101a). It makes no claims of such 'approval' anywhere else in the world.

The service could be extended beyond a mall and the individual stores within it, to, for example, associated walkways and parking areas, and surrounding areas such as government offices, entertainment zones and shopping-strips. Applications can also be readily envisaged on hospital and university campuses, and in airports and other transport hubs. From prior research, this is likely to expose the individual's place of employment, and even their residence (Michael et al. 2006). Even if only aggregated data is sold to businesses, the individual records remain available to at least the service-provider.

The scope exists to combine this form of locational surveillance with video-surveillance such as in-store CCTV, and indeed this is claimed to be already a feature of the company's offering to retail stores. To the extent that a commonly-used identifier can be established (e.g. through association with the person's payment or loyalty card at a point-of-sale), the full battery of local and externally-acquired customer transaction histories and consolidated 'public records' data can be linked to in-store behaviour (Michael & Michael 2007). Longstanding visual surveillance is intersecting with well-established data surveillance, and being augmented by locational surveillance, giving breath to dataveillance, or what is now being referred to by some as 'smart surveillance' (Wright et al. 2010, IBM 2011).

Surreptitious collection of personal data is (with exemptions and exceptions) largely against the law, even when undertaken by law enforcement personnel. The MDS mechanism also flies in the face of telephonic interception laws. How, then, can it be in any way acceptable for a form of warrantless tracking to be undertaken by or on behalf of corporations or mainstream government agencies, of shoppers in a mall, or travellers in an airport, or commuters in a transport hub? Why should a service-provider have the right to do what a law enforcement agency cannot normally do?

6. Controls

The tenor of the discussion to date has been that location surveillance harbours enormous threats to location privacy, but also to personal safety, the freedom to communicate, freedom of movement, and freedom of behaviour. This section examines the extent to which protections exist, firstly in the form of natural or intrinsic controls, and secondly in the form of legal provisions. The existing safeguards are found to be seriously inadequate, and it is therefore necessary to also examine the prospects for major enhancements to law, in order to achieve essential protections.

6.1 Intrinsic Controls

A variety of forms of safeguard exist against harmful technologies and unreasonable applications of them. The intrinsic economic control has largely evaporated, partly because the tools use electronics and the components are produced in high volumes at low unit cost. Another reason is that the advertising and marketing sectors are highly sophisticated, already hold and exploit vast quantities of personal data, and are readily geared up to exploit yet more data.

Neither the oxymoronic notion of 'business ethics' nor the personal morality of executives in business and government act as any significant brake on the behaviours of corporations and governments, because they are very weak barriers, and they are readily rationalised away in the face of claims of enhanced efficiencies in, for example, marketing communications, fraud control, criminal justice and control over anti-social behaviour.

A further category of intrinsic control is 'self-regulatory' arrangements within relevant industry sectors. In 2010, for example, the Australian Mobile Telecommunications Association (AMTA) released industry guidelines to promote the privacy of people using LBS on mobile devices (AMTA 2010). The guidelines were as follows:

  1. Every LBS must be provided on an opt-in basis with a specific request from a user for the service
  2. Every LBS must comply with all relevant privacy legislation
  3. Every LBS must be designed to guard against consumers being located without their knowledge
  4. Every LBS must allow consumers to maintain full control
  5. Every LBS must enable customers to control who uses their location information and when that is appropriate, and be able to stop or suspend a service easily should they wish

The second point is a matter for parliaments, privacy oversight agencies and law enforcement agencies, and its inclusion in industry guidelines is for-information-only. The remainder, meanwhile, are at best 'aspirational', and at worst mere window-dressing. Codes of this nature are simply ignored by industry members. They are primarily a means to hold off the imposition of actual regulatory measures. Occasional short-term constraints may arise from flurries of media attention, but the 'responsible' organisations escape by suggesting that bad behaviour was limited to a few 'cowboy' organisations or was a one-time error that won't be repeated.

A case study of the industry self-regulation is provided by the Biometrics Code issued by the misleadingly-named Australian industry-and-users association, the Biometrics 'Institute' (BI 2004). During the period 2009-12, the privacy advocacy organisation, the Australian Privacy Foundation (APF), submitted to the Privacy Commissioner on multiple occasions that the Code failed to meet the stipulated requirements and under the Commissioner's own Rules had to be de-registered. The Code never had more than five subscribers (out of a base of well over 100 members - which was itself only a sub-set of organisations active in the area), and had no signatories among the major biometrics vendors or users, because all five subscribers were small organisations or consultants. In addition, none of the subscribers appear to have ever provided a link to the Code on their websites or in their Privacy Policy Statements (APF 2012).

The Commissioner finally ended the farce in April 2012, citing the "low numbers of subscribers", but avoided its responsibilities by permitting the 'Institute' to "request" revocation, over two years after the APF had made the same request (OAIC 2012). The case represents an object lesson in the vacuousness of self-regulation and the business-friendliness of a captive privacy oversight agency.

If economics, morality and industry-sector politics are inadequate, perhaps competition and organisational self-interest might work. On the other hand, repeated proposals that privacy is a strategic factor for corporations and government agencies have fallen on stony ground (Clarke 19962006b).

The public can endeavour to exercise countervailing power against privacy-invasive practices. On the other hand, individuals acting alone are of little or no consequence to organisations that are intent on the application of location surveillance. Moreover, consumer organisations lack funding, professionalism and reach, and only occasionally attract sufficient media attention to force any meaningful responses from organisations deploying surveillance technologies.

Individuals may have direct surveillance countermeasures available to them, but relatively few people have the combination of motivation, technical competence and persistence to overcome lethargy and the natural human desire to believe that the institutions surrounding them are benign. In addition, some government agencies, corporations and (increasingly prevalent) public-private partnerships seek to deny anonymity, pseudonymity and multiple identities, and to impose so-called 'real name' policies, for example as a solution to the imagined epidemics of cyber-bullying, hate speech and child pornography. Individuals who use cryptography and other obfuscation techniques have to overcome the endeavours of business and government to stigmatise them as criminals with 'something to hide'.

6.2 Legal Controls

It is clear that natural or intrinsic controls have been utter failures in privacy matters generally, and will be in locational privacy matters as well. That leaves legal safeguards for personal freedoms as the sole protection. There are enormous differences among domestic laws relating to location surveillance. This section accordingly limits itself to generalities and examples.

Privacy laws are (with some qualifications, mainly in Europe) very weak instruments. Even where public servants and parliaments have an actual intention to protect privacy, rather than merely to overcome public concerns by passing placebo statutes, the draft Bills are countered by strong lobbying by government agencies and industry, to the extent that measures that were originally portrayed as being privacy-protective reach the statute books as authority for privacy breaches and surveillance (Clarke 2000).

Privacy laws, once passed, are continually eroded by exceptions built into subsequent legislation, and by technological capabilities that were not contemplated when the laws were passed. In most countries, location privacy has yet to be specifically addressed in legislation. Even where it is encompassed by human rights and privacy laws, the coverage is generally imprecise and ambiguous. More direct and specific regulation may exist, however. In Australia, for example, the Telecommunications (Interception and Access) Act and the Surveillance Devices Act define and criminalise inappropriate interception and access, use, communication and publication of location information that is obtained from mobile device traffic (AG 2005). On the other hand, when Google Inc. intercepted wi-fi signals and recorded the data that they contained, the Privacy Commissioner absolved the company (Riley 2010), and the Australian Federal Police refused to prosecute despite the action - whether it was intentional, 'inadvertent' or merely plausibly deniable - being a clear breach of the criminal law (Moses 2010).

The European Union determined a decade ago that location data that is identifiable to individuals is to some extent at least subject to existing data protection laws (EU 2002). However, the wording of that so-called 'e-Privacy Directive' countenances the collection of "location data which are more precise than is necessary for the transmission of communications", without clear controls over the justification, proportionality and transparency of that collection (para. 35). In addition, the e-Privacy Directive only applies to telecommunications service providers, not to other organisations that acquire location and tracking data. King & Jessen (2010) discuss various gaps in the protective regimes in Europe.

The EU's Advisory Body (essentially a Committee of European Data Protection Commissioners) has issued an Opinion that mobile location data is generally capable of being associated with a person, and hence is personal data, and hence is subject to the EU Directive of 1995 and national laws that implement that Directive (Art. 29 2011). Consent is considered to be generally necessary, and that consent must be informed, and sufficiently granular (pp. 13-18).

It is unclear, however, to what extent this Opinion has actually caused, and will in the future cause, organisations that collect, store, use and disclose location data to change their practices. This uncertainty exists in respect of national security, law enforcement and social control agencies, which have, or which can arrange, legal authority that overrides data protection laws. It also applies to non-government organisations of all kinds, which can take advantage of exceptions, exemptions, loopholes, non-obviousness, obfuscation, unenforceability within each particular jurisdiction, and extra-jurisdictionality, to operate in ways that are in apparent breach of the Opinion.

Legal authorities for privacy-invasions are in a great many cases vague rather than precise, and in many jurisdictions power in relation to specific decisions is delegated to an LEA (in such forms as self-written 'warrants'), or even a social control agency (in the form of demand-powers), rather than requiring a decision by a judicial officer based on evidence provided by the applicant.

Citizens in many countries are subject to more or less legitimate surveillance of various degrees and orders of granularity, by their government, in the name of law enforcement and national security. However, many Parliaments have granted powers to national security agencies to use location technology to track citizens and to intercept telecommunications. Moreover, many Parliaments have failed the public by permitting a warrant to be signed by a Minister, or even a public servant, rather than a judicial officer (Jay 1999). Worse still, it appears that these already-gross breaches of the principle of a free society are in effect being extended to the authorisation of a private organisation to track mobiles of ordinary citizens because it may lead to better services planning, or more efficient advertising and marketing (Collier 2011a).

Data protection legislation in all countries evidences massive weaknesses. There are manifold exemptions and exceptions, and there are intentional and accidental exclusions, for example through limitations in the definitions of 'identified' and 'personal data'. Even the much-vaunted European laws fail to cope with extra-territoriality and are largely ignored by US-based service-providers. They are also focussed exclusively on data, leaving large gaps in safeguards for physical, communications and behavioural privacy.

Meanwhile, a vast amount of abuse of personal data is achieved through the freedom of corporations and government agencies to pretend that Terms imposed on consumers and citizens without the scope to reject them are somehow the subject of informed and freely-given consent. For example, petrol-stations, supermarkets and many government agencies pretend that walking past signs saying 'area subject to CCTV' represents consent to gather, transmit, record, store, use and disclose data. The same approach is being adopted in relation to highly-sensitive location data, and much-vaunted data protection laws are simply subverted by the mirage of consent.

At least notices such as 'you are now being watched' or 'smile, you are being recorded' inform customers that they are under observation. On the other hand, people are generally oblivious to the fact that their mobile subscriber identity is transmitted from their mobile phone and multilaterated to yield a reasonably precise location in a shopping mall (Collier 2011a, b, c). Further, there is no meaningful sense in which they can be claimed to have consented to providing location data to a third party, in this case a location service-provider with whom they have never had contact. And the emergent combination of MDS with CCTV sources becomes a pervasive view of the person, an 'über' view, providing a set of über-analytics to - at this stage - shopping complex owners and their constituents.

What rights do employees have if such a system were instituted in an employment setting? Are workplace surveillance laws in place that would protect employees from constant monitoring? A similar problem applies to people at airports, or on hospital, university, industrial or government campuses. No social contract has been entered into between the parties, rendering the subscriber powerless.

Since the collapse of the Technology Assessment movement, technological deployment proceeds unimpeded, and public risks are addressed only after they have emerged and the clamour of concern has risen to a crescendo. A reactive force is at play, rather than proactive measures being taken to ensure avoidance or mitigation of potential privacy breaches. In Australia, for example, safeguards for location surveillance exist at best incidentally, in provisions under separate legislative regimes and in separate jurisdictions, and at worst not at all. No overarching framework exists to provide consistency among the laws. This causes confusion and inevitably results in inadequate protections (ALRC 2008).

6.3 Prospective Legal Controls

Various learned studies have been conducted, but gather dust. In Australia, the three major law reform commissions have all reported, and all have been ignored by the legislatures (NSWLRC 2005ALRC 2008VLRC 2010).

One critical need is for the fundamental principle to be recovered, to the effect that the handling of personal data requires either consent or legal authority. Consent is meaningless as a control over unreasonable behaviour, however, unless it satisfies a number of key conditions. It must be informed, it must be freely-given, and it must be sufficiently granular, not bundled (Clarke 2002). In a great many of the circumstances in which organisations are claiming to have consent to gather, store, use and disclose location data, the consumer does not appreciate what the scope of handling is that the service-provider is authorising themselves to perform; the Terms are imposed by the service-provider and may even be varied or completely re-written without consultation, a period of notice or even any notice at all; and consent is bundled rather than the individual being able to construct a pattern of consents and denials that suit their personal needs. Discussions all too frequently focus on the specifically-US notion of 'opt-out' (or 'presumed consent'), with consent debased to 'opt-in', and deprecated as inefficient and business-unfriendly.

Recently, some very weak proposals have been put forward, primarily in the USA. In 2011, for example, two US Senators proposed a Location Privacy Protection Bill (Cheng 2011). An organisation that collected location data from mobile or wireless data devices would have to state explicitly in their privacy policies what was being collected, in plain English. This would represent only a partial implementation of the already very weak 2006 recommendation of the Internet Engineering Task Force for Geographic Location/Privacy (IETF GEOPRIV) working group, which decided that technical systems should include `Fair Information Practices' (FIPs) to defend against harms associated with the use of location technologies (EPIC 2006). FIPs, however, is itself only a highly cut-down version of effective privacy protections, and the Bill proposes only a small fraction of FIPs. It would be close to worthless to consumers, and close to legislative authorisation for highly privacy-invasive actions by organisations.

Two other US senators tabled a GPS Bill, nominally intended to "balance the needs of Americans' privacy protections with the legitimate needs of law enforcement, and maintains emergency exceptions" (Anderson 2011). The scope is very narrow - next would have to come the Wi-Fi Act, the A-GPS Act, etc. That approach is obviously unviable in the longer term as new innovations emerge. Effective legislation must have appropriate generality rather than excessive technology-specificity, and should be based on semantics not syntax. Yet worse, these Bills would provide legal authorisation for grossly privacy-invasive location and tracking. IETF engineers, and now Congressmen, want to compromise human rights and increase the imbalance of power between business and consumers.

7. Conclusions

Mobile device location technologies and their applications are enabling surveillance, and producing an enormous leap in intrusions into data privacy and into privacy of the person, privacy of personal communications, and privacy of personal behaviour.

Existing privacy laws are entirely incapable of protecting consumers and citizens against the onslaught. Even where consent is claimed, it generally fails the tests of being informed, freely-given and granular.

There is an urgent need for outcries from oversight agencies, and responses from legislatures. Individual countries can provide some degree of protection, but the extra-territorial nature of so much of the private sector, and the use of corporate havens, in particular the USA, mean that multilateral action is essential in order to overcome the excesses arising from the US laissez faire traditions.

One approach to the problem would be location privacy protection legislation, although it would need to embody the complete suite of protections rather than the mere notification that the technology breaches privacy. An alternative approach is amendment of the current privacy legislation and other anti-terrorism legislation in order to create appropriate regulatory provisions, and close the gaps that LBS providers are exploiting (Koppel 2010).

The chimeras of self-regulation, and the unenforceability of guidelines, are not safeguards. Sensitive data like location information must be subject to actual, enforced protections, with guidelines and codes no longer used as a substitute, but merely playing a supporting role. Unless substantial protections for personal location information are enacted and enforced, there will be an epidemic of unjustified, disproportionate and covert surveillance, conducted by government and business, and even by citizens (Gillespie 2009, Abbas et al. 2011).

References

Abbas R. (2011) 'The social and behavioural implications of location-based services: An observational study of users' Journal of Location Based Services, 5, 3-4 (December 2011)

Abbas R., Michael K., Michael m.g. & Aloudat A. (2011) 'Emerging forms of covert surveillance using GPS-enabled devices', Journal of Cases on Information Technology, 13, 2 (2011) 19-33

AG (2005) 'What the Government is doing: Surveillance Device Act 2004', 25 May 2005, Australian Government, at http://www.ag.gov.au/agd/www/nationalsecurity.nsf/AllDocs/9B1F97B59105AEE6CA2570C0014CAF5?OpenDocument

ALRC (2008) 'For your information: Australian privacy law and practice (ALRC Report 108)', Australian Government, 2, pp. 1409-10, http://www.alrc.gov.au/publications/report-108

AMTA (2010) 'New mobile telecommunications industry guidelines and consumer tips set benchmark for Location Based Services', Australian Mobile Telecommunications Association, 2010, athttp://www.amta.org.au/articles/New.mobile.telecommunications.industry.guidelines.and.consumer.tips.set.benchmark.for.Location.Based.Services

Anderson N. (2011) 'Bipartisan bill would end government's warrantless GPS tracking', Ars Technica, June 2011, at http://arstechnica.com/tech-policy/news/2011/06/bipartisan-bill-would-end-governments-warrantless-gps-tracking.ars

APF (2012) 'Revocation of the Biometrics Industry Code' Australian Privacy Foundation, March 2012, at http://www.privacy.org.au/Papers/OAIC-BiomCodeRevoc-120321.pdf

Arnold B. (2008) 'Privacy guide', Caslon Analytics, May 2008, at http://www.caslon.com.au/privacyguide19.htm

Art. 29 (2011) 'Opinion 13/2011 on Geolocation services on smart mobile devices' Article 29 Data Protection Working Party , 881/11/EN WP 185, 16 May 2011, at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2011/wp185_en.pdf

BI (2004) 'Privacy Code' Biometrics Institute, Sydney, April 2004, at http://web.archive.org/web/20050424120627/http://www.biometricsinstitute.org/displaycommon.cfm?an=1&subarticlenbr=8

Blumberg A.J. & Eckersley P. (2009) 'On locational privacy, and how to avoid losing it forever' Electronic Frontier Foundation, August 2009, at https://www.eff.org/wp/locational-privacy

Bronitt S. (2010) 'Regulating covert policing methods: from reactive to proactive models of admissibility', in S. Bronitt, C. Harfield and K. Michael (eds.), The Social Implications of Covert Policing, 2010, pp. 9-14

Cheng J. (2011) 'Franken's location-privacy bill would close mobile-tracking 'loopholes'', Wired, 17 June 2011, at http://www.wired.com/epicenter/2011/06/franken-location-loopholes/

Chetty K., Smith G.E. & Woodbridge K. (2012) 'Through-the-Wall Sensing of Personnel Using Passive Bistatic WiFi Radar at Standoff Distances' IEEE Transactions on Geoscience and Remote Sensing 50, 4 (Aril 2012) 1218 - 1226

Clarke R. (1988) 'Information technology and dataveillance', Communications of the ACM, 31(5), May 1988, pp498-512, at http://www.rogerclarke.com/DV/CACM88.html

Clarke R. (1994) 'The Digital Persona and its Application to Data Surveillance' The Information Society 10,2 (June 1994) 77-92, at http://www.rogerclarke.com/DV/DigPersona.html

Clarke R. (1996) 'Privacy and Dataveillance, and Organisational Strategy' Proc. I.S. Audit & Control Association (EDPAC'96), Perth, Western Australia, May 1996, athttp://www.rogerclarke.com/DV/PStrat.html

Clarke R. (2000) 'Submission to the Commonwealth Attorney-General re: 'A privacy scheme for the private sector: Release of Key Provisions' of 14 December 1999' Xamax Consultancy Pty Ltd, January 2000, at http://www.anu.edu.au/people/Roger.Clarke/DV/PAPSSub0001.html

Clarke R. (2001) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Information Technology & People 14, 2 (Summer 2001) 206-231, athttp://www.rogerclarke.com/DV/PLT.html

Clarke R. (2002) 'e-Consent: A Critical Element of Trust in e-Business' Proc. 15th Bled Electronic Commerce Conference, Bled, Slovenia, June 2002, at http://www.rogerclarke.com/EC/eConsent.html

Clarke R. (2006a) 'What's 'Privacy'?' Xamax Consultancy Pty Ltd, August 2006, at http://www.rogerclarke.com/DV/Privacy.html

Clarke R. (2006b) 'Make Privacy a Strategic Factor - The Why and the How' Cutter IT Journal 19, 11 (October 2006), at http://www.rogerclarke.com/DV/APBD-0609.html

Clarke R. (2008) 'Dissidentity: The Political Dimension of Identity and Privacy' Identity in the Information Society 1, 1 (December, 2008) 221-228, at http://www.rogerclarke.com/DV/Dissidentity.html

Clarke R. (2009a) 'The Covert Implementation of Mass Vehicle Surveillance in Australia' Proc 4th Workshop on the Social Implications of National Security: Covert Policing, April 2009, ANU, Canberra, at http://www.rogerclarke.com/DV/ANPR-Surv.html

Clarke R. (2009b) 'A Sufficiently Rich Model of (Id)entity, Authentication and Authorisation' Proc. IDIS 2009 - The 2nd Multidisciplinary Workshop on Identity in the Information Society, LSE, 5 June 2009, at http://www.rogerclarke.com/ID/IdModel-090605.html

Clarke R. (2009c) 'A Framework for Surveillance Analysis' Xamax Consultancy Pty Ltd, August 2009, at http://www.rogerclarke.com/DV/FSA.html

Clarke R. (2010) 'What is Überveillance? (And What Should Be Done About It?)' IEEE Technology and Society 29, 2 (Summer 2010) 17-25, at http://www.rogerclarke.com/DV/RNSA07.html

Clarke R. (2011) 'The Cloudy Future of Consumer Computing' Proc. 24th Bled eConference, June 2011, at http://www.rogerclarke.com/EC/CCC.html

Clarke R. & Wigan M. (2011) 'You are where you've been: The privacy implications of location and tracking technologies' Journal of Location Based Services 5, 3-4 (December 2011) 138-155, PrePrint athttp://www.rogerclarke.com/DV/YAWYB-CWP.html

Cleff E.B. (2007) 'Implementing the legal criteria of meaningful consent in the concept of mobile advertising' Computer Law & Security Review 23,2 (2007) 262-269

Cleff E.B. (2010) 'Effective approaches to regulate mobile advertising: Moving towards a coordinated legal, self-regulatory and technical response' Computer Law & Security Review 26, 2 (2010) 158-169

Collier K. (2011a) 'Stores spy on shoppers', Herald Sun, 12 October 2011, at http://www.heraldsun.com.au/news/more-news/stores-spy-on-shoppers/story-fn7x8me2-1226164244739

Collier K. (2011b) 'Shopping centres' Big Brother plan to track customers', Herald Sun, 14 October 2011, at http://www.heraldsun.com.au/news/more-news/shopping-centres-big-brother-plan-to-track-customers/story-fn7x8me2-1226166191503

Collier K. (2011c) ''Creepy' Path Intelligence retail technology tracks shoppers', news.com.au, 14 October 2011, at http://www.news.com.au/money/creepy-retail-technology-tracks-shoppers/story-e6frfmci-1226166413071

Dahunsi F. & Dwolatzky B. (2012) 'An empirical investigation of the accuracy of location-based services in South Africa' Journal of Location Based Services 6, 1 (March 2012) 22-34

Dobson J. & Fisher P. (2003) 'Geoslavery' IEEE Technology and Society 22 (2003) 47-52, cited in Raper et al. (2007)

Economist (2012) 'Vehicle data recorders - Watching your driving' The Economist' 23 June 2012, at http://www.economist.com/node/21557309

EPIC (2006) 'Privacy and human rights report 2006' Electronic Privacy Information Center, WorldLII, 2006, at http://www.worldlii.org/int/journals/EPICPrivHR/2006/PHR2006-Location.html

EPIC (2012) 'Investigations of Google Street View' Electronic Privacy Information Center, 2012, at http://epic.org/privacy/streetview/

EU (2002) 'Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications)' Official Journal L 201 , 31/07/2002 P. 0037 - 0047, European Commission, at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32002L0058:en:HTML

Figueiras J. & Frattasi S. (2010) 'Mobile Positioning and Tracking: From Conventional to Cooperative Techniques' Wiley, 2010

Fusco S.J., Abbas R., Michael K. & Aloudat A. (2012) 'Location-Based Social Networking and its Impact on Trust in Relationships' IEEE Technology and Society Magazine 31,2 (Summer 2012) 39-50, athttp://works.bepress.com/cgi/viewcontent.cgi?article=1326&context=kmichael

Gallagher T. et al. (2009) 'Trials of commercial Wi-Fi positioning systems for indoor and urban canyons' Proc. IGNSS Symposium, 1-3 December 2009, Queensland, cited in Zandbergen (2012)

Ganz J.S. (2005) 'It's already public: why federal officers should not need warrants to use GPS vehicle tracking devices', Journal of Criminal Law and Criminology 95, 4 (Summer 2005) 1325-37

Gillespie A.A. (2009) 'Covert surveillance, human rights and the law', Irish Criminal Law Journal, 19, 3 (August 2009) 71-79

IBM (2011) 'IBM Smart Surveillance System (Previous PeopleVision Project)', IBM Research, 30 October 2011, at http://www.research.ibm.com/peoplevision/

Jay D.M. (1999) 'Use of covert surveillance obtained by search warrant', Australian Law Journal, 73, 1 (Jan 1999) 34-36

King N.J. & Jessen P.W. (2010) 'Profiling the mobile customer - Privacy concerns when behavioural advertisers target mobile phones' Computer Law & Security Review 26, 5 (2010) 455-478 and 26, 6 (2010) 595-612

Koppel A. (2010) 'Warranting a warrant: Fourth Amendment concerns raised by law enforcement's warrantless use of GPS and cellular phone tracking', University of Miami Law Review 64, 3 (April 2010) 1061-1089

Lewis P. (2008) 'Fears over privacy as police expand surveillance project' The Guardian, 15 September 2008, at http://www.guardian.co.uk/uk/2008/sep/15/civilliberties.police

McGuire M., Plataniotis K.N. & Venetsanopoulos A.N. (2005) 'Data fusion of power and time measurements for mobile terminal location' IEEE Transaction on Mobile Computing 4 (2005) 142-153, cited in Raper et al. (2007)

Mann S., Nolan J. & Wellman B. (2003) 'Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments' Surveillance & Society 1, 3 (June 2003) 331-355, at http://www.surveillance-and-society.org/articles1(3)/sousveillance.pdf

Mautz R. (2011) 'Overview of Indoor Positioning Technologies' Keynote, Proc. IPIN'2011, Guimaraes, September 2011, at http://www.geometh.ethz.ch/people/.../IPIN_Keynote_Mautz_2011.pdf

Mery D. (2009) 'The mobile phone as self-inflicted surveillance - And if you don't have one, what have you got to hide?' The Register, 10 April 2009, athttp://www.theregister.co.uk/2009/04/10/mobile_phone_tracking/

Michael K. & Michael M.G. (2007) 'From Dataveillance to Überveillance and the Realpolitik of the Transparent Society' University of Wollongong, 2007, at http://works.bepress.com/kmichael/51

Michael K. & Michael M.G. (2009) 'Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants' IGI Global, 2009

Michael M.G. & Michael K. (2010) 'Towards a state of uberveillance' IEEE Technology and Society Magazine 29, 2 (Summer 2010) 9-16, at http://works.bepress.com/kmichael/187

Michael K., McNamee A., Michael M.G. & Tootell H. (2006a) 'Location-Based Intelligence - Modeling Behavior in Humans using GPS' Proc. Int'l Symposium on Technology and Society, New York, 8-11 June 2006, at http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1384&context=infopapers

Michael K., McNamee A. & Michael M.G. (2006b) 'The Emerging Ethics of Humancentric GPS Tracking and Monitoring' Proc. Int'l Conf. on Mobile Business, Copenhagen, Denmark IEEE Computer Society, 2006, at http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1384&context=infopapers

Michael M.G., Fusco S.J. & Michael K (2008) 'A Research Note on Ethics in the Emerging Age of Uberveillance (Überveillance)' Computer Communications, 31(6), 2008, 1192-119, athttp://works.bepress.com/kmichael/32/

Michael K. & Masters A. (2006) 'Realized Applications of Positioning Technologies in Defense Intelligence' in Hussein Abbass H. & Essam D. (eds.) 'Applications of Information Systems to Homeland Security and Defense' Idea Group Publishing, 2006, at http://works.bepress.com/kmichael/2

Michael K., Roussos G., Huang G.Q., Gadh R., Chattopadhyay A., Prabhu S. & Chu P. (2010) 'Planetary-scale RFID services in an age of uberveillance' Proceedings of the IEEE 98, 9 (2010) 1663-1671

Moses A. (2010) 'Google escapes criminal charges for Wi-Fi snooping', The Sydney Morning Herald, 6 December 2010, at http://www.smh.com.au/technology/security/google-escapes-criminal-charges-for-wifi-snooping-20101206-18lot.html

NSWLRC (2005) 'Surveillance' Report 108 , NSW Law Reform Commission, 2005, at http://www.lawlink.nsw.gov.au/lawlink/lrc/ll_lrc.nsf/pages/LRC_r108toc

OAIC (2012) '' Office of the Australian Information Commissioner, April 2012, at http://www.comlaw.gov.au/Details/F2012L00869/Explanatory%20Statement/Text

Otterberg A.A. (2005) 'Note: GPS tracking technology: The case for revisiting Knotts and shifting the Supreme Court's theory of the public space under the Fourth Amendment', Boston College Law Review 46 (2005) 661-704

Parenti C. (2003) 'The Soft Cage: Surveillance in America From Slavery to the War on Terror'  Basic Books, 2003

PI (2010a) 'Our Commitment to Privacy', Path Intelligence, 2010, heading changed in late 2012 to 'Privacy by design', at http://www.pathintelligence.com/en/products/footpath/privacy

PI (2010b) 'FootPath Technology', Path Intelligance, 2010, at http://www.pathintelligence.com/en/products/footpath/footpath-technology

PI (2012) 'Retail' Path Intelligence, 2012, at http://www.pathintelligence.com/en/industries/retail

Raper J., Gartner G., Karimi H. & Rizos C. (2007a) 'A critical evaluation of location based services and their potential' Journal of Location Based Services 1, 1 (March 2007) 5-45

Raper J., Gartner G., Karimi H. & Rizos C. (2007b) 'Applications of location-based services: a selected review' Journal of Location Based Services 1, 2 (June 2007) 89-111

RE (2010a) 'IEEE 802.11 standards tutorial' Radio-Electronics.com, apparently of 2010, at http://www.radio-electronics.com/info/wireless/wi-fi/ieee-802-11-standards-tutorial.php

RE (2010b) 'WiMAX IEEE 802.16 technology tutorial' Radio-Electronics.com, apparently of 2010, at http://www.radio-electronics.com/info/wireless/wimax/wimax.php

RE (2012) 'Assisted GPS, A-GPS' Radio-Electronics.com, apparently of 2012, at http://www.radio-electronics.com/info/cellulartelecomms/location_services/assisted_gps.php

Renegar B.D., Michael K. & Michael M.G. (2008) 'Privacy, value and control issues in four mobile business applications' Proc. 7th Int'l Conf. on Mobile Business, 2008, pp. 30-40

Riley J. (2010) 'Gov't 'travesty' in Google privacy case', ITWire, Wednesday 3 November 2010, 20:44, at http://www.itwire.com/it-policy-news/regulation/42898-govt-travesty-in-google-privacy-case

Samuel I.J. (2008) 'Warrantless location tracking', New York University Law Review, 83 (2008) 1324-1352

SHW (2012) 'Skyhook Location Performance', at http://www.skyhookwireless.com/location-technology/performance.php

Skyhook (2012) Website Entries, including 'Frequently Asked Questions' at http://www.skyhookwireless.com/whoweare/faq.php, 'Privacy Policy' athttp://www.skyhookwireless.com/whoweare/privacypolicy.php and 'Location Privacy' at http://www.skyhookwireless.com/whoweare/privacy.php,

Song C., Qu Z., Blumm N. & Barabási A.-L. (2010) 'Limits of predictability in human mobility' Science 327, 5968 (2010) 1018-1021

USGov (2012) 'GPS Accuracy' National Coordination Office for Space-Based Positioning, Navigation, and Timing, February 2012, at http://www.gps.gov/systems/gps/performance/accuracy/

van Loenen B., Zevenbergen J. & de Jong J. (2009) 'Balancing Location Privacy with National Security: A Comparative Analysis of Three Countries through the Balancing Framework of the European Court Of Human Rights' Ch. 2 of Patten N.J. et al. 'National Security: Institutional Approaches', Nova Science Publishers, 2009

VLRC (2010) 'Surveillance in Public Spaces' Victorian Law Reform Commission, Final Report 18, March 2010, athttp://www.lawreform.vic.gov.au/wps/wcm/connect/justlib/Law+Reform/resources/3/6/36418680438a4b4eacc0fd34222e6833/Surveillance_final_report.pdf

Wright D., Friedewald M., Gutwirth S., Langheinrich M., Mordini E., Bellanova R., De Hert P., Wadhwa K. & Bigo D. (2010) 'Sorting out smart surveillance' Computer Law & Security Review 26, 4 (2010) 343-354

Zandbergen P.A. (2012) 'Comparison of WiFi positioning on two mobile devices' Journal of Location Based Services 6, 1 (March 2012) 35-50

Acknowledgements

A preliminary version of the analysis presented in this paper appeared in the November 2011 edition of Precedent, the journal of the Lawyers Alliance. The article has been significantly upgraded as a result of comments provided by the referees and editor.

Author Affiliations

Katina Michael is an Associate Professor in the School of Information Systems and Technology at the University of Wollongong. She is the editor in chief of the IEEE Technology and Society Magazine, is on the editorial board of Computers & Security, and is a co-editor of 'Social Implications of Covert Policing' (2010). She is a Board member of the Australian Privacy Foundation and a representative of the Consumer Federation of Australia.

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Cyberspace Law & Policy Centre at the University of N.S.W., and a Visiting Professor in theResearch School of Computer Science at the Australian National University. He is currently Chair of the Australian Privacy Foundation, and an Advisory Board member of Privacy International.

Location and tracking of mobile devices: Uberveillance stalks the streets

Abstract

During the last decade, location-tracking and monitoring applications have proliferated, in mobile cellular and wireless data networks, and through self-reporting by applications running in smartphones that are equipped with onboard global positioning system (GPS) chipsets. It is now possible to locate a smartphone user's location not merely to a cell, but to a small area within it. Innovators have been quick to capitalise on these location-based technologies for commercial purposes, and have gained access to a great deal of sensitive personal data in the process. In addition, law enforcement utilises these technologies, can do so inexpensively and hence can track many more people. Moreover, these agencies seek the power to conduct tracking covertly, and without a judicial warrant. This article investigates the dimensions of the problem of people-tracking through the devices that they carry. Location surveillance has very serious negative implications for individuals, yet there are very limited safeguards. It is incumbent on legislatures to address these problems, through both domestic laws and multilateral processes.

1. Introduction

Personal electronic devices travel with people, are worn by them, and are, or soon will be, inside them. Those devices are increasingly capable of being located, and, by recording the succession of locations, tracked. This creates a variety of opportunities for the people concerned. It also gives rise to a wide range of opportunities for organisations, at least some of which are detrimental to the person's interests.

Commonly, the focus of discussion of this topic falls on mobile phones and tablets. It is intrinsic to the network technologies on which those devices depend that the network operator has at least some knowledge of the location of each handset. In addition, many such devices have onboard global positioning system (GPS) chipsets, and self-report their coordinates to service-providers. The scope of this paper encompasses those already well-known forms of location and tracking, but it extends beyond them.

The paper begins by outlining the various technologies that enable location and tracking, and identifies those technologies' key attributes. The many forms of surveillance are then reviewed, in order to establish a framework within which applications of location and tracking can be characterised. Applications are described, and their implications summarised. Controls are considered, whereby potential harm to the interests of individuals can be prevented or mitigated.

2. Relevant technologies

The technologies considered here involve a device that has the following characteristics:

• it is conveniently portable by a human, and

• it emits signals that:

• enable some other device to compute the location of the device (and hence of the person), and

• are sufficiently distinctive that the device is reliably identifiable at least among those in the vicinity, and hence the device's (and hence the person's) successive locations can be detected, and combined into a trail

The primary form-factors for mobile devices are currently clam-shape (portable PCs), thin rectangles suitable for the hand (mobile phones), and flat forms (tablets). Many other form-factors are also relevant, however. Anklets imposed on dangerous prisoners, and even as conditions of bail, carry RFID tags. Chips are carried in cards of various sizes, particularly the size of credit-cards, and used for tickets for public transport and entertainment venues, aircraft boarding-passes, toll-road payments and in some countries to carry electronic cash. Chips may conduct transactions with other devices by contact-based means, or contactless, using radio-frequency identification (RFID) or its shorter-range version near-field communication (NFC) technologies. These capabilities are in credit and debit cards in many countries. Transactions may occur with the cardholder's knowledge, with their express consent, and with an authentication step to achieve confidence that the person using the card is authorised to do so. In a variety of circumstances, however, some and even all of those safeguards are dispensed with. The electronic versions of passports that are commonly now being issued carry such a chip, and have an autonomous communications capability. The widespread issue of cards with capabilities uncontrolled by, and in many cases unknown to, the cardholder, is causing consternation among segments of the population that have become aware of the schemes.

Such chips can be readily carried in other forms, including jewellery such as finger-rings, and belt-buckles. Endo-prostheses such as replacement hips and knees and heart pacemakers can readily carry chips. A few people have voluntarily embedded chips directly into their bodies for such purposes as automated entry to premises (Michael and Michael, 2009).

In order to locate and track such devices, any sufficiently distinctive signals may in principle suffice. See Raper et al. (2007a) and Mautz (2011). In practice, the signals involved are commonly those transmitted by a device in order to take advantage of wireless telecommunications networks. The scope of the relevant technologies therefore also encompasses the signals, devices that detect the signals, and the networks over which the data that the signals contain are transmitted.

In wireless networks, it is generally the case that the base-station or router needs to be aware of the identities of devices that are currently within the cell. A key reason for this is to conserve limited transmission capacity by sending messages only when the targeted device is known to be in the cell. This applies to all of:

• cellular mobile originally designed for voice telephony and extended to data (in particular those using the ‘3G’ standards GSM/GPRS, CDMA2000 and UMTS/HSPA and the ‘4G’ standard LTE)

• wireless local area networks (WLANs, commonly Wifi/IEEE 802.11x – RE, 2010a)

• wireless wide area networks (WWANs, commonly WiMAX/IEEE 802.16x – RE, 2010b).

Devices in such networks are uniquely identified by various means (Clarke and Wigan, 2011). In cellular networks, there is generally a clear distinction between the entity (the handset) and the identity it is adopting at any given time (which is determined by the module inserted in it). Depending on the particular standards used, what is commonly referred to as ‘the SIM-card’ is an R-UIM, a CSIM or a USIM. These modules store an International Mobile Subscriber Identity (IMSI), which constitutes the handset's identifier. Among other things, this enables network operators to determine whether or not to provide service, and what tariff to apply to the traffic. However, cellular network protocols may also involve transmission of a code that distinguishes the handset itself, within which the module is currently inserted. A useful generic term for this is the device ‘entifier’ (Clarke, 2009b). Under the various standards, it may be referred to as an International Mobile Equipment Identity (IMEI), ESN, or MEID.

Vendor-specific solutions also may provide additional functionality to a handset unbeknown to the end-user. For example, every mobile device manufactured by Apple has a 40-character Unique Device Identifier (UDID). This enables Apple to track its users. Not only Apple itself, but also marketers, were able to use the UDID to track devices. It has also been alleged that data emanating from these devices is routinely accessible to law enforcement agencies. Since late 2012, Apple has prevented marketers from using the UDID, but has added an Identifier for Advertisers (IFA or IDFA). This is temporary, and it can be blocked; but it is by default open for tracking, and turning it off is difficult, and is likely to result in reduced services (Edwards, 2012). In short, Apple devices are specifically designed to enable tracking of consumers by Apple, by any government agency that has authority to gain access to the data, and by all consumer-marketing corporations, although in the last case with a low-grade option available to the user to suppress tracking.

In Wifi and WiMAX networks, the device entifier may be a processor-id or more commonly a network interface card identifier (NIC Id). In various circumstances, other device-identifiers may be used, such as a phone number, or an IP-address may be used as a proxy. In addition, the human using the device may be directly identified, e.g. by means of a user-account name.

A WWAN cell may cover a large area, indicatively of a 50 km radius. Telephony cells may have a radius as large as 2–3 km or as little as a hundred metres. WLANs using Wifi technologies have a cell-size of less than 1 ha, indicatively 50–100 m radius, but in practice often constrained by environmental factors to only 10–30 m.

The base-station or router knows the identities of devices that are within its cell, because this is a technically necessary feature of the cell's operation. Mobile devices auto-report their presence 10 times per second. Meanwhile, the locations of base-stations for cellular services are known with considerable accuracy by the telecommunications providers. And, in the case of most private Wifi services, the location of the router is mapped to c. 30–100 m accuracy by services such as Skyhook and Google Locations, which perform what have been dubbed ‘war drives’ in order to maintain their databases – in Google's case in probable violation of the telecommunications interception and/or privacy laws of at least a dozen countries (EPIC, 2012).

Knowing that a device is within a particular mobile phone, WiMAX or Wifi cell provides only a rough indication of location. In order to generate a more precise estimate, within a cell, several techniques are used (McGuire et al., 2005). These include the following (adapted from Clarke and Wigan, 2011; see also Figueiras and Frattasi, 2010):

• directional analysis. A single base-station may comprise multiple receivers at known locations and pointed in known directions, enabling the handset's location within the cell to be reduced to a sector within the cell, and possibly a narrow one, although without information about the distance along the sector;

• triangulation. This involves multiple base-stations serving a single cell, at known locations some distance apart, and each with directional analysis capabilities. Particularly with three or more stations, this enables an inference that the device's location is within a small area at the intersection of the multiple directional plots;

• signal analysis. This involves analysis of the characteristics of the signals exchanged between the handset and base-station, in order to infer the distance between them. Relevant signal characteristics include the apparent response-delay (Time Difference of Arrival – TDOA, also referred to as multilateration), and strength (Received Signal Strength Indicator – RSSI), perhaps supplemented by direction (Angle Of Arrival – AOA).

The precision and reliability of these techniques varies greatly, depending on the circumstances prevailing at the time. The variability and unpredictability result in many mutually inconsistent statements by suppliers, in the general media, and even in the technical literature.

Techniques for cellular networks generally provide reasonably reliable estimates of location to within an indicative 50–100 m in urban areas and some hundreds of metres elsewhere. Worse performance has been reported in some field-tests, however. For example, Dahunsi and Dwolatzky (2012) found the accuracy of GSM location in Johannesburg to be in the range 200–1400 m, and highly variable, with “a huge difference between the predicted and provided accuracies by mobile location providers”.

The website of the Skyhook Wifi-router positioning service claims 10-m accuracy, 1-s time-to-first-fix and 99.8% reliability (SHW, 2012). On the other hand, tests have resulted in far lower accuracy measures, including an average positional error of 63 m in Sydney (Gallagher et al., 2009) and “median values for positional accuracy in [Las Vegas, Miami and San Diego, which] ranged from 43 to 92 metres… [and] the replicability… was relatively poor” (Zandbergen, 2012, p. 35). Nonetheless, a recent research article suggested the feasibility of “uncooperatively and covertly detecting people ‘through the wall’ [by means of their WiFi transmissions]” (Chetty et al., 2012).

Another way in which a device's location may become known to other devices is through self-reporting of the device's position, most commonly by means of an inbuilt Global Positioning System (GPS) chipset. This provides coordinates and altitude based on broadcast signals received from a network of satellites. In any particular instance, the user of the device may or may not be aware that location is being disclosed.

Despite widespread enthusiasm and a moderate level of use, GPS is subject to a number of important limitations. The signals are subject to interference from atmospheric conditions, buildings and trees, and the time to achieve a fix on enough satellites and deliver a location measure may be long. This results in variability in its practical usefulness in different circumstances, and in its accuracy and reliability. Civil-use GPS coordinates are claimed to provide accuracy within a theoretical 7.8 m at a 95% confidence level (USGov, 2012), but various reports suggest 15 m, or 20 m, or 30 m, but sometimes 100 m. It may be affected by radio interference and jamming. The original and still-dominant GPS service operated by the US Government was subject to intentional degradation in the US's national interests. This ‘Selective Availability’ feature still exists, although subject to a decade-long policy not to use it; and future generations of GPS satellites may no longer support it.

Hybrid schemes exist that use two or more sources in order to generate more accurate location-estimates, or to generate estimates more quickly. In particular, Assisted GPS (A-GPS) utilises data from terrestrial servers accessed over cellular networks in order to more efficiently process satellite-derived data (e.g. RE, 2012).

Further categories of location and tracking technologies emerge from time to time. A current example uses means described by the present authors as ‘mobile device signatures’ (MDS). A device may monitor the signals emanating from a user's mobile device, without being part of the network that the user's device is communicating with. The eavesdropping device may detect particular signal characteristics that distinguish the user's mobile device from others in the vicinity. In addition, it may apply any of the various techniques mentioned above, in order to locate the device. If the signal characteristics are persistent, the eavesdropping device can track the user's mobile device, and hence the person carrying it. No formal literature on MDS has yet been located. The supplier's brief description is at PI (2010b).

The various technologies described in this section are capable of being applied to many purposes. The focus in this paper is on their application to surveillance.

3. Surveillance

The term surveillance refers to the systematic investigation or monitoring of the actions or communications of one or more persons (Clarke, 2009c). Until recent times, surveillance was visual, and depended on physical proximity of an observer to the observed. The volume of surveillance conducted was kept in check by the costs involved. Surveillance aids and enhancements emerged, such as binoculars and, later, directional microphones. During the 19th century, the post was intercepted, and telephones were tapped. During the 20th century, cameras enabled transmission of image, video and sound to remote locations, and recording for future use (e.g. Parenti, 2003).

With the surge in stored personal data that accompanied the application of computing to administration in the 1970s and 1980s, dataveillance emerged (Clarke, 1988). Monitoring people through their digital personae rather than through physical observation of their behaviour is much more economical, and hence many more people can be subjected to it (Clarke, 1994). The dataveillance epidemic made it more important than ever to clearly distinguish between personal surveillance – of an identified person who has previously come to attention – and mass surveillance – of many people, not necessarily previously identified, about some or all of whom suspicion could be generated.

Location data is of a very particular nature, and hence it has become necessary to distinguish location surveillance as a sub-set of the general category of dataveillance. There are several categories of location surveillance with different characteristics (Clarke and Wigan, 2011):

• capture of an individual's location at a point in time. Depending on the context, this may support inferences being drawn about an individual's behaviour, purpose, intention and associates

• real-time monitoring of a succession of locations and hence of the person's direction of movement. This is far richer data, and supports much more confident inferences being drawn about an individual's behaviour, purpose, intention and associates

• predictive tracking, by extrapolation from the person's direction of movement, enabling inferences to be drawn about near-future behaviour, purpose, intention and associates

• retrospective tracking, on the basis of the data trail of the person's movements, enabling reconstruction of a person's behaviour, purpose, intention and associates at previous times

Information arising at different times, and from different forms of surveillance, can be combined, in order to offer a more complete picture of a person's activities, and enable yet more inferences to be drawn, and suspicions generated. This is the primary sense in which the term ‘überveillance’ is applied: “Überveillance has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought). Überveillance can be a predictive mechanism for a person's expected behaviour, traits, likes, or dislikes; or it can be based on historical fact; or it can be something in between… Überveillance is more than closed circuit television feeds, or cross-agency databases linked to national identity cards, or biometrics and ePassports used for international travel. Überveillance is the sum total of all these types of surveillance and the deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time” (Michael and Michael, 2010. See also Michael and Michael, 2007Michael et al., 20082010Clarke, 2010).

A comprehensive model of surveillance includes consideration of geographical scope, and of temporal scope. Such a model assists the analyst in answering key questions about surveillance: of what? for whom? by whom? why? how? where? and when? (Clarke, 2009c). Distinctions are also needed based on the extent to which the subject has knowledge of surveillance activities. It may be overt or covert. If covert, it may be merely unnotified, or alternatively express measures may be undertaken in order to obfuscate, and achieve secrecy. A further element is the notion of ‘sousveillance’, whereby the tools of surveillance are applied, by those who are commonly watched, against those who are commonly the watchers (Mann et al., 2003).

These notions are applied in the following sections in order to establish the extent to which location and tracking of mobile devices is changing the game of surveillance, and to demonstrate that location surveillance is intruding more deeply into personal freedoms than previous forms of surveillance.

4. Applications

This section presents a typology of applications of mobile device location, as a means of narrowing down to the kinds of uses that have particularly serious privacy implications. These are commonly referred to as location-based services (LBS). One category of applications provide information services that are for the benefit of the mobile device's user, such as navigation aids, and search and discovery tools for the locations variously of particular, identified organisations, and of organisations that sell particular goods and services. Users of LBS of these kinds can be reasonably assumed to be aware that they are disclosing their location. Depending on the design, the disclosures may also be limited to specific service-providers and specific purposes, and the transmissions may be secured.

Another, very different category of application is use by law enforcement agencies (LEAs). The US E-911 mandate of 1999 was nominally a public safety measure, to enable people needing emergency assistance to be quickly and efficiently located. In practice, the facility also delivered LEAs means for locating and tracking people of interest, through their mobile devices. Personal surveillance may be justified by reasonable grounds for suspicion that the subject is involved in serious crime, and may be specifically authorised by judicial warrant. Many countries have always been very loose in their control over LEAs, however, and many others have drastically weakened their controls since 2001. Hence, in any given jurisdiction and context, each and all of the controls may be lacking.

Yet worse, LEAs use mobile location and tracking for mass surveillance, without any specific grounds for suspicion about any of the many people caught up in what is essentially a dragnet-fishing operation (e.g. Mery, 2009). Examples might include monitoring the area adjacent to a meeting-venue watching out for a blacklist of device-identifiers known to have been associated with activists in the past, or collecting device-identifiers for use on future occasions. In addition to netting the kinds of individuals who are of legitimate interest, the ‘by-catch’ inevitably includes threatened species. There are already extraordinarily wide-ranging (and to a considerable extent uncontrolled) data retention requirements in many countries.

Of further concern is the use of Automated Number Plate Recognition (ANPR) for mass surveillance purposes. This has been out of control in the UK since 2006, and has been proposed or attempted in various other countries as well (Clarke, 2009a). Traffic surveillance is expressly used not only for retrospective analysis of the movements of individuals of interest to LEAs, but also as a means of generating suspicions about other people (Lewis, 2008).

Beyond LEAs, many government agencies perform social control functions, and may be tempted to conduct location and tracking surveillance. Examples would include benefits-paying organisations tracking the movements of benefits-recipients about whom suspicions have arisen. It is not too far-fetched to anticipate zealous public servants concerned about fraud control imposing location surveillance on all recipients of some particularly valuable benefit, or as a security precaution on every person visiting a sensitive area (e.g. a prison, a power plant, a national park).

Various forms of social control are also exercised by private sector organisations. Some of these organisations, such as placement services for the unemployed, may be performing outsourced public sector functions. Others, such as workers' compensation providers, may be seeking to control personal insurance claimants, and similarly car-hire companies and insurance providers may wish to monitor motor vehicles' distance driven and roads used (Economist, 2012Michael et al., 2006b).

A further privacy-invasive practice that is already common is the acquisition of location and tracking data by marketing corporations, as a by-product of the provision of location-based services, but with the data then applied to further purposes other than that for which it was intended. Some uses rely on statistical analysis of large holdings (‘data mining’). Many uses are, on the other hand, very specific to the individual, and are for such purposes as direct or indirect targeting of advertisements and the sale of goods and services. Some of these applications combine location data with data from other sources, such as consumer profiling agencies, in order to build up such a substantial digital persona that the individual's behaviour is readily influenced. This takes the activity into the realms of überveillance.

All such services raise serious privacy concerns, because the data is intensive and sensitive, and attractive to organisations. Companies may gain rights in relation to the data through market power, or by trickery – such as exploitation of a self-granted right to change the Terms of Service (Clarke, 2011). Once captured, the data may be re-purposed by any organisation that gains access to it, because the value is high enough that they may judge the trivial penalties that generally apply to breaches of privacy laws to be well worth the risk.

A recently-emerged, privacy-invasive practice is the application of the mobile device signature (MDS) form of tracking, in such locations as supermarkets. This is claimed by its providers to offer deep observational insights into the behaviour of customers, including dwell times in front of displays, possibly linked with the purchaser's behaviour. This raises concerns a little different from other categories of location and tracking technologies, and is accordingly considered in greater depth in the following section.

It is noteworthy that an early review identified a wide range of LBS, which the authors classified into mobile guides, transport, gaming, assistive technology and location-based health (Raper et al., 2007b). Yet that work completely failed to notice that a vast array of applications were emergent in surveillance, law enforcement and national security, despite the existence of relevant literature from at least 1999 onwards (Clarke, 2001Michael and Masters, 2006).

5. Implications

The previous sections have introduced many examples of risks to citizens and consumers arising from location surveillance. This section presents an analysis of the categories and of the degree of seriousness with which they should be viewed. The first topic addressed is the privacy of personal location data. Other dimensions of privacy are then considered, and then the specific case of MDS is examined. The treatment here is complementary to earlier articles that have looked more generally at particular applications such as location-based mobile advertising, e.g. Cleff (20072010) and King and Jessen (2010). See also Art. 29 (2011).

5.1. Locational privacy

Knowing where someone has been, knowing what they are doing right now, and being able to predict where they might go next is a powerful tool for social control and for chilling behaviour (Abbas, 2011). Humans do not move around in a random manner (Song et al., 2010).

One interpretation of ‘locational privacy’ is that it “is the ability of an individual to move in public space with the expectation that under normal circumstances their location will not be systematically and secretly recorded for later use” (Blumberg and Eckersley, 2009). A more concise definition is “the ability to control the extent to which personal location information is… [accessible and] used by others” (van Loenen et al., 2009). Hence ‘tracking privacy’ is the interest an individual has in controlling information about their sequence of locations.

Location surveillance is deeply intrusive into data privacy, because it is very rich, and enables a great many inferences to be drawn (Clarke, 2001Dobson and Fisher, 2003Michael et al., 2006aClarke and Wigan, 2011). As demonstrated by Raper et al. (2007a, p. 32–3), most of the technical literature that considers privacy is merely concerned about it as an impediment to deployment and adoption, and how to overcome the barrier rather than how to solve the problem. Few authors adopt a positive approach to privacy-protective location technologies. The same authors' review of applications (Raper et al., 2007b) includes a single mention of privacy, and that is in relation to just one of the scores of sub-categories of application that they catalogue.

Most service-providers are cavalier in their handling of personal data, and extravagant in their claims. For example, Skyhook claims that it “respects the privacy of all users, customers, employees and partners”; but, significantly, it makes no mention of the privacy of the people whose locations, through the locations of their Wifi routers, it collects and stores (Skyhook, 2012).

Consent is critical in such LBS as personal location chronicle systems, people-followers and footpath route-tracker systems that systematically collect personal location information from a device they are carrying (Collier, 2011c). The data handled by such applications is highly sensitive because it can be used to conduct behavioural profiling of individuals in particular settings. The sensitivity exists even if the individuals remain ‘nameless’, i.e. if each identifier is a temporary or pseudo-identifier and is not linked to other records. Service-providers, and any other organisations that gain access to the data, achieve the capacity to make judgements on individuals based on their choices of, for example, which retail stores they walk into and which they do not. For example, if a subscriber visits a particular religious bookstore within a shopping mall on a weekly basis, the assumption can be reasonably made that they are in some way affiliated to that religion (Samuel, 2008).

It is frequently asserted that individuals cannot have a reasonable expectation of privacy in a public space (Otterberg, 2005). Contrary to those assertions, however, privacy expectations always have existed in public places, and continue to exist (VLRC, 2010). Tracking the movements of people as they go about their business is a breach of a fundamental expectation that people will be ‘let alone’. In policing, for example, in most democratic countries, it is against the law to covertly track an individual or their vehicle without specific, prior approval in the form of a warrant. This principle has, however, been compromised in many countries since 2001. Warrantless tracking using a mobile device generally results in the evidence, which has been obtained without the proper authority, being inadmissible in a court of law (Samuel, 2008). Some law enforcement agencies have argued for the abolition of the warrant process because the bureaucracy involved may mean that the suspect cannot be prosecuted for a crime they have likely committed (Ganz, 2005). These issues are not new; but far from eliminating a warrant process, the appropriate response is to invest the energy in streamlining this process (Bronitt, 2010).

Privacy risks arise not only from locational data of high integrity, but also from data that is or becomes associated with a person and that is inaccurate, misleading, or wrongly attributed to that individual. High levels of inaccuracy and unreliability were noted above in respect of all forms of location and tracking technologies. In the case of MDS services, claims have been made of 1–2 m locational accuracy. This has yet to be supported by experimental test cases however, and hence there is uncertainty about the reliability of inferences that the service-provider or the shop owner draw. If the data is the subject of a warrant or subpoena, the data's inaccuracy could result in false accusations and even a miscarriage of justice, with the ‘wrong person’ finding themselves in the ‘right place’ at the ‘right time’.

5.2. Privacy more broadly

Privacy has multiple dimensions. One analysis, in Clarke (2006a), identifies four distinct aspects. Privacy of Personal Data, variously also ‘data privacy’ and ‘information privacy’, is the most widely discussed dimension of the four. Individuals claim that data about themselves should not be automatically available to other individuals and organisations, and that, even where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use. The last five decades have seen the application of information technologies to a vast array of abuses of data privacy. The degree of privacy intrusiveness is a function of both the intensity and the richness of the data. Where multiple sources are combined, the impact is particularly likely to chill behaviour. An example is the correlation of video-feeds with mobile device tracking. The previous sub-section addressed that dimension.

Privacy of the Person, or ‘bodily privacy’, extends from freedom from torture and right to medical treatment, via compulsory immunisation and imposed treatments, to compulsory provision of samples of body fluids and body tissue, and obligations to submit to biometric measurement. Locational surveillance gives rise to concerns about personal safety. Physical privacy is directly threatened where a person who wishes to inflict harm is able to infer the present or near-future location of their target. Dramatic examples include assassins, kidnappers, ‘standover merchants’ and extortionists. But even people who are neither celebrities nor notorieties are subject to stalking and harassment (Fusco et al., 2012).

Privacy of Personal Communications is concerned with the need of individuals for freedom to communicate among themselves, without routine monitoring of their communications by other persons or organisations. Issues include ‘mail covers’, the use of directional microphones, ‘bugs’ and telephonic interception, with or without recording apparatus, and third-party access to email-messages. Locational surveillance thereby creates new threats to communications privacy. For example, the equivalent of ‘call records’ can be generated by combining the locations of two device-identifiers in order to infer that a face-to-face conversation occurred.

Privacy of Personal Behaviour encompasses ‘media privacy’, but particular concern arises in relation to sensitive matters such as sexual preferences and habits, political activities and religious practices. Some privacy analyses, particularly in Europe, extend this discussion to personal autonomy, liberty and the right of self-determination (e.g. King and Jessen, 2010). The notion of ‘private space’ is vital to economic and social aspects of behaviour, is relevant in ‘private places’ such as the home and toilet cubicles, but is also relevant and important in ‘public places’, where systematic observation and the recording of images and sounds are far more intrusive than casual observation by the few people in the vicinity.

Locational surveillance gives rise to rich sets of data about individuals' activities. The knowledge, or even suspicion, that such surveillance is undertaken, chills their behaviour. The chilling factor is vital in the case of political behaviour (Clarke, 2008). It is also of consequence in economic behaviour, because the inventors and innovators on whom new developments depend are commonly ‘different-thinkers’ and even ‘deviants’, who are liable to come to come to attention in mass surveillance dragnets, with the tendency to chill their behaviour, their interactions and their creativity.

Surveillance that generates accurate data is one form of threat. Surveillance that generates inaccurate data, or wrongly associates data with a particular person, is dangerous as well. Many inferences that arise from inaccurate data will be wrong, of course, but that won't prevent those inferences being drawn, resulting in unjustified behavioural privacy invasiveness, including unjustified association with people who are, perhaps for perfectly good reasons, themselves under suspicion.

In short, all dimensions of privacy are seriously affected by location surveillance. For deeper treatments of the topic, see Michael et al. (2006b) and Clarke and Wigan (2011).

5.3. Locational privacy and MDS

The recent innovation of tracking by means of mobile device signatures (MDS) gives rise to some issues additional to, or different from, mainstream device location technologies. This section accordingly considers this particular technique's implications in greater depth. Limited reliable information is currently available, and the analysis is of necessity based on supplier-published sources (PI, 2010a2010b) and media reports (Collier, 2011a,b,c).

Path Intelligence (PI) markets an MDS service to shopping mall-owners, to enable them to better value their floor space in terms of rental revenues, and to identify points of on-foot traffic congestion to on-sell physical advertising and marketing floor space (PI, 2010a). The company claims to detect each phone (and hence person) that enters a zone, and to capture data, including:

• how long each device and person stay, including dwell times in front of shop windows;

• repeat visits by shoppers in varying frequency durations; and

• typical route and circuit paths taken by shoppers as they go from shop to shop during a given shopping experience.

For malls, PI is able to denote such things as whether or not shoppers who shop at one establishment will also shop at another in the same mall, and whether or not people will go out of their way to visit a particular retail outlet independent of its location. For retailers, PI says it is able to provide information on conversion rates by department or even product line, and even which areas of the store might require more attention by staff during specific times of the day or week (PI, 2012).

PI says that it uses “complex algorithms” to denote the geographic position of a mobile phone, using strategically located “proprietary equipment” in a campus setting (PI, 2010a). The company states that it is conducting “data-driven analysis”, but is not collecting, or at least that it is not disclosing, any personal information such as a name, mobile telephone number or contents of a short message service (SMS). It states that it only ever provides aggregated data at varying zone levels to the shopping mall-owners. This is presumably justified on the basis that, using MDS techniques, direct identifiers are unlikely to be available, and a pseudo-identifier needs to be assigned. There is no explicit definition of what constitutes a zone. It is clear, however, that minimally-aggregated data at the highest geographic resolution is available for purchase, and at a higher price than more highly-aggregated data.

Shoppers have no relationship with the company, and it appears unlikely that they would even be aware that data about them is being collected and used. The only disclosure appears to be that “at each of our installations our equipment is clearly visible and labelled with our logo and website address” (PI, 2010a), but this is unlikely to be visible to many people, and in any case would not inform anyone who saw it.

In short, the company is generating revenue by monitoring signals from the mobile devices of people who visit a shopping mall for the purchase of goods and services. The data collection is performed without the knowledge of the person concerned (Renegar et al., 2008). The company is covertly collecting personal data and exploiting it for profit. There is no incentive or value proposition for the individual whose mobile is being tracked. No clear statement is provided about collection, storage, retention, use and disclosure of the data (Arnold, 2008). Even if privacy were not a human right, this would demand statutory intervention on the public policy grounds of commercial unfairness. The company asserts that “our privacy approach has been reviewed by the [US Federal Trade Commission] FTC, which determined that they are comfortable with our practices” (PI, 2010a). It makes no claims of such ‘approval’ anywhere else in the world.

The service could be extended beyond a mall and the individual stores within it, to for example, associated walkways and parking areas, and surrounding areas such as government offices, entertainment zones and shopping-strips. Applications can also be readily envisaged on hospital and university campuses, and in airports and other transport hubs. From prior research, this is likely to expose the individual's place of employment, and even their residence (Michael et al., 2006a,b). Even if only aggregated data is sold to businesses, the individual records remain available to at least the service-provider.

The scope exists to combine this form of locational surveillance with video-surveillance such as in-store CCTV, and indeed this is claimed to be already a feature of the company's offering to retail stores. To the extent that a commonly-used identifier can be established (e.g. through association with the person's payment or loyalty card at a point-of-sale), the full battery of local and externally acquired customer transaction histories and consolidated ‘public records’ data can be linked to in-store behaviour (Michael and Michael, 2007). Longstanding visual surveillance is intersecting with well-established data surveillance, and being augmented by locational surveillance, giving breath to dataveillance, or what is now being referred to by some as ‘smart surveillance’ (Wright et al., 2010IBM, 2011).

Surreptitious collection of personal data is (with exemptions and exceptions) largely against the law, even when undertaken by law enforcement personnel. The MDS mechanism also flies in the face of telephonic interception laws. How, then, can it be in any way acceptable for a form of warrantless tracking to be undertaken by or on behalf of corporations or mainstream government agencies, of shoppers in a mall, or travellers in an airport, or commuters in a transport hub? Why should a service-provider have the right to do what a law enforcement agency cannot normally do?

6. Controls

The tenor of the discussion to date has been that location surveillance harbours enormous threats to location privacy, but also to personal safety, the freedom to communicate, freedom of movement, and freedom of behaviour. This section examines the extent to which protections exist, firstly in the form of natural or intrinsic controls, and secondly in the form of legal provisions. The existing safeguards are found to be seriously inadequate, and it is therefore necessary to also examine the prospects for major enhancements to law, in order to achieve essential protections.

6.1. Intrinsic controls

A variety of forms of safeguard exist against harmful technologies and unreasonable applications of them. The intrinsic economic control has largely evaporated, partly because the tools use electronics and the components are produced in high volumes at low unit cost. Another reason is that the advertising and marketing sectors are highly sophisticated, already hold and exploit vast quantities of personal data, and are readily geared up to exploit yet more data.

Neither the oxymoronic notion of ‘business ethics’ nor the personal morality of executives in business and government act as any significant brake on the behaviours of corporations and governments, because they are very weak barriers, and they are readily rationalised away in the face of claims of enhanced efficiencies in, for example, marketing communications, fraud control, criminal justice and control over anti-social behaviour.

A further category of intrinsic control is ‘self-regulatory’ arrangements within relevant industry sectors. In 2010, for example, the Australian Mobile Telecommunications Association (AMTA) released industry guidelines to promote the privacy of people using LBS on mobile devices (AMTA, 2010). The guidelines were as follows:

1. Every LBS must be provided on an opt-in basis with a specific request from a user for the service

2. Every LBS must comply with all relevant privacy legislation

3. Every LBS must be designed to guard against consumers being located without their knowledge

4. Every LBS must allow consumers to maintain full control

5. Every LBS must enable customers to control who uses their location information and when that is appropriate, and be able to stop or suspend a service easily should they wish

The second point is a matter for parliaments, privacy oversight agencies and law enforcement agencies, and its inclusion in industry guidelines is for information only. The remainder, meanwhile, are at best ‘aspirational’, and at worst mere window-dressing. Codes of this nature are simply ignored by industry members. They are primarily a means to hold off the imposition of actual regulatory measures. Occasional short-term constraints may arise from flurries of media attention, but the ‘responsible’ organisations escape by suggesting that bad behaviour was limited to a few ‘cowboy’ organisations or was a one-time error that will not be repeated.

A case study of the industry self-regulation is provided by the Biometrics Code issued by the misleadingly named Australian industry-and-users association, the Biometrics ‘Institute’ (BI, 2004). During the period 2009–2012, the privacy advocacy organisation, the Australian Privacy Foundation (APF), submitted to the Privacy Commissioner on multiple occasions that the Code failed to meet the stipulated requirements and under the Commissioner's own Rules had to be de-registered. The Code never had more than five subscribers (out of a base of well over 100 members – which was itself only a sub-set of organisations active in the area), and had no signatories among the major biometrics vendors or users, because all five subscribers were small organisations or consultants. In addition, none of the subscribers appear to have ever provided a link to the Code on their websites or in their Privacy Policy Statements (APF, 2012).

The Commissioner finally ended the farce in April 2012, citing the “low numbers of subscribers”, but avoided its responsibilities by permitting the ‘Institute’ to “request” revocation, over two years after the APF had made the same request (OAIC, 2012). The case represents an object lesson in the vacuousness of self-regulation and the business friendliness of a captive privacy oversight agency.

If economics, morality and industry sector politics are inadequate, perhaps competition and organisational self-interest might work. On the other hand, repeated proposals that privacy is a strategic factor for corporations and government agencies have fallen on stony ground (Clarke, 19962006b).

The public can endeavour to exercise countervailing power against privacy-invasive practices. On the other hand, individuals acting alone are of little or no consequence to organisations that are intent on the application of location surveillance. Moreover, consumer organisations lack funding, professionalism and reach, and only occasionally attract sufficient media attention to force any meaningful responses from organisations deploying surveillance technologies.

Individuals may have direct surveillance countermeasures available to them, but relatively few people have the combination of motivation, technical competence and persistence to overcome lethargy and the natural human desire to believe that the institutions surrounding them are benign. In addition, some government agencies, corporations and (increasingly prevalent) public–private partnerships seek to deny anonymity, pseudonymity and multiple identities, and to impose so-called ‘real name’ policies, for example as a solution to the imagined epidemics of cyber-bullying, hate speech and child pornography. Individuals who use cryptography and other obfuscation techniques have to overcome the endeavours of business and government to stigmatise them as criminals with ‘something to hide’.

6.2. Legal controls

It is clear that natural or intrinsic controls have been utter failures in privacy matters generally, and will be in locational privacy matters as well. That leaves legal safeguards for personal freedoms as the sole protection. There are enormous differences among domestic laws relating to location surveillance. This section accordingly limits itself to generalities and examples.

Privacy laws are (with some qualifications, mainly in Europe) very weak instruments. Even where public servants and parliaments have an actual intention to protect privacy, rather than merely to overcome public concerns by passing placebo statutes, the draft Bills are countered by strong lobbying by government agencies and industry, to the extent that measures that were originally portrayed as being privacy-protective reach the statute books as authority for privacy breaches and surveillance (Clarke, 2000).

Privacy laws, once passed, are continually eroded by exceptions built into subsequent legislation, and by technological capabilities that were not contemplated when the laws were passed. In most countries, location privacy has yet to be specifically addressed in legislation. Even where it is encompassed by human rights and privacy laws, the coverage is generally imprecise and ambiguous. More direct and specific regulation may exist, however. In Australia, for example, the Telecommunications (Interception and Access) Act and the Surveillance Devices Act define and criminalise inappropriate interception and access, use, communication and publication of location information that is obtained from mobile device traffic (AG, 2005). On the other hand, when Google Inc. intercepted wi-fi signals and recorded the data that they contained, the Privacy Commissioner absolved the company (Riley, 2010), and the Australian Federal Police refused to prosecute despite the action – whether it was intentional, ‘inadvertent’ or merely plausibly deniable – being a clear breach of the criminal law (Moses, 2010Stilgherrian, 2012).

The European Union determined a decade ago that location data that is identifiable to individuals is to some extent at least subject to existing data protection laws (EU, 2002). However, the wording of that so-called ‘e-Privacy Directive’ countenances the collection of “location data which are more precise than is necessary for the transmission of communications”, without clear controls over the justification, proportionality and transparency of that collection (para. 35). In addition, the e-Privacy Directive only applies to telecommunications service-providers, not to other organisations that acquire location and tracking data. King and Jessen (2010) discuss various gaps in the protective regimes in Europe.

The EU's Advisory Body (essentially a Committee of European Data Protection Commissioners) has issued an Opinion that mobile location data is generally capable of being associated with a person, and hence is personal data, and hence is subject to the EU Directive of 1995 and national laws that implement that Directive (Art. 29, 2011). Consent is considered to be generally necessary, and that consent must be informed, and sufficiently granular (p. 13–8).

It is unclear, however, to what extent this Opinion has actually caused, and will in the future cause, organisations that collect, store, use and disclose location data to change their practices. This uncertainty exists in respect of national security, law enforcement and social control agencies, which have, or which can arrange, legal authority that overrides data protection laws. It also applies to non-government organisations of all kinds, which can take advantage of exceptions, exemptions, loopholes, non-obviousness, obfuscation, unenforceability within each particular jurisdiction, and extra-jurisdictionality, to operate in ways that are in apparent breach of the Opinion.

Legal authorities for privacy-invasions are in a great many cases vague rather than precise, and in many jurisdictions power in relation to specific decisions is delegated to a LEA (in such forms as self-written ‘warrants’), or even a social control agency (in the form of demand-powers), rather than requiring a decision by a judicial officer based on evidence provided by the applicant.

Citizens in many countries are subject to more or less legitimate surveillance of various degrees and orders of granularity, by their government, in the name of law enforcement and national security. However, many Parliaments have granted powers to national security agencies to use location technology to track citizens and to intercept telecommunications. Moreover, many Parliaments have failed the public by permitting a warrant to be signed by a Minister, or even a public servant, rather than a judicial officer (Jay, 1999). Worse still, it appears that these already gross breaches of the principle of a free society are in effect being extended to the authorisation of a private organisation to track mobiles of ordinary citizens because it may lead to better services planning, or more efficient advertising and marketing (Collier, 2011a).

Data protection legislation in all countries evidences massive weaknesses. There are manifold exemptions and exceptions, and there are intentional and accidental exclusions, for example through limitations in the definitions of ‘identified’ and ‘personal data’. Even the much vaunted European laws fail to cope with extraterritoriality and are largely ignored by US-based service-providers. They are also focused exclusively on data, leaving large gaps in safeguards for physical, communications and behavioural privacy.

Meanwhile, a vast amount of abuse of personal data is achieved through the freedom of corporations and government agencies to pretend that Terms imposed on consumers and citizens without the scope to reject them are somehow the subject of informed and freely given consent. For example, petrol stations, supermarkets and many government agencies pretend that walking past signs saying ‘area subject to CCTV’ represents consent to gather, transmit, record, store, use and disclose data. The same approach is being adopted in relation to highly sensitive location data, and much vaunted data protection laws are simply subverted by the mirage of consent.

At least notices such as ‘you are now being watched’ or ‘smile, you are being recorded’ inform customers that they are under observation. On the other hand, people are generally oblivious to the fact that their mobile subscriber identity is transmitted from their mobile phone and multilaterated to yield a reasonably precise location in a shopping mall (Collier, 2011a,b,c). Further, there is no meaningful sense in which they can be claimed to have consented to providing location data to a third party, in this case a location service-provider with whom they have never had contact. And the emergent combination of MDS with CCTV sources becomes a pervasive view of the person, an ‘über’ view, providing a set of über-analytics to – at this stage – shopping complex owners and their constituents.

What rights do employees have if such a system were instituted in an employment setting (Michael and Rose, 2007, p. 252–3)? Are workplace surveillance laws in place that would protect employees from constant monitoring (Stern, 2007)? A similar problem applies to people at airports, or on hospital, university, industrial or government campuses. No social contract has been entered into between the parties, rendering the subscriber powerless.

Since the collapse of the Technology Assessment movement, technological deployment proceeds unimpeded, and public risks are addressed only after they have emerged and the clamour of concern has risen to a crescendo. A reactive force is at play, rather than proactive measures being taken to ensure avoidance or mitigation of potential privacy breaches (Michael et al., 2011). In Australia, for example, safeguards for location surveillance exist at best incidentally, in provisions under separate legislative regimes and in separate jurisdictions, and at worst not at all. No overarching framework exists to provide consistency among the laws. This causes confusion and inevitably results in inadequate protections (ALRC, 2008).

6.3. Prospective legal controls

Various learned studies have been conducted, but gather dust. In Australia, the three major law reform commissions have all reported, and all have been ignored by the legislatures (NSWLRC, 2005ALRC, 2008VLRC, 2010).

One critical need is for the fundamental principle to be recovered, to the effect that the handling of personal data requires either consent or legal authority. Consent is meaningless as a control over unreasonable behaviour, however, unless it satisfies a number of key conditions. It must be informed, it must be freely given, and it must be sufficiently granular, not bundled (Clarke, 2002). In a great many of the circumstances in which organisations are claiming to have consent to gather, store, use and disclose location data, the consumer does not appreciate what the scope of handling is that the service-provider is authorising themselves to perform; the Terms are imposed by the service-provider and may even be varied or completely re-written without consultation, a period of notice or even any notice at all; and consent is bundled rather than the individual being able to construct a pattern of consents and denials that suit their personal needs. Discussions all too frequently focus on the specifically-US notion of ‘opt-out’ (or ‘presumed consent’), with consent debased to ‘opt-in’, and deprecated as inefficient and business-unfriendly.

Recently, some very weak proposals have been put forward, primarily in the USA. In 2011, for example, two US Senators proposed a Location Privacy Protection Bill (Cheng, 2011). An organisation that collected location data from mobile or wireless data devices would have to state explicitly in their privacy policies what was being collected, in plain English. This would represent only a partial implementation of the already very weak 2006 recommendation of the Internet Engineering Task Force for Geographic Location/Privacy (IETF GEOPRIV) working group, which decided that technical systems should include ‘Fair Information Practices’ (FIPs) to defend against harms associated with the use of location technologies (EPIC, 2006). FIPs, however, is itself only a highly cut-down version of effective privacy protections, and the Bill proposes only a small fraction of FIPs. It would be close to worthless to consumers, and close to legislative authorisation for highly privacy-invasive actions by organisations.

Two other US senators tabled a GPS Bill, nominally intended to “balance the needs of Americans' privacy protections with the legitimate needs of law enforcement, and maintains emergency exceptions” (Anderson, 2011). The scope is very narrow – next would have to come the Wi-Fi Act, the A-GPS Act, etc. That approach is obviously unviable in the longer term as new innovations emerge. Effective legislation must have appropriate generality rather than excessive technology-specificity, and should be based on semantics not syntax. Yet worse, these Bills would provide legal authorisation for grossly privacy-invasive location and tracking. IETF engineers, and now Congressmen, want to compromise human rights and increase the imbalance of power between business and consumers.

7. Conclusions

Mobile device location technologies and their applications are enabling surveillance, and producing an enormous leap in intrusions into data privacy and into privacy of the person, privacy of personal communications, and privacy of personal behaviour.

Existing privacy laws are entirely incapable of protecting consumers and citizens against the onslaught. Even where consent is claimed, it generally fails the tests of being informed, freely given and granular.

There is an urgent need for outcries from oversight agencies, and responses from legislatures. Individual countries can provide some degree of protection, but the extra-territorial nature of so much of the private sector, and the use of corporate havens, in particular the USA, mean that multilateral action is essential in order to overcome the excesses arising from the US laissez fairetraditions.

One approach to the problem would be location privacy protection legislation, although it would need to embody the complete suite of protections rather than the mere notification that the technology breaches privacy. An alternative approach is amendment of the current privacy legislation and other anti-terrorism legislation in order to create appropriate regulatory provisions, and close the gaps that LBS providers are exploiting (Koppel, 2010).

The chimeras of self-regulation, and the unenforceability of guidelines, are not safeguards. Sensitive data like location information must be subject to actual, enforced protections, with guidelines and codes no longer used as a substitute, but merely playing a supporting role. Unless substantial protections for personal location information are enacted and enforced, there will be an epidemic of unjustified, disproportionate and covert surveillance, conducted by government and business, and even by citizens (Gillespie, 2009Abbas et al., 2011).

Acknowledgements

A preliminary version of the analysis presented in this paper appeared in the November 2011 edition of Precedent, the journal of the Lawyers Alliance. The article has been significantly updated as a result of comments provided by the referees and editor.

References

R. Abbas, The social and behavioural implications of location-based services: an observational study of users, Journal of Location Based Services, 5 (December 2011), pp. 3-4

R. Abbas, K. Michael, M.G. Michael, A. Aloudat, Emerging forms of covert surveillance using GPS-enabled devices, Journal of Cases on Information Technology, 13 (2) (2011), pp. 19-33

AG, What the government is doing: Surveillance Device Act 2004, Australian Government (25 May 2005) at http://www.ag.gov.au/agd/www/nationalsecurity.nsf/AllDocs/9B1F97B59105AEE6CA25700C0014CAF5?OpenDocument

ALRC, For your information: Australian privacy law and practice, (ALRC report 108), Australian Government (2008), 2, p. 1409–10, http://www.alrc.gov.au.ezproxy.uow.edu.au/publications/report-108

AMTA, New mobile telecommunications industry guidelines and consumer tips set benchmark for location based services, Australian Mobile Telecommunications Association (2010) at http://www.amta.org.au/articles/New.mobile.telecommunications.industry.guidelines.and.consumer.tips.set.benchmark.for.Location.Based.Services

N. Anderson, Bipartisan bill would end government's warrantless GPS tracking, Ars Technica (June 2011) at http://arstechnica.com/tech-policy/news/2011/06/bipartisan-bill-would-end-governments-warrantless-gps-tracking.ars

APF Revocation of the biometrics industry code, Australian Privacy Foundation (March 2012) at http://www.privacy.org.au/Papers/OAIC-BiomCodeRevoc-120321.pdf

B. Arnold, Privacy guide, Caslon Analytics (May 2008), at http://www.caslon.com.au/privacyguide19.htm

Art. 29, Opinion 13/2011 on geolocation services on smart mobile devices, Article 29 Data Protection Working Party, 881/11/EN WP 185, at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2011/wp185_en.pdf (16 May 2011)

BI Privacy code, Biometrics Institute, Sydney (April 2004) at http://web.archive.org/web/20050424120627/http://www.biometricsinstitute.org/displaycommon.cfm?an=1&subarticlenbr=8

A.J. Blumberg, P. EckersleyOn locational privacy, and how to avoid losing it forever, Electronic Frontier Foundation (August 2009), at https://www.eff.org/wp/locational-privacy

S. Bronitt, Regulating covert policing methods: from reactive to proactive models of admissibility, S. Bronitt, C. Harfield, K. Michael (Eds.), The social implications of covert policing (2010), pp. 9-14

J. Cheng, Franken's location-privacy bill would close mobile-tracking ‘loopholes’, Wired (17 June 2011), at http://www.wired.com/epicenter/2011/06/franken-location-loopholes/

K. Chetty, G.E. Smith, K. Woodbridge, Through-the-wall sensing of personnel using passive bistatic WiFi radar at standoff distances, IEEE Transactions on Geoscience and Remote Sensing, 50 (4) (April 2012), pp. 1218-1226

R. Clarke, Information technology and dataveillance, Communications of the ACM, 31 (5) (May 1988), pp. 498-512, at http://www.rogerclarke.com/DV/CACM88.html

R. Clarke, The digital persona and its application to data surveillance, The Information Society, 10 (2) (June 1994), pp. 77-92, at http://www.rogerclarke.com/DV/DigPersona.html

Clarke R. Privacy and dataveillance, and organisational strategy. In: Proc. I.S. Audit & Control Association (EDPAC'96), Perth, Western Australia; May 1996, at http://www.rogerclarke.com/DV/PStrat.html.

R. Clarke, Submission to the Commonwealth Attorney-General re: ‘a privacy scheme for the private sector: release of key provisions’ of 14 December 1999, Xamax Consultancy Pty Ltd (January 2000) at http://www.anu.edu.au/people/Roger.Clarke/DV/PAPSSub0001.html

R. Clarke, Person-location and person-tracking: technologies, risks and policy implications, Information Technology & People, 14 (2) (Summer 2001), pp. 206-231, at http://www.rogerclarke.com/DV/PLT.html

Clarke R. e-Consent: a critical element of trust in e-business. In: Proc. 15th Bled electronic commerce conference, Bled, Slovenia; June 2002, at http://www.rogerclarke.com/EC/eConsent.html.

R. Clarke, What's ‘privacy’? Xamax Consultancy Pty Ltd (2006), August 2006, at http://www.rogerclarke.com/DV/Privacy.html

R. Clarke, Make privacy a strategic factor – the why and the how, Cutter IT Journal, 19 (11) (2006), at http://www.rogerclarke.com/DV/APBD-0609.html

R. Clarke, Dissidentity: the political dimension of identity and privacy, Identity in the Information Society, 1 (1) (December 2008), pp. 221-228, at http://www.rogerclarke.com/DV/Dissidentity.html

Clarke R. The covert implementation of mass vehicle surveillance in Australia. In: Proc 4th workshop on the social implications of national security: covert policing, April 2009, ANU, Canberra; 2009a, at http://www.rogerclarke.com/DV/ANPR-Surv.html.

Clarke R. A sufficiently rich model of (id)entity, authentication and authorisation. In: Proc. IDIS 2009 – the 2nd multidisciplinary workshop on identity in the Information Society, LSE, 5 June 2009; 2009b, at http://www.rogerclarke.com/ID/IdModel-090605.html.

R. Clarke, A framework for surveillance analysis, Xamax Consultancy Pty Ltd (2009), August 2009, at http://www.rogerclarke.com/DV/FSA.html

R. Clarke, What is überveillance? (And what should be done about it?) IEEE Technology and Society, 29 (2) (Summer 2010), pp. 17-25, at http://www.rogerclarke.com/DV/RNSA07.html

Clarke R. The cloudy future of consumer computing. In: Proc. 24th Bled eConference; June 2011, at http://www.rogerclarke.com/EC/CCC.html.

R. Clarke, M. Wigan, You are where you've been: the privacy implications of location and tracking technologies, Journal of Location Based Services, 5 (3–4) (December 2011), pp. 138-155, http://www.rogerclarke.com/DV/YAWYB-CWP.html

E.B. Cleff, Implementing the legal criteria of meaningful consent in the concept of mobile advertising, Computer Law & Security Review, 23 (2) (2007), pp. 262-269

E.B. Cleff, Effective approaches to regulate mobile advertising: moving towards a coordinated legal, self-regulatory and technical response, Computer Law & Security Review, 26 (2) (2010), pp. 158-169

K. Collier, Stores spy on shoppers, Herald Sun (2011), 12 October 2011, at http://www.heraldsun.com.au/news/more-news/stores-spy-on-shoppers/story-fn7x8me2-1226164244739

K. Collier, Shopping centres' Big Brother plan to track customers, Herald Sun (2011), 14 October 2011, at http://www.heraldsun.com.au/news/more-news/shopping-centres-big-brother-plan-to-track-customers/story-fn7x8me2-1226166191503

K. Collier, ‘Creepy’ path intelligence retail technology tracks shoppers, news.com.au (2011), 14 October 2011, at http://www.news.com.au/money/creepy-retail-technology-tracks-shoppers/story-e6frfmci-1226166413071

F. Dahunsi, B. Dwolatzky, An empirical investigation of the accuracy of location-based services in South Africa, Journal of Location Based Services, 6 (1) (March 2012), pp. 22-34

J. Dobson, P. Fisher, Geoslavery, IEEE Technology and Society, 22 (2003), pp. 47-52, cited in Raper et al. (2007)

Economist, Vehicle data recorders – watching your driving, The Economist (23 June 2012), at http://www.economist.com/node/21557309

J. Edwards, Apple has quietly started tracking iphone users again, and it's tricky to opt out, Business Insider (11 October 2012) at http://www.businessinsider.com/ifa-apples-iphone-tracking-in-ios-6-2012-10

EPIC, Privacy and human rights report 2006, Electronic Privacy Information Center, WorldLII (2006) at http://www.worldlii.org.ezproxy.uow.edu.au/int/journals/EPICPrivHR/2006/PHR2006-Location.html

EPIC, Investigations of Google street view, Electronic Privacy Information Center (2012), at http://epic.org/privacy/streetview/

EU Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications)

Official Journal, L 201 (2002), 31/07/2002 P. 0037-0047, European Commission, at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32002L0058:en:HTML

J. Figueiras, S. Frattasi, Mobile positioning and tracking: from conventional to cooperative techniques, Wiley (2010)

S.J. Fusco, R. Abbas, K. Michael, A. Aloudat, Location-based social networking and its impact on trust in relationships, IEEE Technology and Society Magazine, 31 (2) (Summer 2012), pp. 39-50, at http://works.bepress.com.ezproxy.uow.edu.au/cgi/viewcontent.cgi?article=1326&context=kmichael

Gallagher T et al. Trials of commercial Wi-Fi positioning systems for indoor and urban canyons. In: Proc. IGNSS symposium, Queensland; 1–3 December 2009, cited in Zandbergen (2012).

J.S. Ganz, It's already public: why federal officers should not need warrants to use GPS vehicle tracking devices, Journal of Criminal Law and Criminology, 95 (4) (Summer 2005), pp. 1325-1337

A.A. Gillespie, Covert surveillance, human rights and the law, Irish Criminal Law Journal, 19 (3) (August 2009), pp. 71-79

IBM, IBM smart surveillance system (previous PeopleVision project, IBM Research (30 October 2011), at http://www.research.ibm.com.ezproxy.uow.edu.au/peoplevision/

D.M. Jay, Use of covert surveillance obtained by search warrant, Australian Law Journal, 73 (1) (Jan 1999), pp. 34-36

N.J. King, P.W. Jessen, Profiling the mobile customer – privacy concerns when behavioural advertisers target mobile phones, Computer Law & Security Review, 26 (5) (2010), pp. 455-478, and 2010; 26(6): 595–612

A. Koppel, Warranting a warrant: fourth amendment concerns raised by law enforcement's warrantless use of GPS and cellular phone tracking, University of Miami Law Review, 64 (3) (April 2010), pp. 1061-1089

P. Lewis, Fears over privacy as police expand surveillance project, The Guardian (15 September 2008) at http://www.guardian.co.uk/uk/2008/sep/15/civilliberties.police

B. van Loenen, J. Zevenbergen, J. de JongBalancing location privacy with national security: a comparative analysis of three countries through the balancing framework of the European court of human rights, N.J. Patten, et al. (Eds.), National security: institutional approaches, Nova Science Publishers (2009), [chapter 2]

M. McGuire, K.N. Plataniotis, A.N. Venetsanopoulos, Data fusion of power and time measurements for mobile terminal location, IEEE Transaction on Mobile Computing, 4 (2005), pp. 142-153, cited in Raper et al. (2007)

S. Mann, J. Nolan, B. Wellman, Sousveillance: inventing and using wearable computing devices for data collection in surveillance environments, Surveillance & Society, 1 (3) (June 2003), pp. 331-355, at http://www.surveillance-and-society.org/articles1(3)/sousveillance.pdf

Mautz R. Overview of indoor positioning technologies. Keynote. In: Proc. IPIN'2011, Guimaraes; September 2011, at http://www.geometh.ethz.ch/people/.../IPIN_Keynote_Mautz_2011.pdf.

D. Mery, The mobile phone as self-inflicted surveillance – and if you don't have one, what have you got to hide? The Register (10 April 2009) at http://www.theregister.co.uk/2009/04/10/mobile_phone_tracking/

Michael and Michael, 2007, K. Michael, M.G. Michael, From dataveillance to überveillance and the Realpolitik of the Transparent Society, University of Wollongong (2007) at http://works.bepress.com.ezproxy.uow.edu.au/kmichael/51

K. Michael, M.G. Michael, Innovative automatic identification and location-based services: from bar codes to chip implants, IGI Global (2009)

M.G. Michael, K. Michael, Towards a state of uberveillance, IEEE Technology and Society Magazine, 29 (2) (Summer 2010), pp. 9-16, at, http://works.bepress.com.ezproxy.uow.edu.au/kmichael/187

Michael K, McNamee A, Michael MG, Tootell H., Location-based intelligence – modeling behavior in humans using GPS. In: Proc. int'l symposium on technology and society, New York, 8–11 June 2006; 2006a, at http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1384&context=infopapers.

Michael K, McNamee A, Michael MG. The emerging ethics of humancentric GPS tracking and monitoring. In: Proc. int'l conf. on mobile business, Copenhagen, Denmark. IEEE Computer Society; 2006b, at http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1384&context=infopapers.

M.G. Michael, S.J. Fusco, K. Michael, A research note on ethics in the emerging age of uberveillance, Computer Communications, 31 (6) (2008), pp. 1192-1199, at http://works.bepress.com.ezproxy.uow.edu.au/kmichael/32/

Michael and Masters, 2006, K. Michael, A. Masters, Realized applications of positioning technologies in defense intelligence, H. Hussein Abbass, D. Essam (Eds.), Applications of information systems to homeland security and defense, Idea Group Publishing (2006), at http://works.bepress.com.ezproxy.uow.edu.au/kmichael/2

K. Michael, G. Rose, Human tracking technology in mutual legal assistance and police inter-state cooperation in international crimes, K. Michael, M.G. Michael (Eds.), From dataveillance to überveillance and the realpolitik of the transparent society. 1st ed, University of Wollongong, Wollongong (2007), pp. 241-256.

K. Michael, G. Roussos, G.Q. Huang, R. Gadh, A. Chattopadhyay, S.Prabhu, et al.Planetary-scale RFID services in an age of uberveillance, Proceedings of the IEEE, 98 (9) (2010), pp. 1663-1671

K. Michael, M.G. Michael, R. Abbas, The importance of scenarios in the prediction of the social implications of emerging technologies and services, Journal of Cases on Information Technology (JCIT) 13.2 (2011), pp. i-vii

A. Moses, Google escapes criminal charges for Wi-Fi snooping, The Sydney Morning Herald (6 December 2010) at http://www.smh.com.au/technology/security/google-escapes-criminal-charges-for-wifi-snooping-20101206-18lot.html

NSWLRC Surveillance, Report 108, NSW Law Reform Commission (2005) at http://www.lawlink.nsw.gov.au/lawlink/lrc/ll_lrc.nsf/pages/LRC_r108toc

OAIC. Office of the Australian Information Commissioner; April 2012, at http://www.comlaw.gov.au/Details/F2012L00869/Explanatory%20Statement/Text.

A.A. Otterberg, Note: GPS tracking technology: the case for revisiting Knotts and shifting the Supreme Court's theory of the public space under the fourth amendment, Boston College Law Review, 46 (2005) (2005), pp. 661-704

C. Parenti, The soft cage: surveillance in America from slavery to the war on terror, Basic Books (2003)

PI, Our commitment to privacy, Path Intelligence (2010), heading changed in late 2012 to ‘privacy by design’, at http://www.pathintelligence.com/en/products/footpath/privacy

PI, FootPath technology, Path Intelligence (2010) at http://www.pathintelligence.com/en/products/footpath/footpath-technology

PI Retail, Path Intelligence (2012), at http://www.pathintelligence.com/en/industries/retail

J. Raper, G. Gartner, H. Karimi, C. Rizos, A critical evaluation of location based services and their potential, Journal of Location Based Services, 1 (1) (2007), pp. 5-45

J. Raper, G. Gartner, H. Karimi, C. Rizos, Applications of location-based services: a selected review, Journal of Location Based Services, 1 (2) (2007), pp. 89-111

RE IEEE 802.11 standards tutorial, Radio-Electronics.com (2010), apparently of 2010, at http://www.radio-electronics.com/info/wireless/wi-fi/ieee-802-11-standards-tutorial.php

RE WiMAX IEEE 802.16 technology tutorial, Radio-Electronics.com (2010), apparently of 2010, at http://www.radio-electronics.com/info/wireless/wimax/wimax.php

RE Assisted GPS, A-GPS, Radio-Electronics.com (2012) apparently of 2012, at http://www.radio-electronics.com/info/cellulartelecomms/location_services/assisted_gps.php

Renegar BD, Michael K, Michael MG. Privacy, value and control issues in four mobile business applications. In: Proc. 7th int'l conf. on mobile business; 2008. p. 30–40.

J. Riley, Gov't ‘travesty’ in Google privacy case, ITWire, 20 (Wednesday 3 November 2010), p. 44, at http://www.itwire.com/it-policy-news/regulation/42898-govt-travesty-in-google-privacy-case

I.J. Samuel, Warrantless location tracking, New York University Law Review, 83 (2008), pp. 1324-1352

SHW Skyhook location performance at http://www.skyhookwireless.com/location-technology/performance.php (2012)

Skyhook. (2012). Website entries, including ‘frequently asked questions’ at http://www.skyhookwireless.com/whoweare/faq.php, ‘privacy policy’ at http://www.skyhookwireless.com/whoweare/privacypolicy.php and ‘location privacy’ at http://www.skyhookwireless.com/whoweare/privacy.php.

C. Song, Z. Qu, N. Blumm, A.-L. Barabási, Limits of predictability in human mobility, Science, 327 (5968) (2010), pp. 1018-1021.

A. Stern, Man fired thanks to GPS tracking, Center Networks (31 August 2007), at http://www.centernetworks.com/man-fired-thanks-to-gps-tracking

Stilgherrian, Forget government data retention, Google has you wired, Crikey (2 October 2012), at http://www.crikey.com.au/2012/10/02/forget-government-data-retention-google-has-you-wired/

USGovGPS accuracy, National Coordination Office for Space-Based Positioning, Navigation, and Timing(February 2012), at http://www.gps.gov/systems/gps/performance/accuracy/

VLRC, Surveillance in public spaces, Victorian Law Reform Commission (March 2010), Final report 18, at http://www.lawreform.vic.gov.au/wps/wcm/connect/justlib/Law+Reform/resources/3/6/36418680438a4b4eacc0fd34222e6833/Surveillance_final_report.pdf

D. Wright, M. Friedewald, S. Gutwirth, M. Langheinrich, E. Mordini, R.Bellanova, et al.Sorting out smart surveillance, Computer Law & Security Review, 26 (4) (2010), pp. 343-354

P.A. Zandbergen, Comparison of WiFi positioning on two mobile devices, Journal of Location Based Services, 6 (1) (March 2012), pp. 35-50

Keywords: Location-based systems (LBS), Cellular mobile, Wireless LAN, GPS, Mobile device signatures (MDS), Privacy, Surveillance, Überveillance

Citation: Katina Michael and Roger Clarke, "Location and tracking of mobile devices: Überveillance stalks the streets", Computer Law & Security Review, Vol. 29, No. 3, June 2013, pp. 216-228, DOI: https://doi.org/10.1016/j.clsr.2013.03.004

Social Implications of Technology: The Past, the Present, and the Future

Abstract

The social implications of a wide variety of technologies are the subject matter of the IEEE Society on Social Implications of Technology (SSIT). This paper reviews the SSIT's contributions since the Society's founding in 1982, and surveys the outlook for certain key technologies that may have significant social impacts in the future. Military and security technologies, always of significant interest to SSIT, may become more autonomous with less human intervention, and this may have both good and bad consequences. We examine some current trends such as mobile, wearable, and pervasive computing, and find both dangers and opportunities in these trends. We foresee major social implications in the increasing variety and sophistication of implant technologies, leading to cyborgs and human-machine hybrids. The possibility that the human mind may be simulated in and transferred to hardware may lead to a transhumanist future in which humanity redesigns itself: technology would become society.

SECTION I. Introduction

“Scientists think; engineers make.” Engineering is fundamentally an activity, as opposed to an intellectual discipline. The goal of science and philosophy is to know; the goal of engineering is to do something good or useful. But even in that bare-bones description of engineering, the words “good” and “useful” have philosophical implications.

Because modern science itself has existed for only 400 years or so, the discipline of engineering in the sense of applying scientific knowledge and principles to the satisfaction of human needs and desires is only about two centuries old. But for such a historically young activity, engineering has probably done more than any other single human development to change the face of the material world.

It took until the mid-20th century for engineers to develop the kind of self-awareness that leads to thinking about engineering and technology as they relate to society. Until about 1900, most engineers felt comfortable in a “chain-of-command” structure in which the boss—whether it be a military commander, a corporation, or a wealthy individual—issued orders that were to be carried out to the best of the engineer's technical ability. Fulfillment of duty was all that was expected. But as the range and depth of technological achievements grew, engineers, philosophers, and the public began to realize that we had all better take some time and effort to think about the social implications of technology. That is the purpose of the IEEE Society on Social Implications of Technology (SSIT): to provide a forum for discussion of the deeper questions about the history, connections, and future trends of engineering, technology, and society.

This paper is not focused on the history or future of any particular technology as such, though we will address several technological issues in depth. Instead, we will review the significant contributions of SSIT to the ongoing worldwide discussion of technology and society, and how technological developments have given rise to ethical, political, and social issues of critical importance to the future. SSIT is the one society in IEEE where engineers and allied professionals are encouraged to be introspective—to think about what they are doing, why they are doing it, and what effects their actions will have. We believe the unique perspective of SSIT enables us to make a valuable contribution to the panoply of ideas presented in this Centennial Special Issue of the Proceedings of the IEEE.

 

SECTION II. The Past

A. Brief History of SSIT

SSIT as a technical society in IEEE was founded in 1982, after a decade as the Committee on Social Responsibility in Engineering (CSRE). In 1991, SSIT held its first International Symposium on Technology and Society (ISTAS), in Toronto, ON, Canada. Beginning in 1996, the Symposium has been held annually, with venues intentionally located outside the continental United States every few years in order to increase international participation.

SSIT total membership was 1705 as of December 2011. Possibly because SSIT does not focus exclusively on a particular technical discipline, it is rare that SSIT membership is a member's primary connection to IEEE. As SSIT's parent organization seeks ways to increase its usefulness and relevance to the rapidly changing engineering world of the 21st century, SSIT will both chronicle and participate in the changes taking place both in engineering and in society as a whole. for a more detailed history of the first 25 years of SSIT, see [1].

B. Approaches to the Social Implications of Technology

In the historical article referred to above [1], former SSIT president Clint Andrews remarked that there are two distinct intellectual approaches which one can take with regard to questions involving technology and society. The CSIT and the early SSIT followed what he calls the “critical science” approach which “tends to focus on the adverse effects of science and technical change.” Most IEEE societies are organized around a particular set of technologies. The underlying assumption of many in these societies is that these particular technologies are beneficial, and that the central issues to be addressed are technical, e.g., having to do with making the technologies better, faster, and cheaper. Andrews viewed this second “technological optimism” trend as somewhat neglected by SSIT in the past, and expressed the hope that a more balanced approach might attract a larger audience to the organization's publications and activities. It is important to note, however, that from the very beginning, SSIT has called for a greater emphasis on the development of beneficial technology such as environmentally benign energy sources and more efficient electrical devices.

In considering technology in its wider context, issues that are unquestionable in a purely technical forum may become open to question. Technique A may be more efficient and a fraction of the cost of technique B in storing data with similar security provisions, but what if a managed offshore shared storage solution is not the best thing to do under a given set of circumstances? The question of whether A or B is better technologically (and economically) is thus subsumed in the larger question of whether and why the entire technological project is going to benefit anyone, and who it may benefit, and who it may harm. The fact that opening up a discussion to wider questions sometimes leads to answers that cast doubt on the previously unquestioned goodness of a given enterprise is probably behind Andrews' perception that on balance, the issues joined by SSIT have predominantly fallen into the critical-science camp. Just as no one expects the dictates of conscience to be in complete agreement with one's instinctive desires, a person seeking unalloyed technological optimism in the pages or discussions hosted by SSIT will probably be disappointed. But the larger aim is to reach conclusions about technology and society that most of us will be thankful for some day, if not today. Another aim is to ensure that we bring issues to light and propose ways forward to safeguard against negative effects of technologies on society.

C. Major Topic Areas of SSIT

In this section, we will review some (but by no means all) topics that have become recurring themes over the years in SSIT's quarterly peer-reviewed publication, the IEEE Technology & Society Magazine. The articles cited are representative only in the sense that they fall into categories that have been dealt with in depth, and are not intended to be a “best of” list. These themes fall into four broad categories: 1) war, military technology (including nuclear weapons), and security issues, broadly defined; 2) energy technologies, policies and related issues: the environment, sustainable development, green technology, climate change, etc.; 3) computers and society, information and communications technologies (ICT), cybersystems, cyborgs, and information-driven technologies; and 4) groups of people who have historically been underprivileged, unempowered, or otherwise disadvantaged: Blacks, women, residents of developing nations, the handicapped, and so on. Education and healthcare also fit in the last category because the young and the ill are in a position of dependence on those in power.

1. Military and Security Issues

Concern about the Vietnam War was a strong motivation for most of the early members of the Committee for Social Responsibility in Engineering, the predecessor organization of SSIT. The problem of how and even whether engineers should be involved in the development or deployment of military technology has continued to appear in some form throughout the years, although the end of the Cold War changed the context of the discussion. This category goes beyond formal armed combat if one includes technologies that tend to exert state control or monitoring on the public, such as surveillance technologies and the violation of privacy by various technical means. In the first volume of the IEEE Technology & Society Magazine published in 1982, luminaries such as Adm. Bobby R. Inman (ret.) voiced their opinions about Cold War technology [2], and the future trend toward terrorism as a major player in international relations was foreshadowed by articles such as “Technology and terrorism: privatizing public violence,” published in 1991 [3]. Opinions voiced in the Magazine on nuclear technology ranged from Shanebrook's 1999 endorsement of a total global ban on nuclear weapons [4] to Andrews' thorough review of national responses to energy vulnerability, in which he pointed out that France has developed an apparently safe, productive, and economical nuclear-powered energy sector [5]. In 2009, a special section of five articles appeared on the topic of lethal robots and their implications for ethical use in war and peacekeeping operations [6]. And in 2010, the use of information and communication technologies (ICT) in espionage and surveillance was addressed in a special issue on “Überveillance,” defined by authors M.G. Michael and K. Michael as the use of electronic means to track and gather information on an individual, together with the “deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time” [7].

2. Energy and Related Technologies and Issues

from the earliest years of the Society, articles on energy topics such as alternative fuels appeared in the pages of the IEEE Technology & Society Magazine. A 1983 article on Brazil's then-novel effort to supplement imported oil with alcohol from sugarcane [8] presaged today's controversial U.S. federal mandate for the ethanol content in motor fuels. The Spring 1984 issue hosted a debate on nuclear power generation between H. M. Gueron, director of New York's Con Edison Nuclear Coal and Fuel Supply division at the time [9], and J. J. MacKenzie, a senior staff scientist with the Union of Concerned Scientists [10]. Long before greenhouse gases became a household phrase and bandied about in debates between Presidential candidates, the Magazine published an article examining the need to increase the U.S.'s peak electrical generating capacity because the increase in average temperature due to increasing atmospheric carbon dioxide would increase the demand for air conditioning [11]. The larger implications of global warming apparently escaped the attention of the authors, focused as they were on the power-generating needs of the state of Minnesota. By 1990, the greenhouse effect was of sufficient concern to show up on the legislative agendas of a number of nations, and although Cruver attributed this to the “explosion of doomsday publicity,” he assessed the implications of such legislation for future energy and policy planning [12]. Several authors in a special issue on the social implications of systems concepts viewed the Earth's total environment in terms of a complex system in 2000 [13]. The theme of ISTAS 2009 was the social implications of sustainable development, and this theme was addressed in six articles in the resulting special issue of the IEEE Technology & Society Magazine for Fall 2010. The record of speculation, debate, forecasting, and analysis sampled here shows that not only has SSIT carried out its charter by examining the social implications of energy technology and related issues, but also it has shown itself a leader and forerunner in trends that later became large-scale public debates.

3. Computing, Telecommunications, and Cyberspace

Fig. 1. BRLESC-II computer built by U.S. Army personnel for use at the Ballistics Research Lab, Aberdeen Proving Grounds between about 1967 and 1978, A. V. Kurian at console. Courtesy of U.S. Army Photos.

In the early years of SSIT, computers were primarily huge mainframes operated by large institutions (Fig. 1). But with the personal computer revolution and especially the explosion of the Internet, SSIT has done its part to chronicle and examine the history, present state, and future trends of the hardware, software, human habits and interactions, and the complex of computer and communications technologies that are typically subsumed under the acronym of ICT.

As we now know, the question of intellectual property has been vastly complicated by the ready availability of peer-to-peer software, high-speed network connections, and legislation passed to protect such rights. In a paper published in 1998, Davis addressed the question of protection of intellectual property in cyberspace [14]. As the Internet grew, so did the volume of papers on all sorts of issues it raised, from the implications of electronic profiling [15] to the threats and promises of facial recognition technology [16]. One of the more forward-looking themes addressed in the pages of the Magazine came in 2005 with a special issue on sustainable pervasive computing [17]. This issue provides an example of how both the critical science and the technological optimism themes cited by Andrews above can be brought together in a single topic. And to show that futuristic themes are not shirked by the IEEE Technology and Society Magazine authors, in 2011 Clarke speculated in an article entitled “Cyborg rights” on the limits and problems that may come as people physically merge with increasingly advanced hardware (implanted chips, sensory enhancements, and so on) [18].

4. Underprivileged Groups

Last but certainly not least, the pages of the IEEE Technology & Society Magazine have hosted articles inspired by the plight of underprivileged peoples, broadly defined. This includes demographic groups such as women and ethnic minorities and those disadvantaged by economic issues, such as residents of developing countries. While the young and the ill are not often formally recognized as underprivileged in the conventional sense, in common with other underprivileged groups they need society's help in order to survive and thrive, in the form of education and healthcare, respectively. An important subset of education is the theme of engineering ethics, a subject of vital interest to many SSIT members and officials since the organization's founding.

In its first year, the Magazine carried an article on ethical issues in decision making [19]. A special 1998 issue on computers and the Internet as used in the K-12 classroom explored these matters in eight focused articles [20]. The roles of ethics and professionalism in the personal enjoyment of engineering was explored by Florman (author of the book The Introspective Engineer) in an interview with the Magazine's managing editor Terri Bookman in 2000 [21]. An entire special issue was devoted to engineering ethics in education the following year, after changes in the U.S. Accreditation Board for Engineering and Technology's policies made it appear that ethics might receive more attention in college engineering curricula [22].

The IEEE Technology & Society Magazine has hosted many articles on the status of women, both as a demographic group and as a minority in the engineering profession. Articles and special issues on themes involving women have on occasion been the source of considerable controversy, even threatening the organization's autonomy at one point [1, p. 9]. In 1999, ISTAS was held for the first time in conjunction with two other IEEE entities: the IEEE Women in Engineering Committee and the IEEE History Center. The resulting special issue that came out in 2000 carried articles as diverse as the history of women in the telegraph industry [23], the challenges of being both a woman and an engineering student [24], and two articles on technology and the sex industry [25], [26].

Engineering education in a global context was the theme of a Fall 2005 special issue of the IEEE Technology and Society Magazine, and education has been the focus of several special issues and ISTAS meetings over the years [27]–[28][29]. The recent development termed “humanitarian engineering” was explored in a special issue only two years ago, in 2010 [30]. Exemplified by the U.S.-based Engineers without Borders organization, these engineers pursue projects, and sometimes careers, based not only on profit and market share, but also on the degree to which they can help people who might not otherwise benefit from their engineering talents.

SECTION III. The Present

Fig. 2.  Cow bearing an Australian National Livestock Identification System (NLIS) RFID tag on its ear. The cow's identity is automatically detected as it goes through the drafting gates and the appropriate feed is provided for the cow based on historical data on its milk yields. Courtesy of Adam Trevarthen.

Fig. 2. Cow bearing an Australian National Livestock Identification System (NLIS) RFID tag on its ear. The cow's identity is automatically detected as it goes through the drafting gates and the appropriate feed is provided for the cow based on historical data on its milk yields. Courtesy of Adam Trevarthen.

Emerging technologies that will act to shape the next few years are complex in their makeup with highly meshed value chains that resemble more a process or service than an individual product [31]. At the heart of this development is convergence: convergence in devices, convergence in applications, convergence in content, and convergence in infrastructure. The current environment is typified by the move toward cloud computing solutions and Web 2.0 social media platforms with ubiquitous access via a myriad of mobile or fixed devices, some of which will be wearable on people and animals (Fig. 2) or embedded in systems (e.g., vehicles and household appliances).

Simultaneous with these changes are the emergence of web services that may or may not require a human operator for decision making in a given business process, reliance upon data streams from automatic identification devices [e.g., radio-frequency identification (RFID) tags], the accuracy and reliability of location-based services [e.g., using Global Positioning Systems (GPS)] and condition monitoring techniques (e.g., using sensors to measure temperature or other physiological data). Most of this new technology will be invisibly located in miniaturized semiconductors which are set to reach such economies of scale, that it is commonly noted by technology evangelists that every single living and nonliving thing will come equipped with a chip “on board.”

Fig. 3. Business woman checking in for an interstate trip using an electronic ticket sent to her mobile phone. Her phone also acts as a mobile payment mechanism and has built-in location services features. Courtesy of NXP Semiconductors 2009.

The ultimate vision of a Web of Things and People (WoTaP)—smart homes using smart meters, smart cars using smart roads, smart cities using smart grids—is one where pervasive and embedded systems will play an active role toward sustainability and renewable energy efficiency. The internetworked environment will need to be facilitated by a fourth-generation mobility capability which will enable even higher amounts of bandwidth to the end user as well as seamless communication and coordination by intelligence built into the cloud. Every smart mobile transaction will be validated by a precise location and linked back to a subject (Fig. 3).

In the short term, some of the prominent technologies that will impact society will be autonomous computing systems with built-in ambient intelligence which will amalgamate the power of web services and artificial intelligence (AI) through multiagent systems, robotics, and video surveillance technologies (e.g., even the use of drones) (Fig. 4). These technologies will provide advanced business and security intelligence. While these systems will lead to impressive uses in green initiatives and in making direct connections between people and dwellings, people and artifacts, and even people and animals, they will require end users to give up personal information related to identity, place, and condition to be drawn transparently from smart devices.

Fig. 4.  A facial recognition system developed by Argus Solutions in Australia. Increasingly facial recognition systems are being used in surveillance and usually based on video technology. Digital images captured from video or still photographs are compared with other precaptured images. Courtesy of Argus Solutions 2009.

Fig. 4. A facial recognition system developed by Argus Solutions in Australia. Increasingly facial recognition systems are being used in surveillance and usually based on video technology. Digital images captured from video or still photographs are compared with other precaptured images. Courtesy of Argus Solutions 2009.

The price of all of this will be that very little remains private any longer. While the opportunities that present themselves with emerging technologies are enormous with a great number of positive implications for society—for instance, a decrease in the number of traffic accidents and fatalities, a reduction in the carbon emission footprint by each household, greater social interconnectedness, etc.—ultimately these gains too will be susceptible to limitations. Who the designated controller is and what they will do with the acquired data is something we can only speculate about. We return then, to the perennial question of “who will guard the guards themselves”: Quis custodiet ipsos custodes? [32]

A. Mobile and Pervasive Computing

In our modern world, data collection from many of our most common activities begins from the moment we step out our front door in the morning until we go to sleep at night. In addition to near-continual data collection, we have become a society of people that voluntarily broadcasts to the world a great deal of personal information. Vacation photos, major life events, and trivialities such as where we are having dinner to our most mundane thoughts, all form part of the stream of data through which we electronically share our inner lives. This combination of the data that is collected about us and the data that is freely shared by us could form a breathtakingly detailed picture of an individual's life, if it could ever all be collected in one place. Most of us would consider ourselves fortunate that most of this data was historically never correlated and is usually highly anonymized. However, in general, it is becoming easier to correlate and deanonymize data sets.

1. Following Jane Doe's Digital Data Trail

Let us consider a hypothetical “highly tracked” individual [33]. Our Jane Doe leaves for work in the morning, and gets in her Chevrolet Impala, which has OnStar service to monitor her car. OnStar will contact emergency services if Jane has an accident, but will also report to the manufacturer any accident or mechanical failure the car's computer is aware of [34]. Jane commutes along a toll road equipped with electronic toll collection (ETC). The electronic toll system tracks where and at what time Jane enters and leaves the toll road (Fig. 5).

Fig. 5. Singapore's Electronic Road Pricing (ERP) system. The ERP uses a dedicated short-range radio communication system to deduct ERP charges from CashCards. These are inserted in the in-vehicle units of vehicles before each journey. Each time vehicles pass through a gantry when the system is in operation, the ERP charges are automatically deducted. Courtesy of Katina Michael 2003.

When she gets to work, she uses a transponder ID card to enter the building she works in (Fig. 6), which logs the time she enters and by what door. She also uses her card to log into the company's network for the morning. Her company's Internet firewall software monitors any websites she visits. At lunch, she eats with colleagues at a local restaurant. When she gets there, she “checks in” using a geolocation application on her phone—for doing so, the restaurant rewards her with a free appetizer [35].

 

Fig. 6. Employee using a contactless smart card to gain entry to her office premises. The card is additionally used to access elevators in the building, rest rooms, and secure store areas, and is the only means of logging into the company intranet. Courtesy of NXP Semiconductors 2009.

She then returns to work for the afternoon, again using her transponder ID badge to enter. After logging back into the network, she posts a review of the restaurant on a restaurant review site, or maybe a social networking site. At the end of the work day, Jane logs out and returns home along the same toll road, stopping to buy groceries at her local supermarket on the way. When she checks out at the supermarket, she uses her customer loyalty card to automatically use the store's coupons on her purchases. The supermarket tracks Jane's purchases so it can alert her when things she buys regularly are on sale.

During Jane's day, her movements were tracked by several different systems. During almost all of the time she spent out of the house, her movements were being followed. But Jane “opted in” to almost all of that tracking; it was her choice as the benefits she received outweighed her perceived costs. The toll collection transponder in her car allows her to spend less time in traffic [36]. She is happy to share her buying habits with various merchants because those merchants reward her for doing so [37]. In this world it is all about building up bonus points and getting rewarded. Sharing her opinions on review and social networking sites lets Jane keep in touch with her friends and lets them know what she is doing.

While many of us might choose to allow ourselves to be monitored for the individual benefits that accrue to us personally, the data being gathered about collective behaviors are much more valuable to business and government agencies. Clarke developed the notion of dataveillance to give a name to the “systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” in the 1980s [38]. ETC is used by millions of people in many countries. The more people who use it, as opposed to paying tolls at tollbooths, the faster traffic can flow for everyone. Everyone also benefits when ETC allows engineers to better monitor traffic flows and plan highway construction to avoid the busiest times of traffic. Geolocation applications let businesses reward first-time and frequent customers, and they can follow traffic to their business and see what customers do and do not like. Businesses such as grocery stores or drug stores that use customer loyalty cards are able to monitor buying trends to see what is popular and when. Increasingly shoppers are being introduced to the near-field communication (NFC) capability on their third-generation (3G) smartphone (Fig. 7).

Fig. 7. Purchasing grocery items effortlessly by using the near-field communication (NFC) capability on your 3G smartphone. Courtesy of NXP Semiconductors 2009.

Some of these constant monitoring tools are truly personal and are controlled by and report back only to the user [39]. for example, there are now several adaptive home thermostat systems that learn a user's temperature preferences over time and allow users to track their energy usage and change settings online. for the health conscious, “sleep monitoring” systems allow users to track not only the hours of sleep they get per night, but also the percentage of time spent in light sleep versus rapid eye movement (REM) sleep, and their overall “sleep quality” [40].

Fig. 8. Barcodes printed on individual packaged items on pallets. Order information is shown on the forklift's on-board laptop and the driver scans items that are being prepared for shipping using a handheld gun to update inventory records wirelessly. Courtesy AirData Pty Ltd, Motorola Premier Business Partner, 2009.

Businesses offer and customers use various mobile and customer tracking services because the offer is valued by both parties (Fig. 8). However, serious privacy and legal issues continue to arise [41]. ETC records have been subpoenaed in both criminal and civil cases [42]. Businesses in liquidation have sold their customer databases, violating the privacy agreements they gave to their customers when they were still in business. Geolocation services and social media that show a user's location or allow them to share where they have been or where they are going can be used in court cases to confirm or refute alibis [43].

 

Near-constant monitoring and reporting of our lives will only grow as our society becomes increasingly comfortable sharing more and more personal details (Fig. 9). In addition to the basic human desire to tell others about ourselves, information about our behavior as a group is hugely valuable to both governments and businesses. The benefits to individuals and to society as a whole are great, but the risks to privacy are also significant [44]. More information about group behaviors can let us allocate resources more efficiently, plan better for future growth, and generate less waste. More information about our individual patterns can allow us to do the same thing on a smaller scale—to waste less fuel heating our homes when there is no one present, or to better understand our patterns of human activity.

 

Fig. 9. A five step overview of how the Wherify location-based service works. The information retrieved by this service included a breadcrumb of each location (in table and map form), a list of time and date stamps, latitude and longitude coordinates, nearest street address, and location type. Courtesy of Wherify Wireless Location Services, 2009.

 

B. Social Computing

When we think of human evolution, we often think of biological adaptions to better survive disease or digest foods. But our social behaviors are also a product of evolution. Being able to read facial expressions and other nonverbal cues is an evolved trait and an essential part of human communication. In essence, we have evolved as a species to communicate face to face. Our ability to understand verbal and nonverbal cues has been essential to our ability to function in groups and therefore our survival [45].

The emoticon came very early in the life of electronic communication. This is not surprising, given just how necessary using facial expressions to give context to written words was to the casual and humor-filled atmosphere of the Internet precursors. Many other attempts to add context to the quick, casual writing style of the Internet have been made, mostly with less success. Indeed, the problem of communication devolving from normal conversations to meaningless shouting matches has been around almost as long as electronic communication itself. More recently, the “anonymous problem”—the problem of people anonymously harassing others without fear of response or retribution—has come under discussion in online forums and communities. And of course, we have seen the recent tragic consequences of cyberbullying [46]. In general, people will be much crueler to other people online than they would ever be in person; many of our evolved social mechanisms depend on seeing and hearing who we are communicating with.

The question we are faced with is this: Given that we now exist and interact in a world that our social instincts were not evolved to handle, how will we adapt to the technology, or more likely, how will the technology we use to communicate with adapt to us? We are already seeing the beginning of that adaptation: more and more social media sites require a “real” identity tied to a valid e-mail address. And everywhere on the Internet, “reputation” is becoming more and more important [177].

Reference sites, such as Wikipedia, control access based on reputation: users gain more privileges on the site to do things such as editing controversial topics or banning other users based on their contributions to the community—writing and editing articles or contributing to community discussions. On social media and review sites, users that are not anonymous have more credibility, and again reputation is gained with time and contribution to the community.

It is now becoming standard practice for social media of all forms to allow users to control who can contact them and make it very easy to block unwanted contact. In the future, these trends will be extended. Any social media site with a significant amount of traffic will have a way for users to build and maintain a reputation and to control access accordingly. The shift away from anonymity is set to continue and this is also evident in the way search engine giants, like Google, are updating their privacy statements—from numerous policies down to one. Google states: “When you sign up for a Google Account, we ask you for personal information. We may combine the information you submit under your account with information from other Google services or third parties in order to provide you with a better experience and to improve the quality of our services” [47].

Fig. 10. Wearable high-definition video calling and recording attire. Courtesy of Xybernaut 2002.

When people use technology to socialize, they are often doing it on mobile platforms. Therefore, the futures of social and mobile computing are inevitably intertwined. The biggest change that is coming to the shared mobile/social computing space is the final spread of WiFi and high-density mobile phone networks. There are still huge geographical areas where there is no way of wirelessly connecting to the Internet or where the connection is so slow as to be unusable. As high-speed mobile Internet spreads, extra bandwidth could help the problems inherent in communicating without being able to see the other person. High-definition (HD) video calling on mobile phones will make person-to-person communications easier and more context rich (Fig. 10). HD video calling and conferencing will make everything from business meetings to long-distance relationships easier by allowing the participants to pick up on unspoken cues.

 

As more and more of our social interactions go online, the online world will be forced to adapt to our evolved human social behaviors. It will become much more like offline communication, with reputation and community standing being deeply important. True anonymity will become harder and harder to come by, as the vast majority of social media will require some proof of identity. for example, this practice is already occurring in countries like South Korea [48].

While we cannot predict all the ways in which our online interactions will become more immersive, we can say for certain that they will. The beauty of all of these changes will be that it will become as easy to maintain or grow a personal relationship on the other side of the world as it would be across town. As countries and regions currently without high-speed data networks come online, they can integrate into a new global community allowing us all to know each other with a diverse array of unknown consequences.

C. Wearable Computing

Fig. 11. The prototype GPS Locator for Children with a built-in pager, a request for 911, GPS technology, and a key fob to manually lock and unlock the locator. This specific device is no longer being marketed, despite the apparent need in some contexts. Courtesy of Wherify Wireless Location Services, 2003.

According to Siewiorek [49, p. 82], the first wearable device was prototyped in 1961 but it was not until 1991 that the term “wearable computer” was first used by a research group at Carnegie Mellon University (Pittsburgh, PA). This coincided with the rise of the laptop computer, early models of which were known as “luggables.” Wearable computing can be defined as “anything that can be put on and adds to the user's awareness of his or her environment …mostly this means wearing electronics which have some computational power” [50, p. 2012]. While the term “wearables” is generally used to describe wearable displays and custom computers in the form of necklaces, tiepins, and eyeglasses, the definition has been broadened to incorporate iPads, iPods, personal digital assistants (PDAs), e-wallets, GPS watches (Fig. 11), and other mobile accessories such as smartphones, smart cards, and electronic passports that require the use of belt buckles or clip-on satchels attached to conventional clothing [51, p. 330]. The iPlant (Internet implant) is probably not far off either [52].

 

Wearable computing has reinvented the way we work and go about our day-to-day business and is set to make even greater changes in the foreseeable future [53]. In 2001, it was predicted that highly mobile professionals would be taking advantage of smart devices to “check messages, finish a presentation, or browse the Web while sitting on the subway or waiting in line at a bank” [54, p. 44]. This vision has indeed been realized but devices like netbooks are still being lugged around instead of worn in the true sense.

The next phase of wearables will be integrated into our very clothing and accessories, some even pointing to the body itself being used as an input mechanism. Harrison of Carnegie Mellon's Human–Computer Interaction Institute (HCII) produced Skinput with Microsoft researchers that makes the body that travels everywhere with us, one giant touchpad [55]. These are all exciting innovations and few would deny the positives that will come from the application of this cutting-edge research. The challenge will be how to avoid rushing this technology into the marketplace without the commensurate testing of prototypes and the due consideration of function creep. Function or scope creep occurs when a device or application is used for something other than it was originally intended.

Early prototypes of wearable computers throughout the 1980s and 1990s could have been described as outlandish, bizarre, or even weird. for the greater part, wearable computing efforts have focused on head-mounted displays (a visual approach) that unnaturally interfered with human vision and made proximity to others cumbersome [56, p. 171]. But the long-term aim of researchers is to make wearable computing inconspicuous as soon as technical improvements allow for it (Fig. 12). The end user should look as “normal” as possible [57, p. 177].

 

Fig. 12. Self-portraits of Mann with wearable computing kit from the 1980s to the 1990s. Prof. Mann started working on his WearComp invention as far back as his high school days in the 1970s. Courtesy of Steve Mann.

New technologies like the “Looxcie” [58] wearable recorders have come a long way since the clunky point-of-view head-mounted recording devices of the 1980s, allowing people to effortlessly record and share their life as they experience it in different contexts. Mann has aptly coined the term sousveillance. This is a type of inverse panopticon, sous (below) and veiller (to watch) stemming from the French words. A whole body of literature has emerged around the notion of sousveillance which refers to the recording of an activity by a participant in the activity, typically by way of small wearable or portable personal technologies. The glogger.mobi online platform demonstrates the great power of sousveillance. But there are still serious challenges, such as privacy concerns, that need to be overcome if wearable computing is to become commonplace [59]. Just like Google has created StreetView, can the individual participate in PersonView without his neighbor's or stranger's consent [7] despite the public versus private space debate? Connected to privacy is also the critical issue of autonomy (and if we were to agree with Kant, human dignity), that is, our right to make informed and uncoerced decisions.

While mass-scale commercial production of wearable clothing is still some time away, some even calling it the unfulfilled pledge [60], shirts with simple memory functions have been developed and tested. Sensors will play a big part in the functionality of the smartware helping to determine the environmental context, and undergarments closest to the body will be used for body functions such as the measurement of temperature, blood pressure, heart and pulse rates. for now, however, the aim is to develop ergonomically astute wearable computing that is actually useful to the end user. Head-mounted displays attached to the head with a headband may be practical for miners carrying out occupational health and safety (OH&S) but are unattractive for everyday consumer users. Displays of the next generation will be mounted or concealed within eyeglasses themselves [61, p. 48].

Mann [57, p. 31] predicts that wearable computing will become so common one day, interwoven into every day clothing-based computing, that “we will no doubt feel naked, confused, and lost without a computer screen hovering in front of our eyes to guide us,” just like we would feel our nakedness without the conventional clothing of today.

1. Wearables in the Medical Domain

Unsurprisingly, wearables have also found a niche market in the medical domain. In the mid-1990s, researchers began to describe a small wearable device that continuously monitored glucose levels so that the right amount of insulin was calculated for the individual reducing the incidence of hypoglycemic episodes [62]. The Glucoday [63] and GlucoChip [64] are just two products demonstrating the potential to go beyond wearables toward in vivo techniques in medical monitoring.

Medical wearables even have the capability to check and monitor products in one's blood [65, p. 88]. Today medical wearable device applications include: “monitoring of myocardial ischemia, epileptic seizure detection, drowsiness detection …physical therapy feedback, such as for stroke victim rehabilitation, sleep apnea monitoring, long-term monitoring for circadian rhythm analysis of heart rate variability (HRV)” [66, p. 44].

Some of the current shortcomings of medical wearables are similar to those of conventional wearables, namely the size and the weight of the device which can be too large and too heavy. In addition, wearing the devices for long periods of time can be irritating due to the number of sensors that may be required to be worn for monitoring. The gel applied for contact resistance between the electrode and the skin can also dry up, which is a nuisance. Other obstacles to the widespread diffusion of medical wearables include government regulations and the manufacturers' requirement for limited liability in the event that an incorrect diagnosis is made by the equipment.

But much has been improved in the products of wearables over the past ten years. Due to commensurate breakthroughs in the miniaturization of computing components, wearable devices are now usually quite small. Consider Toumaz Technology's Digital Plaster invention known as the Sensium Life Pebble TZ203002 (Fig. 13). The Digital Plaster contains a Sensium silicon chip, powered by a tiny battery, which sends data via a cell phone or a PDA to a central computer database. The Life Pebble has the ability to enable continuous, auditable acquisition of physiological data without interfering with the patient's activities. The device can continuously monitor electrocardiogram (ECG), heart rate, physical activity, and skin temperature. In an interview with M. G. Michael in 2006, Toumazou noted how the Digital Plaster had been applied in epilepsy control and depression. He said that by monitoring the electrical and chemical responses they could predict the onset of either a depressive episode or an epileptic fit; and then once predicted the nerve could be stimulated to counter the seizure [67]. He added that this truly signified “personal healthcare.”

Fig. 13. Prof. Christofer Toumazou with a patient wearing the “digital plaster”; a tiny electronic device meant to be embedded in ordinary medical plaster that includes sensors for monitoring health-related metadata such as blood pressure, temperature, and glucose levels. Courtesy of Toumaz Technology 2008.

 

D. Robots and Unmanned Aerial Systems and Vehicles

Fig. 14. Predator Drone aircraft: this plane comes in the armed and reconnaissance versions and the models are known as RQ-1 and MQ-1.

Autonomous systems are those which are self-governed. In practice, there are many degrees of autonomy ranging from the highly constrained and supervised to unconstrained and intelligent. Some systems are referred to as “semiautonomous” in order to suggest that the machines are tasked or supervised by a human operator. An unmanned vehicle may be a remotely piloted “dumb” vehicle or an autonomous vehicle (Fig. 14). Robots may be designed to perform repetitive tasks in a highly constrained environment or with intelligence and a high level of autonomy to make judgments in a dynamic and unpredictable environment. As technology advancements allow for a high level of autonomy and expansion from industrial applications to caregiving and warfighting, society is coming to grips with the present and the future of increasingly autonomous systems in our homes, workplaces, and battlefields.

 

Robot ethics, particularly with respect to autonomous weapons systems, has received increasing attention in the last few years [68]. While some call for an outright stop to the development of such technology [69], others seek to shape the technology with ethical and moral implications in mind [6], [70]–[71][72][73]. Driving robotics weapons development underground or refusing to engage in dialog over the ethical issues will not give ethicists an opportunity to participate in shaping the design and use of such weapons. Arkin [6] and Operto [74], among others, argue that engineers must not shy away from these ethical challenges. Furthermore, the technological cat is out of the bag: “Autonomy is subtle in its development—it is occurring in a step-by-step process, rather than through the creation of a disruptive invention. It is far less likely that we will have a sudden development of a ‘positronic brain’ or its equivalent, but rather a continual and gradual relinquishment of authority to machines through the constant progress of science, as we have already seen in automated trains, elevators, and numerous other examples, that have vanished into the background noise of civilization. Autonomy is already here by some definitions” [70].

The evolution of the development and deployment of unmanned aerial vehicles and other autonomous or semiautonomous systems has outpaced the analysis of social implications and ethics of their design and use [70], [75]. Sullivan argues that the evolution of unmanned vehicles for military deployment should not be confused with the more general trend of increasing autonomy in military applications [75]. Use of robots often provides a tactical advantage due to sensors, data processing, and physical characteristics that outperform humans. Robots can act without emotion, bias, or self-preservation influencing judgment, which may be a liability or advantage. Risks to robot deployment in the military, healthcare industry, and elsewhere include trust of autonomous systems (a lack of, or too much) and diffusion of blame or moral buffering [6], [72].

for such critical applications in the healthcare domain, and lethal applications in weapons, the emotional and physical distance of operating a remote system (e.g., drone strikes via video-game style interface) may negatively influence the moral decision making of the human operator or supervisor, while also providing some benefit of emotional protection against post-traumatic stress disorder [71], [72]. Human–computer interfaces can promote ethical choices in the human operator through thoughtful or model-based design as suggested by Cummings [71] and Asaro [72].

for ethical behavior of the autonomous system itself, Arkin proposes that robot soldiers could be more humane than humans, if technologically constrained to the laws of war and rules of engagement, which they could follow without the distortions of emotion, bias, or a sense of self-preservation [6], [70]. Asaro argues that such laws are not, in fact, objective and static but rather meant for human interpretation for each case, and therefore could not be implemented in an automated system [72]. More broadly, Operto [74] agrees that a robot (in any application) can only act within the ethics incorporated into its laws, but that a learning robot, in particular, may not behave as its designers anticipate.

Fig. 15. Kotaro, a humanoid roboter created at the University of Tokyo (Tokyo, Japan), presented at the University of Arts and Industrial Design Linz (Linz, Austra) during the Ars Electronica Festival 2008. Courtesy of Manfred Werner-Tsui.

Robot ethics is just one part of the landscape of social implications for autonomous systems. The field of human–robot interaction explores how robot interfaces and socially adaptive robots influence the social acceptance, usability, and safety of robots [76] (Fig. 15). for example, robots used for social assistance and care, such as for the elderly and small children, introduce a host of new social implications questions. Risks of developing an unhealthy attachment or loss of human social contact are among the concerns raised by Sharkey and Sharkey [77]. Interface design can influence these and other risks of socially assistive robots, such as a dangerous misperception of the robot's capabilities or a compromise of privacy [78].

 

Autonomous and unmanned systems have related social implication challenges. Clear accountability and enforcing morality are two common themes in the ethical design and deployment of such systems. These themes are not unique to autonomous and unmanned systems, but perhaps the science fiction view of robots run amok raises the question “how can we engineer a future where we can benefit from these technologies while maintaining our humanity?”

 

SECTION IV. The Future

Great strides are being taken in the field of biomedical engineering: the application of engineering principles and techniques to the medical field [79]. New technologies such as prospective applications of nanotechnology, microcircuitry (e.g., implantables), and bionics will heal and give hope to many who are suffering from life-debilitating and life-threatening diseases [80]. The lame will walk again. The blind will see just as the deaf have heard. The dumb will sing. Even bionic tongues are on the drawing board. Hearts and kidneys and other organs will be built anew. The fundamental point is that society at large should be able to distinguish between positive and negative applications of technological advancements before we diffuse and integrate such innovations into our day-to-day existence.

The Bionics Institute [81], for instance, is future-focused on the possibilities of bionic hearing, bionic vision, and neurobionics, stating: “Medical bionics is not just a new frontier of medical science, it is revolutionizing what is and isn't possible. Where once there was deafness, there is now the bionic ear. And where there was blindness, there may be a bionic eye.” The Institute reaffirms its commitment to continuing innovative research and leading the way on the proposed “world-changing revolution.”

A. Cochlear Implants—Helping the Deaf to Hear

Fig. 16. Cochlear's Nucleus Freedom implant with Contour Advance electrode which is impervious to magnetic fields up to 1.5 Tesla. Courtesy of Cochlear Australia.

In 2000, more than 32 000 people worldwide already had cochlear implants [82], thanks to the global efforts of people such as Australian Professor Graeme Clark, the founder of Cochlear, Inc. [83]. Clark performed his first transplant in Rod Saunder's left ear at the Royal Eye and Ear Hospital in Melbourne, Australia, on August 1, 1978, when “he placed a box of electronics under Saunders's skin and a bundle of electrodes in his inner ear” [84]. In 2006, that number had grown to about 77 500 for the nucleus implant (Fig. 16) alone which had about 70% of the market share [85]. Today, there are over 110 000 cochlear implant recipients, about 30 000 annually, and their personal stories are testament enough to the ways in which new technologies can change lives dramatically for the better [86]. Cochlear implants can restore hearing to people who have severe hearing loss, a form of diagnosed deafness. Unlike a standard hearing aid that works like an amplifier, the cochlear implant acts like a microphone to change sound into electronic signals. Signals are sent to the microchip implant via radio frequency (RF), stimulating nerve fibers in the inner ear. The brain then interprets the signals that are transmitted via the nerves to be sound.

 

Today, cochlear implants (which are also commonly known as bionic ears) are being used to overcome deafness; tomorrow, they may be open to the wider public as a performance-enhancing technique [87, pp. 10–11]. Audiologist Steve Otto of the Auditory Brainstem Implant Project at the House Ear Institute (Los Angeles, CA) predicts that one day “implantable devices [will] interface microscopically with parts of the normal system that are still physiologically functional” [88]. He is quoted as saying that this may equate to “ESP for everyone.” Otto's prediction that implants will one day be used by persons who do not require them for remedial purposes has been supported by numerous other high profile scientists. A major question is whether this is the ultimate trajectory of these technologies.

for Christofer Toumazou, however, Executive Director of the Institute of Biomedical Engineering, Imperial College London (London, U.K.), there is a clear distinction between repairing human functions and creating a “Superman.” He said, “trying to give someone that can hear super hearing is not fine.” for Toumazou, the basic ethical paradigm should be that we hope to repair the human and not recreate the human [67].

B. Retina Implants—On a Mission to Help the Blind to See

Fig. 17. Visual cortical implant designed by Prof. Mohamad Sawan, a researcher at Polystim Neurotechnologies Laboratory at the Ecole Polytechnique de Montreal (Montreal, QC, Canada). The basic principle of Prof. Sawan's technology consists of stimulating the visual cortex by implanting a silicon microchip on a network of electrodes, made of biocompatible materials, wherein each electrode injects a stimulating electrical current in order to provoke a series of luminous points to appear (an array of pixels) in the field of vision of the blind person. This system is composed of two distinct parts: the implant and an external controller. Courtesy of Mohamad Sawan 2009, made available under Creative Commons License.

The hope is that retina implants will be as successful as cochlear implants in the future [89]. Just as cochlear implants cannot be used for persons suffering from complete deafness, retina implants are not a solution for totally blind persons but rather those suffering from aged macular degeneration (AMD) and retinitis pigmentosa (RP). Retina implants have brought together medical researchers, electronic specialists, and software designers to develop a system that can be implanted inside the eye [90]. A typical retina implant procedure is as follows: “[s]urgeons make a pinpoint opening in the retina to inject fluid in order to lift a portion of the retina from the back of the eye, creating a pocket to accommodate the chip. The retina is resealed over the chip, and doctors inject air into the middle of the eye to force the retina back over the device and close the incisions” [91] (Fig. 17).

 

Brothers Alan and Vincent Chow, one an engineer, the other an ophthalmologist, developed the artificial silicon retina (ASR) and began the company Optobionics Corporation in 1990. This was a marriage between biology and engineering: “In landmark surgeries at the University of Illinois at Chicago Medical Center …the first artificial retinas made from silicon chips were implanted in the eyes of two blind patients who have lost almost all of their vision because of retinal disease.” In 1993, Branwyn [92, p. 3] reported that a team at the National Institutes of Health (NIH) led by Dr. Hambrecht, implanted a 38-electrode array into a blind female's brain. It was reported that she saw simple light patterns and was able to make out crude letters. The following year the same procedure was conducted by another group on a blind male resulting in the man seeing a black dot with a yellow ring around it. Rizzo of Harvard Medical School's Massachusetts Eye and Ear Infirmary (Boston, MA) has cautioned that it is better to talk down the possibilities of the retina implant so as not to give false hopes. The professor himself had expressed that they are dealing with “science fiction stuff” and that there are no long-term guarantees that the technology will ever fully restore sight, although significant progress is being made by a number of research institutes [93, p. 5].

Among these pioneers are researchers at The Johns Hopkins University Medical Center (Baltimore, MD). Brooks [94, p. 4] describes how the retina chip developed by the medical center will work: “a kind of miniature digital camera…is placed on the surface of the retina. The camera relays information about the light that hits it to a microchip implanted nearby. This chip then delivers a signal that is fed back to the retina, giving it a big kick that stimulates it into action. Then, as normal, a signal goes down the optic nerve and sight is at least partially restored.” In 2009, at the age of 56, Barbara Campbell had an array of electrodes implanted in each eye [95] and while her sight is nowhere near fully restored, she is able to make out shapes and see shades of light and dark. Experts believe that this approach is still more realistic in restoring sight to those suffering from particular types of blindness, even more than stem cell therapy, gene therapy, or eye transplants [96] where the risks still outweigh the advantages.

C. Tapping Into the Heart and Brain

Fig. 18. An artificial pacemaker from St. Jude Medical (St. Paul, MN), with electrode 2007. Courtesy of Steven Fruitsmaak.

If it was possible as far back as 1958 to successfully implant two transistors the size of an ice hockey puck in the heart of a 43 year old man [97], the things that will become possible by 2020 are constrained by the imagination as much as by technological limitations. Heart pacemakers (Fig. 18) are still being further developed today, but for the greater part, researchers are turning their attention to the possibilities of brain pacemakers. In the foreseeable future brain implants may help sufferers of Parkinson's, paralysis, nervous system problems, speech-impaired persons, and even cancer patients. The research is still in its formative years and the obstacles are great because of the complexity of the brain; but scientists are hopeful of major breakthroughs in the next 20 years.

 

The brain pacemaker endeavors are bringing together people from a variety of disciplines, headed mainly by neurosurgeons. By using brain implants electrical pulses can be sent directly to nerves via electrodes. The signals can be used to interrupt incoherent messages to nerves that cause uncontrollable movements or tremors. By tapping into the right nerves in the brain, particular reactions can be achieved. Using a technique that was discovered almost accidentally in France in 1987, the following extract describes the procedure of “tapping into” the brain: “Rezai and a team of functional neurosurgeons, neurologists and nurses at the Cleveland Clinic Foundation in Ohio had spent the next few hours electronically eavesdropping on single cells in Joan's brain attempting to pinpoint the precise trouble spot that caused a persistent, uncontrollable tremor in her right hand. Once confident they had found the spot, the doctors had guided the electrode itself deep into her brain, into a small duchy of nerve cells within the thalamus. The hope was that when sent an electrical current to the electrode, in a technique known as deep-brain stimulation, her tremor would diminish, and perhaps disappear altogether” [98]. Companies such as Medtronic Incorporated of Minnesota (Minneapolis, MN) now specialize in brain pacemakers [98]. Medtronic's Activa implant has been designed specifically for sufferers of Parkinson's disease [93].

More recently, there has been some success with ameliorating epileptic attacks through closed-loop technology, also known as smart stimulation. The implant devices can detect an onset of epileptiform activity through a demand-driven process. This means that the battery power in the active implant lasts longer because of increased efficiency, i.e., it is not always stimulating in anticipation of an attack, and adverse effects of having to remove and install new implants more frequently are forgone [99]. Similarly, it has been said that technology such as deep brain stimulation, which has physicians implant electrodes in the brain and electrical pacemakers in the patient's clavicle for Parkinson's Disease, may well be used to overcome problems with severely depressed persons [100].

Currently, the technology is being used to treat thousands of people who are severely depressed or suffering from obsessive compulsive disorder (OCD) who have been unable to respond to other forms of treatment such as cognitive behavioral therapy (CBT) [101]. It is estimated that 10% of people suffering from depression do not respond to conventional methods. Although hard figures are difficult to obtain, several thousands of depressed persons worldwide have had brain pacemakers installed that have software which can be updated wirelessly and remotely. The trials have been based on decades of research by Prof. Helen Mayberg, from Emory University School of Medicine (Atlanta, GA), who first began studying the use of subcallosal cingulate gyrus deep brain stimulation (SCG DBS) for depression in 1990.

In her research, Mayberg has used a device that is no larger than a matchbox with a battery-powered generator that sits in the chest and produces electric currents. The currents are sent to an area deep in the brain via tiny wires which are channeled under the skin on either side of the neck. Surprisingly the procedure to have this type of implant installed only requires local anesthetic and is an outpatient procedure. In 2005, Mayberg told a meeting at the Science Media Centre in London: “This is a very new way to think about the nature of depression …We are not just exciting the brain, we are using electricity to retune and remodulate…We can interrupt or switch off an abnormally functioning circuit” [102].

Ongoing trials today continue to show promising results. The outcome of a 20-patient clinical trial of persons with depression treated with SCG DBS published in 2011, showed that: “At 1 year, 11 (55%) responded to surgery with a greater than 50% reduction in 17-item Hamilton Depression Scale scores. Seven patients (35%) achieved or were within 1 point of achieving remission (scores < 8). Of note, patients who responded to surgery had a significant improvement in mood, anxiety, sleep, and somatic complains related to the disease. Also important was the safety of the procedure, with no serious permanent adverse effects or changes in neuropsychological profile recorded” [103].

Despite the early signs that these procedures may offer long-term solutions for hundreds of thousands of people, some research scientists believe that tapping into the human brain is a long shot. The brain is commonly understood to be “wetware” and plugging in hardware into this “wetware” would seem to be a type mismatch, at least according to Steve Potter, a senior research fellow in biology working at the California Institute of Technology's Biological Imaging Center (Pasadena, CA). Instead Potter is pursuing the cranial route as a “digital gateway to the brain” [88]. Others believe that it is impossible to figure out exactly what all the millions of neurons in the brain actually do. Whether we eventually succeed in “reverse-engineering” the human brain, the topic of implants for both therapeutic and enhancement purposes has aroused significant controversy in the past, and promises to do so even more in the future.

D. Attempting to Overcome Paralysis

In more speculative research, surgeons believe that brain implants may be a solution for persons who are suffering from paralysis, such as spinal cord damage. In these instances, the nerves in the legs are still theoretically “working”; it is just that they cannot make contact with the brain which controls their movement. If somehow signals could be sent to the brain, bypassing the lesion point, it could conceivably mean that paralyzed persons regain at least part of their capability to move [104]. In 2000, Reuters [105] reported that a paralyzed Frenchman (Marc Merger) “took his first steps in 10 years after a revolutionary operation to restore nerve functions using a microchip implant…Merger walks by pressing buttons on a walking frame which acts as a remote control for the chip, sending impulses through fine wires to stimulate legs muscles…” It should be noted, however, that the system only works for paraplegics whose muscles remain alive despite damage to the nerves. Yet there are promising devices like the Bion that may one day be able to control muscle movement using RF commands [106]. Brooks [94] reports that researchers at the University of Illinois in Chicago (Chicago, IL) have “invented a microcomputer system that sends pulses to a patient's legs, causing the muscles to contract. Using a walker for balance, people paralyzed from the waist down can stand up from a sitting position and walk short distances…Another team, based in Europe…enabled a paraplegic to walk using a chip connected to fine wires in his legs.” These techniques are known as functional neuromuscular stimulation systems [107]. In the case of Australian Rob Summers, who became a paraplegic after an accident, doctors implanted an epidural stimulator and electrodes into his spinal cord. “The currents mimic those normally sent by the brain to initiate movement” [108].

Others working to help paraplegics to walk again have invested time in military technology like exoskeletons [109] meant to aid soldiers in lifting greater weights, and also to protect them during battle. Ekso Bionics (Berkeley, CA), formerly Berkeley Bionics, has been conducting trials of an electronic suit in the United States since 2010. The current Ekso model will be fully independent and powered by artificial intelligence in 2012. The Ekso “provides nearly four hours of battery power to its electronic legs, which replicate walking by bending the user's knees and lifting their legs with what the company claims is the most natural gait available today” [110]. This is yet another example of how military technology has been commercialized toward a health solution [111].

E. Granting a Voice to the Speech Impaired

Speech-impairment microchip implants work differently than cochlear and retina implants. Whereas in the latter two, hearing and sight is restored, in implants for speech impairment the voice is not restored, but an outlet for communication is created, possibly with the aid of a voice synthesizer. At Emory University, neurosurgeon Roy E. Bakay and neuroscientist Phillip R. Kennedy were responsible for critical breakthroughs early in the research. In 1998, Versweyveld [112] reported two successful implants of a neurotrophic electrode into the brain of a woman and man who were suffering from amyotrophic lateral sclerosis (ALS) and brainstem stroke, respectively. In an incredible process, Bakay and Kennedy's device uses the patient's brain processes—thoughts, if you will—to move a cursor on a computer screen. “The computer chip is directly connected with the cortical nerve cells…The neural signals are transmitted to a receiver and connected to the computer in order to drive the cursor” [112]. This procedure has major implications for brain–computer interfaces (BCIs), especially bionics. Bakay predicted that by 2010 prosthetic devices will grant patients that are immobile the ability to turn on the TV just by thinking about it and by 2030 to grant severely disabled persons the ability to walk independently [112], [113].

F. Biochips for Diagnosis and Smart Pills for Drug Delivery

It is not unlikely that biochips will be implanted in people at birth in the not too distant future. “They will make individual patients aware of any pre-disposition to susceptibility” [114]. That is, biochips will be used for point-of-care diagnostics and also for the identification of needed drugs, even to detect pandemic viruses and biothreats for national security purposes [115]. The way that biosensors work is that they “represent the technological counterpart of our sense organs, coupling the recognition by a biological recognition element with a chemical or physical transducer, transferring the signal to the electrical domain” [116]. Types of biosensors include enzymes antibodies, receptors, nucleic acids, cells (using a biochip configuration), biomimetic sequences of RNA (ribonucleic) or DNA (deoxyribonucleic), and molecularly imprinted polymers (MIPs). Biochips, on the other hand, “automate highly repetitive laboratory tasks by replacing cumbersome equipment with miniaturized, microfluidic assay chemistries combined with ultrasensitive detection methodologies. They achieve this at significantly lower costs per assay than traditional methods—and in a significantly smaller amount of space. At present, applications are primarily focused on the analysis of genetic material for defects or sequence variations” [117].

with response to treatment for illness, drug delivery will not require patients to swallow pills or take routine injections; instead chemicals will be stored on a microprocessor and released as prescribed. The idea is known as “pharmacy-on-a-chip” and was originated by scientists at the Massachusetts Institute of Technology (MIT, Cambridge, MA) in 1999 [118]. The following extract is from The Lab[119]: “Doctors prescribing complicated courses of drugs may soon be able to implant microchips into patients to deliver timed drug doses directly into their bodies.”

Microchips being developed at Ohio State University (OSU, Columbus, OH) can be swathed with chemical substances such as pain medication, insulin, different treatments for heart disease, or gene therapies, allowing physicians to work at a more detailed level [119]. The breakthroughs have major implications for diabetics, especially those who require insulin at regular intervals throughout the day. Researchers at the University of Delaware (Newark, DE) are working on “smart” implantable insulin pumps that may relieve people with Type I diabetes [120]. The delivery would be based on a mathematical model stored on a microchip and working in connection with glucose sensors that would instruct the chip when to release the insulin. The goal is for the model to be able to simulate the activity of the pancreas so that the right dosage is delivered at the right time.

Fig. 19. The VeriChip microchip, the first microchip implant to be cleared by the U.S. Food and Drug Administration (FDA) for humans, is a passive microchip that contains a 16-digit number, which can be used to retrieve critical medical information on a patient from a secure online database. The company that owns the VeriChip technology is developing a microscopic glucose sensor to put on the end of the chip to eliminate a diabetic's need to draw blood to get a blood glucose reading. Courtesy of PositiveID Corporation.

Beyond insulin pumps, we are now nearing a time where automated closed-loop insulin detection (Fig. 19) and delivery will become a tangible treatment option and may serve as a temporary cure for Type I diabetes until stem cell therapy becomes available. “Closed-loop insulin delivery may revolutionize not only the way diabetes is managed but also patients' perceptions of living with diabetes, by reducing the burden on patients and caregivers, and their fears of complications related to diabetes, including those associated with low and high glucose levels” [121]. It is only a matter of time before these lab-centric results are replicated in real-life conditions in sufferers of Type 1 diabetes.

 

 

G. To Implant or Not to Implant, That Is the Question

There are potentially 500 000 hearing impaired persons that could benefit from cochlear implants [122] but not every deaf person wants one [123]. “Some deaf activists…are critical of parents who subject children to such surgery [cochlear implants] because, as one charged, the prosthesis imparts ‘the non-healthy self-concept of having had something wrong with one's body’ rather than the ‘healthy self-concept of [being] a proud Deaf’” [124]. Assistant Professor Scott Bally of Audiology at Gallaudet University (Washington, DC) has said, “Many deaf people feel as though deafness is not a handicap. They are culturally deaf individuals who have successfully adapted themselves to being deaf and feel as though things like cochlear implants would take them out of their deaf culture, a culture which provides a significant degree of support” [92]. Putting this delicate debate aside, it is here that some delineation can be made between implants that are used to treat an ailment or disability (i.e., giving sight to the blind and hearing to the deaf), and implants that may be used for enhancing human function (i.e., memory). There are some citizens, like Amal Graafstra of the United States [125], who are getting chip implants for convenience-oriented social living solutions that would instantly herald in a world that had keyless entry everywhere (Fig. 20). And there are other citizens who are concerned about the direction of the human species, as credible scientists predict fully functional neural implants. “[Q]uestions are raised as to how society as a whole will relate to people walking around with plugs and wires sprouting out of their heads. And who will decide which segments of the society become the wire-heads” [92]?

 

Fig. 20. Amal Graafstra demonstrating an RFID-operated door latch application he developed. Over the RFID tag site on his left hand is a single steristrip that remained after implantation for a few days. His right hand is holding the door latch.

 

SECTION V. Überveillance and Function Creep

Section IV focused on implants that were attempts at “orthopedic replacements”: corrective in nature, required to repair a function that is either lying dormant or has failed altogether. Implants of the future, however, will attempt to add new “functionality” to native human capabilities, either through extensions or additions. Globally acclaimed scientists have pondered on the ultimate trajectory of microchip implants [126]. The literature is admittedly mixed in its viewpoints of what will and will not be possible in the future [127].

for those of us working in the domain of implantables for medical and nonmedical applications, the message is loud and clear: implantables will be the next big thing. At first, it will be “hip to get a chip.” The extreme novelty of the microchip implant will mean that early adopters will race to see how far they can push the limits of the new technology. Convenience solutions will abound [128]. Implantees will not be able to get enough of the new product and the benefits of the technology will be touted to consumers in a myriad of ways, although these perceived benefits will not always be realized. The technology will probably be first tested where there will be the least effective resistance from the community at large, that is, on prison inmates [129], then those suffering from dementia. These incremental steps in pilot trials and deployment are fraught with moral consequences. Prisoners cannot opt out from jails adopting tracking technology, and those suffering from cognitive disorders have not provided and could not provide their consent. from there it will conceivably not take long for it to be used on the elderly and in children and on those suffering from clinical depression.

The functionality of the implants will range from passive ID-only to active multiapplication, and most invasive will be the medical devices that can upon request or algorithmic reasoning release drugs or electrically stimulate the body for mental and physical stability. There will also be a segment of the consumer and business markets who will adopt the technology for no clear reason and without too much thought, save for the fact that the technology is new and seems to be the way advanced societies are heading. This segment will probably not be overly concerned with any discernible abridgement of their human rights or the fine-print “terms and conditions” agreement they have signed, but will take an implant on the promise that they will have greater connectivity to the Internet, for example. These consumers will thrive on ambient intelligence, context-aware pervasive applications, and an augmented reality—ubiquity in every sense.

But it is certain that the new technology will also have consequences far greater than what we can presently envision. Questions about the neutrality of technology are immaterial in this new “plugged-in” order of existence. for Brin [130, p. 334], the question ultimately has to do with the choice between privacy and freedom. In his words, “[t]his is one of the most vile dichotomies of all. And yet, in struggling to maintain some beloved fantasies about the former, we might willingly, even eagerly, cast the latter away.” And thus there are two possibilities, just as Brin [130] writes in his amazingly insightful book, The Transparent Society, of “the tale of two cities.” Either implants embedded in humans which require associated infrastructure will create a utopia where there is built-in intelligence for everything and everyone in every place, or implants embedded in humans will create a dystopia which will be destructive and will diminish one's freedom of choice, individuality, and finally that indefinable essence which is at the core of making one feel “human.” A third possibility—the middle-way between these two alternatives—would seem unlikely, excepting for the “off the grid” dissenter.

In Section V-A, we portray some of the attractions people may feel that will draw them into the future world of implanted technologies. In Section V-B, we portray some of the problems associated with implanting technology under the skin that would drive people away from opting in to such a future.

A. The Positive Possibilities

Bearing a unique implant will make the individual feel special because they bear a unique ID. Each person will have one implant which will coordinate hundreds of smaller nanodevices, but each nanodevice will have the capacity to act on its own accord. The philosophy espoused behind taking an implant will be one of protection: “I bear an implant and I have nothing to hide.” It will feel safe to have an implant because emergency services, for example, will be able to rapidly respond to your calls for help or any unforeseen events that automatically log problems to do with your health.

Fewer errors are also likely to happen if you have an implant, especially with financial systems. Businesses will experience a rise in productivity as they will understand how precisely their business operates to the nearest minute, and companies will be able to introduce significant efficiencies. Losses in back-end operations, such as the effects of product shrinkage, will diminish as goods will be followed down the supply chain from their source to their destination customer, through the distribution center and retailer.

It will take some years for the infrastructure supporting implants to grow and thrive with a substantial consumer base. The function creep will not become apparent until well after the early majority have adopted implants and downloaded and used a number of core applications to do with health, banking, and transport which will all be interlinked. New innovations will allow for a hybrid device and supplementary infrastructure to grow so powerful that living without automated tracking, location finding, and condition monitoring will be almost impossible.

B. The Existential Risks

It will take some years for the negative fallout from microchip implants to be exposed. At first only the victims of the fallout will speak out through formal exception reports on government agency websites. The technical problems associated with implants will pertain to maintenance, updates, viruses, cloning, hacking, radiation shielding, and onboard battery problems. But the greater problems will be the impact on the physiology and mental health of the individual: new manifestations of paranoia and severe depression will lead to people continually wanting reassurance about their implant's functionality. Issues about implant security, virus detection, and a personal database which is error free will be among the biggest issues facing implantees. Despite this, those who believe in the implant singularity (the piece of embedded technology that will give each person ubiquitous access to the Internet) will continue to stack up points and rewards and add to their social network, choosing rather to ignore the warnings of the ultimate technological trajectory of mind control and geoslavery [131]. It will have little to do with survival of the fittest at this point, although most people will buy into the notion of an evolutionary path toward the Homo Electricus [132]: a transhumanist vision [133] that we can do away with the body and become one with the Machine, one with the Cosmos—a “nuts and bolts” Nirvana where one's manufactured individual consciousness connects with the advanced consciousness evolving from the system as a whole. In this instance, it will be the ecstatic experience of being drawn ever deeper into the electric field of the “Network.”

Some of the more advanced implants will be able to capture and validate location-based data, alongside recordings (visual and audio capture). The ability to conduct überveillance via the implant will be linked to a type of blackbox recorder as in an airplane's cockpit. Only in this case the cockpit will be the body, and the recorder will be embedded just beneath the translucent layer of the skin that will be used for memory recollection and dispute resolution. Outwardly ensuring that people are telling the full story at all times, there will be no lies or claims to poor memory. Überveillance is an above and beyond, an exaggerated, an omnipresent 24/7 electronic surveillance (Fig. 21). It is a surveillance that is not only “always on” but “always with you.” It is ubiquitous because the technology that facilitates it, in its ultimate implementation, is embedded within the human body. The problem with this kind of bodily invasive surveillance is that omnipresence in the “material” world will not always equate with omniscience, hence the real concern for misinformation, misinterpretation, and information manipulation [7]. While it might seem like the perfect technology to aid in real-time forensic profiling and criminalization, it will be open to abuse, just like any other technique, and more so because of the preconception that it is infallible.

 

Fig. 21.The überveillance triquetra as the intersection of surveillance, dataveillance, and sousveillance. Courtesy of Alexander Hayes.

 

SECTION VI. Technology Roadmapping

According to Andrews cited in [1], a second intellectual current within the IEEE SSIT has begun to emerge which is more closely aligned with most of the IEEE technical societies, as well as economics and business. The proponents of this mode participate in “technology foresight” and “roadmapping” activities, and view technology more optimistically, looking to foster innovation without being too concerned about its possible negative effects [1, p. 14]. Braun [134, p. 133] writes that “[f]orecasts do not state what the future will be…they attempt to glean what it might be.” Thus, one with technology foresight can be trusted insofar as their knowledge and judgment go—they may possess foresight through their grasp of current knowledge, through past experiences which inform their forecasts, and through raw intuition.

Various MIT Labs, such as the Media Lab, have been engaged in visionary research since before 1990, giving society a good glimpse of where technology might be headed some 20–30 years ahead of time. It is from such elite groups that visionaries typically emerge whose main purpose is to envision the technologies that will better our wellbeing and generally make life more productive and convenient in the future. Consider the current activities of the MIT Media Lab's Affective Computing Research Group directed by Prof. Rosalind W. Picard that is working hard on technology aids encapsulating “affect sensing” in response to the growing problem of autism [135]. The Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology. The work of Picard's group was made possible by the foundations laid by the Media Lab's predecessor researchers.

On the global technological roadmap we can now point to the following systems which are already under development but have not yet been widely diffused into the market:

  • alternative fuels heralding in innovations like electric cars which are self-driving, and ocean-powered energy, as well as rise of biofuels;

  • the potential for 3-D printing which will revolutionize prototyping and manufacturing practices and possibly reconstruct human tissue;

  • hologram projections for videoconferencing and televisions that respond to gestures as well as pen-sized computing which will do away with keyboards and screens;

  • quantum computing and cryptography;

  • next-generation prosthetics (Fig. 22);

  • cognitive machines such as robot humanoids;

  • carbon nanotubes and nanotech computing which will make our current silicon chips look gargantuan;

  • genetic engineering breakthroughs and regenerative health treatment such as stem cell treatment;

  • electronic banking that will not use physical cash for transactions but the singularity chip (e.g., implant);

  • ubiquitous high-speed wireless networks;

  • crowdsourced surveillance toward real-time forensic profiling and criminalization;

  • autogeneration visual life logs and location chronicles;

  • enhanced batteries that last longer;

  • body power to charge digital equipment [136];

  • brainwave-based technologies in health/gaming;

  • brain-reading technology for interrogation [137].

 

Fig. 22. Army Reserve Staff Sgt. Alfredo De Los Santos displays what the X2 microprocessor knee prosthetic can do by walking up a flight of stairs at the Military Advanced Training Center at Walter Reed Army Medical Center (Washington, DC), December 8, 2009. Patients at Walter Reed are testing next-generation prosthetics. Courtesy of the U.S. Army.

It is important to note that while these new inventions have the ability to make things faster and better for most living in more developed countries, they can act to increase the ever-widening gap between the rich and the poor. New technologies will not necessarily aid in eradicating the poverty cycle in parts of Africa and South America. In fact, new technologies can have the opposite effect—they can create an ever greater chasm in equity and access to knowledge.

Technology foresight is commonly held by one who is engaged in the act of prediction. Predictive studies more often than not are based on past and present trends and use this knowledge for providing a roadmap of future possibilities. There is some degree of imagination in prediction, and certainly the creative element is prevalent. Predictions are not meant to be wild, but calculated wisely with evidence that shows a given course or path is likely in the future. However, this does not mean that all predictions come true. Predictive studies can be about new inventions and new form factors, or the recombination of existing innovations in new ways (hybrid architectures, for example), or the mutation of an existing innovation. Some elements of predictive studies have heavy quantitative forecasting components that use complex models to predict the introduction of new innovations, some even based on historical data inputs.

Before an invention has been diffused into the market, scenario planning is conducted to understand how the technology might be used, who might take it up, and what percentage of society will be willing to adopt the product over time (i.e., consumption analysis). “Here the emphasis is on predicting the development of the technology and assessing its potential for adoption, including an analysis of the technology's market” [138, p. 328].

Even the founder of Microsoft Bill Gates [139, p. 274] accepted that his predictions may not come true. But his insights in the Road Ahead are to be commended, even though they were understandably broad. Gates wrote, “[t]he information highway will lead to many destinations. I've enjoyed speculating about some of these. Doubtless I've made some foolish predictions, but I hope not too many.” Allaby [140, p. 206] writes, “[f]orecasts deal in possibilities, not inevitabilities, and this allows forecasters to explore opportunities.”

for the greater part, forecasters raise challenging issues that are thought provoking, about how existing inventions or innovations will impact society. They give scenarios for the technology's projected pervasiveness, how they may affect other technologies, what potential benefits or drawbacks they may introduce, how they will affect the economy, and much more.

Kaku [141, p. 5] has argued, “that predictions about the future made by professional scientists tend to be based much more substantially on the realities of scientific knowledge than those made by social critics, or even those by scientists of the past whose predictions were made before the fundamental scientific laws were completely known.” He believes that among the scientific body today there is a growing concern regarding predictions that for the greater part come from consumers of technology rather than those who shape and create it. Kaku is, of course, correct, insofar that scientists should be consulted since they are the ones actually making things possible after discoveries have occurred. But a balanced view is necessary and extremely important, encompassing various perspectives of different disciplines.

In the 1950s, for instance, when technical experts forecasted improvements in computer technology, they envisaged even larger machines—but science fiction writers predicted microminiaturization. They “[p]redicted marvels such as wrist radios and pocket-sized computers, not because they foresaw the invention of the transistor, but because they instinctively felt that some kind of improvement would come along to shrink the bulky computers and radios of that day” (Bova, 1988, quoted in [142, p. 18]). The methodologies used as vehicles to predict in each discipline should be respected. The question of who is more correct in terms of predicting the future is perhaps the wrong question. for example, some of Kaku's own predictions in Visions can be found in science fiction movies dating back to the 1960s.

In speculating about the next 500 years, Berry [142, p. 1] writes, “[p]rovided the events being predicted are not physically impossible, then the longer the time scale being considered, the more likely they are to come true…if one waits long enough everything that can happen will happen.”

 

SECTION VII.THE NEXT 50 YEARS: BRAIN–COMPUTER INTERFACE

When Ellul [143, p. 432] in 1964 predicted the use of “electronic banks” in his book The Technological Society, he was not referring to the computerization of financial institutions or the use of automatic teller machines (ATMs). Rather it was in the context of the possibility of the dawn of a new entity: the conjoining of man with machine. Ellul was predicting that one day knowledge would be accumulated in electronic banks and “transmitted directly to the human nervous system by means of coded electronic messages…[w]hat is needed will pass directly from the machine to the brain without going through consciousness…” As unbelievable as this man–machine complex may have sounded at the time, 45 years later visionaries are still predicting that such scenarios will be possible by the turn of the 22nd century. A large proportion of these visionaries are cyberneticists. Cybernetics is the study of nervous system controls in the brain as a basis for developing communications and controls in sociotechnical systems. Parenthetically, in some places writers continue to confuse cybernetics with robotics; they might overlap in some instances, but they are not the same thing.

Kaku [141, pp. 112–116] observes that scientists are working steadily toward a brain–computer interface (Fig. 23). The first step is to show that individual neurons can grow on silicon and then to connect the chip directly to a neuron in an animal. The next step is to mimic this connectivity in a human, and the last is to decode millions of neurons which constitute the spinal cord in order to interface directly with the brain. Cyberpunk science fiction writers like William Gibson [144] refer to this notion as “jacking-in” with the wetware: plugging in a computer cable directly with the central nervous system (i.e., with neurons in the brain analogous to software and hardware) [139, p. 133].

 

Fig.&nbsp;23.&nbsp; Brain–computer interface schema. (1) Pedestal. (2) Sensor. (3) Electrode. Courtesy of Balougador under creative commons license.

Fig. 23. Brain–computer interface schema. (1) Pedestal. (2) Sensor. (3) Electrode. Courtesy of Balougador under creative commons license.

In terms of the current state of development we can point to the innovation of miniature wearable media, orthopedic replacements (including pacemakers), bionic prosthetic limbs, humanoid robots (i.e., a robot that looks like a human in appearance and is autonomous), and RFID implants. Traditionally, the term cyborg has been used to describe humans who have some mechanical parts or extensions. Today, however, we are on the brink of building a new sentient being, a bearer of electricity, a modern man belonging to a new race, beyond that which can be considered merely part man part machine. We refer here to the absolute fusion of man and machine, where the subject itself becomes the object; where the toolmaker becomes one with his tools [145]. The question at this point of coalescence is how human will the new species be [146], and what are the related ethical, metaphysical, and ontological concerns? Does the evolution of the human race as recorded in history come to an end when technology can be connected to the body in a wired or wireless form?

A. From Prosthetics to Amplification

Fig.&nbsp;24.&nbsp; Cyborg 2.0 Project. Kevin Warwick with wife Irena during the Cyborg 2.0 project. Courtesy of Kevin Warwick.

Fig. 24. Cyborg 2.0 Project. Kevin Warwick with wife Irena during the Cyborg 2.0 project. Courtesy of Kevin Warwick.

While orthopedic replacements corrective in nature have been around since the 1950s [147] and are required to repair a function that is either lying dormant or has failed altogether, implants of the future will attempt to add new functionality to native human capabilities, either through extensions or additions. Warwick's Cyborg 2.0 project [148], for instance, intended to prove that two persons with respective implants could communicate sensation and movement by thoughts alone. In 2002, the BBC reported that a tiny silicon square with 100 electrodes was connected to the professor's median nerve and linked to a transmitter/receiver in his forearm. Although, “Warwick believe[d] that when he move[d] his own fingers, his brain [would] also be able to move Irena's” [104, p. 1], the outcome of the experiment was described at best as sending “Morse-code” messages (Fig. 24). Warwick [148] is still of the belief that a person's brain could be directly linked to a computer network [149]. Commercial players are also intent on keeping ahead, continually funding projects in this area of research.

 

If Warwick is right, then terminals like telephones would eventually become obsolete if thought-to-thought communication became possible. Warwick describes this as “putting a plug into the nervous system” [104] to be able to allow thoughts to be transferred not only to another person but to the Internet and other media. While Warwick's Cyborg 2.0 may not have achieved its desired outcomes, it did show that a form of primitive Morse-code-style nervous-system-to-nervous-system communication is realizable [150]. Warwick is bound to keep trying to achieve his project goals given his philosophical perspective. And if Warwick does not succeed, he will have at least left behind a legacy and enough stimuli for someone else to succeed in his place.

 

B. The Soul Catcher Chip

The Soul Catcher chip was conceived by former Head of British Telecom Research, Peter Cochrane. Cochrane [151, p. 2] believes that the human body is merely a carcass that serves as a transport mechanism just like a vehicle, and that the most important part of our body is our brain (i.e., mind). Similarly, Miriam English has said: “I like my body, but it's going to die, and it's not a choice really I have. If I want to continue, and I want desperately to see what happens in another 100 years, and another 1000 years…I need to duplicate my brain in order to do that” [152]. Soul Catcher is all about the preservation of a human, way beyond the point of physical debilitation. The Soul Catcher chip would be implanted in the brain, and act as an access point to the external world [153]. Consider being able to download the mind onto computer hardware and then creating a global nervous system via wireless Internet [154] (Fig. 25). Cochrane has predicted that by 2050 downloading thoughts and emotions will be commonplace. Billinghurst and Starner [155, p. 64]predict that this kind of arrangement will free up the human intellect to focus on creative rather than computational functions.

 

Fig. 25. Ray Kurzweil predicts that by 2013 supercomputer power will be sufficient for human brain functional simulation and by 2025 for human brain neural simulation for uploading. Courtesy of Ray Kurzweil and Kurzweil Technologies 2005.

Cochrane's beliefs are shared by many others engaged in the transhumanist movement (especially Extropians like Alexander Chislenko). Transhumanism (sometimes known by the abbreviations “> H” or “H+”) is an international cultural movement that consists of intellectuals who look at ways to extend life through the application of emerging sciences and technologies. Minsky [156] believes that this will be the next stage in human evolution—a way to achieve true immortality “replacing flesh with steel and silicon” [141, p. 94]. Chris Winter of British Telecom has claimed that Soul Catcher will mean “the end of death.” Winter predicts that by 2030, “[i]t would be possible to imbue a newborn baby with a lifetime's experiences by giving him or her the Soul Catcher chip of a dead person” [157]. The philosophical implications behind such movements are gigantic; they reach deep into every branch of traditional philosophy, especially metaphysics with its special concerns over cosmology and ontology.

 

SECTION VIII. The Next 100 Years: Homo Electricus

A. The Rise of the Electrophorus

Fig.&nbsp;26.&nbsp; Drawing showing the operation of an Electrophorus, a simple manual electrostatic generator invented in 1762 by Swedish Professor Johan Carl Wilcke. Image by Amédée Guillemin (died 1893).

Fig. 26. Drawing showing the operation of an Electrophorus, a simple manual electrostatic generator invented in 1762 by Swedish Professor Johan Carl Wilcke. Image by Amédée Guillemin (died 1893).

Microchip implants are integrated circuit devices encased in RFID transponders that can be active or passive and are implantable into animals or humans usually in the subcutaneous layer of the skin. The human who has been implanted with a microchip that can send or receive data is an Electrophorus, a bearer of “electric” technology [158]. The Macquarie Dictionary definition of “electrophorus” is “an instrument for generating static electricity by means of induction,” and refers to an instrument used in the early years of electrostatics (Fig. 26).

 

We have repurposed the term electrophorus to apply to humans implanted with microchips. One who “bears” is in some way intrinsically or spiritually connected to that which they are bearing, in the same way an expecting mother is to the child in her womb. The root electro comes from the Greek word meaning “amber,” and phorus means to “wear, to put on, to get into” [159, p. 635]. When an Electrophorus passes through an electromagnetic zone, he/she is detected and data can be passed from an implanted microchip (or in the future directly from the brain) to a computer device.

To electronize something is “to furnish it with electronic equipment” and electrotechnology is “the science that deals with practical applications of electricity.” The term “electrophoresis” has been borrowed here, to describe the “electronic” operations that an electrophorus is involved in. McLuhan and Zingrone [160, p. 94] believed that “electricity is in effect an extension of the nervous system as a kind of global membrane.” They argued that “physiologically, man in the normal use of technology (or his variously extended body) is perpetually modified by it and in turn finds ever new ways of modifying his technology” [161, p. 117].

The term “electrophorus” seems to be much more suitable today for expressing the human-electronic combination than the term “cyborg.” “Electrophorus” distinguishes strictly electrical implants from mechanical devices such as artificial hips. It is not surprising then that these crucial matters of definition raise philosophical and sociological questions of consciousness and identity, which science fiction writers have been addressing creatively. The Electrophorus belongs to the emerging species of Homo Electricus. In its current state, the Electrophorus relies on a device being triggered wirelessly when it enters an electromagnetic field. In the future, the Electrophorus will act like a network element or node, allowing information to pass through him or her, to be stored locally or remotely, and to send out messages and receive them simultaneously and allow some to be processed actively, and others as background tasks.

At the point of becoming an Electrophorus (i.e., a bearer of electricity), Brown [162] makes the observation that “[y]ou are not just a human linked with technology; you are something different and your values and judgment will change.” Some suspect that it will even become possible to alter behavior of people carrying brain implants, whether the individual wills it or not. Maybury [163]believes that “[t]he advent of machine intelligence raises social and ethical issues that may ultimately challenge human existence on earth.”

B. The Prospects of Transhumanism

Fig.&nbsp;27.&nbsp; The transhumanism symbol. Courtesy of Antonu under Creative Commons license.

Fig. 27. The transhumanism symbol. Courtesy of Antonu under Creative Commons license.

Thought-to-thought communications may seem outlandish today, but it is only one of many futuristic hopes of the movement termed transhumanism. Probably the most representative organization for this movement is the World Transhumanist Association (WTA), which recently adopted the doing-business-as name of “Humanity+” (Fig. 27). The WTA's website [164] carries the following succinct statement of what transhumanism is, penned originally by Max More in 1990: “Transhumanism is a class of philosophies of life that seek the continuation and acceleration of the evolution of intelligent life beyond its currently human form and human limitations by means of science and technology, guided by life-promoting principles and values.” Whether transhumanism yet qualifies as a philosophy, it cannot be denied that it has produced its share of both proponents and critics.

 

Proponents of transhumanism claim that the things they want are the things everyone wants: freedom from pain, freedom from suffering, freedom from all the limitations of the human body (including mental as well as physical limitations), and ultimately, freedom from death. One of the leading authors in the transhumanist movement is Ray Kurzweil, whose 652-page book The Singularity Is Near [165] prophesies a time in the not-too-distant future when evolution will accelerate exponentially and bring to pass all of the above freedoms as “the matter and energy in our vicinity will become infused with the intelligence, knowledge, creativity, beauty, and emotional intelligence (the ability to love, for example) of our human-machine civilization. Our civilization will then expand outward, turning all the dumb matter and energy we encounter into sublimely intelligent—transcendent—matter and energy” [165, p. 389].

Despite the almost theological tone of the preceding quote, Kurzweil has established a sound track record as a technological forecaster, at least when it comes to Moore's-Law-type predictions of the progress of computing power. But the ambitions of Kurzweil [178] and his allies go far beyond next year's semiconductor roadmap to encompass the future of all humanity. If the fullness of the transhumanist vision is realized, the following achievements will come to pass:

  • human bodies will cease to be the physical instantiation of human minds, replaced by as-yet-unknown hardware with far greater computational powers than the present human brain;

  • human minds will experience, at their option, an essentially eternal existence in a world free from the present restrictions of material embodiment in biological form;

  • limitations on will, intelligence, and communication will all be overcome, so that to desire a thing or experience will be to possess it.

The Transhumanist Declaration, last modified in 2009 [166], recognizes that these plans have potential downsides, and calls for reasoned debate to avoid the risks while realizing the opportunities. The sixth item in the Declaration, for example, declares that “[p]olicy making ought to be guided by responsible and inclusive moral vision, taking seriously both opportunities and risks, respecting autonomy and individual rights, and showing solidarity with and concern for the interests and dignity of all people around the globe.” The key phrase in this item is “moral vision.” While many self-declared transhumanists may agree on the moral vision which should guide their endeavors, the movement has also inspired some of the most vigorous and categorically critical invective to be found in the technical and public-policy literature.

Possibly the most well known of the vocal critics of transhumanism is Francis Fukuyama, a political scientist who nominated transhumanism as his choice for the world's most dangerous idea [167]. As with most utopian notions, the main problem Fukuyama sees with transhumanism is the transition between our present state and the transhumanists' future vision of completely realized eternal technological bliss (Fig. 28). Will some people be uploaded to become immortal, almost omniscient transhumans while others are left behind in their feeble, mortal, disease-ridden human bodies? Are the human goods that transhumanists say are basically the same for everyone really so? Or are they more complex and subtle than typical transhumanist pronouncements acknowledge? As Fukuyama points out in his foreign Policy essay [167], “Our good characteristics are intimately connected to our bad ones… if we never felt jealousy, we would also never feel love. Even our mortality plays a critical function in allowing our species as a whole to survive and adapt (and transhumanists are about the last group I would like to see live forever).”

 

Fig.&nbsp;28.&nbsp; Brain in a vat with the thought: “I'm walking outside in the sun” being transmitted to the computer. Image reproduced under the Creative Commons license.

Fig. 28. Brain in a vat with the thought: “I'm walking outside in the sun” being transmitted to the computer. Image reproduced under the Creative Commons license.

Transhumanists themselves admit that their movement performs some of the functions of a religion when it “offers a sense of direction and purpose.” But in contrast to most religions, transhumanists explicitly hope to “make their dreams come true in this world” [168]. Nearly all transhumanist programs and proposals arise from a materialist–reductionist view of the world which assumes that the human mind is at most an epiphenomenon of the brain, all of the human brain's functions will eventually be simulated by hardware (on computers of the future), and that the experience known as consciousness can be realized in artificial hardware in essentially the same form as it is presently realized in the human body. Some of the assumptions of transhumanism are based less on facts and more on faith. Just as Christians take on faith that God revealed Himself in Jesus Christ, transhumanists take on faith that machines will inevitably become conscious.

Fig.&nbsp;29.&nbsp; The shadow dextrous hand shakes the human hand. How technology might become society—a future agreement. Courtesy of Shadow Robot Company 2008.

Fig. 29. The shadow dextrous hand shakes the human hand. How technology might become society—a future agreement. Courtesy of Shadow Robot Company 2008.

In keeping with the transhumanists' call for responsible moral vision, the IEEE SSIT has been, and will continue to be, a forum where the implications for society of all sorts of technological developments can be debated and evaluated. In a sense, the transhumanist program is the ultimate technological project: to redesign humanity itself to a set of specifications, determined by us. If the transhumanists succeed, technology will become society, and the question of the social implications of technology will be moot (Fig. 29). Perhaps the best attitude to take toward transhumanism is to pay attention to their prophecies, but, as the Old Testament God advised the Hebrews, “if the thing follow not, nor come to pass…the prophet hath spoken it presumptuously…” [169].

 

 

SECTION IX. Ways forward

In sum, identifying and predicting what the social implications of past, present and future technologies might be can lead us to act in one of four ways, which are not mutually exclusive.

First, we can take the “do nothing” approach and meekly accept the risks associated with new techniques. We stop being obsessed by both confirmed and speculative consequences, and instead, try to see how far the new technologies might take us and what we might become or transform into as a result. While humans might not always like change, we are by nature, if we might hijack Heraclitus, in a continual state of flux. We might reach new potentials as a populace, become extremely efficient at doing business with each other, and make a positive impact on our natural environment by doing so. The downside to this approach is that it appears to be an all or nothingapproach with no built-in decision points. for as Jacques Ellul [170] forewarned: “what is at issue here is evaluating the danger of what might happen to our humanity in the present half-century, and distinguishing between what we want to keep and what we are ready to lose, between what we can welcome as legitimate human development and what we should reject with our last ounce of strength as dehumanization.”

The second option is that we let case law determine for us what is legal or illegal based on existing laws, or new or amended laws we might introduce as a result of the new technologies. We can take the stance that the courts are in the best position to decide on what we should and should not do with new technologies. If we break the law in a civil or criminal capacity, then there is a penalty and we have civil and criminal codes concerning workplace surveillance, telecommunications interception and access, surveillance devices, data protection and privacy, cybercrime, and so on. There is also the continual review of existing legislation by law-reform commissions and the like. New legislation can also be introduced to curb against other dangers or harms that might eventuate as a result of the new techniques.

The third option is that we can introduce industry regulations that stipulate how advanced applications should be developed (e.g., ensuring privacy impact assessments are done before commercial applications are launched), and that technical expectations on accuracy, reliability, and storage of data are met. It is also important that the right balance be found between regulations and freedom so as not to stifle the high-tech industry at large.

Finally, the fourth option would be to adopt the “Amish method”: complete abandonment of technology that has progressed beyond a certain point of development. This is in some respect “living off the grid” [171].

Although obvious, it is important to underline that none of these options are mutually exclusive or foolproof. The final solution may well be at times to introduce industry regulations or codes, at other times to do nothing, and in other cases to rely on legislative amendments despite the length of time it takes to develop these. In other cases, the safeguards may need to be built into the technology itself.

 

SECTION X. Conclusion

If we put our trust in Kurzweil's [172] Law of Accelerating Returns, we are likely headed into a great period of discovery unprecedented in any era of history. This being the case, the time for inclusive dialog is now, not after widespread diffusion of such innovations as “always on” cameras, microchip implants, unmanned drones and the like. We stand at a critical moment of decision, as the mythological Pandora did as she was about to open her box. There are many lessons to be learned from history, especially from such radical developments as the atomic bomb and the resulting arms race. Joy [173] has raised serious fears about continuing unfettered research into “spiritual machines.” Will humans have the foresight to say “no” or “stop” to new innovations that could potentially be a means to a socially destructive scenario? Implants that may prolong life expectancy by hundreds if not thousands of years may appeal at first glance, but they could well create unforeseen devastation in the form of technological viruses, plagues, or a grim escalation in the levels of crime and violence.

To many scientists of the positivist tradition anchored solely to an empirical world view, the notion of whether something is right or wrong is in a way irrelevant. for these researchers, a moral stance has little or nothing to do with technological advancement but is really an ideological position. The extreme of this view is exemplified by an attitude of “let's see how far we can go”, not “is what we are doing the best thing for humanity?” and certainly not by the thought of “what are the long-term implications of what we are doing here?” As an example, one need only consider the mad race to clone the first animal, and many have long suspected an “underground” scientific race continues to clone the first human.

In the current climate of innovation, precisely since the proliferation of the desktop computer and birth of new digital knowledge systems, some observers believe that engineers, and professionals more broadly, lack accountability for the tangible and intangible costs of their actions [174, p. 288]. Because science-enabled engineering has proved so profitable for multinational corporations, they have gone to great lengths to persuade the world that science should not be stopped, for the simple reason that it will always make things better. This ignores the possibility that even seemingly small advancements into the realm of the Electrophorus for any purpose other than medical prostheses will have dire consequences for humanity [175]. According to Kuhns, “Once man has given technique its entry into society, there can be no curbing of its gathering influence, no possible way of forcing it to relinquish its power. Man can only witness and serve as the ironic beneficiary-victim of its power” [176, p. 94].

Clearly, none of the authors of this paper desire to stop technological advance in its tracks. But we believe that considering the social implications of past, present, and future technologies is more than an academic exercise. As custodians of the technical means by which modern society exists and develops, engineers have a unique responsibility to act with forethought and insight. The time when following orders of a superior was all that an engineer had to do is long past. with great power comes great responsibility. Our hope is that the IEEE SSIT will help and encourage engineers worldwide to consider the consequences of their actions throughout the next century.

References

1. K. D. Stephan, "Notes for a history of the IEEE society on social implications of technology", IEEE Technol. Soc. Mag., vol. 25, no. 4, pp. 5-14, 2006.

2. B. R. Inman, "One view of national security and technical information", IEEE Technol. Soc. Mag., vol. 1, no. 3, pp. 19-21, Sep. 1982.

3. S. Sloan, "Technology and terrorism: Privatizing public violence", IEEE Technol. Soc. Mag., vol. 10, no. 2, pp. 8-14, 1991.

4. J. R. Shanebrook, "Prohibiting nuclear weapons: Initiatives toward global nuclear disarmament", IEEE Technol. Soc. Mag., vol. 18, no. 2, pp. 25-31, 1999.

5. C. J. Andrews, "National responses to energy vulnerability", IEEE Technol. Soc. Mag., vol. 25, no. 3, pp. 16-25, 2006.

6. R. C. Arkin, "Ethical robots in warfare", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 30-33, 2009.

7. M. G. Michael, K. Michael, "Toward a state of überveillance", IEEE Technol. Soc. Mag., vol. 29, no. 2, pp. 9-16, 2010.

8. V. Baranauskas, "Large-scale fuel farming in Brazil", IEEE Technol. Soc. Mag., vol. 2, no. 1, pp. 12-13, Mar. 1983.

9. H. M. Gueron, "Nuclear power: A time for common sense", IEEE Technol. Soc. Mag., vol. 3, no. 1, pp. 3-9, Mar. 1984.

10. J. J. Mackenzie, "Nuclear power: A skeptic's view", IEEE Technol. Soc. Mag., vol. 3, no. 1, pp. 9-15, Mar. 1984.

11. E. Larson, D. Abrahamson, P. Ciborowski, "Effects of atmospheric carbon dioxide on U. S. peak electrical generating capacity", IEEE Technol. Soc. Mag., vol. 3, no. 4, pp. 3-8, Dec. 1984.

12. P. C. Cruver, "Greenhouse effect prods global legislative initiatives", IEEE Technol. Soc. Mag., vol. 9, no. 1, pp. 10-16, Mar./Apr. 1990.

13. B. Allenby, "Earth systems engineering and management", IEEE Technol. Soc. Mag., vol. 19, no. 4, pp. 10-24, Winter 2000.

14. J. C. Davis, "Protecting intellectual property in cyberspace", IEEE Technol. Soc. Mag., vol. 17, no. 2, pp. 12-25, 1998.

15. R. Brody, "Consequences of electronic profiling", IEEE Technol. Soc. Mag., vol. 18, no. 1, pp. 20-27, 1999.

16. K. W. Bowyer, "Face-recognition technology: Security versus privacy", IEEE Technol. Soc. Mag., vol. 23, no. 1, pp. 9-20, 2004.

17. D. Btschi, M. Courant, L. M. Hilty, "Towards sustainable pervasive computing", IEEE Technol. Soc. Mag., vol. 24, no. 1, pp. 7-8, 2005.

18. R. Clarke, "Cyborg rights", IEEE Technol. Soc. Mag., vol. 30, no. 3, pp. 49-57, 2011.

19. E. Levy, D. Copp, "Risk and responsibility: Ethical issues in decision-making", IEEE Technol. Soc. Mag., vol. 1, no. 4, pp. 3-8, Dec. 1982.

20. K. R. Foster, R. B. Ginsberg, "Guest editorial: The wired classroom", IEEE Technol. Soc. Mag., vol. 17, no. 4, pp. 3, 1998.

21. T. Bookman, "Ethics professionalism and the pleasures of engineering: T&S interview with Samuel Florman", IEEE Technol. Soc. Mag., vol. 19, no. 3, pp. 8-18, 2000.

22. K. D. Stephan, "Is engineering ethics optional", IEEE Technol. Soc. Mag., vol. 20, no. 4, pp. 6-12, 2001.

23. T. C. Jepsen, "Reclaiming history: Women in the telegraph industry", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 15-19, 2000.

24. A. S. Bix, "‘Engineeresses’ invade campus", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 20-26, 2000.

25. J. Coopersmith, "Pornography videotape and the internet", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 27-34, 2000.

26. D. M. Hughes, "The internet and sex industries: Partners in global sexual exploitation", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 35-41, 2000.

27. V. Cimagalli, M. Balsi, "Guest editorial: University technology and society", IEEE Technol. Soc. Mag., vol. 20, no. 2, pp. 3, 2001.

28. G. L. Engel, B. M. O'Connell, "Guest editorial: Ethical and social issues criteria in academic accreditation", IEEE Technol. Soc. Mag., vol. 21, no. 3, pp. 7, 2002.

29. J. C. Lucena, G. Downey, H. A. Amery, "From region to countries: Engineering education in Bahrain Egypt and Turkey", IEEE Technol. Soc. Mag., vol. 25, no. 2, pp. 4-11, 2006.

30. C. Didier, J. R. Herkert, "Volunteerism and humanitarian engineering—Part II", IEEE Technol. Soc. Mag., vol. 29, no. 1, pp. 9-11, 2010.

31. K. Michael, G. Roussos, G. Q. Huang, R. Gadh, A. Chattopadhyay, S. Prabhu, P. Chu, "Planetary-scale RFID services in an age of uberveillance", Proc. IEEE, vol. 98, no. 9, pp. 1663-1671, Sep. 2010.

32. M. G. Michael, K. Michael, "The fall-out from emerging technologies: On matters of surveillance social networks and suicide", IEEE Technol. Soc. Mag., vol. 30, no. 3, pp. 15-18, 2011.

33. M. U. Iqbal, S. Lim, "Privacy implications of automated GPS tracking and profiling", IEEE Technol. Soc. Mag., vol. 29, no. 2, pp. 39-46, 2010.

34. D. Kravets, "OnStar tracks your car even when you cancel service", Wired, Sep. 2011.

35. L. Evans, "Location-based services: Transformation of the experience of space", J. Location Based Services, vol. 5, no. 34, pp. 242-260, 2011.

36. M. Wigan, R. Clarke, "Social impacts of transport surveillance", Prometheus, vol. 24, no. 4, pp. 389-403, 2006.

37. B. D. Renegar, K. Michael, "The privacy-value-control harmonization for RFID adoption in retail", IBM Syst. J., vol. 48, no. 1, pp. 8:1-8:14, 2009.

38. R. Clarke, "Information technology and dataveillance", Commun. ACM, vol. 31, no. 5, pp. 498-512, 1988.

39. H. Ketabdar, J. Qureshi, P. Hui, "Motion and audio analysis in mobile devices for remote monitoring of physical activities and user authentication", J. Location Based Services, vol. 5, no. 34, pp. 182-200, 2011.

40. E. Singer, "Device tracks how you're sleeping", Technol. Rev. Authority Future Technol., Jul. 2009.

41. L. Perusco, K. Michael, "Control trust privacy and security: Evaluating location-based services", IEEE Technol. Soc. Mag., vol. 26, no. 1, pp. 4-16, 2007.

42. K. Michael, A. McNamee, M. G. Michael, "The emerging ethics of humancentric GPS tracking and monitoring", ICMB M-Business-From Speculation to Reality, 2006.

43. S. J. Fusco, K. Michael, M. G. Michael, R. Abbas, "Exploring the social implications of location based social networking: An inquiry into the perceived positive and negative impacts of using LBSN between friends", 9th Int. Conf. Mobile Business/9th Global Mobility Roundtable (ICMB-GMR), 2010.

44. M. Burdon, "Commercializing public sector information privacy and security concerns", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 34-40, 2009.

45. R. W. Picard, "Future affective technology for autism and emotion communication", Philosoph. Trans. Roy. Soc. London B Biol. Sci., vol. 364, no. 1535, pp. 3575-3584, 2009.

46. R. M. Kowalski, S. P. Limber, P. W. Agatston, Cyber Bullying: The New Moral Frontier, U.K., London: Wiley-Blackwell, 2007.

47. Google: Policies and Principles, Oct. 2011.

48. K.-S. Lee, "Surveillant institutional eyes in South Korea: From discipline to a digital grid of control", Inf. Soc., vol. 23, no. 2, pp. 119-124, 2007.

49. D. P. Siewiorek, "Wearable computing comes of age", IEEE Computer, vol. 32, no. 5, pp. 82-83, May 1999.

50. L. Sydnheimo, M. Salmimaa, J. Vanhala, M. Kivikoski, "Wearable and ubiquitous computer aided service maintenance and overhaul", IEEE Int. Conf. Commun., vol. 3, pp. 2012-2017, 1999.

51. K. Michael, M. G. Michael, Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, 2009.

52. K. Michael, M. G. Michael, "Implementing Namebers using microchip implants: The black box beneath the skin" in This Pervasive Day: The Potential and Perils of Pervasive Computing, U.K., London:Imperial College Press, pp. 101-142, 2011.

53. S. Mann, "Wearable computing: Toward humanistic intelligence", IEEE Intell. Syst., vol. 16, no. 3, pp. 10-15, May/Jun. 2001.

54. B. Schiele, T. Jebara, N. Oliver, "Sensory-augmented computing: Wearing the museum's guide", IEEE Micro, vol. 21, no. 3, pp. 44-52, May/Jun. 2001.

55. C. Harrison, D. Tan, D. Morris, "Skinput: Appropriating the skin as an interactive canvas", Commun. ACM, vol. 54, no. 8, pp. 111-118, 2011.

56. N. Sawhney, C. Schmandt, "Nomadic radio: A spatialized audio environment for wearable computing", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 171-172, 1997.

57. S. Mann, "Eudaemonic computing (‘underwearables’)", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 177-178, 1997.

58. LooxieOverview, Jan. 2012.

59. T. Starner, "The challenges of wearable computing: Part 1", IEEE Micro, vol. 21, no. 4, pp. 44-52, Jul./Aug. 2001.

60. G. Trster, "Smart clothes—The unfulfilled pledge", IEEE Perv. Comput., vol. 10, no. 2, pp. 87-89, Feb. 2011.

61. M. B. Spitzer, "Eyeglass-based systems for wearable computing", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 48-51, 1997.

62. R. Steinkuhl, C. Sundermeier, H. Hinkers, C. Dumschat, K. Cammann, M. Knoll, "Microdialysis system for continuous glucose monitoring", Sens. Actuators B Chem., vol. 33, no. 13, pp. 19-24, 1996.

63. J. C. Pickup, F. Hussain, N. D. Evans, N. Sachedina, "In vivo glucose monitoring: The clinical reality and the promise", Biosens. Bioelectron., vol. 20, no. 10, pp. 1897-1902, 2005.

64. C. Thomas, R. Carlson, Development of the Sensing System for an Implantable Glucose Sensor, Jan. 2012.

65. J. L. Ferrero, "Wearable computing: One man's mission", IEEE Micro, vol. 18, no. 5, pp. 87-88, Sep.-Oct. 1998.

66. T. Martin, "Issues in wearable computing for medical monitoring applications: A case study of a wearable ECG monitoring device", Proc. IEEE 4th Int. Symp. Wearable Comput., pp. 43-49, 2000.

67. M. G. Michael, "The biomedical pioneer: An interview with C. Toumazou" in Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, pp. 352-363, 2009.

68. R. Capurro, M. Nagenborg, Ethics and Robotics, Germany, Heidelberg: Akademische Verlagsgesellschaft, 2009.

69. R. Sparrow, "Predators or plowshares? Arms control of robotic weapons", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 25-29, 2009.

70. R. C. Arkin, "Governing lethal behavior in robots [T&S Interview]", IEEE Technol. Soc. Mag., vol. 30, no. 4, pp. 7-11, 2011.

71. M. L. Cummings, "Creating moral buffers in weapon control interface design", IEEE Technol. Soc. Mag., vol. 23, no. 3, pp. 28-33, 41, 2004.

72. P. Asaro, "Modeling the moral user", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 20-24, 2009.

73. J. Canning, "You've just been disarmed. Have a nice day!", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 13-15, 2009.

74. F. Operto, "Ethics in advanced robotics", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 72-78, Mar. 2011.

75. J. M. Sullivan, "Evolution or revolution? The rise of UAVs", IEEE Technol. Soc. Mag., vol. 25, no. 3, pp. 43-49, 2006.

76. P. Salvini, M. Nicolescu, H. Ishiguro, "Benefits of human-robot interaction", IEEE Robot. Autom. Mag., vol. 18, no. 4, pp. 98-99, Dec. 2011.

77. A. Sharkey, N. Sharkey, "Children the elderly and interactive robots", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 32-38, Mar. 2011.

78. D. Feil-Seifer, M. J. Mataric, "Socially assistive robotics", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 24-31, Mar. 2011.

79. J. D. Bronzino, The Biomedical Engineering Handbook: Medical Devices and Systems, FL, Boca Raton:CRC Press, 2006.

80. C. Hassler, T. Boretius, T. Stieglitz, "Polymers for neural implants", J. Polymer Sci. B Polymer Phys., vol. 49, no. 1, pp. 18-33, 2011.

81. Bionic Hearing Bionic Vision Neurobionics, Jan. 2012.

82. A. Manning, "Implants sounding better: Smaller faster units overcome ‘nerve deafness’", USA Today, pp. 7D, 2000.

83. G. M. Clark, Sounds From Silence, Australia, Melbourne: Allen & Unwin, 2003.

84. G. Carman, Eureka Moment From First One to Hear With Bionic Ear, Feb. 2008.

85. J. F. Patrick, P. A. Busby, P. J. Gibson, "The development of the nucleus FreedomTM cochlear implant system", Sage Publications, vol. 10, no. 4, pp. 175-200, 2006.

86. "Personal stories", Cochlear, Jan. 2012.

87. R. A. Cooper, "Quality of life technology: A human-centered and holistic design", IEEE Eng. Med. Biol., vol. 27, no. 2, pp. 10-11, Mar./Apr. 2008.

88. S. Stewart, "Neuromaster", Wired 8.02.

89. J. Dowling, "Current and future prospects for optoelectronic retinal prostheses", Eye, vol. 23, pp. 1999-2005, 2009.

90. D. Ahlstrom, "Microchip implant could offer new kind of vision", The Irish Times.

91. More Tests of Eye Implants Planned, pp. 1-2, 2001.

92. G. Branwyn, "The desire to be wired", Wired 1.4.

93. W. Wells, The Chips Are Coming.

94. M. Brooks, "The cyborg cometh", Worldlink: The Magazine of the World Economic Forum.

95. E. Strickland, "Birth of the bionic eye", IEEE Spectrum, Jan. 2012.

96. S. Adee, "Researchers hope to mime 1000 neurons with high-res artificial retina", IEEE Spectrum, Jan. 2012.

97. D. Nairne, Building Better People With Chips and Sensors.

98. S. S. Hall, "Brain pacemakers", MIT Enterprise Technol. Rev..

99. E. A. C. Pereira, A. L. Green, R. J. Stacey, T. Z. Aziz, "Refractory epilepsy and deep brain stimulation", J. Clin. Neurosci., vol. 19, no. 1, pp. 27-33, 2012.

100. "Brain pacemaker could help cure depression research suggests", Biomed. Instrum. Technol., vol. 45, no. 2, pp. 94, 2011.

101. H. S. Mayberg, A. M. Lozano, V. Voon, H. E. McNeely, D. Seminowicz, C. Hamani, J. M. Schwalb, S. H. Kennedy, "Deep brain stimulation for treatment-resistant depression", Neuron, vol. 45, no. 5, pp. 651-660, 2005.

102. B. Staff, "Brain pacemaker lifts depression", BBC News, Jun. 2005.

103. C. Hamani, H. Mayberg, S. Stone, A. Laxton, S. Haber, A. M. Lozano, "The subcallosal cingulate gyrus in the context of major depression", Biol. Psychiatry, vol. 69, no. 4, pp. 301-308, 2011.

104. R. Dobson, Professor to Try to Control Wife via Chip Implant.

105. "Chip helps paraplegic walk", Wired News.

106. D. Smith, "Chip implant signals a new kind of man", The Age.

107. "Study of an implantable functional neuromuscular stimulation system for patients with spinal cord injuries", Clinical Trials.gov, Feb. 2009.

108. R. Barrett, "Electrodes help paraplegic walk" in Lateline Australian Broadcasting Corporation, Australia, Sydney: ABC, May 2011.

109. M. Ingebretsen, "Intelligent exoskeleton helps paraplegics walk", IEEE Intell. Syst., vol. 26, no. 1, pp. 21, 2011.

110. S. Harris, "US researchers create suit that can enable paraplegics to walk", The Engineer, Oct. 2011.

111. D. Ratner, M. A. Ratner, Nanotechnology and Homeland Security: New Weapons for New Wars, NJ, Upper Saddle River: Pearson Education, 2004.

112. L. Versweyveld, "Chip implants allow paralysed patients to communicate via the computer", Virtual Medical Worlds Monthly.

113. S. Adee, "The revolution will be prosthetized: DARPA's prosthetic arm gives amputees new hope", IEEE Spectrum, vol. 46, no. 1, pp. 37-40, 2009.

114. E. Wales, "It's a living chip", The Australian, pp. 4, 2001.

115. Our Products: MBAMultiplex Bio Threat Assay, Jan. 2012.

116. F. W. Scheller, "From biosensor to biochip", FEBS J., vol. 274, no. 21, pp. 5451, 2007.

117. A. Persidis, "Biochips", Nature Biotechnol., vol. 16, pp. 981-983, 1998.

118. A. C. LoBaido, "Soldiers with microchips: British troops experiment with implanted electronic dog tag", WorldNetDaily.com.

119. "Microchip implants for drug delivery", ABC: News in Science.

120. R. Bailey, "Implantable insulin pumps", Biology About.com.

121. D. Elleri, D. B. Dunger, R. Hovorka, "Closed-loop insulin delivery for treatment of type 1 diabetes", BMC Med., vol. 9, no. 120, 2011.

122. D. L. Sorkin, J. McClanahan, "Cochlear implant reimbursement cause for concern", HealthyHearing, May 2004.

123. J. Berke, "Parental rights and cochlear implants: Who decides about the implant?", About.com: Deafness, May 2009.

124. D. O. Weber, "Me myself my implants my micro-processors and I", Softw. Develop. Mag., Jan. 2012.

125. A. Graafstra, K. Michael, M. G. Michael, "Social-technical issues facing the humancentric RFID implantee sub-culture through the eyes of Amal Graafstra", Proc. IEEE Int. Symp. Technol. Soc., pp. 498-516, 2010.

126. E.