Implantable Medical Device Tells All

Implantable Medical Device Tells All: Uberveillance Gets to the Heart of the Matter

In 2015, I provided evidence at an Australian inquiry into the use of subsection 313(3) of the Telecommunications Act of 1997 by government agencies to disrupt the operation of illegal online services [1]. I stated to the Standing Committee on Infrastructure and Communications that mandatory metadata retention laws meant blanket coverage surveillance for Australians and visitors to Australia. The intent behind asking Australian service providers to keep subscriber search history data for up to two years was to grant government and law enforcement organizations the ability to search Internet Protocol–based records in the event of suspected criminal activity.

Importantly, I told the committee that, while instituting programs of surveillance through metadata retention laws would likely help to speed up criminal investigations, we should also note that every individual is a consumer, and such programs ultimately come back to bite innocent people through some breach of privacy or security. Enter the idea of uberveillance, which, I told the committee, is “exaggerated surveillance” that allows for interference [1] that I believe is a threat to our human rights [2]. I strongly advised that evoking section 313 of the Telecommunications Act 1997 requires judicial oversight through the process of a search warrant. My recommendations fell on deaf ears, and, today, we even have the government deliberating over whether or not they should relax metadata laws to allow information to be accessed for both criminal and civil litigation [3], which includes divorces, child custody battles, and business disputes. In June 2017, Australian Prime Minister Malcolm Turnbull even stated that “global social media and messaging companies” need to assist security services’ efforts to fight terrorism by “providing access to encrypted communications” [52].

Consumer Electronics Leave Digital Data Footprints

Of course, Australia is not alone in having metadata retention laws. Numerous countries have adopted these laws or similar directives since 2005, keeping certain types of data for anywhere between 30 days and indefinitely, although the standard length is somewhere between one and two years. For example, since 2005, Italy has retained subscriber information at Internet cafes for 30 days. I recall traveling to Verona in 2008 for the European Conference on Information Systems, forgetting my passport in my hotel room, and being unable to use an Internet cafe to send a message back home because I was carrying no recognized identity information. When I asked why I was unable to send a simple message, I was handed an antiterrorism information leaflet. Italy also retains telephone data for up to two years and Internet service provider (ISP) data for up to 12 months.

Similarly, the United Kingdom retains all telecommunications data for one to two years. It also maintains postal information (sender, receiver data), banking data for up to seven years, and vehicle movements for up to two years. In Germany, metadata retention was established in 2008 under the directive Gesetz zur Neuregelung der Telekommunikationsüberwachung und anderer verdeckter Ermittlungsmaßnahmen sowie zur Umsetzung der Richtlinie 2006/24/EG, but it was overturned in 2010 by the Federal Constitutional Court of Germany, which ruled the law was unconstitutional because it violated a fundamental right, in that correspondence should remain secret. In 2015, this violation was challenged again, and a compromise was reached to retain telecommunications metadata for up to ten weeks. Mandatory data retention in Sweden was challenged by one holdout ISP, Bahnhof, which was threatened with an approximately US$605,000 fine in November 2014 if it did not comply [4]. They defended their stance to protect the privacy and integrity of their customers by offering a no-logs virtual private network free of charge [5].

Some European Union countries have been deliberating whether to extend metadata retention to chats and social media, but, in the United States, many corporations voluntarily retain subscriber data, including market giants Amazon and Google. It was reported in The Guardian in 2014 that the United States records Internet metadata for not only itself but the world at large through the National Security Agency (NSA) using its MARINA database to conduct pattern-of-life analysis [6]. Additionally, with the Amendments Act in 2008 of the Foreign Intelligence Surveillance Act 1978, the time allotted for warrantless surveillance was increased, and additional provisions were made for emergency eavesdropping. Under section 702 of the Foreign Intelligence Surveillance Act of 1978 Amendments Act, now all American citizens’ metadata is stored. Phone records are kept by the NSA in the MAINWAY telephony metadata collection database [53], and short message service and other text messaging worldwide are retained in DISHFIRE [7], [8].

Emerging Forms of Metadata in an Internet of Things World

Figure 1. An artificial pacemaker (serial number 1723182) from St. Jude medical, with electrode, which was removed from a deceased patient prior to cremation. (Photo courtesy of wikimedia commons.)

The upward movement toward a highly interconnected world through the Web of Things and people [9] will only mean that even greater amounts of data will be retained by corporations and government agencies around the world, extending beyond traditional forms of telecommunications data (e.g., phone records, e-mail correspondence, Internet search histories, metadata of images, videos, and other forms of multimedia). It should not surprise us that even medical devices are being touted as soon to be connected to the Internet of Things (IoT) [10]. Heart pacemakers, for instance, already send a steady stream of data back to the manufacturer’s data warehouse (Figure 1). Cardiac rhythmic data is stored on the implantable cardioverter-defibrillator’s (ICD’s) memory and is transmitted wirelessly to a home bedside monitor. Via a network connection, the data find their way to the manufacturer’s data store (Figure 2).

The standard setup for an EKG. A patient lies in a bed with EKG electrodes attached to his chest, upper arms, and legs. A nurse oversees the painless procedure. The ICD in a patient produces an EKG (A) which can automatically be sent to a ICD manufacturer's data store (B). (Image courtesy of wikimedia commons.)

In health speak, the ICD set up in the patient’s home is a type of remote monitoring that happens usually when the ICD recipient is in a state of rest, most often while sleeping overnight. It is a bit like how normal computer data backups happen, when network traffic is at its lowest. In the future, an ICD’s proprietary firmware updates may well travel back up to the device, remote from the manufacturer, like installing a Windows operating system update on a desktop. In the following section, we will explore the implications of access to personal cardiac data emanating from heart pacemakers in two cases.

CASE 1: HUGO CAMPOS DENIED ACCESS TO HIS PERSONAL CARDIAC DATA

Figure 3. The conventional radiography of a single-chamber pacemaker. (Photo courtesy of wikimedia commons.)

In 2007, scientist Hugo Campos collapsed at a train station and later was horrified to find out that he had to get an ICD for his genetic heart condition. ICDs usually last about seven years before they require replacement (Figure 3). A few years into wearing the device, being a high-end quantifiedself user who measured his sleep, exercise, and even alcohol consumption, Campos became inquisitive over how he might gain access to the data generated by his ICD (Figure 4). He made some requests to the ICD’s manufacturer and was told that he was unable to receive the information he sought, despite his doctor having full access. Some doctors could even remotely download the patient’s historical data on a mobile app for 24/7 support during emergency situations (Figure 5). Campos’s heart specialist did grant him access to written interrogation reports, but Campos only saw him about once every six months after his conditioned stabilized. Additionally, the logs were of no consequence to him on paper, and the fields and layout were predominantly decipherable only by a doctor (Figure 6).

Figure 4. The Nike FuelBand is a wearable computer that has become one of the most popular devices driving the so-called quantified-self trend. (Photo courtesy of wikimedia commons.)

Dissatisfied by his denied access, Campos took matters into his own hands and purchased a device on eBay that could help him get the data. He also went to a specialist ICD course and then intercepted the cardiac rhythms being recorded [11]. He got to the data stream but realized that to make sense of it from a patient perspective, a patient-centric app had to be built. Campos quickly deduced that regulatory and liability concerns were at the heart of the matter from the manufacturer’s perspective. How does a manufacturer continue to improve its product if it does not continually get feedback from the actual ICDs in the field? If manufacturers offered mobile apps for patients, might patients misread their own diagnoses? Is a manufacturer there to enhance life alone or to make a patient feel better about bearing an ICD? Can an ICD be misused by a patient? Or, in the worst case scenario, what happens in the case of device failure? Or patient death? Would the proof lie onboard? Would the data tell the true story? These are all very interesting questions.

Figure 5. The medical waveform format encoding rule software on a Blackberry device. It displays medical waveforms, such as EKG (shown), electroencephalogram, and blood pressure. Some doctors have software that allows them to interrogate EKG information, but patients presently do not have access to their own ICD data. (Photo courtesy of wikimedia commons.)

Campos might well have acted to not only get what he wanted (access to his data his own way) but to raise awareness globally as to the type of data being stored remotely by ICDs in patients. He noted in his TEDxCambridge talk in 2011 [12]:

the ICD does a lot more than just prevent a sudden cardiac arrest: it collects a lot of data about its own function and about the patient’s clinical status; it monitors its own battery life; the amount of time it takes to deliver a life-saving shock; it monitors a patient’s heart rhythm, daily activity; and even looks at variations in chest impedance to look if there is build-up of fluids in the chest; so it is a pretty complex little computer you have built into your body. Unfortunately, none of this invaluable data is available to the patient who originates it. I have absolutely no access to it, no knowledge of it.

Doctors, on the other hand, have full 24/7 unrestricted access to this information; even some of the manufacturers of these medical devices offer the ability for doctors to access this information through mobile devices. Compare this with the patients’ experience who have no access to this information. The best we can do is to get a printout or a hardcopy of an interrogation report when you go into the doctor’s office.

Figure 6. An EKG chart. Twelve different derivations of an EKG of a 23-year-old japanese man. A similar log was provided to hugo campos upon his request for six months worth of EKG readings. (Photo courtesy of wikimedia commons.)

Campos decided to sue the manufacturer after he was informed that the data being generated from his ICD measuring his own heart activity was “proprietary data” [13]. Perhaps this is the new side of big data. But it is fraught with legal implications and, as far as I am concerned, blatantly dangerous. If we deduce that a person’s natural biometric data (in this instance, the cardiac rhythm of an individual) belong to a third party, then we are headed into murky waters when we speak of even more invasive technology like deepbrain stimulators [14]. It not only means that the device is not owned by the electrophorus (the bearer of technology) [15], [16], but quite possibly the cardiac rhythms unique to the individual are also owned by the device manufacturer. We should not be surprised. In Google Glass’s “Software and Services” section of its terms of use, it states that Google has the right to “remotely disable or remove any such Glass service from user systems” at its “sole discretion” [17]. Placing this in the context of ICDs means that a third party almost indelibly has the right to switch someone off.

CASE 2: ROSS COMPTON’S PACEMAKER DATA IS SUBPOENAED FOR CRIMINAL INVESTIGATIONS

Enter the Ross Compton case of Middletown, Ohio. M.G. Michael and I have dubbed it one of the first authentic uberveillance cases in the world, because the technology was not just wearable but embedded. The story goes something like this: On 27 January 2017, 59-year-old Ross Compton was indicted on arson and insurance fraud charges. Police gained a search warrant to obtain his heart pacemaker readings (heart and cardiac rhythms) and called his alibi into question. Data from Compton’s pacemaker before, during, and after the fire in his home broke out were disclosed by the heart pacemaker manufacturer after a subpoena was served. The insurer’s bill for the damage was estimated at about US$400,000. Police became suspicious of Compton when they traced gasoline to Compton’s shoes, trousers, and shirt.

In his statement of events to police, Compton told a story that misaligned and conflicted with his call to 911. Forensic analysts found traces of multiple fires having been lit in various locations in the home. Yet, Compton told police he had rushed his escape, breaking a window with his walking stick to throw some hastily packed bags out and then fleeing the flames himself to safety. Compton also told police that he had an artificial heart with a pump attached, a fact that he thought might help his cause but that was to be his undoing. In this instance, his pacemaker acted akin to a black box recording on an airplane [18].

After securing the heart pacemaker data set, an independent cardiologist was asked to assess the telemetry data and determine if Compton’s heart function was commensurate with the exertion needed to make a break with personal belongings during a life-threatening fire [19]. The cardiologist noted that, based on the evidence he was given to interpret, it was “highly improbable” that a man who suffered with the medical conditions that Compton did could manage to collect, pack, and remove the number of items that he did from his bedroom window, escape himself, and then proceed to carry these items in front of his house, out of harm’s way (see “Columbo, How to Dial a Murder”). Compton’s own cardio readings, in effect, snitched on him, and none were happier than the law enforcement officer in charge of the case, Lieutenant Jimmy Cunningham, who noted that the pacemaker data, while only a supporting piece of evidence, was vital in proving Compton’s guilt after gasoline was found on his clothing. Evidence-based policing has now well outstripped the more traditional intelligence-led policing approach, entrenched given the new realm of big data availability [20], [21].

Columbo, How to Dial a Murder [S1] Columbo says to the murderer:
“You claim that you were at the physicians getting your heart examined…which was true [Columbo unravels a roll of EKG readings]…the electrocardiogram, Sir. Just before three o’clock your physician left you alone for a resting trace. At that moment you were lying down in a restful position and your heart showed a calm, slow, easy beat [pointing to the EKG readout]. Look at this part, right here [Columbo points to the reading], lots of sudden stress, lots of excitement, right here at three o’clock, your heart beating like a hammer just before the dogs attacked…Oh you killed him with a phone call, Sir…I’ll bet my life on it. Very simple case. Not that I’m particularly bright, Sir…I must say, I found you disappointing, I mean your incompetence, you left enough clues to sink a ship. Motive. Opportunity. And for a man of your intelligence Sir, you got caught on a lot of stupid lies. A lot.” [S1] Columbo: How to Dial a Murder. Directed by James Frawley. 1978. Los Angeles, CA: Universal Pictures Home Entertainment, 2006. DVD.

Consumer Electronics Tell a Story

Several things are now of interest to the legal community: first and foremost, how is the search warrant for a person’s pacemaker data executed? In case 1, Campos was denied access to his own ICD data stream by the manufacturer, and yet his doctor had full access. In case 2, Compton’s own data provided authorities with the extra evidence they needed to accuse him of fraud. This is yet another example of seemingly private data being used against an individual (in this instance, the person from whose body the data emanated), but in the future, for instance, the data from one person’s pacemaker might well implicate other members of the public. For example, the pacemaker might be able to prove that someone’s heart rate substantially increased during an episode of domestic violence [22] or that an individual was unfaithful in a marriage based on the cross matching of his or her time stamp and heart rate data with another.

Of course, a consumer electronic does not have to be embedded to tell a story (Figure 7). It can also be wearable or luggable, as in the case of a Fitbit that was used as a truthdetector in an alleged rape case that turned out to be completely fabricated [23]. Lawyers are now beginning to experiment with other wearable gadgetry that helps to show the impact of personal injury cases from accidents (work and nonwork related) on a person’s ability to return to his or her normal course of activities [24] (Figure 8). We can certainly expect to see a rise in criminal and civil litigation that makes use of a person’s Android S Health data, for instance, which measure things like steps taken, stress, heart rate, SpO2, and even location and time (Figure 9). But cases like Compton’s open the floodgates.

Figure 7. A Fitbit, which measures calories, steps, distance, and floors. (Photo courtesy of wikimedia commons.)

Figure 8. A closeup of a patient wearing the iRhythm ZIO XT patch, nine days after its placement. (Photo courtesy of wikimedia commons.)

I have pondered on the evidence itself: are heart rate data really any different from other biometric data, such as deoxyribonucleic acid (DNA)? Is it perhaps more revealing than DNA? Should it be dealt with in the same way? For example, is the chain of custody coming from a pacemaker equal to that of a DNA sample and profile? In some way, heart rates can be considered a behavioral biometric [25], whereas DNA is actually a cellular sample [26]. No doubt we will be debating the challenges, and extreme perspectives will be hotly contested. But it seems nothing is off limits. If it exists, it can be used for or against you.

Figure 9. (a) and (b) The health-related data from Samsung's S Health application. Unknown to most is that Samsung has diversified its businesses to be a parent company to one of the world's largest health insurers. (Photos courtesy of katina michael.)

The Paradox of Uberveillance

In 2006, M.G. Michael coined the term uberveillance to denote “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” [27]. No doubt Michael’s background as a former police officer in the early 1980s, together with his cross-disciplinary studies, had something to do with his insights into the creation of the term [28]. This kind of surveillance does not watch from above, rather it penetrates the body and watches from the inside, looking out [29].

Furthermore, uberveillance “takes that which was static or discrete…and makes it constant and embedded” [30]. It is real-time location and condition monitoring and “has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought)” [30]. Uberveillance can be used prospectively or retrospectively. It can be applied as a “predictive mechanism for a person’s expected behavior, traits, likes, or dislikes; or it can be based on historical fact” [30].

In 2008, the term uberveillance was entered into the official Macquarie Dictionary of Australia [31]. In research that has spanned more than two decades on the social implications of implantable devices for medical and nonmedical applications, I predicted [15] that the technological trajectory of implantable devices that were once used solely for care purposes would one day be used retrospectively for tracking and monitoring purposes. Even if the consumer electronics in question were there to provide health care (e.g., the pacemaker example) or convenience (e.g., a near-field-communication-enabled smartphone), the underlying dominant function of the service would be control [32]. The socioethical implications of pervasive and persuasive emerging technologies have yet to really be understood, but increasingly, they will emerge to take center stage in court hearings, like the emergence of DNA evidence and then subsequently global positioning system (GPS) data [33].

Medical device implants provide a very rich source of human activity monitoring, such as the electrocardiogram (EKG), heart rate, and more. Companies like Medtronics, among others specializing in implantables, have proposed a future where even healthy people carry a medical implant packed with sensors that could be life sustaining and detect heart problems (among others), reporting them to a care provider and signaling when assistance might be required [34]. Heart readings provide an individual’s rhythmic biometrics and, at the same time, can record increases and decreases in activity. One could extrapolate that it won’t be long before our health insurance providers are asking for the same evidence for reduced premiums.

Figure 10. A pacemaker cemetery. (Photo courtesy of wikimedia commons.)

The future might well be one where we all carry a black box implantable recorder of some sort [35], an alibi that proves our innocence or guilt, minute by minute (Figure 10). Of course, an electronic eye constantly recording our every move brings a new connotation to the wise words expressed in the story of Pinocchio: always let your conscience be your guide. The future black boxes may not be as forgiving as Jiminy Cricket and more like Black Mirror’s “The Entire History of You” [36]. But if we assume that these technologies are to be completely trusted, whether they are implantable, wearable, or even luggable, then we are wrong.

The contribution of M.G. Michael’s uberveillance is in the emphasis that the uberveillance equation is a paradox. Yes, there are near-real-time data flowing continuously from more points of view than ever [37], closed-circuit TV looking down, smartphones in our pockets recording location and movement, and even implantables in some of us ensuring nontransferability of identity [38]. The proposition is that all this technology in sum total is bulletproof and foolproof, omniscient and omnipresent, a God’s eye view that cannot be challenged but for the fact that the infrastructure and the devices, and the software, are all too human. And while uberveillance is being touted for good through an IoT world that will collectively make us and our planet more sustainable, there is one big crack in the utopian vision: the data can misrepresent, misinform, and be subject to information manipulation [39]. Researchers are already studying the phenomenon on complex visual information manipulation, how to tell whether data has been tampered with, a suspect introduced or removed from a scene of a crime, and other forensic visual analytics [40]. It is why Vladimir Radunovic, director of cybersecurity and e-diplomacy programs in the DiploFoundation, cited M.G. Michael’s contribution that “big data must be followed by big judgment” [41].

What happens in the future if we go down the path of constant bodily monitoring of vital organs and vital signs, where we are all bearing some device or at least wearing one? Will we be in control of our own data, or, as is seemingly obvious at present, will we not be in control? And how might selfincrimination play a role in our daily lives, or even worse, individual expectations that can be achieved by only playing to a theater 24/7 so our health statistics can stack up to whatever measure and cross-examination they are put under personally or publicly [42]? Can we believe the authenticity of every data stream coming out of a sensor onboard consumer electronics? The answer is no.

Having run many years of GPS data-logging experiments, I can say that a lot can go wrong with sensors, and they are susceptible to outside environmental conditions. For instance, they can log your location miles away (even in another continent), the temperature gauge can play up, time stamps can revert to different time zones, the speed of travel can be wildly inaccurate due to propagation delays in satellites, readings may not be at regular intervals due to some kind of interference, and memory overflow and battery issues, while getting better, are still problematic. The short and long of it is that technology cannot be trusted. At best, it can act as supporting evidence but should never replace eyewitness accounts. Additionally, “the inherent problem with uberveillance is that facts do not always add up to truth (i.e., as in the case of an exclusive disjunction T 1 T 5 F), and predictions based on uberveillance are not always correct” [30].

Conclusion

While device manufacturers are challenging the possibility that their ICDs are hackable in courts [43], highly revered security experts like Bruce Schneier are heavily cautioning about going down the IoT path, no matter how inviting it might look. In his acclaimed blog, Schneier recently wrote [44]:

All computers are hackable…The industry is filled with market failures that, until now, have been largely ignorable. As computers continue to permeate our homes, cars, businesses, these market failures will no longer be tolerable. Our only solution will be regulation, and that regulation will be foisted on us by a government desperate to “do something” in the face of disaster…We also need to reverse the trend to connect everything to the internet. And if we risk harm and even death, we need to think twice about what we connect and what we deliberately leave uncomputerized. If we get this wrong, the computer industry will look like the pharmaceutical industry, or the aircraft industry. But if we get this right, we can maintain the innovative environment of the internet that has given us so much.

The cardiac implantables market by 2020 is predicted to become a US$43 billion industry [45]. Obviously, the stakes are high and getting higher with every breakthrough implantable innovation we develop and bring to market. We will need to address some very pressing questions at hand, as Schneier suggests, through some form of regulation if we are to maintain consumer privacy rights and data security. Joe Carvalko, a former telecommunications engineer and U.S. patent attorney as well as an associate editor of IEEE Technology and Society Magazine and pacemaker recipient, has added much to this discussion already [46], [47]. I highly recommend several of his publications, including “Who Should Own In-the-Body Medical Data in the Age of eHealth?” [48] and an ABA publication coauthored with Cara Morris, The Science and Technology Guidebook for Lawyers [49]. Carvalko is a thought leader in this space, and I encourage you to listen to his podcast [50] and also to read his speculative fiction novel, Death by Internet, [51] which is hot off the press and wrestles with some of the issues raised in this article.

REFERENCES

[1] K. Michael, M. Thistlethwaite, M. Rowland, and K. Pitt. (2015, Mar. 6). Standing Committee on Infrastructure and Communications, Section 313 of the Telecommunications Act 1997. [Online]. Available: http:// parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;db=COMMITT EES;id=committees%2Fcommrep%2Fd8727a07-ba09-4a91-9920-73d21 e446d1d%2F0006;query=Id%3A%22committees%2Fcommrep%2Fd872 7a07-ba09-4a91-9920-73d21e446d1d%2F0000%22

[2] S. Bronitt and K. Michael, “Human rights, regulation, and national security,” IEEE Technol. Soc. Mag., vol. 31, pp. 15–16, 2012.

[3] B. Hall. (2016, Dec. 22). Australians’ phone and email records could be used in civil lawsuits. Sydney Morning Herald. [Online]. Available: http:// www.smh.com.au/federal-politics/political-news/australians-phone-andemail-records-could-be-used-in-civil-lawsuits-20161222-gtgdy6.html

[4] PureVPN. (2015, Oct. 14). Data retention laws—an update. [Online]. Available: https://www.purevpn.com/blog/data-retention-laws-by-countries/

[5] D. Crawford. (2014, Nov. 18). Renegade Swedish ISP offers all customers VPN. Best VPN. [Online]. Available: https://www.bestvpn.com/ blog/11806/renegade-swedish-isp-offers-customers-vpn/

[6] J. Ball. (2013, Oct. 1). NSA stores metadata of millions of web users for up to a year, secret files show. Guardian. [Online]. Available: https://www .theguardian.com/world/2013/sep/30/nsa-americans-metadata-year-documents

[7] J. S. Granick, American Spies: Modern Surveillance, Why You Should Care, and What to Do About It. Cambridge, U.K.: Cambridge Univ. Press, 2017.

[8] A. Gregory, American Surveillance: Intelligence, Privacy, and the Fourth Amendment. Madison: Univ. of Wisconsin Press, 2016.

[9] K. Michael, G. Roussos, G. Q. Huang, A. Chattopadhyay, R. Gadh, B. S. Prabhu, and P. Chu, “Planetary-scale RFID services in an age of uberveillance,” Proc. IEEE, vol. 98, no. 9, pp. 1663–1671, 2010.

[10] N. Lars. (2015, Mar. 26). Connected medical devices, apps: Are they leading the IOT revolution—or vice versa? Wired. [Online]. Available: https://www.wired.com/insights/2014/06/connected-medical-devicesapps-leading-iot-revolution-vice-versa/

[11] H. Campos. (2015). The heart of the matter. Slate. [Online]. Available: http://www.slate.com/articles/technology/future_tense/2015/03/ patients_should_be_allowed_to_access_data_generated_by_implanted_ devices.html

[12] H. Campos. (2011). Fighting for the right to open his heart data: Hugo Campos at TEDxCambridge 2011. [Online]. Available: https:// www.youtube.com/watch?v=oro19-l5M8k

[13] D. Hinckley. (2016, Feb. 22). This big brother/big data business goes way beyond Apple and the FBI. Huffington Post. [Online]. Available: http://www.huffingtonpost.com/david-hinckley/this-big-brotherbigdata_b_9292744.html october 2017 ^ IEEE Consumer Electronics Magazine 115

[14] K. Michael, “Mental health, implantables, and side effects,” IEEE Technol. Soc. Mag., vol. 34, no. 2, pp. 5–17, 2015.

[15] K. Michael, “The technological trajectory of the automatic identification industry: The application of the systems of innovation (SI) framework for the characterisation and prediction of the auto-ID industry,” Ph.D. dissertation, School of Information Technology and Computer Science, Univ. of Wollongong, Wollongong, Australia, 2003.

[16] K. Michael and M. G. Michael, “Homo electricus and the continued speciation of humans,” in The Encyclopedia of Information Ethics and Security, M. Quigley, Ed. Hershey, PA: IGI Global, 2007, pp. 312–318.

[17] Google Glass. (2014, Aug. 19). Glass terms of use. [Online]. Available: https://www.google.com/glass/termsofuse/

[18] K. Michael and M. G. Michael, “Implementing ‘namebers’ using microchip implants: The black box beneath the skin,” in This Pervasive Day: The Potential and Perils of Pervasive Computing, J. Pitt, Ed. London, U.K.: Imperial College Press, 2011.

[19] D. Smith. (2017, Feb. 4). Pacemaker data used to charge alleged arsonist. Jonathan Turley. [Online]. Available: https://jonathanturley .org/2017/02/04/pacemaker-data-used-to-charge-alleged-arsonist/

[20] K. Michael, “Big data and policing: The pros and cons of using situational awareness for proactive criminalisation,” presented at the Human Rights and Policing Conf,. Australian National University, Canberra, Apr. 16, 2013.

[21] K. Michael and G. L. Rose, “Human tracking technology in mutual legal assistance and police inter-state cooperation in international crimes,” in From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Second Workshop on Social Implications of National Security), K. Michael and M. G. Michael, Eds. Wollongong, Australia: University of Wollongong, 2007.

[22] F. Gerry, “Using data to combat human rights abuses,” IEEE Technol. Soc. Mag., vol. 33, no. 4, pp. 42–43, 2014.

[23] J. Gershman. (2016, Apr. 21). Prosecutors say Fitbit device exposed fibbing in rape case. Wall Street Journal. [Online]. Available: http:// blogs.wsj.com/law/2016/04/21/prosecutors-say-fitbit-device-exposedfibbing-in-rape-case/

[24] P. Olson. (2014, Nov. 16). Fitbit data now being used in the courtroom. Forbes. [Online]. Available: https://www.forbes.com/sites/parmyolson/ 2014/11/16/fitbit-data-court-room-personal-injury-claim/#459434e37379

[25] K. Michael and M. G. Michael, “The social and behavioural implications of location-based services,” J. Location Based Services, vol. 5, no. 3–4, pp. 121–137, Sept.–Dec. 2011.

[26] K. Michael, “The European court of human rights ruling against the policy of keeping fingerprints and DNA samples of criminal suspects in Britain, Wales and Northern Ireland: The case of S. and Marper v United Kingdom,” in The Social Implications of Covert Policing (Workshop on the Social Implications of National Security, 2009), S. Bronitt, C. Harfield, and K. Michael, Eds. Wollongong, Australia: University of Wollongong, 2010, pp. 131–155.

[27] M. G. Michael and K. Michael, “National security: The social implications of the politics of transparency,” Prometheus, vol. 24, no. 4, pp. 359–364, 2006.

[28] M. G. Michael, “On the ‘birth’ of uberveillance,” in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds. Hershey, PA: IGI Global, 2014.

[29] M. G. Michael and K. Michael, “A note on uberveillance,” in From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Second Workshop on Social Implications of National Security), M. G. Michael and K. Michael, Eds. Wollongong, Australia: University of Wollongong, 2007.

[30] M. G. Michael and K. Michael, “Toward a state of uberveillance,” IEEE Technol. Soc. Mag., vol. 29, pp. 9–16, 2010.

[31] M. G. Michael and K. Michael, “Uberveillance,” in Fifth Edition of the Macquarie Dictionary, S. Butler, Ed. Sydney, Australia: Sydney University, 2009.

[32] A. Masters and K. Michael, “Lend me your arms: The use and implications of humancentric RFID,” Electron. Commerce Res. Applicat., vol. 6, no. 1, pp. 29–39, 2007.

[33] K. D. Stephan, K. Michael, M. G. Michael, L. Jacob, and E. P. Anesta, “Social implications of technology: The past, the present, and the future,” Proc. IEEE, vol. 100, pp. 1752–1781, 2012. [34] E. Strickland. (2014, June 10). Medtronic wants to implant sensors in everyone. IEEE Spectrum. [Online]. Available: http://spectrum.ieee .org/tech-talk/biomedical/devices/medtronic-wants-to-implant-sensorsin-everyone

[35] K. Michael, “The benefits and harms of national security technologies,” presented at the Int. Women in Law Enforcement Conf., Hyderabad, India, 2015. [36] J. A. Brian Welsh. (2011). The entire history of you,” Black Mirror, C. Brooker, Ed. [Online]. Available: https://www.youtube.com/watch?v= Sw3GIR70HAY

[37] K. Michael, “Sousveillance and point of view technologies in law enforcement,” presented at the Sixth Workshop on the Social Implications of National Security: Sousveillance and Point of View Technologies in Law Enforcement, University of Sydney, Australia, 2012.

[38] K. Albrecht and K. Michael, “Connected: To everyone and everything,” IEEE Technology and Soc. Mag., vol. 32, pp. 31–34, 2013.

[39] M. G. Michael, “The paradox of the uberveillance equation,” IEEE Technol. Soc. Mag., vol. 35, no. 3, pp. 14–16, 20, 2016.

[40] K. Michael, “The final cut—tampering with direct evidence from wearable computers,” presented at the Fifth Int. Conf. Multimedia Information Networking and Security (MINES 2013), Beijing, China, 2013.

[41] V. Radunovic, “Internet governance, security, privacy and the ethical dimension of ICTs in 2030,” IEEE Technol. Soc. Mag., vol. 35, no. 3, pp. 12–14, 2016.

[42] K. Michael. (2011, Sept. 12). The microchipping of people and the uberveillance trajectory. Social Interface. [Online]. Available: http:// socialinterface.blogspot.com.au/2011/08/microchipping-of-people-and .html

[43] O. Ford. (2017, Jan. 12). Post-merger Abbott moves into 2017 with renewed focus, still faces hurdles. J.P. Morgan Healthcare Conf. 2017. [Online]. Available: http://www.medicaldevicedaily.com/servlet/com .accumedia.web.Dispatcher?next=bioWorldHeadlines_article& forceid=94497

[44] B. Schneier. (2017, Feb. 1). Security and the Internet of Things: Schneier on security. [Online]. Available: https://www.schneier.com/ blog/archives/2017/02/security_and_th.html

[45] IndustryARC. (2015, July 30). Cardiac implantable devices market to reach $43 billion by 2020. GlobeNewswire. [Online]. Available: https://globenewswire.com/news-release/2015/07/30/756345/10143745/ en/Cardiac-Implantable-Devices-Market-to-Reach-43-Billion-By-2020 .html

[46] J. Carvalko, The Techno-Human Shell: A Jump in the Evolutionary Gap. Mechanicsburg, PA: Sunbury Press, 2013.

[47] J. Carvalko and C. Morris, “Crowdsourcing biological specimen identification: Consumer technology applied to health-care access,” IEEE Consum. Electron. Mag., vol. 4, no. 1, pp. 90–93, 2014.

[48] J. Carvalko, “Who should own in-the-body medical data in the age of ehealth?” IEEE Technol. Soc. Mag., vol. 33, no. 2, pp. 36–37, 2014.

[49] J. Carvalko and C. Morris, The Science and Technology Guidebook for Lawyers. New York: ABA, 2014.

[50] K. Michael and J. Carvalko. (2016, June 20). Joseph Carvalko speaks with Katina Michael on his non-fiction and fiction pieces. [Online]. Available: https://www.youtube.com/watch?v=p4JyVCba6VM

[51] J. Carvalko, Death by Internet. Mechanicsburg, PA: Sunbury Press, 2016.

[52] R. Pearce. (2017, June 7). “No-one’s talking about backdoors” for encrypted services, says PM’s cyber guy. Computerworld. [Online]. Available: https://www.computerworld.com.au/article/620329/no-onetalking-about-backdoors-says-pm-cyber-guy/

[53] M. Ambinder. (2013, Aug. 14). An educated guess about how the NSA is structured. The Atlantic. [Online]. Available: https://www .theatlantic.com/technology/archive/2013/08/an-educated-guess-abouthow-the-nsa-is-structured/278697/

Acknowledgment

A short form of this article was presented as a video keynote speech for the Fourth International Conference on Innovations in Information, Embedded and Communication Systems in Coimbatore, India, on 17 March 2017. The video is available at https://www.youtube.com/watch?v=bEKLDhNfZio.

Keywords

Metadata, Electrocardiography, Pacemakers, Heart beat, Telecommunication services, Implants, Biomedical equipment, biomedical equipment, cardiology, criminal law, medical computing, police data processing, transport protocols, implantable medical device, heart, Australian inquiry, government agencies, illegal online services,mandatory metadata retention laws, government organizations, law enforcement organizations, Internet protocol

Citation: Katina Michael, 2017, "Implantable Medical Device Tells All: Uberveillance Gets to the Heart of the Matter", IEEE Consumer Electronics Magazine, Vol. 6, No. 4, Oct. 2017, pp. 107 - 115, DOI: 10.1109/MCE.2017.2714279.

 

Be Vigilant: There Are Limits to Veillance

The Computer After Me: Awareness and Self-Awareness in Autonomic Systems

Chapter 13: Be Vigilant: There Are Limits to Veillance

This image was taken from the BioShock video game series or from websites created and owned by 2K Games, the copyright of which is held by Take-Two Interactive Software, Inc.

Katina Michael, M. G. Michael, Christine Perakslis

The following sections are included:

  • Introduction

  • From Fixed to Mobile Sensors

  • People as Sensors

  • Enter the Veillances

    • Surveillance

    • Dataveillance

    • Sousveillance

    • Überveillance

  • Colliding Principles

    • From ‘drone view’ to ‘person view’

    • Transparency and open data

    • Surveillance, listening devices and the law

    • Ethics and values

    • The unintended side effects of lifelogging

    • Pebbles and shells

    • When bad is good

    • Censorship

  • Summary and Conclusions: Mind/Body Distinction

13.1 Introduction

Be vigilant; we implore the reader. Yet, vigilance requires hard mental work (Warm et al., 2008). Humans have repeatedly shown evidence of poor performance relative to vigilance, especially when we are facing such factors as complex or novel data, time pressure, and information overload (Ware, 2000). For years, researchers have investigated the effect of vigilance, from the positive impact of it upon the survival of the ground squirrel in Africa to its decrement resulting in the poor performance of air traffic controllers. Scholars seem to agree: fatigue has a negative bearing on vigilance.

In our society, we have become increasingly fatigued, both physically and cognitively. It has been widely documented that employees are in­creasingly faced with time starvation, and that consequently self-imposed sleep deprivation is one of the primary reasons for increasing fatigue, as employees forego sleep in order to complete more work (see, for example, the online publications by the Society of Human Resources1 and the Na­tional Sleep Foundation2). Widespread access to technology exacerbates the problem, by making it possible to stay busy round the clock.

Our information-rich world which leads to information overload and novel data, as well as the 24/7/365 connectivity which leads to time pressure, both contribute to fatigue and so work against vigilance. However, the lack of vigilance, or the failure to accurately perceive, identify, or an­alyze bona fide threats, can lead to serious negative consequences, even a life-threatening state of affairs (Capurro, 2013).

This phenomenon, which can be termed vigilance fatigue, can be brought about by four factors:

·       Prolonged exposure to ambiguous, unspecified, and ubiquitous threat information.

·       Information overload.

·       Overwhelming pressure to maintain exceptional, error-free per­formance.

·       Faulty strategies for structuring informed decision-making under con­ditions of uncertainty and stress.

Therefore, as we are asking the reader to be vigilant in this transformative – and potentially disruptive transition toward – the ‘computer after me’, we feel obligated to articulate clearly the potential threats associated with veillance. We believe we must ask the challenging and unpopular questions now. We must disclose and discuss the existence of risk, the values at stake, and the possibility of harm related to veillance. We owe it to the reader in this world of increasing vigilance fatigue to provide unambiguous, specified threat information and to bring it to their attention.

13.2 From Fixed to Mobile Sensors

Embedded sensors have provided us with a range of benefits and conve­niences that many of us take for granted in our everyday life. We now find commonplace the auto-flushing lavatory and the auto-dispensing of soap and water for hand washing. Many of these practices are not only conve­nient but help to maintain health and hygiene. We even have embedded sensors in lamp-posts that can detect on-coming vehicles and are so energy efficient that they turn on as they detect movement, and then turn off again to conserve resources. However, these fixtures are static; they form basic infrastructure that often has ‘eyes’ (e.g. an image and/or motion sensor), but does not have ‘legs’.

What happens when these sensors – for identification, location, condi­tion monitoring, point-of-view (POV) and more – become embeddable in mobile objects and begin to follow and track us everywhere we go? Our vehicles, tablets, smart phones, and even contactless smart cards are equipped to capture, synthesize, and communicate a plethora of information about our behaviors, traits, likes and dislikes, as we lug them around everywhere we go. Automatic licence plate scanners are mounted not only in street­lights or on bridges, but now also on patrol cars. These scanners snap photos of automobiles passing and store such data as plate numbers, times, and locations within massive databases (Clarke, 2009). Stores are combin­ing the use of static fixtures with mobile devices to better understand the psychographics and demographics of their shoppers (Michael and Clarke, 2013). The combination of these monitoring tools is powerful. Cell phone identifiers are used to track the movements of the customers (even if the customer is not connected to the store’s WiFi network), with the surveillance cameras collecting biometric analytics to analyze facial expressions and moods. Along with an augmented capability to customize and person­alize marketing efforts, the stores can identify how long one tarries in an aisle, the customer’s reaction to a sale item, the age of the shopper, and even who did or did not walk by a certain display.

The human has now become an extension (voluntarily or involuntarily) of these location-based and affect-based technological breakthroughs; we the end-users are in fact the end-point of a complex network of net­works. The devices we carry take on a life of their own, sending binary data up and down stream in the name of better connectivity, awareness, and ambient intelligence. ‘I am here’, the device continuously signals to the nearest access node, handshaking a more accurate location fix, as well as providing key behavioral indicators which can easily become predictors of future behaviors. However, it seems as if we, as a society, are rapidly in de­mand of more and more communications technology – or so that is the idea we are being sold. Technology has its many benefits: few people are out of reach now, and communication becomes easier, more personalized, and much more flexible. Through connectivity, people’s input is garnered and responses can be felt immediately. Yet, just as Newton’s action–reaction law comes into play in the physical realm, there are reactions to consider for the human not only in the physical realms, but also in the mental, emo­tional, and spiritual realms (Loehr and Schwartz, 2001), when we live our lives not only in the ordinary world, but also within the digital world.

Claims have been made that our life has become so busy today that we are grasping to gain back seconds in our day. It could be asked: why should we waste time and effort by manually entering all these now-necessary pass­words, when a tattoo or pill could transmit an 18-bit authentication signal for automatic logon from within our bodies? We are led to believe that individuals are demanding uninterrupted connectivity; however, research has shown that some yearn to have the freedom to ‘live off the grid’, even if for only a short span of time (Pearce and Gretzel, 2012).

A recent front cover of a US business magazine Fast Company read “Unplug. My life was crazy. So I disconnected for 25 days. You should too”. The content within the publication includes coping mechanisms of senior-level professionals who are working to mitigate the consequences of perpetual connectivity through technology. One article reveals the digital dilemmas we now face (e.g. how much should I connect?); another article provides tips on how to do a digital detox (e.g. disconnecting because of the price we pay); and yet another article outlines how to bring sanity to your crazy, wired life with eight ways the busiest connectors give themselves a break (e.g. taking time each day to exercise in a way that makes it impossi­ble to check your phone; ditching the phone to ensure undivided attention is given to colleagues; or establishing a company ‘Shabbat’ in which it is acceptable to unplug one day a week). Baratunde Thurston, CEO and co­founder of Cultivated Wit (and considered by some to be the world’s most connected man), wrote:

I love my devices and my digital services, I love being connected to the global hive mind – but I am more aware of the price we pay: lack of depth, reduced accuracy, lower quality, impatience, selfishness, and mental exhaustion, to name but a few. In choosing to digitally enhance lives, we risk not living them.
— (Thurston, 2013, p. 77)

13.3 People as Sensors

Enter Google Glass, Autographer, Memoto, TrackStick, Fitbit, and other wearable devices that are worn like spectacles, apparel, or tied round the neck. The more pervasive innovations such as electronic tattoos, nanopatches, smart pills, and ICT implants seamlessly become a ‘part’ of the body once attached, swallowed, embedded, or injected. These technolo­gies are purported to be lifestyle choices that can provide a myriad of con­veniences and productivity gains, as well as improved health and well-being functionality. Wearables are believed to have such benefits as enhancements to self-awareness, communication, memory, sensing, recognition, and logis­tical skills. Common experiences can be augmented, for example when a theme park character (apparently) knows your child’s name because of a wrist strap that acts as an admissions ticket, wallet, and ID.

Gone are the days when there was a stigma around electronic bracelets being used to track those on parole; these devices are now becoming much like a fashion statement and a desirable method not only for safety and security, but also for convenience and enhanced experiences. However, one must consider that an innocuous method for convenience may prove to create ‘people as sensors’ in which information is collected from the envi­ronment using unobtrusive measures, but with the wearer – as well as those around the wearer – possibly unaware of the extent of the data collection. In addition to issues around privacy, other questions must be asked such as: what will be done with the data now and well into the future?

The metaphor of ‘people as sensors’, also referred to as Citizens as Sen­sors (Goodchild, 2007), is being espoused, as on-board chipsets allow an individual to look out toward another object or subject (e.g. using an im­age sensor), or to look inward toward oneself (e.g. measuring physiological characteristics with embedded surveillance devices). As optional prosthetic devices are incorporated into users, devices are recognized by some as be­coming an extension of the person’s mind and body. New developments in ‘smart skin’ offer even more solutions. The skin can become a function of the user’s habits, personality, mood, or behavior. For example, when inserted into a shoe, the smart skin can analyze and improve the technical skill of an athlete, factors associated with body stresses related to activity, or even health issues that may result from the wearer’s use of high-heeled shoes (Papakostas et al., 2002). Simply put, human beings who function in analog are able to communicate digitally through the devices that they wear or bear. This is quite a different proposition from the typical surveil­lance camera that is bolted onto a wall overlooking the streetscape or mall and has a pre-defined field of view.

Fig. 13.1 People as sensors: from surveillance to uberveillance

‘People as sensors’ is far more pervasive than dash-cams used in police vehicles, and can be likened to the putting on of body-worn devices by law enforcement agencies to collect real-time data from the field (see Fig­ure 13.1). When everyday citizens are wearing and bearing these devices, they form a collective network by contributing individual subjective (and personal) observations of themselves and their surroundings. There are advantages; the community is believed to benefit with relevant, real-time information on such issues as public safety, street damage, weather obser­vations, traffic patterns, and even public health (cf. Chapter 12). People, using their everyday devices, can enter information into a data warehouse, which could also reduce the cost of intensive physical networks that oth­erwise need to be deployed. Although murky, there is vulnerability; such as the risk of U-VGI (Un-Volunteered Geographical Information) with the tracking of mass movements in a cell phone network to ascertain traffic distribution (Resch, 2013).

Consider it a type of warwalking on foot rather than wardriving.3 It seems that opt-in and opt-out features are not deemed necessary, perhaps due to the perceived anonymity of individual user identifiers. The ability to ‘switch off’, ‘turn off’, ‘unplug’, or select the ‘I do not consent’ feature in a practical way, is a question that many have pondered, but with arguably a limited number of pragmatic solutions, if any.

With ‘citizens as sensors’ there is an opt-in for those subscribing, but issues need to be considered for those in the vicinity of the bearer who did not consent to subscribe or to be recorded. Researchers contend that even the bearer must be better educated on the potential privacy issues (Daskala, 2011). For example, user-generated information yields longitude and lat­itude coordinates, time and date stamps, and speed and elevation details which tell us significant aspects about a person’s everyday life leading to insight about current and predictive behavioral patterns. Data could also be routinely intercepted (and stored indefinitely), as has been alleged in the recent National Security Agency (NSA) scandal. Even greater concerns arise from the potential use of dragnet electronic surveillance to be mined for information (now or in the future) to extract and synthesize rich het­erogeneous data containing personal visual records and ‘friends lists’ of the new media. Call detail records (CDRs) may just be the tip of the iceberg.

The quantified-self movement, which incorporates data, taking into ac­count many inputs of a person’s daily life, is being used for self-tracking and community building so individuals can work toward improving their daily functioning (e.g. how you look, feel, and live). Because devices can look inward toward oneself, one can mine very personal data (e.g. body mass index and heart rate) which can then be combined with the outward (e.g. the vital role of your community support network) to yield such quantifiers as a higi score defining a person with a cumulative grade (e.g. your score today out of a possible 999 points).4

Wearables, together with other technologies, assist in the process of tak­ing in multiple and varied data points to synthesize the person’s mental and physical performance (e.g. sleep quality), psychological states such as moods and stimulation levels (e.g. excitement), and other inputs such as food, air quality, location, and human interactions. Neurologically, information is addictive; yet, humans may make worse decisions when more information is at hand. Humans also are believed to overestimate the value of missing data which may lead to an endless pursuit, or perhaps an overvaluing of useless information (Bastardi and Shafir, 1998). Even more consequential, it is even possible that too much introspection can also reduce the quality of decisions of individuals.

13.4 Enter the Veillances

Katina Michael and M. G. Michael (2009) made a presentation that, for the first time at a public gathering, considered surveillance, dataveillance, sousveillance and überveillance all together. As a specialist term, veillance was first used in an important blogpost exploring equiveillance by Ian Kerr and Steve Mann (2006) in which the ‘valences of veillance’ were briefly described. But in contrast to Kerr and Mann, Michael and Michael were pondering on the intensification of a state of überveillance through increasingly pervasive technologies, which can provide details from the big picture view right down to the miniscule personal details.

But what does veillance mean? And how is it understood in different contexts? What does it mean to be watched by a CCTV camera, to have one’s personal details deeply scrutinized, to watch another, to watch one­self? And so we continue by defining the four types of veillances that have received attention in recognized peer reviewed journal publications and the wider corpus of literature.

13.4.1 Surveillance

First, the much embraced idea of surveillance recognized in the early nine­teenth century from the French sur meaning ‘over’ and veiller meaning ‘to watch’. According to the Oxford English Dictionary, veiller stems from the Latin vigilare, which means ‘to keep watch’.

13.4.2 Dataveillance

Dataveillance was conceived by Clarke (1988a) as “the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” (although in the Oxford English Dictionary it is now defined as “the practice of monitoring the online ac­tivity of a person or group”). The term was introduced in response to government agency data matching initiatives linking taxation records and social security benefits, among other commercial data mining practices. At the time it was a powerful response to the proposed Australia Card pro­posal in 1987 (Clarke, 1988b), which was never implemented by the Hawke Government, while the Howard Government’s attempts to introduce an Access Card almost two decades later in 2005 were also unsuccessful. It is remarkable that same issues ensue today, only on a greater magnitude with more consequences and advanced capabilities in analytics, data storage, and converging systems.

13.4.3 Sousveillance

Sousveillance was defined by Steve Mann in 2002, but practiced since 1995 as “the recording of an activity from the perspective of a participant in the activity” . 5 However, its initial introduction into the literature came in the inaugural Surveillance and Society journal in 2003 with a meaning of ‘in­verse surveillance’ as a counter to organizational surveillance (Mann et al., 2003). Mann prefers to interpret sousveillance as under-sight, which main­tains integrity, contra to surveillance as over-sight (Mann, 2004a), which reduces to hypocrisy if governments responsible for surveillance pass laws to make sousveillance illegal.

Whereas dataveillance is the systematic use of personal data systems in the monitoring of people, sousveillance is the inverse of monitoring people; it is the continuous capture of personal experience (Mann, 2004b). For ex­ample, dataveillance might include the linking of someone’s tax file number with their bank account details and communications data. Sousveillance on the other hand, is a voluntary act of logging what people might see as they move through the world. Surveillance is thus considered watch­ing from above, whereas sousveillance is considered watching from below. In contrast, dataveillance is the monitoring of a person’s activities which presents the individual with numerous social dangers (Clarke, 1988a).

13.4.4 Uberveillance

¨Uberveillance conceived by M. G. Michael in 2006, is defined in the Aus­tralian Law Dictionary as: “ubiquitous or pervasive electronic surveillance that is not only ‘always on’ but ‘always with you’, ultimately in the form of bodily invasive surveillance”. The Macquarie Dictionary of Australia entered the term officially in 2008 as “an omnipresent electronic surveil­lance facilitated by technology that makes it possible to embed surveil­lance devices in the human body”. Michael and Michael (2007) defined überveillance as having “to do with the fundamental who (ID), where (loca­tion), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought)”.

¨Uberveillance is a compound word, conjoining the German über mean­ing ‘over’ or ‘above’ with the French veillance. The concept is very much linked to Friedrich Nietzsche’s vision of the übermensch, who is a man with powers beyond those of an ordinary human being, like a super-man with amplified abilities (Michael and Michael, 2010). ¨Uberveillance is analogous to big brother on the inside looking out. For example, heart, pulse, and temperature sensor readings emanating from the body in binary bits wire­lessly, or even through amplified eyes such as inserted contact lens ‘glass’ that might provide visual display and access to the Internet or social net­working applications.

¨Uberveillance brings together all forms of watching from above and from below, from machines that move to those that stand still, from animals and from people, acquired involuntarily or voluntarily using obtrusive or unob­trusive devices (Michael et al., 2010). The network infrastructure underlies the ability to collect data direct from the sensor devices worn by the individ­ual and big data analytics ensures an interpretation of the unique behavioral traits of the individual, implying more than just predicted movement, but intent and thought (Michael and Miller, 2013).

It has been said that überveillance is that part of the veillance puz­zle that brings together the sur, data, and sous to an intersecting point (Stephan et al., 2012). In überveillance, there is the ‘watching’ from above component (sur), there is the ‘collecting’ of personal data and public data for mining (data), and there is the watching from below (sous), which can draw together social networks and strangers, all coming together via wear­able and implantable devices on/in the human body. ¨Uberveillance can be used for good in the practice of health for instance, but we contend that, independent of its application for non-medical purposes, it will always have an underlying control factor (Masters and Michael, 2006).

13.5 Colliding Principles

13.5.1 From ‘drone view’ to ‘person view’

It can be argued that, because a CCTV camera is monitoring activities from above, we should have the ‘counter-right’ to monitor the world around us from below. It therefore follows, if Google can record ‘street views’, then the average citizen should also be able to engage in that same act, which we may call ‘person view’. Our laws as a rule do not forbid recording the world around us (or even each other for that matter), so long as we are not encroaching on someone else’s well-being or privacy (e.g. stalking, or making material public without expressed consent). While we have Street View today, it will only be a matter of time before we have ‘drones as a service’ (DaaS) products that systematically provide even better high res­olution imagery than ‘satellite views’. We can make ‘drone view’ available on Google Maps, as we could probably also make ‘person view’ available. Want to look up not only a street, but a person if they are logged in and registered? Then search ‘John Doe’ and find the nearest camera pointing toward him, and/or emanating from him. Call it a triangulation of sorts.

13.5.2 Transparency and open data

The benefits of this kind of transparency, argue numerous scholars, are that not only will we have a perfect source of open data to work with, but that there will be less crime as people consider the repercussions of being caught doing wrong in real-time. However, this is quite an idealistic paradigm and ethically flawed. Criminals, and non-criminals for that mat­ter, find ways around all secure processes, no matter how technologically foolproof. At that point, the technical elite might well be systematically hiding or erasing their recorded misdemeanours but no doubt keeping the innocent person under 24/7/365 watch. There are, however, varying de­grees to transparency, and most of these have to do with economies of scale and/or are context-based; they have to be. In short, transparency needs to be context related.

13.5.3 Surveillance, listening devices and the law

At what point do we actually believe that in a public space our privacy is not invaded by such incremental innovations as little wearable cameras, half the size of a matchbox, worn as lifelogging devices? One could speculate that the sheer size of these devices makes them unobtrusive and not easily detectable to the naked eye, meaning that they are covert in nature and blatantly break the law in some jurisdictions where they are worn and operational (Abbas et al., 2011). Some of these devices not only capture images every 30 seconds, but also record audio, making them potentially a form of unauthorized surveillance. It is also not always apparent when these devices are on or off. We must consider that the “unrestricted freedom of some may endanger the well-being, privacy, or safety of others” (Rodota and Capurro, 2005, p. 23). Where are the distinctions between the wearer’s right to capture his or her own personal experiences on the one hand (i.e. the unrestricted freedom of some), and intrusion into another’s private sphere in which he or she does not want to be recorded, and is perhaps even disturbed by the prospect of losing control over his or her privacy (i.e. endangering the well-being or privacy of others)?

13.5.4 Ethics and values

Enter ethics and values. Ethics in this debate are greatly important. They have been dangerously pushed aside, for it is ethics that determine the degree of importance, that is the value, we place on the levels of our decision-making. When is it right to take photographs and record another individual (even in a public space), and when is it wrong? Do I physically remove my wearable device when I enter a washroom, a leisure centre, a hospital, a funeral, someone else’s home, a bedroom? Do I need to ask express permis­sion from someone to record them, even if I am a participant in a shared activity? What about unobtrusive devices that blur the line between wear­ables and implantables, such as miniature recording devices embedded in spectacle frames or eye sockets and possibly in the future embedded in con­tact lenses? Do I have to tell my future partner or prospective employer? Should I declare these during the immigration process before I enter the secure zone?

At the same time, independent of how much crowdsourced evidence is gathered for a given event, wearables and implantables are not infallible, their sensors can easily misrepresent reality through inaccurate or incom­plete readings and data can be even further misconstrued post capture (Michael and Michael, 2007). This is the limitation of an überveillance so­ciety – devices are equipped with a myriad of sensors; they are celebrated as achieving near omnipresence, but the reality is that they will never be able to achieve omniscience. Finite knowledge and imperfect awareness create much potential for inadequate or incomplete interpretations.

Some technologists believe that they need to rewrite the books on meta­physics and ontology, as a result of old and outmoded definitions in the traditional humanities. We must be wary of our increasing ‘technicized’ environment however, and continue to test ourselves on the values we hold as canonical, which go towards defining a free and autonomous human be­ing. The protection of personal data has been deemed by the EU as an autonomous individual right.

Yet, with such pervasive data collection, how will we protect “the right of informational self-determination on each individual – including the right to remain master of the data concerning him or her” (Rodota and Capurro, 2005, p. 17)? If we rely on bio-data to drive our next move based on what our own wearable sensors tells some computer application is the right thing to do, we very well may lose a great part of our freedom and the life-force of improvization and spontaneity. By allowing this data to drive our decisions, we make ourselves prone to algorithmic faults in software programs among other significant problems.

13.5.5 The unintended side effects of lifelogging

Lifelogging captures continuous first-person recordings of a person’s life and can now be dynamically integrated into social networking and other appli­cations. If lifelogging is recording your daily life with technical tools, many are unintentionally participating in a form of lifelogging by recording their lives through social networks. Although, technically, data capture in social media happens in bursts (e.g. the upload of a photograph) compared with continuous recording of first-person recordings (e.g. glogger.mobi) (Daskala, 2011). Lifelogging is believed to have such benefits as affecting how we re­member, increasing productivity, reducing an individual’s sense of isolation, building social bonds, capturing memories, and enhancing communication.

Governing bodies could also derive benefit through lifelogging appli­cations data to better understanding public opinion or forecast emerging health issues for society. However, memories gathered by lifelogs can have side effects. Not every image, and not every recording you will take will be a happy one. Replaying these and other moments might be detrimental to our well-being. For example, history shows ‘looking back’ may become traumatic, such as Marina Lutz’s experience of having most of her life ei­ther recorded or photographed in the first 16 years of her life by her father (see the short film The Marina Experience).

Researchers have discovered that personality development and mental health could also be negatively impacted by lifelogging applications. Vul­nerabilities include high influence potential by others, suggestibility, weak perception of self, and a resulting low self-esteem (Daskala, 2011). There is also risk that wearers may also post undesirable or personal expressions of another person, which cause the person emotional harm due to a neg­ative perception of himself or herself among third parties (Daskala, 2011). We have already witnessed such events in other social forums with tragic consequences such as suicides.

Lifelogging data may also create unhealthy competition, for example in gamification programs that use higi scores to compare your quality of life to others. Studies report psychological harm among those who perceive they do not meet peer expectations (Daskala, 2011); how much more so when intimate data about one’s physical, emotional, psychological, and so­cial network is integrated, measured, and calculated to sum up quality of life in a three-digit score (Michael and Michael, 2011). Even the effect of sharing positive lifelogging data should be reconsidered. Various reports have claimed that watching other people’s lives can develop into an obsession and can incite envy, feelings of inadequacy, or feeling as if one is not accomplished enough, especially when comparing oneself to others.

13.5.6 Pebbles and shells

Perhaps lifelogs could have the opposite effect of their intended purpose, without ever denying the numerous positives. We may become wrapped up in the self, rather than in the common good, playing to a theater, and not allowing ourselves to flourish in other ways lest we are perceived as anything but normal. Such logging posted onto public Internet archival stores might well serve to promote a conflicting identity of the self, constant validation through page ranks, hit counts and likes, and other forms of electronic exhibitionism. Researchers purport that lifelogging activities are likely to lead to an over-reliance and excessive dependency on electronic devices and systems with emotionally concerning, on-going cognitive reflections as messages are posted or seen, and this could be at the expense of more important aspects of life (Daskala, 2011).

Isaac Newton gave us much to consider when he said, “I was like a boy playing on the sea-shore, and diverting myself now and then find­ing a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me” (Brewster, 2001). Society at large must question if the measurements of Google hits, higi scores, clicks, votes, recordings, and analysis of data to quantify ‘the self’, could become a dangerously distracting exercise if left unbalanced. The aforementioned measurements, which are multi-varied and enormously insightful, may be of value – and of great enjoyment and fascination – much like Newton’s peb­bles and shells. However, what is the ocean we may overlook – or ignore – as we scour the beach for pebbles and shells?

13.5.7 When bad is good

Data collection and analysis systems, such as lifelogging, may not appro­priately allow for individuals to progress in self-awareness and personal development upon tempered reflection. How do we aptly measure the con­tradictory aspects of life such as the healing that often comes through tears, or the expending of energy (exercise) to gain energy (physical health), or the unique wonder that is realized only through the pain of self-sacrifice (e.g. veritable altruistic acts)? Harvard researchers Loehr and Schwartz (2001) provide us with further evidence of how the bad (or the unpleasant) can be good relative to personal development, through an investigation in which a key participant went by the name of ‘Richard’.

Richard was an individual progressing in self-awareness as documented during an investigation in which researchers were working to determine how executives could achieve peak performance leading to increased capacity for endurance, determination, strength, flexibility, self-control, and focus. The researchers found that executives who perform to full potential, for the long­term, tap into energy at all levels of the ‘pyramid of performance’ which has four ascending levels of progressive capacities: physical, emotional, mental, and spiritual.

The tip of the pyramid was identified as spiritual capacity, defined by the researchers as “an energy that is released by tapping into one’s deepest values and defining a strong sense of purpose” (Loehr and Schwartz, 2001, p. 127). The spiritual capacity, above all else, was found to be the sustenance – or the fuel – of the ideal performance state (IPS); the state in which individuals ‘bring their talent and skills to full ignition and to sustain high performance over time’ (op. cit., p. 122). However, as Richard worked to realize his spiritual capacity, he experienced significant pain during a two-year period. He reported being overcome by emotion, consumed with grief, and filled with longing as he learned to affirm what mattered most in his life. The two-year battle resulted in Richard ‘tapping into a deeper sense of purpose with a new source of energy’ (op. cit., p. 128); however, one must question if technology would have properly quantified the bad as the ultimate good for Richard. Spiritual reflections on the trajectory of technology (certainly since it has now been plainly linked to teleology) are not out of place nor should they be discouraged.

13.5.8 Censorship

Beyond the veillance (the ‘watching’) of oneself, i.e. the inward gaze, is the outward veillance and watching of the other. But this point of eye (PoE), does not necessarily mean a point of view (PoV), or even wider angle field of view (FoV). Particularly in the context of ‘glass’. Our gaze too is subjective, and who or what will connote this censorship at the time when it really matters? The outward watching too may not tell the full story, despite its rich media capability to gather both audio and video. Audio-visual accounts have their own pitfalls. We have long known how vitally important eye gaze is for all of the social primates, and particularly for humans; there will be consequences to any artificial tampering of this basic natural instinct. Hans Holbein’s famous painting The Ambassadors (1533), with its patent reference to anamorphosis, speaks volumes of the critical distinction between PoE and PoV. Take a look, if you are not already familiar with this double portrait and still life. Can you see the skull? The secret lies in the perspective and in the tilt of the head.

13.6 Summary and Conclusions: Mind/Body Distinction

In the future, corporate marketing may hire professional lifeloggers (or mo­bile robotic contraptions) to log other people’s lives with commercial de­vices. Unfortunately, because of inadequate privacy policies or a lack of harmonized legislation, we, as consumers, may find no laws that would pre­clude companies from this sort of ‘live to life’ hire if we do not pull the reins on the obsession to auto-photograph and audio record everything in sight. And this needs to happen right now. We have already fallen behind and are playing a risky game of catch-up. Ethics is not the overriding issue for technology companies or developers; innovation is their primary focus because, in large part, they have a fiduciary responsibility to turn a profit. We must in turn, as an informed and socially responsive community, forge together to dutifully consider the risks. At what point will we leap from tracking the mundane, which is of the body (e.g. location of GPS coordi­nates), toward the tracking of the mind by bringing all of these separate components together using ¨uber-analytics and an ¨uber-view? We must ask the hard questions now. We must disclose and discuss the existence of risk, the values at stake, and the possibility of harm.

It is significant that as researchers we are once more, at least in some places, speaking on the importance of the Cartesian mind/body distinction and of the catastrophic consequences should they continue to be confused when it comes to etymological implications and ontological categories. The mind and the body are not identical even if we are to argue from Leibniz’s Law of Identity that two things can only be identical if they at the same time share exactly the same qualities. Here as well, vigilance is enormously important that we might not disremember the real distinction between machine and human.

References

Abbas, R., Michael, K., Michael, M. G., & Aloudat, A. (2011). Emerging Forms of Covert Surveillance Using GPS-Enabled Devices. Journal of Cases on Information Technology, 13(2), 19-33.

ACLU. (2013). You Are Being Tracked: How License Plate Readers Are Being Used to Record Americans' Movements. from http://www.aclu.org/technology-and-liberty/you-are-being-tracked-how-license-plate-readers-are-being-used-record

Adler, I. (2013). How Our Digital Devices Are Affecting Our Personal Relationships. 90.9 WBUR.

ALD (Ed.). (2010). Uberveillance: Oxford University Press.

Australian Privacy Foundation. (2005). Human Services Card.   Retrieved 6 June 2013, from http://www.privacy.org.au/Campaigns/ID_cards/HSCard.html

Bastardi, A., & Shafir, E. (1998). On the Pursuit and Misuse of Useless Information. Journal of Personality and Social Psychology, 75(1), 19-32.

Brewster, D. (2001). Memoirs of the Life, Writings, and Discoveries of Sir Isaac Newton (1855) Volume II. Ch. 27: Adamant Media Corporation.

Capurro, R. (2013). Medicine in the information and knowledge society. Paper presented at the Conference Name|. Retrieved Access Date|. from URL|.

Carpenter, L. (2011). Marina Lutz interview: The sins of my father. The Observer   Retrieved 20 April 2013, from http://www.guardian.co.uk/artanddesign/2011/apr/17/photography-children

Clarke, R. (1988a). Information Technology and Dataveillance. Communications of the ACM, 31(5), 498-512.

Clarke, R. (1988b). Just another piece of plastic in your wallet: the `Australian card' scheme. ACM SIGCAS Computers and Society, 18(1), 7-21.

Clarke, R. (2009, 7 April 2009). The Covert Implementation of Mass Vehicle Surveillance in Australia. Paper presented at the Fourth Workshop on the Social Implications of National Security: Covert Policing, Canberra, Australia.

Clifford, S., & Hardy, Q. (2013). Attention, Shoppers: Store Is Tracking Your Cell.   Retrieved 14 July, from http://www.nytimes.com/2013/07/15/business/attention-shopper-stores-are-tracking-your-cell.html?pagewanted=all

Collins, L. (2008). Annals of Crime. Friend Game. Behind the online hoax that led to a girl’s suicide. The New Yorker.

DailyMail. (2013). Stores now tracking your behavior and moods through cameras.   Retrieved 6 August, from http://www.dailymail.co.uk/news/article-2364753/Stores-tracking-behavior-moods-cameras-cell-phones.html?ito=feeds-newsxml

ENISA. (2011). To log or not to log?: Risks and benefits of emerging life-logging applications. European Network and Information Security Agency   Retrieved 6 July 2013, from http://www.enisa.europa.eu/activities/risk-management/emerging-and-future-risk/deliverables/life-logging-risk-assessment/to-log-or-not-to-log-risks-and-benefits-of-emerging-life-logging-applications

FastCompany. (2013). #Unplug. Fast Company, July/August(177).

Frankel, T. C. (2012, 20 October). Megan Meier's mom is still fighting bullying. stltoday.com   Retrieved 4 November 2012

Friedman, R. (2012). Why Too Much Data Disables Your Decision Making. Psychology Today: Glue   Retrieved December 4, 2012, from http://www.psychologytoday.com/blog/glue/201212/why-too-much-data-disables-your-decision-making

Goodchild, M. F. (2007). Citizens as sensors: the world of volunteered geography. GeoJournal, 69, 211–221.

Greenwald, G. (2013). NSA collecting phone records of millions of Verizon customers daily. The Guardian   Retrieved 10 August 2013, from http://www.theguardian.com/world/2013/jun/06/nsa-phone-records-verizon-court-order

Hans Holbein the Younger. (1533). The Ambassadors.

Hayes, A. (2010). Uberveillance (Triquetra).   Retrieved 6 May 2013, from http://archive.org/details/Uberveillancetriquetra

HIGI. (2013). Your Score for Life.   Retrieved 29 June 2013, from https://higi.com/about/score

Intellitix. (2013). Reshaping the Event Horizon.   Retrieved 6 July 2013, from http://www.intellitix.com/intellitix/home/

Kerr, I., & Mann, S. (2006). Exploring Equiveillance. ID TRAIL MIX.

Krause. (2012). Vigilance Fatigue in Policing.   Retrieved 22 July, from http://www.fbi.gov/stats-services/publications/law-enforcement-bulletin/december-2012/vigilance-fatigue-in-policing

Levin, A. (2013). Waiting for Public Outrage. Paper presented at the IEEE International Symposium on Technology and Society, Toronto, Canada.

Loehr, J., & Schwartz, T. (2001). The Making of a Corporate Athlete. Harvard Business Review, January, 120-129.

Lutz, M. (2012). The Marina Experiment.   Retrieved 29 May 2013, from www.themarinaexperiment.com

Macquarie (Ed.). (2009). Uberveillance: Sydney University.

Magid, L. (2013). Wearables and Sensors Big Topics at All Things D. Forbes.

Mann, S. (2004a). Continuous lifelong capture of personal experience with EyeTap. Paper presented at the ACM International Multimedia Conference, Proceedings of the 1st ACM workshop on Continuous archival and retrieval of personal experiences (CARPE 2004), New York.

Mann, S. (2004b). Sousveillance: inverse surveillance in multimedia imaging. Paper presented at the Proceedings of the 12th annual ACM international conference on Multimedia, New York, NY, USA.

Mann, S., Nolan, J., & Wellman, B. (2003). Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments. Surveillance and Society, 1(3), 331-355.

Masters, A., & Michael, K. (2006). Lend me your arms: the use and implications of humancentric RFID. Electronic Commerce Research and Applications, 6(1), 29-39.

Michael, K. (2010). Stop social network pitfalls. Illawarra Mercury.

Michael, K. (2013a). Big Data and the Dangers of Over-Quantifying Oneself. Computer Magazine (Multimedia)   Retrieved June 7, 2013, from http://www.youtube.com/watch?v=mn_9YHV2RGQ&list=PLHJB2bhmgB7cbB-oafjt68XbzyPV46szi&index=7

Michael, K. (2013b). Snowden's Revelations Just the Tip of the Iceberg.   Retrieved 6 July 2013, from http://uberveillance.com/blog/2013/7/23/snowdens-revelations-just-the-tip-of-the-iceberg

Michael, K. (2013c). Social Implications of Wearable Computing and Augmediated Reality in Every Day Life (IEEE Symposium on Technology and Society, ISTAS13). Toronto: IEEE.

Michael, K. (2013d). Wearable computers challenge human rights. ABC Science Online.

Michael, K., & Clarke, R. (2013). Location and tracking of mobile devices: Überveillance stalks the streets. Computer Law & Security Review, 29(3), 216-228.

Michael, K., & Michael, M. G. (2009). Teaching Ethics in Wearable Computing:  the Social Implications of the New ‘Veillance’. EduPOV   Retrieved June 18, from http://www.slideshare.net/alexanderhayes/2009-aupov-main-presentation?from_search=3

Michael, K., & Michael, M. G. (2012). Converging and coexisting systems towards smart surveillance. Awareness Magazine: Self-awareness in autonomic systems, June.

Michael, K., & Michael, M. G. (Eds.). (2007). From Dataveillance to Überveillance and the Realpolitik of the Transparent Society. Wollongong, NSW, Australia.

Michael, K., & Miller, K. W. (2013). Big Data: New Opportunities and New Challenges. IEEE Computer, 46(6), 22-24.

Michael, K., Roussos, G., Huang, G. Q., Gadh, R., Chattopadhyay, A., Prabhu, S., et al. (2010). Planetary-scale RFID Services in an Age of Uberveillance. Proceedings of the IEEE, 98(9), 1663-1671.

Michael, M. G., & Michael, K. (2007). Uberveillance. Paper presented at the 29th International Conference of Data Protection and Privacy Commissioners. Privacy Horizons: Terra Incognita, Location Based Tracking Workshop, Montreal, Canada.

Michael, M. G., & Michael, K. (2010). Towards a State of Uberveillance. IEEE Technology and Society Magazine, 29(2), 9-16.

Michael, M. G., & Michael, K. (2011). The Fall-Out from Emerging Technologies: on Matters of Surveillance, Social Networks and Suicide. IEEE Technology and Society Magazine, 30(3), 15-18.

mX. (2013). Hard to Swallow.   Retrieved 6 August 2013, from http://www.mxnet.com.au/story/hard-to-swallow/story-fnh38q9o-1226659271059

Orcutt, M. (2013). Electronic “Skin” Emits Light When Pressed. MIT Tech Review.

Oxford Dictionary. (2013). Dataveillance.   Retrieved 6 May 2013, from http://oxforddictionaries.com/definition/english/surveillance

OxfordDictionary. (2013). Surveillance.   Retrieved 6 May 2013, from http://oxforddictionaries.com/definition/english/surveillance

Papakostas, T. V., Lima, J., & Lowe, M. (2002). 5:3 A Large Area Force Sensor for Smart Skin Applications. Sensors; Proceedings of IEEE, 5(3).

Pearce, P., & Gretzel, U. (2012). Tourism in technology dead zones: documenting experiential dimensions. International Journal of Tourism Sciences, 12(2), 1-20.

Pivtohead. (2013). Wearable Imaging: True point of view.   Retrieved 22 June 2013, from http://pivothead.com/#

Pokin, S. (2007). MySpace' hoax ends with suicide of Dardenne Prairie teen. St. Louis Post-Dispatch.

Resch, B. (2013). People as Sensors and Collective Sensing-Contextual Observations Complementing Geo-Sensor Network Measurements. Paper presented at the Progress in Location-Based Services, Lecture Notes in Geoinformation and Cartography.

Roberts, P. (1984). Information Visualization for Stock Market Ticks: Toward a New Trading Interface. Massachusetts Institute of Technology, Boston.

Rodota, S., & Capurro, R. (2005). Ethical Aspects of ICT Implants in the Human Body. The European Group on Ethics in Science and New Technologies (EGE)   Retrieved June 3, 2006, from http://ec.europa.eu/bepa/european-group-ethics/docs/avis20_en.pdf

SHRM. (2011). from http://www.shrm.org/publications/hrnews/pages/fatiguefactors.aspx

Spence, R. (2009). Eyeborg.   Retrieved 22 June 2010, from http://eyeborg.blogspot.com.au/

Stephan, K. D., Michael, K., Michael, M. G., Jacob, L., & Anesta, E. (2012). Social Implications of Technology: Past, Present, and Future. Proceedings of the IEEE, 100(13), 1752-1781.

SXSchedule. (2013). Better Measure: Health Engagement & higi Score.   Retrieved 29 June 2013, from http://schedule.sxsw.com/2013/events/event_IAP4888

Thurston, B. (2013). I have left the internet. Fast Company, July/August(177), 66-78, 104-105.

Ware, C. (2000). Information Visualization: Perception for Design. San Francisco, CA: Morgan Kaufmann.

Warm, J. S., Parasuraman, R., & Matthews, G. (2008). Vigilance Requires Hard Mental Work and Is Stressful. Human Factors, 433-441.

Williams, R. B. (2012). Is Facebook Good Or Bad For Your Self-Esteem? Psychology Today: Wired for Success.

Wordnik. (2013). Sousveillance.   Retrieved 6 June 2013, from http://www.wordnik.com/words/sousveillance

Endnotes

1 http://www.shrm.org/

2 www.sleepfoundation.org 

3 Someone searching for a WiFi wireless network connection using a mobile device in a moving vehicle.

4 http://higi.com/about/score; http://schedule.sxsw.com

5 http://www.wordnik.com/words/sousveillance

Citation: Katina Michael, M. G. Michael, and Christine Perakslis (2014) Be Vigilant: There Are Limits to Veillance. The Computer After Me: pp. 189-204. DOI: https://doi-org.ezproxy.uow.edu.au/10.1142/9781783264186_0013 

Legal Ramifications of Microchipping People in the United States of America

Abstract

The ability to microchip people for unique positive identification, and for tracking and monitoring applications is becoming increasingly scrutinized by the legal profession, civil libertarians, politicians in positions of power, human rights advocates, and last but not least, citizens across jurisdictions. The United States is among the few nations internationally, that have moved to enact state-level legislation, regarding the microchipping of people in a variety of contexts. This paper provides an overview of nine state laws/bills in the United States of America that have either enacted anti-chipping legislation or have recently proposed bills regarding the enforced chipping of persons. The aim of the paper is to highlight excerpts of legislation, to identify relevant stakeholders the legislation is directed toward and to briefly describe how it may affect their chipping practices. As a final outcome, the paper seeks to broadly compare state legislation, identifying differences in penalties and fines, and to show the complexity of this kind of approach to protecting the rights of citizens against unscrupulous uses of advanced information technologies.

Section 1.

Introduction

The capability to implant people with microchips has its roots in the field of medicine as far back as the innovation of pacemakers in the late 1950s [1][2]. Embedded chip-on-a-card technology, that could identify the cardholder, commonly known as smart cards or integrated circuit cards, was patented and prototyped for the first time in France by Roland Moreno in 1974 [3]. But it was not until 1998, that official reports of the first demonstrated microchip implantation in a human for identification and tracking purposes was achieved by Professor Kevin Warwick of the University of Reading in the Cyborg 1.0 experiment [4]. While United States patents date back to the 1970s, regarding apparatus allowing subcutaneous implants, such as guns for dispensing “pellets” comprising a case with a hollow needle attached to it [5], it was not until later that patents pertaining to medical devices stipulated a unique identification mechanism allowing for the collection of individual patient diagnostic data.

In 1987, beyond unique ID, a location tracking device was patented by a plastic surgeon Dr Daniel Man [6], residing in Florida in the United States. The abstract description of the patent reads: “[a] new apparatus for location and monitoring of humans has been developed. The device employs a unique programmable signal generator and detection system to locate and monitor the movement of individuals. It additionally utilizes a physiological monitoring system to signal a warning for the necessity for immediate help. The device is small enough to be implanted in young children as well as adults. The power supply and signal generator are designed to function at a low duty cycle for prolonged periods before recharging” [7].

Section 2.

Advancements in Implantable Technology and the Law

The challenges brought about by implantable technology, outside the biomedical arena, were for the greater part ignored until the mid-1990s. Few could debate against the obvious benefits brought about by the advancement of medical-related technologies to patients suffering from curable diseases or illnesses, and the lifestyle enhancements they promised and delivered, especially in the area of prosthesis. Even today, few could argue that implants for genuine therapeutic purposes pose any real danger to society at large if applied correctly; in fact they act to prolong life and aid sufferers to go about living as normal life as possible.

We can point to medical breakthroughs, such as those by Alfred Mann, that are likely to help hundreds of thousands of people in the future, to better cope with the treatments of diabetes, cancer, autoimmune and inflammatory diseases via automated drug delivery technologies [8]. Implantable technologies have already helped the deaf hear, and are likely to help the blind see, and to correct functional neural deficits using electrostimulation techniques and much more. The promise of nanotechnology, has brought with it the prospect of implantable treatments for Parkinson's Disease, epilepsy, Tourette's syndrome (which is now beyond the experimental stage), and even obsessive compulsive disorder (OCD).

Responsible, well-tested, and regulated applications of nanotechnology within the biomedical domain can only have positive impacts on the individual who is a recipient of an implant [9]. But in today's commercial context, even biomedical technologies can serve dual purposes, opening up a number of critical moral questions [10] regarding who is actually in control [11] and at what cost [12]. For as Mark N. Gasson writes regarding information and communication technology (ICT) implantable devices, “[a] number of wider moral, ethical and legal issues stem from enhancement applications and it is difficult to foresee the social consequences, the fundamental changes on our very conception of self and the impact on our identity of adoption long term. As a result, it is necessary to acknowledge the possibilities and is timely to have debate to address the wider implications these possibilities may bring” [13].

It is the “legal issues” pertaining to ICT implants which have been addressed only by a few researchers and their respective groups. As there are now several commercial organizations marketing a variety of applications using ICT implants for IDentification and location tracking purposes, some states in the USA have acted as ‘first movers’ to quell citizen concerns over the potential for enforced chipping, and to safeguard the individual's human rights. Of course, this is all set against a backdrop at a national level concerned about national security, and consecutive governments that have introduced widespread radio-frequency identification (RFID) and tracking and monitoring capabilities in passports, driver's licenses, toll-ways etc.

Section 3.

Seminal Works

Of the scant research that has been written addressing legal dilemmas of ICT implants, two can be considered landmark and representative of the literature. Elaine M. Ramesh, from the Franklin Pierce Law School wrote in anticipation of human microchip implants and offered initial insights on the legal implications even before Warwick's Cyborg 1.0 experiment [14]. Almost a decade later, a second paper by William A. Herbert, member of the New York State Public Employment Relations Board, wrote a paper addressing the legal issues related to advanced technologies like Global Positioning Systems (GPS), biometrics, and RFID implants [15]. To date, this article serves to be the most complete on the topic at large.

Ramesh uses a qualitative approach and discusses the rights that may be infringed by humancentric microchip implants in the areas of common law, constitutional rights, the Fourth Amendment, the Fifth Amendment and property rights. The scenarios and results with cases relating to the above laws provided by Ramesh were limited to the point that commercial diffusion of RFID implants only occurred in 2003, with pre-registration beginning in 2002 [16]. Ramesh explains that the human body is not generally accepted as “property” which is her rationale behind the gap in the legal system. If property ownership of one's body could be confirmed, (that is we can claim ownership of one's body and do what we will with it) then property law would apply as protection giving an individual the right to refuse of implantation of the microchip without any consequences as the individuals body is his or her ‘owned property’ (Ramesh, 1997). However this same legislation would bring with it a mine-field of other problems to do with ownership and the rights associated with “selling” one's body or individual body parts.

After the events of September 11, 2001 and the enactment of the USA PATRIOT Act, Herbert [15] analyzed current State and Federal laws within the context of employer practices across the United States. Herbert describes the laws and relevant cases in his paper, along with potential solutions. Herbert justifies his research by addressing the concern over American Labor Laws granting employers greater powers over most employee privacy expectations. Herbert's findings indicate that, “[t]he scope and nature of current legal principles regarding individual privacy are not sufficient to respond to the rapid development and use of human tracking technology” [15]. It is this very disproportionate “power” relationship that could be further propagated and exploited by ICT implants, that Michael and Michael have termed “uberveillance” [17].

Since Herbert's seminal paper, a number of states have enacted what has come to be known in the popular sense as anti-chipping legislation. The rest of this paper is dedicated to providing excerpts of laws and bills for nine U.S. states related to ICT implants for humans [18]. Seven state laws/bills were collected during the main study period in 2007, with two additional laws/bills found in 2009. It must be underscored that this list of states should not be considered an exhaustive list of legislation.

For the states investigated during the main study period in 2007, a legislative excerpt is presented, stakeholders pertaining to the law are identified, and a brief description of how chipping practices in that state may be affected is provided. For the two additional acts/bills found in 2009, only an excerpt is presented with no further analysis. As a final outcome, the paper seeks to broadly compare seven state acts/bills, identifying differences in penalties and fines, and to show the complexity of this kind of approach to protecting the rights of citizens against unscrupulous uses of advanced information technologies. The main contribution of this paper is bringing the state laws together to make identifying similarities and differences easier, and to allow for future research opportunities between United States federal and state legislative comparisons towards harmonization and conflict.

Section 4.

State of California

4.1 SB 362, Identification Devices: Subcutaneous Implanting

SECTION 1. Section 52.7 is added to the Civil Code, to read:

Except as provided in subdivision  person shall not require, coerce, or compel any other individual to undergo the subcutaneous implanting of an identification device.
(1) Any person who violates subdivision  may be assessed an initial civil penalty of no more than ten thousand dollars (1,000) for each day the violation continues until the deficiency is corrected.
This section shall not in any way modify existing statutory or case law regarding the rights of parents or guardians, the rights of children or minors, or the rights of dependent adult.

4.2 Definition

The language used to define the implant; “subcutaneous implanting of an identification device” (2007 California SB 362) provides longevity for the legislation as it can be used for any device that can be implanted and used for identification rather than specifically stating a microchip, RFID tag, or commercial product name [19].

4.3 Who it affects?

“Except as provided in subdivision (g), a person shall not require” (2007 California SB 362) prevents an individual to force the implantation of the device on another, however it does allow the Government of California and the Government of the United States to use the technology as they see fit.

4.4 Exceptions

Section G as stated in the above extract of bill 362 refers to the “existing statutory or case law regarding the rights of parents or guardians” (2007 California SB 362). Because of this clause, a parent and/or a legal guardian may sign the written consent form for any child under the age of 15 under California Family Law to receive an implant.

‘A minor may only consent to the minor's medical care or dental care if all of the following conditions are satisfied: (1) The minor is 15 years of age or older. (2) The minor is living separate and apart from the minor's parents or guardian, whether with or without the consent of a parent or guardian and regardless of the duration of the separate residence. (3) The minor is managing the minor's own financial affairs, regardless of the source of the minor's income.” (California Family Code §6922(a)) If these clauses are not satisfied then the parent or guardian has the right over the child and the right to implant the child.

A minor may sign his/her own consent for the use of a implantable microchip if used for the sole purpose of aiding in the treatment of a psychological disability under California Family Code §6924.

“A minor who is 12 years of age or older may consent to mental health treatment … if both of the following requirements are satisfied: (1) The minor, in the opinion of the attending professional person, is mature enough to participate intelligently in the outpatient services or residential shelter services. (2) The minor  would present a danger of serious physical or mental harm to self or to others without the mental health treatment or counseling or residential shelter services, or  is the alleged victim of incest or child abuse” (California Family Code §6924).

Section 5.

State of Colorado

5.1 HB 07–1082, a Bill for an Act Concerning a Prohibition On Requiring an Individual To Be Implanted with a Microchip

A person may not require an individual to be implanted with a microchip.
A violation of this section is a Class 3 Misdemeanor punishable as provided in section 18–1.3–501. Each day in which a person violates this section shall constitute a separate offence.

5.2 Definition

The term “microchip” is used to describe the device however no formal definition is provided therefore any device containing a microchip or device of similar or advanced capabilities is included within the definition of a ‘microchip’ and therefore must adhere to this Bill.

The crime of forcing the implantation of a microchip is defined as a “Class 3 Misdemeanor” (2007 Colorado HB 1082) which according to Colorado Revised Statutes results in a minimum sentence of 750 fine per offence [20].

5.3 Who it affects?

“A person may not require an individual” (2007 Colorado HB 1082) prevents all individuals within the state of Colorado, however does not protect against United States federal legislation.

5.4 Exceptions

The bill does not outline any clause by where the legislation is void and therefore no loop holes exist. However this then allows the judicial branch to make decisions with each individual based on their specific circumstances, and they have the power to put previous legislation, statute or constitution above HB 1082 deeming it null and void for the case in question. The judicial branch is defined as the branch of the courts whereby the court determines the application of which law is applicable for each specific case and enforces it and determines the sentence/punishment based upon the law written by the legislative branch [21]. The same exception is applied to the majority of the states presented below.

Section 6.

State of Florida

6.1 SB 2220, an Act Relating To Implanted Microchips; Prohibiting the Implanting Of a Microchip or Similar Monitoring Device

It is a felony of the third degree, punishable as provided in . 775.082, . 775.083, or . 775.084, Florida Statutes, to knowingly implant, for tracking or identification purposes a microchip or similar monitoring device into a person without providing full disclosure to that person regarding the use of the device and obtaining the person's informed written consent.

6.2 Definition

The implantable microchip in Florida SB 2220 is defined as “a microchip or similar monitoring device” (2007 Florida SB 2220) which therefore validates the legislation (if enacted) for any technology used for the purpose of monitoring, tracking, tracing and identification.

The crime of forcing the implantation of a microchip is defined as a “felony of the third degree” (2007 Florida SB 2220) which according to Florida Criminal Code §775.082 (penalties) and §775.083 (fines) “For a felony of the third degree, by a term of imprisonment not exceeding 5 years” (Florida Criminal Code §775.082) and a fine of “$5,000, when the conviction is of a felony of the third degree” (Florida Criminal Code §775.083).

6.3 Who it affects?

“Into a person without providing full disclosure to that person regarding the use of the device and obtaining the person's informed written consent” (2007 Florida SB 2220) prevents all individuals within the state of Florida, however does not protect against United States federal legislation. The use of the device must also be outlined to the individual and recognition of the individuals understanding of the implants use must be received prior to the implantation and operation of the device.

Section 7.

State of North Dakota

7.1 SB 2415, an Act Relating To Implanted Microchips in Individuals; and To Provide a Penalty

SECTION 1. A new section to chapter 12.1–15 of the North Dakota Century Code is created and enacted as follows: Implanting microchips prohibited. A person may not require that an individual have inserted into that individual's body a microchip containing a radio frequency identification device. A violation of this section is a class A misdemeanor.

7.2 Definition

The implantable microchip in North Dakota SB 2415 is defined as a “microchip containing a radio frequency identification device” (2007 North Dakota SB 2415). This legislation is therefore limited by its definition and allows the use of devices by which their main technology to achieve its purpose is not radio frequency. Therefore utilization of innovations such as microwaves and barcodes may be argued as immune to the legislation.

The crime of forcing the implantation of a microchip is defined as a “class A misdemeanor” (2007 North Dakota SB 2415). Which according to North Dakota Century Code §12.1–32 “Class A misdemeanor: up to one year in prison, $2000 fine or both” (North Dakota Century Code §12.1–32).

7.3 Who it affects?

“A person may not require that an individual have inserted into that individual's body” (2007 North Dakota SB 2415). Therefore any individual does not have to agree to the implantation of a microchip regardless of status.

Section 8.

State of Ohio

8.1 SB 349 a Bill To Prohibit an Employer From Requiring an Employee Of the Employer To the Employee's Body a Radio Frequency Identification Tag

Sec. 4113.81. No employer shall require an employee of the employer to have inserted into the employee's body a radio frequency identification tag. Any employer who violates this section shall be subject to a fine of not more than one hundred fifty dollars per violation.
As used in this section:
“Radio frequency identification tags” mean a silicon chip containing an antenna that stores data and transmits that data to a wireless receiver.
“Employer” means the state, any political subdivision of the state, or any person employing one or more individuals in the state.

8.2 Definition

The implantable microchip is defined as a “radio frequency identification tag” (2006 Ohio SB 349) in the main text which may seem open to the use of other technologies, however definition (A) states; “Radio frequency identification tags mean a silicon chip containing an antenna that stores data and transmits that data to a wireless receiver” (2006 Ohio SB 349). Therefore the legislation is in relation to any technology that achieves its purpose by the above method.

The preamble of this bill is a proposal for amendment of Ohio Code 4113. Ohio Code 4113 is the Miscellaneous Labor Provisions Code which provides legislation from dismissal laws, to wages to whistle blowing (Ohio Code §4113). This is a clear indication that there was no intention to have the bill / legislation protect every individual of the state, rather to protect an employee from an employer.

8.3 Who it affects?

Ohio's proposed legislation is very unique in the subject affected by it. “No employer shall require an employee” (2006 Ohio SB 349). Unlike the other states, Ohio only proposes the legislation against employer's therefore protecting an employee over an unfair dismissal due to refusing implantation.

8.4 Exceptions

The 2006 Ohio SB 349 leaves itself open for attack. By only referencing an employee to employer relationship the legislation does not prevent state government, hospitals, doctors, parents or any other individual to be microchipped unless the individuals lawyer can prove a violation of §2903.13 of the Ohio Code (assault) whereby “No person shall knowingly cause or attempt to cause physical harm to another or to another's unborn” (Ohio Code §2903.13) whereby the coercion and physical act of microchipping could be classed as assault.

The punishment outlined in 2006 Ohio SB 349 does not reference any Ohio Code section or specify it in a misdemeanour or felony class, instead an exact figure of 150 in addition to the original price of purchasing and using a commercial implant product. If an organisation wants to utilise the technology for convenience and security $150 per employee (or per violation) may be considered an investment rather than a crime,

Section 9.

State of Oklahoma

9.1 HB 2092, SB 47 an Act Prohibiting the Forced Implantation Of a Microchip

No person shall require an individual to undergo the implanting of a microchip.
Any person convicted of violating the provisions of this section shall be subject to a fine of not more than Ten Thousand Dollars ($10,000.00). Each day of continued violation shall constitute a separate offense.

9.2 Definition

The term “microchip” is used to describe the implantable microchip, however no formal definition is provided therefore any device containing a microchip or device of similar or advanced capabilities is included within the definition of a ‘microchip’ and must adhere to this bill.

9.3. Who it affects?

“No person shall require an individual” (2007 Oklahoma HB 2092) prevents all individuals within the state of Oklahoma however does not protect against United States federal legislation.

Section 10.

State of Wisconsin

10.1 2005 Wisconsin Act 482 Prohibiting the Required Implanting Of Microchip in an Individual and Providing a Penalty

The people of the state of Wisconsin, represented in senate and assembly, do enact as follows: SECTION 1. 146.25 of the statutes is created to read: 146.25 Required implanting of microchip prohibited.
No person may require an individual to undergo the implanting of a microchip.
Any person who violates sub. (1)may be required to forfeit not more than $10,000. Each day of continued violation constitutes a separate offense.

10.2 Definition

The term microchip is used however no definition is provided therefore any device containing a microchip or device of similar or advanced capabilities is included within the definition of a ‘microchip.’

10.3 Who it affects?

“No person may require an individual to undergo the implanting of a microchip” (2005 Wisconsin Act 482) prevents all individuals within the state of Wisconsin however does not protect against United States federal legislation.

Section 11.

State of Georgia

11.1 HB 38, Microchip Consent Act

SECTION 2… 1) ‘Implantation’ includes any means intended to introduce a microchip internally, beneath the skin, or applied to the skin of a person.(2) ‘Microchip’ means any microdevice, sensor, transmitter, mechanism, electronically readable marking, or nanotechnology that is passively or actively capable of transmitting or receiving information. This definition shall not include pacemakers.(3) ‘Person’ means any individual, irrespective of age, legal status, or legal capacity.(4) ‘Require’ includes physical violence, threat, intimidation, retaliation, the conditioning of any private or public benefit or care on consent to implantation, including employment, promotion, or other benefit, or by any means that causes a. person to acquiesce to implantation when he or she otherwise would not.  No person shall be required to be implanted with a microchip. This Code section shall be subject to a two-year statute of limitations beginning from the date of discovery that a microchip has been implanted.  Any person required to have a microchip implanted in violation of this Code section shall be entitled to pursue criminal charges in addition to filing a civil action for damages. Each day that a microchip remains implanted shall be subject to damages of not less than $10,000.00 per day and each day shall be considered a separate violation of this Code section.  The voluntary implantation of any microchip or similar device may only be performed by a physician and shall be regulated under the authority of the Composite State Board of Medical Examiners.”

Section 12.

State of Missouri

285.035.1. No employer shall require an employee to have personal identification microchip technology implanted into an employee for any reason.

For purposes of this section, “personal identification microchip technology” means a subcutaneous or surgically implanted microchip technology device or product that contains or is designed to contain a unique identification number and personal information that can be non-invasively retrieved or transmitted with an external scanning device. Any employer who violates this section is guilty of a class A misdemeanor.

Section 13.

Cross-case comparison

From the seven (7) states studied in 2007, it is clear that there are subtle yet possibly detrimental differences between the legislation enacted (e.g. in the case of North Dakota and Wisconsin) and the legislation pending enactment.

13.1 Stakeholder & Other Definitions

Citizen: Refers to any other citizen within the state of the (enacted / pending) legislation other than the subject (oneself).

Employer: Refers to the manager, management, owner, franchiser or CEO of an organization by where the subject is currently employed on any basis (full time, casual, part time, or probation).

Government: Refers to the state government and anyone employed by the state government including law enforcement personnel.

Hospitals (Doctors): Refers to any healthcare practitioner including, general practitioners and psychologists, psychiatrists, social workers and nurses of the subject who may be deemed suffering a mental illness.

Parents:Refers to the parents and guardians of a minor as defined by the state and the carer / guardian / solicitor of a subject deemed mentally ill or elderly.

Yourself: Refers to the subject, an individual wishing to approve the implantation of a microchip into their body.

Fine: Refers to a monetary fine payable for the offence of coercing an individual to be chipped. If a period of time (day(s), month(s), year(s)) is including in this field then jail time for that period indicated is part of the maximum sentence for the crime.

Consecutive Day: Refers to the punishment (jail time / momentary fine) applicable for each day in which the crime occurs.

13.2 Fines and Punishment

The following section provides a breakdown of the key elements within the Acts and Bills for each state and shows what is permitted by law and what is disallowed with regards to ICT implants states of the U.S.A. Section 13.2 should be read together with Table 1.

Table 1. U.S. State Anti-Chipping Laws/Bills Comparison Chart as of October 2007

Table 1. U.S. State Anti-Chipping Laws/Bills Comparison Chart as of October 2007

The yellow colored sections of the table represent a fine or punishment which can be seen as too light in comparison to the other states. In California for each day the offence occurs after the initial offence a 10,000) is charged. According to the United States Census Bureau, a citizen of California on average earns 6.666% more than an average American and 17.7% more than the average citizen of North Dakota [22] and yet the proposed fine in California is only 10% of the fine quoted in North Dakota's enacted legislation (2007 North Dakota SB 2415).

Ohio put in place a maximum penalty of 150 is not too much of an added expense to the $200 outlay per microchip [23]. This fine is not comparable to any of the other states and may oppose a risk rather than a benefit if it becomes enacted and employers act on the proposed $350.00 ‘investment.’

The peach colored section of Table 1 outlines the three states (Colorado, Florida and North Dakota) proposing jail time part of the maximum sentence if an individual is in breach of the legislation. These jail times come about by the classification of the offence as a felony or a misdemeanor and of a particular class. These classifications are then cross referenced to the State Code in order to determine the maximum sentence. Even though these states vary with punishment and do not have a monetary fine comparable with Oklahoma and Wisconsin, the fact they reference a classification under a criminal code protects the legislation for many generations. The fine attached to a classification may be changed if the legislative or judicial assembly makes a proposal and these changes often occur in a change in inflation or the Consumer Price Index (CPI), making the fine comparable in years to come. States that propose a fixed fine do not allow for inflation or CPI and may become a more relaxed punishment during the development of society over subsequent decades.

The green colored sections of Table 1 outline who is allowed to enforce the implantation of a microchip upon an individual without direct punishment in reference to the enacted or proposed bill of that state. In the case of Ohio only an employer who is a citizen of Ohio is prevented from chipping an employee of an Ohio state registered firm (2006 Ohio SB 349). California is the only state out of the seven that included clauses by which an exemption from punishment could be applied. Section (g) of 2007 California SB 362 allows the parents and guardians of minors to enforce the implantation of a device under certain circumstances outlined in §6922 and §6924 of the California Family Code. This clause does not mean that this does not apply to the other six states. The judiciary has the power to veto the legislation if they feel other legislation such as a Family Act is more relevant to the case or superior to the microchipping legislation and the defendant's lawyer has the ability to utilize these acts or codes to refute the microchipping legislation.

Section 14. 

Conclusion

As the development and deployment of the implantable microchip continues to gather momentum across markets and jurisdictions, the greater the propensity for case law to emerge related to the specific ICT implantable technology. The problem with state laws, as demonstrated in the U.S.A is that legislation is not uniform, at least at the state level, and even more anomalous is a comparison between state and federal legislation, which will be the focus of a forthcoming study.

References

1. C. M. Banbury, Surviving Technological Innovation in the Pacemaker Industry 1959-1990. New York: Garland Publishing, 1997.

2. J. H. Schulman, "Human Implantable Technologies," in Career Development in Bioengineering and Biotechnology, G. Madhavan, Ed., 2009, pp. 167-172.

3. R. A. Lindley, Smart Card Innovation. Australia: Saim, 1997.

4. K. Warwick, I, Cyborg. UK: Century, 2002.

5. J. B. Wyatt, P. D. George, and K. Van Dyck, "Implant Gun, Pfizer Inc.," in Appl. No.: 05/046,159 United States Patent, 15 June 1970.

6. D. Man, "Dr. Man Plastic Surgery," 2009.

7. D. Man, "Implantable homing device," in United States Patent: 4,706,689. Boca Raton, Florida: USPTO, 8 January 1987.

8. A. Mann, "Where Technology and Life Unite," Alfred Mann Foundation, 2009.

9. M. Treder, "Radical Prosthetic Implants," Institute for Ethics and Emerging Technologies, 2009.

10. K. Michael and M. G. Michael, "Microchipping People: The Rise of the Electrophorus," Quadrant, vol. 414, pp. 22-33, 2005.

11. K. Michael, M. Michael, and R. Ip, "Microchip Implants for Humans as Unique Identifiers: a Case Study on VeriChip," presented at 3TU: Ethics, Identity and Technology, The Hague, The Netherlands, 2007.

12. M. G. Michael and K. Michael, "Uberveillance: Microchipping People and the Assault on Privacy," Quadrant, vol. LIII, pp. 85-89, 2009.

13. M. N. Gasson, "ICT Implants: the Invasive Future of Identity," in IFIP International Federation for Information Processing: The Future of Identity in the Information Society;, vol. 262, S. Fischer-Hübner, P. Duquenoy, A. Zuccato, and L. Martucci, Eds. Boston: Springer: Springer, 2008, pp. 287-295.

14. E. M. Ramesh, "Time Enough? Consequences of Human Microchip Implantation," Franklin Pierce Law Centre, vol. 8, 1997.

15. W. A. Herbert, "No Direction Home: Will The Law Keep Pace With Human Tracking Technology to Protect Individual Privacy and Stop Geoslavery?," I/S - A Journal of Law and Policy for the Information Society, vol. 2, pp. 409-472, 2006.

16. ADSX, "Get Chipped™: VeriChip™ preregistration program," in Applied Digital Solutions, 2002.

17. K. Michael and M. G. Michael, Innovative Auto-ID and Location-Based Services: from Bar Codes to Chip Implants. Hershey: Information Science Reference, 2009.

18. W. Kluwer, "States regulate use of microchips as tracking device," CCH® Internet Research NetWork, 2009.

19. C. E. Lyon, "California Bans Mandatory Implanting of Identification Devices," Morrison & Foerster, November 2007.

20. F. L. College, "Colorado Revised Statutes, Fort Lewis College," 2007.

21. US Library of Congress, "Federal Judiciary Branch," 21 July 2007.

22. US Census Bureau, "Current Population Survey (CPS): Annual Social and Economic Supplement," 2007.

23. T. Chin, "Tiny Implant Puts Portable Medical," in American Medical News, April 24 2006.

IEEE Keywords: Law, Legal factors, Implants, Legislation, Monitoring, Humans, Medical diagnostic imaging, Signal generators, Diseases, Pharmaceutical technology

INSPEC: public administration, legislation, rights protection, legal ramification, microchipping people, United States of America, state legislation,state law, state bill, antichipping legislation

Citation: Angelo Friggieri, Katina Michael and M.G. Michael, 2009, The legal ramifications of microchipping people in the United States of America- A state legislative comparison, ISTAS09, IEEE International Symposium on Technology and Society, ISTAS '09. 18-20 May, Tempe, Arizona, DOI: 10.1109/ISTAS.2009.5155900.

The Auto-ID Trajectory - Chapter Seven: Ten Cases in the Selection and Application of Auto-ID

The overall purpose of this chapter is to present the auto-ID selection environment by exploring ten embedded case studies. The cases will act to illustrate the pervasiveness of each auto-ID technology within vertical sectors which are synonymous with the technology’s take up. The focus will now shift from the technology provider as the central actor to innovation (as was highlighted in ch. 6) to the service provider stakeholder who adopts a particular technology on behalf of its members and end users. It will be shown that new commercial applications do act to drive incremental innovations which shape a technology’s long-term trajectory. The four levels of analysis that will be conducted can be seen in exhibit 7.1 below, with three examples to help the reader understand the format of the forthcoming micro-inquiry. This chapter dedicates equal space to each case and for the first time will show that coexistence between auto-ID technologies is not only possible but happening presently, and very likely to continue into the future.

Read More