Implantable Medical Device Tells All

Implantable Medical Device Tells All: Uberveillance Gets to the Heart of the Matter

In 2015, I provided evidence at an Australian inquiry into the use of subsection 313(3) of the Telecommunications Act of 1997 by government agencies to disrupt the operation of illegal online services [1]. I stated to the Standing Committee on Infrastructure and Communications that mandatory metadata retention laws meant blanket coverage surveillance for Australians and visitors to Australia. The intent behind asking Australian service providers to keep subscriber search history data for up to two years was to grant government and law enforcement organizations the ability to search Internet Protocol–based records in the event of suspected criminal activity.

Importantly, I told the committee that, while instituting programs of surveillance through metadata retention laws would likely help to speed up criminal investigations, we should also note that every individual is a consumer, and such programs ultimately come back to bite innocent people through some breach of privacy or security. Enter the idea of uberveillance, which, I told the committee, is “exaggerated surveillance” that allows for interference [1] that I believe is a threat to our human rights [2]. I strongly advised that evoking section 313 of the Telecommunications Act 1997 requires judicial oversight through the process of a search warrant. My recommendations fell on deaf ears, and, today, we even have the government deliberating over whether or not they should relax metadata laws to allow information to be accessed for both criminal and civil litigation [3], which includes divorces, child custody battles, and business disputes. In June 2017, Australian Prime Minister Malcolm Turnbull even stated that “global social media and messaging companies” need to assist security services’ efforts to fight terrorism by “providing access to encrypted communications” [52].

Consumer Electronics Leave Digital Data Footprints

Of course, Australia is not alone in having metadata retention laws. Numerous countries have adopted these laws or similar directives since 2005, keeping certain types of data for anywhere between 30 days and indefinitely, although the standard length is somewhere between one and two years. For example, since 2005, Italy has retained subscriber information at Internet cafes for 30 days. I recall traveling to Verona in 2008 for the European Conference on Information Systems, forgetting my passport in my hotel room, and being unable to use an Internet cafe to send a message back home because I was carrying no recognized identity information. When I asked why I was unable to send a simple message, I was handed an antiterrorism information leaflet. Italy also retains telephone data for up to two years and Internet service provider (ISP) data for up to 12 months.

Similarly, the United Kingdom retains all telecommunications data for one to two years. It also maintains postal information (sender, receiver data), banking data for up to seven years, and vehicle movements for up to two years. In Germany, metadata retention was established in 2008 under the directive Gesetz zur Neuregelung der Telekommunikationsüberwachung und anderer verdeckter Ermittlungsmaßnahmen sowie zur Umsetzung der Richtlinie 2006/24/EG, but it was overturned in 2010 by the Federal Constitutional Court of Germany, which ruled the law was unconstitutional because it violated a fundamental right, in that correspondence should remain secret. In 2015, this violation was challenged again, and a compromise was reached to retain telecommunications metadata for up to ten weeks. Mandatory data retention in Sweden was challenged by one holdout ISP, Bahnhof, which was threatened with an approximately US$605,000 fine in November 2014 if it did not comply [4]. They defended their stance to protect the privacy and integrity of their customers by offering a no-logs virtual private network free of charge [5].

Some European Union countries have been deliberating whether to extend metadata retention to chats and social media, but, in the United States, many corporations voluntarily retain subscriber data, including market giants Amazon and Google. It was reported in The Guardian in 2014 that the United States records Internet metadata for not only itself but the world at large through the National Security Agency (NSA) using its MARINA database to conduct pattern-of-life analysis [6]. Additionally, with the Amendments Act in 2008 of the Foreign Intelligence Surveillance Act 1978, the time allotted for warrantless surveillance was increased, and additional provisions were made for emergency eavesdropping. Under section 702 of the Foreign Intelligence Surveillance Act of 1978 Amendments Act, now all American citizens’ metadata is stored. Phone records are kept by the NSA in the MAINWAY telephony metadata collection database [53], and short message service and other text messaging worldwide are retained in DISHFIRE [7], [8].

Emerging Forms of Metadata in an Internet of Things World

Figure 1. An artificial pacemaker (serial number 1723182) from St. Jude medical, with electrode, which was removed from a deceased patient prior to cremation. (Photo courtesy of wikimedia commons.)

The upward movement toward a highly interconnected world through the Web of Things and people [9] will only mean that even greater amounts of data will be retained by corporations and government agencies around the world, extending beyond traditional forms of telecommunications data (e.g., phone records, e-mail correspondence, Internet search histories, metadata of images, videos, and other forms of multimedia). It should not surprise us that even medical devices are being touted as soon to be connected to the Internet of Things (IoT) [10]. Heart pacemakers, for instance, already send a steady stream of data back to the manufacturer’s data warehouse (Figure 1). Cardiac rhythmic data is stored on the implantable cardioverter-defibrillator’s (ICD’s) memory and is transmitted wirelessly to a home bedside monitor. Via a network connection, the data find their way to the manufacturer’s data store (Figure 2).

The standard setup for an EKG. A patient lies in a bed with EKG electrodes attached to his chest, upper arms, and legs. A nurse oversees the painless procedure. The ICD in a patient produces an EKG (A) which can automatically be sent to a ICD manufacturer's data store (B). (Image courtesy of wikimedia commons.)

In health speak, the ICD set up in the patient’s home is a type of remote monitoring that happens usually when the ICD recipient is in a state of rest, most often while sleeping overnight. It is a bit like how normal computer data backups happen, when network traffic is at its lowest. In the future, an ICD’s proprietary firmware updates may well travel back up to the device, remote from the manufacturer, like installing a Windows operating system update on a desktop. In the following section, we will explore the implications of access to personal cardiac data emanating from heart pacemakers in two cases.

CASE 1: HUGO CAMPOS DENIED ACCESS TO HIS PERSONAL CARDIAC DATA

Figure 3. The conventional radiography of a single-chamber pacemaker. (Photo courtesy of wikimedia commons.)

In 2007, scientist Hugo Campos collapsed at a train station and later was horrified to find out that he had to get an ICD for his genetic heart condition. ICDs usually last about seven years before they require replacement (Figure 3). A few years into wearing the device, being a high-end quantifiedself user who measured his sleep, exercise, and even alcohol consumption, Campos became inquisitive over how he might gain access to the data generated by his ICD (Figure 4). He made some requests to the ICD’s manufacturer and was told that he was unable to receive the information he sought, despite his doctor having full access. Some doctors could even remotely download the patient’s historical data on a mobile app for 24/7 support during emergency situations (Figure 5). Campos’s heart specialist did grant him access to written interrogation reports, but Campos only saw him about once every six months after his conditioned stabilized. Additionally, the logs were of no consequence to him on paper, and the fields and layout were predominantly decipherable only by a doctor (Figure 6).

Figure 4. The Nike FuelBand is a wearable computer that has become one of the most popular devices driving the so-called quantified-self trend. (Photo courtesy of wikimedia commons.)

Dissatisfied by his denied access, Campos took matters into his own hands and purchased a device on eBay that could help him get the data. He also went to a specialist ICD course and then intercepted the cardiac rhythms being recorded [11]. He got to the data stream but realized that to make sense of it from a patient perspective, a patient-centric app had to be built. Campos quickly deduced that regulatory and liability concerns were at the heart of the matter from the manufacturer’s perspective. How does a manufacturer continue to improve its product if it does not continually get feedback from the actual ICDs in the field? If manufacturers offered mobile apps for patients, might patients misread their own diagnoses? Is a manufacturer there to enhance life alone or to make a patient feel better about bearing an ICD? Can an ICD be misused by a patient? Or, in the worst case scenario, what happens in the case of device failure? Or patient death? Would the proof lie onboard? Would the data tell the true story? These are all very interesting questions.

Figure 5. The medical waveform format encoding rule software on a Blackberry device. It displays medical waveforms, such as EKG (shown), electroencephalogram, and blood pressure. Some doctors have software that allows them to interrogate EKG information, but patients presently do not have access to their own ICD data. (Photo courtesy of wikimedia commons.)

Campos might well have acted to not only get what he wanted (access to his data his own way) but to raise awareness globally as to the type of data being stored remotely by ICDs in patients. He noted in his TEDxCambridge talk in 2011 [12]:

the ICD does a lot more than just prevent a sudden cardiac arrest: it collects a lot of data about its own function and about the patient’s clinical status; it monitors its own battery life; the amount of time it takes to deliver a life-saving shock; it monitors a patient’s heart rhythm, daily activity; and even looks at variations in chest impedance to look if there is build-up of fluids in the chest; so it is a pretty complex little computer you have built into your body. Unfortunately, none of this invaluable data is available to the patient who originates it. I have absolutely no access to it, no knowledge of it.

Doctors, on the other hand, have full 24/7 unrestricted access to this information; even some of the manufacturers of these medical devices offer the ability for doctors to access this information through mobile devices. Compare this with the patients’ experience who have no access to this information. The best we can do is to get a printout or a hardcopy of an interrogation report when you go into the doctor’s office.

Figure 6. An EKG chart. Twelve different derivations of an EKG of a 23-year-old japanese man. A similar log was provided to hugo campos upon his request for six months worth of EKG readings. (Photo courtesy of wikimedia commons.)

Campos decided to sue the manufacturer after he was informed that the data being generated from his ICD measuring his own heart activity was “proprietary data” [13]. Perhaps this is the new side of big data. But it is fraught with legal implications and, as far as I am concerned, blatantly dangerous. If we deduce that a person’s natural biometric data (in this instance, the cardiac rhythm of an individual) belong to a third party, then we are headed into murky waters when we speak of even more invasive technology like deepbrain stimulators [14]. It not only means that the device is not owned by the electrophorus (the bearer of technology) [15], [16], but quite possibly the cardiac rhythms unique to the individual are also owned by the device manufacturer. We should not be surprised. In Google Glass’s “Software and Services” section of its terms of use, it states that Google has the right to “remotely disable or remove any such Glass service from user systems” at its “sole discretion” [17]. Placing this in the context of ICDs means that a third party almost indelibly has the right to switch someone off.

CASE 2: ROSS COMPTON’S PACEMAKER DATA IS SUBPOENAED FOR CRIMINAL INVESTIGATIONS

Enter the Ross Compton case of Middletown, Ohio. M.G. Michael and I have dubbed it one of the first authentic uberveillance cases in the world, because the technology was not just wearable but embedded. The story goes something like this: On 27 January 2017, 59-year-old Ross Compton was indicted on arson and insurance fraud charges. Police gained a search warrant to obtain his heart pacemaker readings (heart and cardiac rhythms) and called his alibi into question. Data from Compton’s pacemaker before, during, and after the fire in his home broke out were disclosed by the heart pacemaker manufacturer after a subpoena was served. The insurer’s bill for the damage was estimated at about US$400,000. Police became suspicious of Compton when they traced gasoline to Compton’s shoes, trousers, and shirt.

In his statement of events to police, Compton told a story that misaligned and conflicted with his call to 911. Forensic analysts found traces of multiple fires having been lit in various locations in the home. Yet, Compton told police he had rushed his escape, breaking a window with his walking stick to throw some hastily packed bags out and then fleeing the flames himself to safety. Compton also told police that he had an artificial heart with a pump attached, a fact that he thought might help his cause but that was to be his undoing. In this instance, his pacemaker acted akin to a black box recording on an airplane [18].

After securing the heart pacemaker data set, an independent cardiologist was asked to assess the telemetry data and determine if Compton’s heart function was commensurate with the exertion needed to make a break with personal belongings during a life-threatening fire [19]. The cardiologist noted that, based on the evidence he was given to interpret, it was “highly improbable” that a man who suffered with the medical conditions that Compton did could manage to collect, pack, and remove the number of items that he did from his bedroom window, escape himself, and then proceed to carry these items in front of his house, out of harm’s way (see “Columbo, How to Dial a Murder”). Compton’s own cardio readings, in effect, snitched on him, and none were happier than the law enforcement officer in charge of the case, Lieutenant Jimmy Cunningham, who noted that the pacemaker data, while only a supporting piece of evidence, was vital in proving Compton’s guilt after gasoline was found on his clothing. Evidence-based policing has now well outstripped the more traditional intelligence-led policing approach, entrenched given the new realm of big data availability [20], [21].

Columbo, How to Dial a Murder [S1] Columbo says to the murderer:
“You claim that you were at the physicians getting your heart examined…which was true [Columbo unravels a roll of EKG readings]…the electrocardiogram, Sir. Just before three o’clock your physician left you alone for a resting trace. At that moment you were lying down in a restful position and your heart showed a calm, slow, easy beat [pointing to the EKG readout]. Look at this part, right here [Columbo points to the reading], lots of sudden stress, lots of excitement, right here at three o’clock, your heart beating like a hammer just before the dogs attacked…Oh you killed him with a phone call, Sir…I’ll bet my life on it. Very simple case. Not that I’m particularly bright, Sir…I must say, I found you disappointing, I mean your incompetence, you left enough clues to sink a ship. Motive. Opportunity. And for a man of your intelligence Sir, you got caught on a lot of stupid lies. A lot.” [S1] Columbo: How to Dial a Murder. Directed by James Frawley. 1978. Los Angeles, CA: Universal Pictures Home Entertainment, 2006. DVD.

Consumer Electronics Tell a Story

Several things are now of interest to the legal community: first and foremost, how is the search warrant for a person’s pacemaker data executed? In case 1, Campos was denied access to his own ICD data stream by the manufacturer, and yet his doctor had full access. In case 2, Compton’s own data provided authorities with the extra evidence they needed to accuse him of fraud. This is yet another example of seemingly private data being used against an individual (in this instance, the person from whose body the data emanated), but in the future, for instance, the data from one person’s pacemaker might well implicate other members of the public. For example, the pacemaker might be able to prove that someone’s heart rate substantially increased during an episode of domestic violence [22] or that an individual was unfaithful in a marriage based on the cross matching of his or her time stamp and heart rate data with another.

Of course, a consumer electronic does not have to be embedded to tell a story (Figure 7). It can also be wearable or luggable, as in the case of a Fitbit that was used as a truthdetector in an alleged rape case that turned out to be completely fabricated [23]. Lawyers are now beginning to experiment with other wearable gadgetry that helps to show the impact of personal injury cases from accidents (work and nonwork related) on a person’s ability to return to his or her normal course of activities [24] (Figure 8). We can certainly expect to see a rise in criminal and civil litigation that makes use of a person’s Android S Health data, for instance, which measure things like steps taken, stress, heart rate, SpO2, and even location and time (Figure 9). But cases like Compton’s open the floodgates.

Figure 7. A Fitbit, which measures calories, steps, distance, and floors. (Photo courtesy of wikimedia commons.)

Figure 8. A closeup of a patient wearing the iRhythm ZIO XT patch, nine days after its placement. (Photo courtesy of wikimedia commons.)

I have pondered on the evidence itself: are heart rate data really any different from other biometric data, such as deoxyribonucleic acid (DNA)? Is it perhaps more revealing than DNA? Should it be dealt with in the same way? For example, is the chain of custody coming from a pacemaker equal to that of a DNA sample and profile? In some way, heart rates can be considered a behavioral biometric [25], whereas DNA is actually a cellular sample [26]. No doubt we will be debating the challenges, and extreme perspectives will be hotly contested. But it seems nothing is off limits. If it exists, it can be used for or against you.

Figure 9. (a) and (b) The health-related data from Samsung's S Health application. Unknown to most is that Samsung has diversified its businesses to be a parent company to one of the world's largest health insurers. (Photos courtesy of katina michael.)

The Paradox of Uberveillance

In 2006, M.G. Michael coined the term uberveillance to denote “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” [27]. No doubt Michael’s background as a former police officer in the early 1980s, together with his cross-disciplinary studies, had something to do with his insights into the creation of the term [28]. This kind of surveillance does not watch from above, rather it penetrates the body and watches from the inside, looking out [29].

Furthermore, uberveillance “takes that which was static or discrete…and makes it constant and embedded” [30]. It is real-time location and condition monitoring and “has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought)” [30]. Uberveillance can be used prospectively or retrospectively. It can be applied as a “predictive mechanism for a person’s expected behavior, traits, likes, or dislikes; or it can be based on historical fact” [30].

In 2008, the term uberveillance was entered into the official Macquarie Dictionary of Australia [31]. In research that has spanned more than two decades on the social implications of implantable devices for medical and nonmedical applications, I predicted [15] that the technological trajectory of implantable devices that were once used solely for care purposes would one day be used retrospectively for tracking and monitoring purposes. Even if the consumer electronics in question were there to provide health care (e.g., the pacemaker example) or convenience (e.g., a near-field-communication-enabled smartphone), the underlying dominant function of the service would be control [32]. The socioethical implications of pervasive and persuasive emerging technologies have yet to really be understood, but increasingly, they will emerge to take center stage in court hearings, like the emergence of DNA evidence and then subsequently global positioning system (GPS) data [33].

Medical device implants provide a very rich source of human activity monitoring, such as the electrocardiogram (EKG), heart rate, and more. Companies like Medtronics, among others specializing in implantables, have proposed a future where even healthy people carry a medical implant packed with sensors that could be life sustaining and detect heart problems (among others), reporting them to a care provider and signaling when assistance might be required [34]. Heart readings provide an individual’s rhythmic biometrics and, at the same time, can record increases and decreases in activity. One could extrapolate that it won’t be long before our health insurance providers are asking for the same evidence for reduced premiums.

Figure 10. A pacemaker cemetery. (Photo courtesy of wikimedia commons.)

The future might well be one where we all carry a black box implantable recorder of some sort [35], an alibi that proves our innocence or guilt, minute by minute (Figure 10). Of course, an electronic eye constantly recording our every move brings a new connotation to the wise words expressed in the story of Pinocchio: always let your conscience be your guide. The future black boxes may not be as forgiving as Jiminy Cricket and more like Black Mirror’s “The Entire History of You” [36]. But if we assume that these technologies are to be completely trusted, whether they are implantable, wearable, or even luggable, then we are wrong.

The contribution of M.G. Michael’s uberveillance is in the emphasis that the uberveillance equation is a paradox. Yes, there are near-real-time data flowing continuously from more points of view than ever [37], closed-circuit TV looking down, smartphones in our pockets recording location and movement, and even implantables in some of us ensuring nontransferability of identity [38]. The proposition is that all this technology in sum total is bulletproof and foolproof, omniscient and omnipresent, a God’s eye view that cannot be challenged but for the fact that the infrastructure and the devices, and the software, are all too human. And while uberveillance is being touted for good through an IoT world that will collectively make us and our planet more sustainable, there is one big crack in the utopian vision: the data can misrepresent, misinform, and be subject to information manipulation [39]. Researchers are already studying the phenomenon on complex visual information manipulation, how to tell whether data has been tampered with, a suspect introduced or removed from a scene of a crime, and other forensic visual analytics [40]. It is why Vladimir Radunovic, director of cybersecurity and e-diplomacy programs in the DiploFoundation, cited M.G. Michael’s contribution that “big data must be followed by big judgment” [41].

What happens in the future if we go down the path of constant bodily monitoring of vital organs and vital signs, where we are all bearing some device or at least wearing one? Will we be in control of our own data, or, as is seemingly obvious at present, will we not be in control? And how might selfincrimination play a role in our daily lives, or even worse, individual expectations that can be achieved by only playing to a theater 24/7 so our health statistics can stack up to whatever measure and cross-examination they are put under personally or publicly [42]? Can we believe the authenticity of every data stream coming out of a sensor onboard consumer electronics? The answer is no.

Having run many years of GPS data-logging experiments, I can say that a lot can go wrong with sensors, and they are susceptible to outside environmental conditions. For instance, they can log your location miles away (even in another continent), the temperature gauge can play up, time stamps can revert to different time zones, the speed of travel can be wildly inaccurate due to propagation delays in satellites, readings may not be at regular intervals due to some kind of interference, and memory overflow and battery issues, while getting better, are still problematic. The short and long of it is that technology cannot be trusted. At best, it can act as supporting evidence but should never replace eyewitness accounts. Additionally, “the inherent problem with uberveillance is that facts do not always add up to truth (i.e., as in the case of an exclusive disjunction T 1 T 5 F), and predictions based on uberveillance are not always correct” [30].

Conclusion

While device manufacturers are challenging the possibility that their ICDs are hackable in courts [43], highly revered security experts like Bruce Schneier are heavily cautioning about going down the IoT path, no matter how inviting it might look. In his acclaimed blog, Schneier recently wrote [44]:

All computers are hackable…The industry is filled with market failures that, until now, have been largely ignorable. As computers continue to permeate our homes, cars, businesses, these market failures will no longer be tolerable. Our only solution will be regulation, and that regulation will be foisted on us by a government desperate to “do something” in the face of disaster…We also need to reverse the trend to connect everything to the internet. And if we risk harm and even death, we need to think twice about what we connect and what we deliberately leave uncomputerized. If we get this wrong, the computer industry will look like the pharmaceutical industry, or the aircraft industry. But if we get this right, we can maintain the innovative environment of the internet that has given us so much.

The cardiac implantables market by 2020 is predicted to become a US$43 billion industry [45]. Obviously, the stakes are high and getting higher with every breakthrough implantable innovation we develop and bring to market. We will need to address some very pressing questions at hand, as Schneier suggests, through some form of regulation if we are to maintain consumer privacy rights and data security. Joe Carvalko, a former telecommunications engineer and U.S. patent attorney as well as an associate editor of IEEE Technology and Society Magazine and pacemaker recipient, has added much to this discussion already [46], [47]. I highly recommend several of his publications, including “Who Should Own In-the-Body Medical Data in the Age of eHealth?” [48] and an ABA publication coauthored with Cara Morris, The Science and Technology Guidebook for Lawyers [49]. Carvalko is a thought leader in this space, and I encourage you to listen to his podcast [50] and also to read his speculative fiction novel, Death by Internet, [51] which is hot off the press and wrestles with some of the issues raised in this article.

REFERENCES

[1] K. Michael, M. Thistlethwaite, M. Rowland, and K. Pitt. (2015, Mar. 6). Standing Committee on Infrastructure and Communications, Section 313 of the Telecommunications Act 1997. [Online]. Available: http:// parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;db=COMMITT EES;id=committees%2Fcommrep%2Fd8727a07-ba09-4a91-9920-73d21 e446d1d%2F0006;query=Id%3A%22committees%2Fcommrep%2Fd872 7a07-ba09-4a91-9920-73d21e446d1d%2F0000%22

[2] S. Bronitt and K. Michael, “Human rights, regulation, and national security,” IEEE Technol. Soc. Mag., vol. 31, pp. 15–16, 2012.

[3] B. Hall. (2016, Dec. 22). Australians’ phone and email records could be used in civil lawsuits. Sydney Morning Herald. [Online]. Available: http:// www.smh.com.au/federal-politics/political-news/australians-phone-andemail-records-could-be-used-in-civil-lawsuits-20161222-gtgdy6.html

[4] PureVPN. (2015, Oct. 14). Data retention laws—an update. [Online]. Available: https://www.purevpn.com/blog/data-retention-laws-by-countries/

[5] D. Crawford. (2014, Nov. 18). Renegade Swedish ISP offers all customers VPN. Best VPN. [Online]. Available: https://www.bestvpn.com/ blog/11806/renegade-swedish-isp-offers-customers-vpn/

[6] J. Ball. (2013, Oct. 1). NSA stores metadata of millions of web users for up to a year, secret files show. Guardian. [Online]. Available: https://www .theguardian.com/world/2013/sep/30/nsa-americans-metadata-year-documents

[7] J. S. Granick, American Spies: Modern Surveillance, Why You Should Care, and What to Do About It. Cambridge, U.K.: Cambridge Univ. Press, 2017.

[8] A. Gregory, American Surveillance: Intelligence, Privacy, and the Fourth Amendment. Madison: Univ. of Wisconsin Press, 2016.

[9] K. Michael, G. Roussos, G. Q. Huang, A. Chattopadhyay, R. Gadh, B. S. Prabhu, and P. Chu, “Planetary-scale RFID services in an age of uberveillance,” Proc. IEEE, vol. 98, no. 9, pp. 1663–1671, 2010.

[10] N. Lars. (2015, Mar. 26). Connected medical devices, apps: Are they leading the IOT revolution—or vice versa? Wired. [Online]. Available: https://www.wired.com/insights/2014/06/connected-medical-devicesapps-leading-iot-revolution-vice-versa/

[11] H. Campos. (2015). The heart of the matter. Slate. [Online]. Available: http://www.slate.com/articles/technology/future_tense/2015/03/ patients_should_be_allowed_to_access_data_generated_by_implanted_ devices.html

[12] H. Campos. (2011). Fighting for the right to open his heart data: Hugo Campos at TEDxCambridge 2011. [Online]. Available: https:// www.youtube.com/watch?v=oro19-l5M8k

[13] D. Hinckley. (2016, Feb. 22). This big brother/big data business goes way beyond Apple and the FBI. Huffington Post. [Online]. Available: http://www.huffingtonpost.com/david-hinckley/this-big-brotherbigdata_b_9292744.html october 2017 ^ IEEE Consumer Electronics Magazine 115

[14] K. Michael, “Mental health, implantables, and side effects,” IEEE Technol. Soc. Mag., vol. 34, no. 2, pp. 5–17, 2015.

[15] K. Michael, “The technological trajectory of the automatic identification industry: The application of the systems of innovation (SI) framework for the characterisation and prediction of the auto-ID industry,” Ph.D. dissertation, School of Information Technology and Computer Science, Univ. of Wollongong, Wollongong, Australia, 2003.

[16] K. Michael and M. G. Michael, “Homo electricus and the continued speciation of humans,” in The Encyclopedia of Information Ethics and Security, M. Quigley, Ed. Hershey, PA: IGI Global, 2007, pp. 312–318.

[17] Google Glass. (2014, Aug. 19). Glass terms of use. [Online]. Available: https://www.google.com/glass/termsofuse/

[18] K. Michael and M. G. Michael, “Implementing ‘namebers’ using microchip implants: The black box beneath the skin,” in This Pervasive Day: The Potential and Perils of Pervasive Computing, J. Pitt, Ed. London, U.K.: Imperial College Press, 2011.

[19] D. Smith. (2017, Feb. 4). Pacemaker data used to charge alleged arsonist. Jonathan Turley. [Online]. Available: https://jonathanturley .org/2017/02/04/pacemaker-data-used-to-charge-alleged-arsonist/

[20] K. Michael, “Big data and policing: The pros and cons of using situational awareness for proactive criminalisation,” presented at the Human Rights and Policing Conf,. Australian National University, Canberra, Apr. 16, 2013.

[21] K. Michael and G. L. Rose, “Human tracking technology in mutual legal assistance and police inter-state cooperation in international crimes,” in From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Second Workshop on Social Implications of National Security), K. Michael and M. G. Michael, Eds. Wollongong, Australia: University of Wollongong, 2007.

[22] F. Gerry, “Using data to combat human rights abuses,” IEEE Technol. Soc. Mag., vol. 33, no. 4, pp. 42–43, 2014.

[23] J. Gershman. (2016, Apr. 21). Prosecutors say Fitbit device exposed fibbing in rape case. Wall Street Journal. [Online]. Available: http:// blogs.wsj.com/law/2016/04/21/prosecutors-say-fitbit-device-exposedfibbing-in-rape-case/

[24] P. Olson. (2014, Nov. 16). Fitbit data now being used in the courtroom. Forbes. [Online]. Available: https://www.forbes.com/sites/parmyolson/ 2014/11/16/fitbit-data-court-room-personal-injury-claim/#459434e37379

[25] K. Michael and M. G. Michael, “The social and behavioural implications of location-based services,” J. Location Based Services, vol. 5, no. 3–4, pp. 121–137, Sept.–Dec. 2011.

[26] K. Michael, “The European court of human rights ruling against the policy of keeping fingerprints and DNA samples of criminal suspects in Britain, Wales and Northern Ireland: The case of S. and Marper v United Kingdom,” in The Social Implications of Covert Policing (Workshop on the Social Implications of National Security, 2009), S. Bronitt, C. Harfield, and K. Michael, Eds. Wollongong, Australia: University of Wollongong, 2010, pp. 131–155.

[27] M. G. Michael and K. Michael, “National security: The social implications of the politics of transparency,” Prometheus, vol. 24, no. 4, pp. 359–364, 2006.

[28] M. G. Michael, “On the ‘birth’ of uberveillance,” in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds. Hershey, PA: IGI Global, 2014.

[29] M. G. Michael and K. Michael, “A note on uberveillance,” in From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Second Workshop on Social Implications of National Security), M. G. Michael and K. Michael, Eds. Wollongong, Australia: University of Wollongong, 2007.

[30] M. G. Michael and K. Michael, “Toward a state of uberveillance,” IEEE Technol. Soc. Mag., vol. 29, pp. 9–16, 2010.

[31] M. G. Michael and K. Michael, “Uberveillance,” in Fifth Edition of the Macquarie Dictionary, S. Butler, Ed. Sydney, Australia: Sydney University, 2009.

[32] A. Masters and K. Michael, “Lend me your arms: The use and implications of humancentric RFID,” Electron. Commerce Res. Applicat., vol. 6, no. 1, pp. 29–39, 2007.

[33] K. D. Stephan, K. Michael, M. G. Michael, L. Jacob, and E. P. Anesta, “Social implications of technology: The past, the present, and the future,” Proc. IEEE, vol. 100, pp. 1752–1781, 2012. [34] E. Strickland. (2014, June 10). Medtronic wants to implant sensors in everyone. IEEE Spectrum. [Online]. Available: http://spectrum.ieee .org/tech-talk/biomedical/devices/medtronic-wants-to-implant-sensorsin-everyone

[35] K. Michael, “The benefits and harms of national security technologies,” presented at the Int. Women in Law Enforcement Conf., Hyderabad, India, 2015. [36] J. A. Brian Welsh. (2011). The entire history of you,” Black Mirror, C. Brooker, Ed. [Online]. Available: https://www.youtube.com/watch?v= Sw3GIR70HAY

[37] K. Michael, “Sousveillance and point of view technologies in law enforcement,” presented at the Sixth Workshop on the Social Implications of National Security: Sousveillance and Point of View Technologies in Law Enforcement, University of Sydney, Australia, 2012.

[38] K. Albrecht and K. Michael, “Connected: To everyone and everything,” IEEE Technology and Soc. Mag., vol. 32, pp. 31–34, 2013.

[39] M. G. Michael, “The paradox of the uberveillance equation,” IEEE Technol. Soc. Mag., vol. 35, no. 3, pp. 14–16, 20, 2016.

[40] K. Michael, “The final cut—tampering with direct evidence from wearable computers,” presented at the Fifth Int. Conf. Multimedia Information Networking and Security (MINES 2013), Beijing, China, 2013.

[41] V. Radunovic, “Internet governance, security, privacy and the ethical dimension of ICTs in 2030,” IEEE Technol. Soc. Mag., vol. 35, no. 3, pp. 12–14, 2016.

[42] K. Michael. (2011, Sept. 12). The microchipping of people and the uberveillance trajectory. Social Interface. [Online]. Available: http:// socialinterface.blogspot.com.au/2011/08/microchipping-of-people-and .html

[43] O. Ford. (2017, Jan. 12). Post-merger Abbott moves into 2017 with renewed focus, still faces hurdles. J.P. Morgan Healthcare Conf. 2017. [Online]. Available: http://www.medicaldevicedaily.com/servlet/com .accumedia.web.Dispatcher?next=bioWorldHeadlines_article& forceid=94497

[44] B. Schneier. (2017, Feb. 1). Security and the Internet of Things: Schneier on security. [Online]. Available: https://www.schneier.com/ blog/archives/2017/02/security_and_th.html

[45] IndustryARC. (2015, July 30). Cardiac implantable devices market to reach $43 billion by 2020. GlobeNewswire. [Online]. Available: https://globenewswire.com/news-release/2015/07/30/756345/10143745/ en/Cardiac-Implantable-Devices-Market-to-Reach-43-Billion-By-2020 .html

[46] J. Carvalko, The Techno-Human Shell: A Jump in the Evolutionary Gap. Mechanicsburg, PA: Sunbury Press, 2013.

[47] J. Carvalko and C. Morris, “Crowdsourcing biological specimen identification: Consumer technology applied to health-care access,” IEEE Consum. Electron. Mag., vol. 4, no. 1, pp. 90–93, 2014.

[48] J. Carvalko, “Who should own in-the-body medical data in the age of ehealth?” IEEE Technol. Soc. Mag., vol. 33, no. 2, pp. 36–37, 2014.

[49] J. Carvalko and C. Morris, The Science and Technology Guidebook for Lawyers. New York: ABA, 2014.

[50] K. Michael and J. Carvalko. (2016, June 20). Joseph Carvalko speaks with Katina Michael on his non-fiction and fiction pieces. [Online]. Available: https://www.youtube.com/watch?v=p4JyVCba6VM

[51] J. Carvalko, Death by Internet. Mechanicsburg, PA: Sunbury Press, 2016.

[52] R. Pearce. (2017, June 7). “No-one’s talking about backdoors” for encrypted services, says PM’s cyber guy. Computerworld. [Online]. Available: https://www.computerworld.com.au/article/620329/no-onetalking-about-backdoors-says-pm-cyber-guy/

[53] M. Ambinder. (2013, Aug. 14). An educated guess about how the NSA is structured. The Atlantic. [Online]. Available: https://www .theatlantic.com/technology/archive/2013/08/an-educated-guess-abouthow-the-nsa-is-structured/278697/

Acknowledgment

A short form of this article was presented as a video keynote speech for the Fourth International Conference on Innovations in Information, Embedded and Communication Systems in Coimbatore, India, on 17 March 2017. The video is available at https://www.youtube.com/watch?v=bEKLDhNfZio.

Keywords

Metadata, Electrocardiography, Pacemakers, Heart beat, Telecommunication services, Implants, Biomedical equipment, biomedical equipment, cardiology, criminal law, medical computing, police data processing, transport protocols, implantable medical device, heart, Australian inquiry, government agencies, illegal online services,mandatory metadata retention laws, government organizations, law enforcement organizations, Internet protocol

Citation: Katina Michael, 2017, "Implantable Medical Device Tells All: Uberveillance Gets to the Heart of the Matter", IEEE Consumer Electronics Magazine, Vol. 6, No. 4, Oct. 2017, pp. 107 - 115, DOI: 10.1109/MCE.2017.2714279.

 

High-Tech Child's Play in the Cloud

Introduction

The “internet of things” mantra promotes the potential for the interconnectedness of everyone and everything [1]. The fundamental premise is that embedded sensors (including audio and image) will herald in an age of convenience, security, and quick response [2]. We have become so oblivious to the presence and placement of sensors in civil infrastructure (e.g., shopping centers and lampposts) and computing devices (e.g., laptops and smartphones) that we do not question their placement in places of worship, restrooms, and, especially, children's toys [3].

The risk with consumer desensitization over the “sensors everywhere” paradigm is, at times, complacency, but, for the greater part, apathy. When functionality is hidden inside a black box or is wireless, consumers can underestimate the potential for harm. The old adage “what you don't know won't hurt you” is not true in this context and neither is the “I have nothing to hide” principle. Form factors can play a significant role in disarming buyers of white goods for households and gifts for minors. In context, the power of a sensor looks innocent when it is located in a children's toy, as opposed to sitting atop a mobile closed-circuit television policing unit.

Barbie is Watching

The Mattel Vidster is a digital tapeless camcorder that was marketed as a children's toy. It features a 28-mm LCD display, a 2x digital zoom, and records into AVI 320 × 240 video files encoded with the M-JPEG codec at 15 frames/s, with 22-kHz monaural sound. It also takes still photos.

The Mattel Vidster is a digital tapeless camcorder that was marketed as a children's toy. It features a 28-mm LCD display, a 2x digital zoom, and records into AVI 320 × 240 video files encoded with the M-JPEG codec at 15 frames/s, with 22-kHz monaural sound. It also takes still photos.

An example of this shift in context is Mattel's Video Girl Barbie doll, launched in July 2010 [4]. It features a fully functional standard-definition pinhole video camera embedded in Barbie's chest, with a viewing screen on her back. Young children (Mattel is targeting ages six years and above) are supported by user design to make use of “doll's-eye-view” to record Barbie's point of view for up to 30 min. They can then create movies using the accompanying StoryTeller software. Video Girl comes with a (pink) USB plug-in cord for easy upload of the recorded footage. Initially, Mattel provided storage space for video makers in the cloud to share movies (http://barbie.com/videogirl), but the company later recanted and eliminated this video-sharing capability. We have speculated that one of Mattel's reasons for doing so was because it was faced with potential footage recorded at ground level that exposed young, carefree children at play.

The Barbie Video Girl doll—Create movies from Barbie's point-of-view with a real video camera inside the doll (the camera lens is in the necklace, and the video screen is on her back).

The Barbie Video Girl doll—Create movies from Barbie's point-of-view with a real video camera inside the doll (the camera lens is in the necklace, and the video screen is on her back).

In his book Cybercrime, Jonathan Clough makes it clear that offenses for child pornography are stipulated in Title 3, Article 9 of the Cybercrime Convention as producing, offering or making available, distributing or transmitting, procuring, or possessing child pornography [5], [p. 281]. While definitions of what constitutes an offense under child pornography laws vary greatly from one country to the next, court cases worldwide are providing clear precedents for unacceptable behaviors. It is quite possible that Mattel did not wish to find itself in the precarious situation of “offering or making available” debatable imagery of young children or as a potential, albeit accidental, accessory for possession. In essence, this places the manufacturer at the mercy of those who would label them as groomers or even procurers of child pornography, engineers of another insidious arm of the child pornographer. Three of the offenses that constitute the “making available” category of child pornography laws include to publish, make available, and show [5], [p. 287]. Mattel had obviously not thought through all the pros and cons associated with video sharing by minors. In fact, in most social media web sites, Facebook and Instagram included, policies preclude those under the age of 13 from registration and participation.

Four months after the official launch of Video Girl, the U.S. Federal Bureau of Investigation (FBI) privately issued a warning that the doll could be used to produce child pornography [6]. On 30 November 2010, in a situational information report “cybercrime alert,” from its Sacramento field office, the FBI publicly announced in a statement that there was “no reported evidence that the doll had been used in any way other than intended” [7], [8]. However, the report also stated that the FBI had revealed that there was an instance where an individual convicted of distributing child pornography had given the Barbie doll to a 6-year-old girl. In addition, there were numerous instances where a concealed video camera had recorded child pornography as well. All of these events are unsurprising [9]. The most obvious form of possession, with respect to the Barbie, would be if the accused had the item in his or her “present manual custody.” For example, if the defendant was found to be holding a Video Girl Barbie doll containing child pornography images or video, then, subject to the requirement of knowledge, he or she would be in possession of those images or video. In addition, if the doll was likewise found in the defendant's physical control (e.g., in his or her house), even that would constitute an offense.

There are professionals who have filmed Video Girl Barbie in a sexualized manner [10], but that in itself is not an offense. Although the YouTube video that compares the camera quality of the Canon 7D to Video Girl is unlisted (only people who know the link to the video can view it, and unlisted videos do not appear in YouTube search results), it sadly shows what distortion is possible through adult eyes, through using arguably borderline “adult” humor. In the YouTube comments for the video, Naxell wrote, “[t]hat USB in the back and the leg batteries make this seem like some kind of bizarre multipurpose sex gynoid,” while Marcos Vidal wrote, “Well, think on the Barbie's use; it can spy—with Cannon 7D, it's a lot harder.” While no one is claiming that Vidal was referring to the recording of a child for duplicitous reasons, it certainly suggests that Barbie could be used as a covert camera. Essentially, it is taking a form of child's play and making that an asset of the cloud for future use and possible manipulation. And this is just a fundamental issue in the new type of cybercrime—that “the advent of digital technology has transformed the way in which child pornography is produced and distributed” [5], [p. 251]. In essence, child pornography can be defined as “the sexual depiction of a child under a certain age” [5], [p. 255].

Marketing Mishaps

While we do not need to point to a video someone has made of Barbie and her super-power recording prowess “under the hood,” we can simply look at Mattel's poor taste in advertising strategy for the Video Girl doll as a children's toy. The key question is whether those who engineered the doll at Mattel understand that they are accountable for the purposeful user design and user experience they have created [11]. In a press release, the company stated, “Mattel products are designed with children and their best interests in mind. Many of Mattel's employees are parents themselves, and we understand the importance of child safety—it is our number one priority” [12].

The Barbie Video Girl doll is “doll vision” for ages 6 and above.

The Barbie Video Girl doll is “doll vision” for ages 6 and above.

At the time of the online media content review in early 2011, one of the authors, Katina Michael, was horrified to find some disturbing ways in which Mattel had softly launched the product. In fact, the doll sold out at Wal-Mart in its first release. The other author, Alexander Hayes, purchased a Barbie Video Girl in 2010 to inform his Ph.D. research on point-of-view technologies, and he told Katina that the doll was “hideous…a manifestation of the most cruel manner in which to permeate a child's play.” Katina agreed and noted that the purchased Barbie would remain forever unopened because the packaging itself formed a part of the bigger picture they would need to use for a stimulus for discussion to public audiences. Katina used the packaged Barbie during her presentation at the Fourth Regional Conference on Cybercrime and International Criminal Cooperation, which was well attended by law enforcement agencies, legal personnel, and scholars in the social implications of technology [13]. The Video Girl Barbie also made further appearances at the February 2012 SINS Workshop, “Point-of-View Technologies in Law Enforcement” [14], and an invited workshop at which Katina and Alexander spoke, the 2013 INFORMA Policing Technology Conference on the theme “Bring Your Own Body-Worn Device” [15].

In July 2010, Mattel released Barbie Video Girl, a doll with a pinhole video camera in its chest enabling clips up to 30 min to be recorded.

In July 2010, Mattel released Barbie Video Girl, a doll with a pinhole video camera in its chest enabling clips up to 30 min to be recorded.

Perhaps the most disturbing and disappointing aspect of the Video Girl Barbie was the way in which the doll was marketed. On the packaging was the statement “I am a real working video camera.” This vernacular is akin to adult sex workers and does not fit with societal moral and ethical frameworks by which we protect innocent children. It is questionable why the word working was introduced into the phraseology. In essence, Video Girl Barbie is a photoborg [16]. She is reminiscent of Mattel's Vidster video camera toy for kids [17], cloaked in the form of a Barbie doll. Elsewhere, Mattel mentions: “Necklace is a real camera lens!” But the location of the camera on the chest looks less like a necklace and more like cleavage with an additional statement: “This Barbie has a hidden video camera” [18]. There was also a picture of Barbie depicted on her knees with a visual didactic stating “for easy shooting,” indicating the three steps to making a movie. The storytelling video demo scenario Mattel used had to do with cats at the vet and was generally in poor taste. The cat was depicted getting her heartbeat monitored in one video scene, getting an X-ray in another, and then finding herself in a basket with another cat and finding love, with a heart symbol depicted above the cats' heads.

Comments varied for iJustine's video “OMG Video Girl,” which has more than 1.4 million YouTube views [19]. Here was a female adult commenting on a toy for kids. Taylor Johnson wrote, “My Favorite was the vet Barbie! Haha!” Mssjasmine commented, “That doll is kinda creepy (like a pedophile would buy that to watch little kids…ew).” Sam Speirs similarly wrote, “This ‘toy’ of yours will/could be used as a major predator trap! And I know that the idea was for the girls to have a camera [to] do stuff, but, seriously, it's a concealed camera in a popular little girl's toy…Creepy, if you ask me!” Another product reviewer of children's toys wrote: “Barbie sees everything from a whole different angle” [20]. There were several “Boycott Barbie” websites found in 2011: “Get Rid of Barbie Video Girl” Facebook page and “Boycott Porno Barbie.”

A child plays with traditional dolls. Today, we are making dolls that are connected to the cloud and use artificial intelligence to listen to questions from children and provide them answers over the Internet without human intervention. Soon, we will be asking the question “what is real?”

A child plays with traditional dolls. Today, we are making dolls that are connected to the cloud and use artificial intelligence to listen to questions from children and provide them answers over the Internet without human intervention. Soon, we will be asking the question “what is real?”

Perhaps the worst example of Mattel's approach in this product was its initial press release (sent to TechCrunch by the PR firm responsible), which stated: “Unsuspecting subjects won't know that Barbie is watching their every move…” [21]. Issues for Mattel to consider have much to do with corporate responsibility. Excluding the potential for pedophiles to use this technology to cause harm, what happens if innocents produce illegal content which would otherwise mean criminalization? Could the doll be used to groom and seduce victims of child pornography?

Hello? Barbie is Listening

But Mattel, like most high-tech manufacturers, has not stopped there. Convergence has become an integral part of the development cycle. If the Barbie Video Girl doll seemed amazing as a concept, then the Hello Barbie doll has outdone it. In its own words, Mattel states that the Hello Barbie is “a whole new way to play with Barbie!” She differs from Barbie Video Girl in several ways. The doll still comes equipped with a whole bunch of electronics, but Hello Barbie uses speech-recognition technology to hold a conversation with a child and only allows for still-shot photo capture. The product information page on Mattel's website reads:

Using Wi-Fi and speech-recognition technology, Hello Barbie doll can interact uniquely with each child by holding conversations, playing games, sharing stories, and even telling jokes! […] Use is simple after set up—push the doll's belt buckle to start a conversation, and release to hear her respond […] To get started, download the Hello Barbie companion app to your own smart device from your device's app store (not included). Parents must also set up a ToyTalk account and connect the doll to use the conversational features. Hello Barbie doll can remember up to three different Wi-Fi locations [22].

Thus, the doll transmits data back to a service called ToyTalk. Forbes reported that ToyTalk has terms of service and a privacy policy that allow it to “share audio recordings with third-party vendors who assist [Mattel] with speech recognition.” Customer “recordings and photos may also be used for research and development purposes, such as to improve speech recognition technology and artificial intelligence algorithms and create better entertainment experiences” [23]. There is, however, a “SafePlay” option, where parents and guardians are still “in control of their child's data and can manage this data through the ToyTalk account at any time” [22].

To manage SafePlay, parents must visit www.mattel.com/hellobarbiefaq to get more information, or call +1 888 256 0224—and every parent will certainly have time to do this [24]. “Parents must also set up a ToyTalk account and connect to use the conversational features…Use of Hello Barbie involves recording of voice data; see ToyTalk's privacy policy at http://www.toytalk.” Of course, it is not the parents who will end up downloading these apps but the children.

Continued Infiltration

This raises many questions about the trajectory of toys and everyday products that increasingly contain networked features that introduce new parameters to what was once innocent child's play, unseen and carefree. First, Samsung launched a television set that can hear household conversations [25], and now we are to believe that it is the real Barbie who is “chatting” with our children. Are we too blind to see what is occurring? Is this really play? Or is it the best way of gathering marketing data and instituting further manipulation into those too young to know that the Barbie talking to them is not real and actually a robot of sorts? Just like we were once oblivious to the fact that our typed entries in search boxes were being collated to study our habits, likes, and dislikes, we are presently oblivious to the onslaught of products that are trying to infiltrate our homes and even our minds.

A spate of products has entered the market doing exactly the same thing as Hello Barbie but targeting a variety of vertical segments—from Amazon Echo for families who allegedly need a cloud connector because they cannot spell words like cantaloupe [26], [27] to NEST's thermostat and smoke-detection capability that doubles as human activity monitoring and tracking (NEST says so openly in its promotional commercials) [28], to DropCam's reconnaissance video recordings of what happens in your household 24/7, just in case there is a perpetrator who dares to enter [29].

Cayla is Talking—And It's Not Always Pretty

Perhaps our “favorite” is the My Friend Cayla doll [30], which connects to the cloud like the Hello Barbie. She is seemingly innocent but has shown herself to be the stuff of nightmares, akin to the horror movie Child's Play featuring the character Chucky [31]. On the Australian Cayla page, potential buyers are again greeted by a splash page with a cat on it: “I love my cat Lily. I will tell you her story.” Cayla is depicted talking to two little girls. The British Christmas best seller is effectively a Bluetooth headset dressed as a doll. With the help of a Wi-Fi connection (like Hello Barbie), she can answer a whole lot of tough questions, Amazon Echo style, and you would be surprised at her capacity [32]. But security researcher Ken Munro from Pen Test Partners put Cayla to the test and identified some major security flaws that could give perpetrators a way in. In essence, Cayla was hacked. She was made to speak a list of 1,500 strong words and expletives, and her responses to questions were modified [33].

This reminds us of the 2015 article in IEEE Technology and Society Magazine by K. Albrecht and L. McIntyre on IP cameras that double as baby monitors [34]. The moral of the story is the same whether the cloud-connected device is a children's monitor, children's toy, desktop game for kids, television console, Q&A tool for households, or a plain-old Wi-Fi-enabled smoke detector or thermostat: if it's connected, then it's vulnerable to security hacks and breaches in privacy [35]. Worse still, if it can talk back to you in the spoken word, then you need to think about the logic behind the process and what we are teaching our children about what is human and what is not. If these electronics products are going back to the Internet seeking results, then don't be surprised if nonphysical autonomous software robots one day begin to spit out bizarre answers and manipulative responses based on what is out there on the Internet.

As Kate Darling said in a Berkman talk at Harvard University in 2013, “[s]o not to undermine everything that I've just said here, but I do wonder…Say McDonald's gets its hands on a whole bunch of children's toys that are social robots and interacts with the kids socially, and the toys are telling the kids…to eat more McDonald's, and the kids are responding to that. That is something that we also need to think about and talk about, when these things start to happen. They could be used for good and for evil” [36]. If only that is all they will be saying to the next generation!

Katina visited the My Friend Cayla website recently and found this message: “Due to changes in the external website which Cayla gets some information from, she is temporarily unable to answer some types of questions. Cayla can still talk about herself, do maths and spelling, and all other functions are unaffected. A free app update will be issued (for both iOS and Android users) within the next two weeks with a fix. Thank you for your understanding” [37]. Keeping our children safe and aware of the difference between virtual and real is one thing, but, if we aren't careful, we will soon welcome a future where My Friend Cayla might well be facing off against Hello Barbie in another Child's Play blockbuster.

References

1. K. Albrecht, K. Michael, "Connected to everyone and everything", IEEE Technol. Soc. Mag., vol. 32, no. 4, pp. 31-34, 2014.

2. M. G. Michael, K. Michael, C. Perakslis, "Uberveillance the Web of Things and People: What is the culmination of all this surveillance?", IEEE Consumer Electron. Mag., vol. 4, no. 2, pp. 107-113, 2015.

3. K. Michael, "Wearable computers challenge human rights", ABC Science, July 2013, [online] Available: http://www.abc.net.au/science/articles/2013/07/24/3809675.htm.

4. Barbie's video girl, Sept. 2015, [online] Available: http://service.mattel.com/us/TechnicalProductDetail.aspx?prodno=R4807&siteid=27&catid1=508.

5. J. Clough, Principles of Cybercrime, Cambridge Univ. Press, 2010.

6. A. Toor, FBI says video Barbie girl could be used for ‘child pornography production’, Dec. 2010, [online] Available: http://www.switched.com/2010/12/03/fbi-video-barbie-girl-could-be-used-for-child-pornography/.

7. FBI memo raises Barbie child pornography fears, BBC News, Dec. 2010.

8. M. Martinez, FBI: New Barbie ‘Video Girl’ doll could be used for child porn, CNN, Dec. 2010.

9. D. M. Hughes, "The use of new communications and information technologies for sexual exploitation of women and children", Hastings Women's Law J., vol. 13, pp. 127, 2002.

10. Canon 7D vs. Barbie Video Girl, Dec. 2010, [online] Available: http://www.youtube.com/watch?v=uLmgXk4RlOc.

11. A. Hayes, FBI pornography Barbie, Dec. 2010, [online] Available: http://uberveillance.com/blog/2010/12/30/fbi-pornography-barbie.html?rq=barbie.

12. S. Fox, "FBI target new Barbie as child pornography threat", LiveScience, Sept. 2015, [online] Available: http://www.livescience.com/10319-fbi-targets-barbie-child-pornography-threat.html.

13. K. Michael, "The FBI's cybercrime alert on Mattel's Barbie video girl: A possible method for the production of child pornography or just another point of view", Conf. Cybercrime and Int. Criminal Cooperation, 2011-May-19–20.

14. K. Michael, M. G. Michael, "Point of view technologies in law enforcement" in The Social Implications of National Security, Sydney Univ., 2012. 

15. K. Michael, A. Hayes, "WORKSHOP | Body worn video recorders: The socio-technical implications of gathering direct evidence", INFORMA Police Technology Forum 2013, 2013-Mar.

16. K. Michael, "Wearables and lifeblogging: The socioethical implications", IEEE Consumer Electron. Mag., vol. 4, no. 2, pp. 80, 2015.

17. Mattel's Vidster is for kids, Sept. 2015, [online] Available: http://gizmodo.com/124713/mattels-vidster-is-for-kids.

18. VideoGirl, May 2011, [online] Available: http://www.barbie.com/videogirl/.

19. OMG Video Girl!, May 2011, [online] Available: http://www.youtube.com/watch?v=kSCfbSKSxMc.

20. "TimeToPlayMag", Barbie video girl doll from Mattel, [online] Available: http://www.youtube.com/watch?v=YKqrTycSHIQ&NR=1&feature=fvwp.

21. P. Carr, Feds finally closing the net on America's most wanted Barbie (since Klaus), May 2013, [online] Available: http://techcrunch.com/2010/12/03/you-can-brush-my-hair-arrest-me-anywhere/.

22. "Hello Barbie™ Doll—Light brown hair", Mattel Shop, Sept. 2015, [online] Available: http://shop.mattel.com/product/index.jsp?productId=71355596.

23. J. Steinberg, This new toy records your children's private moments—Buyer beware, Forbes, Mar. 2015.

24. High-tech Barbie sparks privacy concerns parental backlash, ABC News, Sept. 2015.

25. N. Grimm, Samsung warns customers new Smart TVs “listen in” on users' personal conversations, ABC News, Mar. 2015.

26. Introducing Amazon Echo, Dec. 2015, [online] Available: https://www.youtube.com/watch?v=KkOCeAtKHIc.

27. Amazon Echo, Sept. 2015, [online] Available: http://www.amazon.com/gp/product/B00X4WHP5E?*Version*=1&*entries*=0.

28. L. Whitney, Google closes \$3.2 billion purchase of Nest, C|NET, Feb. 2014.

29. G. Kumpara, Google and NEST acquire Dropcam for \$555 Million, TechCrunch, June 2014.

30. My Friend Cayla, Sept. 2015, [online] Available: http://www.myfriendcayla.com/.

31. "MovieClips Extras", Child's play behind the scenes—Making a nightmare (1988)—HD, Sept. 2015, [online] Available: https://www.youtube.com/watch?v=2EUwq9acGB8.

32. D. Moye, Talking Doll Cayla hacked to spew filthy things, Huffington Post, Sept. 2015.

33. N. Oakley, My Friend Cayla doll can be HACKED warns expert—Watch kids' toy quote 50 Shades and Hannibal, Sept. 2015, [online] Available: http://www.mirror.co.uk/news/technology-science/technology/friend-cayla-doll-can-hacked-5110112.

34. K. Albrecht, L. McIntyre, "Privacy nightmare: When baby monitors go bad", IEEE Technol. Soc. Mag., vol. 34, no. 3, pp. 14-19, 2015.

35. K. Goldberg, "Cloud Robotics Intro", Talks at Google, Sept. 2015, [online] Available: https://www.youtube.com/watch?v=IzUXT3_7tWc.

36. K. Darling, Kate Darling on near-term ethical legal and societal issues in robotics, Berkman Centre, Sept. 2015.

37. Meet Cayla, My Friend Cayla, Sept. 2015, [online] Available: http://myfriendcayla.co.uk/cayla.

Keywords: Cameras, Sensors, Consumer electronics, Motion pictures, Computer crime, YouTube, Context, social aspects of automation, cloud computing, Internet of Things, children toys, high-tech child play, cloud, Internet of Things, embedded sensors, civil infrastructure, computing devices

Citation: Katina Michael, Alexander Hayes, High-Tech Child's Play in the Cloud: Be safe and aware of the difference between virtual and real, IEEE Consumer Electronics Magazine ( Volume: 5, Issue: 1, Jan. 2016 ), pp. 123 - 128, Date of Publication: 11 December 2015. DOI: 10.1109/MCE.2015.2484878