Implantable Medical Device Tells All

Implantable Medical Device Tells All: Uberveillance Gets to the Heart of the Matter

In 2015, I provided evidence at an Australian inquiry into the use of subsection 313(3) of the Telecommunications Act of 1997 by government agencies to disrupt the operation of illegal online services [1]. I stated to the Standing Committee on Infrastructure and Communications that mandatory metadata retention laws meant blanket coverage surveillance for Australians and visitors to Australia. The intent behind asking Australian service providers to keep subscriber search history data for up to two years was to grant government and law enforcement organizations the ability to search Internet Protocol–based records in the event of suspected criminal activity.

Importantly, I told the committee that, while instituting programs of surveillance through metadata retention laws would likely help to speed up criminal investigations, we should also note that every individual is a consumer, and such programs ultimately come back to bite innocent people through some breach of privacy or security. Enter the idea of uberveillance, which, I told the committee, is “exaggerated surveillance” that allows for interference [1] that I believe is a threat to our human rights [2]. I strongly advised that evoking section 313 of the Telecommunications Act 1997 requires judicial oversight through the process of a search warrant. My recommendations fell on deaf ears, and, today, we even have the government deliberating over whether or not they should relax metadata laws to allow information to be accessed for both criminal and civil litigation [3], which includes divorces, child custody battles, and business disputes. In June 2017, Australian Prime Minister Malcolm Turnbull even stated that “global social media and messaging companies” need to assist security services’ efforts to fight terrorism by “providing access to encrypted communications” [52].

Consumer Electronics Leave Digital Data Footprints

Of course, Australia is not alone in having metadata retention laws. Numerous countries have adopted these laws or similar directives since 2005, keeping certain types of data for anywhere between 30 days and indefinitely, although the standard length is somewhere between one and two years. For example, since 2005, Italy has retained subscriber information at Internet cafes for 30 days. I recall traveling to Verona in 2008 for the European Conference on Information Systems, forgetting my passport in my hotel room, and being unable to use an Internet cafe to send a message back home because I was carrying no recognized identity information. When I asked why I was unable to send a simple message, I was handed an antiterrorism information leaflet. Italy also retains telephone data for up to two years and Internet service provider (ISP) data for up to 12 months.

Similarly, the United Kingdom retains all telecommunications data for one to two years. It also maintains postal information (sender, receiver data), banking data for up to seven years, and vehicle movements for up to two years. In Germany, metadata retention was established in 2008 under the directive Gesetz zur Neuregelung der Telekommunikationsüberwachung und anderer verdeckter Ermittlungsmaßnahmen sowie zur Umsetzung der Richtlinie 2006/24/EG, but it was overturned in 2010 by the Federal Constitutional Court of Germany, which ruled the law was unconstitutional because it violated a fundamental right, in that correspondence should remain secret. In 2015, this violation was challenged again, and a compromise was reached to retain telecommunications metadata for up to ten weeks. Mandatory data retention in Sweden was challenged by one holdout ISP, Bahnhof, which was threatened with an approximately US$605,000 fine in November 2014 if it did not comply [4]. They defended their stance to protect the privacy and integrity of their customers by offering a no-logs virtual private network free of charge [5].

Some European Union countries have been deliberating whether to extend metadata retention to chats and social media, but, in the United States, many corporations voluntarily retain subscriber data, including market giants Amazon and Google. It was reported in The Guardian in 2014 that the United States records Internet metadata for not only itself but the world at large through the National Security Agency (NSA) using its MARINA database to conduct pattern-of-life analysis [6]. Additionally, with the Amendments Act in 2008 of the Foreign Intelligence Surveillance Act 1978, the time allotted for warrantless surveillance was increased, and additional provisions were made for emergency eavesdropping. Under section 702 of the Foreign Intelligence Surveillance Act of 1978 Amendments Act, now all American citizens’ metadata is stored. Phone records are kept by the NSA in the MAINWAY telephony metadata collection database [53], and short message service and other text messaging worldwide are retained in DISHFIRE [7], [8].

Emerging Forms of Metadata in an Internet of Things World

Figure 1. An artificial pacemaker (serial number 1723182) from St. Jude medical, with electrode, which was removed from a deceased patient prior to cremation. (Photo courtesy of wikimedia commons.)

The upward movement toward a highly interconnected world through the Web of Things and people [9] will only mean that even greater amounts of data will be retained by corporations and government agencies around the world, extending beyond traditional forms of telecommunications data (e.g., phone records, e-mail correspondence, Internet search histories, metadata of images, videos, and other forms of multimedia). It should not surprise us that even medical devices are being touted as soon to be connected to the Internet of Things (IoT) [10]. Heart pacemakers, for instance, already send a steady stream of data back to the manufacturer’s data warehouse (Figure 1). Cardiac rhythmic data is stored on the implantable cardioverter-defibrillator’s (ICD’s) memory and is transmitted wirelessly to a home bedside monitor. Via a network connection, the data find their way to the manufacturer’s data store (Figure 2).

The standard setup for an EKG. A patient lies in a bed with EKG electrodes attached to his chest, upper arms, and legs. A nurse oversees the painless procedure. The ICD in a patient produces an EKG (A) which can automatically be sent to a ICD manufacturer's data store (B). (Image courtesy of wikimedia commons.)

In health speak, the ICD set up in the patient’s home is a type of remote monitoring that happens usually when the ICD recipient is in a state of rest, most often while sleeping overnight. It is a bit like how normal computer data backups happen, when network traffic is at its lowest. In the future, an ICD’s proprietary firmware updates may well travel back up to the device, remote from the manufacturer, like installing a Windows operating system update on a desktop. In the following section, we will explore the implications of access to personal cardiac data emanating from heart pacemakers in two cases.

CASE 1: HUGO CAMPOS DENIED ACCESS TO HIS PERSONAL CARDIAC DATA

Figure 3. The conventional radiography of a single-chamber pacemaker. (Photo courtesy of wikimedia commons.)

In 2007, scientist Hugo Campos collapsed at a train station and later was horrified to find out that he had to get an ICD for his genetic heart condition. ICDs usually last about seven years before they require replacement (Figure 3). A few years into wearing the device, being a high-end quantifiedself user who measured his sleep, exercise, and even alcohol consumption, Campos became inquisitive over how he might gain access to the data generated by his ICD (Figure 4). He made some requests to the ICD’s manufacturer and was told that he was unable to receive the information he sought, despite his doctor having full access. Some doctors could even remotely download the patient’s historical data on a mobile app for 24/7 support during emergency situations (Figure 5). Campos’s heart specialist did grant him access to written interrogation reports, but Campos only saw him about once every six months after his conditioned stabilized. Additionally, the logs were of no consequence to him on paper, and the fields and layout were predominantly decipherable only by a doctor (Figure 6).

Figure 4. The Nike FuelBand is a wearable computer that has become one of the most popular devices driving the so-called quantified-self trend. (Photo courtesy of wikimedia commons.)

Dissatisfied by his denied access, Campos took matters into his own hands and purchased a device on eBay that could help him get the data. He also went to a specialist ICD course and then intercepted the cardiac rhythms being recorded [11]. He got to the data stream but realized that to make sense of it from a patient perspective, a patient-centric app had to be built. Campos quickly deduced that regulatory and liability concerns were at the heart of the matter from the manufacturer’s perspective. How does a manufacturer continue to improve its product if it does not continually get feedback from the actual ICDs in the field? If manufacturers offered mobile apps for patients, might patients misread their own diagnoses? Is a manufacturer there to enhance life alone or to make a patient feel better about bearing an ICD? Can an ICD be misused by a patient? Or, in the worst case scenario, what happens in the case of device failure? Or patient death? Would the proof lie onboard? Would the data tell the true story? These are all very interesting questions.

Figure 5. The medical waveform format encoding rule software on a Blackberry device. It displays medical waveforms, such as EKG (shown), electroencephalogram, and blood pressure. Some doctors have software that allows them to interrogate EKG information, but patients presently do not have access to their own ICD data. (Photo courtesy of wikimedia commons.)

Campos might well have acted to not only get what he wanted (access to his data his own way) but to raise awareness globally as to the type of data being stored remotely by ICDs in patients. He noted in his TEDxCambridge talk in 2011 [12]:

the ICD does a lot more than just prevent a sudden cardiac arrest: it collects a lot of data about its own function and about the patient’s clinical status; it monitors its own battery life; the amount of time it takes to deliver a life-saving shock; it monitors a patient’s heart rhythm, daily activity; and even looks at variations in chest impedance to look if there is build-up of fluids in the chest; so it is a pretty complex little computer you have built into your body. Unfortunately, none of this invaluable data is available to the patient who originates it. I have absolutely no access to it, no knowledge of it.

Doctors, on the other hand, have full 24/7 unrestricted access to this information; even some of the manufacturers of these medical devices offer the ability for doctors to access this information through mobile devices. Compare this with the patients’ experience who have no access to this information. The best we can do is to get a printout or a hardcopy of an interrogation report when you go into the doctor’s office.

Figure 6. An EKG chart. Twelve different derivations of an EKG of a 23-year-old japanese man. A similar log was provided to hugo campos upon his request for six months worth of EKG readings. (Photo courtesy of wikimedia commons.)

Campos decided to sue the manufacturer after he was informed that the data being generated from his ICD measuring his own heart activity was “proprietary data” [13]. Perhaps this is the new side of big data. But it is fraught with legal implications and, as far as I am concerned, blatantly dangerous. If we deduce that a person’s natural biometric data (in this instance, the cardiac rhythm of an individual) belong to a third party, then we are headed into murky waters when we speak of even more invasive technology like deepbrain stimulators [14]. It not only means that the device is not owned by the electrophorus (the bearer of technology) [15], [16], but quite possibly the cardiac rhythms unique to the individual are also owned by the device manufacturer. We should not be surprised. In Google Glass’s “Software and Services” section of its terms of use, it states that Google has the right to “remotely disable or remove any such Glass service from user systems” at its “sole discretion” [17]. Placing this in the context of ICDs means that a third party almost indelibly has the right to switch someone off.

CASE 2: ROSS COMPTON’S PACEMAKER DATA IS SUBPOENAED FOR CRIMINAL INVESTIGATIONS

Enter the Ross Compton case of Middletown, Ohio. M.G. Michael and I have dubbed it one of the first authentic uberveillance cases in the world, because the technology was not just wearable but embedded. The story goes something like this: On 27 January 2017, 59-year-old Ross Compton was indicted on arson and insurance fraud charges. Police gained a search warrant to obtain his heart pacemaker readings (heart and cardiac rhythms) and called his alibi into question. Data from Compton’s pacemaker before, during, and after the fire in his home broke out were disclosed by the heart pacemaker manufacturer after a subpoena was served. The insurer’s bill for the damage was estimated at about US$400,000. Police became suspicious of Compton when they traced gasoline to Compton’s shoes, trousers, and shirt.

In his statement of events to police, Compton told a story that misaligned and conflicted with his call to 911. Forensic analysts found traces of multiple fires having been lit in various locations in the home. Yet, Compton told police he had rushed his escape, breaking a window with his walking stick to throw some hastily packed bags out and then fleeing the flames himself to safety. Compton also told police that he had an artificial heart with a pump attached, a fact that he thought might help his cause but that was to be his undoing. In this instance, his pacemaker acted akin to a black box recording on an airplane [18].

After securing the heart pacemaker data set, an independent cardiologist was asked to assess the telemetry data and determine if Compton’s heart function was commensurate with the exertion needed to make a break with personal belongings during a life-threatening fire [19]. The cardiologist noted that, based on the evidence he was given to interpret, it was “highly improbable” that a man who suffered with the medical conditions that Compton did could manage to collect, pack, and remove the number of items that he did from his bedroom window, escape himself, and then proceed to carry these items in front of his house, out of harm’s way (see “Columbo, How to Dial a Murder”). Compton’s own cardio readings, in effect, snitched on him, and none were happier than the law enforcement officer in charge of the case, Lieutenant Jimmy Cunningham, who noted that the pacemaker data, while only a supporting piece of evidence, was vital in proving Compton’s guilt after gasoline was found on his clothing. Evidence-based policing has now well outstripped the more traditional intelligence-led policing approach, entrenched given the new realm of big data availability [20], [21].

Columbo, How to Dial a Murder [S1] Columbo says to the murderer:
“You claim that you were at the physicians getting your heart examined…which was true [Columbo unravels a roll of EKG readings]…the electrocardiogram, Sir. Just before three o’clock your physician left you alone for a resting trace. At that moment you were lying down in a restful position and your heart showed a calm, slow, easy beat [pointing to the EKG readout]. Look at this part, right here [Columbo points to the reading], lots of sudden stress, lots of excitement, right here at three o’clock, your heart beating like a hammer just before the dogs attacked…Oh you killed him with a phone call, Sir…I’ll bet my life on it. Very simple case. Not that I’m particularly bright, Sir…I must say, I found you disappointing, I mean your incompetence, you left enough clues to sink a ship. Motive. Opportunity. And for a man of your intelligence Sir, you got caught on a lot of stupid lies. A lot.” [S1] Columbo: How to Dial a Murder. Directed by James Frawley. 1978. Los Angeles, CA: Universal Pictures Home Entertainment, 2006. DVD.

Consumer Electronics Tell a Story

Several things are now of interest to the legal community: first and foremost, how is the search warrant for a person’s pacemaker data executed? In case 1, Campos was denied access to his own ICD data stream by the manufacturer, and yet his doctor had full access. In case 2, Compton’s own data provided authorities with the extra evidence they needed to accuse him of fraud. This is yet another example of seemingly private data being used against an individual (in this instance, the person from whose body the data emanated), but in the future, for instance, the data from one person’s pacemaker might well implicate other members of the public. For example, the pacemaker might be able to prove that someone’s heart rate substantially increased during an episode of domestic violence [22] or that an individual was unfaithful in a marriage based on the cross matching of his or her time stamp and heart rate data with another.

Of course, a consumer electronic does not have to be embedded to tell a story (Figure 7). It can also be wearable or luggable, as in the case of a Fitbit that was used as a truthdetector in an alleged rape case that turned out to be completely fabricated [23]. Lawyers are now beginning to experiment with other wearable gadgetry that helps to show the impact of personal injury cases from accidents (work and nonwork related) on a person’s ability to return to his or her normal course of activities [24] (Figure 8). We can certainly expect to see a rise in criminal and civil litigation that makes use of a person’s Android S Health data, for instance, which measure things like steps taken, stress, heart rate, SpO2, and even location and time (Figure 9). But cases like Compton’s open the floodgates.

Figure 7. A Fitbit, which measures calories, steps, distance, and floors. (Photo courtesy of wikimedia commons.)

Figure 8. A closeup of a patient wearing the iRhythm ZIO XT patch, nine days after its placement. (Photo courtesy of wikimedia commons.)

I have pondered on the evidence itself: are heart rate data really any different from other biometric data, such as deoxyribonucleic acid (DNA)? Is it perhaps more revealing than DNA? Should it be dealt with in the same way? For example, is the chain of custody coming from a pacemaker equal to that of a DNA sample and profile? In some way, heart rates can be considered a behavioral biometric [25], whereas DNA is actually a cellular sample [26]. No doubt we will be debating the challenges, and extreme perspectives will be hotly contested. But it seems nothing is off limits. If it exists, it can be used for or against you.

Figure 9. (a) and (b) The health-related data from Samsung's S Health application. Unknown to most is that Samsung has diversified its businesses to be a parent company to one of the world's largest health insurers. (Photos courtesy of katina michael.)

The Paradox of Uberveillance

In 2006, M.G. Michael coined the term uberveillance to denote “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” [27]. No doubt Michael’s background as a former police officer in the early 1980s, together with his cross-disciplinary studies, had something to do with his insights into the creation of the term [28]. This kind of surveillance does not watch from above, rather it penetrates the body and watches from the inside, looking out [29].

Furthermore, uberveillance “takes that which was static or discrete…and makes it constant and embedded” [30]. It is real-time location and condition monitoring and “has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought)” [30]. Uberveillance can be used prospectively or retrospectively. It can be applied as a “predictive mechanism for a person’s expected behavior, traits, likes, or dislikes; or it can be based on historical fact” [30].

In 2008, the term uberveillance was entered into the official Macquarie Dictionary of Australia [31]. In research that has spanned more than two decades on the social implications of implantable devices for medical and nonmedical applications, I predicted [15] that the technological trajectory of implantable devices that were once used solely for care purposes would one day be used retrospectively for tracking and monitoring purposes. Even if the consumer electronics in question were there to provide health care (e.g., the pacemaker example) or convenience (e.g., a near-field-communication-enabled smartphone), the underlying dominant function of the service would be control [32]. The socioethical implications of pervasive and persuasive emerging technologies have yet to really be understood, but increasingly, they will emerge to take center stage in court hearings, like the emergence of DNA evidence and then subsequently global positioning system (GPS) data [33].

Medical device implants provide a very rich source of human activity monitoring, such as the electrocardiogram (EKG), heart rate, and more. Companies like Medtronics, among others specializing in implantables, have proposed a future where even healthy people carry a medical implant packed with sensors that could be life sustaining and detect heart problems (among others), reporting them to a care provider and signaling when assistance might be required [34]. Heart readings provide an individual’s rhythmic biometrics and, at the same time, can record increases and decreases in activity. One could extrapolate that it won’t be long before our health insurance providers are asking for the same evidence for reduced premiums.

Figure 10. A pacemaker cemetery. (Photo courtesy of wikimedia commons.)

The future might well be one where we all carry a black box implantable recorder of some sort [35], an alibi that proves our innocence or guilt, minute by minute (Figure 10). Of course, an electronic eye constantly recording our every move brings a new connotation to the wise words expressed in the story of Pinocchio: always let your conscience be your guide. The future black boxes may not be as forgiving as Jiminy Cricket and more like Black Mirror’s “The Entire History of You” [36]. But if we assume that these technologies are to be completely trusted, whether they are implantable, wearable, or even luggable, then we are wrong.

The contribution of M.G. Michael’s uberveillance is in the emphasis that the uberveillance equation is a paradox. Yes, there are near-real-time data flowing continuously from more points of view than ever [37], closed-circuit TV looking down, smartphones in our pockets recording location and movement, and even implantables in some of us ensuring nontransferability of identity [38]. The proposition is that all this technology in sum total is bulletproof and foolproof, omniscient and omnipresent, a God’s eye view that cannot be challenged but for the fact that the infrastructure and the devices, and the software, are all too human. And while uberveillance is being touted for good through an IoT world that will collectively make us and our planet more sustainable, there is one big crack in the utopian vision: the data can misrepresent, misinform, and be subject to information manipulation [39]. Researchers are already studying the phenomenon on complex visual information manipulation, how to tell whether data has been tampered with, a suspect introduced or removed from a scene of a crime, and other forensic visual analytics [40]. It is why Vladimir Radunovic, director of cybersecurity and e-diplomacy programs in the DiploFoundation, cited M.G. Michael’s contribution that “big data must be followed by big judgment” [41].

What happens in the future if we go down the path of constant bodily monitoring of vital organs and vital signs, where we are all bearing some device or at least wearing one? Will we be in control of our own data, or, as is seemingly obvious at present, will we not be in control? And how might selfincrimination play a role in our daily lives, or even worse, individual expectations that can be achieved by only playing to a theater 24/7 so our health statistics can stack up to whatever measure and cross-examination they are put under personally or publicly [42]? Can we believe the authenticity of every data stream coming out of a sensor onboard consumer electronics? The answer is no.

Having run many years of GPS data-logging experiments, I can say that a lot can go wrong with sensors, and they are susceptible to outside environmental conditions. For instance, they can log your location miles away (even in another continent), the temperature gauge can play up, time stamps can revert to different time zones, the speed of travel can be wildly inaccurate due to propagation delays in satellites, readings may not be at regular intervals due to some kind of interference, and memory overflow and battery issues, while getting better, are still problematic. The short and long of it is that technology cannot be trusted. At best, it can act as supporting evidence but should never replace eyewitness accounts. Additionally, “the inherent problem with uberveillance is that facts do not always add up to truth (i.e., as in the case of an exclusive disjunction T 1 T 5 F), and predictions based on uberveillance are not always correct” [30].

Conclusion

While device manufacturers are challenging the possibility that their ICDs are hackable in courts [43], highly revered security experts like Bruce Schneier are heavily cautioning about going down the IoT path, no matter how inviting it might look. In his acclaimed blog, Schneier recently wrote [44]:

All computers are hackable…The industry is filled with market failures that, until now, have been largely ignorable. As computers continue to permeate our homes, cars, businesses, these market failures will no longer be tolerable. Our only solution will be regulation, and that regulation will be foisted on us by a government desperate to “do something” in the face of disaster…We also need to reverse the trend to connect everything to the internet. And if we risk harm and even death, we need to think twice about what we connect and what we deliberately leave uncomputerized. If we get this wrong, the computer industry will look like the pharmaceutical industry, or the aircraft industry. But if we get this right, we can maintain the innovative environment of the internet that has given us so much.

The cardiac implantables market by 2020 is predicted to become a US$43 billion industry [45]. Obviously, the stakes are high and getting higher with every breakthrough implantable innovation we develop and bring to market. We will need to address some very pressing questions at hand, as Schneier suggests, through some form of regulation if we are to maintain consumer privacy rights and data security. Joe Carvalko, a former telecommunications engineer and U.S. patent attorney as well as an associate editor of IEEE Technology and Society Magazine and pacemaker recipient, has added much to this discussion already [46], [47]. I highly recommend several of his publications, including “Who Should Own In-the-Body Medical Data in the Age of eHealth?” [48] and an ABA publication coauthored with Cara Morris, The Science and Technology Guidebook for Lawyers [49]. Carvalko is a thought leader in this space, and I encourage you to listen to his podcast [50] and also to read his speculative fiction novel, Death by Internet, [51] which is hot off the press and wrestles with some of the issues raised in this article.

REFERENCES

[1] K. Michael, M. Thistlethwaite, M. Rowland, and K. Pitt. (2015, Mar. 6). Standing Committee on Infrastructure and Communications, Section 313 of the Telecommunications Act 1997. [Online]. Available: http:// parlinfo.aph.gov.au/parlInfo/search/display/display.w3p;db=COMMITT EES;id=committees%2Fcommrep%2Fd8727a07-ba09-4a91-9920-73d21 e446d1d%2F0006;query=Id%3A%22committees%2Fcommrep%2Fd872 7a07-ba09-4a91-9920-73d21e446d1d%2F0000%22

[2] S. Bronitt and K. Michael, “Human rights, regulation, and national security,” IEEE Technol. Soc. Mag., vol. 31, pp. 15–16, 2012.

[3] B. Hall. (2016, Dec. 22). Australians’ phone and email records could be used in civil lawsuits. Sydney Morning Herald. [Online]. Available: http:// www.smh.com.au/federal-politics/political-news/australians-phone-andemail-records-could-be-used-in-civil-lawsuits-20161222-gtgdy6.html

[4] PureVPN. (2015, Oct. 14). Data retention laws—an update. [Online]. Available: https://www.purevpn.com/blog/data-retention-laws-by-countries/

[5] D. Crawford. (2014, Nov. 18). Renegade Swedish ISP offers all customers VPN. Best VPN. [Online]. Available: https://www.bestvpn.com/ blog/11806/renegade-swedish-isp-offers-customers-vpn/

[6] J. Ball. (2013, Oct. 1). NSA stores metadata of millions of web users for up to a year, secret files show. Guardian. [Online]. Available: https://www .theguardian.com/world/2013/sep/30/nsa-americans-metadata-year-documents

[7] J. S. Granick, American Spies: Modern Surveillance, Why You Should Care, and What to Do About It. Cambridge, U.K.: Cambridge Univ. Press, 2017.

[8] A. Gregory, American Surveillance: Intelligence, Privacy, and the Fourth Amendment. Madison: Univ. of Wisconsin Press, 2016.

[9] K. Michael, G. Roussos, G. Q. Huang, A. Chattopadhyay, R. Gadh, B. S. Prabhu, and P. Chu, “Planetary-scale RFID services in an age of uberveillance,” Proc. IEEE, vol. 98, no. 9, pp. 1663–1671, 2010.

[10] N. Lars. (2015, Mar. 26). Connected medical devices, apps: Are they leading the IOT revolution—or vice versa? Wired. [Online]. Available: https://www.wired.com/insights/2014/06/connected-medical-devicesapps-leading-iot-revolution-vice-versa/

[11] H. Campos. (2015). The heart of the matter. Slate. [Online]. Available: http://www.slate.com/articles/technology/future_tense/2015/03/ patients_should_be_allowed_to_access_data_generated_by_implanted_ devices.html

[12] H. Campos. (2011). Fighting for the right to open his heart data: Hugo Campos at TEDxCambridge 2011. [Online]. Available: https:// www.youtube.com/watch?v=oro19-l5M8k

[13] D. Hinckley. (2016, Feb. 22). This big brother/big data business goes way beyond Apple and the FBI. Huffington Post. [Online]. Available: http://www.huffingtonpost.com/david-hinckley/this-big-brotherbigdata_b_9292744.html october 2017 ^ IEEE Consumer Electronics Magazine 115

[14] K. Michael, “Mental health, implantables, and side effects,” IEEE Technol. Soc. Mag., vol. 34, no. 2, pp. 5–17, 2015.

[15] K. Michael, “The technological trajectory of the automatic identification industry: The application of the systems of innovation (SI) framework for the characterisation and prediction of the auto-ID industry,” Ph.D. dissertation, School of Information Technology and Computer Science, Univ. of Wollongong, Wollongong, Australia, 2003.

[16] K. Michael and M. G. Michael, “Homo electricus and the continued speciation of humans,” in The Encyclopedia of Information Ethics and Security, M. Quigley, Ed. Hershey, PA: IGI Global, 2007, pp. 312–318.

[17] Google Glass. (2014, Aug. 19). Glass terms of use. [Online]. Available: https://www.google.com/glass/termsofuse/

[18] K. Michael and M. G. Michael, “Implementing ‘namebers’ using microchip implants: The black box beneath the skin,” in This Pervasive Day: The Potential and Perils of Pervasive Computing, J. Pitt, Ed. London, U.K.: Imperial College Press, 2011.

[19] D. Smith. (2017, Feb. 4). Pacemaker data used to charge alleged arsonist. Jonathan Turley. [Online]. Available: https://jonathanturley .org/2017/02/04/pacemaker-data-used-to-charge-alleged-arsonist/

[20] K. Michael, “Big data and policing: The pros and cons of using situational awareness for proactive criminalisation,” presented at the Human Rights and Policing Conf,. Australian National University, Canberra, Apr. 16, 2013.

[21] K. Michael and G. L. Rose, “Human tracking technology in mutual legal assistance and police inter-state cooperation in international crimes,” in From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Second Workshop on Social Implications of National Security), K. Michael and M. G. Michael, Eds. Wollongong, Australia: University of Wollongong, 2007.

[22] F. Gerry, “Using data to combat human rights abuses,” IEEE Technol. Soc. Mag., vol. 33, no. 4, pp. 42–43, 2014.

[23] J. Gershman. (2016, Apr. 21). Prosecutors say Fitbit device exposed fibbing in rape case. Wall Street Journal. [Online]. Available: http:// blogs.wsj.com/law/2016/04/21/prosecutors-say-fitbit-device-exposedfibbing-in-rape-case/

[24] P. Olson. (2014, Nov. 16). Fitbit data now being used in the courtroom. Forbes. [Online]. Available: https://www.forbes.com/sites/parmyolson/ 2014/11/16/fitbit-data-court-room-personal-injury-claim/#459434e37379

[25] K. Michael and M. G. Michael, “The social and behavioural implications of location-based services,” J. Location Based Services, vol. 5, no. 3–4, pp. 121–137, Sept.–Dec. 2011.

[26] K. Michael, “The European court of human rights ruling against the policy of keeping fingerprints and DNA samples of criminal suspects in Britain, Wales and Northern Ireland: The case of S. and Marper v United Kingdom,” in The Social Implications of Covert Policing (Workshop on the Social Implications of National Security, 2009), S. Bronitt, C. Harfield, and K. Michael, Eds. Wollongong, Australia: University of Wollongong, 2010, pp. 131–155.

[27] M. G. Michael and K. Michael, “National security: The social implications of the politics of transparency,” Prometheus, vol. 24, no. 4, pp. 359–364, 2006.

[28] M. G. Michael, “On the ‘birth’ of uberveillance,” in Uberveillance and the Social Implications of Microchip Implants, M. G. Michael and K. Michael, Eds. Hershey, PA: IGI Global, 2014.

[29] M. G. Michael and K. Michael, “A note on uberveillance,” in From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Second Workshop on Social Implications of National Security), M. G. Michael and K. Michael, Eds. Wollongong, Australia: University of Wollongong, 2007.

[30] M. G. Michael and K. Michael, “Toward a state of uberveillance,” IEEE Technol. Soc. Mag., vol. 29, pp. 9–16, 2010.

[31] M. G. Michael and K. Michael, “Uberveillance,” in Fifth Edition of the Macquarie Dictionary, S. Butler, Ed. Sydney, Australia: Sydney University, 2009.

[32] A. Masters and K. Michael, “Lend me your arms: The use and implications of humancentric RFID,” Electron. Commerce Res. Applicat., vol. 6, no. 1, pp. 29–39, 2007.

[33] K. D. Stephan, K. Michael, M. G. Michael, L. Jacob, and E. P. Anesta, “Social implications of technology: The past, the present, and the future,” Proc. IEEE, vol. 100, pp. 1752–1781, 2012. [34] E. Strickland. (2014, June 10). Medtronic wants to implant sensors in everyone. IEEE Spectrum. [Online]. Available: http://spectrum.ieee .org/tech-talk/biomedical/devices/medtronic-wants-to-implant-sensorsin-everyone

[35] K. Michael, “The benefits and harms of national security technologies,” presented at the Int. Women in Law Enforcement Conf., Hyderabad, India, 2015. [36] J. A. Brian Welsh. (2011). The entire history of you,” Black Mirror, C. Brooker, Ed. [Online]. Available: https://www.youtube.com/watch?v= Sw3GIR70HAY

[37] K. Michael, “Sousveillance and point of view technologies in law enforcement,” presented at the Sixth Workshop on the Social Implications of National Security: Sousveillance and Point of View Technologies in Law Enforcement, University of Sydney, Australia, 2012.

[38] K. Albrecht and K. Michael, “Connected: To everyone and everything,” IEEE Technology and Soc. Mag., vol. 32, pp. 31–34, 2013.

[39] M. G. Michael, “The paradox of the uberveillance equation,” IEEE Technol. Soc. Mag., vol. 35, no. 3, pp. 14–16, 20, 2016.

[40] K. Michael, “The final cut—tampering with direct evidence from wearable computers,” presented at the Fifth Int. Conf. Multimedia Information Networking and Security (MINES 2013), Beijing, China, 2013.

[41] V. Radunovic, “Internet governance, security, privacy and the ethical dimension of ICTs in 2030,” IEEE Technol. Soc. Mag., vol. 35, no. 3, pp. 12–14, 2016.

[42] K. Michael. (2011, Sept. 12). The microchipping of people and the uberveillance trajectory. Social Interface. [Online]. Available: http:// socialinterface.blogspot.com.au/2011/08/microchipping-of-people-and .html

[43] O. Ford. (2017, Jan. 12). Post-merger Abbott moves into 2017 with renewed focus, still faces hurdles. J.P. Morgan Healthcare Conf. 2017. [Online]. Available: http://www.medicaldevicedaily.com/servlet/com .accumedia.web.Dispatcher?next=bioWorldHeadlines_article& forceid=94497

[44] B. Schneier. (2017, Feb. 1). Security and the Internet of Things: Schneier on security. [Online]. Available: https://www.schneier.com/ blog/archives/2017/02/security_and_th.html

[45] IndustryARC. (2015, July 30). Cardiac implantable devices market to reach $43 billion by 2020. GlobeNewswire. [Online]. Available: https://globenewswire.com/news-release/2015/07/30/756345/10143745/ en/Cardiac-Implantable-Devices-Market-to-Reach-43-Billion-By-2020 .html

[46] J. Carvalko, The Techno-Human Shell: A Jump in the Evolutionary Gap. Mechanicsburg, PA: Sunbury Press, 2013.

[47] J. Carvalko and C. Morris, “Crowdsourcing biological specimen identification: Consumer technology applied to health-care access,” IEEE Consum. Electron. Mag., vol. 4, no. 1, pp. 90–93, 2014.

[48] J. Carvalko, “Who should own in-the-body medical data in the age of ehealth?” IEEE Technol. Soc. Mag., vol. 33, no. 2, pp. 36–37, 2014.

[49] J. Carvalko and C. Morris, The Science and Technology Guidebook for Lawyers. New York: ABA, 2014.

[50] K. Michael and J. Carvalko. (2016, June 20). Joseph Carvalko speaks with Katina Michael on his non-fiction and fiction pieces. [Online]. Available: https://www.youtube.com/watch?v=p4JyVCba6VM

[51] J. Carvalko, Death by Internet. Mechanicsburg, PA: Sunbury Press, 2016.

[52] R. Pearce. (2017, June 7). “No-one’s talking about backdoors” for encrypted services, says PM’s cyber guy. Computerworld. [Online]. Available: https://www.computerworld.com.au/article/620329/no-onetalking-about-backdoors-says-pm-cyber-guy/

[53] M. Ambinder. (2013, Aug. 14). An educated guess about how the NSA is structured. The Atlantic. [Online]. Available: https://www .theatlantic.com/technology/archive/2013/08/an-educated-guess-abouthow-the-nsa-is-structured/278697/

Acknowledgment

A short form of this article was presented as a video keynote speech for the Fourth International Conference on Innovations in Information, Embedded and Communication Systems in Coimbatore, India, on 17 March 2017. The video is available at https://www.youtube.com/watch?v=bEKLDhNfZio.

Keywords

Metadata, Electrocardiography, Pacemakers, Heart beat, Telecommunication services, Implants, Biomedical equipment, biomedical equipment, cardiology, criminal law, medical computing, police data processing, transport protocols, implantable medical device, heart, Australian inquiry, government agencies, illegal online services,mandatory metadata retention laws, government organizations, law enforcement organizations, Internet protocol

Citation: Katina Michael, 2017, "Implantable Medical Device Tells All: Uberveillance Gets to the Heart of the Matter", IEEE Consumer Electronics Magazine, Vol. 6, No. 4, Oct. 2017, pp. 107 - 115, DOI: 10.1109/MCE.2017.2714279.

 

Cloud computing data breaches a socio-technical review of literature

Abstract

images (1).jpg

As more and more personal, enterprise and government data, services and infrastructure moves to the cloud for storage and processing, the potential for data breaches increases. Already major corporations that have outsourced some of their IT requirements to the cloud have become victims of cyber attacks. Who is responsible and how to respond to these data breaches are just two pertinent questions facing cloud computing stakeholders who have entered an agreement on cloud services. This paper reviews literature in the domain of cloud computing data breaches using a socio-technical approach. Socio-technical theory encapsulates three major dimensions- the social, the technical, and the environmental. The outcomes of the search are presented in a thematic analysis. The 7 key themes identified from the literature included: security, data availability, privacy, trust, data flow, service level agreements, and regulation. The paper considers complex issues, pre-empting the need for a better way to deal with breaches that not only affect the enterprise and cloud computing provider, but more importantly, end-users who rely on online services and have had their credentials compromised.

Section I. Introduction

Traditionally, enterprise networks were managed by internal IT staff that had access to underlying infrastructure that stored and processed organizational data. Cloud computing has emerged to overcome traditional barriers such as limited IT budgets, increased use of outdated technology and the inability of corporations to expand IT infrastructure services to users when needed [1]. Cloud computing is Internet-based infrastructure and application service delivery through a controlled and manageable environment that is provided with a pay-as-you-go agreement structure. Cloud computing has acted to lower hardware and software costs [2]. Buyya et al. [3] analogize that cloud computing is similar to utility based-services such as water, electricity, gas and telephony. Cloud computing allows for adjusting resources on an ad-hoc manner for a predefined duration with minimal management effort [4].Customers only pay for what is utilized in an affordable manner and computing requirements can be scaled down when no longer needed [3].

While cloud computing is seen as a utility, [5] state that cloud computing models are undeveloped technology structures that have immense potential for improvement. This is despite that [6] argues that cloud computing concepts are not new and models have been adopted from technologies such as time sharing mainframes, clustering and grid computing. Yet [7] elaborates that cloud computing technology is far more advanced than other technology, exceeding the regulatory environment because it transcends legal boundaries. For example, cloud computing has allowed for data to reside somewhere other than the data owner's home location [8]. There are three layers generally acknowledged “as a service” within the cloud computing context: infrastructure, platform, and software. Business customers (e.g. online merchants), may opt for one or more cloud service layers depending on the needs of their company, and the needs of end-users (i.e. the customer's customer).

A. Infrastructure as a Service

Infrastructure as a Service (IaaS) enables the cloud consumer to acquire and provision hardware infrastructure services through the use of cloud provider web interfaces [9].Through an abstraction view of the hardware, consumers are able to provision infrastructure on a pay-as-you-go basis that can be adjusted on an ad-hoc manner [10].The IaaS delivery model also provides ability to provision system images, scale storage and processing requirements and define network topologies through the cloud provider's user interface management portal [10]. The infrastructure is offered through time-shared facilities that allows storage, processing and network services to be utilized as a service [1]. According to [6, p. 44] IaaS “allows companies to essentially rent a data center environment without the need and worry to create and maintain the same data center footprint in their own company”.

B. Platform as a Service

Platform as a Service (PaaS) enables cloud customers the ability to deliver web-based applications to its users [8]. PaaS also allows cloud customers to support facilities that provide on-demand web application utilization without the need to manage underlying complex network infrastructure. According to [8, p. 49], principle characteristics of PaaS are “services to develop, test, deploy, host, and manage applications to support the application development life cycle”. The added benefits of PaaS allows cloud customers to test developed applications without the need to utilize organizationally-owned infrastructure [6].

C. Software as a Service

Software as a Service (SaaS) allows cloud customers the ability to utilize software resources through a web-based user interface [8]. The SaaS model allows cloud customers the facility of utilizing software applications without the need for them to store, process and maintain backend infrastructure and platform repositories [6]. The level of abstraction increases as cloud customers migrate from IaaS to SaaS delivery models, hence responsibility is handed to cloud providers to handle the SaaS model [11]. Furthermore [12] discuss SaaS architecture through multi-tenant utilization as it shares common resources and underlying instances of both database and object code.

Section II. Security

Several authors [13] [14] [15] agree security concerns are among one of the biggest issues that will enable growth in cloud computing services. The use of public clouds demands tighter restrictions on cloud providers to incorporate into their service models. Legal complications that cloud providers must adhere to are yet to be standardized and as a result remain the biggest obstacle to continued substantial growth of the cloud model [14]. Svantesson and Clarke [5] emphasize that the issue of security within the cloud computing context should be reviewed rigorously by potential business customers and end-users before adoption to ensure that confidentiality, integrity, availability and privacy policies are addressed by the provider.

A recent study [16] focuses on explaining the concerns over network boundaries in the cloud computing model where the risk of attacks are increased as a result of outdated security solutions. The continued usage of cloud computing will result in more devices being connected outside the traditional network boundary, which will in turn mean that the underlying data that is stored may be compromised. Similarly, [7] states whereas once a user was only allowed to log on if they were on the physical network, they can now log on from almost any device that is connected to a network connection. In traditional enterprise networks, organisations had access to security settings and configurations, whilst in the cloud computing model the network boundary is managed by the cloud provider.

Subashini and Kavitha [17, p. 3] state “guaranteeing the security of corporate data in the cloud is difficult, if not impossible”. The state of cloud security is under stress as security threats and vulnerabilities may not be noticed by the cloud customer and their end-users [18]. This in turn raises alarms for disaster recovery plans to be specified in service level agreements to avoid contract breaches. Kshetri [15] elaborates that security and privacy issues come to the fore as customers start to be concerned that data may be used without the explicit consent of the end-user. To complement the latter, [13] details further concerns such as loss of control over data via malicious or un-malicious intent, an issue that can never be completely eradicated.

A. Cloud Computing Data Security Encryption Keys

There have been numerous studies conducted seeking to secure the cloud computing model from risks and threats but this has had little impact on the overall industry [19].The outcome from [19] proposes key-based encryption through simulation modelling that allows data to be stored on cloud infrastructure, with participants accessing certain data according to their encryption key permission. To critique the former, [20] state that security mechanisms that involve encryption key solutions degrade performance levels and do not meet scalability requirements in cloud computing environments. With the issues of performance and scalability on encryption keys, Esayas [13] also elaborates that encryption keys might not meet business requirements as the effectiveness of such a technique is not suitable for all cloud computing services. Implementing security is essential in overall cloud models, although it increases overheads that diminish the return and benefits.

The influence of many industry and academic experts, state that encryption keys will pave the way for secure cloud computing, from a perpetrator and insider attack point of view. Insider breaches are becoming more common as attacks may be deliberate or simply administrative error [21]. In review, [18] detail that traditional security solutions change when enterprises adopt cloud computing and existing encryption standards are outdated for the cloud computing model, inhibiting effective use for privacy protection. The ability to overcome these issues will allow the cloud customer peace of mind with respect to data integrity and end-user confidence [22].

Another method to encryption key security is demonstrated using a simulation approach to protect data from unauthorized access and violation. The concept introduces data coloring to protect different types of data. This distorts the original data and only owners that have the same color key can view the data. Yet [13] and [20] argue that security keys are highly volatile in cloud environments, that if the decryption key is mismanaged, the data will not be able to be decrypted. To add further criticism, the data coloring security solution simulation-based approach has limitations on overall usefulness as it provides simplistic arithmetic calculations [13].

B. Disadvantages of Traditional Security Practices in Cloud

Perpetrators and insider attacks are considered high impact security threats to cloud computing. Pek et al. [23] detail security issues are not being offset by either hardware or software protocols. In assessment, [17] surveys existing traditional security solutions and believes that cloud data needs higher levels of security to overcome vulnerabilities and threats. Traditional security models such as intrusion detection systems, intrusion prevention systems and network firewalls do not effectively address the security issues that are being experienced in the cloud computing model [17].

Salah [24] introduces the proof-of-concept cloud-based network security overlay. For this simulation, Salah [24] uses security intrusion detection systems, network firewalls and anti-virus tools that are intended for cloud environments. The results demonstrate that significant cost savings can be achieved with this implementation although network latency and increased network bandwidth utilization is recorded.[25]emphasize that cloud environments are far greater in complexity and design than traditional enterprise environments. Physical and virtual machines are rapidly being deployed in data centers and the security management protocol for this environment using the traditional security methodology is dormant and unrealistic. For example, Salah [24] has not included solutions for when an intrusion has compromised a particular virtual machine, and how a cloud provider and customer should respond.This is of particular concern as [25] state that once a virtual machine has been compromised then the attacker can gain access to the lower level hypervisor of the machine.

C. Data Security Issues in Virtualised Environments

Virtualization first came on to the market in IBM mainframes through the use of its hypervisor to initiate virtual machines [21]. Virtualization concepts and technical background explanations are not being explicitly detailed to the cloud customer adding to security concerns. Sensitive data that resides on the cloud computing model are acceptable to threats and vulnerabilities using virtualization techniques [26]. A primary design issue is to denote the sensitivity of the data that is being stored and assign low and high security controls for that virtual machine. According to [23], sensitive and nonsensitive data should not be stored on the same physical machine, although this has not being publicized to cloud customers.

Data in virtualized environments according to [17] is an important topic as data location and data ownership is a key enabler to increasing trust relationships between provider and customer. To complement the former, [14] elaborate that trust can be diminished with concerns relating to data breaches. Data security breaches in virtualized environments can occur to one or many tenants that reside in a single physical machine and at times without notifications being issued to customers or consumers [14]. As many tenants reside in a single physical machine, customer data may be accessed by unauthorized personnel if the virtualized environment is compromised [27].

In [28], the definition of sensitive data relates to software configuration data, network configurations and resource allocations for virtualized environments. If we compare this with [14], they define sensitive data with respect to an individual's social information. Throughout the study, [28] state that current security measures for virtualized environments are lacking and increased prevention, detection and protection measures need to be in place. These measures include an increase in the level of policy standards and managerial say during cloud provider assessment for cloud services. [29]emphasizes that the lack of service level agreement acknowledgement during cloud provider assessment plays a pivotal role in ruling out important components of cloud services.

D. Outsourcing Sensitive Data to Virtualised Environments

When cloud customers outsource their workload to a cloud provider due to resource constraints or volatile computation requirements, [20] state that “current outsourcing practice operates in plaintext - that is, it reveals both data and computation results to the commercial public cloud”. This should be concerning to cloud customers, as they very often store data that is likely to contain sensitive information (e.g. corporate intellectual property). The management practices of data security in virtualized cloud environments according to [14] are simply inadequate for sensitive data to be stored.Rocha et al. [21] detail that system and network administrators have log-in credentials to access the virtualization management layer of the physical machine. With this level of access coupled with plaintext data, outsourcing demonstrates that virtualized cloud environments are not suitable for data storage [5].

E. Cloud Security Auditing and Certification Compliance

Standards that include auditing and certifications are considered to be inadequate for the cloud computing model [15]. To complement the former, [30] state that auditing and certifications have not been widely implemented and adopted by cloud providers. A set of security standards and best practices are being developed by the Cloud Security Alliance (CSA), although current cloud providers are yet to demonstrate enthusiasm or optimism that these will play a role in avoiding security breaches [15].

F. Billing Monitoring Security Concerns on the Cloud

The continuation of monitoring services from cloud providers offers timely and effective billing solutions for cloud customers. However, this is also a security matter, given providers need to monitor customer traffic to bill accordingly [31]. The lack of standards for monitoring services increase privacy concerns, as cloud customers cannot apply security metrics nor monitor on what is being scanned [30]. Pek et al. [23] supports accessing the management portal of the cloud computing model as integral to the overall security status and virtual environments.

G. Cloud Security Requirements and Modelling Approach

In their proposed framework modelling approach, [32] address privacy and security requirements analysis for cloud customers through a rigorous process that selects the most suitable cloud vendor. The conceptual framework incorporates different cloud computing stakeholders, iterative requirements processing and a security modelling language. The authors demonstrate the main limitation of this proposed conceptual framework is privacy being a subset requirement of security. [33]agree that the lack of service level agreement analysis during the conceptual framework process is a major contributor to ineffectively measuring cloud provider services.

Chen and Zhao [27] develop a data life cycle conceptual framework through a semantic review of current literature. The various stages address initial data generation from cloud customers to how data destruction is performed by cloud providers once a cloud service is terminated. Throughout the conceptual framework, the lack of monitoring service level agreements in respect to data location, sharing, privacy and security is of particular concern. The conceptual framework process does not provide insights into the overall compliance and regulatory status. [5]emphasize that cloud customers and end-users need to acknowledge the importance of cloud provider compliance and regulation status.

The architecture proposed by [34] is a proxy based cloud service to enable collaboration between multi-cloud consumers and providers on an ad-hoc basis. The concept allows data sharing and processing without establishing agreements or negotiation contracts and business rules. In another study, [25] elaborate the significance of establishing standard service level agreements and contracts for cloud services and how to monitor them on a continuous basis. Modi [25] also describes underlying internet protocol (IP) and proxy services vulnerabilities. Through this, attacks can include man-in-the middle, domain name system (DNS) and address resolution protocol (ARP) spoofing that can be targets for proxy based cloud models. Comparatively, [34] and [35] looked at collaboration with multi-vendor and customers clouds in an alternative way. Yang's [35] simulation involved service level agreements for customers whilst participating in cloud federation services. The measured components of the SLA had Quality of Service (QoS) attributes such as connection latency, bandwidth and threshold limits. The security that [35] incorporated in the simulation had encryption and authentication methods that were standard practice for online activities.

Yang and Jia [36] introduce their concept of enabling dynamic auditing of data that is stored on a cloud service through a conceptual framework. They define the key categories that need attention: increased confidentiality, dynamic and batch auditing. The results were compelling as decreased costs to processing these audits were achieved. The intervention through a third party auditor within the process enabled the avoidance of bias in the results. [17]emphasizes that compliance and regulatory status of cloud providers is crucial to the cloud customer. The lack of acknowledgement of [36] to include the attitude of cloud vendor participation and approval was a key difference in the studies. Cloud vendors may inevitably avoid these scenarios and lack participation for data confidentially checks.

Source: http://searchcloudprovider.techtarget.com/photostory/2240178541/What-your-customers-want-to-see-in-the-2013-cloud-marketplace/6/Security-data-protection-remain-top-cloud-computing-issues

Section III. Data Availability

A. Multiple Availability Zones

Cloud vendors that have multiple availability zones use this functionality as a method to distribute network load and offset critical services to a larger amount of geo-redundant sites. Sun et al. [26] state that replication technology is used for multiple availability zone setups to avoid data loss, although this method is prone to cross-border activities, if stored in different regulated jurisdictions. The study by [37] aimed at data availability to be effected through the use of virtualization and raised security issues. Sun et al. [26] focused on data availability through offloading services to alternative servers for load distribution. Comparatively, [37] insisted to keep high data availability applications in-house until further developments are made to the cloud computing model, although this article is now somewhat dated.

B. Enhancing Data Security to Maintain Data Availability

In the study by [38], enhanced security was achieved through the utilization of security mechanisms such as double authentication and digital signatures. Data availability was achieved by enabling the data to be securely stored and retrieved. In comparison, the study by [39], aimed to increase data availability through a two stage process: using a trusted third party to maintain visibility of the security mechanisms that are used, and using enhanced security mechanisms to protect the data. Thus, [38] proposes the solution through an experimental-based case, whereas [39] only demonstrates this through expected security tools and their capabilities. In contrast, using a literature review, [40] indicate that virtualization security is very important to data availability. With their analysis of current security mechanisms, virtualization security is under-managed and in need of enhanced management practices.

C. Data Availability Priorities

Sakr et al. [41] in their cloud computing survey, aimed to investigate cloud challenges that arise by utilizing their developed model. While they identified several advantages, such as utilization and bandwidth improvements, there were substantial drawbacks from cloud storage techniques that raised concerns. Their findings indicated that the availability of service use, highly impacts the cloud computing model, as the slightest downtime and service degradation would impact the use of the service. Similarly, the study by [42] indicates that performance delivery through the availability of the service was the most significant issue. To critique [42], findings were empirically based as compared to [41] where findings were derived from a literature survey.

Section IV. Privacy

A. Defining Information Privacy in Technology

Meanings of information privacy vary across disciplines. According to the Australian Privacy Law and Practice Report 108: “Information privacy, [is] the establishment of rules governing the collection and handling of personal data such as credit information, and medical and government records.” It is also known as data protection [43]. Information privacy can be considered an important concept when studying cloud computing. It has four sub-components [44]:

  • Psychologically: people need private space;
  • Sociologically: people need to be free to behave… but without the continual threat of being observed;
  • Economically: people need to be free to innovate; and
  • Politically: people need to be free to think and argue and act.”

It is important to note that information privacy is not only something that is important to a cloud computing business customer, but also an end-user who is likely to be an everyday consumer.

B. Technological Advances Outpace Privacy Regulation

The Australian Privacy Law and Practice Report 108 noted that the “…Privacy Act regulates the handling of personal information.” Although the Act was exclusively designed for public sector agencies, now the Information Privacy Principles (IPPs) have a broader reach [43, p. 138]. Complicating the issue of privacy, especially information privacy, is how it is interpreted, or for that matter ignored, by different legal systems.Gavison [45, p. 465] summates the problem of privacy in an ever-changing technological world when he writes: “Advances in the technology of surveillance and the recording, storage, and retrieval of information have made it either impossible or extremely costly for individuals to protect the same level of privacy that was once enjoyed.”

C. Sensitive Data Storage on Cloud Infrastructure

The EU Directive defines sensitive data as personal data that includes health records, criminal activities, or religious philosophy [14]. Similarly, [27] define e-commerce and health care systems data as sensitive. [28]defines sensitive data that includes personal attributes and security configuration files. Subashini and Kavitha [17], state that sensitive data holds value to the end-user and needs to be protected. In addition, [46] discusses that each cloud customer needs to assess suitability and evaluate the security controls that the cloud provider offers. Sun et al.'s [26] key focus is that cloud customers must first acknowledge that their sensitive data is stored on cloud computing infrastructure, and cloud providers need to assure that it is kept confidential. [15]states that cloud customers are cautious while utilizing the cloud computing model to store sensitive data. [14]state that protecting sensitive data in cloud computing is the biggest challenge for cloud customers.

Cloud customers are especially anxious about the release of their information to third party vendors, exclusive of acknowledgement [22]. Sensitive data that is stored on cloud provider infrastructure is often non-aggregated. All data is tightly coupled thus allowing stakeholders that can access the data to utilize it [47]. [17]detail that cloud customers that have non-aggregated data are vulnerable to insider breaches, as data can be taken without cloud customer acknowledgement. All non-aggregated data that can be seen as selective elements are either weakly encrypted or clearly visible. Ter [48] also discusses the importance that cloud customers need to decouple sensitive data from non-sensitive data as a minimal standard if cloud computing is utilized. The ability to process large quantities of data and query datasets at immense speed is available using cloud computing [47]. Yet this very capability raises concerns about privacy and sensitive data security mechanisms. Privacy concerns are raised as the cause for data retention, and deletion from the cloud provider with respect to virtualization techniques have not been elaborated [28].

D. EU and AUS Data Privacy

King and Raja [14] detail privacy rights that cloud customers have if they choose to store data in an EU-based cloud. It follows that cloud providers need to assure that they act according to local regulations. The complexity arises when a cloud customer in Australia, for instance, is subject to foreign laws as their data is stored in another jurisdiction [46].

Section V. Trust

Cloud providers interpret trust as either being a security or privacy issue [15]. In comparison [18] state that trust is strengthened by having tighter technical and social means to enable transparency for cloud customers. End-users of cloud computing (i.e. everyday consumers), lack trust as cloud providers limit the amount of information provided on data transfer, storage and processing to them directly. End-users may also be concerned about confidentiality [49]. A large subset of cloud end-users have concerns that their data may be used inappropriately for other purposes. Nguyen [7, p. 2205] expresses that cloud customers: “[m]aintain personal property on a third party's premises, he or she retains a reasonable expectation of privacy, even if that third party has the right to access the property for some purchases.”

A. Increasing Cloud Trust with Security Technology Solutions

Wu et al. [50] in their research, enable trust by increasing levels of security. They introduce a trusted third party to provide the secret key for encrypting data storage. This enhances the probability that consumers have higher security solutions to prevent data violation in the form of secure envelopes. This kind of solution however, incurs higher network traffic costs. In agreement [19] and [20] discuss that enhancing security encryption degrades systems performance and scalability.

The dispersion of cloud customers and data centers globally alters the current domain trust relationship as cloud customers and servers might not be in the same trusted domain [20]. In comparison with the latter, traditional methods on enabling and enhancing trust are simply unrealistic as the amount of data to process is growing exponentially [49]. Integrity mechanisms that were once used in traditional enterprise data centers focused on independent and isolated servers. The method for hashing the entire file(s) is not feasible in cloud data center technology [49]. This creates uncertainty for cloud consumers that do not have background knowledge in cloud computing. It was also found that cloud customers have little or no knowledge of trust-related issues in cloud computing.

B. Enhancing Trust from Social and Technical Perspectives

Enabling trust is notably difficult to sustain as it is dynamic in nature and subject to other factors that may influence the cloud customer's behavior [26]. The ability to improve trust using cloud computing is not solely a technical issue; it needs to include social structures [22]. Throughout, [15] describes security and privacy issues as being formed by emotions, authority and power by the individuals that use cloud computing resources.

Kshetri [15] details the importance of increasing security while lowering privacy issues by enhancing trust relationships between cloud provider and customer. To support the latter, King and Raja [14] state that security weaknesses will relate to lower consumption of cloud computing and a further decrease of customers handing off data.King and Raja [14] uphold that policymakers need to enforce standards and practices within the cloud computing industry. With respect to customer trust, [51] states that enhancing transparency with respect to security will only act to better support trust.Relating to social trust issues, [28] describe trust in relation to technological and virtualized concepts. To enable trust between virtualized systems is to overcome vulnerabilities in hardware and software design. A key security platform used in virtualized environments is the Trusted Platform Module (TPM), which is an industry standard for enabling root trust in hardware design and components [28].

Cloud customer trust concerns are likely to continue as failures in both technical and social structures of cloud computing remain unresolved [14]. Kshetri [15] states that this is conclusive and ongoing as cloud providers lack giving cloud customers adequate and meaningful information, further diminishing trust. Trusting cloud providers with corporate transactions needs better management [17]. Cloud customers with sensitive data will continue to rationalize and investigate cloud computing. Further research is required in the area of constructing regulatory frameworks that cover trust relationships between all parties in a service level agreement (SLA) [14].

C. Increasing Trust with Service Level Agreement Visibility

According to [5], cloud providers grant minimal visibility for their offered service acceptance terms and agreements. The increased response time to service deployments are critical factors for end-user acceptance. Enquiring and reading through the terms and agreements of the proposed service are dormant as most customers (and their end-users) will have not read or even become aware of the terms and agreements they sign up to [5]. King and Raja [14] state that trust will be jeopardized as privacy and security concerns continue to rise. [5], [14] and [15] discuss perceptions that trust will be further diminished as cloud providers lack the enthusiasm and impetus to address these concerns. As a result, cloud providers continue to have full authority over customer data [52]. Although this has recently changed with many suggesting mandatory data breach notification, and even commensurate penalties for untimely communications about breaches.

Section VI. Data Flow

A. Data Flow Between Multiple Jurisdictions

In network operations, data flow is essential to overall planning and lifecycle management tasks for IT departments. To understand where the data is being transferred, impacts the type of cloud computing model chosen and overall data storage techniques. Critical and sensitive data that belongs to end-users of cloud solutions may store personal information which cannot be shared with third party vendors. Fears amongst end-users of cloud computing models are greatest when cross-border data flows occur without the pre-warning of the cloud provider. Esayas [13] examined the EU Data Protection Directive, which dated back to 1995, and stated that privacy protection was rather limited to the cloud customer as data was being transferred between jurisdictions. In support, [31] detail privacy acts and regulatory bodies in various jurisdictions which clearly lack the sufficient power to withhold cloud providers the right to keep data to be transferred to another jurisdiction.

B. Cloud Infrastructure Outpacing the Legal Framework

Adrian [46] argues technology developments generally outpace privacy and regulatory issues as a key contributor to privacy concerns. This is particularly important to cloud customers as the legal framework is outdated and insufficient for current cloud computing models [46]. The ineffective use of outdated privacy laws and regulations are difficult to be tied to cross-border data flows, as foreign corporations have data ownership to the data that was transferred [5]. Complications and confusion occurs when legal frameworks become uncertain to cloud customers. To add to the severity of the problem, cloud customers are often unawares of the specific physical storage location of their data. Australia's lack of updating and enhancing of the Privacy Act 1988 is particularly problematic for cloud customers [46]. Svantesson and Clarke [5, p. 392] state that cloud computing “extends beyond mere compliance with data protection laws to encompass public expectations and policy issues that are not, or not yet, reflected in the law”.

C. Australia and EU Legal Frameworks Compared

The transfer of data over the Internet, that cloud providers perform, does not correlate to the cross-border transfer of data within the Australian Privacy Act 1988 [13]. In comparing the Australian Privacy Act 1988 and European Union (EU) Data Protection Directive 1995, [14] explicitly mentions that the EU Directive prohibits member states from cross-border data transfer activities that have below acceptable laws and regulation. In contrast, [46] mentions that conflicting judgements often occur, as enforcing these rules becomes difficult to sustain in foreign countries.

King and Raja [14] explain that EU member states have far tighter privacy laws and regulations compared to Australia when cross-border data flows come into question. The EU Directive gives cloud customers basic rights to their data, and knowledge of where their data is physically stored. Cross-border data flows out of Australia are dissimilar, as the common law is applied to these scenarios, which have far less restrictions compared to the EU Directive [52]. Simply, the EU Directive states that cross-border data flows cannot occur if foreign jurisdictions do not have the same levels of enforcement[53].

Compliance and regulations restrict certain jurisdictions from transferring data to foreign jurisdictions, the location from which the data originated and where it is being transferred [17]. With respect to the EU Directive, even if the cloud consumer is located outside the EU, the data that is generated within the EU cannot be transferred outside the EU [14]. To determine data transfer between jurisdictions is often difficult to answer as the flow of data between jurisdictions can be altered at any time without the cloud customers acknowledgement [14].

The concern for cloud customers over sensitive data is often overlooked and underestimated as cloud providers continue to transfer data to other jurisdictions. This has also raised concerns particularly with sensitive data that end-users of cloud services generate from online applications. Sensitive data that is stored within traditional enterprise networks have been controlled by authorized personnel with tight restrictions using an access-control matrix. These restrictions are both physical security as well as security solutions such as authorizations and cryptography. Regardless of data location, cloud consumers need to have control over data flow between jurisdictions [17].

D. Google Docs Privacy Policy: An Example

In their analysis of Google Docs Privacy Policy, [5] state that cloud end-users of the Google SaaS model have a minimal amount of knowledge on where their data is being transferred and processed. In complement, [53] declare that Google's service agreements bear no liability for any privacy and security of cloud end-user data. Their privacy policy does not provide fundamental information about how third party gadgets collect, manipulate and store cloud end-user data when using Google Docs [5]. This is somewhat inclusive as data residing within the EU cannot be transferred to non-EU jurisdictions even though the data owners are not EU-based residents [14]. The confusion for end-users of cloud computing is high as cross-border data flows are often not highlighted and detailed to the cloud end-user during signup of a cloud-based application. The claim made by cloud computing providers is that cross-border data flows allow for higher service guarantees to the cloud customer and their respective customers. Acceptable service level agreements for cloud consumers can be taken into consideration whilst developing cloud strategies. A unified service level agreement will help improve confidence for future cloud computing migration [54].

Section VII. Service Level Agreements

Buyya et al. [3] in their seminal study describe the importance of service level agreements in cloud computing. Service level agreements provide the needed protection between cloud provider and cloud customer. Similarly, [55] also detail that SLAs are important documents that set expectations for both the cloud customer and the provider. With cloud computing being dynamic in nature and resources being adjusted on an ad-hoc basis, [56] discuss the need for the SLA to be self-adaptable and autonomic. For uninspected service disruption to be avoided, cloud providers need to assure that service guarantees are meet in a timely fashion [57].

A. Cloud Computing Service Level Agreement Importance

The issues associated with cloud computing continue to exist and several factors considered by [17] are raised as being important. These include: service level agreements, security and privacy constraints. Service level agreements are pivotal in establishing a contract between provider and customer in the adoption of cloud computing technologies and services. Cloud customers need to be selective and to incorporate security technology and privacy prevention policies within service level agreements [2]. Interpreting SLAs on behalf of cloud customers will enable proper decisions to be made by key managerial staff. SLAs provide customers with the ability to terminate a contract if service levels are not met by the cloud provider.

B. Service Level Agreements and Negotiation Strategies

Karadsheh's[58] findings proposed a security model and SLA negotiation application process. This was derived through understanding business security requirements prior to facilitating cloud computing activities. Throughout the study, the concept was to build confidence in the enterprise by applying the right requirements. Karadsheh's[58] first point was to illustrate due diligence in the cloud provider and then apply the needed security policies, and whether the cloud provider would be able to adhere to them. The remaining component was to negotiate SLAs. Questions based on data location, privacy agreements and backup strategies were performed as measurable attributes, and if successful, a cloud provider would be selected. To complement [33] and [58] discuss the importance of understanding the SLA prior to cloud computing usage enabling all parties to set their legal and technical expectations.

C. Public Cloud Provider SLA Content Analysis Approach

Pauley [59] designed a transparency scorecard framework to measure security, privacy, auditing and SLA attributes. The scorecard framework questions were based on SLA guarantees, SLA management procedures and record of SLA usages. The scorecard was designed to allow cloud consumers the ability to note the cloud provider that best suited their application of use. Pauley's [59] approach compared cloud customer requirements with publicly available information from cloud providers and used the self-service method for analysis. In comparison, [60]analyzed cloud provider applicability with SLAs, without reference to security, privacy and audit. Qiu et al. [60] gathered SLAs from public cloud providers that had no restrictions to view their SLAs.The sample size was larger than in [59]. Qiu et al. [60] also applied the content analysis technique to analyze the data within the SLA and followed up with a case study and interview method with the cloud customer.

The findings by [59] detailed that out of six public cloud providers chosen (Google, Amazon Web Services, Microsoft Azure, IBM, Terremark, Savvis) only two scored greater than 50% in the SLA scorecard. The results were masked and the cloud providers were not identified. Qiu et al. [60] analyzed further SLA attributes than [59] providing greater insights towards the true value of SLAs for cloud computing. Some of the added attributes in the second study that proved significant were definitions of data protection policy, backup policy and regulatory compliance policy that were originally missing from the first study.

Baset [29] details the importance of understanding the variability of SLA from the cloud provider perspective. The author introduces the attributes of service guarantee, time period, granularity, exclusions, service credit and service violation monitoring. These are the key attributes that are going to be analyzed throughout the study using context analysis of the publicly available SLAs. Qiu et al.'s [60] study has additional attributes that define the obligations from both provider and customer points of view.An important finding from the study of [29], is that service violation incident reporting for all cloud providers were not available on the actual SLA, save for Amazon Compute, which had 5 (five) days of incident reporting factored. Cloud customers that stipulate acknowledgement from cloud providers that have a data breach, disruptions or security related incidents occurring are alarmingly noted as “not available” within the service level agreement. The study from [29] also indicates SLAs that were analyzed from October 2008 to April 2010, indicate that SLAs do not change and reflect actual cloud provider technology status. Baset [29] also discusses that enterprise SLAs should comprise more than just availability and performance, but also privacy, security and disaster recovery.

D. Measuring Cloud Provider Service Level Agreements

Throughout organizational use of cloud computing the important aspect of defining SLA is crucial. The service being utilized will be directly affected if the SLA does not fit the cloud consumer's requirements. In the framework that is being proposed, [61] evaluate and rank SLA attributes of cloud providers. They utilize the service measurement index (SMI) and the attributes are accountability, agility, cost, performance, assurance, security, privacy and usability. The authors extend on this concept and introduce user experiences as another attribute. This introduces the Analytical Hierarchical Process (AHP) for cloud consumers to evaluate and rank cloud customers based on the attributes of the SMI. The framework is utilized in a case study approach that consists of three cloud providers (Amazon EC2, Microsoft Azure and Rackspace). Based on the user requirements, the attributes are given a ranking matrix and results in total weight of the quality of service attributes. The final outcome of the proposed study concluded that S3 (service provider 3) anonymously given name was the best in terms of performance, although S1 (service provider 1) provides the best quality/cost ratio. To compare [29], [59], [60], [61]–introduce the known SMI and AHP frameworks that are used to evaluate and measure attributes on known metrics, rather than analyzing from an individual customer's perspective.

E. A Brief Analysis of Google Service Agreements

Svantesson and Clarke's [5] analysis of the Google Docs service terms discuss that cloud customers have very little knowledge how their data is used and where it resides.[53]also details that Google's service agreements provide no protection on both privacy and security issues for cloud customer data. With respect to cloud customer protection, [62] summates that Google's service agreements state that the Internet search giant has the right to use the content that is obtained and publicly displayed through its Google services. Google can willingly use customer data by accessing, indexing and caching without the end customer's knowledge [62]. These agreements are enforced often without the knowledge of the cloud customer or the cloud customer's customer [5].

Section VIII. Regulation

Managing cloud computing regulations in the U.S. have yet to mature and in certain circumstances lack adequate protection for cloud customer data confidentiality, integrity and availability [14]. Comparing U.S. cloud computing regulation to the EU is challenging, as the EU have tighter restrictions on what is deemed acceptable and unacceptable [14]. Current regulatory rights lack the ability to protect data that is owned by cloud customers from different jurisdictions as to the location of the data owner [18]. Conflicting regulatory rights from different jurisdictions enforce foreign laws to be applied. Adrian [46] describes that new regulation for cloud computing models are inevitably risky and costly as change would impact individual entities.Constructing new regulations would impose burdens on existing and established rights as all entities would need to learn and adapt to new regulations [46]. Similarly, imposing new regulatory laws into an ecosystem that has not yet matured can be a challenging task for all participants involved [20].

Robison's [62] discussion on United States Stored Communication Act (SCA) implies a strong and deterministic approach on legal infrastructure is simply outdated for today's technology, including cloud computing. The author describes and contrasts cloud providers to incorporate terms of service (ToS), privacy policies of the agreed service.In comparison, [7] discusses the Stored Communication Act (SCA) imposing recommendations and future frameworks. Their recommendations include: removing the remote computing services (RCS) and electronic communication services (ECS), toward the incorporation of requiring warrants, and implementing a statutory suppression remedy in the SCA. The two studies utilized the SCA as a foundation, although [62] rather intended to cooperate and provide guidelines for future use of cloud computing. Nguyen's [7] objective in his study was rather to propose the alteration of the legal infrastructure itself. Robison's [62] and Nguyen's [7] aim was to satisfy the objectives of reasoning with cloud providers and cloud customers and allow privacy protection to be strengthened. The component of removing the ECS / RCS and issuing a warrant avoids and prevents “searches from turning into fishing expeditions” [7, p. 2213]. The current court orders require less ground to impose search for data, while warrants will allow for searches that are on reasonable grounds.

Section IX. Conclusion

This paper has used a social-technical approach to review literature in the field of cloud computing. From an analysis of the technical-related works in the field of cloud computing, it is conclusive that security concerns are among the most critical issues facing stakeholders of the cloud computing value chain. It is apparent that most previous studies have focused on enhancing security technology without focusing or reviewing the actual attacks that have been successfully launched against cloud providers. This indicates that cloud data breaches are ill-defined and under-researched in cloud computing scholarly works. There are two concerns that are fundamental to cloud computing security that need further attention. The first concern is with pre-cloud computing data breach manageability and the second concern with post-cloud computing security manageability. Scholarly works have focused largely on simulating security solutions, although they have underestimated the importance of incorporating externalities within the studies. Externalities focus on government and industry related regulations which are integral components that are presently only scantly mentioned in the literature. Importantly, social, technical and environmental concerns have been largely overlooked, with works only focusing on either social-technical, technical-environmental, without reference to all three aspects of the cloud computing value chain.

The second part of this paper examined the social aspect of cloud computing and consisted of privacy and trust-based concerns in previous works. The studies found in this area, demonstrated the importance of privacy and trust within cloud computing as not only supporting continual usage of these services, but also to state concerns with utilization. At the very heart of cloud data breaches are privacy and trust. Scholarly works that were reviewed also identified issues with respect to environmental concerns; such as data flow issues, regulation and service level agreements that were either misinterpreted or missing from government statuary legislation and potential cloud provider's terms of service. It was obvious from the review of literature that a lot of research to date in the cloud computing field has focused on technical solutions than the actual social implications of cloud computing data breaches. This not only signifies the need for a balanced approach, but also specifically with respect to the social requirements, especially of cloud customers, and the end-users of cloud solutions who may not even be aware that they are using cloud services.

In terms of the environmental aspect of cloud computing, what we found is that “systems” today have not only a global reach but technology itself is sprawled over a global landscape. Cloud providers do not simply operate from one location but for the purposes of redundancy, cost, and legal boundaries could operate various components of a system scattered all over the world. It may even be impossible for the cloud provider to denote which part of a given transaction is occurring locally as opposed to across the border. Previous works, with the exception of a small number of papers, have not addressed this regulatory/ legal aspect of cloud computing. And even fewer studies, say anything significant about the vulnerability of cloud computing end-users (i.e. everyday consumers) with respect to regulation once a data breach has occurred. What happens when hackers successfully breach a cloud computing service, and the details of personal data from a cloud customer's services are stolen or leaked? Who is informed? How are they informed? When are end-users of the cloud customer notified of a breach? How is a cloud customer supported for damage to their brand by the successful security breach, and more importantly, how does a consumer of a service based on cloud infrastructure, reclaim their personal information once it has been compromised and compensated for the loss? In conclusion, there is an urgent need for research that takes a balanced approach to cloud computing data breaches and incorporates the end-user, not just the cloud provider and cloud business customer into the study. There also needs to be a balance struck between social, technical and environmental aspects covered in finding a practicable solution to security breaches as they continue to occur, for these are inevitable.

References

1. D. N. Chorafas, Cloud Computing Strategies, Boca Raton, Florida:Taylor & Francis Group, 2011.
2. G. Pallis, "Cloud Computing: The New Frontier of Internet Computing" in Internet Computing, IEEE, vol. 14, pp. 70-73, 2010.
3. R. Buyya et al., "Cloud computing and emerging IT platforms: Vision hype and reality for delivering computing as the 5th utility", Future Generation Computer Systems, vol. 25, pp. 599-616, 2009.
4. P. Mell, T. Grance, "The NIST Definition of Cloud Computing National Institute of Standards and Technology", 2011.
5. D. Svantesson, R. Clarke, "Privacy and consumer risks in cloud computing", Computer Law & Security Review, vol. 26, pp. 391-397, 2010.
6. M. H. Hugos, D. Hulitzky, Business in the cloud: what every business needs to know about cloud computing, New York:John Wiley & Sons, 2010.
7. T. M. Nguyen, "Cloud cover: privacy protections and the Stored Communications Act in the age of cloud computing", Notre Dame Law Review, vol. 86, pp. 2189.
8. J. W. Rittinghouse, J. F. Ransome, Cloud computing: implementation management and security, Boca Raton, FL:Taylor & Francis Group, 2010.
9. W. Wang et al., "Cloud-DLS: Dynamic trusted scheduling for Cloud computing", Expert Systems with Applications, vol. 39, pp. 2321-2329, 2012.
10. C. Baun et al., Cloud computing: web-based dynamic IT services, Berlin/Heidelberg:Springer, 2011.
11. J. R. Winkler, Securing the cloud: cloud computing security techniques and tactics, Burlington, MA:Elsevier, 2011.
12. B. R. Rimal, N. Antonopoulos, L. Gillam et al., "Chapter 2. A Taxonomy Survey and Issues of Cloud Computing Ecosystems" in Cloud Computing: Principles Systems and Applications, ed London, UK:Springer-Verlag, pp. 21-46, 2010.
13. S. Y. Esayas, "A walk in to the cloud and cloudy it remains: The challenges and prospects of ‘processing’ and ‘transferring’ personal data", Computer Law & Security Review, vol. 28, pp. 662-678, 2012.
14. N. J. King, V. T. Raja, "Protecting the privacy and security of sensitive customer data in the cloud", Computer Law & Security Review, vol. 28, pp. 308-319, 2012.
15. N. Kshetri, "Privacy and security issues in cloud computing: The role of institutions and institutional evolution", Telecommunications Policy, vol. 37, pp. 372-386, 2013.
16. R. Oppliger, "Security and privacy in an online world", Computer, vol. 44, pp. 21, 2011.
17. S. Subashini, V. Kavitha, "A survey on security issues in service delivery models of cloud computing", Journal of Network and Computer Applications, vol. 34, pp. 1-11, 2011.
18. H. Takabi et al., "Security and Privacy Challenges in Cloud Computing Environments", IEEE Security & Privacy, vol. 8, pp. 24-31, 2010.
19. M. Zhou et al., "Privacy enhanced data outsourcing in the cloud", Journal of Network and Computer Applications, vol. 35, pp. 1367-1373, 2012.
20. K. Ren et al., "Security Challenges for the Public Cloud", IEEE Internet Computing, vol. 16, no. 00, pp. 69-73, 2012.
21. F. Rocha et al., "The Final Frontier: Confidentiality and Privacy in the Cloud", Computer, vol. 44, pp. 44-50, 2011.
22. J. Hwang, D. Li, "Trusted cloud computing (or) controlling the cloud?", Computer Law & Security Review, vol. 14, pp. 14-22.
23. G. Pek et al., "A survey of security issues in hardware virtualization", ACM Computing Surveys, vol. 45, pp. 1-34, 2013.
24. K. Salah et al., "Using Cloud Computing to Implement a Security Overlay Network" in Security & Privacy, IEEE, vol. 11, pp. 44-53, 2013.
25. C. Modi et al., "A survey on security issues and solutions at different layers of Cloud computing", The Journal of Supercomputing, vol. 63, pp. 561-592, 2013.
26. D. Sun et al., "Surveying and Analyzing Security Privacy and Trust Issues in Cloud Computing Environments", Procedia Engineering, vol. 15, pp. 2852-2856, 2011.
27. D. Chen, H. Zhao, "Data Security and Privacy Protection Issues in Cloud Computing", Proceedings of the 2012 International Conference on Computer Science and Electronics Engineering (ICCSEE), 2012.
28. M. Pearce et al., "Virtualization: Issues security threats and solutions", ACM Comput. Surv., vol. 45, pp. 1-39, 2013.
29. S. A. Baset, "Cloud SLAs: present and future", SIGOPS Oper. Syst. Rev., vol. 46, pp. 57-66, 2012.
30. B. Grobauer et al., "Understanding Cloud Computing Vulnerabilities" in Security & Privacy, IEEE, vol. 9, pp. 50-57, 2011.
31. G. Aceto et al., "Cloud monitoring: A survey", Computer Networks, vol. 57, pp. 2093-2115, 2013.
32. H. Mouratidis et al., "A framework to support selection of cloud providers based on security and privacy requirements", Journal of Systems and Software, vol. 86, pp. 2276-2293, 2013.
33. A. Arenas et al., "Bridging the Gap between Legal and Technical Contracts" in Internet Computing, IEEE, vol. 12, pp. 13-19, 2008.
34. M. Singhal et al., "Collaboration in multicloud computing environments: framework and security issues", Computer, vol. 46, pp. 76, 2013.
35. X. Yang et al., "A business-oriented Cloud federation model for real-time applications", Future Generation Computer Systems, vol. 28, pp. 1158-1167, 2012.
36. K. Yang, X. Jia, "Data storage auditing service in cloud computing: challenges methods and opportunities", World Wide Web, vol. 15, pp. 409-428, 2012.
37. P. Hofmann, D. Woods, "Cloud Computing: The Limits of Public Clouds for Business Applications" in Internet Computing, IEEE, vol. 14, pp. 90-93, 2010.
38. S. K. Sood, "A combined approach to ensure data security in cloud computing", Journal of Network and Computer Applications, vol. 35, pp. 1831-1838, 2012.
39. D. Zissis, D. Lekkas, "Addressing cloud computing security issues", Future Generation Computer Systems, vol. 28, pp. 583-592, 2012.
40. H. Y. Tsai et al., "Threat as a Service?: Virtualization's Impact on Cloud Security", IT Professional, vol. 14, pp. 32-37, 2012.
41. S. Sakr et al., "A Survey of Large Scale Data Management Approaches in Cloud Environments" in Communications Surveys & Tutorials, IEEE, vol. 13, pp. 311-336, 2011.
42. A. Benlian, T. Hess, "Opportunities and risks of software-as-a-service: Findings from a survey of IT executives", Decision Support Systems, vol. 52, pp. 232-246, 2011.
43. "For Your Information: Australian Privacy Law and Practice", ALRC Report 108, 2008.
44. R. Clarke, "What's Privacy?", 2006, [online] Available: www.rogerclarke.com.
45. R. Gavison, "Privacy and the Limits of Law", The Yale Law Journal, vol. 89, pp. 421-471, 1980.
46. A. Adrian, "How much privacy do clouds provide? An Australian perspective", Computer Law & Security Review, vol. 29, pp. 48-57, 2013.
47. H. Wang, "Privacy-Preserving Data Sharing in Cloud Computing", Journal of Computer Science & Technology, vol. 25, pp. 401-414, 2010.
48. K. L. Ter, "Singapore's Personal Data Protection legislation: Business perspectives", Computer Law & Security Review, vol. 29, pp. 264-273, 2013.
49. Z. Xiao, Y. Xiao, "Security and Privacy in Cloud Computing", IEEE Communications Surveys & Tutorials, vol. 15, pp. 843-859, 2013.
50. W. Wu et al., "How to achieve non-repudiation of origin with privacy protection in cloud computing", Journal of Computer and System Sciences, vol. 79, pp. 1200.
51. N. Ismail, "Cursing the Cloud (or) Controlling the Cloud?", Computer Law & Security Review, vol. 27, pp. 250-257, 2011.
52. A. Gray, "Conflict of laws and the cloud", Computer Law & Security Review, vol. 29, pp. 58-65, 2013.
53. N. Kshetri, S. Murugesan, "Cloud Computing and EU Data Privacy Regulations", Computer, vol. 46, pp. 86-89, 2013.
54. Y. Wei, M. B. Blake, "Service-Oriented Computing and Cloud Computing: Challenges and Opportunities", IEEE Internet Computing, vol. 14, pp. 72-75, 2010.
55. V. Kumar, P. Pradhan, "Role of Service Level Agreements in SaaS Business Scenario", IUP Journal of Information Technology, vol. 9, pp. 64-76, 2013.
56. A. Kertesz et al., "An interoperable and self-adaptive approach for SLA-based service virtualization in heterogeneous Cloud environments", Future Generation Computer Systems, vol. 32, pp. 54-68, 2014.
57. A. G. Garcia et al., "SLA-driven dynamic cloud resource management", Future Generation Computer Systems, vol. 31, pp. 1-11, 2014.
58. L. Karadsheh, "Applying security policies and service level agreement to IaaS service model to enhance security and transition", Computers & Security, vol. 31, pp. 315-326, 2012.
59. W. A. Pauley, "Cloud Provider Transparency: An Empirical Evaluation" in Security & Privacy, IEEE, vol. 8, pp. 32-39, 2010.
60. M. M. Qiu et al., "Systematic Analysis of Public Cloud Service Level Agreements and Related Business Values", presented at the Proceedings of the 2013 IEEE International Conference on Services Computing, 2013.
61. S. K. Garg et al., "A framework for ranking of cloud computing services", Future Generation Computer Systems, vol. 29, pp. 1012-1023, 2013.
62. W. Robison, "Free at what cost?: Cloud computing privacy under the stored communications act", Georgetown Law Journal, vol. 98, pp. 1195-1239, 2010.

Citation: David Kolevski, Katina Michael, "Cloud computing data breaches a socio-technical review of literature", 2015 International Conference on Green Computing and Internet of Things (ICGCIoT), 8-10 Oct. 2015, Noida, India, DOI: 10.1109/ICGCIoT.2015.7380702

Heaven and Hell: Visions for Pervasive Adaptation

Abstract

With everyday objects becoming increasingly smart and the “info-sphere” being enriched with nano-sensors and networked to computationally-enabled devices and services, the way we interact with our environment has changed significantly, and will continue to change rapidly in the next few years. Being user-centric, novel systems will tune their behaviour to individuals, taking into account users’ personal characteristics and preferences. But having a pervasive adaptive environment that understands and supports us “behaving naturally” with all its tempting charm and usability, may also bring latent risks, as we seamlessly give up our privacy (and also personal control) to a pervasive world of business-oriented goals of which we simply may be unaware.

1. Visions of pervasive adaptive technologies

This session considered some implications for the future, inviting participants to evaluate alternative utopian/dystopian visions of pervasive adaptive technologies. It was designed to appeal to anyone interested in the personal, social, economic and political impacts of pervasive, ubiquitous and adaptive computing.

The session was sponsored by projects from the FET Proactive Initiative on Pervasive Adaptation (PerAda), which targets technologies and design methodologies for pervasive information and communication systems capable of autonomously adapting in dynamic environments. The session was based on themes from the PerAda book entitled “This Pervasive Day”, to be published in 2011 by Imperial College Press, which includes several authors from the PerAda projects, who are technology experts in artificial intelligence, adaptive systems, ambient environments, and pervasive computing. The book offers visions of “user heaven” and “user hell”, describing technological benefits and useful applications of pervasive adaptation, but also potential threats of technology. For example, positive advances in sensor networks, affective computing and the ability to improve user-behaviour modeling using predictive analytics could be offset by results that ensure that neither our behaviour, nor our preferences, nor even our feelings will be exempt from being sensed, digitised, stored, shared, and even sold. Other potentially undesirable outcomes to privacy, basic freedoms (of expression, representation, demonstration etc.), and even human rights could emerge.

One of the major challenges, therefore, is how to improve pervasive technology (still in its immature phase) in order to optimise benefits and reduce the risks of negative effects. Increasingly FET research projects are asked to focus on the social and economic impacts of science and technology, and this session aimed to engage scientists in wider issues, and consider some of the less attractive effects as well as the benefits from pervasive adaptation. Future and emerging technology research should focus on the social and economic impacts of practical applications. The prospect of intelligent services increasingly usurping user preferences as well as a certain measure of human control creates challenges across a wide range of fields.

2. Format

The networking session took the form of a live debate, primed by several short “starter” talks by “This Pervasive Day” authors who each outlined “heaven and hell” scenarios. The session was chaired by Ben Paechter, Edinburgh Napier University, and coordinator of the PerAda coordination action. The other speakers were as follows:

Pervasive Adaptation and Design Contractualism.

Jeremy Pitt, Imperial College London, UK, editor of “This Pervasive Day”.

This presentation described some of the new channels, applications and affordances for pervasive computing and stressed the need to revisit the user-centric viewpoint of the domain of Human-Computer Interaction. In dealing with the issues of security and trust in such complex systems, capable of widespread data gathering and storage, Pitt suggested that there is a requirement for Design Contractualism, where the designer makes moral and ethical judgments and encodes them in the system. No privacy or security model is of any value if the system developers will not respect the implicit social contract on which the model depends.

Micro-chipping People, The Risk vs Reward Debate

Katina Michael, University of Wollongong, Australia

Michael discussed the rise of RFID chip implantation in people as a surveillance mechanism, making comparisons with the CCTV cameras that are becoming commonplace in streets and buildings worldwide. These devices are heralding in an age of “Uberveillance”, she claims, with corporations, governments and individuals being increasingly tempted to read and record the biometric and locative data of other individuals. This constant tracking of location and monitoring of physical condition raises serious questions concerning security and privacy that researchers will have to face in the near future.

Who is more adaptive: the technology or ourselves?

Nikola Serbedzija, Fraunhofer FIRST, Germany

Serbedzija discussed how today's widespread information technologies may be affecting how we are as humans. We are now entering a world where information is replacing materiality, and where control over our individual data allows us to construct ourselves as we wish to be seen by others. Serbedzija then presented examples of research into ethically critical systems, including a reflective approach to designing empathetic systems that use our personal, physical data to assist us in our activities, for example as vehicle co-driving situations.

3. Conclusion

Following the presentations, the discussion was opened out and panellists answered questions from conference delegates. This was augmented by the use of a “tweet wall” which was open to delegates to send comments and opinions using a Twitter account. This was displayed on screen during the discussion session.

Keywords: Pervasive adaptation, ubiquitous computing, sensor networks, affective computing, privacy, security

Citation: Ben Paechter, Jeremy Pitt, Nikola Serbedzija, Katina Michael, Jennifer Willies, Ingi Helgasona, 2011, "Heaven and Hell: Visions for Pervasive Adaptation", Procedia Computer Science: The European Future Technologies Conference and Exhibition 2011, Vol. 7, pp. 81-82, DOI: https://doi.org/10.1016/j.procs.2011.12.025