Location-Based Privacy, Protection, Safety, and Security

Abstract

This chapter will discuss the interrelated concepts of privacy and security with reference to location-based services, with a specific focus on the notion of location privacy protection. The latter can be defined as the extent and level of control an individual possesses over the gathering, use, and dissemination of personal information relevant to their location, whilst managing multiple interests. Location privacy in the context of wireless technologies is a significant and complex concept given the dual and opposing uses of a single LBS solution. That is, an application designed or intended for constructive uses can simultaneously be employed in contexts that violate the (location) privacy of an individual. For example, a child or employee monitoring LBS solution may offer safety and productivity gains (respectively) in one scenario, but when employed in secondary contexts may be regarded as a privacy-invasive solution. Regardless of the situation, it is valuable to initially define and examine the significance of “privacy” and “privacy protection,” prior to exploring the complexities involved.

16.1 Introduction

Privacy is often expressed as the most complex issue facing location-based services (LBS) adoption and usage [44, p. 82, 61, p. 5, 66, pp. 250–254, 69, pp. 414–415]. This is due to numerous factors such as the significance of the term in relation to human rights [65, p. 9]. According to a report by the Australian Law Reform Commission (ALRC), “privacy protection generally should take precedence over a range of other countervailing interests, such as cost and convenience” [3, p. 104]. The intricate nature of privacy is also a result of the challenges associated with accurately defining the term [13, p. 4, 74, p. 68]. That is, privacy is a difficult concept to articulate [65, p. 13], as the term is liberally and subjectively applied, and the boundaries constituting privacy protection are unclear. Additionally, privacy literature is dense, and contains varying interpretations, theories and discrepancies as to what constitutes privacy. However, as maintained by [65, p. 67], “[o]ne point on which there seems to be near-unanimous agreement is that privacy is a messy and complex subject.” Nonetheless, as asserted by [89, p. 196], privacy is fundamental to the individual due to various factors:

The intensity and complexity of life, attendant upon advancing civilization, have rendered necessary some retreat from the world, and man, under the refining influence of culture, has become more sensitive to publicity, so that solitude and privacy have become more essential to the individual.

The Oxford English Dictionary definition of security is the “state of being free from danger or threat.” A designation of security applicable to this research is “a condition in which harm does not arise, despite the occurrence of threatening events; and as a set of safeguards designed to achieve that condition” [92, pp. 390–391]. Security and privacy are often confused in LBS scholarship. Elliot and Phillips [40, p. 463] warn that “[p]rivacy is not the same as security,” although the two themes are related [70, p. 14]. Similarly, Clarke [21] states that the term privacy is often used by information and communication technology professionals to describe data and data transmission security. The importance of security is substantiated by the fact that it is considered “a precondition for privacy and anonymity” [93, p. 2], and as such the two themes are intimately connected. In developing this chapter and surveying security literature relevant to LBS, it became apparent that existing scholarship is varied, but nonetheless entails exploration of three key areas. These include: (1) security of data or information, (2) personal safety and physical security, and (3) security of a nation or homeland/national security, interrelated categories adapted from [70, p. 12].

This chapter will discuss the interrelated concepts of privacy and security with reference to LBS, with a specific focus on the notion of location privacy protection. The latter can be defined as the extent and level of control an individual possesses over the gathering, use, and dissemination of personal information relevant to their location [38, p. 1, 39, p. 2, 53, p. 233], whilst managing multiple interests (as described in Sect. 16.1.1). Location privacy in the context of wireless technologies and LBS is a significant and complex concept given the dual and opposing uses of a single LBS solution. That is, an application designed or intended for constructive uses can simultaneously be employed in contexts that violate the (location) privacy of an individual. For example, a child or employee monitoring LBS solution may offer safety and productivity gains (respectively) in one scenario, but when employed in secondary contexts may be regarded as a privacy-invasive solution. Regardless of the situation, it is valuable to initially define and examine the significance of “privacy” and “privacy protection,” prior to exploring the complexities involved.

16.1.1 Privacy: A Right or an Interest?

According to Clarke [26, pp. 123–129], the notions of privacy and privacy protection emerged as important social issues since the 1960s. An enduring definition of privacy is the “right to be let alone” [89, p. 193]. This definition requires further consideration as it is quite simplistic in nature and does not encompass diverse dimensions of privacy. For further reading on the development of privacy and the varying concepts including that of Warren and Brandeis, see [76]. Numerous scholars have attempted to provide a more workable definition of privacy than that offered by Warren and Brandeis.

For instance, [21] maintains that perceiving privacy simply as a right is problematic and narrow, and that privacy should rather be viewed as an interest or collection of interests, which encompasses a number of facets or categories. As such, privacy is defined as “the interest that individuals have in sustaining a ‘personal space’, free from interference by other people and organisations” [2126]. In viewing privacy as an interest, the challenge is in balancing multiple interests in the name of privacy protection. This, as Clarke [21] maintains, includes opposing interests in the form of one’s own interests, the interests of other people, and/or the interests of other people, organizations, or society. As such Clarke refers to privacy protection as “a process of finding appropriate balances between privacy and multiple competing interests.”

16.1.2 Alternative Perspectives on Privacy

Solove’s [80] taxonomy of privacy offers a unique, legal perspective on privacy by grouping privacy challenges under the categories of information collection, information processing, information dissemination, and invasion. Refer to [80, pp. 483–558] for an in depth overview of the taxonomy which includes subcategories of the privacy challenges. Nissenbaum [65, pp. 1–2], on the other hand, maintains that existing scholarship generally expresses privacy in view of restricting access to, and maintaining control over, personal information. For example, Quinn [73, p. 213] insists that the central theme in privacy debates is that of access, including physical access to an individual, in addition to information access. With respect to LBS and location privacy, Küpper and Treu [53, pp. 233–234] agree with the latter, distinguishing three categories of access: (1) third-party access by intruders and law enforcement personnel/authorities, (2) unauthorized access by providers within the supply chain for malicious purposes, and (3) access by other LBS users. Nissenbaum [65, pp. 1–2] disputes the interpretation focused on access and control, noting that individuals are not interested in “simply restricting the flow of information but ensuring that it flows appropriately.” As such, Nissenbaum offers the framework of contextual integrity, as a means of determining when certain systems and practices violate privacy, and transform existing information flows inappropriately [65, p. 150]. The framework serves as a possible tool that can assist in justifying the need for LBS regulation.

A primary contribution from Nissenbaum is her emphasis on the importance of context in determining the privacy-violating nature of a specific technology-based system or practice. In addition to an appreciation of context, Nissenbaum recognizes the value of perceiving technology with respect to social, economic, and political factors and interdependencies. That is, devices and systems should be considered as socio-technical units [65, pp. 5–6].

In relation to privacy, and given the importance of socio-technical systems, the complexities embedded within privacy may, therefore, arise from the fact that the term can be examined from a number of perspectives. For instance, it can be understood in terms of its philosophical, psychological, sociological, economical, and political significance [2126]. Alternatively, privacy theory can provide varying means of interpretation, given that available approaches draw on inspiration from multiple disciplines such as computer science and engineering, amongst others [65, p. 67]. It is also common to explore privacy through its complex dimensions.

According to Privacy International, for instance, the term comprises the aspects of information privacy, bodily privacy, privacy of communications, and territorial privacy [72]. Similarly, in providing a contemporary definition of privacy, Clarke [26] uses Maslow’s hierarchy of needs to define the various categories of privacy; that is, “privacy of the person,” “privacy of personal behavior,” “privacy of personal communications,” and “privacy of personal data.” Clarke argues that since the late 1960s the term has been confined, in a legal sense, to the last two categories. That is, privacy laws have been restricted in their focus in that they are predominantly based on the OECD fair information principles, and lack coverage of other significant categories of privacy. Therefore, the label of information privacy, typically interchangeable with data privacy, is utilized in reference to the combination of communications and data privacy [21], and is cited by [58, pp. 5–7] as a significant challenge in the information age.

16.2 Background

16.2.1 Defining Information Privacy

In Alan Westin’s prominent book Privacy and Freedom, information privacy is defined as “the right of individuals, groups and institutions to determine for themselves, when, how and to what extent information about them is communicated to others” [90, p. 7]. Information in this instance is personal information that can be linked to or identify a particular individual [33, p. 326]. For a summary of information privacy literature and theoretical frameworks, presented in tabular form, refer to [8, pp. 15–17].

16.2.2 Information Privacy Through the Privacy Calculus Perspective

For the purpose of this chapter, it is noteworthy that information privacy can be studied through differing lenses, one of which is the privacy calculus theoretical perspective. Xu et al. [95, p. 138] explain that “the calculus perspective of information privacy interprets the individual’s privacy interests as an exchange where individuals disclose their personal information in return for certain benefits.” It can be regarded a form of “cost–benefit analysis” conducted by the individual, where privacy is likely to be (somewhat) relinquished if there is a perceived net benefit resulting from information disclosure [33, p. 327]. This perspective acknowledges the claim that privacy-related issues and concerns are not constant, but rather depend on perceptions, motivations, and conditions that are context or situation dependent [78, p. 353]. A related notion is the personalizationprivacy paradox, which is based on the interplay between an individual’s willingness to reap the benefits of personalized services at the expense of divulging personal information, which may potentially threaten or invade their privacy. An article by Awad and Krishnan [8] examines this paradox, with specific reference to online customer profiling to deliver personalized services. The authors recommend that organizations work on increasing the perceived benefit and value of personalized services to ensure “the potential benefit of the service outweighs the potential risk of a privacy invasion” [8, p. 26].

In the LBS context, more specifically, Xu et al. [94] build on the privacy calculus framework to investigate the personalization–privacy paradox as it pertains to overt and covert personalization in location-aware marketing. The results of the study suggest that the personalization approaches (overt and covert) impact on the perceived privacy risks and values. A complete overview of results can be found in [94, pp. 49–50]. For further information regarding the privacy calculus and the personalization–privacy paradox in the context of ubiquitous commerce applications including LBS, refer to [78]. These privacy-related frameworks and the concepts presented in this section are intended to be introductory in nature, enabling an appreciation of the varied perspectives on privacy and information privacy, in addition to the importance of context, rather than providing thoroughness in the treatment of privacy and information privacy. Such notions are particularly pertinent when reflecting on privacy and the role of emerging information and communication technologies (ICTs) in greater detail.

16.2.3 Emerging Technologies, m-Commerce and the Related Privacy Challenges

It has been suggested that privacy concerns have been amplified (but not driven) by the emergence and increased use of ICTs, with the driving force being the manner in which these technologies are implemented by organizations [2126]. In the m-commerce domain, mobile technologies are believed to boost the threat to consumer privacy. That is, the intensity of marketing activities can potentially be increased with the availability of timely location details and, more significantly, tracking information; thus enabling the influencing of consumer behaviors to a greater extent [25]. The threat, however, is not solely derived from usage by organizations. Specifically, the technologies originally introduced for use by government and organizational entities are presently available for consumer adoption by members of the community. For further elaboration, refer to Abbas et al. [1] and chapter 8 of Andrejevic [4]. Thus, location (information) privacy protection emerges as a substantial challenge for the government, business, and consumer sectors.

16.2.4 Defining Location (Information) Privacy

Location privacy, regarded a subset of information privacy, has been defined and presented in various ways. Duckham [38, p. 1] believes that location privacy is “the right of individuals to control the collection, use, and communication of personal information about their location.” Küpper and Treu [53, p. 233] define location privacy as “the capability of the target person to exercise control about who may access her location information in which situation and in which level of detail.” Both definitions focus on the aspect of control, cited as a focal matter regarding location privacy [39, p. 2]. With specific reference to LBS, location privacy and related challenges are considered to be of utmost importance. For example, Perusco and Michael [70, pp. 414–415], in providing an overview of studies relating to the social implications of LBS, claim that the principal challenge is privacy.

In [61, p. 5] Michael et al. also state, with respect to GPS tracking, that privacy is the “greatest concern,” resulting in the authors proposing a number of questions relating to the type of location information that should be revealed to other parties, the acceptability of child tracking and employee monitoring, and the requirement for a warrant in the tracking of criminals and terrorists. Similarly, Bennett and Crowe [12, pp. 9–32] reveal the privacy threats to various individuals, for instance those in emergency situations, mobile employees/workers, vulnerable groups (e.g., elderly), family members (notably children and teenagers), telematics application users, rental car clients, recreational users, prisoners, and offenders. In several of these circumstances, location privacy must often be weighed against other conflicting interests, an example of which is the emergency management situation. For instance, Aloudat [2, p. 54] refers to the potential “deadlock” between privacy and security in the emergency context, noting public concerns associated with the move towards a “total surveillance society.”

16.2.5 Data or Information Security

It has been suggested that data or information security in the LBS domain involves prohibiting unauthorized access to location-based information, which is considered a prerequisite for privacy [88, p. 121]. This form of security is concerned with “implementing security measures to ensure that collected data is only accessed for the agreed-upon purpose” [46, p. 1]. It is not, however, limited to access but is also related to “unwanted tracking” and the protection of data and information from manipulation and distortion [10, p. 185]. The techniques and approaches available to prevent unauthorized access and minimize chances of manipulation include the use of “spatially aware access control systems” [34, p. 28] and security- and privacy-preserving functionality [9, p. 568]. The intricacies of these techniques are beyond the scope of this investigation. Rather, this section is restricted to coverage of the broad data and information security challenges and the resultant impact on LBS usage and adoption.

16.2.6 Impact of Data or Information Security on LBS Market Adoption

It has been suggested that data and information security is a fundamental concern influencing LBS market adoption. From a legal standpoint, security is an imperative concept, particularly in cases where location information is linked to an individual [41, p. 22]. In such situations, safeguarding location data or information has often been described as a decisive aspect impacting on user acceptance. These claims are supported in [85, p. 1], noting that user acceptance of location and context-aware m-business applications are closely linked to security challenges. Hence, from the perspective of organizations wishing to be “socially-responsive,” Chen et al. [19, p. 7] advise that security breaches must be avoided in the interest of economic stability:

Firms must reassure customers about how location data are used…A security lapse, with accompanying publicity in the media and possible ‘negligence’ lawsuits, may prove harmful to both sales and the financial stability of the firm.

Achieving satisfactory levels of security in location- and context-aware services, however, is a tricky task given the general issues associated with the development of security solutions; inevitable conflicts between protection and functionality; mobile-specific security challenges; inadequacy of standards to account for complex security features; and privacy and control-related issues [85, pp. 1–2]. Furthermore, developing secure LBS involves consideration of multiple factors; specifically those related to data or information accuracy, loss, abuse, unauthorized access, modification, storage, and transfer [83, p. 10]. There is the additional need to consider security issues from multiple stakeholder perspectives, in order to identify shared challenges and accurately assess their implications and the manner in which suitable security features can be integrated into LBS solutions. Numerous m-business security challenges relevant to LBS from various perspectives are listed in [85]. Data security challenges relevant to LBS are also discussed in [57, pp. 44–46].

16.3 Privacy and Security Issues

16.3.1 Access to Location Information Versus Privacy Protection

The issue of privacy in emergency situations, in particular, is delicate. For instance, Quinn [73, p. 225] remarks on the benefits of LBS in safety-related situations, with particular reference to the enhanced 911 Directive in the US, which stipulates that the location of mobile phones be provided in emergency situations, aiding in emergency response efforts. The author continues to identify “loss of privacy” as a consequence of this service, specifically in cases where location details are provided to third parties [73, p. 226]. Such claims imply that there may be conflicting aims in developing and utilizing LBS. Duckham [38, p. 1] explains this point, stating that the major challenge in the LBS realm is managing the competing aims of enabling improved access to location information versus allowing individuals to maintain a sufficient amount of control over such information. The latter is achieved through the deployment of techniques for location privacy protection.

16.3.2 Location Privacy Protection

It is valid at this point to discuss approaches to location privacy protection. Bennett and Grant [13, p. 7] claim that general approaches to privacy protection in the digital age may come in varied forms, including, but not limited to, privacy-enhancing technologies, self-regulation approaches, and advocacy. In terms of LBS, substantial literature is available proposing techniques for location privacy protection, at both the theoretical and practical levels. A number of these techniques are best summarized in [39, p. 13] as “regulation, privacy policies, anonymity, and obfuscation.” A review of complementary research on the topic of privacy and LBS indicate that location privacy has predominantly been examined in terms of the social challenges and trade-offs from theoretical and practical perspectives; the technological solutions available to maintain location privacy; and the need for other regulatory response(s)to address location privacy concerns. The respective streams of literature are now inspected further in this chapter.

16.3.3 Social Challenges and Trade-Offs

In reviewing existing literature, the social implications of LBS with respect to privacy tend to be centered on the concepts of invasion, trade-off, and interrelatedness and complexity. The first refers primarily to the perceived and actual intrusion or invasion of privacy resulting from LBS development, deployment, usage, and other aspects. Alternatively, the trade-off notion signifies the weighing of privacy interest against other competing factors, notably privacy versus convenience (including personalization) and privacy versus national security. On the other hand, the factors of interrelatedness and complexity refer to the complicated relationship between privacy and other ethical dilemmas or themes such as control, trust, and security.

With respect to the invasion concept, Westin notes that concerns regarding invasion of privacy were amplified during the 1990s in both the social and political spheres [91, p. 444]. Concentrating specifically on LBS, [62, p. 6] provides a summary of the manner in which LBS can be perceived as privacy-invasive, claiming that GPS tracking activities can threaten or invade the privacy of the individual. According to the authors, such privacy concerns can be attributed to a number of issues regarding the process of GPS tracking. These include: (1) questionable levels of accuracy and reliability of GPS data, (2) potential to falsify the data post-collection, (3) capacity for behavioral profiling, (4) ability to reveal spatial information at varying levels of detail depending on the GIS software used, and (5) potential for tracking efforts to become futile upon extended use as an individual may become nonchalant about the exercise [62, pp. 4–5]. Other scholars examine the invasion concept in various contexts. Varied examples include [55] in relation to mobile advertising, [51] in view of monitoring employee locations, and [79] regarding privacy invasion and legislation in the United States concerning personal location information.

Current studies declare that privacy interests must often be weighed against other, possibly competing, factors, notably the need for convenience and national security. That is, various strands of LBS literature are fixed on addressing the trade-off between convenience and privacy protection. For instance, in a field study of mobile guide services, Kaasinen [50, p. 49] supports the need for resolving such a trade-off, arguing that “effortless use” often results in lower levels of user control and, therefore, privacy. Other scholars reflect on the trade-off between privacy and national security. In an examination of the legal, ethical, social, and technological issues associated with the widespread use of LBS, Perusco et al. [71] propose the LBS privacy–security dichotomy. The dichotomy is a means of representing the relationship between the privacy of the individual and national security concerns at the broader social level [71, pp. 91–97]. The authors claim that a balance must be achieved between both factors. They also identify the elements contributing to privacy risk and security risk, expressing the privacy risks associated with LBS to be omniscience, exposure, and corruption, claiming that the degree of danger is reduced with the removal of a specific risk [71, pp. 95–96]. The lingering question proposed by the authors is “how much privacy are we willing to trade in order to increase security?” [71, p. 96]. Whether in the interest of convenience or national security, existing studies focus on the theoretical notion of the privacy calculus. This refers to a situation in which an individual attempts to balance perceived value or benefits arising from personalized services against loss of privacy in determining whether to disclose information (refer to [833789495]).

The relationship between privacy and other themes is a common topic of discussion in existing literature. That is, privacy, control, security, and trust are key and interrelated themes concerning the social implications of LBS [71, pp. 97–98]. It is, therefore, suggested that privacy and the remaining social considerations be studied in light of these associations rather than as independent themes or silos of information. In particular, privacy and control literature are closely correlated, and as such the fields of surveillance and dataveillance must be flagged as crucial in discussions surrounding privacy. Additionally, there are studies which suggest that privacy issues are closely linked to notions of trust and perceived risk in the minds of users [444849], thereby affecting a user’s decision to engage with LBS providers and technologies. It is commonly acknowledged in LBS privacy literature that resolutions will seek consensus between issues of privacy, security, control, risk, and trust—all of which must be technologically supported.

16.3.4 Personal Safety and Physical Security

LBS applications are often justified as valid means of maintaining personal safety, ensuring physical security and generally avoiding dangerous circumstances, through solutions that can be utilized for managing emergencies, tracking children, monitoring individuals suffering from illness or disability, and preserving security in employment situations. Researchers have noted that safety and security efforts may be enhanced merely through knowledge of an individual’s whereabouts [71, p. 94], offering care applications with notable advantages [61, p. 4].

16.3.5 Applications in the Marketplace

Devices and solutions that capitalize on these facilities have thus been developed, and are now commercially available for public use. They include GPS-enabled wristwatches, bracelets, and other wearable items [59, pp. 425–426], in addition to their supportive applications that enable remote viewing or monitoring of location (and other) information. Assistive applications are one such example, such as those technologies and solutions suited to the navigation requirements of vision impaired or blind individuals [75, p. 104 (example applications are described on pp. 104–105)].

Alternative applications deliver tracking capabilities as their primary function; an example is the Australian-owned Fleetfinder PT2 Personal Tracker, which is advertised as a device capable of safeguarding children, teenagers, and the elderly [64]. These devices and applications promise “live on-demand” tracking and “a solid sense of reassurance” [15], which may be appealing for parents, carers, and individuals interested in protecting others. Advertisements and product descriptions are often emotionally charged, taking advantage of an individual’s (parent or carer) desire to maintain the safety and security of loved ones:

Your child going missing is every parent’s worst nightmare. Even if they’ve just wandered off to another part of the park the fear and panic is instant… [It] will help give you peace of mind and act as an extra set of eyes to look out for your child. It will also give them a little more freedom to play and explore safely [56].

16.3.6 Risks Versus Benefits of LBS Security and Safety Solutions

Despite such promotion and endorsement, numerous studies point to the dangers of LBS safety and security applications. Since their inception, individuals and users have voiced privacy concerns, which have been largely disregarded by proponents of the technology, chiefly vendors, given the (seemingly) voluntary nature of technology and device usage [6, p. 7]. The argument claiming technology adoption to be optional thereby placing the onus on the user is certainly weak and flawed, particularly given situations where an individual is incapable of making an informed decision regarding monitoring activities, supplementary to covert deployment options that may render monitoring activities obligatory. The consequences arising from covert monitoring are explored in [59] (refer to pp. 430–432 for implications of covert versus overt tracking of familiy member) and [1]. Covert and/or mandatory overt monitoring of minors and individuals suffering from illness is particularly problematic, raising doubt and questions in relation to the necessity of consent processes in addition to the suitability of tracking and what constitutes appropriate use.

In [59, p. 426] Mayer claims that there is a fine line between using tracking technologies, such as GPS, for safety purposes within the family context and improper use. Child tracking, for instance, has been described as a controversial area centered on the safety versus trust and privacy debate [77, p. 7]. However, the argument is not limited to issues of trust and privacy. Patel discusses the dynamics in the parent–child relationship and conveys a number of critical points in relation to wearable and embedded tracking technologies. In particular, Patel provides the legal perspective on child (teenager) monitoring [68, pp. 430–435] and other emergent issues or risks (notably linked to embedded monitoring solutions), which may be related to medical complications, psychological repercussions, and unintended or secondary use [68, pp. 444–455]. In Patel’s article, these issues are offset by an explanation of the manner in which parental fears regarding child safety, some of which are unfounded, and the role of the media in publicizing cases of this nature, fuel parents’ need for monitoring teenagers, whereas ultimately the decision to be monitored (according to the author), particularly using embedded devices, should ultimately lie with the teenager [68, pp. 437–442].

16.3.7 Safety of “Vulnerable” Individuals

Similarly, monitoring individuals with an illness or intellectual disability, such as a person with dementia wandering, raises a unique set of challenges in addition to the aforementioned concerns associated with consent, psychological issues, and misuse in the child or teenager tracking scenario. For instance, while dementia wandering and other similar applications are designed to facilitate the protection and security of individuals, they can concurrently be unethical in situations where reliability and responsiveness, amongst other factors, are in question [61, p. 7]. Based on a recent qualitative, focus group study seeking the attitudes of varied stakeholders in relation to the use of GPS for individuals with cognitive disabilities [54, p. 360], it was clear that this is an area fraught with indecisiveness as to the suitability of assistive technologies [54, p. 358]. The recommendations emerging from [54, pp. 361–364] indicate the need to “balance” safety with independence and privacy, to ensure that the individual suffering from dementia is involved in the decision to utilize tracking technologies, and that a consent process is in place, among other suggestions that are technical and devices related.

While much can be written about LBS applications in the personal safety and physical security categories, including their advantages and disadvantages, this discussion is limited to introductory material. Relevant to this chapter is the portrayal of the tensions arising from the use of solutions originally intended for protection and the resultant consequences, some of which are indeed inadvertent. That is, while the benefits of LBS are evident in their ability to maintain safety and security, they can indeed result in risks, such as the use of LBS for cyber stalking others. In establishing the need for LBS regulation, it is, therefore, necessary to appreciate that there will always be a struggle between benefits and risks relating to LBS implementation and adoption.

16.3.8 National Security

Safety and security debates are not restricted to family situations but may also incorporate, as [59, p. 437] indicates, public safety initiatives and considerations, amongst others, that can contribute to the decline in privacy. These schemes include national security, which has been regarded a priority area by various governments for over a decade. The Australian government affirms that the nation’s security can be compromised or threatened through various acts of “espionage, foreign interference, terrorism, politically motivated violence, border violations, cyber attack, organised crime, natural disasters and biosecurity events” [7]. Accordingly, technological approaches and solutions have been proposed and implemented to support national security efforts in Australia, and globally. Positioning technologies, specifically, have been adopted as part of government defense and security strategies, a detailed examination of which can be found in [60], thus facilitating increased surveillance. Surveillance schemes have, therefore, emerged as a result of the perceived and real threats to national security promoted by governments [92, p. 389], and according to [63, p. 2] have been legitimized as a means of ensuring national security, thereby granting governments “extraordinary powers that never could have been justified previously” [71, p. 94]. In [20, p. 216], Cho maintains that the fundamental question is “which is the greater sin—to invade privacy or to maintain surveillance for security purposes?”

16.3.9 Proportionality: National Security Versus Individual Privacy

The central theme surfacing in relevant LBS scholarship is that of proportionality; that is, measuring the prospective security benefits against the impending privacy- and freedom-related concerns. For example, [71, pp. 95–96] proposes the privacy–security dichotomy, as means of illustrating the need for balance between an individual’s privacy and a nation’s security, where the privacy and security elements within the model contain subcomponents that collectively contribute to amplify risk in a given context. A key point to note in view of this discussion is that while the implementation of LBS may enhance security levels, this will inevitably come at the cost of privacy [71, pp. 95–96] and freedom [61, p. 9].

Furthermore, forsaking privacy corresponds to relinquishing personal freedom, a consequential cost of heightened security in threatening situations. Such circumstances weaken the effects of invasive techniques and increase, to some degree, individuals’ tolerance to them [41, p. 12]. In particular, they “tilt the balance in favor of sacrificing personal freedom for the sake of public safety and security” [36, p. 50]. For example, Davis and Silver [35] report that the trade-off between civil liberties and privacy is often correlated with an individual’s sense of threat. In reporting on a survey of Americans post the events of September 11, 2011, the authors conclude that civil liberties are often relinquished in favor of security in high-threat circumstances [35, p. 35], in that citizens are “willing to tolerate greater limits on civil liberties” [35, p. 74]. Similarly, in a dissertation centered on the social implications of auto-ID and LBS technologies, Tootell [86] presents the Privacy, Security, and Liberty Trichotomy, as a means of understanding the interaction between the three values [86: chapter 6]. Tootell concludes that a dominant value will always exist that is unique to each individual [86, pp. 162–163].

Furthermore, researchers such as Gould [45, p. 75] have found that while people are generally approving of enhanced surveillance, they simultaneously have uncertainties regarding government monitoring. From a government standpoint, there is a commonly held and weak view that if an individual has nothing to hide, then privacy is insignificant, an argument particularly popular in relation to state-based surveillance [81, p. 746]. However, this perspective has inherent flaws, as the right to privacy should not be narrowly perceived in terms of concealment of what would be considered unfavorable activities, discussed further by [81, pp. 764–772]. Furthermore, the “civil liberties vs. security trade-off has mainly been framed as one of protecting individual rights or civil liberties from the government as the government seeks to defend the country against a largely external enemy” [35, p. 29].

Wigan and Clarke state, in relation to national security, that “surveillance systems are being developed without any guiding philosophy that balances human rights against security concerns, and without standards or guidance in relation to social impact assessment, and privacy design features” [92, p. 400]. Solove [82, p. 362] agrees that a balance can be achieved between security and liberty, through oversight and control processes that restrict prospective uses of personal data. In the current climate, given the absence of such techniques, fears of an Orwellian society dominated by intense and excessive forms of surveillance materialize. However, Clarke [27, p. 39] proposes a set of “counterveillance” principles in response to extreme forms of surveillance introduced in the name of national security, which include:

independent evaluation of technology; a moratorium on technology deployments; open information flows; justification of proposed measures; consultation and participation; evaluation; design principles; balance; independent controls; nymity and multiple identity; and rollback.

The absence of such principles creates a situation in which extremism reigns, producing a flow-on effect with potentially dire consequences in view of privacy, but also trust and control.

16.4 Solutions

16.4.1 Technological Solutions

In discussing technology and privacy in general, Krumm [52, p. 391] notes that computation-based mechanisms can be employed both to safeguard and to invade privacy. It is, therefore, valuable to distinguish between privacy-invasive technologies (PITs) and privacy-enhancing technologies (PETs). Clarke [23] examines the conflict between PITs and PETs, which are tools that can be employed to invade and protect privacy interests respectively. Technologies can invade privacy either deliberately as part of their primary purpose, or alternatively their invasive nature may emerge in secondary uses [2324, p. 209]. The aspects contributing to the privacy-invasive nature of location and tracking technologies or transactions include the awareness level of the individual, whether an individual has a choice, and the capability of performing an anonymous transaction amongst others [22]. In relation to LBS, [23] cites person-location and person-tracking systems as potential PITs that require the implementation of countermeasures, which to-date have come in the form of PETs or “counter-PITs.”

Existing studies suggest that the technological solutions (i.e., counter-PITs) available to address the LBS privacy challenge are chiefly concerned with degrading the ability to pinpoint location, or alternatively masking the identity of the user. For example, [62, p. 7] suggests that “[l]evels of privacy can be controlled by incorporating intelligent systems and customizing the amount of detail in a given geographic information system”, thus enabling the ethical use of GPS tracking systems. Similarly, other authors present models that anonymize user identity through the use of pseudonyms [14], architectures and algorithms that decrease location resolution [46], and systems that introduce degrees of obfuscation [37]. Notably, scholars such as Duckham [37, p. 7] consider location privacy protection as involving multiple strategies, citing regulatory techniques and privacy policies as supplementary strategies to techniques that are more technological in nature, such as obfuscation.

16.4.2 Need for Additional Regulatory Responses

Clarke and Wigan [31] examine the threats posed by location and tracking technologies, particularly those relating to privacy, stating that “[t]hose technologies are now well-established, yet they lack a regulatory framework.” A suitable regulatory framework for LBS (that addresses privacy amongst other social and ethical challenges) may be built on numerous approaches, including the technical approaches described in Sect. 16.4.1. Other approaches are explored by Xu et al. [95] in their quasi-experimental survey of privacy challenges relevant to push versus pull LBS. The approaches include compensation (incentives), industry self-regulation, and government regulation strategies [95, p. 143]. According to Xu et al., these “intervention strategies,” may have an impact on the privacy calculus in LBS [95, pp. 136–137]. Notably, their survey of 528 participants found that self-regulation has a considerable bearing on perceived risk for both push and pull services, whereas perceived risks for compensation and government regulation strategies vary depending on types of services. That is, compensation increases perceived benefit in the push but not the pull model and, similarly, government regulation reduces perceived privacy risk in the push-based model [95, p. 158].

It should be acknowledged that a preliminary step in seeking a solution to the privacy dilemma, addressing the identified social concerns, and proposing appropriate regulatory responses is to clearly identify and assess the privacy-invasive elements of LBS in a given context- we have used Australia as an example in this instance. Possible techniques that can be employed to identify risks and implications, and consequently possible mitigation strategies, are a Privacy Impact Assessment (PIA) or employing other novel models such as the framework of contextual integrity.

16.4.3 Privacy Impact Assessment (PIA)

A PIA can be defined as “a systematic process that identifies and evaluates, from the perspectives of all stakeholders, the potential effects on privacy of a project, initiative or proposed system or scheme, and includes a search for ways to avoid or mitigate negative privacy impacts” [2930]. The PIA tool, originally linked to technology and impact assessments [28, p. 125], is effectively a “risk management” technique that involves addressing both positive and negative impacts of a project or proposal, but with a greater focus on the latter [67, pp. 4–5].

PIAs were established and developed from 1995 to 2005, and possess a number of distinct qualities, some of which are that a PIA is focused on a particular initiative, takes a forward-looking and preventative as opposed to retrospective approach, broadly considers the various aspects of privacy (i.e., privacy of person, personal behavior, personal communication, and personal data), and is inclusive in that it accounts for the interests of relevant entities [28, pp. 124–125]. Regarding the Australian context, the development of PIAs in Australia can be observed in the work of Clarke [30] who provides an account of PIA maturity pre-2000, post-2000, and the situation in 2010.

16.4.4 Framework of Contextual Integrity

The framework of contextual integrity, introduced by [65], is an alternative approach that can be employed to assess whether LBS, as a socio-technical system, violates privacy and thus contextual integrity. An overview of the framework is provided in [65, p. 14]:

The central claim is that contextual integrity captures the meaning of privacy in relation to personal information; predicts people’s reactions to new technologies because it captures what we care about when we question, protest, and resist them; and finally, offers a way to carefully evaluate these disruptive technologies. In addition, the framework yields practical, step-by-step guidelines for evaluating systems in question, which it calls the CI Decision Heuristic and the Augmented CI Decision Heuristic.

According to Nissenbaum [65], the primary phases within the framework are: (1) explanation, which entails assessing a new system or practice in view of “context-relative informational norms” [65, p. 190], (2) evaluation, which involves “comparing altered flows in relation to those that were previously entrenched” [65, p. 190], and (3) prescription, a process based on evaluation, whereby if a system or practice is deemed “morally or politically problematic,” it has grounds for resistance, redesign or being discarded [65, p. 191]. Within these phases are distinct stages: establish the prevailing context, determine key actors, ascertain what attributes are affected, establish changes in principles of transmission, and red flag, if there are modifications in actors, attributes, or principles of transmission [65, pp. 149–150].

The framework of contextual integrity and, similarly, PIAs are relevant to this study, and may be considered as valid tools for assessing the privacy-invasive or violating nature of LBS and justifying the need for some form of regulation. This is particularly pertinent as LBS present unique privacy challenges, given their reliance on knowing the location of the target. That is, the difficulty in maintaining location privacy is amplified due to the fact that m-commerce services and mobility in general, by nature, imply knowledge of the user’s location and preferences [40, p. 463]. Therefore, it is likely that there will always be a trade-off ranging in severity. Namely, one end of the privacy continuum will demand that stringent privacy mechanisms be implemented, while the opposing end will support and justify increased surveillance practices.

16.5 Challenges

16.5.1 Relationship Between Privacy, Security, Control and Trust

A common thread in discussions relating to privacy and security implications of LBS throughout this chapter has been the interrelatedness of themes; notably, the manner in which a particular consideration is often at odds with other concerns. The trade-off between privacy/freedom and safety/security is a particularly prevalent exchange that must be considered in the use of many ICTs [36, p. 47]. In the case of LBS, it has been observed that the need for safety and security conflicts with privacy concerns, potentially resulting in contradictory outcomes depending on the nature of implementation. For example, while LBS facilitate security and timely assistance in emergency situations, they simultaneously have the potential to threaten privacy based on the ability for LBS to be employed in tracking and profiling situations [18, p. 105]. According to Casal [18, p. 109], the conflict between privacy and security, and lack of adequate regulatory frameworks, has a flow-on effect in that trust in ICTs is diminished. Trust is also affected in the family context, where tracking or monitoring activities result in lack of privacy between family members [59, p. 436]. The underlying question, according to Mayer [59, p. 435] is in relation to the power struggle between those seeking privacy versus those seeking information:

What will be the impact within families as new technologies shift the balance of power between those looking for privacy and those seeking surveillance and information?

Mayer’s [59] question alludes to the relevance of the theme of control, in that surveillance can be perceived as a form of control and influence. Therefore, it can be observed that inextricable linkages exist between several themes presented or alluded to throughout this chapter; notably privacy and security, but also the themes of control and trust. In summary, privacy protection requires security to be maintained, which in turn results in enhanced levels of control, leading to decreased levels of trust, which is a supplement to privacy [70, pp. 13–14]. The interrelatedness of themes is illustrated in Fig. 16.1.

Fig. 16.1: Relationship between control, trust, privacy, and security, after [70, p. 14]

It is thus evident that the idea of balance resurfaces, with the requirement to weigh multiple and competing themes and interests. This notion is not new with respect to location monitoring and tracking. For instance, Mayer [59, p. 437] notes, in the child tracking context, that there is the requirement to resolve numerous questions and challenges in a legal or regulatory sense, noting that “[t]he key is balancing one person’s need for privacy with another person’s need to know, but who will define this balancing point?” Issues of age, consent, and reciprocal monitoring are also significant. Existing studies on location disclosure amongst social relations afford the foundations for exploring the social and ethical challenges for LBS, whilst simultaneously appreciating technical considerations or factors. Refer to [51632424347628487].

16.6 Conclusion

This chapter has provided an examination of privacy and security with respect to location-based services. There is a pressing need to ensure LBS privacy threats are not dismissed from a regulatory perspective. Doing so will introduce genuine dangers, such as psychological, social, cultural, scientific, economic, political, and democratic harm; dangers associated with profiling; increased visibility; publically damaging revelations; and oppression [31]. Additionally, the privacy considerations unique to the “locational or mobile dimension” require educating the general public regarding disclosure and increased transparency on the part of providers in relation to collection and use of location information [11, p. 15]. Thus, in response to the privacy challenges associated with LBS, and based on current scholarship, this research recognizes the need for technological solutions, in addition to commitment and adequate assessment or consideration at the social and regulatory levels. Specifically, the privacy debate involves contemplation of privacy policies and regulatory frameworks, in addition to technical approaches such as obfuscation and maintaining anonymity [37, p. 7]. That is, privacy-related technical solutions must also be allied with supportive public policy and socially acceptable regulatory structures.

For additional readings relevant to LBS and privacy, which include an adequate list of general references for further investigation, refer to [17] on privacy challenges relevant to privacy invasive geo-mash-ups, the inadequacy of information privacy laws and potential solutions in the form of technological solutions, social standards and legal frameworks; [12] report submitted to the Office of the Privacy Commissioner of Canada, focused on mobile surveillance, the privacy dangers, and legal consequences; and [57] report to the Canadian Privacy Commissioner dealing with complementary issues associated with mobility, location technologies, and privacy.

Based on the literature presented throughout this chapter, a valid starting point in determining the privacy-invasive nature of specific LBS applications is to review and employ the available solution(s). These solutions or techniques are summarized in Table 16.1, in terms of the merits and benefits of each approach and the extent to which they offer means of overcoming or mitigating privacy-related risks. The selection of a particular technique is dependent on the context or situation in question. Once the risks are identified it is then possible to develop and select an appropriate mitigation strategy to reduce or prevent the negative implications of utilizing certain LBS applications. This chapter is intended to provide a review of scholarship in relation to LBS privacy and security, and should be used as the basis for future research into the LBS privacy dilemma, and related regulatory debate.

Table 16.1 Summary of solutions and techniques

Solution/Technique | Merits | Limitations

Technological mechanisms

• Provide location obfuscation and anonymity in required situations

• Myriad of solutions available depending on level of privacy required

• In-built mechanisms requiring limited user involvement

• Unlike regulatory solutions, technological solutions encourage industry development

• Result in degradation in location quality/resolution

Regulatory mechanisms

• Variety of techniques available, such as industry self-regulation and government legislation

• Can offer legal protection to individuals in defined situations/scenarios

• Can be limiting in terms of advancement of LBS industry

Impact assessments, contextual frameworks, and internal policies

• Provide proactive approach in identifying privacy (and related) risks

• Used to develop suitable mitigation strategies

• Preventative and inclusive in nature

• Tend to be skewed in focus, focusing primarily on negative implications

• Can be limiting in terms of advancement of LBS industry

References

1. Abbas R, Michael K, Michael MG, Aloudat A (2011) Emerging forms of covert surveillance using GPS-enabled devices. J Cases Inf Technol (JCIT) 13(2):19–33

2. Aloudat A (2012) ‘Privacy Vs Security in national emergencies. IEEE Technol Soc Mag Spring 2012:50–55

3. ALRC 2008 (2012) For your information: Australian privacy law and practice (Alrc Report 108). http://www.alrc.gov.au.ezproxy.uow.edu.au/publications/report-108. Accessed 12 Jan 2012

4. Andrejevic M (2007) Ispy: Surveillance and Power in the Interactive Era. University Press of Kansas, Lawrence

5. Anthony D, Kotz D, Henderson T (2007) Privacy in location-aware computing environments. Pervas Comput 6(4):64–72

6. Applewhite A (2002) What knows where you are? Personal safety in the early days of wireless. Pervas Comput 3(12):4–8

7. Attorney General’s Department (2012) Telecommunications interception and surveillance. http://www.ag.gov.au/Telecommunicationsinterceptionandsurveillance/Pages/default.aspx. Accessed 20 Jan 2012

8. Awad NF, Krishnan MS (2006) The personalization privacy paradox: an empirical evaluation of information transparency and the willingness to be profiled online for personalization. MIS Q 30(1):13–28

9. Ayres G, Mehmood R (2010) Locpris: a security and privacy preserving location based services development framework. In: Setchi R, Jordanov I, Howlett R, Jain L (eds) Knowledge-based and intelligent information and engineering systems, vol 6279, pp 566–575

10. Bauer HH, Barnes SJ, Reichardt T, Neumann MM (2005) Driving the consumer acceptance of mobile marketing: a theoretical framework and empirical study. J Electron Commer Res 6(3):181–192

11. Bennett CJ (2006) The mobility of surveillance: challenges for the theory and practice of privacy protection. In: Paper prepared for the 2006 Meeting of the international communications association, Dresden Germany, June 2006, pp 1–20.

12. Bennett CJ, Crowe L (2005) Location-based services and the surveillance of mobility: an analysis of privacy risks in Canada. A report to the Office of the Privacy Commissioner of Canada, under the 200405 Contributions Program, June 2005. http://www.colinbennett.ca/recent-publications/reports-2/

13. Bennett CJ, Grant R (1999) Introduction. In: Bennett CJ, Grant R (eds) Visions of privacy: policy choices for the digital age. University of Toronto Press, Toronto, pp 3–16.

14. Beresford AR, Stajano F (2004) Mix zones: user privacy in location-aware services. In: Proceedings of the Second IEEE Annual conference on pervasive computing and communications workshops (PERCOMW’04) pp 127–131.

15. Brickhouse Security (2012) Lok8u GPS Child Locator. http://www.brickhousesecurity.com/child-locator.html. Accessed 9 Feb 2012

16. Brown B, Taylor AS, Izadi S, Sellen A, Kaye J, Eardley R (2007) Locating family values: a field trial of the whereabouts clock. In: UbiComp ‘07 Proceedings of the 9th international conference on Ubiquitous computing, pp 354–371.

17. Burdon M (2010) Privacy invasive geo-mashups : Privacy 2.0 and the limits of first generation information privacy laws. Univ Illinois J Law Technol Policy (1):1–50.

18. Casal CR (2004) Impact of location-aware services on the privacy/security balance. Info: J Policy Regul Strategy Telecommun Inf Media 6(2):105–111

19. Chen JV, Ross W, Huang SF (2008) Privacy, trust, and justice considerations for location-based mobile telecommunication services. Info 10(4):30–45

20. Cho G (2005) Geographic information science: mastering the legal issues. Wiley, Hoboken.

21. Clarke R (1997) Introduction to dataveillance and information privacy, and definitions of terms. http://www.anu.edu.au/people/Roger.Clarke/DV/Intro.html

22. Clarke R (1999) Relevant characteristics of person-location and person-tracking technologies. http://www.rogerclarke.com/DV/PLTApp.html

23. Clarke R (2001a) Introducing PITs and PETs: technologies affecting privacy. http://www.rogerclarke.com/DV/PITsPETs.html

24. Clarke R (2001) Person location and person tracking—technologies, risks and policy implications. Inf Technol People 14(2):206–231

25. Clarke R (2003b) Privacy on the move: the impacts of mobile technologies on consumers and citizens. http://www.anu.edu.au/people/Roger.Clarke/DV/MPrivacy.html

26. Clarke R (2006) What’s ‘Privacy’? http://www.rogerclarke.com/DV/Privacy.html

27. Clarke R (2007a) Chapter 3. What ‘Uberveillance’ is and what to do about it. In: Michael K, Michael MG (eds) The Second workshop on the social implications of national security (from Dataveillance to Uberveillance and the Realpolitik of the Transparent Society). University of Wollongong, IP Location-Based Services Research Program (Faculty of Informatics) and Centre for Transnational Crime Prevention (Faculty of Law), Wollongong, Australia, pp 27–46

28. Clarke R (2009) Privacy impact assessment: its origins and development. Comput Law Secur Rev 25(2):123–135

29. Clarke R (2010a) An evaluation of privacy impact assessment guidance documents. http://www.rogerclarke.com/DV/PIAG-Eval.html

30. Clarke R (2010b) Pias in Australia—A work-in-progress report. http://www.rogerclarke.com/DV/PIAsAust-11.html

31. Clarke R, Wigan M (2011) You are where you’ve been: the privacy implications of location and tracking technologies. http://www.rogerclarke.com/DV/YAWYB-CWP.html

32. Consolvo S, Smith IE, Matthews T, LaMarca A, Tabert J, Powledge P (2005) Location disclosure to social relations: why, when, & what people want to share. In: CHI 2005(April), pp 2–7, Portland, Oregon, USA, pp. 81–90

33. Culnan MJ, Bies RJ (2003) Consumer privacy: balancing economic and justice considerations. J Soc Issues 59(2):323–342

34. Damiani ML, Bertino E, Perlasca P (2007) Data security in location-aware applications: an approach based on Rbac. Int. J. Inf Comput Secur 1(1/2):5–38

35. Davis DW, Silver BD (2004) Civil Liberties Vs. Security: public opinion in the context of the terrorist attacks on America. Am J Polit Sci 48(1):28–46

36. Dobson JE, Fisher PF (2003) Geoslavery. IEEE Technol Soc Mag 22(1):47–52

37. Duckham M (2008) Location privacy protection through spatial information hiding. http://www.privacy.vic.gov.au/privacy/web2.nsf/files/20th-meeting-16-july-2008-duckham-presentation/$file/pvn_07_08_duckham.pdf

38. Duckham M (2010) Moving forward: location privacy and location awareness. In: SPRINGL’10 November 2, 2010, San Jose, CA, USA, pp 1–3

39. Duckham M, Kulik L (2006) Chapter 3. location privacy and location-aware computing. In: Drummond J, Billen R, Forrest D, Joao E (eds) Dynamic and Mobile Gis: investigating change in space and time. CRC Press, Boca Raton, pp 120. http://www.geosensor.net/papers/duckham06.IGIS.pdf

40. Elliot G, Phillips N (2004) Mobile commerce and wireless computing systems. Pearson Education Limited, Great Britain 532 pp

41. FIDIS 2007, D11.5: The legal framework for location-based services in Europe. http://www.fidis.net/

42. Fusco SJ, Michael K, Aloudat A, Abbas R (2011) Monitoring people using location-based social networking and its negative impact on trust: an exploratory contextual analysis of five types of “Friend” Relationships. In: IEEE symposium on technology and society (ISTAS11), Illinois, Chicago, IEEE 2011

43. Fusco SJ, Michael K, Michael MG, Abbas R (2010) Exploring the social implications of location based social networking: an inquiry into the perceived positive and negative impacts of using LBSN between friends. In: 9th international conference on mobile business (ICMB2010), Athens, Greece, IEEE, pp 230–237

44. Giaglis GM, Kourouthanassis P, Tsamakos A (2003) Chapter IV. Towards a classification framework for mobile location-based services. In: Mennecke BE, Strader TJ (eds) Mobile commerce: technology, theory and applications. Idea Group Publishing, Hershey, US, pp 67–85

45. Gould JB (2002) Playing with fire: the civil liberties implications of September 11th. In: Public Administration Review, 62 (Special Issue: Democratic Governance in the Aftermath of September 11, 2001), pp 74–79

46. Gruteser M, Grunwald D (2003) Anonymous usage of location-based services through spatial and temporal cloaking. In: ACM/USENIX international conference on mobile systems, applications and services (MobiSys), pp 31–42

47. Iqbal MU, Lim S (2007) Chapter 16. Privacy implications of automated GPS tracking and profiling. In: Michael K, Michael MG (eds) From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (Workshop on the Social Implications of National Security, 2007) University of Wollongong, IP Location-Based Services Research Program (Faculty of Informatics) and Centre for Transnational Crime Prevention (Faculty of Law), Wollongong, pp 225–240

48. Jorns O, Quirchmayr G (2010) Trust and privacy in location-based services. Elektrotechnik & Informationstechnik 127(5):151–155

49. Junglas I, Spitzmüller C (2005) A research model for studying privacy concerns pertaining to location-based services. In: Proceedings of the 38th Hawaii international conference on system sciences, pp 1–10

50. Kaasinen E (2003) User acceptance of location-aware mobile guides based on seven field studies. Behav Inf Technol 24(1):37–49

51. Kaupins G, Minch R (2005) Legal and ethical implications of employee location monitoring. In: Proceedings of the 38th Hawaii international conference on system sciences, pp 1–10

52. Krumm J (2008) A survey of computational location privacy. Pers Ubiquit Comput 13(6):391–399

53. Küpper A, Treu G (2010) Next generation location-based services: merging positioning and web 2.0. In: Yang LT, Waluyo AB, Ma J, Tan L, Srinivasan B (eds) Mobile intelligence. Wiley Inc, Hoboken, pp 213–236

54. Landau R, Werner S (2012) Ethical aspects of using GPS for tracking people with dementia: recommendations for practice. Int Psychogeriatr 24(3):358–366

55. Leppäniemi M, Karjaluoto H (2005) Factors influencing consumers’ willingness to accept mobile advertising: a conceptual model. Int. J Mobile Commun 3(3):197–213

56. Loc8tor Ltd. 2011 (2012), Loc8tor Plus. http://www.loc8tor.com/childcare/. Accessed 9 Feb 2012.

57. Lyon D, Marmura S, Peroff P (2005) Location technologies: mobility, surveillance and privacy (a Report to the Office of the Privacy Commissioner of Canada under the Contributions Program). The Surveillance Project, Queens Univeristy, Canada. www.sscqueens.org/sites/default/files/loctech.pdf

58. Mason RO (1986) Four ethcial challenges in the information age. MIS Q 10(1):4–12

59. Mayer RN (2003) Technology, families, and privacy: can we know too much about our loved ones? J Consum Policy 26:419–439

60. Michael K, Masters A (2006) The advancement of positioning technologies in defense intelligence. In: Abbass H, Essam D (eds) Applications of information systems to homeland security and defense. Idea Publishing Group, United States, pp 196–220

61. Michael K, McNamee A, Michael MG (2006) The emerging ethics of humancentric GPS tracking and monitoring. International conference on mobile business. IEEE Computer Society, Copenhagen, Denmark, pp 1–10

62. Michael K, McNamee A, Michael MG, Tootell H (2006) Location-based intelligence—modeling behavior in humans using GPS. IEEE international symposium on technology and society. IEEE, New York, United States, pp 1–8

63. Michael K, Clarke R (2012) Location privacy under dire threat as Uberveillance stalks the streets. In: Precedent (Focus on Privacy/FOI), vol 108, pp 1–8 (online version) & 24–29 (original article). http://works.bepress.com.ezproxy.uow.edu.au/kmichael/245/

64. Neltronics 2012 (2012) Fleetfinder Pt2 Personal Tracker. http://www.fleetminder.com.au/gps-systems/fleetfinder+PT2. Accessed 9 Feb 2012

65. Nissenbaum H (2010) Privacy in context: technology, policy, and the integrity of social life. Stanford Law Books, Stanford 288 pp

66. O’Connor PJ, Godar SH (2003) Chapter XIII. We know where you are: the ethics of LBS advertising. In: Mennecke BE, Strader TJ (eds) Mobile commerce: technology, theory and applications. Idea Group Publishing, Hershey, pp 245–261

67. Office of the Victorian Privacy Commissioner 2009 (2010) Privacy impact assessments: a single guide for the victorian public sector. www.privacy.vic.gov.au/privacy/web.nsf/content/guidelines. Accessed 3 March 2010

68. Patel DP (2004) Should teenagers get Lojackedt against their will? An argument for the ratification of the United Nations convention on the rights of the child. Howard L J 47(2):429–470

69. Perusco L, Michael K (2005) Humancentric applications of precise location based services. IEEE international conference on e-business engineering. IEEE Computer Society, Beijing, China, pp 409–418

70. Perusco L, Michael K (2007) Control, trust, privacy, and security: evaluating location-based services. IEEE Technol Soc Mag 26(1):4–16

71. Perusco L, Michael K, Michael MG (2006) Location-based services and the privacy-security dichotomy. In: Proceedings of the 3rd international conference on mobile computing and ubiquitous networking, London, UK. Information Processing Society of Japan, pp. 91–98

72. Privacy International 2007, Overview of Privacy. www.privacyinternational.org/article.shtml?cmd[347]=x-347-559062. Accessed 3 Dec 2009

73. Quinn MJ (2006) Ethics for the information age, 2nd edn. Pearson/Addison-Wesley, Boston 484 pp

74. Raab CD (1999) Chapter 3. From balancing to steering: new directions for data protection. In: Bennett CJ, Grant R (eds) Visions of privacy: policy choices for the digital age. University of Toronto Press, Toronto, pp 68–93

75. Raper J, Gartner G, Karimi HA, Rizos C (2007) Applications of location-based services: a selected review. J Locat Based Serv 1(2):89–111

76. Richards NM, Solove DJ (2007) Privacy’s other path: recovering the law of confidentiality. Georgetown Law J 96:123–182

77. Schreiner K (2007) Where We At? Mobile phones bring GPS to the masses. IEEE Comput Graph Appl 2007:6–11

78. Sheng H, Fui-Hoon Nah F, Siau K (2008) An experimental study on ubiquitous commerce adoption: impact of personalization and privacy concerns. J Assoc Inf Syst 9(6):344–376

79. Smith GD (2006) Private eyes are watching you: with the implementation of the E-911 Mandate, Who will watch every move you make? Federal Commun Law J 58:705–726

80. Solove DJ (2006) A taxonomy of privacy. Univ Pennsylvania Law Rev 154(3):477–557

81. Solove DJ (2007) I’ve Got Nothing to Hide’ and other misunderstandings of privacy. San Diego Law Rev 44:745–772

82. Solove DJ (2008) Data mining and the security-liberty debate. Univ Chicago Law Rev 74:343–362

83. Steinfield C (2004) The development of location based services in mobile commerce. In: Priessl B, Bouwman H, Steinfield C (eds) Elife after the Dot.Com Bust. www.msu.edu/~steinfie/elifelbschap.pdf, pp 1–15

84. Tang KO, Lin J, Hong J, Siewiorek DP, Sadeh N (2010) Rethinking location sharing: exploring the implications of social-driven vs. purpose-driven location sharing. In: UbiComp 2010, Sep 26–Sep 29, Copenhagen, Denmark, pp 1–10

85. Tatli EI, Stegemann D, Lucks S (2005) Security challenges of location-aware mobile business. In: The Second IEEE international workshop on mobile commerce and services, 2005. WMCS ‘05, pp 1–10

86. Tootell H (2007) The social impact of using automatic identification technologies and location-based services in national security. PhD Thesis, School of Information Systems and Technology, Informatics, University of Wollongong

87. Tsai JY, Kelley PG, Drielsma PH, Cranor LF, Hong J, Sadeh N (2009) Who’s Viewed You? the impact of feedback in a mobile location-sharing application. In: CHI 2009, April 3–9, 2009, Boston, Massachusetts, USA, pp 1–10

88. Wang S, Min J, Yi BK (2008) Location based services for mobiles: technologies and standards (Presentation). In: IEEE ICC 2008, Beijing, pp 1–123

89. Warren S, Brandeis L (1890) The right to privacy. Harvard Law Rev 4:193–220

90. Westin AF (1967) Privacy and freedom. Atheneum, New York 487 pp

91. Westin AF (2003) Social and political dimensions of privacy. J Social Issues 59(2):431–453

92. Wigan M, Clarke R (2006) Social impacts of transport surveillance. Prometheus 24(4):389–403

93. Wright T (2004) ‘Security. Privacy and Anonymity’, crossroads 11:1–8

94. Xu H, Luo X, Carroll JM, Rosson MB (2011) The personalization privacy paradox: an exploratory study of decision making process for location-aware marketing. Decis Support Syst 51(2011):42–52

95. Xu H, Teo HH, Tan BYC, Agarwal R (2009) The role of push-pull technology in privacy calculus: the case of location-based services. J Manage Inf Syst 26(3):135–173

Citation: Abbas R., Michael K., Michael M.G. (2015) "Location-Based Privacy, Protection, Safety, and Security." In: Zeadally S., Badra M. (eds) Privacy in a Digital, Networked World. Computer Communications and Networks. Springer, Cham, DOI: https://doi-org.ezproxy.uow.edu.au/10.1007/978-3-319-08470-1_16

Social Implications of Technology: The Past, the Present, and the Future

Abstract

The social implications of a wide variety of technologies are the subject matter of the IEEE Society on Social Implications of Technology (SSIT). This paper reviews the SSIT's contributions since the Society's founding in 1982, and surveys the outlook for certain key technologies that may have significant social impacts in the future. Military and security technologies, always of significant interest to SSIT, may become more autonomous with less human intervention, and this may have both good and bad consequences. We examine some current trends such as mobile, wearable, and pervasive computing, and find both dangers and opportunities in these trends. We foresee major social implications in the increasing variety and sophistication of implant technologies, leading to cyborgs and human-machine hybrids. The possibility that the human mind may be simulated in and transferred to hardware may lead to a transhumanist future in which humanity redesigns itself: technology would become society.

SECTION I. Introduction

“Scientists think; engineers make.” Engineering is fundamentally an activity, as opposed to an intellectual discipline. The goal of science and philosophy is to know; the goal of engineering is to do something good or useful. But even in that bare-bones description of engineering, the words “good” and “useful” have philosophical implications.

Because modern science itself has existed for only 400 years or so, the discipline of engineering in the sense of applying scientific knowledge and principles to the satisfaction of human needs and desires is only about two centuries old. But for such a historically young activity, engineering has probably done more than any other single human development to change the face of the material world.

It took until the mid-20th century for engineers to develop the kind of self-awareness that leads to thinking about engineering and technology as they relate to society. Until about 1900, most engineers felt comfortable in a “chain-of-command” structure in which the boss—whether it be a military commander, a corporation, or a wealthy individual—issued orders that were to be carried out to the best of the engineer's technical ability. Fulfillment of duty was all that was expected. But as the range and depth of technological achievements grew, engineers, philosophers, and the public began to realize that we had all better take some time and effort to think about the social implications of technology. That is the purpose of the IEEE Society on Social Implications of Technology (SSIT): to provide a forum for discussion of the deeper questions about the history, connections, and future trends of engineering, technology, and society.

This paper is not focused on the history or future of any particular technology as such, though we will address several technological issues in depth. Instead, we will review the significant contributions of SSIT to the ongoing worldwide discussion of technology and society, and how technological developments have given rise to ethical, political, and social issues of critical importance to the future. SSIT is the one society in IEEE where engineers and allied professionals are encouraged to be introspective—to think about what they are doing, why they are doing it, and what effects their actions will have. We believe the unique perspective of SSIT enables us to make a valuable contribution to the panoply of ideas presented in this Centennial Special Issue of the Proceedings of the IEEE.

 

SECTION II. The Past

A. Brief History of SSIT

SSIT as a technical society in IEEE was founded in 1982, after a decade as the Committee on Social Responsibility in Engineering (CSRE). In 1991, SSIT held its first International Symposium on Technology and Society (ISTAS), in Toronto, ON, Canada. Beginning in 1996, the Symposium has been held annually, with venues intentionally located outside the continental United States every few years in order to increase international participation.

SSIT total membership was 1705 as of December 2011. Possibly because SSIT does not focus exclusively on a particular technical discipline, it is rare that SSIT membership is a member's primary connection to IEEE. As SSIT's parent organization seeks ways to increase its usefulness and relevance to the rapidly changing engineering world of the 21st century, SSIT will both chronicle and participate in the changes taking place both in engineering and in society as a whole. for a more detailed history of the first 25 years of SSIT, see [1].

B. Approaches to the Social Implications of Technology

In the historical article referred to above [1], former SSIT president Clint Andrews remarked that there are two distinct intellectual approaches which one can take with regard to questions involving technology and society. The CSIT and the early SSIT followed what he calls the “critical science” approach which “tends to focus on the adverse effects of science and technical change.” Most IEEE societies are organized around a particular set of technologies. The underlying assumption of many in these societies is that these particular technologies are beneficial, and that the central issues to be addressed are technical, e.g., having to do with making the technologies better, faster, and cheaper. Andrews viewed this second “technological optimism” trend as somewhat neglected by SSIT in the past, and expressed the hope that a more balanced approach might attract a larger audience to the organization's publications and activities. It is important to note, however, that from the very beginning, SSIT has called for a greater emphasis on the development of beneficial technology such as environmentally benign energy sources and more efficient electrical devices.

In considering technology in its wider context, issues that are unquestionable in a purely technical forum may become open to question. Technique A may be more efficient and a fraction of the cost of technique B in storing data with similar security provisions, but what if a managed offshore shared storage solution is not the best thing to do under a given set of circumstances? The question of whether A or B is better technologically (and economically) is thus subsumed in the larger question of whether and why the entire technological project is going to benefit anyone, and who it may benefit, and who it may harm. The fact that opening up a discussion to wider questions sometimes leads to answers that cast doubt on the previously unquestioned goodness of a given enterprise is probably behind Andrews' perception that on balance, the issues joined by SSIT have predominantly fallen into the critical-science camp. Just as no one expects the dictates of conscience to be in complete agreement with one's instinctive desires, a person seeking unalloyed technological optimism in the pages or discussions hosted by SSIT will probably be disappointed. But the larger aim is to reach conclusions about technology and society that most of us will be thankful for some day, if not today. Another aim is to ensure that we bring issues to light and propose ways forward to safeguard against negative effects of technologies on society.

C. Major Topic Areas of SSIT

In this section, we will review some (but by no means all) topics that have become recurring themes over the years in SSIT's quarterly peer-reviewed publication, the IEEE Technology & Society Magazine. The articles cited are representative only in the sense that they fall into categories that have been dealt with in depth, and are not intended to be a “best of” list. These themes fall into four broad categories: 1) war, military technology (including nuclear weapons), and security issues, broadly defined; 2) energy technologies, policies and related issues: the environment, sustainable development, green technology, climate change, etc.; 3) computers and society, information and communications technologies (ICT), cybersystems, cyborgs, and information-driven technologies; and 4) groups of people who have historically been underprivileged, unempowered, or otherwise disadvantaged: Blacks, women, residents of developing nations, the handicapped, and so on. Education and healthcare also fit in the last category because the young and the ill are in a position of dependence on those in power.

1. Military and Security Issues

Concern about the Vietnam War was a strong motivation for most of the early members of the Committee for Social Responsibility in Engineering, the predecessor organization of SSIT. The problem of how and even whether engineers should be involved in the development or deployment of military technology has continued to appear in some form throughout the years, although the end of the Cold War changed the context of the discussion. This category goes beyond formal armed combat if one includes technologies that tend to exert state control or monitoring on the public, such as surveillance technologies and the violation of privacy by various technical means. In the first volume of the IEEE Technology & Society Magazine published in 1982, luminaries such as Adm. Bobby R. Inman (ret.) voiced their opinions about Cold War technology [2], and the future trend toward terrorism as a major player in international relations was foreshadowed by articles such as “Technology and terrorism: privatizing public violence,” published in 1991 [3]. Opinions voiced in the Magazine on nuclear technology ranged from Shanebrook's 1999 endorsement of a total global ban on nuclear weapons [4] to Andrews' thorough review of national responses to energy vulnerability, in which he pointed out that France has developed an apparently safe, productive, and economical nuclear-powered energy sector [5]. In 2009, a special section of five articles appeared on the topic of lethal robots and their implications for ethical use in war and peacekeeping operations [6]. And in 2010, the use of information and communication technologies (ICT) in espionage and surveillance was addressed in a special issue on “Überveillance,” defined by authors M.G. Michael and K. Michael as the use of electronic means to track and gather information on an individual, together with the “deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time” [7].

2. Energy and Related Technologies and Issues

from the earliest years of the Society, articles on energy topics such as alternative fuels appeared in the pages of the IEEE Technology & Society Magazine. A 1983 article on Brazil's then-novel effort to supplement imported oil with alcohol from sugarcane [8] presaged today's controversial U.S. federal mandate for the ethanol content in motor fuels. The Spring 1984 issue hosted a debate on nuclear power generation between H. M. Gueron, director of New York's Con Edison Nuclear Coal and Fuel Supply division at the time [9], and J. J. MacKenzie, a senior staff scientist with the Union of Concerned Scientists [10]. Long before greenhouse gases became a household phrase and bandied about in debates between Presidential candidates, the Magazine published an article examining the need to increase the U.S.'s peak electrical generating capacity because the increase in average temperature due to increasing atmospheric carbon dioxide would increase the demand for air conditioning [11]. The larger implications of global warming apparently escaped the attention of the authors, focused as they were on the power-generating needs of the state of Minnesota. By 1990, the greenhouse effect was of sufficient concern to show up on the legislative agendas of a number of nations, and although Cruver attributed this to the “explosion of doomsday publicity,” he assessed the implications of such legislation for future energy and policy planning [12]. Several authors in a special issue on the social implications of systems concepts viewed the Earth's total environment in terms of a complex system in 2000 [13]. The theme of ISTAS 2009 was the social implications of sustainable development, and this theme was addressed in six articles in the resulting special issue of the IEEE Technology & Society Magazine for Fall 2010. The record of speculation, debate, forecasting, and analysis sampled here shows that not only has SSIT carried out its charter by examining the social implications of energy technology and related issues, but also it has shown itself a leader and forerunner in trends that later became large-scale public debates.

3. Computing, Telecommunications, and Cyberspace

Fig. 1. BRLESC-II computer built by U.S. Army personnel for use at the Ballistics Research Lab, Aberdeen Proving Grounds between about 1967 and 1978, A. V. Kurian at console. Courtesy of U.S. Army Photos.

In the early years of SSIT, computers were primarily huge mainframes operated by large institutions (Fig. 1). But with the personal computer revolution and especially the explosion of the Internet, SSIT has done its part to chronicle and examine the history, present state, and future trends of the hardware, software, human habits and interactions, and the complex of computer and communications technologies that are typically subsumed under the acronym of ICT.

As we now know, the question of intellectual property has been vastly complicated by the ready availability of peer-to-peer software, high-speed network connections, and legislation passed to protect such rights. In a paper published in 1998, Davis addressed the question of protection of intellectual property in cyberspace [14]. As the Internet grew, so did the volume of papers on all sorts of issues it raised, from the implications of electronic profiling [15] to the threats and promises of facial recognition technology [16]. One of the more forward-looking themes addressed in the pages of the Magazine came in 2005 with a special issue on sustainable pervasive computing [17]. This issue provides an example of how both the critical science and the technological optimism themes cited by Andrews above can be brought together in a single topic. And to show that futuristic themes are not shirked by the IEEE Technology and Society Magazine authors, in 2011 Clarke speculated in an article entitled “Cyborg rights” on the limits and problems that may come as people physically merge with increasingly advanced hardware (implanted chips, sensory enhancements, and so on) [18].

4. Underprivileged Groups

Last but certainly not least, the pages of the IEEE Technology & Society Magazine have hosted articles inspired by the plight of underprivileged peoples, broadly defined. This includes demographic groups such as women and ethnic minorities and those disadvantaged by economic issues, such as residents of developing countries. While the young and the ill are not often formally recognized as underprivileged in the conventional sense, in common with other underprivileged groups they need society's help in order to survive and thrive, in the form of education and healthcare, respectively. An important subset of education is the theme of engineering ethics, a subject of vital interest to many SSIT members and officials since the organization's founding.

In its first year, the Magazine carried an article on ethical issues in decision making [19]. A special 1998 issue on computers and the Internet as used in the K-12 classroom explored these matters in eight focused articles [20]. The roles of ethics and professionalism in the personal enjoyment of engineering was explored by Florman (author of the book The Introspective Engineer) in an interview with the Magazine's managing editor Terri Bookman in 2000 [21]. An entire special issue was devoted to engineering ethics in education the following year, after changes in the U.S. Accreditation Board for Engineering and Technology's policies made it appear that ethics might receive more attention in college engineering curricula [22].

The IEEE Technology & Society Magazine has hosted many articles on the status of women, both as a demographic group and as a minority in the engineering profession. Articles and special issues on themes involving women have on occasion been the source of considerable controversy, even threatening the organization's autonomy at one point [1, p. 9]. In 1999, ISTAS was held for the first time in conjunction with two other IEEE entities: the IEEE Women in Engineering Committee and the IEEE History Center. The resulting special issue that came out in 2000 carried articles as diverse as the history of women in the telegraph industry [23], the challenges of being both a woman and an engineering student [24], and two articles on technology and the sex industry [25], [26].

Engineering education in a global context was the theme of a Fall 2005 special issue of the IEEE Technology and Society Magazine, and education has been the focus of several special issues and ISTAS meetings over the years [27]–[28][29]. The recent development termed “humanitarian engineering” was explored in a special issue only two years ago, in 2010 [30]. Exemplified by the U.S.-based Engineers without Borders organization, these engineers pursue projects, and sometimes careers, based not only on profit and market share, but also on the degree to which they can help people who might not otherwise benefit from their engineering talents.

SECTION III. The Present

Fig. 2.  Cow bearing an Australian National Livestock Identification System (NLIS) RFID tag on its ear. The cow's identity is automatically detected as it goes through the drafting gates and the appropriate feed is provided for the cow based on historical data on its milk yields. Courtesy of Adam Trevarthen.

Fig. 2. Cow bearing an Australian National Livestock Identification System (NLIS) RFID tag on its ear. The cow's identity is automatically detected as it goes through the drafting gates and the appropriate feed is provided for the cow based on historical data on its milk yields. Courtesy of Adam Trevarthen.

Emerging technologies that will act to shape the next few years are complex in their makeup with highly meshed value chains that resemble more a process or service than an individual product [31]. At the heart of this development is convergence: convergence in devices, convergence in applications, convergence in content, and convergence in infrastructure. The current environment is typified by the move toward cloud computing solutions and Web 2.0 social media platforms with ubiquitous access via a myriad of mobile or fixed devices, some of which will be wearable on people and animals (Fig. 2) or embedded in systems (e.g., vehicles and household appliances).

Simultaneous with these changes are the emergence of web services that may or may not require a human operator for decision making in a given business process, reliance upon data streams from automatic identification devices [e.g., radio-frequency identification (RFID) tags], the accuracy and reliability of location-based services [e.g., using Global Positioning Systems (GPS)] and condition monitoring techniques (e.g., using sensors to measure temperature or other physiological data). Most of this new technology will be invisibly located in miniaturized semiconductors which are set to reach such economies of scale, that it is commonly noted by technology evangelists that every single living and nonliving thing will come equipped with a chip “on board.”

Fig. 3. Business woman checking in for an interstate trip using an electronic ticket sent to her mobile phone. Her phone also acts as a mobile payment mechanism and has built-in location services features. Courtesy of NXP Semiconductors 2009.

The ultimate vision of a Web of Things and People (WoTaP)—smart homes using smart meters, smart cars using smart roads, smart cities using smart grids—is one where pervasive and embedded systems will play an active role toward sustainability and renewable energy efficiency. The internetworked environment will need to be facilitated by a fourth-generation mobility capability which will enable even higher amounts of bandwidth to the end user as well as seamless communication and coordination by intelligence built into the cloud. Every smart mobile transaction will be validated by a precise location and linked back to a subject (Fig. 3).

In the short term, some of the prominent technologies that will impact society will be autonomous computing systems with built-in ambient intelligence which will amalgamate the power of web services and artificial intelligence (AI) through multiagent systems, robotics, and video surveillance technologies (e.g., even the use of drones) (Fig. 4). These technologies will provide advanced business and security intelligence. While these systems will lead to impressive uses in green initiatives and in making direct connections between people and dwellings, people and artifacts, and even people and animals, they will require end users to give up personal information related to identity, place, and condition to be drawn transparently from smart devices.

Fig. 4.  A facial recognition system developed by Argus Solutions in Australia. Increasingly facial recognition systems are being used in surveillance and usually based on video technology. Digital images captured from video or still photographs are compared with other precaptured images. Courtesy of Argus Solutions 2009.

Fig. 4. A facial recognition system developed by Argus Solutions in Australia. Increasingly facial recognition systems are being used in surveillance and usually based on video technology. Digital images captured from video or still photographs are compared with other precaptured images. Courtesy of Argus Solutions 2009.

The price of all of this will be that very little remains private any longer. While the opportunities that present themselves with emerging technologies are enormous with a great number of positive implications for society—for instance, a decrease in the number of traffic accidents and fatalities, a reduction in the carbon emission footprint by each household, greater social interconnectedness, etc.—ultimately these gains too will be susceptible to limitations. Who the designated controller is and what they will do with the acquired data is something we can only speculate about. We return then, to the perennial question of “who will guard the guards themselves”: Quis custodiet ipsos custodes? [32]

A. Mobile and Pervasive Computing

In our modern world, data collection from many of our most common activities begins from the moment we step out our front door in the morning until we go to sleep at night. In addition to near-continual data collection, we have become a society of people that voluntarily broadcasts to the world a great deal of personal information. Vacation photos, major life events, and trivialities such as where we are having dinner to our most mundane thoughts, all form part of the stream of data through which we electronically share our inner lives. This combination of the data that is collected about us and the data that is freely shared by us could form a breathtakingly detailed picture of an individual's life, if it could ever all be collected in one place. Most of us would consider ourselves fortunate that most of this data was historically never correlated and is usually highly anonymized. However, in general, it is becoming easier to correlate and deanonymize data sets.

1. Following Jane Doe's Digital Data Trail

Let us consider a hypothetical “highly tracked” individual [33]. Our Jane Doe leaves for work in the morning, and gets in her Chevrolet Impala, which has OnStar service to monitor her car. OnStar will contact emergency services if Jane has an accident, but will also report to the manufacturer any accident or mechanical failure the car's computer is aware of [34]. Jane commutes along a toll road equipped with electronic toll collection (ETC). The electronic toll system tracks where and at what time Jane enters and leaves the toll road (Fig. 5).

Fig. 5. Singapore's Electronic Road Pricing (ERP) system. The ERP uses a dedicated short-range radio communication system to deduct ERP charges from CashCards. These are inserted in the in-vehicle units of vehicles before each journey. Each time vehicles pass through a gantry when the system is in operation, the ERP charges are automatically deducted. Courtesy of Katina Michael 2003.

When she gets to work, she uses a transponder ID card to enter the building she works in (Fig. 6), which logs the time she enters and by what door. She also uses her card to log into the company's network for the morning. Her company's Internet firewall software monitors any websites she visits. At lunch, she eats with colleagues at a local restaurant. When she gets there, she “checks in” using a geolocation application on her phone—for doing so, the restaurant rewards her with a free appetizer [35].

 

Fig. 6. Employee using a contactless smart card to gain entry to her office premises. The card is additionally used to access elevators in the building, rest rooms, and secure store areas, and is the only means of logging into the company intranet. Courtesy of NXP Semiconductors 2009.

She then returns to work for the afternoon, again using her transponder ID badge to enter. After logging back into the network, she posts a review of the restaurant on a restaurant review site, or maybe a social networking site. At the end of the work day, Jane logs out and returns home along the same toll road, stopping to buy groceries at her local supermarket on the way. When she checks out at the supermarket, she uses her customer loyalty card to automatically use the store's coupons on her purchases. The supermarket tracks Jane's purchases so it can alert her when things she buys regularly are on sale.

During Jane's day, her movements were tracked by several different systems. During almost all of the time she spent out of the house, her movements were being followed. But Jane “opted in” to almost all of that tracking; it was her choice as the benefits she received outweighed her perceived costs. The toll collection transponder in her car allows her to spend less time in traffic [36]. She is happy to share her buying habits with various merchants because those merchants reward her for doing so [37]. In this world it is all about building up bonus points and getting rewarded. Sharing her opinions on review and social networking sites lets Jane keep in touch with her friends and lets them know what she is doing.

While many of us might choose to allow ourselves to be monitored for the individual benefits that accrue to us personally, the data being gathered about collective behaviors are much more valuable to business and government agencies. Clarke developed the notion of dataveillance to give a name to the “systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” in the 1980s [38]. ETC is used by millions of people in many countries. The more people who use it, as opposed to paying tolls at tollbooths, the faster traffic can flow for everyone. Everyone also benefits when ETC allows engineers to better monitor traffic flows and plan highway construction to avoid the busiest times of traffic. Geolocation applications let businesses reward first-time and frequent customers, and they can follow traffic to their business and see what customers do and do not like. Businesses such as grocery stores or drug stores that use customer loyalty cards are able to monitor buying trends to see what is popular and when. Increasingly shoppers are being introduced to the near-field communication (NFC) capability on their third-generation (3G) smartphone (Fig. 7).

Fig. 7. Purchasing grocery items effortlessly by using the near-field communication (NFC) capability on your 3G smartphone. Courtesy of NXP Semiconductors 2009.

Some of these constant monitoring tools are truly personal and are controlled by and report back only to the user [39]. for example, there are now several adaptive home thermostat systems that learn a user's temperature preferences over time and allow users to track their energy usage and change settings online. for the health conscious, “sleep monitoring” systems allow users to track not only the hours of sleep they get per night, but also the percentage of time spent in light sleep versus rapid eye movement (REM) sleep, and their overall “sleep quality” [40].

Fig. 8. Barcodes printed on individual packaged items on pallets. Order information is shown on the forklift's on-board laptop and the driver scans items that are being prepared for shipping using a handheld gun to update inventory records wirelessly. Courtesy AirData Pty Ltd, Motorola Premier Business Partner, 2009.

Businesses offer and customers use various mobile and customer tracking services because the offer is valued by both parties (Fig. 8). However, serious privacy and legal issues continue to arise [41]. ETC records have been subpoenaed in both criminal and civil cases [42]. Businesses in liquidation have sold their customer databases, violating the privacy agreements they gave to their customers when they were still in business. Geolocation services and social media that show a user's location or allow them to share where they have been or where they are going can be used in court cases to confirm or refute alibis [43].

 

Near-constant monitoring and reporting of our lives will only grow as our society becomes increasingly comfortable sharing more and more personal details (Fig. 9). In addition to the basic human desire to tell others about ourselves, information about our behavior as a group is hugely valuable to both governments and businesses. The benefits to individuals and to society as a whole are great, but the risks to privacy are also significant [44]. More information about group behaviors can let us allocate resources more efficiently, plan better for future growth, and generate less waste. More information about our individual patterns can allow us to do the same thing on a smaller scale—to waste less fuel heating our homes when there is no one present, or to better understand our patterns of human activity.

 

Fig. 9. A five step overview of how the Wherify location-based service works. The information retrieved by this service included a breadcrumb of each location (in table and map form), a list of time and date stamps, latitude and longitude coordinates, nearest street address, and location type. Courtesy of Wherify Wireless Location Services, 2009.

 

B. Social Computing

When we think of human evolution, we often think of biological adaptions to better survive disease or digest foods. But our social behaviors are also a product of evolution. Being able to read facial expressions and other nonverbal cues is an evolved trait and an essential part of human communication. In essence, we have evolved as a species to communicate face to face. Our ability to understand verbal and nonverbal cues has been essential to our ability to function in groups and therefore our survival [45].

The emoticon came very early in the life of electronic communication. This is not surprising, given just how necessary using facial expressions to give context to written words was to the casual and humor-filled atmosphere of the Internet precursors. Many other attempts to add context to the quick, casual writing style of the Internet have been made, mostly with less success. Indeed, the problem of communication devolving from normal conversations to meaningless shouting matches has been around almost as long as electronic communication itself. More recently, the “anonymous problem”—the problem of people anonymously harassing others without fear of response or retribution—has come under discussion in online forums and communities. And of course, we have seen the recent tragic consequences of cyberbullying [46]. In general, people will be much crueler to other people online than they would ever be in person; many of our evolved social mechanisms depend on seeing and hearing who we are communicating with.

The question we are faced with is this: Given that we now exist and interact in a world that our social instincts were not evolved to handle, how will we adapt to the technology, or more likely, how will the technology we use to communicate with adapt to us? We are already seeing the beginning of that adaptation: more and more social media sites require a “real” identity tied to a valid e-mail address. And everywhere on the Internet, “reputation” is becoming more and more important [177].

Reference sites, such as Wikipedia, control access based on reputation: users gain more privileges on the site to do things such as editing controversial topics or banning other users based on their contributions to the community—writing and editing articles or contributing to community discussions. On social media and review sites, users that are not anonymous have more credibility, and again reputation is gained with time and contribution to the community.

It is now becoming standard practice for social media of all forms to allow users to control who can contact them and make it very easy to block unwanted contact. In the future, these trends will be extended. Any social media site with a significant amount of traffic will have a way for users to build and maintain a reputation and to control access accordingly. The shift away from anonymity is set to continue and this is also evident in the way search engine giants, like Google, are updating their privacy statements—from numerous policies down to one. Google states: “When you sign up for a Google Account, we ask you for personal information. We may combine the information you submit under your account with information from other Google services or third parties in order to provide you with a better experience and to improve the quality of our services” [47].

Fig. 10. Wearable high-definition video calling and recording attire. Courtesy of Xybernaut 2002.

When people use technology to socialize, they are often doing it on mobile platforms. Therefore, the futures of social and mobile computing are inevitably intertwined. The biggest change that is coming to the shared mobile/social computing space is the final spread of WiFi and high-density mobile phone networks. There are still huge geographical areas where there is no way of wirelessly connecting to the Internet or where the connection is so slow as to be unusable. As high-speed mobile Internet spreads, extra bandwidth could help the problems inherent in communicating without being able to see the other person. High-definition (HD) video calling on mobile phones will make person-to-person communications easier and more context rich (Fig. 10). HD video calling and conferencing will make everything from business meetings to long-distance relationships easier by allowing the participants to pick up on unspoken cues.

 

As more and more of our social interactions go online, the online world will be forced to adapt to our evolved human social behaviors. It will become much more like offline communication, with reputation and community standing being deeply important. True anonymity will become harder and harder to come by, as the vast majority of social media will require some proof of identity. for example, this practice is already occurring in countries like South Korea [48].

While we cannot predict all the ways in which our online interactions will become more immersive, we can say for certain that they will. The beauty of all of these changes will be that it will become as easy to maintain or grow a personal relationship on the other side of the world as it would be across town. As countries and regions currently without high-speed data networks come online, they can integrate into a new global community allowing us all to know each other with a diverse array of unknown consequences.

C. Wearable Computing

Fig. 11. The prototype GPS Locator for Children with a built-in pager, a request for 911, GPS technology, and a key fob to manually lock and unlock the locator. This specific device is no longer being marketed, despite the apparent need in some contexts. Courtesy of Wherify Wireless Location Services, 2003.

According to Siewiorek [49, p. 82], the first wearable device was prototyped in 1961 but it was not until 1991 that the term “wearable computer” was first used by a research group at Carnegie Mellon University (Pittsburgh, PA). This coincided with the rise of the laptop computer, early models of which were known as “luggables.” Wearable computing can be defined as “anything that can be put on and adds to the user's awareness of his or her environment …mostly this means wearing electronics which have some computational power” [50, p. 2012]. While the term “wearables” is generally used to describe wearable displays and custom computers in the form of necklaces, tiepins, and eyeglasses, the definition has been broadened to incorporate iPads, iPods, personal digital assistants (PDAs), e-wallets, GPS watches (Fig. 11), and other mobile accessories such as smartphones, smart cards, and electronic passports that require the use of belt buckles or clip-on satchels attached to conventional clothing [51, p. 330]. The iPlant (Internet implant) is probably not far off either [52].

 

Wearable computing has reinvented the way we work and go about our day-to-day business and is set to make even greater changes in the foreseeable future [53]. In 2001, it was predicted that highly mobile professionals would be taking advantage of smart devices to “check messages, finish a presentation, or browse the Web while sitting on the subway or waiting in line at a bank” [54, p. 44]. This vision has indeed been realized but devices like netbooks are still being lugged around instead of worn in the true sense.

The next phase of wearables will be integrated into our very clothing and accessories, some even pointing to the body itself being used as an input mechanism. Harrison of Carnegie Mellon's Human–Computer Interaction Institute (HCII) produced Skinput with Microsoft researchers that makes the body that travels everywhere with us, one giant touchpad [55]. These are all exciting innovations and few would deny the positives that will come from the application of this cutting-edge research. The challenge will be how to avoid rushing this technology into the marketplace without the commensurate testing of prototypes and the due consideration of function creep. Function or scope creep occurs when a device or application is used for something other than it was originally intended.

Early prototypes of wearable computers throughout the 1980s and 1990s could have been described as outlandish, bizarre, or even weird. for the greater part, wearable computing efforts have focused on head-mounted displays (a visual approach) that unnaturally interfered with human vision and made proximity to others cumbersome [56, p. 171]. But the long-term aim of researchers is to make wearable computing inconspicuous as soon as technical improvements allow for it (Fig. 12). The end user should look as “normal” as possible [57, p. 177].

 

Fig. 12. Self-portraits of Mann with wearable computing kit from the 1980s to the 1990s. Prof. Mann started working on his WearComp invention as far back as his high school days in the 1970s. Courtesy of Steve Mann.

New technologies like the “Looxcie” [58] wearable recorders have come a long way since the clunky point-of-view head-mounted recording devices of the 1980s, allowing people to effortlessly record and share their life as they experience it in different contexts. Mann has aptly coined the term sousveillance. This is a type of inverse panopticon, sous (below) and veiller (to watch) stemming from the French words. A whole body of literature has emerged around the notion of sousveillance which refers to the recording of an activity by a participant in the activity, typically by way of small wearable or portable personal technologies. The glogger.mobi online platform demonstrates the great power of sousveillance. But there are still serious challenges, such as privacy concerns, that need to be overcome if wearable computing is to become commonplace [59]. Just like Google has created StreetView, can the individual participate in PersonView without his neighbor's or stranger's consent [7] despite the public versus private space debate? Connected to privacy is also the critical issue of autonomy (and if we were to agree with Kant, human dignity), that is, our right to make informed and uncoerced decisions.

While mass-scale commercial production of wearable clothing is still some time away, some even calling it the unfulfilled pledge [60], shirts with simple memory functions have been developed and tested. Sensors will play a big part in the functionality of the smartware helping to determine the environmental context, and undergarments closest to the body will be used for body functions such as the measurement of temperature, blood pressure, heart and pulse rates. for now, however, the aim is to develop ergonomically astute wearable computing that is actually useful to the end user. Head-mounted displays attached to the head with a headband may be practical for miners carrying out occupational health and safety (OH&S) but are unattractive for everyday consumer users. Displays of the next generation will be mounted or concealed within eyeglasses themselves [61, p. 48].

Mann [57, p. 31] predicts that wearable computing will become so common one day, interwoven into every day clothing-based computing, that “we will no doubt feel naked, confused, and lost without a computer screen hovering in front of our eyes to guide us,” just like we would feel our nakedness without the conventional clothing of today.

1. Wearables in the Medical Domain

Unsurprisingly, wearables have also found a niche market in the medical domain. In the mid-1990s, researchers began to describe a small wearable device that continuously monitored glucose levels so that the right amount of insulin was calculated for the individual reducing the incidence of hypoglycemic episodes [62]. The Glucoday [63] and GlucoChip [64] are just two products demonstrating the potential to go beyond wearables toward in vivo techniques in medical monitoring.

Medical wearables even have the capability to check and monitor products in one's blood [65, p. 88]. Today medical wearable device applications include: “monitoring of myocardial ischemia, epileptic seizure detection, drowsiness detection …physical therapy feedback, such as for stroke victim rehabilitation, sleep apnea monitoring, long-term monitoring for circadian rhythm analysis of heart rate variability (HRV)” [66, p. 44].

Some of the current shortcomings of medical wearables are similar to those of conventional wearables, namely the size and the weight of the device which can be too large and too heavy. In addition, wearing the devices for long periods of time can be irritating due to the number of sensors that may be required to be worn for monitoring. The gel applied for contact resistance between the electrode and the skin can also dry up, which is a nuisance. Other obstacles to the widespread diffusion of medical wearables include government regulations and the manufacturers' requirement for limited liability in the event that an incorrect diagnosis is made by the equipment.

But much has been improved in the products of wearables over the past ten years. Due to commensurate breakthroughs in the miniaturization of computing components, wearable devices are now usually quite small. Consider Toumaz Technology's Digital Plaster invention known as the Sensium Life Pebble TZ203002 (Fig. 13). The Digital Plaster contains a Sensium silicon chip, powered by a tiny battery, which sends data via a cell phone or a PDA to a central computer database. The Life Pebble has the ability to enable continuous, auditable acquisition of physiological data without interfering with the patient's activities. The device can continuously monitor electrocardiogram (ECG), heart rate, physical activity, and skin temperature. In an interview with M. G. Michael in 2006, Toumazou noted how the Digital Plaster had been applied in epilepsy control and depression. He said that by monitoring the electrical and chemical responses they could predict the onset of either a depressive episode or an epileptic fit; and then once predicted the nerve could be stimulated to counter the seizure [67]. He added that this truly signified “personal healthcare.”

Fig. 13. Prof. Christofer Toumazou with a patient wearing the “digital plaster”; a tiny electronic device meant to be embedded in ordinary medical plaster that includes sensors for monitoring health-related metadata such as blood pressure, temperature, and glucose levels. Courtesy of Toumaz Technology 2008.

 

D. Robots and Unmanned Aerial Systems and Vehicles

Fig. 14. Predator Drone aircraft: this plane comes in the armed and reconnaissance versions and the models are known as RQ-1 and MQ-1.

Autonomous systems are those which are self-governed. In practice, there are many degrees of autonomy ranging from the highly constrained and supervised to unconstrained and intelligent. Some systems are referred to as “semiautonomous” in order to suggest that the machines are tasked or supervised by a human operator. An unmanned vehicle may be a remotely piloted “dumb” vehicle or an autonomous vehicle (Fig. 14). Robots may be designed to perform repetitive tasks in a highly constrained environment or with intelligence and a high level of autonomy to make judgments in a dynamic and unpredictable environment. As technology advancements allow for a high level of autonomy and expansion from industrial applications to caregiving and warfighting, society is coming to grips with the present and the future of increasingly autonomous systems in our homes, workplaces, and battlefields.

 

Robot ethics, particularly with respect to autonomous weapons systems, has received increasing attention in the last few years [68]. While some call for an outright stop to the development of such technology [69], others seek to shape the technology with ethical and moral implications in mind [6], [70]–[71][72][73]. Driving robotics weapons development underground or refusing to engage in dialog over the ethical issues will not give ethicists an opportunity to participate in shaping the design and use of such weapons. Arkin [6] and Operto [74], among others, argue that engineers must not shy away from these ethical challenges. Furthermore, the technological cat is out of the bag: “Autonomy is subtle in its development—it is occurring in a step-by-step process, rather than through the creation of a disruptive invention. It is far less likely that we will have a sudden development of a ‘positronic brain’ or its equivalent, but rather a continual and gradual relinquishment of authority to machines through the constant progress of science, as we have already seen in automated trains, elevators, and numerous other examples, that have vanished into the background noise of civilization. Autonomy is already here by some definitions” [70].

The evolution of the development and deployment of unmanned aerial vehicles and other autonomous or semiautonomous systems has outpaced the analysis of social implications and ethics of their design and use [70], [75]. Sullivan argues that the evolution of unmanned vehicles for military deployment should not be confused with the more general trend of increasing autonomy in military applications [75]. Use of robots often provides a tactical advantage due to sensors, data processing, and physical characteristics that outperform humans. Robots can act without emotion, bias, or self-preservation influencing judgment, which may be a liability or advantage. Risks to robot deployment in the military, healthcare industry, and elsewhere include trust of autonomous systems (a lack of, or too much) and diffusion of blame or moral buffering [6], [72].

for such critical applications in the healthcare domain, and lethal applications in weapons, the emotional and physical distance of operating a remote system (e.g., drone strikes via video-game style interface) may negatively influence the moral decision making of the human operator or supervisor, while also providing some benefit of emotional protection against post-traumatic stress disorder [71], [72]. Human–computer interfaces can promote ethical choices in the human operator through thoughtful or model-based design as suggested by Cummings [71] and Asaro [72].

for ethical behavior of the autonomous system itself, Arkin proposes that robot soldiers could be more humane than humans, if technologically constrained to the laws of war and rules of engagement, which they could follow without the distortions of emotion, bias, or a sense of self-preservation [6], [70]. Asaro argues that such laws are not, in fact, objective and static but rather meant for human interpretation for each case, and therefore could not be implemented in an automated system [72]. More broadly, Operto [74] agrees that a robot (in any application) can only act within the ethics incorporated into its laws, but that a learning robot, in particular, may not behave as its designers anticipate.

Fig. 15. Kotaro, a humanoid roboter created at the University of Tokyo (Tokyo, Japan), presented at the University of Arts and Industrial Design Linz (Linz, Austra) during the Ars Electronica Festival 2008. Courtesy of Manfred Werner-Tsui.

Robot ethics is just one part of the landscape of social implications for autonomous systems. The field of human–robot interaction explores how robot interfaces and socially adaptive robots influence the social acceptance, usability, and safety of robots [76] (Fig. 15). for example, robots used for social assistance and care, such as for the elderly and small children, introduce a host of new social implications questions. Risks of developing an unhealthy attachment or loss of human social contact are among the concerns raised by Sharkey and Sharkey [77]. Interface design can influence these and other risks of socially assistive robots, such as a dangerous misperception of the robot's capabilities or a compromise of privacy [78].

 

Autonomous and unmanned systems have related social implication challenges. Clear accountability and enforcing morality are two common themes in the ethical design and deployment of such systems. These themes are not unique to autonomous and unmanned systems, but perhaps the science fiction view of robots run amok raises the question “how can we engineer a future where we can benefit from these technologies while maintaining our humanity?”

 

SECTION IV. The Future

Great strides are being taken in the field of biomedical engineering: the application of engineering principles and techniques to the medical field [79]. New technologies such as prospective applications of nanotechnology, microcircuitry (e.g., implantables), and bionics will heal and give hope to many who are suffering from life-debilitating and life-threatening diseases [80]. The lame will walk again. The blind will see just as the deaf have heard. The dumb will sing. Even bionic tongues are on the drawing board. Hearts and kidneys and other organs will be built anew. The fundamental point is that society at large should be able to distinguish between positive and negative applications of technological advancements before we diffuse and integrate such innovations into our day-to-day existence.

The Bionics Institute [81], for instance, is future-focused on the possibilities of bionic hearing, bionic vision, and neurobionics, stating: “Medical bionics is not just a new frontier of medical science, it is revolutionizing what is and isn't possible. Where once there was deafness, there is now the bionic ear. And where there was blindness, there may be a bionic eye.” The Institute reaffirms its commitment to continuing innovative research and leading the way on the proposed “world-changing revolution.”

A. Cochlear Implants—Helping the Deaf to Hear

Fig. 16. Cochlear's Nucleus Freedom implant with Contour Advance electrode which is impervious to magnetic fields up to 1.5 Tesla. Courtesy of Cochlear Australia.

In 2000, more than 32 000 people worldwide already had cochlear implants [82], thanks to the global efforts of people such as Australian Professor Graeme Clark, the founder of Cochlear, Inc. [83]. Clark performed his first transplant in Rod Saunder's left ear at the Royal Eye and Ear Hospital in Melbourne, Australia, on August 1, 1978, when “he placed a box of electronics under Saunders's skin and a bundle of electrodes in his inner ear” [84]. In 2006, that number had grown to about 77 500 for the nucleus implant (Fig. 16) alone which had about 70% of the market share [85]. Today, there are over 110 000 cochlear implant recipients, about 30 000 annually, and their personal stories are testament enough to the ways in which new technologies can change lives dramatically for the better [86]. Cochlear implants can restore hearing to people who have severe hearing loss, a form of diagnosed deafness. Unlike a standard hearing aid that works like an amplifier, the cochlear implant acts like a microphone to change sound into electronic signals. Signals are sent to the microchip implant via radio frequency (RF), stimulating nerve fibers in the inner ear. The brain then interprets the signals that are transmitted via the nerves to be sound.

 

Today, cochlear implants (which are also commonly known as bionic ears) are being used to overcome deafness; tomorrow, they may be open to the wider public as a performance-enhancing technique [87, pp. 10–11]. Audiologist Steve Otto of the Auditory Brainstem Implant Project at the House Ear Institute (Los Angeles, CA) predicts that one day “implantable devices [will] interface microscopically with parts of the normal system that are still physiologically functional” [88]. He is quoted as saying that this may equate to “ESP for everyone.” Otto's prediction that implants will one day be used by persons who do not require them for remedial purposes has been supported by numerous other high profile scientists. A major question is whether this is the ultimate trajectory of these technologies.

for Christofer Toumazou, however, Executive Director of the Institute of Biomedical Engineering, Imperial College London (London, U.K.), there is a clear distinction between repairing human functions and creating a “Superman.” He said, “trying to give someone that can hear super hearing is not fine.” for Toumazou, the basic ethical paradigm should be that we hope to repair the human and not recreate the human [67].

B. Retina Implants—On a Mission to Help the Blind to See

Fig. 17. Visual cortical implant designed by Prof. Mohamad Sawan, a researcher at Polystim Neurotechnologies Laboratory at the Ecole Polytechnique de Montreal (Montreal, QC, Canada). The basic principle of Prof. Sawan's technology consists of stimulating the visual cortex by implanting a silicon microchip on a network of electrodes, made of biocompatible materials, wherein each electrode injects a stimulating electrical current in order to provoke a series of luminous points to appear (an array of pixels) in the field of vision of the blind person. This system is composed of two distinct parts: the implant and an external controller. Courtesy of Mohamad Sawan 2009, made available under Creative Commons License.

The hope is that retina implants will be as successful as cochlear implants in the future [89]. Just as cochlear implants cannot be used for persons suffering from complete deafness, retina implants are not a solution for totally blind persons but rather those suffering from aged macular degeneration (AMD) and retinitis pigmentosa (RP). Retina implants have brought together medical researchers, electronic specialists, and software designers to develop a system that can be implanted inside the eye [90]. A typical retina implant procedure is as follows: “[s]urgeons make a pinpoint opening in the retina to inject fluid in order to lift a portion of the retina from the back of the eye, creating a pocket to accommodate the chip. The retina is resealed over the chip, and doctors inject air into the middle of the eye to force the retina back over the device and close the incisions” [91] (Fig. 17).

 

Brothers Alan and Vincent Chow, one an engineer, the other an ophthalmologist, developed the artificial silicon retina (ASR) and began the company Optobionics Corporation in 1990. This was a marriage between biology and engineering: “In landmark surgeries at the University of Illinois at Chicago Medical Center …the first artificial retinas made from silicon chips were implanted in the eyes of two blind patients who have lost almost all of their vision because of retinal disease.” In 1993, Branwyn [92, p. 3] reported that a team at the National Institutes of Health (NIH) led by Dr. Hambrecht, implanted a 38-electrode array into a blind female's brain. It was reported that she saw simple light patterns and was able to make out crude letters. The following year the same procedure was conducted by another group on a blind male resulting in the man seeing a black dot with a yellow ring around it. Rizzo of Harvard Medical School's Massachusetts Eye and Ear Infirmary (Boston, MA) has cautioned that it is better to talk down the possibilities of the retina implant so as not to give false hopes. The professor himself had expressed that they are dealing with “science fiction stuff” and that there are no long-term guarantees that the technology will ever fully restore sight, although significant progress is being made by a number of research institutes [93, p. 5].

Among these pioneers are researchers at The Johns Hopkins University Medical Center (Baltimore, MD). Brooks [94, p. 4] describes how the retina chip developed by the medical center will work: “a kind of miniature digital camera…is placed on the surface of the retina. The camera relays information about the light that hits it to a microchip implanted nearby. This chip then delivers a signal that is fed back to the retina, giving it a big kick that stimulates it into action. Then, as normal, a signal goes down the optic nerve and sight is at least partially restored.” In 2009, at the age of 56, Barbara Campbell had an array of electrodes implanted in each eye [95] and while her sight is nowhere near fully restored, she is able to make out shapes and see shades of light and dark. Experts believe that this approach is still more realistic in restoring sight to those suffering from particular types of blindness, even more than stem cell therapy, gene therapy, or eye transplants [96] where the risks still outweigh the advantages.

C. Tapping Into the Heart and Brain

Fig. 18. An artificial pacemaker from St. Jude Medical (St. Paul, MN), with electrode 2007. Courtesy of Steven Fruitsmaak.

If it was possible as far back as 1958 to successfully implant two transistors the size of an ice hockey puck in the heart of a 43 year old man [97], the things that will become possible by 2020 are constrained by the imagination as much as by technological limitations. Heart pacemakers (Fig. 18) are still being further developed today, but for the greater part, researchers are turning their attention to the possibilities of brain pacemakers. In the foreseeable future brain implants may help sufferers of Parkinson's, paralysis, nervous system problems, speech-impaired persons, and even cancer patients. The research is still in its formative years and the obstacles are great because of the complexity of the brain; but scientists are hopeful of major breakthroughs in the next 20 years.

 

The brain pacemaker endeavors are bringing together people from a variety of disciplines, headed mainly by neurosurgeons. By using brain implants electrical pulses can be sent directly to nerves via electrodes. The signals can be used to interrupt incoherent messages to nerves that cause uncontrollable movements or tremors. By tapping into the right nerves in the brain, particular reactions can be achieved. Using a technique that was discovered almost accidentally in France in 1987, the following extract describes the procedure of “tapping into” the brain: “Rezai and a team of functional neurosurgeons, neurologists and nurses at the Cleveland Clinic Foundation in Ohio had spent the next few hours electronically eavesdropping on single cells in Joan's brain attempting to pinpoint the precise trouble spot that caused a persistent, uncontrollable tremor in her right hand. Once confident they had found the spot, the doctors had guided the electrode itself deep into her brain, into a small duchy of nerve cells within the thalamus. The hope was that when sent an electrical current to the electrode, in a technique known as deep-brain stimulation, her tremor would diminish, and perhaps disappear altogether” [98]. Companies such as Medtronic Incorporated of Minnesota (Minneapolis, MN) now specialize in brain pacemakers [98]. Medtronic's Activa implant has been designed specifically for sufferers of Parkinson's disease [93].

More recently, there has been some success with ameliorating epileptic attacks through closed-loop technology, also known as smart stimulation. The implant devices can detect an onset of epileptiform activity through a demand-driven process. This means that the battery power in the active implant lasts longer because of increased efficiency, i.e., it is not always stimulating in anticipation of an attack, and adverse effects of having to remove and install new implants more frequently are forgone [99]. Similarly, it has been said that technology such as deep brain stimulation, which has physicians implant electrodes in the brain and electrical pacemakers in the patient's clavicle for Parkinson's Disease, may well be used to overcome problems with severely depressed persons [100].

Currently, the technology is being used to treat thousands of people who are severely depressed or suffering from obsessive compulsive disorder (OCD) who have been unable to respond to other forms of treatment such as cognitive behavioral therapy (CBT) [101]. It is estimated that 10% of people suffering from depression do not respond to conventional methods. Although hard figures are difficult to obtain, several thousands of depressed persons worldwide have had brain pacemakers installed that have software which can be updated wirelessly and remotely. The trials have been based on decades of research by Prof. Helen Mayberg, from Emory University School of Medicine (Atlanta, GA), who first began studying the use of subcallosal cingulate gyrus deep brain stimulation (SCG DBS) for depression in 1990.

In her research, Mayberg has used a device that is no larger than a matchbox with a battery-powered generator that sits in the chest and produces electric currents. The currents are sent to an area deep in the brain via tiny wires which are channeled under the skin on either side of the neck. Surprisingly the procedure to have this type of implant installed only requires local anesthetic and is an outpatient procedure. In 2005, Mayberg told a meeting at the Science Media Centre in London: “This is a very new way to think about the nature of depression …We are not just exciting the brain, we are using electricity to retune and remodulate…We can interrupt or switch off an abnormally functioning circuit” [102].

Ongoing trials today continue to show promising results. The outcome of a 20-patient clinical trial of persons with depression treated with SCG DBS published in 2011, showed that: “At 1 year, 11 (55%) responded to surgery with a greater than 50% reduction in 17-item Hamilton Depression Scale scores. Seven patients (35%) achieved or were within 1 point of achieving remission (scores < 8). Of note, patients who responded to surgery had a significant improvement in mood, anxiety, sleep, and somatic complains related to the disease. Also important was the safety of the procedure, with no serious permanent adverse effects or changes in neuropsychological profile recorded” [103].

Despite the early signs that these procedures may offer long-term solutions for hundreds of thousands of people, some research scientists believe that tapping into the human brain is a long shot. The brain is commonly understood to be “wetware” and plugging in hardware into this “wetware” would seem to be a type mismatch, at least according to Steve Potter, a senior research fellow in biology working at the California Institute of Technology's Biological Imaging Center (Pasadena, CA). Instead Potter is pursuing the cranial route as a “digital gateway to the brain” [88]. Others believe that it is impossible to figure out exactly what all the millions of neurons in the brain actually do. Whether we eventually succeed in “reverse-engineering” the human brain, the topic of implants for both therapeutic and enhancement purposes has aroused significant controversy in the past, and promises to do so even more in the future.

D. Attempting to Overcome Paralysis

In more speculative research, surgeons believe that brain implants may be a solution for persons who are suffering from paralysis, such as spinal cord damage. In these instances, the nerves in the legs are still theoretically “working”; it is just that they cannot make contact with the brain which controls their movement. If somehow signals could be sent to the brain, bypassing the lesion point, it could conceivably mean that paralyzed persons regain at least part of their capability to move [104]. In 2000, Reuters [105] reported that a paralyzed Frenchman (Marc Merger) “took his first steps in 10 years after a revolutionary operation to restore nerve functions using a microchip implant…Merger walks by pressing buttons on a walking frame which acts as a remote control for the chip, sending impulses through fine wires to stimulate legs muscles…” It should be noted, however, that the system only works for paraplegics whose muscles remain alive despite damage to the nerves. Yet there are promising devices like the Bion that may one day be able to control muscle movement using RF commands [106]. Brooks [94] reports that researchers at the University of Illinois in Chicago (Chicago, IL) have “invented a microcomputer system that sends pulses to a patient's legs, causing the muscles to contract. Using a walker for balance, people paralyzed from the waist down can stand up from a sitting position and walk short distances…Another team, based in Europe…enabled a paraplegic to walk using a chip connected to fine wires in his legs.” These techniques are known as functional neuromuscular stimulation systems [107]. In the case of Australian Rob Summers, who became a paraplegic after an accident, doctors implanted an epidural stimulator and electrodes into his spinal cord. “The currents mimic those normally sent by the brain to initiate movement” [108].

Others working to help paraplegics to walk again have invested time in military technology like exoskeletons [109] meant to aid soldiers in lifting greater weights, and also to protect them during battle. Ekso Bionics (Berkeley, CA), formerly Berkeley Bionics, has been conducting trials of an electronic suit in the United States since 2010. The current Ekso model will be fully independent and powered by artificial intelligence in 2012. The Ekso “provides nearly four hours of battery power to its electronic legs, which replicate walking by bending the user's knees and lifting their legs with what the company claims is the most natural gait available today” [110]. This is yet another example of how military technology has been commercialized toward a health solution [111].

E. Granting a Voice to the Speech Impaired

Speech-impairment microchip implants work differently than cochlear and retina implants. Whereas in the latter two, hearing and sight is restored, in implants for speech impairment the voice is not restored, but an outlet for communication is created, possibly with the aid of a voice synthesizer. At Emory University, neurosurgeon Roy E. Bakay and neuroscientist Phillip R. Kennedy were responsible for critical breakthroughs early in the research. In 1998, Versweyveld [112] reported two successful implants of a neurotrophic electrode into the brain of a woman and man who were suffering from amyotrophic lateral sclerosis (ALS) and brainstem stroke, respectively. In an incredible process, Bakay and Kennedy's device uses the patient's brain processes—thoughts, if you will—to move a cursor on a computer screen. “The computer chip is directly connected with the cortical nerve cells…The neural signals are transmitted to a receiver and connected to the computer in order to drive the cursor” [112]. This procedure has major implications for brain–computer interfaces (BCIs), especially bionics. Bakay predicted that by 2010 prosthetic devices will grant patients that are immobile the ability to turn on the TV just by thinking about it and by 2030 to grant severely disabled persons the ability to walk independently [112], [113].

F. Biochips for Diagnosis and Smart Pills for Drug Delivery

It is not unlikely that biochips will be implanted in people at birth in the not too distant future. “They will make individual patients aware of any pre-disposition to susceptibility” [114]. That is, biochips will be used for point-of-care diagnostics and also for the identification of needed drugs, even to detect pandemic viruses and biothreats for national security purposes [115]. The way that biosensors work is that they “represent the technological counterpart of our sense organs, coupling the recognition by a biological recognition element with a chemical or physical transducer, transferring the signal to the electrical domain” [116]. Types of biosensors include enzymes antibodies, receptors, nucleic acids, cells (using a biochip configuration), biomimetic sequences of RNA (ribonucleic) or DNA (deoxyribonucleic), and molecularly imprinted polymers (MIPs). Biochips, on the other hand, “automate highly repetitive laboratory tasks by replacing cumbersome equipment with miniaturized, microfluidic assay chemistries combined with ultrasensitive detection methodologies. They achieve this at significantly lower costs per assay than traditional methods—and in a significantly smaller amount of space. At present, applications are primarily focused on the analysis of genetic material for defects or sequence variations” [117].

with response to treatment for illness, drug delivery will not require patients to swallow pills or take routine injections; instead chemicals will be stored on a microprocessor and released as prescribed. The idea is known as “pharmacy-on-a-chip” and was originated by scientists at the Massachusetts Institute of Technology (MIT, Cambridge, MA) in 1999 [118]. The following extract is from The Lab[119]: “Doctors prescribing complicated courses of drugs may soon be able to implant microchips into patients to deliver timed drug doses directly into their bodies.”

Microchips being developed at Ohio State University (OSU, Columbus, OH) can be swathed with chemical substances such as pain medication, insulin, different treatments for heart disease, or gene therapies, allowing physicians to work at a more detailed level [119]. The breakthroughs have major implications for diabetics, especially those who require insulin at regular intervals throughout the day. Researchers at the University of Delaware (Newark, DE) are working on “smart” implantable insulin pumps that may relieve people with Type I diabetes [120]. The delivery would be based on a mathematical model stored on a microchip and working in connection with glucose sensors that would instruct the chip when to release the insulin. The goal is for the model to be able to simulate the activity of the pancreas so that the right dosage is delivered at the right time.

Fig. 19. The VeriChip microchip, the first microchip implant to be cleared by the U.S. Food and Drug Administration (FDA) for humans, is a passive microchip that contains a 16-digit number, which can be used to retrieve critical medical information on a patient from a secure online database. The company that owns the VeriChip technology is developing a microscopic glucose sensor to put on the end of the chip to eliminate a diabetic's need to draw blood to get a blood glucose reading. Courtesy of PositiveID Corporation.

Beyond insulin pumps, we are now nearing a time where automated closed-loop insulin detection (Fig. 19) and delivery will become a tangible treatment option and may serve as a temporary cure for Type I diabetes until stem cell therapy becomes available. “Closed-loop insulin delivery may revolutionize not only the way diabetes is managed but also patients' perceptions of living with diabetes, by reducing the burden on patients and caregivers, and their fears of complications related to diabetes, including those associated with low and high glucose levels” [121]. It is only a matter of time before these lab-centric results are replicated in real-life conditions in sufferers of Type 1 diabetes.

 

 

G. To Implant or Not to Implant, That Is the Question

There are potentially 500 000 hearing impaired persons that could benefit from cochlear implants [122] but not every deaf person wants one [123]. “Some deaf activists…are critical of parents who subject children to such surgery [cochlear implants] because, as one charged, the prosthesis imparts ‘the non-healthy self-concept of having had something wrong with one's body’ rather than the ‘healthy self-concept of [being] a proud Deaf’” [124]. Assistant Professor Scott Bally of Audiology at Gallaudet University (Washington, DC) has said, “Many deaf people feel as though deafness is not a handicap. They are culturally deaf individuals who have successfully adapted themselves to being deaf and feel as though things like cochlear implants would take them out of their deaf culture, a culture which provides a significant degree of support” [92]. Putting this delicate debate aside, it is here that some delineation can be made between implants that are used to treat an ailment or disability (i.e., giving sight to the blind and hearing to the deaf), and implants that may be used for enhancing human function (i.e., memory). There are some citizens, like Amal Graafstra of the United States [125], who are getting chip implants for convenience-oriented social living solutions that would instantly herald in a world that had keyless entry everywhere (Fig. 20). And there are other citizens who are concerned about the direction of the human species, as credible scientists predict fully functional neural implants. “[Q]uestions are raised as to how society as a whole will relate to people walking around with plugs and wires sprouting out of their heads. And who will decide which segments of the society become the wire-heads” [92]?

 

Fig. 20. Amal Graafstra demonstrating an RFID-operated door latch application he developed. Over the RFID tag site on his left hand is a single steristrip that remained after implantation for a few days. His right hand is holding the door latch.

 

SECTION V. Überveillance and Function Creep

Section IV focused on implants that were attempts at “orthopedic replacements”: corrective in nature, required to repair a function that is either lying dormant or has failed altogether. Implants of the future, however, will attempt to add new “functionality” to native human capabilities, either through extensions or additions. Globally acclaimed scientists have pondered on the ultimate trajectory of microchip implants [126]. The literature is admittedly mixed in its viewpoints of what will and will not be possible in the future [127].

for those of us working in the domain of implantables for medical and nonmedical applications, the message is loud and clear: implantables will be the next big thing. At first, it will be “hip to get a chip.” The extreme novelty of the microchip implant will mean that early adopters will race to see how far they can push the limits of the new technology. Convenience solutions will abound [128]. Implantees will not be able to get enough of the new product and the benefits of the technology will be touted to consumers in a myriad of ways, although these perceived benefits will not always be realized. The technology will probably be first tested where there will be the least effective resistance from the community at large, that is, on prison inmates [129], then those suffering from dementia. These incremental steps in pilot trials and deployment are fraught with moral consequences. Prisoners cannot opt out from jails adopting tracking technology, and those suffering from cognitive disorders have not provided and could not provide their consent. from there it will conceivably not take long for it to be used on the elderly and in children and on those suffering from clinical depression.

The functionality of the implants will range from passive ID-only to active multiapplication, and most invasive will be the medical devices that can upon request or algorithmic reasoning release drugs or electrically stimulate the body for mental and physical stability. There will also be a segment of the consumer and business markets who will adopt the technology for no clear reason and without too much thought, save for the fact that the technology is new and seems to be the way advanced societies are heading. This segment will probably not be overly concerned with any discernible abridgement of their human rights or the fine-print “terms and conditions” agreement they have signed, but will take an implant on the promise that they will have greater connectivity to the Internet, for example. These consumers will thrive on ambient intelligence, context-aware pervasive applications, and an augmented reality—ubiquity in every sense.

But it is certain that the new technology will also have consequences far greater than what we can presently envision. Questions about the neutrality of technology are immaterial in this new “plugged-in” order of existence. for Brin [130, p. 334], the question ultimately has to do with the choice between privacy and freedom. In his words, “[t]his is one of the most vile dichotomies of all. And yet, in struggling to maintain some beloved fantasies about the former, we might willingly, even eagerly, cast the latter away.” And thus there are two possibilities, just as Brin [130] writes in his amazingly insightful book, The Transparent Society, of “the tale of two cities.” Either implants embedded in humans which require associated infrastructure will create a utopia where there is built-in intelligence for everything and everyone in every place, or implants embedded in humans will create a dystopia which will be destructive and will diminish one's freedom of choice, individuality, and finally that indefinable essence which is at the core of making one feel “human.” A third possibility—the middle-way between these two alternatives—would seem unlikely, excepting for the “off the grid” dissenter.

In Section V-A, we portray some of the attractions people may feel that will draw them into the future world of implanted technologies. In Section V-B, we portray some of the problems associated with implanting technology under the skin that would drive people away from opting in to such a future.

A. The Positive Possibilities

Bearing a unique implant will make the individual feel special because they bear a unique ID. Each person will have one implant which will coordinate hundreds of smaller nanodevices, but each nanodevice will have the capacity to act on its own accord. The philosophy espoused behind taking an implant will be one of protection: “I bear an implant and I have nothing to hide.” It will feel safe to have an implant because emergency services, for example, will be able to rapidly respond to your calls for help or any unforeseen events that automatically log problems to do with your health.

Fewer errors are also likely to happen if you have an implant, especially with financial systems. Businesses will experience a rise in productivity as they will understand how precisely their business operates to the nearest minute, and companies will be able to introduce significant efficiencies. Losses in back-end operations, such as the effects of product shrinkage, will diminish as goods will be followed down the supply chain from their source to their destination customer, through the distribution center and retailer.

It will take some years for the infrastructure supporting implants to grow and thrive with a substantial consumer base. The function creep will not become apparent until well after the early majority have adopted implants and downloaded and used a number of core applications to do with health, banking, and transport which will all be interlinked. New innovations will allow for a hybrid device and supplementary infrastructure to grow so powerful that living without automated tracking, location finding, and condition monitoring will be almost impossible.

B. The Existential Risks

It will take some years for the negative fallout from microchip implants to be exposed. At first only the victims of the fallout will speak out through formal exception reports on government agency websites. The technical problems associated with implants will pertain to maintenance, updates, viruses, cloning, hacking, radiation shielding, and onboard battery problems. But the greater problems will be the impact on the physiology and mental health of the individual: new manifestations of paranoia and severe depression will lead to people continually wanting reassurance about their implant's functionality. Issues about implant security, virus detection, and a personal database which is error free will be among the biggest issues facing implantees. Despite this, those who believe in the implant singularity (the piece of embedded technology that will give each person ubiquitous access to the Internet) will continue to stack up points and rewards and add to their social network, choosing rather to ignore the warnings of the ultimate technological trajectory of mind control and geoslavery [131]. It will have little to do with survival of the fittest at this point, although most people will buy into the notion of an evolutionary path toward the Homo Electricus [132]: a transhumanist vision [133] that we can do away with the body and become one with the Machine, one with the Cosmos—a “nuts and bolts” Nirvana where one's manufactured individual consciousness connects with the advanced consciousness evolving from the system as a whole. In this instance, it will be the ecstatic experience of being drawn ever deeper into the electric field of the “Network.”

Some of the more advanced implants will be able to capture and validate location-based data, alongside recordings (visual and audio capture). The ability to conduct überveillance via the implant will be linked to a type of blackbox recorder as in an airplane's cockpit. Only in this case the cockpit will be the body, and the recorder will be embedded just beneath the translucent layer of the skin that will be used for memory recollection and dispute resolution. Outwardly ensuring that people are telling the full story at all times, there will be no lies or claims to poor memory. Überveillance is an above and beyond, an exaggerated, an omnipresent 24/7 electronic surveillance (Fig. 21). It is a surveillance that is not only “always on” but “always with you.” It is ubiquitous because the technology that facilitates it, in its ultimate implementation, is embedded within the human body. The problem with this kind of bodily invasive surveillance is that omnipresence in the “material” world will not always equate with omniscience, hence the real concern for misinformation, misinterpretation, and information manipulation [7]. While it might seem like the perfect technology to aid in real-time forensic profiling and criminalization, it will be open to abuse, just like any other technique, and more so because of the preconception that it is infallible.

 

Fig. 21.The überveillance triquetra as the intersection of surveillance, dataveillance, and sousveillance. Courtesy of Alexander Hayes.

 

SECTION VI. Technology Roadmapping

According to Andrews cited in [1], a second intellectual current within the IEEE SSIT has begun to emerge which is more closely aligned with most of the IEEE technical societies, as well as economics and business. The proponents of this mode participate in “technology foresight” and “roadmapping” activities, and view technology more optimistically, looking to foster innovation without being too concerned about its possible negative effects [1, p. 14]. Braun [134, p. 133] writes that “[f]orecasts do not state what the future will be…they attempt to glean what it might be.” Thus, one with technology foresight can be trusted insofar as their knowledge and judgment go—they may possess foresight through their grasp of current knowledge, through past experiences which inform their forecasts, and through raw intuition.

Various MIT Labs, such as the Media Lab, have been engaged in visionary research since before 1990, giving society a good glimpse of where technology might be headed some 20–30 years ahead of time. It is from such elite groups that visionaries typically emerge whose main purpose is to envision the technologies that will better our wellbeing and generally make life more productive and convenient in the future. Consider the current activities of the MIT Media Lab's Affective Computing Research Group directed by Prof. Rosalind W. Picard that is working hard on technology aids encapsulating “affect sensing” in response to the growing problem of autism [135]. The Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology. The work of Picard's group was made possible by the foundations laid by the Media Lab's predecessor researchers.

On the global technological roadmap we can now point to the following systems which are already under development but have not yet been widely diffused into the market:

  • alternative fuels heralding in innovations like electric cars which are self-driving, and ocean-powered energy, as well as rise of biofuels;

  • the potential for 3-D printing which will revolutionize prototyping and manufacturing practices and possibly reconstruct human tissue;

  • hologram projections for videoconferencing and televisions that respond to gestures as well as pen-sized computing which will do away with keyboards and screens;

  • quantum computing and cryptography;

  • next-generation prosthetics (Fig. 22);

  • cognitive machines such as robot humanoids;

  • carbon nanotubes and nanotech computing which will make our current silicon chips look gargantuan;

  • genetic engineering breakthroughs and regenerative health treatment such as stem cell treatment;

  • electronic banking that will not use physical cash for transactions but the singularity chip (e.g., implant);

  • ubiquitous high-speed wireless networks;

  • crowdsourced surveillance toward real-time forensic profiling and criminalization;

  • autogeneration visual life logs and location chronicles;

  • enhanced batteries that last longer;

  • body power to charge digital equipment [136];

  • brainwave-based technologies in health/gaming;

  • brain-reading technology for interrogation [137].

 

Fig. 22. Army Reserve Staff Sgt. Alfredo De Los Santos displays what the X2 microprocessor knee prosthetic can do by walking up a flight of stairs at the Military Advanced Training Center at Walter Reed Army Medical Center (Washington, DC), December 8, 2009. Patients at Walter Reed are testing next-generation prosthetics. Courtesy of the U.S. Army.

It is important to note that while these new inventions have the ability to make things faster and better for most living in more developed countries, they can act to increase the ever-widening gap between the rich and the poor. New technologies will not necessarily aid in eradicating the poverty cycle in parts of Africa and South America. In fact, new technologies can have the opposite effect—they can create an ever greater chasm in equity and access to knowledge.

Technology foresight is commonly held by one who is engaged in the act of prediction. Predictive studies more often than not are based on past and present trends and use this knowledge for providing a roadmap of future possibilities. There is some degree of imagination in prediction, and certainly the creative element is prevalent. Predictions are not meant to be wild, but calculated wisely with evidence that shows a given course or path is likely in the future. However, this does not mean that all predictions come true. Predictive studies can be about new inventions and new form factors, or the recombination of existing innovations in new ways (hybrid architectures, for example), or the mutation of an existing innovation. Some elements of predictive studies have heavy quantitative forecasting components that use complex models to predict the introduction of new innovations, some even based on historical data inputs.

Before an invention has been diffused into the market, scenario planning is conducted to understand how the technology might be used, who might take it up, and what percentage of society will be willing to adopt the product over time (i.e., consumption analysis). “Here the emphasis is on predicting the development of the technology and assessing its potential for adoption, including an analysis of the technology's market” [138, p. 328].

Even the founder of Microsoft Bill Gates [139, p. 274] accepted that his predictions may not come true. But his insights in the Road Ahead are to be commended, even though they were understandably broad. Gates wrote, “[t]he information highway will lead to many destinations. I've enjoyed speculating about some of these. Doubtless I've made some foolish predictions, but I hope not too many.” Allaby [140, p. 206] writes, “[f]orecasts deal in possibilities, not inevitabilities, and this allows forecasters to explore opportunities.”

for the greater part, forecasters raise challenging issues that are thought provoking, about how existing inventions or innovations will impact society. They give scenarios for the technology's projected pervasiveness, how they may affect other technologies, what potential benefits or drawbacks they may introduce, how they will affect the economy, and much more.

Kaku [141, p. 5] has argued, “that predictions about the future made by professional scientists tend to be based much more substantially on the realities of scientific knowledge than those made by social critics, or even those by scientists of the past whose predictions were made before the fundamental scientific laws were completely known.” He believes that among the scientific body today there is a growing concern regarding predictions that for the greater part come from consumers of technology rather than those who shape and create it. Kaku is, of course, correct, insofar that scientists should be consulted since they are the ones actually making things possible after discoveries have occurred. But a balanced view is necessary and extremely important, encompassing various perspectives of different disciplines.

In the 1950s, for instance, when technical experts forecasted improvements in computer technology, they envisaged even larger machines—but science fiction writers predicted microminiaturization. They “[p]redicted marvels such as wrist radios and pocket-sized computers, not because they foresaw the invention of the transistor, but because they instinctively felt that some kind of improvement would come along to shrink the bulky computers and radios of that day” (Bova, 1988, quoted in [142, p. 18]). The methodologies used as vehicles to predict in each discipline should be respected. The question of who is more correct in terms of predicting the future is perhaps the wrong question. for example, some of Kaku's own predictions in Visions can be found in science fiction movies dating back to the 1960s.

In speculating about the next 500 years, Berry [142, p. 1] writes, “[p]rovided the events being predicted are not physically impossible, then the longer the time scale being considered, the more likely they are to come true…if one waits long enough everything that can happen will happen.”

 

SECTION VII.THE NEXT 50 YEARS: BRAIN–COMPUTER INTERFACE

When Ellul [143, p. 432] in 1964 predicted the use of “electronic banks” in his book The Technological Society, he was not referring to the computerization of financial institutions or the use of automatic teller machines (ATMs). Rather it was in the context of the possibility of the dawn of a new entity: the conjoining of man with machine. Ellul was predicting that one day knowledge would be accumulated in electronic banks and “transmitted directly to the human nervous system by means of coded electronic messages…[w]hat is needed will pass directly from the machine to the brain without going through consciousness…” As unbelievable as this man–machine complex may have sounded at the time, 45 years later visionaries are still predicting that such scenarios will be possible by the turn of the 22nd century. A large proportion of these visionaries are cyberneticists. Cybernetics is the study of nervous system controls in the brain as a basis for developing communications and controls in sociotechnical systems. Parenthetically, in some places writers continue to confuse cybernetics with robotics; they might overlap in some instances, but they are not the same thing.

Kaku [141, pp. 112–116] observes that scientists are working steadily toward a brain–computer interface (Fig. 23). The first step is to show that individual neurons can grow on silicon and then to connect the chip directly to a neuron in an animal. The next step is to mimic this connectivity in a human, and the last is to decode millions of neurons which constitute the spinal cord in order to interface directly with the brain. Cyberpunk science fiction writers like William Gibson [144] refer to this notion as “jacking-in” with the wetware: plugging in a computer cable directly with the central nervous system (i.e., with neurons in the brain analogous to software and hardware) [139, p. 133].

 

Fig.&nbsp;23.&nbsp; Brain–computer interface schema. (1) Pedestal. (2) Sensor. (3) Electrode. Courtesy of Balougador under creative commons license.

Fig. 23. Brain–computer interface schema. (1) Pedestal. (2) Sensor. (3) Electrode. Courtesy of Balougador under creative commons license.

In terms of the current state of development we can point to the innovation of miniature wearable media, orthopedic replacements (including pacemakers), bionic prosthetic limbs, humanoid robots (i.e., a robot that looks like a human in appearance and is autonomous), and RFID implants. Traditionally, the term cyborg has been used to describe humans who have some mechanical parts or extensions. Today, however, we are on the brink of building a new sentient being, a bearer of electricity, a modern man belonging to a new race, beyond that which can be considered merely part man part machine. We refer here to the absolute fusion of man and machine, where the subject itself becomes the object; where the toolmaker becomes one with his tools [145]. The question at this point of coalescence is how human will the new species be [146], and what are the related ethical, metaphysical, and ontological concerns? Does the evolution of the human race as recorded in history come to an end when technology can be connected to the body in a wired or wireless form?

A. From Prosthetics to Amplification

Fig.&nbsp;24.&nbsp; Cyborg 2.0 Project. Kevin Warwick with wife Irena during the Cyborg 2.0 project. Courtesy of Kevin Warwick.

Fig. 24. Cyborg 2.0 Project. Kevin Warwick with wife Irena during the Cyborg 2.0 project. Courtesy of Kevin Warwick.

While orthopedic replacements corrective in nature have been around since the 1950s [147] and are required to repair a function that is either lying dormant or has failed altogether, implants of the future will attempt to add new functionality to native human capabilities, either through extensions or additions. Warwick's Cyborg 2.0 project [148], for instance, intended to prove that two persons with respective implants could communicate sensation and movement by thoughts alone. In 2002, the BBC reported that a tiny silicon square with 100 electrodes was connected to the professor's median nerve and linked to a transmitter/receiver in his forearm. Although, “Warwick believe[d] that when he move[d] his own fingers, his brain [would] also be able to move Irena's” [104, p. 1], the outcome of the experiment was described at best as sending “Morse-code” messages (Fig. 24). Warwick [148] is still of the belief that a person's brain could be directly linked to a computer network [149]. Commercial players are also intent on keeping ahead, continually funding projects in this area of research.

 

If Warwick is right, then terminals like telephones would eventually become obsolete if thought-to-thought communication became possible. Warwick describes this as “putting a plug into the nervous system” [104] to be able to allow thoughts to be transferred not only to another person but to the Internet and other media. While Warwick's Cyborg 2.0 may not have achieved its desired outcomes, it did show that a form of primitive Morse-code-style nervous-system-to-nervous-system communication is realizable [150]. Warwick is bound to keep trying to achieve his project goals given his philosophical perspective. And if Warwick does not succeed, he will have at least left behind a legacy and enough stimuli for someone else to succeed in his place.

 

B. The Soul Catcher Chip

The Soul Catcher chip was conceived by former Head of British Telecom Research, Peter Cochrane. Cochrane [151, p. 2] believes that the human body is merely a carcass that serves as a transport mechanism just like a vehicle, and that the most important part of our body is our brain (i.e., mind). Similarly, Miriam English has said: “I like my body, but it's going to die, and it's not a choice really I have. If I want to continue, and I want desperately to see what happens in another 100 years, and another 1000 years…I need to duplicate my brain in order to do that” [152]. Soul Catcher is all about the preservation of a human, way beyond the point of physical debilitation. The Soul Catcher chip would be implanted in the brain, and act as an access point to the external world [153]. Consider being able to download the mind onto computer hardware and then creating a global nervous system via wireless Internet [154] (Fig. 25). Cochrane has predicted that by 2050 downloading thoughts and emotions will be commonplace. Billinghurst and Starner [155, p. 64]predict that this kind of arrangement will free up the human intellect to focus on creative rather than computational functions.

 

Fig. 25. Ray Kurzweil predicts that by 2013 supercomputer power will be sufficient for human brain functional simulation and by 2025 for human brain neural simulation for uploading. Courtesy of Ray Kurzweil and Kurzweil Technologies 2005.

Cochrane's beliefs are shared by many others engaged in the transhumanist movement (especially Extropians like Alexander Chislenko). Transhumanism (sometimes known by the abbreviations “> H” or “H+”) is an international cultural movement that consists of intellectuals who look at ways to extend life through the application of emerging sciences and technologies. Minsky [156] believes that this will be the next stage in human evolution—a way to achieve true immortality “replacing flesh with steel and silicon” [141, p. 94]. Chris Winter of British Telecom has claimed that Soul Catcher will mean “the end of death.” Winter predicts that by 2030, “[i]t would be possible to imbue a newborn baby with a lifetime's experiences by giving him or her the Soul Catcher chip of a dead person” [157]. The philosophical implications behind such movements are gigantic; they reach deep into every branch of traditional philosophy, especially metaphysics with its special concerns over cosmology and ontology.

 

SECTION VIII. The Next 100 Years: Homo Electricus

A. The Rise of the Electrophorus

Fig.&nbsp;26.&nbsp; Drawing showing the operation of an Electrophorus, a simple manual electrostatic generator invented in 1762 by Swedish Professor Johan Carl Wilcke. Image by Amédée Guillemin (died 1893).

Fig. 26. Drawing showing the operation of an Electrophorus, a simple manual electrostatic generator invented in 1762 by Swedish Professor Johan Carl Wilcke. Image by Amédée Guillemin (died 1893).

Microchip implants are integrated circuit devices encased in RFID transponders that can be active or passive and are implantable into animals or humans usually in the subcutaneous layer of the skin. The human who has been implanted with a microchip that can send or receive data is an Electrophorus, a bearer of “electric” technology [158]. The Macquarie Dictionary definition of “electrophorus” is “an instrument for generating static electricity by means of induction,” and refers to an instrument used in the early years of electrostatics (Fig. 26).

 

We have repurposed the term electrophorus to apply to humans implanted with microchips. One who “bears” is in some way intrinsically or spiritually connected to that which they are bearing, in the same way an expecting mother is to the child in her womb. The root electro comes from the Greek word meaning “amber,” and phorus means to “wear, to put on, to get into” [159, p. 635]. When an Electrophorus passes through an electromagnetic zone, he/she is detected and data can be passed from an implanted microchip (or in the future directly from the brain) to a computer device.

To electronize something is “to furnish it with electronic equipment” and electrotechnology is “the science that deals with practical applications of electricity.” The term “electrophoresis” has been borrowed here, to describe the “electronic” operations that an electrophorus is involved in. McLuhan and Zingrone [160, p. 94] believed that “electricity is in effect an extension of the nervous system as a kind of global membrane.” They argued that “physiologically, man in the normal use of technology (or his variously extended body) is perpetually modified by it and in turn finds ever new ways of modifying his technology” [161, p. 117].

The term “electrophorus” seems to be much more suitable today for expressing the human-electronic combination than the term “cyborg.” “Electrophorus” distinguishes strictly electrical implants from mechanical devices such as artificial hips. It is not surprising then that these crucial matters of definition raise philosophical and sociological questions of consciousness and identity, which science fiction writers have been addressing creatively. The Electrophorus belongs to the emerging species of Homo Electricus. In its current state, the Electrophorus relies on a device being triggered wirelessly when it enters an electromagnetic field. In the future, the Electrophorus will act like a network element or node, allowing information to pass through him or her, to be stored locally or remotely, and to send out messages and receive them simultaneously and allow some to be processed actively, and others as background tasks.

At the point of becoming an Electrophorus (i.e., a bearer of electricity), Brown [162] makes the observation that “[y]ou are not just a human linked with technology; you are something different and your values and judgment will change.” Some suspect that it will even become possible to alter behavior of people carrying brain implants, whether the individual wills it or not. Maybury [163]believes that “[t]he advent of machine intelligence raises social and ethical issues that may ultimately challenge human existence on earth.”

B. The Prospects of Transhumanism

Fig.&nbsp;27.&nbsp; The transhumanism symbol. Courtesy of Antonu under Creative Commons license.

Fig. 27. The transhumanism symbol. Courtesy of Antonu under Creative Commons license.

Thought-to-thought communications may seem outlandish today, but it is only one of many futuristic hopes of the movement termed transhumanism. Probably the most representative organization for this movement is the World Transhumanist Association (WTA), which recently adopted the doing-business-as name of “Humanity+” (Fig. 27). The WTA's website [164] carries the following succinct statement of what transhumanism is, penned originally by Max More in 1990: “Transhumanism is a class of philosophies of life that seek the continuation and acceleration of the evolution of intelligent life beyond its currently human form and human limitations by means of science and technology, guided by life-promoting principles and values.” Whether transhumanism yet qualifies as a philosophy, it cannot be denied that it has produced its share of both proponents and critics.

 

Proponents of transhumanism claim that the things they want are the things everyone wants: freedom from pain, freedom from suffering, freedom from all the limitations of the human body (including mental as well as physical limitations), and ultimately, freedom from death. One of the leading authors in the transhumanist movement is Ray Kurzweil, whose 652-page book The Singularity Is Near [165] prophesies a time in the not-too-distant future when evolution will accelerate exponentially and bring to pass all of the above freedoms as “the matter and energy in our vicinity will become infused with the intelligence, knowledge, creativity, beauty, and emotional intelligence (the ability to love, for example) of our human-machine civilization. Our civilization will then expand outward, turning all the dumb matter and energy we encounter into sublimely intelligent—transcendent—matter and energy” [165, p. 389].

Despite the almost theological tone of the preceding quote, Kurzweil has established a sound track record as a technological forecaster, at least when it comes to Moore's-Law-type predictions of the progress of computing power. But the ambitions of Kurzweil [178] and his allies go far beyond next year's semiconductor roadmap to encompass the future of all humanity. If the fullness of the transhumanist vision is realized, the following achievements will come to pass:

  • human bodies will cease to be the physical instantiation of human minds, replaced by as-yet-unknown hardware with far greater computational powers than the present human brain;

  • human minds will experience, at their option, an essentially eternal existence in a world free from the present restrictions of material embodiment in biological form;

  • limitations on will, intelligence, and communication will all be overcome, so that to desire a thing or experience will be to possess it.

The Transhumanist Declaration, last modified in 2009 [166], recognizes that these plans have potential downsides, and calls for reasoned debate to avoid the risks while realizing the opportunities. The sixth item in the Declaration, for example, declares that “[p]olicy making ought to be guided by responsible and inclusive moral vision, taking seriously both opportunities and risks, respecting autonomy and individual rights, and showing solidarity with and concern for the interests and dignity of all people around the globe.” The key phrase in this item is “moral vision.” While many self-declared transhumanists may agree on the moral vision which should guide their endeavors, the movement has also inspired some of the most vigorous and categorically critical invective to be found in the technical and public-policy literature.

Possibly the most well known of the vocal critics of transhumanism is Francis Fukuyama, a political scientist who nominated transhumanism as his choice for the world's most dangerous idea [167]. As with most utopian notions, the main problem Fukuyama sees with transhumanism is the transition between our present state and the transhumanists' future vision of completely realized eternal technological bliss (Fig. 28). Will some people be uploaded to become immortal, almost omniscient transhumans while others are left behind in their feeble, mortal, disease-ridden human bodies? Are the human goods that transhumanists say are basically the same for everyone really so? Or are they more complex and subtle than typical transhumanist pronouncements acknowledge? As Fukuyama points out in his foreign Policy essay [167], “Our good characteristics are intimately connected to our bad ones… if we never felt jealousy, we would also never feel love. Even our mortality plays a critical function in allowing our species as a whole to survive and adapt (and transhumanists are about the last group I would like to see live forever).”

 

Fig.&nbsp;28.&nbsp; Brain in a vat with the thought: “I'm walking outside in the sun” being transmitted to the computer. Image reproduced under the Creative Commons license.

Fig. 28. Brain in a vat with the thought: “I'm walking outside in the sun” being transmitted to the computer. Image reproduced under the Creative Commons license.

Transhumanists themselves admit that their movement performs some of the functions of a religion when it “offers a sense of direction and purpose.” But in contrast to most religions, transhumanists explicitly hope to “make their dreams come true in this world” [168]. Nearly all transhumanist programs and proposals arise from a materialist–reductionist view of the world which assumes that the human mind is at most an epiphenomenon of the brain, all of the human brain's functions will eventually be simulated by hardware (on computers of the future), and that the experience known as consciousness can be realized in artificial hardware in essentially the same form as it is presently realized in the human body. Some of the assumptions of transhumanism are based less on facts and more on faith. Just as Christians take on faith that God revealed Himself in Jesus Christ, transhumanists take on faith that machines will inevitably become conscious.

Fig.&nbsp;29.&nbsp; The shadow dextrous hand shakes the human hand. How technology might become society—a future agreement. Courtesy of Shadow Robot Company 2008.

Fig. 29. The shadow dextrous hand shakes the human hand. How technology might become society—a future agreement. Courtesy of Shadow Robot Company 2008.

In keeping with the transhumanists' call for responsible moral vision, the IEEE SSIT has been, and will continue to be, a forum where the implications for society of all sorts of technological developments can be debated and evaluated. In a sense, the transhumanist program is the ultimate technological project: to redesign humanity itself to a set of specifications, determined by us. If the transhumanists succeed, technology will become society, and the question of the social implications of technology will be moot (Fig. 29). Perhaps the best attitude to take toward transhumanism is to pay attention to their prophecies, but, as the Old Testament God advised the Hebrews, “if the thing follow not, nor come to pass…the prophet hath spoken it presumptuously…” [169].

 

 

SECTION IX. Ways forward

In sum, identifying and predicting what the social implications of past, present and future technologies might be can lead us to act in one of four ways, which are not mutually exclusive.

First, we can take the “do nothing” approach and meekly accept the risks associated with new techniques. We stop being obsessed by both confirmed and speculative consequences, and instead, try to see how far the new technologies might take us and what we might become or transform into as a result. While humans might not always like change, we are by nature, if we might hijack Heraclitus, in a continual state of flux. We might reach new potentials as a populace, become extremely efficient at doing business with each other, and make a positive impact on our natural environment by doing so. The downside to this approach is that it appears to be an all or nothingapproach with no built-in decision points. for as Jacques Ellul [170] forewarned: “what is at issue here is evaluating the danger of what might happen to our humanity in the present half-century, and distinguishing between what we want to keep and what we are ready to lose, between what we can welcome as legitimate human development and what we should reject with our last ounce of strength as dehumanization.”

The second option is that we let case law determine for us what is legal or illegal based on existing laws, or new or amended laws we might introduce as a result of the new technologies. We can take the stance that the courts are in the best position to decide on what we should and should not do with new technologies. If we break the law in a civil or criminal capacity, then there is a penalty and we have civil and criminal codes concerning workplace surveillance, telecommunications interception and access, surveillance devices, data protection and privacy, cybercrime, and so on. There is also the continual review of existing legislation by law-reform commissions and the like. New legislation can also be introduced to curb against other dangers or harms that might eventuate as a result of the new techniques.

The third option is that we can introduce industry regulations that stipulate how advanced applications should be developed (e.g., ensuring privacy impact assessments are done before commercial applications are launched), and that technical expectations on accuracy, reliability, and storage of data are met. It is also important that the right balance be found between regulations and freedom so as not to stifle the high-tech industry at large.

Finally, the fourth option would be to adopt the “Amish method”: complete abandonment of technology that has progressed beyond a certain point of development. This is in some respect “living off the grid” [171].

Although obvious, it is important to underline that none of these options are mutually exclusive or foolproof. The final solution may well be at times to introduce industry regulations or codes, at other times to do nothing, and in other cases to rely on legislative amendments despite the length of time it takes to develop these. In other cases, the safeguards may need to be built into the technology itself.

 

SECTION X. Conclusion

If we put our trust in Kurzweil's [172] Law of Accelerating Returns, we are likely headed into a great period of discovery unprecedented in any era of history. This being the case, the time for inclusive dialog is now, not after widespread diffusion of such innovations as “always on” cameras, microchip implants, unmanned drones and the like. We stand at a critical moment of decision, as the mythological Pandora did as she was about to open her box. There are many lessons to be learned from history, especially from such radical developments as the atomic bomb and the resulting arms race. Joy [173] has raised serious fears about continuing unfettered research into “spiritual machines.” Will humans have the foresight to say “no” or “stop” to new innovations that could potentially be a means to a socially destructive scenario? Implants that may prolong life expectancy by hundreds if not thousands of years may appeal at first glance, but they could well create unforeseen devastation in the form of technological viruses, plagues, or a grim escalation in the levels of crime and violence.

To many scientists of the positivist tradition anchored solely to an empirical world view, the notion of whether something is right or wrong is in a way irrelevant. for these researchers, a moral stance has little or nothing to do with technological advancement but is really an ideological position. The extreme of this view is exemplified by an attitude of “let's see how far we can go”, not “is what we are doing the best thing for humanity?” and certainly not by the thought of “what are the long-term implications of what we are doing here?” As an example, one need only consider the mad race to clone the first animal, and many have long suspected an “underground” scientific race continues to clone the first human.

In the current climate of innovation, precisely since the proliferation of the desktop computer and birth of new digital knowledge systems, some observers believe that engineers, and professionals more broadly, lack accountability for the tangible and intangible costs of their actions [174, p. 288]. Because science-enabled engineering has proved so profitable for multinational corporations, they have gone to great lengths to persuade the world that science should not be stopped, for the simple reason that it will always make things better. This ignores the possibility that even seemingly small advancements into the realm of the Electrophorus for any purpose other than medical prostheses will have dire consequences for humanity [175]. According to Kuhns, “Once man has given technique its entry into society, there can be no curbing of its gathering influence, no possible way of forcing it to relinquish its power. Man can only witness and serve as the ironic beneficiary-victim of its power” [176, p. 94].

Clearly, none of the authors of this paper desire to stop technological advance in its tracks. But we believe that considering the social implications of past, present, and future technologies is more than an academic exercise. As custodians of the technical means by which modern society exists and develops, engineers have a unique responsibility to act with forethought and insight. The time when following orders of a superior was all that an engineer had to do is long past. with great power comes great responsibility. Our hope is that the IEEE SSIT will help and encourage engineers worldwide to consider the consequences of their actions throughout the next century.

References

1. K. D. Stephan, "Notes for a history of the IEEE society on social implications of technology", IEEE Technol. Soc. Mag., vol. 25, no. 4, pp. 5-14, 2006.

2. B. R. Inman, "One view of national security and technical information", IEEE Technol. Soc. Mag., vol. 1, no. 3, pp. 19-21, Sep. 1982.

3. S. Sloan, "Technology and terrorism: Privatizing public violence", IEEE Technol. Soc. Mag., vol. 10, no. 2, pp. 8-14, 1991.

4. J. R. Shanebrook, "Prohibiting nuclear weapons: Initiatives toward global nuclear disarmament", IEEE Technol. Soc. Mag., vol. 18, no. 2, pp. 25-31, 1999.

5. C. J. Andrews, "National responses to energy vulnerability", IEEE Technol. Soc. Mag., vol. 25, no. 3, pp. 16-25, 2006.

6. R. C. Arkin, "Ethical robots in warfare", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 30-33, 2009.

7. M. G. Michael, K. Michael, "Toward a state of überveillance", IEEE Technol. Soc. Mag., vol. 29, no. 2, pp. 9-16, 2010.

8. V. Baranauskas, "Large-scale fuel farming in Brazil", IEEE Technol. Soc. Mag., vol. 2, no. 1, pp. 12-13, Mar. 1983.

9. H. M. Gueron, "Nuclear power: A time for common sense", IEEE Technol. Soc. Mag., vol. 3, no. 1, pp. 3-9, Mar. 1984.

10. J. J. Mackenzie, "Nuclear power: A skeptic's view", IEEE Technol. Soc. Mag., vol. 3, no. 1, pp. 9-15, Mar. 1984.

11. E. Larson, D. Abrahamson, P. Ciborowski, "Effects of atmospheric carbon dioxide on U. S. peak electrical generating capacity", IEEE Technol. Soc. Mag., vol. 3, no. 4, pp. 3-8, Dec. 1984.

12. P. C. Cruver, "Greenhouse effect prods global legislative initiatives", IEEE Technol. Soc. Mag., vol. 9, no. 1, pp. 10-16, Mar./Apr. 1990.

13. B. Allenby, "Earth systems engineering and management", IEEE Technol. Soc. Mag., vol. 19, no. 4, pp. 10-24, Winter 2000.

14. J. C. Davis, "Protecting intellectual property in cyberspace", IEEE Technol. Soc. Mag., vol. 17, no. 2, pp. 12-25, 1998.

15. R. Brody, "Consequences of electronic profiling", IEEE Technol. Soc. Mag., vol. 18, no. 1, pp. 20-27, 1999.

16. K. W. Bowyer, "Face-recognition technology: Security versus privacy", IEEE Technol. Soc. Mag., vol. 23, no. 1, pp. 9-20, 2004.

17. D. Btschi, M. Courant, L. M. Hilty, "Towards sustainable pervasive computing", IEEE Technol. Soc. Mag., vol. 24, no. 1, pp. 7-8, 2005.

18. R. Clarke, "Cyborg rights", IEEE Technol. Soc. Mag., vol. 30, no. 3, pp. 49-57, 2011.

19. E. Levy, D. Copp, "Risk and responsibility: Ethical issues in decision-making", IEEE Technol. Soc. Mag., vol. 1, no. 4, pp. 3-8, Dec. 1982.

20. K. R. Foster, R. B. Ginsberg, "Guest editorial: The wired classroom", IEEE Technol. Soc. Mag., vol. 17, no. 4, pp. 3, 1998.

21. T. Bookman, "Ethics professionalism and the pleasures of engineering: T&S interview with Samuel Florman", IEEE Technol. Soc. Mag., vol. 19, no. 3, pp. 8-18, 2000.

22. K. D. Stephan, "Is engineering ethics optional", IEEE Technol. Soc. Mag., vol. 20, no. 4, pp. 6-12, 2001.

23. T. C. Jepsen, "Reclaiming history: Women in the telegraph industry", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 15-19, 2000.

24. A. S. Bix, "‘Engineeresses’ invade campus", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 20-26, 2000.

25. J. Coopersmith, "Pornography videotape and the internet", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 27-34, 2000.

26. D. M. Hughes, "The internet and sex industries: Partners in global sexual exploitation", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 35-41, 2000.

27. V. Cimagalli, M. Balsi, "Guest editorial: University technology and society", IEEE Technol. Soc. Mag., vol. 20, no. 2, pp. 3, 2001.

28. G. L. Engel, B. M. O'Connell, "Guest editorial: Ethical and social issues criteria in academic accreditation", IEEE Technol. Soc. Mag., vol. 21, no. 3, pp. 7, 2002.

29. J. C. Lucena, G. Downey, H. A. Amery, "From region to countries: Engineering education in Bahrain Egypt and Turkey", IEEE Technol. Soc. Mag., vol. 25, no. 2, pp. 4-11, 2006.

30. C. Didier, J. R. Herkert, "Volunteerism and humanitarian engineering—Part II", IEEE Technol. Soc. Mag., vol. 29, no. 1, pp. 9-11, 2010.

31. K. Michael, G. Roussos, G. Q. Huang, R. Gadh, A. Chattopadhyay, S. Prabhu, P. Chu, "Planetary-scale RFID services in an age of uberveillance", Proc. IEEE, vol. 98, no. 9, pp. 1663-1671, Sep. 2010.

32. M. G. Michael, K. Michael, "The fall-out from emerging technologies: On matters of surveillance social networks and suicide", IEEE Technol. Soc. Mag., vol. 30, no. 3, pp. 15-18, 2011.

33. M. U. Iqbal, S. Lim, "Privacy implications of automated GPS tracking and profiling", IEEE Technol. Soc. Mag., vol. 29, no. 2, pp. 39-46, 2010.

34. D. Kravets, "OnStar tracks your car even when you cancel service", Wired, Sep. 2011.

35. L. Evans, "Location-based services: Transformation of the experience of space", J. Location Based Services, vol. 5, no. 34, pp. 242-260, 2011.

36. M. Wigan, R. Clarke, "Social impacts of transport surveillance", Prometheus, vol. 24, no. 4, pp. 389-403, 2006.

37. B. D. Renegar, K. Michael, "The privacy-value-control harmonization for RFID adoption in retail", IBM Syst. J., vol. 48, no. 1, pp. 8:1-8:14, 2009.

38. R. Clarke, "Information technology and dataveillance", Commun. ACM, vol. 31, no. 5, pp. 498-512, 1988.

39. H. Ketabdar, J. Qureshi, P. Hui, "Motion and audio analysis in mobile devices for remote monitoring of physical activities and user authentication", J. Location Based Services, vol. 5, no. 34, pp. 182-200, 2011.

40. E. Singer, "Device tracks how you're sleeping", Technol. Rev. Authority Future Technol., Jul. 2009.

41. L. Perusco, K. Michael, "Control trust privacy and security: Evaluating location-based services", IEEE Technol. Soc. Mag., vol. 26, no. 1, pp. 4-16, 2007.

42. K. Michael, A. McNamee, M. G. Michael, "The emerging ethics of humancentric GPS tracking and monitoring", ICMB M-Business-From Speculation to Reality, 2006.

43. S. J. Fusco, K. Michael, M. G. Michael, R. Abbas, "Exploring the social implications of location based social networking: An inquiry into the perceived positive and negative impacts of using LBSN between friends", 9th Int. Conf. Mobile Business/9th Global Mobility Roundtable (ICMB-GMR), 2010.

44. M. Burdon, "Commercializing public sector information privacy and security concerns", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 34-40, 2009.

45. R. W. Picard, "Future affective technology for autism and emotion communication", Philosoph. Trans. Roy. Soc. London B Biol. Sci., vol. 364, no. 1535, pp. 3575-3584, 2009.

46. R. M. Kowalski, S. P. Limber, P. W. Agatston, Cyber Bullying: The New Moral Frontier, U.K., London: Wiley-Blackwell, 2007.

47. Google: Policies and Principles, Oct. 2011.

48. K.-S. Lee, "Surveillant institutional eyes in South Korea: From discipline to a digital grid of control", Inf. Soc., vol. 23, no. 2, pp. 119-124, 2007.

49. D. P. Siewiorek, "Wearable computing comes of age", IEEE Computer, vol. 32, no. 5, pp. 82-83, May 1999.

50. L. Sydnheimo, M. Salmimaa, J. Vanhala, M. Kivikoski, "Wearable and ubiquitous computer aided service maintenance and overhaul", IEEE Int. Conf. Commun., vol. 3, pp. 2012-2017, 1999.

51. K. Michael, M. G. Michael, Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, 2009.

52. K. Michael, M. G. Michael, "Implementing Namebers using microchip implants: The black box beneath the skin" in This Pervasive Day: The Potential and Perils of Pervasive Computing, U.K., London:Imperial College Press, pp. 101-142, 2011.

53. S. Mann, "Wearable computing: Toward humanistic intelligence", IEEE Intell. Syst., vol. 16, no. 3, pp. 10-15, May/Jun. 2001.

54. B. Schiele, T. Jebara, N. Oliver, "Sensory-augmented computing: Wearing the museum's guide", IEEE Micro, vol. 21, no. 3, pp. 44-52, May/Jun. 2001.

55. C. Harrison, D. Tan, D. Morris, "Skinput: Appropriating the skin as an interactive canvas", Commun. ACM, vol. 54, no. 8, pp. 111-118, 2011.

56. N. Sawhney, C. Schmandt, "Nomadic radio: A spatialized audio environment for wearable computing", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 171-172, 1997.

57. S. Mann, "Eudaemonic computing (‘underwearables’)", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 177-178, 1997.

58. LooxieOverview, Jan. 2012.

59. T. Starner, "The challenges of wearable computing: Part 1", IEEE Micro, vol. 21, no. 4, pp. 44-52, Jul./Aug. 2001.

60. G. Trster, "Smart clothes—The unfulfilled pledge", IEEE Perv. Comput., vol. 10, no. 2, pp. 87-89, Feb. 2011.

61. M. B. Spitzer, "Eyeglass-based systems for wearable computing", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 48-51, 1997.

62. R. Steinkuhl, C. Sundermeier, H. Hinkers, C. Dumschat, K. Cammann, M. Knoll, "Microdialysis system for continuous glucose monitoring", Sens. Actuators B Chem., vol. 33, no. 13, pp. 19-24, 1996.

63. J. C. Pickup, F. Hussain, N. D. Evans, N. Sachedina, "In vivo glucose monitoring: The clinical reality and the promise", Biosens. Bioelectron., vol. 20, no. 10, pp. 1897-1902, 2005.

64. C. Thomas, R. Carlson, Development of the Sensing System for an Implantable Glucose Sensor, Jan. 2012.

65. J. L. Ferrero, "Wearable computing: One man's mission", IEEE Micro, vol. 18, no. 5, pp. 87-88, Sep.-Oct. 1998.

66. T. Martin, "Issues in wearable computing for medical monitoring applications: A case study of a wearable ECG monitoring device", Proc. IEEE 4th Int. Symp. Wearable Comput., pp. 43-49, 2000.

67. M. G. Michael, "The biomedical pioneer: An interview with C. Toumazou" in Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, pp. 352-363, 2009.

68. R. Capurro, M. Nagenborg, Ethics and Robotics, Germany, Heidelberg: Akademische Verlagsgesellschaft, 2009.

69. R. Sparrow, "Predators or plowshares? Arms control of robotic weapons", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 25-29, 2009.

70. R. C. Arkin, "Governing lethal behavior in robots [T&S Interview]", IEEE Technol. Soc. Mag., vol. 30, no. 4, pp. 7-11, 2011.

71. M. L. Cummings, "Creating moral buffers in weapon control interface design", IEEE Technol. Soc. Mag., vol. 23, no. 3, pp. 28-33, 41, 2004.

72. P. Asaro, "Modeling the moral user", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 20-24, 2009.

73. J. Canning, "You've just been disarmed. Have a nice day!", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 13-15, 2009.

74. F. Operto, "Ethics in advanced robotics", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 72-78, Mar. 2011.

75. J. M. Sullivan, "Evolution or revolution? The rise of UAVs", IEEE Technol. Soc. Mag., vol. 25, no. 3, pp. 43-49, 2006.

76. P. Salvini, M. Nicolescu, H. Ishiguro, "Benefits of human-robot interaction", IEEE Robot. Autom. Mag., vol. 18, no. 4, pp. 98-99, Dec. 2011.

77. A. Sharkey, N. Sharkey, "Children the elderly and interactive robots", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 32-38, Mar. 2011.

78. D. Feil-Seifer, M. J. Mataric, "Socially assistive robotics", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 24-31, Mar. 2011.

79. J. D. Bronzino, The Biomedical Engineering Handbook: Medical Devices and Systems, FL, Boca Raton:CRC Press, 2006.

80. C. Hassler, T. Boretius, T. Stieglitz, "Polymers for neural implants", J. Polymer Sci. B Polymer Phys., vol. 49, no. 1, pp. 18-33, 2011.

81. Bionic Hearing Bionic Vision Neurobionics, Jan. 2012.

82. A. Manning, "Implants sounding better: Smaller faster units overcome ‘nerve deafness’", USA Today, pp. 7D, 2000.

83. G. M. Clark, Sounds From Silence, Australia, Melbourne: Allen & Unwin, 2003.

84. G. Carman, Eureka Moment From First One to Hear With Bionic Ear, Feb. 2008.

85. J. F. Patrick, P. A. Busby, P. J. Gibson, "The development of the nucleus FreedomTM cochlear implant system", Sage Publications, vol. 10, no. 4, pp. 175-200, 2006.

86. "Personal stories", Cochlear, Jan. 2012.

87. R. A. Cooper, "Quality of life technology: A human-centered and holistic design", IEEE Eng. Med. Biol., vol. 27, no. 2, pp. 10-11, Mar./Apr. 2008.

88. S. Stewart, "Neuromaster", Wired 8.02.

89. J. Dowling, "Current and future prospects for optoelectronic retinal prostheses", Eye, vol. 23, pp. 1999-2005, 2009.

90. D. Ahlstrom, "Microchip implant could offer new kind of vision", The Irish Times.

91. More Tests of Eye Implants Planned, pp. 1-2, 2001.

92. G. Branwyn, "The desire to be wired", Wired 1.4.

93. W. Wells, The Chips Are Coming.

94. M. Brooks, "The cyborg cometh", Worldlink: The Magazine of the World Economic Forum.

95. E. Strickland, "Birth of the bionic eye", IEEE Spectrum, Jan. 2012.

96. S. Adee, "Researchers hope to mime 1000 neurons with high-res artificial retina", IEEE Spectrum, Jan. 2012.

97. D. Nairne, Building Better People With Chips and Sensors.

98. S. S. Hall, "Brain pacemakers", MIT Enterprise Technol. Rev..

99. E. A. C. Pereira, A. L. Green, R. J. Stacey, T. Z. Aziz, "Refractory epilepsy and deep brain stimulation", J. Clin. Neurosci., vol. 19, no. 1, pp. 27-33, 2012.

100. "Brain pacemaker could help cure depression research suggests", Biomed. Instrum. Technol., vol. 45, no. 2, pp. 94, 2011.

101. H. S. Mayberg, A. M. Lozano, V. Voon, H. E. McNeely, D. Seminowicz, C. Hamani, J. M. Schwalb, S. H. Kennedy, "Deep brain stimulation for treatment-resistant depression", Neuron, vol. 45, no. 5, pp. 651-660, 2005.

102. B. Staff, "Brain pacemaker lifts depression", BBC News, Jun. 2005.

103. C. Hamani, H. Mayberg, S. Stone, A. Laxton, S. Haber, A. M. Lozano, "The subcallosal cingulate gyrus in the context of major depression", Biol. Psychiatry, vol. 69, no. 4, pp. 301-308, 2011.

104. R. Dobson, Professor to Try to Control Wife via Chip Implant.

105. "Chip helps paraplegic walk", Wired News.

106. D. Smith, "Chip implant signals a new kind of man", The Age.

107. "Study of an implantable functional neuromuscular stimulation system for patients with spinal cord injuries", Clinical Trials.gov, Feb. 2009.

108. R. Barrett, "Electrodes help paraplegic walk" in Lateline Australian Broadcasting Corporation, Australia, Sydney: ABC, May 2011.

109. M. Ingebretsen, "Intelligent exoskeleton helps paraplegics walk", IEEE Intell. Syst., vol. 26, no. 1, pp. 21, 2011.

110. S. Harris, "US researchers create suit that can enable paraplegics to walk", The Engineer, Oct. 2011.

111. D. Ratner, M. A. Ratner, Nanotechnology and Homeland Security: New Weapons for New Wars, NJ, Upper Saddle River: Pearson Education, 2004.

112. L. Versweyveld, "Chip implants allow paralysed patients to communicate via the computer", Virtual Medical Worlds Monthly.

113. S. Adee, "The revolution will be prosthetized: DARPA's prosthetic arm gives amputees new hope", IEEE Spectrum, vol. 46, no. 1, pp. 37-40, 2009.

114. E. Wales, "It's a living chip", The Australian, pp. 4, 2001.

115. Our Products: MBAMultiplex Bio Threat Assay, Jan. 2012.

116. F. W. Scheller, "From biosensor to biochip", FEBS J., vol. 274, no. 21, pp. 5451, 2007.

117. A. Persidis, "Biochips", Nature Biotechnol., vol. 16, pp. 981-983, 1998.

118. A. C. LoBaido, "Soldiers with microchips: British troops experiment with implanted electronic dog tag", WorldNetDaily.com.

119. "Microchip implants for drug delivery", ABC: News in Science.

120. R. Bailey, "Implantable insulin pumps", Biology About.com.

121. D. Elleri, D. B. Dunger, R. Hovorka, "Closed-loop insulin delivery for treatment of type 1 diabetes", BMC Med., vol. 9, no. 120, 2011.

122. D. L. Sorkin, J. McClanahan, "Cochlear implant reimbursement cause for concern", HealthyHearing, May 2004.

123. J. Berke, "Parental rights and cochlear implants: Who decides about the implant?", About.com: Deafness, May 2009.

124. D. O. Weber, "Me myself my implants my micro-processors and I", Softw. Develop. Mag., Jan. 2012.

125. A. Graafstra, K. Michael, M. G. Michael, "Social-technical issues facing the humancentric RFID implantee sub-culture through the eyes of Amal Graafstra", Proc. IEEE Int. Symp. Technol. Soc., pp. 498-516, 2010.

126. E. M. McGee, G. Q. Maguire, "Becoming borg to become immortal: Regulating brain implant technologies", Cambridge Quarterly Healthcare Ethics, vol. 16, pp. 291-302, 2007.

127. P. Moore, Enhancing Me: The Hope and the Hype of Human Enhancement, U.K., London: Wiley, 2008.

128. A. Masters, K. Michael, "Humancentric applications of RFID implants: The usability contexts of control convenience and care", Proc. 2nd IEEE Int. Workshop Mobile Commerce Services, pp. 32-41, 2005.

129. J. Best, "44000 prison inmates to be RFID-chipped", silicon.com, Nov. 2010.

130. D. Brin, The Transparent Society: Will Technology Force Us to Choose Between Privacy and Freedom?, MA, Boston: Perseus Books, 1998.

131. J. E. Dobson, P. F. Fischer, "Geoslavery", IEEE Technol. Soc. Mag., vol. 22, no. 1, pp. 47-52, 2003.

132. K. Michael, M. G. Michael, "Homo Electricus and the Continued Speciation of Humans" in The Encyclopedia of Information Ethics and Security, PA, Hershey: IGI, pp. 312-318, 2007.

133. S. Young, Designer Evolution: A Transhumanist Manifesto, New York: Prometheus Books, 2006.

134. E. Braun, Wayward Technology, U.K., London: Frances Pinter, 1984.

135. R. el Kaliouby, R. Picard, S. Baron-Cohen, "Affective computing and autism", Ann. New York Acad. Sci., vol. 1093, no. 1, pp. 228-248, 2006.

136. D. Bhatia, S. Bairagi, S. Goel, M. Jangra, "Pacemakers charging using body energy", J. Pharmacy Bioallied Sci., vol. 2, no. 1, pp. 51-54, 2010.

137. V. Arstila, F. Scott, "Brain reading and mental privacy", J. Humanities Social Sci., vol. 15, no. 2, pp. 204-212, 2011.

138. R. Westrum, Technologies and Society: The Shaping of People and Things, CA, Belmont: Wadsworth, 1991.

139. B. Gates, The Road Ahead, New York: Penguin, 1995.

140. M. Allaby, Facing The Future: The Case for Science, U.K., London: Bloomsbury, 1996.

141. M. Kaku, Visions: How Science Will Revolutionise the 21st Century and Beyond, U.K., Oxford: Oxford Univ. Press, 1998.

142. A. Berry, The Next 500 Years: Life in the Coming Millennium, New York: Gramercy Books, 1996.

143. J. Ellul, The Technological Society, New York: Vintage Books, 1964.

144. W. Gibson, Neuromancer, New York: Ace Books, 1984.

145. M. McLuhan, Understanding Media: The Extensions of Man, MA, Cambridge: MIT Press, 1964.

146. A. Toffler, Future Shock, New York: Bantam Books, 1981.

147. C. M. Banbury, Surviving Technological Innovation in the Pacemaker Industry 19591990, New York: Garland, 1997.

148. K. Warwick, I Cyborg, U.K., London: Century, 2002.

149. M. G. Michael, K. Warwick, "The professor who has touched the future" in Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, pp. 406-422, 2009.

150. D. Green, "Why I am not impressed with Professor Cyborg", BBC News.

151. P. Cochrane, Tips For Time Travellers: Visionary Insights Into New Technology Life and the Future on the Edge of Technology, New York: McGraw-Hill, 1999.

152. I. Walker, "Cyborg dreams: Beyond Human: Background Briefing", ABC Radio National, Jan. 2012.

153. W. Grossman, "Peter Cochrane will microprocess your soul", Wired 6.11.

154. R. Fixmer, "The melding of mind with machine may be the next phase of evolution", The New York Times.

155. M. Billinghurst, T. Starner, "Wearable devices: New ways to manage information", IEEE Computer, vol. 32, no. 1, pp. 57-64, Jan. 1999.

156. M. Minsky, Society of Mind, New York: Touchstone, 1985.

157. R. Uhlig, "The end of death: ‘Soul Catcher’ computer chip due", The Electronic Telegraph.

158. K. Michael, M. G. Michael, "Microchipping people: The rise of the electrophorus", Quadrant, vol. 414, no. 3, pp. 22-33, 2005.

159. K. Michael, M. G. Michael, "Towards chipification: The multifunctional body art of the net generation", Cultural Attitudes Towards Technol. Commun., 2006.

160. E. McLuhan, F. Zingrone, Essential McLuhan, NY, New York:BasicBooks, 1995.

161. M. Dery, Escape Velocity: Cyberculture at the End of the Century, U.K., London: Hodder and Stoughton, 1996.

162. J. Brown, "Professor Cyborg", Salon.com, Jan. 2012.

163. M. T. Maybury, "The mind matters: Artificial intelligence and its societal implications", IEEE Technol. Soc. Mag., vol. 9, no. 2, pp. 7-15, Jun./Jul. 1990.

164. Philosophy, Jan. 2012.

165. R. Kurzweil, The Singularity Is Near, New York: Viking, 2005.

166. Transhumanist Declaration, Jan. 2010.

167. F. Fukuyama, "Transhumanism", Foreign Policy, no. 144, pp. 42-43, 2004.

168. How Does Transhumanism Relate to Religion? in Transhumanist FAQ, Jan. 2012.

169.

170. J. Ellul, What I Believe, MI, Grand Rapids: Eerdmans, 1989.

171. J. M. Wetmore, "Amish Technology: Reinforcing values and building community", IEEE Technol. Soc. Mag., vol. 26, no. 2, pp. 10-21, 2007.

172. R. Kurzweil, The Age of Spiritual Machines, New York: Penguin Books, 1999.

173. B. Joy, "Why the future doesn't need us", Wired 8.04.

174. K. J. O'Connell, "Uses and abuses of technology", Inst. Electr. Eng. Proc. Phys. Sci. Meas. Instrum. Manage. Educ. Rev., vol. 135, no. 5, pp. 286-290, 1988.

175. D. F. Noble, The Religion of Technology: The Divinity of Man and the Spirit of Invention, New York: Penguin Books, 1999.

176. W. Kuhns, The Post-Industrial Prophets: Interpretations of Technology, New York: Harper Colophon Books, 1971.

177. D. J. Solove, The Future of Reputation, CT, New Haven: Yale Univ. Press., 2007.

178. J. Rennie, "Ray Kurzweil's slippery futurism", IEEE Spectrum, Dec. 2010.

Keywords

Technology forecasting, Social implications of technology, History, Social factors, Human factors, social aspects of automation, human-robot interaction, mobile computing, pervasive computing, IEEE society, social implications of technology, SSIT, society founding, social impacts, military technologies, security technologies, cyborgs, human-machine hybrids, human mind, transhumanist future, humanity redesigns, mobile computing, wearable computing, Überveillance, Corporate activities, engineering education, ethics, future of technology, history,social implications of technology, sociotechnical systems

Citation: Karl D. Stephan, Katina Michael, M. G. Michael, Laura Jacob, Emily P. Anesta, 2012, "Social Implications of Technology: The Past, the Present, and the Future", Proceedings of the IEEE, Volume: 100, Issue: Special Centennial Issue, May 13 2012, 1752-1781. 10.1109/JPROC.2012.2189919

Using social informatics to study effects of location-based social networking

Using a social informatics framework to study the effects of location-based social networking on relationships between people: A review of literature

Abstract

6c89c-social-networking-informatics.jpg

This paper is predominantly a review of literature on the emerging mobile application area known as location-based social networking. The study applies the social informatics framework to the exploratory question of what effect location based social networking may have on relationships between people. The classification model used in the paper relates previous research on location based services and online social networking together. Specifically the wider study is concerned with literature which identifies the impact of technology on trust with respect to friendship. This paper attempts to draw out the motivations behind using location based social networking applications and the implications this may have on individual privacy and more broadly one's social life. It relies heavily on the domain of social informatics with a view to setting a theoretical underpinning to the shaping between context and information and communication technology design.

Section 1. Introduction

The purpose of this paper is to provide a review of the relevant literature of the effects of location-based social networking (LBSN) on relationships between people. There are three main areas of literature reviewed. The first area is literature related to the domain of social informatics. The purpose of reviewing this literature is to guide the conduct of the wider research study. The second area of literature reviewed is the social informatics based studies on online social networking (OSN), location based services (LBS), and location based social networking (LBSN). The purpose of reviewing the literature on online social networking and location based services is because these technologies precede location based social networking. LBSN is the composite of LBS and OSN and therefore the literature on each of these technologies provides insight into core concepts related to location based social networking. The intersection between LBS, ONS and LBSN also uncovers an area which has been under researched predominantly due to its newness in the field of information and communication technology (ICT). The third area of literature reviewed by this research is the literature on trust and friendship. The purpose of briefly reviewing this literature is to provide an outline of the social theory that forms the background of the wider study. Prior to reviewing the literature a classification model is presented which summarizes the literature in the domain, in addition to providing a roadmap for this paper.

Section 2. Background

Location Based Social Networking (LBSN) applications such as Google Latitude, Loopt and BrightKite enhance our ability to perform social surveillance. These applications enable users to view and share real time location information with their “friends”. LBSN applications offer users the ability to look up the location of another “friend” remotely using a smart phone, desktop or other device, anytime and anywhere. Users invite their friends to participate in LBSN and there is a process of consent that follows. Friends have the ability to alter their privacy settings to allow their location to be monitored by another at differing levels of accuracy (e.g. suburb, pinpoint at the street address level, or manual location entry). Individuals can invite friends they have met in the physical space, friends they have met virtually in an online social network, their parents, their siblings, their extended family, partners, even strangers to join them in an LBSN setting.

With the emergence of this technology it is crucial to consider that “technology alone, even good technology alone is not sufficient to create social or economic value” [1]. Further to not contributing “sufficient” economic or social value, Kling and other scholars have identified that technologies can have negative impacts on society [2]. Consider the case of persons who have befriended each other in the virtual space, only to meet in the physical space and to encounter unforeseen consequences by doing so [3]. As location based social networking technologies are used between what is loosely termed “friends,” they have the potential to impact friendships, which are integral not only to the operation of society but also to the individual's well being [4].

Section 3. Classification Model

The classification model of the literature review expressed in Figure 1 summarizes the current social informatics based scholarship on location based services, online social networking and location based social networking applications. The arrows indicate the researchers view that location based social networking applications are novel in that they have been designed to provide additional functionality for social networking. The classification model also summarizes the scholarship on trust and technology and introduces the social theory of trust and friendship. The purpose of reviewing this literature is first to identify studies relating trust to LBS and OSN, and then to understand how technology has the potential to impact upon human trust. Although it must be stated upfront that the number of studies relating to this particular research question are scarce, given that the first popular LBSN application was launched in the beginning of 2009 [5], with only beta applications existing in August of 2008. Secondly, the purpose of reviewing the literature on trust and friendship is to develop a social theory to inform the research.

Figure 1. Classification Model

In order to logically understand the literature it is organized in a top-down approach. First the paper addresses enquiries in the domain of social informatics. Second the literature on online social networking and location based services is reviewed, providing a background to the types of issues pertinent to location based social networking. The review of the literature specifically on LBSN then follows. Once the gap in current research is presented, previous works on ‘trust and technology’, and ‘trust and friendship’ are presented.

Section 4. Socio-Technical Network Influences

The social implications of technologies have been explored under several different theoretical frameworks, including technological determinism, social shaping of technology, critical information theory and social informatics. This research adopts the approach of social informatics. Thus the overall aim of the research is to engage in a holistic and empirical study of the ‘consequences’ of location based social networking applications. This section provides a definition and outline of social informatics, how and why it has developed and how it can be used as a framework for further research. This section concludes with a justification for the adoption of this particular approach against a backdrop of other possible theories.

4.1. Definition of Social Informatics

Social informatics research focuses upon the relationships between information and communication technologies (ICTs) and the larger social context they exist within [6]. The definition of social informatics provided by the Encyclopedia of Library and Information Sciencedefines Social Informatics as [7]:

“the systematic, interdisciplinary study of the design, uses and consequences of information technologies that takes into account their interaction with institutional and cultural contexts. Thus, it is the study of the social aspects of computers, telecommunications, and related technologies, and examines issues such as the ways that IT shape organizational and social relations, or the ways in which social forces influence the use and design of IT… Social Informatics research strategies are usually based on empirical data… [and] use data to analyze the present and recent past to better understand which social changes are possible, which are plausible and which are most likely in the future.”

One of the key concepts underlying the approach of social informatics is that information and communication technology are not designed in social isolation, that a social context does exist, and it does influence the manner in which ICT is developed, used and ultimately has a social impact [7].

4.2. The Development of Social Informatics

Social informatics research was born from the dissatisfaction with previous information systems research methods that were focused on either exploring the deterministic effects of technology upon society, or society upon technology. These theories are respectively referred to as technological determinism and social shaping of technology.

Technological deterministic research studies focus on the impact of technology upon society. The research approach aims to answer questions such as:

“What would be the impact of computers on organizational behavior if we did X? What would be the changes in social life if we did X? Will computer systems improve or degrade the quality of work?… ‘What will happen, X or Y?’ The answer was, sometimes X, and sometimes Y. There was no simple, direct effect” [8].

Technological determinism has failed to produce satisfactory prediction and this has lead to the formation of social informatics research [9]. Technological determinism was also seen by the proponents of the social shaping of technology, as being only a partial truth, and “oversimplistic” [10].

The social shaping of technology approach proposes that technology is not an autonomous entity as it is shaped by social forces. This is in direct opposition to technological determinism which depicts technology as an “autonomous entity, which develops according to an internal logic and in a direction of its own, and then has determinate impacts on society” [11]. Social shaping of technology studies aim to show that technology is in fact a social product, it does not mold society, but rather society molds it, and this can be seen by investigating the social forces at play in the creation and use of technology [12]. Examples of approaches in the social shaping of technology include the social construction of technology and the actor network theory. These theories focused on the role of either knowledge or actors upon the development of technology. Technological determinism focuses on the impacts of technology, while the social shaping of technology focuses on the context. Social informatics on the other hand “investigates how the influences and nodes in a sociotechnical network shape each other” [13].

Social informatics does not ask deterministic questions ‘What will happen X or Y?’, instead social informatics researchers asks the question 'When will X happen? And Under what Conditions?’ providing a nuanced conceptual understanding of the operation of technology in social life [9]. In contrast to technologic determinism and social shaping of technology theories, the social informatics framework highlights the mutual shaping of technology and society, both molding each other at the same time.

4.3. Examples of Social Informatics Research

Figure 2. Bidirectional Shaping between Context and ICT Design

Social informatics takes a nuanced approach to investigating technologies and explores the bidirectional shaping between context and ICT design, implementation and use [13] (figure 2). This approach, which combines the social aspects and the technical aspects of technology, has been found to be useful for understanding the social shaping and ‘consequences’ of information communication technologies [9]. Examples of social informatics research include the vitality of electronic journals [14], the adoption and use of Lotus Notes within organizations [15], public access to information via the internet [16], and many other studies. Social informatics research also investigates new social phenomenon that materialize when people use technology, for example, the unintended effects of behavioral control in virtual teams [17]. Research falling in this area is perceived as the future direction for social informatics research [9].

4.4. Social Informatics as a Framework

Social informatics is not described as a theory, but as a “large and growing federation of scholars focused on common problems”, with no single theory or theoretical notion being pursued [13]. What social informatics does provide is a framework for conducting research. What follows is a description of the framework, its key elements and distinguishing features.

4.4.1. Key Features of Social Informatics Research

Social informatics research is problem orientated, empirical, theory based and interdisciplinary with a focus on informatics (table 1). In addition there are several key distinguishing features of the framework. First, social informatics does not prescribe a specific methodology although the majority of methods employed by researchers in this field are qualitative methods. Second, social informatics is inclusive of normative, analytical or critical approaches to research. Third, this type of research “investigate[s] how influences and nodes at different levels in the network shape each other” [13], engaging in analysis of the interconnected levels of the social context. Fourth, research in this field can be seen to fall within three broad themes:

  1. ICT uses lead to multiple and sometimes paradoxical effects,

  2. ICT uses shape thought and action in ways that benefit some groups more than others and these differential effects often have moral and ethical consequences and;

  3. a reciprocal relationship exists between ICT design, implementation, use and the context in which these occur [13].

When adopting the framework of social informatics, the main focus of social informatics should not be overshadowed. The research should be focused upon the idea that “ICT are inherently socio-technical, situated and social shaped” [18] and that in order to understand their impacts we need to explore, explain and theorize about their socio-technical contexts [13].

Table 1. Key Features of Social Informatics Research (adapted from [13])

4.5. Justification for Using the Social Informatics Framework

There are two primary justifications for adopting a social informatics approach. First, the goals and achievements of social informatics accords to the researchers' goal and motivation. Second, the holistic method of enquiry adopted by social informatics research provides meaningful data. Social Informatics researchers aim to develop: “reliable knowledge about information technology and social change based on systematic empirical research, in order to inform both public policy issues and professional practice” [8]. This is in accordance with the researchers' goal to identify the credible threats that LBSN pose to friends and society with a view to preventing or minimizing their effect. Social informatics research has also developed an “increased understanding of the design, use, configuration and/or consequences of ICTs so that they are actually workable for people and can fulfill their intended functions” [9]. In essence, this is the primary motivation behind this study: to increase our understanding of location based social networking so that it can be workable and fulfill its intended function in society without causing individuals harm.

The method of enquiry adopted by social informatics researchers is usually based on conducting a holistic and interdisciplinary investigation into the bidirectional relationship between context and ICT design, use and implementation. This study takes into account the social theory surrounding trust and relationships; thus providing meaningful data on the implications of location based social networking upon trust. For Kling, it was the fact that information and communication technologies were increasingly becoming enmeshed in the lives of more and more people, that there was a pressing need to explore the ultimate social consequences of the ensuing changes [8]. Kling considered that studying new and emerging applications early in the process of diffusion granted significant opportunities to shape the forms and uses of new technologies.

4.6. Alternative Theories and Approaches to the Study of the Social Implications of Technology

Two alternative approaches to social informatics were discussed in section 4.2, i.e., technological determinism and the social shaping of technology. A third possible theory that was considered was critical social theory (founded by Jürgen Habermas). Critical social theory has four distinct attributes: (1) it is sensitive to lifeworlds of the organizational actors and is oriented to interpreting and mapping the meanings of their actions from their perspectives, (2) adopts pluralistic methods, (3) does not separate the subjects of inquiry from their context and (4) recognizes that the context is not only important to meaning construction, but to social activity as well [19]. Thus, we can say, that critical social theory is similar to social informatics in three main ways: (1) both approaches are sensitive to the context surrounding the subject of enquiry, (2) both focus on the inter-relationship between context and subject, and (3) both approaches employ pluralistic methods. However, the main focus of the two approaches is markedly different.

Critical information theory focuses on “questioning the conventional wisdom of prevailing schools of thought and institutional practices with a primary focus on issues related to justice and power” [20]. In applying this kind of approach to ICT we would be aiming to “discover and expose attempts to design and (mis)use IS to deceive, manipulate, exploit, dominate and disempower people” [21]. This is not the aim of the research problem presented here- while admittedly location based social networking can cause harm if misused (e.g. stalking by x-partners), it can also act to be incredibly beneficial (e.g. in a family travel holiday in a foreign country). Thus, the aim of the research is to understand the positive and negative implications of the use of location based social networking in society, not just to look at issues of justice and power.

The following section provides an overview of the key literature on the use, design, implementation, context and implications of online social networking, location based services, and location based social networking.

Section 5. Online Social Networking Sites

Current studies on online social networking sites use varied methods involving case studies, surveys, interviews and observations to investigate the use, implications, design and context of the emerging application. The literature on OSN falls into three broad areas of study: (1) purpose, motivation and patterns of use, (2) effect on interpersonal relationships, and (3) threats to privacy, trust and security.

5.1. Purpose, Motivation and Patterns of Use

These studies on online social networking outline the purpose for which OSN is used, the motivation behind an individual's use of OSN, and how users go about the adoption of OSN applications.

5.1.1. Purpose of Online Social Networking

The purpose of OSN has been identified as the public articulation of individual social connections [22], the creation of an information ground [23] or a means of satisfying “our human tendencies towards togetherness” [24]. Boyd's study on Friendster users, revealed that OSN “reshaped how groups of people verbally identify relationships and solidified the importance of creative play in social interactions” [22]. Boyd identified the value of networks, how users presented themselves on Friendster, who users connected with from exiting friends to “hook-ups” to “familiar strangers,” and it highlighted the dilemma caused by fakesters in the network.

Counts and Fisher's study explored OSN exposing the “types and usefulness of information shared in everyday life, the way the system fits into participants communication and social “ecosystem” and the ways in which the system functions as an information ground” [23]. Other than just a source of information, OSN also functions to provide “a logical extension of our human tendencies towards togetherness” [24]. Weaver and Morrison perform case studies on four social networking sites (mySpace, Facebook, Wikipedia and YouTube) to explore the range of socialization that can occur revealing the core purpose of connecting to people.

5.1.2. Motivation Behind the Use of Online Social Networking

Lampe, Ellison and Steinfield have conducted two major survey studies on the use of OSN. The first study was in 2006, and the second was in 2008. The purpose of the first study was to answer the question - “Are Facebook members using the site to make new online connections, or to support already existing offline connections?” The results revealed that Facebook users are primarily interested in increasing “their awareness of those in their offline community” [25]. The second study incorporated three surveys and interviews in order to explore whether the use, perception of audience and attitudes of users of Facebook changed over time with the introduction of new features to Facebook. The results again revealed that the primary use of Facebook was to maintain existing offline connections, in order to: keep in touch with friends, learn more about existing classmates and people that users have met socially offline [26]. Both studies were conducted upon undergraduate university populations.

Joinson [27] performed a use and motivation study on a random sample of Facebook users, not limited to campus-based populations, which supported the conclusions of both Lampe, Ellison and Steinfield studies. Furthermore the study by Joinson probed further identifying seven unique uses and gratifications of online social networks, including social connection, shared identities, content, social investigation, social network surfing and status updating, and identifying that different uses and gratifications relate differentially to patterns of usage [27].

5.1.3. Patterns of Use of Online Social Networking

Other studies of use of online social networking have looked at how the information provided by social networking sites can be used to understand patterns of use. Hancock, Toma and Fenner [28]explore how people use information available on social networking sites to initiate relationships. They asked participants to befriend partners via an instant messaging conversation by using profile information readily available on Facebook. This use of asymmetric information revealed that the information helped in linking persons together, but only in 2 out of 133 scenarios did the users realize that information had been gained from their Facebook profile, instead of the real-time instant messaging conversation(s) they had had with the friend. This study highlighted the rich source of information about the self which is available online, as well as the unintended consequences of others strategically plotting to use that information for their own relational goals.

Online social networking researchers have also explored patterns of use among different groups of people and communities. Ahn and Han [29] investigated the typological characteristics of online networking services. Chapman and Lahav [30] conducted an ethnographic interview studying the cross-cultural differences in usage patterns of OSN in multiple cultures. Results from the interviews identified three dimensions of cultural difference for typical social networking behaviors: users' goals, typical pattern of self expression and common interaction behaviors. The study was limited to the interviews with participants from the United States, France, China and South Korea, and therefore requires future work to evaluate the presented results.

Other studies have explored the usage among different age groups. Arjan, Pfeil and Zaphiris [31]explored users MySpace friend networks with webcrawlers to compare teenage (13–19) networks with those of older people (60+). The findings of the study showed that teenage users had larger networks with more users of the same age than older users. Furthermore when representing themselves online teenagers use more self referencing, negative emotions and cognitive works than older people. The limitation of this study is the small sample size and limited frame of reference – that is the differences between teenagers and older people without reference to other intermediate age groups. A third study by Schrammel, Köffel and Tscheligi [32] surveyed users of various online communities to explore the different information disclosure behavior in the different types of online communities. They identified that users disclose more information in business and social contexts, with students being more freehanded with information than employed people, and females being more cautious than males. Studies relating to the use of OSN have also explored its potential application to other contexts including the workplace [33][34]; student learning [35], citizen involvement [36] and connecting women in information technology [37].

5.2. The Effect of Online Social Networking on Interpersonal Relationships

Online social networking is used in the context of being social, creating connections with users and expanding networks [38]. The implication of using OSN to create or maintain relationships has been explored by several researchers highlighting the nature of intimate online relationships and social interactions as well as the benefits and detriments of the use of OSN upon relationships. Boyd's study concentrated on intimacy and trust within the OSN site Friendster. He highlighted that intimate computing hinges upon issues surrounding trust, trust in the technology, and ultimately trust in the other users to operate by the same set or rules [39]. Dwyer [40] has presented a preliminary framework modeling how attitudes towards privacy and impression management translate into social interactions within MySpace. Other issues that have been explored in the literature include whether interaction between users, flow from the declaration of friends and whether users interact evenly or lopsidedly with friends. These questions were explored by Chun et al, in a quantitative case study of the OSN site Cyworld, reporting that there was a high degree of reciprocity among users [41].

The benefits and detriments of OSN upon interpersonal relationships have not been extensively explored. A survey of undergraduate university students conducted by Ellison, Steinfield and Lampe [42] identified that using Facebook benefits the maintenance and growth of social capital among “friends” and also improves psychological well being. However, although OSN sites reinforce peer communication, Subrahmanyam and Greenfield [43] point out that this may be at the expense of communication within the family, expressing the need for further research into the affects of OSN upon real world communications and relationships.

5.3. Implications of Use- Privacy, Trust and Security

5.3.1. Privacy

Privacy in online social networking sites has received significant attention, with researchers exploring patterns of information revelation and implications upon privacy [44], the use of OSN policies to ensure privacy [45], differences in perceptions of privacy across different OSN [46], the privacy risks presented by OSN [47], mechanisms to enhance privacy on OSN [48], user strategies to manage privacy [49], and the notion of privacy and privacy risk in OSN [50].

The work of Levin and others at Ryerson University (the Ryerson Study) provides the largest survey on usage, attitudes and perceptions of risk of online social networking sites [50]. The design of the survey incorporated quantitative questions, scenarios and short answer questions to understand the level of risk and responsibility one feels when revealing information online. This study identified that young Canadians have a unique perception of network privacy “according to which personal information is considered private as long as it is limited to their social network” [50]. A further contribution of this study, along with other privacy studies [44][46] is the implication of the use of online social networking sites upon trust.

5.3.2. Trust

There are very few studies that explore the concept of trust in online social networking. The majority of studies which do look at trust are focused upon algorithms [51] or frameworks [52] that provide users of OSN with trust ratings. Other scant studies have mentioned or examined online social networking sites in terms of their impact upon trust in relationships. Gross and Acquisti [44]have mentioned that: “trust in and within online social networks may be assigned differently and have a different meaning than in their offline counterparts…[and that] trust may decrease within an online social network”. However they did not investigate this aspect of OSN further. There are three studies which have investigated the impact of OSN upon trust. The first by Dwyer, Hiltz and Passerini [46], compares perceptions of trust and privacy between different OSN applications. The second study, conducted by Ryerson University, identifies the potential for OSN to impact upon trust, and the third study, by Gambi and Reader, is currently ongoing and aims to determine whether trust is important in online friendships and how it is developed.

Dwyer, Hiltz and Passerini [46] compared perceptions of trust and privacy concern between MySpace and Facebook. Trust was measured with the following two quantitative questions; “I feel that my personal information is protected by [social networking sites]” and “I believe most of the profiles I view on [social networking sites] are exaggerated to make the person look more appealing”. The outcome of the study was focused upon trust in the users and online social network itself, but it did not shed light upon the effect of OSN upon trust in relationships.

The Ryerson study provides some exploration into the impact of online social networking sites upon trust in relationships, by presenting scenarios where users had experienced a loss of trust with other members of the site. The participants were then asked whether they had experienced or know of someone who had experienced such a scenario. The first scenario presented a user who went out partying and photographs were taken of the occasion and displayed on Facebook, resulting in the loss of trust by the family. Sixty-four percent of respondents either experienced this scenario directly or indirectly or heard of it happening to someone else. The second scenario that focused on trust involved a comment being posted upon a user's wall, indicating that that individual had been involved in shoplifting, and that no matter what the user claimed everyone still believed that he/she was a shoplifter. In this scenario, seventy-six percent of respondents reported that they had not heard of this occurring. The Ryerson study therefore presented a glimpse into the potential effect of use of online social networking sites upon trust. Another snapshot is provided by Gambi and Reader [53] who performed an online questionnaire with online social networking users to determine whether trust was important in online friendships, and how trust is developed online. Despite the low number of studies in the area of trust and OSN, it is clear from the currency of the three studies that this is an emerging area of research.

5.3.3. Security

Studies in online social networking have explored the impact of OSN on the security of user information and identity. A recent study by Bilge, Strufe, Balzarotti and Kirda [54] identifies the ease with which a potential attacker could perform identity theft attacks upon OSN and suggests improvements in OSN security.

Section 6. Location Based Services

The focus of the literature on location based services, as with social networking, does not surround the technological aspects of design but the use and implications from a social informatics perspective. In this vein the literature reviewed identified the different contexts of use of LBS, the implications of use including trust, control, privacy and security.

6.1. Context of Use of Location Based Services

The literature identifies both current and future applications of LBS to track and monitor human subjects. These applications include employee monitoring [55], government surveillance [56], law enforcement [57], source of evidence [58], patient monitoring [59], locating family members for safety [60][61][62], locating students at school [63], identifying kidnapped victims [60], and socializing with friends [64][65]. The following section details the literature conducted on humancentric LBS in terms of their social implications.

6.2. Implications of Using Location Based Services

Michael, Fusco and Michael's research note on the ethics of LBS provides a concise summary of the literature on the socio-ethical implications of LBS available prior to 2008. The research note identifies trust, control, security and privacy [66] as the four implications of LBS. The literature pertaining to each of these implications will now be described.

6.2.1. Trust

The literature on trust and location based services has predominantly used scenarios [67], theory based discussion of workplace practices [68], and addressed consumer trust with respect to LBS [69]. To the researcher's knowledge, the investigation of trust and LBS is limited to these works.

6.2.2. Control

Dobson and Fisher provide an account of the concept of “geoslavery”, which is defined as “the practice in which one entity, the master, coercively or surreptitiously monitors and exerts control over the physical location of another individual, the slave” [70]. While Dobson and Fisher provide a theoretical account of the potential for “geoslavery” and the human rights issues which accompany it, Troshynski, Lee and Dourish examine the application of “geoslavery” upon paroled sex offenders who have been tracked using a LBS device [57].

Troshynski, Lee and Dourish's work draws upon two focus groups of parole sex offenders to explore the ways that LBS frame people's everyday experience of space. The findings from the focus groups draw out the notion of accountabilities of presence. Troshynski et al define accountabilities of presence as the notion that “[l]ocations are not merely disclosed, rather users are held accountable for their presence and absence at certain time and places” [57]. This presence need not be their actual physical location but the location that is disclosed to the observer. For instance, the parole sex offenders were “primarily concerned with understanding how their movement appear to their parole officers” [57]. This concept of being held to account is a mechanism of enforcing control.

A handful of studies have made mention of the parallel between LBS and Michel Foucault's Panopticon design for prisons [71][57][72]. The Panopticon prison was designed to be round so that the guards could observe the prisoners from the centre without the prisoners knowing whether they were being observed or not. Foucault argued “that the omni-present threat of surveillance renders the actual exercise of power (or violence) unnecessary; the mechanisms of pervasive surveillance induce discipline and docility in those who are surveilled” [57]. LBS represent a modern form of the Panopticon prison, exerting implicit control through the ability to observe.

6.2.3. Security

LBS can be used to provide security, such as law enforcement in order to make “police more efficient in the war against crime” [73] and also for border security [63]. However they can also present a threat to security [74].

6.2.4. Privacy

LBS pose a threat to privacy in the way that information is collected, stored, used and disclosed [75][74][76]. The threat to privacy is further exacerbated by the aggregation and centralization of personal information enabling location information to be combined with other personal information [77]. However while privacy is important, a hypothetical study requiring users to “imagine” the existence of a LBS, provided evidence to show that users were “not overly concerned about their privacy” [78]. Two other studies showed that in situations of emergency, individuals are more willing to forgo some of their privacy [60][79].

Section 7. Location Based Social Networking

The current literature on location based social networking explores users' willingness and motivations for disclosing location information and presents several user studies, which draw out different findings on the implications of using LBSN.

7.1. Disclosure of Location Information

Grandhi, Jones and Karam [80] conducted a survey to gauge attitudes towards disclosure of location information, and use of LBSN applications. The findings from the short survey indicated that there was a general interest in LBSN services. The majority of respondents stated that they would disclose their personal location data, that demographics and geotemporal routines did matter, and finally that social relationships are important in predicting when or with whom individuals want to share personal location data.

7.2. LBSN User Studies

7.2.1. LBSN Studies Based on Perceptions and Closed Environments

Several user studies have been conducted on location based social networking [81]. One of the earliest studies to be conducted involved a two phased study comparing perceived privacy concerns with actual privacy concerns within a closed LBS environment [82]. Barkhuus found that although users were concerned about their location privacy in general, when confronted with a closed environment the concern diminished. Another user study observed the configuration of privacy settings on a work-related location based service [83]. The study found that grouping permissions provided a convenient balance between privacy and control. Moving away solely from the concept of privacy, Consolvo and Smith [84] conducted a three phased study. First they explored whether social networking users would use location-enhanced computing, second they recorded the response of users to in-situ hypothetical requests for information, and thirdly requested participants to reflect upon phase one and two. Some of the captured results included: what participants were willing to disclose, the relationship between participant and requestor, the effect of where participants were located, the activity or mode, privacy classifications, what people want to know about another's location, and privacy and security concerns. The limitation of the research, and prior research on LBSN technologies was the hypothetical nature of the research, or that the research took place within a controlled environment. The following studies employed the use of actual or tailored LBSN.

7.2.2. Semi-Automated and Customizable LBSN Studies

Brown and Taylor [61] implemented the Whereabouts Clock, a location based service which displayed the location of family members on a clock face with four values. At any given point of time, an individual had the status of being at home, at work, at school, or elsewhere. This study revealed that LBSN within the family context could help co-ordination and communication and provide reassurance and connectedness, although it also caused some unnecessary anxiety. Privacy was found not to be an issue among family members using the Whereabouts Clock. The LBSN technology used in this study was more sophisticated than prior studies but it was rather limited in geographic granularity.

Humphreys performed a year long qualitative field study on the mobile social network known as Dodgeball which allowed users to ‘check in’ at a location and then that location was broadcasted to people on their given network. The outcomes of this study revealed patterns of use of LBSN, the creation of a “third space” by LBSN, and the resultant social molecularization caused by Dodgeball use [85]. The limitation of this study is again in the technology employed, the location information was not automated or real-time as Dodgeball required the user to consciously provide manual location updates.

Barkhuus and Brown [86] conducted a trial using Connecto, in order to investigate the emergent practices around LBSN. Connecto allowed users to tag physical locations and then the phone would automatically change the users displayed location to represent the tagged location. This provided a closer simulation of real-time automated LBSN. The outcomes of this study demonstrated that users could use Connecto to establish a repartee and were self-conscious about the location they disclosed. By publishing their location, the users were found to engage in ongoing story-telling with their friends, via a process of mutual monitoring. This act was seen as a “part of friendship relations” and added to an “ongoing relationship state.” There was also the additional expectation that users had to “have seen each others' location or else risk falling ‘out of touch’ with the group” [86].

7.2.3. Real-time LBSN Studies

Brown LBSN studies published after the 2008 calendar year use methods that take advantage of sophisticated real-time automated LBSN applications. Tsai and Kelley [87] developed the Locyoution Facebook application which was used to automatically locate user laptops using wireless fidelity (Wi-Fi) access points leveraging the SkyHook technology. The aim of the study was to investigate how important feedback is for managing personal privacy in ubiquitous systems. Participants were divided into two groups; one group received no information about who had requested their location while the other group was able to view their location disclosure history. The four major findings of the study were that (1) providing feedback to users makes them more comfortable about sharing location (2) feedback is a desired feature and makes users more willing to share location information, (3) time and group based rules are effective for managing privacy, and (4) peers and technical savviness have a significant impact upon use.

Vihavaninen and Oulasvirta [88] performed three field trials of Jaiku, a mobile microblogging service that automates disclosure and diffusion of location information. The focus of the field trials was on investigating the use, user response and user understanding of automation. The results of this study revealed that automation caused issues related to control, understanding, emergent practices and privacy. This study is significant as it is one of the first studies to investigate the implication of automated location disclosure upon user perceptions. The study however does not investigate the implications of the use of automated LBSN upon social relationships.

An ethnographic study by Page and Kobsa explored people's attitudes towards and adoption of Google Latitude, a real-time and automated LBSN. The focus of this study was upon “how participants perceive[d] Latitude to be conceptually situated within the ecology of social networking and communication technologies” [65], based upon technology adoption, social norms, audience management, information filtering and benefits. This study while innovative, presented preliminary results based upon 12 interviews of users and non-users of Latitude.

The user studies conducted upon LBSN have matured over time, with more recent studies employing sophisticated LBSN which provide automated real-time location disclosure. These studies provide insight into user perceptions and use of LBSN however issues of control, security or trust have been neglected, although they are becoming increasingly pertinent to both location based services and online social networking technologies. Furthermore there has been no more than a cursory investigation into the implications of using LBSN upon social relationships.

Section 8. Towards a Study Investigating the Social Implications of LBSN on Relationships

Location based social networking is an emerging and evolving technology with current applications still very much in their infancy. Previous works reflect the state of the technology in late 2008, utilizing hypothetical scenario methods or unsophisticated non-real time incarnations of LSBN. While new research has begun to utilize more sophisticated mobile software applications such as Google Latitude, a sober full-length study is absent from the literature. The need for such a study however is escalating as more and more LBSN applications proliferate, with more and more mobile Internet users being aware of the existence of LBSN and/or adopting the technology. What remains to be explored in the area of LBSN are the concepts of control, security and trust, and the effect of these emerging technologies upon social relationships.

In the months between February and May 2010, the number of fully-fledged LBSN applications more than doubled from fifty to over one hundred [89]. This is a substantial increase when one considers that in late 2009 there were about 30 functional LBSN applications, but only about 8 that people would generally say were usable, reliable, or worth using. Today, innovative developers are simply piggybacking on top of the Google platform and offering niche LBSN applications targeted at dating services, adventure sports, hobbyists, expertise and qualifications, and other demographic profiling categories. Table 2 shows a list of over 100 LBSN applications. Although this is not an exhaustive list, one can only imagine the potential for such services, and the unforeseen consequences (positive and negative) that may ensue from their widespread adoption.

TABLE 2. A List of LBSN Applications [89]

8.1. Trust and Technology

Many studies concerning trust and technology focus upon trust in technology. Trust is an important aspect of human interaction, including human interaction with technology, however that interaction is a two way event, and only minimal research has been undertaken to observe the impact of technology upon trust. Two studies have been found which focus upon the effect of technology upon trust.

Vasalou, Hopfensiz and Pitt [90] examined how trust can break down in online interactions. The ways trust can break down can occur from intentional acts but also from unintentional acts or exceptional acts. The paper titled: “In praise of forgiveness: ways for repairing trust breakdowns in one-off online interactions” also proposes methods for fairly assessing the kind of offender to determine whether the offender committed an intentional act that resulted in the trust breakdown or whether the act was unintentional or exceptional.

The second study that looked at the effect of technology on trust was conducted by Piccoli and Ives [17], and explored trust and the unintended effects of behavior control in virtual teams. This study was based upon observations of the conduct of virtual teams. The findings showed that behavior control mechanisms increase vigilance and make instances when individuals perceive team members to have failed to uphold their obligations salient [17].

8.2. Social Theory

Social informatics studies incorporate a social theory into the study of the technology. This research will incorporate the theory of trust and its importance within friendships.

8.2.1. Trust

Trust is defined as the willingness for an individual to be vulnerable where there is the presence of risk and dependence or reliance between the parities [91]. There are two important things to note about this definition of trust. First that trust is not a behavior or choice but a state of mind where the individual is willing to make themselves vulnerable. Second, that trust is not a control mechanism but a substitute for control [92], although the relationship between trust and control is more complex than this [93]. In order to understand trust more fully it is important to understand the bases upon which trust is formed and the dynamic nature of trust.

Trust is formed upon three bases (1) cognitive, (2) emotional or relational and (3) behavioral [94]. The cognitive basis of trust refers to the “evidence of trustworthiness” or “good reason” to trust. It is not that evidence or knowledge amounts to trust but that “when social actors no longer need or want any further evidence or rational reasons for their confidence in the objects' of trust” and are then able to make the cognitive “leap” into trust [94]. The emotional basis of trust refers to the emotional bond between parties which provides the interpersonal platform for trust. Finally, behavioral trust is the behavioral enactment of trust. To illustrate behavioral trust consider two individuals A and B and A trusts B with task X. If B performs task X then the trust that A has in B will be confirmed, therefore there is the behavioral enactment of trust. In the same way acting incongruently can reduce the trust. The behavioral basis of trust feeds also into the fact that trust is a dynamic concept: “ a trustor takes a risk in a trustee that leads to a positive outcome, the trustor's perceptions of the trustee are enhanced. Likewise, perceptions of the trustee will decline when trust leads to unfavorable conclusions” [92].

8.2.2. Trust and Friendship

Trust is a vitally important element of friendship. Trust secures the “stability of social relationships” [4]. Friendships are described as being “based on trust, reciprocity and equality… which is an important source of solidarity and self-esteem” [4]. And trust is described as a timelessly essential factor of friendships: “the importance of mutual commitment, loyalty and trust between friends will increase and may become an essential element of modern friendship regardless of other changes, which may be expected as the nature of social communication and contracts is transformed” [4].

Section 9. Conclusion

Online social networking technologies have already transformed the way in which people interact in the virtual space. Generally, younger people are more inclined to interact via features on online social networks than with traditional forms of online communications such as electronic mail. The ability to look up a “friends” location using a location based social network, now grants individuals even greater freedom to interact with one another in an almost omniscient manner. Not only do we now know the ‘who’ (identity) of a person, but we also know the ‘whereabouts’ (location) of a person, and from the profile data available on the online social network we also know something more about one's ‘context.’ If used appropriately these new applications have the potential to strengthen individual relationships and provide an unforeseen level of convenience between “friends”, including partners, siblings, parent-child, employer-employee relationships. However, there is also the danger that these technologies can be misused and threaten fundamental threads that society is built upon, such as trust. This literature review has attempted to establish what previous research has already been conducted in the area of LBSN, and what has yet to be done. Our future work will focus on participant realtime automated LBSN fieldwork, with a view to understanding the impact of LBSN on trust between people, and the broader social implications of this emerging technology upon society.

References

1. R. Kling, "What is social informatics and why does it matter?", The Information Society, vol. 23, pp. 205-220, 2007.

2. K. Robert, K. Sara, "Internet paradox revisited", Journal of Social Issues, vol. 58, pp. 49-74, 2002.

3. A. Drummond, Teenager missing after Facebook meeting, 14 May 2010.

4. B. Misztal, Trust in Modern Societies - The Serach for the bases of Social Order, Cambridge:Blackwell Publishers, 1998.

5See where your friends are with Google Latitude, February 2009.

6. R. Kling, H. Rosenbaum, "Social informatics in information science: An introduction", Journal of the American Society for Information Science, vol. 49, pp. 1047-1052, 1998.

7. R. Kling, "Social Informatics", Encyclopedia of Library and Information Science, pp. 2656-2661, 2003.

8. R. Kling, "Learning About Information Technologies and Social Change: The Contribution of Social Informatics", The Information Society, vol. 16, pp. 217-232, 2000.

9. R. Kling, "Social Informatics: A New Perspective on Social Research about Information and Communication Technologies", Prometheus, vol. 18, pp. 245-264, 2000.

10. D. Mackenzie, D. Mackenzie, "Introductory Essay: The Social Shaping of Technology" in The Social Shaping of Technology, Philadelphia:Open University Press, pp. 2-27, 1999.

11. S. Russell, R. Williams, K. Sorensen, R. Williams, "Social Shaping of Technology: Frameworks Findings and Implications for Policy With Glossary of Social Shaping Concepts" in Shaping Technology Guiding Policy: Concepts Spaces and Tools, Chetenham:Elgar, pp. 37-131, 2002.

12. R. Williams, D. Edge, "The Social Shaping of Technology", Research Policy, vol. 25, pp. 856-899, 1996.

13. S. Sawyer, K. Eschenfelder, "Social informatics: Perspectives examples and trends", Annual Review of Information Science and Technology, vol. 36, pp. 427-465, 2002.

14. R. Kling, L. Covi, "Electronic journals and legitimate media in the systems of scholarly communication", The Information Society, vol. 11, pp. 261-271, 1995. 

15. W. Orlikowski, "Learning from notes: Organizational issues in GroupWare implementation", The Information Society, vol. 9, pp. 237-250, 1993.

16. B. Kahin, J. Keller, Public Access to the Internet, Cambridge: MIT Press, 1995.

17. G. Piccoli, B. Ives, "Trust and the Unintended Effects of Behavior Control in Virtual Teams", MIS Quarterly, vol. 27, pp. 365-395, 2003.

18. S. Sawyer, A. Tapia, "From Findings to Theories: Institutionalizing Social Informatics", The Information Society, vol. 23, pp. 263-275, 2007.

19. O. K. Ngwenyama, A. S. Lee, "Communication Richness in Electronic Mail: Critical Social Theory and the Contextuality of Meaning", MIS Quarterly, vol. 21, pp. 145-167, 1997.

20. S. Hansen, N. Berente, "Wikipedia Critical Social Theory and the Possibility of Rational Discourse", The Information Society, vol. 25, pp. 38-59, 2009.

21. D. Cecez-Kecmanovic, "Doing critical IS research: the question of methodology" in Qualitative Research in Information Systems: Issues and Trends, Hershey:Idea Group Publishing, pp. 141-163, 2001.

22. D. M. Boyd, "Friendster and publicly articulated social networking" in CHI '04 on Human Factors in Computing Systems, Vienna, Australia:, 2004.

23. S. Counts, K. E. Fisher, "Mobile Social Networking: An Information Grounds Perspective", Proceedings of the 41st Annual Hawaii International Conference on System Sciences, 2008.

24. A. C. Weaver, B. B. Morrison, "Social Networking", Computer, vol. 41, pp. 97-100, 2008.

25. C. Lampe, N. Ellison, C. Steinfield, "A face(book) in the crowd: social Searching vs. social browsing", Proceedings of the 2006 20th Anniversary Conference on Computer Supported Cooperative Work, 2006.

26. C. Lampe, N. B. Ellison, C. Steinfield, "Changes in use and perception of facebook", Proceedings of the ACM 2008 conference on Computer supported cooperative work, 2008.

27. A. N. Joinson, Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, 2008.

28. J. T. Hancock, C. L. Toma, "I know something you don't: the use of asymmetric personal information for interpersonal advantage", Proceedings of the ACM 2008 conference on Computer supported cooperative work, 2008.

29. Y.-Y. Ahn, S. Han, Proceedings of the 16th international conference on World Wide Web, 2007.

30. C. N. Chapman, M. Lahav, "International ethnographic observation of social networking sites" in CHI '08 extended abstracts on Human factors in computing systems, Florence, Italy:, 2008.

31. R. Arjan, U. Pfeil, P. Zaphiris, "Age differences in online social networking", Conference on Human Factors in Computing Systems, 2008.

32. J. Schrammel, C. Kaffel, Tscheligi, "How much do you tell?: information disclosure behavior indifferent types of online communities", Proceedings of the fourth international conference on Communities and technologies, 2009.

33. J. DiMicco, D. R. Millen, "Motivations for social networking at work", Proceedings of the ACM 2008 conference on Computer supported cooperative work, 2008.

34. M. M. Skeels, J. Grudin, "When social networks cross boundaries: a case study of workplace use of facebook and linkedin", Proceedings of the ACM 2009 international conference on Supporting group work, 2009.

35. I. Liccardi, A. Ounnas, "The role of social networks in students' learning experiences" in Working group reports on ITiCSE on Innovation and technology in computer science education, Dundee, Scotland:, 2007.

36. S. Bystein, J. Rose, "The Role of Social Networking Services in eParticipation", Proceedings of the 1st International Conference on Electronic Participation, 2009.

37. R. M. Beth, M. C. John, "wConnect: a facebook-based developmental learning community to support women in information technology", Proceedings of the fourth international conference on Communities and technologies, 2009.

38. J. Donath, D. Boyd, "Public displays of connection", BT Technology Journal, vol. 22, pp. 71-82, 2004.

39. D. Boyd, "Reflections on Friendster Trust and Intimacy", Ubiquitous Computing Workshop application for the Intimate Ubiquitous Computing Workshop, 2003.

40. C. Dwyer, "Digital Relationships in the “MySpace” Generation: Results From a Qualitative Study", Proceedings of the 40th Annual Hawaii International Conference on System Sciences, 2007.

41. H. Chun, H. Kwak, "Comparison of online social relations in volume vs interaction: a case study of cyworld", Proceedings of the 8th ACM SIGCOMM conference on Internet measurement, 2008.

42. N. Ellison, C. Steinfield, C. Lampe, "The Benefits of Facebook “Friends:” Social Capital and College Students Use of Online Social Network Sites", Journal of Computer-Mediated Communication, vol. 12, pp. 1143-1168, 2007.

43. K. Subrahmanyam, P. Greenfield, "Online communication and adolescent relationships", The Future of Children, vol. 18, pp. 119-128, 2008.

44. R. Gross, A. Acquisti, "Information Revelation and Privacy in Online Social Networks", Workshop on Privacy in Electronic Society, 2005.

45. J. Snyder, D. Carpenter, " MySpace.com - A Social Networking Site and Social Contract Theory ", Information Systems Education Journal, vol. 5, pp. 3-11, 2007.

46. C. Dwyer, S. Hiltz, Passerini, "Trust and privacy concern within social networking sites: A comparison of Facebook and MySpace", Proceedings of the Thirteenth Americas Conference on Information Systems (AMCIS), 2007.

47. D. Rosenblum, "What Anyone Can Know: The Privacy Risks of Social Networking Sites", IEEE Security & Privacy, vol. 5, pp. 40-49, 2007.

48. M. Mohammad, C. O. Paul, "Privacy-enhanced sharing of personal content on the web", Proceeding of the 17th international conference on World Wide Web, 2008.

49. S. Katherine, L. H. Richter, "Strategies and struggles with privacy in an online social networking community", Proceedings of the 22nd British HCI Group Annual Conference on HCI 2008: People and Computers XXII: Culture Creativity Interaction, vol. 1, 2008.

50. A. Levin, M. Foster, The Next Digital Divide: Online Social Network Privacy, 2008.

51. J. Golbeck, U. Kuter, "The Ripple Effect: Change in Trust and Its Impact Over a Social Network", Computing with Social Trust, pp. 169-181, 2009.

52. C. James, L. Ling, "Socialtrust: tamper-resilient trust establishment in online communities", Proceedings of the 8th ACM/IEEE-CS joint conference on Digital libraries, 2008.

53. S. Gambi, W. Reader, "The Development of Trust in Close Friendships Formed within Social Network Sites", Proceedings of the WebSci'09: Society On-Line, 2009.

54. L. Bilge, T. Strufe, Balzarotti, Kirda, "All your contacts are belong to us: automated identity theft attacks on social networks", Proceedings of the 18th international conference on World wide web, 2009.

 55. G. Kaupins, R. Minch, "Legal and ethical implications of employee location monitoring", International Journal of Technology and Human Interaction, vol. 2, pp. 16-20, 2006.

56. G. D. Smith, "Private eyes are watching you: with the implementation of the E-911 mandate who will watch every move you make? (Telecommunications Act of 1996: Ten Years Later Symposium)", Federal Communications Law Journal, vol. 58, pp. 705-721, 2006.

57. E. Troshynski, C. Lee, Dourish, "Accountabilities of presence: reframing location-based systems", Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, 2008.

58. C. Strawn, "Expanding The Potential for GPS Evidence Acquisition", Small Scale Digital Device Forensics Journal, vol. 3, pp. 1-12, 2009.

59. Y. Xiao, B. Shen, "Security and Privacy in RFID and application in telemedicine", IEEE Communications Magazine, vol. 44, pp. 64-72, 2006.

60. A. Masters, K. Michael, "Lend me your arms: The use and implications of humancentric RFID", Electronic Commerce Research Applications, vol. 6, pp. 29-39, 2007.

61. B. Brown, A. Taylor, "Locating Family Values: A Field Trial of the {Whereabouts} Clock", UbiComp 2007, 2007.

62. L.-D. Chou, N.-H. Lai, Y.-W. Chen, Y.-J. Chang, L.-F. Huang, W.-L. Chiang, H.-Y. Chiu, J.-Y. Yang, "Management of mobile social network services for families with Developmental Delay Children", 10th International Conference on e-health Networking Applications and Services: HealthCom 2008, 2008.

63. D. J. Glasser, K. W. Goodman, "Chips tags and scanners: Ethical challenges for radio frequency identification", Ethics and Information Technology, vol. 9, pp. 101-109, 2007.

64. L. Nan, C. Guanling, "Analysis of a Location-Based Social Network", International Conference on Computational Science and Engineering, 2009.

65. X. Page, A. Kobsa, "The Circles of Latitude: Adoption and Usage of Location Tracking in Online Social Networking", International Conference on Computational Science and Engineering, 2009.

66. M. G. Michael, S. J. Fusco, K. Michael, "A Research Note on Ethics in the Emerging Age of überveillance", Computer Communications, vol. 31, pp. 1192-1199, 2008.

67. L. Perusco, K. Michael, "Humancentric applications of precise location based services", International Conference on eBusiness Engineering, 2005.

68. J. Weckert, "Trust and monitoring in the workplace", IEEE Symposium on Technology and Society, 2000.

69. G. Borriello, "RFID: tagging the world", Communications of the ACM, vol. 48, pp. 34-37, 2005.

70. J. E. Dobson, P. F. Fisher, "Geoslavery", IEEE Technology and Society Magazine, vol. 22, pp. 47-52, 2003.

71. P. Joore, "Social aspects of location-monitoring systems: the case of Guide Me and of My-SOS", Social Science Information, vol. 47, pp. 253-274, 2008.

72. J. E. Dobson, P. F. Fisher, "The Panopticon's Changing Geography", The Geographical Review, vol. 97, pp. 307-323, 2007.

73. E. M. Dowdell, "You are here! Mapping the boundaries of the Fourth Amendment with GPS technology", Rutgers Computer and Technology Law Journal, vol. 32, pp. 109-131, 2005.

74. V. Lockton, R. Rosenberg, "RFID: The Next Serious Threat to Privacy", Ethics and Information Technology, vol. 7, pp. 221-231, 2005.

75. S. L. Garfinkel, A. Juels, "RFID Privacy: An Overview of Problems and Proposed Solutions", IEEE Security and Privacy, pp. 34-43, 2005.

76. M. Gadzheva, "Privacy concerns pertaining to location-based services", International Journal of Intercultural Information Management, vol. 1, pp. 49, 2007.

77. J. L. Wang, M. Loui, "Privacy and ethical issues in location-based tracking systems", Proceedings of the IEEE Symposium on Technology and Society, 2009.

78. L. Barkhuus, A. Dey, "Location-Based Services for Mobile Telephony: a study of user's privacy concerns", Proceedings of the INTERACT 9th IFIP TC13 International Conference on Human-Computer Interaction, 2003.

79. A. Aloudat, K. Michael, R. Abbas, "Location-Based Services for Emergency Management: A Multi-Stakeholder Perspective", Eighth International Conference on Mobile Business (ICMB 2009), 2009.

80. S. A. Grandhi, Q. Jones, Karam, "Sharing the big apple: a survey study of people place and locatability" in presented at CHI '05 extended abstracts on Human factors in computing systems, Portland, OR:, 2005.

81. S. J. Fusco, K. Michael, M. G. Michael, R. Abbas, "Exploring the Social Implications of Location Based Social Networking: An inquiry into the perceived positive and negative impacts of using LBSN between friends", International Conference on Mobile Business, 2010.

82. L. Barkhuus, "Privacy in Location-Based Services Concern vs. Coolness", HCI 2004 workshop: Location System Privacy and Control, 2004.

83. S. Patil, J. Lai, "Who gets to know what when: configuring privacy permissions in an awareness application", Proceedings of the SIGCHI conference on Human factors in computing systems, 2005.

84. S. Consolvo, I. E. Smith, "Location disclosure to social relations: why when & what people want to share", Proceedings of the SIGCHI conference on Human factors in computing systems, 2005.

85. L. Humphreys, "Mobile Social Networks and Social Practice: A Case Study of Dodgeball", Journal of Computer-Mediated Communication, vol. 13, pp. 341-360, 2008.

86. L. Barkhuus, B. Brown, "From awareness to repartee: sharing location within social groups", Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, 2008.

87. J. Y. Tsai, P. Kelley, "Who's viewed you?: the impact of feedback in a mobile location-sharing application", Proceedings of the 27th international conference on Human factors in computing systems, 2009.

88. S. Vihavainen, A. Oulasvirta, "I can't lie anymore!”: The implications of location automation for mobile social applications", 6th Annual International Mobile and Ubiquitous Systems: Networking & Services, 2009.

89. C. Schapsis, Location Based Social Networks Links: A list of Location Based Social Networks, 2010.

90. A. Vasalou, A. Hopfensitz, J. Pitt, "In praise of forgiveness: Ways for repairing trust breakdowns in one-off online interactions", International Journal of Human-Computer Studies, vol. 66, pp. 466-480, 2008.

91. D. Rousseau, S. Sitkin, "Not So Different After All: A Cross-Discipline View of Trust", Academy of Management Review, vol. 22, pp. 393-404, 1998.

92. R. C. Mayer, J. H. Davis, "An Integrative Model of Organizational Trust", Academy of Management Review, vol. 20, pp. 709-734, 1995.

93. K. Bijlsma-Frankema, A. C. Costa, "Understanding the Trust-Control Nexus", International Sociology, vol. 20, pp. 259-282, 2005.

94. J. D. Lewis, A. Weigert, "Trust as a Social Reality", Social Forces, vol. 63, pp. 967-985, 1985.

Acknowledgments

The authors would like to acknowledge the funding support of the Australian Research Council (Discovery grant DP0881191): “Toward the Regulation of the Location-Based Services Industry: Influencing Australian Government Telecommunications Policy”.

Keywords

Informatics, Social network services, Space technology, Privacy, Communications technology, Information systems, Social implications of technology, Context, Surveillance, Smart phones, social networking (online), data privacy, mobile computing, social aspects of automation, information and communication technology design, social informatics, location-based social networking, mobile application, classification model, location based service, online social networking, trust, friendship, privacy, social life

Citation:  Sarah Jean Fusco, Katina Michael and M. G. Michael, "Using a social informatics framework to study the effects of location-based social networking on relationships between people: A review of literature",  2010 IEEE International Symposium on Technology and Society (ISTAS), 7-9 June 2010, Wollongong, Australia, DOI: 10.1109/ISTAS.2010.5514641