Social Implications of Technology: The Past, the Present, and the Future

Abstract

The social implications of a wide variety of technologies are the subject matter of the IEEE Society on Social Implications of Technology (SSIT). This paper reviews the SSIT's contributions since the Society's founding in 1982, and surveys the outlook for certain key technologies that may have significant social impacts in the future. Military and security technologies, always of significant interest to SSIT, may become more autonomous with less human intervention, and this may have both good and bad consequences. We examine some current trends such as mobile, wearable, and pervasive computing, and find both dangers and opportunities in these trends. We foresee major social implications in the increasing variety and sophistication of implant technologies, leading to cyborgs and human-machine hybrids. The possibility that the human mind may be simulated in and transferred to hardware may lead to a transhumanist future in which humanity redesigns itself: technology would become society.

SECTION I. Introduction

“Scientists think; engineers make.” Engineering is fundamentally an activity, as opposed to an intellectual discipline. The goal of science and philosophy is to know; the goal of engineering is to do something good or useful. But even in that bare-bones description of engineering, the words “good” and “useful” have philosophical implications.

Because modern science itself has existed for only 400 years or so, the discipline of engineering in the sense of applying scientific knowledge and principles to the satisfaction of human needs and desires is only about two centuries old. But for such a historically young activity, engineering has probably done more than any other single human development to change the face of the material world.

It took until the mid-20th century for engineers to develop the kind of self-awareness that leads to thinking about engineering and technology as they relate to society. Until about 1900, most engineers felt comfortable in a “chain-of-command” structure in which the boss—whether it be a military commander, a corporation, or a wealthy individual—issued orders that were to be carried out to the best of the engineer's technical ability. Fulfillment of duty was all that was expected. But as the range and depth of technological achievements grew, engineers, philosophers, and the public began to realize that we had all better take some time and effort to think about the social implications of technology. That is the purpose of the IEEE Society on Social Implications of Technology (SSIT): to provide a forum for discussion of the deeper questions about the history, connections, and future trends of engineering, technology, and society.

This paper is not focused on the history or future of any particular technology as such, though we will address several technological issues in depth. Instead, we will review the significant contributions of SSIT to the ongoing worldwide discussion of technology and society, and how technological developments have given rise to ethical, political, and social issues of critical importance to the future. SSIT is the one society in IEEE where engineers and allied professionals are encouraged to be introspective—to think about what they are doing, why they are doing it, and what effects their actions will have. We believe the unique perspective of SSIT enables us to make a valuable contribution to the panoply of ideas presented in this Centennial Special Issue of the Proceedings of the IEEE.

 

SECTION II. The Past

A. Brief History of SSIT

SSIT as a technical society in IEEE was founded in 1982, after a decade as the Committee on Social Responsibility in Engineering (CSRE). In 1991, SSIT held its first International Symposium on Technology and Society (ISTAS), in Toronto, ON, Canada. Beginning in 1996, the Symposium has been held annually, with venues intentionally located outside the continental United States every few years in order to increase international participation.

SSIT total membership was 1705 as of December 2011. Possibly because SSIT does not focus exclusively on a particular technical discipline, it is rare that SSIT membership is a member's primary connection to IEEE. As SSIT's parent organization seeks ways to increase its usefulness and relevance to the rapidly changing engineering world of the 21st century, SSIT will both chronicle and participate in the changes taking place both in engineering and in society as a whole. for a more detailed history of the first 25 years of SSIT, see [1].

B. Approaches to the Social Implications of Technology

In the historical article referred to above [1], former SSIT president Clint Andrews remarked that there are two distinct intellectual approaches which one can take with regard to questions involving technology and society. The CSIT and the early SSIT followed what he calls the “critical science” approach which “tends to focus on the adverse effects of science and technical change.” Most IEEE societies are organized around a particular set of technologies. The underlying assumption of many in these societies is that these particular technologies are beneficial, and that the central issues to be addressed are technical, e.g., having to do with making the technologies better, faster, and cheaper. Andrews viewed this second “technological optimism” trend as somewhat neglected by SSIT in the past, and expressed the hope that a more balanced approach might attract a larger audience to the organization's publications and activities. It is important to note, however, that from the very beginning, SSIT has called for a greater emphasis on the development of beneficial technology such as environmentally benign energy sources and more efficient electrical devices.

In considering technology in its wider context, issues that are unquestionable in a purely technical forum may become open to question. Technique A may be more efficient and a fraction of the cost of technique B in storing data with similar security provisions, but what if a managed offshore shared storage solution is not the best thing to do under a given set of circumstances? The question of whether A or B is better technologically (and economically) is thus subsumed in the larger question of whether and why the entire technological project is going to benefit anyone, and who it may benefit, and who it may harm. The fact that opening up a discussion to wider questions sometimes leads to answers that cast doubt on the previously unquestioned goodness of a given enterprise is probably behind Andrews' perception that on balance, the issues joined by SSIT have predominantly fallen into the critical-science camp. Just as no one expects the dictates of conscience to be in complete agreement with one's instinctive desires, a person seeking unalloyed technological optimism in the pages or discussions hosted by SSIT will probably be disappointed. But the larger aim is to reach conclusions about technology and society that most of us will be thankful for some day, if not today. Another aim is to ensure that we bring issues to light and propose ways forward to safeguard against negative effects of technologies on society.

C. Major Topic Areas of SSIT

In this section, we will review some (but by no means all) topics that have become recurring themes over the years in SSIT's quarterly peer-reviewed publication, the IEEE Technology & Society Magazine. The articles cited are representative only in the sense that they fall into categories that have been dealt with in depth, and are not intended to be a “best of” list. These themes fall into four broad categories: 1) war, military technology (including nuclear weapons), and security issues, broadly defined; 2) energy technologies, policies and related issues: the environment, sustainable development, green technology, climate change, etc.; 3) computers and society, information and communications technologies (ICT), cybersystems, cyborgs, and information-driven technologies; and 4) groups of people who have historically been underprivileged, unempowered, or otherwise disadvantaged: Blacks, women, residents of developing nations, the handicapped, and so on. Education and healthcare also fit in the last category because the young and the ill are in a position of dependence on those in power.

1. Military and Security Issues

Concern about the Vietnam War was a strong motivation for most of the early members of the Committee for Social Responsibility in Engineering, the predecessor organization of SSIT. The problem of how and even whether engineers should be involved in the development or deployment of military technology has continued to appear in some form throughout the years, although the end of the Cold War changed the context of the discussion. This category goes beyond formal armed combat if one includes technologies that tend to exert state control or monitoring on the public, such as surveillance technologies and the violation of privacy by various technical means. In the first volume of the IEEE Technology & Society Magazine published in 1982, luminaries such as Adm. Bobby R. Inman (ret.) voiced their opinions about Cold War technology [2], and the future trend toward terrorism as a major player in international relations was foreshadowed by articles such as “Technology and terrorism: privatizing public violence,” published in 1991 [3]. Opinions voiced in the Magazine on nuclear technology ranged from Shanebrook's 1999 endorsement of a total global ban on nuclear weapons [4] to Andrews' thorough review of national responses to energy vulnerability, in which he pointed out that France has developed an apparently safe, productive, and economical nuclear-powered energy sector [5]. In 2009, a special section of five articles appeared on the topic of lethal robots and their implications for ethical use in war and peacekeeping operations [6]. And in 2010, the use of information and communication technologies (ICT) in espionage and surveillance was addressed in a special issue on “Überveillance,” defined by authors M.G. Michael and K. Michael as the use of electronic means to track and gather information on an individual, together with the “deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time” [7].

2. Energy and Related Technologies and Issues

from the earliest years of the Society, articles on energy topics such as alternative fuels appeared in the pages of the IEEE Technology & Society Magazine. A 1983 article on Brazil's then-novel effort to supplement imported oil with alcohol from sugarcane [8] presaged today's controversial U.S. federal mandate for the ethanol content in motor fuels. The Spring 1984 issue hosted a debate on nuclear power generation between H. M. Gueron, director of New York's Con Edison Nuclear Coal and Fuel Supply division at the time [9], and J. J. MacKenzie, a senior staff scientist with the Union of Concerned Scientists [10]. Long before greenhouse gases became a household phrase and bandied about in debates between Presidential candidates, the Magazine published an article examining the need to increase the U.S.'s peak electrical generating capacity because the increase in average temperature due to increasing atmospheric carbon dioxide would increase the demand for air conditioning [11]. The larger implications of global warming apparently escaped the attention of the authors, focused as they were on the power-generating needs of the state of Minnesota. By 1990, the greenhouse effect was of sufficient concern to show up on the legislative agendas of a number of nations, and although Cruver attributed this to the “explosion of doomsday publicity,” he assessed the implications of such legislation for future energy and policy planning [12]. Several authors in a special issue on the social implications of systems concepts viewed the Earth's total environment in terms of a complex system in 2000 [13]. The theme of ISTAS 2009 was the social implications of sustainable development, and this theme was addressed in six articles in the resulting special issue of the IEEE Technology & Society Magazine for Fall 2010. The record of speculation, debate, forecasting, and analysis sampled here shows that not only has SSIT carried out its charter by examining the social implications of energy technology and related issues, but also it has shown itself a leader and forerunner in trends that later became large-scale public debates.

3. Computing, Telecommunications, and Cyberspace

Fig. 1. BRLESC-II computer built by U.S. Army personnel for use at the Ballistics Research Lab, Aberdeen Proving Grounds between about 1967 and 1978, A. V. Kurian at console. Courtesy of U.S. Army Photos.

In the early years of SSIT, computers were primarily huge mainframes operated by large institutions (Fig. 1). But with the personal computer revolution and especially the explosion of the Internet, SSIT has done its part to chronicle and examine the history, present state, and future trends of the hardware, software, human habits and interactions, and the complex of computer and communications technologies that are typically subsumed under the acronym of ICT.

As we now know, the question of intellectual property has been vastly complicated by the ready availability of peer-to-peer software, high-speed network connections, and legislation passed to protect such rights. In a paper published in 1998, Davis addressed the question of protection of intellectual property in cyberspace [14]. As the Internet grew, so did the volume of papers on all sorts of issues it raised, from the implications of electronic profiling [15] to the threats and promises of facial recognition technology [16]. One of the more forward-looking themes addressed in the pages of the Magazine came in 2005 with a special issue on sustainable pervasive computing [17]. This issue provides an example of how both the critical science and the technological optimism themes cited by Andrews above can be brought together in a single topic. And to show that futuristic themes are not shirked by the IEEE Technology and Society Magazine authors, in 2011 Clarke speculated in an article entitled “Cyborg rights” on the limits and problems that may come as people physically merge with increasingly advanced hardware (implanted chips, sensory enhancements, and so on) [18].

4. Underprivileged Groups

Last but certainly not least, the pages of the IEEE Technology & Society Magazine have hosted articles inspired by the plight of underprivileged peoples, broadly defined. This includes demographic groups such as women and ethnic minorities and those disadvantaged by economic issues, such as residents of developing countries. While the young and the ill are not often formally recognized as underprivileged in the conventional sense, in common with other underprivileged groups they need society's help in order to survive and thrive, in the form of education and healthcare, respectively. An important subset of education is the theme of engineering ethics, a subject of vital interest to many SSIT members and officials since the organization's founding.

In its first year, the Magazine carried an article on ethical issues in decision making [19]. A special 1998 issue on computers and the Internet as used in the K-12 classroom explored these matters in eight focused articles [20]. The roles of ethics and professionalism in the personal enjoyment of engineering was explored by Florman (author of the book The Introspective Engineer) in an interview with the Magazine's managing editor Terri Bookman in 2000 [21]. An entire special issue was devoted to engineering ethics in education the following year, after changes in the U.S. Accreditation Board for Engineering and Technology's policies made it appear that ethics might receive more attention in college engineering curricula [22].

The IEEE Technology & Society Magazine has hosted many articles on the status of women, both as a demographic group and as a minority in the engineering profession. Articles and special issues on themes involving women have on occasion been the source of considerable controversy, even threatening the organization's autonomy at one point [1, p. 9]. In 1999, ISTAS was held for the first time in conjunction with two other IEEE entities: the IEEE Women in Engineering Committee and the IEEE History Center. The resulting special issue that came out in 2000 carried articles as diverse as the history of women in the telegraph industry [23], the challenges of being both a woman and an engineering student [24], and two articles on technology and the sex industry [25], [26].

Engineering education in a global context was the theme of a Fall 2005 special issue of the IEEE Technology and Society Magazine, and education has been the focus of several special issues and ISTAS meetings over the years [27]–[28][29]. The recent development termed “humanitarian engineering” was explored in a special issue only two years ago, in 2010 [30]. Exemplified by the U.S.-based Engineers without Borders organization, these engineers pursue projects, and sometimes careers, based not only on profit and market share, but also on the degree to which they can help people who might not otherwise benefit from their engineering talents.

SECTION III. The Present

Fig. 2.  Cow bearing an Australian National Livestock Identification System (NLIS) RFID tag on its ear. The cow's identity is automatically detected as it goes through the drafting gates and the appropriate feed is provided for the cow based on historical data on its milk yields. Courtesy of Adam Trevarthen.

Fig. 2. Cow bearing an Australian National Livestock Identification System (NLIS) RFID tag on its ear. The cow's identity is automatically detected as it goes through the drafting gates and the appropriate feed is provided for the cow based on historical data on its milk yields. Courtesy of Adam Trevarthen.

Emerging technologies that will act to shape the next few years are complex in their makeup with highly meshed value chains that resemble more a process or service than an individual product [31]. At the heart of this development is convergence: convergence in devices, convergence in applications, convergence in content, and convergence in infrastructure. The current environment is typified by the move toward cloud computing solutions and Web 2.0 social media platforms with ubiquitous access via a myriad of mobile or fixed devices, some of which will be wearable on people and animals (Fig. 2) or embedded in systems (e.g., vehicles and household appliances).

Simultaneous with these changes are the emergence of web services that may or may not require a human operator for decision making in a given business process, reliance upon data streams from automatic identification devices [e.g., radio-frequency identification (RFID) tags], the accuracy and reliability of location-based services [e.g., using Global Positioning Systems (GPS)] and condition monitoring techniques (e.g., using sensors to measure temperature or other physiological data). Most of this new technology will be invisibly located in miniaturized semiconductors which are set to reach such economies of scale, that it is commonly noted by technology evangelists that every single living and nonliving thing will come equipped with a chip “on board.”

Fig. 3. Business woman checking in for an interstate trip using an electronic ticket sent to her mobile phone. Her phone also acts as a mobile payment mechanism and has built-in location services features. Courtesy of NXP Semiconductors 2009.

The ultimate vision of a Web of Things and People (WoTaP)—smart homes using smart meters, smart cars using smart roads, smart cities using smart grids—is one where pervasive and embedded systems will play an active role toward sustainability and renewable energy efficiency. The internetworked environment will need to be facilitated by a fourth-generation mobility capability which will enable even higher amounts of bandwidth to the end user as well as seamless communication and coordination by intelligence built into the cloud. Every smart mobile transaction will be validated by a precise location and linked back to a subject (Fig. 3).

In the short term, some of the prominent technologies that will impact society will be autonomous computing systems with built-in ambient intelligence which will amalgamate the power of web services and artificial intelligence (AI) through multiagent systems, robotics, and video surveillance technologies (e.g., even the use of drones) (Fig. 4). These technologies will provide advanced business and security intelligence. While these systems will lead to impressive uses in green initiatives and in making direct connections between people and dwellings, people and artifacts, and even people and animals, they will require end users to give up personal information related to identity, place, and condition to be drawn transparently from smart devices.

Fig. 4.  A facial recognition system developed by Argus Solutions in Australia. Increasingly facial recognition systems are being used in surveillance and usually based on video technology. Digital images captured from video or still photographs are compared with other precaptured images. Courtesy of Argus Solutions 2009.

Fig. 4. A facial recognition system developed by Argus Solutions in Australia. Increasingly facial recognition systems are being used in surveillance and usually based on video technology. Digital images captured from video or still photographs are compared with other precaptured images. Courtesy of Argus Solutions 2009.

The price of all of this will be that very little remains private any longer. While the opportunities that present themselves with emerging technologies are enormous with a great number of positive implications for society—for instance, a decrease in the number of traffic accidents and fatalities, a reduction in the carbon emission footprint by each household, greater social interconnectedness, etc.—ultimately these gains too will be susceptible to limitations. Who the designated controller is and what they will do with the acquired data is something we can only speculate about. We return then, to the perennial question of “who will guard the guards themselves”: Quis custodiet ipsos custodes? [32]

A. Mobile and Pervasive Computing

In our modern world, data collection from many of our most common activities begins from the moment we step out our front door in the morning until we go to sleep at night. In addition to near-continual data collection, we have become a society of people that voluntarily broadcasts to the world a great deal of personal information. Vacation photos, major life events, and trivialities such as where we are having dinner to our most mundane thoughts, all form part of the stream of data through which we electronically share our inner lives. This combination of the data that is collected about us and the data that is freely shared by us could form a breathtakingly detailed picture of an individual's life, if it could ever all be collected in one place. Most of us would consider ourselves fortunate that most of this data was historically never correlated and is usually highly anonymized. However, in general, it is becoming easier to correlate and deanonymize data sets.

1. Following Jane Doe's Digital Data Trail

Let us consider a hypothetical “highly tracked” individual [33]. Our Jane Doe leaves for work in the morning, and gets in her Chevrolet Impala, which has OnStar service to monitor her car. OnStar will contact emergency services if Jane has an accident, but will also report to the manufacturer any accident or mechanical failure the car's computer is aware of [34]. Jane commutes along a toll road equipped with electronic toll collection (ETC). The electronic toll system tracks where and at what time Jane enters and leaves the toll road (Fig. 5).

Fig. 5. Singapore's Electronic Road Pricing (ERP) system. The ERP uses a dedicated short-range radio communication system to deduct ERP charges from CashCards. These are inserted in the in-vehicle units of vehicles before each journey. Each time vehicles pass through a gantry when the system is in operation, the ERP charges are automatically deducted. Courtesy of Katina Michael 2003.

When she gets to work, she uses a transponder ID card to enter the building she works in (Fig. 6), which logs the time she enters and by what door. She also uses her card to log into the company's network for the morning. Her company's Internet firewall software monitors any websites she visits. At lunch, she eats with colleagues at a local restaurant. When she gets there, she “checks in” using a geolocation application on her phone—for doing so, the restaurant rewards her with a free appetizer [35].

 

Fig. 6. Employee using a contactless smart card to gain entry to her office premises. The card is additionally used to access elevators in the building, rest rooms, and secure store areas, and is the only means of logging into the company intranet. Courtesy of NXP Semiconductors 2009.

She then returns to work for the afternoon, again using her transponder ID badge to enter. After logging back into the network, she posts a review of the restaurant on a restaurant review site, or maybe a social networking site. At the end of the work day, Jane logs out and returns home along the same toll road, stopping to buy groceries at her local supermarket on the way. When she checks out at the supermarket, she uses her customer loyalty card to automatically use the store's coupons on her purchases. The supermarket tracks Jane's purchases so it can alert her when things she buys regularly are on sale.

During Jane's day, her movements were tracked by several different systems. During almost all of the time she spent out of the house, her movements were being followed. But Jane “opted in” to almost all of that tracking; it was her choice as the benefits she received outweighed her perceived costs. The toll collection transponder in her car allows her to spend less time in traffic [36]. She is happy to share her buying habits with various merchants because those merchants reward her for doing so [37]. In this world it is all about building up bonus points and getting rewarded. Sharing her opinions on review and social networking sites lets Jane keep in touch with her friends and lets them know what she is doing.

While many of us might choose to allow ourselves to be monitored for the individual benefits that accrue to us personally, the data being gathered about collective behaviors are much more valuable to business and government agencies. Clarke developed the notion of dataveillance to give a name to the “systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” in the 1980s [38]. ETC is used by millions of people in many countries. The more people who use it, as opposed to paying tolls at tollbooths, the faster traffic can flow for everyone. Everyone also benefits when ETC allows engineers to better monitor traffic flows and plan highway construction to avoid the busiest times of traffic. Geolocation applications let businesses reward first-time and frequent customers, and they can follow traffic to their business and see what customers do and do not like. Businesses such as grocery stores or drug stores that use customer loyalty cards are able to monitor buying trends to see what is popular and when. Increasingly shoppers are being introduced to the near-field communication (NFC) capability on their third-generation (3G) smartphone (Fig. 7).

Fig. 7. Purchasing grocery items effortlessly by using the near-field communication (NFC) capability on your 3G smartphone. Courtesy of NXP Semiconductors 2009.

Some of these constant monitoring tools are truly personal and are controlled by and report back only to the user [39]. for example, there are now several adaptive home thermostat systems that learn a user's temperature preferences over time and allow users to track their energy usage and change settings online. for the health conscious, “sleep monitoring” systems allow users to track not only the hours of sleep they get per night, but also the percentage of time spent in light sleep versus rapid eye movement (REM) sleep, and their overall “sleep quality” [40].

Fig. 8. Barcodes printed on individual packaged items on pallets. Order information is shown on the forklift's on-board laptop and the driver scans items that are being prepared for shipping using a handheld gun to update inventory records wirelessly. Courtesy AirData Pty Ltd, Motorola Premier Business Partner, 2009.

Businesses offer and customers use various mobile and customer tracking services because the offer is valued by both parties (Fig. 8). However, serious privacy and legal issues continue to arise [41]. ETC records have been subpoenaed in both criminal and civil cases [42]. Businesses in liquidation have sold their customer databases, violating the privacy agreements they gave to their customers when they were still in business. Geolocation services and social media that show a user's location or allow them to share where they have been or where they are going can be used in court cases to confirm or refute alibis [43].

 

Near-constant monitoring and reporting of our lives will only grow as our society becomes increasingly comfortable sharing more and more personal details (Fig. 9). In addition to the basic human desire to tell others about ourselves, information about our behavior as a group is hugely valuable to both governments and businesses. The benefits to individuals and to society as a whole are great, but the risks to privacy are also significant [44]. More information about group behaviors can let us allocate resources more efficiently, plan better for future growth, and generate less waste. More information about our individual patterns can allow us to do the same thing on a smaller scale—to waste less fuel heating our homes when there is no one present, or to better understand our patterns of human activity.

 

Fig. 9. A five step overview of how the Wherify location-based service works. The information retrieved by this service included a breadcrumb of each location (in table and map form), a list of time and date stamps, latitude and longitude coordinates, nearest street address, and location type. Courtesy of Wherify Wireless Location Services, 2009.

 

B. Social Computing

When we think of human evolution, we often think of biological adaptions to better survive disease or digest foods. But our social behaviors are also a product of evolution. Being able to read facial expressions and other nonverbal cues is an evolved trait and an essential part of human communication. In essence, we have evolved as a species to communicate face to face. Our ability to understand verbal and nonverbal cues has been essential to our ability to function in groups and therefore our survival [45].

The emoticon came very early in the life of electronic communication. This is not surprising, given just how necessary using facial expressions to give context to written words was to the casual and humor-filled atmosphere of the Internet precursors. Many other attempts to add context to the quick, casual writing style of the Internet have been made, mostly with less success. Indeed, the problem of communication devolving from normal conversations to meaningless shouting matches has been around almost as long as electronic communication itself. More recently, the “anonymous problem”—the problem of people anonymously harassing others without fear of response or retribution—has come under discussion in online forums and communities. And of course, we have seen the recent tragic consequences of cyberbullying [46]. In general, people will be much crueler to other people online than they would ever be in person; many of our evolved social mechanisms depend on seeing and hearing who we are communicating with.

The question we are faced with is this: Given that we now exist and interact in a world that our social instincts were not evolved to handle, how will we adapt to the technology, or more likely, how will the technology we use to communicate with adapt to us? We are already seeing the beginning of that adaptation: more and more social media sites require a “real” identity tied to a valid e-mail address. And everywhere on the Internet, “reputation” is becoming more and more important [177].

Reference sites, such as Wikipedia, control access based on reputation: users gain more privileges on the site to do things such as editing controversial topics or banning other users based on their contributions to the community—writing and editing articles or contributing to community discussions. On social media and review sites, users that are not anonymous have more credibility, and again reputation is gained with time and contribution to the community.

It is now becoming standard practice for social media of all forms to allow users to control who can contact them and make it very easy to block unwanted contact. In the future, these trends will be extended. Any social media site with a significant amount of traffic will have a way for users to build and maintain a reputation and to control access accordingly. The shift away from anonymity is set to continue and this is also evident in the way search engine giants, like Google, are updating their privacy statements—from numerous policies down to one. Google states: “When you sign up for a Google Account, we ask you for personal information. We may combine the information you submit under your account with information from other Google services or third parties in order to provide you with a better experience and to improve the quality of our services” [47].

Fig. 10. Wearable high-definition video calling and recording attire. Courtesy of Xybernaut 2002.

When people use technology to socialize, they are often doing it on mobile platforms. Therefore, the futures of social and mobile computing are inevitably intertwined. The biggest change that is coming to the shared mobile/social computing space is the final spread of WiFi and high-density mobile phone networks. There are still huge geographical areas where there is no way of wirelessly connecting to the Internet or where the connection is so slow as to be unusable. As high-speed mobile Internet spreads, extra bandwidth could help the problems inherent in communicating without being able to see the other person. High-definition (HD) video calling on mobile phones will make person-to-person communications easier and more context rich (Fig. 10). HD video calling and conferencing will make everything from business meetings to long-distance relationships easier by allowing the participants to pick up on unspoken cues.

 

As more and more of our social interactions go online, the online world will be forced to adapt to our evolved human social behaviors. It will become much more like offline communication, with reputation and community standing being deeply important. True anonymity will become harder and harder to come by, as the vast majority of social media will require some proof of identity. for example, this practice is already occurring in countries like South Korea [48].

While we cannot predict all the ways in which our online interactions will become more immersive, we can say for certain that they will. The beauty of all of these changes will be that it will become as easy to maintain or grow a personal relationship on the other side of the world as it would be across town. As countries and regions currently without high-speed data networks come online, they can integrate into a new global community allowing us all to know each other with a diverse array of unknown consequences.

C. Wearable Computing

Fig. 11. The prototype GPS Locator for Children with a built-in pager, a request for 911, GPS technology, and a key fob to manually lock and unlock the locator. This specific device is no longer being marketed, despite the apparent need in some contexts. Courtesy of Wherify Wireless Location Services, 2003.

According to Siewiorek [49, p. 82], the first wearable device was prototyped in 1961 but it was not until 1991 that the term “wearable computer” was first used by a research group at Carnegie Mellon University (Pittsburgh, PA). This coincided with the rise of the laptop computer, early models of which were known as “luggables.” Wearable computing can be defined as “anything that can be put on and adds to the user's awareness of his or her environment …mostly this means wearing electronics which have some computational power” [50, p. 2012]. While the term “wearables” is generally used to describe wearable displays and custom computers in the form of necklaces, tiepins, and eyeglasses, the definition has been broadened to incorporate iPads, iPods, personal digital assistants (PDAs), e-wallets, GPS watches (Fig. 11), and other mobile accessories such as smartphones, smart cards, and electronic passports that require the use of belt buckles or clip-on satchels attached to conventional clothing [51, p. 330]. The iPlant (Internet implant) is probably not far off either [52].

 

Wearable computing has reinvented the way we work and go about our day-to-day business and is set to make even greater changes in the foreseeable future [53]. In 2001, it was predicted that highly mobile professionals would be taking advantage of smart devices to “check messages, finish a presentation, or browse the Web while sitting on the subway or waiting in line at a bank” [54, p. 44]. This vision has indeed been realized but devices like netbooks are still being lugged around instead of worn in the true sense.

The next phase of wearables will be integrated into our very clothing and accessories, some even pointing to the body itself being used as an input mechanism. Harrison of Carnegie Mellon's Human–Computer Interaction Institute (HCII) produced Skinput with Microsoft researchers that makes the body that travels everywhere with us, one giant touchpad [55]. These are all exciting innovations and few would deny the positives that will come from the application of this cutting-edge research. The challenge will be how to avoid rushing this technology into the marketplace without the commensurate testing of prototypes and the due consideration of function creep. Function or scope creep occurs when a device or application is used for something other than it was originally intended.

Early prototypes of wearable computers throughout the 1980s and 1990s could have been described as outlandish, bizarre, or even weird. for the greater part, wearable computing efforts have focused on head-mounted displays (a visual approach) that unnaturally interfered with human vision and made proximity to others cumbersome [56, p. 171]. But the long-term aim of researchers is to make wearable computing inconspicuous as soon as technical improvements allow for it (Fig. 12). The end user should look as “normal” as possible [57, p. 177].

 

Fig. 12. Self-portraits of Mann with wearable computing kit from the 1980s to the 1990s. Prof. Mann started working on his WearComp invention as far back as his high school days in the 1970s. Courtesy of Steve Mann.

New technologies like the “Looxcie” [58] wearable recorders have come a long way since the clunky point-of-view head-mounted recording devices of the 1980s, allowing people to effortlessly record and share their life as they experience it in different contexts. Mann has aptly coined the term sousveillance. This is a type of inverse panopticon, sous (below) and veiller (to watch) stemming from the French words. A whole body of literature has emerged around the notion of sousveillance which refers to the recording of an activity by a participant in the activity, typically by way of small wearable or portable personal technologies. The glogger.mobi online platform demonstrates the great power of sousveillance. But there are still serious challenges, such as privacy concerns, that need to be overcome if wearable computing is to become commonplace [59]. Just like Google has created StreetView, can the individual participate in PersonView without his neighbor's or stranger's consent [7] despite the public versus private space debate? Connected to privacy is also the critical issue of autonomy (and if we were to agree with Kant, human dignity), that is, our right to make informed and uncoerced decisions.

While mass-scale commercial production of wearable clothing is still some time away, some even calling it the unfulfilled pledge [60], shirts with simple memory functions have been developed and tested. Sensors will play a big part in the functionality of the smartware helping to determine the environmental context, and undergarments closest to the body will be used for body functions such as the measurement of temperature, blood pressure, heart and pulse rates. for now, however, the aim is to develop ergonomically astute wearable computing that is actually useful to the end user. Head-mounted displays attached to the head with a headband may be practical for miners carrying out occupational health and safety (OH&S) but are unattractive for everyday consumer users. Displays of the next generation will be mounted or concealed within eyeglasses themselves [61, p. 48].

Mann [57, p. 31] predicts that wearable computing will become so common one day, interwoven into every day clothing-based computing, that “we will no doubt feel naked, confused, and lost without a computer screen hovering in front of our eyes to guide us,” just like we would feel our nakedness without the conventional clothing of today.

1. Wearables in the Medical Domain

Unsurprisingly, wearables have also found a niche market in the medical domain. In the mid-1990s, researchers began to describe a small wearable device that continuously monitored glucose levels so that the right amount of insulin was calculated for the individual reducing the incidence of hypoglycemic episodes [62]. The Glucoday [63] and GlucoChip [64] are just two products demonstrating the potential to go beyond wearables toward in vivo techniques in medical monitoring.

Medical wearables even have the capability to check and monitor products in one's blood [65, p. 88]. Today medical wearable device applications include: “monitoring of myocardial ischemia, epileptic seizure detection, drowsiness detection …physical therapy feedback, such as for stroke victim rehabilitation, sleep apnea monitoring, long-term monitoring for circadian rhythm analysis of heart rate variability (HRV)” [66, p. 44].

Some of the current shortcomings of medical wearables are similar to those of conventional wearables, namely the size and the weight of the device which can be too large and too heavy. In addition, wearing the devices for long periods of time can be irritating due to the number of sensors that may be required to be worn for monitoring. The gel applied for contact resistance between the electrode and the skin can also dry up, which is a nuisance. Other obstacles to the widespread diffusion of medical wearables include government regulations and the manufacturers' requirement for limited liability in the event that an incorrect diagnosis is made by the equipment.

But much has been improved in the products of wearables over the past ten years. Due to commensurate breakthroughs in the miniaturization of computing components, wearable devices are now usually quite small. Consider Toumaz Technology's Digital Plaster invention known as the Sensium Life Pebble TZ203002 (Fig. 13). The Digital Plaster contains a Sensium silicon chip, powered by a tiny battery, which sends data via a cell phone or a PDA to a central computer database. The Life Pebble has the ability to enable continuous, auditable acquisition of physiological data without interfering with the patient's activities. The device can continuously monitor electrocardiogram (ECG), heart rate, physical activity, and skin temperature. In an interview with M. G. Michael in 2006, Toumazou noted how the Digital Plaster had been applied in epilepsy control and depression. He said that by monitoring the electrical and chemical responses they could predict the onset of either a depressive episode or an epileptic fit; and then once predicted the nerve could be stimulated to counter the seizure [67]. He added that this truly signified “personal healthcare.”

Fig. 13. Prof. Christofer Toumazou with a patient wearing the “digital plaster”; a tiny electronic device meant to be embedded in ordinary medical plaster that includes sensors for monitoring health-related metadata such as blood pressure, temperature, and glucose levels. Courtesy of Toumaz Technology 2008.

 

D. Robots and Unmanned Aerial Systems and Vehicles

Fig. 14. Predator Drone aircraft: this plane comes in the armed and reconnaissance versions and the models are known as RQ-1 and MQ-1.

Autonomous systems are those which are self-governed. In practice, there are many degrees of autonomy ranging from the highly constrained and supervised to unconstrained and intelligent. Some systems are referred to as “semiautonomous” in order to suggest that the machines are tasked or supervised by a human operator. An unmanned vehicle may be a remotely piloted “dumb” vehicle or an autonomous vehicle (Fig. 14). Robots may be designed to perform repetitive tasks in a highly constrained environment or with intelligence and a high level of autonomy to make judgments in a dynamic and unpredictable environment. As technology advancements allow for a high level of autonomy and expansion from industrial applications to caregiving and warfighting, society is coming to grips with the present and the future of increasingly autonomous systems in our homes, workplaces, and battlefields.

 

Robot ethics, particularly with respect to autonomous weapons systems, has received increasing attention in the last few years [68]. While some call for an outright stop to the development of such technology [69], others seek to shape the technology with ethical and moral implications in mind [6], [70]–[71][72][73]. Driving robotics weapons development underground or refusing to engage in dialog over the ethical issues will not give ethicists an opportunity to participate in shaping the design and use of such weapons. Arkin [6] and Operto [74], among others, argue that engineers must not shy away from these ethical challenges. Furthermore, the technological cat is out of the bag: “Autonomy is subtle in its development—it is occurring in a step-by-step process, rather than through the creation of a disruptive invention. It is far less likely that we will have a sudden development of a ‘positronic brain’ or its equivalent, but rather a continual and gradual relinquishment of authority to machines through the constant progress of science, as we have already seen in automated trains, elevators, and numerous other examples, that have vanished into the background noise of civilization. Autonomy is already here by some definitions” [70].

The evolution of the development and deployment of unmanned aerial vehicles and other autonomous or semiautonomous systems has outpaced the analysis of social implications and ethics of their design and use [70], [75]. Sullivan argues that the evolution of unmanned vehicles for military deployment should not be confused with the more general trend of increasing autonomy in military applications [75]. Use of robots often provides a tactical advantage due to sensors, data processing, and physical characteristics that outperform humans. Robots can act without emotion, bias, or self-preservation influencing judgment, which may be a liability or advantage. Risks to robot deployment in the military, healthcare industry, and elsewhere include trust of autonomous systems (a lack of, or too much) and diffusion of blame or moral buffering [6], [72].

for such critical applications in the healthcare domain, and lethal applications in weapons, the emotional and physical distance of operating a remote system (e.g., drone strikes via video-game style interface) may negatively influence the moral decision making of the human operator or supervisor, while also providing some benefit of emotional protection against post-traumatic stress disorder [71], [72]. Human–computer interfaces can promote ethical choices in the human operator through thoughtful or model-based design as suggested by Cummings [71] and Asaro [72].

for ethical behavior of the autonomous system itself, Arkin proposes that robot soldiers could be more humane than humans, if technologically constrained to the laws of war and rules of engagement, which they could follow without the distortions of emotion, bias, or a sense of self-preservation [6], [70]. Asaro argues that such laws are not, in fact, objective and static but rather meant for human interpretation for each case, and therefore could not be implemented in an automated system [72]. More broadly, Operto [74] agrees that a robot (in any application) can only act within the ethics incorporated into its laws, but that a learning robot, in particular, may not behave as its designers anticipate.

Fig. 15. Kotaro, a humanoid roboter created at the University of Tokyo (Tokyo, Japan), presented at the University of Arts and Industrial Design Linz (Linz, Austra) during the Ars Electronica Festival 2008. Courtesy of Manfred Werner-Tsui.

Robot ethics is just one part of the landscape of social implications for autonomous systems. The field of human–robot interaction explores how robot interfaces and socially adaptive robots influence the social acceptance, usability, and safety of robots [76] (Fig. 15). for example, robots used for social assistance and care, such as for the elderly and small children, introduce a host of new social implications questions. Risks of developing an unhealthy attachment or loss of human social contact are among the concerns raised by Sharkey and Sharkey [77]. Interface design can influence these and other risks of socially assistive robots, such as a dangerous misperception of the robot's capabilities or a compromise of privacy [78].

 

Autonomous and unmanned systems have related social implication challenges. Clear accountability and enforcing morality are two common themes in the ethical design and deployment of such systems. These themes are not unique to autonomous and unmanned systems, but perhaps the science fiction view of robots run amok raises the question “how can we engineer a future where we can benefit from these technologies while maintaining our humanity?”

 

SECTION IV. The Future

Great strides are being taken in the field of biomedical engineering: the application of engineering principles and techniques to the medical field [79]. New technologies such as prospective applications of nanotechnology, microcircuitry (e.g., implantables), and bionics will heal and give hope to many who are suffering from life-debilitating and life-threatening diseases [80]. The lame will walk again. The blind will see just as the deaf have heard. The dumb will sing. Even bionic tongues are on the drawing board. Hearts and kidneys and other organs will be built anew. The fundamental point is that society at large should be able to distinguish between positive and negative applications of technological advancements before we diffuse and integrate such innovations into our day-to-day existence.

The Bionics Institute [81], for instance, is future-focused on the possibilities of bionic hearing, bionic vision, and neurobionics, stating: “Medical bionics is not just a new frontier of medical science, it is revolutionizing what is and isn't possible. Where once there was deafness, there is now the bionic ear. And where there was blindness, there may be a bionic eye.” The Institute reaffirms its commitment to continuing innovative research and leading the way on the proposed “world-changing revolution.”

A. Cochlear Implants—Helping the Deaf to Hear

Fig. 16. Cochlear's Nucleus Freedom implant with Contour Advance electrode which is impervious to magnetic fields up to 1.5 Tesla. Courtesy of Cochlear Australia.

In 2000, more than 32 000 people worldwide already had cochlear implants [82], thanks to the global efforts of people such as Australian Professor Graeme Clark, the founder of Cochlear, Inc. [83]. Clark performed his first transplant in Rod Saunder's left ear at the Royal Eye and Ear Hospital in Melbourne, Australia, on August 1, 1978, when “he placed a box of electronics under Saunders's skin and a bundle of electrodes in his inner ear” [84]. In 2006, that number had grown to about 77 500 for the nucleus implant (Fig. 16) alone which had about 70% of the market share [85]. Today, there are over 110 000 cochlear implant recipients, about 30 000 annually, and their personal stories are testament enough to the ways in which new technologies can change lives dramatically for the better [86]. Cochlear implants can restore hearing to people who have severe hearing loss, a form of diagnosed deafness. Unlike a standard hearing aid that works like an amplifier, the cochlear implant acts like a microphone to change sound into electronic signals. Signals are sent to the microchip implant via radio frequency (RF), stimulating nerve fibers in the inner ear. The brain then interprets the signals that are transmitted via the nerves to be sound.

 

Today, cochlear implants (which are also commonly known as bionic ears) are being used to overcome deafness; tomorrow, they may be open to the wider public as a performance-enhancing technique [87, pp. 10–11]. Audiologist Steve Otto of the Auditory Brainstem Implant Project at the House Ear Institute (Los Angeles, CA) predicts that one day “implantable devices [will] interface microscopically with parts of the normal system that are still physiologically functional” [88]. He is quoted as saying that this may equate to “ESP for everyone.” Otto's prediction that implants will one day be used by persons who do not require them for remedial purposes has been supported by numerous other high profile scientists. A major question is whether this is the ultimate trajectory of these technologies.

for Christofer Toumazou, however, Executive Director of the Institute of Biomedical Engineering, Imperial College London (London, U.K.), there is a clear distinction between repairing human functions and creating a “Superman.” He said, “trying to give someone that can hear super hearing is not fine.” for Toumazou, the basic ethical paradigm should be that we hope to repair the human and not recreate the human [67].

B. Retina Implants—On a Mission to Help the Blind to See

Fig. 17. Visual cortical implant designed by Prof. Mohamad Sawan, a researcher at Polystim Neurotechnologies Laboratory at the Ecole Polytechnique de Montreal (Montreal, QC, Canada). The basic principle of Prof. Sawan's technology consists of stimulating the visual cortex by implanting a silicon microchip on a network of electrodes, made of biocompatible materials, wherein each electrode injects a stimulating electrical current in order to provoke a series of luminous points to appear (an array of pixels) in the field of vision of the blind person. This system is composed of two distinct parts: the implant and an external controller. Courtesy of Mohamad Sawan 2009, made available under Creative Commons License.

The hope is that retina implants will be as successful as cochlear implants in the future [89]. Just as cochlear implants cannot be used for persons suffering from complete deafness, retina implants are not a solution for totally blind persons but rather those suffering from aged macular degeneration (AMD) and retinitis pigmentosa (RP). Retina implants have brought together medical researchers, electronic specialists, and software designers to develop a system that can be implanted inside the eye [90]. A typical retina implant procedure is as follows: “[s]urgeons make a pinpoint opening in the retina to inject fluid in order to lift a portion of the retina from the back of the eye, creating a pocket to accommodate the chip. The retina is resealed over the chip, and doctors inject air into the middle of the eye to force the retina back over the device and close the incisions” [91] (Fig. 17).

 

Brothers Alan and Vincent Chow, one an engineer, the other an ophthalmologist, developed the artificial silicon retina (ASR) and began the company Optobionics Corporation in 1990. This was a marriage between biology and engineering: “In landmark surgeries at the University of Illinois at Chicago Medical Center …the first artificial retinas made from silicon chips were implanted in the eyes of two blind patients who have lost almost all of their vision because of retinal disease.” In 1993, Branwyn [92, p. 3] reported that a team at the National Institutes of Health (NIH) led by Dr. Hambrecht, implanted a 38-electrode array into a blind female's brain. It was reported that she saw simple light patterns and was able to make out crude letters. The following year the same procedure was conducted by another group on a blind male resulting in the man seeing a black dot with a yellow ring around it. Rizzo of Harvard Medical School's Massachusetts Eye and Ear Infirmary (Boston, MA) has cautioned that it is better to talk down the possibilities of the retina implant so as not to give false hopes. The professor himself had expressed that they are dealing with “science fiction stuff” and that there are no long-term guarantees that the technology will ever fully restore sight, although significant progress is being made by a number of research institutes [93, p. 5].

Among these pioneers are researchers at The Johns Hopkins University Medical Center (Baltimore, MD). Brooks [94, p. 4] describes how the retina chip developed by the medical center will work: “a kind of miniature digital camera…is placed on the surface of the retina. The camera relays information about the light that hits it to a microchip implanted nearby. This chip then delivers a signal that is fed back to the retina, giving it a big kick that stimulates it into action. Then, as normal, a signal goes down the optic nerve and sight is at least partially restored.” In 2009, at the age of 56, Barbara Campbell had an array of electrodes implanted in each eye [95] and while her sight is nowhere near fully restored, she is able to make out shapes and see shades of light and dark. Experts believe that this approach is still more realistic in restoring sight to those suffering from particular types of blindness, even more than stem cell therapy, gene therapy, or eye transplants [96] where the risks still outweigh the advantages.

C. Tapping Into the Heart and Brain

Fig. 18. An artificial pacemaker from St. Jude Medical (St. Paul, MN), with electrode 2007. Courtesy of Steven Fruitsmaak.

If it was possible as far back as 1958 to successfully implant two transistors the size of an ice hockey puck in the heart of a 43 year old man [97], the things that will become possible by 2020 are constrained by the imagination as much as by technological limitations. Heart pacemakers (Fig. 18) are still being further developed today, but for the greater part, researchers are turning their attention to the possibilities of brain pacemakers. In the foreseeable future brain implants may help sufferers of Parkinson's, paralysis, nervous system problems, speech-impaired persons, and even cancer patients. The research is still in its formative years and the obstacles are great because of the complexity of the brain; but scientists are hopeful of major breakthroughs in the next 20 years.

 

The brain pacemaker endeavors are bringing together people from a variety of disciplines, headed mainly by neurosurgeons. By using brain implants electrical pulses can be sent directly to nerves via electrodes. The signals can be used to interrupt incoherent messages to nerves that cause uncontrollable movements or tremors. By tapping into the right nerves in the brain, particular reactions can be achieved. Using a technique that was discovered almost accidentally in France in 1987, the following extract describes the procedure of “tapping into” the brain: “Rezai and a team of functional neurosurgeons, neurologists and nurses at the Cleveland Clinic Foundation in Ohio had spent the next few hours electronically eavesdropping on single cells in Joan's brain attempting to pinpoint the precise trouble spot that caused a persistent, uncontrollable tremor in her right hand. Once confident they had found the spot, the doctors had guided the electrode itself deep into her brain, into a small duchy of nerve cells within the thalamus. The hope was that when sent an electrical current to the electrode, in a technique known as deep-brain stimulation, her tremor would diminish, and perhaps disappear altogether” [98]. Companies such as Medtronic Incorporated of Minnesota (Minneapolis, MN) now specialize in brain pacemakers [98]. Medtronic's Activa implant has been designed specifically for sufferers of Parkinson's disease [93].

More recently, there has been some success with ameliorating epileptic attacks through closed-loop technology, also known as smart stimulation. The implant devices can detect an onset of epileptiform activity through a demand-driven process. This means that the battery power in the active implant lasts longer because of increased efficiency, i.e., it is not always stimulating in anticipation of an attack, and adverse effects of having to remove and install new implants more frequently are forgone [99]. Similarly, it has been said that technology such as deep brain stimulation, which has physicians implant electrodes in the brain and electrical pacemakers in the patient's clavicle for Parkinson's Disease, may well be used to overcome problems with severely depressed persons [100].

Currently, the technology is being used to treat thousands of people who are severely depressed or suffering from obsessive compulsive disorder (OCD) who have been unable to respond to other forms of treatment such as cognitive behavioral therapy (CBT) [101]. It is estimated that 10% of people suffering from depression do not respond to conventional methods. Although hard figures are difficult to obtain, several thousands of depressed persons worldwide have had brain pacemakers installed that have software which can be updated wirelessly and remotely. The trials have been based on decades of research by Prof. Helen Mayberg, from Emory University School of Medicine (Atlanta, GA), who first began studying the use of subcallosal cingulate gyrus deep brain stimulation (SCG DBS) for depression in 1990.

In her research, Mayberg has used a device that is no larger than a matchbox with a battery-powered generator that sits in the chest and produces electric currents. The currents are sent to an area deep in the brain via tiny wires which are channeled under the skin on either side of the neck. Surprisingly the procedure to have this type of implant installed only requires local anesthetic and is an outpatient procedure. In 2005, Mayberg told a meeting at the Science Media Centre in London: “This is a very new way to think about the nature of depression …We are not just exciting the brain, we are using electricity to retune and remodulate…We can interrupt or switch off an abnormally functioning circuit” [102].

Ongoing trials today continue to show promising results. The outcome of a 20-patient clinical trial of persons with depression treated with SCG DBS published in 2011, showed that: “At 1 year, 11 (55%) responded to surgery with a greater than 50% reduction in 17-item Hamilton Depression Scale scores. Seven patients (35%) achieved or were within 1 point of achieving remission (scores < 8). Of note, patients who responded to surgery had a significant improvement in mood, anxiety, sleep, and somatic complains related to the disease. Also important was the safety of the procedure, with no serious permanent adverse effects or changes in neuropsychological profile recorded” [103].

Despite the early signs that these procedures may offer long-term solutions for hundreds of thousands of people, some research scientists believe that tapping into the human brain is a long shot. The brain is commonly understood to be “wetware” and plugging in hardware into this “wetware” would seem to be a type mismatch, at least according to Steve Potter, a senior research fellow in biology working at the California Institute of Technology's Biological Imaging Center (Pasadena, CA). Instead Potter is pursuing the cranial route as a “digital gateway to the brain” [88]. Others believe that it is impossible to figure out exactly what all the millions of neurons in the brain actually do. Whether we eventually succeed in “reverse-engineering” the human brain, the topic of implants for both therapeutic and enhancement purposes has aroused significant controversy in the past, and promises to do so even more in the future.

D. Attempting to Overcome Paralysis

In more speculative research, surgeons believe that brain implants may be a solution for persons who are suffering from paralysis, such as spinal cord damage. In these instances, the nerves in the legs are still theoretically “working”; it is just that they cannot make contact with the brain which controls their movement. If somehow signals could be sent to the brain, bypassing the lesion point, it could conceivably mean that paralyzed persons regain at least part of their capability to move [104]. In 2000, Reuters [105] reported that a paralyzed Frenchman (Marc Merger) “took his first steps in 10 years after a revolutionary operation to restore nerve functions using a microchip implant…Merger walks by pressing buttons on a walking frame which acts as a remote control for the chip, sending impulses through fine wires to stimulate legs muscles…” It should be noted, however, that the system only works for paraplegics whose muscles remain alive despite damage to the nerves. Yet there are promising devices like the Bion that may one day be able to control muscle movement using RF commands [106]. Brooks [94] reports that researchers at the University of Illinois in Chicago (Chicago, IL) have “invented a microcomputer system that sends pulses to a patient's legs, causing the muscles to contract. Using a walker for balance, people paralyzed from the waist down can stand up from a sitting position and walk short distances…Another team, based in Europe…enabled a paraplegic to walk using a chip connected to fine wires in his legs.” These techniques are known as functional neuromuscular stimulation systems [107]. In the case of Australian Rob Summers, who became a paraplegic after an accident, doctors implanted an epidural stimulator and electrodes into his spinal cord. “The currents mimic those normally sent by the brain to initiate movement” [108].

Others working to help paraplegics to walk again have invested time in military technology like exoskeletons [109] meant to aid soldiers in lifting greater weights, and also to protect them during battle. Ekso Bionics (Berkeley, CA), formerly Berkeley Bionics, has been conducting trials of an electronic suit in the United States since 2010. The current Ekso model will be fully independent and powered by artificial intelligence in 2012. The Ekso “provides nearly four hours of battery power to its electronic legs, which replicate walking by bending the user's knees and lifting their legs with what the company claims is the most natural gait available today” [110]. This is yet another example of how military technology has been commercialized toward a health solution [111].

E. Granting a Voice to the Speech Impaired

Speech-impairment microchip implants work differently than cochlear and retina implants. Whereas in the latter two, hearing and sight is restored, in implants for speech impairment the voice is not restored, but an outlet for communication is created, possibly with the aid of a voice synthesizer. At Emory University, neurosurgeon Roy E. Bakay and neuroscientist Phillip R. Kennedy were responsible for critical breakthroughs early in the research. In 1998, Versweyveld [112] reported two successful implants of a neurotrophic electrode into the brain of a woman and man who were suffering from amyotrophic lateral sclerosis (ALS) and brainstem stroke, respectively. In an incredible process, Bakay and Kennedy's device uses the patient's brain processes—thoughts, if you will—to move a cursor on a computer screen. “The computer chip is directly connected with the cortical nerve cells…The neural signals are transmitted to a receiver and connected to the computer in order to drive the cursor” [112]. This procedure has major implications for brain–computer interfaces (BCIs), especially bionics. Bakay predicted that by 2010 prosthetic devices will grant patients that are immobile the ability to turn on the TV just by thinking about it and by 2030 to grant severely disabled persons the ability to walk independently [112], [113].

F. Biochips for Diagnosis and Smart Pills for Drug Delivery

It is not unlikely that biochips will be implanted in people at birth in the not too distant future. “They will make individual patients aware of any pre-disposition to susceptibility” [114]. That is, biochips will be used for point-of-care diagnostics and also for the identification of needed drugs, even to detect pandemic viruses and biothreats for national security purposes [115]. The way that biosensors work is that they “represent the technological counterpart of our sense organs, coupling the recognition by a biological recognition element with a chemical or physical transducer, transferring the signal to the electrical domain” [116]. Types of biosensors include enzymes antibodies, receptors, nucleic acids, cells (using a biochip configuration), biomimetic sequences of RNA (ribonucleic) or DNA (deoxyribonucleic), and molecularly imprinted polymers (MIPs). Biochips, on the other hand, “automate highly repetitive laboratory tasks by replacing cumbersome equipment with miniaturized, microfluidic assay chemistries combined with ultrasensitive detection methodologies. They achieve this at significantly lower costs per assay than traditional methods—and in a significantly smaller amount of space. At present, applications are primarily focused on the analysis of genetic material for defects or sequence variations” [117].

with response to treatment for illness, drug delivery will not require patients to swallow pills or take routine injections; instead chemicals will be stored on a microprocessor and released as prescribed. The idea is known as “pharmacy-on-a-chip” and was originated by scientists at the Massachusetts Institute of Technology (MIT, Cambridge, MA) in 1999 [118]. The following extract is from The Lab[119]: “Doctors prescribing complicated courses of drugs may soon be able to implant microchips into patients to deliver timed drug doses directly into their bodies.”

Microchips being developed at Ohio State University (OSU, Columbus, OH) can be swathed with chemical substances such as pain medication, insulin, different treatments for heart disease, or gene therapies, allowing physicians to work at a more detailed level [119]. The breakthroughs have major implications for diabetics, especially those who require insulin at regular intervals throughout the day. Researchers at the University of Delaware (Newark, DE) are working on “smart” implantable insulin pumps that may relieve people with Type I diabetes [120]. The delivery would be based on a mathematical model stored on a microchip and working in connection with glucose sensors that would instruct the chip when to release the insulin. The goal is for the model to be able to simulate the activity of the pancreas so that the right dosage is delivered at the right time.

Fig. 19. The VeriChip microchip, the first microchip implant to be cleared by the U.S. Food and Drug Administration (FDA) for humans, is a passive microchip that contains a 16-digit number, which can be used to retrieve critical medical information on a patient from a secure online database. The company that owns the VeriChip technology is developing a microscopic glucose sensor to put on the end of the chip to eliminate a diabetic's need to draw blood to get a blood glucose reading. Courtesy of PositiveID Corporation.

Beyond insulin pumps, we are now nearing a time where automated closed-loop insulin detection (Fig. 19) and delivery will become a tangible treatment option and may serve as a temporary cure for Type I diabetes until stem cell therapy becomes available. “Closed-loop insulin delivery may revolutionize not only the way diabetes is managed but also patients' perceptions of living with diabetes, by reducing the burden on patients and caregivers, and their fears of complications related to diabetes, including those associated with low and high glucose levels” [121]. It is only a matter of time before these lab-centric results are replicated in real-life conditions in sufferers of Type 1 diabetes.

 

 

G. To Implant or Not to Implant, That Is the Question

There are potentially 500 000 hearing impaired persons that could benefit from cochlear implants [122] but not every deaf person wants one [123]. “Some deaf activists…are critical of parents who subject children to such surgery [cochlear implants] because, as one charged, the prosthesis imparts ‘the non-healthy self-concept of having had something wrong with one's body’ rather than the ‘healthy self-concept of [being] a proud Deaf’” [124]. Assistant Professor Scott Bally of Audiology at Gallaudet University (Washington, DC) has said, “Many deaf people feel as though deafness is not a handicap. They are culturally deaf individuals who have successfully adapted themselves to being deaf and feel as though things like cochlear implants would take them out of their deaf culture, a culture which provides a significant degree of support” [92]. Putting this delicate debate aside, it is here that some delineation can be made between implants that are used to treat an ailment or disability (i.e., giving sight to the blind and hearing to the deaf), and implants that may be used for enhancing human function (i.e., memory). There are some citizens, like Amal Graafstra of the United States [125], who are getting chip implants for convenience-oriented social living solutions that would instantly herald in a world that had keyless entry everywhere (Fig. 20). And there are other citizens who are concerned about the direction of the human species, as credible scientists predict fully functional neural implants. “[Q]uestions are raised as to how society as a whole will relate to people walking around with plugs and wires sprouting out of their heads. And who will decide which segments of the society become the wire-heads” [92]?

 

Fig. 20. Amal Graafstra demonstrating an RFID-operated door latch application he developed. Over the RFID tag site on his left hand is a single steristrip that remained after implantation for a few days. His right hand is holding the door latch.

 

SECTION V. Überveillance and Function Creep

Section IV focused on implants that were attempts at “orthopedic replacements”: corrective in nature, required to repair a function that is either lying dormant or has failed altogether. Implants of the future, however, will attempt to add new “functionality” to native human capabilities, either through extensions or additions. Globally acclaimed scientists have pondered on the ultimate trajectory of microchip implants [126]. The literature is admittedly mixed in its viewpoints of what will and will not be possible in the future [127].

for those of us working in the domain of implantables for medical and nonmedical applications, the message is loud and clear: implantables will be the next big thing. At first, it will be “hip to get a chip.” The extreme novelty of the microchip implant will mean that early adopters will race to see how far they can push the limits of the new technology. Convenience solutions will abound [128]. Implantees will not be able to get enough of the new product and the benefits of the technology will be touted to consumers in a myriad of ways, although these perceived benefits will not always be realized. The technology will probably be first tested where there will be the least effective resistance from the community at large, that is, on prison inmates [129], then those suffering from dementia. These incremental steps in pilot trials and deployment are fraught with moral consequences. Prisoners cannot opt out from jails adopting tracking technology, and those suffering from cognitive disorders have not provided and could not provide their consent. from there it will conceivably not take long for it to be used on the elderly and in children and on those suffering from clinical depression.

The functionality of the implants will range from passive ID-only to active multiapplication, and most invasive will be the medical devices that can upon request or algorithmic reasoning release drugs or electrically stimulate the body for mental and physical stability. There will also be a segment of the consumer and business markets who will adopt the technology for no clear reason and without too much thought, save for the fact that the technology is new and seems to be the way advanced societies are heading. This segment will probably not be overly concerned with any discernible abridgement of their human rights or the fine-print “terms and conditions” agreement they have signed, but will take an implant on the promise that they will have greater connectivity to the Internet, for example. These consumers will thrive on ambient intelligence, context-aware pervasive applications, and an augmented reality—ubiquity in every sense.

But it is certain that the new technology will also have consequences far greater than what we can presently envision. Questions about the neutrality of technology are immaterial in this new “plugged-in” order of existence. for Brin [130, p. 334], the question ultimately has to do with the choice between privacy and freedom. In his words, “[t]his is one of the most vile dichotomies of all. And yet, in struggling to maintain some beloved fantasies about the former, we might willingly, even eagerly, cast the latter away.” And thus there are two possibilities, just as Brin [130] writes in his amazingly insightful book, The Transparent Society, of “the tale of two cities.” Either implants embedded in humans which require associated infrastructure will create a utopia where there is built-in intelligence for everything and everyone in every place, or implants embedded in humans will create a dystopia which will be destructive and will diminish one's freedom of choice, individuality, and finally that indefinable essence which is at the core of making one feel “human.” A third possibility—the middle-way between these two alternatives—would seem unlikely, excepting for the “off the grid” dissenter.

In Section V-A, we portray some of the attractions people may feel that will draw them into the future world of implanted technologies. In Section V-B, we portray some of the problems associated with implanting technology under the skin that would drive people away from opting in to such a future.

A. The Positive Possibilities

Bearing a unique implant will make the individual feel special because they bear a unique ID. Each person will have one implant which will coordinate hundreds of smaller nanodevices, but each nanodevice will have the capacity to act on its own accord. The philosophy espoused behind taking an implant will be one of protection: “I bear an implant and I have nothing to hide.” It will feel safe to have an implant because emergency services, for example, will be able to rapidly respond to your calls for help or any unforeseen events that automatically log problems to do with your health.

Fewer errors are also likely to happen if you have an implant, especially with financial systems. Businesses will experience a rise in productivity as they will understand how precisely their business operates to the nearest minute, and companies will be able to introduce significant efficiencies. Losses in back-end operations, such as the effects of product shrinkage, will diminish as goods will be followed down the supply chain from their source to their destination customer, through the distribution center and retailer.

It will take some years for the infrastructure supporting implants to grow and thrive with a substantial consumer base. The function creep will not become apparent until well after the early majority have adopted implants and downloaded and used a number of core applications to do with health, banking, and transport which will all be interlinked. New innovations will allow for a hybrid device and supplementary infrastructure to grow so powerful that living without automated tracking, location finding, and condition monitoring will be almost impossible.

B. The Existential Risks

It will take some years for the negative fallout from microchip implants to be exposed. At first only the victims of the fallout will speak out through formal exception reports on government agency websites. The technical problems associated with implants will pertain to maintenance, updates, viruses, cloning, hacking, radiation shielding, and onboard battery problems. But the greater problems will be the impact on the physiology and mental health of the individual: new manifestations of paranoia and severe depression will lead to people continually wanting reassurance about their implant's functionality. Issues about implant security, virus detection, and a personal database which is error free will be among the biggest issues facing implantees. Despite this, those who believe in the implant singularity (the piece of embedded technology that will give each person ubiquitous access to the Internet) will continue to stack up points and rewards and add to their social network, choosing rather to ignore the warnings of the ultimate technological trajectory of mind control and geoslavery [131]. It will have little to do with survival of the fittest at this point, although most people will buy into the notion of an evolutionary path toward the Homo Electricus [132]: a transhumanist vision [133] that we can do away with the body and become one with the Machine, one with the Cosmos—a “nuts and bolts” Nirvana where one's manufactured individual consciousness connects with the advanced consciousness evolving from the system as a whole. In this instance, it will be the ecstatic experience of being drawn ever deeper into the electric field of the “Network.”

Some of the more advanced implants will be able to capture and validate location-based data, alongside recordings (visual and audio capture). The ability to conduct überveillance via the implant will be linked to a type of blackbox recorder as in an airplane's cockpit. Only in this case the cockpit will be the body, and the recorder will be embedded just beneath the translucent layer of the skin that will be used for memory recollection and dispute resolution. Outwardly ensuring that people are telling the full story at all times, there will be no lies or claims to poor memory. Überveillance is an above and beyond, an exaggerated, an omnipresent 24/7 electronic surveillance (Fig. 21). It is a surveillance that is not only “always on” but “always with you.” It is ubiquitous because the technology that facilitates it, in its ultimate implementation, is embedded within the human body. The problem with this kind of bodily invasive surveillance is that omnipresence in the “material” world will not always equate with omniscience, hence the real concern for misinformation, misinterpretation, and information manipulation [7]. While it might seem like the perfect technology to aid in real-time forensic profiling and criminalization, it will be open to abuse, just like any other technique, and more so because of the preconception that it is infallible.

 

Fig. 21.The überveillance triquetra as the intersection of surveillance, dataveillance, and sousveillance. Courtesy of Alexander Hayes.

 

SECTION VI. Technology Roadmapping

According to Andrews cited in [1], a second intellectual current within the IEEE SSIT has begun to emerge which is more closely aligned with most of the IEEE technical societies, as well as economics and business. The proponents of this mode participate in “technology foresight” and “roadmapping” activities, and view technology more optimistically, looking to foster innovation without being too concerned about its possible negative effects [1, p. 14]. Braun [134, p. 133] writes that “[f]orecasts do not state what the future will be…they attempt to glean what it might be.” Thus, one with technology foresight can be trusted insofar as their knowledge and judgment go—they may possess foresight through their grasp of current knowledge, through past experiences which inform their forecasts, and through raw intuition.

Various MIT Labs, such as the Media Lab, have been engaged in visionary research since before 1990, giving society a good glimpse of where technology might be headed some 20–30 years ahead of time. It is from such elite groups that visionaries typically emerge whose main purpose is to envision the technologies that will better our wellbeing and generally make life more productive and convenient in the future. Consider the current activities of the MIT Media Lab's Affective Computing Research Group directed by Prof. Rosalind W. Picard that is working hard on technology aids encapsulating “affect sensing” in response to the growing problem of autism [135]. The Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology. The work of Picard's group was made possible by the foundations laid by the Media Lab's predecessor researchers.

On the global technological roadmap we can now point to the following systems which are already under development but have not yet been widely diffused into the market:

  • alternative fuels heralding in innovations like electric cars which are self-driving, and ocean-powered energy, as well as rise of biofuels;

  • the potential for 3-D printing which will revolutionize prototyping and manufacturing practices and possibly reconstruct human tissue;

  • hologram projections for videoconferencing and televisions that respond to gestures as well as pen-sized computing which will do away with keyboards and screens;

  • quantum computing and cryptography;

  • next-generation prosthetics (Fig. 22);

  • cognitive machines such as robot humanoids;

  • carbon nanotubes and nanotech computing which will make our current silicon chips look gargantuan;

  • genetic engineering breakthroughs and regenerative health treatment such as stem cell treatment;

  • electronic banking that will not use physical cash for transactions but the singularity chip (e.g., implant);

  • ubiquitous high-speed wireless networks;

  • crowdsourced surveillance toward real-time forensic profiling and criminalization;

  • autogeneration visual life logs and location chronicles;

  • enhanced batteries that last longer;

  • body power to charge digital equipment [136];

  • brainwave-based technologies in health/gaming;

  • brain-reading technology for interrogation [137].

 

Fig. 22. Army Reserve Staff Sgt. Alfredo De Los Santos displays what the X2 microprocessor knee prosthetic can do by walking up a flight of stairs at the Military Advanced Training Center at Walter Reed Army Medical Center (Washington, DC), December 8, 2009. Patients at Walter Reed are testing next-generation prosthetics. Courtesy of the U.S. Army.

It is important to note that while these new inventions have the ability to make things faster and better for most living in more developed countries, they can act to increase the ever-widening gap between the rich and the poor. New technologies will not necessarily aid in eradicating the poverty cycle in parts of Africa and South America. In fact, new technologies can have the opposite effect—they can create an ever greater chasm in equity and access to knowledge.

Technology foresight is commonly held by one who is engaged in the act of prediction. Predictive studies more often than not are based on past and present trends and use this knowledge for providing a roadmap of future possibilities. There is some degree of imagination in prediction, and certainly the creative element is prevalent. Predictions are not meant to be wild, but calculated wisely with evidence that shows a given course or path is likely in the future. However, this does not mean that all predictions come true. Predictive studies can be about new inventions and new form factors, or the recombination of existing innovations in new ways (hybrid architectures, for example), or the mutation of an existing innovation. Some elements of predictive studies have heavy quantitative forecasting components that use complex models to predict the introduction of new innovations, some even based on historical data inputs.

Before an invention has been diffused into the market, scenario planning is conducted to understand how the technology might be used, who might take it up, and what percentage of society will be willing to adopt the product over time (i.e., consumption analysis). “Here the emphasis is on predicting the development of the technology and assessing its potential for adoption, including an analysis of the technology's market” [138, p. 328].

Even the founder of Microsoft Bill Gates [139, p. 274] accepted that his predictions may not come true. But his insights in the Road Ahead are to be commended, even though they were understandably broad. Gates wrote, “[t]he information highway will lead to many destinations. I've enjoyed speculating about some of these. Doubtless I've made some foolish predictions, but I hope not too many.” Allaby [140, p. 206] writes, “[f]orecasts deal in possibilities, not inevitabilities, and this allows forecasters to explore opportunities.”

for the greater part, forecasters raise challenging issues that are thought provoking, about how existing inventions or innovations will impact society. They give scenarios for the technology's projected pervasiveness, how they may affect other technologies, what potential benefits or drawbacks they may introduce, how they will affect the economy, and much more.

Kaku [141, p. 5] has argued, “that predictions about the future made by professional scientists tend to be based much more substantially on the realities of scientific knowledge than those made by social critics, or even those by scientists of the past whose predictions were made before the fundamental scientific laws were completely known.” He believes that among the scientific body today there is a growing concern regarding predictions that for the greater part come from consumers of technology rather than those who shape and create it. Kaku is, of course, correct, insofar that scientists should be consulted since they are the ones actually making things possible after discoveries have occurred. But a balanced view is necessary and extremely important, encompassing various perspectives of different disciplines.

In the 1950s, for instance, when technical experts forecasted improvements in computer technology, they envisaged even larger machines—but science fiction writers predicted microminiaturization. They “[p]redicted marvels such as wrist radios and pocket-sized computers, not because they foresaw the invention of the transistor, but because they instinctively felt that some kind of improvement would come along to shrink the bulky computers and radios of that day” (Bova, 1988, quoted in [142, p. 18]). The methodologies used as vehicles to predict in each discipline should be respected. The question of who is more correct in terms of predicting the future is perhaps the wrong question. for example, some of Kaku's own predictions in Visions can be found in science fiction movies dating back to the 1960s.

In speculating about the next 500 years, Berry [142, p. 1] writes, “[p]rovided the events being predicted are not physically impossible, then the longer the time scale being considered, the more likely they are to come true…if one waits long enough everything that can happen will happen.”

 

SECTION VII.THE NEXT 50 YEARS: BRAIN–COMPUTER INTERFACE

When Ellul [143, p. 432] in 1964 predicted the use of “electronic banks” in his book The Technological Society, he was not referring to the computerization of financial institutions or the use of automatic teller machines (ATMs). Rather it was in the context of the possibility of the dawn of a new entity: the conjoining of man with machine. Ellul was predicting that one day knowledge would be accumulated in electronic banks and “transmitted directly to the human nervous system by means of coded electronic messages…[w]hat is needed will pass directly from the machine to the brain without going through consciousness…” As unbelievable as this man–machine complex may have sounded at the time, 45 years later visionaries are still predicting that such scenarios will be possible by the turn of the 22nd century. A large proportion of these visionaries are cyberneticists. Cybernetics is the study of nervous system controls in the brain as a basis for developing communications and controls in sociotechnical systems. Parenthetically, in some places writers continue to confuse cybernetics with robotics; they might overlap in some instances, but they are not the same thing.

Kaku [141, pp. 112–116] observes that scientists are working steadily toward a brain–computer interface (Fig. 23). The first step is to show that individual neurons can grow on silicon and then to connect the chip directly to a neuron in an animal. The next step is to mimic this connectivity in a human, and the last is to decode millions of neurons which constitute the spinal cord in order to interface directly with the brain. Cyberpunk science fiction writers like William Gibson [144] refer to this notion as “jacking-in” with the wetware: plugging in a computer cable directly with the central nervous system (i.e., with neurons in the brain analogous to software and hardware) [139, p. 133].

 

Fig.&nbsp;23.&nbsp; Brain–computer interface schema. (1) Pedestal. (2) Sensor. (3) Electrode. Courtesy of Balougador under creative commons license.

Fig. 23. Brain–computer interface schema. (1) Pedestal. (2) Sensor. (3) Electrode. Courtesy of Balougador under creative commons license.

In terms of the current state of development we can point to the innovation of miniature wearable media, orthopedic replacements (including pacemakers), bionic prosthetic limbs, humanoid robots (i.e., a robot that looks like a human in appearance and is autonomous), and RFID implants. Traditionally, the term cyborg has been used to describe humans who have some mechanical parts or extensions. Today, however, we are on the brink of building a new sentient being, a bearer of electricity, a modern man belonging to a new race, beyond that which can be considered merely part man part machine. We refer here to the absolute fusion of man and machine, where the subject itself becomes the object; where the toolmaker becomes one with his tools [145]. The question at this point of coalescence is how human will the new species be [146], and what are the related ethical, metaphysical, and ontological concerns? Does the evolution of the human race as recorded in history come to an end when technology can be connected to the body in a wired or wireless form?

A. From Prosthetics to Amplification

Fig.&nbsp;24.&nbsp; Cyborg 2.0 Project. Kevin Warwick with wife Irena during the Cyborg 2.0 project. Courtesy of Kevin Warwick.

Fig. 24. Cyborg 2.0 Project. Kevin Warwick with wife Irena during the Cyborg 2.0 project. Courtesy of Kevin Warwick.

While orthopedic replacements corrective in nature have been around since the 1950s [147] and are required to repair a function that is either lying dormant or has failed altogether, implants of the future will attempt to add new functionality to native human capabilities, either through extensions or additions. Warwick's Cyborg 2.0 project [148], for instance, intended to prove that two persons with respective implants could communicate sensation and movement by thoughts alone. In 2002, the BBC reported that a tiny silicon square with 100 electrodes was connected to the professor's median nerve and linked to a transmitter/receiver in his forearm. Although, “Warwick believe[d] that when he move[d] his own fingers, his brain [would] also be able to move Irena's” [104, p. 1], the outcome of the experiment was described at best as sending “Morse-code” messages (Fig. 24). Warwick [148] is still of the belief that a person's brain could be directly linked to a computer network [149]. Commercial players are also intent on keeping ahead, continually funding projects in this area of research.

 

If Warwick is right, then terminals like telephones would eventually become obsolete if thought-to-thought communication became possible. Warwick describes this as “putting a plug into the nervous system” [104] to be able to allow thoughts to be transferred not only to another person but to the Internet and other media. While Warwick's Cyborg 2.0 may not have achieved its desired outcomes, it did show that a form of primitive Morse-code-style nervous-system-to-nervous-system communication is realizable [150]. Warwick is bound to keep trying to achieve his project goals given his philosophical perspective. And if Warwick does not succeed, he will have at least left behind a legacy and enough stimuli for someone else to succeed in his place.

 

B. The Soul Catcher Chip

The Soul Catcher chip was conceived by former Head of British Telecom Research, Peter Cochrane. Cochrane [151, p. 2] believes that the human body is merely a carcass that serves as a transport mechanism just like a vehicle, and that the most important part of our body is our brain (i.e., mind). Similarly, Miriam English has said: “I like my body, but it's going to die, and it's not a choice really I have. If I want to continue, and I want desperately to see what happens in another 100 years, and another 1000 years…I need to duplicate my brain in order to do that” [152]. Soul Catcher is all about the preservation of a human, way beyond the point of physical debilitation. The Soul Catcher chip would be implanted in the brain, and act as an access point to the external world [153]. Consider being able to download the mind onto computer hardware and then creating a global nervous system via wireless Internet [154] (Fig. 25). Cochrane has predicted that by 2050 downloading thoughts and emotions will be commonplace. Billinghurst and Starner [155, p. 64]predict that this kind of arrangement will free up the human intellect to focus on creative rather than computational functions.

 

Fig. 25. Ray Kurzweil predicts that by 2013 supercomputer power will be sufficient for human brain functional simulation and by 2025 for human brain neural simulation for uploading. Courtesy of Ray Kurzweil and Kurzweil Technologies 2005.

Cochrane's beliefs are shared by many others engaged in the transhumanist movement (especially Extropians like Alexander Chislenko). Transhumanism (sometimes known by the abbreviations “> H” or “H+”) is an international cultural movement that consists of intellectuals who look at ways to extend life through the application of emerging sciences and technologies. Minsky [156] believes that this will be the next stage in human evolution—a way to achieve true immortality “replacing flesh with steel and silicon” [141, p. 94]. Chris Winter of British Telecom has claimed that Soul Catcher will mean “the end of death.” Winter predicts that by 2030, “[i]t would be possible to imbue a newborn baby with a lifetime's experiences by giving him or her the Soul Catcher chip of a dead person” [157]. The philosophical implications behind such movements are gigantic; they reach deep into every branch of traditional philosophy, especially metaphysics with its special concerns over cosmology and ontology.

 

SECTION VIII. The Next 100 Years: Homo Electricus

A. The Rise of the Electrophorus

Fig.&nbsp;26.&nbsp; Drawing showing the operation of an Electrophorus, a simple manual electrostatic generator invented in 1762 by Swedish Professor Johan Carl Wilcke. Image by Amédée Guillemin (died 1893).

Fig. 26. Drawing showing the operation of an Electrophorus, a simple manual electrostatic generator invented in 1762 by Swedish Professor Johan Carl Wilcke. Image by Amédée Guillemin (died 1893).

Microchip implants are integrated circuit devices encased in RFID transponders that can be active or passive and are implantable into animals or humans usually in the subcutaneous layer of the skin. The human who has been implanted with a microchip that can send or receive data is an Electrophorus, a bearer of “electric” technology [158]. The Macquarie Dictionary definition of “electrophorus” is “an instrument for generating static electricity by means of induction,” and refers to an instrument used in the early years of electrostatics (Fig. 26).

 

We have repurposed the term electrophorus to apply to humans implanted with microchips. One who “bears” is in some way intrinsically or spiritually connected to that which they are bearing, in the same way an expecting mother is to the child in her womb. The root electro comes from the Greek word meaning “amber,” and phorus means to “wear, to put on, to get into” [159, p. 635]. When an Electrophorus passes through an electromagnetic zone, he/she is detected and data can be passed from an implanted microchip (or in the future directly from the brain) to a computer device.

To electronize something is “to furnish it with electronic equipment” and electrotechnology is “the science that deals with practical applications of electricity.” The term “electrophoresis” has been borrowed here, to describe the “electronic” operations that an electrophorus is involved in. McLuhan and Zingrone [160, p. 94] believed that “electricity is in effect an extension of the nervous system as a kind of global membrane.” They argued that “physiologically, man in the normal use of technology (or his variously extended body) is perpetually modified by it and in turn finds ever new ways of modifying his technology” [161, p. 117].

The term “electrophorus” seems to be much more suitable today for expressing the human-electronic combination than the term “cyborg.” “Electrophorus” distinguishes strictly electrical implants from mechanical devices such as artificial hips. It is not surprising then that these crucial matters of definition raise philosophical and sociological questions of consciousness and identity, which science fiction writers have been addressing creatively. The Electrophorus belongs to the emerging species of Homo Electricus. In its current state, the Electrophorus relies on a device being triggered wirelessly when it enters an electromagnetic field. In the future, the Electrophorus will act like a network element or node, allowing information to pass through him or her, to be stored locally or remotely, and to send out messages and receive them simultaneously and allow some to be processed actively, and others as background tasks.

At the point of becoming an Electrophorus (i.e., a bearer of electricity), Brown [162] makes the observation that “[y]ou are not just a human linked with technology; you are something different and your values and judgment will change.” Some suspect that it will even become possible to alter behavior of people carrying brain implants, whether the individual wills it or not. Maybury [163]believes that “[t]he advent of machine intelligence raises social and ethical issues that may ultimately challenge human existence on earth.”

B. The Prospects of Transhumanism

Fig.&nbsp;27.&nbsp; The transhumanism symbol. Courtesy of Antonu under Creative Commons license.

Fig. 27. The transhumanism symbol. Courtesy of Antonu under Creative Commons license.

Thought-to-thought communications may seem outlandish today, but it is only one of many futuristic hopes of the movement termed transhumanism. Probably the most representative organization for this movement is the World Transhumanist Association (WTA), which recently adopted the doing-business-as name of “Humanity+” (Fig. 27). The WTA's website [164] carries the following succinct statement of what transhumanism is, penned originally by Max More in 1990: “Transhumanism is a class of philosophies of life that seek the continuation and acceleration of the evolution of intelligent life beyond its currently human form and human limitations by means of science and technology, guided by life-promoting principles and values.” Whether transhumanism yet qualifies as a philosophy, it cannot be denied that it has produced its share of both proponents and critics.

 

Proponents of transhumanism claim that the things they want are the things everyone wants: freedom from pain, freedom from suffering, freedom from all the limitations of the human body (including mental as well as physical limitations), and ultimately, freedom from death. One of the leading authors in the transhumanist movement is Ray Kurzweil, whose 652-page book The Singularity Is Near [165] prophesies a time in the not-too-distant future when evolution will accelerate exponentially and bring to pass all of the above freedoms as “the matter and energy in our vicinity will become infused with the intelligence, knowledge, creativity, beauty, and emotional intelligence (the ability to love, for example) of our human-machine civilization. Our civilization will then expand outward, turning all the dumb matter and energy we encounter into sublimely intelligent—transcendent—matter and energy” [165, p. 389].

Despite the almost theological tone of the preceding quote, Kurzweil has established a sound track record as a technological forecaster, at least when it comes to Moore's-Law-type predictions of the progress of computing power. But the ambitions of Kurzweil [178] and his allies go far beyond next year's semiconductor roadmap to encompass the future of all humanity. If the fullness of the transhumanist vision is realized, the following achievements will come to pass:

  • human bodies will cease to be the physical instantiation of human minds, replaced by as-yet-unknown hardware with far greater computational powers than the present human brain;

  • human minds will experience, at their option, an essentially eternal existence in a world free from the present restrictions of material embodiment in biological form;

  • limitations on will, intelligence, and communication will all be overcome, so that to desire a thing or experience will be to possess it.

The Transhumanist Declaration, last modified in 2009 [166], recognizes that these plans have potential downsides, and calls for reasoned debate to avoid the risks while realizing the opportunities. The sixth item in the Declaration, for example, declares that “[p]olicy making ought to be guided by responsible and inclusive moral vision, taking seriously both opportunities and risks, respecting autonomy and individual rights, and showing solidarity with and concern for the interests and dignity of all people around the globe.” The key phrase in this item is “moral vision.” While many self-declared transhumanists may agree on the moral vision which should guide their endeavors, the movement has also inspired some of the most vigorous and categorically critical invective to be found in the technical and public-policy literature.

Possibly the most well known of the vocal critics of transhumanism is Francis Fukuyama, a political scientist who nominated transhumanism as his choice for the world's most dangerous idea [167]. As with most utopian notions, the main problem Fukuyama sees with transhumanism is the transition between our present state and the transhumanists' future vision of completely realized eternal technological bliss (Fig. 28). Will some people be uploaded to become immortal, almost omniscient transhumans while others are left behind in their feeble, mortal, disease-ridden human bodies? Are the human goods that transhumanists say are basically the same for everyone really so? Or are they more complex and subtle than typical transhumanist pronouncements acknowledge? As Fukuyama points out in his foreign Policy essay [167], “Our good characteristics are intimately connected to our bad ones… if we never felt jealousy, we would also never feel love. Even our mortality plays a critical function in allowing our species as a whole to survive and adapt (and transhumanists are about the last group I would like to see live forever).”

 

Fig.&nbsp;28.&nbsp; Brain in a vat with the thought: “I'm walking outside in the sun” being transmitted to the computer. Image reproduced under the Creative Commons license.

Fig. 28. Brain in a vat with the thought: “I'm walking outside in the sun” being transmitted to the computer. Image reproduced under the Creative Commons license.

Transhumanists themselves admit that their movement performs some of the functions of a religion when it “offers a sense of direction and purpose.” But in contrast to most religions, transhumanists explicitly hope to “make their dreams come true in this world” [168]. Nearly all transhumanist programs and proposals arise from a materialist–reductionist view of the world which assumes that the human mind is at most an epiphenomenon of the brain, all of the human brain's functions will eventually be simulated by hardware (on computers of the future), and that the experience known as consciousness can be realized in artificial hardware in essentially the same form as it is presently realized in the human body. Some of the assumptions of transhumanism are based less on facts and more on faith. Just as Christians take on faith that God revealed Himself in Jesus Christ, transhumanists take on faith that machines will inevitably become conscious.

Fig.&nbsp;29.&nbsp; The shadow dextrous hand shakes the human hand. How technology might become society—a future agreement. Courtesy of Shadow Robot Company 2008.

Fig. 29. The shadow dextrous hand shakes the human hand. How technology might become society—a future agreement. Courtesy of Shadow Robot Company 2008.

In keeping with the transhumanists' call for responsible moral vision, the IEEE SSIT has been, and will continue to be, a forum where the implications for society of all sorts of technological developments can be debated and evaluated. In a sense, the transhumanist program is the ultimate technological project: to redesign humanity itself to a set of specifications, determined by us. If the transhumanists succeed, technology will become society, and the question of the social implications of technology will be moot (Fig. 29). Perhaps the best attitude to take toward transhumanism is to pay attention to their prophecies, but, as the Old Testament God advised the Hebrews, “if the thing follow not, nor come to pass…the prophet hath spoken it presumptuously…” [169].

 

 

SECTION IX. Ways forward

In sum, identifying and predicting what the social implications of past, present and future technologies might be can lead us to act in one of four ways, which are not mutually exclusive.

First, we can take the “do nothing” approach and meekly accept the risks associated with new techniques. We stop being obsessed by both confirmed and speculative consequences, and instead, try to see how far the new technologies might take us and what we might become or transform into as a result. While humans might not always like change, we are by nature, if we might hijack Heraclitus, in a continual state of flux. We might reach new potentials as a populace, become extremely efficient at doing business with each other, and make a positive impact on our natural environment by doing so. The downside to this approach is that it appears to be an all or nothingapproach with no built-in decision points. for as Jacques Ellul [170] forewarned: “what is at issue here is evaluating the danger of what might happen to our humanity in the present half-century, and distinguishing between what we want to keep and what we are ready to lose, between what we can welcome as legitimate human development and what we should reject with our last ounce of strength as dehumanization.”

The second option is that we let case law determine for us what is legal or illegal based on existing laws, or new or amended laws we might introduce as a result of the new technologies. We can take the stance that the courts are in the best position to decide on what we should and should not do with new technologies. If we break the law in a civil or criminal capacity, then there is a penalty and we have civil and criminal codes concerning workplace surveillance, telecommunications interception and access, surveillance devices, data protection and privacy, cybercrime, and so on. There is also the continual review of existing legislation by law-reform commissions and the like. New legislation can also be introduced to curb against other dangers or harms that might eventuate as a result of the new techniques.

The third option is that we can introduce industry regulations that stipulate how advanced applications should be developed (e.g., ensuring privacy impact assessments are done before commercial applications are launched), and that technical expectations on accuracy, reliability, and storage of data are met. It is also important that the right balance be found between regulations and freedom so as not to stifle the high-tech industry at large.

Finally, the fourth option would be to adopt the “Amish method”: complete abandonment of technology that has progressed beyond a certain point of development. This is in some respect “living off the grid” [171].

Although obvious, it is important to underline that none of these options are mutually exclusive or foolproof. The final solution may well be at times to introduce industry regulations or codes, at other times to do nothing, and in other cases to rely on legislative amendments despite the length of time it takes to develop these. In other cases, the safeguards may need to be built into the technology itself.

 

SECTION X. Conclusion

If we put our trust in Kurzweil's [172] Law of Accelerating Returns, we are likely headed into a great period of discovery unprecedented in any era of history. This being the case, the time for inclusive dialog is now, not after widespread diffusion of such innovations as “always on” cameras, microchip implants, unmanned drones and the like. We stand at a critical moment of decision, as the mythological Pandora did as she was about to open her box. There are many lessons to be learned from history, especially from such radical developments as the atomic bomb and the resulting arms race. Joy [173] has raised serious fears about continuing unfettered research into “spiritual machines.” Will humans have the foresight to say “no” or “stop” to new innovations that could potentially be a means to a socially destructive scenario? Implants that may prolong life expectancy by hundreds if not thousands of years may appeal at first glance, but they could well create unforeseen devastation in the form of technological viruses, plagues, or a grim escalation in the levels of crime and violence.

To many scientists of the positivist tradition anchored solely to an empirical world view, the notion of whether something is right or wrong is in a way irrelevant. for these researchers, a moral stance has little or nothing to do with technological advancement but is really an ideological position. The extreme of this view is exemplified by an attitude of “let's see how far we can go”, not “is what we are doing the best thing for humanity?” and certainly not by the thought of “what are the long-term implications of what we are doing here?” As an example, one need only consider the mad race to clone the first animal, and many have long suspected an “underground” scientific race continues to clone the first human.

In the current climate of innovation, precisely since the proliferation of the desktop computer and birth of new digital knowledge systems, some observers believe that engineers, and professionals more broadly, lack accountability for the tangible and intangible costs of their actions [174, p. 288]. Because science-enabled engineering has proved so profitable for multinational corporations, they have gone to great lengths to persuade the world that science should not be stopped, for the simple reason that it will always make things better. This ignores the possibility that even seemingly small advancements into the realm of the Electrophorus for any purpose other than medical prostheses will have dire consequences for humanity [175]. According to Kuhns, “Once man has given technique its entry into society, there can be no curbing of its gathering influence, no possible way of forcing it to relinquish its power. Man can only witness and serve as the ironic beneficiary-victim of its power” [176, p. 94].

Clearly, none of the authors of this paper desire to stop technological advance in its tracks. But we believe that considering the social implications of past, present, and future technologies is more than an academic exercise. As custodians of the technical means by which modern society exists and develops, engineers have a unique responsibility to act with forethought and insight. The time when following orders of a superior was all that an engineer had to do is long past. with great power comes great responsibility. Our hope is that the IEEE SSIT will help and encourage engineers worldwide to consider the consequences of their actions throughout the next century.

References

1. K. D. Stephan, "Notes for a history of the IEEE society on social implications of technology", IEEE Technol. Soc. Mag., vol. 25, no. 4, pp. 5-14, 2006.

2. B. R. Inman, "One view of national security and technical information", IEEE Technol. Soc. Mag., vol. 1, no. 3, pp. 19-21, Sep. 1982.

3. S. Sloan, "Technology and terrorism: Privatizing public violence", IEEE Technol. Soc. Mag., vol. 10, no. 2, pp. 8-14, 1991.

4. J. R. Shanebrook, "Prohibiting nuclear weapons: Initiatives toward global nuclear disarmament", IEEE Technol. Soc. Mag., vol. 18, no. 2, pp. 25-31, 1999.

5. C. J. Andrews, "National responses to energy vulnerability", IEEE Technol. Soc. Mag., vol. 25, no. 3, pp. 16-25, 2006.

6. R. C. Arkin, "Ethical robots in warfare", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 30-33, 2009.

7. M. G. Michael, K. Michael, "Toward a state of überveillance", IEEE Technol. Soc. Mag., vol. 29, no. 2, pp. 9-16, 2010.

8. V. Baranauskas, "Large-scale fuel farming in Brazil", IEEE Technol. Soc. Mag., vol. 2, no. 1, pp. 12-13, Mar. 1983.

9. H. M. Gueron, "Nuclear power: A time for common sense", IEEE Technol. Soc. Mag., vol. 3, no. 1, pp. 3-9, Mar. 1984.

10. J. J. Mackenzie, "Nuclear power: A skeptic's view", IEEE Technol. Soc. Mag., vol. 3, no. 1, pp. 9-15, Mar. 1984.

11. E. Larson, D. Abrahamson, P. Ciborowski, "Effects of atmospheric carbon dioxide on U. S. peak electrical generating capacity", IEEE Technol. Soc. Mag., vol. 3, no. 4, pp. 3-8, Dec. 1984.

12. P. C. Cruver, "Greenhouse effect prods global legislative initiatives", IEEE Technol. Soc. Mag., vol. 9, no. 1, pp. 10-16, Mar./Apr. 1990.

13. B. Allenby, "Earth systems engineering and management", IEEE Technol. Soc. Mag., vol. 19, no. 4, pp. 10-24, Winter 2000.

14. J. C. Davis, "Protecting intellectual property in cyberspace", IEEE Technol. Soc. Mag., vol. 17, no. 2, pp. 12-25, 1998.

15. R. Brody, "Consequences of electronic profiling", IEEE Technol. Soc. Mag., vol. 18, no. 1, pp. 20-27, 1999.

16. K. W. Bowyer, "Face-recognition technology: Security versus privacy", IEEE Technol. Soc. Mag., vol. 23, no. 1, pp. 9-20, 2004.

17. D. Btschi, M. Courant, L. M. Hilty, "Towards sustainable pervasive computing", IEEE Technol. Soc. Mag., vol. 24, no. 1, pp. 7-8, 2005.

18. R. Clarke, "Cyborg rights", IEEE Technol. Soc. Mag., vol. 30, no. 3, pp. 49-57, 2011.

19. E. Levy, D. Copp, "Risk and responsibility: Ethical issues in decision-making", IEEE Technol. Soc. Mag., vol. 1, no. 4, pp. 3-8, Dec. 1982.

20. K. R. Foster, R. B. Ginsberg, "Guest editorial: The wired classroom", IEEE Technol. Soc. Mag., vol. 17, no. 4, pp. 3, 1998.

21. T. Bookman, "Ethics professionalism and the pleasures of engineering: T&S interview with Samuel Florman", IEEE Technol. Soc. Mag., vol. 19, no. 3, pp. 8-18, 2000.

22. K. D. Stephan, "Is engineering ethics optional", IEEE Technol. Soc. Mag., vol. 20, no. 4, pp. 6-12, 2001.

23. T. C. Jepsen, "Reclaiming history: Women in the telegraph industry", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 15-19, 2000.

24. A. S. Bix, "‘Engineeresses’ invade campus", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 20-26, 2000.

25. J. Coopersmith, "Pornography videotape and the internet", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 27-34, 2000.

26. D. M. Hughes, "The internet and sex industries: Partners in global sexual exploitation", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 35-41, 2000.

27. V. Cimagalli, M. Balsi, "Guest editorial: University technology and society", IEEE Technol. Soc. Mag., vol. 20, no. 2, pp. 3, 2001.

28. G. L. Engel, B. M. O'Connell, "Guest editorial: Ethical and social issues criteria in academic accreditation", IEEE Technol. Soc. Mag., vol. 21, no. 3, pp. 7, 2002.

29. J. C. Lucena, G. Downey, H. A. Amery, "From region to countries: Engineering education in Bahrain Egypt and Turkey", IEEE Technol. Soc. Mag., vol. 25, no. 2, pp. 4-11, 2006.

30. C. Didier, J. R. Herkert, "Volunteerism and humanitarian engineering—Part II", IEEE Technol. Soc. Mag., vol. 29, no. 1, pp. 9-11, 2010.

31. K. Michael, G. Roussos, G. Q. Huang, R. Gadh, A. Chattopadhyay, S. Prabhu, P. Chu, "Planetary-scale RFID services in an age of uberveillance", Proc. IEEE, vol. 98, no. 9, pp. 1663-1671, Sep. 2010.

32. M. G. Michael, K. Michael, "The fall-out from emerging technologies: On matters of surveillance social networks and suicide", IEEE Technol. Soc. Mag., vol. 30, no. 3, pp. 15-18, 2011.

33. M. U. Iqbal, S. Lim, "Privacy implications of automated GPS tracking and profiling", IEEE Technol. Soc. Mag., vol. 29, no. 2, pp. 39-46, 2010.

34. D. Kravets, "OnStar tracks your car even when you cancel service", Wired, Sep. 2011.

35. L. Evans, "Location-based services: Transformation of the experience of space", J. Location Based Services, vol. 5, no. 34, pp. 242-260, 2011.

36. M. Wigan, R. Clarke, "Social impacts of transport surveillance", Prometheus, vol. 24, no. 4, pp. 389-403, 2006.

37. B. D. Renegar, K. Michael, "The privacy-value-control harmonization for RFID adoption in retail", IBM Syst. J., vol. 48, no. 1, pp. 8:1-8:14, 2009.

38. R. Clarke, "Information technology and dataveillance", Commun. ACM, vol. 31, no. 5, pp. 498-512, 1988.

39. H. Ketabdar, J. Qureshi, P. Hui, "Motion and audio analysis in mobile devices for remote monitoring of physical activities and user authentication", J. Location Based Services, vol. 5, no. 34, pp. 182-200, 2011.

40. E. Singer, "Device tracks how you're sleeping", Technol. Rev. Authority Future Technol., Jul. 2009.

41. L. Perusco, K. Michael, "Control trust privacy and security: Evaluating location-based services", IEEE Technol. Soc. Mag., vol. 26, no. 1, pp. 4-16, 2007.

42. K. Michael, A. McNamee, M. G. Michael, "The emerging ethics of humancentric GPS tracking and monitoring", ICMB M-Business-From Speculation to Reality, 2006.

43. S. J. Fusco, K. Michael, M. G. Michael, R. Abbas, "Exploring the social implications of location based social networking: An inquiry into the perceived positive and negative impacts of using LBSN between friends", 9th Int. Conf. Mobile Business/9th Global Mobility Roundtable (ICMB-GMR), 2010.

44. M. Burdon, "Commercializing public sector information privacy and security concerns", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 34-40, 2009.

45. R. W. Picard, "Future affective technology for autism and emotion communication", Philosoph. Trans. Roy. Soc. London B Biol. Sci., vol. 364, no. 1535, pp. 3575-3584, 2009.

46. R. M. Kowalski, S. P. Limber, P. W. Agatston, Cyber Bullying: The New Moral Frontier, U.K., London: Wiley-Blackwell, 2007.

47. Google: Policies and Principles, Oct. 2011.

48. K.-S. Lee, "Surveillant institutional eyes in South Korea: From discipline to a digital grid of control", Inf. Soc., vol. 23, no. 2, pp. 119-124, 2007.

49. D. P. Siewiorek, "Wearable computing comes of age", IEEE Computer, vol. 32, no. 5, pp. 82-83, May 1999.

50. L. Sydnheimo, M. Salmimaa, J. Vanhala, M. Kivikoski, "Wearable and ubiquitous computer aided service maintenance and overhaul", IEEE Int. Conf. Commun., vol. 3, pp. 2012-2017, 1999.

51. K. Michael, M. G. Michael, Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, 2009.

52. K. Michael, M. G. Michael, "Implementing Namebers using microchip implants: The black box beneath the skin" in This Pervasive Day: The Potential and Perils of Pervasive Computing, U.K., London:Imperial College Press, pp. 101-142, 2011.

53. S. Mann, "Wearable computing: Toward humanistic intelligence", IEEE Intell. Syst., vol. 16, no. 3, pp. 10-15, May/Jun. 2001.

54. B. Schiele, T. Jebara, N. Oliver, "Sensory-augmented computing: Wearing the museum's guide", IEEE Micro, vol. 21, no. 3, pp. 44-52, May/Jun. 2001.

55. C. Harrison, D. Tan, D. Morris, "Skinput: Appropriating the skin as an interactive canvas", Commun. ACM, vol. 54, no. 8, pp. 111-118, 2011.

56. N. Sawhney, C. Schmandt, "Nomadic radio: A spatialized audio environment for wearable computing", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 171-172, 1997.

57. S. Mann, "Eudaemonic computing (‘underwearables’)", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 177-178, 1997.

58. LooxieOverview, Jan. 2012.

59. T. Starner, "The challenges of wearable computing: Part 1", IEEE Micro, vol. 21, no. 4, pp. 44-52, Jul./Aug. 2001.

60. G. Trster, "Smart clothes—The unfulfilled pledge", IEEE Perv. Comput., vol. 10, no. 2, pp. 87-89, Feb. 2011.

61. M. B. Spitzer, "Eyeglass-based systems for wearable computing", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 48-51, 1997.

62. R. Steinkuhl, C. Sundermeier, H. Hinkers, C. Dumschat, K. Cammann, M. Knoll, "Microdialysis system for continuous glucose monitoring", Sens. Actuators B Chem., vol. 33, no. 13, pp. 19-24, 1996.

63. J. C. Pickup, F. Hussain, N. D. Evans, N. Sachedina, "In vivo glucose monitoring: The clinical reality and the promise", Biosens. Bioelectron., vol. 20, no. 10, pp. 1897-1902, 2005.

64. C. Thomas, R. Carlson, Development of the Sensing System for an Implantable Glucose Sensor, Jan. 2012.

65. J. L. Ferrero, "Wearable computing: One man's mission", IEEE Micro, vol. 18, no. 5, pp. 87-88, Sep.-Oct. 1998.

66. T. Martin, "Issues in wearable computing for medical monitoring applications: A case study of a wearable ECG monitoring device", Proc. IEEE 4th Int. Symp. Wearable Comput., pp. 43-49, 2000.

67. M. G. Michael, "The biomedical pioneer: An interview with C. Toumazou" in Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, pp. 352-363, 2009.

68. R. Capurro, M. Nagenborg, Ethics and Robotics, Germany, Heidelberg: Akademische Verlagsgesellschaft, 2009.

69. R. Sparrow, "Predators or plowshares? Arms control of robotic weapons", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 25-29, 2009.

70. R. C. Arkin, "Governing lethal behavior in robots [T&S Interview]", IEEE Technol. Soc. Mag., vol. 30, no. 4, pp. 7-11, 2011.

71. M. L. Cummings, "Creating moral buffers in weapon control interface design", IEEE Technol. Soc. Mag., vol. 23, no. 3, pp. 28-33, 41, 2004.

72. P. Asaro, "Modeling the moral user", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 20-24, 2009.

73. J. Canning, "You've just been disarmed. Have a nice day!", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 13-15, 2009.

74. F. Operto, "Ethics in advanced robotics", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 72-78, Mar. 2011.

75. J. M. Sullivan, "Evolution or revolution? The rise of UAVs", IEEE Technol. Soc. Mag., vol. 25, no. 3, pp. 43-49, 2006.

76. P. Salvini, M. Nicolescu, H. Ishiguro, "Benefits of human-robot interaction", IEEE Robot. Autom. Mag., vol. 18, no. 4, pp. 98-99, Dec. 2011.

77. A. Sharkey, N. Sharkey, "Children the elderly and interactive robots", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 32-38, Mar. 2011.

78. D. Feil-Seifer, M. J. Mataric, "Socially assistive robotics", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 24-31, Mar. 2011.

79. J. D. Bronzino, The Biomedical Engineering Handbook: Medical Devices and Systems, FL, Boca Raton:CRC Press, 2006.

80. C. Hassler, T. Boretius, T. Stieglitz, "Polymers for neural implants", J. Polymer Sci. B Polymer Phys., vol. 49, no. 1, pp. 18-33, 2011.

81. Bionic Hearing Bionic Vision Neurobionics, Jan. 2012.

82. A. Manning, "Implants sounding better: Smaller faster units overcome ‘nerve deafness’", USA Today, pp. 7D, 2000.

83. G. M. Clark, Sounds From Silence, Australia, Melbourne: Allen & Unwin, 2003.

84. G. Carman, Eureka Moment From First One to Hear With Bionic Ear, Feb. 2008.

85. J. F. Patrick, P. A. Busby, P. J. Gibson, "The development of the nucleus FreedomTM cochlear implant system", Sage Publications, vol. 10, no. 4, pp. 175-200, 2006.

86. "Personal stories", Cochlear, Jan. 2012.

87. R. A. Cooper, "Quality of life technology: A human-centered and holistic design", IEEE Eng. Med. Biol., vol. 27, no. 2, pp. 10-11, Mar./Apr. 2008.

88. S. Stewart, "Neuromaster", Wired 8.02.

89. J. Dowling, "Current and future prospects for optoelectronic retinal prostheses", Eye, vol. 23, pp. 1999-2005, 2009.

90. D. Ahlstrom, "Microchip implant could offer new kind of vision", The Irish Times.

91. More Tests of Eye Implants Planned, pp. 1-2, 2001.

92. G. Branwyn, "The desire to be wired", Wired 1.4.

93. W. Wells, The Chips Are Coming.

94. M. Brooks, "The cyborg cometh", Worldlink: The Magazine of the World Economic Forum.

95. E. Strickland, "Birth of the bionic eye", IEEE Spectrum, Jan. 2012.

96. S. Adee, "Researchers hope to mime 1000 neurons with high-res artificial retina", IEEE Spectrum, Jan. 2012.

97. D. Nairne, Building Better People With Chips and Sensors.

98. S. S. Hall, "Brain pacemakers", MIT Enterprise Technol. Rev..

99. E. A. C. Pereira, A. L. Green, R. J. Stacey, T. Z. Aziz, "Refractory epilepsy and deep brain stimulation", J. Clin. Neurosci., vol. 19, no. 1, pp. 27-33, 2012.

100. "Brain pacemaker could help cure depression research suggests", Biomed. Instrum. Technol., vol. 45, no. 2, pp. 94, 2011.

101. H. S. Mayberg, A. M. Lozano, V. Voon, H. E. McNeely, D. Seminowicz, C. Hamani, J. M. Schwalb, S. H. Kennedy, "Deep brain stimulation for treatment-resistant depression", Neuron, vol. 45, no. 5, pp. 651-660, 2005.

102. B. Staff, "Brain pacemaker lifts depression", BBC News, Jun. 2005.

103. C. Hamani, H. Mayberg, S. Stone, A. Laxton, S. Haber, A. M. Lozano, "The subcallosal cingulate gyrus in the context of major depression", Biol. Psychiatry, vol. 69, no. 4, pp. 301-308, 2011.

104. R. Dobson, Professor to Try to Control Wife via Chip Implant.

105. "Chip helps paraplegic walk", Wired News.

106. D. Smith, "Chip implant signals a new kind of man", The Age.

107. "Study of an implantable functional neuromuscular stimulation system for patients with spinal cord injuries", Clinical Trials.gov, Feb. 2009.

108. R. Barrett, "Electrodes help paraplegic walk" in Lateline Australian Broadcasting Corporation, Australia, Sydney: ABC, May 2011.

109. M. Ingebretsen, "Intelligent exoskeleton helps paraplegics walk", IEEE Intell. Syst., vol. 26, no. 1, pp. 21, 2011.

110. S. Harris, "US researchers create suit that can enable paraplegics to walk", The Engineer, Oct. 2011.

111. D. Ratner, M. A. Ratner, Nanotechnology and Homeland Security: New Weapons for New Wars, NJ, Upper Saddle River: Pearson Education, 2004.

112. L. Versweyveld, "Chip implants allow paralysed patients to communicate via the computer", Virtual Medical Worlds Monthly.

113. S. Adee, "The revolution will be prosthetized: DARPA's prosthetic arm gives amputees new hope", IEEE Spectrum, vol. 46, no. 1, pp. 37-40, 2009.

114. E. Wales, "It's a living chip", The Australian, pp. 4, 2001.

115. Our Products: MBAMultiplex Bio Threat Assay, Jan. 2012.

116. F. W. Scheller, "From biosensor to biochip", FEBS J., vol. 274, no. 21, pp. 5451, 2007.

117. A. Persidis, "Biochips", Nature Biotechnol., vol. 16, pp. 981-983, 1998.

118. A. C. LoBaido, "Soldiers with microchips: British troops experiment with implanted electronic dog tag", WorldNetDaily.com.

119. "Microchip implants for drug delivery", ABC: News in Science.

120. R. Bailey, "Implantable insulin pumps", Biology About.com.

121. D. Elleri, D. B. Dunger, R. Hovorka, "Closed-loop insulin delivery for treatment of type 1 diabetes", BMC Med., vol. 9, no. 120, 2011.

122. D. L. Sorkin, J. McClanahan, "Cochlear implant reimbursement cause for concern", HealthyHearing, May 2004.

123. J. Berke, "Parental rights and cochlear implants: Who decides about the implant?", About.com: Deafness, May 2009.

124. D. O. Weber, "Me myself my implants my micro-processors and I", Softw. Develop. Mag., Jan. 2012.

125. A. Graafstra, K. Michael, M. G. Michael, "Social-technical issues facing the humancentric RFID implantee sub-culture through the eyes of Amal Graafstra", Proc. IEEE Int. Symp. Technol. Soc., pp. 498-516, 2010.

126. E. M. McGee, G. Q. Maguire, "Becoming borg to become immortal: Regulating brain implant technologies", Cambridge Quarterly Healthcare Ethics, vol. 16, pp. 291-302, 2007.

127. P. Moore, Enhancing Me: The Hope and the Hype of Human Enhancement, U.K., London: Wiley, 2008.

128. A. Masters, K. Michael, "Humancentric applications of RFID implants: The usability contexts of control convenience and care", Proc. 2nd IEEE Int. Workshop Mobile Commerce Services, pp. 32-41, 2005.

129. J. Best, "44000 prison inmates to be RFID-chipped", silicon.com, Nov. 2010.

130. D. Brin, The Transparent Society: Will Technology Force Us to Choose Between Privacy and Freedom?, MA, Boston: Perseus Books, 1998.

131. J. E. Dobson, P. F. Fischer, "Geoslavery", IEEE Technol. Soc. Mag., vol. 22, no. 1, pp. 47-52, 2003.

132. K. Michael, M. G. Michael, "Homo Electricus and the Continued Speciation of Humans" in The Encyclopedia of Information Ethics and Security, PA, Hershey: IGI, pp. 312-318, 2007.

133. S. Young, Designer Evolution: A Transhumanist Manifesto, New York: Prometheus Books, 2006.

134. E. Braun, Wayward Technology, U.K., London: Frances Pinter, 1984.

135. R. el Kaliouby, R. Picard, S. Baron-Cohen, "Affective computing and autism", Ann. New York Acad. Sci., vol. 1093, no. 1, pp. 228-248, 2006.

136. D. Bhatia, S. Bairagi, S. Goel, M. Jangra, "Pacemakers charging using body energy", J. Pharmacy Bioallied Sci., vol. 2, no. 1, pp. 51-54, 2010.

137. V. Arstila, F. Scott, "Brain reading and mental privacy", J. Humanities Social Sci., vol. 15, no. 2, pp. 204-212, 2011.

138. R. Westrum, Technologies and Society: The Shaping of People and Things, CA, Belmont: Wadsworth, 1991.

139. B. Gates, The Road Ahead, New York: Penguin, 1995.

140. M. Allaby, Facing The Future: The Case for Science, U.K., London: Bloomsbury, 1996.

141. M. Kaku, Visions: How Science Will Revolutionise the 21st Century and Beyond, U.K., Oxford: Oxford Univ. Press, 1998.

142. A. Berry, The Next 500 Years: Life in the Coming Millennium, New York: Gramercy Books, 1996.

143. J. Ellul, The Technological Society, New York: Vintage Books, 1964.

144. W. Gibson, Neuromancer, New York: Ace Books, 1984.

145. M. McLuhan, Understanding Media: The Extensions of Man, MA, Cambridge: MIT Press, 1964.

146. A. Toffler, Future Shock, New York: Bantam Books, 1981.

147. C. M. Banbury, Surviving Technological Innovation in the Pacemaker Industry 19591990, New York: Garland, 1997.

148. K. Warwick, I Cyborg, U.K., London: Century, 2002.

149. M. G. Michael, K. Warwick, "The professor who has touched the future" in Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, pp. 406-422, 2009.

150. D. Green, "Why I am not impressed with Professor Cyborg", BBC News.

151. P. Cochrane, Tips For Time Travellers: Visionary Insights Into New Technology Life and the Future on the Edge of Technology, New York: McGraw-Hill, 1999.

152. I. Walker, "Cyborg dreams: Beyond Human: Background Briefing", ABC Radio National, Jan. 2012.

153. W. Grossman, "Peter Cochrane will microprocess your soul", Wired 6.11.

154. R. Fixmer, "The melding of mind with machine may be the next phase of evolution", The New York Times.

155. M. Billinghurst, T. Starner, "Wearable devices: New ways to manage information", IEEE Computer, vol. 32, no. 1, pp. 57-64, Jan. 1999.

156. M. Minsky, Society of Mind, New York: Touchstone, 1985.

157. R. Uhlig, "The end of death: ‘Soul Catcher’ computer chip due", The Electronic Telegraph.

158. K. Michael, M. G. Michael, "Microchipping people: The rise of the electrophorus", Quadrant, vol. 414, no. 3, pp. 22-33, 2005.

159. K. Michael, M. G. Michael, "Towards chipification: The multifunctional body art of the net generation", Cultural Attitudes Towards Technol. Commun., 2006.

160. E. McLuhan, F. Zingrone, Essential McLuhan, NY, New York:BasicBooks, 1995.

161. M. Dery, Escape Velocity: Cyberculture at the End of the Century, U.K., London: Hodder and Stoughton, 1996.

162. J. Brown, "Professor Cyborg", Salon.com, Jan. 2012.

163. M. T. Maybury, "The mind matters: Artificial intelligence and its societal implications", IEEE Technol. Soc. Mag., vol. 9, no. 2, pp. 7-15, Jun./Jul. 1990.

164. Philosophy, Jan. 2012.

165. R. Kurzweil, The Singularity Is Near, New York: Viking, 2005.

166. Transhumanist Declaration, Jan. 2010.

167. F. Fukuyama, "Transhumanism", Foreign Policy, no. 144, pp. 42-43, 2004.

168. How Does Transhumanism Relate to Religion? in Transhumanist FAQ, Jan. 2012.

169.

170. J. Ellul, What I Believe, MI, Grand Rapids: Eerdmans, 1989.

171. J. M. Wetmore, "Amish Technology: Reinforcing values and building community", IEEE Technol. Soc. Mag., vol. 26, no. 2, pp. 10-21, 2007.

172. R. Kurzweil, The Age of Spiritual Machines, New York: Penguin Books, 1999.

173. B. Joy, "Why the future doesn't need us", Wired 8.04.

174. K. J. O'Connell, "Uses and abuses of technology", Inst. Electr. Eng. Proc. Phys. Sci. Meas. Instrum. Manage. Educ. Rev., vol. 135, no. 5, pp. 286-290, 1988.

175. D. F. Noble, The Religion of Technology: The Divinity of Man and the Spirit of Invention, New York: Penguin Books, 1999.

176. W. Kuhns, The Post-Industrial Prophets: Interpretations of Technology, New York: Harper Colophon Books, 1971.

177. D. J. Solove, The Future of Reputation, CT, New Haven: Yale Univ. Press., 2007.

178. J. Rennie, "Ray Kurzweil's slippery futurism", IEEE Spectrum, Dec. 2010.

Keywords

Technology forecasting, Social implications of technology, History, Social factors, Human factors, social aspects of automation, human-robot interaction, mobile computing, pervasive computing, IEEE society, social implications of technology, SSIT, society founding, social impacts, military technologies, security technologies, cyborgs, human-machine hybrids, human mind, transhumanist future, humanity redesigns, mobile computing, wearable computing, Überveillance, Corporate activities, engineering education, ethics, future of technology, history,social implications of technology, sociotechnical systems

Citation: Karl D. Stephan, Katina Michael, M. G. Michael, Laura Jacob, Emily P. Anesta, 2012, "Social Implications of Technology: The Past, the Present, and the Future", Proceedings of the IEEE, Volume: 100, Issue: Special Centennial Issue, May 13 2012, 1752-1781. 10.1109/JPROC.2012.2189919

The legal, social and ethical controversy of DNA samples in forensic science

The legal, social and ethical controversy of the collection and storage of fingerprint profiles and DNA samples in forensic science

Abstract

The collection and storage of fingerprint profiles and DNA samples in the field of forensic science for nonviolent crimes is highly controversial. While biometric techniques such as fingerprinting have been used in law enforcement since the early 1900s, DNA presents a more invasive and contentious technique as most sampling is of an intimate nature (e.g. buccal swab). A fingerprint is a pattern residing on the surface of the skin while a DNA sample needs to be extracted in the vast majority of cases (e.g. at times extraction even implying the breaking of the skin). This paper aims to balance the need to collect DNA samples where direct evidence is lacking in violent crimes, versus the systematic collection of DNA from citizens who have committed acts such as petty crimes. The legal, ethical and social issues surrounding the proliferation of DNA collection and storage are explored, with a view to outlining the threats that such a regime may pose to citizens in the not-to-distant future, especially persons belonging to ethnic minority groups.

SECTION 1. Introduction

The aim of this paper is to apply the science, technology and society (STS) studies approach which combines history, social study and philosophy of science to the legal history of DNA sampling and profiling in the United Kingdom since the first forensic use of DNA in a criminal court case in 1988. The paper begins by defining the application of biometrics to the field of criminal law, in particular the use of fingerprint and DNA identification techniques. It then presents the differences between fingerprints and DNA evidence and focuses on distinguishing between DNA profiles and samples, and DNA databanks and databases. Finally the paper presents the legal, ethical and social concerns of the proliferation of DNA collection and storage in particular jurisdictions prior to 2010 (e.g. United Kingdom). The paper points to the pressing need for the review of the Police and Criminal Evidence Act 1984, and to the procedures for DNA collection and storage in the U.K.'s National DNA Database (NDNAD) which was established in 1995. Some examples are provided of the state of play in the United States as well.

SECTION 2. Conceptual Framework

It is of no surprise that in recent years there has been a convergence between science and technology studies (STS) and law and society (L&S) studies. Some commentators, like this author believe that there is a need to define a new theoretical framework that amalgamates these increasingly converging areas. Lynch et al. [6], [p.14] write: “[w]hen law turns to science or science turns to law, we have the opportunity to examine how these two powerful systems work out their differences.” This convergence has its roots planted in legal disputes in the fields of health, safety and environmental regulation. For instance, advances in technology have challenged ones right to live or die. New innovations have the capacity to draw out traditional distinctions of regulations or they can challenge and even evade them.

In this paper we study the “DNA controversy” using the conceptual framework that can be found in Figure 1 which depicts the role of major stakeholders in the debate. In the early 1990s the “DNA Wars” [6] focused on two major problems with respect to the techno-legal accountability of DNA evidence in a court of law. The first had to do with the potential for error in the forensic laboratory, and the second had to do with the combination of genetic and statistical datasets. And it did not just have to do with legal and administrative matters, but issues that were both technical and scientific in nature. The key players included expert lawyers, scientists who actively participated in legal challenges and public policy debates, and the media who investigated and reported the controversy [6]. To put an end to the controversy would require the coming together of law, science and the public in a head-on confrontation. And that is indeed what occurred. By the late 1990s DNA had become an acceptable method of suspect identification and a great number of onlookers prematurely rushed to declare a closure to the controversy although as commentators have stated there was no moment of truth or definitive judgment that put an end to the controversy. What many did not recognize at the time however, is that the DNA controversy would return, in places like the United Kingdom, bigger and with more intensity than ever before.

Figure 1. The core set diagram: studying the DNA controversy

It is with great interest to read that closure in the DNA controversy was really visible when the NDNAD and some of the legislation and policy surrounding it facilitated talks between nations in Europe with respect to harmonization. According to Lynch et al. [6], [p.229]:

“[e]fforts were made to “harmonize” DNA profile and database standards in Europe, and other international efforts were made to coordinate forensic methods in order to track suspected “mobile” criminals and terrorists across national borders. These international efforts to implement and standardize DNA profiling contributed to closure in particular localities by demonstrating that the technique was widely used and had become a fixture of many criminal justice systems.”

While closure it may have signified to those working within an STS and L&S approach, harmonization was certainly not reached. Far from it, the U.K. who had been responsible for initial harmonization efforts, later, lost its way. What made onlookers believe that closure had fully occurred were the technical, legal and administrative fixes that had taken place. But closure in this instance did not mean the complete end to the controversy-no-what was coming was much greater disquiet in the U. K, and this period was named ‘post-closure’ by the STS and L&S commentators. Postclosure signals a period of time after closure is established, when the possibilities for issues that were once closed are reopened. In the case of the NDNAD in the U.K. it was not old issues that were reopened during postclosure, but new issues that were introduced due to so-called legal fixes. These legal fixes had social implications, so it was not until the public and the media and non-government organizations alongside self-interest groups were satisfied that change would be imminent, that postclosure seemed a real possibility. The threat to the post-closure of the DNA controversy however, is the burgeoning demand for DNA samples in fields such as epidemiology research and the recent commercialization of DNA sample collection and storage for every day citizens (e.g. DNA home kits selling for less than $100US dollars). DNA is no longer seen as just useful for forensic science or health, and this is placing incredible pressure on the advanced identification technique which is increasingly becoming commoditized.

SECTION 3. Background: What is Biometrics?

As defined by the Association for Biometrics (AFB) a biometric is “ … a measurable, unique physical characteristic or personal trait to recognize the identity, or verify the claimed identity, of an enrollee.” The physical characteristics that can be used for identification include: facial features, full face and profile, fingerprints, palmprints, footprints, hand geometry, ear (pinna) shape, retinal blood vessels, striation of the iris, surface blood vessels (e.g., in the wrist), and electrocardiac waveforms [1]. Other examples of biometric types include DNA (deoxyribonucleic acid), odor, skin reflectance, thermogram, gait, keystroke, and lip motion. Biometrics have seven characteristics: they are universal in that every person should possess that given characteristic; they are unique in that no two persons should have the same pattern; they are permanent in that they do not change over time; they are collectable and quantifiable; there is performance in that the measure is accurate, it is acceptable to users; and circumventing, meaning that the system of identification theoretically cannot be duped [2]. The two most popular methods of identification today in criminal law, when direct evidence is lacking such as a first hand eyewitness account, are fingerprinting and DNA.

SECTION 4. What is Fingerprinting?

Fingerprints are classified upon a number of fingerprint characteristics or unique pattern types, which include arches, loops and whorls [3], [p.228]. If one inspects the epidermis layer of the fingertips closely, one can see that it is made up of ridge and valley structures forming a unique geometric pattern. The ridge endings are given a special name called minutiae. Identifying an individual using the relative position of minutiae and the number of ridges between minutiae is the traditional algorithm used to compare pattern matches. As fingerprints do not change from birth until death unless they are accidentally or deliberately deformed, it is argued that they can provide an absolute proof of identity. The science of fingerprint identification is called dactyloscopy [4], [p.4].

4.1. Fingerprinting as Applied to Criminal Law

Fingerprints left behind at the scene of a crime (SOC) can be used to collect physical evidence for the purposes of human identification. They have the capacity to link a person (e.g. a suspect) to a particular location at a given time. This can happen in one of two ways: (i) the suspect's fingerprints are taken and cross-matched with those fingerprints found at the scene of a crime; or (ii) a successful match is found using computer technology to compare the fingerprints found at the scene of a crime with a database of previous offenders. It should be noted that fingerprinting in criminal law is not new. Manual standards, for instance, existed since the 1920s when the Federal Bureau of Investigation (FBI) in the U.S. started processing fingerprint cards. These standards ensured completeness, quality and permanency.

By the early 1970s due to progress in computer processing power and storage, and the rise of new more sophisticated software applications, law enforcement began to use automatic machines to classify, store, and retrieve fingerprint data. The FBI led the way by introducing the Integrated Automated Fingerprint Identification Systems (IAFIS) that could scan a fingerprint image and convert the minutiae to digital information and compare it to thousands of other fingerprints [5], [p.4ll]. Today, very large computer databases containing millions of fingerprints of persons who have been arrested are used to make comparisons with prints obtained from new crime scenes. These comparisons can literally take seconds or minutes depending on the depth of the search required. Sometimes successful matches can be made, other times the fingerprints cannot be matched. When fingerprints cannot be matched it is inferred that a new offender has committed a crime. These ‘new’ prints are still stored on the database as a means to trace back crimes committed by a person committing a second offence and who is apprehended by direct evidence, thus creating a trail of criminal events linked back to the same individual with the potential to solve multiple crimes. Commonly a list of prints that come closest to matching that print found at the scene of a crime are returned for further examination by an expert who then deems which single print is the closest match. In recent years background checks are even conducted on individuals using fingerprints, as a means to gain employment such as in early childhood [4], [p.5], or during the process of adoption or other security clearance requirements.

SECTION 5. What is DNA?

DNA fingerprinting, DNA (geno)typing, DNA profiling, identity testing and identification analysis, all denote the ability to characterize one or more rare features of an individual's genome, that is, their hereditary makeup. DNA contains the blueprints that are responsible for our cells, tissues, organs, and body [4], [p.8]. In short it can be likened to “God's signature” [6], [p.259]. Every single human has a unique composition, save for identical twins who share the same genotype but have subtly different phenotypes. When DNA samples are taken from blood cells, saliva or hair bulb specimens of the same person, the structure of the DNA remains the same. Thus only one sample is required as the basis for DNA profiling, and it can come from any tissue of the body [7], [P.L]. DNA fingerprinting was discovered in 1985 by English geneticist Dr Alec Jeffreys. He found that certain regions of DNA contained sequences that repeated themselves over and over again, one after the other and that different individuals had a different number of repeated sections. He developed a technique to examine the length variation of these DNA repeat sequences, thus creating the ability to perform identification tests [8], pp.2FJ.

The smallest building block of DNA is known as the nucleotide. Each nucleotide contains a deoxyribose, a phosphate group and a base. When we are analyzing DNA structures it is the sequence of bases that is important for the purposes of identification [9], [p.ll]. There are four bases through which a genetic code is described. These are: Adenine (A), Thymine (T), Guanine (G) and Cytosine (C). When trying to understand DNA sequences as they might appear in written form, consider that ‘A’ only binds with ‘T’, and ‘G’ only binds with ‘C’ (see figure 2 comparing row one and two). These base pairs are repeated millions of times in every cell and it is their order of sequence that determines the characteristics of each person. It is repetitive DNA sequences that are utilized in DNA profiling [10], [p.2].

Figure 2.&nbsp; A typical DNA sequence

Figure 2. A typical DNA sequence

For example, in Figure 2 the base sequences of the two strands, known as the double helix, is written for a fictitious DNA sample. While the labels “5” and “3” have been included for illustrative purposes a sequence is written plainly as CTTAGCCATAGCCTA. From this sequence we can deduce the second strand given the rules for binding described above. Furthermore, in specific applications of DNA testing various polymorphisms may be considered which denote the type of repeat for a given stretch of DNA. For instance the tetranucleotide repeat is merely a stretch of DNA where a specific four nucleotide motif is repeated [9], [P.L 0].

DNA profiling can be applied to a broad range of applications including diagnostic medicine, famil y relationship analysis (proof of paternity and inheritance cases), and animal and plant sciences [7], [p.31]. The most high profile use of DNA however is in the area of forensic science, popularized by modern day television series such as CSI Miami and Cold Case. Episodes from the series, such as “Death Pool” [11] and “Dead Air,” [12] allow members of the public to visualize how DNA might be used to gather evidence towards prosecution in a court of law. Although Hollywood is well known for its farcical and inaccurate representations, these episodes still do demonstrate the potential for DNA. DNA profiling illustrates the power to eliminate a suspect with a discrimination power so high that it can be considered a major identification mechanism [13], [P.L]. It is with no doubt that forensic DNA analysis has made a huge impact on criminal justice and the law since its inception in U.K. Courts with the 1988 investigation into the deaths of schoolgirls Lynda Mann in 1983 and Dawn Ashworth in 1986 [14]. Since that time, DNA has been used successfully in criminal law to help prove guilt or innocence [15], in family law to prove parentage, and in immigration law to prove blood relations for cases related to citizenship [4], [p.xiii].

5.1. DNA as Applied to Criminal Law

In forensic DNA analysis today, mitochondrial DNA is used for identification, as nuclear DNA does not possess the right properties toward individual identification [9], [p.5]. According to Koblinsky et al. it is the moderately repetitious DNA that is of interest to forensic analysts [4], [pp.17f]:

“It has been shown that 99.9% of human DNA is the same in every individual. In fact, every individual's DNA has a relatively small number of variations from others. It is that variation of 1 in every 1000 bases that allows us to distinguish one individual from another through forensic genetic testing.”

Similarly in the case of dactyloscopy, an individual's DNA can be left behind at a scene of a crime or on a victim. When natural fibers are transferred through human contact, for example, from a perpetrator to a victim, or natural fibers sometimes microscopic in nature are left behind at a scene of a crime, they can be used for evidentiary purposes. The DNA found in hair for example, can be compared to hair specimens taken from a crime suspect or the DNA profile stored in an existing DNA databank. Synthetic fibers not containing DNA, such as threads from a piece of clothing worn by a perpetrator, can also be used to link a suspect to a crime. When fibers are transferred from one person to another upon physical contact it is known as the Locard exchange principle [4], [p.3].

It is important to note that all physical evidence like DNA should only ever be considered circumstantial evidence. It is evidence that provides only a basis for inference about the claim being made, and can be used in logical reasoning to prove or disprove an assertion. In a criminal case, DNA alone cannot be used to prove someone's guilt or innocence. Rather DNA may be able to point investigators to ‘what happened’, ‘the order of events that took place’, ‘who was involved’, ‘where an event took place’ and ‘how it might have taken place,’ and in that manner the forensic scientist is conducting a reconstruction by means of association (table 1) [16], [P.L]. Thus the job of an investigator is to put all the pieces of the puzzle together and to gather as much information as possible and from as many available sources of evidence including eyewitness accounts, physical evidence and archival records [4], [P.L].

Table 1. A theoretical framework for the discipline of criminalistics [16], [p.2]

As more sophisticated techniques have emerged to analyze DNA samples taken at the scene of a crime, the lesser the mass of DNA that is needed for a correct reading. How much DNA do you need? Well, it all depends on the richness of the sample. For instance, a 2002 US State Police handbook noted that a clump of pulled hair contained enough material for successful RFLP (Restriction Fragment Length Polymorphism) typing. A single hair root provided enough nuclear DNA for PCR STR (polymerase chain reaction short tandem repeat) typing, but not enough for RFLP. And a hair shaft contained sufficient mitochondria for successful mtDNA (mitochondrial DNA) typing, but was inadequate for PCR STR or RFLP typing [16], [p.61]. A blood, saliva, urine, bone, teeth, skin or semen sample could be considered a richer sample than a hair root for extraction purposes, but DNA analysis is all very much dependent on the level of degradation the sample has been exposed to.

Environmental factors can be harmful to DNA that has been collected from a scene of a crime and can lead to issues relating to deterioration, destruction, or contamination of evidence which are all contestable issues a lawyer may have to deal with in a court of law [4], [p.xiii]. For instance, heat, moisture, bacteria, ultraviolet (UV) rays and common chemicals can contribute to the degradation process [9], [p.61]. When a sample undergoes some level of degradation, it is said to have had infringed upon the chain of custody. To get around such problems, experts have proposed bringing the laboratory closer to policing practice. The concept of “lab in a van” or “lab on a chip” (LOC) proposes the use of a mobile laboratory where analysis and interpretation of evidence is even possible at the scene of a crime [6], [p.153]. The advancements in mobile technologies continue to allow for even very tiny biological substances to undergo DNA testing resulting in accurate identification. Even a cigarette butt which has saliva on it containing epithelial cells can be screened for DNA evidence [4], [p.6].

SECTION 6. Comparing DNA and Fingerprinting

To begin with, traditional fingerprinting classification techniques have been around a lot longer than DNA identification, although both fingerprinting and DNA have been part of the human body since the start of time. In its manual form, the Galton-Henry system of fingerprint classification first made its impact on the practices of Scotland Yard in 1901. So whereas fingerprint recognition can happen using manual methods, DNA testing can only happen using laboratory systems, even if analysis now takes the form of a mobile lab on a chip. DNA is also a pervasive and invasive biometric technique. That is DNA is owned by everyone, and DNA actually belongs to the internals of what makes up the body. For a DNA reading, a hair shaft has been detached from the scalp, teeth and skin and bones have to be ‘dismembered’ from the body, blood and urine and saliva is extracted from the body [17], [p.374].

In most states, the police can take non-intimate samples if a person has been arrested for a serious recordable offence, and in other states DNA can be taken for offences such as begging, being drunk and disorderly, and taking part in an illegal demonstration. In the U.K. for instance, DNA does not have to be directly relevant to investigating the offence for which a person is being arrested and they do not have to be charged before the sample is taken. The police are not allowed to take more than one successful sample from the same body part during the course of an investigation. The police can take an intimate sample only with a person's written consent even if they have been arrested. However, there is a burgeoning debate at present about what actually constitutes consent during such a process-is it true consent, or merely compliance or acknowledgment of required police procedures by the individual under arrest.

Fingerprints are different in that while belonging to the body, they are a feature on the surface of the body, and they do not constitute mass. Fingerprints are patterns that appear on the skin, but they are not the fiber we know as skin. Fingerprints also exclude a small portion of the population-those who do not have particular fingers, or hands, or arms, or may have fingers that have been severely deformed due to accidental or deliberate damage. Despite these differences, the claim is made by scientists that forensic DNA testing has emerged as an accurate measure of someone's identification with reliability equal to that of fingerprint recognition [4], [p.5].

6.1. Intimate and Non-Intimate Measures: Other Biometrics Versus DNA Sampling

6.1.1. The United States and Other Biometrics

The notion of “intimacy” is very much linked to literature on DNA, and not of biometrics in general. Although historically there has been some contention that a fingerprint sample is both “intimate” and “private”, the proliferation of fingerprint, handprint, and facial recognition systems now used for government and commercial applications, has rendered this debate somewhat redundant. This is not to say that the storage of personal attributes is not without its own commensurate risks but large-scale applications enforced by such acts as the United States Enhanced Border Security and Visa Entry Reform Act of 2002 mean that fingerprint, hand and facial recognition systems have now become commonplace. In fact, this trend promises to continue through multimodal biometrics, the adoption of several biometrics toward individual authentication. Few travelers, at the time of transit, directly challenge the right of authorities to be taking such personal details, and to be storing them on large databases in the name of national security. However sentiment, at least in North America, was different prior to the September 11 terrorist attacks on the Twin Towers [18].

In 1997 biometrics were touted a type of personal data which was wholly owned by the individual bearer with statutory implications depending on the governing jurisdiction [19]. It followed that a mandatory requirement by a government agency to collect and store fingerprint data may have been in breach of an individual's legitimate right to privacy. In the U.S., court cases on this issue have found consistently that certain biometrics do not violate federal laws like the Fourth Amendment. It seems that the [20]:

“ … real test for constitutionality of biometrics … appears to be based on the degree of physical intrusiveness of the biometric procedure. Those that do not break the skin are probably not searches, while those that do are”.

In the context of DNA we can almost certainly claim that there is “physical intrusiveness” of a different nature to the collection of surface-level fingerprints (figure 2). In the collection of blood samples we must “break” or “pierce” the skin, in the collection of saliva samples we enter the mouth and touch the inner lining of the mouth with buccal swabs, in the removal of a hair or clump of hair we are “pulling” the hair out of a shaft etc. And it is here, in these examples, where consent and policing powers and authority become of greatest relevance and significance.

Figure 2. Left: finger “prints” on the surface of the skin. right: DNA blood “sample” taken by pricking the skin

6.1.2. Britain and DNA

In the world of DNA, there is a simple classification, followed by most law enforcement agencies that denote samples as either being of an “intimate” nature or “non-intimate” nature. In the British provisions of the original Police and Criminal Evidence Act of 1984 (PACE), section 65 defines intimate samples as: “a sample of blood, semen or any other tissue fluid, urine, saliva or pubic hair, or a swab taken from a person's body orifice” and non-intimate samples as “hair other than pubic hair; a sample taken from a nail or from under a nail; a swab taken from any part of a person's body other than a body orifice” [21], [p.80]. Generally, it must be noted that at times police can take a sample by force but on other occasions they require consent. In Britain, prior to 2001, intimate samples from a person in custody were once only obtainable through the express authority of a police officer at the rank of superintendent and only with the written permission of the person who had been detained (section 62) [21]. Non-intimate samples could be taken from an individual without consent but with permission from a police officer of superintendent rank (section 63). In both instances, there had to be reasonable grounds for suspecting that the person from whom the sample would be taken had been involved in a serious offence [21]. And above reasonable grounds, there had to be, theoreticall y at least, the potential to confirm or disprove the suspect's involvement through obtaining a DNA sample [22], [p.29]. Over time Acts such as the PACE have been watered down leading to controversial strategic choices in law enforcement practices, such as the trend towards growing national DNA databases at a rapid rate.

6.2. Continuity of Evidence

Table 2. Ways to mitigate the effect of DNA evidence

Policing and forensic investigative work, are no different to any other “system” of practice; they require to maintain sophisticated audit trails, even beyond those of corporate organizations, to ensure that a miscarriage of justice does not take place. However, fingerprints are much easier attributes to prove a continuity of evidence than DNA which is much more complex. A fingerprint found at a crime scene, does not undergo the same type of degradation as a DNA sample. Thus it is much easier to claim a fingerprint match in a court of law, than a DNA closeness match. Providing physical evidence in the form of a DNA sample or profile requires the litigator to prove that the sample was handled with the utmost of care throughout the whole chain of custody and followed a particular set of standard procedures for the collection, transportation, and handling of the material. The proof that these procedures were followed can be found in a series of paper trails which track the movements of samples [6], [p.114].

Beyond the actual location of the evidence, a continuity of evidence has to do with how a DNA sample is stored and handled, information related to temperature of the place where the sample was found and the temperature at the place of storage, whether surrounding samples to that being analyzed were contaminated, how samples are identified and qualified using techniques such as barcode labels or tags, how samples were tested and under what conditions, and how frequently samples were accessed and by whom and for what purposes [4], [p.43]. When DNA forensic testing was in its infancy, knowledgeable lawyers would contest the DNA evidence in court by pointing to micro-level practices of particular laboratories that had been tasked with the analytical process. The first time that attention had been focused on the need to standardize procedures and to develop accreditation processes for laboratories and for personnel was in the 1989 case People v Castro 545 N.Y.S.2d 985 (Sup. Ct. 1989). When DNA testing began it was a very unregulated field, with one commentator famously noting that: “clinical laboratories [were required to] meet higher standards to be allowed to diagnose strep throat than forensic labs [were required to] meet to put a defendant on death row” [9], [p.55]. But it must be said, given the advancement in quality procedures, attacks on DNA evidence, rarely focus on the actual standards, and more so focus on whether or not standards were followed appropriately [9], [p.61].

In the event that a defense lawyer attempts to lodge an attack on the DNA evidence being presented in a court of law, they will almost always claim human error with respect to the procedures not being followed in accordance to industry standards. Human error cannot be eradicated from any system, and no matter how small a chance, there is always the possibility that a sample has been wrongly labeled or contaminated with other external agents [9]. Worse still is the potential for a forensic expert to provide erroneous or misleading results, whether by a lack of experience, a miscalculation on statistical probabilities or deliberate perjury. The latter is complex to prove in court. Some have explained away these human errors toward wrongful conviction as a result of undue political pressure placed on lab directors and subsequently analysts for a timely response to a violent crime [16], [p.157]. As Michaelis et al. note [9], [p.69]:

“[i]n far too many cases, the directors of government agencies such as forensic testing laboratories are subjected to pressure from politicians and government officials to produce results that are politically expedient, sometimes at the expense of quality assurance … Laboratory directors are too often pressured to produce results quickly, or to produce results that will lead to a conviction, rather than allowed to take the time required to ensure quality results.”

Thus attacks on DNA evidence can be made by attacking the chain of custody among other strategies shown in Table 2.

SECTION 7. The Difference Between Databases and Databanks

7.1. Of Profiles and Samples

In almost any biometric system, there are four steps that are required towards matching one biometric with another. First, data is acquired from the subject, usually in the form of an image (e.g. fingerprint or iris). Second, the transmission channel which acts as the link between the primary components will transfer the data to the signal processor. Third, the processor takes the raw biometric image and begins the process of coding the biometric by segmentation which results in a feature extraction and a quality score. The matching algorithm attempts to find a record that is identical resulting in a match score. Finally, a decision is made based on the resultant scores, and an acceptance or rejection is determined [23]. At the computer level, a biometric image is translated into a string of bits, that is, a series of one's and zero's. Thus a fingerprint is coded into a numeric value, and these values are compared in the matching algorithm against other existing values. So simply put, the input value is the actual fingerprint image, and the output value is a coded value. This coded value is unique in that it can determine an individual profile.

With respect to the extraction of a DNA sample the process is much more complex, as is its evaluation and interpretation. A DNA sample differs from a fingerprint image. A sample is a piece of the body or something coming forth or out from the body, while in the case of fingerprints, an image is an outward bodily aspect. When a DNA sample undergoes processing, it is also coded into a unique value of As, Ts, Gs and Cs. This value is referred to as a DNA profile. Storing DNA profiles in a computer software program is considered a different practice to storing the actual feature rich DNA sample in a DNA store. Some members of the community have volunteered DNA samples using commercial DNA test kits such as “DNA Exam” by the BioSynthesis Corporation [24]. For example, the DNA Diagnostics Center [25] states that one may:

“ … elect to take advantage of [the] DNA banking service without any additional charge if [one] orders a DNA profile [and that the company] will store a sample of the tested individual's DNA in a safe, secure facility for 15 years-in case the DNA sample is ever needed for additional testing”.

The controversy over storing “samples” by force in the crime arena has to do with the potential for DNA to generate information such as a person's predisposition to disease or other characteristics that a person might consider confidential. It is the application of new algorithms or extraction/evaluation/ interpretation techniques to an existing sample that is of greatest concern to civil liberties advocates. Profiles are usually unique combinations of 16 markers [26], they can only be used to match, and cannot be used toward further fact finding discoveries although some believe that you might be able to draw conclusions from profiles in the future. In a given population, there are several different alleles for any single marker and some of these may appear more frequently than others. The best markers are those with the greatest number of different alleles and an even distribution of allele frequencies [9], [p.19].

7.2. Of Databases and Databanks

Although textbooks would have us believe that there is a clear-cut distinction about what constitutes a database as opposed to a databank, in actual fact the terms are used interchangeably in most generalist computing literature. Most dictionaries for example will define the term database without an entry for databank. A database is a file of information assembled in an orderly manner by a program designed to record and manipulate data and that can be queried using specific criteria. Commercial enterprise grade database products include Oracle and Microsoft Access. The International Standards Organization however, does define a databank as being “a set of data related to a given subject and organized in such a way that it can be consulted by users” [27]. This distinction is still quite subtle but we can extrapolate from these definitions that databases are generic information stores, while databanks are specific to a subject [28].

In the study of DNA with respect to criminal law, the distinction between databases and databanks is a lot more crystallized, although readers are still bound to be confused by some contradictory statements made by some authors. Still, in most cases, a databank is used to investigate crimes and to identify suspects, and a database is used to estimate the rarity of a particular DNA profile in the larger population [9], [p.99]. Databanks contain richer personal information related to samples, even if the identity of the person is unknown. For example, the databank can contain unique profiles of suspects and convicted criminals and content about physical crime stains and records of DNA profiles generated by specific probes at specific loci [10], [p.40]. Databases are much more generic than databanks containing information that is representative of the whole populace or a segment of the populace. For example, a database can contain statistical information relating to the population frequencies of various DNA markers generated from random samples for particular ethnic groups or for the whole population at large. Databanks may contain rich personal data about offenders and cases [16], [pp.157f] but databases only contain minimal information such as the DNA profile, ethnic background and gender of the corresponding individuals.

Table 3. The NDNAD database attributes [30]

The premise of the DNA databank is that DNA profile data of known offenders can be searched in an attempt to solve crimes, known as ‘cold cases’. They are valuable in that they can help investigators string a series of crimes together that would otherwise go unrelated, allowing for the investigator to go across space and time after all other avenues have been exhausted [9, p.99]. With respect to violent crimes, we know that offenders are highly prone to re-offending and we also know that violent crimes often provide rich DNA sample sources such as bones, blood, or semen. Thus DNA left at the scene of a crime can be used to search against a DNA databank in the hope of a “close” match [16], [p.157]. The probative value of the DNA evidence is greater the rarer the DNA profile in the larger population set [9], [p.19].

Different jurisdictions have different standards on the criteria for inclusion into DNA databanks and what attribute information is stored in individual records and who has access. In the United States for instance, different states have different rules, some allowing for DNA databanks to be accessed by law enforcement agencies alone, and others allowing for public officials to have access for purposes outside law enforcement [9], [p.100]. In the U.S. the CODIS (Combined DNA Index System) system was launched in 1998–99 by the FBI. It contains two searchable databases, one with previous offenders and another with DNA profiles gathered from evidence at crime scenes [9], [p.16]. In the case of the U.K., the National DNA Database (NDNAD) of Britain, Wales and Northern Ireland, contains very detailed information for each criminal justice (CJ) record (see table 3) and profiles are searched against each other on a daily basis with close hit results forwarded on to the appropriate police personnel. It is quite ironic that the 1995 NDNAD is a databank but is so large that it is considered a database by most, as is also evident by the fact that the word “database” also appears in the NDNAD acronym [29], [p.2].

SECTION 8. Legal, Ethical and Social Concerns

The collection, storage, and use of DNA samples, profiles and fingerprints raise a number of legal, ethical and social concerns. While some of the concerns for the collection and storage of an individual's fingerprints by the State have dissipated over the last decade, the debate over the storage of DNA samples and profiles rages more than ever before. It was around the turn of the century when a number of social, ethical and legal issues were raised with respect to DNA sampling but councils and institutes through lack of knowledge or expertise could hardly offer anything in terms of a possible solution or way forward to the DNA controversy [31], [p.34]. At the heart of the techno-legal “controversy” is a clash of ideals coming from a collision of disciplines. For many medical practitioners working on topics related to consent or confidentiality, the legal position on DNA is one which acts as a barrier to important medical research. While few would dispute the importance of data protection laws and the ethical reasons behind balancing the right to privacy against other rights and interests, some in the medical field believe that the law has not been able to deal with exceptions where the use of DNA data could be considered proportionate, for instance, in the area of epidemiology. There are those like Iverson who argue that consent requirements could be relaxed for the sake of the common good.

“We are not arguing that epidemiological research should always proceed without consent. But it should be allowed to do so when the privacy interference is proportionate. Regulators and researchers need to improve their ability to recognize these situations. Our data indicate a propensity to over-predict participants' distress and under-predict the problems of using proxies in place of researchers. Rectifying these points would be a big step in the right direction” [32], [p.169].

Thinking in this utilitarian way, the use of DNA evidence for criminal cases, especially violent crimes, is something that most people would agree is a legitimate use of technology and within the confines of the law. The application of DNA to assist in civil cases, again, would seem appropriate where family and state-to-citizen disputes can only be settled by the provision of genetic evidence. Volunteering DNA samples to appropriate organizations and institutions is also something that an individual has the freedom to do, despite the fact that a large portion of the population would not participate in a systematic collection of such personal details. Voluntary donation of a DNA sample usually happens for one of three reasons: (i) to assist practitioners in the field of medical research; (ii) to assist in DNA cross-matching exercises with respect to criminal cases; and (iii) to aid an individual in the potential need they may have of requiring to use their own DNA in future uses with any number of potential possibilities. For as Carole McCartney reminds us:

“[f]orensic DNA technology has multiple uses in the fight against crime, and ongoing research looks to expand its usefulness further in the future. While the typical application of DNA technology in criminal investigations is most often unproblematic, there needs to be continued vigilance over the direction and implications of research and future uses” [33], [p.189].

Table 4. Legal, ethical and social issues related to use of DNA in criminal law

It is in this parallel development that we can see an evolution of sorts occurring with the collection of highly intimate personal information. On the one hand we have the law, on the other hand we have medical discovery, both on parallel trajectories that will have overflow impact effects on one other. For many, the appropriate use of DNA in the medical research field and criminal law field can only have positive benefits for the community at large. There is no denying this to be the case. However, the real risks cannot be overlooked. Supplementary industries can see the application of DNA in a plethora of programs, including the medical insurance of ‘at risk’ claimants to an unforeseen level of precision, measuring an individual's predisposition to a particular behavioral characteristic for employment purposes [34], [p.897], and the ability to tinker with the genes of unborn children to ensure the “right” type of citizens are born into the world. All of these might sound like the stuff of science fiction but they are all areas under current exploration.

For now, we have the ability to identify issues that have quickly escalated in importance in the DNA debate. For this we have several high profile cases in Europe to thank but especially the latest case which was heard in the European Court of Human Rights (ECtHR) on the 4 December 2008, that being S and Marper v. the United Kingdom [35]. This landmark case, against all odds, acted to make the U.K. (and to some extent the rest of the world) stop and think about the course it had taken. For the U.K. this meant a re-evaluation of its path forward via a community consultation process regarding the decade old initiatives of the NDNAD. The main issues that the case brought to the fore, and those of its predecessor cases, can be found in summary in Table 4. The table should be read from left to right, one row at a time. The left column indicates what most authors studying the socio-ethical issues regard as an acceptable use of DNA, and the right column indicates what most authors regard as either debatable or unacceptable use of DNA.

Of greatest concern to most civil libertarians is the issue of proportionality and the potential for a disproportionate number of profiles to be gathered relative to other state practices towards a blanket coverage databank. Blanket coverage databanks can be achieved by sampling a populace, a census approach is not required. Maintaining DNA profiles for some 15–20% of the total population, means you could conduct familial searching on the rest to make associations between persons with a high degree of accuracy [4], [p.274], something that would be possible in the U.K. by 2018 if it maintained the same level of sampling due process. This is not without its dangers, as it promotes adventitious searching and close matches that might not categorically infer someone's guilt or Innocence.

Table 5. Social, ethical and legal issues pertaining to DNA databanks identified by national institute of justice in the united states in 2000 [31, pp. 35f].

In addition, the large databanks are not without their bias. Already police records are filled with the presence of minority groups of particular ethnic origin for instance, which can have an impact on the probability of a close match despite someone's innocence. Being on the database means that there is a chance a result might list you as a suspect based on having a similar DNA profile to someone else. And ultimately, the fact that innocent people would have their profiles stored on the NDNAD would do little in the way of preventing crime, and would lead before too long, to a de facto sampling of all state citizens.

The driving force behind such a campaign could only be achieved by obtaining DNA samples from persons (including innocent people or ‘innocents’), either via some event triggering contact between an individual and the police or via an avenue at birth [10], [p.40]. Police powers have increased since world wide terrorist attacks post 2000 especially, and this has led to a tradeoff with an individual's right to privacy [36], [p.14]. Notions of consenting to provide a DNA sample to law enforcement personnel have been challenged whereby the use of force has been applied. And not consenting to a sample being taken, even if you are innocent has its own implications and can be equally incriminating. So legislative changes have encroached on individual rights; whereby a warrant was once required to take a DNA sample from a suspect's body based on reasonable grounds, today it is questionable if this caveat actually exists.

Beyond the obvious downsides of retaining the DNA profile or sample of innocent people who are in actual fact law abiding citizens, there is the potential for persons to feel aggravated because they have not been let alone to go about their private business. Innocent persons who are treated like criminals may end up losing their trust in law enforcement agencies. This problem is not too small of a social issue, given that there are about 1 million innocent people on the NDNAD in the U.K. And in this context, it is not difficult to see how some individuals or groups of individuals might grow to possess an anti-police or antigovernment sentiment, feeling in some way that they have been wronged or singled out. In some of these ‘mistaken identity’ situations, surely it would have been better to prove someone’ s innocence by using other available evidence such as closed circuit television (CCTV), without the need to take an intimate DNA sample first. Despite these problems, it seems anyone coming under police suspicion in the U.K. will have their DNA taken anyway [33], [p.175].Of a most sensitive nature is the collection of DNA samples for an indefinite period of time [4], [p. 7]. In most countries, samples are taken and DNA profiles are determined and stored in computer databases, and subsequently samples are destroyed. The long-term storage of DNA samples for those who have committed petty crimes and not violent crimes raises the question of motivation for such storage by government authorities [4]. There are critics who believe that the retention of samples is “an unjustifiable infringement on an individual's privacy” [33], [p.189].

There is much that has changed with respect to social, ethical and legal issues since 2000, both in the United States and the United Kingdom since its publication. But the table still provides a historical insight into the growing list of issues that were identified at the turn of the century.

Equally alarming is the storage of samples of innocents and also of those who are minors. Even more disturbing is the storage of samples with which no personal details have been associated. DNA databanks are not different to other databanks kept by the state-they can be lost, they can be accessed by unauthorized persons, and results can be misrepresented either accidentally or deliberately [33], [p.188]. The stakes however are much higher in the case of DNA than in fingerprinting or other application areas because the information contained in a DNA sample or profile is much richer in potential use. All of this raises issues pertaining to how changes in the law affect society, and how ethics might be understood within a human rights context.

SECTION 9. Conclusion

The legal, social and ethical issues surrounding the collection, use and storage of DNA profiles and samples is probably more evident today than at any other time in history. On the one hand we have the necessity to advance technology and to use it in situations in which it is advantageous to the whole community, on the other hand this same technology can impinge on the rights of individuals (if we let it), through sweeping changes to legislation. Whether we are discussing the need for DNA evidence in criminal law, civil law, epidemiological research or other general use, consent should be the core focus of any and every collection instance. Unlimited retention of DNA samples collected from those arrested but not charged is another issue where legislative reforms need to be taken in a number of European jurisdictions, although this trend seems to be gathering momentum now more so outside Europe. Another issue is the redefinition of what constitutes an intimate or non-intimate sample, and here, especially most clearly we have a problem in a plethora of jurisdictions with regards to the watering down of what DNA procedures are considered invasive as opposed to non-invasive with respect to the human body. The bottom line is that we can still convict criminals who have committed serious recordable offences, without needing to take the DNA sample of persons committing petty crimes, despite that statistics allege links between those persons committing serious and petty offences. So long as a profile is in a database, it can be searched, and the problem with this is that so-called ‘matches’ (adventitious in nature) can be as much ‘incorrect’ as they are ‘correct’. And this possibility alone has serious implications for human rights. The time to debate and discuss these matters is now, before the potential for widespread usage of DNA becomes commonplace for general societal applications.

SECTION 10. Afterword

Although members of society should not expect to learn of a black market for DNA profiles just yet, it is merely a matter of time before the proliferation and use of such profiles means they become more attracti ve to members of illicit networks. There is now overwhelming evidence to show that identity theft worldwide is on the rise (although estimates vary depending on the study and state). The systematic manipulation of identification numbers, such as social security numbers, credit card numbers, and even driver's license numbers for misuse is now well documented. Victims of identity theft know too well the pains of having to prove who they are to government agencies and financial institutions, and providing adequate evidence that they should not be held liable for information and monetary transactions they did not commit. Today's type of identity theft has its limitations however-stealing a number is unlike stealing somebody's godly signature. While credit card numbers can be replaced, one's DNA or fingerprints cannot. This resonates with the well-known response of Sir Thomas More to Norfolk in A Man for All Seasons: “you might as well advise a man to change the color of his eyes [another type of biometric]”, knowing all too well that this was impossible. While some have proclaimed the end of the DNA controversy, at least from a quality assurance and scientific standpoint, the real controversy is perhaps just beginning.

ACKNOWLEDGEMENTS

The author would like to acknowledge Associate Professor Clive Harfield of the Centre for Transnational Crime Prevention in the Faculty of Law at the University of Wollongong for his mentorship in the areas of U.K. law and policing in 2009. The author also wishes to extend her sincere thanks to Mr Peter Mahy, Partner at Howells LLC and the lawyer who represented S & Marper in front of the Grand Chamber at the European Court of Human Rights, for his willingness to share his knowledge on the NDNAD controversy via a primary interview.

References

1. J. R. Parks, P. L. Hawkes, "Automated personal identification methods for use with smart cards" in Integrated Circuit Cards Tags and Tokens: new technology and applications, Oxford: BSP Professional Books, pp. 92-135, 1990.

2. A. K. Jain, L. Hong, S. Pankati, R. Bolle, "An identity-authentication system using fingerprints", Proceedings of the IEEE, vol. 85, pp. 1365-1387, 1997.

3. J. Cohen, Automatic Identification and Data Collection Systems, London:McGraw-Hill Book Company, pp. 228, 1994.

4. L. Koblinsky, T. F. Liotti, J. Oeser-Sweat, "DNA: Forensic and Legal Applications" in , New Jersey:Wiley, 2005.

5. P. T. Higgins, "Standards for the electronic submission of fingerprint cards to the FBI", Journal of Forensic Identification, vol. 45, pp. 409-418, 1995.

6. M. Lynch, S. A. Cole, R. McNally, K. Jordan, Truth Machine: the Contentious History of DNA Fingerprinting, Chicago:The University of Chicago Press, 2008.

7. L. T. Kirby, DNA Fingerprinting: An Introduction, New York:Stockton Press, 1990.

8. J. M. Butler, Forensic DNA Typing: Biology Technology and Genetic of STR Markers, Amsterdam:Elsevier Academic Press, pp. 2, 2005.

9. R. C. Michaelis, R. G. Flanders, P. H. Wulff, A Litigator's Guide to DNA: from the Laboratory to the Courtroom, Amsterdam:Elsevier, 2008.

10. C. A. Price, DNA Evidence: How Reliable Is It? An Analysis of Issues Which May Affect the Validity and Reliability of DNA Evidence, Legal Research Foundation, vol. 38, 1994.

11. A. Donahue, E. Devine, S. Hill, "Death Pool (Season 5 Episode 3)", CSI Miami, 2006.

12. J. Haynes, S. Hill, "Dead Air (Season 4 Episode 21)", CSI Miami, 2006.

13. B. Selinger, J. Vernon, B. Selinger, "The Scientific Basis of DNA Technology" in DNA and Criminal Justice, Canberra:, vol. 2, 1989.

14Man jailed in first DNA case wins murder appeal, May 2009, [online] Available: http://uk.reuters.comlarticle/idUKTRE54D3cc20090514?pageNumber=1&virtuaIBrandChannel=O.

15The Innocence Project-Home, 2009, [online] Available: http://www.innocenceproject.org/.

16. N. Rudin, K. Inman, An Introduction to Forensic DNA Analysis, London:CRC Press, 2002.

17. A. Roberts, N. Taylor, "Privacy and the DNA Database", European Human Rights Law Review, vol. 4, pp. 374, 2005.

18. K. Michael, M. G. Michael, Automatic Identification and Location Based Services: from Bar Codes to Chip Implants:, 2009.

19. R. Van Kralingen, C. Prins, J. Grijpink, Using your body as a key; legal aspects of biometrics, 1997, [online] Available: http://cwis.kub.nll~frw/people/kraling/contentlbiomet.htm.

20. S. O'Connor, "Collected tagged and archived: legal issues in the burgeoning use of biometrics for personal identification", Stanford Technology Law Review, 1998, [online] Available: http://www.jus.unitn.it/USERS/pascuzzi/privcomp99-00/topics/6/firma/connor.txt.

21. S. Ireland, "What Authority Should Police Have to Detain Suspects to take Samples?", DNA and Criminal Justice, pp. 80, 1989.

22. I. Feckelton, J. Vernon, B. Selinger, "DNA Profiling: Forensic Science Under the Microscope" in DNA and Criminal Justice, Canberra:, vol. 2, pp. 29, 1989.

23. K. Raina, J. D. Woodward, N. Orlans, J. D. Woodward, N. M. Orlans, P. T. Higgins, "How Biometrics Work", Biometrics, pp. 29, 2002.

24Identity DNA Tests, 2009, [online] Available: http://www.800dnaexam.comlIdentity_dna_tests.aspx.

25Profiling, 2009, [online] Available: http://www.dnacenter.comldna-testing/profiling.html.

26. "Biosciences Federation and The Royal Society of Chemistry", Forensic Use of Bioinformation: A response from the Biosciences Federation and the Royal Society of Chemistry to the Nuffield Council on Bioethics, January 2007, [online] Available: http://www.rsc.org/images/ForensicBioinformation_tcm18-77563.pdf.

27. J. C. Nader, "Data bank" in , Prentice Hall's Illustrated Dictionary of Computing, pp. 152, 1998.

28DNA Safeguarding for security and identification, 2009, [online] Available: http://www.dnatesting.comldna-safeguarding/index.php.

29. "The British Academy of Forensic Sciences", response to the Nuffield Bioethics Council Consultation-The Forensic use of bioinformation: ethical issues between November 2006 and January 2007, 2007, [online] Available: http://www.nuffieldbioethics.org/fileLibrary/pdf/British_Academy-of_Forensic_Sciences.pdf.

30What happens when someone is arrested?, 2009, [online] Available: http://www.genewatch.org/sub-539483.

31. "The Future of Forensic DNA Testing: Predictions of the Research and Development Working Group", National Institute of Justice, 2000.

32. A. Iversen, K. Liddell, N. Fear, M. Hotopf, S. Wessely, "Consent confidentiality and the Data Protection Act", British Medical Journal, vol. 332, pp. 169, 2006.

33. C. McCartney, "The DNA Expansion Programme and Criminal Investigation", The British Journal of Criminology, vol. 46, pp. 175-189, 2006.

34. D. Meyerson, "Why Courts Should Not Balance Rights Against the Public Interest", Melbourne University Law Review, vol. 33, pp. 897, 2007.

35. "Grand Chamber I Case of S. and Marper v. The United Kingdom (Applications nos. 30562/04 and 30566/04) Judgment", European Court of Human Rights Strasbourg, December 2008.

36. J. Kearney, P. Gunn, "Meet the Experts-Part III DNA Profiling", pp. 14, 1991.

Keywords

Law, Legal factors, Fingerprint recognition, DNA, Forensics, Biometrics, Sampling methods, Skin, Sociotechnical systems, History, Controlled Indexing
social sciences, criminal law, ethical aspects, fingerprint identification, forensic science, social issue, fingerprint profile collection, fingerprint profile storage, DNA sample, forensic science, nonviolent crime, biometric technique, buccal swab, legal issue, ethical issue

Citation: Katina Michael, "The legal, social and ethical controversy of the collection and storage of fingerprint profiles and DNA samples in forensic science", 2010 IEEE International Symposium on Technology and Society (ISTAS), 7-9 June 2010, Wollongong, Australia

The Social, Cultural, Religious and Ethical Implications of Automatic Identification

Katina Michael, School of Information Technology & Computer Science, University of Wollongong, NSW, Australia 2500, katina@uow.edu.au

M.G. Michael, American Academy of Religion, PO Box U184, University of Wollongong, NSW, Australia 2500, mgm@uow.edu.au

Full Citation: Katina Michael, M.G. Michael, 2004, The Social, Cultural, Religious and Ethical Implications of Automatic Identification, Seventh International Conference on Electronic Commerce Research (ICER-7), University of Texas, Dallas, Texas, USA, June 10-13. Sponsored by ATSMA, IFIP Working Group 7.3, INFORMS Information Society.

Abstract

The number of automatic identification (auto-ID) technologies being utilized in eBusiness applications is growing rapidly. With an increasing trend toward miniaturization and wireless capabilities, auto-ID technologies are becoming more and more pervasive. The pace at which new product innovations are being introduced far outweighs the ability for citizens to absorb what these changes actually mean, and what their likely impact will be upon future generations. This paper attempts to cover a broad spectrum of issues ranging from the social, cultural, religious and ethical implications of auto-ID with an emphasis on human transponder implants. Previous work is brought together and presented in a way that offers a holistic view of the current state of proceedings, granting an up-to-date bibliography on the topic. The concluding point of this paper is that the long-term side effects of new auto-ID technologies should be considered at the outset and not after it has enjoyed widespread diffusion.

1.  Introduction

Automatic identification is the process of identifying a living or nonliving object without direct human intervention. Before auto-ID only manual identification techniques existed, such as tattoos [[i]] and fingerprints, which did not allow for the automatic capture of data (see exhibit 1.1). Auto-ID becomes an e-business application enabler when authorization or verification is required before a transaction can take place. Many researchers credit the vision of a cashless society to the capabilities of auto-ID. Since the 1960s automatic identification has proliferated especially for mass-market applications such as electronic banking and citizen ID. Together with increases in computer processing power, storage equipment and networking capabilities, miniaturization and mobility have heightened the significance of auto-ID to e-business, especially mobile commerce. Citizens are now carrying multiple devices with multiple IDs, including ATM cards, credit cards, private and public health insurance cards, retail loyalty cards, school student cards, library cards, gym cards, licenses to drive automobiles, passports to travel by air and ship, voting cards etc. More sophisticated auto-ID devices like smart card and radio-frequency identification (RFID) tags and transponders that house unique lifetime identifiers (ULI) or biometric templates are increasingly being considered for business-to-consumer (B2C) and government-to-citizen (G2C) transactions. For example, the United States (US) is enforcing the use of biometrics on passports due to the increasing threats of terrorism, and Britain has openly announced plans to begin implanting illegal immigrants with RFID transponders. Internationally, countries are also taking measures to decrease the multi-million dollar costs of fraudulent claims made to social security by updating their citizen identification systems.

Exhibit 1.1 &nbsp;&nbsp;&nbsp;&nbsp;Manual versus Automatic Identification Techniques

Exhibit 1.1     Manual versus Automatic Identification Techniques

2.  Literature Review

The relative ease of performing electronic transactions by using auto-ID has raised a number of social, cultural, religious and ethical issues. Among others, civil libertarians, religious advocates and conspiracy theorists have long cast doubts on the technology and the ultimate use of the information gathered by it. Claims that auto-ID technology impinges on human rights, the right to privacy, and that eventually it will lead to totalitarian control of the populace have been put forward since the 1970s. This paper aims to explore these themes with a particular emphasis on emerging human transponder implant technology. At present, several US companies are marketing e-business services that allow for the tracking and monitoring of individuals using RFID implants in the subcutaneous layer of the skin or Global Positioning System (GPS) wristwatches worn by enrollees. To date previous literature has not consistently addressed philosophical issues related to chip implants for humans in the context of e-business. In fact, popular online news sources like CNN [[ii]] and the BBC [[iii]] are among the few mainline publishers discussing the topic seriously, albeit in a fragmented manner. The credible articles on implanting humans are mostly interviews conducted with proponents of the technology, such as Applied Digital Solutions (ADSX) [[iv]] representatives who are makers of the VeriChip system solution [[v]]; Professor Kevin Warwick of the University of Reading who is known for his Cyborg 1.0 and 2.0 projects [[vi]]; and implantees like the Jacobs family in the US who bear RF/ID transponder implants [[vii]]. Block passages from these interviews are quoted throughout this paper to bring some of the major issues to the fore using a holistic approach.

More recently academic papers on human transponder implants covering various perspectives have surfaced on the following topics: legal and privacy [[viii], [ix]], ethics and culture [[x]], technological problems and health concerns [[xi]], technological progress [[xii]], trajectories [[xiii], [xiv]]. While there is a considerable amount of other popular material available especially on the Internet related to human chip implants, much of it is subjective and not properly sourced. One major criticism of these reports is that the reader is left pondering as to the authenticity of the accounts provided with little evidence to support respective claims and conclusions. Authorship of this literature is another problem. Often these articles are contributed anonymously, and when they do cite an author’s name, the level of technical understanding portrayed by the individual is severely lacking to the detriment of what he/she is trying to convey, even if there is a case to be argued. Thus, the gap this paper seeks to fill is to provide a sober presentation of cross-disciplinary perspectives on topical auto-ID issues with an emphasis on human transponder implants, and second to document some of the more thought-provoking discussion which has already taken place on the topic, complemented by a complete introductory bibliography.

3.  Method

Articles on auto-ID in general have failed to address the major philosophical issues using a holistic approach. For instance, Woodward [[xv]] is one of the few authors to have mentioned anything overly substantial about religious issues, with respect to biometric technology in a recognized journal. Previously the focus has basically been on privacy concerns and Big Brother fears. While such themes are discussed in this paper as well, the goal is to cover a broader list of issues than the commonplace. This is the very reason why two researchers with two very different backgrounds, one in science and the other in humanities, have collaborated to write this paper. A qualitative strategy is employed in this investigation to explore the major themes identified in the literature review. It should be noted however that legal, regulatory, economic and related policy issues such as standards, have been omitted because the aim of this paper is not to inform a purely technical audience or an audience which is strictly concerned with policy. It is aimed rather at the potential end-user of auto-ID devices and at technology companies who are continually involved in the process of auto-ID innovation.

Original material is quoted extensively to ensure that the facts are presented “as is.” There is nothing lost in simplified translations and the full weight of argument is retained, raw and uncut. The authors therefore cannot be accused of bias or misrepresentation. The sheer breadth of literature used for this investigation ensures reliability and validity in the findings. The narrative reporting style helps to guide readers through the paper, allowing individuals to form their own opinions and interpretations of what is being presented. Evidence for the issues discussed has been gathered from a wide variety of sources including offline and online documentation. High level content analysis has been performed to aid in the grouping of categories of discussion including social, cultural, religious and ethical issues that form the skeleton of the main body of the article as a way to identify emerging trends and patterns. Subcategories are also useful in identifying the second tier themes covered, helping to reduce complexity in analysis. The subcategories also allow for links to be made between themes. A highly intricate thread runs through the whole paper telling the story of not just auto-ID but the impacts of the information technology and telecommunications (IT&T) revolution [[xvi]]. There is therefore a predictive element to the paper as well which is meant to confront the reader with some present and future scenarios. The ‘what if’ questions are important as it is hoped they will generate public debate on the major social, cultural, religious and ethical implications of RFID implants in humans.

4. Towards Ubiquitous Computing

Section 4 is wholly dedicated to providing a background in which to understand auto-ID innovation; it will also grant some perspective to the tremendous pace of change in IT&T; and note some of the more grounded predictions about the future of computing. The focus is on wearable and ubiquitous computing within which auto-ID will play a crucial role. This section will help the reader place the evidence presented in the main body of the article into an appropriate context. The reader will thus be able to interpret the findings more precisely once the basic setting has been established, allowing each individual to form their own opinions about the issues being presented.

From personal computers (PCs) to laptops to personal digital assistants (PDAs) and from landline phones to cellular phones to wireless wristwatches, miniaturization and mobility have acted to shift the way in which computing is perceived by humans. Lemonick [[xvii]] captures this pace of change well in the following excerpt:

[i]t took humanity more than 2 million years to invent wheels but only about 5,000 years more to drive those wheels with a steam engine. The first computers filled entire rooms, and it took 35 years to make the machines fit on a desk- but the leap from desktop to laptop took less than a decade… What will the next decade bring, as we move into a new millennium? That’s getting harder and harder to predict.

Once a stationary medium, computers are now portable, they go wherever humans go [[xviii]]. This can be described as technology becoming more human-centric, “where products are designed to work for us, and not us for them” [[xix]]. Thus, the paradigm shift is from desktop computing to wearable computing [[xx]]. Quite remarkably in the pursuit of miniaturization, little has been lost in terms of processing power. “The enormous progress in electronic miniaturization make it possible to fit many components and complex interconnection structures into an extremely small area using high-density printed circuit and multichip substrates” [[xxi]]. We now have so-named Matchbox PCs that are no larger than a box of matches with the ability to house fully functional operating systems [[xxii]]. “The development of wearable computer systems has been rapid. Salonen [[xxiii]], among others [[xxiv]] are of the belief that “quite soon we will see a wide range of unobtrusive wearable and ubiquitous computing equipment integrated into our everyday wear”. The next ten years will see wearable computing devices become an integral part of our daily lives, especially as the price for devices keeps falling. Whether noticeable or not by users, the change has already begun. Technology is increasingly becoming an extension of the human body, whether it is by carrying smart cards or electronic tags [[xxv]] or even PDAs and mobile phones. Furui [[xxvi]] predicts that “[p]eople will actually walk through their day-to-day lives wearing several computers at a time.” Cochrane described this phenomenon as technology being an omnipresent part of our lives. Not only will devices become small and compact but they will be embedded in our bodies, invisible to anyone else [[xxvii]]. For the time being however, we are witnessing the transition period in which auto-ID devices especially are being trialled upon those who either i) desperately require their use for medical purposes or ii) who cannot challenge their application, such as in the case of armed forces or prison inmates. Eventually, the new technology will be opened to the wider market in a voluntary nature but will most likely become a de facto compulsory standard (i.e. such as in the case of the mobile phone today), and inevitably mandatory as it is linked to some kind of requirement for survival. Upon reflection, this is the pattern that most successful high-tech innovations throughout history have followed.

Mark Weiser first conceived the term “ubiquitous computing” to espouse all those small information systems (IS) devices, including calculators, electronic calendars and communicators that users would carry with them every day [[xxviii]]. It is important to make the distinction between ubiquitous and wearable computing. They “have been posed as polar opposites even though they are often applied in very similar applications” [[xxix]]. Kaku [[xxx]] stated that ubiquitous computing, is the time “when computers are all connected to each other and the ratio of computers to people flips the other way, with as many as one hundred computers for every person.” This latter definition implies a ubiquitous environment that allows the user to seamlessly interface with computer systems around them. Environments of the future are predicted to be context-aware so that users are not disturbed in every context, save for when it is suitable [[xxxi]]. Kortuem [[xxxii]] stated that “[s]uch environments might be found at the home, at the office, at factory floors, or even vehicles.” There is some debate however of where to place sensors in these environments. For example, should they be located around the room or should they be located on the individual. Locating sensors around the room enforces certain conditions on an individual, while locating sensors on an individual means that that person is actually in control of their context. The latter case also requires less localized infrastructure and a greater degree of freedom. Rhodes et al. [29] argue that by “properly combining wearable computing and ubiquitous computing, a system can have the advantages of both.”

5.  Social Issues

5.1 Privacy Concerns and Big Brother Fears

Starner [[xxxiii]] makes the distinction between privacy and security concerns. “Security involves the protection of information from unauthorized users; privacy is the individual’s right to control the collection and use of personal information.” Mills [[xxxiv]] is of the opinion that some technology, like communications, is not non-neutral but totalitarian in nature and that it can make citizens passive. “These glamorous technologies extend and integrate cradle-to-grave surveillance, annihilating all concept of a right to personal privacy, and help consolidate the power of the national security state… every technology, being a form of power, has implicit values and politics…” Over the years terms like Big Brother [[xxxv], [xxxvi]] and function creep [[xxxvii]] have proliferated to correspond to the all-seeing eyes of government and to the misuse and abuse of data. In most western countries data matching programs were constructed, linked to a unique citizen ID, to cross-check details provided by citizens, claims made, and benefits distributed [[xxxviii], [xxxix]]. More recently however, the trend has tended towards information centralization between government agencies based around the auspices of a national ID to reduce fraud [[xl]] and to combat terrorism [[xli]]. Currently computers allow for the storage and searching of data gathered like never before [[xlii]]. The range of automated data collection devices continues to increase to include systems such as bar codes (with RF capabilities), magnetic-stripe card, smart card and a variety of biometric techniques, increasing the rapidity and ease at which information is gathered. RFID transponders especially have added a greater granularity of precision in in-building and campus-wide solutions, given the wireless edge, allowing information to be gathered within a closed environment, anywhere/ anytime, transparent to the individual carrying the RFID badge or tag.

Now, while auto-ID itself is supposed to ensure privacy, it is the ease with which data can be collected that has some advocates concerned about the ultimate use of personal information. While the devices are secure, breaches in privacy can happen at any level- especially at the database level where information is ultimately stored after it is collected [[xliii]]. How this information is used, how it is matched with other data, who has access to it, is what has caused many citizens to be cautious about auto-ID in general [[xliv]]. Data mining also has altered how data is filtered, sifted and utilized all in the name of customer relationship management (CRM). It is not difficult to obtain telemarketing lists, census information aggregated to a granular level, and mapping tools to represent market segments visually. Rothfeder [[xlv]] states:

[m]edical files, financial and personnel records, Social Security numbers, and telephone call histories- as well as information about our lifestyle preferences, where we shop, and even what car we drive- are available quickly and cheaply.

Looking forward, the potential for privacy issues linked to chip implants is something that has been considered but mostly granted attention by the media. Privacy advocates warn that such a chip would impact civil liberties in a disastrous way [[xlvi]]. Even Warwick, himself, is aware that chip implants do not promote an air of assurance:

Looking back, Warwick admits that the whole experiment [Cyborg 1.0] “smacked of Big Brother.” He insists, however, that it’s important to raise awareness of what’s already technically possible so that we can remain in the driver’s seat. “I have a sneaking suspicion,” he says, “that as long as we’re gaining things, we’ll yell ‘Let’s have Big Brother now!’ It’s when we’re locked in and the lights start going off- then Big Brother is a problem.” [[xlvii]]

In this instance, Warwick has made an important observation. So long as individuals are “gaining” they generally will voluntarily part with a little more information. It is when they stop gaining and blatantly start being taken advantage of that the idea of Big Brother is raised. On that point, chip implants promise the convenience of not having to carry a multitude of auto-ID devices, perhaps not even a wallet or purse.

According to McGinity [18] “[e]xperts say it [the chip] could carry all your personal information, medical background, insurance, banking information, passport information, address, phone number, social security number, birth certificate, marriage license.” This kind of data collection is considered by civil libertarians to be “crypto-fascism or high-tech slavery” [[xlviii]]. The potential for abuse cannot be overstated [[xlix]]. Salkowski agrees pointing to the ADSX VeriChip system, stating that police, parents and ADSX employees could abuse their power. “It might even be possible for estranged spouses, employers and anyone else with a grudge to get their hands on tracking data through a civil subpoena” [[l]]. Hackers too, could try their hand at collecting data without the knowledge of the individual, given that wireless transmission is susceptible to interception. At the same time, the chip implant may become a prerequisite to health insurance and other services. “You could have a scenario where insurance companies refuse to insure you unless you agree to have a chip implant to monitor the level of physical activity you do” says Pearson of British Telecom [[li]]. This should not be surprising given that insurance companies already ask individuals for a medical history of illnesses upon joining a new plan. Proponents say the chip would just contain this information more accurately [7]. Furthermore, “[c]ost-conscious insurance companies are sure to be impressed, because the portability of biomems [i.e., a type of medical chip implant] would allow even a seriously ill patient to be monitored after surgery or treatment on an outpatient basis” [[lii]]. Now a chip storing personal information is quite different to one used to monitor health 24x7x365 and then to relay diagnoses to relevant stakeholders. As Chris Hoofnagle, an attorney for the Electronic Privacy Information Centre in Washington, D.C., pointed out, “[y]ou always have to think about what the device will be used for tomorrow” [[liii]]. In its essential aspect, this is exactly the void this paper has tried to fill.

5.2 Mandatory Proof of Identification

In the US in 2001 several bills were passed in Congress to allow for the creation of three new Acts related to biometric identification of citizens and aliens, including the Patriot Act, Aviation and Transport Security Act, and the Enhanced Border Security and Visa Entry Reform Act. If terrorism attacks continue to increase in frequency, there is a growing prospect in the use of chip implants for identification purposes and GPS for tracking and monitoring. It is not an impossible scenario to consider that one day these devices may be incorporated into national identification schemes. During the SARS (severe acute respiratory syndrome) outbreak, Singapore [[liv]] and Taiwan [[lv]] considered going as far as tagging their whole population with RFID devices to monitor automatically the spread of the virus. Yet, independent of such random and sporadic events, governments worldwide are already moving toward the introduction of a single unique ID to cater for a diversity of citizen applications. Opinions on the possibility of widespread chip implants in humans range from “it would be a good idea,” to “it would be a good idea, but only for commercial applications not government applications,” to “this should never be allowed to happen”. Leslie Jacobs, who was one of the first to receive a VeriChip told Scheeres [[lvi]], “[t]he world would be a safer place if authorities had a tamper-proof way of identifying people… I have nothing to hide, so I wouldn’t mind having the chip for verification… I already have an ID card, so why not have a chip?” It should be noted that some tracking and monitoring systems can be turned off and on by the wearer, making monitoring theoretically voluntary [[lvii]]. Sullivan a spokesperson for ADSX, said: “[i]t will not intrude on personal privacy except in applications applied to the tracking of criminals” [49]. ADSX have claimed on a number of occasions that it has received more than two thousand emails from teenagers volunteering to be the next to be “chipped” [[lviii]]. There are others like McClimans [[lix]] that believe that everyone should get chipped. Cunha Lima, a Brazilian politician who also has a chip implant is not ignorant of the potential for invasion of privacy but believes the benefits outweigh the costs and that so long as the new technology is voluntary and not mandatory there is nothing to worry about. He has said, “[i]f one chooses to ‘be chipped,’ then one has considered the consequences of that action” [[lx]]. Lima argues that he feels more secure with an implant given the number of kidnappings in South America of high profile people each year- at least this way his location is always known.

Professor Brad Meyers of the Computer Science Department at Carnegie Mellon University believes that the chip implant technology has a place but should not be used by governments. Yet the overriding sentiment is that chip implants will be used by government before too long. Salkowski [50] has said, “[i]f you doubt there are governments that would force at least some of their citizens to carry tracking implants, you need to start reading the news a little more often.” Black [53] echoes these sentiments: “Strictly voluntary? So far so good. But now imagine that same chip being used by a totalitarian government to keep track of or round up political activists or others who are considered enemies of the state. In the wrong hands, the VeriChip could empower the wrong people.” In a report written by Ramesh [[lxi]] for the Franklin Pierce Law Centre the prediction is made that: 

[a] national identification system via microchip implants could be achieved in two stages: Upon introduction as a voluntary system, the microchip implantation will appear to be palatable. After there is a familiarity with the procedure and a knowledge of its benefits, implantation would be mandatory.

Bob Gellman, a Washington privacy consultant, likens this to “a sort of modern version of tattooing people, something that for obvious reasons- the Nazis tattooed numbers of people- no one proposes” [49, [lxii], [lxiii]]. The real issue at hand as Gellman sees it is “who will be able to demand that a chip be implanted in another person.” Mieszkowski supports Gray by observing how quickly a new technological “option” can become a requirement. Resistance after the voluntary adoption stage can be rather futile if momentum is leading the device towards a mandatory role.

McMurchie [[lxiv]] reveals the subtle progression toward embedded devices:

[a]s we look at wearable computers, it’s not a big jump to say, ‘OK, you have a wearable, why not just embed the device?’… And no one can rule out the possibility that employees might one day be asked to sport embedded chips for ultimate access control and security…

Professor Chris Hables Gray uses the example of prospective military chip implant applications. How can a marine, for instance, resist implantation? Timothy McVeigh, convicted Oklahoma bomber, claimed that during the Gulf War, he was implanted with a microchip against his will. The claims have been denied by the U.S. military [[lxv]], however the British Army is supposedly considering projects such as APRIL (Army Personnel Rationalization Individual Listings) [51]. Some cyberpunks have attempted to counteract the possibility of enforced implantation. One punk known by the name of “Z.L” is an avid reader of MIT specialist publications like open|DOOR MIT magazine on bioengineering and beyond. Z.L.’s research has indicated that:

[i]t is only a matter of time… before technology is integrated within the body. Anticipating the revolution, he has already taught himself how to do surgical implants and other operations. “The state uses technology to strengthen its control over us,” he says. “By opposing this control, I remain a punk. When the first electronic tags are implanted in the bodies of criminals, maybe in the next five years, I’ll know how to remove them, deactivate them and spread viruses to roll over Big Brother” [25].

5.3 Health Risks

Public concern about electromagnetic fields from cellular phones was a contentious issue in the late 1990s. Now it seems that the majority of people in More Developing Countries (MDCs) have become so dependent on mobile phones that they are disregarding the potential health risks associated with the technology [[lxvi]]. Though very little has been proven concretely, most terminal manufacturers do include a warning with their packaging, encouraging users not to touch the antenna of the phone during transmission [[lxvii]]. Covacio [11] is among the few authors to discuss the potential technological problems associated with microchips for human ID from a health perspective. In his paper he provides evidence why implants may impact humans adversely, categorizing these into thermal (i.e. whole/partial rise in body heating), stimulation (i.e. excitation of nerves and muscles) and other effects most of which are currently unknown. He states that research into RFID and mobile telephone technology [11]:

...has revealed a growing concern with the effects of radio frequency and non-ionizing radiation on organic matter. It has been revealed a number of low-level, and possible high-level risks are associated with the use of radio-frequency technology. Effects of X-rays and gamma rays have been well documented in medical and electronic journals…

In considering future wearable devices, Salonen [[lxviii]] puts forward the idea of directing antenna away from the head where “there may be either a thermal insult produced by power deposition in tissue (acute effects) or other (long-term) effects” to midway between the shoulder and elbow where radiation can be pushed outward from the body. Yet chip implants may also pose problems, particularly if they are active implants that contain batteries and are prone to leakage if transponders are accidentally broken. Geers et al. [[lxix]] write the following regarding animal implants.

Another important aspect is the potential toxic effect of the battery when using active transponders. Although it should be clear that pieces of glass or copper from passive tags are not allowed to enter the food chain. When using electronic monitoring with the current available technology, a battery is necessary to guarantee correct functioning of sensors when the transponder is outside the antenna field. If the transponder should break in the animal’s body, battery fluid may escape, and the question of toxological effects has to be answered.

In fact, we need only consider the very real problems that women with failed silicon breast implants have had to suffer. Will individuals with chip implants, twenty years down the track, be tied up in similar court battles and with severe medical problems? Surgical implantation, it must also be stated, causes some degree of stress in an animal and it takes between four to seven days for the animal to return to equilibrium [69]. Most certainly some discomfort must be felt by humans as well. In the Cyborg 1.0 project, Warwick was advised to leave the implant under his skin for only ten days. According to Trull [[lxx]], Warwick was taking antibiotics to fight the possibility of infection. Warwick also reportedly told his son while playing squash during Cyborg 1.0: “Whatever you do, don’t hit my arm. The implant could just shatter, and you’ll have ruined your father’s arm for life” [[lxxi]]. It is also worthwhile noting Warwick’s appearance after the Cyborg 2.0 experiment. He looked pale and weary in press release photographs, like someone who had undergone a major operation. Covacio [11] believes ultimately that widespread implantation of microchips in humans will lead to detrimental effects to them and the environment at large. Satellite technology (i.e. the use of GPS to locate individuals), microwave RF and related technological gadgetry will ultimately “increase health problems and consequentially increase pressure on health services already under economic duress.”

6. Cultural Issues

6.1 The Net Generation

When the ENIAC was first made known to the public in February of 1946 reporters used “anthropomorphic” and “awesome characterizations” to describe the computer. The news was received with skepticism by citizens who feared the unknown. In an article titled “The Myth of the Awesome Thinking Machine”, Martin [[lxxii]] stated that the ENIAC was referred to in headlines as “a child, a Frankenstein, a whiz kid, a predictor and controller of weather, and a wizard”. Photographs of the ENIAC used in publications usually depicted the computer to completely fill a small room, from wall-to-wall and floor-to-ceiling. People are usually shown interacting with the machine, feeding it with instructions, waiting for results and monitoring its behavior. One could almost imagine that the persons in the photographs are ‘inside the body’ of the ENIAC [[lxxiii]]. Sweeping changes have taken place since that time, particularly since the mid 1980s. Consumers now own personal computers (PCs) in their homes- these are increasingly being networked- they carry laptop computers and mobile phones and chip cards, and closely interact with public automated kiosks. Relatively speaking, it has not taken long for people to adapt to the changes that this new technology has heralded. Today we speak of a Net Generation (N-Geners) who never knew a world without computers or the Internet [[lxxiv]]; for them the digital world is as ubiquitous as the air that they breathe. What is important to N-Geners is not how they got to where they are today but what digital prospects the future holds.

“[O]ur increasing cultural acceptance of high-tech gadgetry has led to a new way of thinking: robotic implants could be so advantageous that people might actually want to become cybernetic organisms, by choice. The popularization of the cyberpunk genre has demonstrated that it can be hip to have a chip in your head” [70].

6.2 Science Fiction Genre

The predictions of science fiction writers have often been promoted through the use of print, sound and visual mediums. Below is a list of sci-fi novels, films and television series that undoubtedly have influenced and are still influencing the trajectory of auto-ID. Chris Hables Gray tells his students “…that a lot of the best cyborgology has been done in the mass media and in fiction by science fiction writers, and science fiction movie producers, because they’re thinking through these things” [[lxxv]]. The popular 1970s series of Six Million Dollar Man, for instance, began as follows: “We can rebuild him. We have the technology. We have the capability to make the world’s first Bionic man.” Today bionic limbs are a reality and no longer science fiction [[lxxvi]]. More recently AT&T’s Wireless mMode magazine alluded to Start Trek [[lxxvii]]:

They also talked about their expectations- one media executive summed it up best, saying, “Remember that little box that Mr. Spock had on Star Trek? The one that did everything? That’s what I’d like my phone to be…”

Beyond auto-ID we find a continuing legacy in sci-fi genre toward the electrification of humans- from Frankenstein to Davros in Dr Who, and from Total Recall to Johnny Mnemonic (see exhibit 1.2). While all this is indeed ‘merely’ sci-fi, it is giving some form to the word, allowing the imagination to be captured in powerful images, sounds and models. What next? A vision of a mechanized misery [[lxxviii]] as portrayed in Fritz Lang’s 1927 cult film classic Metropolis? Only this time instead of being at the mercy of the Machine, we have gone one step further and invited the Machine to reside inside the body, and marked it as a ‘technological breakthrough’ as well. As several commentators have noted, “[w]e live in an era that… itself often seems like science fiction, and Metropolis has contributed powerfully to that seeming” [[lxxix]].

Exhibit 1.2 &nbsp;&nbsp;&nbsp;&nbsp;Sci-Fi Film Genre Pointing to the Electrification of Humans

Exhibit 1.2     Sci-Fi Film Genre Pointing to the Electrification of Humans

Some of the more notable predictions and social critiques are contained within the following novels: Frankenstein (Shelley 1818), Paris in the 20th Century (Verne 1863), Looking Backward (Bellamy 1888), The Time Machine (Wells 1895), R.U.R. (Kapek 1917), Brave New World (Huxley 1932), 1984 (Orwell 1949), I, Robot (Asimov 1950), Foundation (Asimov 1951-53, 1982), 2001: A Space Odyssey (Clarke 1968), Blade Runner (Dick 1968), Neuromancer (Gibson 1984), The Marked Man (Ingrid 1989), The Silicon Man (Platt 1991), Silicon Karma (Easton 1997). The effects of film have been even more substantial on the individual as they have put some form to the predictions. These include: Metropolis (Fritz Lang 1927), Forbidden Planet (Fred Wilcox 1956), Fail Safe (Sidney Lumet 1964), THX-1138 (George Lucas 1971), 2001: A Space Odyssey (Stanley Kubrick 1968), The Terminal Man (George Lucas 1974), Zardoz (John Boorman 1974), Star Wars (George Lucas 1977), Moonraker (Lewis Gilbert II 1979), Star Trek (Robert Wise 1979), For Your Eyes Only (John Glen II 1981), Blade Runner (Ridley Scott 1982), War Games (John Badham 1983), 2010: The Year We Make Contact (Peter Hyams 1984), RoboCop (Paul Verhoeven, 1987), Total Recall (Paul Verhoeven 1990), The Terminator Series, Sneakers (Phil Alden Robinson 1992), Patriot Games (Phillip Noyce 1992), The Lawnmower Man (Brett Leonard 1992), Demolition Man (Marco Brambilla 1993), Jurassic Park (Steven Speilberg 1993), Hackers (Iain Softley 1995), Johnny Mnemonic (Robert Longo 1995), The NET (Irwin Winkler 1995) [[lxxx]], Gattaca (Andrew Niccol 1997) Enemy of the State (Tony Scott 1998), Fortress 2 (Geoff Murphy 1999), The Matrix (L. Wachowski & A. Wachowski 1999), Mission Impossible 2 (John Woo 2000), The 6th Day (Roger Spottiswoode 2000). Other notable television series include: Dr Who, Lost in Space, Dick Tracy, The Jetsons, Star Trek, Batman, Get Smart, Six Million Dollar Man, Andromeda, Babylon 5, Gasaraki, Stargate SG-1, Neon Genesis Evangelion, FarScape, and X-Files.  

6.3 Shifting Cultural Values

Auto-ID and more generally computer and network systems have influenced changes in language, art [[lxxxi]], music and film. An article by Branwyn [[lxxxii]] summarizes these changes well.

Language [[lxxxiii]]: “Computer network and hacker slang is filled with references to “being wired” or “jacking in” (to a computer network), “wetware” (the brain), and “meat” the body”.
Music: “Recent albums by digital artists Brian Eno, Clock DVA, and Frontline Assembly sport names like Nerve Net, Man Amplified and Tactical Neural Implant.” See also the 1978 album by Kraftwerk titled “The Man Machine”.
Film: “Science fiction films, from Robocop to the recent Japanese cult film Tetsuo: The Iron Man, imprint our imaginations with images of the new.”

Apart from the plethora of new terms that have been born from the widespread use of IT&T and more specifically from extropians (much of which have religious connotations or allusions [[lxxxiv]]), it is art, especially body art that is being heavily influenced by chip implant technology. Mieszkowski [49] believes that “chipification” will be the next big wave in place of tattoos, piercing and scarification (see exhibit 1.3). In the U.S. it was estimated in 2001 that about two hundred Americans had permanently changed their bodies at around nine hundred dollars per implant, following a method developed by Steve Hayworth and Jon Cobb [25].

Exhibit 1.3 &nbsp;&nbsp;&nbsp;&nbsp;The New Fashion: Bar Code Tattoos, Piercing &amp; Chips

Exhibit 1.3     The New Fashion: Bar Code Tattoos, Piercing & Chips

Canadian artist Nancy Nisbet has implanted microchips in her hands to better understand how implant technology may affect the human identity. The artist told Scheeres [[lxxxv]], “I am expecting the merger between human and machines to proceed whether we want it to or not…” As far back as 1997, Eduardo Kac “inserted a chip into his ankle during a live performance in Sao Paulo, then registered himself in an online pet database as both owner and animal” [86]. Perhaps the actual implant ceremony was not Kac’s main contribution but the subsequent registration onto a pet database. Other artists like Natasha Vita More and Stelarc have ventured beyond localized chip implants. Their vision is of a complete prosthetic body that will comprise of nanotechnology, artificial intelligence, robotics, cloning, and even nanobots [75]. More calls her future body design Primo 3M Plus. Stelarc’s live performances however, have been heralded as the closest thing there is to imagining a world where the human body will become obsolete [[lxxxvi]].

A Stelarc performance… usually involves a disturbing mix of amplified sounds of human organs and techno beats, an internal camera projecting images of his innards, perhaps a set of robotic legs or an extra arm, or maybe tubes and wires connecting the performer’s body to the internet with people in another country manipulating the sensors, jerking him into a spastic dance. It’s a dark vision, but it definitely makes you think [75].

Warwick [[lxxxvii]] believes that the new technologies “will dramatically change [art], but not destroy it.”

6.4 Medical Marvels or Human Evolution

As Sacleman wrote in 1967 “...the impact of automation on the individual involve[d] a reconstruction of his values, his outlook and his way of life” [[lxxxviii]]. Marshall McLuhan [[lxxxix], [xc]] was one of the first explorers to probe how the psycho-social complex was influenced by electricity. “Electricity continually transforms everything, especially the way people think, and confirms the power of uncertainty in the quest for absolute knowledge.” [[xci]]. Numerous examples can be given to illustrate these major cultural changes- from the use of electricity for household warmth, to wide area networks (WAN) enabling voice and data communications across long distances, to magnetic-stripe cards used for credit transactions [[xcii], [xciii], [xciv], [xcv]]. But what of the direct unification of humans and technology, i.e., the fusion between flesh and electronic circuitry [[xcvi], [xcvii], [xcviii]]? Consider for a moment the impact that chip implants have had on the estimated 23,000 cochlear recipients in the US. A medical marvel perhaps but it too, not without controversy. There are potentially 500,000 hearing impaired persons that could benefit from cochlear implants [[xcix]] but not every deaf person wants one.

Some deaf activists… are critical of parents who subject children to such surgery [cochlear implants] because, as one charged, the prosthesis imparts “the nonhealthy self-concept of having had something wrong with one’s body” rather than the “healthy self-concept of [being] a proud Deaf” [[c]].

Assistant Professor Scott Bally of Audiology at Gallaudet University has said: “Many deaf people feel as though deafness is not a handicap. They are culturally deaf individuals who have successfully adapted themselves to being deaf and feel as though things like cochlear implants would take them out of their deaf culture, a culture which provides a significant degree of support” [82].

Putting this delicate debate aside it is here that some delineation can be made between implants that are used to treat an ailment or disability (i.e. giving sight to the blind and hearing to the deaf), and implants that may be used for enhancing human function (i.e. memory). Some citizens are concerned about the direction of the human species as future predictions of fully functional neural implants are being made by credible scientists. “[Q]uestions are raised as to how society as a whole will relate to people walking around with plugs and wires sprouting out of their heads. And who will decide which segments of the society become the wire-heads” [82]? Those who can afford the procedures perhaps? And what of the possibility of brain viruses that could be fatal and technological obsolescence that may require people to undergo frequent operations? Maybury [[ci]] believes that humans are already beginning to suffer from a type of “mental atrophy” worse than that that occurred during the industrial revolution and that the only way to fight it is to hang on to those essential skills that are required for human survival. The question remains whether indeed it is society that shapes technology [[cii]] or technology that shapes society [[ciii]]. Inevitably it is a dynamic process of push and pull that causes cultural transformations over time.

7 Religious Issues

7.1 The Mark of the Beast

Ever since the bar code symbology UPC (Universal Product Code) became widespread some Christian groups have linked auto-ID to the “mark” in the Book of Revelation (13:18): “the number of the beast… is 666” [[civ], [cv], [cvi]]. Coincidentally, the left (101), centre (01010) and right (101) border codes of the UPC bars are encoded 6, 6, 6 (see exhibit 1.4). As it is now an established standard for every non-perishable item to be bar coded there was a close association with the prophecy: “so that no one could buy or sell unless he had the mark” (Rev 13:17). In full, verses 16-18 of chapter 13 of Revelation read as follows:

He also forced everyone, small and great, rich and poor, free and slave, to receive a mark on his right hand or on his forehead, so that no one could buy or sell unless he had the mark, which is the name of the beast or the number of his name. This calls for wisdom. If anyone has insight, let him calculate the number of the beast, for it is man’s number. His number is 666. [[cvii]]

According to some Christians, this reference would appear to be alluding to a mark on or in the human body, the prediction being made that the UPC would eventually end up on or under human skin [[cviii]]. As the selection environment of auto-ID devices grew, the interpretation of the prophecy further developed as to the actual guise of the mark. It was no longer interpreted to be ‘just’ the bar code (see exhibit 1.4). Some of the more prominent religious web sites that discuss auto-ID and the number of the beast include: http://www.666soon.com (2003), http://www.greaterthings.com (2003), http://www.countdown.com.org (2003), http://www.raidersnewsupdate.com (2003), http://www.light1998.com (2003) and http://www.av1611.org (1996). At first the sites focused on bar code technology, now they have grown to encompass a plethora of auto-ID technologies, especially biometrics and looming chip implants. For a thorough analysis of the background, sources and interpretation of the “number of the beast” see M.G. Michael’s thesis [[cix]].

Card technology such as magnetic-stripe and smart cards became the next focus as devices that would gradually pave the way for a permanent ID for all citizens globally: “He also forced everyone, small and great, rich and poor, free and slave, to receive a mark…” (Rev 13:16). Biometrics was then introduced and immediately the association was made that the “mark” [charagma] would appear on the “right hand” (i.e. palmprint or fingerprint) or on the “forehead” (facial/ iris recognition) as was supposedly prophesied (Rev. 13:16). For the uses of charagma in early Christian literature see Arndt and Gingrich [[cx]]. Short of calling this group of people fundamentalists, as Woodward [15] refers to one prominent leader, Davies is more circumspect [[cxi]]:

“I think they’re legitimate [claims]. People have always rejected certain information practices for a variety of reasons: personal, cultural, ethical, religious and legal. And I think it has to be said that if a person feels bad for whatever reason, about the use of a body part then that’s entirely legitimate and has to be respected”.

Finally RF/ID transponders made their way into pets and livestock for identification, and that is when some Christian groups announced that the ‘authentic’ mark was now possible, and that it was only a matter of time before it would find its way into citizen applications [[cxii]]. Terry Cook [[cxiii]], for instance, an outspoken religious commentator and popular author, “worries the identification chip could be the ‘mark of the beast’, an identifying mark that all people will be forced to wear just before the end times, according to the Bible” [[cxiv]]. The description of an implant procedure for sows that Geers et al. [69] gives, especially the section about an incision being made on the skin, is what some religious advocates fear may happen to humans as well in the future.

When the thermistor was implanted the sows were restrained with a lasso. The implantation site was locally anaesthetized with a procaine (2%) injection, shaved and disinfected. After making a small incision in the skin, the thermistor was implanted subcutaneously, and the incision was closed by sewing. The position of the thermistor (accuracy 0.1C) was wire-connected to a data acquisition system linked to a personal computer.

“Religious advocates say it [i.e. transponder implants] represents ‘the mark of the Beast’, or the anti-Christ” [[cxv]]. Christians who take this mark, for whatever reason, are said to be denouncing the seal of baptism, and accepting the Antichrist in place of Christ [[cxvi], [cxvii], [cxviii]]. Horn [[cxix]] explains:

[m]any Christians believe that, before long, an antichrist system will appear. It will be a New World Order, under which national boundaries dissolve, and ethnic groups, ideologies, religions, and economics from around the world, orchestrate a single and dominant sovereignty… According to popular Biblical interpretation, a single personality will surface at the head of the utopian administration… With imperious decree the Antichrist will facilitate a one-world government, universal religion, and globally monitored socialism. Those who refuse his New World Order will inevitably be imprisoned or destroyed.

References discussing the New World Order include Barnet and Cavanagh [[cxx]], Wilshire [[cxxi]], and Smith [[cxxii]].

Exhibit 1.4 &nbsp;&nbsp;&nbsp;&nbsp;The Mark of the Beast as Shown on GreaterThings.com

Exhibit 1.4     The Mark of the Beast as Shown on GreaterThings.com

Companies that specialize in the manufacture of chip implant solutions, whether for animals or for humans, have been targeted by some religious advocates. The bad publicity has not been welcomed by these companies- some have even notably “toned down” the graphic visuals on their web sites so that they do not attract the wrong ‘type’ of web surfers. While they are trying to promote an image of safety and security, some advocates have associated company brands and products with apocalyptic labels. Some of the company and product names include: Biomark, BioWare, BRANDERS, MARC, Soul Catcher, Digital Angel and Therion Corporation. Perhaps the interesting thing to note is that religious advocates and civil libertarians agree that ultimately the chip implant technology will be used by governments to control citizens. ADSX is one of the companies that have publicly stated that they do not want adverse publicity after pouring hundreds of thousands of dollars into research and development and the multi-million dollar purchase of the Destron Fearing company. So concerned were they that they even appeared on the Christian talk show The 700 Club, emphasizing that the device would create a lot of benefits and was not meant to fulfill prophecy [60]. A spokesperson for ADSX said: “[w]e don’t want the adverse publicity. There are a number of privacy concerns and religious implications- fundamentalist Christian groups regard [i.e., implanting computer chips] as the Devil’s work” [51].  According to Gary Wohlscheid, the president of The Last Day Ministries, the VeriChip could well be the mark.  Wohlscheid believes that out of all the auto-ID technologies with the potential to be the mark, the VeriChip is the closest. About the VeriChip he says however, “[i]t’s definitely not the final product, but it’s a step toward it. Within three to four years, people will be required to use it. Those that reject it will be put to death” [56]. These are, of course, the positions of those who have entered the debate from the so-called fundamentalist literalist perspective and represent the more vocal and visible spectrum of contemporary “apocalyptic” Christianity. In this context the idea of fundamentalism seems to be a common label today, for anyone within the Christian community who questions the trajectory of technological advancement.

With respect to the potential of brain chips in the perceived quest for “immortality” [13, 14], many Christians across the denominational confession see this as an attempt to usurp the Eternal Life promised by God, in Jesus Christ, through the Holy Spirit. This is similar to the case of human cloning, where specialist geneticists are accused of trying to play God by usurping the Creator’s role. However, the area is notoriously grey here; when for instance, do implants for medical breakthroughs become acceptable versus those required for purposes of clear identification? In the future the technology in question could end up merging the two functions onto the single device. This is a real and very possible outcome, when all factors, both market and ethical, are taken on board by the relevant stakeholders. Ultimately, for most members of a believing religious community, this subject revolves around the most important question of individual freedom and the right to choose [[cxxiii], [cxxiv]].

8. Ethical Issues

In an attempt to make our world a safer place we have inadvertently infringed on our privacy and our freedom through the use of surveillance cameras and all other ancillary. We equip our children with mobile phones, attach tracking devices to them or make them carry them [[cxxv]] in their bags and soon we might even be implanting them with microchips [[cxxvi]]. This all comes at a price- yet it seems more and more people are willing to pay this price as heinous crimes become common events in a society that should know better. Take the example of 11-year old Danielle Duval who is about to have an active chip (i.e. containing a rechargeable battery) implanted in her. Her mother believes that it is no different to tracking a stolen car, simply that it is being used for another more important application. Mrs Duvall is considering implanting her younger daughter age 7 as well but will wait until the child is a bit older: “so that she fully understands what’s happening” [[cxxvii]]. One could be excused for asking whether Danielle at the age of 11 actually can fully comprehend the implications of the procedure she is about to undergo. It seems that the age of consent would be a more appropriate age.

Warwick has said that an urgent debate is required on this matter (i.e. whether every child should be implanted by law), and whether or not signals from the chips should be emitted on a 24x7 basis or just triggered during emergencies. Warwick holds the position that “we cannot prejudge ethics” [87]. He believes that ethics can only be debated and conclusions reached only after people become aware of the technical possibilities when they have been demonstrated. He admits that ethics may differ between countries and cultures [[cxxviii]]. The main ethical problem related to chip implants seems to be that they are under the skin [70] and cannot just be removed by the user at their convenience. In fact there is nothing to stop anyone from getting multiple implants all over their body rendering some applications useless. Tien of the Electronic Frontier Foundation (EFF) is convinced that if a technology is there to be abused, whether it is chip implants or national ID cards, then it will because that is just human nature [[cxxix]]. Similarly, Kidscape, a charity that is aimed at reducing the incidence of sexual abuse in children believe that implants will not act to curb crime. Kidscape hold the position that rather than giving children a false sense of security because they are implanted with a tracking device that could be tampered with by an offender, they should be educated on the possible dangers. Implanted tracking devices may sound entirely full-proof but deployment of emergency personnel, whether police or ambulance, cannot just magically appear at the scene of a crime in time to stop an offender from committing violence against a hostage.

8.1 The Prospect of International ID Implants

There are numerous arguments for why implanting a chip in a person is outright unconstitutional. But perhaps the under-explored area as Gellman puts it are the legal and social issues of who would have power over the chip and the information gathered by its means [49]. Gellman is correct in his summation of the problem but science has a proven way of going into uncharted territory first, then asking the questions about implications later. ADSX, for instance, have already launched the VeriChip solution. Sullivan, a spokesperson for the company told Salkowski [50]:

“I’m certainly not a believer in the abuse of power,” he offered, suggesting that Congress could always ban export of his company’s device. Of course, he admits he wouldn’t exactly lobby for that law. “I’m a businessman,” he said.

Black [53] makes the observation that the US government might well indeed place constraints on international sales of the VeriChip if it felt it could be used against them by an enemy. Consider the governance issues surrounding GPS technology that has been in operation a lot longer than human RFID implants.

“Good, neutral, or perhaps undesirable outcomes are now possible… Tension arises between some of the civil/commercial applications and the desire to preclude an adversary’s use of GPS. It is extremely difficult (technically, institutionally, politically, and economically) to combine the nonmilitary benefits of the system that require universality of access, ease of use, and low cost with military requirements for denial of the system to adversaries. Practical considerations require civil/commercial applications to have relatively easy access” [[cxxx]].

From a different angle, Rummler [[cxxxi]] points out that the monitoring and tracking of individuals raises serious legal implications regarding the individual’s capacity to maintain their right to freedom. He wrote: “[o]nce implanted with bio-implant electronic devices, humans might become highly dependent on the creators of these devices for their repair, recharge, and maintenance. It could be possible to modify the person technologically… thus placing them under the absolute control of the designers of the technology.” The Food and Drug Administration’s (FDA) Dr. David Feigal has been vocal about the need for such devices as the VeriChip not to take medical applications lightly and that companies wishing to specialize in health-related implants need to be in close consultation with the FDA [[cxxxii], [cxxxiii]]. There is also the possibility that such developments, i.e. regulating chip implants, may ultimately be used against an individual. The Freedom of Information Act for instance, already allows U.S. authorities to access automatic vehicle toll-passes to provide evidence in court [2]; there is nothing to suggest this will not happen with RFID transponder implants as well, despite the myriad of promises made by ADSX.  Professor Gray is adamant that there is no stopping technological evolution no matter how sinister some technologies may appear, and that we need to become accustomed to the fact that new technologies will continually infringe upon the constitution [49].

8.2 Beyond Chip Implants

Luggables, like mobile phones, do create a sense of attachment between the user and the device but the devices are still physically separate; they can accidentally be left behind. Wearable computers on the other hand are a part of the user, they are worn, and they “create an intimate human-computer-symbiosis in which respective strengths combine” [[cxxxiv]]. Mann calls this human-computer-symbiosis, “human interaction” (HI) as opposed to HCI (human-computer interaction).

[W]e prefer not to think of the wearer and the computer with its associated I/O apparatus as separate entities. Instead, we regard the computer as a second brain and its sensory modalities as additional senses, which synthetic synesthesia merges with the wearer’s senses. [[cxxxv]]
Exhibit 1.5 &nbsp;&nbsp;&nbsp;&nbsp;The Process of Transformation

Exhibit 1.5     The Process of Transformation

Human-computer electrification is set to make this bond irrevocable (see exhibit 1.5). Once on that path there is no turning back. If at the present all this seems impossible, a myth, unlikely, a prediction far gone, due to end-user resistance and other similar obstacles facing the industry today, history should teach us otherwise. This year alone, millions of babies will be born into a world where there are companies on the New York Stock Exchange specializing in chip implant devices for humans. “They” will grow up believing that these technologies are not only “normal” but also quite useful, just   like   other   high-tech technologies before them such as the Internet, PCs, smart cards etc. Consider the case of Cynthia Tam, aged two, who is an avid computer user:

“[i]t took a couple of days for her to understand the connection between the mouse in her hand and the cursor on the screen and then she was off… The biggest problem for Cynthia’s parents is how to get her to stop… for Cynthia, the computer is already a part of her environment… Cynthia’s generation will not think twice about buying things on the Internet, just like most people today don’t think twice when paying credit card, or using cash points for withdrawals and deposits” [[cxxxvi]].

But you do not have to be a newborn baby to adapt to technological change. Even grandmothers and grandfathers surf the web these days and send emails as a cheaper alternative to post or telephone [74]. And migrants struggling with a foreign language will even memorize key combinations to withdraw money even if they do not actually fully perceive the actions they are commanding throughout the process. Schiele [[cxxxvii]] believes that our personal habits are shaped by technological change and that over time new technologies that seem only appropriate for technophiles eventually find themselves being used by the average person. “[O]ver time our culture will adjust to incorporate the devices.” Gotterbarn is in agreement [10].

We enthusiastically adopt the latest gadget for one use, but then we start to realize that it gives us power for another use. Then there is the inevitable realization that we have overlooked the way it impacts other people, giving rise to professional and ethical issues.

What is apparent regardless of how far electrophoresis is taken is that the once irreconcilable gap between human and machine is closing (see exhibit 1.6).

Beyond chip implants for tracking there are the possibilities associated with neural prosthetics and the potential to directly link computers to humans [[cxxxviii]]. Warwick is also well aware that one of the major obstacles of cyber-humans are the associated moral issues [[cxxxix], [cxl]]- who gives anyone the right to be conducting complex procedures on a perfectly healthy person, and who will take responsibility for any complications that present themselves? Rummler [131] asks whether it is ethical to be linking computers to humans in the first place and whether or not limitations should be placed on what procedures can be conducted even if they are possible. For instance, could this be considered a violation of human rights? And more to the point what will it mean in the future to call oneself “human”. McGrath [[cxli]] asks “how human”?

As technology fills you up with synthetic parts, at what point do you cease to be fully human? One quarter? One third?... At bottom lies one critical issue for a technological age: are some kinds of knowledge so terrible they simply should not be pursued? If there can be such a thing as a philosophical crisis, this will be it. These questions, says Rushworth Kidder, president of the Institute for Global Ethics in Camden, Maine, are especially vexing because they lie at “the convergence of three domains- technology, politics and ethics- that are so far hardly on speaking terms.

At the point of becoming an electrophorus (i.e. a bearer of electricity), “[y]ou are not just a human linked with technology; you are something different and your values and judgment will change” [[cxlii]]. Some suspect that it will even become possible to alter behavior in people with brain implants [51], whether they will it or not. Maybury [101] believes that “[t]he advent of machine intelligence raises social and ethical issues that may ultimately challenge human existence on earth.”

 

Exhibit 1.6 &nbsp;&nbsp;&nbsp;&nbsp;Marketing Campaigns that Point to the Electrophorus

Exhibit 1.6     Marketing Campaigns that Point to the Electrophorus

Gotterbarn [10] argues precisely that our view of computer technologies generally progresses through several stages:

1) naïve innocence and technological wonder, 2) power and control, and 3) finally, sometimes because of disasters during the second stage, an understanding of the essential relationship between technologies and values.

Bill Joy, the chief technologist of Sun Microsystems, feels a sense of unease about such predictions made by Ray Kurzweil in The Age of Spiritual Machines [138]. Not only because Kurzweil has proven technically competent in the past but because of his ultimate vision for humanity- “a near immortality by becoming one with robotic technology” [[cxliii]]. Joy was severely criticized for being narrow-sighted, even a fundamentalist of sorts, after publishing his paper in Wired, but all he did was dare to ask the questions- ‘do we know what we are doing? Has anyone really carefully thought about this?’ Joy believes [143]:

[w]e are being propelled into this new century with no plan, no control, no brakes. Have we already gone too far down the path to alter course? I don’t believe so, but we aren’t trying yet, and the last chance to assert control- the fail-safe point- is rapidly approaching.

Surely there is a pressing need for ethical dialogue [[cxliv]] on auto-ID innovation and more generally IT&T. If there has ever been a time when engineers have had to act socially responsibly [[cxlv]], it is now as we are at defining crossroads.

The new era of biomedical and genetic research merges the worlds of engineering, computer and information technology with traditional medical research. Some of the most significant and far-reaching discoveries are being made at the interface of these disciplines. [[cxlvi]]

9. Conclusion

The principal objective of this paper was to encourage critical discussion on the exigent topic of human implants in e-business applications by documenting the central social, cultural, religious and ethical issues. The evidence provided indicates that technology-push has been the driving force behind many of the new RFID transponder implant applications instead of market-pull. What is most alarming is the rate of change in technological capabilities without the commensurate response from an informed community involvement or ethical discourse on what these changes actually “mean”, not only for the present but also for the future. It seems that the normal standard now is to introduce a technology, stand back to see its general effects on society, and then act to rectify problems as they might arise. The concluding point of this paper is that the long-term side effects of a technology should be considered at the outset and not after the event. One need only bring to mind the Atomic Bomb and the Chernobyl disaster for what is possible, if not inevitable once a technology is set on its ultimate trajectory [103]. As citizens it is our duty to remain knowledgeable about scientific developments and to discuss the possible ethical implications again and again [10]. In the end we can point the finger at the Mad Scientists [75] but we too must be socially responsible, save we become our own worst enemy [[cxlvii]]. It is certainly a case of caveat emptor, let the buyer beware.

10. References

[1] Cohen, T., The Tattoo, Savvas, Sydney (1994).

[2] Sanchez-Klein, J., “Cyberfuturist plants chip in arm to test human-computer interaction”, CNN Interactive,  http://www.cnn.com/TECH/computing/9808/28/ armchip.idg/index.html, [Accessed 28 August 1998], pp. 1-2 (1998).

[3] Jones, C., “Kevin Warwick: Saviour of humankind?”, BBC News, http://news.bbc.co.uk/2/hi/in_depth/uk/2000/newsmakers/1069029.stm, [Accessed 4 January 2003], pp. 1-4 (2000).

[4] ADSX, “Homepage”, http://www.adsx.com, Applied Digital Solutions, [Accessed 1 March 2004], p. 1 (2004).

[5] ADSX, “VeriChip Corporation”, Applied Digital Solutions, http://www.4verichip.com/, [Accessed 1 April 2004], p. 1 (2004).

[6] Warwick, K., “Professor of Cybernetics, University of Reading”, Kevin Warwick, http://www.kevinwarwick.com, [Accessed 14 November 2002], pp. 1-2 (2002).

[7] Goldman, J., “Meet ‘The Chipsons’: ID chips implanted successfully in Florida family”, ABC News: techtv, http://abcnews.go.com/sections/scitech/ TechTV/techtv_chipfamily020510.html, [Accessed 13 November 2003], pp. 1-2 (2002).

[8] Ramesh, E.M., “Time Enough: consequences of the human microchip implantation”, Franklin Pierce Law Centre, http://www.fplc.edu/risk/vol8/fall/ ramesh.htm, [Accessed 1 March 2004], pp. 1-26 (2004).

[9] Unatin, D., “Progress v. Privacy: the debate over computer chip implants”, JOLT: Notes, http://www.lawtechjournal.com/notes/2002/24_020819_ unatin. php, [Accessed 1 March 2004], pp. 1-3 (2002).

[10] Gotterbarn, D., “Injectable computers: once more into the breach! The life cycle of computer ethics awareness”, inroads- The SIGCSE Bulletin, Vol. 35, No. 4, December, pp. 10-12, (2003).

[11] Covacio, S., “Technological problems associated with the subcutaneous microchips for human identification (SMHId), InSITE-“Where Parallels Intersect, June, pp. 843-853 (2003).

[12] Warwick, K., “I, Cyborg”, 2003 Joint Lecture: The Royal Society of Edinburgh and The Royal Academy of Engineering, The Royal Society of Edinburgh, pp. 1-16 (2003).

[13] Norman, D.A., “Cyborgs”, Communications of the ACM, Vol. 44, No. 3, March, pp. 36-37 (2001).

[14] Bell, G. & Gray, J., “Futuristic forecasts of tools and technologies: digital immortality”, Communications of the ACM, March, Vol. 44, No. 3, pp. 29-31 (2001).

[15] Woodward, J.D., “Biometrics: privacy’s foe or privacy’s friend?”, Proceedings of the IEEE, Vol. 85, No. 9, pp. 1480-1492 (1997).

[16] Rosenberg, R.S., The Social Impact of Computers, Elsevier Academic Press, California (2004).

[17] Lemonick, M.D., “Future tech is now”, Time Australia, 17 July, pp. 44-79 (1995).

[18] McGinity, M., “Body of the technology: It’s just a matter of time before a chip gets under your skin”, Communications of the ACM, 43(9), September, pp. 17-19 (2000).

[19] Stephan, R., “The ultrahuman revolution”, MoneyZone.com, http://www. moneyzone.com/MTM_features3.28.cfm, [Accessed 29 November 2001], pp. 1-3 (2001).

[20] Sheridan, J.G. et al., “Spectators at a geek show: an ethnographic inquiry into wearable computing”, IEEE The Fourth International Symposium on Wearable Computers, pp. 195-196 (2000).

[21] Lukowicz, P., “The wearARM modular low-power computing core”, IEEE Micro, May-June, pp. 16-28 (2001).

[22] DeFouw, G. & Pratt, V., “The matchbox PC: a small wearable platform”, The Third International Symposium on Wearable Computers, pp. 172-175 (1999).

[23] Salonen, P. et al., “A small planar inverted-F antenna for wearable applications”, IEEE Tenth International Conference on Antennas and Propagation, Vol. 1, pp. 82-85 (1997).

[24] Mann S., “Wearable computing: a first step toward personal imaging”, IEEE Computer, February, pp. 25-32 (1997).

[25] Millanvoye, M., “Teflon under my skin”, UNESCO, http://www.unesco. org/courier/2001_07/uk/doss41.htm, [Accessed 29 November 2001], pp. 1-2 (2001).

[26] Furui, S., “Speech recognition technology in the ubiquitous/wearable computing environment”, IEEE International Conference on Acoustics, Speech, and Signal Processing, Vol. 6, pp. 3735-3738 (2000).

[27] Pickering, C., “Silicon man lives”, Forbes ASAP, http://www.cochrane.org.uk/opinion/interviews/forbes.htm, [Accessed 22 November 2001], pp. 1-2 (1999).

[28] Sydänheimo, L. et al., “Wearable and ubiquitous computer aided service, maintenance and overhaul”, IEEE International Conference on Communications, Vol. 3, pp. 2012-2017 (1999).

[29] Rhodes, B. J. et al., “Wearable computing meets ubiquitous computing: reaping the best of both worlds”, The Third International Symposium on Wearable Computers, pp. 141-149 (1999).

[30] Kaku, M., Visions: how science will revolutionize the 21st century and beyond, Oxford University Press, Oxford (1998).

[31] van Laerhoven, K. & Cakmacki, O., “What shall we teach our pants?”, IEEE The Fourth International Symposium on Wearable Computers, pp. 77-83 (2000).

[32] Kortuem, G. et al., “Context-aware, adaptive wearable computers as remote interfaces to ‘intelligent’ environments”, Second International Symposium on Wearable Computers, pp. 58-65 (1998).

[33] Starner, T., “The challenges of wearable computing: part 2”, IEEE Micro, July-August, pp. 54-67 (2001).

[34] Mills, S. (ed.), Turning Away From Technology: a new vision for the 21st century, Sierra Club Books, San Francisco (1997).

[35] Davies, S., Big Brother: Australia’s growing web of surveillance, Simon and Schuster, Sydney (1992).

[36] Davies, S., Monitor: extinguishing privacy on the information superhighway, PAN, Sydney (1996).

[37] Hibbert, C., “What to do when they ask for your social security number”, in Computerization and Controversy: value conflicts and social choices, (ed.) Rob Kling, Academic Press, New York, pp. 686-696 (1996).

[38] Kusserow, R.P., “The government needs computer matching to root out waste and fraud”, in Computerisation and Controversy: value conflicts and social choices, (ed.) Rob Kling, Academic Press, New York, part 6, section E, pp. 653f (1996).

[39] Privacy Commissioner, Selected Extracts from the Program Protocol Data-Matching Program (Assistance and Tax), Privacy Commission, Sydney (1990).

[40] Jones, D., “UK government launches smart card strategy”, Ctt, Vol. 11, No. 6, February, p. 2 (2000).

[41] Michels, S., “National ID”, Online NewsHour, http://www.pbs.org/ newshour/bb/fedagencies/jan-june02/id_2-26.html, [Accessed 2 September 2001], pp. 1-8 (2002).

[42] Rosenberg, R.S., The Social Impact of Computers, Sydney, Elsevier, pp. 339-405 (2004).

[43] Brin, D., The Transparent Society: will technology force us to choose between privacy and freedom, Perseus Books, Massachusetts (1998).

[44] Branscomb, A. W., Who Owns Information: from privacy to public access, BasicBooks, USA (1994).

[45] Rothfeder, J., “Invasion of privacy”, PC World, Vol. 13, No. 11, pp. 152-162 (1995).

[46] Newton, J. “Reducing ‘plastic’ counterfeiting”, European Convention on Security and Detection, Vol. 408, pp. 198-201 (1995).

[47] Masterson, U.O., “A day with ‘Professor Cyborg’”, MSNBC, http://www.msnbc.com/news/394441.asp, [Accessed 29 November 2001], pp. 1-6 (2000).

[48] Associated Press, “Chip in your shoulder? Family wants info device”, USA Today: Tech, http://www.usatoday.com/life/cyber/tech/2002/04/01/verichip-family.htm, [Accessed 15 October 2002], pp. 1-2 (2002).

[49] Mieszkowski, K., “Put that silicon where the sun don’t shine”, Salon.com, http://www.salon.com/tech/feature/2000/09/07/chips/, Parts 1-3, [Accessed 11 November 2001], pp. 1-3 (2000).

[50] Salkowski, J., “Go track yourself”, StarNet Dispatches, http://dispatches. azstarnet.com/joe/2000/0104-946929954.htm, [Accessed 29 November 2001], pp. 1-4 (2000).

[51] LoBaido, A.C. 2001, “Soldiers with microchips: British troops experiment with implanted, electronic dog tag”, WorldNetDaily.com, http://www.fivedoves. com/letters/oct2001/chrissa102.htm, [Accessed 20 November 2001], pp. 1-2 (2001).

[52] Swissler, M.A., “Microchips to monitor meds”, Wired, http://www.wired. com/news/technology/0,1282,39070,00.html, [Accessed 29 November 2001], pp. 1-3 (2000).

[53] Black, J., “Roll up your sleeve – for a chip implant”, Illuminati Conspiracy, http://www.conspiracyarchive.com/NWO/chip_implant.htm, [Accessed 15 October 2002], pp. 1-6 (2002).

[54] RFID, “Singapore fights SARS with RFID”, RFID Journal, http://216.121.131.129/article/articleprint/446/-1/1/, [Accessed 1 May 2004], pp. 1-2 (2003).

[55] RFID, “Taiwan uses RFID to combat SARS”, RFID Journal, http://216.121.131.129/article/articleprint/520/-1/1/, [Accessed 1 May 2004], pp. 1-2 (2003).

[56] Scheeres, J. “They want their id chips now”, Wired News, http://www.wired.com/news/privacy/0,1848,50187,00.html, [Accessed 15 October 2002], pp. 1-2 (2002).

[57] Wherify, “Frequently Asked Questions”, Wherify Wireless, http://www.wherifywireless.com/faq.asp, [Accessed 15 April 2004], pp. 1-7 (2004).

[58] Scheeres, J., “Kidnapped? GPS to the rescue”, Wired News, http://www.wired.com/news/business/0,1367,50004,00.html, [Accessed 15 October 2002], pp. 1-2 (2002).

[59] McClimans, F., ‘Is that a chip in your shoulder, or are you just happy to see me?’, CNN.com, http://www.cnn.com/TECH/computing/9809/02/chippotent. idg/index.html, [Accessed 22 November 2001], pp. 1-4 (1998).

[60] Scheeres, J., “Politician wants to ‘get chipped’”, Wired News, http://www.wired.com/news/technology/0,1282,50435,00.html, [Accessed 15 October 2002], pp. 1-2 (2002).

[61] Horn, T., “Opinionet contributed commentary”, Opinionet, http://www. opinionet.com/commentary/contributors/ccth/ccth13.htm, [Accessed 29 November 2001], pp. 1-4 (2000).

[62] Levi, P., The Drowned and the Saved, trans. Raymond Rosenthal, Summit Books, London (1988).

[63] Lifton, R.J., The Nazi Doctors: medical killing and the psychology of genocide, Basic Books, New York (1986).

[64] McMurchie, L., “Identifying risks in biometric use”, Computing Canada, Vol. 25, No. 6, p. 11, (1999).

[65] Nairne, D., “Building better people with chips and sensors”, scmp.com, http://special.scmp.com/mobilecomputing/article/FullText_asp_ArticleID-20001009174, [Accessed 29 November 2001], pp. 1-2 (2000).

[66] National Radiological Protection Board, “Understanding radiation: ionizing radiation and how we are exposed to it”, NRPB, http://www.nrpb.org/radiation_ topics/risks/index.htm, [Accessed 1 May 2004], pp. 1-2 (2004).

[67] Australian Communications Authority, Human exposure to radiofrequency electromagnetic energy: information for manufacturers, importers, agents, licensees or operators of radio communications transmitters, Australian regulations, Melbourne (2000).

[68] Salonen, P. et al., “A small planar inverted-F antenna for wearable applications”, IEEE Tenth International Conference on Antennas and Propagation, Vol. 1, pp. 82-85 (1997).

[69] Geers, R. et al., Electronic Identification, Monitoring and Tracking of Animals, CAN International, New York (1997).

[70] Trull, D., “Simple Cyborg”, Parascope, http://www.parascope.com/ articles/slips/fs29_2.htm, [Accessed 20 November 2001], pp. 1-4 (1998).

[71] Witt, S., “Professor Warwick chips in”, Computerworld, 11 January, p. 89 (1999).

[72] Martin, C.D., “The myth of the awesome thinking machine”, Communications of the ACM, 36(4), pp. 120-133 (1993).

[73] Michael, K., “The automatic identification trajectory: from the ENIAC to chip implants”, in Internet Commerce: digital models for business, E. Lawrence et al., John Wiley and Sons, Queensland, pp. 131-134, 136 (2002).

[74] Tapscott, D., Growing up digital: the rise of the net generation, McGraw- Hill, New York (1998).

[75] Walker, I., “Cyborg dreams: Beyond Human”, Background Briefing ABC Radio National, 4 November, pp. 1-15 (2001)

[76] Anonymous, “Will a chip every day keep the doctor away?”, PhysicsWeb, http://physicsweb.org/article/world/14/7/11, [Accessed 29 November 2001], pp. 1-2 (2001).

[77] Goldberg, H., “Building a better mMode”, http://www.mmodemagazine. com/features/bettermmode.asp, mMode Magazine, [Accessed 1 April 2004), pp. 1-4 (2004).

[78] Wilmington, M., “Movie review, ‘Metropolis (Re-release)’”, Metromix.com, http://metromix.chicagotribune.com/search/mmx-17922_lgcy. story, [Accessed 3 May 2004], pp. 1-3 (2004).

[79] McRoy, J., “Science fiction studies”, DePauw University, Vol. 28, No. 3, http://www.depauw.edu/sfs/birs/bir85b.htm, [Accessed 3 May 2004], pp. 1-3 (2001).

[80] Anonymous, “The NET”, MovieWeb, http://movieweb.com/movie/thenet/ index.html, [Accessed 3 May 2004], pp. 1-5 (2001).

[81] King, B., “Robots: It’s an art thing” http://www.wired.com/news/print/ 0,1294,48253,00.html, [Accessed 4 January 2003], pp. 1-2 (2001).

[82] Branwyn, G., “The desire to be wired”, Wired, September/October (1993).

[83] Schirato, T. & Yell, S. Communication & Cultural Literacy: an introduction, Allen and Unwin, NSW (1996).

[84] Dery, M., Escape Velocity: cyberculture at the end of the century, Hodder and Stoughton, London (1996).

[85] Scheeres, J., “New body art: Chip implants”, Wired News, http://www. wired.com/news/culture/0,1284,50769,00.html, [Accessed 15 October 2002], pp. 1-2 (2002).

[86] Tysome, T., “Dance of a cyborg”, The Australian, p. 35 (2001).

[87] Warwick, K., “Frequently asked questions”, Professor Kevin Warwick, http://www2.cyber.rdg.ac.uk/kevinwarwick/FAQ.html, [Accessed 20 November 2001], pp. 1-4 (2001).

[88] Sacleman, H. Computers, System Science, And Evolving Society: the challenge of man-machine digital systems, Wiley, New York (1967).

[89] McLuhan, M., Understanding Media: the extensions of man, The MIT Press, England (1999).

[90] McLuhan, M. & Powers, B.R., The Global Village: transformations in world life and media in the 21st century, Oxford University Press, New York (1989).

[91] McLuhan, E. & Zingrone, F., Essential McLuhan, BasicBooks, USA (1995).

[92] Ellul, J., The Technological Society, Vintage Books, New York (1964).

[93] Toffler, A., Future Shock, Bantam Books, New York (1970).

[94] Gates, B., The Road Ahead, The Penguin Group, New York (1995).

[95] Negroponte, N., Being Digital, Hodder and Stoughton, Australia (1995).

[96] Moravec, H., Mind Children: the future of robot and human intelligence, Harvard University Press, Cambridge (1988).

[97] Moravec, H., Robot: mere machine to transcendent mind, Oxford University Press, Oxford (1999).

[98] Paul, G.S. & Cox, E.D. Beyond Humanity: cyberevolution and future minds, Charles River Media, Massachusetts (1996).

[99] Sorkin, D.L. & McClanahan, J. “Cochlear implant reimbursement cause for concern”, HealthyHearing, http://www.healthyhearing.com/healthyhearing/ newroot/articles/arc_disp.asp?id=147&catid=1055, [Accessed 3 May 2004], pp. 1-4 (2004).

[100] Weber, D.O., “Me, myself, my implants, my micro-processors and I”, Software Development Magazine, http://www.sdmagazine.com/print/ documentID=11149, [Accessed 29 November 2001], pp. 1-6 (2000).

[101] Maybury, M.T., “The mind matters: artificial intelligence and its societal implications”, IEEE Technology and Society Magazine, June/July, pp. 7-15 (1990).

[102] Bijker, W.E. & Law, J. (eds), Shaping Technology/Building Society: studies in sociotechnical change, The MIT Press, Massachusetts (1992).

[103] Pool, R. Beyond Engineering: how society shapes technology, Oxford University Press, New York (1997).

[104] Hristodoulou, M. Hieromonk, “In the last days”, in Geron Paisios, Mount Athos, Greece, (in Greek), pp. 181-192 (1994).

[105] Relfe, M.S., The New Money System, Ministries Inc., Alabama (1982).

[106] Relfe, M.S., When Your Money Fails, League of Prayer, Alabama (1981).

[107] Barker, K. et al. (eds), The NIV Study Bible, Zondervan Publishing House, Michigan, pp. 1939-1940 (1995).

[108] Watkins, T., “WARNING: 666 IS COMING!”, Dial-the-Truth Ministries,  http://www.secis.com/truth [Accessed 1 August 1996], now http://www.av1611. org, pp. 1-6 (1996).

[109] Michael, M.G., The Number of the Beast, 666 (Revelation 13:16-18): Background, Sources and Interpretation, Macquarie University, MA (Hons) Thesis, Sydney, Australia (1998).

[110] Arndt, W.F. & Gingrich, F.W., A Greek-English Lexicon of the New Testament and Other Early Christian Literature, The University of Chicago Press, Chicago, p. 876 (1979).

[111] Roethenbaugh, G., “Simon Davies- Is this the most dangerous man in Europe?”, Biometrics in Human Services, Vol. 2, No. 5, pp. 2-5 (1998).

[112] Decker, S., “Technology raises concerns: Pros and cons of scientific advances weighed as Christians discuss issue”, The Falcon Online Edition,  http://www.thefalcononline.com//story/2270, [Accessed 1 April 2003], pp. 1-3 (2002).

[113] Cook, T.L. The Mark of the New World Order, ASIN, USA (1999).

[114] Newton, C., “U.S. to weigh computer chip implant”, Netscape: Daily News, http://dailynews.netscape.com/mynsnews/story.tmpl?table=n&cat=51180 &id= 200202261956000188605, [Accessed 15 October 2002], pp. 1-2 (2002).

[115] Associated Press, “Chip in your shoulder? Family wants info device”, USA Today: Tech, http://www.usatoday.com/life/cyber/tech/2002/04/01/ verichip-family.htm, [Accessed 15 October 2002], pp. 1-2 (2002).

[116] Michael, M.G., “For it is the number of a man”, Bulletin of Biblical Studies, Vol. 19, January-June, pp. 79-89 (2000).

[117] Michael, M.G., “666 or 616 (Rev 13:18): Arguments for the authentic reading of the Seer's conundrum”, Bulletin of Biblical Studies, Vol. 19, July-December, pp. 77-83 (2000).

[118] Bauckham, R., The Climax of Prophecy: Studies on the Book of Revelation, T & T Clark: Edinburgh, pp. 384-452 (1993).

[119] Horn, T., “Opinionet contributed commentary”, Opinionet, http://www.opinionet.com/commentary/contributors/ccth/ccth13.htm, [Accessed 29 November 2001], pp. 1-4 (2000).

[120] Barnet, R.J. & Cavanagh, J., Global Dreams: imperial corporations and the new world order, Simon and Schuster, New York (1994).

[121] Wilshire, B., The Fine Print, Brian Wilshire, Australia (1992).

[122] Smith, B., Warning, Smith Family Evangelism, New Zealand (1980).

[123] Stahl, W.A., God and the Chip: religion and the culture of technology, EDSR, Canada (1999).

[124] Noble, D.F., The Religion of Technology: the divinity of man and the spirit of invention, Penguin Books, England (1999).

[125] Sensormatic, “SafeKids™”, Sensormatic, http://www.sensormatic.com/ html/safekids/index.htm, [Accessed 3 June 1999], pp. 1-2 (1999).

[126] Raimundo, N., ‘Digital angel or big brother?’, SCU, http://cseserv.engr. scu.edu/StudentWebPages/NRaimundo/ResearchPaper.htm [Accessed 15th December 2002], (2002).

[127] Wilson, J., “Girl to get tracker implant to ease parents’ fears”, The Guardian, http://www.guardian.co.uk/Print/0,3858,4493297,00.html, [Accessed 15 October 2002], pp. 1-2 (2002).

[128] Ermann, M.D. et al. (eds), Computers, Ethics, and Society, Oxford University Press, New York (1997).

[129] Eng, P., “I, Chip? Technology to meld chips into humans draws closer”, ABCNEWS.com, http://abcnews.go.com/sections/scitech/DailyNews/  chipimplant020225.html, [Accessed 15 October 2002], pp. 1-3 (2002).

[130] Pace, S. et al. (eds), The Global Positioning System: assessing national policies, Rand Corporation, New York (1996).

[131] Rummler, D.M., “Societal issues in engineering”, ENGR 300, pp. 1-3 (2001).

[132] Associated Press, “Company gets okay to sell ID-only computer chip implant”, The Detroit News, http://www.detnews.com/2002/technology/0204/ 05/technology-457686.htm, [Accessed 15 October 2002] (2002).

[133] Associated Press, “ID chip ready for implant”, USA Today: Tech, http:// www.usatoday.com/life/cyber/tech/2002/04/04/implant-chip.htm, [Accessed 15 October 2002], pp. 1-2.

[134] Billinghurst, M. & Starner T., “Wearable devices: new ways to manage information”, IEEE Computer, January, Vol. 32, No. 1, pp. 57-64 (1999).

[135] Mann, S., “Wearable computing: toward humanistic intelligence”, IEEE Intelligent Systems, May/June, pp. 10-15 (2001).

[136] Chan, T., “Welcome to the Internet, baby!”, Telecom Asia, p. 38 (2001).

[137] Schiele, B. et al., “Sensory-augmented computing: wearing the museum’s guide”, IEEE Micro, pp. 44-52.

[138] Kurzweil, R., The Age of Spiritual Machines: when computers exceed human intelligence, Penguin Books, New York (1999).

[139] Irwin, A., “Brain implant lets man control computer by thought”, Telegraph.co.uk, 1238, http://www.telegraph.co.uk/et?ac=000118613908976, [Accessed 22 November 2001], pp. 1-3 (1998).

[140] Warwick, K., “Are chip implants getting under your skin?”, Compiler, http://www.synopsys.com/news/pubs/compiler/art3_chipimplan-mar03.html, [Accessed 1 March 2004], pp. 1-5 (2003).

[141] McGrath, P., “Technology: Building better humans”, Newsweek, http:// egweb.mines.edu/eggn482/admin/Technology.htm, [Accessed 29 November], pp. 1-3 (2001).

[142] Anonymous, “Professor Cyborg”, Salon.com, http://www.salon.com/tech/ feature/1999/10/20/cyborg/index1.html, 3 parts, [Accessed 29 November 2001], pp. 1-3 (1999).

[143] Joy, B. “Why the future doesn’t need us”, Wired, 8.04, http://www.wired. com/wired/archive/8.04/joy_pr.html, [Accessed 4 January 2003], pp. 1-19 (2000).

[144] Masey, S. “Can we talk? The need for ethical dialogue”, The IEE, p. 4/1, (1998).

[145] Wenk, E., “The design of technological megasystems: new social responsibilities for engineers”, IEEE, pp. 47-61 (1990).

[146] Boehringer, B., “Benefits of the OHSU/OGI merger”, The Oregon Opportunity: A New Era of Medical Breakthroughs, http://www.ohsu.edu/ about/opportunity/ohsu_ogi.htm, [Accessed 20 November 2001], pp. 1-2 (2001).

[147] Ebert, R., “Enemy of the State”, Ebert on Movies, http://www.suntimes. com/ebert/ebert_reviews/1998/11/112006.html, pp. 1-3 (2001).

 

Biographical Note

Dr Katina Michael is a lecturer in Information Technology at the University of Wollongong in Australia. In 1996 she completed her Bachelor of Information Technology degree with a co-operative scholarship from the University of Technology, Sydney (UTS) and in 2003 she was awarded her Doctor of Philosophy with the thesis “The Auto-ID Trajectory” from the University of Wollongong. She has an industrial background in telecommunications and has held positions as a systems analyst with United Technologies and Andersen Consulting. Most of her work experience was acquired as a senior network and business planner with Nortel Networks (1996-2001). In this capacity she consulted for Asia’s largest telecommunication operators and service providers. Katina now teaches and researches in eBusiness and her main academic interests are in the areas of automatic identification devices, third generation wireless applications, geographic information systems, and technology forecasting.

Dr M.G. Michael is a church historian and New Testament scholar. He has spoken at numerous international conferences and has written two highly regarded dissertations on the Book of Revelation. His specialist interests are in apocalypticism, millennial studies, and Orthodox mysticism. He has completed a Doctor of Philosophy at the Australian Catholic University, a Master of Arts (Honours) at Macquarie University, a Master of Theology and Bachelor of Arts at Sydney University and a Bachelor of Theology at the Sydney College of Divinity.

The Auto-ID Trajectory - Chapter Ten: Conclusion

The principal conclusions from the findings given in chapter nine are threefold. First, that an evolutionary process of development is present in the auto-ID technology system (TS). Incremental steps either by way of technological recombinations or mutations have lead to revolutionary changes in the auto-ID industry- both at the device level and at the application level. The evolutionary process in the auto-ID TS does not imply a ‘survival of the fittest’ approach,[1] rather a model of coexistence where each particular auto-ID technique has a path which ultimately influences the success of the whole industry. The patterns of migration, integration and convergence can be considered either mutations or recombinations of existing auto-ID techniques for the creation of new auto-ID innovations. Second, that forecasting technological innovations is important in predicting future trends based on past and current events. Analysing the process of innovation between intervals of widespread diffusion of individual auto-ID technologies sheds light on the auto-ID trajectory. Third, that technology is autonomous by nature has been shown by the changes in uses of auto-ID; from non-living to living things, from government to commercial applications, and from external identification devices in the form of tags and badges to medical implants inserted under the skin.

Read More