Wiener's Cybernetics Legacy

Wiener's Cybernetics Legacy and the Growing Need for the Interdisciplinary Approach [Scanning Our Past]

I. Wiener's Contribution

 Fig. 1. Norbert Wiener (courtesy of MIT Museum).

Fig. 1. Norbert Wiener (courtesy of MIT Museum).

Norbert Wiener (Fig. 1) was born in Columbia, MO, USA, in 1894. His father, Leo Wiener, was the first Jew to be appointed a tenured, full professor at Harvard University [1, p. 25], raising Norbert in a multidisciplinary intellectual milieu that included American pragmatism [31]. Wiener wrote his first philosophical paper at age 10 on the role of approximation, uncertainty, and incompleteness of all human knowledge [2, p. 96] (Fig. 2). Norbert was famous for being a child prodigy graduating from Tufts University with a degree in mathematics at age 14 and a Ph.D. from Harvard University at age 18 in the philosophy of mathematical logic. At 19, he undertook postdoctoral study under Bertrand Russell, having written his dissertation on Whitehead and Russell’s Principia Mathematica. Key sources on Wiener’s life and work include his own two-volume autobiography [2], [3], along with Heims [4], [14], Masani [5], Conway and Siegelman [1], Galison [6], and Kline [7].

In 1919 Wiener gained a position at the Massachusetts Institute of Technology (MIT) becoming a full Professor of Mathematics in 1932. Using Gibbs’ outcomes on statistical mechanics, Wiener examined Brownian motion and provided an advance in probability theory called the ‘‘Wiener measure’’ [5, p. 83]. He made significant contributions in harmonic analysis and Fourier transforms but is best known for his contributions to the development of cybernetics. Over time, Wiener developed a strong interest in the study of feedback mechanisms in both science and engineering. This work included research he did with engineer Julian Bigelow1 during World War II on an antiaircraft fire-control system, from which Bigelow, Wiener, and neurophysiologist Arturo Rosenblueth proposed a general explanation of purposeful (teleological) behavior in both animals and machines in 1943 [6].

Wiener and Bigelow set out to work on how they could shoot down an enemy airplane by predicting its future position. They managed to develop a curved-flight predictor, but the Wiener–Bigelow system did not go into operation during World War II due to complexity of its implementation. Yet this research led to sophisticated filtering techniques, and Wiener quickly connected the preliminary results with nervous disorders. Wiener asked Rosenblueth about the potential links of his findings, and Rosenblueth reinforced Wiener’s view of the similarity of control in the human and the machine at least on a statistical basis [3, p. 253]. These discoveries led to Wiener proposing the new field of cybernetics [9].

II. Defining Cybernetics

When we consider the introduction of a new term into the literature we need to reflect on its antecedents. It almost always emerges from a body of existing knowledge. Neurophysiologist W. Ross Ashby in Britain, communication and control system engineers in the United States, and many other researchers wrote about humans and feedback control mechanisms which anticipated cybernetics when it was not known by that name [29, ch. 4] [30]. But it was in 1944 that Wiener and the group of scientists and engineers around him and Rosenblueth became aware of the ‘‘essential unity of the set of problems centering about communication, control, and statistical mechanics, whether in the machine or in living tissue’’ [10, p. 19]. It was Wiener himself, who put the stake in the ground, pronouncing that the entire field of control and communication theory, whether in machine or animal, should be called ‘‘cybernetics.’’ Wiener acknowledges the work of James Clerk Maxwell (1868) [10, p. 19] for the first significant paper on feedback mechanisms where reference is made to governors in the Latin form. Instead, Wiener took the Greek form of ‘‘steersman’’ written ‘‘"&’’ and claimed ‘‘cybernetics.’’2 Wiener was intrigued by how the human brain worked and looked for ‘‘common elements in the functioning of automatic machines and of the human nervous system’’ [11, p. 19].

 Fig. 2. Norbert Wiener at age 9 (courtesy of MIT Museum).

Fig. 2. Norbert Wiener at age 9 (courtesy of MIT Museum).

At the time of the first large-scale electronic computers (like the ENIAC), journalists had begun to anthropomorphise machine elements (e.g., ‘‘electronic brains’’ [12]) and Wiener sought to compare the two forms more closely. For example, he compared the human brain to the digital computer and noted that vacuum-tube computers used much more energy than their biological counterparts, but the energy spent on each calculation was very small [10, p. 155]. For Wiener the central analogy of cybernetics was in terms of a generalized feedback control system. Wiener believed cybernetics could model animals and automatic machines because they both had ‘‘sensors, effectors, brains, and feedback-paths with which they communicate (exchange information) with the outside world and operate in and on that world’’ [7, p. 12]. It was on focusing on these feedback patterns that Wiener made interdisciplinary observations regarding the human brain that tied together neuroscience and electronics. In doing so he would draw upon analogies like giving a psychiatric patient electroshock treatment was akin to erasing a computer’s memory.

Similarly, Wiener was intrigued by how the human body was constantly providing feedback to the brain in order to control basic physiological vital signs like blood pressure and temperature, known as homeostasis.

He even did research on such cybernetic devices as medical prostheses and the automatic control of insulin injection. Although much of the rich discourse that Wiener and other cyberneticians developed in the 1950s has been abandoned in today’s flattened talk of an information age, the noun ‘‘information’’ and the prefix ‘‘cyber’’ today inform how ‘‘we talk, think, and act on our digital present and future’’ [7, p. 4].

Wiener’s most famous book, Cybernetics: or Control and Communication in the Animal and the Machine, was published in 1948 [10]. In this he drew together a multidisciplinary approach developed in the early part of a series of conferences organized by the Josiah Macy, Jr., Foundation in New York City, chaired by Warren McCulloch and held between 1946 and 1953. These conferences on feedback mechanisms in biology and society attracted an interdisciplinary group of several dozen specialists, including John von Neumann and Claude Shannon, alongside whom Wiener made major contributions to the foundation of modern information theory. These included the ‘‘Wiener filter,’’ a method of reducing noise present in a signal, initially written as part of his military research in 1942, and released publicly in 1949 [13].

To the surprise of book publishers and the press, the highly mathematical Cybernetics rose to fame (with a concerted publicity campaign) and soon there was discussion that this might even serve to bring the disciplines closer together by allowing at least a common universal language that could be understood and applied by all. The Macy feedback conference series adopted the term cybernetics as its title, which helped spread cybernetics and information theory throughout the social sciences and biology, forming a basis for a later discourse on information. Attendees and thought leaders linked to the Macy conferences believed that they could model the behavior of humans and society because they both contained information-feedback loops [14]. According to Kline ‘‘[t]he allure of cybernetics rested on its promise to model mathematically the purposeful behavior of all organisms, as well as inanimate systems’’ [7, p. 4].

III. Wiener's Interdisciplinary Approach

Wiener placed priority on the ability of scientists to step beyond their own discipline in order to achieve interesting results (Fig. 3). For him, multidisciplinary work involved more than having people of different backgrounds working in the same team: ‘‘The mathematician need not have the skill to conduct a physiological experiment, but he must have the skill to understand one, to criticize one, and to suggest one. The physiologist need not be able to prove a certain mathematical theorem, but he must be able to grasp its physiological significance and to tell the mathematician for what he should look’’ [10, p. 3].

 Fig. 3. Norbert Wiener at blackboard (courtesy of MIT Museum)

Fig. 3. Norbert Wiener at blackboard (courtesy of MIT Museum)

Key to this type of interdisciplinarity was the fact that Wiener saw significant parallels between human and machine processes. For example, he wrote: ‘‘In the ear, the transposition of music from one fundamental pitch to another is nothing but a translation of the logarithm of the frequency, and may consequently be performed by a group-scanning apparatus’’ [10, p. 141]. Some have criticized the liberal-human subject position of Wiener’s (first-order) cybernetics (e.g., Hayles [15]). Wiener’s concern was not with distinguishing the human and the machine, but with ensuring that humans did not themselves become part of a machine in the organizational sense: ‘‘When human atoms are knit into an organization in which they are used, not in their full right as responsible human beings, but as cogs and levers and rods, it matters little that their raw material is flesh and blood. What is used as an element in a machine, is in fact an element in the machine’’ [16, p. 185].

For Wiener, while his work on control and communication helped lay a theoretical basis for computerized factory automation and many information technologies of the 21st century, he was among the early group of scientists and engineers to call attention to potentially negative impacts of computers and cybernetics. His warnings about the effect on employment through automation were popularized as the subject of science fiction writer Kurt Vonnegut’s first novel, Player Piano [17]. But it is perhaps now, more than ever before, when we ponder on both the utopian and dystopian possibilities of cybernetics when Wiener’s warnings should be heeded. In the postwar era, Wiener strove to mathematically model humans and machines (with some caveats regarding human society).

What was once typically relegated to the realms of science fiction, we are now edging toward. Science fiction and science fact were once scarcely on speaking terms, now they are at times indistinguishable one from the other. If we can map the activity of the 100 billion neurons in the human brain as we cracked the code to the human genome in DNA, what about the trajectory then? If we can delete the memory of the computer and reprogram it, can we also do that to people who use embedded brain stimulators? And ultimately who is in control of the decision making processes?

Generally, engineers in research institutes tend to only reflect on ethics at that time when they have a commercialized product going to market. If indeed this reflection is to happen at all. Ethicists are more often than not bolted on to a nationalfunded project or center to justify the existence of the entity in question and assure the nonacademic community that nothing remiss will take place outside established national guidelines. Among the scientists and engineers who began to question science and technology in the Cold War, Wiener, with a strong interdisciplinary background, began to ask profound questions about the societal implications from the problem definition phase. While this did not mean he stopped his research when he understood his work might collide with fundamental human values, he had at least drawn a discernible line where he thought his ideas might signal negative consequences for humanity at large, if pursued unchecked.

IV. The "Inter" in Interdisciplinary: The Value of Sharing Science

In many ways, between the 1920s and 1960s, Wiener’s questions foreshadowed those of the field of science, technology, and society (STS), often called science and technology studies. He was able to do so because of his family grounding in the humanities and willingness to cross disciplinary boundaries to consider questions usually reserved for the social sciences. The Macy Conferences held between 1946 and 1953 aimed at breaking down the disciplinary barriers in the sciences. Mathematicians, engineers, biologists, social scientists, and humanists debated how the wartime theories of communications and control engineering applied to both humans and machines. Some anthropologists like Margaret Mead and her then-husband Gregory Bateson hoped that cybernetics would bring greater rigor to the social sciences. But the goal of the Macy conferences was for cybernetics to be considered an interdiscipline with the focused goal that people could communicate with one another, creating a ‘‘hybrid field of knowledge existing between and within disciplines’’ [7, pp. 3 and 62].3

Macy conference series chair Warren McCulloch noted at the end of the first year of the group’s existence that members had come from such diverse disciplines that they had to begin with learning each other’s vocabularies before they could even hope to understand one another and carry out even a simple conversation. He wrote about research results that had been ‘‘gathered by extremely dissimilar methods, by observers biased by disparate endowment and training, and related to one another only through a babel of laboratory slangs and technical jargons. Our most notable agreement that we have learned to know one another a bit better, and to fight fair in our shirt sleeves’’ [18, p. 69]. Margaret Mead was initially impressed with the usefulness of cybernetics as a common language that could cross disciplinary boundaries and lead to real interdisciplinary research, but she was utterly deflated when ‘‘the whole thing fragmented’’ upon trying to set up a project in the 1960s for cross-disciplinary communication between the United States and the USSR [7, p. 182].

One of the important characteristics of the field of STS today is its examination of claims for a single trajectory of technology independent of society, such as ‘‘progress’’ [19]. Wiener writes: ‘‘[t]hose who uphold the idea of progress as an ethical principle regard this unlimited and quasi-spontaneous process of change as a Good Thing, and as the basis on which they guarantee to future generations a Heaven on Earth’’ [16, p. 42]. Rather than placing limits on the concept, for Wiener faith in progress is undesirable: ‘‘[t]he simple faith in progress is not a conviction belonging to strength, but one belonging to acquiescence and hence to weakness’’ [16, p. 47]. Parallels to this approach can be found in the work of Ellul [20], Borgman [21], Feenberg [22], Marcuse [23], Bauman [24], and others.

Having reached a perspective that he felt required action, Wiener sought to achieve change through the activity of the scientist or engineer. Wiener’s approach is pragmatic rather than sentimental:

Such interests in the humanitarian duties of the scientist as I now have are due more to the direct impact of the moral problems besetting the research man of the present day than to any original conviction that the scientist is primarily a philanthropist [2, p. 73].

Retaining this pragmatic perspective, Wiener often pointed to the difficulty of achieving what he classified as desirable outcomes. He wrote that the scientific method leaves scientists ill-equipped to tackle societal issues:

The scientist is thus disposed to regard his opponent as an honorable enemy. This attitude is necessary for his effectiveness as a scientist, but tends to make him the dupe of unprincipled people in war and in politics [16, p. 36].

Wiener wrote extensively on the need to allow the individual space to be creative. For Wiener the individual could not be separated from the community, and while he advocated opportunity for the individual (particularly to be creative) his focus throughout his writings was on improving the community:

Whatever benefits are awarded for scientific creation should have the good of the community as their purpose even more than the good of the individual. As such, they should be contingent on a full and free publication of the new ideas of the discoverer. The truth can make us free only when it is a freely obtainable truth [25, p. 154].

For Wiener, ‘‘the most fruitful areas for the growth of the sciences were those which had been neglected as a no-man’s land between the various established fields... A man may be a topologist or an acoustician or a coleopterist. He will be filled with the jargon of his field, and will know all its literature and all is ramifications, but, more frequently than not, he will regard the next subject as something belonging to his colleague three doors down the corridor, and will consider any interest in it on his own part as an unwarrantable breach of privacy’’ [10, pp. 2–3]. Further, in regard to scientific work, ‘‘the simple coexistence of two items of information is of relatively small value, unless these two items can be effectively combined in some mind or organ which is able to fertilize one by means of the other. This is the very opposite of the organization in which each member travels a preassigned path, and in which the sentinels of science, when they come to the end of their beats, present arms, do an about face, and march back in the direction from which they have come’’ [16, p. 126].

V. Norbert's Warning to the Future: Benefit and Harm

Much of Wiener’s writing regarding technology and society can be placed in the field of ethics. Wiener felt and often conveyed a sense of ethical responsibility. In this he confounds writers on the philosophy of technology such as Ellul who present ‘‘human techniques’’ as deterministic and despairs of the scientist/technologist addressing ethical issues [20]. Wiener wrote of the implications for the scientist and engineer.

A certain precise mixture of religion, pornography, and pseudo science will sell an illustrated newspaper. A certain blend of wheedling, bribery, and intimidation will induce a young scientist to work on guided missiles or the atomic bomb [10, pp. 159–160].

For Wiener these challenges were existential: ‘‘[f]or the first time in history, it has become possible for a limited group of a few thousand people to threaten the absolute destruction of millions, and this without any highly specific immediate risk to themselves’’ [3, p. 300]. And it is here that he emphasized that scientists had to think hard about whom they would disclose their ideas to.

Beyond the threat of war, Wiener identified a separate concern related to his own technical work. Here the sense of responsibility of the professional is clear.

So far the moral problem of warfare had not concerned me directly. However, in the fall of 1944 a complex of events took place which had a very considerable effect on my later career and thought. I had already begun to reflect on the relation between the high-speed computing machine and the automatic factory... the automatic factory was not far off. I wondered whether I had not got into a moral situation in which my first duty might be to speak to others concerning material which could be socially harmful [3, p. 295]. I thus decided that I would have to turn from a position of the greatest secrecy to a position of the greatest publicity, and bring to the attention of all the possibilities and dangers of the new developments [3, p. 308].

Wiener’s writings contain many examples of humanitarian concerns across a wide range of topics, such as the following: ‘‘[t]he principal type of surgical intervention which has been practiced is known as prefrontal lobotomy, and consists in the removal or isolation of a portion of the prefrontal lobe of the cortex. It has recently been having a certain vogue, probably not unconnected with the fact that it makes the custodial care of many patients easier. Let me remark in passing that killing them makes their custodial care still easier’’ [10, p. 148].

Just as he questioned a naive faith in progress, for Wiener technology itself had both a human potential and an antihuman potential. ‘‘Thus the new industrial revolution is a twoedged sword. It may be used for the benefit of humanity, but only if humanity survives long enough to enter a period in which such a benefit is possible. It may also be used to destroy humanity, and if it is not used intelligently it can go very far in that direction’’ [16, p. 162].

He drew a personal responsibility from this, in a letter published in the January 1947 issue of Atlantic Monthly, titled ‘‘A scientist rebels,’’ and the basis on which Wiener can be described as a founder of information ethics [26]:

The policy of the government itself during and after the war, say in the bombing of Hiroshima and Nagasaki, has made clear that to provide scientific information is not a necessarily innocent act, and may entail the gravest consequences... It is perfectly clear also that to disseminate information about a weapon in the present state of our civilization is to make it practically certain that the weapon will be used... If therefore I do not desire to participate in the bombing or poisoning of defenseless peoples and I most certainly do not must take a serious responsibility as to those to whom I disclose my scientific ideas [27].

Following his Atlantic Monthly statement, he proposed that medical and other humanitarian applications could provide an alternative vocation to military research. His own work included ‘‘my general interest in the philosophy of prosthesis. I have believed that much could be done with artificial limbs by realizing that the deprivation of the amputee is quite as much sensory as motor’’ [3, p. 287].

He also presented the view that even without a human–machine connection, machines impact humans: ‘‘[l]et us remember that the automatic machine, whatever we think of any feelings it may have or may not have, is the precise economic equivalent of slave labor. Any labor which competes with slave labor must accept the economic conditions of slave labor’’ [16, p. 162].

The debate over whether his contribution was humanistic or antihumanistic continues well after his death, such as in [6]. Wiener’s discoveries meet his own prediction of the two-edged sword of technology. He was unusual among scientists in drawing attention to this.

By the 1960s much of the technology that Wiener and others laid the theoretical foundation for in the 1940s had appeared, at least in a rudimentary form. In the last of his books published in his lifetime (he died in 1964), which deals with the relationship between cybernetic technology and religion, Wiener makes a comment which is relevant in the first quarter of the 21st century: ‘‘[t]he world of the future will be an ever more demanding struggle against the limitations of our intelligence, not a comfortable hammock in which we can lie down to be waited upon by our robot slaves’’ [28, p. 69].

VI. Conclusion

For Wiener, human fields of endeavor are multidisciplinary by nature. He criticized the ‘‘deification of the businessman’’ as inimical to human creativity. He described the richness and diverse of the many civilizations and cultures, and suggested that European and North American researchers should reach out to nonwestern cultures to renew human vitality after the two catastrophic world wars of the 20th century. ‘‘Megabuck science’’ and political orthodoxy ‘‘can be expected to end in lowering the intellectual water table and turning vast areas of the soul which need our cultivation into dead and useless deserts’’ [25, p. 33]. Interdisciplinarity can produce sustainable and interoperable solutions. It can foresee problems and it can go a long way to solving them. Arguably, interdisciplinary approaches lead to a more organized world and to enriched communities. And the question of whether I should continue to investigate a given idea or breakthrough raises ethical considerations that if viewed from a multidisciplinary perspective can cast light on whether further development might result in potential benefits or harms. Wiener argued that knowledge and wisdom are to be found in multiple sources and discoveries. 

Footnotes

1 Julian Bigelow was later the chief engineer on mathematician John von Neumann’s project to build a pioneering digital computer at Princeton University [8].

2 It is important to note that a significant typographic error was made in Wiener’s Cybernetics [1] and carried forward in subsequent editions. The kappa in the Greek form of steersman [[‘‘see published paper in IEEE’’]] originally written correctly in the 1948 edition was wrongly replaced with a chi [[‘‘see published paper in IEEE’’]] in the 1961 second edition. 

3Although Wiener’s philosophy evolved over time, we present quotations without context in this section and the next one, because his positions on the relationship between cybernetic machines and society remained fairly constant from 1947 to his last book in 1964.

References

[1] F. Conway and J. Siegelman, Dark Hero of the Information Age: In Search of Norbert Wiener, the Father of Cybernetics. New York, NY, USA: Basic Books, 2006.

[2] N. Wiener, Ex-Prodigy: My Childhood and Youth. New York, NY, USA: Simon and Schuster, 1955.

[3] N. Wiener, I am a Mathematician: The Later Life of a Prodigy. New York, NY, USA: Doubleday, 1956.

[4] S. J. Heims, John von Neumann and Norbert Wiener: From Mathematics to the Technologies of Life and Death. Cambridge, MA, USA: MIT Press, 1980.

[5] P. R. Masani, Norbert Wiener 1894–1964. Basel, Switzerland: Birkhauser-Verlag, 1990.

[6] P. Galison, ‘‘The ontology of the enemy: Norbert Weiner and the cybernetic vision,’’ Critical Inquiry, vol. 21, pp. 228–266, 1994.

[7] R. R. Kline, The Cybernetics Moment: Or Why we Call our Age the Information Age. Baltimore, MD, USA: John Hopkins Univ. Press, 2015.

[8] W. Aspray, John von Neumann and the Origins of Modern Computing. Cambridge, MA, USA: MIT Press, 1990.

[9] G. Adamson, ‘‘Norbert Wiener and Prasanta Chandra Mahalanobis: Technology and nation-building in post-independence India,’’ in Proc. IEEE Conf. Technol. Soc. Asia, Singapore, 2012.

[10] N. Wiener, Cybernetics: Or Control and Communication in the Animal and the Machine, 2nd ed. Cambridge, MA, USA: MIT Press, 1961.

[11] N. Weiner, ‘‘Cybernetics,’’ Sci. Amer., vol. 179, pp. 14–19, 1948.

[12] C. D. Martin, ‘‘ENIAC: The press conference that shook the world,’’ IEEE Technol. Soc. Mag., vol. 14, no. 4, pp. 3–10, Winter 1995.

[13] N. Wiener, Extrapolation, Interpolation, and Smoothing of Stationary Time Series. Cambridge, MA, USA: MIT Press, 1949.

[14] S. J. Heims, The Cybernetics Group. Cambridge, MA, USA: MIT Press, 1990.

[15] K. N. Hayles, How we Became Posthuman: Virtual Bodies in Cybernetics, Literature, Informatics. Chicago, IL, USA: Univ. Chicago Press, 1999.

[16] N. Wiener, The Human Use of Human Beings: Cybernetics and Society. Boston, MA, USA: Houghton Mifflin, 1954.

[17] K. Vonnegut, Player Piano, 2nd ed. New York, NY, USA: Delacorte Press, 1969.

[18] W. S. McCulloch, ‘‘Summary of the points of agreement reached in the previous nine conferences on cybernetics,’’ in Trans. 10th Conf. Cybern., New York, 1955, pp. 69–80.

[19] G. Adamson, ‘‘Norbert Wiener on technology and society,’’ in Proc. IEEE Conf. Norbert Wiener in the 21st Century, Boston, MA, USA: Josiah, Jr., Macy Foundation, 2014, DOI: 10. 1109/NORBERT.2014.6893930.

[20] J. Ellul, Translated by John Wilkinson, The Technological Society. New York, NY, USA: Alfred A. Knopf, 1964. 

[21] A. Borgmann, Technology and the Character of Contemporary Life: A Philosophical Inquiry. Chicago, IL, USA: Univ. Chicago Press, 1984.

[22] A. Feenberg, ‘‘Democratic rationalization,’’ in Readings in the Philosophy of Technology, D. M. Kaplan, Ed. Oxford, U.K.: Rowman & Littlefield, 2004, pp. 209–225.

[23] H. Marcuse, Eros and Civilization. Boston, MA, USA: Beacon Press, 1966.

[24] Z. Bauman, Wasted Lives: Modernity and its Outcasts. Cambridge, U.K.: Polity, 2004.

[25] N. Wiener, Invention: The Care and Feeding of Ideas. Cambridge, MA, USA: MIT Press, 1993.

[26] T. Bynum, Computer and information ethics, Stanford Encyclopedia of Philosophy, Mar. 27, 2014. [Online]. Available: http:// plato.stanford.edu/archives/spr2011/entries/ ethics-computer/.

[27] N. Wiener, ‘‘A scientist rebels,’’ The Atlantic Monthly, vol. 179, p. 46, 1947.

[28] N. Wiener, God and Golem, Inc. Boston, MA, USA: MIT Press, 1964.

[29] A. Pickering, The Cybernetic Brain: Sketches of Another Future. Chicago, IL, USA: Univ. Chicago Press, 2010.

[30] D. Mindell, Between Human and Machine: Feedback, Control, Computing Before Cybernetics. Baltimore, MD, USA: Johns Hopkins Univ. Press, 2002.

[31] B. Peters, ‘‘Toward a genealogy of cold war communication sciences: The strange loops of Leo and Norbert Wiener,’’ Russian J. Commun., vol. 5, pp. 31–43, 2013.

Keywords: Cybernetics, History, Wiener, Norbert, interdisciplinary approach, Wiener cybernetics legacy

Citation: Greg Adamson, Ronald R. Kline, Katina Michael, M. G. Michael, "Wiener's Cybernetics Legacy and the Growing Need for the Interdisciplinary Approach", Proceedings of the IEEE, Vol. 103, No. 11, pp. 2208 - 2214, Nov. 2015.

Social Implications of Technology: The Past, the Present, and the Future

Abstract

The social implications of a wide variety of technologies are the subject matter of the IEEE Society on Social Implications of Technology (SSIT). This paper reviews the SSIT's contributions since the Society's founding in 1982, and surveys the outlook for certain key technologies that may have significant social impacts in the future. Military and security technologies, always of significant interest to SSIT, may become more autonomous with less human intervention, and this may have both good and bad consequences. We examine some current trends such as mobile, wearable, and pervasive computing, and find both dangers and opportunities in these trends. We foresee major social implications in the increasing variety and sophistication of implant technologies, leading to cyborgs and human-machine hybrids. The possibility that the human mind may be simulated in and transferred to hardware may lead to a transhumanist future in which humanity redesigns itself: technology would become society.

SECTION I. Introduction

“Scientists think; engineers make.” Engineering is fundamentally an activity, as opposed to an intellectual discipline. The goal of science and philosophy is to know; the goal of engineering is to do something good or useful. But even in that bare-bones description of engineering, the words “good” and “useful” have philosophical implications.

Because modern science itself has existed for only 400 years or so, the discipline of engineering in the sense of applying scientific knowledge and principles to the satisfaction of human needs and desires is only about two centuries old. But for such a historically young activity, engineering has probably done more than any other single human development to change the face of the material world.

It took until the mid-20th century for engineers to develop the kind of self-awareness that leads to thinking about engineering and technology as they relate to society. Until about 1900, most engineers felt comfortable in a “chain-of-command” structure in which the boss—whether it be a military commander, a corporation, or a wealthy individual—issued orders that were to be carried out to the best of the engineer's technical ability. Fulfillment of duty was all that was expected. But as the range and depth of technological achievements grew, engineers, philosophers, and the public began to realize that we had all better take some time and effort to think about the social implications of technology. That is the purpose of the IEEE Society on Social Implications of Technology (SSIT): to provide a forum for discussion of the deeper questions about the history, connections, and future trends of engineering, technology, and society.

This paper is not focused on the history or future of any particular technology as such, though we will address several technological issues in depth. Instead, we will review the significant contributions of SSIT to the ongoing worldwide discussion of technology and society, and how technological developments have given rise to ethical, political, and social issues of critical importance to the future. SSIT is the one society in IEEE where engineers and allied professionals are encouraged to be introspective—to think about what they are doing, why they are doing it, and what effects their actions will have. We believe the unique perspective of SSIT enables us to make a valuable contribution to the panoply of ideas presented in this Centennial Special Issue of the Proceedings of the IEEE.

 

SECTION II. The Past

A. Brief History of SSIT

SSIT as a technical society in IEEE was founded in 1982, after a decade as the Committee on Social Responsibility in Engineering (CSRE). In 1991, SSIT held its first International Symposium on Technology and Society (ISTAS), in Toronto, ON, Canada. Beginning in 1996, the Symposium has been held annually, with venues intentionally located outside the continental United States every few years in order to increase international participation.

SSIT total membership was 1705 as of December 2011. Possibly because SSIT does not focus exclusively on a particular technical discipline, it is rare that SSIT membership is a member's primary connection to IEEE. As SSIT's parent organization seeks ways to increase its usefulness and relevance to the rapidly changing engineering world of the 21st century, SSIT will both chronicle and participate in the changes taking place both in engineering and in society as a whole. for a more detailed history of the first 25 years of SSIT, see [1].

B. Approaches to the Social Implications of Technology

In the historical article referred to above [1], former SSIT president Clint Andrews remarked that there are two distinct intellectual approaches which one can take with regard to questions involving technology and society. The CSIT and the early SSIT followed what he calls the “critical science” approach which “tends to focus on the adverse effects of science and technical change.” Most IEEE societies are organized around a particular set of technologies. The underlying assumption of many in these societies is that these particular technologies are beneficial, and that the central issues to be addressed are technical, e.g., having to do with making the technologies better, faster, and cheaper. Andrews viewed this second “technological optimism” trend as somewhat neglected by SSIT in the past, and expressed the hope that a more balanced approach might attract a larger audience to the organization's publications and activities. It is important to note, however, that from the very beginning, SSIT has called for a greater emphasis on the development of beneficial technology such as environmentally benign energy sources and more efficient electrical devices.

In considering technology in its wider context, issues that are unquestionable in a purely technical forum may become open to question. Technique A may be more efficient and a fraction of the cost of technique B in storing data with similar security provisions, but what if a managed offshore shared storage solution is not the best thing to do under a given set of circumstances? The question of whether A or B is better technologically (and economically) is thus subsumed in the larger question of whether and why the entire technological project is going to benefit anyone, and who it may benefit, and who it may harm. The fact that opening up a discussion to wider questions sometimes leads to answers that cast doubt on the previously unquestioned goodness of a given enterprise is probably behind Andrews' perception that on balance, the issues joined by SSIT have predominantly fallen into the critical-science camp. Just as no one expects the dictates of conscience to be in complete agreement with one's instinctive desires, a person seeking unalloyed technological optimism in the pages or discussions hosted by SSIT will probably be disappointed. But the larger aim is to reach conclusions about technology and society that most of us will be thankful for some day, if not today. Another aim is to ensure that we bring issues to light and propose ways forward to safeguard against negative effects of technologies on society.

C. Major Topic Areas of SSIT

In this section, we will review some (but by no means all) topics that have become recurring themes over the years in SSIT's quarterly peer-reviewed publication, the IEEE Technology & Society Magazine. The articles cited are representative only in the sense that they fall into categories that have been dealt with in depth, and are not intended to be a “best of” list. These themes fall into four broad categories: 1) war, military technology (including nuclear weapons), and security issues, broadly defined; 2) energy technologies, policies and related issues: the environment, sustainable development, green technology, climate change, etc.; 3) computers and society, information and communications technologies (ICT), cybersystems, cyborgs, and information-driven technologies; and 4) groups of people who have historically been underprivileged, unempowered, or otherwise disadvantaged: Blacks, women, residents of developing nations, the handicapped, and so on. Education and healthcare also fit in the last category because the young and the ill are in a position of dependence on those in power.

1. Military and Security Issues

Concern about the Vietnam War was a strong motivation for most of the early members of the Committee for Social Responsibility in Engineering, the predecessor organization of SSIT. The problem of how and even whether engineers should be involved in the development or deployment of military technology has continued to appear in some form throughout the years, although the end of the Cold War changed the context of the discussion. This category goes beyond formal armed combat if one includes technologies that tend to exert state control or monitoring on the public, such as surveillance technologies and the violation of privacy by various technical means. In the first volume of the IEEE Technology & Society Magazine published in 1982, luminaries such as Adm. Bobby R. Inman (ret.) voiced their opinions about Cold War technology [2], and the future trend toward terrorism as a major player in international relations was foreshadowed by articles such as “Technology and terrorism: privatizing public violence,” published in 1991 [3]. Opinions voiced in the Magazine on nuclear technology ranged from Shanebrook's 1999 endorsement of a total global ban on nuclear weapons [4] to Andrews' thorough review of national responses to energy vulnerability, in which he pointed out that France has developed an apparently safe, productive, and economical nuclear-powered energy sector [5]. In 2009, a special section of five articles appeared on the topic of lethal robots and their implications for ethical use in war and peacekeeping operations [6]. And in 2010, the use of information and communication technologies (ICT) in espionage and surveillance was addressed in a special issue on “Überveillance,” defined by authors M.G. Michael and K. Michael as the use of electronic means to track and gather information on an individual, together with the “deliberate integration of an individual's personal data for the continuous tracking and monitoring of identity and location in real time” [7].

2. Energy and Related Technologies and Issues

from the earliest years of the Society, articles on energy topics such as alternative fuels appeared in the pages of the IEEE Technology & Society Magazine. A 1983 article on Brazil's then-novel effort to supplement imported oil with alcohol from sugarcane [8] presaged today's controversial U.S. federal mandate for the ethanol content in motor fuels. The Spring 1984 issue hosted a debate on nuclear power generation between H. M. Gueron, director of New York's Con Edison Nuclear Coal and Fuel Supply division at the time [9], and J. J. MacKenzie, a senior staff scientist with the Union of Concerned Scientists [10]. Long before greenhouse gases became a household phrase and bandied about in debates between Presidential candidates, the Magazine published an article examining the need to increase the U.S.'s peak electrical generating capacity because the increase in average temperature due to increasing atmospheric carbon dioxide would increase the demand for air conditioning [11]. The larger implications of global warming apparently escaped the attention of the authors, focused as they were on the power-generating needs of the state of Minnesota. By 1990, the greenhouse effect was of sufficient concern to show up on the legislative agendas of a number of nations, and although Cruver attributed this to the “explosion of doomsday publicity,” he assessed the implications of such legislation for future energy and policy planning [12]. Several authors in a special issue on the social implications of systems concepts viewed the Earth's total environment in terms of a complex system in 2000 [13]. The theme of ISTAS 2009 was the social implications of sustainable development, and this theme was addressed in six articles in the resulting special issue of the IEEE Technology & Society Magazine for Fall 2010. The record of speculation, debate, forecasting, and analysis sampled here shows that not only has SSIT carried out its charter by examining the social implications of energy technology and related issues, but also it has shown itself a leader and forerunner in trends that later became large-scale public debates.

3. Computing, Telecommunications, and Cyberspace

Fig. 1. BRLESC-II computer built by U.S. Army personnel for use at the Ballistics Research Lab, Aberdeen Proving Grounds between about 1967 and 1978, A. V. Kurian at console. Courtesy of U.S. Army Photos.

In the early years of SSIT, computers were primarily huge mainframes operated by large institutions (Fig. 1). But with the personal computer revolution and especially the explosion of the Internet, SSIT has done its part to chronicle and examine the history, present state, and future trends of the hardware, software, human habits and interactions, and the complex of computer and communications technologies that are typically subsumed under the acronym of ICT.

As we now know, the question of intellectual property has been vastly complicated by the ready availability of peer-to-peer software, high-speed network connections, and legislation passed to protect such rights. In a paper published in 1998, Davis addressed the question of protection of intellectual property in cyberspace [14]. As the Internet grew, so did the volume of papers on all sorts of issues it raised, from the implications of electronic profiling [15] to the threats and promises of facial recognition technology [16]. One of the more forward-looking themes addressed in the pages of the Magazine came in 2005 with a special issue on sustainable pervasive computing [17]. This issue provides an example of how both the critical science and the technological optimism themes cited by Andrews above can be brought together in a single topic. And to show that futuristic themes are not shirked by the IEEE Technology and Society Magazine authors, in 2011 Clarke speculated in an article entitled “Cyborg rights” on the limits and problems that may come as people physically merge with increasingly advanced hardware (implanted chips, sensory enhancements, and so on) [18].

4. Underprivileged Groups

Last but certainly not least, the pages of the IEEE Technology & Society Magazine have hosted articles inspired by the plight of underprivileged peoples, broadly defined. This includes demographic groups such as women and ethnic minorities and those disadvantaged by economic issues, such as residents of developing countries. While the young and the ill are not often formally recognized as underprivileged in the conventional sense, in common with other underprivileged groups they need society's help in order to survive and thrive, in the form of education and healthcare, respectively. An important subset of education is the theme of engineering ethics, a subject of vital interest to many SSIT members and officials since the organization's founding.

In its first year, the Magazine carried an article on ethical issues in decision making [19]. A special 1998 issue on computers and the Internet as used in the K-12 classroom explored these matters in eight focused articles [20]. The roles of ethics and professionalism in the personal enjoyment of engineering was explored by Florman (author of the book The Introspective Engineer) in an interview with the Magazine's managing editor Terri Bookman in 2000 [21]. An entire special issue was devoted to engineering ethics in education the following year, after changes in the U.S. Accreditation Board for Engineering and Technology's policies made it appear that ethics might receive more attention in college engineering curricula [22].

The IEEE Technology & Society Magazine has hosted many articles on the status of women, both as a demographic group and as a minority in the engineering profession. Articles and special issues on themes involving women have on occasion been the source of considerable controversy, even threatening the organization's autonomy at one point [1, p. 9]. In 1999, ISTAS was held for the first time in conjunction with two other IEEE entities: the IEEE Women in Engineering Committee and the IEEE History Center. The resulting special issue that came out in 2000 carried articles as diverse as the history of women in the telegraph industry [23], the challenges of being both a woman and an engineering student [24], and two articles on technology and the sex industry [25], [26].

Engineering education in a global context was the theme of a Fall 2005 special issue of the IEEE Technology and Society Magazine, and education has been the focus of several special issues and ISTAS meetings over the years [27]–[28][29]. The recent development termed “humanitarian engineering” was explored in a special issue only two years ago, in 2010 [30]. Exemplified by the U.S.-based Engineers without Borders organization, these engineers pursue projects, and sometimes careers, based not only on profit and market share, but also on the degree to which they can help people who might not otherwise benefit from their engineering talents.

SECTION III. The Present

  Fig. 2.  Cow bearing an Australian National Livestock Identification System (NLIS) RFID tag on its ear. The cow's identity is automatically detected as it goes through the drafting gates and the appropriate feed is provided for the cow based on historical data on its milk yields. Courtesy of Adam Trevarthen.

Fig. 2. Cow bearing an Australian National Livestock Identification System (NLIS) RFID tag on its ear. The cow's identity is automatically detected as it goes through the drafting gates and the appropriate feed is provided for the cow based on historical data on its milk yields. Courtesy of Adam Trevarthen.

Emerging technologies that will act to shape the next few years are complex in their makeup with highly meshed value chains that resemble more a process or service than an individual product [31]. At the heart of this development is convergence: convergence in devices, convergence in applications, convergence in content, and convergence in infrastructure. The current environment is typified by the move toward cloud computing solutions and Web 2.0 social media platforms with ubiquitous access via a myriad of mobile or fixed devices, some of which will be wearable on people and animals (Fig. 2) or embedded in systems (e.g., vehicles and household appliances).

Simultaneous with these changes are the emergence of web services that may or may not require a human operator for decision making in a given business process, reliance upon data streams from automatic identification devices [e.g., radio-frequency identification (RFID) tags], the accuracy and reliability of location-based services [e.g., using Global Positioning Systems (GPS)] and condition monitoring techniques (e.g., using sensors to measure temperature or other physiological data). Most of this new technology will be invisibly located in miniaturized semiconductors which are set to reach such economies of scale, that it is commonly noted by technology evangelists that every single living and nonliving thing will come equipped with a chip “on board.”

Fig. 3. Business woman checking in for an interstate trip using an electronic ticket sent to her mobile phone. Her phone also acts as a mobile payment mechanism and has built-in location services features. Courtesy of NXP Semiconductors 2009.

The ultimate vision of a Web of Things and People (WoTaP)—smart homes using smart meters, smart cars using smart roads, smart cities using smart grids—is one where pervasive and embedded systems will play an active role toward sustainability and renewable energy efficiency. The internetworked environment will need to be facilitated by a fourth-generation mobility capability which will enable even higher amounts of bandwidth to the end user as well as seamless communication and coordination by intelligence built into the cloud. Every smart mobile transaction will be validated by a precise location and linked back to a subject (Fig. 3).

In the short term, some of the prominent technologies that will impact society will be autonomous computing systems with built-in ambient intelligence which will amalgamate the power of web services and artificial intelligence (AI) through multiagent systems, robotics, and video surveillance technologies (e.g., even the use of drones) (Fig. 4). These technologies will provide advanced business and security intelligence. While these systems will lead to impressive uses in green initiatives and in making direct connections between people and dwellings, people and artifacts, and even people and animals, they will require end users to give up personal information related to identity, place, and condition to be drawn transparently from smart devices.

  Fig. 4.  A facial recognition system developed by Argus Solutions in Australia. Increasingly facial recognition systems are being used in surveillance and usually based on video technology. Digital images captured from video or still photographs are compared with other precaptured images. Courtesy of Argus Solutions 2009.

Fig. 4. A facial recognition system developed by Argus Solutions in Australia. Increasingly facial recognition systems are being used in surveillance and usually based on video technology. Digital images captured from video or still photographs are compared with other precaptured images. Courtesy of Argus Solutions 2009.

The price of all of this will be that very little remains private any longer. While the opportunities that present themselves with emerging technologies are enormous with a great number of positive implications for society—for instance, a decrease in the number of traffic accidents and fatalities, a reduction in the carbon emission footprint by each household, greater social interconnectedness, etc.—ultimately these gains too will be susceptible to limitations. Who the designated controller is and what they will do with the acquired data is something we can only speculate about. We return then, to the perennial question of “who will guard the guards themselves”: Quis custodiet ipsos custodes? [32]

A. Mobile and Pervasive Computing

In our modern world, data collection from many of our most common activities begins from the moment we step out our front door in the morning until we go to sleep at night. In addition to near-continual data collection, we have become a society of people that voluntarily broadcasts to the world a great deal of personal information. Vacation photos, major life events, and trivialities such as where we are having dinner to our most mundane thoughts, all form part of the stream of data through which we electronically share our inner lives. This combination of the data that is collected about us and the data that is freely shared by us could form a breathtakingly detailed picture of an individual's life, if it could ever all be collected in one place. Most of us would consider ourselves fortunate that most of this data was historically never correlated and is usually highly anonymized. However, in general, it is becoming easier to correlate and deanonymize data sets.

1. Following Jane Doe's Digital Data Trail

Let us consider a hypothetical “highly tracked” individual [33]. Our Jane Doe leaves for work in the morning, and gets in her Chevrolet Impala, which has OnStar service to monitor her car. OnStar will contact emergency services if Jane has an accident, but will also report to the manufacturer any accident or mechanical failure the car's computer is aware of [34]. Jane commutes along a toll road equipped with electronic toll collection (ETC). The electronic toll system tracks where and at what time Jane enters and leaves the toll road (Fig. 5).

Fig. 5. Singapore's Electronic Road Pricing (ERP) system. The ERP uses a dedicated short-range radio communication system to deduct ERP charges from CashCards. These are inserted in the in-vehicle units of vehicles before each journey. Each time vehicles pass through a gantry when the system is in operation, the ERP charges are automatically deducted. Courtesy of Katina Michael 2003.

When she gets to work, she uses a transponder ID card to enter the building she works in (Fig. 6), which logs the time she enters and by what door. She also uses her card to log into the company's network for the morning. Her company's Internet firewall software monitors any websites she visits. At lunch, she eats with colleagues at a local restaurant. When she gets there, she “checks in” using a geolocation application on her phone—for doing so, the restaurant rewards her with a free appetizer [35].

 

Fig. 6. Employee using a contactless smart card to gain entry to her office premises. The card is additionally used to access elevators in the building, rest rooms, and secure store areas, and is the only means of logging into the company intranet. Courtesy of NXP Semiconductors 2009.

She then returns to work for the afternoon, again using her transponder ID badge to enter. After logging back into the network, she posts a review of the restaurant on a restaurant review site, or maybe a social networking site. At the end of the work day, Jane logs out and returns home along the same toll road, stopping to buy groceries at her local supermarket on the way. When she checks out at the supermarket, she uses her customer loyalty card to automatically use the store's coupons on her purchases. The supermarket tracks Jane's purchases so it can alert her when things she buys regularly are on sale.

During Jane's day, her movements were tracked by several different systems. During almost all of the time she spent out of the house, her movements were being followed. But Jane “opted in” to almost all of that tracking; it was her choice as the benefits she received outweighed her perceived costs. The toll collection transponder in her car allows her to spend less time in traffic [36]. She is happy to share her buying habits with various merchants because those merchants reward her for doing so [37]. In this world it is all about building up bonus points and getting rewarded. Sharing her opinions on review and social networking sites lets Jane keep in touch with her friends and lets them know what she is doing.

While many of us might choose to allow ourselves to be monitored for the individual benefits that accrue to us personally, the data being gathered about collective behaviors are much more valuable to business and government agencies. Clarke developed the notion of dataveillance to give a name to the “systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons” in the 1980s [38]. ETC is used by millions of people in many countries. The more people who use it, as opposed to paying tolls at tollbooths, the faster traffic can flow for everyone. Everyone also benefits when ETC allows engineers to better monitor traffic flows and plan highway construction to avoid the busiest times of traffic. Geolocation applications let businesses reward first-time and frequent customers, and they can follow traffic to their business and see what customers do and do not like. Businesses such as grocery stores or drug stores that use customer loyalty cards are able to monitor buying trends to see what is popular and when. Increasingly shoppers are being introduced to the near-field communication (NFC) capability on their third-generation (3G) smartphone (Fig. 7).

Fig. 7. Purchasing grocery items effortlessly by using the near-field communication (NFC) capability on your 3G smartphone. Courtesy of NXP Semiconductors 2009.

Some of these constant monitoring tools are truly personal and are controlled by and report back only to the user [39]. for example, there are now several adaptive home thermostat systems that learn a user's temperature preferences over time and allow users to track their energy usage and change settings online. for the health conscious, “sleep monitoring” systems allow users to track not only the hours of sleep they get per night, but also the percentage of time spent in light sleep versus rapid eye movement (REM) sleep, and their overall “sleep quality” [40].

Fig. 8. Barcodes printed on individual packaged items on pallets. Order information is shown on the forklift's on-board laptop and the driver scans items that are being prepared for shipping using a handheld gun to update inventory records wirelessly. Courtesy AirData Pty Ltd, Motorola Premier Business Partner, 2009.

Businesses offer and customers use various mobile and customer tracking services because the offer is valued by both parties (Fig. 8). However, serious privacy and legal issues continue to arise [41]. ETC records have been subpoenaed in both criminal and civil cases [42]. Businesses in liquidation have sold their customer databases, violating the privacy agreements they gave to their customers when they were still in business. Geolocation services and social media that show a user's location or allow them to share where they have been or where they are going can be used in court cases to confirm or refute alibis [43].

 

Near-constant monitoring and reporting of our lives will only grow as our society becomes increasingly comfortable sharing more and more personal details (Fig. 9). In addition to the basic human desire to tell others about ourselves, information about our behavior as a group is hugely valuable to both governments and businesses. The benefits to individuals and to society as a whole are great, but the risks to privacy are also significant [44]. More information about group behaviors can let us allocate resources more efficiently, plan better for future growth, and generate less waste. More information about our individual patterns can allow us to do the same thing on a smaller scale—to waste less fuel heating our homes when there is no one present, or to better understand our patterns of human activity.

 

Fig. 9. A five step overview of how the Wherify location-based service works. The information retrieved by this service included a breadcrumb of each location (in table and map form), a list of time and date stamps, latitude and longitude coordinates, nearest street address, and location type. Courtesy of Wherify Wireless Location Services, 2009.

 

B. Social Computing

When we think of human evolution, we often think of biological adaptions to better survive disease or digest foods. But our social behaviors are also a product of evolution. Being able to read facial expressions and other nonverbal cues is an evolved trait and an essential part of human communication. In essence, we have evolved as a species to communicate face to face. Our ability to understand verbal and nonverbal cues has been essential to our ability to function in groups and therefore our survival [45].

The emoticon came very early in the life of electronic communication. This is not surprising, given just how necessary using facial expressions to give context to written words was to the casual and humor-filled atmosphere of the Internet precursors. Many other attempts to add context to the quick, casual writing style of the Internet have been made, mostly with less success. Indeed, the problem of communication devolving from normal conversations to meaningless shouting matches has been around almost as long as electronic communication itself. More recently, the “anonymous problem”—the problem of people anonymously harassing others without fear of response or retribution—has come under discussion in online forums and communities. And of course, we have seen the recent tragic consequences of cyberbullying [46]. In general, people will be much crueler to other people online than they would ever be in person; many of our evolved social mechanisms depend on seeing and hearing who we are communicating with.

The question we are faced with is this: Given that we now exist and interact in a world that our social instincts were not evolved to handle, how will we adapt to the technology, or more likely, how will the technology we use to communicate with adapt to us? We are already seeing the beginning of that adaptation: more and more social media sites require a “real” identity tied to a valid e-mail address. And everywhere on the Internet, “reputation” is becoming more and more important [177].

Reference sites, such as Wikipedia, control access based on reputation: users gain more privileges on the site to do things such as editing controversial topics or banning other users based on their contributions to the community—writing and editing articles or contributing to community discussions. On social media and review sites, users that are not anonymous have more credibility, and again reputation is gained with time and contribution to the community.

It is now becoming standard practice for social media of all forms to allow users to control who can contact them and make it very easy to block unwanted contact. In the future, these trends will be extended. Any social media site with a significant amount of traffic will have a way for users to build and maintain a reputation and to control access accordingly. The shift away from anonymity is set to continue and this is also evident in the way search engine giants, like Google, are updating their privacy statements—from numerous policies down to one. Google states: “When you sign up for a Google Account, we ask you for personal information. We may combine the information you submit under your account with information from other Google services or third parties in order to provide you with a better experience and to improve the quality of our services” [47].

Fig. 10. Wearable high-definition video calling and recording attire. Courtesy of Xybernaut 2002.

When people use technology to socialize, they are often doing it on mobile platforms. Therefore, the futures of social and mobile computing are inevitably intertwined. The biggest change that is coming to the shared mobile/social computing space is the final spread of WiFi and high-density mobile phone networks. There are still huge geographical areas where there is no way of wirelessly connecting to the Internet or where the connection is so slow as to be unusable. As high-speed mobile Internet spreads, extra bandwidth could help the problems inherent in communicating without being able to see the other person. High-definition (HD) video calling on mobile phones will make person-to-person communications easier and more context rich (Fig. 10). HD video calling and conferencing will make everything from business meetings to long-distance relationships easier by allowing the participants to pick up on unspoken cues.

 

As more and more of our social interactions go online, the online world will be forced to adapt to our evolved human social behaviors. It will become much more like offline communication, with reputation and community standing being deeply important. True anonymity will become harder and harder to come by, as the vast majority of social media will require some proof of identity. for example, this practice is already occurring in countries like South Korea [48].

While we cannot predict all the ways in which our online interactions will become more immersive, we can say for certain that they will. The beauty of all of these changes will be that it will become as easy to maintain or grow a personal relationship on the other side of the world as it would be across town. As countries and regions currently without high-speed data networks come online, they can integrate into a new global community allowing us all to know each other with a diverse array of unknown consequences.

C. Wearable Computing

Fig. 11. The prototype GPS Locator for Children with a built-in pager, a request for 911, GPS technology, and a key fob to manually lock and unlock the locator. This specific device is no longer being marketed, despite the apparent need in some contexts. Courtesy of Wherify Wireless Location Services, 2003.

According to Siewiorek [49, p. 82], the first wearable device was prototyped in 1961 but it was not until 1991 that the term “wearable computer” was first used by a research group at Carnegie Mellon University (Pittsburgh, PA). This coincided with the rise of the laptop computer, early models of which were known as “luggables.” Wearable computing can be defined as “anything that can be put on and adds to the user's awareness of his or her environment …mostly this means wearing electronics which have some computational power” [50, p. 2012]. While the term “wearables” is generally used to describe wearable displays and custom computers in the form of necklaces, tiepins, and eyeglasses, the definition has been broadened to incorporate iPads, iPods, personal digital assistants (PDAs), e-wallets, GPS watches (Fig. 11), and other mobile accessories such as smartphones, smart cards, and electronic passports that require the use of belt buckles or clip-on satchels attached to conventional clothing [51, p. 330]. The iPlant (Internet implant) is probably not far off either [52].

 

Wearable computing has reinvented the way we work and go about our day-to-day business and is set to make even greater changes in the foreseeable future [53]. In 2001, it was predicted that highly mobile professionals would be taking advantage of smart devices to “check messages, finish a presentation, or browse the Web while sitting on the subway or waiting in line at a bank” [54, p. 44]. This vision has indeed been realized but devices like netbooks are still being lugged around instead of worn in the true sense.

The next phase of wearables will be integrated into our very clothing and accessories, some even pointing to the body itself being used as an input mechanism. Harrison of Carnegie Mellon's Human–Computer Interaction Institute (HCII) produced Skinput with Microsoft researchers that makes the body that travels everywhere with us, one giant touchpad [55]. These are all exciting innovations and few would deny the positives that will come from the application of this cutting-edge research. The challenge will be how to avoid rushing this technology into the marketplace without the commensurate testing of prototypes and the due consideration of function creep. Function or scope creep occurs when a device or application is used for something other than it was originally intended.

Early prototypes of wearable computers throughout the 1980s and 1990s could have been described as outlandish, bizarre, or even weird. for the greater part, wearable computing efforts have focused on head-mounted displays (a visual approach) that unnaturally interfered with human vision and made proximity to others cumbersome [56, p. 171]. But the long-term aim of researchers is to make wearable computing inconspicuous as soon as technical improvements allow for it (Fig. 12). The end user should look as “normal” as possible [57, p. 177].

 

Fig. 12. Self-portraits of Mann with wearable computing kit from the 1980s to the 1990s. Prof. Mann started working on his WearComp invention as far back as his high school days in the 1970s. Courtesy of Steve Mann.

New technologies like the “Looxcie” [58] wearable recorders have come a long way since the clunky point-of-view head-mounted recording devices of the 1980s, allowing people to effortlessly record and share their life as they experience it in different contexts. Mann has aptly coined the term sousveillance. This is a type of inverse panopticon, sous (below) and veiller (to watch) stemming from the French words. A whole body of literature has emerged around the notion of sousveillance which refers to the recording of an activity by a participant in the activity, typically by way of small wearable or portable personal technologies. The glogger.mobi online platform demonstrates the great power of sousveillance. But there are still serious challenges, such as privacy concerns, that need to be overcome if wearable computing is to become commonplace [59]. Just like Google has created StreetView, can the individual participate in PersonView without his neighbor's or stranger's consent [7] despite the public versus private space debate? Connected to privacy is also the critical issue of autonomy (and if we were to agree with Kant, human dignity), that is, our right to make informed and uncoerced decisions.

While mass-scale commercial production of wearable clothing is still some time away, some even calling it the unfulfilled pledge [60], shirts with simple memory functions have been developed and tested. Sensors will play a big part in the functionality of the smartware helping to determine the environmental context, and undergarments closest to the body will be used for body functions such as the measurement of temperature, blood pressure, heart and pulse rates. for now, however, the aim is to develop ergonomically astute wearable computing that is actually useful to the end user. Head-mounted displays attached to the head with a headband may be practical for miners carrying out occupational health and safety (OH&S) but are unattractive for everyday consumer users. Displays of the next generation will be mounted or concealed within eyeglasses themselves [61, p. 48].

Mann [57, p. 31] predicts that wearable computing will become so common one day, interwoven into every day clothing-based computing, that “we will no doubt feel naked, confused, and lost without a computer screen hovering in front of our eyes to guide us,” just like we would feel our nakedness without the conventional clothing of today.

1. Wearables in the Medical Domain

Unsurprisingly, wearables have also found a niche market in the medical domain. In the mid-1990s, researchers began to describe a small wearable device that continuously monitored glucose levels so that the right amount of insulin was calculated for the individual reducing the incidence of hypoglycemic episodes [62]. The Glucoday [63] and GlucoChip [64] are just two products demonstrating the potential to go beyond wearables toward in vivo techniques in medical monitoring.

Medical wearables even have the capability to check and monitor products in one's blood [65, p. 88]. Today medical wearable device applications include: “monitoring of myocardial ischemia, epileptic seizure detection, drowsiness detection …physical therapy feedback, such as for stroke victim rehabilitation, sleep apnea monitoring, long-term monitoring for circadian rhythm analysis of heart rate variability (HRV)” [66, p. 44].

Some of the current shortcomings of medical wearables are similar to those of conventional wearables, namely the size and the weight of the device which can be too large and too heavy. In addition, wearing the devices for long periods of time can be irritating due to the number of sensors that may be required to be worn for monitoring. The gel applied for contact resistance between the electrode and the skin can also dry up, which is a nuisance. Other obstacles to the widespread diffusion of medical wearables include government regulations and the manufacturers' requirement for limited liability in the event that an incorrect diagnosis is made by the equipment.

But much has been improved in the products of wearables over the past ten years. Due to commensurate breakthroughs in the miniaturization of computing components, wearable devices are now usually quite small. Consider Toumaz Technology's Digital Plaster invention known as the Sensium Life Pebble TZ203002 (Fig. 13). The Digital Plaster contains a Sensium silicon chip, powered by a tiny battery, which sends data via a cell phone or a PDA to a central computer database. The Life Pebble has the ability to enable continuous, auditable acquisition of physiological data without interfering with the patient's activities. The device can continuously monitor electrocardiogram (ECG), heart rate, physical activity, and skin temperature. In an interview with M. G. Michael in 2006, Toumazou noted how the Digital Plaster had been applied in epilepsy control and depression. He said that by monitoring the electrical and chemical responses they could predict the onset of either a depressive episode or an epileptic fit; and then once predicted the nerve could be stimulated to counter the seizure [67]. He added that this truly signified “personal healthcare.”

Fig. 13. Prof. Christofer Toumazou with a patient wearing the “digital plaster”; a tiny electronic device meant to be embedded in ordinary medical plaster that includes sensors for monitoring health-related metadata such as blood pressure, temperature, and glucose levels. Courtesy of Toumaz Technology 2008.

 

D. Robots and Unmanned Aerial Systems and Vehicles

Fig. 14. Predator Drone aircraft: this plane comes in the armed and reconnaissance versions and the models are known as RQ-1 and MQ-1.

Autonomous systems are those which are self-governed. In practice, there are many degrees of autonomy ranging from the highly constrained and supervised to unconstrained and intelligent. Some systems are referred to as “semiautonomous” in order to suggest that the machines are tasked or supervised by a human operator. An unmanned vehicle may be a remotely piloted “dumb” vehicle or an autonomous vehicle (Fig. 14). Robots may be designed to perform repetitive tasks in a highly constrained environment or with intelligence and a high level of autonomy to make judgments in a dynamic and unpredictable environment. As technology advancements allow for a high level of autonomy and expansion from industrial applications to caregiving and warfighting, society is coming to grips with the present and the future of increasingly autonomous systems in our homes, workplaces, and battlefields.

 

Robot ethics, particularly with respect to autonomous weapons systems, has received increasing attention in the last few years [68]. While some call for an outright stop to the development of such technology [69], others seek to shape the technology with ethical and moral implications in mind [6], [70]–[71][72][73]. Driving robotics weapons development underground or refusing to engage in dialog over the ethical issues will not give ethicists an opportunity to participate in shaping the design and use of such weapons. Arkin [6] and Operto [74], among others, argue that engineers must not shy away from these ethical challenges. Furthermore, the technological cat is out of the bag: “Autonomy is subtle in its development—it is occurring in a step-by-step process, rather than through the creation of a disruptive invention. It is far less likely that we will have a sudden development of a ‘positronic brain’ or its equivalent, but rather a continual and gradual relinquishment of authority to machines through the constant progress of science, as we have already seen in automated trains, elevators, and numerous other examples, that have vanished into the background noise of civilization. Autonomy is already here by some definitions” [70].

The evolution of the development and deployment of unmanned aerial vehicles and other autonomous or semiautonomous systems has outpaced the analysis of social implications and ethics of their design and use [70], [75]. Sullivan argues that the evolution of unmanned vehicles for military deployment should not be confused with the more general trend of increasing autonomy in military applications [75]. Use of robots often provides a tactical advantage due to sensors, data processing, and physical characteristics that outperform humans. Robots can act without emotion, bias, or self-preservation influencing judgment, which may be a liability or advantage. Risks to robot deployment in the military, healthcare industry, and elsewhere include trust of autonomous systems (a lack of, or too much) and diffusion of blame or moral buffering [6], [72].

for such critical applications in the healthcare domain, and lethal applications in weapons, the emotional and physical distance of operating a remote system (e.g., drone strikes via video-game style interface) may negatively influence the moral decision making of the human operator or supervisor, while also providing some benefit of emotional protection against post-traumatic stress disorder [71], [72]. Human–computer interfaces can promote ethical choices in the human operator through thoughtful or model-based design as suggested by Cummings [71] and Asaro [72].

for ethical behavior of the autonomous system itself, Arkin proposes that robot soldiers could be more humane than humans, if technologically constrained to the laws of war and rules of engagement, which they could follow without the distortions of emotion, bias, or a sense of self-preservation [6], [70]. Asaro argues that such laws are not, in fact, objective and static but rather meant for human interpretation for each case, and therefore could not be implemented in an automated system [72]. More broadly, Operto [74] agrees that a robot (in any application) can only act within the ethics incorporated into its laws, but that a learning robot, in particular, may not behave as its designers anticipate.

Fig. 15. Kotaro, a humanoid roboter created at the University of Tokyo (Tokyo, Japan), presented at the University of Arts and Industrial Design Linz (Linz, Austra) during the Ars Electronica Festival 2008. Courtesy of Manfred Werner-Tsui.

Robot ethics is just one part of the landscape of social implications for autonomous systems. The field of human–robot interaction explores how robot interfaces and socially adaptive robots influence the social acceptance, usability, and safety of robots [76] (Fig. 15). for example, robots used for social assistance and care, such as for the elderly and small children, introduce a host of new social implications questions. Risks of developing an unhealthy attachment or loss of human social contact are among the concerns raised by Sharkey and Sharkey [77]. Interface design can influence these and other risks of socially assistive robots, such as a dangerous misperception of the robot's capabilities or a compromise of privacy [78].

 

Autonomous and unmanned systems have related social implication challenges. Clear accountability and enforcing morality are two common themes in the ethical design and deployment of such systems. These themes are not unique to autonomous and unmanned systems, but perhaps the science fiction view of robots run amok raises the question “how can we engineer a future where we can benefit from these technologies while maintaining our humanity?”

 

SECTION IV. The Future

Great strides are being taken in the field of biomedical engineering: the application of engineering principles and techniques to the medical field [79]. New technologies such as prospective applications of nanotechnology, microcircuitry (e.g., implantables), and bionics will heal and give hope to many who are suffering from life-debilitating and life-threatening diseases [80]. The lame will walk again. The blind will see just as the deaf have heard. The dumb will sing. Even bionic tongues are on the drawing board. Hearts and kidneys and other organs will be built anew. The fundamental point is that society at large should be able to distinguish between positive and negative applications of technological advancements before we diffuse and integrate such innovations into our day-to-day existence.

The Bionics Institute [81], for instance, is future-focused on the possibilities of bionic hearing, bionic vision, and neurobionics, stating: “Medical bionics is not just a new frontier of medical science, it is revolutionizing what is and isn't possible. Where once there was deafness, there is now the bionic ear. And where there was blindness, there may be a bionic eye.” The Institute reaffirms its commitment to continuing innovative research and leading the way on the proposed “world-changing revolution.”

A. Cochlear Implants—Helping the Deaf to Hear

Fig. 16. Cochlear's Nucleus Freedom implant with Contour Advance electrode which is impervious to magnetic fields up to 1.5 Tesla. Courtesy of Cochlear Australia.

In 2000, more than 32 000 people worldwide already had cochlear implants [82], thanks to the global efforts of people such as Australian Professor Graeme Clark, the founder of Cochlear, Inc. [83]. Clark performed his first transplant in Rod Saunder's left ear at the Royal Eye and Ear Hospital in Melbourne, Australia, on August 1, 1978, when “he placed a box of electronics under Saunders's skin and a bundle of electrodes in his inner ear” [84]. In 2006, that number had grown to about 77 500 for the nucleus implant (Fig. 16) alone which had about 70% of the market share [85]. Today, there are over 110 000 cochlear implant recipients, about 30 000 annually, and their personal stories are testament enough to the ways in which new technologies can change lives dramatically for the better [86]. Cochlear implants can restore hearing to people who have severe hearing loss, a form of diagnosed deafness. Unlike a standard hearing aid that works like an amplifier, the cochlear implant acts like a microphone to change sound into electronic signals. Signals are sent to the microchip implant via radio frequency (RF), stimulating nerve fibers in the inner ear. The brain then interprets the signals that are transmitted via the nerves to be sound.

 

Today, cochlear implants (which are also commonly known as bionic ears) are being used to overcome deafness; tomorrow, they may be open to the wider public as a performance-enhancing technique [87, pp. 10–11]. Audiologist Steve Otto of the Auditory Brainstem Implant Project at the House Ear Institute (Los Angeles, CA) predicts that one day “implantable devices [will] interface microscopically with parts of the normal system that are still physiologically functional” [88]. He is quoted as saying that this may equate to “ESP for everyone.” Otto's prediction that implants will one day be used by persons who do not require them for remedial purposes has been supported by numerous other high profile scientists. A major question is whether this is the ultimate trajectory of these technologies.

for Christofer Toumazou, however, Executive Director of the Institute of Biomedical Engineering, Imperial College London (London, U.K.), there is a clear distinction between repairing human functions and creating a “Superman.” He said, “trying to give someone that can hear super hearing is not fine.” for Toumazou, the basic ethical paradigm should be that we hope to repair the human and not recreate the human [67].

B. Retina Implants—On a Mission to Help the Blind to See

Fig. 17. Visual cortical implant designed by Prof. Mohamad Sawan, a researcher at Polystim Neurotechnologies Laboratory at the Ecole Polytechnique de Montreal (Montreal, QC, Canada). The basic principle of Prof. Sawan's technology consists of stimulating the visual cortex by implanting a silicon microchip on a network of electrodes, made of biocompatible materials, wherein each electrode injects a stimulating electrical current in order to provoke a series of luminous points to appear (an array of pixels) in the field of vision of the blind person. This system is composed of two distinct parts: the implant and an external controller. Courtesy of Mohamad Sawan 2009, made available under Creative Commons License.

The hope is that retina implants will be as successful as cochlear implants in the future [89]. Just as cochlear implants cannot be used for persons suffering from complete deafness, retina implants are not a solution for totally blind persons but rather those suffering from aged macular degeneration (AMD) and retinitis pigmentosa (RP). Retina implants have brought together medical researchers, electronic specialists, and software designers to develop a system that can be implanted inside the eye [90]. A typical retina implant procedure is as follows: “[s]urgeons make a pinpoint opening in the retina to inject fluid in order to lift a portion of the retina from the back of the eye, creating a pocket to accommodate the chip. The retina is resealed over the chip, and doctors inject air into the middle of the eye to force the retina back over the device and close the incisions” [91] (Fig. 17).

 

Brothers Alan and Vincent Chow, one an engineer, the other an ophthalmologist, developed the artificial silicon retina (ASR) and began the company Optobionics Corporation in 1990. This was a marriage between biology and engineering: “In landmark surgeries at the University of Illinois at Chicago Medical Center …the first artificial retinas made from silicon chips were implanted in the eyes of two blind patients who have lost almost all of their vision because of retinal disease.” In 1993, Branwyn [92, p. 3] reported that a team at the National Institutes of Health (NIH) led by Dr. Hambrecht, implanted a 38-electrode array into a blind female's brain. It was reported that she saw simple light patterns and was able to make out crude letters. The following year the same procedure was conducted by another group on a blind male resulting in the man seeing a black dot with a yellow ring around it. Rizzo of Harvard Medical School's Massachusetts Eye and Ear Infirmary (Boston, MA) has cautioned that it is better to talk down the possibilities of the retina implant so as not to give false hopes. The professor himself had expressed that they are dealing with “science fiction stuff” and that there are no long-term guarantees that the technology will ever fully restore sight, although significant progress is being made by a number of research institutes [93, p. 5].

Among these pioneers are researchers at The Johns Hopkins University Medical Center (Baltimore, MD). Brooks [94, p. 4] describes how the retina chip developed by the medical center will work: “a kind of miniature digital camera…is placed on the surface of the retina. The camera relays information about the light that hits it to a microchip implanted nearby. This chip then delivers a signal that is fed back to the retina, giving it a big kick that stimulates it into action. Then, as normal, a signal goes down the optic nerve and sight is at least partially restored.” In 2009, at the age of 56, Barbara Campbell had an array of electrodes implanted in each eye [95] and while her sight is nowhere near fully restored, she is able to make out shapes and see shades of light and dark. Experts believe that this approach is still more realistic in restoring sight to those suffering from particular types of blindness, even more than stem cell therapy, gene therapy, or eye transplants [96] where the risks still outweigh the advantages.

C. Tapping Into the Heart and Brain

Fig. 18. An artificial pacemaker from St. Jude Medical (St. Paul, MN), with electrode 2007. Courtesy of Steven Fruitsmaak.

If it was possible as far back as 1958 to successfully implant two transistors the size of an ice hockey puck in the heart of a 43 year old man [97], the things that will become possible by 2020 are constrained by the imagination as much as by technological limitations. Heart pacemakers (Fig. 18) are still being further developed today, but for the greater part, researchers are turning their attention to the possibilities of brain pacemakers. In the foreseeable future brain implants may help sufferers of Parkinson's, paralysis, nervous system problems, speech-impaired persons, and even cancer patients. The research is still in its formative years and the obstacles are great because of the complexity of the brain; but scientists are hopeful of major breakthroughs in the next 20 years.

 

The brain pacemaker endeavors are bringing together people from a variety of disciplines, headed mainly by neurosurgeons. By using brain implants electrical pulses can be sent directly to nerves via electrodes. The signals can be used to interrupt incoherent messages to nerves that cause uncontrollable movements or tremors. By tapping into the right nerves in the brain, particular reactions can be achieved. Using a technique that was discovered almost accidentally in France in 1987, the following extract describes the procedure of “tapping into” the brain: “Rezai and a team of functional neurosurgeons, neurologists and nurses at the Cleveland Clinic Foundation in Ohio had spent the next few hours electronically eavesdropping on single cells in Joan's brain attempting to pinpoint the precise trouble spot that caused a persistent, uncontrollable tremor in her right hand. Once confident they had found the spot, the doctors had guided the electrode itself deep into her brain, into a small duchy of nerve cells within the thalamus. The hope was that when sent an electrical current to the electrode, in a technique known as deep-brain stimulation, her tremor would diminish, and perhaps disappear altogether” [98]. Companies such as Medtronic Incorporated of Minnesota (Minneapolis, MN) now specialize in brain pacemakers [98]. Medtronic's Activa implant has been designed specifically for sufferers of Parkinson's disease [93].

More recently, there has been some success with ameliorating epileptic attacks through closed-loop technology, also known as smart stimulation. The implant devices can detect an onset of epileptiform activity through a demand-driven process. This means that the battery power in the active implant lasts longer because of increased efficiency, i.e., it is not always stimulating in anticipation of an attack, and adverse effects of having to remove and install new implants more frequently are forgone [99]. Similarly, it has been said that technology such as deep brain stimulation, which has physicians implant electrodes in the brain and electrical pacemakers in the patient's clavicle for Parkinson's Disease, may well be used to overcome problems with severely depressed persons [100].

Currently, the technology is being used to treat thousands of people who are severely depressed or suffering from obsessive compulsive disorder (OCD) who have been unable to respond to other forms of treatment such as cognitive behavioral therapy (CBT) [101]. It is estimated that 10% of people suffering from depression do not respond to conventional methods. Although hard figures are difficult to obtain, several thousands of depressed persons worldwide have had brain pacemakers installed that have software which can be updated wirelessly and remotely. The trials have been based on decades of research by Prof. Helen Mayberg, from Emory University School of Medicine (Atlanta, GA), who first began studying the use of subcallosal cingulate gyrus deep brain stimulation (SCG DBS) for depression in 1990.

In her research, Mayberg has used a device that is no larger than a matchbox with a battery-powered generator that sits in the chest and produces electric currents. The currents are sent to an area deep in the brain via tiny wires which are channeled under the skin on either side of the neck. Surprisingly the procedure to have this type of implant installed only requires local anesthetic and is an outpatient procedure. In 2005, Mayberg told a meeting at the Science Media Centre in London: “This is a very new way to think about the nature of depression …We are not just exciting the brain, we are using electricity to retune and remodulate…We can interrupt or switch off an abnormally functioning circuit” [102].

Ongoing trials today continue to show promising results. The outcome of a 20-patient clinical trial of persons with depression treated with SCG DBS published in 2011, showed that: “At 1 year, 11 (55%) responded to surgery with a greater than 50% reduction in 17-item Hamilton Depression Scale scores. Seven patients (35%) achieved or were within 1 point of achieving remission (scores < 8). Of note, patients who responded to surgery had a significant improvement in mood, anxiety, sleep, and somatic complains related to the disease. Also important was the safety of the procedure, with no serious permanent adverse effects or changes in neuropsychological profile recorded” [103].

Despite the early signs that these procedures may offer long-term solutions for hundreds of thousands of people, some research scientists believe that tapping into the human brain is a long shot. The brain is commonly understood to be “wetware” and plugging in hardware into this “wetware” would seem to be a type mismatch, at least according to Steve Potter, a senior research fellow in biology working at the California Institute of Technology's Biological Imaging Center (Pasadena, CA). Instead Potter is pursuing the cranial route as a “digital gateway to the brain” [88]. Others believe that it is impossible to figure out exactly what all the millions of neurons in the brain actually do. Whether we eventually succeed in “reverse-engineering” the human brain, the topic of implants for both therapeutic and enhancement purposes has aroused significant controversy in the past, and promises to do so even more in the future.

D. Attempting to Overcome Paralysis

In more speculative research, surgeons believe that brain implants may be a solution for persons who are suffering from paralysis, such as spinal cord damage. In these instances, the nerves in the legs are still theoretically “working”; it is just that they cannot make contact with the brain which controls their movement. If somehow signals could be sent to the brain, bypassing the lesion point, it could conceivably mean that paralyzed persons regain at least part of their capability to move [104]. In 2000, Reuters [105] reported that a paralyzed Frenchman (Marc Merger) “took his first steps in 10 years after a revolutionary operation to restore nerve functions using a microchip implant…Merger walks by pressing buttons on a walking frame which acts as a remote control for the chip, sending impulses through fine wires to stimulate legs muscles…” It should be noted, however, that the system only works for paraplegics whose muscles remain alive despite damage to the nerves. Yet there are promising devices like the Bion that may one day be able to control muscle movement using RF commands [106]. Brooks [94] reports that researchers at the University of Illinois in Chicago (Chicago, IL) have “invented a microcomputer system that sends pulses to a patient's legs, causing the muscles to contract. Using a walker for balance, people paralyzed from the waist down can stand up from a sitting position and walk short distances…Another team, based in Europe…enabled a paraplegic to walk using a chip connected to fine wires in his legs.” These techniques are known as functional neuromuscular stimulation systems [107]. In the case of Australian Rob Summers, who became a paraplegic after an accident, doctors implanted an epidural stimulator and electrodes into his spinal cord. “The currents mimic those normally sent by the brain to initiate movement” [108].

Others working to help paraplegics to walk again have invested time in military technology like exoskeletons [109] meant to aid soldiers in lifting greater weights, and also to protect them during battle. Ekso Bionics (Berkeley, CA), formerly Berkeley Bionics, has been conducting trials of an electronic suit in the United States since 2010. The current Ekso model will be fully independent and powered by artificial intelligence in 2012. The Ekso “provides nearly four hours of battery power to its electronic legs, which replicate walking by bending the user's knees and lifting their legs with what the company claims is the most natural gait available today” [110]. This is yet another example of how military technology has been commercialized toward a health solution [111].

E. Granting a Voice to the Speech Impaired

Speech-impairment microchip implants work differently than cochlear and retina implants. Whereas in the latter two, hearing and sight is restored, in implants for speech impairment the voice is not restored, but an outlet for communication is created, possibly with the aid of a voice synthesizer. At Emory University, neurosurgeon Roy E. Bakay and neuroscientist Phillip R. Kennedy were responsible for critical breakthroughs early in the research. In 1998, Versweyveld [112] reported two successful implants of a neurotrophic electrode into the brain of a woman and man who were suffering from amyotrophic lateral sclerosis (ALS) and brainstem stroke, respectively. In an incredible process, Bakay and Kennedy's device uses the patient's brain processes—thoughts, if you will—to move a cursor on a computer screen. “The computer chip is directly connected with the cortical nerve cells…The neural signals are transmitted to a receiver and connected to the computer in order to drive the cursor” [112]. This procedure has major implications for brain–computer interfaces (BCIs), especially bionics. Bakay predicted that by 2010 prosthetic devices will grant patients that are immobile the ability to turn on the TV just by thinking about it and by 2030 to grant severely disabled persons the ability to walk independently [112], [113].

F. Biochips for Diagnosis and Smart Pills for Drug Delivery

It is not unlikely that biochips will be implanted in people at birth in the not too distant future. “They will make individual patients aware of any pre-disposition to susceptibility” [114]. That is, biochips will be used for point-of-care diagnostics and also for the identification of needed drugs, even to detect pandemic viruses and biothreats for national security purposes [115]. The way that biosensors work is that they “represent the technological counterpart of our sense organs, coupling the recognition by a biological recognition element with a chemical or physical transducer, transferring the signal to the electrical domain” [116]. Types of biosensors include enzymes antibodies, receptors, nucleic acids, cells (using a biochip configuration), biomimetic sequences of RNA (ribonucleic) or DNA (deoxyribonucleic), and molecularly imprinted polymers (MIPs). Biochips, on the other hand, “automate highly repetitive laboratory tasks by replacing cumbersome equipment with miniaturized, microfluidic assay chemistries combined with ultrasensitive detection methodologies. They achieve this at significantly lower costs per assay than traditional methods—and in a significantly smaller amount of space. At present, applications are primarily focused on the analysis of genetic material for defects or sequence variations” [117].

with response to treatment for illness, drug delivery will not require patients to swallow pills or take routine injections; instead chemicals will be stored on a microprocessor and released as prescribed. The idea is known as “pharmacy-on-a-chip” and was originated by scientists at the Massachusetts Institute of Technology (MIT, Cambridge, MA) in 1999 [118]. The following extract is from The Lab[119]: “Doctors prescribing complicated courses of drugs may soon be able to implant microchips into patients to deliver timed drug doses directly into their bodies.”

Microchips being developed at Ohio State University (OSU, Columbus, OH) can be swathed with chemical substances such as pain medication, insulin, different treatments for heart disease, or gene therapies, allowing physicians to work at a more detailed level [119]. The breakthroughs have major implications for diabetics, especially those who require insulin at regular intervals throughout the day. Researchers at the University of Delaware (Newark, DE) are working on “smart” implantable insulin pumps that may relieve people with Type I diabetes [120]. The delivery would be based on a mathematical model stored on a microchip and working in connection with glucose sensors that would instruct the chip when to release the insulin. The goal is for the model to be able to simulate the activity of the pancreas so that the right dosage is delivered at the right time.

Fig. 19. The VeriChip microchip, the first microchip implant to be cleared by the U.S. Food and Drug Administration (FDA) for humans, is a passive microchip that contains a 16-digit number, which can be used to retrieve critical medical information on a patient from a secure online database. The company that owns the VeriChip technology is developing a microscopic glucose sensor to put on the end of the chip to eliminate a diabetic's need to draw blood to get a blood glucose reading. Courtesy of PositiveID Corporation.

Beyond insulin pumps, we are now nearing a time where automated closed-loop insulin detection (Fig. 19) and delivery will become a tangible treatment option and may serve as a temporary cure for Type I diabetes until stem cell therapy becomes available. “Closed-loop insulin delivery may revolutionize not only the way diabetes is managed but also patients' perceptions of living with diabetes, by reducing the burden on patients and caregivers, and their fears of complications related to diabetes, including those associated with low and high glucose levels” [121]. It is only a matter of time before these lab-centric results are replicated in real-life conditions in sufferers of Type 1 diabetes.

 

 

G. To Implant or Not to Implant, That Is the Question

There are potentially 500 000 hearing impaired persons that could benefit from cochlear implants [122] but not every deaf person wants one [123]. “Some deaf activists…are critical of parents who subject children to such surgery [cochlear implants] because, as one charged, the prosthesis imparts ‘the non-healthy self-concept of having had something wrong with one's body’ rather than the ‘healthy self-concept of [being] a proud Deaf’” [124]. Assistant Professor Scott Bally of Audiology at Gallaudet University (Washington, DC) has said, “Many deaf people feel as though deafness is not a handicap. They are culturally deaf individuals who have successfully adapted themselves to being deaf and feel as though things like cochlear implants would take them out of their deaf culture, a culture which provides a significant degree of support” [92]. Putting this delicate debate aside, it is here that some delineation can be made between implants that are used to treat an ailment or disability (i.e., giving sight to the blind and hearing to the deaf), and implants that may be used for enhancing human function (i.e., memory). There are some citizens, like Amal Graafstra of the United States [125], who are getting chip implants for convenience-oriented social living solutions that would instantly herald in a world that had keyless entry everywhere (Fig. 20). And there are other citizens who are concerned about the direction of the human species, as credible scientists predict fully functional neural implants. “[Q]uestions are raised as to how society as a whole will relate to people walking around with plugs and wires sprouting out of their heads. And who will decide which segments of the society become the wire-heads” [92]?

 

Fig. 20. Amal Graafstra demonstrating an RFID-operated door latch application he developed. Over the RFID tag site on his left hand is a single steristrip that remained after implantation for a few days. His right hand is holding the door latch.

 

SECTION V. Überveillance and Function Creep

Section IV focused on implants that were attempts at “orthopedic replacements”: corrective in nature, required to repair a function that is either lying dormant or has failed altogether. Implants of the future, however, will attempt to add new “functionality” to native human capabilities, either through extensions or additions. Globally acclaimed scientists have pondered on the ultimate trajectory of microchip implants [126]. The literature is admittedly mixed in its viewpoints of what will and will not be possible in the future [127].

for those of us working in the domain of implantables for medical and nonmedical applications, the message is loud and clear: implantables will be the next big thing. At first, it will be “hip to get a chip.” The extreme novelty of the microchip implant will mean that early adopters will race to see how far they can push the limits of the new technology. Convenience solutions will abound [128]. Implantees will not be able to get enough of the new product and the benefits of the technology will be touted to consumers in a myriad of ways, although these perceived benefits will not always be realized. The technology will probably be first tested where there will be the least effective resistance from the community at large, that is, on prison inmates [129], then those suffering from dementia. These incremental steps in pilot trials and deployment are fraught with moral consequences. Prisoners cannot opt out from jails adopting tracking technology, and those suffering from cognitive disorders have not provided and could not provide their consent. from there it will conceivably not take long for it to be used on the elderly and in children and on those suffering from clinical depression.

The functionality of the implants will range from passive ID-only to active multiapplication, and most invasive will be the medical devices that can upon request or algorithmic reasoning release drugs or electrically stimulate the body for mental and physical stability. There will also be a segment of the consumer and business markets who will adopt the technology for no clear reason and without too much thought, save for the fact that the technology is new and seems to be the way advanced societies are heading. This segment will probably not be overly concerned with any discernible abridgement of their human rights or the fine-print “terms and conditions” agreement they have signed, but will take an implant on the promise that they will have greater connectivity to the Internet, for example. These consumers will thrive on ambient intelligence, context-aware pervasive applications, and an augmented reality—ubiquity in every sense.

But it is certain that the new technology will also have consequences far greater than what we can presently envision. Questions about the neutrality of technology are immaterial in this new “plugged-in” order of existence. for Brin [130, p. 334], the question ultimately has to do with the choice between privacy and freedom. In his words, “[t]his is one of the most vile dichotomies of all. And yet, in struggling to maintain some beloved fantasies about the former, we might willingly, even eagerly, cast the latter away.” And thus there are two possibilities, just as Brin [130] writes in his amazingly insightful book, The Transparent Society, of “the tale of two cities.” Either implants embedded in humans which require associated infrastructure will create a utopia where there is built-in intelligence for everything and everyone in every place, or implants embedded in humans will create a dystopia which will be destructive and will diminish one's freedom of choice, individuality, and finally that indefinable essence which is at the core of making one feel “human.” A third possibility—the middle-way between these two alternatives—would seem unlikely, excepting for the “off the grid” dissenter.

In Section V-A, we portray some of the attractions people may feel that will draw them into the future world of implanted technologies. In Section V-B, we portray some of the problems associated with implanting technology under the skin that would drive people away from opting in to such a future.

A. The Positive Possibilities

Bearing a unique implant will make the individual feel special because they bear a unique ID. Each person will have one implant which will coordinate hundreds of smaller nanodevices, but each nanodevice will have the capacity to act on its own accord. The philosophy espoused behind taking an implant will be one of protection: “I bear an implant and I have nothing to hide.” It will feel safe to have an implant because emergency services, for example, will be able to rapidly respond to your calls for help or any unforeseen events that automatically log problems to do with your health.

Fewer errors are also likely to happen if you have an implant, especially with financial systems. Businesses will experience a rise in productivity as they will understand how precisely their business operates to the nearest minute, and companies will be able to introduce significant efficiencies. Losses in back-end operations, such as the effects of product shrinkage, will diminish as goods will be followed down the supply chain from their source to their destination customer, through the distribution center and retailer.

It will take some years for the infrastructure supporting implants to grow and thrive with a substantial consumer base. The function creep will not become apparent until well after the early majority have adopted implants and downloaded and used a number of core applications to do with health, banking, and transport which will all be interlinked. New innovations will allow for a hybrid device and supplementary infrastructure to grow so powerful that living without automated tracking, location finding, and condition monitoring will be almost impossible.

B. The Existential Risks

It will take some years for the negative fallout from microchip implants to be exposed. At first only the victims of the fallout will speak out through formal exception reports on government agency websites. The technical problems associated with implants will pertain to maintenance, updates, viruses, cloning, hacking, radiation shielding, and onboard battery problems. But the greater problems will be the impact on the physiology and mental health of the individual: new manifestations of paranoia and severe depression will lead to people continually wanting reassurance about their implant's functionality. Issues about implant security, virus detection, and a personal database which is error free will be among the biggest issues facing implantees. Despite this, those who believe in the implant singularity (the piece of embedded technology that will give each person ubiquitous access to the Internet) will continue to stack up points and rewards and add to their social network, choosing rather to ignore the warnings of the ultimate technological trajectory of mind control and geoslavery [131]. It will have little to do with survival of the fittest at this point, although most people will buy into the notion of an evolutionary path toward the Homo Electricus [132]: a transhumanist vision [133] that we can do away with the body and become one with the Machine, one with the Cosmos—a “nuts and bolts” Nirvana where one's manufactured individual consciousness connects with the advanced consciousness evolving from the system as a whole. In this instance, it will be the ecstatic experience of being drawn ever deeper into the electric field of the “Network.”

Some of the more advanced implants will be able to capture and validate location-based data, alongside recordings (visual and audio capture). The ability to conduct überveillance via the implant will be linked to a type of blackbox recorder as in an airplane's cockpit. Only in this case the cockpit will be the body, and the recorder will be embedded just beneath the translucent layer of the skin that will be used for memory recollection and dispute resolution. Outwardly ensuring that people are telling the full story at all times, there will be no lies or claims to poor memory. Überveillance is an above and beyond, an exaggerated, an omnipresent 24/7 electronic surveillance (Fig. 21). It is a surveillance that is not only “always on” but “always with you.” It is ubiquitous because the technology that facilitates it, in its ultimate implementation, is embedded within the human body. The problem with this kind of bodily invasive surveillance is that omnipresence in the “material” world will not always equate with omniscience, hence the real concern for misinformation, misinterpretation, and information manipulation [7]. While it might seem like the perfect technology to aid in real-time forensic profiling and criminalization, it will be open to abuse, just like any other technique, and more so because of the preconception that it is infallible.

 

Fig. 21.The überveillance triquetra as the intersection of surveillance, dataveillance, and sousveillance. Courtesy of Alexander Hayes.

 

SECTION VI. Technology Roadmapping

According to Andrews cited in [1], a second intellectual current within the IEEE SSIT has begun to emerge which is more closely aligned with most of the IEEE technical societies, as well as economics and business. The proponents of this mode participate in “technology foresight” and “roadmapping” activities, and view technology more optimistically, looking to foster innovation without being too concerned about its possible negative effects [1, p. 14]. Braun [134, p. 133] writes that “[f]orecasts do not state what the future will be…they attempt to glean what it might be.” Thus, one with technology foresight can be trusted insofar as their knowledge and judgment go—they may possess foresight through their grasp of current knowledge, through past experiences which inform their forecasts, and through raw intuition.

Various MIT Labs, such as the Media Lab, have been engaged in visionary research since before 1990, giving society a good glimpse of where technology might be headed some 20–30 years ahead of time. It is from such elite groups that visionaries typically emerge whose main purpose is to envision the technologies that will better our wellbeing and generally make life more productive and convenient in the future. Consider the current activities of the MIT Media Lab's Affective Computing Research Group directed by Prof. Rosalind W. Picard that is working hard on technology aids encapsulating “affect sensing” in response to the growing problem of autism [135]. The Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology. The work of Picard's group was made possible by the foundations laid by the Media Lab's predecessor researchers.

On the global technological roadmap we can now point to the following systems which are already under development but have not yet been widely diffused into the market:

  • alternative fuels heralding in innovations like electric cars which are self-driving, and ocean-powered energy, as well as rise of biofuels;

  • the potential for 3-D printing which will revolutionize prototyping and manufacturing practices and possibly reconstruct human tissue;

  • hologram projections for videoconferencing and televisions that respond to gestures as well as pen-sized computing which will do away with keyboards and screens;

  • quantum computing and cryptography;

  • next-generation prosthetics (Fig. 22);

  • cognitive machines such as robot humanoids;

  • carbon nanotubes and nanotech computing which will make our current silicon chips look gargantuan;

  • genetic engineering breakthroughs and regenerative health treatment such as stem cell treatment;

  • electronic banking that will not use physical cash for transactions but the singularity chip (e.g., implant);

  • ubiquitous high-speed wireless networks;

  • crowdsourced surveillance toward real-time forensic profiling and criminalization;

  • autogeneration visual life logs and location chronicles;

  • enhanced batteries that last longer;

  • body power to charge digital equipment [136];

  • brainwave-based technologies in health/gaming;

  • brain-reading technology for interrogation [137].

 

Fig. 22. Army Reserve Staff Sgt. Alfredo De Los Santos displays what the X2 microprocessor knee prosthetic can do by walking up a flight of stairs at the Military Advanced Training Center at Walter Reed Army Medical Center (Washington, DC), December 8, 2009. Patients at Walter Reed are testing next-generation prosthetics. Courtesy of the U.S. Army.

It is important to note that while these new inventions have the ability to make things faster and better for most living in more developed countries, they can act to increase the ever-widening gap between the rich and the poor. New technologies will not necessarily aid in eradicating the poverty cycle in parts of Africa and South America. In fact, new technologies can have the opposite effect—they can create an ever greater chasm in equity and access to knowledge.

Technology foresight is commonly held by one who is engaged in the act of prediction. Predictive studies more often than not are based on past and present trends and use this knowledge for providing a roadmap of future possibilities. There is some degree of imagination in prediction, and certainly the creative element is prevalent. Predictions are not meant to be wild, but calculated wisely with evidence that shows a given course or path is likely in the future. However, this does not mean that all predictions come true. Predictive studies can be about new inventions and new form factors, or the recombination of existing innovations in new ways (hybrid architectures, for example), or the mutation of an existing innovation. Some elements of predictive studies have heavy quantitative forecasting components that use complex models to predict the introduction of new innovations, some even based on historical data inputs.

Before an invention has been diffused into the market, scenario planning is conducted to understand how the technology might be used, who might take it up, and what percentage of society will be willing to adopt the product over time (i.e., consumption analysis). “Here the emphasis is on predicting the development of the technology and assessing its potential for adoption, including an analysis of the technology's market” [138, p. 328].

Even the founder of Microsoft Bill Gates [139, p. 274] accepted that his predictions may not come true. But his insights in the Road Ahead are to be commended, even though they were understandably broad. Gates wrote, “[t]he information highway will lead to many destinations. I've enjoyed speculating about some of these. Doubtless I've made some foolish predictions, but I hope not too many.” Allaby [140, p. 206] writes, “[f]orecasts deal in possibilities, not inevitabilities, and this allows forecasters to explore opportunities.”

for the greater part, forecasters raise challenging issues that are thought provoking, about how existing inventions or innovations will impact society. They give scenarios for the technology's projected pervasiveness, how they may affect other technologies, what potential benefits or drawbacks they may introduce, how they will affect the economy, and much more.

Kaku [141, p. 5] has argued, “that predictions about the future made by professional scientists tend to be based much more substantially on the realities of scientific knowledge than those made by social critics, or even those by scientists of the past whose predictions were made before the fundamental scientific laws were completely known.” He believes that among the scientific body today there is a growing concern regarding predictions that for the greater part come from consumers of technology rather than those who shape and create it. Kaku is, of course, correct, insofar that scientists should be consulted since they are the ones actually making things possible after discoveries have occurred. But a balanced view is necessary and extremely important, encompassing various perspectives of different disciplines.

In the 1950s, for instance, when technical experts forecasted improvements in computer technology, they envisaged even larger machines—but science fiction writers predicted microminiaturization. They “[p]redicted marvels such as wrist radios and pocket-sized computers, not because they foresaw the invention of the transistor, but because they instinctively felt that some kind of improvement would come along to shrink the bulky computers and radios of that day” (Bova, 1988, quoted in [142, p. 18]). The methodologies used as vehicles to predict in each discipline should be respected. The question of who is more correct in terms of predicting the future is perhaps the wrong question. for example, some of Kaku's own predictions in Visions can be found in science fiction movies dating back to the 1960s.

In speculating about the next 500 years, Berry [142, p. 1] writes, “[p]rovided the events being predicted are not physically impossible, then the longer the time scale being considered, the more likely they are to come true…if one waits long enough everything that can happen will happen.”

 

SECTION VII.THE NEXT 50 YEARS: BRAIN–COMPUTER INTERFACE

When Ellul [143, p. 432] in 1964 predicted the use of “electronic banks” in his book The Technological Society, he was not referring to the computerization of financial institutions or the use of automatic teller machines (ATMs). Rather it was in the context of the possibility of the dawn of a new entity: the conjoining of man with machine. Ellul was predicting that one day knowledge would be accumulated in electronic banks and “transmitted directly to the human nervous system by means of coded electronic messages…[w]hat is needed will pass directly from the machine to the brain without going through consciousness…” As unbelievable as this man–machine complex may have sounded at the time, 45 years later visionaries are still predicting that such scenarios will be possible by the turn of the 22nd century. A large proportion of these visionaries are cyberneticists. Cybernetics is the study of nervous system controls in the brain as a basis for developing communications and controls in sociotechnical systems. Parenthetically, in some places writers continue to confuse cybernetics with robotics; they might overlap in some instances, but they are not the same thing.

Kaku [141, pp. 112–116] observes that scientists are working steadily toward a brain–computer interface (Fig. 23). The first step is to show that individual neurons can grow on silicon and then to connect the chip directly to a neuron in an animal. The next step is to mimic this connectivity in a human, and the last is to decode millions of neurons which constitute the spinal cord in order to interface directly with the brain. Cyberpunk science fiction writers like William Gibson [144] refer to this notion as “jacking-in” with the wetware: plugging in a computer cable directly with the central nervous system (i.e., with neurons in the brain analogous to software and hardware) [139, p. 133].

 

  Fig.&nbsp;23.&nbsp; Brain–computer interface schema. (1) Pedestal. (2) Sensor. (3) Electrode. Courtesy of Balougador under creative commons license.

Fig. 23. Brain–computer interface schema. (1) Pedestal. (2) Sensor. (3) Electrode. Courtesy of Balougador under creative commons license.

In terms of the current state of development we can point to the innovation of miniature wearable media, orthopedic replacements (including pacemakers), bionic prosthetic limbs, humanoid robots (i.e., a robot that looks like a human in appearance and is autonomous), and RFID implants. Traditionally, the term cyborg has been used to describe humans who have some mechanical parts or extensions. Today, however, we are on the brink of building a new sentient being, a bearer of electricity, a modern man belonging to a new race, beyond that which can be considered merely part man part machine. We refer here to the absolute fusion of man and machine, where the subject itself becomes the object; where the toolmaker becomes one with his tools [145]. The question at this point of coalescence is how human will the new species be [146], and what are the related ethical, metaphysical, and ontological concerns? Does the evolution of the human race as recorded in history come to an end when technology can be connected to the body in a wired or wireless form?

A. From Prosthetics to Amplification

  Fig.&nbsp;24.&nbsp; Cyborg 2.0 Project. Kevin Warwick with wife Irena during the Cyborg 2.0 project. Courtesy of Kevin Warwick.

Fig. 24. Cyborg 2.0 Project. Kevin Warwick with wife Irena during the Cyborg 2.0 project. Courtesy of Kevin Warwick.

While orthopedic replacements corrective in nature have been around since the 1950s [147] and are required to repair a function that is either lying dormant or has failed altogether, implants of the future will attempt to add new functionality to native human capabilities, either through extensions or additions. Warwick's Cyborg 2.0 project [148], for instance, intended to prove that two persons with respective implants could communicate sensation and movement by thoughts alone. In 2002, the BBC reported that a tiny silicon square with 100 electrodes was connected to the professor's median nerve and linked to a transmitter/receiver in his forearm. Although, “Warwick believe[d] that when he move[d] his own fingers, his brain [would] also be able to move Irena's” [104, p. 1], the outcome of the experiment was described at best as sending “Morse-code” messages (Fig. 24). Warwick [148] is still of the belief that a person's brain could be directly linked to a computer network [149]. Commercial players are also intent on keeping ahead, continually funding projects in this area of research.

 

If Warwick is right, then terminals like telephones would eventually become obsolete if thought-to-thought communication became possible. Warwick describes this as “putting a plug into the nervous system” [104] to be able to allow thoughts to be transferred not only to another person but to the Internet and other media. While Warwick's Cyborg 2.0 may not have achieved its desired outcomes, it did show that a form of primitive Morse-code-style nervous-system-to-nervous-system communication is realizable [150]. Warwick is bound to keep trying to achieve his project goals given his philosophical perspective. And if Warwick does not succeed, he will have at least left behind a legacy and enough stimuli for someone else to succeed in his place.

 

B. The Soul Catcher Chip

The Soul Catcher chip was conceived by former Head of British Telecom Research, Peter Cochrane. Cochrane [151, p. 2] believes that the human body is merely a carcass that serves as a transport mechanism just like a vehicle, and that the most important part of our body is our brain (i.e., mind). Similarly, Miriam English has said: “I like my body, but it's going to die, and it's not a choice really I have. If I want to continue, and I want desperately to see what happens in another 100 years, and another 1000 years…I need to duplicate my brain in order to do that” [152]. Soul Catcher is all about the preservation of a human, way beyond the point of physical debilitation. The Soul Catcher chip would be implanted in the brain, and act as an access point to the external world [153]. Consider being able to download the mind onto computer hardware and then creating a global nervous system via wireless Internet [154] (Fig. 25). Cochrane has predicted that by 2050 downloading thoughts and emotions will be commonplace. Billinghurst and Starner [155, p. 64]predict that this kind of arrangement will free up the human intellect to focus on creative rather than computational functions.

 

Fig. 25. Ray Kurzweil predicts that by 2013 supercomputer power will be sufficient for human brain functional simulation and by 2025 for human brain neural simulation for uploading. Courtesy of Ray Kurzweil and Kurzweil Technologies 2005.

Cochrane's beliefs are shared by many others engaged in the transhumanist movement (especially Extropians like Alexander Chislenko). Transhumanism (sometimes known by the abbreviations “> H” or “H+”) is an international cultural movement that consists of intellectuals who look at ways to extend life through the application of emerging sciences and technologies. Minsky [156] believes that this will be the next stage in human evolution—a way to achieve true immortality “replacing flesh with steel and silicon” [141, p. 94]. Chris Winter of British Telecom has claimed that Soul Catcher will mean “the end of death.” Winter predicts that by 2030, “[i]t would be possible to imbue a newborn baby with a lifetime's experiences by giving him or her the Soul Catcher chip of a dead person” [157]. The philosophical implications behind such movements are gigantic; they reach deep into every branch of traditional philosophy, especially metaphysics with its special concerns over cosmology and ontology.

 

SECTION VIII. The Next 100 Years: Homo Electricus

A. The Rise of the Electrophorus

  Fig.&nbsp;26.&nbsp; Drawing showing the operation of an Electrophorus, a simple manual electrostatic generator invented in 1762 by Swedish Professor Johan Carl Wilcke. Image by Amédée Guillemin (died 1893).

Fig. 26. Drawing showing the operation of an Electrophorus, a simple manual electrostatic generator invented in 1762 by Swedish Professor Johan Carl Wilcke. Image by Amédée Guillemin (died 1893).

Microchip implants are integrated circuit devices encased in RFID transponders that can be active or passive and are implantable into animals or humans usually in the subcutaneous layer of the skin. The human who has been implanted with a microchip that can send or receive data is an Electrophorus, a bearer of “electric” technology [158]. The Macquarie Dictionary definition of “electrophorus” is “an instrument for generating static electricity by means of induction,” and refers to an instrument used in the early years of electrostatics (Fig. 26).

 

We have repurposed the term electrophorus to apply to humans implanted with microchips. One who “bears” is in some way intrinsically or spiritually connected to that which they are bearing, in the same way an expecting mother is to the child in her womb. The root electro comes from the Greek word meaning “amber,” and phorus means to “wear, to put on, to get into” [159, p. 635]. When an Electrophorus passes through an electromagnetic zone, he/she is detected and data can be passed from an implanted microchip (or in the future directly from the brain) to a computer device.

To electronize something is “to furnish it with electronic equipment” and electrotechnology is “the science that deals with practical applications of electricity.” The term “electrophoresis” has been borrowed here, to describe the “electronic” operations that an electrophorus is involved in. McLuhan and Zingrone [160, p. 94] believed that “electricity is in effect an extension of the nervous system as a kind of global membrane.” They argued that “physiologically, man in the normal use of technology (or his variously extended body) is perpetually modified by it and in turn finds ever new ways of modifying his technology” [161, p. 117].

The term “electrophorus” seems to be much more suitable today for expressing the human-electronic combination than the term “cyborg.” “Electrophorus” distinguishes strictly electrical implants from mechanical devices such as artificial hips. It is not surprising then that these crucial matters of definition raise philosophical and sociological questions of consciousness and identity, which science fiction writers have been addressing creatively. The Electrophorus belongs to the emerging species of Homo Electricus. In its current state, the Electrophorus relies on a device being triggered wirelessly when it enters an electromagnetic field. In the future, the Electrophorus will act like a network element or node, allowing information to pass through him or her, to be stored locally or remotely, and to send out messages and receive them simultaneously and allow some to be processed actively, and others as background tasks.

At the point of becoming an Electrophorus (i.e., a bearer of electricity), Brown [162] makes the observation that “[y]ou are not just a human linked with technology; you are something different and your values and judgment will change.” Some suspect that it will even become possible to alter behavior of people carrying brain implants, whether the individual wills it or not. Maybury [163]believes that “[t]he advent of machine intelligence raises social and ethical issues that may ultimately challenge human existence on earth.”

B. The Prospects of Transhumanism

  Fig.&nbsp;27.&nbsp; The transhumanism symbol. Courtesy of Antonu under Creative Commons license.

Fig. 27. The transhumanism symbol. Courtesy of Antonu under Creative Commons license.

Thought-to-thought communications may seem outlandish today, but it is only one of many futuristic hopes of the movement termed transhumanism. Probably the most representative organization for this movement is the World Transhumanist Association (WTA), which recently adopted the doing-business-as name of “Humanity+” (Fig. 27). The WTA's website [164] carries the following succinct statement of what transhumanism is, penned originally by Max More in 1990: “Transhumanism is a class of philosophies of life that seek the continuation and acceleration of the evolution of intelligent life beyond its currently human form and human limitations by means of science and technology, guided by life-promoting principles and values.” Whether transhumanism yet qualifies as a philosophy, it cannot be denied that it has produced its share of both proponents and critics.

 

Proponents of transhumanism claim that the things they want are the things everyone wants: freedom from pain, freedom from suffering, freedom from all the limitations of the human body (including mental as well as physical limitations), and ultimately, freedom from death. One of the leading authors in the transhumanist movement is Ray Kurzweil, whose 652-page book The Singularity Is Near [165] prophesies a time in the not-too-distant future when evolution will accelerate exponentially and bring to pass all of the above freedoms as “the matter and energy in our vicinity will become infused with the intelligence, knowledge, creativity, beauty, and emotional intelligence (the ability to love, for example) of our human-machine civilization. Our civilization will then expand outward, turning all the dumb matter and energy we encounter into sublimely intelligent—transcendent—matter and energy” [165, p. 389].

Despite the almost theological tone of the preceding quote, Kurzweil has established a sound track record as a technological forecaster, at least when it comes to Moore's-Law-type predictions of the progress of computing power. But the ambitions of Kurzweil [178] and his allies go far beyond next year's semiconductor roadmap to encompass the future of all humanity. If the fullness of the transhumanist vision is realized, the following achievements will come to pass:

  • human bodies will cease to be the physical instantiation of human minds, replaced by as-yet-unknown hardware with far greater computational powers than the present human brain;

  • human minds will experience, at their option, an essentially eternal existence in a world free from the present restrictions of material embodiment in biological form;

  • limitations on will, intelligence, and communication will all be overcome, so that to desire a thing or experience will be to possess it.

The Transhumanist Declaration, last modified in 2009 [166], recognizes that these plans have potential downsides, and calls for reasoned debate to avoid the risks while realizing the opportunities. The sixth item in the Declaration, for example, declares that “[p]olicy making ought to be guided by responsible and inclusive moral vision, taking seriously both opportunities and risks, respecting autonomy and individual rights, and showing solidarity with and concern for the interests and dignity of all people around the globe.” The key phrase in this item is “moral vision.” While many self-declared transhumanists may agree on the moral vision which should guide their endeavors, the movement has also inspired some of the most vigorous and categorically critical invective to be found in the technical and public-policy literature.

Possibly the most well known of the vocal critics of transhumanism is Francis Fukuyama, a political scientist who nominated transhumanism as his choice for the world's most dangerous idea [167]. As with most utopian notions, the main problem Fukuyama sees with transhumanism is the transition between our present state and the transhumanists' future vision of completely realized eternal technological bliss (Fig. 28). Will some people be uploaded to become immortal, almost omniscient transhumans while others are left behind in their feeble, mortal, disease-ridden human bodies? Are the human goods that transhumanists say are basically the same for everyone really so? Or are they more complex and subtle than typical transhumanist pronouncements acknowledge? As Fukuyama points out in his foreign Policy essay [167], “Our good characteristics are intimately connected to our bad ones… if we never felt jealousy, we would also never feel love. Even our mortality plays a critical function in allowing our species as a whole to survive and adapt (and transhumanists are about the last group I would like to see live forever).”

 

  Fig.&nbsp;28.&nbsp; Brain in a vat with the thought: “I'm walking outside in the sun” being transmitted to the computer. Image reproduced under the Creative Commons license.

Fig. 28. Brain in a vat with the thought: “I'm walking outside in the sun” being transmitted to the computer. Image reproduced under the Creative Commons license.

Transhumanists themselves admit that their movement performs some of the functions of a religion when it “offers a sense of direction and purpose.” But in contrast to most religions, transhumanists explicitly hope to “make their dreams come true in this world” [168]. Nearly all transhumanist programs and proposals arise from a materialist–reductionist view of the world which assumes that the human mind is at most an epiphenomenon of the brain, all of the human brain's functions will eventually be simulated by hardware (on computers of the future), and that the experience known as consciousness can be realized in artificial hardware in essentially the same form as it is presently realized in the human body. Some of the assumptions of transhumanism are based less on facts and more on faith. Just as Christians take on faith that God revealed Himself in Jesus Christ, transhumanists take on faith that machines will inevitably become conscious.

  Fig.&nbsp;29.&nbsp; The shadow dextrous hand shakes the human hand. How technology might become society—a future agreement. Courtesy of Shadow Robot Company 2008.

Fig. 29. The shadow dextrous hand shakes the human hand. How technology might become society—a future agreement. Courtesy of Shadow Robot Company 2008.

In keeping with the transhumanists' call for responsible moral vision, the IEEE SSIT has been, and will continue to be, a forum where the implications for society of all sorts of technological developments can be debated and evaluated. In a sense, the transhumanist program is the ultimate technological project: to redesign humanity itself to a set of specifications, determined by us. If the transhumanists succeed, technology will become society, and the question of the social implications of technology will be moot (Fig. 29). Perhaps the best attitude to take toward transhumanism is to pay attention to their prophecies, but, as the Old Testament God advised the Hebrews, “if the thing follow not, nor come to pass…the prophet hath spoken it presumptuously…” [169].

 

 

SECTION IX. Ways forward

In sum, identifying and predicting what the social implications of past, present and future technologies might be can lead us to act in one of four ways, which are not mutually exclusive.

First, we can take the “do nothing” approach and meekly accept the risks associated with new techniques. We stop being obsessed by both confirmed and speculative consequences, and instead, try to see how far the new technologies might take us and what we might become or transform into as a result. While humans might not always like change, we are by nature, if we might hijack Heraclitus, in a continual state of flux. We might reach new potentials as a populace, become extremely efficient at doing business with each other, and make a positive impact on our natural environment by doing so. The downside to this approach is that it appears to be an all or nothingapproach with no built-in decision points. for as Jacques Ellul [170] forewarned: “what is at issue here is evaluating the danger of what might happen to our humanity in the present half-century, and distinguishing between what we want to keep and what we are ready to lose, between what we can welcome as legitimate human development and what we should reject with our last ounce of strength as dehumanization.”

The second option is that we let case law determine for us what is legal or illegal based on existing laws, or new or amended laws we might introduce as a result of the new technologies. We can take the stance that the courts are in the best position to decide on what we should and should not do with new technologies. If we break the law in a civil or criminal capacity, then there is a penalty and we have civil and criminal codes concerning workplace surveillance, telecommunications interception and access, surveillance devices, data protection and privacy, cybercrime, and so on. There is also the continual review of existing legislation by law-reform commissions and the like. New legislation can also be introduced to curb against other dangers or harms that might eventuate as a result of the new techniques.

The third option is that we can introduce industry regulations that stipulate how advanced applications should be developed (e.g., ensuring privacy impact assessments are done before commercial applications are launched), and that technical expectations on accuracy, reliability, and storage of data are met. It is also important that the right balance be found between regulations and freedom so as not to stifle the high-tech industry at large.

Finally, the fourth option would be to adopt the “Amish method”: complete abandonment of technology that has progressed beyond a certain point of development. This is in some respect “living off the grid” [171].

Although obvious, it is important to underline that none of these options are mutually exclusive or foolproof. The final solution may well be at times to introduce industry regulations or codes, at other times to do nothing, and in other cases to rely on legislative amendments despite the length of time it takes to develop these. In other cases, the safeguards may need to be built into the technology itself.

 

SECTION X. Conclusion

If we put our trust in Kurzweil's [172] Law of Accelerating Returns, we are likely headed into a great period of discovery unprecedented in any era of history. This being the case, the time for inclusive dialog is now, not after widespread diffusion of such innovations as “always on” cameras, microchip implants, unmanned drones and the like. We stand at a critical moment of decision, as the mythological Pandora did as she was about to open her box. There are many lessons to be learned from history, especially from such radical developments as the atomic bomb and the resulting arms race. Joy [173] has raised serious fears about continuing unfettered research into “spiritual machines.” Will humans have the foresight to say “no” or “stop” to new innovations that could potentially be a means to a socially destructive scenario? Implants that may prolong life expectancy by hundreds if not thousands of years may appeal at first glance, but they could well create unforeseen devastation in the form of technological viruses, plagues, or a grim escalation in the levels of crime and violence.

To many scientists of the positivist tradition anchored solely to an empirical world view, the notion of whether something is right or wrong is in a way irrelevant. for these researchers, a moral stance has little or nothing to do with technological advancement but is really an ideological position. The extreme of this view is exemplified by an attitude of “let's see how far we can go”, not “is what we are doing the best thing for humanity?” and certainly not by the thought of “what are the long-term implications of what we are doing here?” As an example, one need only consider the mad race to clone the first animal, and many have long suspected an “underground” scientific race continues to clone the first human.

In the current climate of innovation, precisely since the proliferation of the desktop computer and birth of new digital knowledge systems, some observers believe that engineers, and professionals more broadly, lack accountability for the tangible and intangible costs of their actions [174, p. 288]. Because science-enabled engineering has proved so profitable for multinational corporations, they have gone to great lengths to persuade the world that science should not be stopped, for the simple reason that it will always make things better. This ignores the possibility that even seemingly small advancements into the realm of the Electrophorus for any purpose other than medical prostheses will have dire consequences for humanity [175]. According to Kuhns, “Once man has given technique its entry into society, there can be no curbing of its gathering influence, no possible way of forcing it to relinquish its power. Man can only witness and serve as the ironic beneficiary-victim of its power” [176, p. 94].

Clearly, none of the authors of this paper desire to stop technological advance in its tracks. But we believe that considering the social implications of past, present, and future technologies is more than an academic exercise. As custodians of the technical means by which modern society exists and develops, engineers have a unique responsibility to act with forethought and insight. The time when following orders of a superior was all that an engineer had to do is long past. with great power comes great responsibility. Our hope is that the IEEE SSIT will help and encourage engineers worldwide to consider the consequences of their actions throughout the next century.

References

1. K. D. Stephan, "Notes for a history of the IEEE society on social implications of technology", IEEE Technol. Soc. Mag., vol. 25, no. 4, pp. 5-14, 2006.

2. B. R. Inman, "One view of national security and technical information", IEEE Technol. Soc. Mag., vol. 1, no. 3, pp. 19-21, Sep. 1982.

3. S. Sloan, "Technology and terrorism: Privatizing public violence", IEEE Technol. Soc. Mag., vol. 10, no. 2, pp. 8-14, 1991.

4. J. R. Shanebrook, "Prohibiting nuclear weapons: Initiatives toward global nuclear disarmament", IEEE Technol. Soc. Mag., vol. 18, no. 2, pp. 25-31, 1999.

5. C. J. Andrews, "National responses to energy vulnerability", IEEE Technol. Soc. Mag., vol. 25, no. 3, pp. 16-25, 2006.

6. R. C. Arkin, "Ethical robots in warfare", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 30-33, 2009.

7. M. G. Michael, K. Michael, "Toward a state of überveillance", IEEE Technol. Soc. Mag., vol. 29, no. 2, pp. 9-16, 2010.

8. V. Baranauskas, "Large-scale fuel farming in Brazil", IEEE Technol. Soc. Mag., vol. 2, no. 1, pp. 12-13, Mar. 1983.

9. H. M. Gueron, "Nuclear power: A time for common sense", IEEE Technol. Soc. Mag., vol. 3, no. 1, pp. 3-9, Mar. 1984.

10. J. J. Mackenzie, "Nuclear power: A skeptic's view", IEEE Technol. Soc. Mag., vol. 3, no. 1, pp. 9-15, Mar. 1984.

11. E. Larson, D. Abrahamson, P. Ciborowski, "Effects of atmospheric carbon dioxide on U. S. peak electrical generating capacity", IEEE Technol. Soc. Mag., vol. 3, no. 4, pp. 3-8, Dec. 1984.

12. P. C. Cruver, "Greenhouse effect prods global legislative initiatives", IEEE Technol. Soc. Mag., vol. 9, no. 1, pp. 10-16, Mar./Apr. 1990.

13. B. Allenby, "Earth systems engineering and management", IEEE Technol. Soc. Mag., vol. 19, no. 4, pp. 10-24, Winter 2000.

14. J. C. Davis, "Protecting intellectual property in cyberspace", IEEE Technol. Soc. Mag., vol. 17, no. 2, pp. 12-25, 1998.

15. R. Brody, "Consequences of electronic profiling", IEEE Technol. Soc. Mag., vol. 18, no. 1, pp. 20-27, 1999.

16. K. W. Bowyer, "Face-recognition technology: Security versus privacy", IEEE Technol. Soc. Mag., vol. 23, no. 1, pp. 9-20, 2004.

17. D. Btschi, M. Courant, L. M. Hilty, "Towards sustainable pervasive computing", IEEE Technol. Soc. Mag., vol. 24, no. 1, pp. 7-8, 2005.

18. R. Clarke, "Cyborg rights", IEEE Technol. Soc. Mag., vol. 30, no. 3, pp. 49-57, 2011.

19. E. Levy, D. Copp, "Risk and responsibility: Ethical issues in decision-making", IEEE Technol. Soc. Mag., vol. 1, no. 4, pp. 3-8, Dec. 1982.

20. K. R. Foster, R. B. Ginsberg, "Guest editorial: The wired classroom", IEEE Technol. Soc. Mag., vol. 17, no. 4, pp. 3, 1998.

21. T. Bookman, "Ethics professionalism and the pleasures of engineering: T&S interview with Samuel Florman", IEEE Technol. Soc. Mag., vol. 19, no. 3, pp. 8-18, 2000.

22. K. D. Stephan, "Is engineering ethics optional", IEEE Technol. Soc. Mag., vol. 20, no. 4, pp. 6-12, 2001.

23. T. C. Jepsen, "Reclaiming history: Women in the telegraph industry", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 15-19, 2000.

24. A. S. Bix, "‘Engineeresses’ invade campus", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 20-26, 2000.

25. J. Coopersmith, "Pornography videotape and the internet", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 27-34, 2000.

26. D. M. Hughes, "The internet and sex industries: Partners in global sexual exploitation", IEEE Technol. Soc. Mag., vol. 19, no. 1, pp. 35-41, 2000.

27. V. Cimagalli, M. Balsi, "Guest editorial: University technology and society", IEEE Technol. Soc. Mag., vol. 20, no. 2, pp. 3, 2001.

28. G. L. Engel, B. M. O'Connell, "Guest editorial: Ethical and social issues criteria in academic accreditation", IEEE Technol. Soc. Mag., vol. 21, no. 3, pp. 7, 2002.

29. J. C. Lucena, G. Downey, H. A. Amery, "From region to countries: Engineering education in Bahrain Egypt and Turkey", IEEE Technol. Soc. Mag., vol. 25, no. 2, pp. 4-11, 2006.

30. C. Didier, J. R. Herkert, "Volunteerism and humanitarian engineering—Part II", IEEE Technol. Soc. Mag., vol. 29, no. 1, pp. 9-11, 2010.

31. K. Michael, G. Roussos, G. Q. Huang, R. Gadh, A. Chattopadhyay, S. Prabhu, P. Chu, "Planetary-scale RFID services in an age of uberveillance", Proc. IEEE, vol. 98, no. 9, pp. 1663-1671, Sep. 2010.

32. M. G. Michael, K. Michael, "The fall-out from emerging technologies: On matters of surveillance social networks and suicide", IEEE Technol. Soc. Mag., vol. 30, no. 3, pp. 15-18, 2011.

33. M. U. Iqbal, S. Lim, "Privacy implications of automated GPS tracking and profiling", IEEE Technol. Soc. Mag., vol. 29, no. 2, pp. 39-46, 2010.

34. D. Kravets, "OnStar tracks your car even when you cancel service", Wired, Sep. 2011.

35. L. Evans, "Location-based services: Transformation of the experience of space", J. Location Based Services, vol. 5, no. 34, pp. 242-260, 2011.

36. M. Wigan, R. Clarke, "Social impacts of transport surveillance", Prometheus, vol. 24, no. 4, pp. 389-403, 2006.

37. B. D. Renegar, K. Michael, "The privacy-value-control harmonization for RFID adoption in retail", IBM Syst. J., vol. 48, no. 1, pp. 8:1-8:14, 2009.

38. R. Clarke, "Information technology and dataveillance", Commun. ACM, vol. 31, no. 5, pp. 498-512, 1988.

39. H. Ketabdar, J. Qureshi, P. Hui, "Motion and audio analysis in mobile devices for remote monitoring of physical activities and user authentication", J. Location Based Services, vol. 5, no. 34, pp. 182-200, 2011.

40. E. Singer, "Device tracks how you're sleeping", Technol. Rev. Authority Future Technol., Jul. 2009.

41. L. Perusco, K. Michael, "Control trust privacy and security: Evaluating location-based services", IEEE Technol. Soc. Mag., vol. 26, no. 1, pp. 4-16, 2007.

42. K. Michael, A. McNamee, M. G. Michael, "The emerging ethics of humancentric GPS tracking and monitoring", ICMB M-Business-From Speculation to Reality, 2006.

43. S. J. Fusco, K. Michael, M. G. Michael, R. Abbas, "Exploring the social implications of location based social networking: An inquiry into the perceived positive and negative impacts of using LBSN between friends", 9th Int. Conf. Mobile Business/9th Global Mobility Roundtable (ICMB-GMR), 2010.

44. M. Burdon, "Commercializing public sector information privacy and security concerns", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 34-40, 2009.

45. R. W. Picard, "Future affective technology for autism and emotion communication", Philosoph. Trans. Roy. Soc. London B Biol. Sci., vol. 364, no. 1535, pp. 3575-3584, 2009.

46. R. M. Kowalski, S. P. Limber, P. W. Agatston, Cyber Bullying: The New Moral Frontier, U.K., London: Wiley-Blackwell, 2007.

47. Google: Policies and Principles, Oct. 2011.

48. K.-S. Lee, "Surveillant institutional eyes in South Korea: From discipline to a digital grid of control", Inf. Soc., vol. 23, no. 2, pp. 119-124, 2007.

49. D. P. Siewiorek, "Wearable computing comes of age", IEEE Computer, vol. 32, no. 5, pp. 82-83, May 1999.

50. L. Sydnheimo, M. Salmimaa, J. Vanhala, M. Kivikoski, "Wearable and ubiquitous computer aided service maintenance and overhaul", IEEE Int. Conf. Commun., vol. 3, pp. 2012-2017, 1999.

51. K. Michael, M. G. Michael, Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, 2009.

52. K. Michael, M. G. Michael, "Implementing Namebers using microchip implants: The black box beneath the skin" in This Pervasive Day: The Potential and Perils of Pervasive Computing, U.K., London:Imperial College Press, pp. 101-142, 2011.

53. S. Mann, "Wearable computing: Toward humanistic intelligence", IEEE Intell. Syst., vol. 16, no. 3, pp. 10-15, May/Jun. 2001.

54. B. Schiele, T. Jebara, N. Oliver, "Sensory-augmented computing: Wearing the museum's guide", IEEE Micro, vol. 21, no. 3, pp. 44-52, May/Jun. 2001.

55. C. Harrison, D. Tan, D. Morris, "Skinput: Appropriating the skin as an interactive canvas", Commun. ACM, vol. 54, no. 8, pp. 111-118, 2011.

56. N. Sawhney, C. Schmandt, "Nomadic radio: A spatialized audio environment for wearable computing", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 171-172, 1997.

57. S. Mann, "Eudaemonic computing (‘underwearables’)", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 177-178, 1997.

58. LooxieOverview, Jan. 2012.

59. T. Starner, "The challenges of wearable computing: Part 1", IEEE Micro, vol. 21, no. 4, pp. 44-52, Jul./Aug. 2001.

60. G. Trster, "Smart clothes—The unfulfilled pledge", IEEE Perv. Comput., vol. 10, no. 2, pp. 87-89, Feb. 2011.

61. M. B. Spitzer, "Eyeglass-based systems for wearable computing", Proc. IEEE 1st Int. Symp. Wearable Comput., pp. 48-51, 1997.

62. R. Steinkuhl, C. Sundermeier, H. Hinkers, C. Dumschat, K. Cammann, M. Knoll, "Microdialysis system for continuous glucose monitoring", Sens. Actuators B Chem., vol. 33, no. 13, pp. 19-24, 1996.

63. J. C. Pickup, F. Hussain, N. D. Evans, N. Sachedina, "In vivo glucose monitoring: The clinical reality and the promise", Biosens. Bioelectron., vol. 20, no. 10, pp. 1897-1902, 2005.

64. C. Thomas, R. Carlson, Development of the Sensing System for an Implantable Glucose Sensor, Jan. 2012.

65. J. L. Ferrero, "Wearable computing: One man's mission", IEEE Micro, vol. 18, no. 5, pp. 87-88, Sep.-Oct. 1998.

66. T. Martin, "Issues in wearable computing for medical monitoring applications: A case study of a wearable ECG monitoring device", Proc. IEEE 4th Int. Symp. Wearable Comput., pp. 43-49, 2000.

67. M. G. Michael, "The biomedical pioneer: An interview with C. Toumazou" in Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, pp. 352-363, 2009.

68. R. Capurro, M. Nagenborg, Ethics and Robotics, Germany, Heidelberg: Akademische Verlagsgesellschaft, 2009.

69. R. Sparrow, "Predators or plowshares? Arms control of robotic weapons", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 25-29, 2009.

70. R. C. Arkin, "Governing lethal behavior in robots [T&S Interview]", IEEE Technol. Soc. Mag., vol. 30, no. 4, pp. 7-11, 2011.

71. M. L. Cummings, "Creating moral buffers in weapon control interface design", IEEE Technol. Soc. Mag., vol. 23, no. 3, pp. 28-33, 41, 2004.

72. P. Asaro, "Modeling the moral user", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 20-24, 2009.

73. J. Canning, "You've just been disarmed. Have a nice day!", IEEE Technol. Soc. Mag., vol. 28, no. 1, pp. 13-15, 2009.

74. F. Operto, "Ethics in advanced robotics", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 72-78, Mar. 2011.

75. J. M. Sullivan, "Evolution or revolution? The rise of UAVs", IEEE Technol. Soc. Mag., vol. 25, no. 3, pp. 43-49, 2006.

76. P. Salvini, M. Nicolescu, H. Ishiguro, "Benefits of human-robot interaction", IEEE Robot. Autom. Mag., vol. 18, no. 4, pp. 98-99, Dec. 2011.

77. A. Sharkey, N. Sharkey, "Children the elderly and interactive robots", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 32-38, Mar. 2011.

78. D. Feil-Seifer, M. J. Mataric, "Socially assistive robotics", IEEE Robot. Autom. Mag., vol. 18, no. 1, pp. 24-31, Mar. 2011.

79. J. D. Bronzino, The Biomedical Engineering Handbook: Medical Devices and Systems, FL, Boca Raton:CRC Press, 2006.

80. C. Hassler, T. Boretius, T. Stieglitz, "Polymers for neural implants", J. Polymer Sci. B Polymer Phys., vol. 49, no. 1, pp. 18-33, 2011.

81. Bionic Hearing Bionic Vision Neurobionics, Jan. 2012.

82. A. Manning, "Implants sounding better: Smaller faster units overcome ‘nerve deafness’", USA Today, pp. 7D, 2000.

83. G. M. Clark, Sounds From Silence, Australia, Melbourne: Allen & Unwin, 2003.

84. G. Carman, Eureka Moment From First One to Hear With Bionic Ear, Feb. 2008.

85. J. F. Patrick, P. A. Busby, P. J. Gibson, "The development of the nucleus FreedomTM cochlear implant system", Sage Publications, vol. 10, no. 4, pp. 175-200, 2006.

86. "Personal stories", Cochlear, Jan. 2012.

87. R. A. Cooper, "Quality of life technology: A human-centered and holistic design", IEEE Eng. Med. Biol., vol. 27, no. 2, pp. 10-11, Mar./Apr. 2008.

88. S. Stewart, "Neuromaster", Wired 8.02.

89. J. Dowling, "Current and future prospects for optoelectronic retinal prostheses", Eye, vol. 23, pp. 1999-2005, 2009.

90. D. Ahlstrom, "Microchip implant could offer new kind of vision", The Irish Times.

91. More Tests of Eye Implants Planned, pp. 1-2, 2001.

92. G. Branwyn, "The desire to be wired", Wired 1.4.

93. W. Wells, The Chips Are Coming.

94. M. Brooks, "The cyborg cometh", Worldlink: The Magazine of the World Economic Forum.

95. E. Strickland, "Birth of the bionic eye", IEEE Spectrum, Jan. 2012.

96. S. Adee, "Researchers hope to mime 1000 neurons with high-res artificial retina", IEEE Spectrum, Jan. 2012.

97. D. Nairne, Building Better People With Chips and Sensors.

98. S. S. Hall, "Brain pacemakers", MIT Enterprise Technol. Rev..

99. E. A. C. Pereira, A. L. Green, R. J. Stacey, T. Z. Aziz, "Refractory epilepsy and deep brain stimulation", J. Clin. Neurosci., vol. 19, no. 1, pp. 27-33, 2012.

100. "Brain pacemaker could help cure depression research suggests", Biomed. Instrum. Technol., vol. 45, no. 2, pp. 94, 2011.

101. H. S. Mayberg, A. M. Lozano, V. Voon, H. E. McNeely, D. Seminowicz, C. Hamani, J. M. Schwalb, S. H. Kennedy, "Deep brain stimulation for treatment-resistant depression", Neuron, vol. 45, no. 5, pp. 651-660, 2005.

102. B. Staff, "Brain pacemaker lifts depression", BBC News, Jun. 2005.

103. C. Hamani, H. Mayberg, S. Stone, A. Laxton, S. Haber, A. M. Lozano, "The subcallosal cingulate gyrus in the context of major depression", Biol. Psychiatry, vol. 69, no. 4, pp. 301-308, 2011.

104. R. Dobson, Professor to Try to Control Wife via Chip Implant.

105. "Chip helps paraplegic walk", Wired News.

106. D. Smith, "Chip implant signals a new kind of man", The Age.

107. "Study of an implantable functional neuromuscular stimulation system for patients with spinal cord injuries", Clinical Trials.gov, Feb. 2009.

108. R. Barrett, "Electrodes help paraplegic walk" in Lateline Australian Broadcasting Corporation, Australia, Sydney: ABC, May 2011.

109. M. Ingebretsen, "Intelligent exoskeleton helps paraplegics walk", IEEE Intell. Syst., vol. 26, no. 1, pp. 21, 2011.

110. S. Harris, "US researchers create suit that can enable paraplegics to walk", The Engineer, Oct. 2011.

111. D. Ratner, M. A. Ratner, Nanotechnology and Homeland Security: New Weapons for New Wars, NJ, Upper Saddle River: Pearson Education, 2004.

112. L. Versweyveld, "Chip implants allow paralysed patients to communicate via the computer", Virtual Medical Worlds Monthly.

113. S. Adee, "The revolution will be prosthetized: DARPA's prosthetic arm gives amputees new hope", IEEE Spectrum, vol. 46, no. 1, pp. 37-40, 2009.

114. E. Wales, "It's a living chip", The Australian, pp. 4, 2001.

115. Our Products: MBAMultiplex Bio Threat Assay, Jan. 2012.

116. F. W. Scheller, "From biosensor to biochip", FEBS J., vol. 274, no. 21, pp. 5451, 2007.

117. A. Persidis, "Biochips", Nature Biotechnol., vol. 16, pp. 981-983, 1998.

118. A. C. LoBaido, "Soldiers with microchips: British troops experiment with implanted electronic dog tag", WorldNetDaily.com.

119. "Microchip implants for drug delivery", ABC: News in Science.

120. R. Bailey, "Implantable insulin pumps", Biology About.com.

121. D. Elleri, D. B. Dunger, R. Hovorka, "Closed-loop insulin delivery for treatment of type 1 diabetes", BMC Med., vol. 9, no. 120, 2011.

122. D. L. Sorkin, J. McClanahan, "Cochlear implant reimbursement cause for concern", HealthyHearing, May 2004.

123. J. Berke, "Parental rights and cochlear implants: Who decides about the implant?", About.com: Deafness, May 2009.

124. D. O. Weber, "Me myself my implants my micro-processors and I", Softw. Develop. Mag., Jan. 2012.

125. A. Graafstra, K. Michael, M. G. Michael, "Social-technical issues facing the humancentric RFID implantee sub-culture through the eyes of Amal Graafstra", Proc. IEEE Int. Symp. Technol. Soc., pp. 498-516, 2010.

126. E. M. McGee, G. Q. Maguire, "Becoming borg to become immortal: Regulating brain implant technologies", Cambridge Quarterly Healthcare Ethics, vol. 16, pp. 291-302, 2007.

127. P. Moore, Enhancing Me: The Hope and the Hype of Human Enhancement, U.K., London: Wiley, 2008.

128. A. Masters, K. Michael, "Humancentric applications of RFID implants: The usability contexts of control convenience and care", Proc. 2nd IEEE Int. Workshop Mobile Commerce Services, pp. 32-41, 2005.

129. J. Best, "44000 prison inmates to be RFID-chipped", silicon.com, Nov. 2010.

130. D. Brin, The Transparent Society: Will Technology Force Us to Choose Between Privacy and Freedom?, MA, Boston: Perseus Books, 1998.

131. J. E. Dobson, P. F. Fischer, "Geoslavery", IEEE Technol. Soc. Mag., vol. 22, no. 1, pp. 47-52, 2003.

132. K. Michael, M. G. Michael, "Homo Electricus and the Continued Speciation of Humans" in The Encyclopedia of Information Ethics and Security, PA, Hershey: IGI, pp. 312-318, 2007.

133. S. Young, Designer Evolution: A Transhumanist Manifesto, New York: Prometheus Books, 2006.

134. E. Braun, Wayward Technology, U.K., London: Frances Pinter, 1984.

135. R. el Kaliouby, R. Picard, S. Baron-Cohen, "Affective computing and autism", Ann. New York Acad. Sci., vol. 1093, no. 1, pp. 228-248, 2006.

136. D. Bhatia, S. Bairagi, S. Goel, M. Jangra, "Pacemakers charging using body energy", J. Pharmacy Bioallied Sci., vol. 2, no. 1, pp. 51-54, 2010.

137. V. Arstila, F. Scott, "Brain reading and mental privacy", J. Humanities Social Sci., vol. 15, no. 2, pp. 204-212, 2011.

138. R. Westrum, Technologies and Society: The Shaping of People and Things, CA, Belmont: Wadsworth, 1991.

139. B. Gates, The Road Ahead, New York: Penguin, 1995.

140. M. Allaby, Facing The Future: The Case for Science, U.K., London: Bloomsbury, 1996.

141. M. Kaku, Visions: How Science Will Revolutionise the 21st Century and Beyond, U.K., Oxford: Oxford Univ. Press, 1998.

142. A. Berry, The Next 500 Years: Life in the Coming Millennium, New York: Gramercy Books, 1996.

143. J. Ellul, The Technological Society, New York: Vintage Books, 1964.

144. W. Gibson, Neuromancer, New York: Ace Books, 1984.

145. M. McLuhan, Understanding Media: The Extensions of Man, MA, Cambridge: MIT Press, 1964.

146. A. Toffler, Future Shock, New York: Bantam Books, 1981.

147. C. M. Banbury, Surviving Technological Innovation in the Pacemaker Industry 19591990, New York: Garland, 1997.

148. K. Warwick, I Cyborg, U.K., London: Century, 2002.

149. M. G. Michael, K. Warwick, "The professor who has touched the future" in Innovative Automatic Identification and Location-Based Services, New York: Information Science Reference, pp. 406-422, 2009.

150. D. Green, "Why I am not impressed with Professor Cyborg", BBC News.

151. P. Cochrane, Tips For Time Travellers: Visionary Insights Into New Technology Life and the Future on the Edge of Technology, New York: McGraw-Hill, 1999.

152. I. Walker, "Cyborg dreams: Beyond Human: Background Briefing", ABC Radio National, Jan. 2012.

153. W. Grossman, "Peter Cochrane will microprocess your soul", Wired 6.11.

154. R. Fixmer, "The melding of mind with machine may be the next phase of evolution", The New York Times.

155. M. Billinghurst, T. Starner, "Wearable devices: New ways to manage information", IEEE Computer, vol. 32, no. 1, pp. 57-64, Jan. 1999.

156. M. Minsky, Society of Mind, New York: Touchstone, 1985.

157. R. Uhlig, "The end of death: ‘Soul Catcher’ computer chip due", The Electronic Telegraph.

158. K. Michael, M. G. Michael, "Microchipping people: The rise of the electrophorus", Quadrant, vol. 414, no. 3, pp. 22-33, 2005.

159. K. Michael, M. G. Michael, "Towards chipification: The multifunctional body art of the net generation", Cultural Attitudes Towards Technol. Commun., 2006.

160. E. McLuhan, F. Zingrone, Essential McLuhan, NY, New York:BasicBooks, 1995.

161. M. Dery, Escape Velocity: Cyberculture at the End of the Century, U.K., London: Hodder and Stoughton, 1996.

162. J. Brown, "Professor Cyborg", Salon.com, Jan. 2012.

163. M. T. Maybury, "The mind matters: Artificial intelligence and its societal implications", IEEE Technol. Soc. Mag., vol. 9, no. 2, pp. 7-15, Jun./Jul. 1990.

164. Philosophy, Jan. 2012.

165. R. Kurzweil, The Singularity Is Near, New York: Viking, 2005.

166. Transhumanist Declaration, Jan. 2010.

167. F. Fukuyama, "Transhumanism", Foreign Policy, no. 144, pp. 42-43, 2004.

168. How Does Transhumanism Relate to Religion? in Transhumanist FAQ, Jan. 2012.

169.

170. J. Ellul, What I Believe, MI, Grand Rapids: Eerdmans, 1989.

171. J. M. Wetmore, "Amish Technology: Reinforcing values and building community", IEEE Technol. Soc. Mag., vol. 26, no. 2, pp. 10-21, 2007.

172. R. Kurzweil, The Age of Spiritual Machines, New York: Penguin Books, 1999.

173. B. Joy, "Why the future doesn't need us", Wired 8.04.

174. K. J. O'Connell, "Uses and abuses of technology", Inst. Electr. Eng. Proc. Phys. Sci. Meas. Instrum. Manage. Educ. Rev., vol. 135, no. 5, pp. 286-290, 1988.

175. D. F. Noble, The Religion of Technology: The Divinity of Man and the Spirit of Invention, New York: Penguin Books, 1999.

176. W. Kuhns, The Post-Industrial Prophets: Interpretations of Technology, New York: Harper Colophon Books, 1971.

177. D. J. Solove, The Future of Reputation, CT, New Haven: Yale Univ. Press., 2007.

178. J. Rennie, "Ray Kurzweil's slippery futurism", IEEE Spectrum, Dec. 2010.

Keywords

Technology forecasting, Social implications of technology, History, Social factors, Human factors, social aspects of automation, human-robot interaction, mobile computing, pervasive computing, IEEE society, social implications of technology, SSIT, society founding, social impacts, military technologies, security technologies, cyborgs, human-machine hybrids, human mind, transhumanist future, humanity redesigns, mobile computing, wearable computing, Überveillance, Corporate activities, engineering education, ethics, future of technology, history,social implications of technology, sociotechnical systems

Citation: Karl D. Stephan, Katina Michael, M. G. Michael, Laura Jacob, Emily P. Anesta, 2012, "Social Implications of Technology: The Past, the Present, and the Future", Proceedings of the IEEE, Volume: 100, Issue: Special Centennial Issue, May 13 2012, 1752-1781. 10.1109/JPROC.2012.2189919