Unintended Consequences: The Paradox of Technological Potential
Originally, this theme issue of IEEE Potentials was prospectively titled “The Dark Side of Technology.” We wanted to address some of the dangerous scenarios that arise from the development of new technologies without proper constraint, restraint, ethical judgment, or regard for people and the planet. Certainly, there was no shortage of case studies or scenarios from which to choose.
But something didn't sit quite right with us as we prepared the call for papers. While paying special attention to the potential for harm is a timely exercise, focusing solely on the dark side of technology at the expense of not offering plausible solutions to overcome, or ideally avoid, the darkness seemed too simplistic an approach to what is undoubtedly an innately complex issue. To make this issue as useful as possible, we also discuss the need to address the potential for good: the allure and appeal of new technologies and, in turn, the subsequent risks and cautions.
For instance, social media connects us to friends and loved ones in unprecedented ways, and yet for some people, it has become a trigger to addictive behaviors, the endless gazing into glowing handheld devices instead of engaging with the people in our presence. There are leading researchers who speak of online addictions to the Internet, social media (e.g., Facebook), and video games (e.g., Minecraft), while others devote their studies to the good that can come from those very tools.
The coming of age of artificial intelligence (AI) gives rise to incredible new opportunities in industry, health care, manufacturing, and beyond…and yet, in the halls of universities and research institutes, there are musings that perhaps as our technology becomes more humanlike, we are on a trajectory to become more like machines.
The pace and schedule of our lives, once determined by the seasons, the rise and fall of the sun, and our circadian clocks, is now set by the digital metronome of connective devices: status updates (“likes” and favorites), instant-message pings, and e-mail notifications, compelling us to be on, always, racing to keep up with the influx of information, media, and interconnection. Where the advent of the Internet promised the freedom of asynchronous communication, whereby we could connect with anyone at any time, when it was convenient with our personal schedules and priorities, the rise of smartphones enabled with push notifications taunts us to keep up, constantly, to connect synchronously, all the time, anywhere, stripping us of the promised potential of this new technology, almost as quickly as it was presented to us in the first place. It becomes impossible at one point, to optimize a business process more, without compromising the human being in that process.
We cannot—nor should we wish to—stop the march of time. The potential of new technologies to make us smarter, healthier, and even happier than ever before is immense. But as the speed at which these technologies advance increases on a near-daily basis, it would serve us well to be thoughtful of how these innovations are being developed and to remain mindful of their use, application, and impact on the people and places they touch. With possibility comes responsibility: this is the paradox of technological potential. Underlying this potential is also a thriving market economy that enslaves all consumers into a lifetime of adopting one upgrade after another. Think on how the myth of Sisyphus might be understood in this context, left for eternity trying to push a heavy boulder up a hill, to live up to an unfulfilled promise.
A Complex Dynamic
With this in mind, we set out to explore the complex dynamic of unintended consequences. We cast our net wide, as we sought out critical reviews and analyses, case examples, commentaries, interviews, and opinion pieces from researchers' futurists, practitioners, and storytellers, examining the hidden implications of our ever-digital lives. Just as the Renaissance era saw a boom in innovation, thanks to an appreciation of convergent skills and values, from science and mathematics to fine art and music, we are now living during a time where the line between science fiction and reality is blurring: our relationships are mediated, our memories are archived, and our identities are public documents.
The number of submissions to this theme issue was substantial. We received far more articles than we could accommodate and, as such, decided to focus the scope of this collection even more to specifically examine how the paradox of technological potential is impacting society. There are two special issues that have been compiled—the first focuses on “society,” which appears in this issue of IEEE Potentials, and the second spotlights the “self” and is scheduled to be published in IEEE Technology and Society Magazine in March 2017. It is also our hope that a column might follow within IEEE Potentials to further engage readers with these issues.
In this issue, you will find articles on the implications of rapidly advancing technology on government, organizations, privacy, and autonomy. We begin with an interview with Douglas Rushkoff, one of the founding fathers of the study of technological paradox. In this conversation with Ramona Pringle, Rushkoff explains how he not-so-secretly hopes for a public revolt or revolution, possibly the only way out of what he sees as an otherwise inescapable growth trap with which we find ourselves racing to keep up. Pringle presents Rushkoff's ideas with highlights of his most recent book, Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity.
From there, we continue with the concept of political revolution with an article by Brian Martin examining the role of social media and networked technology in protest and revolt. Repressive regimes aren't the only governments impacted by the rise of new tools.
The next article, written by Jesse Hirsh, looks into the state of predictive policing and how the role of those charged to serve and protect society is impacted by the onslaught of data and tools for processing that information. It should not be surprising to learn that Facebook is the first port of call for most policing organizations during criminal investigations because it is free and can be used in tandem with other biometric facial search software web crawling the Internet. But more importantly, the move toward evidence-based policing is now well and truly entrenched as a practice, not just a philosophy. Intelligence is no longer enough. But with this big data movement for the “collective good,” there are asymmetric impacts on the “individual.” There is something wrong with the digital chroniclization of our lives, the blanket coverage surveillance; perchance the good might turn evil, and we might need the data trail to prove it. In the not too distant future, we will be turning our attention to transparency, tampering, and trust.
How do we achieve proactive policing in actuality, a type of total information awareness? Through a range of veillances that Christine Perakslis et al. consider as “border crossings.” Surveillance technologies can now extend from the sky, to the street, to the person around you, to within you, as in the case of uberveillance, which denotes embedded surveillance devices. Will we each be asked to carry a tiny microchip implant, an alibi, that traces our steps, our location, and our condition for securitization?
Avner Levin explores this new age of veillance, offering suggestions for retaining the last remnants of privacy within a public space inundated by cameras and social media platforms that compel us to share the ever-changing details of where we are, who we are with, and what's on our mind.
As we consider how we can cling to privacy in this connected age, we shift our focus to the device with which we are most intimately connected, our smartphones. Peter Corcoran examines networks, connectedness, and our relationships with these mobile devices. Can we give up some of our privacy to help develop sustainable cities through technologically facilitating the wisdom of the crowd?
Expanding on the issue of privacy and networks, the next article by Jukka Vuorinen and Pekka Tetri surveys the paradoxical complexity of information security, whereby they argue that the “tools of control make systems uncontrollable.” In other words, the greater the inbuilt security in a system, the greater the costs to productivity. It is well known in the national security community, for instance, that foolproof security is not only a fallacy but is not desired by government agencies (e.g., secret services and intelligence agencies) that require access to make judgements.
Wilhelm E.J. Klein presents a philosophical investigation of our lack of moral intuitions toward technology, where the reader is pushed to ask if he or she would accept from a human, or a friend, the same rules of engagement we enter into so unquestioningly with our pervasive technologies. Klein makes a major distinction between humans and machines.
Finally, Ann Cavoukian, former long-standing information and privacy commissioner of Ontario, Canada, puts forward a proposal for an international task force focused on privacy as a solution toward the unchecked trajectories of new technologies. She argues that state surveillance is not just a privacy issue; it negatively impacts everything from innovation to prosperity, on a societal level. Best of all, she offers suggestions for how we might pivot at this critical juncture.
Tech is Not Neutral
With all great innovation comes responsibility, an inevitable dark side, and with the exponential growth of technology, the window within which we can examine the ethics and consequences of our adoption of new technologies becomes increasingly narrow. Technology is not neutral. This is one of the most dangerous of all modern mantras. Technology can never be neutral because it cannot be disassociated from the humans who design it, sell it, buy it, and use it. It is endowed with the aspirations of its creators, and it also serves to create a technological elite. We shape our tools, and then, in turn, those tools shape us.
This is our time. Our time to make decisions about the world we want to live in, today, five years from now, and 50 years in the future. Can we envision that world and reverse engineer it to come to be? Can we develop a script, or a guide, of how to get to the future we want to inhabit? How do we coexist—with each other and with the increasingly complex systems and tools we are building—in a productive, positive way, amidst rapid technological change? Many would argue that is what we are trying to achieve through our current developments. But we challenge readers to take a pause and look at both the promise and the potential for peril with some of these so-called advancements—whether it be Internet-driven AI-based toys for children, body-worn recording devices that hear and see everything around them covertly, self-driving vehicles, or anthropomorphic robots that make the aged feel less lonely and depressed. All of these advancements appear under the philosophy of “Web of Things and People.”
We—the curious, the crafty, the creative, and the compassionate—have an obligation to truly understand our new technologies and the infrastructures being built around us and to use these powerful new tools to shape our lives, not let them shape us. How can we use these technologies to better ourselves? How can we use them to do good in the world? They will not do these things on their own. What we do with them is our choice and our responsibility.
Instead of fear mongering, let us raise alarms where there is cause for concern, and let us praise the cautious optimist, the one who is watchful, questioning, and critical, but with the openness to new solutions and the optimism that positive change is possible. How do we adjust our course, as a society, before it is too late? Can we acknowledge and take responsibility where we have made missteps and when new technologies have been exploited for harmful intent?
What we have gathered is a collection of disruptive perspectives, articles that present solutions and blueprints, while questioning the status quo. In the following articles, you will read precautionary tales, studies on assessment impacts and response, design principles, and ideas that examine changing standards, regulations, and approaches to corporate social responsibility. We hope you enjoy reading this issue, but we also hope you are disquieted, that your attention is piqued, and that by the time you reach the back cover of this magazine, you are left with more questions than answers.
As playwright Eugene Ionesco once stated, “It is not the answer that enlightens, but the question.”
Technology, Social factors, Human factors, Ethics, Technological innovation, Social implications of technology, social networking (online), social aspects of automation, Google, unintended consequences, social media, networked technology
Citation: Ramona Pringle, Katina Michael, M.G. Michael, 2016, "Unintended Consequences: The Paradox of Technological Potential", IEEE Potentials, Vol. 35, No. 5, Sept.-Oct. 2016, pp. 7 - 10. DOI: 10.1109/MPOT.2016.2569672