Each week in the lead up to Brave Conversations, we'll be chatting to two featured speakers about their take on the issues facing humans and technology, now and in the future.
Follow us on Twitter for updates at @braveconvos and #braveconversations
The Rise of The Machine
What are the biggest challenges emerging with regard to humans and the Web?
Michael: Humans are increasingly (directly or indirectly) reliant upon the Web as a means of satisfying fundamental needs, such as water, food and shelter. I would say the biggest challenge emerging is the securitization of critical sociotechnical systems that rely on the Web either upstream or downstream and are responsible for the ongoing production and delivery of human needs.
I have often reflected: here we all are, billions of interconnected people, hundreds of millions of interconnected corporations, hundreds of sovereign states, varied national and supranational laws, closed and open economic infrastructures each with their own financial systems, geographic expanses each with their unique environmental conditions all relying on the Web. Here we are pouring our hearts and souls into this Complex Living Web by uploading text, images and video, busily dotting the i’s and crossing the t’s toward long-term sustainability and growth and development. Data… data… and more data… My biggest fear? The annihilation of this Web of Everything to a Web of Nothing- a fragmented and broken and virused desolate e-wasteland, filled with bot-generated disinformation. What then remains as truth?
We don’t have paper backups any longer, we don’t even keep tape drives any longer, the lifetime of a social media-based message is diminishing. The paradox is that while we can store everything in our daily digital lifelogs, we seemingly don’t remember what happened to us yesterday, let alone history before the Web. Yes, “it’s all the in Cloud” we are told, but we forget the Cloud is actually on the ground, and highly vulnerable and susceptible to acts of God or to human attack.
From a socio-ethical perspective, I am very concerned that we are generating bits and bytes almost continuously, by being hooked to online applications driven by the Web, some of them purposefully addictive. The systematic extraction of personal information from individuals will leave our spirits with little to bear that is sacred, and yet some companies will reap the profits, leaving us profusely naked in the process. “Yes, they already know more about us through psychometrics than we know about our family, friends and even ourselves”. We may well be adding much in terms of the “number of words” and the “number of images” and the number of “location points of interest” but what does that profit us as humans? Do we understand the temporality of life? Our own limitedness? How do we understand life and death in this scenario of digital chronicling? Yes, in the short term it helps us reflect on our practices, but in the longer term, all the data in the world also has a finite lifetime because the earth is finite (according to Science). My worry is that we are preoccupied with the wrong aspects of our life (the temporal, not eternal), and in-so-doing our choices are made not on the spiritual relationship questions, but on the technocratic prospects of machine learning and artificial intelligence. We worship technology today, and most of us don’t even know it. Proponents of this view say that technology has an answer for everything. And that is a fallacy.
Ackland: I’m not going to say these are necessarily the “biggest” challenges, but they are what I spend some time thinking about, so they reflect my own interests.
People having to sign up to proprietary online services in order to not have a degraded social life or travel experience. I’ve never felt the need to go on e.g. Facebook because I’m old enough that I wasn’t going to be excluded from social circles I wanted to be part of. I avoided google maps (particularly, with location tracker) for many years because I didn’t want google to have that information, but in the past couple of years I gave in because my ability to travel in new cities, in particular, was compromised. So now google knows what I do, and I don’t like it.
Relatedly, it is harder to maintain an online professional presence without doing so via one of the walled gardens e.g. Academia.edu, LinkedIn. I don’t want to invest time in putting together an academic profile on e.g. academia.edu when I know I can’t get my data out of it.
Misinformation, fake news, social bots (in e.g. Twitter) pushing particular political agendas. A lot being said about this at the moment.
Filter bubbles – the phenomenon that people are receiving information that supports their previously held opinions, because they only connect with people similar to themselves in social media (homophily) and recommender algorithms in social media serve up news and info that matches their previous behaviour. Social media could be leading to more polarisation, although to be honest some people have been saying this since Web 1.0 (‘echo chambers’, ‘cyberbalkinzation’). But the 2016 US presidential election put misinformation and filter bubbles back into the spotlight.
Machine learning – not obviously just to do with the web, but the web is important in recent improvements in machine learning (e.g. data for deep learning). Machine learning may have very negative impacts on particular sectors of the workforce.
Cybersecurity – don’t really need to add too much here, as its something everyone is talking about.
MOOCs – as an academic, this is obviously something that is on the radar for me. Perhaps MOOCs will do to academia what the Web did to the music industry and newspaper industries.
Amazing that I’m having to say this, given email is such a relatively old technology: spam. I can’t believe the amount of emails for fake conferences, fake journals and general rubbish that I receive.
Censorship of the web. Most people think certain things shouldn’t be on the web (e.g. relating to inciting hate, child pornography etc.) but the problem is the grayer area (as the saying goes: one person’s terrorist is another person’s freedom fighter). Countries are always going to insist that they have the right to censor material that some (e.g. in the West) think shouldn’t be censored. How to come up with a system that objectively classifies material that should/shouldn’t be censored, without imposing moral judgments?
Abusive and incivil behaviour on the web, cyber-bullying: shutting down particular voices.
Michael: I agree with your censorship stance… interestingly there is a loose connection between disinformation and censorship… countries can say “open” to everything and then “drown out” perspectives and ideas by sheer volume. In effect, society is at the mercy of those that control the “algorithm”.
What are the three most important things that we can do about them?
Michael: We need to build in contingencies. What if there was no Web? What would that mean? How would we go on? Consider from across vertical sectors.
Critical systems that are required for humans to exist must have off-grid backup and disaster recovery plans. Water is #1. Better understanding of interconnected and interdependent systems. Water affects electricity, electricity affects telecoms, telecoms in turn affects banking… it is a highly-meshed domino effect.
Media literacy is paramount. We need to get ready for a new breed of medical conditions as a result of too much sitting, not enough verbal communication with each other, and repetitive actions akin to Obsessive Compulsive Disorder triggered by technology in various forms. Beyond that how to maintain privacy as that is a fundamental quality of freedom (i.e. human rights).
Ackland: We are never going back to the time when people created their own websites – it was only a small subset of people who could do this anyway. So we need the technology that allows ordinary user to be able to produce and consume information, connect with people. But we need some way that people can keep ownership of their data and move it easily rather than it being stuck in a walled garden. I always thought peer-to-peer online social networks such as Diaspora (https://diasporafoundation.org/) seemed very promising, but of course there is a major network effect going on here: who wants to be on a peer-to-peer online social network that none of your friends are on?
Machine learning – tax the robot? (Bill Gates is calling for this).
Web censorship – a crazy idea I’ve had for a long time is that a “censorship trading scheme” similar to the carbon trading scheme set up to reduce greenhouse emissions could be something to pursue here. Each country (or this could be done at the level of individual organisations) gets a certain allocation of censorship credits that they can “spend” on censoring particular websites,and on preventing other websites from being censored. That would lead to a price of censorship of a given website. Child pornography sites would have a price of zero because presumably no country would use its censorship credits to prevent a child pornography site from being censored. However politically-oriented sites would have a non-zero price. China might want to spend its credits on censoring Free Tibet websites, and some western governments might want to spend credits to keep these sites uncensored. Of course this couldn’t be policed etc. but it would lead to a market that could be used to objectively ‘price’ censorship and potentially would lead to less censorship. When I proposed this at a conference on web censorship people were baying for my blood. I didn’t expect it to go down well, but I got up and said it because I was sick of hearing people ready to impose their values and yet acknowledging they had no way of objectively quantifying how ‘bad’ it is to censor a sex site, compared to a political site.
For more, see Brave Conversations here