D.avid Auerbach is a writer and software engineer who has worked for Google and Microsoft. He also teaches the history of computing at the New Center for Research & Practice in Seattle, USA. His new book is Megagrids: how digital forms escape our control Commander of our daily lives and internal realities. He argues that the widespread concern about artificial intelligence is legitimate, but the problem is already all around us, with vast technological networks that no one, neither governments nor their owners, can control.
His book deals with the threat to social and economic stability posed by mega-grids. How is a megagrid defined?
The definition I use for a megagrid is an opaque, evolving, persistent data network that greatly influences how people view the world. It is always on and consists of both a large component of server technology and millions and millions of users who are constantly active, using those services and influencing them. All of these users play a small role in the collective authorship of how these algorithms are executed. The effect is contributing to a severe fracturing of society in which we are literally becoming incapable of understanding one another, as we split into self-controlled like-minded groups that enforce unanimity and uniformity, and prevent any social consensus. on a large scale.
You identify social media platforms, cryptocurrencies and the calls metaverse as aspects of this distorting combination of advanced technology and mass participation. You were a software engineer at Google and Microsoft – When did you start to worry about this phenomenon?
Well, the problem came to me sometime after the advent of social media. That’s when you saw these feedback loops, where people acted on algorithms, which then acted on people, and that’s when the public discourse seemed to be changing for the worse. But it took me quite a while to understand what the hell was going on, to realize that we have less control over these systems than we think, and even the people who run them have less control than we think.
One point that you make is that, as much as we want to attribute the lack of surveillance in places like Facebook to greed or indifference, these are not the real problems.
To some degree, the tech companies deserve the blame for what they spread, but if you see this as a conspiracy to make money by making life miserable for us, you’re going to get nowhere. There’s a big Facebook note that was leaked saying that the narrative they least want is to be perceived as not being in control of their systems. Being evil is actually a better look for them than not being in control. But unfortunately, the narrative of not being in control is much more accurate.
So do the people who work within these megagrids recognize the problems that you diagnose?
I think it depends on your position. The base feels this, or certainly those I’ve spoken to do. If you’re Mark Zuckerberg, you’re dealing with an enormous amount of cognitive dissonance because you feel like whatever you do, you get criticized. But at the same time, you don’t want to admit to a level of helplessness. My suspicion is that executives are in varying degrees of denial about the magnitude of the problem.
Is it a universal problem or really a problem that affects liberal free market societies?that it cannot impose draconian control measures?
It is universal, although the commitment differs. One of the ironies I found was that China is actually less aggressive in deploying government-driven megagrids than, say, India. Because, in an authoritarian society, the danger to the party from a fallible government-run megagrid is greater than in a society where it can be attributed to the free market or third parties.
If we think that these megagrids are distorting our view of the world, making it more volatile and unstable, why can’t we just switch off?
These systems are too complex and diffuse to stop. It would be like closing the stock market, except that the degree of complexity is much greater than that of the stock market. What we should be looking for is to mitigate the dysfunctional effects of the megagrid and exert some indirect influence on them. What we can’t do is control them at the fine-grained level that people ask for.
What did you think of the recent story of Microsoft’s Sydney chatbot going rogue and revealing its possible dark side of a New York Times reporter?
What people don’t understand is that a lot of what Sydney was saying is our collective unconsciousness, our collective data being filtered through their algorithms, which is very much a reflection of us and couldn’t exist without the human component. People see him as a single separate agent because that’s what he appears to be. But we cannot understand what it does or why unless we recognize that it is a product of that relationship between programmed algorithms and the mass of our data on which they are based.
You write: “The social history of computing is a story of how we turned ourselves, our lives, our actions, our purchases, our words into data, online and offline.” Do you think we are aware of that process and somehow want it?
The fact is that we see ourselves more and more as collections of labels. “I am this, this and this”: what people attack as identity politics. I think it’s a misleading diagnosis, because it’s really about classification and taxonomization. In a way, we now speak a more quantitative language and qualitative wealth falls through the cracks and is eliminated. So I think there is an awareness of it, but not necessarily an awareness of where it comes from. Do we want it? On some level, yes, because meganets have this potential to create an incredible sense of belonging, one where you’re constantly surrounded by people you feel right at home with.
you discuss the game stop case in the book, in which Wall Street companies lost thousands of millions of dollars after a subreddit group dramatically boosted the company’s share price video game company. What does that show about the power of megagrids?
The meganet allows for decentralized forms of association that have never been possible before. And that delegates power to a greater extent than ever before, but it does so in a disorganized way, so it’s not about rationality or conscience; you’re dealing with the hive mind. It is not the wisdom of crowds, it is the chaos of crowds.
One solution that you suggest to slow down these fractured social processes is, so to speak, to fight chaos with chaos. Could you explain that?
Meganets like to track people demographically and match them up. This tends to create homogeneity and growing doctrinaire. If you were to mess with it, simply to prevent freezing, it would at least slow things down. There are several ways to do it. I think TikTok already has a way of injecting smorgasbord content into their algorithms, because I think the idea was that it was showing too many pro-anorexia videos in a row or something. But if you focus on one type of content, you’ll be playing hit-a-mole. So the problem is, can you do it in such a way that you get heterogeneous content more generally across the board?