Transcript

This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.

 

David Skok: Welcome to Big Tech. I'm David Skok, the Editor-in-chief of The Logic, a publication focused on the innovation economy.

Taylor Owen: I'm Taylor Owen, a senior fellow at the Centre for International Governance Innovation and a professor of public policy at McGill University.

While David focuses on the business of tech, I look at the way technology is impacting society.

David Skok: And this is a podcast that explores a world reliant on tech, those creating it, and those trying to govern it.

 

[CLIP]

Stuart Varney: The CEO of the New York Times says that Facebook's fake news algorithm will damage democracy.

SOURCE: Fox Business YouTube Channel

https://youtu.be/ypdaYrG-rbE

“Facebook algorithms damaging democracy?”

June 15, 2018

[CLIP]

Michael Smerconish: So, does spending too much time on social media, like Facebook, Instagram, Snapchat, actually increase loneliness and depression? That's the new finding of a brand new study by researchers at the University of Pennsylvania ...

SOURCE: CNN YouTube Channel

https://youtu.be/OHQ4fhWoeLs

“Can social media use cause depression?”

November 24, 2018

[CLIP]

Tristan Harris: We've never had a media device that, literally, a billion people are being programmed in the same way, where so much influence is in the hands of a few technology designers.

SOURCE: PBS NewsHour YouTube Channel

https://youtu.be/MacJ4p0vITM

“Your phone it trying to control your life”

January 30, 2017

[CLIP]

Charlie Angus: While we were playing on our phones and apps, our democratic institutions, our form of civil conversation seem to have been upended by frat boy billionaires from California.

SOURCE: Charlie Angus YouTube Channel

https://youtu.be/xw7n9-5__QE

“Charlie Angus at International Grand Committee: Democracy ‘Upended by Frat Boy Billionaires’”

November 28, 2018

 

David Skok: It's become common wisdom that modern technology isn't always good for us. Instagram and Snapchat are making us lonelier, Facebook is destabilizing our democracies, and smart phones are changing the way we think.

Taylor Owen: These arguments aren't new, and of course they are debatable, but few make them as compellingly, and as enthusiastically, as Douglas Rushkoff. Rushkoff is a prominent media theorist, and the author of books like Present Shock, and Throwing Rocks At The Google Bus. His most recent book is called Team Human, which is also the name of his podcast.

David Skok: Rushkoff argues that a lot of our technology is anti-human, it isolates us when it could be connecting us. And, it generates animosity where it could encourage empathy. But, although he is an impassioned and outspoken critic of technology, he doesn't blame the tech itself for our societal woes. Rather, the way we use it.

Taylor Owen: A quick note to our listeners, this interview was recorded in February prior to the onset of the COVID-19 pandemic. We wanted to bring it to you anyways though because we thought it was a fascinating conversation and might offer a bit of relief from the COVID news cycle we’re all absorbed in. We hope you enjoy it.

[MUSIC]

David Skok: Douglas Rushkoff, welcome to Big Tech.

Douglas Rushkoff: Hi, good to be with you.

David Skok: You've said that tech de-socializes us, and alienates us from our souls, that it's anti-human. These are big words. What do you mean?

Douglas Rushkoff: Well, I don't think that tech does this, unless we tell tech to do that. I mean, tech is really good at doing lots of different things, but we have programmed our technology to extend, really obsolete industrial age program, which is remove human beings from the equation wherever you can. Use unqualified labor, make processes repeatable, convince consumers that they need stuff so they buy things that they don't really need, monitor consumer behaviour, use behavioural finance and other techniques to addict people to certain cycles. That's stuff we've been doing a long time. And where those of us who are exposed to computer technology, and networking in the '80s and early '90s thought, oh wow, we finally have a way to break free of this, and to start using media and technology to connect with other people, rather than just be manipulated by companies. We pivoted away from those possibilities in the late '90s, and now we're living in a world where technology's being used, primarily, to monitor, and control human behaviour, and to keep us addicted to the sorts of behaviours that ... Well, that threaten the extinction of our species. And we tend to do so, by keeping people from connecting with one another. The more rapport and connection you have with other people, the less vulnerable you are to some of these really primitive means of throwing you into fear, and panic, and getting you to behave impulsively, or reactively. It's come to the point where technologies, even like Facebook which may have, at one point, been intended to help college guys find girls to date, which at least was human connective, is now really just a tool of market research and social controllers.

Taylor Owen: Let's take the Facebook example, we've talked a lot on this show about all the ills and challenges with Facebook. But, can you walk us through some of the behavioural manipulation piece of this? So, you open Facebook, you're exposed to a feed of content. How is the intent to manipulate our behaviour surfaced?

Douglas Rushkoff: Well, I guess it serves ...

[BEEP]

Taylor Owen: Two questions into our interview, the line goes down.

David Skok: He's really gone, eh? That never happens.

Taylor Owen: After a couple minutes of troubleshooting, we reconnect.

Douglas Rushkoff: Dude, that was you, not me.

David Skok: Can you hear me now?

Douglas Rushkoff: Yeah, that was you guys, though.

Taylor Owen: There we were, three self-professed technology experts, trying to figure out which one of us was responsible for the crappy Internet connection.

Douglas Rushkoff: That's weird. I mean, I could check my Internet speed. Let me see. I get a little bleep when it goes away from you. Is that what you get?

Taylor Owen: We get a bleep when you come back.

Douglas Rushkoff: Oh. I get a bleep when you go away.

David Skok: That's just great.

Taylor Owen: With the connection up and running, we dive back in.

Taylor Owen: How exactly is Facebook controlling us?

Douglas Rushkoff: Basically, you're asking how do we know, or what are the ways that Facebook is trying to dehumanize us?

Taylor Owen: Yeah.

Douglas Rushkoff: Principally, the newsfeed on your Facebook page is arranged, is composed by an algorithm that was ported from the slot machines of Las Vegas, in order to induce addictive behaviour. The value system on what constitutes a successful post has really nothing to do with the post itself, but how much traction, how many stories it generated, how many people clicked on it. We end up adopting the metrics of a market researcher or advertiser as our values. And to do that in a professional space, maybe to do that on LinkedIn is one thing, because that's a place where you're just trying to make money or something. But, to expect your social realm, which is what Facebook supposedly is, your social medium, to somehow correspond to the values of the market reduces human connection and behaviour, to that which can be exploited for financial gain. That's a real problem.

David Skok: How is this different to any other time in human history? Or, is it different? I mean, technology has advanced. The story that I often recall is the one of the Lumiere brothers, sitting, creating a motion picture for the first time, and people sitting in a cinema in 1897, and seeing a train come towards them and ducking under the seats. Is this any different to prior technological revolutions?

Douglas Rushkoff: Well, when the train came at them in their seats, were they told that, if you vote for Teddy Roosevelt, this train will not hurt you? Were people measuring the impulses that could push them back into their seats, so we know how to control a population?

David Skok: So, it's the feedback loop, it's the measuring of the reaction to the technology that's different?

Douglas Rushkoff: Well, it's two things. One, it's the intent. I believe the people who are making the trains were doing it as a form of entertainment, where the people paid a nickel or a dime to go in, and they had this experience, and that was what they were doing. On Facebook, people are not paying with their money, they're paying with things they don't even know they're delivering to a company that's primary intent is not to entertain you with no strings attached, but rather to addict you to a platform in order to predict and manipulate your behaviour in the long-term. I think they're really different things. One is an extension of the arts, and the other is an extension of programmatic advertising.

Taylor Owen: Before we leave on the different moments in media history, you've spent a lot of your career looking at those different moments. It drives me crazy when we hear the constant argument that, well, each new revolution in communications technology had negative consequences, and it took us decades to figure it out. Then, ultimately, we did. In part, because it seems to me there's something fundamentally different about the Internet. Do you think that's true? Is this is a different moment?

Douglas Rushkoff: It may be. I mean, they're all different moments. The shift from an oral culture to scribal culture, that was a really big one. Once we could write things down, we got history, and we got a future. That's when, instead of living in a circular, timeless culture, we moved into a society of progress, and each year we're going to be better than the last. In a lot of ways, that was very positive, to make the world a better place. In a lot of ways, it was really negative because it led to Colonialism, it led to interest, and corporatization. It's happened before, in broadcast media as well. Broadcast media became a tool for manipulating people, that's what television is. Television was really ... Maybe it wasn't invented to do this, but it certainly proliferated for its ability to create consumer demand for products that people didn't need. The whole economic model of America was based on how do we get people to buy stuff they don't need, in order to get jobs for people in factories, to make the stuff that's surplus. We created a consumer society. The Internet stood a chance of breaking us from that cycle of consumer desire. The problem with the Internet, when it first came around, was in 1994 they did a study, and they found out that the average Internet connected home was watching nine hours less television a week. They were watching nine hours fewer commercials, there were no ads online. The Internet was an ad free zone, back then. They're watching nine hours less commercial television, this could crash the whole economy, so that's why they put advertising online. Netscape went public, and we got the Dot Com boom, was so that the Internet could be, rather than this new, weird other thing, it could be an extension of television.

Taylor Owen: But, even more powerfully, because there's more perceived choice?

Douglas Rushkoff: Right, but even more powerfully because it feeds back, and works in real time, and all that. In some sense, I feel like we haven't genuinely moved into the digital media environment, we're using digital media to do television. Trump is not the first digital candidate, he's kind of the last TV candidate because the tweets and things he says only matter if CNN and MSNBC broadcast them.

David Skok: We're talking a lot about communication. I'm curious if, more broadly, you see a delineation between communication technology and innovation, versus overall industrial or societal innovation? Is there a distinction, in your mind, or in your view, among communication technology and how it impacts us, versus more broad technology?

Douglas Rushkoff: Oh, that's interesting. I wonder how much of the Internet we think of as communication technology, as opposed to some version of one way, broadcast technology. Communication always, as a word, to me has communal in it, it has some sense of parody. So, I think of the telephone as a communications' device, and I think of the early Internet as a communications' device. But, I think of the Web as this other thing, this flat catalog. I guess, you mean media, computers, and technology versus other stuff, like machines, and nano, and robots, and all that?

Taylor Owen: Yeah. We struggle with this all the time, in the scope of digital technology studies. Is it just communications and media, or does it include AI? Clearly it includes things like algorithms in it, but how far can you extend that?

Douglas Rushkoff: Digital media is really anything that we relate to, or manipulate through a symbol system, rather than directly. Digital is everywhere, you can have a digital thermometer, digital television, digital ... It doesn't just have to be what we think of as communications media to be digital. The biases of digital end up impacting everything. What I'm hoping is that some of the biases of digital are able to overcome the biases of industrial, corporate, extractive Capitalism. Oddly enough, I feel like I'm defending digital against something else, that it's never been digital that I've been upset with. People think, oh, you used to like digital, and now you hate digital. It's like, no, digital's been the same, I used to love the way that we applied digital, and now I hate the way we're applying digital. There's a really big difference. It's like, I like hammers as long as people aren't hitting each other in the face with them.

Taylor Owen: I mean, in some ways, it's the way that the digital's been appropriated by power, right?

Douglas Rushkoff: Yeah.

Taylor Owen: It's not just humans, it's a certain manifestation of human control and power that has reshaped the digital?

Douglas Rushkoff: Yeah. It happens to every great culture. I mean, you're doing this great thing, think about the early '90s. You're more likely to find a computer programmer going to raves at night, than looking at their Charles Schwab stock indexes and things. Those of us who were involved in computers, our parents were worried for us. They thought that if you're going into computers, it would be like you're going into Dungeons and Dragons or something, that this is a useless hobby. When big business came along, I think it felt to a lot of hackers like we would maintain control of this thing, that they were coming to us. Or, that the only way for this thing to get to the next level was by accepting money from those folks, or making some kind of a compromise. I don't think we realized just how powerful those people are. It's like we missed the '60s, and didn't realize how quickly an entire important movement would be co-opted the minute it is deemed a threat to the status quo.

David Skok: I think of John Perry Barlow, and his declaration of cyberspace, his utopian vision of the internet back in 1996. Do you think Barlow was naïve?

Douglas Rushkoff: Naïve would be the best word you could use for it. I think people like me were naïve to believe it.

Taylor Owen: You believed it at the time, right?

Douglas Rushkoff: Yeah. I mean, I was a kid. Or, a kid, I felt like a kid anyways, in my 20s. But, I was taking E, and going to raves, and here's this wise adult, who lived through the '60s, and wrote lyrics for the Grateful Dead, and was as cool as cool gets, and understood these machines, and email, and Usenet, and was everywhere on The Well, and friends with Steward Brand and Howard Rheingold, was saying right after Operation Sun Devil ... Which is when they arrested all these nice little hacker kids, just for poking around AT&T, and they were doing the Computer Decency Act, trying to shut down the web because they were scared of child porn or some other bullshit propagandistic fear. Along came John Perry Barlow and said, "Hey government and law enforcement, get off our net!"

Taylor Owen: You can't control us.

Douglas Rushkoff: Right, we're going to take care of it ourselves. What I didn't realize, because I hadn't read Marx yet at the time, was that government and business balance each other out, kind of like fungus and bacteria in your body. If you get rid of all your probiotics, the candida is going to run amok in your body. So, you get rid of government, and corporations are going to run amok on there. Really, what he had done, and I didn't know what a Libertarian was, what he did was espoused a California Libertarian ideology that business can take care of itself, and will take care of all this. It turned out, no, because corporations are not fans of the free market, corporations are fans of-

Taylor Owen: Of monopolies.

Douglas Rushkoff: Right, exactly, and something regulated in their favour. So, we didn't end up getting even the free market that Barlow might have spoken about. But yeah, that was a big turning point. I mean, for me, the turning point was the day that Marc Andreessen took Netscape public. Netscape was a non-profit thing, and the day that Netscape went public was the same day that Jerry Garcia, the guitarist for the Grateful Dead, died. I always took that to be symbolic of that's the day that we-

Taylor Owen: Jesus Christ.

Douglas Rushkoff: That we left behind the 1960s values that were informing that early homebrew computer club, West Coast/Bay Area, psychedelic, rave, fantasy role-playing, hypertext Gaia hypothesis, Terrence McKenna, Ralph Abraham, Esalen, neural-network-of-humans Internet, and traded it in for this dot-com, extractive, exponential Kurzweilian nightmare.

David Skok: I don't know how to follow that, but I'll try. One of the things that these corporations do care about, for whatever reasons, is productivity. That productivity has resulted in these technologies, even allowing us to make this podcast, apart from the Internet breaking down, relatively easy to do. What are the positives of this technology? Do you believe that there are positives that have come from what I would call the pursuit of productivity, that corporations have pursued?

Douglas Rushkoff: I mean, some of the productivity is illusory. Maybe our fidelity is somehow higher, in this rickety little thing that we're trying to use to connect, and our users won't know that we've been disconnected four times as we tried to do this thing on our various pieces of technology. I challenge some of the productivity, just in terms of whatever. But, I agree. Thanks for word processing, I can write a novel a heck of a lot easier than I could do it on my Smith Corona. So yes, productivity has gone up, but what we don't have is an economy that knows how to incorporate higher productivity in a way that benefits human beings. Look at something as simple as energy, if we actually had a usable, renewable energy source, it would screw up the oil companies to the extent that it would crash our whole economy. We have an economy built on non-renewable extraction. How do you shift from an extractive economy to a renewable one? That's the same kind of a question, how could we cope with a world where robots tilled the fields, instead of humans? Would we let ourselves just lay in lawn chairs, and drink iced tea, while the robots did all the work, could we tolerate that? No, because we have an economy based in scarcity, we don't want to give something to someone unless they have a job, so we create jobs to get people to do stuff that we don't need, in order to justify letting them participate in the spoils of productivity. It's an economic question, not a technological question. And one that someone, like even Andy Yang, was closer to addressing than most other people on the public circuit right now.

Taylor Owen: He, in some ways, represents almost an ideology that we can fix this problem, right? That there's not just necessarily technological solutions, but that there's an economic reordering that can happen that would change the directionality of these technologies.

Douglas Rushkoff: There is.

Taylor Owen: Do you think that's the case?

Douglas Rushkoff: Yeah. I don't know that it's UBI. I wrote a whole book about this, called Throwing Rocks At The Google Bus, which my original title was Distributism: Operating System For a New Economy. I mean yeah, distributism is the idea that, rather than redistributing the spoils of capitalism after the fact, you pre-distribute the means of production before the fact. It's basically platform cooperatives, so that the drivers of Uber own the platform. Even while they're training their robot replacements, while they drive, they're going to own those robot replacements once their jobs are gone.

Taylor Owen: The co-op movement though, I hear you, but why hasn't ... For something like Uber, that's a fairly replicable technology, why hasn't the coop platform movement taken hold in that space?

Douglas Rushkoff: Because Uber has multiple billions of dollars to operate at a loss.

Taylor Owen: So, it's just pure subsidization? Yeah.

Douglas Rushkoff: Yeah, they're subsidized with stockholder money in order to be the last man standing, put everybody out of business, and then everybody out of work. They become the sole provider of goods, and the sole employer in the same community.

David Skok: Do you see that changing at all, with the WeWork tale? The VCs that I've been talking to are pretty clear that they're going to be pursuing more profitability than scale in the companies they're investing in.

Douglas Rushkoff: Hopefully, yeah. I went to a Pepsi shareholders meeting, and they were there shouting the number, like 4.3, 4.3, it was their growth target for the year. I got up there to do my keynote, and I'm like, "Dudes, you're one of the 50 biggest companies in the world. If you have to grow at 4.3% in order to be okay, then we are all fucked." How big can you get, and why do you have to get bigger? That's problem of math, not a problem of business.

David Skok: Yeah. I wonder, and this is going into a direction that I did not expect us to go down, but would the market not just create different investment vehicles then, that would give them that growth? I guess, is it a fundamental human desire for growth that drives this, rather than a particular asset class, and what you can get on that return?

Douglas Rushkoff: Only since the invention of interest-based currency, from the 11th to the 13th Century. After Feudalism, there was a company of centuries when people started to trade with one another. We got marketplaces outside the towns, and people had local currencies that acted like poker chips. They were valueless, except for their ability to enhance the velocity of transactions, so that the guy with bread could get a chicken, and the guy with chicken could get shoes. You used little chips, and that was money. The problem with that was people were getting wealthy, the former peasants of Medievalism were becoming the middle class, and the Aristocracy who didn't work were getting relatively poorer. So, they came up with some laws, and the biggest law was nobody's allowed to use those local currencies anymore. Instead, you have to borrow central treasury, at interest, from the central bank. You have to pay it back, but you have to pay back more than you borrowed. If you have to pay back more money than you borrowed, where does that more money come from? Growth. So, we have growth, growth became a requirement of business. Not because the business needs to grow in order to survive, but because the banker needs growth in order to make money, simply by having money. That system no longer works. It's stupid, it's exploitational, and it runs amok in an extractive economy put on digital steroids. So no, it's not intrinsic or endemic, whatever, to business or commerce or innovation, it's an artifact of a banking system that was created by Monarchs in the 12th Century, who have long since left the building.

Taylor Owen: So, if this is a structural problem, both embedded in the economic model, but also in the way the technologies are structured themselves, and the power that's on top of that, how do you see us breaking that cycle?

Douglas Rushkoff: I mean, one of two ways, either reform or collapse. You know, so there's people like me, who are talking about reform, and alternatives, and how it's not so bad, and it might be fun. You get to spend more time playing, and we have enough stuff, and we don't have to have wars, and all that. Or, it breaks, and when it breaks, there's a lot of casualties. But, you come up with something else. I still think some sort of Renaissance is possible.

Taylor Owen: Renaissance, to a certain degree, requires human ingenuity, and human agency. I find that, if you're talking about the lack human agency, and particularly when we step into conversations around transhumanism, and the way the technologies are actually, fundamentally changing what that human agency looks like or is even capable of, is that an added risk here, that a Renaissance might not even be possible because we're losing so much agency?

Douglas Rushkoff: I mean, it may not, but it may be. That's why I'm doing my whole Team Human shtick, to say celebrate your humanity, look into the eyes of another person, establish rapport, retrieve the social mechanisms that were painstakingly evolved over 800,000 years, and use them to engage with others, and forge solidarity, and experience yourself as part of the greater human organism again.

Taylor Owen: Is Team Human going to win?

Douglas Rushkoff: If Team Human gets to go on, then Team Human has won. We never fully win, but do we get to go on another day, another year, another century. I'm concerned, given how many species we've annihilated over the last century, I'm concerned for ours as well. I still don't even really know how much we can do about it, at this point. Have we tipped? Is the three degrees, four degrees inevitable? And then, what other sort of system affects happen at that point?

David Skok: Douglas, it speaks to the humanity of our species that we have thinkers and writers like you in our world, and we're grateful that you took the time to talk to us today. Thank you so much.

Douglas Rushkoff: Oh, that means a lot to me. It's great that you're doing this. At least we've having the conversation. But, it always gets down to what can I do, what can I do? I think that the first step, and people are never willing to just take this, is the big question is, what is your comportment as you move through life? In other words, I've become ... I don't mean this in a defeatist way, but I've become less concerned with some goal, we've all got to come together so we can do this, or do that. And rather, what's your approach? How are you approaching each moment, how are you approaching other people? What level of kindness, and grace, and humility are you bringing to each of the moments that you live? I feel like that's really the best starting place to retrieve our humanity.

David Skok: That was Douglas Rushkoff, author of Team Human.

David Skok: Make sure you subscribe to Big Tech on Apple Podcasts, Spotify or wherever you get your podcasts. We release new episodes on Thursdays every other week.

Taylor Owen: Big Tech is presented by the Centre for International Governance Innovation and The Logic and produced by ANTICA Productions.

 

[ More Less ]
For media inquiries, usage rights or other questions please contact CIGI.
The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.