Revision Path has had quite a journey, so for our 6th anniversary, we're going to take some time to go over the highlights (and lowlights) of the past year, and give you a sneak peek into what's coming up on the horizon. Also, we've got a special treat — a conversation with Glitch CEO Anil Dash and CryptoHarlem founder Matt Mitchell from this year's inaugural Forums @ Civic Hall event titled "The State of the Internet 2019"! If you're interested in A huge thanks to all of our listeners and supporters over the past six years!
Revision Path has had quite a journey, so for our 6th anniversary, we're going to take some time to go over the highlights (and lowlights) of the past year, and give you a sneak peek into what's coming up on the horizon.
Also, we've got a special treat — a conversation with Glitch CEO Anil Dash and CryptoHarlem founder Matt Mitchell from this year's inaugural Forums @ Civic Hall event titled "The State of the Internet 2019"! If you're interested in A huge thanks to all of our listeners and supporters over the past six years!
Links from The State of the Internet 2019
Like this episode? Then subscribe to us on Apple Podcasts, Google Podcasts, SoundCloud, Spotify, or wherever you find your favorite podcasts.
Subscribe and leave us a 5-star rating and a review! Thanks so much to all of you who have already rated and reviewed us!
Revision Path is brought to you by Glitch and sponsored by Facebook Design, Google Design, and Mailchimp.
Powered by Simplecast. Sign up today for a 14-day free trial!
You can also follow Revision Path on Facebook, Twitter, and Instagram. Come chat with us! And thanks for listening!
[NOTE: This transcript is only for the recorded portion of the episode from The State of the Internet 2019 on February 28, 2019.]
Danielle Tomson: ... and with that, I will hand over the mic to Matt Mitchell.
Matt Mitchell: Hey. Hello my fellow internet users. Welcome to State of the Internet. Hey, I'm Matt Mitchell and I do a couple things. I'm just going to go over that real quick before I break into my State of the Internet talk. I founded this thing called CryptoHarlem and that is teaching folks and everyone in New York, but really folks in the inner-city, really black folks about all sides of technology, the good, and the bad, and also how the bad tends to be kind of meted out and harmful to those folks and what they can do to protect themselves.
MM: There's a lot of circumvention technologies, learning about cyber information security, things like that, and I'm the Director of Digital Safety and Privacy for a nonprofit that's based in Berlin, Germany called Tactical Technology Collective. They do a lot of great work. If you went to this event called The Glass Room, which was in New York last year, they're the ones that threw that thing. I saw you nodding out there. That's great. Okay, so let's jump in.
MM: So we're talking about the state of the internet and we're looking at where are we. It's 2019, kind of kicking it off in New York City, home of the rest in peace, formerly HQ2 space, right? Although I heard our governor was on the phone begging for, please take us back, so we'll see what happens. But in all seriousness, what is the internet look like for us in the future? What is the future of technology look like? And I'm going to talk about some things that I'm seeing and what I think where we might be heading towards. I think in 2019 we have a fork in the road, we have to go one way or the other. And one way that we could go, is a way where tech companies' kind of run the show. And then there's another way we can go and that's where the people run the show. The humans.
MM: Now its weird because when you open up a tech company, there's actually humans inside. I think what's very important for us to do is to try to see why are there two roads and why are they're not convening in the middle. Why don't they benefit each other, etcetera. In the past what we've seen is this thing called "Disrupt Harlem" and I was there speaking to young folks in Harlem, in my neighborhood, and I was telling them about opportunities for them. I asked them, "how many people do you think work at twitter?" And these kids were like I don't know, a hundred thousand people. And I was like, I don't know. I think globally you're looking at maybe thousands of people, lets say it's 3,000 or something like that. And I was like how many of those people do you think are black folks? And then they were like, maybe half of them. I don't know, maybe 10% or whatever. That's part of the problem, when you look at the actual numbers, they're a lot smaller than what you would think.
MM: A lot of these tech companies your going to see, just entire staff reports, there's an EEO1 report. That's a report that tech companies either publish online or they can choose to do that, but they give to the US government and actually has instead of 10% or 5% is the actual numbers of people in different positions. When you look at a lot of tech companies and center for investigative reporting looked at a lot of these companies you'll see that a lot of these numbers don't meet for any kind of requirement for what would look like a diverse organization. Let along engineering or executives just any kind of marginalized group, very low. I think at one point, twitter had 70 something black folks working there total. I think they were just not quite at the number where we're thinking when you have thousands of employees. And I like to wonder how does that happen. How do these tech firms, these companies, not look like the people who use them? We use these tech firms, we use their software, we use their technology, but we don't ask of them to represent us.
MM: A lot of us are consuming the technology on our mobile phones, but you can't develop a program on a mobile phone. Right? Your not someone who can construct the technology that's something you can control and look for. Also what we're looking at is a day in age where tech firms and there influencing us in our politics. Something Tactical Tech has is this data and politics area of our website where we do a lot of research and we look at these influencing companies. And we found over 250 organizations that work with political platforms and different governments across the globe. They're kind of like the Cambridge of analyticas of the world. They're trying to influence you through the platforms you use. Whether that's twitter or google or whatever. In that influence, they've been able to kind of guarantee certain outputs. We can sway an election. We can change peoples viewpoints. That's something we really need to be careful of, because its not just fake news or misinformation. It's peoples lives that can be lost.
MM: In places like India with WhatsApp, we see a lot of innocent people who misinformation has said that this person's guilty of some crime and then a mob decided to find justice. Then they stone or hang that person.
MM: Places like in Myanmar, where the government said this is a page that could be popular, that's the military, we're going to take over this page. We're going to use that. A lot of the people who live here are reading about these snipers and this military units and we're going to use it to kind of change the way they think about a marginalized group in our country. A lot of people point this take over of this kind of propaganda on social media to what led to genocides in Myanmar. You see some platforms have said, we weren't designed to have any safeguards to protect ourselves from these things happening. We weren't designed to help people not fall for this. What we're going to do now is ban accounts. Myanmar, the head of military accounts, hundreds of people accounts were banned off of platforms like Facebook.
MM: How do we, as people, prevent this from happening? I think it comes from learning about what has happened in the past and trying to push for positive change. They're groups like Color of Change, which is a racial justice organization and I used to work there in a 10 month fellowship, helping them with their digital security. They try to work with platforms to diversify their workforce. Because they feel like I they're more of us at these platforms, they'll be less chance for them to go off the rails. And they also try to work in platforms by pointing out where they are doing wrong.
MM: Last year, a New York Times report found that Color of Change was targeted by a platform that paid for a lobbyist group to finders public affairs, to go after them, research them, do some opposition research and try to create like a misinformation campaign against them and other groups that were saying, look we need to be careful about these different technology platforms. You know the ones we use and put our information on that can be used against us. It's kind of strange that a non-profit that's just trying to educate people gets targeted by a huge lobbyist campaign. But that's the day in age that we're in right now.
MM: So what can we do as regular internet users to fight against, let's say, misinformation campaigns, to fight against people trying to influence us to change the way we think, we work, etc. I have a lot of faith because I look at things like the amazon HQ2 debacle and I see something that kind of says, as regular people we have a voice and we can actually make change happen and we don't have to have this kind of defeatist attitude. That the platforms have to always exist the way they are; they always have to be staffed the way they are; they always have to employ the same people that they do; they always have the fight for the same causes the way they do. We can ask for anything we want because at the end of the day, it's our information that fuels these platforms. It's our data that they monetize.
MM: And at Technical Tech they have this thing called the data detox kit, which I recommend that you all try to look for. And with that data detox kit, its just kind of like an eight day detox, kind of like a health detox, and you can go through it and say maybe I'm going to use less twitter today. Maybe I'm just not going to like and share that on Facebook today. And it walks you through like little things you can do. We also have at Technical Tech on our website information on politics and elections and research that shows things like links in the Brazilian election to use of WhatsApp to sway the electorate. By looking at things like this and understanding how it works, you can be more critical about what you're seeing on these platforms. I don't think that leaving a platform is a bad idea. And I don't think that's impossible. I think that we have a lot of information on how to do that.
MM: There's an article on Tactical Tech's platform that talks about lets say you wanted to leave Facebook. How you could go about doing that. What if you didn't want to leave, but you just wanted to change your privacy settings and have less of your data that's shared, you can do that too.
MM: And I wanted to talk about the work that I do in Harlem and CryptoHarlem. One thing that we found is a lot of social media platforms are being monitored by law enforcement and that targets folks who have a high incident of contact with law enforcement. And if you're living in the inner city, if you're living in the Bronx, you're living in Harlem, black and brown youth are found gathered up through these gang conspiracy charges. Right? And gang conspiracy charges require very little evidence. Just having someones number on your phone or just having a like on someones Facebook post is enough for you to be considered a suspect. There are cases where someone in Harlem, this guy named Guilani Henry, who lives around the corner, and Julani was in Rikers for 18 months waiting for his day in court over such a charge. Connected to social media and a phone number. And only because he decided not to sign anything, only because he decided not to take any pleas was he just let out at the end of that time.
MM: And when we think about how did that happen, all of our platforms whether they're search engines like google or whether they're social media platforms like Facebook, they have an in road, a back door, a window for law enforcement to use them. They're aren't many like rules or regulations on that. There are many challenges to how that works. So if you go to Facebook, and you write posts whether they're private and only between you and another party or whether they're something on your wall, all of that can be used against you. It's not something that your defense has access to but something that a prosecution would. When you go to sites like Facebook.com/records or lers.google.com that's the google version of it. You'll see a page that's like, hi law enforcement this is how you can use our platform to access data, find information on users, etc. I think that's something that a lot of users aren't aware of and its something that's not used universally against all users. What we find is a lot of young people, a lot of folks who live in the inner city. This is used to kind of prey on them.
MM: But I'm not just trying to talk you about negative things and some of the bad things about technology. The platforms do have the occasional like positive, like that's a good thing to see. And one thing that I think is encouraging is platforms have the encryption on their platform, I guess. Right? Whether if you're using Facebook messenger, it has a private chat, which is encrypted. It uses like the same crypto that signal uses, but its not the default message. Its not like the default the way it is. There's a lot of talk of twitter one day encrypting your direct messages. And it's only through by having that privacy and by knowing that what you've posted and what you've shared and what you've created cannot be accessed by another entity or another group can you really be safe online. Alright? Okay.
DT: You're done?
MM: I'm done. Okay. I gave my two minutes. Alright thank you.
Anil Dash: Thank you Matt. Matt is modest because the work he does is so important especially with people who are most vulnerable, so I am very appreciative to get the chance to follow him. And thank you to everybody for having me here tonight.
AD: Did you all get something to eat? [inaudible 00:12:47] we provided the food so I want you to eat. I'm an Asian parent. I need to feed you. Food is very important. I want to make sure everybody's feeling good.
AD: There's a couple things I want to get a chance to go over tonight and I think so much of it is grounded in the ideas that Matt has brought up. And we think about state of the internet, we look forward. I want to go back a little bit, because I think it makes sense to sort of look back at where we've come from and especially Civic Hall feels like one of those places where this is a community I've been connected in to for a long time. Ten or twelves years ago there was a rise of what we started to call civic tech. That was old enough when people called it Gov 2.0. Does anybody even remember that mintage? Yeah. You'd put like a point "O" on things that was the hotness for like 5 minutes.
AD: So that era has fortunately passed, but what's most interesting to me is the people we're drawn to, engaging with, what are the civic and social impacts of this technology? Seems a little bit out there. Right? Especially, if like me, I would've been a programmer. I'd be designing apps, building software. I worked in Silicon Valley, and I left and came back home to New York for all the reasons you do that. Not just pizza.
AD: And you know I came back and I just sort of felt like, am I throwing my whole career away? Am I gonna be ever working again in this industry. If I say what we do matters in the world and the impact it has on society matters; and that maybe we should have a little bit of regulation. That maybe we should have a little bit of mind set you know for what's going to happen to the people who actually use these apps. And I was frankly terrified. And I know I was not alone. I think most people who had that reckoning, even in those early years, not just ten, twelve years ago, but 20 years ago and as recently as a few years ago, felt like if I speak up on the ethical, social, even moral impacts of the app's and the tech we use, am I saying I can no longer be part of the industry and part of this enormous wealth that's being created; this enormous opportunity that's been created. Because we did also see, by the way, the richest companies in the history of the world were being built, which is a nice thing to want to be proximate to. They give you free snacks.
AD: So that was something that you know jumped out to me and the interesting thing is to now look at this through the lens of 10 years later. Well we're at the end of the beginning. Like I don't have to make the case to you about this stuff mattering in the world. Even if you hadn't just heard Matt articulate it so well, just now, you would say, yeah of course. Of course the apps that I'm using; of course the information that I'm sharing; of course the data that's being collected about me is impacting my life. Of course these technologies are skuuing what's happening in media, in politics, in culture.
AD: And that represents a great milestone. There's a big shift because it used to be if you were the person talking about this stuff. If you were somebody that cared about this stuff, you were chicken little. You were the one saying the sky's falling and they wouldn't believe you. And so that's a big shift. And I want to reflect on that because I think that's something that we almost take for granted. But shifts in culture that large don't happen by accident. They don't happen casually. They don't happen on their own. There is no intrinsic momentum. I don't think I actually believe the moral arc of the universe bends towards good. I think we bend it ourselves. And so I think that's something that is really important to reflect on. The good news is we don't have to explain that technology can make things worse. The bad news is that its because we can see it. And you know, I think about that and this crowd in particular is mindful of that change, but maybe we don't always reflect on how it happened.
AD: And one of the moments that jumped out to me most is two years ago now, there was the tech won't build it movement. Right. This was the first time that I could recall seeing this sort of consumer internet industry, the people who build the apps, who build the technology, gathering together and saying in this case we won't build a religious registry. We won't be part of building a system that leads to noble injustices. And we learned from history, we learned from IBM providing mainframes to Nazis in the forties and thinking, wow that was the worst thing we could possibly do with technology. Let's not echo that again. And the interesting thing about this, this was not top dow, this was ordinary workers. This was the people who code, who design, who work at every tech company out there, pretty much
AD: Who code, who design, who work at every tech company out there. Pretty much every major companies and a lot of minor ones, people spoke up. I grew up in a union household. My mom worked in a union for years. I worked in construction with trades people that were in unions. The mechanism is not new. This is an ancient mechanism, but to see it in tech and to see people banding together where they had been really, really somewhat militantly anti-organization for a long time. We don't belong to things. We don't join things unless it's a Slack channel, right?
AD: So that was a really, really big shift and it kind of happened casually and in a very nerdy way. You had to go to GitHub and make a poll request, which is like the arcane incantation of belongingship to some classes of coders. I struggled to do it, but I know how to do it so I can put my name on the list. Now, they made it too hard to participate, but it was just easy enough for a lot of people to put their name on the list and say, "I'm not going to be party to this." That was a revolutionary moment. That was extraordinary.
AD: Probably most people, even in this space, wouldn't be able to rattle off the top of their head. If you think about it, you're like, "Yeah, I vaguely remember that." That's very telling. The most dramatic turning points in the history of social tech, civic tech, whatever we want to call it, sort of go a little bit under-reported. They don't get as much as attention as they deserve at the moment, and then they fade pretty quickly. Now, a little harder to ignore was the Google walkout.
AD: The Google walkout was obviously dramatic, but it was the biggest political surprise of last year. It's seldom framed that way. We talk about it in the specifics of the working conditions that people at Google had to endure, particularly none in Google, which is right and just, and that is a starting point. But what it represents is the first time there was an organized labor movement at a major technology company. That's radical. That's stunning. That's transformative.
AD: We saw labor movements transforming education, public school teachers across America, and we're like, "Sure, yeah teachers. Yeah, that's what they do." The fact that they were successful was surprised, but the fact that they were organized was not. The same thing happened at Google and we have kind of already forgotten about it. These are generational changes. These are fantastically huge changes, in the likes of which no technology company has ever seen before. And, they had huge, huge impact. Direct impact.
AD: That thing, without that, the rest doesn't happen. There is no other mechanism that can change things. One of the things that I sort of want to talk about most importantly is we can identify the problems that Matt gave voice too, that I can articulate. It's everything from sentencing software that replicates and amplifies the injustices on race, and on class, that have resulted in the biases in our criminal justice system for decades. It is the minor things, the things that seem a little less serious, but "Is this photo filter going to work for my face? Is this app going to properly classify these people that I connect to and recognize that they are humans, first of all?"
AD: Also that there are people that are worthy of being treated as first class users of an application. Those things are minor. They're small. The other extreme is the misinformation, it's the manipulation, it's the thing that leads to whether it is elections being distorted, or mass violence taking place. The worst things that humans know how to do each other, technology can help amplify. So on that incredibly long continuum from, "Does this suit my vanity for how I look in this photo filter?" To, "Are we persecuting vulnerable people?" and every single aspect of this, the institutions that we would now rely on to help reign in those problems are not up to the task.
AD: If we say, "Can traditional media in journalism report on these problems?" Not while they're fiscally being gutted, and very few of their staff writers have the technical fluency to be able to report on these things. Also, if they do have that technical fluency then they'll probably leave journalism and go work at the tech companies. If we say government is going to reign these things in, how many members of Congress, especially before this last wave of Congress people came in, could install a smart app on their smart phone?
AD: Three, yeah. It's one hand. You can count on one hand, right? I think AOC knows how to now. We've doubled the number of people that can install an app on their phone. If you don't have that basic level of fluency of like, "I've got SnapChat on my phone," there is no way you're making fluent policy. That's something that's extraordinary. Think about it, because if you think about if every member of the Finance Committee and the Senate had said, "I've never had a bank account. I don't know how to use an ATM. I don't really know anything about money, but I've got a nephew and he's got a bank account," you would say, "You're probably not qualified for this. You're a little out of your depth." Right?
AD: You will hear that verbatim from people talking about technology. The people who regulate technology will be like, "I don't have a smart phone, but I've got a nephew and he made a website." And they're not even embarrassed. They're not even humiliated. When you can brag about your incompetence, there's a very big problem. Of course, that's a broader cultural problem too. But, coming all the way back to-
Audience-Female: [inaudible 00:22:13] bank account.
AD: Yeah, yeah right. And so there's this challenge that we would rely on government justice, we would rely on media to be the watchdogs and to reign in the abuses, but they're not up to the task. Then if we say, "I'm a technologist. I used to be a decent coder, now I'm pretty bad these days but I still try," and I probably have the fluency to be able to inform at least a conversation about it, but I'm not going to enough City Council meetings, and I'm not going to enough state legislature meetings to deal with the volume of policy and regulation that needs to be created. Who is going and volunteering?
AD: Are you all going? I hope so. Some of you are, a couple, right? It needs to be every one of us every night to go to the number of conversations that are happening. We're going to have to do what in tech we call scaling. We're going to have to have a lot more people participating to make these institutions fluent enough in the technical challenges that they're facing. That's really, really hard because these are paths that don't cross. So often, these are very, very siloed.
AD: There is somewhere between contempt and derision, is the attitude of most of Silicon Valley towards policy makers and regulators. For most of it. There's exceptions, but that's the default stance of a lot of the rank and file. I mean, it's separate from leadership. People reign code, signers. There's a lot of skepticism. Similarly, there are lot of people linking policy. Like, it's just an app. It's just tech. It's nothing new under the sun. It's just a faster and louder version of what we already had. We can use TV and radio regulations to regulate this too.
AD: So it's a misunderstanding of kind. It's a category error about the conversation that needs to happen. The same is true for media and journalism as well. The shocking thing about this is that that disconnect persists. There's some places that's getting better. Obviously, there are people in each of these cohorts that are trying to bridge the gaps. But that disconnect fundamentally still exists even as the problems get more serious. Usually when a problem gets more urgent people start to come together, when they say, "Let's fix this together." This is one of the places that convenes those conversations, but generally there aren't enough of those conversations taking place and they certainly aren't increasing at a pace commensurate to the severity of the problems being caused.
AD: So that's something that I've spent a lot of time thinking about, because I'm culpable. I've built social platforms 15-20 years ago. The first time that somebody ever docks us by publishing our home address was on a platform that I had built, and where I had helped created policies where we thought that would be a fair expression of freedom of speech. I felt differently after it was my child that they were publishing the address of. And, I didn't know what I didn't know. I reflect on that every day.
AD: There are people that have been made vulnerable to do choices I made in apps that I shipped, and put in front of millions of people. I wasn't even the one that succeeded. I didn't become a billionaire and nobody uses those tools anymore. I skated away. They're like, "Fortunately, the stuff I made is obsolete. I wasn't good at it. Or ruthless enough. Or whatever combination of it, it takes. So then I'm not as culpable, but I have friends that were there and they did end up on the other side of it. It's nice because they have private jets and shit, but the downside is real harm. Real harm, not just, "This makes me feel creeped out. This person is making me feel uncomfortable," but real harm.
AD: When we think about those real harms, who is paying that price? A lot of us have seen the recent stories about the people responsible for moderating communities, and the incredibly gut-wrenching things that they face. Worse than the problems of moderating the most mortifying that humans could share with each other, is almost the surprise aspect of it. Things veer from the binal, the annoying, the trivial, to extremely severe. If you think about the thing that people crave most, or pretty much every mammal craves most, is that intermittent reward, to where, "Am I going to get piece of a cheese when I push the button?"
AD: Intermittent severe emotional punishment is probably the worst thing that we can do to humans, and we're relying on thousands of them that will pay to be really a huge part of the barrier between us and the worst things these platforms can do. I get it. I ran a team like that. Unfortunately, again, we weren't big enough where we had to dea with the worst things all the time. But when we see those things happen, the impulse that we have technologists, that I've had as a technologist, has bene to say, "Let's just hire more moderators. Let's use AI. Let's solve this by trying to stuff the cat back into the bag."
AD: It doesn't work that way. The designs have to change at a very, very fundamental level. I don't just mean better flagging, better reporting. The economics of the platforms have to change. When we have systems that make more money as they gather more data, that make more money as they get more attention, there have to be really, really fundamental changes. The good news, there is some good news in this that I can offer you, is people are trying to do this. People have always been trying to do this.
AD: There is an independent Internet. There are independent creators. There are small platforms. There are even moderately-sized platforms. There are platforms where people talk about what they're reading or what they're knitting. We're trying to build one where people are just sharing fun little apps. But, those of you that are old enough or remember back in the day, Geo Cities and your Neo Pets page. Neo Pets had some problems, but there weren't mass waves of Nazis on there, so that's good.
AD: What it takes me back to is thinking about where we've solved problems in other domains outside of technology. If we look at what's it going to take to teach our kids well, then we get involved in our schools. We get involved locally. We show up. We have these conversations. We talk to the people that are teaching, and we learn from them too. In particular what I think about, and there's analogies of this in art, there's analogies of this in the local newspapers, local media, but I did say food is important.
AD: I looked at the food we consume. Imagine, if you look at your phone right now, how many of the apps on your phone do you know who wrote the code in those apps?
AD: Zero, right? Now imagine if the only food you ate was fast food. You go to McDonald's and the only food that you ate off of their menu was, they told you, "Well this is the food that has the highest profit margin for us, and you don't get to see the menu. We'll just give it to you. Don't worry about it, you eat it." The algorithm has chosen for you the right burger. And, we won't tell you about the sourcing of the food. There will be no local food. There will be no farm-to-table. There will be no going to Whole Foods and seeing who grew the apple that you're eating. None of that stuff.
AD: Then that's every meal. That's breakfast, lunch and dinner. Is there any possibility of being healthy with that as your diet? I don't think so. And then I look at my phone. I look at my tablet. I look at my PC. I look at all the things that I have and I say, "There has to be a path forward where some of what we consume digitally is made by people we know, maybe even people who love us, maybe some of those people are local and not far away, not in Silicon Valley, not on the other side of the world-" I mean, this is New York City. We are not hurting for creative people.
AD: Then to be able to say, "What percentage of my diet is that? What percentage of what I consume, what percentage of what I encourage others to use of where I put my investment, my time and my data, are things that are created by people who I trust, who I know, who I feel share my values, who have thought about these issues, and are building systems that prevent the harms that anticipate the negative reactions as opposed to trying to put the cat back in the bag after the fact?"
AD: I hope that is the diet that we all get to enjoy. Thank you.
AD: You sure?
Speaker 2: [inaudible 00:30:40].
AD: Maurice is going to come up. He's going to join us. I'm lucky I get to have to Maurice as a coworker and a colleague. It's very fun.
Maurice Cherry: Hi everybody. All right. What's up?
AD: Hey Maurice.
MC: How are we doing? No, we're doing pretty good, you know, all things considered. A round of applause again for both of them. Those were both two really great conversations. [inaudible 00:31:12]
AD: Yeah, we're on.
MC: I kind of can't hear myself, but that's okay. So I have some questions for both of you. We'll start off with Anil, since you went last. We'll start first with you.
AD: All right.
MC: You kind of started off saying that we're basically at the end of the beginning, and you've been around the Internet for a long time. I've been around the Internet for a long time.
AD: You called me old, right?
MC: We're both old.
AD: All right.
MC: You know, so what do you think was the tipping point for all of this kind of public perception of tech changing?
AD: I think everybody had a different one. I think culturally you can't ignore. Obviously the 2016 election is a reckoning for people about how does media manipulation work, and how does misinformation work, and those things. But I think everybody had a moment where they said, "Why is that ad showing up? How did they know?" The first time you saw an ad that was like, "How did they know I was looking at that? How did they know I was going to buy a new pair of boots?"
AD: The moment you have that first, "Whoa. Whoa, whoa, whoa, I don't understand how this happened. I don't know why is on my screen." I think that was the sort of moment, and everybody had a version of that in some aspect of their life. Pretty much a billion people had that experience over the last [inaudible 00:32:25] five years.
MC: So it's kind of been like a collective reckoning-
AD: Yeah, yeah.
MC: Essentially. We all saw re-targeted ads and thought, "Wait a minute."
AD: Yeah, or something like that. Just some aspect of like, "This surprised me in a way I didn't expect in a very personal way."
MC: Okay, all right. Matt, we've got a question for you. Near the end of your talk, you were sort of saying that we have the option to sort of elect out of using some services you know. Some might go as far as saying it's a boycott. Someone may take it way, and say "Oh, we've got to stage these tech boycotts," et cetera. Certainly, I think we've seen that with several platforms over the past maybe year and a half or so. They don't work. The tech boycotts don't work in general.
MC: I mean, I think it's something certainly where we want people to know that it is an option, but in mass it doesn't seem to really sort of move the needle. Why do you think people are still doing them if this change doesn't happen?
MM: I think directly affected people will always fight for liberation, and at the end of the day ... I'm vegan, right? You don't have to be vegan, but I am. I might have a milk boycott. It's just not something I drink. I believe in animal rights, but I'm not having a meat boycott. It's just I choose not to eat that because it's not good for me. There's the first thing that people tell you is that these things always existed, and you must always use them. One generation will pass, and then they'll accept it. Then a second generation will pass, and it will cement that. But it is not true.
MM: These boycotts don't work because they're not boycotts. It's not a boycott-
MM: These boycotts don't work because they're not boycotts. It's not a boycott, right? It's like, stop using this 'cause it's bad. Get crack out of our neighborhood, period.
AD: So this is interesting 'cause when you said about their boycotts can happen, that was the first thing you said where I was like, I don't know if I agree, right? And it jumped out to me because I look at, if I say, "I don't want to do anything with Google. I don't want to be involved, I delete my account, do whatever, but I still use email." And you're on Gmail on the other end and Google's still got my email. There is no they can't have my data, right? If I'm participating in the digital world they're gonna have an outline of me, a shadow of me, right? Some version and this is the thing I worry about, and especially the marginalized communities. If I say, "I'm opting out of all that", well, all right how are you gonna get a job if you're not on LinkedIn? How are you gonna open your network to connect to opportunity? That's the thing I worry about for me. I can object to ... there are communities where misinformation is spread on WhatsApp, period. You need to talk about some of them.
AD: But my cousins in India are on WhatsApp, and that is what it is, and we didn't ... they grew up without running water, to be able to connect to someone on the other side of the world to see baby pictures, I'm like, "look, I'm going to be there." And so the cost of what would it be like to go back to being disconnected for them, it's too high, so this is almost this coercive effect of your network.
AD: I totally agree, economically it's not a boycott at all, because it's not about the economics of it. That's not how that works.
MM: Well I would say, real quick, when it comes to Linkedin, they're not hiring us anyway. It doesn't matter. You can ... I mean, I'll show you the numbers.
AD: No that's ... math is math.
MM: You can be a PhD out of MIT, you are not getting that job. So ...
MC: Shot's fired.
MM: Look, WhatsApp or nothing, that is not true. There's many alternatives.
MM: And people will come up with their own alternatives headquartered in their own cities. Helping their own people.
MM: When I look at people who are South Asian, I'm not seeing moderators who are going to see the most despicable of content and get paid nothing, and feel like they won the lottery to get that job. I'm going to say "look, you have your own capital, you have your own resources, you have your own community. Start your own thing." Let's open up that glitch Indian office, and take over.
AD: It's wild because, you know, I would go to India as a kid. Everything ... they didn't have Coca-Cola, they had Thumbs Up. It's because we're not going to have western companies come in and create soda for us, we're going to make our own sugar water, right? And it was such an interesting thing where it's like "yeah, we rot our own teeth." But they went into ownership and equity as a starting point because it was such a recent thing for them to be independent.
MM: Yeah. Thumbs Up, man, all day.
AD: Yeah, that's good, though.
MC: Question for both of you. Mostly for Matt, but both you can answer this. Matt, you mentioned Color of Change is one organization that is out there walking the walk and talking the talk as it relates to pointing out the fact that tech needs to be fixed in these certain areas. Are there other organizations or people or agencies that we need to be on the lookout for?
MM: Oh my goodness. It's a crazy long list, you know?
MC: Plug away.
MM: Okay. So, we have Color of Change, which is a civil rights organization that uses technology to uplift all of us so definitely join their mailing list because that's how they work. It's free, and you're going to get one email from them this year that's going to be worth signing up.
AD: And if you're on social media, follow Rashad Robinson, their director. He's incredible.
MM: And he's an amazing dresser.
MM: Rashad. Amazing.
AD: Yeah, that [inaudible 00:37:33] He's got it.
MM: Then there's a group called Code 2040. And they're like, look, in the year 2040 we want these tech companies to just change a little bit to look like the actual people. And they're like, look, we need to get more people of color in these companies and this is our path and our mission. And we're going to do it by 2040, and we're going to have all these people coding, and they're going to work at these tech companies. Change from within type of thing, right?
MM: There's a group called Dev Color and they're like, look, we're the black engineers at Facebook, at Google, at all these places, and we're one of few. And the hard thing isn't hiring an engineer of color, right? I mean, even though you look at the stats and you think it's impossible to hire a black woman, right? The real hard thing is keeping them because you face so many micro-aggressions day after day that you end up running and going to non-profit spare or something like that, or the government. There's a lot of people I know who work in the government who are like, I used to work at Twitter as an engineer. It just didn't feel welcoming. I had to go, you know?
MM: Whether you're LGBTQ, gender non-conforming, trans, you're black, you're latino, you're whatever you are, you're different, you're reminded of that difference even though you're told this is a welcoming space. And I think that's what makes people drop out and leave. So that's something that they're doing, Dev Color.
MM: Let's see. There's ... What's that? Girls Who Code, of course. Yes.
AD: Black Girls Code. [inaudible 00:38:53]
MM: Black Girls Code. Blah Blah Code. Code Code Code.
AD: Identity code.
MM: Yeah, which is awesome. Let's see. There's Hack the Hood, which does a lot of work we'll see in the west coast. But they're trying to get people from Silicon Valley, from these tech firms to volunteer their time to educate folks in the inner city, and that's Hack the Hood. They're doing a lot of good. It rhymes.
MM: I mean, there's a lot of these groups. There's an insane number of them. I wrote this tweet and I was just trying to do a Black History tweet for only groups that do this and it was ... I was like, wait, there's too many.
AD: There's so many.
MM: And the problem is even though there's so many, we still hear well, there's a pipeline problem. We're still here. Where are these people? We can't find them. And I think that that's, there's a reason why we hear that. Because the face it's coming from always looks the same.
MC: I'll also throw out The Design Justice Collective is another good one for any designers out there.
MM: And there's this great blog Revision Path on the pod. So Revision Path interviews people, you can catch up with people who are doing this work day after day and see where your stories align.
MC: Thank you.
MM: You know, downwards.
AD: Lay the plug, I like that.
MC: Today is also our sixth anniversary, so ... Six years in the game.
MC: So Anil, let's go back to two points that you mentioned that I wanted to touch on. You said that few people in positions of power, government, et cetera, are fluent in how tech makes us vulnerable. I think if any of us have seen the hearings with Congress trying to ... Mark Zuckerberg is testifying to Congress, we're like, what are they talking about, right? But you also said that many institutions have to help to fix the internet. It's not just a kind of singular solution. How do we connect the dots so those people can help out the ones that are in power?
AD: You know, I think Congress are like people that are nominally fluent in the contemporary world, that exist on Earth in 2019, which doesn't - Sorry, there's just this like, there's so many members of Congress where you're like, are you even on this planet? Like are you in current reality? And then on top of that to be sort of technically fluent, and you do look at the new wave of folks that are coming in where you're like, at least they have, like I said, used a smartphone, engaged in the world. So then it's possible to contextualize here are the challenges, to have them sort of speak to it.
AD: You know, voting is never, especially in a time with so much disenfranchisement, going to be the only answer, but I think it's a pillar. Like it is one of the core things we can do. I think a lot of it is old-fashioned civic engagement. And I said it glibly, but I mean it very sincerely. Like going to city council meetings, going to community board meetings, going to school board meetings, talking about what you know. Where everybody in this room has a level of technical fluency that is orders of magnitude better than most of the people that create the policy that shapes our lives.
AD: And you might feel like, oh well, there's somebody here in this room who's a great programmer and I don't know all that. It doesn't matter. What you know is enough. You showing up is enough, and it's hard. It's hard to make time. I think that's one of the things where like for me I'm balancing am I going to have dinner with my son tonight, or am I going to a bunch of folks that are yelling about parking spots, and talk about technology? You got to, at least sometimes, you gotta do it. But if everybody did that like once a month, one night a month, it would change incredibly rapidly.
MM: And I think to kind of piggy back off that point one thing you'll notice even when trying to find some of these spaces or go to some of these places, the design is not that great. They don't really have things together, but that's often because they don't have people like the folks in this audience that are there to help them.
MC: So speaking to what you said about you go and you're kind of automatically the expert. Not saying that it's a big ego thing, but just consider that you would be surprised how even just a little bit of knowledge would go a long way on a civic level.
MC: So question for the both of you about diversity and inclusion.
AD: I'm for it, so I don't mean to brag.
MC: You're my boss so that's good to know.
MC: So diversity and inclusion has been this [inaudible 00:43:10] call in tech for years now and I think that certainly it's something that people are very excited about. There's, I would say there's even a whole cottage industry around it at this point. But companies are not, they're not talking the talk, they're not walking the walk. Matt, you mentioned it about EEO-1 report. Anil, you touched on it briefly in your talk. They're just not doing better overall as it relates to recruitment, as it relates to hiring, or as it relates to retention. So what do you think needs to happen in this industry in order for substance of change to happen along those lines?
AD: A couple things. The people talking about it would have to mean it. Would be a starting point. And most don't, and I think the EEO-1 reports are really good examples. So these are federally mandated reports that publicly traded companies have to do a report on who makes up their companies. And the wildest shit about this is Google as a publicly traded company just didn't do them for a long time. For like ten years. They're like - and it's not like, uh teacher, a dog ate my homework, whatever. They're just like, nope. Nope. It wasn't until they were going to get sued about it and then people inside started reporting their numbers independently as employees, that they started doing this. And that is like, you can't even argue you're working in good faith at that point. You are hiding your numbers. Period. Like at that point. And that should've been the reckoning.
AD: And then they sort of papered over it and they have good ability to communicate, and pretty much every company in the Valley was doing this. But there was a shift where all of a sudden they're like, all right, now we're proudly publishing our numbers and our goal is to increase things by one percent over the next four years, if we're lucky, for black representation and 0.2 percent for Latin representation. And I'm like, you're the same people that are like, we can get a reasonable rocket to go to the moon and come back, or you're the same people saying, we're going to get a permanently flying drone providing wi-fi over disconnected parts of the world. And you're like, can you hire in absolute numbers six black women instead of four? And they're like, that's too hard. I don't buy it. So I don't think they mean it.
MC: Do you got anything to add to that, Matt?
MM: Yeah, I would say as a developer whenever I see a bug in the code I want to fix that. And these tech firms when there's a problem on their platform or there's a server outage or something, they fix it. And if they wanted to fix the DNI issue it wouldn't exist. It doesn't make you more money. It doesn't translate to you getting more people on your platform. So the incentives aren't there.
AD: Well, it would. It actually does translate to more users, right?
MM: I don't think so. I don't think, honestly, it's going to stop things like SnapChat having a black face filter. It'll stop things like that. But it's not gonna ...
MC: Knock on wood.
MM: You don't know, the code that's written by a black woman with dreadlocks and a gender non-conforming, awesome person with purple hair and a dog collar. That code runs just as great as white males cis code. And I think that at the end of the day, until we start our own things that look like us, we're not gonna - This isn't a problem that's worth solving to people who don't care about it.
AD: Yeah, there's not a pain. I took a look at the point of earlier about boycotts. Delete Uber was powerful enough that it caused Travis' job, right? And that's not to say there were that much economic impact, but it had the combination of economic impact and PR impact and all the sort of things that hit together. And also he was vulnerable because there are cultural issues to come so there was like a perfect storm he did this thing. Who's going to get fired for not following through on DNI commitments, right? That's actually where it's at. And can they be fired?
AD: In the case of SnapChat, I'm sorry if you own SnapChat stock, but you don't have any voting rights. They managed to go public with the company where you have no rights as a shareholder to do anything. So the odds of them fixing inclusion problems at that company, even if you're a shareholder who's like, I own your stock and I care about this issue, is zero. They're only going to change if they want to.
AD: And I look at this of like, our company where we work. To an approximation as recently as five years ago the company was pretty much a hundred percent white men, right? Very talented ones, but it was not an inclusion organization, period. And we're shifting. And we have a lot of work to do, but you all hear me around the office. This is a thing we're going to do and we're going to spend time on it, we're going to set goals. And we're going to get to proportional representation on our staff in the US. And I'm accountable to it. That's not that hard to say.
AD: The same is true like we're going to grow our users and we're going to try to make more money. Guess what? We're a company. That's what we do. The difference is it's not any more outrageous for me to say we have those goals and to say I'm going to be accountable to them than it is about our business goals or about being good at design or any of the things that companies want to do. But how many tech CEOs will say it? They'll say we're screwing up, we're not doing a good enough job. We really want to try hard this year and look, we sponsor this conference or whatever. That tends to be the - that's how you kick it down the road. And that's why I say it's a lie. Because if you were trying to delay the conversation and put it off, what would you do? You'd be like, look over there, we did the thing. We bought tshirts for somebody and so now we're good.
MC: Final question and then we'll kick it out to the audience for Q and A. Now let's go ahead and address the inevitable. And the audience, you are actually all a part of this. We're preaching to the choir here. I mean, I think everyone here kind of knows the ramifications and the instances of things that are happening on the internet, otherwise you would not be here tonight I would think. How do we take the messages from this event and spread it to those who not only need to hear it the most, but those who it affects the most?
MM: I would say when you walk out of here, you take the elevator and hit the street, just text someone from your phone. When you see a friend, just talk to them. I was at this thing yesterday and it's the one thing I remembered from it, or the one thing that impacted me. It's about those human relations. I think if AOC can have the power to get Amazon out of Long Island City then you have the power to, in your social network and your family and your friends, get them to care one degree more. And that's all we can do. I think that makes all the difference in the world.
AD: Yeah, I think any movement is one on one, door to door, person by person, face to face. It's a ground game. And it's the same as like any candidate you've ever cared about, if you've knocked on doors and been that person that nine people in a row slam the door, and you're like, this was the one. They're going to be excited and they're going to actually take the card when I hand it to them or whatever it is. It's the same thing. If you're saying I care about the impact that tech is going to have on people's lives, you gotta have those conversations. And you gotta have them over and over and over, and most of them are not going to be initially productive. And you gotta still stick with it. It's not glamorous. It is not like - There's no revelatory moment where somebody's like, I see the light. You know what I mean? It's just like this. It's the rest of our lives actually, in every day for the rest of our lives. So that's the fun, yay. It's a lot of work.
MC: Give them both a round of applause.