May 24, 2024


Listen on: Apple Podcasts | Spotify | Pandora

This is the full transcript for season 7, episode 6 — The algorithm: Letters of recommendation — of the Quartz Obsession podcast.

Gabriela: I want to get a little meta here, listeners. For one reason or another, you opened up some digital streaming tool and it led you to me. Or rather, my voice hosting this podcast. Maybe you’re on your computer and someone you know shared this link on one of your social feeds. Maybe you’re a fan who follows us and your phone dropped you a push alert with a new episode.

Maybe. You were on a platform where you tend to listen to other things, audiobooks, albums, radio episodes, and our little show served itself up on your screen. Suggested listening. Shows you might like. Recommended for you. Somewhere inside the depths of your screen, someone or something has been quietly determining what you’ll want to listen to next.

All right, Bruce, you already have to listen to my voice all day in the office and on Zoom calls. How would you feel if the algorithm pulled you to this episode?

Bruce: I would feel like it got it right, that it knows me pretty well. I love talking about algorithms with Quartz editors.

Gabriela: I’m Gabriela Riccardi, the host of Quartz Obsession Season 7, where we’re taking a closer look at the technologies and the ideas that define our lives.

So, get yourself ready to learn about machine learning, because today I’m talking to Bruce Gill from Quartz. about algorithms.

So when we’re talking about algorithms, at least here, we’re talking about recommendation algorithms, the sprawling, peculiar string of numbers that’s in charge of figuring out what you like and dictating how it becomes everything that you see online. With that in mind, Bruce, explain to me, how did you first get into the algorithm?

Where do we find recommendation algorithms in our daily lives?

Bruce: My relationship with the algorithm has been, for the most part, pretty positive. I’m a pretty indecisive person in nature. I love the idea of having something, I guess, helping me decide what to eat. Like Grubhub, please give me recommendations. Netflix, tell me what’s good to watch. Spotify, give me your curated playlist.

But I will say there was something a little bit different about TikTok. And I think a lot of people have a similar experience where it felt like it knew you too well. And at first I thought that was amazing. Like, again, it was like a very constant. Stream of like good entertaining content.

Gabriela: What made it seem like it was too good? What were the kinds of things that it was serving up for you?

Bruce: It was just a lot of things that I’m not sure how I knew I would enjoy or like. Obviously there was kind of like some standard things if you know about me. Drag race clips. There were some ASMR videos. I love seeing, you know, people like tapping on things or cutting soap.

I don’t know why.

Gabriela: It scratches the itch in your brain. Sometimes it’s inexplicable.

Bruce: Exactly. And then another weird thing that I noticed too was that like, it kind of told me a little bit about my history. It would show me memes from like, old times. that I used to watch with my family when I was a kid.

And I’m like, how did this get in here? Cause I was not something I was searching myself on TikTok. Eventually I just noticed that like, it was almost too efficient where I was spending hours. It would mess with my sleep. I did end up getting rid of the app for my phone, just for myself, personal preference, and once I like started learning more about these algorithms, it did feel a little deceiving or like, Kind of like a little trickster that knows more about your interests than you probably know yourself.

Gabriela: We all hear this pretty commonly, like we sort of like dip our toe into a new social platform or a different digital ecosystem, and suddenly it is pulling out all these things that it knows about us. We make the jokes that our phones and our devices are listening to us, but sometimes it’s just so uncanny. It feels like a lot of people are arriving at these conclusions today, that the algorithm has gotten so sophisticated that it’s telling you about you in a way that you can’t even talk about you.

Okay, Bruce, let’s just break this down to the most basic of levels. What exactly is an algorithm?

What is an algorithm?

Bruce: So an algorithm in its broadest term is just a process or specific set of instructions that when you follow or perform, they result in a desired outcome. A lot of times people like to describe it as almost like a recipe to solve a problem.

They’re usually used to solve math problems, but now with computers, they’re used to solve almost everything.

Gabriela: So it goes back, you know, all of us learn in maybe elementary school or middle school, the idea of the formula. At least in my classrooms, I learned about PEMDAS, which is like the order of operations for how to like put numbers together so that they arrive at their logical conclusions.

Algorithms are just like much, much, much more sophisticated, more evolved versions of that. But at their basic level, they’re just kind of formulas or instructions. So let’s take it back for a second. When did people start? Using algorithms for the first time. How did this idea of an algorithm originate?

What are the origins of algorithms?

Bruce: Algorithms are like a pretty ancient idea. Some of the earliest recorded algorithms were written in clay tablets in 2000 BC in Babylonia.

Gabriela: Oh my gosh.

Bruce: Right, crazy. There’s Greek mathematicians, like one famous algorithm is the Euclidean algorithm. So this one goes back to 300 bc. It’s kind of more recent.

Gabriela: Oh, yes. So much more recent .

Bruce: It’s an algorithm to find the greatest common denominator between two non zero numbers. And it’s basically a very simple algorithm where you do a lot of division problems until you get the greatest common denominator or greatest factor in two numbers.

Gabriela: Okay.

Bruce: I’m sure people are familiar with it from grade school. I think it’s one of like the first formulas or like factoring equations that we learned back in math class.

Gabriela: How do we go from literal ancient times, Euclid, old mathematics, things being written on tablets, to the 21st century today?

What are some of the foundational algorithms?

Bruce: Once computers came onto the scene, they made it easier for people to perform larger, more complex algorithms, and they’re used for all sorts of things, to sort large pieces of data.

There’s these two algorithms that I like to think about as like the foundational algorithms to these kind of newer TikTok recommendation algorithms, the Google PageRank. And the Netflix prize. They’re like really the ones that people first started interacting with. Let’s start with Google. Cause Google, I think it’s something we use every day and to hear its origins.

It’s kind of crazy too. Ooh. So the original Google algorithm was written in like 1998 by these two college students. They went to Stanford, Sergey Brin and Lawrence Page. They wrote it as a part of an academic paper while they were in college. And the idea was to bring order to the web. Back at that time, search engines weren’t as helpful as they are now.

They were still figuring out how to like give users the most relevant and useful search results. And so these two college students thought of an idea of how to make sure when you Google something, you’re actually getting reliable information and not just somebody’s like random blog.

Gabriela: So you’re telling me Larry Page and Sergey Brin built Google beginning as a college paper, just an academic paper, like birthed so many ways that we navigate the internet today.

And also a literal verb that I think everybody has in their vocabulary to Google something. I just heard you say it, you know, like it’s shaped even our language. Yeah.

Bruce: It’s so interesting to think about how, like, now that we, like, kind of complain about the algorithm, they were so helpful and, like, game changing for the way we, like, all navigate the internet now.

Gabriela: So tell me how PageRank worked once it came off of, like, the academic paper and was an actual tool that existed for real.

Bruce: So now the Google algorithm is way more complex, but back then their idea was you rank these pages based on their number of links and the quality of the links. So, each page gets a rank based on how many other pages are linking back to them and then also takes into account the page rank of those pages.

So if you have like these high quality pages citing you back, your rank goes up.

Gabriela: That’s how we get the New York Times and the Chicago Tribune and the Washington Post and the LA Times floating to the top of Google’s search results. as opposed to mynewsblog. com. us. co. uk. gov. I don’t know. Someone’s random, you know, opining on the internet.

That’s not going to be ranked as highly because it’s a less reputable source. It doesn’t exist within this, you know, ecosystem of recommending things to each other.

Bruce: Exactly.

How did Netflix build recommendation algorithms?

Gabriela: So tell me about the second foundational algorithm.

Bruce: So this one I think was a huge game changer on like recommendation algorithms.

Back in 2006, when Netflix wasn’t even a streaming service, it was a DVD rental service, they launched a competition that is called the Netflix Prize, where they offered 1 million to anyone who could improve their recommendation software by 1 million. 10%. 10 percent

Gabriela: does not sound like that much. It’s a million dollars worth?

Bruce: Yes, but it did take people like at least two years to reach that benchmark.

Gabriela: Wow. Okay. So clearly this is a higher hurdle than my instinct tells me. Okay. So tell me, how did they get there over the two years?

Bruce: What Netflix did was that they made public all this data on how their users are rating movies.

So there was a hundred million ratings. of 17,770 movies from about 480 customers. So the task was to use that data to create an algorithm that would predict whether someone would like a movie based on their previous ratings, not just their ratings, but the ratings of everyone in this data set. So everyone that uses Netflix.

And so 30,000 people signed up to be a part of this. And then there were so many more. Forums and discussion posts and people sharing their ideas and you know people started to think about like are more recent ratings more Accurate than ratings made months ago. There’s a time of day when someone made a rating Does that affect think and they started adding these things to their calculations into the algorithms and they all build upon each other’s work There’s this one contestant Simon Funk who was very influential.

He was one of the top people To first, like, just blog and make their code public so people can work off of his. And, like, kind of the big innovation he came up with was using this, like, mathematical technique called singular value decomposition.

Gabriela: This sounds very over my head, but tell me more.

Bruce: So, basically, it’s a way to automate finding similarities between movies and users that like those movies.

So all these ratings are just numbers so the algorithm can like find that these groups of people all seem to be liking these kinds of movies. So it kind of creates buckets for you. And that could be genres. There’s a certain type of person that likes action movies, or there’s a certain type of person that likes Tom Hanks movies.

The algorithm learned that if you like romantic comedies, it can find out that other people that like romantic comedies aren’t really sci fi fans. And so it knows not to recommend you any sci fi movies.

Gabriela: It’s just so interesting to hear this one. It sounds like by making public like this treasure trove of user data.

I mean, we’re obviously in such early days that user privacy is not really. As hot, but in a topic as it is today, but it gave through this crowdsourcing effect. A lot of people were testing like various inputs. I think it’s fascinating that the time of day that you’re surfing Netflix might affect the choice that you make on what to stream.

Like, what do you want to put on in the morning? Maybe you’re having your breakfast versus what you’re putting on just before you go to sleep at night. Like, do you want something more energizing in the morning and soothing at night or maybe, you know, like vice versa. It’s fascinating to think about that.

Bruce: Netflix did anonymize the data for the Netflix prize, but people were able to like figure out who these users were by connecting the data with IMDB pages. And so that kind of ended up shutting down the Netflix competition. And they had planned a second one, but they ended up canceling those plans because of privacy concerns.

Gabriela: Oh, wow. What’s old is new in some ways. Data concerns then, data concerns now, privacy concerns then, privacy concerns now. So it sounds like, in the case of PageRank, this was designed to bring in orderliness and streamline things. Pages, websites, get linked to more, have a higher rank. The rank keeps moving up if the pages that you’re linked to have a high rank themselves.

It’s sort of an organizing system that’s very hierarchical. This Netflix prize example sounds like it’s really content based and personality based and profile based and so it’s sort of a sprawling complexity that establishes different preferential connections. Totally interesting.

So, I’m curious, what are some of the different ways that algorithms are being used right here, right now? How have they advanced since PageRank or like a crowdfunded contest to beat Netflix at its own algorithm?

How have algorithms changed in recent decades?

Bruce: There’s just so much more data now. There’s so many ways to track people. You know, Netflix, in that example, was relying on users rating films.

It was like a hundred million movie ratings. But now that Netflix is a streaming service, it could actually track what you’re actually watching. You don’t even have to rate something to get pulled into this algorithm.

Gabriela: What do you think about the impact that recommendation algorithms have on us culturally? Can you tell me a little bit, Bruce, about how they shape and influence us in ways that we might not recognize?

How do algorithms shape and influence us in ways that we might not recognize?

Bruce: Going back to our conversation earlier, TikTok is such a clear example of how culture is being shaped by these algorithms. It’s also, like, one of the more powerful recommendation algorithms. If you look at its design, it’s kind of like the perfect machine learning platform and that, you know, like an algorithm is learning by collecting more data and proving upon itself because in this platform you have like about a billion users watching all these like two minute clips.

Every swipe or every minute you keep watching a video is another data point for it to know what to recommend you next. And that’s what I was talking about earlier when I said I felt kind of tricked. There might be times where like, I don’t think I like cat videos. I’m not liking cat videos. But for some reason the algorithm can track that I watched them all the way through and will keep recommending these cat videos.

So in a way, the TikTok algorithm and its design of the app is so powerful in like showing you stuff that’s addicting and knows that you will watch because you’re giving it all the info it needs for it to know that you’re going to keep watching.

Gabriela: Oh my gosh, I relate. I am not on TikTok myself because I’m already allocating too many of my off-time hours to my screens.

But the same thing happens to me on my, oh, my Instagram discover feeds, my Pinterest explore pages. Oh my gosh. I am just getting constantly reinforced of the things I know I like, and then I’m also getting pushed things that I didn’t think I liked, but somehow can’t look away from. So talk to me about the darker side.

What are some of the threats or the safety concerns that recommendation algorithms pose? And especially as we continue to refine our algorithms and they get more sophisticated by the day.

What are some of the threats or the safety concerns that recommendation algorithms pose?

Bruce: Some data shows that they’re responsible for a lot of the content people watch on YouTube. Their recommendation algorithm drives about 70 percent of the videos that are watched on the app.

And so when people are relying on these algorithms for the stuff they watch, you have to be very careful on what it’s showing people.

Gabriela: What the algorithms inadvertently coached to keep looking at.

Bruce: Right. The YouTube algorithm has improved, but in the past they’ve shown that like, it would funnel people into these rabbit holes of conspiracy videos.

Another example is like, Facebook. Their algorithm used to prioritize the angry emoji reaction. And as a result, in overweighting those kind of reactions, it helps spread misinformation on the website.

Gabriela: And violent content too. Facebook is a fascinating case study in this. Some of this information only came to light to the public when a whistleblower released sort of a treasure trove of files that revealed that Facebook’s algorithm was recommending content that made people angry, content that reflected violence, content that didn’t necessarily contain truth.

I actually interviewed that whistleblower, Frances Haugen, last year, and when I asked her, what ultimately pushed you to take this large and risky step? To collect what you saw going wrong with an algorithm at your workplace, this powerful, powerful technology company, and release it to the public. And she said, “I needed to be able to sleep at night again.”

Once she saw the darkest stuff happening in the depths of the black box of algorithms, and how it was, you know, sort of spanning into real life user interactions on the web, pa It really distilled how dangerous an algorithm gone wrong could be. There’s very clear case studies in which when unmitigated or handled irresponsibly or just left to teach itself things, algorithms can go dark places.

Bruce: Yeah, and I think it’s important to note right now that like even lawmakers are concerned. I know the [U.S.] House of Representatives passed a bill to potentially block or ban TikTok if it doesn’t sever its relationship with the Chinese company ByteDance over concerns about data but also just over concerns of how much control the company has on what viewers are getting recommended.

I know in the past there’s been reports that TikTok has suppressed posts by LGBT users, disabled users, or suppressed certain political topics on the platform.

Gabriela: Yeah, we’ve also seen that Meta, which is obviously the parent company of Facebook and Instagram and Threads, they now have a switch that you can toggle on and off to either be served political content or not be served political content because that’s just been such a trigger point for what algorithms might bring up, surface, and distribute when left sort of unchecked.

Bruce, we’ve obviously talked about some of the dangers that are lurking inside algorithms, but also some of the delights. If the algorithms are serving you cute kittens, or like very soothing ASMR, or maybe your next favorite spy thriller, what’s your take after looking into them so deeply? Do you think that we’re better off for having these sophisticated algorithms sort of powering the edges of our digital lives?

Do recommendation algorithms leave our digital lives better-off — or worse?

Bruce: So I guess my relationship status with algorithms is complicated at the moment. I think I’m, I’m, I’m conflicted. There’s this new book by New Yorker writer Kyle Chayka, [called] Filterworld: How Algorithms Flatten Culture. The basic premise is that because we’re discovering so many new things through these algorithms, content makers are feeling a little bit pressured to go viral on these places and to do well on these places and that means kind of targeting the widest audience possible.

And he argues this goes all the way up from Netflix and, you know, all these movie studios releasing these like kind of very similar superhero movies to musicians kind of making songs that will like result in a TikTok dance challenge. And so I do think there’s something to say about. Algorithms taking away a little bit of the like the magic and discovering things on your own.

And now that we’re all like going straight to the same sources, um, I think the internet is becoming a little less fun.

Gabriela: In some ways, the algorithm is taking away your ability for self discovery and just kind of handing you things.

Bruce, this has been such a fun conversation. Thanks podcast. I’m never going to look at my explore and discover pages the same again.

Bruce Gil covers breaking news and health for Quartz. This episode was produced by Ready Freddy Media. Additional support from Angel Fajardo, Quartz Executive Editor Susan Housen, and Head of Video David Weinstein. Our theme music is by Taka Yasuzawa and Alex Tsukira. If you like what you heard, follow us on Apple Podcasts, Spotify, wherever you’re listening, and tell your friends about us.

Wanna beat the algorithm? Send this episode link to five friends who would love it to show you know them better than a machine ever could. Then head to qz. com slash obsession to sign up for Quartz’s weekly obsession email and browse hundreds of interesting backstories. I’m Gabriella Riccardi. Thanks for listening.

All right, Bruce, are you ready? We are both going to open up our Instagram accounts and we are going to click, click, click, tap, tap, tap. All right. I’m looking, I’m looking. Oh, some of this stuff is stuff I don’t even want to speak into the microphone.

Bruce: So, basically, there’s a man, like, he has what looks like a glass Stanley cup, and he’s mixing a whole bunch of flavored syrups and things into a water.

Part of me feels like this has to be a parody of, like, WaterTok.

Gabriela: I just found this illustration of a shrimp smoking a cigarette. And I don’t even know where it came from! I do follow some art accounts and some illustrators. I hope that shrimp is having a nice evening. Somewhere outside preferably. No one, no one likes you smoking inside.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *