My Teacher Is an Algorithm: Silicon Valley Billionaires Want to Replace Teachers with Technology
Facebook’s Mark Zuckerberg is in. So is Reed Hastings from Netflix. In fact, it’s hard to find a Silicon Valley billionaire who doesn’t want to disrupt public education by replacing teachers with algorithms.
In the latest episode of the Have You Heard podcast, co-hosts Jennifer Berkshire and Jack Schneider talk to Common Sense Media’s Bill Fitzgerald about how so-called “personalized learning” is actually a misnomer. Learning by algorithm, says Fitzgerald, isn’t particularly personal, or even human. And the closed learning systems of which Zuckerberg et al are so enthusiastic about give adults far too much opportunity to limit the content of what kids are exposed to—one reason why fans of religious education, including Secretary of Education, Betsy DeVos, are all over this trend. Don’t be fooled by Silicon Valley’s talk of equity and civil rights as part of its sales pitch for personalized learning, says Fitzgerald. Automating the learning experiences of the most vulnerable students will only exacerbate the country’s stark educational inequities. You can hear the entire episode here.
Have You Heard: We just played a clip of Mark Zuckerberg explaining why he’s betting big on personalized learning, but as you explain, a more accurate name for the product he’s pushing might be “algorithmically mediated learning.” Break it down for us.
Bill Fitzgerald: When we talk about algorithmically mediated learning, we're talking about a very specific type of interaction, which even though it's actually called personalized learning, actually cuts people out of the project, cuts people out of the equation. So we have the semantics of the term, which actually sound very human, but in some implementations, not all, but in some, we actually have a process where what we call personalized learning is actually less personal and less human. There's a piece of software with a set of predetermined paths and outcomes. You can get there different ways, but the roads you're going to travel are already built. You just have a choice about whether you turn right or left, but not ultimately about where you go.
Have You Heard: There is something so sad about the images of students sitting in front of their terminals at Rocketship or at Carpe Diem, another “blended learning” prototype that is now scaling back because it turns out that students don’t actually like being educated this way. Where exactly do students fit into this vision of education?
Fitzgerald: One of the things I always look for when I'm actually assessing a system is what mechanisms are in place for student voice. That's something that's not present in many places. We often, and this is a problem that exists across the board in education period, is we are often hesitant to step aside and make room for students who really are the experts in their own experience to share that expertise with us. This is, from a learning system place, that's a design flaw, but from a systems place, it think that's something where the adults in the room actually we need to step aside. We need to make room for voices that are not our own and for voices that actually might disagree with things that we feel strongly about. You also see algorithms being used to limit the content that kids are given access to. I’m thinking, for example, about information, about LGBTQ issues or female reproductive health. It gets down to controlling who can access what and what resources might be put into allowing or controlling what people can access.
Have You Heard: There was a story in the New York Times this spring about how Google is taking over the classroom. You were quoted as saying that the Silicon Valley “disrupters” have appropriated the language of equity in a way that mostly benefits them. Explain what you mean by that.
Fitzgerald: We have these social and ethical and moral issues and algorithms can effectively embeds those and make them less visible and because it's an automated process, we’re trained to think that it's more objective, when the reality is its lack of objectivity just gets performed automatically every single time. So I think there's some large scale misunderstanding of what algorithms actually do and how they can exacerbate biases. We have examples of this all over the place. Recently Uber rolled out pricing that will shift based on what an individual user is expected to pay or would feel comfortable paying. That’s a great example of a personalized system, which completely actually takes advantage of the person using the system and also cuts labor out because the shift in prices doesn't actually accrue to the driver. It goes straight to corporate. So we see these examples of personalized systems which really create greater inequities or increased inequities. I think there's a tendency in some of the more marketing-focused conversations about personalized learning to downplay the role that humans have in creating these systems in an attempt to make it seem like these things actually conquer some of these really hard and intractable social issues. They're still a creation that we put out into the world and we humans have a say and we downplay that at our peril.
Have You Heard: One of the most disturbing elements of the personalized learning debate is that there is so little debate. Boosters abound while skeptical voices like yours are few. What sorts of questions that should be front and center right now?
Fitzgerald: We need to ask questions about what technologies are used in what locations and why and we need to ask questions about why does one environment and or one setting or one school in one district make sense whereas another one doesn't. What's the full reason? What are the full range of reasons we are making the technological decisions that we are because these lead us into the uncomfortable conversations that we need to have if we are actually going to do this right. I think there are elements of personalized and even algorithmically mediated learning that can be helpful, but if we're going to roll these out, we need to be really clear about what we're leaving behind and why we're doing it.
(This is an edited transcript. You can hear the entire podcast here)