For TeachLab’s first Failure to Disrupt Book Club episode, Justin Reich is joined by friend and colleague Audrey Watters for her insight on Justin’s new book as a well respected writer on educational technology. They reflect on and play clips from the first live webinar Book Club from September 21st, where they were joined by special guest Chris Gilliard, Professor of English at Macomb Community College, to get his unique privacy and surveillance perspective on the book’s introductory chapter and edtech in the age of remote learning. “I mean, one for one, the pandemic highlighted that it's a place where kids go because their parents have to go to work. It's a place where kids get fed. It's all these things. As an educator, I don't want to overstate this. I believe strongly in education, but it is a place where people, in some ways, it holds people until they're adults. I'm trying to state that in the least offensive way possible. I mean “watch” in all the different ways you might think about it. Watch as in oversee. Watch as in take care of. Watch as in monitor.” -Chris Gilliard
For TeachLab’s first Failure to Disrupt Book Club episode, Justin Reich is joined by friend and colleague Audrey Watters for her insight on Justin’s new book as a well respected writer on educational technology.
They reflect on and play clips from the first live webinar Book Club from September 21st, where they were joined by special guest Chris Gilliard, Professor of English at Macomb Community College, to get his unique privacy and surveillance perspective on the book’s introductory chapter and edtech in the age of remote learning.
“I mean, one for one, the pandemic highlighted that it's a place where kids go because their parents have to go to work. It's a place where kids get fed. It's all these things. As an educator, I don't want to overstate this. I believe strongly in education, but it is a place where people, in some ways, it holds people until they're adults. I'm trying to state that in the least offensive way possible. I mean “watch” in all the different ways you might think about it. Watch as in oversee. Watch as in take care of. Watch as in monitor.” -Chris Gilliard
In this episode we’ll talk about:
Resources and Links
Watch the full Book Club webinar here!
Check out Justin Reich’s new book!
Produced by Aimee Corrigan and Garrett Beazley
Recorded and mixed by Garrett Beazley
Justin Reich: From the home studios of the Teaching Systems Lab at MIT, this is TeachLab, a podcast about the art and craft of teaching. I'm Justin Reich. This is our first book club highlights episode. This fall, I released a new book, Failure to Disrupt: Why Technology Alone Can't Transform Education that tries to look at the past of education technology to understand where we are now and where we might head in the future.
Justin Reich: Each week, we've hosted a book club with guests coming in to talk about each of the 10 chapters of the book. And we've had some really great conversations that we wanted to share with our TeachLab audience. To help me do that, I've invited my friend and colleague and inveterate education technology writer, Audrey Watters to join us. Audrey, thanks for being with us.
Audrey Watters: Thanks for having me.
Justin Reich: So in our first book club session, we were joined by Chris Gilliard, who's a professor of English at Macomb Community College, and it was actually Audrey who introduced me to Chris and Chris's work. Although, I realized once I met him that I was very familiar with him from Twitter, not because I followed him, but because his work was retweeted and found its way into my timeline so often. Audrey, what inspired you to invite Chris to the book club?
Audrey Watters: I think that Chris is one of the most important voices when it comes to thinking about privacy in education, education technology, and in technology in general. One of the things that's also important is that too often, I think when we have education conversations, we tend to not really recognize where most students are in school and in higher education, most students are a community colleges. We have a lot of discussions with folks from schools like MIT and really where most students are attending, it's more like a Macomb Community College. So I thought Chris was a great person to talk about perhaps a different kind of set of issues than often gets highlighted.
Justin Reich: Yeah. Different set of issues and a different set of perspectives that often get highlighted. But, you're exactly right that if you read the New York Times or the Washington Post, you will have a steady diet of information about Stanford and Yale and Harvard and MIT and Penn, and come to the belief that most college students are four year students enrolled in residential programs. When the vast majority of people involved in higher education in the U.S. are working class adults who are fitting things in around their jobs and other kinds of things.
Justin Reich: So the core argument of the introduction is that, if you want to understand the future of education and education technology, you should look to the past because education systems are conservative and they're not conservative because teachers and educators are lazy or can't think of new and interesting ideas. It's because school systems are immensely complex and they balance lots of competing goals and needs and incentives and stakeholders. And they're actually pretty well optimized as the machine learning folks would say for meeting that set of constraints. And when people come in claiming that they can dramatically change some part of the system, they often are underestimating how much they can mess things up in other parts of the delicate balances that are created by that system.
Justin Reich: So the book is broken up into two parts. In the first part of the book, we look at a series of technologies that are about learning at scale, learning environments with many, many learners and few experts to guide them. We look at instructor guided things like massive open online courses. We look at algorithm guided things like intelligent tutors. We look at peer guided network learning communities like Scratch or the original Connectivist massive open online courses.
Justin Reich: And we look at these things because new technologies are usually not actually all that new. They usually largely borrow from their older lineages. So if we can make some sense of what previous generations of technology have done, how they fit into existing systems, how efficacious they've been in terms of learning, then we can make some good guesses about what new entrance will look like.
Justin Reich: And then in the second half of the book, I argue that all of these technologies get hobbled in their usefulness by four kinds of dilemmas that show up over and over again, the curse of the familiar, the EdTech Matthew effect, the trap of routine assessment, and then the toxic power of data and experiment. Those are dilemmas that we'll talk about throughout the book club to try to see if these are areas where education technology runs into friction in trying to be useful and helpful and ethical, and see what we can do to try to address some of those frictions, both in technology design and in the way that we manage educational communities, schools, and colleges, and other things like that, to be able to use these new technologies. So, that's what the introduction to the book is trying to do.
Justin Reich: I think there were two parts of the conversation that we had that really stood out to me. Chris talked about school as a place where people watch and are watched. And I thought it was just a great description because I think one of the things that's happened during the pandemic is that I am certainly realizing more and more things that are just in the everyday water of school. The most taken for granted parts of what we do in schools, that when you translate them into online environments, they operate very, very differently. Watching someone while you're sitting in a room with them is a very different experience from watching them in a little box on a video conference call. And those differences, parts of them are subtle because parts of video conferencing work, you can see the other person, but you can't see them and you can't watch in the same way and you can't be watched in the same way. So that was an idea that sparked a lot of thinking to me about what online learning can and can't do during a pandemic and beyond.
Justin Reich: Now, let's play a clip from our conversation with Chris.
Chris Gilliard: As I mentioned, I'm interested in privacy and surveillance, and one of the things you mention in the book is how school as an institution serves multiple functions. One of those functions is to watch people. And I'm really interested in how that function, which is not always openly stated, but as like EdTech, more and more EdTech moves in from other industries, whether that's prison or platforms or whatever, that surveillance aspect gets magnified and more openly articulated.
Justin Reich: Chris, will you say more about, a sentence that you said before, which I found very compelling. Because I don't think I've ever thought about it this way. A purpose of school is to watch people. What do you mean by that? What are some examples of that?
Chris Gilliard: Yeah. Well, for one, the pandemic highlighted that. That it's a place where kids go because their parents have to go to work. It's a place where kids get fed. It's like all these things. And as an educator, I don't want to overstate this, I believe strongly in education, but it is a place where people, in some ways, it holds people until they are adults. I'm trying to state that in the least offensive way possible. But I mean, watch in all the different ways you might think about it. Watch as in oversee, watch as in take care of, watch as in monitor. And so that is often, it's a function that's not... I think, again, the pandemic has really highlighted the extent to which that is true. But I also think that most people understood that to some extent or another, so a lot of EdTech and it could be the LMS or it could be cameras in schools or whatever it is. That surveillance function has really blossomed, not the right word.
Justin Reich: Blossomed in the way that Kudzu blossoms.
Chris Gilliard: In the last 10 years or something.
Justin Reich: [crosstalk 00:09:22] poison mushrooms blossom too.
Chris Gilliard: Yeah.
Justin Reich: What do you think, Audrey?
Audrey Watters: I often say that one of the things that ed tech does is it confuses surveillance for care, and I think that we see this with the kinds of watching that most ed tech does. It really does, as Chris says, crank up a certain kind of monitoring and I think tries to convince people that it's in the service of caring for students. We need to make sure that they're safe. We need to make sure that they're not going to harm themselves or other students. We have to make sure that they're not cheating. But it's so different than the kinds of caring relationships that are built between people when you have that. When you have that ed tech mediation in between, it really is mostly surveillance.
Justin Reich: I was talking with some educators about the experience of parents who don't speak English in schools, and one of their concerns in serving these families was that an enormous number of the difficulties that these families run into can be mediated by administrative assistants in schools. Most school buildings have a receptionist with a human being at the front of it and you can go to that person with a wide variety of concerns and issues, and they can say, "Oh, I know the person who speaks that language. I've got a person who can connect you with that family there. We can use a few broken translator words to get you the information that you need to fix some particular problem." And it's exactly the kind of interaction, which if you dramatically over simplify it, you think that you could replace with technology. You could say, "Oh, I bet administrative assistants, 85% of their queries are about people who are tardy to school. Let's replace that with a text bot or something like that."
Justin Reich: And then, "Well, why don't we just translate X percent of the documents into these languages, and then that'll cover a bunch of the rest of the queries," not realizing how much subtlety and nuance, or even how important it is that when you come in with a question that you think is going to be perceived as a silly question, it's not just getting the answer to it, it's having someone with a smile on the other end saying, "No, no, no, no, no. We're going to make sure that your kids are okay," all of those other parts of the communication, which are every bit as important as what's being communicated, and it's all that nuance and subtlety and humanity that gets lost. But I think an incredibly powerful way to summarize it is replacing care with surveillance or replacing care with communication. That's very helpful.
Justin Reich: And then the second conversation that I thought was a really useful piece of criticism was Audrey pushing back on describing technologies as learning at scale, along the lines of scale is a value that entrepreneurs have, that venture capitalists have, and they often think about scale differently and I think in less productive ways than educators think about scale. And certainly one of the big themes of the book is that you can't achieve meaningful scale in education technology through distribution. If you scale anything up in a way that really makes a difference, it's because you've scaled communities. It's because you've scaled networks of teachers and educators who come up with better ideas about how to do teaching and learning with technology.
Justin Reich: But Audrey I thought made a very sensible point that scale is a word that really has some salience and some resonance in people who think about scale of the first way, and the very adoption of the term can put the conversation in that framing on those terms. And it might be worthwhile doing some more thinking about if what I'm really interested in is scale through community, maybe scale isn't even the right word to be using from the beginning. Let's listen to what you had to say.
Audrey Watters: We're maybe jumping ahead to some of the other parts that we were thinking of doing during the book club, the stump the chump thing. But this is one of the things I would like to push back at you, Justin, is this idea of scale. And I know learning at scale is your jam, but for me, that's the problem with this word scale. Does the scale means something different than public education? Does scale means something different than adequately coming up with the funding, public funding, taxpayer supported funding, that supports access for everybody to have educational opportunities? Does scale means something different than open, for example? And if not, why not? And if so, what does it mean to talk about learning at scale versus, for example, public education?
Justin Reich: That was great. I guess learning at skill for me, there are lots of learning environments with many, many learners and few experts to guide them. And some of the ones historically have been printed books and printed textbooks, some of which have been integrated in a variety of ways into public education systems. And some of them have been deliberately ways of creating new pathways into education, like the Harvard Classics, this library of books that one of the Harvard Presidents publishes in the early 20th century, which says, "Read these 50 books and this is basically as good as a Harvard education is, and it will be free and accessible," and so forth. Children's television is another mechanism, which is about serving many, many learners with few experts to guide them, and the availability of the internet just creates lots of new pathways for these kinds of large scale learning environments to exist, which build on existing efforts, but are not exactly the same as existing technologies.
Justin Reich: The proliferation of adaptive tutors, of massive open online courses of peer learning communities, they seem to be things that are not quite like books and television, that they have a different set of affordances. I suppose. I think it comes back to my somewhat pragmatic optimism that we could build these things and we can build terrible things with them, or we can build great things with them. And I think a point where we agree, it's going to matter a lot, what is the political economy in which we generate these things, a political economy in which we have very robust support for public education, for public higher education, is one in which we're going to build technologies and people are going to be like, "Cool, this can slot in here. This is how we can prepare people extra for these things," or stuff like that.
Justin Reich: And then I think there are other political economies, including the one that we're in particularly in higher education, with austerity and adjunctification where, as we shrink higher education, we shrink the value of what we can generate. I mean, I will also say that some of the artifact of being interested in learning at scale too, one of the things that I was interested in doing with the book, which I think the vast majority of public is not particularly interested in, it's weird that it's still in there, but I just observed that there are different communities of people that study things that try to operate at scale. So I proposed these three genres of learning at scale that we're going to read about in the next few weeks, instructor guided things, algorithm guided things, and peer guided things. And I observed that it tends to be different communities of people who build and study these things. But I actually think they have a bunch of similar kinds of challenges and problems, and so part of what learning at scale is meant to do is to be like, "Oh, well, let's get people to come together and say, 'Oh, well maybe there's some things about making more equitable technologies that folks in Scratch have figured out that might be useful for the people who are working at edX or Khan Academy or other kinds of things like that.'" And, duh.
Justin Reich: And then sometimes, I think, "That is a weird piece of scholarly politics to try to weave into your book, Justin. Most people are not going to find that helpful or interesting." But you can find bits of piece in there. I don't know, Audrey. Does that help at all? Or Chris, do you have reactions to that is what learning at scale is?
Chris Gilliard: Go ahead, Audrey. I'm still trying to process it. So go ahead, Audrey, if you're ...
Audrey Watters: I think you're right. I'm pushing at you purposefully, but I do think that it matters in some ways, though, how much we let these narratives ... again, we're circling back on things again, but how much powerful narratives seem to seize, particularly, imaginations of politicians and administrators, right? That there's something about these techno fantasies that really resonate.
Audrey Watters: I just remember the during the year of the MOOC, the ways in which people lost their minds, administrators lost their minds. I remember when they ... The Virginia UVA fired its [crosstalk 00:01:29].
Justin Reich: My alma mater.
Audrey Watters: Yeah. The board fired the president because they thought that she wasn't moving quickly enough. And all of the David Brooks op-eds and saying like, "This is it. This is the end. Everyone get on board." "Higher ed will never be the same. It's the end of college as we know it," I think TechCrunch pronounced.
Audrey Watters: And it was very much part of this narrative that you could see it be really crafted and repeated by people who might've had a background in teaching machines to think, but didn't really necessarily have a background in teaching humans to learn. And so it was ... This is such a ... It's such a powerful ... politically so powerful.
Justin Reich: So one way I might reinterpret your critique is something like, "Justin, it was the charismatics who invented this at scale phrase. Why are you using it?"
Audrey Watters: Why are you using it?
Justin Reich: Because the frame gives them a privileged higher ground. That's a great critique and one I hadn't thought of, and I hope that people will keep thinking about that, too.
Justin Reich: Audrey, do you have another term with the benefit of hindsight and reflection that you think would be a better one?
Audrey Watters: Public education. I'm being slightly facetious, but not really actually. I think it is a commitment. It is a commitment. And perhaps it is the same commitment to make sure that everybody has access to education, to high quality education.
Audrey Watters: But to me, it's like, "How do we talk about the funding mechanism? How do we talk about building capacity politically, socially, not just how do we build capacity, technologically?," which, to me, that scale piece really seems to put its thumb right there on the scale, if you will.
Justin Reich: Yeah, yeah.
Audrey Watters: I think that technology, for ed tech in particular, for a very long time has been promised and seen as if not a silver bullet, then certainly some necessity that we simply must have in order to move our classrooms forward into the 21st century.
Audrey Watters: I think it's worth pushing back on that. I think it's worth pushing back on the cheerleading and the lack of the uncritical cheerleading that too often has happened. If we buy iPads, things will be better. If we buy computers, things will be better. As long as students are doing digital work, it's better than the analog work.
Audrey Watters: And I think it's always worth asking questions about the vendors that come to school to sell their products and the vision that they have for the future as well. We can think about the vision of the future that Facebook has. Facebook says it's about building community. But we can look at the reality and see that actually Facebook's been quite detrimental to community. Facebook's been quite detrimental to democracy.
Audrey Watters: And I think we have to ask those same kinds of questions for ed tech, that promises to make things better, faster, cheaper. What if we're actually adopting things that are detrimental to democracy?
Justin Reich: Yeah.
Audrey Watters: And during the year of the MOOC, one of the more popular, perhaps better venture funded startups was you Udacity, founded by Sebastian Thrun, an artificial intelligence researcher at Sanford/Google. And he set out to prove, by partnering with San Jose State University, that his online courses would be better, cheaper, faster than the kinds of courses that San Jose State offered.
Audrey Watters: And the experiment failed, and it failed rather dramatically. The students who took the Udacity courses at San Jose State did much more poorly than the students who took the regular classes at the school.
Audrey Watters: And that's important for a number of reasons. Obviously, nobody likes to come up with an ed tech intervention in which people do worse. But it's also important that San Jose State Public University in California is actually one of the most ethnically diverse universities in the country. And so it wasn't just sort of ... it wasn't students broadly speaking did worse. These were students of color who did worse. This was actually an intervention that the TechCrunch, for example, promised would end higher education as we know it and actually did real damage to students in the classes.
Justin Reich: Yes, I think that's exactly right. Let's play the clip.
Chris Gilliard: Yeah. I think part of it is that we're stuck with technology invented by people who actually didn't think about those questions. So the big example I'm working with ... I'm using now is like whole thread that went around Twitter and made it into a bunch of different magazines about Zoom backgrounds and how often people with dark skin, their face is not picked up in it when you use a virtual background.
Chris Gilliard: And there's a, I think he's an educational technologist who posted a thread on this. And he literally posted his head and he's a bald, appears to be white guy. And he has a virtual background that works fine.
Chris Gilliard: And then a faculty member who was seeking his assistance is what appears to be a dark skin, black male. And so he looks like the headless horseman. So it's just like a body with no head.
Chris Gilliard: And Zoom, it's been in ... So the ... Well, I'll shorten this. The people who made Zoom didn't think about these things. They didn't think about harassment. They didn't think about Zoom bombing. They didn't ... There's all these things they didn't.
Chris Gilliard: And so it's a difficult question to do risk reward because it forces that question onto the user, when those questions should have been asked and answered or addressed, or at least gamed out to some extent way before that. And now, we're just stuck using technology that wasn't invented for us or for the purpose in which people are using it.
Audrey Watters: It comes back actually to some of the things I'm interested in in this introduction, Justin, is what is it ... Is there something about ... Is it the culture? Is it about the disciplinary training that technologists have? Is it something about this idea of wanting to engineer society or engineer school that, I think, leads us to end up with these technologies being built by people who haven't thought about these things? And how do we get here with the folks, with the engineering crowd missing the boat so dramatically on these questions that…
Justin Reich: Well, I think you're asking great questions. Chris introduced us to this idea that the technologies that we use in education are often not designed by educators and therefore they don't even have a hope of having these considerations because Zoom was designed for people who are thinking about like board meetings and corporate meetings and things like that.
Audrey Watters: There wouldn't be any black people on the board.
Audrey Watters: We know from research that disciplinary practices at school, for example, tend to play out in ways that are biased against students of color, black girls in particular, end up being expelled as suspended at far, far higher rate than white girls. And so if we think about what are the practices that already happen in schools, what are the ways of which racial bias already happens in schools, we have to ask them what happens when we add technology to the mix? What happens when we actually have very little insight into the decision making practices that go into these algorithms? And so if we know, for example, that schools make decisions that are biased, and then we know that schools are adopting technology on top of this, that's often built with data that would reflect that bias, are we going to see a tech sort of reinscribe and actually, maybe even obscure some of the bias, but just because we can't actually look at the ways in which these algorithms were crafted.
Audrey Watters: We don't know why Twitter makes the decisions that it does because we have no insight into the Twitter algorithm. We have no insight into the Facebook algorithm. We had no insight into the algorithms that schools use in their ed tech as well.
Justin Reich: I think when people look back on 2020, they're going to recognize it as an extraordinary demonstration of the durability and conservatism of education systems. For the most part, in colleges and secondary schools, faculty members walked away from their lecterns and they sat down in front of their webcams and they kept teaching roughly the way they were doing before, and to the extent that they had to make adaptations, they were mostly focused on making those adaptations in ways that aligned and cohered with what they were doing before. There were however many dozens of panels about reinventing education, and re-imagining education and those kinds of things, but it's not overwhelmingly what happened on the ground. And people can critique that.
Justin Reich: You can say that trying to do school as we were doing it during a pandemic makes no sense. Or people can celebrate that maybe we maintained this continuity because we actually, in schools, do the best job we can. That there's room for improvement, but there's not sort of dramatic new forms out there that are going to be tremendously better. But for me, as someone who tries to help schools get better by starting from, "All right, what does the evidence say about the reality of schools?" I think it really does put the onus on people who are imagining huge transformations in education systems to be able to answer the question, "All right. But if this is what happened during a global pandemic, why do you think that there's some other time in which we would dramatically reorganize the education of our society around new technologies to make that work?"
Justin Reich: I think it's good evidence that instead of trying to massively reshape our systems, we're better off realizing that really good education is an education that gets a thousand little things right, and we should keep trying to get those pieces more and more right over time.
Justin Reich: Audrey, thanks so much for joining us.
Audrey Watters: Thank you, Justin.
Justin Reich: Looking forward to more conversations in the weeks ahead.
Justin Reich: That was Audrey Waters reflecting on our book club conversation with Chris Gilliard. Special thanks to Chris Gilliard and Audrey Waters for being part of our first live book club session for Failure to Disrupt. If you'd like to see the full conversation, you can find the full webinar on our Teaching Systems Lab YouTube page.
Justin Reich: I'm Justin Reich. Thanks for listening to TeachLab. Please subscribe to TeachLab to get future episodes on how educators from all walks of life are tackling distance learning during COVID-19. As you probably know by this point, I've released a new book, Failure to Disrupt: Why Technology Alone Can't Transform Education, available from booksellers everywhere. You can read reviews, related media, and sign up for online events at failuretodisrupt.com. That's failuretodisrupt.com.
Justin Reich: This episode of TeachLab was produced by Aimee Corrigan and Garrett Beazley, recorded and sound mixed by Garrett Beazley. Stay safe until next time.