Justin Reich is joined by Mike Caulfield, a digital information literacy expert working at Washington State University who has worked with a wide variety of organizations on digital literacy initiatives to combat misinformation. Together they discuss critical thinking, issues with traditional forms of evaluating sources, and the SIFT method. “SIFT. S-I-F-T. First is just “Stop”. If you find yourself emotional, if you find something that you've just got to share... Whatever is the trigger, the emotion, your excitement about sharing it, your rage, seeing something that just strikes you as a little bit odd... Whatever is the trigger, stop and ask yourself, do I really know what I'm looking at here? And you might… You might look at the source, and you might be like, oh yeah. I know this person. Most of the time, a lot of the time, you don't. A lot of the time, it just landed on your doorstep.” - Mike Caulfield
Justin Reich is joined by Mike Caulfield, a digital information literacy expert working at Washington State University who has worked with a wide variety of organizations on digital literacy initiatives to combat misinformation. Together they discuss critical thinking, issues with traditional forms of evaluating sources, and the SIFT method.
“SIFT. S-I-F-T. First is just “Stop”. If you find yourself emotional, if you find something that you've just got to share... Whatever is the trigger, the emotion, your excitement about sharing it, your rage, seeing something that just strikes you as a little bit odd... Whatever is the trigger, stop and ask yourself, do I really know what I'm looking at here? And you might… You might look at the source, and you might be like, oh yeah. I know this person. Most of the time, a lot of the time, you don't. A lot of the time, it just landed on your doorstep.”
- Mike Caulfield
In this episode we’ll talk about:
Resources and Links
Check out Mike Caulfield’s Sifting Through the Coronavirus Pandemic!
Check out Mike’s book Web Literacy for Student Fact-Checkers!
Check out Mike Caulfield's blog: Hapgood!
Check out Justin Reich’s book, Failure To Disrupt!
Join our self-paced online edX course: Sorting Truth from Fiction: Civic Online Reasoning
Join our self-paced online edX course: Becoming a More Equitable Educator: Mindsets and Practices
Transcript
https://teachlabpodcast.simplecast.com/episodes/mike-caulfield/transcript
Produced by Aimee Corrigan and Garrett Beazley. Recorded and mixed by Garrett Beazley
Follow TeachLab:
Justin Reich: From the home studios of the Teaching Systems Lab at MIT, this is TeachLab, a podcast about the art and craft of teaching. I am Justin Reich. Today we're happy to welcome Mike Caulfield. Mike is a digital information literacy expert working at Washington State University. He's worked with a wide variety of organizations on digital literacy initiatives to combat misinformation with the AASCU's American Democracy Project, the National Writing Project, and Civics Canada. He's a winner of the Rita Allen Misinformation Solutions Prize, and the author of the award-winning textbook, Web Literacy for Student Fact-Checkers.
He's an early believer in the idea of civic digital literacies, and Mike's award-winning work has been covered by the New York Times, the Chronicle of Higher Education, NPR, and the MIT Technology Review. Mike, thanks so much for joining us today.
Mike Caulfield: My pleasure, Justin.
Justin Reich: Mike, we've talked over the years about OER, open educational resources, about digital information literacy, but I actually don't know the story of how you got interested in this particular topic. Where does it begin for you?
Mike Caulfield: Yeah, it actually begins, I don't know, it might begin before we met. When I used to work at Keene State College in New Hampshire, a public liberal arts college, we were looking at digital literacy outcomes. We were rolling out a new outcome, testable outcome to be integrated into classes. I led a committee on that, and we came up with something that, if people are familiar with Harold Rheingold's work, something that mirrored some of Howard's work. We had participatory technologies, collaborative technologies, you know that I've been big into collaborative technologies.
Justin Reich: Like wikis and those kinds of things.
Mike Caulfield: Like wikis and so forth. And then we had this one thing called critical consumption. It turned out to be the one that a lot of instructors picked to be part of their courses, was this critical consumption thing. The thing about university-level outcomes is you assess them. So, at the end of the semester, people were assessing them, and I got this call from the library that said, hey, could you come over here? I want to show you something. I get called in, and they say, we did the assessment. We taught these critical consumption things using this thing called CRAAP, C-R-A-A-P, and we did the assessment, and here are some of the things that we found when we actually assessed at the end of the semester, the sources that students were using.
One of the sources was a website that was Government Slaves, and the student was, I think, citing them on water policy or something. It had the whole colloidal silver advertisements on the side. And so it was really clear that something had gone...
Justin Reich: This is a website which is just filled with conspiracy theory and strangeness.
Mike Caulfield: Yeah, exactly.
Justin Reich: You've had a series of students which have been nominally trained to be able to evaluate these sources, and you're now doing a more formal university-wide evaluation...
Mike Caulfield: Yeah.
Justin Reich: And as the tests are coming back in from these various classrooms, people are tapping you on the shoulder and saying, Mike, I think something is really, really wrong here.
Mike Caulfield: Right. I agreed. We started to look into this, and what we found at that point was that what a lot of students were actually missing was not critical thinking skills, as we think of critical thinking skills. They were missing some real basics, like what is this site I'm looking at? Can I find out? If I do a web search, is this claim associated with conspiracy theory, or is this a consensus reality claim? We tried to advance that. It was part of something in AASCU, the American Association of State Colleges and Universities, for a while called the E-Citizenship Project, and we tried to advance this from 2010 to 2013. But at that time, there just wasn't a whole lot of interest in this particular issue, and so... Put that aside, and I think I met you when I put that asi... Again, not of lack of my interest, but out of lack of the interest of anyone else in this particularly weird issue. I went and did a bunch of stuff on collaborative tech.
That's where you get into the wiki, that's where you get into Federated Wiki, the idea of, if you've seen my stuff about we live in this stream, but we want to get back to this web as garden, all this stuff that... I don't know. Kind of theoretical. Then, of course 2016 happens, and a bunch of people that had worked with me before on these issues started saying, hey, maybe we should make another go at this. I agreed, but I think something really lucky happened.
As I was starting to find the old stuff and pull it together, I had this grab back of tricks that students could do. It's not even a tool belt or box. It's not even that organized. It's just like a bag of hammers and wrenches. Here, go and do a search on whether you can see this news story other places. And if you don't see that news story other places, maybe that's not a real news story. Look up some stuff that was, I think, too advanced. Look up who owns a domain, which was way too advanced, now that we look back at it. All these 14 or 15 things you could do to test the veracity or credibility of a source or a claim.
Justin Reich: These are the kinds of things that conceivably would have helped the young person who found themselves on governementslaves.org.
Mike Caulfield: Yeah, exactly.
Justin Reich: Citing resources on water policy.
Mike Caulfield: Yeah, exactly. You could look up and you could say, hey, is Government Slaves a well-known publication in Wikipedia? Is this a well-known publication? You could look at some of the stories that they had on that and throw them into a web search and see if they're associated with Snopes Fact Checks. Things like that. Yeah, this big bag of, I think, useful stuff, but just all over the place. And then I encountered the work of Sam Wineburg and Sarah McGrew had written a piece where they had looked at students and their ability to make these claim and source credibility evaluations. It was like going back in time in getting to exactly that place. I could put myself back in that library. I remember actually really specifically the library room I was in as they were showing me these papers. And then I was looking at their work, and I was like, this is exactly the same thing. And so I wrote this long...
Justin Reich: And what I would describe to listeners who aren't familiar, Mike, what Sam and Sarah's research shows, is that if you ask Stanford freshmen and tenured history professors to do pretty reasonable information literacy tasks, they are shockingly bad at those tasks. And if you ask professional fact checkers at news magazines, they are 100% accurate with much less time. Basically, we can find a group of people for whom we can give all these web literacy challenges, and they always get it right, and they always do it relatively quickly. And they simply use different strategies than the Stanford freshmen and the tenured historians use. Would you summarize the research that way?
Mike Caulfield: Yeah, yeah. Although, it's even weirder than that, Justin. I'll tell you, it's even weirder than that, because actually the one that I was seeing was the one in November 2016, which was the first one, which just showed students, showed nothing but students. And of course, like you said, they're shockingly bad at this stuff. They look at a picture of mutated daisies, and it says, hey, this was found near Fukushima. Is this good evidence that conditions are unsafe around this nuclear power plant that's had this issue? And students, they just react to it and they say, yeah it is, because look at those daisies. You can't trust nuclear power. And other students say, no, this isn't good. Anybody can put anything up on the internet, so it's not good evidence. And nobody actually just does the simple thing, which is just look up, hey, has anybody talked about this daisy on the web? Could we just go and see if other people who actually know more than me have looked at this daisy and talked about it?
No one does that. No one does that. And what I saw at the time was fascinating, was everybody was looking at Sam and Sarah's article, and they were all going, ain't it awful? Ain't it awful? What we really need to have is more critical thinking. And so my large rant in November was I just read this study from these two people, I'm looking at the news coverage of it, and nobody is actually getting what's going on here. It's not about critical thinking. It's that the students don't actually go and do things. It's about critical doing. It's about getting off the page and just doing these really quick things. And so I went into my whole history with Government Slaves, and it was a long, long, rant, but basically said, look, everybody is looking at this study wrong.
The story as it's developed, I think you are probably the one connection that I actually had to Sam. It may have come through some other way, but Sam Wineburg ended up reading it, and he said, you know what, we got to talk. And also, there's another study that we're doing, and I want to tell you about this study. And that's that study that you're talking about.
Justin Reich: Got it.
Mike Caulfield: With the fact checkers. It ended up being this lucky piece that they were doing this research. And they had a theoretical frame that took my bag of hammers and wrenches and screwdrivers, and started to slot it into something that was a little more streamlined and direct.
Justin Reich: A schema.
Mike Caulfield: A schema. Yeah, a schema. We could talk a little bit about what that schema looks like, but that's how it started. That's how it started. Basically, old work that was shelved, and then when I was seeing this new work, I was like, this is Groundhog Day.
Justin Reich: I think a lot of people who encounter the research that you've done, the research that Sam has done with his team have a similar experience of going, wow, this is really bad. This is really a problem. There's two parts of the problem. One is that we ask all kinds of people to do basic evaluation of information on the web and they don't perform well in relatively straightforward tasks. And then two, when you dig a little bit deeper, the approach they seem to be taking is wrong, but it's exactly what they're being taught to do.
Mike Caulfield: Yeah.
Justin Reich: The strategies, you talk about the CRAAP checklist, which I don't think we enumerated what that was, but this is a pretty standard source evaluation. The CRAAP stands for currency, relevance, authority, accuracy, and purpose. Why don't the standard approaches that we're using work, and why do we keep teaching them?
Mike Caulfield: CRAAP, and this was something we actually learned back in that library room. What the librarians were using was this CRAAP model, and what we learned as we looked at it was that what the students were doing was actually predicted by what they were taught. You mentioned what CRAAP is about. Currency, relevance, authority, accuracy, purpose. It's really a set of questions. I think if you actually map out what people usually do with CRAAP, it's about 26 questions, because each one of those things has five questions under it. But it was never developed, actually, for people to evaluate web resources. At some point it was transferred to that. At some point it became repurposed, but the sorts of questions that are asked there are actually questions that were developed initially back in the 1970s and 1980s as collection selection criteria.
If you imagine yourself as a librarian and you've got to decide, hey, do I spend money on this book? Or do I spend money on this book? You've got to have some sort of selection criteria. It's got to be transparent so that you can explain to this person, hey, this is why I got Carl Sagan's book, and this is why I did not get your healing with crystals book. Because its public money. Actually, some of the earliest standards of this came out of medical libraries that were fighting this battle as to where the money went. It is a good model for that, because what you look at is you look at something like currency.
Currency just says, hey, is this the most current information available? If you're a library, you want to make sure you have current books. You want to make sure you have the state-of-the-art. Does the purpose suit our clientele? Is the accuracy of the book high? Does it seem to be high? It's the sort of thing, if you're going to be a librarian, spending a few hours going through and trying to decide what set of books to get, it's not a bad set of decision-making criteria. But it really fails on the web, because when you look at the way people use it on a webpage, what they tend to do is they tend to come to the page, and they tend to see this as a set of items on a checklist, and the more that you can check off, the better the source must be.
So, if I go and I see a piece of misinformation, but it's absolutely current, it just came out, somehow this is still a check for it. It's like, okay, well, it's up to date. That's good. That's one point for. And then I look at it, and well, there's no spelling errors here. That's a high level of accuracy. That's two points for. And then you have a whole bunch of stuff under this that are just bizarre internet folklorish things, oh, well, if it's a dot-org, it's more likely to be better than if it's a dot-com. There's never been any truth to that whatsoever, but that's been adopted. And so you're like, well, this is a dot-com, so that's a strike against.
And two things are going to happen. One is, none of this stuff really matters, right? None of this stuff really matters. What matters is what do people in the know really think about this that you can trust? And then the second thing that happens is, even if it did matter, if you're overwhelmed by complexity and then you go through 26 questions, and a third of them end up being looks not so good, and two-thirds of them end up being looks good, you haven't solved your complexity problem. You've taken something that's a relatively complex, overwhelming task, and you've just expanded the complexity of it. And now you're going to have to weight all these things. You're going to be like, well, the third that is bad, is that particularly bad? It doesn't actually solve any problem you have.
Justin Reich: [crosstalk] No, it's great. Some of that background I didn't know. I didn't know that the CRAAP test emerged from evaluating books to be included in libraries. And it makes sense that a set of criteria to evaluate books to be included in libraries is not the right criteria to use to sort truth from fiction online.
We know that the things that are happening. Presumably, we keep teaching these things over and over again, because there's a conservatism and inertia that's inherent in systems. The internet emerges, the CRAAP test gets spread around. I actually have this hypothesis that one of the reasons why these checklist approaches emerged and spread in the late 1990s and the early 2000s was that they actually did kind of work okay, that misinformation sites had more of these flaggable kinds of errors than more credible sites did. And so, there was maybe a period in which using a CRAAP test might have worked, although that might simply be me being too charitable to the checklist.
Mike Caulfield: I found a study in 1998 or '99 that looked at this. And this was even before it was called CRAAP. There was a previous one that was, I forget, COCOA. It's the same things, but it was mapped onto different terms. The term CRAAP comes out of 2004, when they come up with that particular acronym. It was showing, even in the late 90s, that the librarians applying this approach, they were simply not finding any correlation with student success in these things. But I will say though, I will say that the social problem around that has become more intense. It certainly is the case that the amount of time that we have to evaluate individual things has shrunk, that we are used to seeing a lot more things fly by us in a day than we used to. And so, the necessity of getting something that is more manageable and more quick has certainly gone up.
And then yeah, I would say that some of the things like accuracy, where people look at things like layout and things like that, some of those problems are more intense now. In the 1990s, you didn't have WordPress. You didn't have blogging software that was just really simple to make a decent-looking site. If you found something that was a decent-looking site, you could at least know there was some money behind it. It wasn't some crank in the basement. I guess what I'm saying, it's a little bit of a mixed bag. There were warning signs even at the beginning, but it certainly has been the case that something which may have sometimes had a half-decent effect has now really become useless.
Justin Reich: It's also the case that in educational systems, periodically we adopt programs that have a lot of face validity and don't work, and use them for long periods of time before realizing they don't work, and use them for long periods of time after there's good science suggesting that... The D.A.R.E. program to prevent drug abuse in school is a good example. There's lots of other good examples.
Mike Caulfield: John Warner said something interesting about CRAAP that I learned from. He said that the thing about CRAAP is that it's a really bad tool, but it is a half-decent statement of values. Right? If you think about what you value in a work, and this makes sense, if you think about it as initially developed, in coming out of library collection criteria, it's a half-decent statement about what you value in works that you select, right?
Justin Reich: Mm-hmm (affirmative).
Mike Caulfield: But it's not a good way to go about determining it. I think the face validity of it partially comes from the fact that people can look at CRAAP and say, well yeah, these are things that I would value in a work, but are they the questions that students should be asking? Sometimes those are two separate processes, and I think that's where the difficulty comes from.
Justin Reich: Let's move onto, then, what those questions are that students should be asking. You have these 14 or 15 tools, you reorganize them into a schema, you organize them around this idea called SIFT. Walk us through your introduction to how students should be evaluating sources online.
Mike Caulfield: The idea of SIFT, the broad idea is this idea, lateral reading, that again, is theorized by McGrew and Wineburg and Breakstone and others. And that's the idea that to really learn about the credibility of a claim or a source, you probably want to go and see what the rest of the knowledge network says about it. If you really want to learn if a news story is true that comes to you from some rando, go and see what other people are saying about the news story. Don't delve deeply into that person's work on it, just go and see, is this a generally reported story? The general principle around a source is, the about page of a website is a good starting point, but that's what the site itself is saying about itself, and you see the problem there. If it's not a trustworthy site, then the about page is not necessarily trustworthy either. So go and see. Don't look at what Robert F. Kennedy, Jr. says about himself. Look at what other people say about Robert F. Kennedy, Jr. and his work in vaccines. Get off the site and see what the larger community says.
SIFT is a middle layer to that that reminds us of four things. The first is just stop. It's an acronym, SIFT. S-I-F-T. First is just stop. If you find yourself emotional, if you find something that you've just got to share... Maybe you do see a lot of spelling errors or something. Something that makes you go, hmm, okay? Whatever is the trigger, the emotion, your excitement about sharing it, your rage, seeing something that just strikes you as a little bit odd... Whatever is the trigger, stop and ask yourself, do I really know what I'm looking at here? And you might. You might. If you have some expertise in something, you might be looking at something, you're like, yeah. I know enough about this subject to know that this is probably true. And you might be looking at it, if you stop and you look, you might look at the source, and you might be like, oh yeah. I know this person.
Most of the time, a lot of the time, you don't. A lot of the time, it just landed on your doorstep. A subject you've heard nothing...
Justin Reich: Tumbled down your feed.
Mike Caulfield: Tumbled down your feed. I have this whole little parable I tell about this bottle in a lake, and I say, the web is sort of like this. You're walking along the beach and you see this bottle in a lake, it's bobbing in the waves, and you go and grab the bottle and you pull out this note, and the note says, N95 masks don't prevent COVID. So you start, and you dig into the note, and you look, and it has some footnotes and it has some charts and there's some data there, and it makes a really logical argument about the size of the mask weave and all this stuff. You go to your friend, you say, hey, I found this note in a bott... N95 masks don't work. And your friend is like, well, that's very disturbing if true. And how do you know that? I said, well, I found this note in a bottle floating in the lake. And the person says, well, that doesn't sound great. And you're like, but no. I looked at the note. I looked really deeply at the note and I followed all the logical arguments. The note has lots of footnotes and so forth.
The point again, is that the web is sort of like that. The web tumbles this stuff at your feet, which may have, again, we use this term face validity, may have some sort of validity on its face, but you want to get away from that. So, you're going to stop and you're going to say, hey, I pulled this bottle out of the lake. I actually don't know who it's from, and I'm not actually a virologist. So, I'm probably going to have to zoom out here.
Investigate the source is really simple. It's not a Pulitzer Prize-winning investigation. It is as simple as, I want to find out who is this person that's telling me this? Are they in a position to know? Is there a reason they would know more about this subject than the average person, either through their profession, maybe they're a reporter, or as an expert? And do they have some reputational incentives to try to be as truthful and as non-spinny as possible? What's their, we don't talk so much about bias, but we do talk about agenda. What's their agenda? Is their agenda to get you to buy nutritional supplements, or is their agenda to inform? And those are two separate agendas. Look at that. Or, is their agenda, maybe they're a comedian and they're into satire. That would inform you as well.
And then, find other coverage. Find better coverage is, if that source in itself is not sufficient to say, oh, okay, this is a really solid source, I'm not going to just take this at face value, then go and find something better. You don't have to come with the... It's not the dance. You don't have to dance with the person that brought you. You can actually go and you can find a better source.
Some things are a little more complex, and it turns out you can't find better coverage on something, it turns out the source that initially provided you the claim is not reliable, and so the trace, the T, the trace is about, okay, well this person makes this claim. Let's follow the links that they provide and see if it gets us to a better or more reliable source. That's a last step, because you want to be careful that you're not sucked into their game of pulling you into their knowledge network. It's always better to zoom out, pick your own expert, pick the best expert, pick the best source, pick better coverage, than to follow their web in, but sometimes you've got to do that.
Justin Reich: It strikes me there. That highlights an important distinction with the older CRAAP checklist models, which is the idea of CRAAP, which I'm sure virtually no one has ever done outside of an assessment criteria, is that you're supposed to use all of these criteria to evaluate a website, and with the SIFT model, you're really supposed to stop and quit as soon as you get triggered that something is not right here. That it might be that...
Mike Caulfield: Or that something is right. Or that something is right. That's one of the weirdest things, is we give students permission to stop when it looks like it's good enough. Students get a little freaked out. They do, they do. This is one of the most bizarre things about teaching students, is you'll find something is a viral image of a gigantic tortoise or something, and it turns out to be true. Or little windows that have been put into cows. I saw you liked that tweet earlier. Little windows, but it turns out that it will look like it's true, and you ask the student and you say, hey, so what do we think about this? And the student is like, well, it's reported in a couple of things that look pretty reputable. And you're oh yeah, so what do you think? And you're like, good enough. And then they kind of freeze. They've never said good enough before. And they're like, but now we're going to have to delve in deeper. It's like, no. You're sharing a viral cow image.
If three major reporting sources have verified the viral cow image, I don't think it's your job as a consumer of information to somehow outdo the Irish Times or something there. There is. There's a stopping rule. There's a stopping rule when you come to something and it's fishy enough, you just say, hey, I'm going to pass. When you come to something and you're like, I am of the opinion that enough people I trust did enough work on this that is good enough, you stop there too.
Justin Reich: How would you contrast this with critical thinking? How is this not critical think... I get the sense that the argument that you make, that Sam make that this is something that's different from critical thinking.
Mike Caulfield: I've got to say this really clearly. Alex Jones came after me really hard on this, as did a bunch of other people, that I was somehow anti-critical thinking. You can find the video of him going, Michael Caulfield, an academic...
Justin Reich: Congratulations.
Mike Caulfield: People will go, Michael Caulfield is against critical thinking. And I'm not. I'm against critical thinking as taught. The problem with critical thinking as taught is it's largely taught as an individual epistemology. The idea is that you, with your own super smart brain, are going to somehow directly verify these arguments through the logical train of the arguments, through the data that you find in the document, through looking for any inconsistencies that might surface, through getting some sense of whether it's well-sourced or not well-sourced, that you are going to do that. Your critical thinking powers, generally taught, is a very individualistic epistemology. It removes the most important thing in most decisions we make, which is social epistemology. That is, we need to know how to read, and read relatively quickly, the state of knowledge or opinion of expert and professional communities on things that we're interested in.
This simply really hasn't been made a piece of critical thinking, which is, again, so much about direct verification. One of the things I talk is that we spend, for example, so much time putting students into science labs where they replicate these various experiments, and that's not bad, to learn the techniques of science, but students grow up, graduate high school, and most of them can tell you, hey, mitochondria, powerhouse of the cell.
Justin Reich: The powerhouse of the cell, if you will.
Mike Caulfield: Powerhouse of the cell. I don't know how much it goes beyond that. But if you ask a question like, what's the different in the mission between the CDC and the NIH, which is actually going to be much more useful to you in the course of your life, understanding that the CDC is a community-engaged agency, which actually spends a lot of its time hashing out its decisions and suggestions with stakeholders, and the NIH is a research agency, and these serve synergistic but different functions, that's going to be a lot more useful to you, understanding what the CDC means when it says this. And who they may have talked to. That sort of stuff, which is pretty easily teachable, not anywhere in any curriculum.
But that's the social epistemology stuff. That's saying, we've built these vast infrastructures of knowledge creation and knowledge testing, knowledge verification, these vast infrastructures, and then we teach students the way you verify things is you get a test tube and you drop some stuff in it. That's not the way you verify things. The way you verify things in a complex, multi-professional society that has high degrees of specializations is you learn what the knowledge infrastructure is in an area, how it works, what consensus looks like in a community, what dissensus looks like in a community, what an emerging majority looks like in a community, and if you can get a read on that relatively quickly...
If you understand, for example, that by the time the American Physical Society says, climate change is real, and we're writing up the memo for the entire association, that you have an academic society that just wrote a memo for its... An academic society coming to that point, that means that the actual intellectual question was resolved a while back. The APS saying that about climate change is bigger than any individual paper is ever going to be. An entire society of physicists saying this is a statement of our profession. Understanding that, the fundamental difference between a statement that can be read as a consensus of a profession, and oh, someone forwarded me this paper that says actually there was something called the Little Ice Age, and we had grapes in a place that we didn't expect, that these are fundamentally different things. Not something that anybody understands coming out of high school, and probably the most important thing that a person can learn.
Justin Reich: Let me then...
Mike Caulfield: I'm backing up from the mike.
Justin Reich: Back up from the mike, Mike Caulfield. This is something which has interested me about these lateral reading SIFT approaches, is that the SIFT approach is very simple. The first few steps that you take are relatively straightforward. Learning the basics of information and scientific consensus production is harder. Learning the details, like you brought up, of what's the difference of the CDC versus the NIH, obviously that one pair of facts is not that hard to figure out, but multiply that my every government agency that exists, and you now have a pretty extensive background knowledge that people need to go beyond the SIFT.
Mike Caulfield: Yeah.
Justin Reich: That's my interpretation. If you want people to be good at information literacy, the first thing that you do is you teach them some basic steps, most of which is, instead of trying to figure out whether the bottle and the note is accurate by looking really closely at the bottle, is just ask what other people think about the message that's in the note.
Mike Caulfield: Yeah.
Justin Reich: That's a relatively straightforward first step. But then, everything that you need to know to be able to evaluate every possible claim is a lifetime's pursuit of knowledge.
Mike Caulfield: It is, but here's what we have found. And I would love to get more rigorous research verifying this, but I can tell you that we have found, over and over again, that it is a virtuous cycle. That SIFT is the first step. I'll give you an example. We did, it was actually a pretty rigorous assessment of this intervention we did at CUNY Staten Island, and the results we got from this intervention, in terms of students being able to assess the credibility of various sources and claims, really great results. Really great results, but...
Justin Reich: Students haven't learned the SIFT, you bring it in, you bring in some curriculum, you give them some teaching and training, afterwards, they do much, much better. They make far...
Mike Caulfield: They do much, much better. Right. If you look at students that are coming to correct conclusions and using lateral reading, you're seeing these tenfold increases in an intervention that's three weeks. So, you're pretty excited. Your listeners will know this, but this is not content. This is skill.
Justin Reich: Yep.
Mike Caulfield: So, when you're seeing 46% of students on at least one prompt, coming to a level of mastery in three weeks, that's not 46% of students getting 90% on a vocab list. That's 46% of students applying a set of skills, coming from something like 3%. That's a gigantic increase that is shocking to me. A lot of what I do talks to people that are just in the misinformation world, and sometimes I feel like they do not understand what a profoundly weird educational result that is. Because I have done so many assessments and so many other things. I have never seen anything skills-based that works anything like... But anyway, I'll back off of that. I'm ramping that up.
But here's the more important thing about that CUNY Staten Island intervention. We did it in September or early October of the year. And then in May, we get this letter from the Director of Scholarships, the scholarship office in CUNY Staten Island. And they don't know anything about what we did. They didn't know we did this intervention in all the freshmen classes or whatever. But they write this letter and they say, hey, I direct the scholarship office, and there's a scholarship that is this big deal scholarship at CUNY Staten Island, and a lot of students apply to it, and it has a social engagement piece to it. And so because the students that are going to apply to it have to go to this interview process with the funder, with the granting organization, we have them in and we do little run-through interviews.
And every year, she says, we ask the students, where do you get your news? And the students basically say, on the web. And we'll have one student out of a dozen who says, oh, well, the New York Times. And that's it. She says, this year, every single student gave us multiple sources they consult for news. Students were talking about how they might prefer a local source for local news, that for international news they might go to Reuters, and for something that might be more scientific, they go to some other sci... And they were rattling off all these sources, and here's the thing, is we didn't necessarily teach them those sources. But they're getting on the web into this process of saying, oh, I'm just going to do a news search. What do they start to do?
Some things keep popping up. Whenever they're looking at something that's about landscape, about wildlife, maybe National Geographic pops up. And they suddenly think, oh, National Geographic, I've done the Wikipedia search, that actually... So there's a virtuous cycle that starts here. Now, there's some stuff you've got to prime. For example, students don't understand the difference between an advocacy group and a research group, or all the things in between. That's a concept you've got to prime, because if they don't understand, hey, look, advocacy is really a mission-first organization. A research group might have a mission, but it has certain principles. They try to put some guardrails on that.
There's certain things you've got to prime. What's the difference between an opinion column in a newspaper, and what's the difference between an art... Some stuff you've got to prime. But a lot of this is just a virtuous cycle. We do see this where students, even in the course of the classroom session--
Justin Reich: It puts them on a better path.
Mike Caulfield: Yeah.
Justin Reich: The path that we have students on right now is not working, is not giving them the skills that they need to be successful in information literacy, and even when they have a certain skepticism, it leads them to this really desperate, you can't trust anything on the web kind of skepticism, which is, in some respects, equally poisonous to, well, everything is true.
Mike Caulfield: Yep.
Justin Reich: If you start pointing them in the right direction of lateral reading, then you get them interested and excited about how you can do verification, how knowledge is formed in this society, and they start developing more and more interest in all those various spokes.
Mike Caulfield: And every questions that comes up, because of the way this is structured, every question that comes up ends up being a little foray out into our knowledge infrastructure. Because they're not just focusing on this note that came to them in the bottle, which will have no benefit from them outside that note if that's what they focus on, because what they're doing is venturing out and saying, hey, what are the sources out there? What is the APS? Is that a big organization for physicists? Because they're venturing out there and they're learning these things, every individual question that comes to them, even if it's about whether scientists put windows in cows, even the most trivial question ends up building their knowledge of what's out there to answer questions. What are the better sources to answer questions? What's a good source for this? What are some of the proficiencies?
Over time, and this is one of the things that I'm really interested in exploring, I think you can give students ridiculous prompts. I'm into this whole animal behavior thing right now, where I've got 57 prompts that are just on unusually large dogs, or a tiger getting loose in Houston. The interesting thing about that is, because even the trivial prompts push you out into the real social knowledge infrastructure, over time, you start to build this thing. That's the really interesting thing about the CUNY Staten Island thing to me, is that because I've seen these kids in class, not the particular ones in Staten Island, but all the ones that I've taught, I know those students were proud to be rattling off all these things. They were listing these things off like a hipster lists off indie bands.
They had a pride that they actually understood the information landscape. They were basically saying, look, you want me to draw you a map? I'll draw you a map. National Geographic is over here, and over here you have Reuters, and you've got some of these sources in India, which can be dodgy at times, but... You start to get this map. And so even trivial questions start to build this deeper understanding of this knowledge infrastructure, and that's a piece of it that's really exciting to me.
Justin Reich: Well, Mike, it's a story about an incredibly difficult problem, but with some really hopeful solutions. And that sense of self-efficacy that students are developing is a great place to sign off. Thanks so much for joining us on TeachLab.
Mike Caulfield: My pleasure.
Justin Reich: I'm Justin Reich. Thanks for listening to TeachLab. You can check out our show notes for links to all kinds of parts of Mike's work, from web literacy for student fact-checkers, to sifting through the pandemic, and other resources that we discussed today.
Be sure to subscribe to TeachLab for future episodes, and consider leaving us a review. You can find my new book, Failure to Disrupt: Why Technology Alone Can't Transform Education, at booksellers everywhere, and you can check out related content at failuretodisrupt.com. That's failuretodisrupt.com.
Me and my colleagues at the Teaching Systems Lab have two courses you can sign up for on edX. If you enjoyed the topic of today's episode, you can join me and Sam Wineburg from Stanford University in Sorting Truth From Fiction: Civic Online Reasoning, where you'll learn the skills and practices of information literacy, lateral reading, the SIFT technique, that folks like fact-checkers use to sort fact from fiction online, and you'll learn great ways to teach these strategies to your students.
And, you can join myself and Vanderbilt professor Rich Milner, in a free self-paced online course for educators called Becoming a More Equitable Educator: Mindsets and Practices. Through inquiry and practice, you'll cultivate a better understanding of yourself and your students. You'll find new resources to help all students thrive, and develop an action plan to work in your community to advance the lifelong work of equitable teaching. Even if you've taken this course in the past, we'd love to have you back. Bring your colleagues, form a learning circle in your school or community, and come and participate in our online community. You can find the links to these courses on edX in our show notes, and you can enroll now.
This episode of TeachLab was produced by Aimee Corrigan and Garrett Beazley, recorded and sound mixed by Garrett Beazley. Stay safe until next time.