Skip to main content
EdCast

College Students in the Age of Surveillance

Project Information Literacy's Alison Head on how algorithms shape the way information is tracked online, and how it affects teaching and learning.
Student on laptop holding phone

This newest generation of college students is aware that algorithms tend to skew the truth online, but many feel it is par for the course. Alison Head, a researcher and director of Project Information Literacy, explores how algorithmic-driven platforms are shaping the ways college students access news and information and its potential to change the college landscape.

TRANSCRIPT

Jill Anderson: I'm Jill Anderson. This is the Harvard EdCast. Information Scientist Alison Head wanted to know how aware college students and even faculty are about the ways algorithm shape the news and information they receive online. She's the founder of Project Information Literacy, a research institute that studies what it is like to be a student in the digital age. What she found out is most students know the internet tracks their online moves, but still feel resigned to live with it even though it bothers them. I spoke to Alison about that and how all the ways information is tracked, impacts teaching and learning and raises some serious questions about what's going on in our world.

Alison Head

Alison Head: We characterize it as an epistemological crisis, in the sense that it's harder to know what to believe these days and what is true. As well as where to find it. And I think that, really, in this study in particular... We felt some of that in the new study, but I think this study in particular called out this gap between faculty and students and how they think of the information landscape that is impacted by the shaping and influencing. Really it's correlated information, is algorithms. I mean that's what they're trying to do, they're trying to match your needs and interests.

Jill Anderson: And so where was the divide between students and faculty on that?

Alison Head: That's a great question and we really didn't know going in. Which makes the research really exciting. When we went in we thought," Well..." Almost like the new study, do students engage with news? We found they did in that study. These are really companion studies. In the second study we asked a very broad question to begin with, which was how familiar were students? We did student focus groups. We had 103 students, totally from eight colleges and universities across the U.S. And we found that students were more familiar than faculty. We had in depth faculty interviews. It really is a point of comparison with 37 faculty members, small sample, both of them are, it's qualitative so it's exploring the relationships between the individuals in the sample. But we found faculty was less aware than students.

Just backing up for a second, I love this quote. One student said," I can't really tell you how they work because I'm not an information group, but it's like algorithms have this magic potion. And they can find out what you've searched. And then if you like CNN, the next time you search, that's what you're going to get, not BBC." So there was a recognition that their information was being manipulated. Faculty, on the other hand, this is a great contrast, knew that information was manipulated. So their approach was really to abandon what we call these internet giants, like Google, like Facebook, like Instagram. Even the different programs on a campus use different algorithms.

But faculty were more prone to be overwhelmed, like students. Helpless, hopeless like students. But then to retreat to peer reviewed resources or resources that they considered as tried and true. NPR came up a lot for news, so did the New York times. So what faculty did was they tried to abandon platforms that used algorithms. Of course the irony is algorithms are being used more and more and that's going to be a difficult strategy to uphold. Students, we found, actually had a set of defensive practices. It certainly could be built on, in educational institutions. Which is one of our recommendations, as a way to apply it back and have a certain agency over the algorithms.

Jill Anderson: Can you talk a little bit about what are some of the dangers for not being aware and knowledgeable about these algorithms? Or how certain media companies might be skewing what you're seeing or what students are seeing?

Alison Head: We talk about this a lot in the study and I do think it's one of the more fascinating findings. Which is, they were resigned yet indignant. And resigned, really, if you think about it, like you said, what do you really do about a company like Facebook or Amazon? I mean these are zillion dollar companies. We're talking about an unregulated media environment. As one student said," Look it algorithms help sort my information, I'm going to exchange convenience for my privacy. That's part of the deal. This is a free application for me and that's the price I pay." But there were larger concerns about it, as algorithms continue to be used. Not all algorithms are bad, it's just coding. But when coding that collects information in a secretive of way about users and then combines it with machine learning and AI, which is happening more and more, to make automated decisions about who gets into college.

And the students that in our focus groups," Well, I already got in." But there are other things that are going on here. That is in their world, even a learning system like Canvas that collects a lot of personal data that are used on campuses to study retention, to bring in students that may be at most risks. But you have to look under the hood and say," Well, how do you come up with your algorithms, with the students who are at risk? Does it come down to zip code? Does this red lining also exist with who gets a loan to buy a car? Who gets a job interview? There are a number of concerns about facial recognition that these big data sets that are being used are actually kind of small and really not representative.

While we were out in the field, there was a lots of discussion about some AI technology using algorithms to collect data about who gets an interview in the UK. Well they have no dark skinned people, in their sample, that they were comparing applicants to. So that means if you're a person of color, you don't make the cut right off. So there are these inequalities that are reinforce by this very fast, automated decision making that is determined at the programming level. It's concerning.

On campuses, something like Canvas, it's up for sale, it's a for profit company. Harvard uses it. A number of institutions across the U.S. use it. What happens to that data? This swapping of data? We first thought with Cambridge Analytica, where algorithms became part of a public conversation, but what's that mean in an educational setting? Or putting an Echo Dot in your dorm room for a student? Sure, it can help answer questions about where the library is or the cafeteria. Is it listening? What data is it collecting? Do students have the ability to opt out of Canvas? They don't.

Jill Anderson: Did the students in your study... The things that you were just mentioning about all of these... I want to say hidden, but more institutional things that track information, were the students aware of that? Did they think about it and talk about that in the study?

Alison Head: The word, the adjective that came up over and over again was the creepiness of algorithms. And when we asked a follow up question of," Well what's that mean and what's your tipping point? When have algorithms gone too far for you?" More often than other cases in the focus groups, a students talked about algorithms that followed them from device to device, from platform to platform. I think we've all been in that situation where you're looking for a pair of shoes and all of a sudden it comes up on Amazon, even though you looked on Zappos. And it may come up on your Facebook page.

When you think of algorithms and often in these discussions, ads and the use of algorithms for generating ad content that's personalized is most visible. But there are other things going on that also concerned students about inequalities in society. That were caused by, what is your world view because really in a larger sense, why should we care about this? Well, it's fundamental to how a democracy works. If you think of a democracy as being an informed part of your community, but you have a hard time telling, again back to the epistemological crisis, what's true and what's not? This becomes a really problematic.

Jill Anderson: Did you pick up on any way that this is impacting teaching and learning?

Alison Head: First of all, students felt looking forward at individuals that were older than themselves. Students felt that their teachers were really often out of step. We talk about this very tangible example, we talk about this idea of defensive practices. What do you do to fight algorithms? If you're a little bit indignant or a lot, how do you have agency? That distinction between faculty and students, then what they did was key. And that is, students often had more sophisticated strategies, often learned in high school from peers. Like using a VPN, a virtual privacy network to be able to end run firewalls. Sometimes international travel, the year abroad had taught them that particular skill.

Where faculty, we're telling students more things like, "Well you better clean your cache of cookies." And that's algorithm worries 1A. So they weren't seen as a source of learning. But where students really saw a value in the classroom is really key here. And that is conversations around social justice, were translated into this idea of algorithmic justice. And that is what action can you take? What's available to you? How can you continue learning to be somebody that is, what we call, more literate. Algorithm literacy about what tools exist, what things you can do. There are projects out there that are starting to develop. University of Amsterdam's developing tools. MIT media lab, down the street from you, is developing an algorithmic justice program for high school students.

So we're starting to see more of a presence in classrooms. But how this is really channeled into the curriculum is something that, faculty's really not thinking about. A number of faculty really weren't aware what was even going on with the data collected around learning management systems. And you see these stories over and over again in the press. University of Missouri is starting to keep track of students.... Using mobile phones to keep track of whether they're in class or not. So there really is a divide in awareness.

Jill Anderson: We might have some educators listening, hopefully, and they might be thinking," Maybe I should do something about this. Maybe it's not a college level class, but..."

Alison Head: Yeah. Well let's talk about the other end of the continuum. You think about child development and so you're teaching a class in an ed school about child development. How does filtered content and surveillance, which is basically what this is, take early values and learning? Could you pull different news studies in, to support that? Could you raise that as an issue in your class? Something that we've done, really, to help educators think about how they can integrate this beyond the study, is we do have further readings. And then on the site we have a landing page for the project @projectinfolit.org, that we're doing with a pin board. We're adding stories every day that are new news stories, that can serve as fodder for using in the classroom. As well as our further reading has a syllabus. How do you teach journalists about algorithms? Not necessarily covering it, but how news is shaped by algorithms and what they'd cover?

One of our original studies was about Wikipedia and Wikipedia use. When Wikipedia was seen as a really bad thing early on, so our survey data with over 8,000 students, and I thought," I'm going to ask a question about Wikipedia." It is the sinful source at the time. In 2009, 2010 students do not use Wikipedia, which is really kind of a modern day encyclopedia, but better because it's crowdsourced and updated. And now has a lot more fact checking and has become a lot more rigorous. But at the time we really dove deeply into understanding Wikipedia and how Wikipedia could play a role in the classroom.

And I think librarians, as well as faculty, do a number of exercises around, write a Wikipedia entry, you create knowledge and you submit this. Tell me what you're interested in it. Freshman or first year student or sophomore, you write an entry, you find sources to support what you're saying are facts about something. Say you like skeet shooting, or say you like Asian cooking and a certain kind from a certain province. That exploration and creating knowledge and understanding what goes into it has turned out to be a very powerful thing. And I think we're at a moment now where we can think of algorithms beyond just being in a computer science class. But really thinking about how it impacts, like Wikipedia, or understanding of knowledge and what's true and what's not.

Jill Anderson: Alison Head is a founder and director of Project Information Literacy, a national research institute that studies what it is like to be a student in the digital age. She's also a visiting scholar at the Harvard Graduate School of Education. I am Jill Anderson. This is the Harvard Edcast produced by the Harvard Graduate School of Education. Thanks for listening and please subscribe.

About the Harvard EdCast

 

In the complex world of education, we keep the focus simple: what makes a difference for learners, educators, parents, and our communities.

The Harvard EdCast is a weekly podcast about the ideas that shape education, from early learning through college and career. We talk to teachers, researchers, policymakers, and leaders of schools and systems in the US and around the world — looking for positive approaches to the challenges and inequties in education. One of the driving questions we explore: How can the transformative power of education reach every learner? Through authentic conversation, we work to lower the barriers of education’s complexities so that everyone can understand.

EdCast

An education podcast that keeps the focus simple: what makes a difference for learners, educators, parents, and communities

Related Articles