Harvard Graduate School of Education Logo

Learning in the Age of Algorithms

Project Information Literacy explores how students understand and navigate information — and what has to change

January 21, 2020
Young woman typing on laptop

#FakeNews, deep fakes, Russian hackers, and tricky algorithms — in 2020, the truth may be hard to come by. And today’s college students are feeling the uncertainty intensely, according to a newly released study conducted by principal investigator Alison Head and her team at Project Information Literacy.

Head collected qualitative research from a diverse sample of 103 students and 37 professors from eight colleges and universities across the country, to explore the ways in which students conduct research and obtain and validate news and information in a shifting information landscape.

What she found is that many students are in the middle of an “epistemological crisis,” says Head, a visiting scholar at Harvard Graduate School of Education. These students — members of what the report calls “a pivotal generation born before the constant connectivity of social media” — are deeply skeptical consumers of information. They’ve grown up with the internet, and algorithms have always influenced the news and information they receive. But most educators aren’t well positioned to help students navigate this new world, where information is shaped and filtered differently for different users.

“Democracy calls for an informed public, and yet, students have questions about what to believe," says Head. "The whole issue of credibility and not knowing what to trust is so critically important, as is having the necessary evaluation skills and abilities at a time of dramatic change to what we experience and know. That’s where education comes in.”

A Generational Disconnect on Algorithms and Trust

The new information landscape is a product of algorithmic data collection used by social media platforms, like Facebook and YouTube, news sites, and even our cellphones. Companies use invisible computer codes to track users’ interactions in order to personalize their web experiences and influence their buying and viewing behavior, leading to concerns about privacy, accuracy of information, the preservation of shared norms, and authenticity.

Data is collected everywhere. Even schools, from elementary schools to higher education institutions, collect data on students through learning management systems (LMS), thrusting schools directly into debates that consider how these companies, like Canvas, protect or use data.

While the ethics and impact of algorithmic-driven content on platforms like Facebook or an LMS are murky and just emerging, the Project Information Literacy study raises immediate flags.

  • A disconnect between faculty and students: “Where the study is most revealing from an educational standpoint is this gap between faculty and students,” says Head, noting that students report that faculty often tell them to use more trusted sources like peer-reviewed journals obtained through the schools database or to use sites with “.org” instead of “.com.” But students, having grown up with the internet, know that there is a lot more news and information on the web, and they lack direction in navigating it.
    • Potential for peer learning: While students reported skepticism both of news sources and of traditional authority figures, the team found there was trust within their peer group. Students often looked to one another for ways to navigate and undermine collection of their own personal data, using Virtual Private Networks (VPNs), for instance, to shield their browsing activity. For Head, this pointed to an opportunity for continued learning in higher education through peer learning opportunities.
       
  • Mixing indignation with resignation: “Students have what we called an ‘ambivalent bond’ with algorithmic-driven platforms,” Head says. “In a lot of ways, they’re torn. They’re resigned to a big, powerful, social media economy and unable to abandon sites like Instagram or Google, but they still objected to certain advertising practices these platforms used that they saw as ‘creepy’ and resented being compartmentalized into filter bubbles.”
    • Potential for activism: During the focus groups, researchers noted that students became particularly indignant about algorithms parsing content that inadvertently amplifies societal inequalities. They also observed a concern for future generations and the implications of the world of big data. The right academic content and a greater awareness of the implications of algorithmic platforms could help students make a stand and have let them exert their agency.

What does this mean for curriculum, citizenship, and equity?

  • Rethink the research project: The research project, often in the form of a final paper, is something most students tackle in higher education. But are educators asking them to conduct research in a way that is applicable to how they will later have to navigate the flood of information as citizens and as professionals? Are students encouraged to explore topics using a wide range of resources and ask questions of their own? 
     
  • A rise of isolation, distrust, and extremism: Because algorithms try to influence behavior by personalizing news and advertising, we don’t experience the same virtual reality and as a result, we don’t experience reality in the same way — look at what’s happened with beliefs about climate change. “There’s an inability to find information that is trustworthy. I think the slippage we’ve seen, especially with news, is the difficulty students have telling facts from opinions,” says Head. “We’ve seen this contribute to a growing lack of student confidence in being able to say this is how I’m going to vote because this or that is true, or this is what I’m passionate about and I have figured out the argument using facts I trust..”
    • Cross-disciplinary collaboration is critical: To truly teach students how to navigate information and make decisions holistically, all disciplines need to consider the ways in which algorithms influence their particular field. Maybe it’s a philosophical look at the ethics of algorithms or a study of propaganda that extends to the differences in students’ Google results. Head noted one professor who asked his students to consider FitBits and the ways in which information about physical activity could be used by insurance companies.
       
  • Consider algorithmic justice alongside social justice: Injustice doesn't just exist in social interactions but is perpetuated in virtual ones, too. “The agency, as well as the need for a more just society, are also part of programming decisions that go into coding algorithms and how data collected are ultimately used,” Head says. “ Students need to know that algorithms, depending on how they are used, can reinforce existing biases and prejudices. They can be used to determine things like who gets called in for a job interview or approved for a loan. There are real-life consequences for us all.”
Widen Layout: 
standard
See More In
College and Career Learning and Teaching Parenting and Community