6 Advocating Digital Citizenship Technology Trends and Media Literacy We all have seen the growth of content and media in our lifetimes. Librar- ians reading this book may remember card catalogs and rows and rows of magazines and journals to be retrieved by inquiring students. Technology has changed all our lives and how we talk and teach about media literacy. How we find information has changed. There are fewer gatekeepers and arbiters of truth. The new gatekeepers are algorithms. Those algorithms come from search engines, social media feeds, YouTube channels, and e-commerce sites. Those algorithms are not always even understood by their creators. In an article out of Futurism, author Bahar Gholipour writes, “It’s difficult to figure out if an algorithm is biased or fair, even for computer experts. One reason is that the details behind an algorithm’s creation are often considered proprietary information . . . in more advanced cases, the algorithms are so complex that even their creators don’t know exactly how they work. This is AI’s so-called black box problem—our inability to see the inside of an algo- rithm and understand how it arrives at a decision” (Gholipour, 2018). Humans, like many algorithms, have biases. These biases by human-created content have contributed to the erosion of trust in the media. Gatekeepers are not necessarily a negative thing—gatekeepers can protect the castle and keep out destructive hordes. But they also exclude others from the treasure inside. With the massive increase of information, a desire for convenience, and our distrust for these gatekeepers, algorithms have become the new castle walls. And just like with their human builders, algorithms are faulty. Some of the faults of algorithms include: ■ Inscrutability: Unlike a human, there’s often not a default system for recourse when an algorithm denies you a job or loan. ■ Bias: Algorithms learn from associations between common words. Hu- mans learn this way too, and those associations can lead to stereotypes, generalizations, and harmful ideas about groups of people. Both hu- mans and algorithms have biases, but humans are more likely to be aware of their biases. ■ Lack of accountability: Algorithms, with their secretive and protected content, are not subject to regulation and checks and balances. Devel- opers should evaluate these algorithms, and many do, but often algo- rithms run untested and unchallenged. ■ Engagement issues: Algorithms are designed to get people to engage and stay on a platform or product as long as possible. This can affect users’ mental health and make it harder for them to stay away from the screen.