"Startups and tech giants alike are racing to create deepfake detection devices to try and prevent a surge in AI-generated disinformation in the age of ChatGPT ..."
A surge in awareness about disinformation among pupils and teachers has been accompanied by a rise in the number of teachers who bring up this thorny issue in the classroom. But the gap between demand and supply remains largely unchanged. The share of teachers saying digital literacy is important is still nearly 30 percentage points above those who say it is being taught.
In this blog post, Pip Divall, Clinical Librarian Service Manager at University Hospitals of Leicester NHS Trust and CILIP Information Literacy Group’s Health Sector rep, gathers together useful resources for tackling Covid-19 vaccine misinformation during the global pandemic.
To make sense of the information streaming to and at us through media and social-media, we need to be able to detect and identify misinformation, misleading information, and disinformation. This talk shares key tools and practices we can learn that can help us navigate what can seem like a minefield of misinformation, so we can better sift valuable information from what could be harmful. Chris Coward studies information at the University of Washington Information School. As director of the UW's Technology & Social Change Group and co-founder of the Center for an Informed Public, Chris focuses on issues of access, digital inclusion, digital literacy, civic engagement, and most recently, misinformation. Fake news, disinformation, misinformation, hype, rumor and distortion can create mine field and labyrinths. He shares clues on how to strengthen constructive interactions and free yourself from the escape room of popular misinformation. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx
The spread of false and misleading news on social media is of great societal concern. Why do people share such content, and what can be done about it? In a first survey experiment (N=1,015), we demonstrate a disconnect between accuracy judgments and sharing intentions.
How can we stop the spread of misleading, sometimes dangerous content while maintaining an internet with freedom of expression at its core? Misinformation expert Claire Wardle explores the new challenges of our polluted online environment and maps out a plan to transform the internet into a place of trust -- with the help everyday users. "Together, let's rebuild our information commons," she says.
I believe one way to help stem the spread of misinformation is to educate our students by embedding the topics of media literacy and digital citizenship throughout all curriculum areas. Thoughts? JE
Beneath the spread of all “fake news,” misinformation, disinformation, digital falsehoods and foreign influence lies society’s failure to teach its citizenry information literacy: how to think critically about the deluge of information that confronts them in our modern digital age. Instead, society has prioritized speed over accuracy, sharing over reading, commenting over understanding. Children are taught to regurgitate what others tell them and to rely on digital assistants to curate the world rather than learn to navigate the informational landscape on their own. Schools no longer teach source triangulation, conflict arbitration, separating fact from opinion, citation chaining, conducting research or even the basic concept of verification and validation. In short, we’ve stopped teaching society how to think about information, leaving our citizenry adrift in the digital wilderness increasingly saturated with falsehoods without so much as a compass or map to help them find their way to safety. The solution is to teach the world's citizenry the basics of information literacy.
A well presented essay / article on the necessity of teaching our 'citizens' digital and information literacy and to cultivate a persistant and healthy scepticism towards the information they read online. My favourite phrase in the piece: "Most importantly, we must emphasize verification and validation over virality and velocity."
Already, science and technology can help us fight fake news, which is a problem of cultural origin and poor conscience like compulsive sharing, it takes honest conscientious work of individual users in an organization, what do we think, what do we mean, what do we feel ?
In the past year, many educational institutions began to address the challenge of digital misinformation. As head of a multi-institutional project that addresses these issues, I found this heartening. Less encouraging, however, was the persistence of many myths about how misinformation works, what its risks are and how we might address it. In the hope we might have a more productive 2019, I thought I’d outline some of those myths and realities below.
"For the first time, A.I.-generated personas, often used for corporate trainings, were detected in a state-aligned information campaign — opening a new chapter in online manipulation ..."
In this 15-minute presentation, MIT’s David Rand summarizes what recent research says about psychological factors related to belief in information, both true and false. Repetition, alignment with prior beliefs, and hearing from trusted sources are factors that correlate with more belief in information, regardless of its truth.
False content online has only multiplied over the years. But the fake news designation has also been used to serve all kinds of purposes—including, increasingly, to disparage real news reporters—so most experts now avoid the term. Instead, researchers usually talk about disinformation, which is purposefully false, and misinformation, which is unwittingly false (either because the publisher made a mistake or because the person sharing the content did). As false content spreads through social media networks, it can oscillate between the two, and it can manifest in various forms, including memes, tweets, or “imposter” content made to imitate real news stories.
In an age of democracy via social media, platforms are struggling to combat visual mis/disinformation such as 'spliced' images and deepfakes. Digital media literacy has never been so important.
People of all ages struggle to evaluate the integrity of the digital information that rains down with every web search and social media scroll. When the Stanford History Education Group released findings showing that most students couldn’t tell sponsored ads from real articles, among other miscues, it intensified the scramble for tools and strategies to help students discern better.
But a more recent study by Stanford’s Sam Wineburg and Sarah McGrew suggests that many of the techniques that students and teachers employ — which include checklists and other practices most recommended for digital literacy — are often misleading.
A better solution for navigating our cluttered online environment, they say, can be found in the practices of professional fact-checkers. Their approach, which harnesses the power of the web to determine trustworthiness, is more likely to expose dubious information.
The following guidelines for interrogating online information, inspired by the fact-checkers’ techniques, will increase students’ odds of determining unreliable sources (and consuming reliable ones).
"Bad News is a website that offers simulations that show visitors how misinformation is spread through social media. Bad News is available in two versions. The regular version is intended for those who are high school age or older. Bad News Junior is appropriate for middle school and older elementary school students. The difference between the two versions is found in the news topics that are used in the simulations."
This summer, a new California law goes into effect, aimed at supporting media literacy in my home state's school systems. Effective July 1, the statute requires the state Department of Education to provide online resources on media literacy for use by school districts. And some U.S. senators have reportedly floated similar legislation at the national level. These efforts can't come soon enough, given how fast unreliable and provocative online information is dividing the country and challenging the very stability of our democracy.
Laws can only go so far, however. We need to get teachers and parents involved in grassroots efforts to promote media literacy at all levels of education. If you have a high school student in your household as I do, it's time to talk with other parents, reach out to the social studies department, and get organized. If you are a teacher, you should either embrace whatever proactive measures your students' parents want to make or be the first to encourage such a coalition. We need leadership on both sides.
American tech companies positioned themselves as entities that brought positive change by connecting people and spreading information. Perceptions are shifting.
At the start of this decade, the Arab Spring blossomed with the help of social media. That is the sort of story the tech industry loves to tell about itself: It is bringing freedom, enlightenment and a better future for all mankind.
Mark Zuckerberg, the Facebook founder, proclaimed that this was exactly why his social network existed. In a 2012 manifesto for investors, he said Facebook was a tool to create “a more honest and transparent dialogue around government.” The result, he said, would be “better solutions to some of the biggest problems of our time.”
Now tech companies are under fire for creating problems instead of solving them. At the top of the list is Russian interference in last year’s presidential election. Social media might have originally promised liberation, but it proved an even more useful tool for stoking anger. The manipulation was so efficient and so lacking in transparency that the companies themselves barely noticed it was happening.
The election is far from the only area of concern. Tech companies have accrued a tremendous amount of power and influence. Amazon determines how people shop, Google how they acquire knowledge, Facebook how they communicate. All of them are making decisions about who gets a digital megaphone and who should be unplugged from the web.
Their amount of concentrated authority resembles the divine right of kings, and is sparking a backlash that is still gathering force.
To get content containing either thought or leadership enter:
To get content containing both thought and leadership enter:
To get content containing the expression thought leadership enter:
You can enter several keywords and you can refine them whenever you want. Our suggestion engine uses more signals but entering a few keywords here will rapidly give you great content to curate.