You probably know you can’t believe everything you see on the Internet. But you may still be surprised to find how easily fake science makes its way through YouTube and other social media sites — and how intentionally it’s being promoted.
A new study from a researcher at Aachen University in Germany about the prevalence of inaccurate climate science and conspiracy theories on YouTube illustrates the grim reality, but also a way to fix it.
The study used 10 different search terms on YouTube, such as “climate change,” “climate science,” “geoengineering” and “climate hacking,” and analyzed the results to see which videos supported the scientific consensus around climate change and which did not.
It also used an internet tool called Tor, which anonymizes users, in order to avoid YouTube’s practice of personalizing search results based on previously watched videos, location and other demographics.
Overall, most videos in the 200-video sample disagreed with the scientific consensus around climate change, and of those, 85 percent actively spread conspiracy theories. Videos that agreed with scientific consensus received more total views than those that disagreed, but by only 2,300 views — and both categories had almost 17 million views each.
YouTube has taken some steps to counter this, outlined in an update it published on its blog in July 2018. One major change was the addition of blurbs drawn from Wikipedia and Encyclopædia Britannica next to videos on “well-established historical and scientific topics that have often been subject to misinformation,” such as the moon landing.
But the study’s author, Joachim Allgaier, explores another solution that doesn’t involve YouTube changing its guidelines or algorithms. In the study, he writes, “YouTube and other online video-sharing websites have an enormous potential as tools for science and environmental education … [T]he professional communities from these subject areas will do well to engage effectively with these communication channels.”
In other words, scientists should step up to the plate and produce more YouTube videos that fit with the facts.
Getting more scientists actively engaged in science education is no easy feat — but, at least on YouTube, it is already happening. Channels like SciShow, Physics Girl, The Brain Scoop and more provide a wide range of science content. In Allgaier’s study, four videos from “Science YouTubers,” including SciShow, appeared in the sample and had the third highest views of all videos in the sample.
Scientists don’t necessarily have to create a new YouTube channel, buy fancy video equipment and hire a team of script writers or video editors in order to have an impact. SciShow, for instance, is hosted by non-scientist YouTuber Hank Green and others, but hires scientists as consultants to develop curriculum and video ideas. Crash Course, a channel owned by the same education company as SciShow (it’s called Complexly, and it’s owned by Green and his brother), also hires chemistry and geography experts.
There’s another angle on this that Allgaier notes. When he used the search term “chemtrails,” all but one of the videos in the sample actively supported the common conspiracy theory. If someone did search for that information, there might not be any scientifically accurate videos to counter all the conspiracy content. A similar pattern appeared with the term “geoengineering,” and Allgaier writes that the scientific term has been “hijacked” by conspiracy advocates to push their own agenda.
So, there’s something to be said for using those terms to intentionally push more real science videos into the search results for common conspiracies like chemtrails, and possibly reclaim words like “geoengineering” that have been corrupted.
Fixing misinformation and fake science online is not an easy task, and it won’t happen overnight. But maybe, as Allgaier says, instead of waiting for YouTube to take action, scientists can start their own ball rolling.