Websites like YouTube openly host videos that falsely claim vaccines harm people, that the US government knew in advance about the 9/11 attacks, that the moon landings were faked and other conspiracies.
Finding this content is not hard for those who search for it. But a more insidious issue is how recommendation algorithms present this misinformation to users who haven't asked for it. Indeed, many commentators complain that these algorithms can create “misinformation filter bubbles” in which users are fed a worrying diet of demonstrably false ideas.
Services like YouTube do not publish their recommendation algorithms. This makes it hard to know how easily users can fall into disinformation filter bubbles or how hard they find it to get out again.