While I tend to agree with those who say that the proliferation of fake news and harmful pseudoscience on social media sites like Facebook is a symptom of bigger problems, such as lack of critical thinking skills among readers and knee-jerk tribalism, the platform itself isn’t helping things. While Facebook has acknowledged some responsibility and has reportedly made changes to improve the situation, I’m not sure enough is being done.
Below is a screenshot of one of our posts linking to an article that intends to address various cancer related myths, and which Facebook has decided to pair with three stories pushing dangerous cancer related myths. Have a look.
While this may be inadvertent, it should nevertheless be unacceptable. The problem here isn’t simply the fact that misinformation is making its way onto Facebook. I happen to be of the mind that people should be free to use social media sites to say whatever they please. That, of course, means that dangerous rumors and outright lies are going to make the rounds, but I feel more progress can be made by confronting these things and having discussions rather than banning such material outright. No doubt there are lines to be drawn and I am not advocating some sort of safe haven for fake news or medical quackery, but that is a bit beyond the scope of this post.
The idea that conversation is the best path to combat pseudoscience and propaganda on social media is, admittedly, idealistic. The average comment exchange on Facebook is probably anything but productive. But one of the biggest hurtles skeptic and science communicators face is that Facebook is not a level playing field. The screenshot above is a perfect example of why.
Anytime a skeptical or science page attempts to get solid, comprehensive information out there, chances are Facebook is using the opportunity to spread rumors and harmful pseudoscience right along with it. Facebook users run the gamut from savvy to novice, and so it isn’t unreasonable to think some people may see the links paired with our post and assume that we are somehow endorsing them. People can get the idea that the myths we are debunking is that cancer is a legitimate disease, chemotherapy is helpful, and that doctors want to help. This works to defeat the purpose of posting such links to in the first place.
With our article Facebook has decided to include three dangerous and well debunked claims. The first is that cancer cannot survive in an alkaline environment. The idea here is that if we change our diet to include alkaline foods we can change our blood pH to be more alkaline and thus kill cancer cells. This information has been addressed by a number of sources including this article from Quackwatch. In short, blood pH is tightly regulated by our bodies, so no matter how many alkaline foods you eat your blood pH will stay within a narrow range. Further, regular healthy cells cannot function in an highly alkaline environment either, so even if you did manage to change your blood pH you would be harming more than just the cancer. The goal should be to eat a balanced and varied diet. The idea that by eating certain foods we can bring about significant and lasting changes in blood pH is as absurd as the idea that eating cold foods will result in significant and lasting changes in body temperature. Both are kept in a narrow range by the body and if they aren’t, you’ll need to seek actual medical attention, not simply eat a plate of broccoli.
The next claim is that a combination of lemon juice and baking soda produces effects ten thousand times stronger than chemotherapy. The article itself talks almost exclusively about lemons, with the baking soda seeming to be an afterthought to “normalize the pH of the body.” A quick search of snopes finds debunking for this one.
The best that can be said at this point is that citrus fruits may potentially harbor anti-cancer properties that could help ward off cancer. No reputable scientific or medical studies have reported that lemons have definitively been found to be a “proven remedy against cancers of all types,” nor has any of the (conveniently unnamed) “world’s largest drug manufacturers” reported discovering that lemons are “10,000 times stronger than chemotherapy” and that their ingestion can “destroy malignant [cancer] cells.”
In addition to the information from snopes, the idea also fails on several basic levels. First, if chemicals in lemons were found to have reliable anti-cancer properties these chemicals would be isolated and given to patients as part of, you guessed it, chemotherapy. Chemotherapy is nothing more than a collection of chemicals which have been demonstrated to have an effect on cancer. By demonizing chemotherapy the authors are bashing the very thing they are promoting, but lack the proper understanding to realize it. Additionally, chemotherapy is tailored to the situation of every individual patient. The treatments can vary depending on the type of cancer, the progression of the disease, and the circumstances and lifestyle of the patient. Doctors would never recommend chemotherapy without carefully considering these factors and consulting a biopsy of the cancer. The blunt idea that a substance ten thousand times stronger than chemo would be good for the patient ignores all these protocols and reveals a deep ignorance of what cancer is or how chemotherapy works. Does Facebook really think this is an idea that needs to be put in front of more eyes?
The last claim is one I find the most egregious. This article spreads the idea that doctors are diagnosing healthy patients with cancer for no other reason than to charge them for chemo treatments. The idea here is to undermine the trust people put into doctors so that the choice of alternative treatments seems more reasonable. The article cites one single doctor who, apparently, admitted to “intentionally and wrongfully diagnosing healthy people with cancer.” It then goes on to state “Like him there are thousands of legally practicing Doctors and oncologists in the United States and abroad who are guilty of the same crimes, but because they fly below the radar, they are never caught.” One has to wonder though, if these doctors are never caught, how does the author of the article know they exist? While I have no doubt that this happens – one can find examples of individual doctors doing all sorts of nefarious things – the article gives no justification for reporting that this is a widespread problem or that medical establishments condone such behavior.
In addition to the terrible information Facebook is promoting with these links, even worse is the inclusion of a “share” button immediately after each headline. If you spend any time trying to promote good information on Facebook you’ll quickly realize that the phenomenon of people reading nothing more than headlines, making their own assumptions, and then walking away with the idea that those assumptions are backed up by proper journalism is a big part of the problem. If people never actually read the content in the articles they have no chance of using their BS detecting skills to judge the integrity of the information. By adding a share button directly after the headline Facebook is encouraging this behavior. One of the suggestions Facebook gives to prospective page owners is that they should share quality links with engaging content that the reader will find useful, but the very dynamics of the platform itself seems to suggest that this is not all that important after all. All one needs is a sensational headline that stokes the fear and cynicism readers may be harboring and you can influence the public narrative and reinforce the myths that do so much harm, and which take hours and hours of good skeptical journalism to debunk.
Let me reiterate that I am not asking Facebook to police what is and isn’t allowed on Facebook. However, the result of this feature is that anytime a science communicator wants to share what they think is good information, they must weigh the benefits against the possibility that they will be putting dangerous misinformation in front of more eyes. This is information which has the potential to cause real harm in a real person’s life. To some people this creates a legitimate moral quandary. I don’t think people should have to face such a quandary simply because they want to share something on Facebook.
While I sympathize with Facebook and acknowledge that the phenomenon of fake news and pseudoscience on social media is a problem that can’t be fixed with a few mere tweaks to algorithms, there is no excuse for the practice seen here. By giving a boost to misinformation and pseudoscience anytime that pseudoscience is being addressed, Facebook helps to ensure that the efforts of skeptical and science communicators is counterproductive and a waste of time.
The placement of the inappropriate articles is just FB’s garbage associative algorithms at work. The same as eBay and Amazon “suggested products”, programmed with idiotic simplicity to look at the highest-ranking words in a post and insert suggestions based upon it (e.g. if I post “Trump must die”, suggestions are for “We love Trump” pages. If I buy windscreen wipers for a particular car, Amazon extrapolates that I am some sort of windscreen-wiper fetishist and need to feed my addiction by buying more of them from a variety of cars.)
The algorithms won’t improve because an acceptable hit rate, like spam emails, is so low that it makes commercial sense to keep them stupid for the click-through ad revenue.