When “Mad” Mike Hughes launched himself into the sky in a homemade rocket last year, leaving a torn parachute behind in a blast of steam propulsion, he was doing what scientists and doctors have done for millennia, testing a hypothesis against firsthand experience. As it happened, Hughes’ hypothesis, that the Earth is flat, has been under assault since antiquity. Still, Hughes, a stuntman who once made a world-record distance jump in a stretch limo, spent years building rockets so he could get far enough away from his subject to see for himself. Whether his last flight scratched the itch is unknown; after the bungled launch, his rocket obeyed the laws of projectiles, attributed to Galileo, and nosedived into the desert.

As crazy as Hughes’ passion and demise might sound to anyone who has seen the curvature of the Earth from the window of a plane, watched a large ship sail over the horizon or who simply trusts a couple thousand years of science, it points to a question that philosophers have been furrowing their brows at for just as long: How do we know what we know? And the related question: Are we sure?

A Google search is hardly rocket science, but at a moment when the flimsiest false statement circulates freely online, and “do your own research” has become the go-to defense for the mother of all conspiracy theories, the one about a cabal of blood-drinking pedophile satanists, it seems like we're failing worse than ever in our diligence.

Some of our inherent gullibility is addressed in epistemology, which is the philosophical study of knowledge, or how we know what we know.

Epistemologists differentiate several categories, including things we know from perception (bearing witness), reflection (thinking) and testimony (learning from a source).

A common example is the idea that Antarctica is cold, which we all accept as true, though very few of us have been there. It’s a harmless and useful assumption, as are most ideas that we get from the testimony of others. But the fact that we typically get reliable information leaves us less prepared when the information is unreliable.

Daniel Kahneman, a Nobel Prize–winning psychologist, explained the exploitable gaps in our wiring by dividing the brain into two types of functions, which he called System 1 and System 2.

System 1 is immediate and unconscious. It detects the direction of a sound source, allows you to drive on an empty road, completes the phrase “bread and ____” or the math problem 2+2. System 2 kicks in whenever you have to actively engage your mind to solve a problem, as in doing a double-digit math problem or focusing on one person’s voice in a crowded room. These two aspects of mental processing work seamlessly together, which is mostly a very good thing, Kahneman says. But they also have predictable gaps, and when something slips through, the inherent properties of the system often allow it to go unnoticed.

In his book, “Thinking Fast and Slow,” Kahneman offers this simple problem and says, “Do not try to solve it but listen to your intuition”:

A bat and a ball cost $1.10

The bat costs one dollar more than the ball.

How much does the ball cost?


If 10 cents popped into your mind, that’s System 1 at work. If that’s your final answer, then your mind never called upon System 2 to determine whether your intuition was right.

Kahneman fills 400 pages with examples of cognitive biases, flawed logic, motivated reasoning and other ways in which our minds don’t quite work the way we think they should, sometimes with bizarre results — in one experiment, a group was asked to make a four-word sentences from a group of five words. A group given the words Florida, forgetful, bald, gray, wrinkle took longer to walk down a hallway after the experiment than another group, as though they had internalized the attributes of old age; study subjects “primed” with images of money were less likely to help someone who spilled a cup of pencils, and so on.

Among political applications, consider that System 1 will unconsciously connect a description of a person with a stereotype, while System 2 is used to check the validity of a complex logical argument. When System 2 isn’t called to investigate, the mind, which Kahneman calls “a machine for leaping to conclusions,” knows just what to do.

Much has been written about how the speed of online communication and the casual prompts of social media news feeds seem designed to perfectly exploit the part of our brain that delivers quick reactions.

On January 13, Jordan Davis in the Federalist took “corporate media journalists and left-wing activists” to task for quoting Republican Rep. Louie Gohmert of Texas out of context, appearing to support insurrection. Accompanying tweets attributed this quote to Gohmert: “I just don’t know why there aren’t uprisings all over the country. Maybe there will be.”

Davis noted that Gohmert, as is evident in video clips from the hearing, was quoting Democratic House Speaker Nancy Pelosi:

“While Gohmert specified that what he was saying was a quote, taken directly from House Speaker Nancy Pelosi’s previous statements encouraging uprisings, journalists and activists quickly twisted his words, publicly speculating and claiming he was calling for increased political violence around the nation.”

But it gets stranger, because Pelosi’s quote, from 2018, was related to U.S. immigration policy, and taken out of context by Gohmert: “And in order to do away with that crown jewel [the U.S. refugee resettlement system],” Pelosi said, “they’re doing away with children being with their moms. I just don’t even know why there aren’t uprisings all over the country. And maybe there will be, when people realize that this is a policy that they defend.”

The Electric Collage

Lately, it’s rare that a day passes when I don’t try to verify something I saw online — Googling a quote for attribution (Did Gandhi really say that?) or a headline to see if it’s raised enough eyebrows to warrant a fact-check. I save images and re-upload them to reverse image search engines, and have even ventured into photo forensics (using software to look for signs of manipulation). I’m not uniquely good at any of this, and it’s not a passion. It just seems like someone should grab the occasional suspicious-looking widget off the assembly line and give it a closer look.

A couple examples come to mind. There was the black-and-white photo of a group of Klanswomen holding a sign reading “Women of the Democrat Party.” The sign originally said “Lancaster County.”

Then there was the photo of the sunken motorboat with a Trump flag still waving from a pole on its stern, which made the rounds after five boats sank during a pro-Trump boat parade in Texas. The photo of the boat was real, but it was taken in Michigan several months earlier, and the Trump flag was added digitally.

Both of these examples were circulated widely enough to invite scrutiny by experienced fact-checkers, so it didn’t take any exceptional effort for me to get to the bottom of it. Still, comments on social media suggested the images were more widely accepted at face value than challenged. Kahneman’s System 1 was driving the boat.

Recently, I found myself researching an image that hadn’t been fact-checked by any news organization. It was a color photo of two women at a pro-Trump demonstration holding signs with gratuitous misspellings (“Stop the Steel” and “Stop the Electric Collage”). Of course, I thought, misspelled signs aren’t unheard of, but these seemed too far afield to be real, and they were. Using TinEye.com, a reverse image search tool, I found the original photo, which were far fewer in number than the doctored image — the signs read: “Stop the Steal” and “Trump Won Big.”

So where did “Stop the Electric Collage” come from? Searching the term produced just one other image from mid-December 2020, posted without context on a Medium blog post. It showed a confident-looking woman posing for the camera holding a white sign bearing the phrase in black block letters. The photo reappeared in a tweet a few days later with the caption: Not Photoshopped. #MAGA

Again, as arcane as the Electoral College is, it seemed hard to believe that someone would screw up the name that badly, and that, as one commenter noted, no one in the crowd at the rally would point it out. A reverse image search turned up nothing but a few copies of the same image. Several commenters asked if it might be Photoshopped, to which the author, @Roshan_Rinaldi, replied with vague, coy comments that left the door open — “People are saying (emoji with tongue sticking out).”

Commenters mostly did one of two things. They riffed on the idea of an “electric collage,” whatever that might be, or they seconded the idea that Trump supporters are as dumb as the sign falsely suggested. A comment that included what appeared to be the original undoctored photo received no likes or comments.

When I’ve pointed out Photoshopped images, the response I typically get is along the lines of, “I know, but I thought it was funny,” which suggests the persuasive message is more important than veracity.

“Deep fake” videos that have been altered to show something that didn’t happen have been tagged as the next level of subterfuge, which will require even more sophisticated forensic tools.

In an academic paper, currently in pre-print, researchers Soubhik Barari, Christopher Lucas, and Kevin Munger found that deep fakes did a good job of fooling people, but not any better than low-tech fakes, like a photo of a person with the text of a fabricated quote superimposed on it.

Greg Miller highlighted the study in “The Enduring Allure of Conspiracy Theories” in Knowable Magazine and described how various cognitive biases lead us to accept bad information.

The media bias chart

Assuming we want reliable information that is unbiased, where should we look? Within our lifetimes, the major national and international media were once seen as trusted neutral observers. But with the proliferation of opinion-heavy cable news channels that started in the late 1980s after the revocation of the Fairness Doctrine, and later internet-only aggregators and news-like sites, that trust has mostly disappeared. Today even the news organizations that strive for unbiased reporting are often the hardest hit with allegations of bias, often simultaneously by people with opposing worldviews.

Starting in 2018, Vanessa Otero, Denver, Colorado–area patent attorney and founder of Ad Fontes Media, set out to analyze news stories for bias and reliability. The resulting media bias chart, first published in 2019 and updated regularly, plots a sampling of 8,500 news articles on a matrix with the horizontal axis corresponding to political leaning and the vertical axis representing the trustworthiness of the reporting — the higher on the chart, the more accurate, the lower, the more likely that it is heavy on opinion, analysis, and in the lower regions, that it traffics in extremism.

At the top center are the wire services along with the military paper Stars and Stripes and The Weather Channel.

Most of the legacy newspapers and broadcasters — a cluster of maybe two dozen news outlets including the major networks, NPR, The New York Times, BBC, US News and World Report, Al Jazeera and many others — sit high and somewhat left of center. By contrast, the reliable zone to the right of center is sparse, with the Wall Street Journal, Financial Times, Christian Science Monitor and not much else. The country’s most watched network, Fox News, sits farther down and to the right, in the orange region, which the chartmakers warn is likely to show “selective, incomplete, unfair persuasion, propaganda or other issues.” 

Both sides have their ideological fringes. The right dips deeper into unreliability and extremism. There you’ll find two new conservative networks, One America News Network and Newsmax, that came to prominence by backing President Donald Trump.

InfoWars, the conspiracy-heavy site founded by Alex Jones, sits near the bottom right. In 2016, InfoWars had created its own media bias chart, with the horizontal axis representing a continuum of "tyranny" to "freedom" and the vertical axis representing “state run/corporate/foreign influences” at the bottom and “independent” at the top. The site placed itself at the apex of freedom and independence.

The Ad Fontes chart relies on human judges, one self-declared left-leaning, one centrist and one right-leaning, for each article. The chartmakers acknowledge the element of subjectivity but argue that a fair result is still possible, comparing their process to “grading standardized written exams (like AP tests and the bar exam), or judging athletic competitions.”

One of the interesting, undeclared features of the chart is that the news organizations naturally fall along a bell curve, the midpoint of which lies in the center of the ideological spectrum. In this way, it could be said the the truth is most resonant in the center.

Which raises the question of the relationship between partisanship and reliability? Is it possible for an article be very liberal or very conservative and still be very accurate? And is accuracy really what we want, or are we looking for something else.

I spoke with a philosophy professor recently about the political turmoil and calls for “unity.” He said an underlying problem is that we have competing value frameworks based on cultural, religious and social demographics. As he saw it, the conflict can’t end without either limiting free speech or limiting the number competing interests. On the one hand, he said, a country with a relatively homogeneous population with shared culture and religious beliefs will have fewer reasons to be in conflict. On the other hand, you can’t have a liberal system and root out undesirable views without exerting total control, as China does. An ethnostate or a totalitarian one. He was not happy about either of the options. “There’s a real chance we’re stepping in it right now,” he said.

Shortly after “Mad” Mike Hughes’ death last year, Space.com published an article titled “‘Mad’ Mike wasn’t trying to prove ‘flat Earth’ theory on ill-fated homemade rocket launch.”

Hughes had made the goals of his rocket fancy well known in his early flights. But the article countered that in an interview with the website "Hughes clarified, 'although I do believe in the flat Earth, this was never an attempt to prove that. This flat Earth has nothing to do with the steam rocket launches, it never did, it never will. I’m a daredevil!'”

At the time of his death, Hughes was being filmed by the Science Channel series called “Homemade Astronauts,” so it may be that the channel had asked him to downplay his political views.

Or maybe he’d seen enough of the Earth to know it was round, and enough of the Electric Collage to know it couldn't be stopped.