Elon Musk Also Has a Problem with Wikipedia

The LedeLately, Musk’s beef has merged with a general conviction on the right that the site is biased against conservatives.By Margaret TalbotMarch 4, 2025Illustration by Ricardo TomásIf you have spent time on Wikipedia—and especially if you’ve delved at all into the online encyclopedia’s inner workings—you will know that it is, in almost every aspect, the inverse of Trumpism. That’s not a statement about its politics. The thousands of volunteer editors who write, edit, and fact-check the site manage to adhere remarkably well, over all, to one of its core values: the neutral point of view. Like many of Wikipedia’s s principles and procedures, the neutral point of view is the subject of a practical but sophisticated epistemological essay posted on Wikipedia. Among other things, the essay explains, N.P.O.V. means not stating opinions as facts, and also, just as important, not stating facts as opinions. (So, for example, the third sentence of the entry titled “Climate change” states, with no equivocation, that “the current rise in global temperatures is driven by human activities, especially fossil fuel burning since the Industrial Revolution.”)The LedeReporting and commentary on what you need to know today.If Wikipedia is Trumpism’s opposite—or, as a lot of people like to say, only half facetiously, the only good place left on the Internet—it’s because of its principles and procedures. You may not find everything you want to know about a given subject (“Ben Whishaw,” “Suede (band),” “Olfactory bulb,” to name three of the last pages I visited). But Wikipedia articles—especially those on big and contentious subjects—are remarkably forthcoming about why they include or omit what they do. Click on the “Talk” tab at the top of almost any high-profile article (try the one for Robert F. Kennedy, Jr., for instance) and you can read lengthy, civil, precise, citation-studded debates among editors about terms (“conspiracy theorist,” “anti-vaccine activist”), sources, and themes to emphasize in the entry. Click on the “View history” tab and a list of all the edits ever made to that page unfurls before you. If you want to add or correct something and you are able to supply references for it that meet Wikipedia’s increasingly stringent standards for reliability, you can make an edit and see if it passes muster. Since the start of the new Trump Administration, there have been abrupt erasures of the historical and scientific record on multiple federal-government sites: pages dealing with H.I.V. statistics, contraception, gender-affirming care, and flu vaccination, for instance, all taken down from the C.D.C.’s Web site, though in some cases later restored; references to transgender people, who were crucial participants in the Stonewall uprising, hastily scrubbed from the National Park Service’s Stonewall National Monument page. These absences and excisions go unexplained on the pages themselves—though we know that they represent muddled capitulations to Trump’s executive orders—in a way that Wikipedia’s edits never do.The site, twenty-four years into its existence, still largely works as a collaborative democracy of unpaid contributors striving for consensus and veracity. Tamzin Hadasa Kelly, a twenty-eight-year-old Wikipedia administrator who has been contributing to the site since 2012, told me, “The vast, vast majority of content on the site is produced completely volunteer. We don’t have ranks, we don’t have editorial structure, we don’t have assigned topics.” Wikipedia does have an arbitration committee, a sort of supreme court that adjudicates rule-violating conduct on the site, but to a great extent, Kelly says, “it’s just us.” If she has to block people from editing because their work is consistently subpar—maybe they don’t plagiarize but they tend to paraphrase too closely or chronically fail to cite sources—at least, she says, “I’m not worrying that I’m taking away their livelihood.” (It can be painful in a different way: “When someone is trying their best, it really sucks to tell them, I know you’re trying your best and not getting paid for this, but that’s still not good enough. I’m going to need you to stop.”)Since the content is not monetized, and the site accepts no advertising, the articles rarely devolve into mere clickbait. What the Internet scholar Yochai Benkler calls Wikipedia’s “nonmarket utility” has helped insure its integrity. At a time when other social-media sites have abandoned whatever safeguards they had in place against mis- and disinformation—Meta has eliminated fact checking, X has been flooded with free-floating dreck of murky provenance and purpose, ChatGPT obligingly spits out hallucination-filled answers like a student who hasn’t done the reading—Wikipedia is a bastion of transparency, punctiliousness, and accessible knowledge.So maybe it should come as no surprise that Elon Musk has lately taken time from his busy schedule of dismantling the federal government, along with many of its sou

Mar 4, 2025 - 21:12
Elon Musk Also Has a Problem with Wikipedia
Lately, Musk’s beef has merged with a general conviction on the right that the site is biased against conservatives.
Illustration of Wikipedia puzzle piece globe painted black with the logo for X painted over it
Illustration by Ricardo Tomás

If you have spent time on Wikipedia—and especially if you’ve delved at all into the online encyclopedia’s inner workings—you will know that it is, in almost every aspect, the inverse of Trumpism. That’s not a statement about its politics. The thousands of volunteer editors who write, edit, and fact-check the site manage to adhere remarkably well, over all, to one of its core values: the neutral point of view. Like many of Wikipedia’s s principles and procedures, the neutral point of view is the subject of a practical but sophisticated epistemological essay posted on Wikipedia. Among other things, the essay explains, N.P.O.V. means not stating opinions as facts, and also, just as important, not stating facts as opinions. (So, for example, the third sentence of the entry titled “Climate change” states, with no equivocation, that “the current rise in global temperatures is driven by human activities, especially fossil fuel burning since the Industrial Revolution.”)

If Wikipedia is Trumpism’s opposite—or, as a lot of people like to say, only half facetiously, the only good place left on the Internet—it’s because of its principles and procedures. You may not find everything you want to know about a given subject (“Ben Whishaw,” “Suede (band),” “Olfactory bulb,” to name three of the last pages I visited). But Wikipedia articles—especially those on big and contentious subjects—are remarkably forthcoming about why they include or omit what they do. Click on the “Talk” tab at the top of almost any high-profile article (try the one for Robert F. Kennedy, Jr., for instance) and you can read lengthy, civil, precise, citation-studded debates among editors about terms (“conspiracy theorist,” “anti-vaccine activist”), sources, and themes to emphasize in the entry. Click on the “View history” tab and a list of all the edits ever made to that page unfurls before you. If you want to add or correct something and you are able to supply references for it that meet Wikipedia’s increasingly stringent standards for reliability, you can make an edit and see if it passes muster. Since the start of the new Trump Administration, there have been abrupt erasures of the historical and scientific record on multiple federal-government sites: pages dealing with H.I.V. statistics, contraception, gender-affirming care, and flu vaccination, for instance, all taken down from the C.D.C.’s Web site, though in some cases later restored; references to transgender people, who were crucial participants in the Stonewall uprising, hastily scrubbed from the National Park Service’s Stonewall National Monument page. These absences and excisions go unexplained on the pages themselves—though we know that they represent muddled capitulations to Trump’s executive orders—in a way that Wikipedia’s edits never do.

The site, twenty-four years into its existence, still largely works as a collaborative democracy of unpaid contributors striving for consensus and veracity. Tamzin Hadasa Kelly, a twenty-eight-year-old Wikipedia administrator who has been contributing to the site since 2012, told me, “The vast, vast majority of content on the site is produced completely volunteer. We don’t have ranks, we don’t have editorial structure, we don’t have assigned topics.” Wikipedia does have an arbitration committee, a sort of supreme court that adjudicates rule-violating conduct on the site, but to a great extent, Kelly says, “it’s just us.” If she has to block people from editing because their work is consistently subpar—maybe they don’t plagiarize but they tend to paraphrase too closely or chronically fail to cite sources—at least, she says, “I’m not worrying that I’m taking away their livelihood.” (It can be painful in a different way: “When someone is trying their best, it really sucks to tell them, I know you’re trying your best and not getting paid for this, but that’s still not good enough. I’m going to need you to stop.”)

Since the content is not monetized, and the site accepts no advertising, the articles rarely devolve into mere clickbait. What the Internet scholar Yochai Benkler calls Wikipedia’s “nonmarket utility” has helped insure its integrity. At a time when other social-media sites have abandoned whatever safeguards they had in place against mis- and disinformation—Meta has eliminated fact checking, X has been flooded with free-floating dreck of murky provenance and purpose, ChatGPT obligingly spits out hallucination-filled answers like a student who hasn’t done the reading—Wikipedia is a bastion of transparency, punctiliousness, and accessible knowledge.

So maybe it should come as no surprise that Elon Musk has lately taken time from his busy schedule of dismantling the federal government, along with many of its sources of reliable information, to attack Wikipedia. On January 21st, after the site updated its page on Musk to include a reference to the much-debated stiff-armed salute he made at a Trump inaugural event, he posted on X that “since legacy media propaganda is considered a ‘valid’ source by Wikipedia, it naturally simply becomes an extension of legacy media propaganda!” He urged people not to donate to the site: “Defund Wikipedia until balance is restored!” It’s worth taking a look at how the incident is described on Musk’s page, quite far down, and judging for yourself. What I see is a paragraph that first describes the physical gesture (“Musk thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together”), goes on to say that “some” viewed it as a Nazi or a Roman salute, then quotes Musk disparaging those claims as “politicized,” while noting that he did not explicitly deny them. (There is also now a separate Wikipedia article, “Elon Musk salute controversy,” that goes into detail about the full range of reactions.)

This is not the first time Musk has gone after the site. In December, he posted on X, “Stop donating to Wokepedia.” And that wasn’t even his first bad Wikipedia pun. “I will give them a billion dollars if they change their name to Dickipedia,” he wrote, in an October, 2023, post. It seemed to be an ego thing at first. Musk objected to being described on his page as an “early investor” in Tesla, rather than as a founder, which is how he prefers to be identified, and seemed frustrated that he couldn’t just buy the site. But lately Musk’s beef has merged with a general conviction on the right that Wikipedia—which, like all encyclopedias, is a tertiary source that relies on original reporting and research done by other media and scholars—is biased against conservatives.

The Heritage Foundation, the think tank behind the Project 2025 policy blueprint, has plans to unmask Wikipedia editors who maintain their privacy using pseudonyms (these usernames are displayed in the article history but don’t necessarily make it easy to identify the people behind them) and whose contributions on Israel it deems antisemitic. (That story was reported by the Forward in January, and was based on leaked Heritage documents. Mike Howell, of Heritage, told me that this “investigation” of Wikipedia, which, he said, “is where information is laundered,” will be “shared with the appropriate policymakers to help inform a strategic response.”) The New York Post ran an editorial in February with the headline “Big Tech must block Wikipedia until it stops censoring and pushing disinformation.” The editorial cited a “bombshell new report” from something called the Media Research Center, which identifies itself as a conservative-movement leader in “combating the left’s efforts to manipulate the electoral process, silence opposing voices online, and undermine American values.”

The “bombshell” was that Wikipedia maintains what the Post described as “a blacklist” of sources. But the list it apparently refers to is neither a blacklist nor a secret. You can find it, along with many discussions between editors, under the entry “Wikipedia: Reliable sources/Perennial sources.” The list classifies a number of sources by degrees of reliability, and those that occupy the lowest category include conservative outlets such as Newsmax and Breitbart News, on the basis that they have promoted unsubstantiated conspiracy theories. It cautions against using things said on Fox News talk shows as “statements of fact,” but notes that they can sometimes be used for “attributed opinions.” It urges skepticism about some other sources because they allow clients to pay for coverage, run press releases or other user-generated content with little oversight, or operate essentially as partisan forums for activists of various political stripes. Those make up a motley assortment, including Amazon user reviews, the publication BroadwayWorld, and the progressive site Daily Kos. The compendium is not, the entry stresses, meant to function as a list of “pre-approved sources that can always be used without regard for the ordinary rules of editing,” nor as a “list of banned sources that can never be used or should be removed on sight.”

Amy Bruckman, a professor of interactive computing at Georgia Institute of Technology, published a book in 2022 called “Should You Believe Wikipedia?: Online Communities and the Construction of Knowledge.” Bruckman told me that the answer to the title’s question can depend on the article itself. An entry on an obscure subject that few editors have had a chance to scrutinize, or that is missing multiple citations, may be of little value. Many Wikipedia articles come with a header warning that they don’t yet meet the site’s criteria for sourcing. (If editors spot the missing citations and fix them, the header will be removed.) “One thing you can say with confidence,” Bruckman noted. “A highly popular Wikipedia page is generally something you can rely on. You should of course check citations—they should not take you to somebody’s blog post. They should be published by a reputable journalistic outlet or a scientific paper, or something else you can trust.” But when it comes to the major articles, “there are literally hundreds of people checking them every day, and there is surprisingly little political bias. It’s amazing to me how many controversial topics result in balanced articles.”

Like the Encyclopedia Britannica before it, but with much wider coverage—Wikipedia exists in more than three hundred languages, and there are currently almost seven million articles in the English version—it’s almost always a place to start, not finish, serious research on a topic. And it certainly has blind spots that have led to the overrepresentation or underrepresentation of some topics. Bruckman finds Wikipedia “studiously nonpartisan” except in its “bias towards covering things people find fun. It has way too much content on science-fiction and fantasy novels compared to specific topics in science. They could maybe use more fish scientists and fewer fans of Terry Pratchett.” To its editors’ annoyance, they sometimes have to contend with an article that some entity—a P.R. or reputation-management firm, for instance—has been paid by a client to produce. (Editors can and do challenge, correct, or delete these, but there can be a lag with less prominent subjects.) Some articles on math and science, though they may be technically correct, can be almost impenetrable for the general reader. (Look up the statistical term “confidence interval,” which I had occasion to do recently, and see if you are as flummoxed as I was.) There has been a historic gender and racial imbalance among frequent contributors to the English-language Wikipedia—what data there is suggests that the majority are white and male. Ryan McGrady, a researcher at the University of Massachusetts, Amherst, who both writes about and contributes to Wikipedia, told me that he sees plenty of articles on “sports, American politics, video games—they’re all well covered in English in particular. But there’s not as much high-quality information about places in Africa apart from big cities, for example, or the culture of places that are not largely English-speaking.” Those biases, presumably, would not be particularly vexing for the anti-D.E.I. crusader Musk.

A key difference between Wikipedia and many of its Trumpian critics is that Wikipedia admits when it is wrong. It is, by definition, a work in progress, and it makes its progress legible. Performative invincibility is not its brand. Some of the best sources on Wikipedia’s flaws can be found on Wikipedia itself. (See, for example, the entry “Wikipedia: Systemic bias.”) In a recent interview with New York magazine, Jimmy Wales, the site’s co-founder, said that claims of anti-right bias were “something I look at and focus on and think about, but whenever I try to find problematic examples, it’s pretty hard.”

What may be the worst violation of Wikipedia’s principles did not involve the English-language version. Between 2011 and 2020, a small group of far-rightists essentially took over Croatian Wikipedia. Articles valorized or whitewashed the Fascist puppet regime that ruled Croatia during the Second World War; whereas the English-language and other Wikipedia sites continued to correctly identify a place called Jasenovac as a concentration camp, the Croatian version adopted a more anodyne phrase: “collection and labor camp.” Benjamin Mako Hill, a professor at the University of Washington who co-authored a research paper on the Croatian Wikipedia debacle, told me that contributors without a political agenda were eventually able to wrest control back, banning a small group of Fascist apologists from editing articles. “It’s hard to imagine” such a takeover “happening in English-language Wikipedia,” he said, “just because it’s so big.” And Hill said that this was the only example he knew of in which an entire Wikipedia site was targeted and captured.

Still, it’s not hard to imagine the risk of such a scenario. A new study, by university researchers in Amsterdam, which looked at tens of millions of tweets by politicians in twenty-six countries over the course of six years found that, in comparison with other political movements, “radical-right populism”—of the sort that Musk, with his admiration for Germany’s ultranationalist AfD party, seems drawn to—“is the strongest determinant for the propensity to spread misinformation.” Distrust of “institutions such as mainstream media” and “hostility towards democratic institutions” were the prime motivators.

But Wikipedia has proved remarkably resilient. Wales has stressed that the site is not for sale. And for two decades, a long time in tech years, it has stayed true to its crowdsourced, democratic ethos and to its commitment to facts. In 2025 America, that counts as a beacon of hope. ♦

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.

Home    
Games    
Auto News    
Headline    
News    
Tools    
Community    
Focus