Fact-checking is the act of checking factual assertions in non-fictional text in order to determine the veracity and correctness of the factual statements in the text. This may be done either before (ante hoc) or after (post hoc) the text has been published or otherwise disseminated.[1] Fact-checking may be done privately, such as when a magazine editor wants to verify the contents of a news article, either before or after publication. This is called internal fact-checking.[2] Alternatively, the fact-checking analysis may be published, in which case it is called external fact-checking.[2]

Ante hoc fact-checking (fact-checking before dissemination) aims to remove errors and allow text to proceed to dissemination (or to rejection if it fails confirmations or other criteria). Post hoc fact-checking is most often followed by a written report of inaccuracies, sometimes with a visual metric from the checking organization (e.g., Pinocchios from The Washington Post Fact Checker, or TRUTH-O-METER ratings from PolitiFact). Several organizations are devoted to post hoc fact-checking, such as FactCheck.org and PolitiFact.

Research on the impact of fact-checking is relatively recent but the existing research suggests that fact-checking does indeed correct misperceptions among citizens, as well as discourage politicians from spreading misinformation.

Post hoc fact-checking

External post hoc fact-checking by independent organizations began in the United States in the early 2000s.[2]

Consistency across fact-checkers

One study finds that fact-checkers PolitiFact, FactCheck.org, and Washington Post's Fact Checker overwhelmingly agree on their evaluations of claims.[3][4] However, a study by Morgan Marietta, David C. Barker and Todd Bowser found "substantial differences in the questions asked and the answers offered." They concluded that this limited the "usefulness of fact-checking for citizens trying to decide which version of disputed realities to believe."[5] A paper by Chloe Lim, Ph.D. student at Stanford University, found little overlap in the statements that fact-checkers check. Out of 1065 fact-checks by PolitiFact and 240 fact-checks by The Washington Post's Fact-Checker, there were only 70 statements that both fact-checkers checked. The study found that the fact-checkers gave consistent ratings for 56 out of 70 statements, which means that one out every five times, the two fact-checkers disagree on the accuracy of statements.[6]


Studies of post hoc fact-checking have made clear that such efforts often result in changes in the behavior, in general, of both the speaker (making them more careful in their pronouncements) and of the listener or reader (making them more discerning with regard to the factual accuracy of content); observations include the propensities of audiences to be completely unswayed by corrections to errors regarding the most divisive subjects, or the tendency to be more greatly persuaded by corrections of negative reporting (e.g., "attack ads"), and to see minds changed only when the individual in error was someone reasonably like-minded to begin with.[7]

Correcting misperceptions

A 2015 study found evidence a "backfire effect" (correcting false information may make partisan individuals cling more strongly to their views): "Corrective information adapted from the Centers for Disease Control and Prevention (CDC) website significantly reduced belief in the myth that the flu vaccine can give you the flu as well as concerns about its safety. However, the correction also significantly reduced intent to vaccinate among respondents with high levels of concern about vaccine side effects--a response that was not observed among those with low levels of concern."[8] A 2017 study attempted to replicate the findings of the 2015 study but failed to do so.[9]

A 2016 study found little evidence for the "backfire effect": "By and large, citizens heed factual information, even when such information challenges their partisan and ideological commitments."[10] A study of Donald Trump supporters during the 2016 race similarly found little evidence for the backfire effect: "When respondents read a news article about Mr. Trump's speech that included F.B.I. statistics indicating that crime had "fallen dramatically and consistently over time," their misperceptions about crime declined compared with those who saw a version of the article that omitted corrective information (though misperceptions persisted among a sizable minority)."[11][12] A 2018 study found no evidence of a backfire effect.[13]

Studies have shown that fact-checking can affect citizens' belief in the accuracy of claims made in political advertisement.[14] A paper by a group of Paris School of Economics and Sciences Po economists found that falsehoods by Marine Le Pen during the 2017 French presidential election campaign (i) successfully persuaded voters, (ii) lost their persuasiveness when fact-checked, and (iii) did not reduce voters' political support for Le Pen when her claims were fact-checked.[15] A 2017 study in the Journal of Politics found that "individuals consistently update political beliefs in the appropriate direction, even on facts that have clear implications for political party reputations, though they do so cautiously and with some bias... Interestingly, those who identify with one of the political parties are no more biased or cautious than pure independents in their learning, conditional on initial beliefs."[16]

A study by Yale University cognitive scientists Gordon Pennycook and David G. Rand found that Facebook tags of fake articles "did significantly reduce their perceived accuracy relative to a control without tags, but only modestly".[17] A Dartmouth study led by Brendan Nyhan found that Facebook tags had a greater impact than the Yale study found.[18][19] A "disputed" tag on a false headline reduced the number of respondents who considered the headline accurate from 29% to 19%, whereas a "rated false" tag pushed the number down to 16%.[18] The Yale study found evidence of a backfire effect among Trump supporters younger than 26 years whereby the presence of both untagged and tagged fake articles made the untagged fake articles appear more accurate.[17] In response to research which questioned the effectiveness of the Facebook "disputed" tags, Facebook decided to drop the tags in December 2017 and would instead put articles which fact-checked a fake news story next to the fake news story link whenever it is shared on Facebook.[20]

Based on the findings of a 2017 study in the journal Psychological Science, the most effective ways to reduce misinformation through corrections is by:[21]

  • limiting detailed descriptions of / or arguments in favor of the misinformation;
  • walking through the reasons why a piece of misinformation is false rather than just labelling it false;
  • presenting new and credible information which allows readers to update their knowledge of events and understand why they developed an inaccurate understanding in the first place;
  • using video, as videos appear to be more effective than text at increasing attention and reducing confusion, making videos more effective at correcting misperception than text.

A forthcoming study in the Journal of Experimental Political Science found "strong evidence that citizens are willing to accept corrections to fake news, regardless of their ideology and the content of the fake stories."[22]

A paper by Andrew Guess (of Princeton University), Brendan Nyhan (Dartmouth College) and Jason Reifler (University of Exeter) found that consumers of fake news tended to have less favorable views of fact-checking, in particular Trump supporters.[23] The paper found that fake news consumers rarely encountered fact-checks: "only about half of the Americans who visited a fake news website during the study period also saw any fact-check from one of the dedicated fact-checking website (14.0%)."[23]

A 2018 study found that Republicans were more likely to correct their false information on voter fraud if the correction came from Breitbart News rather than a non-partisan neutral source such as PolitiFact.[24]

Political discourse

A 2015 experimental study found that fact-checking can encourage politicians to not spread misinformation. The study found that it might help improve political discourse by increasing the reputational costs or risks of spreading misinformation for political elites. The researchers sent, "a series of letters about the risks to their reputation and electoral security if they were caught making questionable statements. The legislators who were sent these letters were substantially less likely to receive a negative fact-checking rating or to have their accuracy questioned publicly, suggesting that fact-checking can reduce inaccuracy when it poses a salient threat."[25]

Political preferences

One experimental study found that fact-checking during debates affected viewers' assessment of the candidates' debate performance and "greater willingness to vote for a candidate when the fact-check indicates that the candidate is being honest."[26]

A study of Trump supporters during the 2016 presidential campaign found that while fact-checks of false claims made by Trump reduced his supporters' belief in the false claims in question, the corrections did not alter their attitudes towards Trump.[27]

Controversies and criticism

Political fact-checking is sometimes criticized as being opinion journalism.[28][29] In September 2016, a Rasmussen Reports national telephone and online survey found that "just 29% of all Likely U.S. Voters trust media fact-checking of candidates' comments. Sixty-two percent (62%) believe instead that news organizations skew the facts to help candidates they support."[30][31]

Informal fact-checking

Individual readers perform some types of fact-checking, such as comparing claims in one news story against claims in another.

Rabbi Moshe Benovitz, has observed that: "modern students use their wireless worlds to augment skepticism and to reject dogma." He says this has positive implications for values development:

"Fact-checking can become a learned skill, and technology can be harnessed in a way that makes it second nature… By finding opportunities to integrate technology into learning, students will automatically sense the beautiful blending of… their cyber… [and non-virtual worlds]. Instead of two spheres coexisting uneasily and warily orbiting one another, there is a valuable experience of synthesis…".[32]

Detecting fake news

Fake news has become increasingly prevalent over the last few years, with over a 100 incorrect articles and rumors spread incessantly just with regard to the 2016 United States presidential election.[33] These fake news articles tend to come from satirical news websites or individual websites with an incentive to propagate false information, either as clickbait or to serve a purpose.[33] Since these articles typically hope to intentionally promote incorrect information, these articles are quite difficult to detect.[34] When identifying a source of information, one must look at many attributes, including but not limited to the content of the email and social media engagements.[34] The language, specifically, is typically more inflammatory in fake news than real articles, in part because the purpose is to confuse and generate clicks.[34] Furthermore, modeling techniques such as n-gram encodings and bag of words have served as other linguistic techniques to determine the legitimacy of a news course.[34] On top of that, researchers have determined that visual-based cues also play a factor in categorizing an article, specifically some features can be designed to assess if a picture was legitimate, and provides us more clarity on the news.[34] There is also many social context features that can play a role, as well as the model of spreading the news. Websites such as “Snopes” try to detect this information manually, while certain universities are trying to build mathematical models to do this themselves.[33]

Organizations and individuals

Some individuals and organizations publish their fact-checking efforts on the internet. These may have a special subject-matter focus, such as Snopes.com's focus on urban legends or the Reporters' Lab at Duke University's focus on providing resources to journalists.

On-going Research in Fact-checking and Detecting Fake News

Donald Trump by Gage Skidmore 2
Donald Trump, a prominent figure in regards to fake news, smirks

Since the 2016 United Stated presidential election, fake news has been a popular topic of discussion by President Trump and news outlets. The reality of fake news had become omnipresent, and a lot of research has gone into understanding, identifying, and combating fake news. Also, a number of researchers began with the usage of fake news to influence the 2016 presidential campaign. One research found evidence of pro-Trump fake news being selectively targeted on conservatives and pro-Trump supporters in 2016.[35] The researchers found that social media sites, Facebook in particular, to be powerful platforms to spread certain fake news to targeted groups to appeal to their sentiments during the 2016 presidential race. Additionally, researchers from Stanford, NYU, and NBER found evidence to show how engagement with fake news on Facebook and Twitter was high throughout 2016.[36] Recently, a lot of work has gone into detecting and identifying fake news through machine learning and artificial intelligence. In 2018, researchers at MIT's CSAIL (Computer Science and Artificial Intelligence Lab) created and tested a machine learning algorithm to identify false information by looking for common patterns, words, and symbols that typically appear in fake news.[37] More so, they released an open-source data set with a large catalog of historical news sources with their veracity scores to encourage other researchers to explore and develop new methods and technologies for detecting fake news.

Despite the ongoing research at top universities and institutions, there is much debate on the effectiveness of such technology in identifying fake news. There is still not enough good training data for machine learning and AI scientists to use to create very accurate predictive models on detecting fake news. Nonetheless, a lot of research is still ongoing to better understand fake news and their characteristics.

Ante hoc fact-checking

Among the benefits of printing only checked copy is that it averts serious, sometimes costly, problems. These problems can include lawsuits for mistakes that damage people or businesses, but even small mistakes can cause a loss of reputation for the publication. The loss of reputation is often the more significant motivating factor for journalists.[38]

Fact checkers verify that the names, dates, and facts in an article or book are correct.[38] For example, they may contact a person who is quoted in a proposed news article and ask the person whether this quotation is correct, or how to spell the person's name. Fact-checkers are primarily useful in catching accidental mistakes; they are not guaranteed safeguards against those who wish to commit journalistic frauds.

As a career

Professional fact checkers have generally been hired by newspapers, magazines, and book publishers, probably starting in the early 1920s with the creation of Time magazine in the US.[38][2] Fact checkers may be aspiring writers, future editors, or freelancers engaged other projects; others are career professionals.[38]

Historically, the field was considered women's work, and from the time of the first professional American fact checker through at least the 1970s, the fact checkers at a media company might be entirely female or primarily so.[38]

The number of people employed in fact-checking varies by publication. Some organizations have substantial fact-checking departments. For example, The New Yorker magazine had 16 fact checkers in 2003.[38] Others may hire freelancers per piece, or may combine fact-checking with other duties. Magazines are more likely to use fact checkers than newspapers.[2] Television and radio programs rarely employ dedicated fact checkers, and instead expect others, including senior staff, to engage in fact-checking in addition to their other duties.[38]

Checking original reportage

Stephen Glass began his journalism career as a fact-checker. He went on to invent fictitious stories, which he submitted as reportage, and which fact-checkers at The New Republic (and other weeklies for which he worked) never flagged. Michael Kelly, who edited some of Glass's concocted stories, blamed himself, rather than the fact-checkers, saying: "Any fact-checking system is built on trust ... If a reporter is willing to fake notes, it defeats the system. Anyway, the real vetting system is not fact-checking but the editor." [39]

Education on fact-checking

With the circulation of fake news on the internet, many organizations have dedicated time to create guidelines to help read to verify the information they are consuming. Many universities across America provide university students resources and tools to help them verify their sources. Universities provide access to research guides that help students conduct thorough research with reputable sources within academia. Organizations like FactCheck.org, OntheMedia.org, and PolitiFact.com provide procedural guidelines that help individuals navigate the process to fact-check a source.

Books on professional fact-checking

  • Sarah Harrison Smith spent some time and also headed the fact-checking department for The New York Times. She is the author of the book, The Fact Checker's Bible.
  • Jim Fingal worked for several years as a fact-checker at The Believer and McSweeney's and is co-author with John D'Agata of The Lifespan of a Fact which is an inside look at the struggle between fact-checker (Fingal) and author (D'Agata) over an essay that pushed the limits of the acceptable "artistic license" for a non-fiction work.

Alumni of the role

The following is a list of individuals for whom it has been reported, reliably, that they have played such a fact-checking role at some point in their careers, often as a stepping point to other journalistic endeavors, or to an independent writing career:

See also


  1. ^ Fellmeth, Aaron X.; Horwitz, Maurice (2009). "Ante hoc". Ante hoc – Oxford Reference. Oxford University Press. doi:10.1093/acref/9780195369380.001.0001. ISBN 9780195369380. Archived from the original on 21 February 2015.
  2. ^ a b c d e Graves, Lucas; Amazeen, Michelle A. (25 February 2019), "Fact-Checking as Idea and Practice in Journalism", Oxford Research Encyclopedia of Communication, Oxford University Press, doi:10.1093/acrefore/9780190228613.013.808, ISBN 9780190228613, retrieved 14 March 2019
  3. ^ Amazeen, Michelle A. (1 October 2016). "Checking the Fact-Checkers in 2008: Predicting Political Ad Scrutiny and Assessing Consistency". Journal of Political Marketing. 15 (4): 433–464. doi:10.1080/15377857.2014.959691. hdl:2144/27297. ISSN 1537-7857.
  4. ^ Amazeen, Michelle A. (2 January 2015). "Revisiting the Epistemology of Fact-Checking". Critical Review. 27 (1): 1–22. doi:10.1080/08913811.2014.993890. hdl:2144/27304. ISSN 0891-3811.
  5. ^ Marietta, Morgan; Barker, David C.; Bowser, Todd (2015). "Fact-Checking Polarized Politics: Does The Fact-Check Industry Provide Consistent Guidance on Disputed Realities?" (PDF). The Forum. 13 (4): 577. doi:10.1515/for-2015-0040. Retrieved 27 September 2016.
  6. ^ "Checking how fact-checkers check".
  7. ^ Amazeen, Michelle (2015) "Monkey Cage: Sometimes political fact-checking works. Sometimes it doesn't. Here's what can make the difference.," The Washington Post (online), 3 June 2015, see [1], accessed 27 July 2015.
  8. ^ Nyhan, Brendan; Reifler, Jason (9 January 2015). "Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information". Vaccine. 33 (3): 459–464. doi:10.1016/j.vaccine.2014.11.017. hdl:10871/21566. ISSN 1873-2518. PMID 25499651.
  9. ^ Haglin, Kathryn (1 July 2017). "The limitations of the backfire effect". Research & Politics. 4 (3): 2053168017716547. doi:10.1177/2053168017716547. ISSN 2053-1680.
  10. ^ Wood, Thomas; Porter, Ethan (5 August 2016). "The Elusive Backfire Effect: Mass Attitudes' Steadfast Factual Adherence". SSRN 2819073.
  11. ^ Nyhan, Brendan (5 November 2016). "Fact-Checking Can Change Views? We Rate That as Mostly True". The New York Times. ISSN 0362-4331. Retrieved 5 November 2016.
  12. ^ Nyhan, Brendan; Porter, Ethan; Reifler, Jason; Wood, Thomas J. (21 January 2019). "Taking Fact-Checks Literally But Not Seriously? The Effects of Journalistic Fact-Checking on Factual Beliefs and Candidate Favorability". Political Behavior. doi:10.1007/s11109-019-09528-x. ISSN 1573-6687.
  13. ^ Guess, Andrew; Coppock, Alexander (2018). "Does Counter-Attitudinal Information Cause Backlash? Results from Three Large Survey Experiments". British Journal of Political Science: 1–19. doi:10.1017/S0007123418000327. ISSN 0007-1234.
  14. ^ Fridkin, Kim; Kenney, Patrick J.; Wintersieck, Amanda (2 January 2015). "Liar, Liar, Pants on Fire: How Fact-Checking Influences Citizens' Reactions to Negative Advertising". Political Communication. 32 (1): 127–151. doi:10.1080/10584609.2014.914613. ISSN 1058-4609.
  15. ^ Rodriguez, Barrera; David, Oscar; Guriev, Sergei M.; Henry, Emeric; Zhuravskaya, Ekaterina (18 July 2017). "Facts, Alternative Facts, and Fact Checking in Times of Post-Truth Politics". SSRN 3004631.
  16. ^ Hill, Seth J. (16 August 2017). "Learning Together Slowly: Bayesian Learning about Political Facts". The Journal of Politics. 79 (4): 1403–1418. doi:10.1086/692739. ISSN 0022-3816.
  17. ^ a b Pennycook, Gordon; Rand, David G. (12 September 2017). "Assessing the Effect of "Disputed" Warnings and Source Salience on Perceptions of Fake News Accuracy". SSRN 3035384.
  18. ^ a b Nyhan, Brendan (23 October 2017). "Why the Fact-Checking at Facebook Needs to Be Checked". The New York Times. ISSN 0362-4331. Retrieved 23 October 2017.
  19. ^ Clayton, Katherine; Blair, Spencer; Busam, Jonathan A.; Forstner, Samuel; Glance, John; Green, Guy; Kawata, Anna; Kovvuri, Akhila; Martin, Jonathan (11 February 2019). "Real Solutions for Fake News? Measuring the Effectiveness of General Warnings and Fact-Check Tags in Reducing Belief in False Stories on Social Media". Political Behavior. doi:10.1007/s11109-019-09533-0. ISSN 1573-6687.
  20. ^ "Facebook stops putting "Disputed Flags" on fake news because it doesn't work". Axios. 27 December 2017. Retrieved 28 December 2017.
  21. ^ Chokshi, Niraj (18 September 2017). "How to Fight 'Fake News' (Warning: It Isn't Easy)". The New York Times. ISSN 0362-4331. Retrieved 19 September 2017.
  22. ^ Porter, Ethan; Wood, Thomas; Kirby, David (17 November 2017). "Sex Trafficking, Russian Infiltration, Birth Certificates, and Pedophilia: A Survey Experiment Correcting Fake News". SSRN 3073294.
  23. ^ a b "Selective Exposure to Misinformation: Evidence from the consumption of fake news during the 2016 U.S. presidential campaign" (PDF).
  24. ^ Holman, Mirya R.; Lay, J. Celeste (2018). "They See Dead People (Voting): Correcting Misperceptions about Voter Fraud in the 2016 U.S. Presidential Election". Journal of Political Marketing: 1–38. doi:10.1080/15377857.2018.1478656.
  25. ^ Nyhan, Brendan; Reifler, Jason (1 July 2015). "The Effect of Fact-Checking on Elites: A Field Experiment on U.S. State Legislators". American Journal of Political Science. 59 (3): 628–40. doi:10.1111/ajps.12162. hdl:10871/21568. ISSN 1540-5907.
  26. ^ Wintersieck, Amanda L. (5 January 2017). "Debating the Truth". American Politics Research. 45 (2): 304–331. doi:10.1177/1532673x16686555.
  27. ^ Nyhan, Brendan; Porter, Ethan; Reifler, Jason; Wood, Thomas J. (n.d.). "Taking Fact-checks Literally But Not Seriously? The Effects of Journalistic Fact-checking on Factual Beliefs and Candidate Favorability" (PDF).
  28. ^ Riddell, Kelly (26 September 2016). "Eight examples where 'fact-checking' became opinion journalism". Washington Times. Retrieved 27 September 2016.
  29. ^ Graves, Lucas (2016). Deciding What's True: The Rise of Political Fact-Checking in American Journalism. Columbia University Press. p. 27. ISBN 9780231542227. Retrieved 27 September 2016.
  30. ^ Reports, Rasmussen. "Voters Don't Trust Media Fact-Checking - Rasmussen Reports™". Retrieved 17 October 2016.
  31. ^ Lejeune, Tristan (30 September 2016). "Poll: Voters don't trust media fact-checkers". Retrieved 17 October 2016.
  32. ^ Moshe Benovitz et al., 2012, "Education: The Social Media Revolution: What Does It Mean for Our Children?" Jewish Action (online), August 24, 2012, New York, NY, USA:Orthodox Union, see [2], accessed 28 July 2015.
  33. ^ a b c Allcott, Hunt (2017). "Social Media and Fake News in the 2016 Election." The Journal of Economic Perspectives". The Journal of Economic Perspectives. 31: 211–235 – via JSTOR.
  34. ^ a b c d e Liu, Huan; Tang, Jiliang; Wang, Suhang; Sliva, Amy; Shu, Kai (7 August 2017). "Fake News Detection on Social Media: A Data Mining Perspective".
  35. ^ Guess, Andrew (9 January 2018). "Selective Exposure to Misinformation: Evidence from the consumption of fake news during the 2016 U.S. presidential campaign" (PDF). Dartmouth. Retrieved 5 March 2019.
  36. ^ Allcott, Hunt (October 2018). "Trends in the Diffusion of Misinformation on Social Media" (PDF). Stanford. Retrieved 5 March 2019.
  37. ^ Hao, Karen. "AI is still terrible at spotting fake news". MIT Technology Review. Retrieved 6 March 2019.
  38. ^ a b c d e f g Harrison Smith, Sarah (2004). The Fact Checker's Bible: A Guide to Getting it Right. New York: Anchor Books. pp. 8–12. ISBN 0385721064. OCLC 53919260.
  39. ^ John Watson (2 April 2017). "What is Fact Checking? – FactCheck Sri Lanka". Factchecksrilanka.com. Archived from the original on 7 November 2017. Retrieved 7 December 2017.
  40. ^ "An Interview With Susan Choi". Archived from the original on 18 February 2001. Retrieved 18 November 2006.CS1 maint: BOT: original-url status unknown (link)
  41. ^ "CNN.com – Transcripts". Transcripts.cnn.com. 1 June 2006. Retrieved 18 October 2011.
  42. ^ "Contributors". Archived from the original on 19 March 2006. Retrieved 17 November 2006.CS1 maint: BOT: original-url status unknown (link)
  43. ^ "William Gaddis (American author)". Britannica.com. Retrieved 18 October 2011.
  44. ^ Skurnick, Lizzie. "Content". Mediabistro.com. Retrieved 18 October 2011.
  45. ^ "Hodge, Roger D." Archived from the original on 8 March 2007. Retrieved 18 November 2006.CS1 maint: BOT: original-url status unknown (link)
  46. ^ Kirkpatrick, David D. "David Kirkpatrick". The New York Times.
  47. ^ "Swarthmore College Bulletin". Swarthmore.edu. July 2011. Archived from the original on 27 October 2008. Retrieved 18 October 2011.
  48. ^ "Sean Wilsey – About Sean Wilsey – Penguin Group". Us.penguingroup.com. Archived from the original on 27 September 2011. Retrieved 18 October 2011.

Further reading

External links

    Africa Check

    Africa Check is a non-profit fact checking organisation set up in 2012 to promote accuracy in public debate and the media in Africa. The organisation's goal is to raise the quality of information available to society across the continent. Africa Check is an independent organisation with offices in Johannesburg, Nairobi, Lagos, Dakar and London, producing reports in English and French testing claims made by public figures, institutions and the media against the best available evidence.


    AltNews.in is an Indian fact checking website, run by former software engineer Pratik Sinha. The website was launched on 9 February 2017 to combat the phenomenon of fake news.


    FactCheck.org is a nonprofit website that describes itself as a "consumer advocate for voters that aims to reduce the level of deception and confusion in U.S. politics". It is a project of the Annenberg Public Policy Center of the Annenberg School for Communication at the University of Pennsylvania, and is funded primarily by the Annenberg Foundation. FactCheck.org has won four Webby Awards in the Politics category, in 2008, 2010, 2011 and 2012.Most of its content consists of rebuttals to what it considers inaccurate, misleading, or false claims made by politicians. FactCheck.org has also targeted misleading claims from various partisan groups. Other features include:

    Ask FactCheck: users can ask questions that are usually based on an online rumor.

    Viral Spiral: a page dedicated to the most popular online myths that the site has debunked. It clarifies the answer as well as links readers to a full article on the subject.

    Party Lines: talking points that have been repeatedly used by multiple members of a political party.

    Mailbag: page for readers' sent letters and praise or disapproval of something said on the site.

    Fake news website

    Fake news websites (also referred to as hoax news websites) are Internet websites that deliberately publish fake news—hoaxes, propaganda, and disinformation purporting to be real news—often using social media to drive web traffic and amplify their effect. Unlike news satire, fake news websites deliberately seek to be perceived as legitimate and taken at face value, often for financial or political gain. Such sites have promoted political falsehoods in Germany, Indonesia and the Philippines, Sweden, Myanmar, and the United States. Many sites originate in, or are promoted by, Russia, North Macedonia, Romania, and some individuals in the United States.

    Full Fact

    Full Fact is a charity based in London to check and correct facts reported in the news.

    Glenn Kessler (journalist)

    Glenn Kessler (born July 6, 1959) is an American diplomatic correspondent who writes columns and helms the "Fact Checker" feature for The Washington Post.

    Gossip Cop

    Gossip Cop is a website that fact-checks celebrity reporting. Based in New York City, Gossip Cop investigates entertainment stories that are published in magazines and newspapers, as well as on the web, to ascertain whether they are true or false. To help visitors quickly identify the truth value of every story, the site features a 0-10 scale next to each article. A rating of 0 means the rumor is completely untrue, fiction or even fake news, while a rating of 10 means the report is 100 percent fact or true. Gossip Cop participates in International Fact-Checking Network events, including attending Global Fact 4 in Madrid in July 2017 and Global Fact 5 in Rome in June 2018.

    The website was created by Michael Lewittes, a veteran entertainment journalist. During his 25-year career, Lewittes has served as an "Access Hollywood" producer, the news director for Us Weekly, an editor at The New York Post, and a columnist for the New York Daily News. A graduate of Yale College, Michael was also a correspondent on the E! series, "The Gossip Show."Along with co-founder Dan Abrams, Lewittes launched the site on July 29, 2009 with appearances on Good Morning America and The Today Show. As a result, Gossip Cop received a tremendous amount of publicity, including features in The New York Times and People Magazine. Today, the site is known for its dogged investigations. In November 2017, Elle magazine called Gossip Cop "the Robert Mueller of the celebrity news world."


    Kallxo is an online platform for reporting corruption, fraud, conflict of interest, and other related cases of misuse of official position, negligence and including cases on hampering the Kosovo citizens’ rights.

    Kallxo is part of the International Fact-Checking Network, IFCN, by the Poynter Institute.

    List of fact-checking websites

    This list of fact-checking websites includes websites that provide fact-checking services about both political and non-political subjects.

    The Reporters' Lab at Duke University maintains a database of fact-checking organizations that is managed by Mark Stencel and Bill Adair. The database tracks more than 100 non-partisan organizations around the world. The Lab's inclusion criteria is based on whether the organization

    examines all parties and sides;

    examines discrete claims and reaches conclusions;

    tracks political promises;

    is transparent about sources and methods;

    discloses funding/affiliations;

    and whether its primary mission is news and information.

    Media (communication)

    Media are the communication outlets or tools used to store and deliver information or data. The term refers to components of the mass media communications industry, such as print media, publishing, the news media, photography, cinema, broadcasting (radio and television), and advertising.The development of early writing and paper enabled longer-distance communication systems such as mail, including in the Persian Empire (Chapar Khaneh and Angarium) and Roman Empire, which can be interpreted as early forms of media.Writers such as Howard Rheingold have framed early forms of human communication as early forms of media, such as the Lascaux cave paintings and early writing. Another framing of the history of media starts with the Chauvet Cave paintings and continues with other ways to carry human communication beyond the short range of voice: smoke signals, trail markers, and sculpture.The term media in its modern application relating to communication channels was first used by Canadian communications theorist Marshall McLuhan, who stated in Counterblast (1954): "The media are not toys; they should not be in the hands of Mother Goose and Peter Pan executives. They can be entrusted only to new artists, because they are art forms." By the mid-1960s, the term had spread to general use in North America and the United Kingdom. The phrase "mass media" was, according to H.L. Mencken, used as early as 1923 in the United States.The term "medium" (the singular form of "media") is defined as "one of the means or channels of general communication, information, or entertainment in society, as newspapers, radio, or television."

    Media Bias/Fact Check

    Media Bias/Fact Check is a web site that rates factual accuracy and political bias in news media. The site classifies media sources on a political bias spectrum, as well as on the accuracy of their factual reporting. The site is run by founder and editor Dave Van Zandt.The Columbia Journalism Review describes Media Bias/Fact Check as an amateur attempt at categorizing media bias and Van Zandt as an "armchair media analyst." Van Zandt describes himself as someone with "more than 20 years as an arm chair researcher on media bias and its role in political influence." The Poynter Institute notes, "Media Bias/Fact Check is a widely cited source for news stories and even studies about misinformation, despite the fact that its method is in no way scientific."The site has been used by researchers at the University of Michigan to create a tool called the "Iffy Quotient", which draws data from Media Bias/Fact Check and NewsWhip to track the prevalence of 'fake news' and questionable sources on social media. The site was also used by a research group at the Massachusetts Institute of Technology in initial training of an AI to fact check and detect the bias on a website.

    Media coverage of 2019 India–Pakistan standoff

    The media coverage of the 2019 India-Pakistan standoff that escalated following an attack in Pulwama on 14 February 2019 through to the Balakot airstrike and the aftermath was criticised for largely being "jingoistic" and "nationalistic", to the extent of the media war-mongering and the battle being fought between India and Pakistan through newsrooms. During the escalation, fake videos and misinformation were prevalent on the social media which were further reported to escalate tensions between India and Pakistan. Once tensions started de-escalating, the media coverage shifted to comparisons being made between "India and Pakistan" and "Narendra Modi and Imran Khan" in terms of who won the "perception battle".


    The 12-volume Micropædia is one of the three parts of the 15th edition of Encyclopædia Britannica, the other two being the one-volume Propædia and the 17-volume Macropædia. The name Micropædia is a neologism coined by Mortimer J. Adler from the ancient Greek words for "small" and "instruction"; the best English translation is perhaps "brief lessons".

    The Micropædia was introduced in 1974 with 10 volumes having 102,214 short articles, all of which were strictly fewer than 750 words. This limit was relaxed in the major re-organization of the 15th edition; many articles were condensed together, resulting in roughly 65,000 articles in 12 volumes. In general, the 750-word limit is still respected and most articles are only 1-2 paragraphs; however, a few longer articles can be found in the 2007 Micropædia, such as the Internet entry, which takes up a full page.

    With rare exceptions (<3%), the ~65,000 articles of the Micropædia have no bibliographies and no named contributors. The Micropædia is intended primarily for quick fact-checking and as a guide to the 700 longer articles of the Macropædia, which do have identified authors and bibliographies.


    PolitiFact.com is a nonprofit project operated by the Poynter Institute in St. Petersburg, Florida, with offices there and in Washington, D.C.. It began in 2007 as a project of the Tampa Bay Times (then the St. Petersburg Times), with reporters and editors from the newspaper and its affiliated news media partners reporting on the accuracy of statements made by elected officials, candidates, their staffs, lobbyists, interest groups and others involved in U.S. politics. Its journalists evaluate original statements and publish their findings on the PolitiFact.com website, where each statement receives a "Truth-O-Meter" rating. The ratings range from "True" for completely accurate statements to "Pants on Fire" (from the taunt "Liar, liar, pants on fire") for false and ridiculous claims.

    PunditFact, a related site that was also created by the Times' editors, is devoted to fact-checking claims made by political pundits. Both PolitiFact and PunditFact were funded primarily by the Tampa Bay Times and ad revenues generated on the website until 2018, and the Times continues to sell ads for the site now that it is part of Poynter, a non-profit organization that also owns the newspaper. PolitiFact increasingly relies on grants from several nonpartisan organizations, and in 2017 launched a membership campaign and began accepting donations from readers.In addition to political claims, the site monitors the progress elected officials make on their campaign promises, including a "Trump-O-Meter" for President Donald Trump and an "Obameter" for President Barack Obama. PolitiFact.com's local affiliates review promises by elected officials of regional relevance, as evidenced by PolitiFact Tennessee's "Haslam-O-Meter" tracking Tennessee Governor Bill Haslam's efforts and Wisconsin's "Walk-O-Meter" tracking Wisconsin Governor Scott Walker's efforts.PolitiFact has won several awards, and has been both praised and criticized by independent observers, conservatives and liberals alike. Both liberal and conservative bias have been alleged at different points, and criticisms have been made that PolitiFact attempts to fact-check statements that cannot be truly "fact-checked".


    Rappler is an online news website based in the Philippines with a bureau in Jakarta, Indonesia. It started as a Facebook page named MovePH in August 2011 and later evolved into a complete website on January 1, 2012. Along with web based text news content, it was also among the first news websites in the Philippines to extensively use online multimedia including video, pictures, text and audio. It also uses social media sites for news distribution.According to its own website, the name Rappler is a portmanteau of the words "rap" (to discuss) and "ripple" (to make waves).In 2018, it was the recipient of legal proceedings from arms of the Philippine government. Rappler and its staff said it was being targeted for its revelations of misappropriations by government and elected officials.


    Snopes , formerly known as the Urban Legends Reference Pages, claims to be one of the first online fact-checking websites. It has been described as a "well-regarded source for sorting out myths and rumors" on the Internet. It has also been seen as a source for validating and debunking urban legends and similar stories in American popular culture.

    The Straight Dope

    "The Straight Dope" was a question-and-answer newspaper column written by Cecil Adams and illustrated by Slug Signorino, first published in 1973 in the Chicago Reader as well as syndicated nationally in the United States.Following the column of June 27, 2018, "Straight Dope" was placed on hiatus, with no decision made regarding its future.


    USAFacts (circa 2017) is a non-profit organization and website which offers a non-partisan portrait of the US population, its government's finances, and government's impact on society.


    Vetting is the process of performing a background check on someone before offering them employment, conferring an award, or doing fact checking prior to making any decision. In addition, in intelligence gathering, assets are vetted to determine their usefulness.

    This page is based on a Wikipedia article written by authors (here).
    Text is available under the CC BY-SA 3.0 license; additional terms may apply.
    Images, videos and audio are available under their respective licenses.